Azure Container Service : Using the Azure File Storage as a persistent (kubernetes) volume

Introduction

Today’s post is a brief one… Though it packs some punch! In the past I talked about storage patterns for docker/containers. Today we’ll touch how you can leverage the Azure File Storage as a shared & persistent storage for your container deployments. Kubernetes has been gaining a lot of traction, and that one has support for the Azure File Storage as a persistent volume too.

 

Demo Files

Want to run this yourself? Check out the following GitHub repository!

Continue reading “Azure Container Service : Using the Azure File Storage as a persistent (kubernetes) volume”

Azure : Pushing Azure Resource Manager Templates through a CI/CD (release) pipeline with Visual Studio Team Services

Introduction

From template code to deployment… If we really want to control this, then we’ll be pushing these templates through a CI/CD (continuous integration / continuous deployment) pipeline. What does that mean? We are going to put our template in a source code repository (like Github). Everytime we update the code, we’ll going to kick in a “build”. During this build we’ll be packaging it (read : create a zip file of it) and it is also strongly advised to do testing here too. Next up, if all goes well, we’ll be using that package to deploy to all our environments. And in the end, we want to be able to have a nice view on this too…

Why do it like this? Quality! We all make mistakes. We want to detect them early and not repeat them. And every change, we want to put it through the exact same process… time and time again!

 

Prerequisites

Starting off, I’m assuming you already have VSTS (Visual Studio Team Services) in place. If not, register for it! It’s free up till 5 users. And let’s be honest, at about 5€ per user / month & 8€ per build agent per month, … it’s still a steal! 😉

Continue reading “Azure : Pushing Azure Resource Manager Templates through a CI/CD (release) pipeline with Visual Studio Team Services”

10 years… Time goes fast!

WOW, really… WOW! It has been 10 years since I started my blog…! And here we are… more than 500 posts later. With more than 300 comments being posted. At the moment this blog has about 7000 visitors per month. My head still cannot fanthom this.

birthday-cake-ideas-for-10-year-old-boys-14

When I first started it was a kind of attempt to document things I discovered. As someone who has been an active member of the OSS community, I’ve always seen given back to the community as a task of everyone. I often joke that my job is “google premium”. Though when looking at ourselves, we must admit that a big chunk of our job relies on finding the right information. Therefore I always publish my findings online, in the hope that it might help someone else out there. That way we can keep evolving at a great pace. 🙂

 

Anyhow, as closing thought of today ;

Thanks to every single visitor out there!

Azure : Deploying a domain controller via DSC pull

Introduction

Today’s blog post will showcase how you can leverage the DSC pull feature of Azure Automation when deploying workloads to Azure. To many, the following question will pop up ; “Why use a pull mechanism, whilst I could use the DSC extension to push my configs?”. The answer is pretty simple  The pull mechanism facilitates the lifecycle flow of workloads better. You can easily update the config of the virtual machine and do follow-up on the rollout of your configuration.

 

The Flow

Now how would such a flow go?

  1. We’ll use an ARM template to deploy (and afterwards keep) our Azure Automation Account (up-to-date)
  2. We’ll use a script to import the Powershell modules into our Azure Automation Account, which are needed to compile configurations.
  3. We’ll use a script to import & compile the DSC configurations into ou Azure Automation Account.
  4. We’ll use an ARM template to deploy the domain controller.
  5. This ARM template will also register the VM with the Azure Automation Account and link it with a given DSC configuration.
  6. The configuration will be applied and the updates will be reported back to the Azure Automation Account.

 

Deep-dive Demo

Continue reading “Azure : Deploying a domain controller via DSC pull”

Azure Governance – Policy Automation

Introduction

In my last post I talked about the possibility to manage “Azure Resource Manager Policies” via the portal. Where the policy is a good location to view the policies, this is not the area you want to be managing your policies! In today’s post, we’ll look how we can automate these things. This to ensure that all policies are effective towards their scope and remain that way. Once your subscriptions grows, you can have way too many resources & resource groups at your hands. Setting up things manually is not the way to go…

Season 5 GIF - Find & Share on GIPHY

 

Microsoft Azure Enterprise Scaffold

How to do governance in Azure is a very common questions. So if you have found yourself asking questions in regards to that topic, do not feel strange! One of the prime resources I can recommend in this area is the “Microsoft Azure Enterprise Scaffold” ;

The scaffold is based on practices we have gathered from many engagements with clients of various sizes. Those clients range from small organizations developing solutions in the cloud to Fortune 500 enterprises and independent software vendors who are migrating and developing solutions in the cloud. The enterprise scaffold is “purpose-built” to be flexible to support both traditional IT workloads and agile workloads; such as, developers creating software-as-a-service (SaaS) applications based on Azure capabilities.

Continue reading “Azure Governance – Policy Automation”

Azure Governance – Policies in public preview on the portal

Introduction

Ever wondered if you can put policies on the deployment of resources in Azure?  Yes you can via “Resource Policies“.

This used to be only possible via JSON deployments like the following ;

{
  "properties": {
    "parameters": {
      "allowedLocations": {
        "type": "array",
        "metadata": {
          "description": "The list of locations that can be specified when deploying resources",
          "strongType": "location",
          "displayName": "Allowed locations"
        }
      }
    },
    "displayName": "Allowed locations",
    "description": "This policy enables you to restrict the locations your organization can specify when deploying resources.",
    "policyRule": {
      "if": {
        "not": {
          "field": "location",
          "in": "[parameters('allowedLocations')]"
        }
      },
      "then": {
        "effect": "deny"
      }
    }
  }
}

The good news is that the preview portal shows a public preview shows that this feature will be available via the portal!

Continue reading “Azure Governance – Policies in public preview on the portal”

Single Page Webapp : How to secure your app and your API with Azure Active Directory

Introduction

A few months ago I did a post on using PHP to connect to the Azure management API. And a week ago I did a demo on how to secure a “classic” webapp with Azure Active Directory. Today we’ll look how to secure a single page webapp by using Azure Active Directory. For the post of today I’ll be using two webapps ;

  • Front end ; a small webapp based using AngularJS
  • Backend ; also a small webapp based on PHP, which will serve the API calls made from the front end

Why does this kind of setup differ from a “classic” approach? With single page apps, we see a very clear segregation of  backend & front end. When the backend & front end are combined, we often see more simple mechanisms used, often based on session information. When the two are clearly separated, we’ll need to authenticate to both individually… I’ve often seen the error where organizations just protect the front end, as this is where the user logs in. And they forget to secure the backend API… An unsecure API means that everyone who can access that API will be able to retrieve (or delete/adjust) the data served by that API. Let that one sink in!

 

Flow of the day

So what will we be doing today?

  1. A user access our front end
  2. If the user is not authenticated, (s)he will be redirected to Azure Active Directory (AAD) to login
  3. AAD will redirect (on success) with an authorization token
  4. We’ll inject this authorization token into the calls made to the backend (to prove your identity)
  5. The backend API will validate the authorization token and verify it against the issuer (AAD)

 

Continue reading “Single Page Webapp : How to secure your app and your API with Azure Active Directory”