To, without shame, grab the introduction of the “Static website hosting in Azure Storage” page ;
As deployments shift toward elastic, cost-effective models, the ability to deliver web content without the need for server management is critical. The introduction of static website hosting in Azure Storage makes this possible, enabling rich backend capabilities with serverless architectures leveraging Azure Functions and other PaaS services.
Which, to me, sounds great! As one of my projects (VMchooser) is actually a static site (VueJS based Single Page App) that could just as well run on Azure Storage (thus reducing my cost footprint). So today we’re going to test that one out, and afterwards integrate it into our existing CI/CD pipeline (powered by Azure DevOps).
Enabling the Static Website feature
How to enable the feature? Just go to the section on the storage account… Enable it. and provide an index document name. Done. 😉
Surprise of the day “azure cli”
Normally I would tend towards the Azure Storage Explorer, though for a CI/CD pipeline… That would become “challenging”. 😉
The next option was to use AzCopy, as this is one of the native extension in Azure Devops. Here I must admit that my experience wasn’t that positive, as I was unable to tweak the pattern to do a batch upload of all my files/folders in a recursive manner. Next to that, I also found that the hosted build agents didn’t include AzCopy yet.
As I was doing some stuff on the CLI side, I was curious if “perhaps” the Azure CLI wouldn’t include AzCopy. This as the Azure CLI is included in the hosted build agents. To my big surprise, the Azure CLI even offers a native option for file uploads (even in batch!).
So let’s test that one out…
That’s going to be … wait for it… legendary!
Proof of the pudding…
So we enabled the feature… We uploaded the site… Now does it work?
Azure Devops Integration
Now let’s see how we can achieve this… in Azure Devops. Basically publish to our “dev” (preview) environment and to our production one, where both are powered by Azure Storage.
In terms of involved tasks, this is pretty simple ;
- One to extract our build artifact
- One to upload our new website to the Azure storage
With the extract… nothing fancy to see here.
Where the Azure CLI uses the “az storage blob upload-batch” to get everything done ;
As you can see, everything is also using variables. Where I’ve setup the destination ($web) to be for all releases. Though each environment has its own StorageAccountName & StorageAccountKey.
So now I’m currently running two deployments in parallel. Where in the (near) future I’ll switch over (in phases) towards Azure Storage!
When migrating from my existing implementation towards Azure Storage. I’ll also be leveraging Azure FrontDoor (“AFD”) for the SSL termination in combination with the custom domains. At the moment that the “naked domain” capability is released with AFD, then I’ll start the migration away from AppService.