Using Azure DevOps to deploy your static webpage (SPA) to Azure Storage

Introduction

To, without shame, grab the introduction of the “Static website hosting in Azure Storage” page ;

 

Azure Storage now offers static website hosting, enabling you to deploy cost-effective and scalable modern web applications on Azure. On a static website, webpages contain static content and JavaScript or other client-side code. By contrast, dynamic websites depend on server-side code, and can be hosted using Azure Web Apps.

As deployments shift toward elastic, cost-effective models, the ability to deliver web content without the need for server management is critical. The introduction of static website hosting in Azure Storage makes this possible, enabling rich backend capabilities with serverless architectures leveraging Azure Functions and other PaaS services.

 

Which, to me, sounds great! As one of my projects (VMchooser) is actually a static site (VueJS based Single Page App) that could just as well run on Azure Storage (thus reducing my cost footprint). So today we’re going to test that one out, and afterwards integrate it into our existing CI/CD pipeline (powered by Azure DevOps).

 

 

Enabling the Static Website feature

How to enable the feature? Just go to the section on the storage account… Enable it. and provide an index document name. Done. πŸ˜‰

Or almost, you still need to upload the website of course!

 

Surprise of the day “azure cli”

Normally I would tend towards the Azure Storage Explorer, though for a CI/CD pipeline… That would become “challenging”. πŸ˜‰

The next option was to use AzCopy, as this is one of the native extension in Azure Devops. Here I must admit that my experience wasn’t that positive, as I was unable to tweak the pattern to do a batch upload of all my files/folders in a recursive manner. Next to that, I also found that the hosted build agents didn’t include AzCopy yet.

As I was doing some stuff on the CLI side, I was curious if “perhaps” the Azure CLI wouldn’t include AzCopy. This as the Azure CLI is included in the hosted build agents. To my big surprise, the Azure CLI even offers a native option for file uploads (even in batch!).

So let’s test that one out…

Yihah!

That’s going to be … wait for it… legendary!

 

Proof of the pudding…

So we enabled the feature… We uploaded the site… Now does it work?

Yes it does! Now let’s automate this…

 

Azure Devops Integration

Now let’s see how we can achieve this… in Azure Devops. Basically publish to our “dev” (preview) environment and to our production one, where both are powered by Azure Storage.

In terms of involved tasks, this is pretty simple ;

  • One to extract our build artifact
  • One to upload our new website to the Azure storage

With the extract… nothing fancy to see here.

Where the Azure CLI uses the “az storage blob upload-batch” to get everything done ;

As you can see, everything is also using variables. Where I’ve setup the destination ($web) to be for all releases. Though each environment has its own StorageAccountName & StorageAccountKey.

So now I’m currently running two deployments in parallel. Where in the (near) future I’ll switch over (in phases) towards Azure Storage!

 

Closing Thoughts

When migrating from my existing implementation towards Azure Storage. I’ll also be leveraging Azure FrontDoor (“AFD”) for the SSL termination in combination with the custom domains. At the moment that the “naked domain” capability is released with AFD, then I’ll start the migration away from AppService.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.