Azure : Benchmarking SQL Database Setups – To measure is to know, and being able to improve…


To measure is to know. If you can not measure it, you cannot improve it!

Today’s post will go more in-depth on what performance to expect from different SQL implementations in Azure. We’ll be focussing on two kind of benchmarks ; the storage subsystem and an industry benchmark for SQL. This so that we can compare the different scenario’s to each other in the most neutral way possible.


Test Setup

As a test bed I started from one of my previous posts


The machines I used were DS1 v2 machines when using single disks and a DS2 v2 machines when using multiple disks. In terms of OS, I’ll be using Windows 2012 R2 and MSSQL 2014 (12.04100.1) as database.

Continue reading “Azure : Benchmarking SQL Database Setups – To measure is to know, and being able to improve…”

Storage Spaces : Create a non-clustered storage pool on a cluster instance


Today I ran into some issues when creating a storage spaces volume on a cluster instance. I wanted to use the performance benefits of joining multiple Azure storage disks by using storage spaces. Afterwards I wanted to use the volume with SIOS Datakeeper. The issue at hand was that the newly storage spaces would auto register with the cluster. It would then assume that the Azure disks were shared accross the cluster.

Continue reading “Storage Spaces : Create a non-clustered storage pool on a cluster instance”

Azure : Performance limits when using MSSQL datafiles directly on an Storage Account


In a previous post I explained how you are able to integrate MSSQL with Azure storage by directly storing the data files on the storage account.

2016-04-22 19_41_15-kvaessql21 - - Remote Desktop Connection

Now this made me wondering what the performance limitations would be of this setup? After doing some research, the basic rule is that the same logic applies to “virtual disks”, as to the “data files”… Why is this? They are both “blobs” ; the virtual disk is a blob called “disk” and the data files will be “page blobs”.

2016-04-25 09_35_42-Pricing - Cloud Storage _ Microsoft Azure

Continue reading “Azure : Performance limits when using MSSQL datafiles directly on an Storage Account”

Azure : Setting up a high available SQL cluster with standard edition


It is important to know that you will only get an SLA (99,95%) with Azure when you have two machines deployed (within one availability set) that do the same thing. If this is not the case, then Microsoft will not guarantee anything. Why is that? Because during service windows, a machine can go down. Those service windows are quite broad in terms of time where you will not be able to negotiate or know the exact downtime.

That being said… Setting up your own high available SQL database is not that easy. There are several options, though it basically bears down to the following ;

  • an AlwaysOn Availability Groups setup
  • a Failover Cluster backed by SIOS datakeeper

Where I really like AlwaysOn, there are two downsides to that approach ;

  • to really enjoy it, you need the enterprise edition (which isn’t exactly cheap)
  • not all applications support AlwaysOn with their implementations

So a lot of organisations were stranded in terms of SQL and moving to Azure. Though, thank god, a third party tool introduced itself ; SIOS Datakeeper ! Now we can build our traditional Failover Cluster on Azure.



Before we start, let’s delve into the design for our setup ;


Continue reading “Azure : Setting up a high available SQL cluster with standard edition”

Azure : Integrating MSSQL data files with Azure Storage


In the previous post I showed you how you can setup a backup to Azure storage and also mentioned that you can add your data/logfiles to Azure storage. An important note there is that this feature does not work with the storage account key, but you’ll need to set it up with SAS (Shared Access Signature) tokens.

Continue reading “Azure : Integrating MSSQL data files with Azure Storage”

Azure : Debugging VPN Connectivity on Resource Manager

Debugging failed VPN tunnels can be quite annoying… Today we had an issue with a new deployment that had us on a wild goose chase for a while. So a quick post to give all of you some tracking points ; vpn_trans

  • The first VPN gateway that receives a packet which is in need of the tunnel will initiate the connection. In ARM you have no way to manually initiate the connect.
    As a side effect, the destination gateway is typically the one who has the most useful information regarding the VPN connection. So when debugging, look towards that gateway.
    Therefor I would suggest to start a ping from an Azure VM (within the VNET) towards the local network. This will kickstart the connection process.
  • The diagnostic part on the Azure side is quite “basic” and well hidden… Actually, the commands to get diagnostics are only available in “classic”-mode. Though you can work your way around it. Check out the following post for more information o getting diagnostics for the VNET gateway on Resource Manager.
  • With the change from “Classic” to “Resource Manager”, there was also a change in the naming of the VPN types. Previously we had “static” and “dynamic”. The “static” connection was “policy-based” and the “dynamic” was “route-based”. When looking towards the effect, the “route-based” deployment relied on IKE V2, where the “policy-based” deployment relies on IKE V1. This is VERY important to know, as this has an effect on the amount of tunnels you can build. In addition, there are a lot of VPN gateways that do not support IKE V2 (at this moment).

Good luck troubleshooting!

Azure : Storage Explorer Preview


Another build announcement was the “Storage Explorer Preview”… ( )

  • Main Features
    • Mac OS X, Linux, and Windows versions
    • Sign in to view your Storage Accounts – use your Org Account, Microsoft Account, 2FA, etc
    • Connect to Storage Accounts using:
      • Account name and Key
      • Custom endpoints (including Azure China)
      • SAS URI for Storage Accounts
    • ARM and Classic Storage support
    • Generate SAS keys for blobs, blob containers, queues, or tables
    • Connect to blob containers, queues, or tables with Shared Access Signatures (SAS) key
    • Manage Stored Access Policies for blob containers, queues, and tables
    • Local development storage with Storage Emulator (Windows-only)
    • Create and delete blob containers, queues, or tables
    • Search for specific blobs, queues, or tables
  • Blobs
    • View blobs and navigate through directories
    • Upload, download, and delete blobs and folders
    • Open and view the contents text and picture blobs
    • View and edit blob properties and metadata
    • Search for blobs by prefix
    • Drag ‘n drop files to upload
  • Tables
    • View and query entities with ODATA
    • Insert prewritten queries with “Add Query” button
    • Add, edit, delete entities
  • Queues
    • Peek most recent 32 messages
    • Add, dequeue, view messages
    • Clear queue

Continue reading “Azure : Storage Explorer Preview”