Todays post will be the backend tour of “Frietjes-of-Niet” (translated from Dutch : “French-Frites-or-Not?”). A big part of the mission of Azure is about democratizing technology so it becomes accessible to organizations in order for them to achieve more. AI (Artificial Intelligence) is a key part of that vision.
What will be the flow for today?
- We’ll train a model to recognize fries
- Next we’ll be exporting that model to be used as a container
- Afterwards we’ll build that container
- To end with deploying (and testing) it onto AKS
Sound cool? Let’s get to it..
Training a model
Just login and click on “New project”.
You’ll notice that there are several “domains”, which each is more tailored for a given use case. Though do be aware that if you want to export the model, in order to use it inside of a container, that only the “Compact domains” are support for export.
So in this case I’ve selected the “General (compact”) domain. And we’ll click on “Create project”
Next up, we’ll start training the model. So click on the “Add images” link and select the pictures you want to tag.
Here I’ll be uploading “some” pictures of fries…
Hungry yet? 😉
Once they have been added, we’ll tag it with “Frietjes” (the Dutch word for “Fries”) and proceed to uploading them.
Once uploaded, we can kick off the “training” of our model by clicking on “Train”.
We’ll see that a new training iteration has been kicked off.
Once done, it’ll show use the precision of our model per tag.
Basically, this is the entire training step for our model… Depending on your results, you can add iterations so see what gives the best result for your specific workload. Anyhow, we’ll now do a quick test by pressing “Quick Test” and proving a new picture of fries.
Here we’ll notice that it’s 100% sure they are fries. Now let’s try the same with some yellow/golden twine, in the hope that our model is being fooled to thing they are fries…
And luckily it wasn’t fooled! Next to our iteration, we’ll see what images were tested, and we would go back and see the results of those tests later on.
For now we’ll leave the model as is and start proceeding to exporting it.
Exporting the model
In the performance tab, you’ll notice an option called “Export”. So let’s click on this to get our model.
Next up, you can choose the platform you want the model to be exported for ;
I’ll be choosing “Dockerfile” here as I want it to leverage that one.
After selecting Linux, it’ll starting preparing this request…
Which will come as a nice zip file.
(Optional) Source Control & Automated Build for the container
In this phase, I’m going to add that package to GitHub. Afterwards I’ll like that repository to Docker hub in order to get a quick & easy automated build. So let’s add the code to GitHub… Your method may vary, I’ll be using my WSL for this.
Unzip it to its own directory…
And adding all the files into an initial commit.
And pushing it to GitHub!
And there we go!
(Sidenote ; I love that GitHub provides the inside of vulnerable dependencies. So I’ve taken the “liberty” of updating the dependency to a non-vulnerable version.)
Now let’s go over to Docker hub, and create and automated build ;
Next up, select GitHub and our repository
Enter the short description, and press “Create”
Now head over to “Build Settings” and press “Trigger”.
and after a while it’ll start building…
And we’ll have one with the latest tag.
(Sidenote ; Using the latest tag is far from best practice! Though for my quick demo, it’ll do…)
Let’s copy the pull command info for later on.
Deploying the container
Earlier this week I was test driving Rancher in order to provide workflow orchestration on top of AKS. So I’ll be continuing on this motion for the deployment of our little container. So let’s deploy a workload onto our cluster…
We’ll git it its own namespace.. and last but not least… We’ll also add a port mapping, where we’ll select the L4 Load Balancer. This will showcase the AKS integration with the Azure Load Balancer!
Once created, we’ll see that the deployment is starte.
Which is also visible in the UI.
Now let’s check the exposing service. As I didn’t notice one being created, I manually created one myself.
Where I ended up with two services for my deployment… I should really learn to be a bit more patient.
Anyhow, here we can see that the services have been exposed externally via the Azure Load Balancer.
Now let’s test shall we?!?
Testing the endpoint
What does our readme say?
Then use your favorite tool to connect to the end points.
Let’s replace the 127.0.0.1 with one of our service IPs and let’s test!
Every time our image gets scored by our “FrietjesOrNot”-model. All powered by the Custom Vision API!
I hope this post has provided you with the following insights ;
- AI has become very accessible.
- Apart from having the cloud powered PaaS services. You can also export the model for your mobile app, or edge scenario. Even if you are wondered about “vendor lock-in”, you can even opt to export the model to run anywhere else (on for instance a k8s cluster).
- French fries, despite coming from the word “Frenched” (deep-fried), are truly Belgian… 😉