Face API – Detection 101
In this post we dive into the Face API, part of the Microsoft Cognitive services landscape. We will go through the most basic setup and create a simple Azure function.
The 2 most important features of this API are face detection and face identification. To get started you need a Face API endpoint, this you can create in through the Azure Portal. There is a free version that limits the request per second and maximizes your total calls per month and a payed version that gives you unlimited calls a month with 10 request per second limit.
To get started is very easy, get the Face API SDK from NuGet or connect to the endpoint directly. In this blog I will use the C# SDK as example.
First let’s dive into the detection part of the API. By sending a image to this endpoint the detection process starts and gives you back some great information about the faces it sees on the image. The information that the API returns can be divided in a 4 parts.
The faceId This is a unique ID for the face detected. The API will remember this ID for 10 minutes.
FaceRectangle This is the bounding box for the face. It returns the offset from the top, left and give the width and height in pixels.
FaceAttributes This is the most interesting part of the API in my opinion. When calling the API using the SDK you can specify which attributes you want the API to detect. The list is getting longer and longer every release cycle and the service gets better and better over time.
Some interesting new features are, if the persons is wearing make-up and what his/her hair color is.
The complete list of attributes in September 2017 is
- Gender, the sex of the person
- Age, a guess of the age, mostly to old
- Haircolor, newly added. Returns a confidence of the haircolor, if the hair is visible and the person is bald.
- Smile
- Headpose
- Facial Hair, give a score of confidence for moustache, beard and sideburns.
- Glasses, can detect reading glasses, sun glasses of swimming goggles.
- Makeup, gives a true or false for eye and lip makeup
- Emotion, can detect 8 types of emotions
- Occlusion
- Accessories
FaceLandmarks When making the call you can specify if you want the facial landmark to be returned. The landmarks are 27 points that make face unique.
Let’s dive in some code and het you up and running in less then 3 lines.
FaceServiceClient faceService = new FaceServiceClient("[API KEY]");
string imageUrl = "http://www.test.com/image.jpg";
var faceResult = await faceService.DetectAsync(imageUrl, true, false, new[]
{FaceAttributeType.Age, FaceAttributeType.Gender, FaceAttributeType.Smile});