I've started using Microsoft Cognitive Services a while back for the purpose of learning and using their features for my smart home scenarios. Now that they've moved to Azure, access to old services' home is coming to an end and old access keys are expiring. I couldn't find any way to properly migrate my old services data I've trained (like people's faces) to Azure, so I've decided to start from scratch, beginning with service creation. This blog post will deal with a new Cognitive Service creation, specifically adding a new Faces API service.
Microsoft Cognitive Services
Microsoft Cognitive Services is a set of APIs that employ powerful machine learning algorithms to provide application developers with intelligent features like image and voice recognition, face identification and language understanding. Starting under the code name Project Oxford and with limited availability for testing, Cognitive Services evolved significantly over the time and were recently moved to the Microsoft Azure Portal.
Faces API
Face API consists of two main group of APIs: face detection and face recognition. While face detection API can detect people faces in provided images, along with face's attributes (gender, age, emotion, makeup, ...) and position on that image (face, mouth, eyes, nose, ...), face recognition can actually recognize detected faces on an image, if detected face matches one of those in previously trained data.
Creating a new Face API service in Azure
Head to https://portal.azure.com and log into your account. Hit New (the big green plus) and search for Cognitive Services. You'll get a bunch of available apps in the results; select Face API.
In the above screen, enter required data and click Create. Before that, consider selecting the right pricing tier for you. In the time of this writing, there are two tiers available:
- Free tier (F0) costs nothing, but is limited to 20 calls per minute and overall 30,000 calls per month,
- Standard tier (S0) is limited to 10 calls per second and calls cost per the table below
There is also a cost for stored images, if you need face identification and matching capabilities.
Now that Face API resource is created, the most important thing is to take note of the pair of access keys that are available through the Resource Management -> Keys section.
These keys are associated with your Azure account will allow you to access Face API endpoints. Let's try them live with API testing console!
Testing with API keys
Cognitive Services APIs have excellent documentation, with ready to test API console; you can even choose which Azure you want to use (note that not every service is available in every region)
Here's link to use West Central US endpoints: https://westcentralus.dev.cognitive.microsoft.com/docs/services/563879b61984550e40cbbe8d. Following it will land you on API documentation page with an entry form used to supply required and optional parameters. Note that API keys only work for the same region you picked when creating API service on Azure.
Query parameters lets you pick what you want the service to return. If any faces are detected, setting returnFaceId will get you IDs of that faces if you need them for subsequent calls. returnFaceLandmarks will, if set to true, return positions of various face landmarks, if they were detected. For returnFaceAttributes, simply list what attributes you want to be returned. In the above example, I'm interested in age and gender of any detected face.
Don't forget to enter or paste your API key into the Ocp-Apim-Subscription-Key field.
Finally, use request body to enter a valid image URL you want to test with and hit the Send button below.
If everything went OK, you should get a HTTP 200 response with a detailed JSON result (I've clipped some of the response below to shorten it):
[
{
"faceId": "***********************************",
"faceRectangle": {
"top": 79,
"left": 57,
"width": 43,
"height": 43
},
"faceLandmarks": {
"pupilLeft": {
"x": 66.4,
"y": 94.3
},
"pupilRight": {
"x": 83.9,
"y": 88.1
},
"noseTip": {
"x": 79.4,
"y": 102.0
},
"mouthLeft": {
"x": 71.9,
"y": 113.0
},
"mouthRight": {
"x": 90.5,
"y": 106.8
},
"eyebrowLeftOuter": {
"x": 59.5,
"y": 93.6
},
"eyebrowLeftInner": {
"x": 70.2,
"y": 90.1
},
... ... ...
"faceAttributes": {
"gender": "male",
"age": 64.8
}
},
{
"faceId": "***********************************",
"faceRectangle": {
"top": 50,
"left": 88,
"width": 39,
"height": 39
},
"faceLandmarks": {
... ... ...
"noseRightAlarOutTip": {
"x": 114.9,
"y": 71.0
},
"upperLipTop": {
"x": 110.3,
"y": 77.4
},
"upperLipBottom": {
"x": 110.7,
"y": 78.2
},
"underLipTop": {
"x": 112.3,
"y": 81.7
},
"underLipBottom": {
"x": 113.2,
"y": 83.8
}
},
"faceAttributes": {
"gender": "female",
"age": 60.0
}
}
]
Wrap up
Now that Face API Cognitive Service is up and running, you can do interesting things with it. This blog post was about setting the service up, follow up posts will focus on on use cases and other Cognitive Services.
HedaWhece - sobota, 09. november 2024
A, Representative images of vaginal epithelium show increased overall epithelial maturation and keratinization superficial laminar zone following E2 and antagonism of this effect by Tam <a href=https://fastpriligy.top/>priligy tablet</a> I have used ALL of the apps out there, from Myfitnesspal to Lose It