How to use t6-Iot image preprocessor to trigger actions based on image facial expression recognition

In this recipe, the goal is to trigger an Email action after t6-IoT detect a human facial expression from a provided image.

Check prerequisites

In this recipe, we’ll use the following concepts:

  • an aidc Preprocessor to identify the facial expression (Automatic identification and data capture)
  • a Flow to store specific measurements
  • a Rule to trigger an email after the facial expression is identified on the image
  • We’ll also set datatype as a String so that the Flow will contains the facial expression

Setup the Flow container

This step is straight forward and does not require anything special. We’ll customize this Flow with a String datatype.

So, the first step is to create this Flow using the following payload. To have more details on Flows, read the technical documentation.

{
    "name": "My AIDC Flow to identify facial expression from images",
    "data_type": "a394e18f-12bd-4c22-b9c3-74c387d1a8db",
    "preprocessor": [
        {
            "name": "aidc",
            "mode": "faceExpressionRecognition"
        }
    ]
}

Once your Flow is created, take note of the flow.data.id on the Api results. This value will be used on datapoints creation as the referring variable {{$flow_id}}.

Create the Rule that will trigger the Email

{
    "name": "Trigger an email when aidc identify a sad facial expression",
    "rule": {
        "conditions": {
            "all": [
                {
                    "fact": "flow",
                    "operator": "equal",
                    "value": "65e2ca88-adf1-431b-a2f4-82497f54f32f"
                },
                {
                    "fact": "value",
                    "operator": "equal",
                    "value": "sad"
                }
            ]
        },
        "event": {
            "type": "email",
            "params": {
                "to": "{{$your_own_email@domain.invalid}}",
                "subject": "Facial recognition on t6 Flow {flow}",
                "text": "Facial recognition on t6 Flow {value}",
                "html": "<h1>Hello</h1>Facial recognition on t6 Flow<br />Value: {value}"
            }
        },
        "priority": 1
    },
    "active": true
}

Need more details on Rules? read the technical documentation.

Let’s put it all together, post image datapoint

Before posting the datapoint, you’ll need to make sure the payload contains a valid base64 image encoded string. You can use an online service to do that.

{
    "save": false,
    "publish": true,
    "flow_id": "{{$flow_id}}",
    "mqtt_topic": "image-test-processing",
    "preprocessor": [
        {
            "name": "aidc",
            "mode": "faceExpressionRecognition"
        }
    ],
    "value": "/9j/4AAQSkZJRgABAQEASABIAAD[...TREUNCATED...]RzKaSpfrFa3M30wYeFkOfsav/Z"
}

And voilà, you’ll notice the Api results will transform the initial value into a String value telling about the recognized expression found in the image. Additionally, the preprocessor is having a specific expressions node in the result providing with the full expressions scores. The Rule identified and use that value to trigger the Email as notification.

"value": "sad",
"preprocessor": [
    {
        "name": "aidc",
        "mode": "faceExpressionRecognition",
        "initialValue": "1658669719797000000-faceExpressionRecognition.png",
        "status": "completed",
        "expressions": {
            "neutral": 9.418351254453228e-8,
            "happy": 1.1365385715889076e-10,
            "sad": 0.9999997615814209,
            "angry": 7.31789351338108e-10,
            "fearful": 1.6018465487377398e-7,
            "disgusted": 8.681204626687089e-13,
            "surprised": 6.788646977895496e-9
        },
        "recognizedValue": "sad",
        "expressionValue": 0.9999997615814209
    },
],

To have more details on Datapoints, read the technical documentation.