Human pose estimation using Node-RED

Kazuhito Yokoi
5 min readDec 10, 2020

--

I’m Kazuhito Yokoi at OSS Solution Center in Hitachi, Ltd. In the previous article, I introduced how to use Node-RED Operator on Red Hat OpenShift. In this article, I will explain the procedures to connect the Node-RED environment with the TensorFlow container on the OpenShift.

Use case

Here, we will use the example of Node-RED flow that realizes the alert system for when driving a car.

In Japan, cars run on the left side road and cyclists also run on the same side of the car line as you may know. Therefore, drivers always want to avoid collisions with them when the cyclist approaches the car. To support the situation, this system show alert for the driver when the camera detects the scene when the cyclist shows signs to turn right side using the human pose estimation model.

Before the following procedures, in advance, you need to create a Node-RED environment using Node-RED Operator as described in the previous article. All of the operations should be performed by the developer account on the OpenShift web console, not by the admin role.

(1) Deploy TensorFlow container

Access to “+Add” -> “Container image” item on the web console.

In the next wizard, paste “quay.io/codait/max-human-pose-estimator” as the image name from the external registry. This container includes the REST API server to estimate human pose from the input image (see the details if you are interested). Other items, “application name” and “name” in the general field are automatically filled after pasting the image name.

After about two minutes from clicking the create button on the wizard, the ring of the max-human-pose-estimator instance will be blue which means the success of the deployment.

By double-clicking the max-human-pose-estimator instance which has the OpenShift logo, you can see the generated endpoint URL of the REST API on the routes field of the resources tab on the detail side area.

If you access the endpoint URL, you can see the Swagger UI to call the methods provided on the REST API server. Please remember this endpoint URL because we will use it in the final step for editing the Node-RED flow.

(2) Import Node-RED flow

The Node-RED environment created by Node-RED Operator has enabled GitHub integration called “Project feature” as default. To import sample flow from GitHub, select “Projects” -> “New” on the menu of the Node-RED flow editor as the first step.

In the following wizard, click the “Clone repository” button to clone the repository which contains sample Node-RED flow.

In the next window, you need to type your name and e-mail address for the git command. But this information will not be used in these procedures because there is not git commit command which requires the information.

Finally, paste “https://github.com/kazuhitoyokoi/node-red-pose-estimation-demo.git" to Git repository URL. The project name will be inputted automatically, and other fields need to be empty.

After the import process, the error notification will be pop-up because some Node-RED modules don’t exist at this time. To solve the situation, click the “Manage project dependencies” button to open the project settings UI firstly (If there is no button, click the ellipsis mark next to the project name in the information tab instead of that).

The dependencies tab on the project settings UI shows some missing Node-RED modules as follows. To install these modules to Node-RED, click each “install” button.

After fixing the issue about missing modules, the Node-RED flow becomes flow without unknown nodes of the red dot-line.

(3) Configure endpoint

To change the endpoint URL to the TensorFlow container deployed on the OpenShift environment, double-click the “human-pose-estimator” node at the first.

To move into the endpoint setting UI, click the pencil button on the node property UI.

For the host field on the endpoint setting UI, paste the endpoint URL which is copied from the max-human-pose-estimator instance deployed previously. Changing the name field to “openshift” is not mandatory.

After clicking the “Update” button on the endpoint setting UI, click the “Done” on the node property UI and then deploy the flow using the “deploy” button on the top-right of the Node-RED flow editor.

(4) Checking the flow behaviors

If you click the button which is located on the left side of the top-left inject node, you will see the annotated image under the image output node on top-right. Simultaneously, the ok image will emerge under the image output node on the bottom-right. Between them, the analyze node determines if the human pose is the normal position while the human rides on the bicycle.

When you click the button of the second bottom inject node as the next step, the result will be the alert image. Using the analyze node, the flow decided that the cyclist intends to turn right soon because the right arm has been raised.

If you needed, you can also connect to the camera equipped on the car via not only the http request node but also other protocols.

As described in this article, we saw the example which integrated Node-RED with other components deployed on the Red Hat OpenShift. Because there are several useful OpenShift components that can connect by the Node-RED node modules like MySQL node, I will survey these components and continue to come up with another use case using them.

--

--