Human pose estimation using Node-RED

I’m Kazuhito Yokoi at OSS Solution Center in Hitachi, Ltd. In the previous article, I introduced how to use Node-RED Operator on Red Hat OpenShift. In this article, I will explain the procedures to connect the Node-RED environment with the TensorFlow container on the OpenShift.

Use case

Here, we will use the example of Node-RED flow that realizes the alert system for when driving a car.

Image for post
Image for post

In Japan, cars run on the left side road and cyclists also run on the same side of the car line as you may know. Therefore, drivers always want to avoid collisions with them when the cyclist approaches the car. To support the situation, this system show alert for the driver when the camera detects the scene when the cyclist shows signs to turn right side using the human pose estimation model.

Before the following procedures, in advance, you need to create a Node-RED environment using Node-RED Operator as described in the previous article. All of the operations should be performed by the developer account on the OpenShift web console, not by the admin role.

(1) Deploy TensorFlow container

Access to “+Add” -> “Container image” item on the web console.

In the next wizard, paste “quay.io/codait/max-human-pose-estimator” as the image name from the external registry. This container includes the REST API server to estimate human pose from the input image (see the details if you are interested). Other items, “application name” and “name” in the general field are automatically filled after pasting the image name.

Image for post
Image for post

After about two minutes from clicking the create button on the wizard, the ring of the max-human-pose-estimator instance will be blue which means the success of the deployment.

Image for post
Image for post

By double-clicking the max-human-pose-estimator instance which has the OpenShift logo, you can see the generated endpoint URL of the REST API on the routes field of the resources tab on the detail side area.

Image for post
Image for post

If you access the endpoint URL, you can see the Swagger UI to call the methods provided on the REST API server. Please remember this endpoint URL because we will use it in the final step for editing the Node-RED flow.

(2) Import Node-RED flow

The Node-RED environment created by Node-RED Operator has enabled GitHub integration called “Project feature” as default. To import sample flow from GitHub, select “Projects” -> “New” on the menu of the Node-RED flow editor as the first step.

Image for post
Image for post

In the following wizard, click the “Clone repository” button to clone the repository which contains sample Node-RED flow.

Image for post
Image for post

In the next window, you need to type your name and e-mail address for the git command. But this information will not be used in these procedures because there is not git commit command which requires the information.

Finally, paste “https://github.com/kazuhitoyokoi/node-red-pose-estimation-demo.git" to Git repository URL. The project name will be inputted automatically, and other fields need to be empty.

Image for post
Image for post

After the import process, the error notification will be pop-up because some Node-RED modules don’t exist at this time. To solve the situation, click the “Manage project dependencies” button to open the project settings UI firstly (If there is no button, click the ellipsis mark next to the project name in the information tab instead of that).

Image for post
Image for post

The dependencies tab on the project settings UI shows some missing Node-RED modules as follows. To install these modules to Node-RED, click each “install” button.

Image for post
Image for post

After fixing the issue about missing modules, the Node-RED flow becomes flow without unknown nodes of the red dot-line.

(3) Configure endpoint

To change the endpoint URL to the TensorFlow container deployed on the OpenShift environment, double-click the “human-pose-estimator” node at the first.

Image for post
Image for post

To move into the endpoint setting UI, click the pencil button on the node property UI.

Image for post
Image for post

For the host field on the endpoint setting UI, paste the endpoint URL which is copied from the max-human-pose-estimator instance deployed previously. Changing the name field to “openshift” is not mandatory.

Image for post
Image for post

After clicking the “Update” button on the endpoint setting UI, click the “Done” on the node property UI and then deploy the flow using the “deploy” button on the top-right of the Node-RED flow editor.

(4) Checking the flow behaviors

If you click the button which is located on the left side of the top-left inject node, you will see the annotated image under the image output node on top-right. Simultaneously, the ok image will emerge under the image output node on the bottom-right. Between them, the analyze node determines if the human pose is the normal position while the human rides on the bicycle.

Image for post
Image for post

When you click the button of the second bottom inject node as the next step, the result will be the alert image. Using the analyze node, the flow decided that the cyclist intends to turn right soon because the right arm has been raised.

Image for post
Image for post

If you needed, you can also connect to the camera equipped on the car via not only the http request node but also other protocols.

As described in this article, we saw the example which integrated Node-RED with other components deployed on the Red Hat OpenShift. Because there are several useful OpenShift components that can connect by the Node-RED node modules like MySQL node, I will survey these components and continue to come up with another use case using them.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store