Search

Saturday 19 May 2012

Kinect & Programmable Automation Controller (PAC's)

An Integrated gesture-based control with a Programmable Automation Controller (PAC). The purpose of this simple follow-on experiment is to demonstrate interactions between a control program running in a PAC and a Windows-based HMI program that uses both conventional mouse-based input and gesture input.

PACs are used extensively in automation and machine control applications. They can be programmed to perform advanced data acquisition and control tasks, and offer many flexible options for communicating with other hardware devices and SCADA/HMI systems. The concept I am trying to promote in my experimentation is the use of advanced natural user interface (NUI) devices like the Microsoft Kinect in automation and machine control applications. There are many practical uses for NUI technologies in industrial applications, where conventional "hands-on" human-machine interactions may otherwise prove to be difficult or potentially hazardous.
This project was created with the recently released Kinect for Windows sensor and the Microsoft v1.0 commercial SDK. The hardware device used in this demonstration is a Snap PAC Learning Center courtesy of Opto 22 Corporation, which includes a PAC-R1 controller and I/O rack containing a mix of various input and output modules. The Learning Center provides a convenient platform for these types of experiments, as it includes switches, LEDs and other components for simulating real-world operating conditions.
A simple PAC program that monitors state transitions of an ON-OFF toggle switch. The toggle switch provides a voltage to one of the DC input modules on the rack, and another DC output module switches one of the panel LEDs on and off. The PAC program monitors both state transitions of the switch and the on/off status of the LED; if a switch-state change is detected, the program toggles the voltage to the LED. This could have easily been accomplished by directly mapping the input module to the output module, but the idea here is to allow an external HMI program to also interact with the I/O points.

The Windows-based HMI program superimposes a simple control panel on the screen with On and Off buttons and an Output Status indicator. As demonstrated in the video (which was captured using the Kinect’s built-in RGB camera), the LED can be controlled on or off by any combination of On/Off screen button clicks, physical switching of the toggle switch, or gesture-based control. All gesture recognition and control is done through skeletal tracking of the left hand, which has an on-screen indicator to show relative hand positioning when it is inside the virtual control panel.


As BK Brown tried to show in the video, body positioning is not critical for mapping the hand coordinates to the On/Off screen buttons. The HMI program interacts with the PAC by using MMP messages to control the LED as well as retrieve the On/Off status of the DC output module, which allows the HMI program to know the status of the LED even when it gets switched on or off manually by the toggle switch.


Introduction to Kinect


Check out these Videos:

No comments: