itestro - Fotolia


Human-machine interface advancements could be boon to productivity

Augmented reality and voice recognition are just two of the HMI tools that could help make factory automation more humane and fun -- not to mention profitable.

The way we interact with machines is changing, and that's a very good thing indeed, since more machines are taking over more of what we do as manufacturers. As our relationship with machines changes, so does the nature and importance of our interaction with those machines. Fortunately, technology is providing new and exciting ways to display information and communicate with systems that make it easier and more intuitive to interact with our electronic "employees." 

The human-machine interface (HMI) is the space where these interactions between humans and machines occur, according to Wikipedia. In the plant, the most common human-machine interface is the control panel or screen mounted at the operator position on a computer numerical control (CNC) mill, lathe or conveyor system. This display panel is for interacting with a specific machine, of course, to enter and change programming, start and stop activities, and for the operator to see status, tool position, sensor measurements, control charts and more.

The next step up the food chain includes indirect interaction with production equipment through a plant-level network and manufacturing execution system (MES) software. In this case, the user interface is most often through a PC keyboard and screen or a touch-screen terminal. With both the direct and the indirect MES interface, the information displayed is almost exclusively character data with few if any graphics. Input is confined to keyboard, mouse clicks or touch-screen taps.

When the ERP system or the engineering systems are tied into MES, input/output options expand to include more graphics, though these are from the ERP or MES to the user (and not to or from the production equipment) in the form of work instructions, drawings, videos and the like. Input options may well include barcode scanning but this, too, is more for interfacing with the other machines (ERP, MES) in the loop. Incidentally, there is an emerging trend toward a consolidation of MES and ERP into integrated “shop floor to top floor” systems containing full MES and ERP functionality in a single system set on a unified platform.

The current state-of-the-art in human-machine interface technology is the mobile device -- smartphone or tablet -- with interaction to the shop floor via ERP and MES rather than directly with the production equipment. These devices offer exceptional portability and availability along with stunning graphics and powerful analytics and data-visualization capabilities. They are limited primarily by screen size, but software developers are rolling out new formats designed specifically for mobile devices' limited screen acreage.

The next human-machine interface technology coming to the plant floor is wearable devices. Notwithstanding the disappointing results of early trials with Google Glass, the case for a wearable human-machine interface remains strong. The industrial Internet of Things (IIoT) is boosting the development of wearable display technologies that can and will be adapted to plant use. Wearables are particularly important in the plant because they allow the user's hands to remain free to do the work. Wearables currently in use rely on voice recognition and voice response and that will continue to be an important part of the wearable interface world.

But the most promising new human-machine interface technology is based on augmented reality (AR), which is more likely to rely on gestures and body movement as the primary input method. AR is similar to virtual reality (VR) that uses a headset or glasses to project a digital image in the user's field of vision. While the VR image is all the user sees, the AR image is overlaid with physical reality so the user sees both the digital image and the physical surroundings at the same time. The power of AR is its ability to synchronize the digital and physical views. The digital image can, for example, display an image of a control panel over the actual panel and demonstrate the movement of a lever. The user just has to duplicate the movement in the real world to complete the task. AR technology can incorporate complex and detailed work instructions that can improve operational consistency and performance, reduce training requirements, and document precise actions and events.

Human-machine interactions are the weak link in automation where error is most apt to be introduced. Any improvements to human-machine interface design and technology that make this interaction easier and more reliable can't help but improve quality and improve performance. New technologies like AR promise to help humans become more effective in controlling machines and better able to benefit from the overwhelming amount of data collected from IIoT.  

Next Steps

Understand the AR state of the art

Read a case study about field service via AR

Learn how AR could transform work

Dig Deeper on Industrial automation software