Your smart device is watching you

The devices around you, through sensors that capture things like temperature, humidity, proximity, and touch, are collecting data about their environments. But now those devices can quite literally watch you with 'embedded vision' – technology that uses digital processing and intelligent algorithms to interpret meaning from images or video.

Embedded vision has traditionally been limited to systems such as factory floor inspection machines that cost tens of thousands of dollars. But with the rapid advances in IoT, and as processors become more powerful, cost-effective, and energy-efficient, things are beginning to change.

According to Jeff Bier, founder of the Embedded Vision Alliance, any device or system can be made smarter with embedded vision. "Vision is the richest source of learning," Bier says. "Just as in humans, vision in machines brings the world into vivid, colorful focus and allows devices to understand and interact more efficiently with their surroundings."

One example Bier shares is the robotic vacuum cleaner recently introduced by Dyson. The Dyson 360 Eye robot observes and interprets its surroundings much like a human being. But unlike humans, it has a unique 360 degree vision system which uses complex mathematics, probability theory, geometry, and trigonometry to map and navigate a room. This allows the robot to understand where it is, where it's been, and what's already been cleaned.

No sensor is perfect, Bier says, but he believes the combination of a camera, processor, and software forms one of the most powerful types of technology on the market today.

"Embedded vision can help IoT systems know more about the world around them and the more the devices know, the smarter they will be," Bier says. "Right now, computer vision still takes a lot of battery power, but it can be used with other sensors so that vision is only used when needed."

For instance, in a home security system, a low-power, low-cost infrared sensor might detect an object that is a different temperature than the rest of the home environment. When this occurs, a camera can come on and identify exactly what the object is and whether it is friend or foe.

Another advantage of embedded vision is that it can do more than one thing, unlike most sensors that have a single task. "A car safety system is a good example of this," Bier says. "It can detect nearby pedestrians, speed limits, lane markers, and oncoming vehicles." Car safety research shows that self-braking cars reduce collisions by 38 percent.

Other examples of embedded vision at work include systems such as Amazon's Kiva robots that interact with humans to speed up the company's fulfillment process. The embedded vision systems in these types of robots have a camera connected to a high-performance image processor for object recognition and a real-time control system for precise component placement.

"The more companies like Dyson and Amazon that use technology like this, the more affordable embedded vision is becoming," Bier concludes.

"With each newly released product, engineers are thinking, 'If they did that there, then I can do this here.' And with more cost-effective devices, it's becoming more feasible to put vision in more places."