Eyesight announced at the Samsung Developers Conference Wednesday that it will embed its computer-vision and deep-learning technology into Samsung’s Artik 10 Internet of Things module. The companies say this will enable manufacturers to embed gesture-recognition capabilities directly into products such as smart light bulbs. It will also eliminate the need to grab your smartphone to control a device, and it will significantly reduce response time because the processing power will be on the device itself instead of a server in the cloud.
Eyesight said it would demonstrate its technology at the conference by using subtle finger movements to control a set of Phillips Hue LED light bulbs. The company said future IoT applications would enable personalized experiences based on factors such as the number of people in sight of the device, or the gender and age of people in the room.
Samsung’s Artik 10 is based on a quad-core ARM Cortex A15 married to an ARM quad-core Cortex A7 processor, plus DRAM, flash memory, camera and display interfaces, digital I/O, and analog inputs. The module also has an 802.11 a/b/g/n/ac Wi-Fi adapter along with Bluetooth LE and ZigBee/Thread radios. The tiny device even has a dedicated GPU (a Mali T628 MIPS), and everything resides on a circuit board that’s just 29mm wide by 39mm tall.
Why this matters: Gesture control will be a boon for the connected home. Small hand movements will be easier and less disruptive than uttering voice commands to thin air. But it could be some time before this technology is ready even for early adopters.