The Role of Sensor Fusion and Remote Emotive Computing (REC) in the Internet of Things
Abstract
The age of sensor technology is upon us. These days, it’s unusual to experience an electronic consumer product that doesn’t use sensors to create new experiences for its users. Sensors are experiencing a renaissance of sorts as microelectromechanical systems (MEMS) technology becomes less expensive and further miniaturized, in turn fueling penetration of sensors into new applications and creating new potential for the sensor market.
Introduction
Sensors are now found in a wide variety of applications, such as smart mobile devices, automotive systems, industrial control, healthcare, oil exploration, and climate monitoring. Sensors are used almost everywhere, and now sensor technology is beginning to closely mimic the ultimate sensing machine … the human being. The technology that allows this to happen is sensor fusion, which leverages a microcontroller (a “brain”) to fuse the individual data collected from multiple sensors to get a more accurate and reliable view of the data than one would get by using the data from each discrete sensor on its own. Sensor fusion creates a situation in which the whole is much greater than the sum of its parts. Sensor fusion enables context awareness, which has huge potential for the Internet of Things (IoT). Advances in sensor fusion for remote emotive computing (emotion-sensing and processing) could also lead to exciting new applications in the future, including smart healthcare. However, these capabilities spark significant privacy concerns that IoT governance will need to address. Massive amounts of context-aware data will become available as the use of sensor fusion and REC technologies increases. This data, along with the IoT’s access to the “global neural network in the sky” and cloud-based processing resources, will lead to a tremendous expansion in the delivery of context-aware services customized for any given situation. Services could be based on the context of what an individual user is doing, what machines are doing, what the infrastructure is doing, what nature is doing, or all of the above in various combinations.
by Kaivan Karimi Executive Director—Global Strategy and Business Development, MCUs, Freescale Semiconductor