With Synsis, customers will be able to start analyzing human states immediately and build empathic features within the vehicle that respond to these states. This new kind of interaction can be operated through any programmable component of the vehicle, such as the infotainment system, environment controls or advanced driver assistance systems. These in-cabin solutions could be differentiators for automotive brands searching for new ways to attract customers, as noisy engines, manual driving and vehicle ownership are replaced with electrification, automation and ridesharing.

Gawain Morrison, CEO and co-founder, Sensum, commented, “Human data will become the most important asset in the next generation of in-cabin products and services. Companies need to put in the data miles now to build customized models, and a solution like ours is the only way to get true data in the real places where people live their lives.”

User experience designers and researchers can now collect human data in the wild, to gather data from a wide range of sensors for body, face and voice data, and context data can be integrated from almost any source. Sensum’s patent-pending sensor-fusion solution brings all human data streams into one place synchronized, tagged for events, and ready for analysis.

Engineers can build human-reactive prototypes using Sensum’s scientifically validated human state models, which provide a comprehensive model of the human user across a wide range of emotions and other cognitive and physiological states. They can then trigger product features based on the user’s current state, to test and build products that respond empathically, for richer, more personalized human-machine interaction.

By: Dominic Akuffo