Connected cars have become a topic of conversation since we first started hearing about adding Alexa and Google Home into cars, bringing the home beyond a home.
However, automakers are going beyond the virtual assistants to connect cars to the human body. Cars will soon be able to recognize the driver and passengers by their eyes, skin, gait, and even heartbeat. Using facial and iris scans, as well as voice and fingerprint tracking, the car will be able to accurately identify the person and adjust seats, mirrors, etc. and start the engine without a key. This means detecting factors such as alcohol levels and fatigue, and for sure will adjust how insurance plans are catered to each person.
While the privacy concerns will be a huge hurdle and exact timing of execution is unknown, this shows that connectivity is becoming more prevalent. Connected cars are over 10x more expected to be used than self-driving cars, showing that we as people trust ourselves much more than releasing 100% of the control to a machine.
Cars are becoming the second home and will hold even more data now than ever before. Understanding how to capitalize on this from a QSR standpoint is interesting, will there be body language trends for those that are hungry that can fire off a message for Jack in the Box LTO?





