Smart Authentication for the Internet of Things
Ambiently Detecting Individuals Using Bioimpedance Signatures
Over the last couple of years our computing has spread out into our environment, but the way we interact with it has not, we still use a screen. These days we carry the screen around in our pockets, but it’s still a screen.
If the Internet of Things is ever going to live up to the promises people are making around it, we’re going to have to give those screens—and the keyboards that go with them—up, and interact with our computers differently.
If everything has sensors embedded, those bits of distributed computing are going to have to have a way to figure out who is interacting with them, and that’s where the Zensei project from the MIT Media Lab comes in.
Zensei uses biosensors, advanced signal processing, and machine learning to identify people solely by their body’s electrical properties—it even works through clothing.
In other words, using this technology the objects around you can figure out who you are without you having to wear, or carry, a token or be wired up in any way. For instance you could authenticate yourself and login to your cellphone, just by picking it up.
“Zensei works by sensing the amplitude and phase response of an extremely tiny AC signal through an array of up to 8 electrodes. The signal modulated through one electrode at a time before being recieved at the remaining sensing electrodes for every possible electrode rotation. On-board signal generators create sine waves at a range of programmable frequencies (~1KHz-1.5MHz). The signal is then amplified and outputs at a select electrode pair. A part of the user’s body touches the electrodes, and the return signal’s amplitude and phase component are captured with the Analog-to-Digital converter (ADC) port of the microprocessor and RF gain and phase detector IC.”
It’s important to realize that it’s not just the development of smarter sensor technology that will drive development of the Internet of Things, it also needs the smarter interaction methods.
No technology is really mature until it’s invisible, and right now the Internet of Things is anything but that. Every time we use it we have to reach into our pockets and pull our our phones, or sit down and open our laptops, and authenticate with the world around us. User interaction innovation like Zensei might mean we’ll eventually be moving through a world that will not only be filled with sensors, but will be able to know who we are and react accordingly.
Want to delve deeper into the project? You can download the team’s paper, which was presented last week at the Conference on Human Factors in Computing Systems, in Denver.