Can a potato chip bag reveal your last conversation?
A team of scientists at MIT, Microsoft and Adobe have figured out a way to turn everyday objects into visual microphones.
Picture this: Two of your office colleagues are standing in the break room chatting, and you think to yourself: What if I could actually hear what they're saying. You're not a Marvel superhero with super-human powers to hear through walls. So what can you do? Enter science.
In the break room, on a table next to your colleagues, is an open bag of potato chips. What if that bag could listen in on the conversation and report back to you later on what was said? You'd think we were nuts for even suggesting the idea, but allow us to explain.
When people talk, their speech creates tiny, tiny, tiny sound vibrations into the air. Those vibrations then hit inanimate objects around the room. And imagine if you had a camera that was zoomed in on one of those objects extremely closely. In theory, you could actually see the object move along with the vibrations. You could then feed that video into some fancy shmancy computer software and – voila! – you can play back the audio of the conversation that just took place. You're basically turning everyday objects into visual microphones.
Ok, so we know what you're thinking. That "sounds" kind of ridiculous, maybe something you'd see in a sci-fi movie, but can it actually be done? The answer is, surprisingly, yes.
A team of scientists at MIT, Microsoft and Adobe have created a piece of software that can pull off this amazing feat. What's more, you don't even need a high-end, fancy camera to help with the speech recognition. The researchers found that even using an inexpensive consumer-friendly camera would get the job done as well.
Watch them explain how it's all done in this video below:
In addition to the bag of chips, the scientists tried another example: They played "Mary Had a Little Lamb" next to a potted plant. If you look closely, you'll see that the sound vibrations caused the branches of the plant to move ever so slightly with each note. Keep in mind that the naked eye couldn't pick up the movement: the leaves moved by less than a hundredth of a pixel.
The camera was kept outside the room, behind a soundproof glass wall. And it simply zoomed in on the leaves. The researchers took the video, ran it through their computer algorithm and, lo and behold, they heard the song played back to them.
The experiment was first done a few years ago, but they continue to fine-tune the execution. One of the people behind the project is Michael Rubinstein, who graduated summa cum laude from Tel Aviv University and received his Ph.D. from MIT. The Israeli academic is now a research scientist at Google and speaks often on the topic. Here he is giving a TED Talk about it:
So the next time you want in on a conversation at work, not to worry: Just look for a bag of chips and you'll be AOK. Or, if you want, you could simply just walk into the break room and join in on the office gossip.
MORE FROM THE GRAPEVINE: