It’s all I could think as I listened to Facebook’s VP of Engineering Regina Dugan speak about skin and brain interfaces.
Like the first time you encounter Dr. Frankenstein outside his lab, Dugan, who made her F8 developers conference debut on Wednesday, started off sounding reasonable enough, telling us that the choice between paying attention to the person in front of you and checking your smartphone was a false one.
Yeah, I could get on board with that. There’s important stuff on my pocket device, a world of information, social media, and breaking news that I shouldn’t have to miss, just because there’s someone in front of me craving my attention.
But then Dugan slipped on her lab coat, grabbed a burning torch and invited us downstairs into her basement lab, figuratively, of course.
And things got weird.
Our brains are fast and the pipeline for getting that information out into the world ( our mouths and voices) is slow. It’s like broadband on one side and a 300-baud modem on the other.
“What if you could type directly from your brain?” she cackled.
Okay, Dugan never cackled. She explained in measured, soothing tones every body-interface concept as if it was the most natural thing in the world.
Typing directly from your brain normally involves some sort of implant and has been tested on ALS patients, giving them the ability to type eight words per minute. However, brain implants couldn’t be done at scale — all that surgery — she said. And Facebook’s goal is, she noted, “To create and ship category-defining products that are social first. At scale.”
Facebook’s brain interface method would involve optic technology (something to do with quasi-ballistic photons) and it would read words you’d committed to saying or typing, not thoughts.
It sounded fantastic, strange, and scary. Dugan promised a working demo in a few years.
With my mind still reeling over the concept of thought-based text, Dugan introduced the concept of hearing skin.
Seriously, what is this fascination with body parts?
The good news is I understand how skin listening can work.
Our skin is a tremendously sensitive organ (yup, your skin is your body’s largest organ). It can tell the difference between a tap and a squeeze, a kiss, and a breeze. Facebook has already created the prototype of a device that can translate words into pulses delivered to the skin surface, ones that someone can be trained to understand.
Dugan gleefully proclaimed that it only took hours for the woman in the video demo we saw hours to learn a few words.
It was then that we saw the reanimated corpse.
No. Sorry, that didn’t happen.
But why is Dugan trying to engineer our bodies? Did she drink from the same cup as Neural Lace-obsessed Elon Musk?
It’s about you as the interface
I think I understand where Dugan and Facebook are going here: They’re preparing for a day where we no longer carry smartphones. Instead, we have AR glasses, haptic clothing and a teragraph terrestrial network that maintains our connection to the internet and, therefore, Facebook, no matter where we are, 24/7.
If you can think Facebook posts without typing, every post will be perfect, typo free. And if you can do it without speech-to-text recognition, you won’t distract others around you or even be distracted from the real-world task in front of you — that is if you are someone who can walk and chew gum at the same time. I’m not convinced that many of us can think a Facebook post while driving a car.
Yet, if this works, it will turn Facebook engagement into a friction-free experience. Brain-generated posts could just as easily be brain-generated likes.
How do you know what to like on Facebook if you’re not reading? Skin listening.
Your form-fitting haptic skin suit can tap and pulse all over your body to read out posts, alert you to likes and let you conduct surreptitious conversations wherever you are. Facebook Messenger would be alive on your body.
Dugan’s somewhat terrifying plans don’t mean Facebook is any less committed to the smartphone. Look at all the time, money and engineering effort it’s putting into the Camera as an AR platform. There is, for now, no way to replace images captured and consumed on the phone with something body-based, though I’m sure Dr. Frankenst … er … Dugan is working on it.
Enriching that experience makes sense even as we eventually shift the conversational portion of Facebook communication to our bodies.
AR glasses, which will surely have built-in HD cameras for capturing photos and video, will eventually replace the smartphone screen. When that happens, we may be truly free of this distracting device and the burden of having to make that false choice, at least from the perspective of the world’s most popular social media platform
We’ll still need phones for email, though.