But Facebook has visions for this wrist technology beyond AR and VR, Bosworth says. “If you really had access to an interface that allowed you to type or use a mouse, without having to type or physically use a mouse, you could use it anywhere.” The keyboard is a prime example, he says; this wrist computer is just another means of intentional typing, except you can take it anywhere with you.
The wearable can discern the movements of the wearer’s hand by sensing nerve activity in the person’s wrist.
Video: FacebookBosworth also suggested the kitchen microwave as a use case – noting that Facebook doesn’t, in fact, build a microwave. Appliance interfaces are all different, so why not program an appliance like this to just understand when you want to cook something for 10 minutes on medium power?
In the virtual demo that Facebook gave earlier this week, a gamer was shown wearing the wrist and controlling a character in a rudimentary video game on a flat screen, all without having to move their fingers at all. These types of demos tend to (pardon the pun) a move towards mind-reading tech, which Bosworth insisted it didn’t. In this case, he says, the mind generates signals identical to those that would move the thumb, but the thumb does not move. The device records a intention to move the thumb. “We don’t know what’s going on in the brain, which is full of thoughts, ideas and notions. We don’t know what’s going on until someone sends a signal over the wire. “
Bosworth also pointed out that this wearable wrist is different from the invasive implants that have been used in a Brain-Computer Interface Study 2019 on which Facebook worked with the University of California at San Francisco; and different from Elon Musk’s Neuralink, a wireless implant that could theoretically allow people to send neuroelectric signals from their brains directly to digital devices. In other words, Facebook doesn’t read our thoughts, even though it already knows a lot about what’s going on in our heads.
Researchers say there is still a lot of work to be done in the area of using EMG sensors as virtual input devices. Precision is a big challenge. Chris Harrison, director of the Future Interfaces Group at Carnegie Mellon University’s Human-Computer Interaction Lab, points out that every human’s nerves are a little different, and so are the shapes of our arms and wrists. “There is always a calibration process that has to happen with any muscle sensing system or BCI system. It really depends on where the computational intelligence is, ”says Harrison.