Kate WilletteWhen my daughter was in middle school, I sometimes found myself at talent shows featuring goofy skits. A favorite was called The Dressing Table. A girl sat with a makeshift dressing table in front of her, pretending to face a mirror. Seated Girl wore a very large sweatshirt, but her arms were not in the sleeves. Kneeling behind her, where the audience couldn’t see, was a friend with her head hidden inside the same sweatshirt and her arms thrust through those sleeves, making it look as though Seated Girl had very short arms. Seated Girl announced theatrically, “I think I’ll put on some lipstick!”

The hands of Kneeling Girl scrambled comically around on the table until they landed on a tube of bright red lipstick, which she then applied, very badly, somewhere in the vicinity of Seated Girl’s mouth, before announcing, “Now I’ll do my hair!”

This skit came to mind while I was thinking about the subject of this article: brain computer interface. Seated Girl was, in a sense, paralyzed. Tucked inside the giant sweatshirt, her hands were useless. Kneeling Girl’s job was to compensate — to use her hands to do the task that her friend was naming for the audience.

This is a little like how a BCI system is supposed to work, only without words. When the paralyzed person simply thinks of doing a task, that intention