The Department of Defense has already invested in a number of projects to which the Minerva research has relevance. The Army Research Laboratory, for example, has funded researchers who captured and transmitted a participant’s thoughts about a character’s movement in a video game, using magnetic stimulation to beam those neural instructions to another person’s brain and cause movement. And it has supported research using deep learning algorithms and EEG readings to predict a person’s “drowsy and alert” states.

Evans points to one project funded by Defense Advanced Research Projects Agency (DARPA): Scientists tested a BCI that allowed a woman with quadriplegia to drive a wheelchair with her mind. Then, “they disconnected the BCI from the wheelchair and connected to a flight simulator,” Evans says, and she brainfully flew a digital F-35. “DARPA has expressed pride that their work can benefit civilians,” says Moreno. “That helps with Congress and with the public so it isn’t just about ‘supersoldiers,’” says Moreno.

Still, this was a civilian participant, in a Defense-funded study, with “fairly explicitly military consequences,” says Evans. And the big question is whether the experiment’s purpose justifies the risks. “There’s no obvious therapeutic reason for learning to fly a fighter jet with a BCI,” he says. “Presumably warfighters have a job that involves, among other things, fighter jets, so there might be a strategic reason to do this experiment. Civilians rarely do.”

It’s worth noting that warfighters are, says Moreno, required to take on more risks than the civilians they are protecting, and in experiments, military members may similarly be asked to shoulder more risk than a regular-person participant.

DARPA has also worked on implants that monitor mood and boost the brain back to “normal” if something looks off, created prosthetic limbs animated by thought, and made devices that improve memory. While those programs had therapeutic aims, the applications and follow-on capabilities extend into the enhancement realm — altering mood, building superstrong bionic arms, generating above par memory.

For both military and civilian applications, and from therapeutic and enhancement perspectives, ethical researchers are interested in how BCIs alter their possessors. Do you feel like your true self when you have an implant attached to your brain? The results from different studies are relevant across the board. One recent project investigated feelings of personhood in six patients with BCIs that alerted them of an impending seizure. While the devices correctly predicted seizures, they didn’t always improve the patients’ well-being. The BCI left one participant, for example, estranged from themselves, constantly aware of their illness.

Another participant, meanwhile, felt freer, because she could do things like shower and not fear for her life. But that success poses its own problems. “She became a de novo person, but that de novo person was dependent on this A.I. system in her brain, but this A.I. system was owned by a company. That wasn’t hers,” says Frederic Gilbert, one of the paper’s authors. “And in a way, the company owned the new person, because as soon as the device was explanted, that person was dead.”

That complication applies to the Minerva team’s persons-of-interest, too: How are researchers weighing the benefits of BCI, and other smart-software-enabled technology, against the consequences for study soldier participants? It’s an especially knotty conundrum since a device that successfully enhances someone’s abilities, or those of their unit, doesn’t necessarily enhance their quality of life and may, in fact, degrade it. “Is that the kind of benefit you should be able to risk your body — and the case of BCI, your identity — for?” asks Evans.

To find out, he and his team will go deep into the world of would-be supersoldiers, before brain-boosters go the way of CrossFit.