This startup wants to put your brain-computer interface in Apple Vision Pro


Now, Cognixion reaches its artificial intelligence communication program to the Vision Pro, which Forsland says it works more than the axon-R. “Vision Pro all your apps, App Store, give you whatever you want to do,” he says.

Apple opened the doors of BCI integration in May, when it announced a new protocol to allow users with inability to move to control the iPhone, iPad and Vision Pro without physical movement. Another BCI company, synchronized, whose planting is inserted into a blood vessel adjacent to the brain, has also merged its system with Vision Pro. (It is not clear that Apple is developing its own BCI.)

In the Cognixion court, the company has exchanged Apple’s forehead with six electronsfall or EEG sensors. They gather information from the visual and parliamentary strata of the brain, located behind. Specifically, the Cognixion system specifies visual stabilization signals, which occur when one retains his or her view of an object. This allows users to use the menu of the options in the interface using mental attention. A neural computing package covered in the HIP process puts brain data outside the Vision Pro.

“The philosophy of our approach is about reducing the amount of load created by one’s communication needs,” says Chris Ulrich, chief technology director for Cognixion.

Current communication tools can help but are not ideal. For example, low -tech manual letters allow patients to look at specific letters, words or images so that a caregiver can guess their meaning, but they are time consuming for use. And eye tracking technology is still expensive and not always reliable.

“We actually create an artificial intelligence for each of the participants with a history of talking, their humor style, whatever they have written, anything they can say, we can collect,” says Ulrich.

Leave a Reply

Your email address will not be published. Required fields are marked *