Researchers from Texas University have developed an AI decoder that can read human thoughts using functional magnetic resonance imaging (fMRI). In a study published in the journal “Nature Neuroscience,” researchers announced that they had developed an AI that could translate human thoughts by analyzing fMRI images, without the need to implant electrodes or other invasive devices.
During the study, researchers played podcasts for three participants while recording their fMRI images to measure blood flow to different areas of the brain. They then used a large language model, such as OpenAI’s GPT, to connect the recorded brain activity with the words the participants heard.
The AI was able to recognize the general meaning of the participant’s thoughts, although the translated words did not match the actual words heard. The study’s authors believe that the AI decoder could be beneficial for individuals with speech difficulties, but they also acknowledged that there were limitations to the technology.
For example, an AI model trained on one individual could not read the thoughts of another person, and there were concerns regarding privacy issues if someone’s thoughts could be stolen. Nonetheless, the researchers hope that their work will lay the foundation for more advanced brain-machine interfaces in the future.