‘Mind Captioning’ Brain Tech Translates What You’re Seeing (And Imagining) Into Words
studyfinds.org·4h
Flag this post

Measured brain activity was combined with LLMs like ChatGPT to connect the dots between neural patterns and language. (Credit: Teacher Photo on Shutterstock)

In A Nutshell

  • The technology works like a translator, not a mind reader – It converts brain scan patterns into coherent sentences by learning which neural patterns correspond to different types of visual content, then building descriptions word by word through 100 rounds of AI-powered optimization.
  • It identifies the correct video about 50% of the time from 100 options – Where random guessing would succeed only 1% of the time, the system generates descriptions accurate enough to match videos people watched or recalled from memory, capturing not just objects but relationships and actions.
  • **The brain stores deta…

Similar Posts

Loading similar posts...