Reading thoughts?


You absolutely MUST check out these two recent experiments in neuroscience! In the first Jack Gallant and his team at UC Berkeley have come a long way in decoding and reconstructing people’s dynamic visual experiences. The group used functional magnetic resonance imaging (fMRI) to measure people’s brain activity while watching Hollywood movie trailers and computed models to predict what people are watching given the measured activity. Beware of your breath when watching the results (original footage on the left, predicted footage on the right):

The second experiment, carried out by Brian Pasley and his team at UC Berkeley, is similar to the first in that it decodes and recodes sensual input, but this time the researchers focused on sound. Working together with patients scheduled to undergo brain surgery, they placed electrodes directly onto the brains of these patients and made them listen to specific words while recording their brain activity. Computational models were used to predict the sounds people listened to…and scarily accurate! (The first sound is always the original one, followed by two predictions using different algorithms)

Read more here and here.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s