CoBot2The trend in user interfaces is to make access to information more natural, convenient and less intrusive.  Multi-touch, gestures and voice interfaces are recent steps in that direction, and are better than a keyboard and mouse in many situations. But even these interface methods are less than optimal. The NeuroSys project is predicated on the belief that users won’t be completely natural until the interface disappears entirely, and accessing information is easy as thinking.  The ultimate vision of the NeuroSys project is computers and mobile devices you interact with using only your thoughts. 

While still very exploratory, our research is already showing that thought-based user interfaces are not as farfetched as one might think. In this joint Intel / CMU / UPitt project we are investigating what can be inferred about a person’s cognitive state from their pattern of neural activity.  We are leveraging a variety of brain imaging modalities, including EEG, fMRI and magnetoencephalography (MEG), in order to gain insights into how the brain processes information, and how that information might be employed to build more natural user interfaces.

NeuroSys is a joint project between Intel Labs Pittsburgh, Carnegie Mellon University and the University of Pittsburgh.

Please see the CMU Brain Image Analysis Research website for more details.


Zero-Shot Learning with Semantic Output Codes MARK PALATUCCI, DEAN POMERLEAU, GEOFF HINTON, TOM M. MITCHELL In Neural Information Processing Systems '09 (NIPS-09) Vancouver, Canada, December 2009. [PDF]


  • Intel Labs Pittsburgh: Dean Pomerleau
  • Carnegie Mellon: Tom Mitchell (Professor, Machine Learning), Marcel Just (Professor, Psychology)
  • Students (alphabetical): Mark Palatucci (CMU), Gus Sudre (CMU), Leila Wehbe (CMU)