Imagine you have thousands of photographs and only minutes to find a handful that contain Dalmation puppies.
Paul Sajda, a professor of biomedical engineering, thinks he’s found a solution to such “information overload” that could revolutionize how vast amounts of visual information are processed—allowing users to riffle through potentially millions of images and home in on what they are looking for in record time. He’s used it successfully for both of these purposes.
Brain Computer Interfaces Benefit from Cloud Advancements
What do you get when you mix compute clouds and electroencephalograms (EEG) together? Ask Kathleen Ericson a PhD candidate in the Department of Computer Science at Colorado State University, who in a paper coauthored with Professors Shrideep Pallickara and Charles Anderson has explored some of these possibilities [1]. This paper was awarded the Best Student Paper award at the IEEE Conference on Cloud Computing Technology & Science in December 2010.
A pianist plays a series of notes, and the woman echoes them on a computerized music system. The woman then goes on to play a simple improvised melody over a looped backing track. It doesn't sound like much of a musical challenge — except that the woman is paralysed after a stroke, and can make only eye, facial and slight head movements. She is making the music purely by thinking.