This week our ED 256 class visited UCSB’s Four Eyes Lab. The lab is a part of the university’s Computer Science Program and the Media Arts and Technology Program. The “four eyes (I’s)” – Imaging, Interaction, and Interactive Interfaces – are literally brought to life through the work of faculty and graduate students. This class also visited the surround-view, three-story Allosphere – a multimedia, 3D, audio and visual panorama science experience. But that’s all I have to say about the Allosphere since I had a class to run to toward the end of our class time when the visit was made. I will have to see it another time.
Professor Matthew Turk highlighted some of the important design work being done through the Four Eyes Lab. These innovative designs and programs speak volumes to how they may be applicable to the educational challenges we have been developing as a class over the last few weeks. What is most fascinating about this research is that it is moves beyond conventional desktop computing. The ideas presented by Professor Turk at this session do not involve computer keyboards, but rather, they are apparatuses that act more like extensions of your body.
Thus, Mobile Augmented Reality overlays computer generated information on top of the physical world. Turk explained the “imaginary” scrimmage line that now accompanies professional NFL football games as an example of augmented reality. What this might mean for education is really staggering. If images and sound can be overlaid onto physical objects then there is no limit to the amount of content that can be added to enhance that learning. I just imagine a computer image of one human hand, and one human foot, demonstrating a drum groove that can be imitated at home. What that means is that students do not have to remember the exact details of building such a groove. An augmented reality sequence would enhance both their memory and motor skills.
Another wonderful design that I found fascinating was the idea of computational photography. Professor Turk described the use of multiflash imaging to highlight hidden or shadowed areas of images. I believe the use of multiflash imaging already appears to be a reality for some professions, for instance, the medial world. The idea that you can light the contours of images, revealing the shadows, allows for clearer exploring of all sorts of objects, or in the medical world, a clearer look into the far reaches of the human body. For educational purposes, this technology could be harnessed as a tool that requires students to build projects based on multiple images. For example, by creating an array of images of one object, how many different perspectives or conclusions can be reached about that object?
There were other ideas presented by graduate students of the lab that were awesome. I found the “Gibber” project very cool. Building electronic grooves and tunes, musically speaking, is right up my alley. Learning the java script language to program “Gibber” already has potential for students. I cannot think of a more engaging and fun way to learn a computer language. Of course, since I am a musician, that’s sort of a no-brainer for me.
Four Eyes Lab and the Allosphere are the future, and it’s really nice to see it as a part of our university!