This video is pretty cool – it shows how the new iPhone is already making augmented reality a true reality. It also got me thinking about what this could mean for the future of personal computing. More about that after the video:
What would happen if iPhones could combine with Sixth Sense and video glasses? Maybe the smart-phone would become something you just wear… maybe like one of those headphones that wrap around the back or of your head, but thicker and wider around the back (of course, with a smaller microphone and headphone buds instead of those earmuff looking things in the picture I linked to). I say bigger in the back because this is where the phone, hard drive, GPS, compass, tilt senor, wi-fi, battery, etc are. Maybe even some small solar panels to keep power levels up. Then, you buy a pair of sunglasses in whatever style you like, plug those into your smart-phone, and the lenses become heads-up displays for augmented reality. You would see maps like in the video above, but for anything you want – directions, site-seeing, etc. That is pretty cool… but there could be even more.
People have been talking about getting video calling on smart-phones for a while now. But that is obviously limited. I’ve always wondered if the “G” in 3G stood for “Good-night-i-wish-this-would-hurry-up-and-load.” What if we take the technology that creates realistic 3-D avatars based on photos. You create an avatar for yourself. When you call someone, instead of using video to slow your smart-phone down to 0.5G – you send your avatar. Tilt sensors and maybe even tiny cameras built into the lenses would send info to the avatar, making it mimic your moves. Each person in the conversation would see a realistic CG avatar in front of them talking to them. No more “freaky eyes staring off in some random direction when you are trying to have a conversation” like in video conferencing.
Those embedded cameras could also follow your hands and give you a cool “Minority Report”-ish interface with your apps that only you can see. Or use voice recognition to use apps. Need to send an email? Speak it out, or have a virtual keyboard float in front of you for more privacy.
What if this could also become your interface for your computer at home? The possibilities are endless: Better control in Second Life. Self-guided field trips for school. Truly secure test-taking in distance learning. Work or learn anywhere you go. Clueless people may never get lost again!
Someone get me 10 million dollars and development team!
Matt is currently an Instructional Designer II at Orbis Education and a Part-Time Instructor at the University of Texas Rio Grande Valley. Previously he worked as a Learning Innovation Researcher with the UT Arlington LINK Research Lab. His work focuses on learning theory, Heutagogy, and learner agency. Matt holds a Ph.D. in Learning Technologies from the University of North Texas, a Master of Education in Educational Technology from UT Brownsville, and a Bachelors of Science in Education from Baylor University. His research interests include instructional design, learning pathways, sociocultural theory, heutagogy, virtual reality, and open networked learning. He has a background in instructional design and teaching at both the secondary and university levels and has been an active blogger and conference presenter. He also enjoys networking and collaborative efforts involving faculty, students, administration, and anyone involved in the education process.