Click here for the Daily Orange's inclusive journalism fellowship applications for this year


Pulp

Teched out: Professors demonstrate innovative technologies that respond to gestures

On Tuesday night, Dan Pacheco posed the question, “how long before this is dead?” while pointing to the image of a MacBook Pro.

The question prompted the start of “Waveables, Wearables and Flyables: Your Body Is The Computer,” an event demonstrating how technology will inevitably change everything in years to come, from education to art. Pacheco, chair of journalism innovation at the S.I. Newhouse School of Public Communications, and other journalism professors led the event.

Pacheco said technologies that are currently available, like laptops and smartphones, might soon be replaced by technologies that respond to human gestures, such as a voice command or a wave of the hand.

“It’s like the Internet in 1994 all over again. I feel like all of these technologies in particular are going to fundamentally change our world,” Pacheco said.

Some of these technologies are already being produced by companies like Samsung and Google, which have been working on new gadgets: watches that tell users more than just the time and a pill that humans ingest to become a walking password, rather than having to constantly sign into networks. These companies are going to make the current technology look “really boring,” Pacheco said.



“Our bodies are becoming computers. We are merging with technology in ways we don’t even realize,” Pacheco said.

One piece of innovative technology Pacheco presented on Tuesday was Leap Motion, a hardware that uses a sensor to detect hand motion. It enables users to execute projects with a wave of the hand — without typing a key or clicking a mouse. The device has its own app store, which includes The New York Times and Pacheco’s personal favorite: AirHarp.

Frank Biocca, the director of SU’s M.I.N.D. Lab, gave a brief presentation about the many ways the body can be combined with technology. His focus was on immersive virtual reality, in which humans react to a virtual environment. An example he showed was a virtual cadaver that can be observed and taken apart for closer inspection.

He said that augmented reality and other virtual interfaces will be available in the M.I.N.D. lab as soon as January.

Professor Bill Ward, a Google Glass explorer, spoke briefly about Google Glass and its abilities, followed by 11 student teams pitching their own innovative ideas for Google Glass apps that would utilize its technology.

Among the pitches was Recognize, an app designed to recognize people’s faces and remind the user who they are. It would do so by using spatial technology and accessing social media to pinpoint the identity of the person. Another was Food Valet, which allows its user to take a photo of a street sign and the app would be able to identify restaurants on that street. It also would have the ability to make a reservation and display menus.

Lorne Covington, a response environment digital artist, also spoke, sharing his views about how being in control is boring. The interesting parts of technology, he said, are actually the enriching and complex responses. Through cameras that sense his movement, for example, Covington is able to create ever-changing art simply by moving with his latest project.

Covington is currently working on responsive ambient video: A scene is displayed on screen and people have the ability to move around fluently, which causes the colors on the screen to change based on the motion.

“It’s not about the tech, that’s just my paint that I’m using. What I really want to do is make provocative, immersive, impactful art,” Covington said.

Covington also showed an interactive hover-and-touch model of the “unseen sun.” This allows users to “hover” through images of the sun and examine its different temperature ranges, a task that — without this technology — is physically impossible.

Covington then demonstrated a Wikipedia webgraph that detects body motion and shows related terms previously searched on the information engine. From there, the user has easy access to similar pages associated with a prior search — all of which are accessible by gesturing through the pages. He said when searching for information in this manner, people are participating more with the content, an experience they cannot get with current technology.

But overall, Covington said one of his goals for the event was to encourage students to be a part of the wave of innovation that is currently happening.

“We are built for interaction,” Covington said. “Do it.”





Top Stories