The Future of UI Design and Its Impacts on Everday Life

Science fiction often influences the world of technology by showing us the impossible. Today’s iPad is essentially the tablet computer from Star Trek: The Next Generation, and Back to the Future II in 1989 got a lot right about the technology of 2015, now people use tablet and computers in their daily life to work, and play games with the help of csgo boost guide they can find online. In fact, the man who invented the world’s first flip phone, the Motorola Star-Tac, was inspired by the Star Trek communicator.

The future is definitely exciting when it comes to innovations in technology and  in many ways, the future is already here! In this article we will review examples of how future UI design will impact our everyday lives.

Gesture Interfaces

Tom Cruz in Minority Report. © 2002 Twentieth Century Fox Film Corporation.
Tom Cruz in Minority Report. © 2002 Twentieth Century Fox Film Corporation.

The most memorable futuristic user interfaces were shown in the Minority Report and Iron Man. These interfaces are the work of inventor John Underkoffler. He says the feedback loop between science fiction and reality is accelerating with every new summer blockbuster. He goes on to say, “there’s an openly symbiotic relationship between science fiction and the technology we use in real life. The interface is the OS – they are one.”

In the video above, view a real-life demonstration of the futuristic user interfaces as seen in Minority Report. You can see how simple hand gestures can perform complex operations. Now imagine playing video games with such capabilities!

Light Ring

LightRing from Microsoft Research uses infrared to detect finger motion and a gyroscope to determine orientation, and it can turn any surface into an interface. You can tap, draw, flick and drag on a book, your knee, or the wall. For now, the interaction is with only one finger, but still provides a really attractive and natural looking way for user gestures.

This technology puts wearable computing to a whole new level! Imagine controlling your device anywhere and any way you choose. As shown in the video, the nature of using this technology is similar to using a mouse, so we are already familiar with how the product works.

Room Alive

RoomAlive is Microsoft Research’s follow-up to IllumiRoom, which was presented at CES 2012. Both are steps towards a “this-is-our-house-now” Kinect future. The new system goes beyond projection mapping around a TV by adding input-output pixels on top of everything in the room. RoomAlive uses multiple depth cameras and spatially mapped projectors to overlay an interactive screen from which there is no escape.

Imagine “real-life” video games that transform your living room into the world of the game. Or imagine virtual home decoration, projecting your vision of what you want to rearrange or add to your home’s decor.

Skin Buttons

The Skin Buttons project uses miniature projectors to display interactive icons on the skin around the watch face. This technology expands the interactive zone around a smartwatch without making it physically bigger. The projector parts cost less than $2 and can even increase battery life by shifting workload from the main display.

FlexSense

The FlexSense is a transparent sheet of plastic, but its embedded piezoelectric sensors detect exactly what shape it’s in. This allows for all kinds of intuitive, paper-like interactions. For example, flipping up a corner to reveal something underneath, toggling layers in maps or drawings.

Imagine cell phone cases that react as you peel the cover. Or interactive books or children’s books that react as you turn a page.

HaptoMime

HaptoMime uses ultrasound to create tactile feedback in midair, so you feel like you’re touching a hovering image when there’s nothing there at all. It’s produced by a hidden LCD and an angled transmissive mirror. This technology has massive potential for any public display.

Zero UI

https://youtu.be/KkOCeAtKHIc?rel=0

Zero UI isn’t a new idea. If you’ve ever used an Amazon Echo, changed a channel by waving at a Microsoft Kinect, or setup a Nest thermostat, you’ve already used a device that could be considered part of Goodman’s Zero UI thinking. It’s all about getting away from the touchscreen, and interfacing with the devices around us in more natural ways. With methods such as haptics, computer vision, voice control, and artificial intelligence, Zero UI represents a whole new dimension for designers.

As these technologies become more intuitive and natural for the new generation of users, we will be treated to a more immersive computing experience that will continually test our ability to digest the flood of knowledge they have to share. The potential for change is both overwhelming and exciting for future user interfaces and it’s definitely something to look forward to when new technologies and ground breaking products come to market.