At Synaptics, we’re always on the lookout for new trends and approaches that will revolutionize how we interact with our electronic devices and while technology, materials, and computing power have progressed tremendously in recent years, humans simply haven’t. (Well, other than the reports of coordination and dexterity improving on account of increased video game playing.) Accordingly, new technologies tend to address the same human interaction challenges that continue to confound users with their devices. Over the past several years, touchscreen technology has evolved so much that device manufacturers are now looking to bridge the gap between users and technology in ways that were unimaginable just a few years ago. While touch technology remains the most effective way that users can control their device, I want to showcase some of the new trends we’re seeing that will not only enhance the existing touch experience, but will change the way we interact with our devices. Let’s take a look!
One of the unintended consequences of touchscreen-based visions of the future, is that it is seemingly devoid of any textured surfaces. While a touchscreen is certainly an efficient way to implement a device’s control system, current devices like smartphones, tablets and cars, lack robust tactile feedback. Besides reducing input performance and accuracy, the lack of tactility is potentially dangerous as users are required to devote complete visual attention to controlling their device.
A number of companies such as Immersion, Senseg, and Artificial Muscle continue to evolve more traditional haptics in which the flat touchscreen surface is given tactility through vibratory or even electrical actuation. These techniques can help create the illusion of texture but a new class of technology, spearheaded by Tactus, is creating real, configurable, tactility using the latest in mechatronic technology. By using microfluidics and painstakingly matching all components optically, Tactus has created a touchscreen cover lens that can enable physical on-screen buttons to appear when a keyboard is displayed on screen. When the keyboard is closed, the cover lens reverts back to what looks like a standard tablet or smartphone surface. While the locations of each button are fixed in location, one can certainly speculate that over time, the density of the tactile buttons can be increased and each button can be individually addressed, creating a dynamically addressable matrix of tactile buttons.
With larger and hi-res displays, along with the trends toward multi-screen environments, navigation and control across such large screen regions tends to be highly inefficient with traditional input devices like mice or touchpads. Although touchscreens offer one solution to large screen navigation, it sacrifices some of the fine positioning control that the mouse/touchpad-based cursor control offers.
Enter eye tracking technology. Simply put, because what you see is what you get. That is, users tend to look directly at what is of interest to them. Accordingly, implementing systems such as Tobii Gaze, can provide a “warp speed” transition for a cursor to jump from one display to a new text window that the user just opened on another display. Similarly, eye tracking software in handsets, now under development by Umoove, may soon refine and improve the front-camera-based navigation controls that are nearly featured in the Samsung Galaxy S4 and LG Optimus G Pro.
Like many new technologies, natural gestures have made its initial entrance into the consumer space with games. Sure we’ve all seen the horrifying video of the mother playing Microsoft’s Kinect. However, natural gesture recognition continues to make its way into the mainstream with higher-quality cameras and increased processing power. Beyond the robust and thriving community hacking Kinect hardware to do all sorts of interesting things, we expect to see a lot of companies innovating in this space in the near future.
While not really a new technology, sensor fusion is another example of the whole being greater than the sum of its parts. With devices coming fully loaded with a number of sensory systems: touch, accelerometers, gyroscopes, compasses, and, of course, cameras, sensor fusion technology promises to deliver even more realism to user interfaces by integrating environmental information from several sensors. Synaptics has been exploring this technology for some time but we’re seeing more companies like Invensense who are continuing to refine the integration of various motion sensors for new applications and functionality.
Ultra-mobility – wearable computing
Finally, an area of technology that continues to drive innovation is wearable computing. By now, everyone has seen the attention surrounding Google Glass. Glass offers additional opportunities for a computing device to further augment reality as it utilizes heads-up display technology to superimpose a display onto the a user’s existing field of view. While still in its early stages of testing, it will be interesting to see how developers create new ways that Glass can interact in the real world.
Equally fascinating is the focus on watch-based devices. These watches, such as the Pebble, the Motorola MOTOACTV, and even the rumored iWatch, require pairing with a user’s mobile handset, but enable rapid access to many commonly used functions, like receiving text messages and listening to music. Additionally, the watch form factor, along with sensors such as a heart rate monitor or a bicycle-based power meter, serves as an ideal platform to aggregate fitness data during exercise, a trend that’s been seen in the FitBitor Nike Fuelband.
With all of these technologies coming to fruition, it’s easy to see that the way we interact with our devices is literally changing by the minute. Gone are the days where a user interface consisted of just a mouse and/or QWERTY keyboard. As these technologies continue to evolve, the consumer electronic devices of the future will look nothing like what they look like today. We’re eagerly looking forward to what the future brings and we’re excited to help lead the way.