- jaguar land rover and the university of cambridge have developed a contactless touchscreen that determines where you’ll tap before you do it.
- the system uses artificial intelligence combined with in-car gesture sensors to figure out the driver's intention.
- because the system is software based, it could be deployed to vehicles that already have in-car gesture and eye-tracking sensors installed.
touchscreens allow automakers to add a multitude of features to vehicles in a user experience that's familiar. they also helps keep cost down, in the long run, because there are fewer physical buttons and knobs to manufacture, install, and service. one of the downsides of these new shiny glass interactions is that bumpy roads can make it difficult to select the correct item on the screen, and, worse, they require drivers to divert their attention from the task at hand—driving.
yes, it’s an oxymoron, but stay with us here. the system uses artificial intelligence and sensors to determine where the driver intends to touch and then selects the item without the person actually coming in contact with the glass.
the team calls the technology predictive touch. jlr states that during lab tests, the system was able to reduce the amount of time a driver spends using the touchscreen of a vehicle and interaction effort with said screen by 50 percent. the automaker says that the drop in time spent dealing with the screen is a result of the system allowing drivers to select an item more quickly. yes, even on bumpy roads.
predictive touch has gesture trackers that use vision-based or radio-frequency-based sensors to determine where the driver's hand is located and combines that with artificial intelligence to figure what the driver is aiming to select before they touch the glass. jlr and cambridge also note that other sensors such as eye-gaze trackers could be used to determine what spot on the screen someone is intending to select.
"our technology has numerous advantages over more basic midair interaction techniques or conventional gesture recognition because it supports intuitive interactions with legacy interface designs and doesn't require any learning on the part of the user," said cambridge's bashar ahmad.
because the patented system is software based and relies on sensors that could already be in vehicles, it could be deployed into existing cars and suvs that are currently on the road.
both teams also brought up the reduction of pathogen transmission as a bonus because the driver is no longer touching the screen. but it's not foolproof: if you're sharing a vehicle with another driver, the steering wheel will already be contaminated, and if the passenger is sick, you're already in a confined area with that person.
one issue we see is that having a hand interact with something you don’t actually touch could be a bit disorienting. having a hand hover over a screen as it selects items might eventually become comfortable, but initially, it could be weird. researchers claim there isn't any learning curve, but introducing a new way to interact with something that's rolling down the road at 70 mph is sure to require a bit of practice.