It’s an interesting time in the world of automotive design. It’s not just our phones that are smart these days; so are our cars, and they’re getting smarter.
The self-driving car is close to becoming a practical reality. Even many mid-range hatchbacks now have sensors, cameras, voice-activated commands. Get a text on your bluetooth-connected phone? Your car will read it to you. Need to stop in traffic in a manual car on a hill? Don’t worry, it won’t roll back when you take your foot off the brake—it’s too smart for that. Face embarrassment by stalling out in the the middle of the intersection? Smart car will get you going again before anyone notices. It can all be a little unnerving until you get used to it.
The smart car knows how you like your seat configured, and how your partner likes theirs and that you don’t always agree on the temperature settings. It knows where it is in the world and in relation to its lane, and where all the cars around it are. It knows where the next gas station is, and whether you’re going to make it there before you run out. It knows that there are hold-ups on one route, and free-flowing traffic on another. It knows what the speed limit is, and whether you’re exceeding it. It may even know what exit you should take to capture that particularly rare Pokemon!
The challenges is how to deal with all this data? How can the interface between human and machine be made optimal for today’s iPad generation, who, when they see a display, expect to be able to touch it, to interact with it? Expect it to be beautiful? And animated? This is the generation who grew up with photorealistic CG in their films and games. Analog is so yesterday. This generation expects their car’s interface to reflect and strengthen the brand they’re buying into, the buttons and dials to match the look and feel of the car’s interior, not to look like some generic set of controls that could be in any car in the last decade. They expect their car to seamlessly connect to their digital world as well as as the world around them. And when that’s done well, they can form an emotional connection with their car, and that can influence buying decisions.
This is the challenge today’s automotive designers are facing. A satisfying UI/UX experience can be a differentiating selling feature—as much as performance, comfort, handling or fuel economy—but with current workflows it’s difficult to translate the vision the artist has to the reality of what will run in-car; all too often, the quality of the experience is compromised along the way; the timing is not right, the transition jerky, the details inconsistent with the brand’s promise. The authoring environment is separate from the final delivery, making iterations slow and painful, so refining the design to its ultimate potential is not always possible.
Why is there such a gulf between the software used to author the experience and that used to display it? Computers in cars have seriously limited capacity—the graphics cards are only the size of credit cards. As they are subject to such extreme stresses of movement and temperature, they need to be small and robust. So you can’t just take a standard 3D package or games engine and run it in the car. Therein lies the disconnect between authoring and delivery, and it’s a tough nut to crack.
With the ever growing Internet of Things (IOT), there’s only going to be more demand for increasingly sophisticated human machine interfaces. Anywhere there’s a display, there’s an opportunity for interactivity. Whether it’s your fridge, your thermostat, your robot vacuum cleaner or your automatic cat feeder—the way you interact with it is going to change. And in our opinion, change is good.
Want to see a glimpse into the car dashboard of the future? Check out Project Dash - the innovative project we collaborated on with Mercedes Benz.