Over the past decade, businesses have begun to recognize the value of incorporating UX design as a key component in the product and services development process. Data strongly suggests that companies focusing on meeting users’ needs perform better than those that remain stuck in earlier generations’ “functionality first” mindset.
The emergence of new technologies (virtual reality, drones, self-driving cars, artificial intelligence, etc.) has reinforced the need for UX design to drive development and has revealed a new set of challenges that cannot be solved using the methods of the past. There is an opportunity to approach these challenges with a fresh perspective in order to create a solid foundation that focuses on user experience from the beginning. In order for these new tools and technologies to improve our lives, they must be designed with a knowledge of, and integrated with an appreciation for, people’s needs and cultural expectations.
Roombas, delivery drones, and digital assistants are now relatively common, but we have yet to define the UX “code of conduct” around our interactions with them. For instance, if you encountered a robot moving towards you in a hotel hallway, or entering your elevator, how would you react? How should it react to you?
Savioke is a company that creates autonomous robots for the service industry. Once they achieved the goal of creating a robot that could successfully navigate spaces and execute commands, their main concern was: How should their robots behave around humans?
To answer this question, the Savioke team partnered with Google Ventures to conduct a design sprint to examine the user experience. The Google design sprint is a 5-day collaborative exercise for efficiently answering critical business questions and creating realistic prototypes validated with actual end-users.
Savioke implemented this process one month before providing a local hotel with their Relay robot. As a result, the team was able to update the robot’s software to display human-like characteristics close to the initial release. This included a series of expressive sounds (think R2-D2 and Wallie), a physical interface vaguely resembling a cheerful face, and a “dance” that occurs after a successful delivery.
These may be small features compared to the challenge of building a robot capable of operating autonomously, but the impact of understanding the human element (or ignoring it) is critical to the integration of new technology into our society.