Viv is an AI personal assistant launched at TechCrunch Disrupt NY on Monday May 9. Viv promises to radically simplify our interface with everything. Viv is everything that the Siri cofounders Dag Kittlaus and Adam Cheyer wanted to build with Siri but couldn’t. Of course Viv is a few years more advanced than Siri, but the big difference is Steve Jobs preferred to keep Siri in a walled garden and not allow open partnerships.
“Viv is a global platform that enables developers to plug into and create an intelligent, conversational interface to anything.”
Cheyer and Kittlaus are focusing on creating a developer ecosystem so that Viv will be the way that every device interacts with you in future. They believe that no one company can bring you everything that you need. But one personal assistant can interact with all the companies and services that you need.
Viv does ‘conversational commerce’ with breathtaking speed and accuracy. I think you can forget about deep linking as the next mobile/online paradigm. Kittlaus described Viv as “the next major marketplace and channel for offering content, commerce and services.”
Viv will launch a gradually expanding developer program later in 2016. And as thousands and thousands of developers add to the Viv intelligence then the power of Viv expands. While the demo focused on showing Viv’s strengths on a phone, Viv is clearly intended to be the interface for all of our smart connected devices. And robots.
In fact, I’ll bet that my future home robot will be running on Viv.
Is there a downside?
What are could possibly go wrong in a future world full of personal assistants?
Well, while Dag Kittlaus was launching Viv at TechCrunch Disrupt NY, cofounder Adam Cheyer was in San Jose on the CHI2016 panel “On the Future of Personal Assistants” with , , , , .
Firstly the panel noted that the term ‘personal assistant’ was deceptive as we are transitioning from helpful but passive assistants such as Siri to active agents such as Echo, Viv and perhaps Jibo or other home robots in the future.
“Assistants are just helpful. Agents go out into the world and perform actions.” said Phil Cohen, VoiceBox.
There are also huge potential implications in the transition from personal assistants to family assistants. Amazon’s Echo is an example of a family assistant, where the device is no longer in our hands or pockets but sits in a room and interacts with a range of people.
How do we deal with multiple devices in various areas of our lives ie. from car to kitchen to work to bed, and then also how do devices interact across our individual family members? Will we be creating functional or dysfunctional family agents? Navigating these issues of privacy, trust and boundaries should be at the forefront of our development.
Eric Horvitz, Microsoft, described the ‘penumbra – or bubble of data and devices’ that travels with each individual. We expect these agents to collaborate with us, but who else do our agents collaborate with in order to provide us services?
The next step is to consider how our agents can in future interpret our emotional states. What inferences can be made, and shared, with whom? What control can be effected by these agents.
Rana El Kaliouby, Affectiva said “your fridge will understand your emotional state and help you make better decisions, if you want to change your behavior. That’s a very powerful promise.”
Cheyer was clear that he doesn’t want “a plethora of different assistants… I want one assistant who can learn my preferences over time and do a better job.”
Although the intelligence might be in the cloud, the form factor of the device will give very different feel to how we interact, whether it’s a screen or TV or a robot of some kind. And as we start to embody our assistants we change the relationship significantly.
Horvitz told an anecdote about going on holidays and turning back to bring Alexa along. Kind of like a member of the family.
Steve Whittaker, UCSC said that “we can use the metaphor of a house as an approach to designing different categories of personal assistants. A house is a collection of resources for family activities. At some level they are shared facilities for a family but it is segmented.”
“Everyone is talking about the strengths of AI but it’s not really going to solve all our problems. Where Siri was pretty dumb, Viv has light weight planning ability. Viv is able to do complex tasks with several steps which is a lot closer to the way we really ask for things. It’s not going to be doing the life planning things that Eric was asking for. But we’re working on taking actionable insights out of the unstructured data of our lives. And in the end that’s a very exciting development across all domains.”
In a room full of interactionists and ethicists, it’s no surprise that the first question to the panel was “Is it good for society to be creating a slave class?”
This opened the door on a discussion that ranged from the ethics of ‘nagware’ up to lethal force. At which point the challenge was made for designers shift away from negativity and to create delightful solutions that would enhance and improve our lives. People centered rather than task centered design.