At I/O 2018, Google’s CEO Sundar Pichai unveiled its amazing creation – Google Duplex. Google kicked this off with a demonstration – the Artificial Intelligence (AI) behind Google Duplex, along with Google Assistant, called a local hair salon and restaurant and the AI actually had a conversation with them.
Pichai has said that Duplex has been in the works for many years, and uses a lot of the technology Google has created recently, including natural language understanding, deep learning, and text-to-speech. Duplex is still under development, however Google is planning on releasing a testing phase and launching it inside Assistant this summer.
“We’re still developing this technology, and we want to work hard to get this right,” Pichai said. “We really want it to work in cases, say, if you’re a busy parent in the morning and your kid is sick and you want to call for a doctor’s appointment.”
“The technology is directed towards completing specific tasks, such as scheduling certain types of appointments. For such tasks, the system makes the conversational experience as natural as possible, allowing people to speak normally, like they would to another person, without having to adapt to a machine.”
Check out the video below and see how natural Duplex is at talking to a person on the phone. If this was an audio recording of just the phone calls, I would have no idea that it wasn’t an actual person talking.