Voice – MUXL

Lee Mallon the Technical lead from Rarely Impossible came to share his experience in working with Alexa voice, giving some useful insight into how using this technology can change the way we communicate.

One point raised was quite interesting, talking about how the younger generation adapts to technology first, but then turning this conception on its head looking at how the interaction could actually suit the older generation more due to under 25s having grown up researching and asking questions using the internet with a keypad, whereas the older generation 65+ have grown up talking to gain insight and knowledge, which i though was quite a nice turning the usual concept on its head. He pointed out how Alexa gives a transcript of everything that was asked during the past month or so, and mentioned how his nan suffers a bit from Dementia and this was shown up within the transcript, where she had asked the same question a number of times within a short period of time, which raises some useful thoughts about how this could be turned into something positive, that might be able to detect certain patterns in how the user uses the device detecting early onset of dementia or other illnesses.

It is interesting that Alexa constantly monitors what is being said waiting for awake words to be mentioned, but Lee mentioned how you soon forget and carry on with normal life. What was alarming is that every term that goes through Alexa is being monitored by a team of people who put a transcript together, although not noting location or name of the user.

This led to how Alexa can be used for offices such as being able to put together and presenting sales figures, and then presenting it to the team – which would save a few people within my team a few hours a week. Although companies must be scared that their data is being run through a direct competitor. They found some interesting things about the placement of Amazon Echo when testing it within a client’s office, such as Alexa found it hard to pick up voices in a noisy location which is fairly obvious, but most interesting was it struggled at the end of the day, finding the office workers voices and language hard to understand as the users become lazy and tired, wanting to get home. Lee found with his team that there is a real difference in the tone and quality of the users voice depending on how they feel.

Lee talked about the findings he found when taking it home, finding that he would first talk normally when asking Alexa a question, but soon started talking as if he was typing into a Google search box. Eg saying, “Hamilton Malaysia” instead of “Where did Hamilton finish at the Malaysian GP this weekend”. With his kids everything started fine until after a couple of week one of his kids went up to him and said “Dad get me a drink”. Lee found himself replying “I’m not Alexa!”. They then had to download a skill so that Alexa would only carry out requests once it had heard the word please.

Moving onto new technology of where voice could be used such as lifts and how this would affect the interface, eg allowing the user to know that they would have to speak, when there is no interface. Here the lift would have to talk to the user asking which floor the user would want to go to, which would be one of the first times that a computer would initiate contact with the user.