Saturday, November 16, 2013

Final Project: Smartphones and Dialects

For my final project, I wanted to combine the digital world and linguistics. I was just reading online when I came across an article about people in the U.K. having issues using their voice activated Smartphone features because of their dialect or accent. It seems that Smartphones aren't really programmed with the capability to understand non-American dialects. The article is a bit dated, but it got me thinking, and I did some more research.

My husband has an Australian coworker, and I asked her if her Smartphone understood her voice. She said it was horrible at understanding her, but she thought there might be some way to change a Smartphone to get it to recognize your dialect. I work at the Office of Information Technology on campus, and I asked some programmers I work with if they knew anything about a dialect app for Smartphones. They said they hadn't considered that problem and didn't know of any, but one told me about high tech voice recognition software called Dragon.

This led to more researching. I found an article from USA Today detailing helpful Smartphone travel apps, including language apps. None of the apps mentioned have the ability to recognize foreign dialects. They are all mostly for translation purposes, which makes sense, but there is a need for a lot of improvement in this field as well.

Then I found another article where I found out that Google and other companies are using Dragon software to improve their Smartphone's recognition. I also found out that Google and other companies are crowdsourcing to find a solution to their phone's voice recognition problems. The article also discusses a little how complicated it is to have Smartphones recognize dialect. Voice recognition technology relies a lot on statistical models of language that try and predict which words will come next in a sentence based on probability. A Google product manager interviewed in the article gives the example of saying the "empire" into your Smartphone. He says that statistically it is more likely for someone to follow that word with "state building" or "strikes back" than another word. He also mentions that continuously collecting linguistic field data helps improve voice recognition.

But is there really a demand for an app that can recognize different dialects and not just foreign languages? I found another very recent article from 2013 in Forbes magazine about what customers want in a new Smartphone. Among the list of wants is better voice recognition. The author acknowledges that even with just a standard American accent, his iPhone has trouble understanding what he says at times. He also recognizes that dialects and other "lingo" make a potential Smartphone user's experience not ideal.

So this brings me to my project. I've met with the director of the International Student Association, and he has agreed to send out a Qualtrics survey for me that would get the students' opinions and experiences on using the voice activated features on their Smartphones. I chose the International Student Association because I knew that they would all have non-American accents and that, for the non-native speakers, their English was at a level where they could read and understand the survey questions. I am sending this survey to get an idea of the demand for a Smartphone or a Smartphone app that would allow them to use voice activated features with their version of English.

What are your thoughts?