Samsung Electronics has released its first AI-powered phone series, the Galaxy S24, which has been applauded for its ability to overcome language barriers.
The Galaxy S24 comes with an interpreter feature that allows users to translate languages on the go. However, this feature seems to be useful for some limited tasks such as finding directions or making restaurant reservations while traveling abroad. Moreover, unlike document translation, real-time interpretation services are prone to have recognition errors and delays, which can make irritable for impatient Koreans.
When making a phone call with William, an American in his 40s who works as an English teacher at a middle school, using the real-time interpretation feature of the Galaxy S24, there was no problem communicating in simple questions. The topic of the call is a casual conversation with an American taxi driver. William plays the role of the taxi driver. William asks, “What bring you to the US? Are you enjoying your visit so far?” in English. William’s words were translated into Korean in real time.
However, in response to William’s next question, “Did you get a chance to test drive?”, the Galaxy S24 misinterpreted the reporter’s answer as “Drive the cars yourself”, and the conversation ended with William responding, “Oh, The interpretation is weird.”
The reporter spoke with Hai, a Vietnamese woman in her 30s living in Korea through an international marriage, in Vietnamese and English. Hai spoke in Vietnamese and English, while the reporter spoke in Korean. In response to the reporter’s question, “Can meritocracy be called fair,” in Korean, the Galaxy S24 correctly recognized the Korean word meritocracy as the English word “meritocracy” and delivered it in English. However, the Korean question, “Do you think meritocracy provides equal opportunities,” was not translated correctly into English, and the reporter had to rephrase the question.
Unlike translations from Korean to other languages, the Galaxy S24 was relatively fast and accurate in the opposite direction.
This is because Korean speakers often omit subjects, complements, and objects, and deduce meanings from context. If you don’t recall the previous conversation, the interpretation can be entirely wrong.
Kim Soo-yeon, a professor of English at Sejong University, who participated in the government’s data construction project for AI training (Korean speech data), attributed this misinterpretation to the lack of Korean speech recognition interpretation data. “It can be challenging to interpret Korean sentences (in English), since they usually omit subjects and nouns, meanwhile they are grammatically necessary in English,” Kim explained. Korean is a context-dependent language that determines meaning based on context, so it is essential to learn Korean speech recognition interpretation data for perfect AI interpretation, she added.