Snapchat adds four native languages ​​to users’ lens platform

Popular social app Snapchat has launched a new feature to help Australians learn over 170 Indigenous words.

The social media company has partnered with First Languages ​​Australia to bring Aboriginal and Torres Strait Islander dialects to users’ screens through the app’s language learning lenses.

Each lens uses augmented reality and machine learning to identify objects and display their names in the languages ​​Wiradjuri (Central New South Wales), Yugambeh (South East Queensland), Wakka Wakka (Central Queensland) and Yawuru (Broome in Western Australia).

Shaun Davies, descendant of Yugambeh, said social media is the modern “campfire” where stories are shared.

Yugambeh Tongue x3

“In the past, our elders used to teach lingo by the campfire,” he said.

“But the camp has changed, and the fire that people watch every day is no longer the same.

Technology has become a central place in the house and now our lingo must go there if it is to survive for mobo jahjum (future generations).

First Languages ​​Australia chief executive Beau Willaims said the lens would boost language recognition.

“We know millions of young Australians use Snapchat every day – so it’s an amazing opportunity for them to learn about our First Nations languages ​​in a fun and interactive way,” he said.

Among the objects are the ear (wudha in Wiradjuri), the spider (wanggarranggarra in Yawuru) and
hat (binka in Yugambeh).

“But the camp has changed, and the fire that people watch every day is no longer the same” Shaun Davies

Snapchat APC chief executive Kathryn Carter said it was important to share the Indigenous language with young Australians.

“We are delighted to partner with First Languages ​​Australia, and hope these lenses represent our small part in supporting Australia’s Aboriginal and Torres Strait Islander communities in a unique way,” she said. .

The lenses can be accessed by searching Learn Wiradjuri, Learn Yugambeh, Learn Wakka Wakka, or Learn Yawuru, or by scanning the Snapcodes below.

Users point their cameras at an object to scan it, and the lens automatically displays the object’s English and Indigenous names in real time, along with an audible clip.

Comments are closed.