Automated speech tools for helping communities process restricted-access corpora for language revival efforts

Nay San, Martijn Bartelds, Tolulope Ogunremi, Alison Mount, Ruben Thompson, Michael Higgins, Roy Barker, Jane Simpson, Dan Jurafsky

    Research output: Contribution to conferencePaper

    Abstract

    Many archival recordings of speech from endangered languages remain unannotated and inaccessible to community members and language learning programs. One bottleneck is the time-intensive nature of annotation. An even narrower bottleneck occurs for recordings with access constraints, such as language that must be vetted or filtered by authorised community members before annotation can begin. We propose a privacy-preserving workflow to widen both bottlenecks for recordings where speech in the endangered language is intermixed with a more widely-used language such as English for meta-linguistic commentary and questions (e.g.What is the word for ‘tree’?). We integrate voice activity detection (VAD), spoken language identification (SLI), and automatic speech recognition (ASR) to transcribe the metalinguistic content, which an authorised person can quickly scan to triage recordings that can be annotated by people with lower levels of access. We report work-in-progress processing 136 hours archival audio containing a mix of English and Muruwari. Our collaborative work with the Muruwari custodian of the archival materials show that this workflow reduces metalanguage transcription time by 20% even given only minimal amounts of annotated training data, 10 utterances per language for SLI and for ASR at most 39 minutes, and possibly as little as 39 seconds.
    Original languageEnglish
    Pages41-51
    DOIs
    Publication statusPublished - 2022
    EventFifth Workshop on the Use of Computational Methods in the Study of Endangered Languages - Dublin, Ireland
    Duration: 1 Jan 2022 → …

    Conference

    ConferenceFifth Workshop on the Use of Computational Methods in the Study of Endangered Languages
    Period1/01/22 → …

    Cite this