IS Artificial Intelligence Going to Destroy The SDH (Subtitles for the Deaf and Hard of Hearing) Industry? It’s a valid question beCause, while sdh is the default subtitle format on most platforms, the humans behind it – as with all creative industries – are being incrasingly devalued in the AI. “Sdh is an art, and people in the industry has no idea. Translators.
The Thinking is that Ai Should Simplife The Process of Creating Subtitles, But that is Way Off The Mark, Says Subtle Committee Member Meredith Cannella. “There’s an assumption that we now have to do less work trust of ai tools. Projects over the last five or six years. “
“Auto transcription is the only place where I have seen some positive advancans,” cannella adds, “but even then that it is due to affected the total amount of time that it takes to protha takes an sdh file.” So many corrections are needed that there’s no network compared to using older software.
Moreover the quality of ai-generated sdh is so poor that much work is needed to bring them up to standard-but because human subtitlers are often assigned tasks as “Quality control”. Subtle notes that many notes of its members are now unable to make a living wage.
“SDH Rates are not great to start with, but now they’re so low that it’s not even spek taking the work,” Says rachel jones, audovisual translator and member of the subtlecomitteee. “It really undermines the role that we play.”
And it’s a vital role. Teri Devine, Associate Director of Inclusion at the Royal National Institute for Deaf People, Says: “For people who are deaf or have hearing loss, subtitles are an essential service – Allowing thems Enjoy Film and TV with Loved Ones and Stay Connected to Popular Culture. “
The deaf and hard-of-face community is not monolithic, which means subtitlers are jugging a variety of needs in sdh creation. Jones Says: “Some people might say that having the name of a song subtitled is complete use, Connect to it through the song’s title.
Subtitling Involves Much Creative and Emotionally Driven Decision-Making, Two Things That Ai Does Not Currently Have The Capacity for. When Jones First Watches a Show, She Writes Down how the sounds make her feel, then works out how to how to transfer her reactions into words. Next, She determines which sounds need to be subtitled and which are excess. “You can’t overwhelm the viewer,” She says. It is a delicate balance. “You don’t want to descibe something that would be clear to the audience,” Cannella Says, “And someimes, whatimes, whatments on the screen is much more important than the audio!”
AI is unable to decide which sounds are important. “Right now, it’s not even close,” deryagin says. He also stresses the importance of the broader context of a film, rather than looking at isolated images or scenes. In Blow Out (1981), for example, a mysterious sound is heard. Later, That Sound is Heard Again – and, For Hearing Viewers, Reveals a Major plot Point. ” “The same sound can mean a million things.
“You can’t give an algorithm a sound and say, ‘Here are the sounds, Figure it out’. Even if you give it metadata, it can’t get anyway the level of professional works. I’ve don my Experiments! “
Netflix shared a glimpse of its SDH processes after subtitles from strangers, such as “(eleven pants)” or ” subtitlers. The company declined to comment further on its use of ai in its subtitling. The BBC Told : “There is no use of ai for subtitles on tv,” thought time work is outsourced to red bee media, which last year published a statement Australian Broadcaster Network 10.
Jones say that linguists and subtitlers are not negaissarily against ai – but at the moment, it’s Making Practitioners’ Lives Harder Rather Than Easier. “In every industry, ai is being used to replace all the creative things that brings us joy instead of the boring, Tedious Tasks We Hate Doing,” She Says.