E3 is , attracting tens of thousands of live viewers to its live game announcements. There is a lot of and worth checking out, but if you’re relying on the stream’s captions to make sense of what’s being unveiled today, you might be very confused.
Try to figure this out, for example: “I’ve seen the biggest things like NY Hawk Pro Star, MROID Prime, ‘Guitar Ho.’ “Or” I actually own the world record for the person who worked on… St video games motr is very proud.
While may be you can understand the gist of what is being said by looking at the full sentence (albeit riddled with errors) of this article now imagine these words being spelled out in real time as they try to keep pace with a live speaker. It is much more difficult to understand.
These weren’t the only examples of scrambled captions on the E3 livestream that I came across, but listing them all would make this article endless – it was this wrong. Feel free to browse this gallery of screenshots for more examples from the third day of the show (the captioning appeared largely normal over the weekend). The Weird Captions were popping up on both Twitch and YouTube today, making it more likely that an issue could arise on the part of the E3 Entertainment Software Association (ESA) organizer.
Gallery: E3 live captioning fails | 23 photos
Gallery: E3 live captioning fails | 23 photos
To figure out what’s going on, I spoke with (parent company of Engadget) Verizon Media Studios’ own senior director of streaming technology, Dennis Scarna. He thinks there are a few possible explanations. “Scrambled caption data can be attributed to a data flow / signal flow issue, poor audio quality or shorthand errors,” he said.
ESA told Engadget that it has hired qualified staff to manually transcribe its live stream. Typically, human captioners are given a stream of the live stream that they need to transcribe and are given an encoder to type in the captions. Their words would be fed back into an organization’s streaming infrastructure as embedded data and transmitted to various platforms (like Twitch and YouTube, for example).
Anyone who has tried taking notes during a lecture by a fast-talking professor will know how difficult it is to transcribe a live event. Even the best stenographers will have a hard time following a conversation, especially if there is crosstalk. But the level at which E3’s closed captions failed looked less like a human struggling to keep up with the flow and more like a technical error as, as Scarna theorized, “Data [that] could still be corrupted even if it is manual.
Since live transcription can be incredibly difficult, some mistakes are inevitable. But big chunks of the E3 streams seemed to be prerecorded, which meant they could have been captioned ahead of time. We asked ESA how many of its live broadcasts are pre-recorded, and the association said the metric changes for each broadcast and a majority of its content is live or is coming live to be included in the broadcast. .
Screenshot
We also asked why the pre-recorded segments weren’t pre-captioned, and ESA said it made the decision to provide captioning for the entire program, not just its own videos. produced, but also for those of its partners. But because he had no scripts in advance, his live captioners type in what’s being said without warning about what to expect.
ESA has also stated that it has sign language interpreters on demand (although it is not clear how a viewer can request a sign language interpreter halfway through), and has .
The Engadget editors watched the E3 Day 3 live stream nonstop today, and until around 5 p.m. ET the closed captioning was incomprehensible (like the screenshots we provided ). Engadget had contacted ESA around 1 p.m. ET to ask about the issue. We did finally get some responses at the time of the enhancement, which seems to indicate that the ESA was made aware of the problem and may have worked on a fix. As of this writing, the quality of the subtitles has improved significantly. We are still awaiting clarification from ESA on what went wrong and what was done to resolve the issue.
The AI-generated captions are unlikely to have helped. Although technology is improving, they are still imperfect and many broadcasters prefer to rely on human stenographers. And if the problem was corrupted data, it would have hampered the AI just as much. Accurate transcriptions are a crucial part of making content more inclusive, especially for the deaf and hard of hearing community. People with disabilities also play games, and events and mistakes like this just show how people with different needs can be left out of the conversation. Industry to improve the inclusiveness of its technology products, but it’s 2021 and we need to do better than that.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through any of these links, we may earn an affiliate commission.