Artificial Intelligence is helping to boost learning outcomes for deaf students.
Hard-of-hearing and deaf students are, unfortunately, often left at a disadvantage as they have an extra set of challenges to contend with on top of the difficulties that come with adjusting to a classroom environment.
This extra work – keeping up with sign language interpreters while simultaneously reading course material and taking notes – often results in learning challenges and lower grades. However, as technology integration grows in schools, new ways to bridge classroom accessibility gaps emerge.Prior to working with Verbit, the university was facing major challenges due to a lack of resources, a high demand for services and a large backlog of inaccessible content that required transcription Click To Tweet
And that’s where technology can really make a difference, according to Tom Livne, CEO of Verbit. His company has developed a cutting edge transcription process, powered by machine learning technology, that provide 99% transcription accuracy – which he claims is the fastest turnaround time in the industry at any volume.
“Our hybrid model fuses the latest AI technology as well as seasoned transcribers who edit and review each file to polish them to perfection. Our platform is the only solution on the market that incorporates contextual data, specific acoustic models and current events to provide each customer with a tailored model that keeps improving with time.”
Founded in 2016, Verbit has secured $11M in seed funding backed by several prominent venture capitalists. It is now being used by various institutions including Brigham Young University in Idaho, which has used their technology to provide fast and accurate transcripts and captions for their students, particularly those who are deaf or hard of hearing. Prior to working with Verbit, the university was facing major challenges due to a lack of resources, a high demand for services and a large backlog of inaccessible content that required transcription.Keeping up with sign language interpreters while simultaneously reading course material and taking notes – often results in learning challenges and lower grades for deaf students Click To Tweet
This is part of a growing trend which we’ve been seeing at events such as this year’s Microsoft Build, where Satya Nadella launched their company’s AI for Accessibility program Click To Tweet
Deploying the technology resulted in a much faster turnaround without compromising on accuracy. Automation also allowed the services to be delivered at a much lower cost, which is of course a crucial issue for many education institutions trying to balance their budgets.
“We chose Verbit because, out of the multitude of companies on the market, they were a perfect fit for us. We stay with Verbit because the people are fantastic. The customer service has been incredible. The turnaround time, the accuracy, the editing time – all of those things are truly best- in-class.” Valerie Sturm, BYU-I Coordinator of Services for the Deaf and Hard of Hearing.
Prior to piloting the automation route, BYU-Idaho employed 11 transcribers on campus, tasked with covering over 500 hours of transcription per week. A high demand for services and too few people to provide them made it impossible to meet students’ needs and recover from a lengthy backlog of requests.
The system relied mostly on student transcribers, who would graduate in due course, meaning a constant need to re-train personnel from scratch, which was highly inefficient. The university found itself faced with a constant backlog of requests for content transcription, which they weren’t able to meet under the old system. At one point the institution had over 1,000 pieces of audio that were never transcribed or captioned. Activities like tutoring sessions or extra content that took place outside of the traditional classroom setting only added to this growing backlog.Deploying the technology resulted in a much faster turnaround without compromising on accuracy Click To Tweet
This is part of a growing trend which we’ve been seeing at events such as this year’s Microsoft Build, where Satya Nadella launched their company’s AI for Accessibility program. There is little doubt that with the fast pace of innovation we’re seeing in fields such as machine learning, this type of capability has great potential for making society more inclusive for people with disabilities, and that has to be a good thing.
Alice Bonasio is a VR and Digital Transformation Consultant and Tech Trends’ Editor in Chief. She also regularly writes for Fast Company, Ars Technica, Quartz, Wired and others. Connect with her on LinkedIn and follow @alicebonasio on Twitter.
Also published on Medium.