The importance of educational media accessibility is in the spotlight, as lawsuits against both Harvard and M.I.T. were filed by the National Deaf Association last week. The lawsuits are over the failure of the two schools to adequately caption their online learning content. This content includes various video and audio material, including online lectures and podcasts.
This should be an informative legal proceeding and one to keep an eye on. It may impact interpretation and enforcement of already existing laws that relate to electronic media access like ADA and the Rehabilitation Act, especially as online learning increases in higher education.
Here at UT, we’re very concerned with accessibility and are proud of the efforts made to caption campus media. However, even with significant progress made thus far, there’s more to be done. These lawsuits emphasize the importance of our university’s efforts to level the playing field and provide access to the benefits of online learning to everyone.
If you are a content owner and have questions or need captions please contact me or visit our site for more info.
We’re a good ways into the fall semester, so I thought it might be interesting to share some data we’ve gathered through our captioning and transcription service piloting phase.
First up is a chart displaying how many minutes of video or audio we’ve transcribed and captioned total. There’s also a trend line indicating the amount of staff hours worked. I’ve grouped this data into 2-week chunks.
As you can see, we hit a high mark in the second half of October with steady increases up to that point.
Of course there are quite a few variables that affect how quickly a video or piece of audio can be transcribed and captioned. How quickly someone speaks or how technical the language can have a big impact on turnaround time.
Below, you’ll find a chart displaying a breakdown of which groups have requested captions or transcription from us over the course of the semester. This is based solely on number of videos or audio (not duration).
I’m always on the lookout for ways to increase efficiency. It’ll be interesting to see how this data develops over time.
Here in TIS we’ve been quietly rolling out a captioning and transcription service. It’s not widely publicized yet, but we’ve captioned or created transcriptions for almost 70 videos, totalling nearly 1,000 minutes of content. That’s roughly the equivalent duration of The Lord of the Rings trilogy (theatrical edition) 2 times through!
In addition, we have almost 50 hours of new content on the docket for the upcoming months.
We are very fortunate to have hired three fantastic student staff members, who are diligently and efficiently transcribing, timing and creating very high-quality captions and transcripts at blistering typing speeds (want to challenge us to TypeRacer?)
So far, the feedback has been very positive for both our turnaround time and overall quality of work.
Although this is just the beginning we’re very excited about the progress. Especially since we have some other exciting initiatives in the works. Our next phases will include:
rolling out a website
ramping up our capacity
creating a captioning knowledge base
getting the word out to entire UT campus
If you are interested in seeing some of our work, here are a few links to public facing content that we’ve done:
Need captions or transcriptions for your videos? Make your content accessible! Please contact me, Daniel Jacobs (firstname.lastname@example.org) to get the ball rolling. I’d love to talk shop and answer any questions you might have about all things captions/transcription. And please do check back here for more updates in the future.
p.s. I’m serious about TypeRacer. Set up a race, send me the link.
CAPTCHA or Completely Automated Public Turing Test is a challenge-response test used to prevent bot abuse and eliminate spam. If you’ve been on the internet anytime in the past decade, you’ve most likely come into contact with (and perhaps been frustrated by) a CAPTCHA. You’ve probably even used a CAPTCHA within the past few days – statistics show that 200 million reCAPTCHAS (a specific type of CAPTCHA) are completed daily.
The most common CAPTCHA looks something like the image above. A couple of distorted words are presented to the user. Bots, unable to comprehend the warped text, cannot break the CAPTCHA and are thwarted. There are many kinds of CAPTCHAs, ranging from the classic text recognition and image recognition to logic questions and interactive tasks.Several CAPTCHA alternatives have also been developed. One particularly interesting alternative is the honeypot method which uses a hidden HTML field to trick bots. Because humans cannot see this field, they leave it blank. However, bots who interact with the “raw HTML,” and are unable to tell that the field is hidden, will insert data and consequently reveal their bot identities.
For explanations and examples about these specific types of CAPTCHAs, David Pogue’s article, “Use it Better: 8 Alternatives to the Hated Captcha,” is a great starter. However, none of these CAPTCHA variations are foolproof and each creates accessibility issues.“In Search of the Perfect CAPTCHA,” is another useful article which covers much of the same information as Pogue’s, but suggests that perhaps the burden of preventing spam should not be placed on the user; if web developers can eradicate the incentives to spam, then the problem will solve itself and the need for CAPTCHAS will disappear.
However, no single solution to the spam problem has been unanimously championed. The fate of the CAPTCHA continues to be a much mulled over topic in the web securities realm – some wish to abolish CAPTCHAs altogether, some concede they are a necessary evil and others are searching for creative ways to increase web security without testing the patience of users and creating web accessibility barriers.
What is web accessibility and why is it important?
Web accessibility is the practice of making web pages usable to people of all abilities and disabilities. As the main goal of a web page is to convey information, it should be built to allow everyone access to the information.
Disabilities preventing successful navigation of web pages are more diverse and ubiquitous than you might think. Did you know that nearly 1 in 10 men suffer from some sort of color blindness? Color-blindness and other visual impairments can make it difficult or impossible for users to differentiate colors and read text. Additionally, many websites are developed without consideration of varying levels of manual dexterity; limiting clickable regions to small text or areas prevents users who cannot precisely maneuver a mouse from accessing content. These are only a few examples of many accessibility issues that web developers should be aware of when developing a website.
Another great online resource is WebAim.org. This website offers informative articles for a range of accessibility topics—from introductory articles describing what accessibility is and different disability types to advanced articles providing accessibility information for specific web elements. WebAIM also provides WAVE, a web accessibility evaluation tool which allows you to enter a URL, run an accessibility test and view your site’s accessibility violations. A WAVE FireFox toolbar is also available which allows pages to be evaluated without sending information to the WAVE server. In addition to being convenient, this is especially helpful when evaluating intranet or password-protected pages.
For institutions such as the University of Texas, web accessibility is critical. All University web applications must meet requirements set by state law, and starting in July, the University will be evaluating pages to ensure that they are at least 90% accessible. In order to make sure pages are compliant, the University will be using a new tool WorldSpace Sync. More information about the University’s adoption of WorldSpace Sync can be found in a previous blog post. To learn more about the University’s accessibility requirements, visit the Web Accessibility Policy Page.
UT is stepping up its accessibility efforts. Our current tool for automated accessibility checking, WebXM, is no longer being properly updated and supported. Members of TIS attended an information session last week to learn about the university’s plan going forward.
Within a few weeks, departments will begin gaining access to a new tool called WorldSpace Sync. Over the course of 2013, all of UT will transition into the new tool and retire WebXM.
Central ITS will run scans of the Libraries site and provide a report card that highlights issues with alt image text, color contrast, form issues, broken links, etc, while also checking for the presence of mandatory links, including a link to UT’s home page. TIS will be able to assign known issues to various web authors, each of whom will utilize an account within WorldSpace Sync.
FireEyes, a plug-in to the Firebug extension for Firefox, provides an optional interface that will allow web authors to make and check fixes within the browser, comment, and mark tasks as completed. It does require running Firefox 15 or older so we’ll be working with MCS to determine how staff can best utilize this plug-in.
We’ll provide more information and training as things progress.