Using basic accessibility standards, the presence or absence of essential usability features, and site accessibility statements, this study evaluates the accessibility of the home pages of the federal judiciary — those of the U.S. district, Appellate and specialty courts, the Administrative Office of U.S. Courts (AO), the Federal Judicial Center (FJC), and the main homepage of PACER, the federal judiciary’s e-filing and e-records access system. Software evaluations reveal detected instances of a narrow set of accessibility issues, including scripts with no accompanying functional text, images/server-side image maps with no text equivalents/descriptors, and inaccessible forms. Manual evaluations of Web sites show a high proportion (about 67 percent) of the home pages provided skip navigation links, whereas smaller proportions provided direct or indirect links to accessibility information — about 15 percent and 12 percent, respectively, as well as controls for manipulating font size (about 12 percent). Notably, a sizeable proportion (about 45 percent) of home pages provided direct or indirect links to a “BrowseAloud” explanation and download page, apparently in lieu of information on accessibility. Finally, content analysis of existing Web site accessibility pages and policy statements show a high degree of variation, with some being exceptionally detailed and informative, and some less so.
Summary of results
Study details and results
Discussion and conclusion
Some key implications for design
Advances in Web-based technology have made electronic access a pervasive feature of modern life (Anderson and Rainie, 2014; Ellison, 2004; Hitlin, 2018; Kaplan, 2013; Wentz, et al., 2011). Not surprisingly, use of Web-based technology for conducting official business is also becoming pervasive. According to Jaeger (2006): “E-government is becoming an increasingly important part of the democratic process and of typical activities of citizens in the United States.” . Jaeger and Thompson (2004) argue that limitations on such activities effectively restricts participation in the democratic process itself.
Today, digital courts and the electronic submission of pleadings and other materials is on the rise (Kaplan, 2013), and filings, calendars, e-forms, and juror questionnaires are now available for real time online access. The use of kiosks and electronic portals over the Internet for litigants to access court functions, for example, allows for asynchronous (any time, any place) access (Kaplan, 2013).
The U.S. judiciary pioneered electronic access to court records in developing the PACER (Public Access to Court Electronic Records) e-filing and e-records access system (Kaplan, 2013). An online platform established and maintained by the Administrative Office of the United States Courts, Pacer provides electronic access to federal court records. It allows registered users to obtain case and docket information online from federal appellate, district, and bankruptcy courts, and the PACER Case Locator. A newly introduced “Next Generation” (NextGen) Case Management/Electronic Case Filing (CM/ECF) system aims at providing enhanced electronic access and processing. Importantly, the gateway to electronic access to all these features is via the homepages/Web sites of courts. The extent to which these homepages are accessible to all is thus an important research/empirical/policy question (Jaeger, 2003). Notably, examining the accessibility of home pages is of significance because an inaccessible home page is likely to limit access to any other subsidiary part of a Web site (Lazar and Greenidge, 2006; Olalere and Lazar, 2011).
Early studies of the accessibility of e-government sites (Ellison, 2004; Fagan and Fagan, 2004; Jaeger, 2006; Lazar, et al., 2010; Lazar and Olalere, 2011; Lipowicz, 2010; Loiacono, et al., 2005; Rubaii-Barrett and Wise, 2008; Stowers, 2002) found low levels of accessibility, with fewer than one-third of these sites labeled “accessible” (Jaeger, 2006). Surprisingly, relatively few of these studies examined the home pages of courts. Two early studies included judicial Web sites: Ellison (2004) and Olalere and Lazar (2011). Out of the 50 Web sites examined in Ellison’s (2004) study, however, only one was a judicial site — that of the U.S. Supreme Court (supremecourtus.gov.). Olalere and Lazar’s (2011) more recent study included only two judicial Web sites (that of the U.S. Supreme Court and USCourts.gov). By contrast, their sample of 100 Web sites included 80 executive branch sites, eight legislative branch sites, and 10 open government sites. Our own extensive search and review of Web site accessibility studies revealed no study that focused on judicial Web sites. More generally, while access to state and local courts has received a great deal of attention in recent years (Bleyer, et al., 1995; Charmatz and McRea, 2003; Cress, et al., 2006; Linnell and Wieck, 2012; Prescott, 2017; Schwartz, 2005; Udell and Diller, 2007), access to federal courts remains a relatively under-explored phenomenon. Moreover, virtually all previous studies have focused on issues related with physical access to courtrooms, and not electronic access. Moreover, as noted above, today, digital courts and the electronic submission of pleadings and other materials is on the rise (Kaplan, 2013). Since the home pages of courts is primarily, if not exclusively, the gateway to these court functions, access to court home pages is of paramount importance.
To address this shortcoming, this study examines the home pages of the federal judiciary. Notably, Section 508 of the U.S. Rehabilitation Act (https://www.fcc.gov/general/section-508-rehabilitation-act) requires Web pages of federal government agencies to be accessible to people with disabilities. According to a 2003 DOJ report: “removal of barriers on Federal agencies’ Web sites is simply a matter of good design. It also benefits others, such as those who use low-end technology with lower modem speeds and people who use wireless Internet connections” (U.S. Department of Justice, 2015).
While accessibility laws such as the ADA and Section 504 of the Rehabilitation Act of 1973 do not apply to the federal judiciary, this paper reports results from a recent study  undertaken by the first author as part of an American Association for the Advancement of Science (AAAS) fellowship with the Federal Judicial Center in Washington, D.C. Using basic accessibility standards, the presence or absence of essential usability features, and Site Accessibility Statements, the study evaluated the accessibility of the home pages of the federal judiciary — those of the U.S. district, Appellate and specialty courts, the Administrative Office of U.S. Courts (AO), the Federal Judicial Center (FJC), and the main homepage of PACER, the federal judiciary’s e-filing and e-records access system. Specifically, 113 Web sites were investigated, including those of the U.S. Supreme Court, The current 94 U.S. district courts, the 11 U.S. circuit/appellate courts, the U.S. Court of Appeals for the District of Columbia, the U.S. Court of Appeals for the Federal Circuit, and the two U.S. specialty courts (the U.S. Court of International Trade and the U.S. Court of Federal Claims), plus those of the FJC and PACER. To examine federal judicial sites, this study began with use of the automated Web accessibility evaluation tools AChecker and Cynthia Says. AChecker is an open source accessibility evaluation tool developed in 2009 by the Inclusive Design Research Centre (formerly known as the Adaptive Technology Resource Centre) of the University of Toronto. The Cynthia Says portal is a joint education and outreach project of Cryptzone, ICDRI, and the Internet Society Disability and Special Needs Chapter. As initial approximations or baselines of Web accessibility, these tools allow the user to evaluate a site using software to review the HTML code of a Web page to look for common accessibility errors. Manual examinations by a blind evaluator of sites for the presence or absence of essential usability features like skip navigation and font size controls provided further tests. These are crucial usability features, particularly for people who use screen readers or navigate by keyboard only (e.g., vision- and motor-impaired users). Skip navigation links, for example, allow blind and visually impaired users to bypass the navigation bars on a Web site and proceed to main content. Skip navigation tools are similarly helpful to users who cannot use a mouse but rely on a keyboard. Finally, content analysis of existing Web site accessibility pages and policy statements provided a final layer of analysis.
Our goal is twofold. First, we wish to highlight the importance of the accessibility of courts to due process, a principle that is said to be foundational to American democracy. As Jaeger (2006) notes: “E-government is becoming an increasingly important part of the democratic process and of typical activities of citizens in the United States.” . Jaeger and Thompson (2004) argue that limitations on such activities necessarily restricts participation in the democratic process. Said to be the vanguard of the rule of law in this democracy, it goes without saying that access to courts is foundational in this context. Second, we wish to highlight the issue of the accessibility of federal e-government sites more generally.
Summary of results
Results of the automated Web accessibility evaluations reveal detected instances of a narrow set of accessibility issues, including scripts with no accompanying functional text, images/server-side image maps with no text equivalents/descriptors, and inaccessible forms. The two examples in Figures 1 and 2 are illustrative.
Figure 1: Web site with an accessibility issue Example 1. This site displays a feature which decreases its accessibility to people using screen reader software. The slide show numbers (highlighted above in yellow) are read by a screen reader as “1”, “2”, “3”. However, there is no indication as to what these numbers are in reference to when using a screen reader. Source: “Disability and the federal courts: A study of Web accessibility,” Federal Judicial Center (FJC) report (Corra, 2019), at https://www.fjc.gov/sites/default/files/materials/24/Disability%20and%20the%20Federal%20Courts.pdf.
Figure 2: Web site with an accessibility issue Example 2. This site has a great deal of empty space which registers as linked artifacts to a screen reader. This makes people with a screen reader think there is pertinent information which is simply not coded properly when in fact there is nothing of value on that portion of the screen. Source: “Disability and the federal courts: A study of Web accessibility,” Federal Judicial Center (FJC) report (Corra, 2019), at https://www.fjc.gov/sites/default/files/materials/24/Disability%20and%20the%20Federal%20Courts.pdf.
Manual evaluation checks show a high proportion (about 67 percent) of home pages provided skip navigation links for ease of access to main content areas. Whereas smaller proportions of home pages provided direct or indirect links to accessibility information — about fifteen percent and twelve percent, respectively, as well as controls for manipulating font size (about 12 percent). Notably, a sizeable proportion (about 45 percent) of home pages provided direct or indirect links to a “BrowseAloud” explanation and download page, apparently in lieu of information on accessibility. Finally, content analysis of existing Web site accessibility pages and policy statements demonstrate a high degree of variation, with some being exceptionally detailed and informative, while some less so. The remainder of this article provides details on each of these areas of study.
Study details and results
In the months of October, November, and December 2018, the homepages of the 113 federal judiciary Web sites noted above were evaluated with the AChecker and CynthiaSays Web accessibility tools, using the Section 508 evaluation standards . Only the main homepages and related content were examined. These results are described in Tables 1 and 2. Manual checks for the presence or absence of essential usability features were subsequently conducted and recorded. Results of these checks are presented in Table 3. Data on existing accessibility policy statements was obtained and compiled during the manual examinations of Web sites. A content analysis of these accessibility policy statements is discussed later.
Results from automated evaluations
The data presented in Tables 1 and 2 represent automated evaluations of the accessibility of the 113 home pages and are presented in the form of summary statistics. Reported data include the total number of pages with each detected accessibility issue, percentage of pages with each issue, and total number of detected instances of each issue.
As can be seen in Tables 1 and 2, the two Web accessibility tools detected several accessibility issues, with two commonly identified issues (detected as accessibility issues by both). Like AChecker, Cynthia Says identified several instances of scripts with no accompanying functional text, as well as forms with accessibility issues. Yet, Cynthia Says detected a greater number of these instances than AChecker. The AChecker software, for example, detected 170 “known” instances of nontext elements with no text equivalents (see Table 1), whereas Cynthia Says detected 391 (Table 2). Similarly, while the AChecker software identified 65 “known” instances of forms with accessible issues, Cynthia Says detected 67.
Table 1: AChecker Web assessment results. Measure Non-text feature with no text descriptor Active server-side image map region with no redundant text link Script with no functional text identifier Inaccessible forms Number of home pages with this type of accessibility issue 34 1 92 35 Percentage of home pages with this type of accessibility issue 30.91% 0.93% 83.64% 31.82% Total number of accessibility issues across all home pages 182 1 317 66
Table 2: Cynthia Says Web assessment results. Home page accessibility issues identified Number of home pages with issue Percent of home pages with issue Total frequency of issue across all home pages Provide text equivalents to nontext elements 91 81.98 391 Provide readable documents without the need for an accompanying style sheet 74 66.67 4,416 Markups are used to associate data cells and header cells in data tables 2 1.80 2 Frames titled with text that identify frames to facilitate navigation 3 2.70 6 Applications of Web sites employing applets, plug-ins, or other applications on the users’ computers comply with the Section 508 guidelines for software products and Web sites 87 78.38 708 Provide accessible and navigable online electronic forms 35 31.53 67 Users are not timed out of applications/available option for more time 1 0.90 1
Moreover, the AChecker Web accessibility tool indicated that of 113 home pages, only nine were free of accessibility issues. These were the home pages of the U.S. district courts for Connecticut (http://www.ctd.uscourts.gov/), Guam (http://www.gud.uscourts.gov/), New Mexico (http://www.nmcourt.fed.us/web/index.htm), North Dakota (http://www.ndd.uscourts.gov/), Eastern District of Kentucky (http://www.kyed.uscourts.gov/), Northern District of Oklahoma (https://www.oknd.uscourts.gov/), Northern Mariana Islands http://www.nmid.uscourts.gov/), Southern District of West Virginia (https://www.wvsd.uscourts.gov/), and home page of the U.S. Court of Appeals for the Seventh Circuit (http://www.ca7.uscourts.gov/). This represents about eight percent of the Web pages evaluated. By contrast, the CynthiaSays Web accessibility tool indicated that of the 113 homepages, three were free of accessibility issues. These were the district courts for Guam (http://www.gud.uscourts.gov/), Northern Mariana Islands (http://www.nmid.uscourts.gov/), and Middle District of Alabama (http://www.almd.uscourts.gov/). This represents only about three percent of the Web pages evaluated.
Notably, the AChecker tool also detected “likely” issues that were mainly in documents that were difficult to read without an accompanying style sheet. Cynthia Says detected several issues that AChecker did not identify as “known problems.” For example, Cynthia Says detected 4,416 issues associated with documents that were difficult to read without accompanying style sheets, two instances of data cells and header cells in data tables that were not identified by markup (inadequate labeling of data tables with headers for rows and columns), six instances of frames with no identifying text titles, and one instance of a site that timed-out users without warning and/or did not provide an option for more time (see Table 2).
In summary, a key takeaway from the Web accessibility data (presented in Tables 1 and 2) is that both evaluation tools reveal several accessibility issues with home pages included in this study. However, the AChecker software reported fewer instances of “known” issues than those detected by the Cynthia Says tool. Cynthia Says also revealed a greater number/proportion of any given issue than AChecker.
A plausible explanation for the divergence in findings between these two software applications is that Cynthia Says is the newer of the two evaluation tools. As such, the Cynthia Says tool may be more precise and, by extension, able to detect a greater number of issues than AChecker. As the two tables (1 and 2) show, for each accessibility issue that both tools identified, the Cynthia Says software detected more instances than AChecker. Moreover, of the 113 home pages, the AChecker Web accessibility tool indicated nine were free of accessibility issues, whereas CynthiaSays identified three. Notably, both accessibility evaluators identified the sites for the U.S. district courts for Guam and Northern Mariana Islands as being free of accessibility issues. In addition to the listed “known” issues, the AChecker software identified “likely” issues. This suggests less divergence in the evaluations of the two Web accessibility tools than might appear at first glance.
Taken together, the two Web accessibility tools revealed two key commonly detected issues. The first involved images and/or scripts with no accompanying identifying text. That is to say, graphics that were not labelled by providing “alternative text” (alternative text enables screen reader programs to communicate via synthesized speech information that is visually displayed on Web sites). Figures 1 and 2 are illustrative. The second dealt with forms with accessibility issues — that is, fillable online forms that might be difficult to navigate with a keyboard. Cynthia Says also detected other accessibility issues, including documents that were difficult to read without accompanying style sheets, data cells and header cells in data tables that were not identified by markups, and sites that timed-out users without warning and/or had no option for more time.
Yet, as with most studies, especially ones that evaluate a medium as volatile as the Web, there were limitations. At the outset, it must be noted that the results are in reference to data collected in the fall of 2018. An important caveat to make is that Web sites are not static, but continually change. Because of a minor change in page construction, a Web page that is accessible today could become inaccessible tomorrow and vice versa. The findings reported thus apply to homepages during the time period that they were examined; many may have changed since then. Moreover, accessibility guidelines can be, and are, updated or “tweaked” periodically. In addition, as the findings of the Web accessibility tools above suggest, one software application may identify a given issue a “likely” accessibility problem that another identifies as a “known” issue. Moreover, software analysis tools like Achecker and Cynthia Says are only able to check for basic design errors. At issue here is content of 113 different sites that were likely designed by a multitude of individuals.
It follows that the key takeaway here is that to have and maintain an accessible Web site requires continual awareness of accessibility issues. Existing technology that is now freely available on the Web makes the evaluation of sites a quick and simple process. Routine and periodic use of these tools will go a long way. One key source is the site of the Web Accessibility Initiative (https://www.w3.org/WAI/), which posts key developments on digital accessibility and emerging technology. Such “assistive” technology includes:
- Alternative mouse and keyboards;
- Speech recognition and text-to-speech software;
- Speech recognition and speech-to-text software;
- Captioning software; and,
- Screen magnifiers and readers (Tam, 2018).
Results from manual evaluations of the presence/absence of essential usability features
One simple test of the usability of a Web site is the presence or absence of essential usability features. Two of such features are skip navigation and font size controls. Skip navigation links, for example, allow users to get to the main content area of a site without having to go through a whole series of links and sub-links. Font size controls similarly enable the user to increase or decrease the size of the font used on a site. Each of the homepages were examined to see if it (1) provided a direct link to accessibility information that was so designated/labelled; (2) provided an “indirect” link to accessibility information; and/or, (3) provided a link to download the screen reader tool, “BrowseAloud.” The direct links appeared under a variety of titles, including “ADA Accommodations,” “Accessibility,” “Request for Accommodation,” “Disability Access Information,” “Access Coordinator/s,” “Accessibility Information,” “Accommodations,” “Courthouse Accessibility Information,” and “Access/Visitors with Disabilities and Special Needs.” By contrast, Web sites designated as having indirect links did not have clearly demarcated links to accessibility information as such. Closer examination of these sites revealed some to have links like “Court Info,” “Courthouse,” and “Courtroom Technology,” clicking on which took the reader to a link clearly demarcated as containing accessibility information. Other sites included in this category were so designated by searches of the main page with terms like “accessibility,” “accommodation,” “ADA Accommodation,” etc., that revealed subsidiary links with accessibility information. Finally, several court homepages had a “BrowseAloud” link, clicking on which took the user to a “BrowseAloud Information” page with the following information: “BrowseAloud is a small computer program that works with Web browsers to listen to text that appears on a site as it is read aloud. Once the required software is installed, a computer-generated voice reads aloud selected text, highlighting specific text as it is read. BrowseAloud is a free download and can be downloaded here [external link provided.] BrowseAloud can be a useful tool for persons who may have difficulty reading text online, or who have literacy problems or a learning disability such as dyslexia. It can also be helpful for persons with limited English language skills who may find it easier to listen to spoken text instead of reading the words on a page.” Notably, each home page that had the BrowseAloud link had this exact text and provided a BrowseAloud download link.
Table 3 presents data on manual evaluations of the home pages of the federal judiciary for the presence or absence of essential usability features like skip navigation and font size controls. In addition to these, that table also reports data on the presence or absence of a link to “BrowseAloud.” Here, we note that the BrowseAloud feature was incidental to the study (simply uncovered during the manual evaluations), but since it was discovered as a common feature of a sizeable proportion of the homepages, it is reported here. Table 3 also reports the percentage of home pages with a direct link to accessibility information, as well as those with an indirect link.
Table 3: Manual assessments of the presence/absence of essential usability. Measure Percentage Skip to main content 66.25% Direct link to accessibility 15.04% Indirect link to accessibility on homepage 12.39% BrowseAloud 45.13% Decrease/reset/increase font size 11.50%
As can be seen in Table 3, a sizeable proportion of the homepages provided a skip navigation link to the main content area of the page. About two thirds (67.25 percent) of the homepages provided such a link. By contrast, about one in nine (11.5 percent) provided a font control feature that allowed the user to manipulate the size of content.
Furthermore, about one in seven homepages (14.2 percent) had a direct link on accessibility information. For some home pages, this meant having an indirect link to accessibility information. About one in eight homepages (12.39 percent) provided such indirect links. Taken together, about 27 percent of home pages had a direct or indirect link to accessibility information.
Moreover, about 45 percent of the home pages provided a direct or indirect link to an explanation and download page for “BrowseAloud.” Only six out of the 16 home pages that had a direct link to accessibility information also had a link to BrowseAloud. At the time of reviewing the homepages, these six were the homepages of the District Courts for New Hampshire, New Mexico, Eastern District of Pennsylvania, and Courts of Appeals for the second, eighth, and eleventh circuits. Similarly, only six out of the 13 home pages that had an indirect link to accessibility information also had a link to BrowseAloud. At the time of review, these sites were the home pages of the District Courts for Connecticut, New Jersey, Nebraska, Northern and Southern Districts of Georgia, and District of Columbia. All other home pages that had BrowseAloud did not have a direct or indirect link to accessibility information (about 38). A key issue with the BrowseAloud software to note here is that the software has to first be downloaded before it can be used to access content.
Taken together, the manual examination of home pages for the presence or absence of essential usability features revealed at least three key distinct patterns. First, a high proportion (about 67 v) of the home pages provided skip navigation links for ease of access to main content areas. Second, smaller proportions of home pages provided direct or indirect links to accessibility information — about 15 percent for the former, and about 12 percent for the latter — as well as controls for manipulating font size (about 12 percent). Third, a sizeable proportion (about 45 percent) of home pages provided direct or indirect links to a “BrowseAloud” explanation and download page.
Results of content analysis of accessibility policy statements
Information on accessibility and accessibility policy statements were obtained in one of two ways. First, some home pages provided a direct link to accessibility information. Other pages provided an indirect link or links. Accessibility statements and related material were accessed via these links, reviewed, and analyzed. In total, 16 homepages had direct working links to accessibility information ; about 13 sites had indirect links .
Table 4 presents accessibility information obtained from the 29 home pages that provided such information via direct or indirect links. The table is organized around the presence or absence of seven major “themes” as follows: (1) Whether or not the page made any reference to the ADA; If so, (2) whether or not the page made it clear (explicitly stated) that the ADA does not apply to the federal judiciary; (3) Whether or not the page provided information on the person/entity to contact for accessibility issues; (4) Whether or not the page provided specific information on available accessibility features; (5) Whether or not the page identified a specifically designated access coordinator(s) and gave information on such coordinator(s) ; (6) Whether or not the information on accessibility made specific reference to “communication disability” ; and, (7) Whether or not the site made specific reference to Section 508.
Table 4: Results from content analysis of accessibility pages and statements. Content matter Direct link
Referenced ADA 8 (50%) 8 (50%) 12 (93%) 1 (7%) Explicitly noted that ADA does not apply 10 (63%) 6 (37%) 12 (93%) 1 (7%) Provided point of contact for accessibility 1 (6%) 15 (94%) 9 (67%) 4 (33%) Provided information on available accessibility features 4 (25%) 12 (75%) 6 (46%) 7 (54%) Identified designated access coordinator(s) 6 (38%) 10 (62%) 10 (77%) 3 (23%) Specific reference to “communication disability” 8 (50%) 8 (50%) 9 (67%) 4 (33%) Specific reference to Section 508 14 (88%) 2 (12%) 13 (100%) 0 (0%)
The data in Table 4 shows that seven out of 16 pages (about 44 percent) with direct links to accessibility information referenced the ADA. Five of those seven (71 percent) also explicitly noted that the ADA does not apply to the federal judiciary. To this, some also added (not reported on Table 4) that the court provided accommodations pursuant to Judicial Conference Policy.
Importantly, a sizeable proportion of the 16 pages with direct links to accessibility information provided it on a variety of accessibility issues, including whom to contact for inquiries, available accessibility features, and contact information on designated access coordinator(s). Fourteen out of 16 pages (88 percent), for example, provided information on point of contact for accessibility inquiries. Likewise, 11 out of 16 pages (69 percent) provided information on available accessibility features. Finally, nine out of 16 (56 percent) identified a designated access coordinator(s). The designation of an access coordinator is one of the key criteria for determining the extent to which state and local courts are deemed “accessible.”
These are impressive statistics, to be sure. We only note here that the reference here is to 16 home pages with a direct link to accessibility information. Moreover, only one of the pages referenced Section 508, that of the United States Court of Appeals for the Second Circuit (http://www.ca2.uscourts.gov/nav/accessibility.html):
“The Second Circuit is committed to providing access to our Web pages for individuals with disabilities, both members of the public and federal employees. To meet this commitment, we will comply with the requirements of Section 508 of the Rehabilitation Act. Section 508 requires that individuals with disabilities, who are members of the public seeking information or services from us, have access to and use of information and data that is comparable to that provided to the public who are not individuals with disabilities, unless an undue burden would be imposed on us. Section 508 also requires us to ensure that federal employees with disabilities have access to and use of information and data that is comparable to the access to and use of information and data by federal employees who are not individuals with disabilities, unless an undue burden would be imposed on us. If the format of any material on our Web site interferes with your ability to access the information, due to an issue with accessibility caused by a disability as defined in the Rehabilitation Act, please contact the site webmaster for assistance [link provided]. To enable us to respond in a manner most helpful to you, please indicate the nature of your accessibility problem, the preferred format in which to receive the material, the Web address (URL) of the material with which you are having difficulty, and your contact information.”
For home pages with indirect links to accessibility information, we noted that only one referenced the ADA, and it also explicitly recognizecd the nonapplicability of the ADA to the federal judiciary (see Table 4). Moreover, four pages provided contact information on accessibility inquiries; three identified a designated coordinator(s) and provided contact information on the coordinator(s); and four referenced “communication disability” and delimited information given on this form of disability. No such pages cited Section 508.
Discussion and conclusion
Using basic accessibility standards and site accessibility pages and statements, this study evaluated the accessibility of the homepages of the federal judiciary. Pages examined included those of the U.S. district, appellate and specialty courts, Administrative Office of U.S. Courts, Federal Judicial Center, and main homepage of PACER, the federal judiciary’s e-filing and e-records access system. Three levels of data collection and analyses were employed. First, the Web accessibility evaluation tools Achecker and Cynthia Says were used as initial assessments of accessibility based on Section 508 guidelines. Accompanying the Web accessibility evaluation were manual examinations of sites for the presence or absence of essential usability features like skip navigation and font size controls. Content analysis of existing accessibility pages and statements provided a final layer of analysis.
Results from the Web accessibility tools reveal two key issues. First, both software detected images and/or scripts with no accompanying identifying text. Second, both evaluation tools located forms with accessibility issues, i.e., fillable online forms that might be difficult to navigate with a keyboard. Additional issues included documents that were difficult to read without accompanying style sheets, data cells, and header cells in data tables that were not identified by markup, and sites that timed out users without warning and/or did not provide the option for more time.
By contrast, manual examinations of home pages revealed a great number (about 67 percent) provided skip navigation links for direct access to main content areas. Likewise, a sizeable proportion (about 45 percent) of home pages also provided direct or indirect links to a “BrowseAloud” explanation and download page. By contrast, smaller proportions of home pages provided direct or indirect links to accessibility information, as well as font size controls (15 percent, 12 percent, and 12 percent, respectively).
Finally, content analysis of home pages revealed that a very small percentage had clear and easily accessible information on accessibility. At the most, 16 home pages (about 14 percent) had direct links to accessibility information, and only an additional 13 home pages (about 12 percent) provided indirect links. Moreover, for some of the home pages that provided accessibility information indirectly, that information was only obtained by conducting a search of the sites with keywords like “ADA,” “accommodation,” and “accessibility.”
Taken together, a key takeaway from these findings is that the results suggest detected instances based on a narrow set of accessibility issues. For example, the two Web accessibility software detected accessibility issues with seven key areas, including images and/or scripts with no accompanying identifying text, forms with accessibility issues, documents that are difficult to read without accompanying style sheets, data cells and header cells in data tables that were not identified by markup, and sites that timed-out users without warning and/or did not give users the option for more time. Rather than being indicative of the pervasiveness of accessibility, these findings therefore indicate concentration around a narrow and specific set of problems.
The federal judiciary is foundational to our democratic ideals and principles of governance. If so, understanding the workings of this co-equal branch of our government is of paramount importance. The federal judiciary is the ultimate venue for judicial review. When access to these courts is limited, access to judicial review itself is arguably limited.
Digital courts and the electronic submission of pleadings and other materials is on the rise (Kaplan, 2013). Since the home pages of courts is primarily, if not exclusively, the gateway to these court functions, access to court home pages is of paramount importance . In addition, as noted above, relatively few studies have investigated electronic access to courts. Studies of government Web sites that include court sites have tended to include only one or two judicial sites. This study, therefore, has several policy implications, notable among which is the need for equal access to courts and court material for persons with disabilities.
Some key implications for design
A key takeaway from this study is for Web designers to keep in mind the variability in design issues associated with access and how problematic a single fix for all mindset could be. Consider, for example, the graph in Figure 1 above that we identified as having accessibility issues. That graph is a screenshot of a court site that we noted “displays a feature which decreases its accessibility to people using screen reader software.” The site represents a series of numbered slide shows with no descriptors associated with the numbers. Hence, the slides are read by the screen reader as “1,” “2,” “3,” etc., with no indication as to what these numbers are in reference. A simple fix for this, of course, is to add a descriptor to the slide show that describes what the numbers represent, and the slides themselves (here, we note that descriptors that simply state “click here” are the worst possible text for a link).
Yet, a graph can have issues that go beyond the absence of a descriptor. Consider, for example, the second of the graphs presented earlier (Figure 2). That graph has the added issue of having a great deal of empty space that registers as linked artifacts to a screen reader, making it sound like the empty space has pertinent information.
The key takeaway here being that to have and maintain an accessible site requires continual awareness of accessibility issues. Continual testing of sites as an up-front design protocol, which perhaps involves stakeholders, would go a long way. Here, by “stakeholders” we mean the multitude of types of disabilities that can be representative of users.
Our evaluation of sites for the presence/absence of essential accessibility features point to another important design issue. This is the idea that designers should be mindful of the fact that barriers to access are as varied as the types of disabilities individuals have. This variability could exist within categories of disability, e.g., the blind/visually impaired. Font size controls, for example, are very useful to individuals with some vision and therefore directly bear upon the accessibility of a site. By contrast, a skip navigation tool is especially useful to a user that is totally blind, but may be more of a usability issue than accessibility. It follows that having both features on a homepage may be the most ideal.
The extent to which links are clearly demarcated, i.e., “accessibility information,” is another salient issue that this study highlights. Sites providing indirect links to accessibility information, for example, were not clearly designated as such. Examples are links broadly labelled “courthouse,” “court info,” and “courtroom technology.” For a user seeking information on accessibility, these are so broad to the point of being useless. Admittedly, this is more about usability rather than accessibility, but the usability of a site ultimately enhances accessibility.
About the authors
Mamadi Corra is Graduate Director and a professor in the Department of Sociology at East Carolina University.
E-mail: corram [at] ecu [dot] edu
Ian McCandliss is a recent graduate at George Mason University.
The first author acknowledges support of the American Association for the Advancement of Science (AAAS) via a 2018-19 AAAS Science & Technology Policy Fellowship placement in the Judicial Branch program, support of the Gordon and Betty Moore Foundation that funds this fellowship, and the first author’s home university, East Carolina University, for accommodating the fellowship, during which research reported here was completed. We thank the Federal Judicial Center (FJC) for hosting the first author during this fellowship year, and Research Division Director Jim Eaglin for his excellent mentorship. Special thanks to Bob Edwards, Susan Pearce, and Sydney Johnson for valuable editorial assistance with the graphs, tables, and references.
1. Jaeger, 2006, p. 187.
2. The full report of this study is accessible at the site of the Federal Judicial Center at https://www.fjc.gov/content/343147/disability-and-federal-courts-study-web-accessibility or directly at https://www.fjc.gov/sites/default/files/materials/24/Disability%20and%20the%20Federal%20Courts.pdf.
3. Jaeger, 2006, p. 187.
4. While the actual Section 508 guidelines are more specific and technical in the explanations, summarizations of them include those noted in Tables 1 and 2; see, for example, Jaeger (2006).
5. These 16 were the homepages of the U.S. district courts for Maine, New Hampshire, Southern District of New York, Eastern District of Pennsylvania, Western District of Virginia, Northern District of Illinois, Western District of Missouri, Central, Eastern, and Northern Districts of California, Western District of Washington, Middle District of Alabama, and Middle District of Florida, plus the homepages of three circuit/appeals courts: those of the Second, Eighth, and Eleventh Circuits. A seventeenth link existed but was not working at the time and took users to a page with information unrelated to accessibility.
6. These 13 were the homepages of the U.S. Supreme Court; district courts for Connecticut, Delaware, New Jersey, Northern and Southern Districts of Iowa, Minnesota, Oregon, Eastern District of Washington, Northern and Southern Districts of Florida, and District of Columbia; and the U.S. Court of Federal Claims.
7. At least at the state and local levels, having an access coordinator(s) is one of the key criteria for determining whether or not a court/court system is accessible and in accord with the ADA. For federal judiciary Web sites that provided such information, some simply directed individuals to “contact the court ...” or “contact the clerk of court ...”, or some variant of these. Others had designated access coordinators and identified as such on the sites. Table 4 thus differentiates the two.
8. This is a curious delimiting that showed up in a number of Web sites, and, in most cases, it determined matters of accessibility to those related with hearing impairments. In many of these pages, available accessibility features are limited to those associated with hearing impairments.
9. For a statement on why it is important to measure the accessibility of Federal e-government, see, for example, Jaeger (2003).
J. Anderson and L. Rainie, 2014. “Digital life in 2025,” Pew Research Center (11 March), at https://www.pewresearch.org/internet/2014/03/11/digital-life-in-2025/, accessed 12 August 2021.
K. Bleyer, K.S. McCarty, and E. Wood, 1995. “Jury service for people with disabilities,” Judicature, volume 78, number 6, pp. 273–275.
M. Charmatz and A. McRea, 2003. “Access to the courts: A blueprint for successful litigation under the Americans With Disabilities Act and the Rehabilitation Act,” University of Maryland Law Journal of Race, Religion, Gender, & Class, volume 3, number 2, at https://digitalcommons.law.umaryland.edu/rrgc/vol3/iss2/7/, accessed 12 August 2021.
M.K. Corra, 2019. “Disability and the federal courts: A study of Web accessibility,” Federal Judicial Center (FJC), at https://www.fjc.gov/sites/default/files/materials/24/Disability%20and%20the%20Federal%20Courts.pdf, accessed 12 August 2021.
R. Cress, J.N. Grindstaff, and S.E. Malloy, 2006. “Mental health courts and Title II of the ADA: Accessibility to state court systems for individuals with mental disabilities and the need for diversion,” Saint Louis University Public Law Review, volume 25, number 2, at https://scholarship.law.slu.edu/plr/vol25/iss2/6, accessed 12 August 2021.
J. Ellison, 2004. ”Assessing the accessibility of fifty United States government Web pages: Using Bobby to check on Uncle Sam,” First Monday, volume 9, number 7, at https://firstmonday.org/article/view/1161/1081, accessed 12 August 2021.
doi: https://doi.org/10.5210/fm.v9i7.1161, accessed 12 August 2021.
J.C. Fagan and B. Fagan, 2004. “An accessibility study of state legislative Web sites,” Government Information Quarterly, volume 21, pp. 65–85.
doi: https://doi.org/10.1016/j.giq.2003.12.010, accessed 12 August 2021.
P. Hitlin, 2018. “Use of Internet, social media use and device ownership in U.S. have plateaued after years of growth,” Pew Research Center (28 September), at https://www.pewresearch.org/fact-tank/2018/09/28/internet-social-media-use-and-device-ownership-in-u-s-have-plateaued-after-years-of-growth/, accessed 12 August 2021.
P.T. Jaeger, 2006. “Assessing Section 508 compliance on federal e-government websites: A multi-method, user-centered evaluation of accessibility for persons with disabilities,” Government Information Quarterly, volume 23, number 2, pp. 169–190.
doi: https://doi.org/10.1016/j.giq.2003.12.010, accessed 12 August 2021.
P.T. Jaeger, 2003. “The importance of measuring the accessibility of the federal e-government: What studies are missing and how these issues can be addressed,” Information Technology and Disabilities, volume 9, number 1, at http://itd.athenpro.org/volume9/number1/jaeger.html, accessed 12 August 2021.
P.T. Jaeger and K.M. Thompson, 2004. “Social information behavior and the democratic process: Information poverty, normative behavior, and electronic government in the United States,” Library & Information Science Research, volume 26, number 1, pp. 94–107.
doi: https://doi.org/10.1016/j.lisr.2003.11.006, accessed 12 August 2021.
K. Kaplan, 2013. “Will virtual courts create courthouse relics?” (1 May), at https://www.americanbar.org/groups/judicial/publications/judges_journal/2013/spring/will_virtual_courts_create_courthouse_relics/, accessed 12 August 2021.
J. Lazar and A. Olalere, 2011. “Investigation of best practices for maintaining Section 508 compliance in U.S. federal Web sites,” In: C. Stephanidis (editor). Universal access in human-computer interaction. Design for all and inclusion. Lecture Notes in Computer Science, volume 6765. Berlin: Springer, pp. 498–506.
doi: https://doi.org/10.1007/978-3-642-21672-5_55, accessed 12 August 2021.
J. Lazar and K.-D. Greenidge, 2006. “One year older, but not necessarily wiser: An evaluation of homepage accessibility problems over time,” Universal Access in the Information Society, volume 4, pp. 285–291.
doi: https://doi.org/10.1007/s10209-003-0087-1, accessed 12 August 2021.
J. Lazar, P. Beavan, J. Brown, D. Coffey, B. Nolf, R. Poole, R. Turk, V. Waith, T. Wall, K. Weber, and B. Wenger, 2010. “Investigating the accessibility of state government Web sites in Maryland,” In: P.M. Langdon, P.J. Clarkson, and P. Robinson (editors). Designing inclusive interactions: Inclusive interactions between people and products in their contexts of use. London: Springer, pp. 69–78.
doi: https://doi.org/10.1007/978-1-84996-166-0_7, accessed 12 August 2021.
B. Linnell and C. Wieck, 2012. “Access to justice: The impact of federal courts on disability rights,” Federal Lawyer, volume 59, number 10, pp. 48–53, and at https://disabilityjustice.org/wp-content/uploads/FLDecember2012-Access-to-Justice-Article1.pdf, accessed 12 August 2021.
A. Lipowicz, 2010. “Government 2.0: Federal Web sites chided for accessibility, usability issues,” Washington Technology (28 July), at https://washingtontechnology.com/blogs/gov-2/2010/07/federal-web-sites-in-spotlight-for-accessibility-usability.aspx, accessed 12 August 2021.
E.T. Loiacono, S. McCoy, and W. Chin, 2005. “Federal Web site accessibility for people with disabilities,” IT Professional, volume 7, number 1, pp. 27–31.
doi: https://doi.org/10.1109/MITP.2005.1407801, accessed 12 August 2021.
A. Olalere and J. Lazar, 2011. “Accessibility of U.S. federal government home pages: Section 508 compliance and site accessibility statements,” Government Information Quarterly, volume 28, number 3, pp. 303–309.
doi: https://doi.org/10.1016/j.giq.2011.02.002, accessed 12 August 2021.
J.J. Prescott, 2017. “Improving access to justice in state courts with platform technology,” Vanderbilt Law Review, volume 70, pp. 1,993–2,050, and at https://repository.law.umich.edu/articles/1912, accessed 12 August 2021.
N. Rubaii-Barrett and L.R. Wise, 2008. “Disability access and e-government: An empirical analysis of state practices,” Journal of Disability Policy Studies, volume 19, number 1, pp. 52–64.
doi: https://doi.org/10.1177/1044207307311533, accessed 12 August 2021.
K.S. Schwartz, 2005. “Applying Section 5: Tennessee v. Lane and judicial conditions on the Congressional enforcement power,” Yale Law Journal, volume 114, number 5, pp. 1,133–1,175, and at https://www.yalelawjournal.org/note/applying-section-5-ligtennessee-v-lanelig-and-judicial-conditions-on-the-congressional-enforcement-power, accessed 12 August 2021.
G.N.L. Stowers, 2002. “The state of federal websites: The pursuit of excellence,” IBM Endowment for the Business of Government, at http://www.businessofgovernment.org/sites/default/files/FederalWebsites.pdf, accessed 12 August 2021.
D. Tam, 2018. “Digital accessibility for law firms: Why lawyers need to prioritize accessibility” (17 October), at https://www.biggerlawfirm.com/digital-accessibility-for-law-firms/, accessed 12 August 2021.
D. Udell and R. Diller, 2007. “Access to justice: Opening the courthouse door,” Brennan Center for Justice White Paper, New York University School of Law (16 April), at https://www.brennancenter.org/our-work/research-reports/access-justice-opening-courthouse-door, accessed 12 August 2021.
U.S. Department of Justice, 2015. “The evaluation tools,” at https://www.justice.gov/crt/evaluation-tools, accessed 12 August 2021.
World Wide Web Consortium, Web Accessibility Initiative, 2021. “Making the Web accessible,” at https://www.w3.org/WAI/, accessed 12 August 2021.
B. Wentz, P.T. Jaeger, and J. Lazar, 2011. “Retrofitting accessibility: The legal inequality of after-the-fact online access for persons with disabilities in the United States,” First Monday, volume 16, number 11, at https://firstmonday.org//article/view/3666/3077, accessed 12 August 2021.
doi: https://doi.org/10.5210/fm.v16i11.3666, accessed 12 August 2021.
Received 17 February 2021; revised 30 March 2021; accepted 6 August 2021.
Copyright © 2021, Mamadi Corra and Ian McCandliss. All Rights Reserved.
Disability and access to courts: Assessing the accessibility of U.S. federal judiciary homepages thirty years after the ADA
by Mamadi Corra and Ian McCandliss.
First Monday, Volume 26, Number 9 - 6 September 2021