Mental Disorders and Accessibility

Paget Michael Creelman (CC BY-SA 4.0)

~5,300 words

Published:

Last modified:

Author: 

Summary

A look at how mental disorders, and those who have them, are underserved in tech. accessibility discussions.

Mental, behavioural and neurodevelopmental disorders are syndromes characterised by clinically significant disturbance in an individual’s cognition, emotional regulation, or behaviour that reflects a dysfunction in the psychological, biological, or developmental processes that underlie mental and behavioural functioning. These disturbances are usually associated with distress or impairment in personal, family, social, educational, occupational, or other important areas of functioning.
World Health Organization, ICD-11

Accessibility, in terms of both tech. in general and IT security more specifically, is a core consideration. I’ve talked before about how designing with accessibility in mind can benefit all kinds of users in ways one can’t possibly predict. However, one aspect of accessibility seems to receive limited attention: mental disorder.

In this short piece, then, I want to relate a couple of case studies I’ve encountered, to assess the current state of accessibility discussion with respect to such disorders and to briefly explore what ensuring accessibility for those with mental disorders might entail. In doing so, I’ll highlight some points of concern I have with the current draft of a key standard document.

Some quick caveats:

  • I have little direct experience with the W3C and this represents my first attempt to contribute to the standards definition process, so I may have missed relevant documents due to my unfamiliarity with their mailing lists, processes, etc.;1
  • I do not, as far as I know, have any personal experience of mental disorder; but
  • I do have experience as a consumer of their standards in a professional capacity; and
  • I (along with pretty much everyone else) have been both benefitted by Web accessibility efforts and disadvantaged by their absence.

Case Studies

Both of these case studies relate to people I have met: one briefly, and the other a long-term friend. I’m a technologist, not a psychiatrist, so I don’t claim to know what, if anything, the individuals may have been diagnosed with; my classifications are non-clinical based on external symptoms and self-reporting.

A: Psychosis

A had had some sort of psychotic episode shortly before I met them; this seems to be a fairly regular occurrence. During the episode, A changed the password to their Microsoft account. Afterwards, they did not remember the password that they had set, and were unable to log into their Windows laptop.

They asked me if it was possible to get into their laptop. They didn’t have access to the verification email address, so I couldn’t reset the Microsoft account password; they also didn’t know the answers to their security questions, so I couldn’t reset the local user account password. A had to join a video call soon and told me to format the laptop, saying that all of their files were backed up elsewhere.

Having only just met this person, I had no way of knowing whether this was a rationally-made request or another episode. In the end I started the format process, but let A pull the trigger.

B: Anxiety

B has a chronic anxiety disorder that finds particular expression with relation to tech. B described the issue as being sledghammered by the thoughts that they are fully aware are irrational, but can do nothing about.

B described their tipping points as Not Knowing What Maybe or Maybe Didn’t Happen and But What If and provided two specific examples.

Firstly, they described their process for sending an email:

Ah I’m not sure but I’m also not sure what the normal email procedure usually looks like

Like a simply reply email with no attachements is just

check draft text

check draft text

check draft text

go back to inbox

check no new replies to thread

check old replies on thread

check email addresses on old replies

check email addresses on draft

check draft text

check draft text

[Loop until moment of power/Just Feels Right]

[REPEAT process checking ‘Sent’ instead of ‘Drafts’]

B, Signal messages

Secondly, they described their problems with being unable to access detailed logs of system activity, or being shown things that cause alarm and having no way to further investigate them. For example, their experience with viewing their Google account data:

Fine and dandy for the corpos to say oOoOo just ask and I’ll give you all your data

Entirely unhelpful if it’s unmanageable

Case A: I cannot click these to see details on what they actually are or remove them:

A screenshot of the Google Account ‘Chrome data in your account’ page, showing 0 Apps, 1 Extension, 82 Settings and 0 Auto-fills.

B

Case B: mysterious group it tells me I joined in 2018 but I can no longer access and doesnt appear in my actual groups list, and therefore can’t remove myself from

Also idk if they change the name to some weird code/group number or whatever if it gets deleted and that’s why it’s weird but I don’t recognise the acronyms at all of that’s the real name

Case C: when I logged into my chronebook it downloaded a couple of apps I only vaguely remember ever using (assumedly on a very old computer) but I cannot find anywhere in settings etc. where to see the settings for this

I could rant about the Google acocunt menus forever

Accessibility and Mental Health

the issue with technological advances being basically omnipresent for the general population is that the general population includes the sick ones

And if they want to make seamless integration with life then it also interfaces the ugly parts

It’s probably just an extension of the world being inhospitable to the non able-bodied/minded

And honestly even worse in tech bc as much as I assume they do user testing and whatever, the people with enough competence to make tech. products will ever understand the needs of the many uniquely problematic laypeople that suffer by being users

Bring me the professional programmer at google who has had a panic attack over some settings not being accessible or transparent

I suppose really because the whole definition of mental illness is about impairing your daily activities and tech is now so integrated into daily life then the venn diagram had to exist

But I can totally see new subgenres popping up as we go on

B, Signal messages

Despite their prevalence, mental disorders appear to be largely sidelined from accessibility discussions in tech. In this interview with the first-ever accessibility engineer at The Washington Post, Holden Foreman (for it is he) talks about expanding the scope of accessibility beyond just disabilities related to vision or hearing:

People with the exact same disability can have different resources, needs, and preferences. And issues like low internet bandwidth can correlate with other user demographics like geographic location. There are nuances specific to the accessibility space. Not everyone with a disability has access to the same technology. Screen reader availability varies by operating system. JAWS, one of the popular screen readers, is not free to use.

Whilst acknowledging that it’s essential to think about accessibility not just in the context of disability but also in the context of other inequities, and even publishing an alternate plain language version of his introductory blog post for users with lower reading literacy or cognitive disabilities, Foreman at no point mentions mental disorders. Even beyond the technologist sphere, this article claims that there’s been very little, if any, research done on the subject [of the intersection of OCD and the Internet].

The MDN Web Docs article on Cognitive accessibility briefly mentions mental disorders in its intro, but then never specifically references them again (although this in in part due to its focus on cognitive needs from the Overview section on). The Internet Society’s 2018 Digital Accessibility Guidelines only contains mention of mental health disorders that may affect the ability to remember and focus.

The EU’s Web Accessibility Directive does include mental impairments in its list of disabilities (copied from the UN Convention on the Rights of Persons with Disabilities definition), and ditto for the more-recent European Accessibility Act, but the supporting EN 301 549 standard only lists Usage with limited cognition, language or learning in its list of functional performance statements, again showing the focus on cognitive and learning disorders.

Searching for ‘accessibility’ and ‘mental disorder’ returns mostly academic papers about generally accessing mental health services, but I did find a handful about technological accessibility for those with mental disorders. One 2016 meta-review looked at 16 publications (3 of which were W3C guidelines) and concluded that there is little research on the barriers PwMD experience when using digital technology and facilitation measures used to address such barrier, highlighting that included barriers were related to neurocognitive dysfunction—impaired attention, processing and responding to information slowly and problem-solving—and none were associated with sociocognitive deficits—impaired affect regulation and difficulty processing emotional cues. Promisingly, though, they did find that more accessibility and usability research involving PwMD have been done in the last 5 years compared with previous times (though I can’t see much evidence that this trend has continued in the 7 years since the paper’s publication).

One organisation that does appear to be taking action on this blind spot is the W3C, which on its WAI page says that we’re updating several documents to better address the needs of people with cognitive and learning disabilities [and that] a current topic is mental health; they also have a Mental Health Subgroup under the Cognitive and Learning Disabilities Accessibility Task Force (COGA), which is currently conducting a literature review.

As a subset of tech. that nonetheless touches on just about every aspect, discussion of mental disorders within the security field specifically is (in my experience) almost entirely absent; for example, a recent NCSC blog post titled Accessibility as a cyber security priority didn’t make a single mention of mental disorders. That’s not to say that there is no discussion of mental disorders as experienced within the community (e.g., an excellent DEF CON talk from 2021, a report about the rate of disorders amongst IT security professionals and several discussions about whether one can get security clearance with a mental disorder). But, whilst I’m by no means a subject matter expert on this, I can’t think of a single time I’ve seen mental disorders discussed in the context of user accessibility when it comes to security solutions.2

In the rest of this section, I first want to look at the prevalence of mental disorders around the world. Then, I’ll describe the de facto standard guidelines for accessibility and how they address mental disorders. Lastly, I’ll explore the limitations of purely technical solutions, as well as highlighting some potential issues in the current draft of the future version of those guidelines.

Classification and Epidemiology of Mental Disorders

The WHO estimated in 2019 that 1 in 8 people (12.5%, or 970m) are currently living with a mental disorder, in comparison to 16% of people living with a physical disability (although those two categories can overlap).

Specific to the two case studies described above, they also estimate that 301m people were living with an anxiety disorder, whilst the National Institute of Mental Health report an estimated rate of between 15–100 people out of every 100,000 who develop psychosis each year.

In terms of other conditions, this article includes a psychologist who states that the internet is involved in approximately 40 per cent of her clients’ OCD.

There is a controversy regarding the classification of a proposed Internet addiction disorder (encompassing sub-conditions like pornography and gaming addition). The overarching condition is not currently formally recognised, although the ICD-11 does define gaming disorder. The dispute revolves around whether the condition is a bespoke thing, or just an Internet-mediated manifestation of other conditions. The latter view is expressed in that same article:

Technology is not to blame. Before the internet, some people with OCD would compulsively phone telephone helplines or search library books, and Elizabeth herself says neither the internet nor smartphones are the problem. All I needed was my mind, she says.
Amelia Tait, It’s like stepping into the storm: How OCD can affect your online life

That said, they also note that the accessibility of the internet does exacerbate the problem, and another article calls it an emerging manifestation of OCD for which there is no name.3

WCAG 3.0

Approximately 60% of the world’s population have access to the Internet, and the majority of Internet traffic (particularly the kind that users knowingly generate and interact with) is Web traffic. As such, Web accessibility represents one of the most wide-reaching and impactful areas to focus on, and it is subsequently also one of the most well-developed.

The Web Content Accessibility Guidelines (WCAG) are the gold-standard for Web accessibility. In their current 2.1 edition, the Guidelines provide a list of requirements for Web content grouped under the top-level requirements that such content must be Perceivable, Operable, Understandable and Robust in the face of user impairment. The Guidelines also provide means of assessing conformance, ranging from Levels A to AAA.

The WCAG are specifically cited by the US Department of Justice as a means of guaranteeing compliance with the ADA, which requires government bodies and some private businesses to made reasonable adjustments to ensure that their services are accessible to users with impairments. Similarly, in the UK, the Equality Act 2010 imposes similar obligations, and this was later specifically extended to encompass Web and mobile applications for public sector bodies (equivalent to WCAG 2.1 AA compliance). As a result, inaccessibility can be expensive: US supermarket chain Target famously had to pay a $6m settlement for not including alt attributes for images on their Web site.

WCAG 2.2 is currently in draft stage, but will only represent a small change to add new criteria. More interestingly, WCAG 3.0 is also under development, and represents a complete overhaul of how assessments will be conducted and conformance assessed. Alongside other changes like using plain, rather than technical, language and renaming the project to W3C Accessibility Guidelines to reflect a broadened scope that encompasses all digital products including web, ePub, PDF, applications, mobile apps, and other emerging technologies, they have completely reworked the approach to both defining the guidelines and assessing oneself against them.

The 3.0 methodology revolves around the identification of functional needs, which can then be applied to specific topics (for example: contrast, forms, readability, and more) to identify the barriers experienced by people with disabilities; these are grouped into Essential, Sensory, Physical, Cognitive and Independence needs.

These functional needs are then grouped again into functional categories which can be used when reporting test results. At time of writing, there are 14 functional categories including things like Vision and Visual, Mobility and Attention. Each category is divided into several functional needs, but the Mental Health functional category only contains a single functional need at present: use with debilitating fear or anxiety.

However, it’s worth stressing that WCAG 3.0 is still in the very early stages of drafting.

WCAG 3.0 and Mental Disorders

The ICD-11 lists 19 basic categories of mental, behavioural or neurodevelopmental disorders, not including those classified in other top-level sections such as sexual dysfunctions and gender incongruence. Of these, some are clearly of little relevance to Web accessibility, such as elimination disorders, but others clearly are.

By taking the functional needs approach of WCAG 3.0 and working through the disorders in ICD-11, we can identify some places where already-defined needs are sufficient. For example, use with limited emotional control and self monitoring and use with limited judgement under the Executive category could well be applied to a user affected by schizophrenia or other primary psychotic disorders or bipolar or related disorders as in the case of A above, as could use with limited ability to direct attention under Attention in the case of a user affected by attention deficit hyperactivity disorder. Also, conditions like dementia are addressed piecemeal by many defined needs, across almost every category.

However, this approach also shows functional needs that are currently lacking. Whilst the existing needs do generally address the likes of anxiety or fear-related disorders and obsessive-compulsive or related disorders, there is nothing specifically for disorders due to addictive behaviours. In fact, there are a number of other disorders that could be covered under a single functional need, something along the lines of use without exacerbating pre-existing disorders that also would address those with feeding and eating disorders, personality disorders and more.

The Limits of Technical Solutionism

I asked B how they felt about the potential for technological solutions to the issues they face when using services.

They suggested that recording detailed and easily-accessible logs (including colour-coding) would be helpful:

What would realllllyyyy help is devices having accessible history logs that exclude or colour code normal system behaviour/updates etc. and show you every notification received and every action taken on the device

And if Google had every single piece of account data readily accessible and changeable

B, Signal messages

Of course, like everything in accessibility, providing a means of data access and modification is just good practice, not to mention a legal requirement under the GDPR, but B was clear that this needs to be self-service:

I’m sure they theoretically fulfil all the tickboxes to comply with the law, and maybe I can email Google and say hi please help me find and change \( xyz \) but not being able to readily independently identify and edit your account settings is the reality and this is from someone who has been up to my ears diving into every menu and setting known to man. Some things are impossible to do. You should be able to do stuff without contacting customer support etc. but of course if they made it easy and obvious how to delete your data then more people would do it, and then they’d lose all their profitable profitable data.

also, even if you’re trying to be healthy and mitigate your responses, a common thing is to stop and look at the hard facts/evidence so you can separate what is reality from what’s your own warped brain jumps, to then address those separately. So if you take away full digital info accessibility or it feels like there’s hidden unknowable sections then it removes your ability to use a literal therapeutic coping technique. So where do you go from there when your tools don’t work?

B, Signal messages

So, by adhering to an agreed standard like the Syslog protocol and providing log and user data in an open format, developers can ensure that logs can be automatically colour-coded by the user’s visualisation tool of choice; this would benefit sysadmins just as much as the cripplingly anxious.

The Google issues were more debilitating, with B stating that they had literally been the sole trigger of meltdowns, like by creating doubt/threat and then not allowing me to fully investigate or control the situation. The email sending process that B described was a more manageable issue, and they were less convinced by my proposal for an email send delay functionality:

You could easily add an extra middle step of repetition but for the pre-send buffer time message and I’m not sure it would be calming

That being said might be good exposure practice like you could define my email will be in this box for \( x \) minutes and that is enough time to check it and that’s all I’ll do and decrease the time gradually?

B, Signal messages

This is actually already a feature in some email clients. In 2008 Google specifically introduced the feature as Mail Goggles and pretty explicitly targeted it at drunk users, confirming my long-held theory that designing something with a drunk person in mind will cover about 80% of accessibility needs.

When it comes to capacity issues, like those raised in A’s case, I think a period of time during which decisions can be reversed would be potentially beneficial, but this would come with a security trade-off and, if made optional and non-default, would require the user to activate it and to not also disable this during their episode. That said, there is a pre-existing regime around service users who lack capacity, most prominently in the financial sector.4 Power of attorney is a well-established mechanism by which pre-selected representatives can veto important decisions deemed to not be in the best interests of the user. The WCAG 3.0 draft already addresses this under the Independence category, with the functional need to use without autonomy or agency and use without privacy that would require developers to incorporate ways for authorised agents to approve relevant decisions.

Ensuring support for people in pre-existing power of attorney situations is one thing, and perhaps one could be legally assigned the power of attorney over account credential changes or deletion requests under the current system; I don’t know, I’m not a lawyer. This is probably the best way of implementing such a system, if so desired, rather than technologically: the latter would be a form of accessibility that paradoxically requires one to limit accessibility in a user’s best interests, in a technical analogue to the Liberty Protection Safeguards (formerly Deprivation of Liberty Safeguards) mechanism. As B said, this could risk infantilisation and associated issues. This may well be a cause for concern over the current wording of the functional need to use with limited judgement: I see a future of snake oil salesmen flogging AI-powered user second-guessing tools, Clippies for the modern era, no doubt tuned in short order to dissuade users from making judgement errors like requesting their personal data or deleting their accounts.

But, returning to the Washington Post interview referenced above, many business models inherently rely on exploiting some aspect of users to convince them they need your products, or to increase engagement. Makeup and clothing will target people’s insecurities about how they look, luxury brands will target people’s insecurities with how they live and health brands target people’s fears about their own morbidity and mortality. For most people these things are manageable, part of day-to-day life, but each of these can also be turned up to 11 for people with mental disorders: targeting body dysmorphia can contribute to eating disorders; lifestyle dissatisfaction can result in depression; and services optimised for user engagement can tap into pre-existing OCD (amongst other disorders) to produce new avenues for self-harm.

Perhaps this is why Foreman made no mention of mental disorders in his interview: because the whole business model of a for-profit news service like the one he works on is generating reader engagement, which has the unavoidable side effect of over-engaging users who are prone to it. Even token efforts by such companies to promote more healthy usage patterns only tend to last until the first bad financial quarter.

Oh no: the problem was capitalism all along!

Many physicians, psychiatrists, psychologists, and other scientists believe that mental diseases have organic causes. Let me make clear that I do not contend that human relations, or mental events, take place in a neurophysiological vacuum. It is more than likely that if a person, say an Englishman, decides to study French, certain chemical (or other) changes will occur in his brain as he learns a language. Nevertheless, I think it would be a mistake to infer from this assumption that the most significant or useful statements about this learning process must be expressed in the language of physics.
Thomas S. Szasz, The Myth of Mental Illness: Foundations of a Theory of Personal Conduct

Mental health is a pretty controversial field of medicine. Some argue that they are sometimes (or always) best handled through a non-medical framework, and others deny their existence entirely and claim they are entirely socially-constructed, often as a form of social control. As I understand it, the DSM-5 itself is the result of an uneasy compromise between two rival approaches within the field. I’m by no means qualified to make and wide-ranging assessment on this topic, but what I do feel qualified to say is that the boundaries between mental disorders, the effects of socioeconomic conditions, natural personality variations and rational responses to unjust situations are not necessarily clear. However, a line has to be drawn somewhere, and it will be necessarily somewhat arbitrary.

Tied up in all that is one functional need currently defined in the W3C draft that worries me. Under the Essential category is the sole functional need use without harm or risk. Whereas all of the other needs address barriers to a user being able to interact with an application or service, this one is vague: is it talking about harm or risk to the user, or to other users (or society-at-large)? Or both?

This is a crucial distinction to make, and to make clearly, because the consequences would be significant. Accessibility requirements, as mentioned, are codified into law in several jurisdictions, and the WCAG are cited as a means of guaranteeing compliance. If the need is use without harm or risk to oneself, then that already causes difficulties around user agency. There is a functional need currently defined for users who lack agency (as I’ve already touched on), but nothing about ensuring agency: ensuring the right to use in a way of their choice, and make modifications where necessary.5

Oh no: the problem was unfree software all along!

If, however, the need is use without harm and risk to others, this becomes a case of introducing user restrictions and content moderation requirements through the backdoor of accessibility as, generally, the harms that a user may cause to another due to a mental disorder—be it antisocial behaviour, an impulse control disorder, a paraphilic disorder or otherwise—are already handled within the sphere of criminal law. There is no further clarity provided by the initial proposed set of guidelines, which contains the equally-ambiguous prevent harm (as well as the more-specific harm from motion, which seems clearly aimed at protecting the user).

Perhaps I’m oversensitised to this as someone from the UK, where the government only recently dropped plans to create a new legal category of legal but harmful speech and tried to impose a duty of care on platform providers that would have required them to remove it. The technically-literate community (both academic and commercial) seem to understand pretty uniformly that proposals like the UK Government’s are doomed to failure, and the more conspiratorial (or, perhaps, savvy) amongst us can see how such measures are ripe for abuse. Most relevant for the current discussion is that the UK Government’s attempt represents a fairly novel approach to introducing censorship and control measures through co-option of a pre-existing legal regime (that of the ‘duty of care’).6

As much as I might like to see a company like Meta made vulnerable to ADA lawsuits for its conscious efforts to exacerbate eating disorders, for example, I think accessibility guidelines would be better off limiting their scope to the enabling of diverse forms of access, and not user safeguarding. There is also a practical reason for this: the Online Safety Bill has faced a tortured two-year passage through the UK legislative system (or four, if you start counting from the publication of the Online Harms White Paper), and I think that the functional needs as currently drafted would either serve to delay or limit the adoption of the 3.0 guidelines.

That all said, there is one fairly radical new area that I think would be well within the scope of accessibility guidelines, but which is currently not mentioned. Again quoting the Washington Post accessibility engineer, the Guidelines should think about accessibility not just in the context of disability but also in the context of other inequities. I would like to see a new functional category added—let’s call it Technical—which would contain functional needs like use with limited Internet bandwidth,7 use with unreliable Internet connection, use with low-resource device and use with limited device storage. If this were to force every bloated SPA to provide a simplified alternative instead (think old.reddit.com or Gmail’s basic HTML version), then the world would be a brighter place. By grouping these all under a single heading, it would also be possible to mandate compliance with all but this category in cases where legislation only relates to disability accessibility.

Conclusion

Fundamentally, I don’t think it is possible to make truly accessible software without making it free software, for the simple reason that no developer can ever predict or afford to personally support every possible use case. Additionally (and relatedly), I don’t think it is possible under current business models to make online services that will not be incentivised to exploit tendencies that will result in outsized harm to those with mental disorders.

But we should not let perfect be the enemy of good, and the pursuit of accessibility even within these constraints is still vitally important. The approach taken by WCAG 3.0 is good, but the functional needs to use without harm or risk and use with limited judgement as defined in the current draft need review, and potentially removal entirely. As the functional needs flow into the guidelines-to-be I’ve not spent much time here analysing those, but potentially the harm from motion guideline points to a better, more targetted approach to guaranteeing use without (self-)harm than the more problematically vague prevent harm). Similarly, the Mental Health functional category should be expanded to encompass other mental disorders and their sociocognitive effects that are not otherwise addressed by the existing functional needs. Or, perhaps the whole category could do with rethinking. And, lastly, technical accessibility needs should be considered, which would serve (amongst other things) as a proxy for financial accessibility needs.


  1. Update 2023-08-10: You can check out the thread I created on the WCAG 3.0 repo. for follow-up discussion. ↩︎

  2. I did once work on the security of a proposed new enterprise architecture for a big financial firm and we did talk a lot about ensuring diverse means of access for users with different needs, but this was a result of the financial sector connection, which I’ve touched on elsewhere in this piece. ↩︎

  3. The article also points out that the Internet can also make it easier to access care, as a hospital in Sweden is finding in its ongoing research on internet-based therapy for people with the disorder. ↩︎

  4. UK national treasure Martin Lewis set up the Money and Mental Health Policy Institute in 2016, which works to improve accessibility for people with mental disorders within the financial and utility sectors, amongst others. Their Mental Health Accessible standards don’t appear to be publicly available, but there is useful information to be found for our purposes in a 2018 report of theirs entitled Access Essentials: Giving people with mental health problems equal access to vital services; for example, Section One of the report details issues users face when it comes to account management and Section Two describes issues getting in touch with service providers, both of which were highlighted in B’s case study. ↩︎

  5. There is some ongoing work in this area in the form of the WAI-Adapt project, which hearkens back to the early days of CSS when Web site styling was expected to be the user’s responsibility. ↩︎

  6. That’s not to say that this co-option, itself, is novel. In countries like the US and UK such monitoring and censorship has traditionally been introduced through the national security rubric, but others have used everything from hate speech to the threat of disinformation and fake news to justify getting their way. ↩︎

  7. For this reason, I’m very excited for the prefers-reduced-data media query to take off. ↩︎