The cost of switching for those who are blind or low-vision
This content was originally published on Medium (March 16th, 2026) and has been adapted by The Honking Goose platform.
Imagine it’s Christmas Day: you and those closest to you exchange gifts that reflect each other’s interests. Eventually you’re handed a box and discover you’ve been given the new MacBook Neo. You set it with your other presents and keep enjoying time with your circle. You’ve never used a Mac before; you currently have a basic Windows laptop that runs Google Chrome just fine. If you can see, you will likely set up the MacBook visually and adapt within a few hours, since the interface is intuitive and its visual design makes it easy to explore and learn what works for you.
For people who are blind, the experience is very different. They rely on screen readers, software that converts on-screen content to audio, and they need a sighted person to set up the computer and configure the screen reader. There is no universal touch gesture or keyboard command that launches a guided tutorial for blind users, and while accessibility features exist, initial setup often assumes sighted assistance. Switching between operating systems can mean relearning how to use a computer from scratch, which creates a significant barrier for someone who cannot access the visual cues most people take for granted.
As a blind person, I wish learning to use a screen reader and memorizing all the keyboard commands were easy. It requires countless hours of training with a professional. I can’t see. I use a screen reader. I have an iPhone, iPad, and a Windows Workstation Laptop (it’s more of a portable desktop, but the hardware is powerful enough that Windows doesn’t become slow). When you’re blind or low-vision like me, you do not use a pointing device (mouse, tablet and stylus, etc.). You control the computer entirely with keyboard commands. Most operating systems and applications fully support keyboard control, and your screen reader software can take over if an application isn’t working for you. The keyboard commands for Windows and its applications are unique to Windows, the keyboard commands for macOS are unique to macOS, and unfortunately even the keyboard commands on iPadOS are unique to iPadOS. The list could go on, but in short, the skills are not transferable. Every new platform requires learning thousands of keyboard commands and figuring out how to use its screen reader and utilities. My university requires Windows, and most jobs will require Windows as well. A small number of jobs might offer a Mac, but it’s uncommon. It’s hard to justify learning how to use a device I won’t get to use in the workplace.
At home I have my much more accessible Apple iPad, which supports visiting all the websites and running all the apps I rely on. I am typing the draft of this article on an iPad within Microsoft OneNote, and I prefer it far over using a Windows computer. In many of my previous roles, I was assigned a Lenovo ThinkPad and an Apple iPad. The iPad in particular helped when I was away from home and needed to complete on-call tasks, since being able to remote desktop from the iPad into a more powerful Windows machine allowed me to work from anywhere without carrying around a heavy laptop. While I do less technical work today, the apps I rely on, like Microsoft Word and OneNote, have native iPad apps. I also have Pythonista for writing and running basic Python programs and Juno for working with Jupyter Notebooks, which is helpful for scientific analysis. Having this software available as native apps rather than tools I run in a web browser is powerful and adds another useful layer to my workflows. Apple’s marketing line “Your next computer won’t be a computer” becomes more true with every investment Apple makes into the iOS platform.
Apple devices come with a screen reader called VoiceOver. It does a decent job at reading the contents of the screen to you and lets you navigate your iPhone with touch gestures and your iPad with gestures and keyboard commands. However, it does not fully replace the utility of screen readers on Windows. A screen reader is far more than just a tool to read the contents of the screen and offer a few extra keyboard shortcuts as needed. They come with dozens if not hundreds of utilities and small tools that make using a computer more productive. For example, JAWS on Windows has AI features that make inaccessible apps and pages work anyway. The AI Labeler figures out what forms are supposed to look like and then sends the user a correctly labeled page. JAWS Scripting lets users write scripts to help navigate difficult websites and apps. There is also easy-to-use OCR that reads text from images (Apple can sometimes do this, but it’s not as advanced as JAWS), PictureSmart AI which lets JAWS take an image file or even a screenshot and use AI to describe its contents, Tandem which allows a sighted person to control your screen and help you with tasks, and Virtual Cursor which is designed to make navigating websites or apps feel like navigating a Word document. VoiceOver has nothing like this. All of these features place the usability of JAWS well above VoiceOver. These are not optional or nice-to-have features; they are essential to being able to use my computer.
I work with a professional consultant every week to help me improve my computer use with JAWS. A lot of time has been invested into learning to use Microsoft Windows and JAWS effectively. Switching to macOS and having the full Apple integration would save me a lot of time. My current approach of copying things into Microsoft OneNote to send between my Apple and Windows devices is annoying, to say the least. What if Apple added a “Use Windows Keyboard Shortcuts” as an accessibility option to reduce the learning curve of using Apple macOS? It would add to Apple’s promise of an intuitive design where it just works.
This post is not just about Apple. Google’s Chromebooks have a screen reader called Vox, but it is not a full screen reader and would not give me full independence. Linux does not have full screen reader support yet. It has a screen reader called Orca which supports most of GNOME, but there is no easy way to change the voice, requiring complex keyboard commands that even with my background in software engineering and use of AI I couldn’t solve. Linux cannot be taken seriously as an enterprise desktop operating system while it lacks basic accessibility tools. Microsoft Windows is the only platform that comes close to full accessibility support for blind people.
I think Apple is doing a great job at accessibility on their iOS and mobile platform. For the most part, things just work there. But on macOS, full feature parity with Microsoft Windows and JAWS has not been achieved, which makes switching difficult to impossible depending on your degree of vision loss. Everyone is unique and I want to avoid overgeneralizing where possible.
This is an industry-wide crisis, not an issue unique to Apple. Microsoft Windows or a mobile device should not be the only options available to blind and low-vision people. If anyone at Apple is reading this, please share it with the right team. I want to see more choices for blind and low-vision people, not fewer.
Leave a comment