The professionals who cannot use cloud AI for writing

When most people talk about AI privacy concerns, they mean something vague. Data harvesting. Surveillance capitalism. A general unease about corporations knowing too much.
For a specific group of professionals, the concern is not vague at all. It is a legal and ethical obligation with real consequences for violation.
The problem with cloud AI for certain work
Every cloud AI writing tool works the same way. You type something, it leaves your device, travels to a server, gets processed and comes back. The quality of the suggestion depends on what the model learned and how well it can predict what you were going to write next.
For most people, that trip to the server is invisible and inconsequential. For others, it is the thing that makes the tool unusable.
Lawyers
Attorney-client privilege protects communications between a lawyer and their client. The protection can be waived if the communication is shared with a third party. Courts are still working out exactly what third party means when it comes to cloud services, but the direction of the analysis is clear. Sending client information to an external server creates risk that most attorneys are not willing to take.
Bar associations have been issuing guidance on AI use since 2023. The consistent theme is due diligence on where client data goes. State bars in California, New York and Florida have all issued formal guidance requiring attorneys to understand how AI tools handle confidential information before using them.
In practice, this means a lot of lawyers who would benefit from writing assistance are not using it. They draft briefs, contracts and client communications the same way they did five years ago because the tools that work are the tools they cannot use.
Healthcare professionals
HIPAA created a detailed framework for what counts as protected health information and who can access it. Cloud AI services are not covered entities under HIPAA and do not sign business associate agreements by default. Sending patient information to a general-purpose AI writing tool is, depending on how you read the regulations, a violation.
Physicians write constantly. Referral letters, prior authorizations, patient communications and research notes. The documentation burden is one of the primary drivers of burnout in medicine. AI could help significantly. But most of what a doctor writes contains patient information, which means most of the obvious tools are off-limits.
Therapists and counselors
Therapy notes contain some of the most sensitive information that exists. Mental health disclosures, trauma histories and relationship details. The therapeutic relationship depends entirely on the client trusting that what they share stays between them and their therapist.
Therapists who write session notes, treatment plans or progress summaries using cloud AI are taking on compliance risk and, more fundamentally, are making a decision about their client's data that the client did not consent to. Most would not do this. So they write without assistance.
Consultants and corporate advisors
NDAs and confidentiality agreements cover most consulting engagements. A consultant writing a strategic analysis for a client has typically agreed not to disclose that client's information to third parties. Whether an AI writing assistant counts as a third party is a question most consultants are not willing to test in litigation.
Management consulting firms have issued internal guidance restricting or banning cloud AI use on client work. Individual consultants at smaller firms make the same calculation quietly, on their own and end up in the same place.
What local AI changes
Local AI does not send your text anywhere. The model runs on your Mac. Autocomplete suggestions are generated on-device and never leave it. There is no server receiving your draft, no third party processing your client's information and no data to be subpoenaed or breached.
This is not a privacy feature layered on top of a cloud product. It is the architecture. The tool cannot send your data to a server because it was not built to do that.
For the professionals above, that distinction matters. It turns writing assistance from a compliance risk into a professional tool.
Typeahead runs entirely on your Mac. Your writing never touches a server. All AI processing stays local. The only network activity is license activation and optional update checks.