Privacy Concerns in Health Tech for Seniors


I was interviewing a 72-year-old retired accountant who had unplugged his smart glucose meter. He explained that he “didn’t know who was looking at” his blood sugar data.
He was no stranger to technology: he had used computers successfully for decades during his career. He was of sound mind. But when it came to his health device, he couldn’t find clear answers about where his data went, who could access it, or how to control it. The instructions were dense and the privacy settings were buried in several menus. So he made what seemed like the safest choice: he unplugged it. This decision meant forgoing the real-time blood sugar monitoring recommended by her doctor.
The healthcare IoT (Internet of Things) market is expected to exceed $289 billion by 2028, with older adults accounting for a significant share of users. These devices are fall detectors, medication reminders, glucometers, heart rate trackers and others that enable independent living. Yet the gap is widening between deployment and adoption. According to an AARP survey, 34% of adults over 50 cite privacy as the biggest barrier to health technology adoption. That’s millions of people who could benefit from surveillance tools but avoid them because they don’t feel safe.
In my study at the Ritchie School of Engineering and Computer Science at the University of Denver, I surveyed 22 older adults and conducted in-depth interviews with nine participants who use health monitoring devices. The results revealed a critical technical failure: 82% of respondents understood security concepts such as two-factor authentication and encryption, but only 14% felt confident managing their privacy when using these devices. In my research, I also evaluated 28 healthcare apps designed for seniors and found that 79% of them lacked basic breach notification protocols.
One participant told me, “I know there is encryption, but I don’t know if it’s really enough to protect my data. Another said: “The idea of my health data falling into the wrong hands is very concerning. I am particularly worried about identity theft or my information being used for fraud.”
This is not a problem of user knowledge; it’s an engineering problem. We’ve built systems that require technical expertise to operate safely, then put them in the hands of people managing complex health needs while dealing with age-related changes in vision, cognition and dexterity.
Measure the gap
To quantify issues related to transparency of privacy settings, I developed the Privacy Risk Assessment Framework (PRAF), a tool that evaluates health apps in five critical areas.
First, the regulatory compliance area assesses whether apps explicitly state their adherence to the Health Insurance Portability and Accountability Act (HIPAA), the General Data Protection Regulation (GDPR), or other data protection standards. It is not enough to claim compliance: they must provide verifiable evidence.
Second, security mechanisms The field evaluates the implementation of encryption, access controls, and most importantly, breach notification protocols that alert users when their data may have been compromised. Third, in the area of usability and accessibility, the tool examines whether privacy interfaces are readable and navigable for people with age-related visual or cognitive changes. Fourth, data minimization practices assess whether apps collect only necessary information and clearly specify retention periods. Finally, third-party sharing transparency measures whether users can easily understand who has access to their data and why.
When I applied the PRAF to 28 healthcare apps commonly used by older adults, the results revealed systemic gaps. Only 25% explicitly stated HIPAA compliance and only 18% mentioned GDPR compliance. Even more alarming, 79% did not have breach notification protocols in place, meaning users may never know if their data was compromised. The average readability of the privacy policy was achieved at the 12th grade level, although research shows that the average reading level of seniors is at the 8th grade level.. No apps included accessibility features in their privacy interfaces.
Think about what happens when an older person opens a typical health app. They are faced with a multi-page privacy policy filled with legal terminology about “data controllers” and “purposes of processing,” followed by settings scattered across multiple menus. One participant told me, “The instructions are hard to understand, the print is too small, and it’s overwhelming.” Another explained: “I don’t feel informed enough about how my data is collected, stored and shared. It seems like most of these companies are profit-driven and don’t make it easy for users to understand what’s happening with their data.”
When protection requires a manual that people can’t read, two outcomes ensue: They either ignore security entirely, thereby making themselves vulnerable, or they abandon the technology altogether, losing its health benefits.
Engineering for Privacy
We need to view trust as a technical specification, not a marketing promise. Based on my research findings and the specific barriers older adults face, three approaches address the root causes of distrust.
The first approach is adaptive security defaults. Rather than requiring users to navigate complex configuration menus, devices should come with preconfigured best practices that automatically adapt to data sensitivity and device type. A fall detection system does not need the same settings as a continuous glucose monitor. This approach is inspired by the principle of “safety by default” in systems engineering.
Biometric or voice authentication can replace easily forgotten or written passwords. The key is to remove the burden of expertise while maintaining strong protection. As one participant said: “Simplified security settings, better educational resources, and more intuitive user interfaces will be beneficial. »
The second approach is real-time transparency. Users shouldn’t have to navigate through settings to see where their data is going. Instead, notification systems should display each data access or sharing event in simple language. For example: “Your doctor accessed your heart rate data at 2 p.m. to review it for your next appointment. » A single dashboard should summarize who has access and why.
This addresses a concern that came up repeatedly during my interviews: users want to know who sees their data and why. The engineering challenge here is not technical complexity, but designing interfaces that convey technical realities in a language that everyone can understand. Such systems already exist in other areas; banking apps, for example, send immediate notifications for every transaction. The same principle applies to health data, where the stakes are arguably higher.
The third approach concerns invisible security updates. Manual patching creates windows of vulnerability. Automatic and transparent updates should be standard for any device managing health data, coupled with a simple status indicator so users can confirm protection at a glance. As one participant said: “The biggest problem we face as older people is the fact that we don’t remember our passwords…New technology is outpacing older people’s ability to keep up. » Automating updates removes a significant source of anxiety and risk.
What is at stake
We can continue to develop IoT for healthcare the way we have been: fast, feature-rich, and fundamentally unreliable. We can also design systems that are transparent, secure and usable from the moment they are designed. Trust is not something that is marketed through slogans or legal notices. It’s something you make up, line by line, in the code itself. For older adults who rely on technology to maintain their independence, this type of engineering is more important than any new features we might add. Every unplugged glucometer, every abandoned fall detector, every health app deleted out of confusion or fear represents not only a lost sale but also a missed opportunity to support a person’s health and independence.
The privacy challenge in healthcare IoT goes beyond repairing existing systems: it requires reimagining how we communicate privacy itself. My ongoing research builds on these findings with an AI-powered Data Helper, a system that uses large linguistic models to translate dense privacy legal policies into short, accurate, and senior-friendly summaries. By making data practices transparent and understanding measurable, this approach aims to transform compliance into understanding and trust, thereby advancing the next generation of trusted digital health systems.
From the articles on your site
Related articles on the web



