A unique design challenge
AI has changed so many parts of our lives, for better and for worse. Beyond the uses we see every day, it’s also quietly serving groups that don’t always get much attention.
There are over 1 billion people aged 60 and older, and over 55 million people living with dementia worldwide. My own family and many people around me have had to take care of an elderly loved one. It’s never easy. It takes constant attention, patience, time, and energy.
With all the advances in technology, could AI actually take some weight off a caregiver’s shoulders, and do it ethically?
What makes this user group different
Older adults aren’t just “users with accessibility needs.” They represent a fundamentally different design paradigm, and that means a lot of the frameworks designers use by default don’t quite fit.
They are afraid of making mistakes
A 2025 systematic review of 132 studies on age-friendly mobile app design highlights several barriers: cognitive overload from complex interfaces, fine motor difficulties that make small touchscreens unusable, and, most importantly, a fear of making errors. Unlike younger people who grew up with technology, older users are more cautious and genuinely afraid they’ll break something. The fear turns into serious anxiety and resistance.
They might not be the ones using it
A lot of the time, the resident is the one being assessed, but the caregiver is the one holding the device and reading the screen. That means you’re designing two different experiences at once: one for clinical accuracy and speed, one for comfort and dignity. HCI researchers have recently started pushing for technologies where the person with dementia is “an active rather than a passive user of technology in the management” of their own care, but there isn’t a clear picture about what it will look like yet.
This also creates a consent problem. Someone with moderate-to-severe cognitive impairment can’t really read a privacy policy, can’t fully understand what an audio or visual monitor is doing, can’t meaningfully consent to having their face scanned. So how do you design ethically for someone who can’t choose whether to be designed for?
They might actively resist the product
Compared to other age groups, older adults tend to show the lowest levels of trust in technology. From watching my own family, I also see that when someone is dealing with chronic illness like dementia, anything unfamiliar can spike anxiety or even anger. Wearable fall detectors get taken off or switched off all the time. Anything that looks or feels like surveillance can make them feel extremely uncomfortable and vulnerable.
With most products, the design challenge is to get people to engage. However, with this group, the challenge is more like getting them to tolerate the thing existing in their space. Nielsen Norman Group has noticed something similar: even tech-savvy seniors are quick to uninstall apps they find annoying and delete accounts that feel intrusive. In a care setting, where the product usually isn’t their choice, the pushback can be even stronger.
They’re hard to co-design with
“Involve your users in the design process” is the principle many designers follow. But researchers at JMIR found that co-design sessions with cognitively impaired older adults are extremely challenging. Participants get tired quickly, lose interest, struggle with hardware, and sometimes can’t hear the voices clearly. Stuff you’d never think about (e.g., glare on a screen, a chair without armrests, ambient noise in the room) can completely derail a session.
Problems and current AI solutions
These constraints push design teams to get more creative. There are already several AI products tackling the biggest issues: undetected pain, falls, loneliness.
Undetected pain
Studies show that pain is drastically underdiagnosed in people with cognitive impairment. Traditional pain assessments rely on a caregiver watching the patient’s face and body language, which is not only subjective and inconsistent, but also vulnerable to racial and gender bias in how expressions get read. Caregivers also can’t monitor every moment, which means pain can go unnoticed for hours.
The product
PainChek, developed in Australia, is an app built for detecting pain. A caregiver records a short video of the resident’s face. AI will then analyze the tiny muscle movements tied to pain expressions. Then the caregiver works through guided checklists across five other areas (voice, movement, behavior, activity, body). The app combines AI and human judgment to generate an overall pain score.

According to MIT Technology Review, a dementia care chain in England that adopted PainChek saw psychotropic prescriptions drop and residents calm down. Turns out a lot of what was being treated as “behavioral issues” was actually unaddressed pain.
Design/technical challenge
The hybrid system is designed to improve speed and accuracy by combining machine objectivity and human clinical judgment. However, both human and machine can introduce bias. For the human part, one Australian study found Aboriginal and Torres Strait Islander residents were getting lower total pain scores than non-Indigenous residents. For the machine part, facial recognition trained on narrow datasets can misread people from minority groups. How to balance AI and human roles for more accurate results remains an open question.
Worth thinking about
This is a case where the user isn’t the direct operator, but their well-being relies heavily on how accurately someone else uses the tool. Every design choice (binary vs. scale, automated vs. manual) becomes an ethical choice.
There is also a risk that clinicians start relying on the score too heavily, instead of trusting their own experience. A caregiver who knows the patient’s history can bring insights an algorithm can’t.
Falls
Falls are one of the leading causes of unplanned hospital admissions among older adults. The old fall-detection tools (e.g., pendant alarms, wristbands) usually need the person to hit a button after they’ve fallen, which many people can’t do after a fall. And many residents refuse to wear them at all because they make them feel like they’re being watched. Also, many older people just aren’t used to wearing something on their body all day.
Design challenge
The easy solution to this problem is “just use cameras.” But for some older adults, especially those who don’t trust technology, cameras make them feel watched and powerless. So the challenge is designing something seniors actually feel comfortable having around.
The products
Nobi, a Belgian company, makes AI-powered ceiling lamps that detect falls and keep an eye on sleep and behavior. The U.S. head of the company said their goal is making something “a resident would want in their apartment.” It’s designed to look attractive and blend into the environment. This AI lamp is trained on over 250,000 real-life scenarios. It uses optical sensors to detect falls. Once a fall is detected, it verbally asks if the resident is okay, then notifies staff. It can also shift its lighting to match circadian rhythms, such as dimming at night and getting brighter in the morning. The goal is to help with disorientation and reduce the risk of falls.

Then there is Sensi.AI, which uses a different approach. It’s an always-on audio device — no cameras, no wearables, no screen or interface for the senior to interact with. It just sits in the background and listens for patterns: coughing, tone of voice, movement sounds, changes in eating or sleep. The AI interprets what it hears and sends insights to caregivers.
Worth thinking about
Invisible, background products might be more acceptable for people who don’t want to feel monitored, or who don’t have the tech literacy to use an interface. However, it can get so invisible that users forget they exist, which makes privacy, data collection, and data handling extremely important.
Nobi, the AI lamp, tried to address this problem. All image processing happens locally in the lamp. Nothing goes to the cloud. The system only sends events, not raw footage. When imagery does get captured, residents show up as anonymous stick figures. Footage gets deleted automatically unless a fall is detected, and even then it’s kept for a maximum of 14 days.
Loneliness
Many nursing home residents go days without a meaningful conversation. Some can’t physically or mentally get out of their bed to meet people. Social isolation is seriously dangerous for older adults. Research links it to higher rates of dementia, depression, heart disease, and early death. In the U.S., nearly 30% of seniors live alone.
A 2024 scoping review of human-centered design for loneliness tech makes an important point: loneliness in later life has its own causes, such as losing a spouse, moving into a care facility, losing mobility. We can’t design through the lens of the loneliness younger people feel.
Design challenge
For mainstream users, companion products tend to look like cute pets or things you’re meant to bond with. But when researchers asked older adults what they wanted, they said they didn’t want a toy or a pet. They wanted a stationary object that noticed they were there, started conversations, and offered support. They don’t want to feel like they’re emotionally dependent on it or like it’s trying to replace a human relationship.
There is also an engagement problem. Voice assistants like Alexa, Siri, and Google Home wait for a wake word. However, this doesn’t work well for isolated older people, because they’re unlikely to initiate interaction themselves. So the designer has to figure out a way for the product to start the conversation itself.
The product
ElliQ tracks behavior patterns, sentiment, time of day, and conversation history. With all that data, it decides when to check in, suggest an activity, offer some trivia, or just say good morning. The product is designed to look like a friendly object rather than a humanoid robot or cute pet.
According to the design team, the hardest part was dialing in the right amount of proactivity. In early beta testing, some users got startled by how often it jumped in. The team spent years tuning how often it initiates, what it suggests, and when to just back off. In a New York State pilot with over 800 older adults, 95% reported feeling less lonely, and users were interacting with it more than 30 times a day.

The AI-in-elder-care market was worth $56.78 billion in 2025, and it’s growing more than 21% a year. That number is only going up as the world is aging and staffing shortages aren’t going anywhere. At this point, using technology as part of the solution is inevitable.
But it’s important to remember these aren’t just numbers. The elderly are a vulnerable group, and sometimes they can’t speak up for themselves. That’s why we need to keep asking the harder questions: Is this ethical? Is it replacing something that actually matters in a human life? Take ElliQ’s 95% loneliness-reduction stat. The number is impressive, but even so, the technology should be a supplement, not a replacement. As one AI industry report put it: “AI cannot replace human interaction, which is of the utmost importance to the elderly as loneliness is a negative influence on cognitive decline.” Pain detection is similar. AI can flag it, but a clinician’s experience and judgment are often what’s needed to actually make the right call.
It’s also worth asking whether technology might amplify the very problem it’s trying to solve. Will institutions use these tools as a reason to hire fewer staff and cut costs, making short-staffing even worse? And most of these products are still in early stages. PainChek has a relatively solid evidence base, but the studies still rely on limited sample sizes and specific populations.
At the end of the day, we’re designing for people in one of the most fragile chapters of their lives. That deserves care, gentleness, and rigorous study, maybe more than ever.
Reference:
Alzheimer’s Disease International. (n.d.). Dementia statistics. https://www.alzint.org/about/dementia-facts-figures/dementia-statistics/
Amouzadeh, E., Dianat, I., Faradmal, J., & Babamiri, M. (2025). Optimizing mobile app design for older adults: Systematic review of age-friendly design. Aging Clinical and Experimental Research, 37, 248. https://doi.org/10.1007/s40520-025-03157-7
Brett, K., & Severn, M. (2023). Facial analysis technology for pain detection: A potentially useful tool for people living with dementia (CADTH Horizon Scan Report No. EN0047). Canadian Agency for Drugs and Technologies in Health. https://www.ncbi.nlm.nih.gov/books/NBK595121/
Care Innovation Summit. (n.d.). AI fall prevention with Nobi lights. https://careinnovationsummit.co.uk/ai-fall-prevention-with-nobi-lights/
CBRE. (n.d.). How AI is being adopted across the care homes sector. https://www.cbre.co.uk/insights/articles/how-ai-is-being-adopted-across-the-care-homes-sector
Fuseproject. (n.d.). ElliQ — AI companion for elder care. https://fuseproject.com/case-studies/elliq/
Ghosh, R., Khan, N., Migovich, M., Tate, J. A., Maxwell, C. A., Newhouse, P. A., Scharre, D. W., Tan, A., Mion, L. C., & Sarkar, N. (2025). Engaging older adults and staff in the co-design and evaluation of socially assistive robot and virtual reality activities for long-term care: User-centered study. JMIR Aging, 8, e75288. https://doi.org/10.2196/75288
InsightAce Analytic. (2026). AI in aging and elderly care market. https://www.insightaceanalytic.com/report/ai-in-aging-and-elderly-care-market/2696
Intuition Robotics. (n.d.). ElliQ. https://elliq.com/
McKnight’s Senior Living. (n.d.). Next-gen lamps combine smart lighting and passive monitoring for fewer senior falls incidents. https://www.mcknightsseniorliving.com/news/next-gen-lamps-combine-smart-lighting-and-passive-monitoring-for-fewer-senior-falls-incidents/
Mousa, D. (2025, October 15). AI is changing how we quantify pain. MIT Technology Review. https://www.technologyreview.com/2025/10/15/1125116/ai-is-changing-how-we-quantify-pain/
New York State Office for the Aging. (2023, August 1). NYSOFA’s rollout of AI companion robot ElliQ shows 95% reduction in loneliness. https://aging.ny.gov/news/nysofas-rollout-ai-companion-robot-elliq-shows-95-reduction-loneliness
Nielsen Norman Group. (n.d.). Usability for senior citizens. https://www.nngroup.com/articles/usability-for-senior-citizens/
Nobi. (n.d.). Product. https://www.nobi.life/en/product
PainChek. (n.d.). PainChek® universal pain management app. https://www.painchek.com/
Probst, F., Ratcliffe, J., Molteni, E., Mexia, N., Rees, J., Matcham, F., Antonelli, M., Tinker, A., Shi, Y., Ourselin, S., & Liu, W. (2024). A scoping review on human-centered design approaches and considerations in the design of technologies for loneliness and social isolation in older adults. Design Science, 10, e28. https://doi.org/10.1017/dsj.2024.22
Sensi.AI. (n.d.). Home. https://www.sensi.ai/
Wilson, M., Doyle, J., Turner, J., Nugent, C., & O’Sullivan, D. (2024). Designing technology to support greater participation of people living with dementia in daily and meaningful activities. Digital Health, 10, 20552076231222427. https://doi.org/10.1177/20552076231222427
Wilmink, G., Dupey, K., Alkire, S., Grote, J., Zobel, G., Fillit, H. M., & Movva, S. (2020). Artificial intelligence–powered digital health platform and wearable devices improve outcomes for older adults in assisted living communities: Pilot intervention study. JMIR Aging, 3(2), e19554. https://doi.org/10.2196/19554
World Health Organization. (2025, October 1). Ageing and health. https://www.who.int/news-room/fact-sheets/detail/ageing-and-health
How AI may reshape elderly care was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
