Connected Living Role-Based Design
Balancing resident independence and staff operational efficiency

Overview
As smart home adoption grows, older adults are managing an increasingly fragmented ecosystem of connected devices. While these technologies promise greater independence, they can also introduce cognitive load, troubleshooting friction, and safety risks — particularly in assisted living environments where support systems are already stretched.
I led an 8-week product design project exploring Connected Living, an AI-powered mobile platform that unifies smart device management for residents and staff within senior living facilities. The platform is configured based on the devices available in each facility, allowing the experience to adapt to different room setups and infrastructure. The project explored a central tension: how do you empower seniors to remain independent while equipping care teams with tools that improve response time and operational efficiency?
Role
Product Designer
Duration
8 Weeks
Focus Areas
Role-Based Design, AI Integration, Accessibility
Platform
iOS Mobile
The Problem
In senior living environments, residents often use smart home devices to support independence, but those devices exist outside of staff workflows. Care teams have limited visibility into device activity and constrained time to intervene when issues arise. At the same time, age-related changes in vision, dexterity, and cognition can make managing multiple apps and troubleshooting devices increasingly difficult.
The challenge was not simply consolidating devices into one platform — it was designing a role-based system that improves staff visibility and response time while remaining accessible and autonomy-supportive for residents.

Research
To understand how smart devices impact both independence and care workflows, I interviewed adults 55+ who actively use connected home technology and staff working in assisted living facilities. I focused on how devices are used day to day, how issues are resolved, and where visibility breaks down between residents and care teams.
How Smart Devices Fit Into Daily Life
To understand how smart safety systems intersect with daily routines, I mapped the current-state escalation journey when device alerts are triggered. While connected technologies are designed to enhance safety and independence, rigid alert routing and manual escalation pathways often introduce unnecessary friction. In situations where normal activity is misinterpreted as risk, residents and staff experience avoidable stress and workflow disruption.
The journey revealed that independence is situational and closely tied to system design. Residents feel confident during routine use but experience anxiety when alerts escalate without flexibility or clear status visibility. Without shared context and customizable controls, safety technology can unintentionally undermine autonomy and strain care team operations.
Current-State Resident Journey Map

How Care Teams Manage Resident Support
To understand operational realities, I examined how assisted living staff receive and respond to concerns tied to connected devices. While smart technologies are intended to improve safety, their alerts often sit outside of existing care workflows. Instead of surfacing through a shared system, concerns are frequently relayed through phone calls or informal communication, requiring staff to pause and manually assess the situation.
Interviews revealed that time constraints and staffing limitations shape how quickly and confidently staff can respond. Without clear risk categorization or defined escalation thresholds, care teams must rely on judgment and fragmented information to determine urgency. The result is a reactive model of support — not because the technology is lacking, but because it isn’t structured around how care teams actually work.
Staff Member Journey Map

Who We're Designing For
To translate research into product direction, I distilled findings into two focused personas: a self-reliant retiree who values routine and wants technology that works predictably without undermining independence, and an overextended care team member balancing constant demands who needs systems that reduce chaos rather than add to it.
While their goals differ, both rely on clarity and trust. Residents need technology that supports autonomy without adding cognitive burden. Staff need structured visibility that integrates seamlessly into existing workflows without increasing unnecessary intervention.
Resident Persona

Staff Persona

Key Insights

Role-Based Design Must Be Intentional
Residents and care staff share the same environment but operate with different goals and constraints. A unified system must clearly differentiate control and visibility to protect autonomy while supporting oversight.

Proactive Support Should Enhance, Not Override
Monitoring and alerts can improve response time, but intervention must preserve resident-first interaction and staff judgment to maintain trust, protect independence, and reinforce a sense of autonomy.

Accessibility Enables Confidence and Independence
Participants consistently emphasized simplicity and ease of use. Clear feedback, intuitive controls, and reduced cognitive load are essential to ensuring technology remains supportive as needs evolve over time.
Strategy & Design
Research insights informed the design strategy, focusing on three priorities: defining clear role boundaries between residents and staff, using AI to support early risk detection and structured escalation, and ensuring the experience remains accessible as residents’ needs evolve. These priorities guided how the system is structured and how residents and staff interact with it.
Role-Based Design
Rather than designing a single shared interface, I defined how core components of the product adapt by role. While additional areas shift throughout the experience, the three components below demonstrate how navigation, homepage structure, and alert visibility are tailored for residents and staff. Residents manage their own environment, while staff monitor risk and intervene when necessary.
-
Navigation: Resident navigation supports daily device management, while staff navigation centers on monitoring and response. Both roles share Home and Settings, but residents also have a dedicated Support page. Residents use Activity to view alerts and recent events, whereas staff use Alerts as a focused space for incident triage and resolution.
-
Home Page: The homepage shifts based on role and responsibility. For residents, it functions as a personal control center with active alerts and quick access to their devices. For staff, it serves as a monitoring dashboard, prioritizing active alerts and visibility across residents.
-
Activity / Alerts: Visibility within Activity and Alerts is structured around user tasks and privacy boundaries. Residents see alerts and recent device events within their own home for awareness. Staff see only incident-based alerts filtered by device and resident, with Active and Resolved tabs to support triage and prevent critical alerts from being buried.
Resident Wireframes

Staff Wireframes

This structure translated into two distinct but connected experiences. The resident interface prioritizes direct device control, simplified actions, and clear system feedback to reinforce independence. Devices are grouped by room, status is visible at a glance, and alerts present limited, high-confidence actions.
The staff interface centers on structured visibility and controlled support. Staff can monitor active alerts, view assigned residents, and respond within defined boundaries. Shared features behave differently by role, ensuring residents remain in control of their environment while staff have the tools needed to step in when necessary.
Resident Experience

Staff Experience

AI Integration
AI was integrated to surface potential safety issues before they escalate. Depending on the device, detection is driven either by time-based thresholds or shifts from typical behavior. A stove left on follows a defined inactivity rule to ensure predictability in high-risk scenarios, while irregular water usage is evaluated against established patterns where deviations from baseline are more meaningful.
Once triggered, escalation follows a clear hierarchy. Residents are prompted first. If there’s no confirmation, visibility expands to staff. Automated safeguards activate only as a last resort.
Expanding visibility across roles introduces ethical considerations. I conducted an Ethics and Society Review (ESR) to evaluate risks related to overreliance, data privacy, misuse, and accessibility. These findings directly shaped escalation thresholds, role-based boundaries, and the conditions under which automated safeguards activate.
Ethics & Society Review

The escalation framework below applies the ESR findings to a low-risk stove scenario, showing how it advances from Level 1 to Level 3 based on timing and response. Escalation is incident-dependent, so higher-risk events like smoke detection bypass resident prompts and notify staff immediately.
Escalation Framework: Low-Risk Stove Incident

The following screens illustrate how the low-risk stove incident unfolds across two stages. The resident flow captures a Level 1 alert resolved independently. The staff flow demonstrates how the system responds when that alert goes unanswered and escalates to Level 2.
Resident Flow - Level 1

Staff Flow - Level 2

Accessibility
Accessibility was approached knowing that residents’ cognitive and physical abilities may shift over time. As previously shown, confirmation prompts, a clear way to request help with the "Support" tab, and access to recent activity logs were incorporated to reduce reliance on memory, particularly as cognitive decline progresses.
Accessibility challenges also extend to physical interaction. Many smart home products rely on precise gestures or fine motor control. As shown in the Nest thermostat and video examples below, circular sliders and small touch targets require steady, controlled movement. This can create friction for users experiencing hand tremors or reduced motor stability. Although alternative plus and minus controls are available, they are often visually minimized and harder to locate.
Examples of Interaction Challenges in the Nest App

These considerations are reflected in the interaction patterns incorporated into the app. On the thermostat screen, temperature is adjusted using large plus and minus controls rather than the circular sliders commonly used in other smart thermostat apps, allowing residents to change settings through simple taps instead of precise dragging.
On the security system screen, vertical scrolling was used to review footage. Unlike horizontal timelines used in some systems, vertical scrolling shows more events at once and requires less controlled movement or repeated clicking. Additional controls such as skipping ahead 30 seconds, rewinding 15 seconds, and previous or next clip buttons allow residents to navigate recordings without relying on the timeline. Together, these patterns make the system easier to use for residents who may experience hand tremors or reduced motor stability.
UI Accessibility Improvements

Evaluation & Impact
Following the redesign, I reviewed the system structure to confirm that role boundaries, alert visibility, and escalation logic remained consistent across the experience. I also revisited key accessibility decisions to ensure interaction patterns, touch targets, and feedback supported residents with changing cognitive and physical abilities.
The result is a structured, role-based system that protects independence while giving staff clear visibility and support, with proactive monitoring and accessibility built into the foundation rather than added later.