An iPadOS AAC (Augmentative and Alternative Communication) device that empowers people with motor neuron diseases to communicate again. Uses advanced ARKit facial recognition to control a full communication board through natural expressions—winks, eyebrow raises, puckers, and more. Features combo inputs for enhanced accessibility and SwiftData for persistent storage.
Creating an accessible communication solution for individuals with motor neuron diseases who have lost the ability to speak or use traditional input methods. The challenge was to develop an intuitive system that allows users to communicate effectively using only facial expressions, while ensuring reliability, speed, and ease of use. The app needed to work seamlessly on iPadOS and provide a full-featured AAC board experience.
Built an iPadOS app using SwiftUI and ARKit to detect and interpret facial expressions in real-time. Implemented a sophisticated combo input system that combines multiple facial gestures (winks, eyebrow raises, lip puckers) to create a comprehensive control scheme. Integrated SwiftData for persistent storage of user preferences, custom phrases, and communication boards. Created an intuitive AAC board interface with customizable phrases, quick replies, caregiver prompts, and emoji support. The facial recognition system processes natural expressions without requiring controllers or external devices, restoring independence and voice to users.
I served as the iOS Engineer.