Better journeys
inside autonomous
shuttles.
ShuttleSense delivers adaptive, passenger-first interfaces for self-driving shuttles — making smart city transit safer, clearer, and more human.
Task completion rate across 16 live usability test participants
Error recovery — up from 28% baseline (p < 0.001)
Survey participants across German smart city commuter networks
Faster emergency response — 7.6s vs. 14.5s in baseline version
Passengers are left confused and unsafe.
Most autonomous shuttle interfaces were engineered for the vehicle — not the person inside it. The result is a broken, anxiety-inducing experience that drives people away from transit they should trust.
Outdated Static Displays
In-cabin screens show minimal status with no real-time adaptation, leaving passengers guessing throughout their journey.
No Error Recovery
When route deviations or emergencies occur, passengers have no way to understand, react, or reverse unexpected changes.
Inconsistent Feedback
Visual, audio, and haptic signals are fragmented or absent — forcing passengers to stay anxious rather than relaxed and informed.
Accessibility Gaps
Older adults and visually impaired passengers — who need clarity most — are worst served by current in-cabin systems.
A dynamic interface built around the passenger.
ShuttleSense uses multimodal feedback, context-aware modes, and clear error recovery to create a journey passengers can genuinely trust.
Multimodal Communication
Synchronized voice messages, visual alerts, and haptic feedback work in unison. Every passenger receives critical information through their preferred channel.
Visual · Audio · HapticContext-Aware Modes
The interface shifts dynamically: boarding, en route, approaching stop, emergency. Each mode surfaces only what matters at that precise moment.
Adaptive UIUndo Alert System
ShuttleSense's signature feature. When unexpected events occur, passengers can immediately acknowledge, act, or reverse — cutting response time in half.
Error RecoveryAnimated Status Headers
Real-time system status at the top of every screen with smooth transitions. State changes communicate clearly without causing alarm or overload.
Live StatusWCAG 2.1 AA Throughout
High-contrast modes, ARIA live regions for screen readers, 44px minimum touch targets — built in from the start, not added as afterthoughts.
AccessibilityAI Companion (Roadmap)
A conversational ambient assistant that proactively informs passengers of delays and personalises the journey — without requiring any interaction to start.
AI · PersonalisationGrounded in real people,
real rides, real data.
A rigorous mixed-method study across interviews, field observations, large-scale surveys, and controlled A/B testing — not assumptions.
In-Depth Interviews
20 structured interviews with regular shuttle commuters, older adults, and visually impaired users — surfacing hidden pain points that surveys alone would have missed.
Field Observations
25 real autonomous shuttle rides observed across smart city corridors, capturing authentic passenger behaviour with existing in-cabin systems in natural conditions.
Large-Scale Survey
112 participants validated qualitative findings and ranked 15+ user requirements by frequency and impact severity — giving statistical weight to what interviews revealed.
Iterative Prototyping
From paper sketches to clickable mid-fidelity to a comprehensive Figma prototype — refined through heuristic evaluation with Experience Design specialists.
Controlled A/B Testing
Qualitative A/B via UserTesting.com (12 users) and a controlled Google Analytics test (50 users) measuring real behavioural change between versions.
The numbers
speak clearly.
Version B produced statistically significant improvements across every metric — validated at p < 0.001, ω² = 15.31.
"Adaptive, multimodal in-cabin interfaces make driverless shuttle services significantly easier to use, safer, and more satisfying for every passenger."
16 in-person usability test participants
Error recovery after Undo Alert redesign
Emergency response time halved in Version B
Full accessibility audit confirmed compliant
Designed for everyone,
from day one.
ShuttleSense meets WCAG 2.1 AA throughout — not as a checklist, but as a core design principle from the first sketch.
High-Contrast & Screen Reader
All elements exceed WCAG 2.1 AA contrast ratios. Full ARIA landmark and live region support ensures screen readers convey real-time status updates accurately.
Reduced Mental Load
Context-aware modes surface only relevant information at each moment — dramatically reducing cognitive overload for older adults and first-time riders.
Large Touch Targets
Every interactive element meets 44×44px minimum touch targets. Emergency alerts use oversized buttons that work reliably even in a moving vehicle.
Three passengers who
shaped every decision.
Clear, consistent cues that convey exactly what the shuttle is doing — without decoding small icons or fast-moving text in an unfamiliar, moving vehicle.
Rich, real-time information about route, ETA, and system status — and the ability to act immediately when something unexpected disrupts his commute.
A complete journey experience through voice and haptic feedback alone — every update, alert, and interaction working perfectly without any visual dependency.
Ready to bring ShuttleSense to your transit network?
Whether you're a transit operator, smart city planner, or mobility researcher — we'd love to talk about what ShuttleSense can do for your passengers.