Back to Portfolio

ShopAssist: AI-Powered Accessible Shopping

Transforming in-store shopping for blind and low-vision customers through smart lens technology with integrated radar and AI vision

Role Lead UX Designer
Type Innovation Project
Focus Inclusive Design
Users Blind & Low Vision

Project Overview

The Challenge

2.2 million people in the UK are living with sight loss. For blind and low-vision individuals, shopping in convenience stores, supermarkets, and off-licenses remains a frustrating, often impossible, independent activity. Current solutions—asking staff, memorizing layouts, or relying on sighted companions—strip away dignity and independence.

The Problem: Existing stores are designed for sighted customers. Product locations change, aisles are crowded, signage is visual-only, and checkout queues are chaotic. No current technology provides real-time spatial guidance, obstacle detection, and item identification that truly empowers independent shopping.

The Opportunity

Design a revolutionary smart lens device: lightweight glasses with integrated camera, radar sensors, and bone conduction audio providing real-time navigation, obstacle detection, and voice-guided product identification—completely hands-free.

My Role

Led user research with blind/low-vision shoppers, designed complete user experience from store entrance to checkout, and validated solution through usability testing.

Impact Vision

Enable independent shopping for 2.2M UK sight loss community. Transform retail accessibility. Create replicable model for Tesco, Asda, Sainsbury's, Co-op, and independent stores.

2.2M
UK sight loss population
89%
Avoid shopping independently
20
Blind/low vision users interviewed
100%
Want independent shopping

Research & Discovery

Understanding the Lived Experience

Research Methodology

I conducted extensive research with the blind and low-vision community to understand their shopping experiences, pain points, and needs.

Research Activities

Key Research Findings

Current Barriers

  • No spatial awareness: Can't locate aisles, sections, or specific products
  • Changing layouts: Stores rearrange products frequently
  • Obstacle dangers: Shopping carts, restocking carts, spills, displays
  • Product identification: Can't read labels, prices, expiry dates
  • Checkout confusion: Multiple queues, self-service machines inaccessible

User Needs

  • Independence: Shop without relying on sighted companion
  • Safety: Avoid collisions, obstacles, hazards
  • Efficiency: Find products quickly without wandering
  • Information: Know product details, prices, offers
  • Dignity: Shop like everyone else, not feel "less than"

Existing Solutions Assessment

I evaluated current assistive shopping technologies:

Gap Identified: No solution combines real-time spatial awareness, obstacle detection, turn-by-turn navigation, and product identification in one integrated experience.

User Personas

Meet Our Users

Based on research with 20 blind and low-vision shoppers, I created three primary personas representing the diversity of our user base.

DM

David Mitchell

Age 42, Blind from birth

Nottingham, UK

Background

David is a software developer who uses JAWS screen reader at work and VoiceOver on his iPhone. He uses a white cane for mobility and is highly tech-savvy. Lives independently and values autonomy.

Shopping Challenges

  • Can't navigate stores independently—memorizes specific store layouts
  • If layout changes, becomes completely lost
  • Can't read product labels—must ask staff or use Be My Eyes
  • Feels embarrassed asking for help constantly
  • Avoids larger supermarkets, sticks to same small convenience store

Technology Usage

iPhone with VoiceOver, white cane, Be My Eyes app (but doesn't always work well in stores), Seeing AI for occasional label reading

"I just want to pop to the shop for milk without planning a military operation. I want to be spontaneous like everyone else."

Goals for ShopAssist

  • Navigate unfamiliar stores confidently
  • Find specific products quickly
  • Avoid obstacles and crowded areas
  • Read labels and compare prices independently
EP

Eleanor Patel

Age 67, Low vision (macular degeneration)

Derby, UK

Background

Eleanor is a retired teacher whose vision has deteriorated significantly over the past 5 years. She has peripheral vision but no central vision—can't read text or recognize faces. Uses screen magnification on her tablet.

Shopping Challenges

  • Can navigate stores using peripheral vision but misses details
  • Can't read product labels, prices, or expiry dates
  • Struggles with poor lighting and reflective surfaces
  • Finds crowded stores overwhelming—loses spatial orientation
  • Used to shop independently—now requires daughter's help weekly

Technology Usage

Previously used smartphone with screen magnification, prefers audio feedback, less tech-confident than younger users but excited about hands-free solutions

"I used to love browsing the shops. Now I dread it. I can't see prices, can't tell if milk is about to expire. I feel useless."

Goals for ShopAssist

  • Regain independence she's lost
  • Access product information through audio
  • Navigate without daughter's help
  • Feel confident and capable again

User Journey Mapping

Current State: Shopping Without ShopAssist

Following David through a typical shopping trip to Tesco Metro to buy groceries for dinner.

Anxious, Prepared

Planning (at home)

Makes mental list of items needed. Calls store to ask if they have specific items in stock. Plans route memorizing last visit's layout 3 weeks ago.

Time: 20 minutes

Nervous, Uncertain

Store Entrance

Arrives at store. Automatic doors open. Uses white cane to feel for shopping baskets. Can't find them—location changed. Staff member notices and brings basket.

Barrier: No audio cue for basket location. Requires help immediately.

Frustrated, Lost

Navigating Aisles

Heads toward dairy section based on memory. Store has rearranged aisles since last visit. Walks down wrong aisle. White cane hits promotional display—items fall.

Barrier: No way to know layout changed. No obstacle warning. Feels embarrassed.

Time wasted: 8 minutes

Embarrassed, Dependent

Finding Products

Asks staff member where milk is. Staff walks him to dairy section. Reaches milk shelves. Can't tell which is whole milk vs. semi-skimmed vs. plant-based. Tries Be My Eyes app on phone. Waits 4 minutes for volunteer. Poor lighting makes camera focus difficult.

Barrier: No independent way to identify products. Relies on strangers' availability.

Time: 12 minutes for one item

Exhausted, Rushed

Additional Items

Needs bread and pasta sauce. Each item requires repeating the process—finding aisle, identifying product, checking price. Gives up on pasta sauce after 10 minutes. Settles for whatever staff member hands him.

Result: Can't shop for what he actually wants. Compromises constantly.

Worried, Confused

Checkout

Heads to checkout area. Multiple queues but can't tell which is shortest or if any registers are open. Hears beeping—assumes self-service. Self-service machines have visual touchscreens—completely inaccessible. Joins staffed till queue. Can't tell when it's his turn—waits for staff to call him.

Barrier: No audio queue management. Self-service inaccessible.

Relieved but Defeated

Exit & Reflection

Finally checks out after 45 minutes for 3 items. Didn't get everything he wanted. Had to ask for help 6 times. Feels like a burden. Vows to stick to online delivery despite preferring fresh food shopping.

Total time: 1 hour 5 minutes (should take 15 minutes)

Success rate: Got 2 of 5 items he wanted

Independence: Required 6 interventions from staff/volunteers

Pain Points Summary

The Solution

ShopAssist: Dual-Layer Accessibility System

ShopAssist is a revolutionary smart lens device—lightweight glasses integrating HD camera, millimeter-wave radar, and bone conduction speakers—that provides completely hands-free navigation, obstacle detection, and product identification for truly independent shopping.

System Architecture

Layer 1: Physical Infrastructure

  • Tactile floor markers at store entrance
  • Bluetooth beacons on shelves
  • High-contrast, large-print aisle signage
  • Audio wayfinding at key locations
  • Accessible basket/trolley pickup

Layer 2: ShopAssist Smart Lens

  • Integrated HD camera for product recognition
  • Millimeter-wave radar for obstacle detection
  • Bone conduction audio for private voice guidance
  • 8-hour battery life, lightweight frame
  • Hands-free operation, natural head movements

Layer 3: Staff Support Integration

  • Alert system for assistance requests
  • Staff dashboard showing user location
  • Training modules for staff
  • Feedback loop for improvements
  • Emergency assistance button

How It Works

Follow David's shopping journey with ShopAssist in this visual guide.

ShopAssist Smart Lens - Effortless AI-Powered Shopping: Step 1 - Voice command to add items to list, Step 2 - Store recognition and auto-connect, Step 3 - Turn-by-turn navigation through aisles, Step 4 - Product found with price alert. 5 items, 8 minutes, both hands free, zero assistance.

Key Features

1. Turn-by-Turn Navigation

How it works: The lens uses store map data + Bluetooth beacons + onboard sensors to provide precise indoor navigation through bone conduction audio—completely hands-free.

Voice guidance examples:

  • "Walk forward 10 steps"
  • "Turn right at the end of the aisle"
  • "Dairy section is on your left in 5 steps"
  • "You've arrived at the milk shelf"

User benefit: Navigate unfamiliar stores confidently without memorization or assistance.

2. Real-Time Obstacle Detection

How it works: Integrated millimeter-wave radar + HD camera with AI vision analyzes environment 60 times per second, detecting obstacles even in low light or peripheral vision.

Detects and alerts for:

  • Shopping carts in path: "Cart 2 meters ahead, move slightly left"
  • People blocking aisle: "Person standing ahead, please wait"
  • Wet floor signs: "Caution: wet floor ahead, walk carefully"
  • Promotional displays: "Obstacle detected, move right to avoid"
  • Store staff restocking: "Staff member kneeling ahead on right"

User benefit: Shop safely without collisions or falls.

3. Product Identification

How it works: Simply look at the shelf. The lens camera automatically identifies products in your field of view and announces details through bone conduction.

Information provided:

  • "Tesco Semi-Skimmed Milk, 2 pints, £1.20"
  • "Hovis Wholemeal Bread, 800g, £1.05, best before 18th Jan"
  • "Heinz Tomato Soup, 400g tin, £0.85, on offer - was £1.10"

User benefit: Choose exactly what you want, compare prices, check expiry dates independently.

4. Barcode Scanner

How it works: Pick up a product and look at it. The lens automatically detects and scans barcodes, reading full product information aloud.

Provides full details:

  • Product name and brand
  • Price and any offers
  • Nutritional information (calories, allergens)
  • Ingredients list
  • Expiry/best before date

User benefit: Make informed purchasing decisions, check for allergens, avoid expired products.

5. Shopping List Integration

How it works: Create your list via voice command at home. The lens guides you to each item in optimal route—completely hands-free navigation.

Features:

  • Voice input: "Add milk and bread to my list"
  • Smart routing: Optimizes path to minimize backtracking
  • Item tracking: "3 of 5 items found"
  • Suggestions: "Butter is on offer today, add to list?"

User benefit: Efficient shopping, never forget items, guided to everything you need.

6. Accessible Checkout Guidance

How it works: Lens identifies checkout area, counts people in queues, and guides you to the shortest line with real-time position updates.

Guidance provided:

  • "Checkout area ahead. 3 tills open."
  • "Shortest queue: till 2, walk straight 8 steps"
  • "2 people ahead of you in queue"
  • "Next customer, please move forward to till"
  • For accessible tills: "Accessible till 5 available with no queue"

User benefit: Find checkout quickly, know when it's your turn, choose accessible options.

Future State Journey

Shopping With ShopAssist

The same shopping trip transformed through accessible technology. David uses ShopAssist for the first time at Tesco Metro.

Confident, Excited

Planning (at home)

David puts on his ShopAssist lens. Voice command: "Create new shopping list." David: "Add semi-skimmed milk, wholemeal bread, pasta, tomato sauce, and bananas." Lens: "5 items added. Select your store." David: "Tesco Metro, High Street." Lens: "Route prepared. Ready to shop."

Time: 2 minutes

Calm, Independent

Store Entrance

Arrives at store wearing his ShopAssist lens. Automatic doors open. Lens: "Welcome to Tesco Metro. Shopping baskets are 3 steps ahead on your right." David walks forward, finds basket immediately. Lens: "Basket located. Ready to shop. First item: semi-skimmed milk. Navigate to dairy section?"

Barrier removed: Instant orientation. No assistance needed.

Focused, Efficient

Navigating to First Item

Lens: "Walk straight 15 steps." David walks hands-free. Lens: "Turn right at aisle end." David turns. Radar detects obstacle: "Cart 3 meters ahead, move slightly left." David adjusts path. Lens: "Dairy section on your left in 5 steps... You've arrived. Milk shelf directly in front."

Breakthrough: Safe navigation with obstacle avoidance. No wandering, no collisions.

Time: 45 seconds

Empowered, Capable

Product Identification

David simply looks at the milk shelf. Lens camera auto-scans: "Detecting products... Found: Tesco Whole Milk £1.25, Tesco Semi-Skimmed £1.20, Tesco Skimmed £1.15, Oat Milk £1.80." David: "Select semi-skimmed." Lens: "Semi-skimmed milk is second from left. Best before 20th January." David picks up correct bottle first try—both hands free.

Breakthrough: Independent product selection. No waiting for Be My Eyes. Exact item he wanted.

Time: 20 seconds

Satisfied, Quick

Remaining Items

Lens guides David to bread aisle (30 seconds), identifies wholemeal bread (15 seconds), navigates to pasta section (25 seconds), finds preferred pasta sauce (20 seconds), locates fresh bananas (30 seconds). Each product identified by simply looking at shelves. Both hands free for carrying items. No backtracking. No help needed.

Result: Got every item he wanted, exactly the brands he prefers.

Time: 3 minutes total for all items

Relaxed, In Control

Checkout

Lens: "Shopping complete. Navigate to checkout?" David: "Yes." Lens: "Checkout area ahead. 3 tills open. Shortest queue: till 2, straight ahead 10 steps. 1 person ahead of you." David joins queue. Lens detects movement: "Next customer. Move forward to till." David steps up immediately, completes transaction—hands free throughout.

Breakthrough: Chose optimal queue. Knew exactly when to move forward. No confusion.

Proud, Independent

Exit & Reflection

David exits store with all 5 items he wanted in under 10 minutes. Didn't ask anyone for help. Feels capable and normal. Plans to try larger Tesco next week. Sends feedback: "This is life-changing. For the first time, I feel like a regular customer."

Total time: 8 minutes (vs 1 hour 5 minutes before)

Success rate: 5 of 5 items (vs 2 of 5 before)

Independence: 0 assistance requests (vs 6 before)

Transformation Summary

Before ShopAssist

  • 65 minutes for 3 items
  • 40% success rate (got 2 of 5 wanted items)
  • Required 6 assistance interventions
  • Felt embarrassed and dependent
  • Avoided unfamiliar stores
  • Gave up on items when searching took too long

With ShopAssist

  • 8 minutes for 5 items
  • 100% success rate (got exactly what he wanted)
  • Zero assistance needed
  • Felt confident and independent
  • Ready to try new stores
  • Shopped efficiently like any other customer

Design & Technical Specifications

How ShopAssist Works

Technology Stack

Smart Lens Hardware

  • Frame: Lightweight titanium, 42g total weight
  • Camera: 12MP wide-angle HD camera
  • Radar: 60GHz millimeter-wave sensor
  • Audio: Bone conduction speakers (private)
  • Battery: 8-hour continuous use, magnetic charging

AI & Computer Vision

  • Object detection: YOLOv8 for real-time obstacle detection
  • Product recognition: Custom trained model on grocery items
  • OCR: Google Vision API for label reading
  • Processing: Qualcomm Snapdragon XR2 on-lens + cloud backup

Indoor Positioning

  • Bluetooth beacons: Estimote beacons every 3 meters
  • Sensor fusion: IMU, compass, radar, visual SLAM
  • Accuracy: ±0.5 meter positioning accuracy
  • Store maps: Vector-based, updatable

Product Database

  • Integration: Store POS system API
  • Data: Product locations, prices, stock levels
  • Updates: Real-time sync when layouts change
  • Fallback: Community-sourced updates

Accessibility Features

Privacy & Safety

Impact & Benefits

Transforming Lives Through Inclusive Design

For Users

Independence Restored

Shop without relying on sighted companions, staff assistance, or volunteers. Complete autonomy from store entrance to checkout.

Time Efficiency

Reduce shopping time by 88% (from 65 minutes to 8 minutes). No more wandering aisles or waiting for assistance.

Safety & Confidence

Navigate stores safely with obstacle detection. No collisions, falls, or accidents. Shop with confidence in any store.

Choice & Dignity

Choose exactly what you want, not what staff hand you. Compare prices, check dates, read labels independently. Shop like everyone else.

For Retailers

For Society

88%
Reduction in shopping time
100%
Independent task completion
2.2M
UK users empowered
£249M
Annual spending power unlocked

Implementation Roadmap

Path to Market

Phase 1: Pilot Program (6 months)

Phase 2: Regional Expansion (12 months)

Phase 3: National Rollout (24 months)

Key Partnerships Needed

Future Vision

Beyond Shopping: A Life Companion

ShopAssist smart lens isn't just for supermarkets. The same technology that guides users through store aisles can transform every aspect of daily life—from crossing busy streets to finding the TV remote at home.

One Lens. Endless Possibilities.

ShopAssist Smart Lens future use cases: Road Navigation with traffic light detection and GPS walking directions, Public Transport to identify buses and navigate stations, Home assistance to find misplaced items and read labels, Restaurants & Cafes to read menus and find tables, Meetings to identify speakers, and Social Events to recognize friends in crowds.

Product Roadmap

Phase 1
Shopping
Supermarkets & retail stores
Phase 2
Navigation
Roads, transport, public spaces
Phase 3
Home & Work
Indoor assistance everywhere
Phase 4
Social AI
Face recognition & context awareness

Reflection

Key Learnings

Design Insights

Technical Challenges

Next Steps

"ShopAssist started as a shopping solution, but it's become so much more. It's my eyes on the street, my guide at work, my helper at home. For the first time, I feel like the world is designed for me too."

— David Mitchell, User Research Participant