About the Problem

About the Problem

Improving a player’s shot requires accurate feedback on movement and form.

However, most existing solutions rely on phone-recorded videos for AI analysis.

These recordings introduce too many uncontrolled variables:

Camera angle changes

Camera angle changes

Inconsistent lighting

Different device quality

Different device quality

Manual recording errors

Manual recording errors

Through discussions with designers and engineers and a review of existing products, several patterns began to emerge about how teams were designing and building interfaces.

Img 1 : Different recording setups and camera positions create inconsistent input data, making reliable AI analysis difficult.

Img 1 : Different recording setups and camera positions create inconsistent input data, making reliable AI analysis difficult.

Product Direction

Product Direction

To provide reliable AI feedback, we needed to understand how player performance is currently captured and analyzed.

This exploration revealed limitations in existing approaches and helped us define a more reliable system architecture.

Img 2 : Uncontrolled recording conditions—such as camera angle, device type, and environment—produce inconsistent input data for AI analysis.

Market Observation

Market Observation

Most basketball training tools analyze performance using phone-recorded videos combined with AI models.

While accessible, this approach relies on uncontrolled recording environments, making consistent analysis difficult.

Key Insight

Key Insight

AI feedback quality depends on consistent and repeatable input data.

When camera position, lighting, and recording conditions vary, the model receives inconsistent data, reducing the reliability of insights.

Strategic Decision

Strategic Decision

Instead of analyzing uncontrolled recordings, we shifted the approach to control how the data is captured. The team proposed a device–mobile ecosystem:

Design Challenges

Design Challenges

The wearable device improved data consistency but introduced a new challenge — coordinating interactions between hardware, mobile, and the AI pipeline.

The mobile app had to manage device connection, recording, data upload, and analysis states while keeping the system reliable.

These constraints shaped how the experience was designed.

Device Connectivity

Device Connectivity

The device and mobile app required a stable connection during recording and data transfer.
Connection failures could result in incomplete sessions.

Recording Sync

Recording Sync

Recording had to trigger simultaneously on both the device and mobile app.
Incorrect state transitions could cause missed or duplicate sessions.

Video Processing

Video Processing

After recording, sessions were uploaded and analyzed by the AI system.
The interface needed to prevent conflicts while communicating progress.

Designing the Device–App Workflow

Designing the Device–App Workflow

To ensure reliable analysis, the mobile experience had to coordinate device connection, shot capture, and AI processing.

To deliver reliable AI analysis, the mobile experience had to coordinate device connection, shot capture, and AI processing.

The challenge was translating complex system states into a clear and predictable workflow for players.

Step 1

Device Connection

Device Connection

Before recording could begin, the app needed to establish a stable connection with the wearable device.

Step 2

Recording Workflow

Recording Workflow

Recording required synchronization between the device capture and the mobile session.

Clear recording states ensured sessions started and stopped reliably.

Step 3

Upload and Processing

Upload and Processing

After recording, sessions were uploaded and analyzed by the AI system.

Progress indicators surfaced background states:

Img 3 : Clear system states guide players through the session workflow, from device connection to recording and AI analysis.

Iteration

Iteration

As we tested the system, we iterated on how the mobile app communicated device activity and long-running processes during practice sessions.

Img 4 : Initial device setup required connecting the wearable through Bluetooth and Wi-Fi before recording sessions could start.

Version 1

Device-First Recording

Device-First Recording

The first version focused on establishing device connectivity through Bluetooth and Wi-Fi.


Once connected, recording was triggered directly from the device, with minimal feedback in the mobile app.


This left players unsure about what the device was doing at any given moment.

Version 2

Setting up the recording process outcome

Setting up the recording process outcome

To address this, we introduced real-time activity screens that reflected the device’s current state.


The mobile app now surfaced system progress such as:

Recording video

Analyzing session

Analysis complete

While this improved clarity, users were locked into a single screen during processing, limiting navigation.

Img 5 : Real-time activity screens reflected the device’s current state, helping players understand recording and analysis progress.

Img 8 : Background status indicators provided continuous visibility into session progress without interrupting the user’s workflow.

Version 3

Persistent System Status

Persistent System Status

To improve flexibility, we introduced persistent status indicators inspired by background task patterns.


These status chips remained visible across the app, allowing players to move freely while still understanding system progress.

Connected

Recording

Uploading

Analyzing

This approach provided continuous system awareness without interrupting exploration.

Outcome

Outcome

The redesigned workflow improved how players interacted with the device–mobile–AI system during practice sessions, making recording and analysis more predictable.

Faster Device Setup

Bluetooth pairing reduced dependency on shared Wi-Fi networks, making device connection faster and easier to complete.

Bluetooth pairing reduced dependency on shared Wi-Fi networks, making device connection faster and easier to complete.

Reliable Recording Sessions

Reliable Recording Sessions

Clear recording states ensured sessions were triggered and captured consistently, reducing missed or incomplete recordings.

Clear Processing Feedback

Clear Processing Feedback

Persistent status indicators allowed players to track recording and AI analysis progress while continuing to explore the app.

Learning

Learning

Designing a connected device–mobile–AI system required thinking beyond screens and focusing on system states, reliability, and cross-team collaboration.

Designing System Workflows

Designing System Workflows

Working with a wearable device shifted the focus from individual screens to end-to-end workflows across device, mobile, and backend systems.

Making System States Visible

Connectivity limits and processing delays required designing clear system states so users always knew what the product was doing.

Balancing Control with Flexibility

The system required strict state coordination while still allowing users to navigate freely during recording and analysis.

Cross-Team Collaboration

Close collaboration with hardware, mobile, and ML teams ensured the interaction design aligned with technical constraints and system reliability.