The iOS app shares the same database, same RLS policies, and same data model as the web app. Any database migrations created for the web automatically apply to iOS since both platforms use the same Supabase project.
Stack
| Layer | Technology |
|---|---|
| UI | SwiftUI, iOS 18+ |
| Architecture | MVVM + Repository pattern |
| Backend | Supabase (shared project with web) |
| Auth | Supabase Auth + Keychain token storage |
| Wake word | Picovoice Porcupine (on-device) |
| Speech recognition | Apple SFSpeechRecognizer |
| Text-to-speech | AVSpeechSynthesizer |
| LLM classification | Claude Haiku via Supabase Edge Function |
| Dependencies | Supabase Swift SDK (SPM), Porcupine iOS SDK |
Architecture pattern
| Layer | Responsibility | Rules |
|---|---|---|
| View | UI rendering and user interaction | No business logic. No direct Supabase calls. |
| ViewModel | State management and business logic | @MainActor always. No PostgREST imports. |
| Repository | Data access and Supabase queries | Only layer that imports PostgREST. Receives accessToken and facilityId as init params. |
| Model | Data structures | Plain Codable structs. No business logic. |
- Only Repositories import
PostgREST— Views and ViewModels must never import it directly - All ViewModels are
@MainActor— never useDispatchQueue.main.async - Use
@EnvironmentObjectinjection, not.sharedsingletons - Pass
accessTokenandfacilityIdas init params to repositories
Project structure
Core managers
The app uses several singleton managers injected via@EnvironmentObject:
AuthManager
Manages the full authentication lifecycle:- Login/logout with Supabase Auth
- Token refresh — automatic refresh 5 minutes before expiry
- Foreground refresh — handles iOS background suspension gracefully
- Keychain storage — tokens encrypted via iOS Keychain Services
- User profile — role, facility_id, and display information
- Auto-logout — if token refresh fails, returns to login screen
NotificationManager
- Supabase Realtime subscriptions for live case updates
- APNs integration — device token registration and push notification handling
- Device token table — stores tokens in
device_tokensfor server-side push - Foreground handling — banner + sound + badge for in-app notifications
- Tap routing — opens relevant case or screen from notification
ActiveCaseManager
Tracks the current in-progress case for the floating ActiveCaseBar that appears above the tab bar. Polls on a timer and updates via Realtime events.Design system
The design system is centralized inTheme.swift:
Colors
| Token | Usage |
|---|---|
orbitPrimary (#2563EB) | Primary brand blue |
orbitGreen | Success states, ahead-of-pace |
orbitRed | Error states, behind-pace |
orbitOrange | Warning states, on-pace |
orbitSlate | Neutral text and borders |
roomBackground, roomCard, etc. | Room Mode dark palette |
Device scaling
The app uses a scaling factor for iPad:.deviceScaled and .scaledSystem() modifiers to fonts and dimensions. This ensures Room Mode text is readable from across the operating room.
Typography
- Dynamic Type support for accessibility
- Rounded design for titles (
Font.Design.rounded) - iPad fonts scaled at 1.5x via device scale factor
Data layer
Repositories
All database access flows through repositories. Each repository:- Receives
accessTokenandfacilityIdin its initializer - Sets the Supabase client’s auth header for RLS
- Returns typed Swift models (never raw JSON)
- Handles error mapping to
ORbitErrortypes
| Repository | Key operations |
|---|---|
| CaseRepository | Fetch cases by date/room/status, record milestones, update case fields |
| MilestoneRepository | Get facility milestones (ordered), validate sequence, check recorded status |
| RoomRepository | Room list with case counts, room-case relationships |
| StaffRepository | Staff assignments CRUD, role lookups, staff-case links |
| DelayRepository | Add/remove delays with reason codes |
| ImplantRepository | Implant data and sizing |
| ScoreRepository | Surgeon scorecards from pre-computed table |
Key queries
All queries follow these platform-wide rules:- Every query filters by
facility_id - Milestone joins use
facility_milestone_id(nevermilestone_type_id) - Soft-delete tables filter
is_active = true - Analytics calculations use median, not mean
Voice command pipeline
The voice system is the most complex subsystem in the iOS app. It lives entirely inFeatures/RoomMode/Voice/ and consists of 8 coordinated services.
Architecture overview
Service breakdown
VoiceCommandService (orchestrator)
The central coordinator (~764 lines). Manages the two-stage flow:- Configures audio session for simultaneous playback + recording
- Starts Porcupine wake word detection (Stage 1)
- On wake word → starts SFSpeechRecognizer (Stage 2)
- On transcription → routes to parser
- On result → triggers feedback and returns to Stage 1
- Handles microphone permissions, audio route changes, and error recovery
WakeWordDetector
Picovoice Porcupine wrapper for the “Orbit” wake word:- Processes audio frames from the shared
AVAudioEngine - Converts 48kHz float samples → 16kHz Int16 (Porcupine’s required format)
- Runs at ~1-2% CPU (vs. 15-25% for always-on speech recognition)
- Bundled model file:
Orbit_en_ios_v4_0_0.ppn - No network required — fully on-device
VoiceCommandParser
Routes transcriptions through the alias dictionary:- Normalizes input (lowercase, trim whitespace, collapse multiple spaces)
- Checks
MilestoneAliasDictionaryfor a match - Returns
ParsedCommandwith action type, milestone ID, and confidence - Returns
.noMatchif no alias found → triggers LLM slow path
MilestoneAliasDictionary
An in-memory hash map of voice command aliases loaded from thevoice_command_aliases database table:
- Exact match — direct hash lookup (O(1))
- Contains match — checks if the transcription contains a known alias phrase
- Fuzzy match — Levenshtein distance ≤ 2 for close misses
- Auto-learning — new aliases can be added at runtime when the LLM auto-caches
lowercased → trimmed → collapsed whitespace → stripped punctuation
VoiceLLMClassifier
Calls theclassify-voice-command Supabase Edge Function when no local alias matches:
Request payload:
shouldCache = true and confidence ≥ 85%, the phrase is automatically saved as a new alias in the voice_command_aliases table (with auto_learned = true).
MilestoneValidation
Validates milestone sequence before recording:| Result | Condition | Behavior |
|---|---|---|
| Immediate | All prior milestones recorded | Record instantly |
| SkippedWarning | 1 milestone skipped | Record with warning toast |
| OutOfOrder | 2+ milestones skipped | Hold pending — require verbal confirmation (15s timeout) |
| AlreadyRecorded | Milestone has existing timestamp | Hold pending — announce existing time, ask to confirm update |
VoiceFeedbackService
Text-to-speech engine usingAVSpeechSynthesizer:
- Supports configurable voice selection (system voices + Siri voices)
- Three feedback levels: Full Verbal, Sounds Only, Silent
- Pauses Stage 2 recognition during TTS playback to prevent self-hearing
- Stage 1 keeps running during TTS — wake word detection is never interrupted
- Plays confirmation chimes via
AVAudioPlayer
VoiceQueryResponder
Handles non-action intents (query_time, query_duration, query_case_info):- Formats response text based on case state and recorded milestones
- Returns overlay data for UI display (auto-dismiss after 5 seconds)
- Responds verbally via VoiceFeedbackService
VoiceCommandModels
Type definitions for the voice pipeline:Command routing
TheRoomModeViewModel routes parsed commands to their handlers:
Database tables
The voice system uses two database tables:voice_command_aliases
| Column | Type | Description |
|---|---|---|
id | UUID | Primary key |
facility_id | UUID (nullable) | NULL = global template, UUID = facility-specific |
milestone_type_id | UUID (nullable) | Links to milestone type |
facility_milestone_id | UUID (nullable) | Links to facility milestone |
alias_phrase | text | The spoken phrase to match |
action_type | text | Routing key (record, cancel, undo_last, etc.) |
source_alias_id | UUID (nullable) | Template propagation tracking |
is_active | boolean | Soft delete flag |
auto_learned | boolean | Whether this was auto-learned from LLM |
deleted_at | timestamptz (nullable) | Soft delete timestamp |
facility_id = NULL) are managed by global admins. Facility-specific aliases override or extend the global set. Each facility loads both global and local aliases.
voice_command_logs
| Column | Type | Description |
|---|---|---|
id | UUID | Primary key |
case_id | UUID | The case this command was spoken during |
command_text | text | Raw transcription |
matched_milestone_id | UUID (nullable) | Which milestone was matched |
confidence_level | text | high, medium, low, none |
outcome | text | recorded, pending, rejected, cancelled, timeout, unrecognized |
source_text | text (nullable) | Original speech before normalization |
Feature parity matrix
| Feature | Web | iOS |
|---|---|---|
| Case management | Full CRUD | View + milestone recording |
| Room status board | Full | Full |
| Surgeon home dashboard | Full | Full |
| Device rep tray tracking | Full | Full (differentiator) |
| ORbit Score / Scorecards | Client-side calculation | Via surgeon_scorecards table |
| Voice commands | N/A | Full (iPad differentiator) |
| Room Mode | N/A | Full (iPad only) |
| Face ID auth | N/A | Full |
| Push notifications | N/A | Full (APNs) |
| Analytics dashboards | 6 views | Not started |
| Block scheduling | Full | Not started |
| Admin features | Full | Not planned for mobile |
Auth flow
- Tokens stored in iOS Keychain (encrypted, hardware-backed)
- Auto-refresh 5 minutes before expiry
- Foreground refresh handles iOS background suspension
- Face ID gates app access when enabled
Shared principles
The iOS app follows the same platform-wide rules as the web app:- Milestone v2.0 —
facility_milestone_idis the FK, nevermilestone_type_id - Median over average — all analytics use median, not mean
- Soft deletes — filter
is_active = trueon soft-delete tables - Facility scoping — every query filters by
facility_id
Testing
Tests live inORbitTests/ using XCTest:
| Coverage area | Status |
|---|---|
| Repositories | Covered |
| ViewModels | Covered |
| Voice pipeline (VoiceCommandService, VoiceLLMClassifier, etc.) | Gap — highest priority |
| UI / integration | Manual testing via Simulator/device |
Build configuration
| Setting | Value |
|---|---|
| Xcode project | ORbit.xcodeproj |
| Target | ORbit |
| Minimum iOS | 18.0 |
| Swift version | 5.9+ |
| Package manager | Swift Package Manager |
| Dependencies | Supabase Swift SDK, Porcupine iOS SDK |
| Entitlements | Keychain access, push notifications, microphone, speech recognition |
FAQ
Can I test iOS against the same database as web?
Can I test iOS against the same database as web?
Yes. Both apps use the same Supabase project. Point the iOS app’s configuration to your Supabase URL and anon key. RLS ensures data isolation per facility.
How do I add a new voice command action type?
How do I add a new voice command action type?
- Add the action type string to the
voice_command_aliasesmigration (or insert rows directly) - Add a
casehandler inRoomModeViewModel’s command routing switch - Seed default aliases for the new action type
- Add the action type to the web settings UI filter (in
VoiceCommandsPageClient.tsx)
Why is the RoomModeViewModel so large?
Why is the RoomModeViewModel so large?
It orchestrates milestone recording, voice command routing, staff/delay/implant modals, pace calculation, timer management, and cement timer state. Consider extracting voice routing into a dedicated
VoiceCommandRouter if it grows further.How does the LLM auto-learn new aliases?
How does the LLM auto-learn new aliases?
When the
classify-voice-command Edge Function returns shouldCache = true with confidence ≥ 85%, the iOS app inserts a new row into voice_command_aliases with auto_learned = true. The alias is immediately added to the in-memory dictionary for instant matching on subsequent uses.Next steps
iOS app guide
User-facing guide to the iOS app’s features and pages.
Room Mode
Full-screen OR dashboard with voice commands.
Architecture
Full technical overview of the ORbit platform.
Data model
Database schema and relationships.