Investor Brief
The one-page case for interoceptive training as a product category.
Your body registers emotional shifts before your conscious mind catches up. HRV drops before anxiety. Heart rate climbs before excitement. Skin conductance shifts before you can name the feeling.
This gap is where your relationship to experience gets decided. Miss the shift, and the reaction runs on autopilot — you're inside it before you know it's happening. Catch it, and you can meet what's arising rather than being moved by it. That single difference — noticing before versus after — is what separates reacting from responding. Better regulation, clearer decisions, more skillful relationships: all downstream of closing the gap. No product trains this skill. No product measures the change.
Frame Snap — tap your Apple Watch when you notice a shift. The app captures biometric context, you label the felt sense, and an AI trained in contemplative vocabulary reflects back what the pattern might mean. Four steps: Tap → Capture → Name → Reflect.
Tap
Notice a shift
Watch or iPhone
Capture
HR, HRV, context
HealthKit + EventKit + CoreLocation
Name
Describe, label, intensity
Subjective before objective
Reflect
See data + AI reflection
Claude streams contemplative response
Interoceptive Lead Time (ILT): the temporal gap between a biometric shift and conscious noticing. As the user trains, this gap shrinks — they notice sooner. ILT gives Notice a peer-reviewable outcome metric: the first objective measure of interoceptive skill gain, validatable against MAIA-2 assessments.
Three trust paths, one protocol. Apple Watch data stays entirely on-device via WatchConnectivity. Garmin connects via BLE direct — raw data never leaves the phone. Oura data transits Oura's cloud but is fetched client-side and never re-uploaded. All paths converge through relative descriptor functions — the AI sees patterns, never raw biometrics. Privacy by architecture, not policy.
Architecture
Privacy by design. AI at two scales.
On-Device — The Read
Apple Foundation Models · ~3B parameters · Free, offline, no API keys
HealthKit
HR, HRV, 7-day trends
EventKit
Calendar context ±2h
CoreLocation
Semantic location
SwiftData
Recent snap patterns
Felt-Sense Interpreter
"Tight jaw, buzzy" → suggests Stirred texture group → matching labels
Cloud — The Reflection
Claude API (Sonnet) · Streaming SSE · Contemplative system prompt
Longitudinal pattern analysis
Contemplative reframing
Scaffolding-aware reflection
Dam Model vocabulary
What Claude receives
emotion: "tense" · intensity: 0.7 · description: "tight jaw, shoulders up"
biometric: HR elevated, HRV 22ms below weekly avg
context: "before a work meeting, third similar pattern this week"
Meditation apps teach awareness but ignore the body. Health trackers measure the body but don't build awareness. Mood journals ask you to reflect but only after the fact. Notice connects all three: real-time biometric sensing, somatic awareness training, and AI-assisted reflection. No existing product occupies this intersection.
Product Landscape
Three categories. None training the underlying capacity.
Meditation Apps
Headspace, Calm, Waking Up
Mood Trackers
How We Feel, Daylio, Bearable
Health Platforms
Oura, WHOOP, Apple Health

Notice
All three — plus AI reflection
Capability
Meditation
Mood
Health
Biometric sensing
Emotion labeling
Contemplative grounding
AI reflection
Subjective-first
On-device personalization via LoRA fine-tuning on the user's own patterns. No API dependency for the core reflection loop. Training data never leaves the device. The model improves with each user without any user's data improving it for anyone else. Privacy and personalization compound into a moat that deepens with every snap.
Individual → Relational → Collective. Notice begins with personal interoceptive training. It expands to relational awareness — shared somatic states in couples, teams, therapeutic dyads. The long arc: a collective interoceptive commons. The infrastructure to make this progression possible is the same at every scale: sense, name, reflect, integrate.