The Body Knows First
The Gap
You have felt this before. Your shoulders are tight for twenty minutes before you notice you are bracing. Your heart rate climbs through a conversation you tell yourself is fine. Your breathing shallows during a meeting and you only register it when someone asks if you are okay.
The body leads. The story follows. There is always a gap.
The gap is measurable. Heart rate variability drops before conscious anxiety arrives. Heart rate rises before excitement registers. Skin conductance shifts before you can name the feeling.
Notice exists to train that skill.
> The most valuable structures in our lives — emotional patterns, somatic intelligence, relational dynamics — are invisible until something makes them navigable.
This is part of a broader thesis I have been developing across two projects. Scholion makes the dependency structure of scientific claims visible and navigable — the hidden load-bearing walls behind published findings. Notice does the same thing for the dependency structure of your inner life: the connections between what your body is doing, what you are feeling, and how you are relating to it all. The full technical essay — You’re Already Feeling Something You Haven’t Noticed Yet — covers the architecture and interaction design in depth. Different domains, same problem. The structures that matter most are the ones you cannot see.
What Notice Is
The core interaction is called a
The same snap-debrief-reflection loop works across Apple Watch, Garmin, and iPhone — and Oura Ring feeds overnight baseline context into the system without requiring a conscious snap. Three hardware ecosystems, three different transport layers, three different
Core Interaction
The Frame Snap
Tap
Notice a shift
Watch or iPhone
Capture
HR, HRV, context
HealthKit + EventKit + CoreLocation
Name
Describe, label, intensity
Subjective before objective
Reflect
See data + AI reflection
Claude streams contemplative response
Key Design Decision
You commit your subjective assessment before you see the biometric data. You say "I feel tense" and then discover your HRV is 22ms below your weekly average. Over time, that feedback loop trains calibration between felt sense and physiology.
The debrief is not a form. It is a felt-sense encounter. An emotion picker organized by somatic texture invites you to name what you notice. Six groups, three labels each, organized by where you feel them in your body:
The taxonomy is sized to a research sweet spot.
Then Claude — Anthropic’s AI — generates a contemplative reflection. Not advice. Not a diagnosis. A mirror. The reflection orients toward relation — how you are meeting your experience — never toward object — what the experience supposedly is.
> “You said calm, but your heart rate variability was lower than usual. That is not a contradiction — it is information. What happens when you hold both?”
This distinction, drawn from
Claude reflects at three timescales: a brief sentence at snap time — a small act of witnessing; an exploratory paragraph during debrief — an invitation toward curiosity; and daily and weekly synthesis — longitudinal pattern detection that surfaces what you cannot see from inside a single moment. The weekly reflections consume daily syntheses hierarchically, so the AI reasons over compressed patterns rather than re-aggregating raw data. Over weeks and months, these syntheses surface the recurring shapes of your inner life — the patterns you did not know you had.
What’s Built
Notice is not a concept. It is a working product in closed beta via TestFlight. The core loop has been built three times across three hardware ecosystems — Apple Watch via WatchConnectivity, Garmin Enduro 3 via Connect IQ Companion SDK over BLE, and Oura Ring 3 via cloud REST API — each with different transport mechanics, different epoch formats, and different trust boundaries.
What made this tractable is the BiometricSnapshot protocol: a single abstraction that normalizes heart rate,
Voice-initiated snaps via Siri and AirPods let you capture a moment without looking at a screen. On-device intelligence handles context assembly — HealthKit trends, calendar, location, recent snaps — without any data leaving the device.
Privacy by Architecture
Notice’s privacy model is not a policy. It is an architecture. And with three hardware sources, the question is no longer whether data is private but what path it traveled and what boundaries it crossed.
Architecture
Privacy by design. AI at two scales.
On-Device — The Read
Apple Foundation Models · ~3B parameters · Free, offline, no API keys
HealthKit
HR, HRV, 7-day trends
EventKit
Calendar context ±2h
CoreLocation
Semantic location
SwiftData
Recent snap patterns
Felt-Sense Interpreter
"Tight jaw, buzzy" → suggests Stirred texture group → matching labels
Cloud — The Reflection
Claude API (Sonnet) · Streaming SSE · Contemplative system prompt
Longitudinal pattern analysis
Contemplative reframing
Scaffolding-aware reflection
Dam Model vocabulary
What Claude receives
emotion: "tense" · intensity: 0.7 · description: "tight jaw, shoulders up"
biometric: HR elevated, HRV 22ms below weekly avg
context: "before a work meeting, third similar pattern this week"
Three trust paths converge on a single protocol:
Apple Watch — single trust domain. Data travels wrist → phone via WatchConnectivity. No network hop. Raw heart rate, HRV, and accelerometer data never leave the device pairing.
Garmin — single trust domain, different mechanics. Data travels wrist → phone via BLE through the Connect IQ Companion SDK. Different transport, different encoding (Garmin epoch, flat dictionaries), same topological guarantee: no cloud hop.
Oura Ring — cloud hop. Data travels ring → Oura app → Oura Cloud → REST API → iPhone. Authentication via OAuth. The raw data reaches the phone but never touches Notice infrastructure — it is a client-side fetch, topologically equivalent to the user reading their own Oura dashboard.
The SnapBiometricSource protocol normalizes these three paths. Relative descriptor functions — relativeHRV, relativeHrvRMSSD, relativeStressScore, relativeBodyBattery — translate device-specific values into contextual language. By the time data reaches Claude, all three architecturally distinct paths have been reduced to the same format: “HRV lower than your baseline.” “Stress elevated.” Never a raw number. Never a source identifier. The full architectural analysis — how three trust topologies converge on a single protocol — is documented in Trust Topologies.
A stateless Cloudflare Worker proxy holds the API key server-side and validates device identity via Apple’s
This is not just a privacy choice. It is a regulatory strategy. The
What the Science Says
Three research threads converge in the product.
Affect Labeling
The act of putting feelings into words is not just expressive. It is neurologically active. Affect labeling activates the right ventrolateral prefrontal cortex and downregulates amygdala reactivity — a specific regulatory mechanism distinct from cognitive reappraisal or suppression. The emotion picker is an affect labeling intervention. Every Frame Snap is a micro-dose of this technique.
Emotional Granularity
Barrett’s research shows that people who make finer distinctions between emotional states — distinguishing irritated from frustrated from exasperated — demonstrate better emotion regulation. This is
Interoceptive Lead Time
This is the metric I am most excited about, and as far as I can tell, no one else is measuring it.
Notice already collects both data streams: continuous biometric samples from HealthKit running in the background, and discrete conscious reports from Frame Snaps — the moment you notice a shift. With multiple devices, the system now tracks
That is a training outcome you can feel — not a score on a dashboard, but a mirror that shows you something true about your own development. If interoceptive lead time reduction correlates with
An App That Gets Quieter
Most apps optimize for engagement. More time on screen. More sessions. More data. Notice is designed to do the opposite.
Full support: reflections after every snap, active felt-sense suggestions, full biometric context. This is the current app. Reduced: reflections shift to on-demand, suggestions fade, biometric display simplifies to trend arrows. Minimal: no automatic reflections, the app becomes a quiet archive you consult when you choose to. Your interoceptive capacity is the primary instrument. Notice is documentation.
Phase transitions are triggered by behavioral signals — snap count thresholds, vocabulary stabilization, biometric-label convergence — and confirmed by the user. The app never decides for you that you are ready. It notices, and invites.
This is a genuine commercial bet. An app that trains its users to need it less sounds counterintuitive in an industry built on retention metrics. But the value of Notice is not in the screen time it captures. It is in the capacity it builds. Users do not churn because they are bored. They graduate because they have developed the skill. And the on-device AI model that personalizes to their practice over months creates a switching cost no competitor can replicate.
The On-Device Future
The Claude API is powerful, but it introduces ongoing cost, latency, and a data pathway outside the phone. The north star is every reflection tier running locally. No API dependency. No data disclosure. No marginal cost per reflection.
On-device
Timeline: on-device brief reflections are days away from shipping. Exploratory reflections on-device are plausible within weeks. Daily and weekly synthesis may require a generation of model capability improvement. The cloud API remains available as fallback throughout.
The Market
Three product categories surround the space Notice occupies. None connect all three layers.
Product Landscape
Three categories. None training the underlying capacity.
Meditation Apps
Headspace, Calm, Waking Up
Mood Trackers
How We Feel, Daylio, Bearable
Health Platforms
Oura, WHOOP, Apple Health

Notice
All three — plus AI reflection
Capability
Meditation
Mood
Health
Biometric sensing
Emotion labeling
Contemplative grounding
AI reflection
Subjective-first
Calm and Headspace own guided meditation. WHOOP and Oura own biometric tracking. How We Feel and Daylio own mood logging. Rosebud owns AI journaling. None connect all three layers — and none support multi-device biometric integration. WHOOP reads only WHOOP. Oura reads only Oura. Calm and Headspace do not read biometrics at all. Notice works across Apple Watch, Garmin, and Oura Ring with unified AI reflection, which means users are not locked to one hardware ecosystem and the addressable market is the union of all three device populations.
The window to establish this position is narrowing. Calm is adding HRV biofeedback. Headspace is integrating with Oura Ring. WHOOP is publishing peer-reviewed mental health research and adding journal prompts. Each competitor is extending toward the triad from their corner. None have arrived yet, but the trajectories are visible. Multi-device support is a structural moat: each new integration widens the addressable market and increases the architectural distance competitors must cross to match it. Time-to-market for the core loop matters more than feature completeness.
Premium, anchored against WHOOP and Oura, not meditation apps.
The Larger Vision
Notice’s trajectory follows a natural widening of the aperture of awareness — mirroring the developmental arc of contemplative practice itself.
Individual Interoception — Now
The current product. You learn to read your own internal states: noticing shifts, labeling felt sense, seeing patterns in how you relate to experience. The mirror faces inward.
Relational Attunement — Later
Co-regulation expands the mirror to face the space between two people. The full research program is documented in What If You Could Feel Someone Breathing From Across the Room. The research basis is
The research program is staged with kill criteria at each phase. Phase 0: can the hardware extract breathing rate within ±2 BPM? Phase 1: a sham-controlled study with 30 dyads to distinguish real entrainment from placebo. Phase 2: at-home ecological validation. Phase 3: regulatory pathway assessment.
Collective Field Awareness — Speculative
The furthest horizon. If dyadic co-regulation works, the same architecture extends to small groups — a meditation sangha, a therapy group, a team. Group physiological coherence is measurable. Notice could surface how a collective field forms and dissolves, who anchors it, how individual states propagate. This is genuinely speculative. But it is the logical terminus of the thesis: making invisible structures visible and navigable, applied to the most invisible structure of all — the felt sense of being in a room together.
Each expansion is gated by the one before it.
Each expansion widens the moat.
What Comes Next
The on-device reflection model is the key technical milestone. Brief reflections moving on-device eliminates the largest cost center, makes the Core tier economically viable at zero marginal cost, and delivers the privacy promise in its strongest form.
For Apple Watch and Garmin snaps with on-device reflections, nothing leaves your phone — the entire path from wrist to insight stays within a single trust domain. For Oura baseline context, the data transits Oura’s cloud but never touches Notice infrastructure. The privacy guarantee is topological, not absolute: you can trace exactly which boundaries each datum crossed.
Notice is built on a conviction I keep returning to: the most important thing you can learn is how to read your own experience with honesty and precision. Not to fix it. Not to optimize it. To see it clearly enough that you can choose how to respond rather than being carried by reflex.
The technology is in service of that learning. A temporary scaffold that builds a permanent capacity. The biometric data is a mirror for the body you already have. The emotion label is a frame for the feeling you already feel. The AI reflection is an invitation to look at what you already know but have not yet noticed.
The app gets quieter as you get better. That is the design.
APPENDICES
| Core | Full | |
|---|---|---|
| Price | ~$80/year | $149–199/year |
| Reflection tiers | Brief (on-device) | Brief + Exploratory + Daily + Weekly synthesis |
| AI runtime | On-device model (zero marginal cost) | Claude API (cloud) |
| Biometric context | Basic | Full enrichment via Foundation Models |
| Anchor comparison | Oura ($70/yr + $300 hardware) | WHOOP ($239/yr) |
.brief reflections: on-device primary, cloud fallback. .exploratory: cloud primary for now, on-device plausible for 3B models as training data accumulates. .daily and .weekly synthesis: always cloud — requires analytical reasoning over variable-length sequences that exceeds small model capability. On-device .brief covers ~80% of API calls.| Week | Milestone | Key Deliverables |
|---|---|---|
| 1 | Foundation & Feedback | Validate Foundation Models on physical hardware (interpreter <1s, context assembly <3s). Incorporate beta taxonomy feedback. Deploy Cloudflare Worker proxy with App Attest. Push new TestFlight build (sessions 11–15). Begin synthetic training corpus generation. |
| 2 | On-Device & Measurement | Ship on-device brief reflections via llama.cpp or MLX (runtime fork resolved by benchmark). Implement scaffolding decay phase 1 triggers. Begin measuring interoceptive lead time across beta cohort. Fix keyboard drawer default on debrief screen. |
| 3+ | Launch Preparation | Expand beta beyond Jhourney (QS/Reddit channels). Launch notice.tools landing page. Finalize App Store metadata and screenshots (FDA-compliant language). Privacy settings view. App Store submission. |