Now

Updated 2026-02-26T12:00:00Z

Notice

I built a biofeedback app for Apple Watch + iPhone that helps you notice internal state shifts. You tap when something shifts, the app captures your heart rate and HRV, you label what you're feeling using a felt-sense taxonomy grounded in affect labeling research, and Claude generates contemplative reflections on your patterns over time. It's in TestFlight now with a growing beta group from the Jhourney contemplative community.

The next frontier: replacing the cloud AI with a fine-tuned model running entirely on-device using Apple's MLX framework and mlx-swift. Train a small open-weights model (Llama 3.2 3B, LoRA fine-tuned) on high-quality contemplative reflections, quantize it, and run inference on the iPhone GPU — no server, no API calls, nothing leaves your phone.

The real unlock beyond privacy is on-device LoRA adaptation: a model that learns your phenomenological vocabulary, your somatic patterns, your relationship to experience over time. Not a generic wellness chatbot — a contemplative mirror that gets more precise the longer you practice with it.

Still working through where the quality boundary falls between what a 3B model can do (moment-level reflections) and what still needs a larger model (weekly pattern synthesis across dozens of sessions). Likely a hybrid for now, fully on-device as the long-term goal.