Case Study / March 2026

Brain Link
Building a Personal AI

From concept to a voice-first assistant that lives on your phone — thinks with Claude, remembers with vectors, and acts on the real world.

React Native Claude Sonnet Supabase Expo SDK 55
Scroll
01 / 10
The Problem

Your phone has a supercomputer.
It still can't think for you.

🚫

Siri / Google can't reason

Stock assistants handle timers and weather. Ask anything requiring real thought and they redirect you to a search page.

🔒

ChatGPT has no memory

Every conversation starts cold. It doesn't know your projects, your decisions, or what you told it yesterday.

Nothing connects your life

Your email, calendar, contacts, and notes live in separate apps. No assistant ties them together into action.

02 / 10
The Vision

What if your phone had a brain?

An AI that knows you, remembers everything you've told it, can read your email, check your calendar, search the web, browse websites, and take action — all through voice.

Voice
First interface. Tap and talk.
Memory
Persistent. It remembers you.
Tools
Email. Calendar. Web. Contacts.
Local
Runs on your phone. Your data.
03 / 10
Tech Stack

How it's built

App Framework

Expo SDK 55 + React Native 0.83 — TypeScript throughout. Single codebase, native Android build via local Gradle. Registered as the system assistant app on Samsung S24 Ultra.

🧠

AI Engine

Claude Sonnet via Anthropic API with tool-use loop (up to 8 iterations). The brain that reasons, plans, and decides when to use tools vs. just respond.

🌐

Web Search & Browsing

Perplexity Sonar for live web queries. Desktop gateway for full browser automation via Playwright on the host machine.

🎤

Voice Pipeline

expo-speech-recognition (on-device STT) → Claude → expo-speech (on-device TTS). Zero cloud latency on the voice layer. Echo-safe interruption support.

🐘

Memory System

Supabase + pgvector (Open Brain). Thoughts stored as vectors via Ollama nomic-embed-text running on a DigitalOcean droplet. Semantic search on every query.

👤

Google Integration

Native Google Sign-In with OAuth2 scopes for Gmail (read/send/compose), Calendar (read/events), and Contacts (People API).

04 / 10
Architecture

System Overview

🎤 Voice Input
expo-speech-recognition
Brain Link App
Expo + React Native
🔈 Voice Output
expo-speech
Claude Sonnet
Anthropic API
←→
Tool Router
chat-engine.ts (8 max)
search_brain
Supabase pgvector
capture_thought
Open Brain
web_search
Perplexity Sonar
web_browse
Desktop Gateway
read_email / send_email
Gmail API
check_calendar
Google Calendar
search_contacts
People API
read_sms
Native Module
Supabase
brain.lock28.com
Ollama Embeddings
nomic-embed-text
Cloudflare Tunnel
brainlink.mrqwen.us
05 / 10
Use Cases

What Brain Link actually does

💬 Context-Aware Conversation

Moe "What did I decide about the auth approach last week?"
Brain "You decided to use native Google Sign-In with OAuth2 instead of expo-auth-session. The reason was that the native SDK handles token refresh better and doesn't require a redirect URI."

📧 Email Triage

Moe "Check my email for anything urgent from today."
Brain "You have 3 new emails today. One from a client about a solar panel diagnostic — they want the report by Thursday. Want me to draft a reply?"

🌐 Live Research

Moe "What's the latest on Expo SDK 56?"
Brain "Expo SDK 56 hasn't been released yet. SDK 55 is current, launched February 2026. The roadmap mentions improved native module support and faster build times for the next version."
06 / 10
Use Cases (continued)

Beyond conversation

📅

Smart Scheduling

"Am I free tomorrow at 2?" → Checks Google Calendar, finds conflicts, suggests alternatives, and can book directly.

💡

Thought Capture

Brain Link silently captures important decisions and insights to Open Brain. Weeks later, you can ask "what did I decide about X?" and get an answer.

📱

SMS & Contacts

"Read my last texts from Sarah" — accesses SMS via a custom native Expo module. Search contacts via Google People API.

🔧

Artifact Preview

"Create a quick landing page mockup" — generates HTML, renders it in an in-app WebView modal with an Export button. Visual output, not just text.

🖥

Web Browsing

Routes through a local desktop gateway to control Chrome via Playwright. Can read full pages, fill forms, and extract data from any website.

🔔

Assistant Trigger

Registered as the Android assistant — double-press the side button to launch. No unlocking, no app drawer. Instant access.

07 / 10
Development Journey

Built in days, not months

Day 1 — March 13, 2026
Voice + Claude
Basic voice loop: tap mic, transcribe speech, send to Claude Haiku, speak response with ElevenLabs TTS. First working prototype.
Day 1 — March 13
Memory & Open Brain
Connected Supabase + pgvector for semantic memory. Auto-capture thoughts. Morning briefing system from GitHub repo.
Day 2 — March 14
Android Assistant
Custom Expo plugin to register as Android assistant. Side button double-press launches Brain Link. Standalone APK with bundled JS.
Day 3 — March 15
Google Integration
Native Google Sign-In with Gmail, Calendar, Contacts. Switched to on-device TTS/STT. SMS native module. Voice interrupt support.
Day 3 — March 15
Browsing + Artifacts
Desktop gateway for full web browsing. HTML artifact preview in WebView. Upgraded to Claude Sonnet. Tool loop extended to 8 iterations.

Built with Claude Code

Every line of Brain Link was written in collaboration with Claude Code. Architecture decisions, debugging, native module creation, API integration — all through the CLI.

Key decisions

Haiku → Sonnet: Upgraded when tool-use required deeper reasoning.

ElevenLabs → expo-speech: Moved to on-device TTS for zero latency and no API costs.

EAS Cloud → Local Gradle: Faster iteration, full control over native code.

By the numbers

3
Days to ship
8
Tools built
0
Cloud TTS cost
1
Developer
08 / 10
Current State

Where Brain Link is today

✅ Shipped & Working

Voice-first AI assistant running as standalone APK on Samsung S24 Ultra

Claude Sonnet with 8-tool chain reasoning

Persistent vector memory via Open Brain (Supabase + pgvector)

Gmail read/send, Calendar, Contacts, SMS

Perplexity web search + desktop browser automation

HTML artifact creation with in-app preview

Side-button double-press launch, echo-safe voice interrupt

🚧 Next Up

Wake Word

"Hey Brain Link" via Picovoice Porcupine — always-listening, hands-free activation

Lock Screen Audio

Android foreground service for persistent audio when the screen is off

Morning Briefing

7am auto-notification with daily briefing — calendar, email summary, priorities

Proactive Intelligence

Background analysis — "you have a meeting in 30 minutes and haven't prepped"

09 / 10
The Takeaway

One developer.
Three days.
A phone that thinks.

Brain Link proves that the gap between "AI chatbot" and "personal AI" is just engineering — connecting the right APIs, building persistent memory, and making voice the primary interface. The tools exist. The future is about wiring them together.

Brain Link Open Brain Claude Code Built by Moe

github.com/mrmoe28/brain-link · brain.lock28.com

10 / 10