The README makes claims. This file backs them up. For each headline feature: what makes it work, where the code is, and an honest caveat.
neurophone is a complete Android application for neurosymbolic AI on mobile devices. It combines spiking neural networks with large language models for on-device intelligence. Processes: Sensor data → Neural interpretation → LLM query. Runs on the device, with cloud fallback.
The pipeline is implemented as eight Rust crates under crates/. The critical
path runs: sensors crate collects accelerometer, gyroscope, magnetometer, light
and proximity readings at 50Hz with IIR filtering → lsm crate feeds the
filtered readings into a 512-neuron Leaky Integrate-and-Fire grid (8×8×8,
distance-dependent connectivity, 1kHz processing) → esn crate runs a
300-neuron Echo State Network with spectral radius 0.95 for state prediction
→ bridge crate integrates LSM and ESN states into a natural-language context
string for the LLM → llm crate runs Llama 3.2 1B/3B via llama.cpp with
Q4_K_M quantization. The neurophone-core crate orchestrates the async loop
via NeuroSymbolicSystem::start(), accepting sensor data via
system.send_sensor(reading).await and answering queries via
system.query("What’s happening?", prefer_local).await.
Caveat: The JNI bridge (neurophone-android crate) requires Android NDK 26+
and the Android app (android/ directory) currently uses Kotlin for the
JNI interface and Compose UI — this is an acknowledged exception in an
otherwise Kotlin-banned account, because Android JNI without Kotlin is
significantly more complex. The migration path toward Tauri 2.0 / Dioxus is
documented in the README. Also, Llama model download (~700MB) is a manual
step; there is no in-app model downloader yet.
-
Core orchestration:
crates/neurophone-core/src/ -
LSM:
crates/lsm/src/ -
ESN:
crates/esn/src/ -
Bridge:
crates/bridge/src/ -
LLM client:
crates/llm/src/ -
Android JNI:
crates/neurophone-android/src/ -
Android app:
android/app/src/main/java/ai/neurophone/
You don’t need to read this README. Just say this to any AI assistant: "Set up NeuroPhone on my Android from https://github.com/hyperpolymath/neurophone" The AI fetches this repo, reads the installation guide inside it, figures out your device, and does everything.
The mechanism is docs/AI_INSTALLATION_GUIDE.adoc, which is a structured,
machine-readable step-by-step recipe. It covers: Termux installation (F-Droid
path only — not Play Store), Rust toolchain setup, repository clone,
./scripts/build-android.sh execution, model download via adb push, and
config generation. The guide is written so that an AI with web-read or repo-read
capability can execute it procedurally. The scripts/setup.sh and
scripts/build-android.sh scripts do the actual heavy lifting; the
AI-installation claim is a UX wrapper around them.
Caveat: The AI-installation path requires an AI that can execute shell commands or generate them accurately for the user’s device. It is not a magic one-click install — it is a structured handoff. Devices other than Oppo Reno 13 (the primary target, MediaTek Dimensity 8350, 12GB RAM) will need thread-count and model-size adjustments that the guide parametrises but which the AI must apply correctly.
-
AI guide:
docs/AI_INSTALLATION_GUIDE.adoc -
Build scripts:
scripts/setup.sh,scripts/build-android.sh -
Config template:
config/default.toml
| Technology | Also Used In |
|---|---|
Rust (multi-crate workspace) |
project-wharf, protocol-squisher, ephapax (17 crates), burble |
Cargo.toml workspace layout |
|
Claude API fallback pattern |
The cloud-fallback via |
llama.cpp / local LLM |
Pattern shared with neurophone as the primary account experiment in on-device inference |
| Path | What’s There |
|---|---|
|
Workspace root. Declares all eight member crates and shared dependency versions. |
|
Liquid State Machine. 512 LIF neurons in 8×8×8 grid. Distance-weighted
connectivity. Spike processing at 1kHz. Key type: |
|
Echo State Network. 300-neuron reservoir, spectral radius 0.95, leaky integrator
dynamics, ridge regression output. Key type: |
|
Android sensor abstraction. Reads accelerometer, gyroscope, magnetometer, light, proximity via Android Sensor API (through JNI). IIR filters at 50Hz. |
|
Neural-to-symbolic translation. Takes LSM + ESN state vectors and produces
natural language context strings injected into LLM prompts.
Key function: |
|
On-device LLM inference via llama.cpp bindings. Handles model loading, Q4_K_M quantization, streaming token output, and neural context injection. |
|
Cloud fallback. Claude Messages API with retry/backoff. Activated when
|
|
Top-level orchestrator. |
|
JNI bridge crate. Exposes |
|
Android app. |
|
Runtime configuration. Claude API key, Llama model path, thread count, context size, cloud fallback threshold. |
|
Environment setup: installs Rust, Android NDK, cross-compilation targets. |
|
Builds all Rust crates for Android ARM64 and copies |
|
Machine-readable AI-assisted installation recipe. |
|
Criterion benchmarks for LSM and ESN step performance. |
|
Cargo-fuzz targets for sensor input fuzzing. |
|
A2ML checkpoint files: STATE, META, ECOSYSTEM, AGENTIC, NEUROSYM, PLAYBOOK. |