Nataris — use P2P Android inference as a backend for your Flowise flows #6281
Sharrmavishal
started this conversation in
Show and tell
Replies: 1 comment
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Built this over the past three months with a two-person team. Since Flowise supports ChatOpenAI with a custom base path, you can point it at Nataris and use P2P Android inference in your flows.
What Nataris is
A P2P inference marketplace. Android phones run open-weight models locally (Qwen 2.5 0.5B, Llama 3.2 1B) and serve requests via a standard OpenAI-compatible API. Phone owners earn per token. Developers get inference without managing any infrastructure.
No prompt logging. No content filtering. No model training on your queries.
Using with Flowise
In any ChatOpenAI node, set:
https://api.nataris.ai/v1nataris-fast(Qwen 2.5 0.5B) ornataris-balanced(Llama 3.2 1B)Enables streaming in the node settings — mobile cold-starts can timeout on non-streaming requests.
Where we are
21 provider devices on the network, 2,775 inference jobs completed, 350K+ tokens processed during closed beta. Android app just went live on Google Play.
Good fit for flows where 5–20s latency is acceptable and privacy matters (no prompt logging at the API layer).
$5 free credits on signup, no card needed.
API: https://api.nataris.ai/v1
Docs: https://api.nataris.ai/docs
Provider app (earn by running models on your Android): https://play.google.com/store/apps/details?id=ai.nataris.app
Beta Was this translation helpful? Give feedback.
All reactions