This is a work in progress.
We are currently attempting to build a website to address broken supply chains, such as occur during natural disasters. This is an attempt to build a website that allows makers and manufacturers to match demand for necessary articles which they can make. This matching of demand to supply is a fundamental human problem that could allow lives to be saved and distress relieved during the serial natural disaster that we face.
Our OKH and OKW libraries are implemented as publicly accessible Azure blob containers:
"Azure_Storage_ServiceName": "https://projdatablobstorage.blob.core.windows.net",
"Azure_Storage_OKH_ContainerName": "okh",
"Azure_Storage_OKW_ContainerName": "okw"
These OKHs and OKWs are taken from our repo: https://github.com/helpfulengineering/library.
Helpful created its own OKW template, and added an extenstion to OKH, both of which are defined here: https://github.com/helpfulengineering/OKF-Schema
We are currently working with the Internet of Production Alliance (IoPA) to unify these extensions with their official schemas.
At present, we are using Microsoft Azure to build a basic website.
Harry Pierson has explored an AI-based matching of tools and capabilities that are available to tools and capabilities which are are needed to make needed articles.
The matching engine used for supply-graph flows lives in a separate repository, supply-graph-ai (Open Hardware Manager, OHM): a FastAPI service that this Nuxt app calls from the browser.
We need volunteers. Although we can use a wide variety of skills we need:
- Programmers who can build a website with Azure
- Front-end programmers who, using Nuxt/Vue/Boostrap, can implement visual components. Apply here: https://helpful.directory/opportunity?id=Software-Developer---Typescript-446
- AI programmers who can use vector-matching a LLMs to build a robust matching algorithm.
This repository is a monorepo of separate npm packages (no root package.json). Install and start each part from its package directory.
| Component | Directory | Default URL | Purpose |
|---|---|---|---|
| Azure Functions API | packages/back-end |
http://127.0.0.1:7071/api |
OKH/OKW listing, blob-backed product data, incidents (Postgres), etc. |
| Nuxt front end | packages/front-end |
http://localhost:3000 |
Vue UI |
| Open Hardware Manager (supply-graph-ai) | separate clone (often Docker Compose) | http://localhost:8001 |
FastAPI matching / supply-tree API (POST /v1/api/match, docs at /docs) |
| Mock match API (optional) | packages/mock-api |
http://localhost:8001 |
Tiny Express stub; same port as OHM — use one or the other, not both |
- Node.js (LTS recommended) and npm
- Azure Functions Core Tools v4 (installed with
packages/back-endvia npm; ensurefuncis on yourPATHafternpm install) - Azure CLI (
az) — install from Microsoft’s docs. You need subscription access and permissions for team resources (blob storage, etc.) as appropriate. - PostgreSQL — only if you use endpoints that query the database (e.g.
/api/incidents). Configure viapackages/back-end/.env(see packages/back-end/README.md). - supply-graph-ai — easiest path is Docker Desktop and
docker composein that repo (serviceohm-apimaps host port 8001). Alternative: local Python 3.12 / Conda per supply-graph-ai README.
-
Copy
local.settings.json.templatetolocal.settings.jsoninpackages/back-end/.The Functions host reads Azure storage settings from here. The app fails at startup if
Azure_Storage_ServiceName,Azure_Storage_OKH_ContainerName, orAzure_Storage_OKW_ContainerNameare missing. -
If
func startreports missing job storage, setAzureWebJobsStorageinlocal.settings.jsonto a valid value (for example an Azurite connection string for local development). -
For database-backed routes, create
packages/back-end/.envwithPGHOST,PGUSER,PGPASSWORD,PGDATABASE, and optionallyPGPORT. -
From the repo root:
cd packages/back-end npm install npm run startnpm run startruns TypeScript build thenfunc start. HTTP routes are served under/api/...(for examplehttp://localhost:7071/api/listOKHsummaries). -
az login— use the subscription and permissions your team expects for Azure resources (blob access, etc.).
cd packages/front-end
npm install
npm run devOpen http://localhost:3000 (and http://localhost:3000/homepage if you use that route).
The front end uses two different configuration mechanisms:
-
nuxt.config.ts→runtimeConfig.public.baseUrl— used by pages that calluseRuntimeConfig().public.baseUrl(for example the home page). Default ishttp://127.0.0.1:7071/api(note127.0.0.1, notlocalhost; the project has seen browser/runtime quirks withlocalhosthere).Override with
BACKEND_URLwhen starting Nuxt, for example:BACKEND_URL=http://127.0.0.1:7071/api npm run dev
-
VITE_*variables — used by supply-graph-related pages (supply-graph-api, product supply tree). Set when starting Nuxt so they match your local Functions host and OHM:BACKEND_URL=http://127.0.0.1:7071/api \ VITE_API_BASE_URL=http://127.0.0.1:7071/api \ VITE_SUPPLY_GRAPH_AI_URL=http://localhost:8001 \ npm run dev
nuxt.config.tsalso exposesSUPPLY_GRAPH_AI_URLasruntimeConfig.public.supplyGraphAiUrlfor code that reads runtime config; the supply-graph pages above primarily useVITE_SUPPLY_GRAPH_AI_URL.
Clone supply-graph-ai outside this repo (for example next to it: ../supply-graph-ai).
With Docker Desktop running, from the supply-graph-ai repository root:
cd /path/to/supply-graph-ai
cp env.template .env # if you have not already; edit for storage/API keys as needed
docker compose up ohm-apiThe Compose file publishes the FastAPI app on host localhost:8001 (container service ohm-api, port mapping 8001:8001). Confirm it is up:
You can use docker compose up instead of docker compose up ohm-api if you intend to start the full stack defined in that repo’s docker-compose.yml.
For active development inside supply-graph-ai with hot reload:
cd /path/to/supply-graph-ai
conda create -n supply-graph-ai python=3.12
conda activate supply-graph-ai
pip install -r requirements.txt
pip install -e .
cp env.template .env
python run.pySame default URL: http://localhost:8001 (API_PORT in that project’s settings).
| Direction | What to configure |
|---|---|
| Browser → Azure Functions | BACKEND_URL / default baseUrl → http://127.0.0.1:7071/api (or your deployed API URL). |
| Browser → OHM | VITE_SUPPLY_GRAPH_AI_URL → base URL only, no path (e.g. http://localhost:8001). The front end appends /v1/api/match for match requests. |
| OKH file URL for match | Optional VITE_PUBLIC_OKH_BLOB_BASE (default: https://projdatablobstorage.blob.core.windows.net/okh). The UI sends okh_url = {base}/{fname} so OHM can fetch the manifest over HTTPS. |
| OHM → Azure Blob | Configure OHM’s .env / storage settings per supply-graph-ai documentation so it can load OKW/OKH data as needed for matching. |
| OKH vs OKW in a match call | This repo sends OKH as okh_url. OKW (capability) files are not attached by the browser; OHM’s OKWService loads them from OHM-configured storage and matches against the fetched OKH. See docs/match-endpoint-integration.md (section OKH + OKW workflow). |
Match endpoint: POST {VITE_SUPPLY_GRAPH_AI_URL}/v1/api/match — this is the route the Open Hardware Manager exposes; it is not the same as the stub in packages/mock-api (POST /v1/match).
End-to-end match testing: Follow docs/match-endpoint-integration.md for the ordered verification steps (OHM health, blob URL reachability from Docker, Azure Functions, CORS, and curl). Shared client helpers live in packages/front-end/utils/ohmMatch.ts. To assert the same JSON body and headers as the UI against a running OHM, run OKH_FNAME=… node scripts/verify-ohm-match.mjs from the repo root (see doc §6).
CORS: In development, supply-graph-ai typically allows browser calls; if you change origins or run production-like settings, set CORS_ORIGINS in OHM’s environment so http://localhost:3000 (and/or http://127.0.0.1:3000) is allowed.
Ports: Do not run packages/mock-api on port 8001 at the same time as OHM; they conflict. Use the mock only when you want a minimal stub instead of the real Python service.
A minimal Express server for quick stubs. The code listens on port 8001 and implements POST /v1/match, which is not the same path as OHM’s POST /v1/api/match. Use it only when you are not running supply-graph-ai on 8001.
cd packages/mock-api
npm install
npm run devThe older prototype website is implemented with Github Pages.
TypeScript port of original Project Data Python Code.