Skip to content

ShrishailSGajbhar/fastapi-onnx-inference

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Reproducing the FastAPI-ONNX Inference For Sentiment Analysis

How to run

Prerequisites:

  • Docker
  • Binary file for the task in onnx format. Download it from this link and put in the webapp folder

Steps:

  1. Create a docker containers for frontend and backend services using command docker-compose up -d
  2. Go to http://localhost:8502 for Streamlit UI (fronted)
  3. Go to http://localhost:8001/docs for Swagger UI (backend)

Example for positive sentiment

sample-1

Example for negative sentiment

sample-2

Swagger UI for backend APIs

sample-3

About

ONNX ML model inference using FastAPI and Streamlit

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors