Project that develops an LSTM model to predict Bitcoin prices. It includes data collection, model creation and training, deployment of an API for predictions, and monitoring in production.
This project develops an LSTM model to predict Bitcoin prices. It includes data collection, model creation and training with PyTorch, deployment of an API for predictions, experiment tracking and model management with MLflow. The API stack built using FastAPI for the web framework, Docker for containerization, Grafana and Prometheus for monitoring. The API provides and endpoints for making predictions and monitoring performance. The project is organized modularly, with separate components for data collection, preprocessing, model definition, evaluation, and auxiliary functions.
/src: Contains all the source code for the project, including scripts for data collection, preprocessing, model training, and API services./res: Stores resources such as configuration files, datasets, and other static files required by the project./grafana: Includes configuration files and dashboards for monitoring the model's performance and other metrics using Grafana.
Contains scripts and modules for setting up and running the API service. This includes the implementation of endpoints, request handling, and integration with the prediction model.
Houses utility libraries and helper functions that are used across different parts of the project. This can include custom modules for data processing, logging, configuration management, and other reusable components.
Includes scripts and resources for training the machine learning model. This directory typically contains code for data preprocessing, model training, validation, and saving the trained model.
- Install all necessary dependencies:
pip install -r requirements.txt
- Configure training parameters in
config.yamlas needed. - Execute the main pipeline using:
python -m src.training.main
- Install all necessary dependencies:
pip install -r requirements.txt
- Run the main pipeline if there already isn't a trained model and scaler available in
/res/models/saved:python -m src.training.main
- Run the docker-compose file to start the API service:
docker-compose up --build