This project integrates with the NASA API to collect and analyze space-related data. The objective of this notebook is to explore and visualize data obtained from NASA's API, applying various machine learning algorithms to uncover insights.
This project fetches data from NASA's API using an API key and processes it for analysis. The data is then pre-processed and fed into various machine learning models such as Random Forest, SVM, and Neural Networks. The models are evaluated for their performance, and visualizations are generated using matplotlib and seaborn.
Below is the video with more details about data exploration, models used and application.
To run this project locally, follow these steps:
-
Clone the repository:
git clone git@github.com:rodrigosantili/1MLET_FASE3_TECH_CHALLENGE.git cd 1MLET_FASE3_TECH_CHALLENGE -
Install the dependencies: You can install the required packages by running:
pip install -r requirements.txt
-
Add your NASA API key: Create a .env file with the API_KEY_NASA key followed by the password for your NASA API key. If you do not have an account to retrieve this data, you can create one here https://api.nasa.gov/.
-
Run the file
python .\src\main.pyto generate the models -
Now run the command below in the terminal to launch the application
streamlit run .\src\app.py
- Open the local streamlit application in the browser
http://localhost:8501
This project uses the following Python libraries:
requests==2.32.3pandas==2.2.3scikit-learn==1.5.2matplotlib==3.9.2seaborn==0.13.2statsmodels==0.14.3xgboost==2.1.1streamlit==1.38.0python-dotenv==1.0.1joblib~=1.4.2numpy~=2.1.1
You can install all dependencies by running the command:
pip install -r requirements.txt- Bruno Machado Corte Real
- Pedro Henrique Romaoli Garcia
- Rodrigo Santili Sgarioni