Skip to content

Latest commit

 

History

History
46 lines (25 loc) · 1.95 KB

File metadata and controls

46 lines (25 loc) · 1.95 KB

Computer Vision With PyTorch

This repository contains Jupyter notebooks demonstrating fundamental concepts in neural networks and computer vision using PyTorch.

Notebooks

  • Auto_Gradient_Of_Tensor.ipynb: Demonstrates automatic gradient computation for tensors in PyTorch.

  • Backpropagation.ipynb: Explains the backpropagation algorithm for computing gradients in neural networks.

  • Building_a_neural_network_using_PyTorch_on_a_toy_dataset.ipynb: Builds a neural network using PyTorch on a simple toy dataset.

  • Chain_Rule.ipynb: Demonstrates the chain rule in calculus, essential for backpropagation.

  • Feed_Forward_Propagation.ipynb: Implements feed-forward propagation in a neural network, showing how inputs are processed through layers to produce outputs.

  • Gradient_Descent.ipynb: Demonstrates gradient descent optimization algorithm used for training neural networks, including weight updates and loss minimization.

  • Implementing_custom_loss_function.ipynb: Shows how to implement a custom loss function in PyTorch.

  • Initialinzing_A_Tensor.ipynb: Covers different ways to initialize tensors in PyTorch.

  • Learning_Rate.ipynb: Explores the impact of different learning rates on the training process of neural networks.

  • Oeprations_On_Tensors.ipynb: Demonstrates various operations that can be performed on tensors in PyTorch.

  • Specifying_batch_size_while_training_a_model.ipynb: Explains how to specify batch size when training a model in PyTorch.

Requirements

  • Python 3.x
  • NumPy
  • Matplotlib
  • PyTorch (if applicable in future notebooks)

Usage

Open the notebooks in Jupyter Lab or VS Code to run the code cells and see the visualizations.

References

License

This project is for educational purposes.