Skip to content

ChiragKaushikCK/Activation_Functions_DL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Activation_Functions_DL

Python NumPy Matplotlib Jupyter


Activation Functions Banner
🌟 Linear & Non-Linear Activation Functions 🌟

Unlock the magic behind neural networks with this **Linear & Non-Linear Activation Functions** Jupyter Notebook! This project offers a vibrant visual exploration of fundamental activation functions like **Linear, ReLU, Sigmoid, and Tanh**. Understand their behavior, impact on neural network capabilities, and why non-linearity is crucial for deep learning. Perfect for students, researchers, and anyone curious about the building blocks of AI! ✨


Table of Contents
  1. 📚 About The Project
  2. 💡 Activation Functions Explored
  3. 🎯 Features
  4. 🛠️ Prerequisites
  5. 📋 How to Run
  6. 📊 Example Output
  7. 🧠 Code Breakdown
  8. 🌈 Customization Ideas
  9. 🤝 Contribute

📚 About The Project

This project offers a clear and concise visualization of various **activation functions** commonly used in artificial neural networks. [cite: uploaded:linear_and_Non_Linear_Activation_Function.ipynb] Understanding activation functions is crucial as they introduce non-linearity into the network, enabling it to learn complex patterns and relationships in data. Without non-linear activation functions, a neural network, no matter how many layers it has, would behave like a simple linear regression model. This notebook uses `NumPy` for numerical operations and `Matplotlib` for plotting, making the concepts easy to grasp visually.


💡 Activation Functions Explored

The notebook provides detailed visualizations and implementations for the following activation functions:

  • 1. Linear Function ($y = mx + b$):

    A simple function where the output is directly proportional to the input. While not commonly used in hidden layers of deep networks due to its inability to introduce non-linearity, it's fundamental for understanding the basics and sometimes used in the output layer for regression tasks.

  • 2. ReLU (Rectified Linear Unit) Function ($y = \text{max}(0, x)$):

    One of the most popular activation functions in deep learning. It outputs the input directly if it's positive, otherwise, it outputs zero. Its simplicity and computational efficiency make it widely used in hidden layers.

  • 3. Sigmoid Function ($y = 1 / (1 + e^{-x})$):

    Also known as the logistic function, it squashes input values into a range between 0 and 1. Historically popular in output layers for binary classification problems, though it suffers from vanishing gradients in deep networks.

  • 4. Tanh (Hyperbolic Tangent) Function ($y = (e^x - e^{-x}) / (e^x + e^{-x})$):

    Similar to Sigmoid but squashes input values into a range between -1 and 1. It's zero-centered, which often helps with training stability compared to Sigmoid, and was a popular choice before ReLU's widespread adoption.


🎯 Features

  • 📈 Interactive Visualizations: See how each activation function transforms input values through clear plots. [cite: uploaded:linear_and_Non_Linear_Activation_Function.ipynb]
  • ✍️ Simple Implementations: Each function is implemented from scratch using NumPy for better understanding. [cite: uploaded:linear_and_Non_Linear_Activation_Function.ipynb]
  • 🔍 Side-by-Side Comparisons: Plots are arranged for easy comparison of different activation functions. [cite: uploaded:linear_and_Non_Linear_Activation_Function.ipynb]
  • 📚 Educational Resource: Well-commented code and clear explanations make it an ideal learning tool for deep learning fundamentals.

🛠️ Prerequisites

To run this project, ensure you have the following installed:

  • Python 3.x
  • Jupyter Notebook (or JupyterLab, Google Colab)
  • Required Libraries:
    pip install numpy matplotlib

📋 How to Run

  1. Download the Notebook:

    Download linear_and_Non_Linear_Activation_Function.ipynb from this repository.

    Alternatively, open it directly in Google Colab for a zero-setup experience.

  2. Install Dependencies:
    pip install numpy matplotlib
  3. Run the Notebook:

    Open linear_and_Non_Linear_Activation_Function.ipynb in Jupyter or Colab.

    Execute each cell sequentially to visualize the activation functions!


📊 Example Output

The notebook will generate plots for each activation function, similar to the examples below:

Linear & ReLU Functions:

Linear and ReLU Plot Placeholder

Comparison of Linear and ReLU activation functions.

Sigmoid & Tanh Functions:

Sigmoid and Tanh Plot Placeholder

Comparison of Sigmoid and Tanh activation functions.


🧠 Code Breakdown

Here are the core implementations of the activation functions:

Linear Function:

def linear_function(x, m=1, b=0):
    return m * x + b

ReLU Function:

def relu_function(x):
    return np.maximum(0, x)

Sigmoid Function:

def sigmoid(x):
    return 1 / (1 + np.exp(-x))

Tanh Function:

def tanh(x):
    return (np.exp(x) - np.exp(-x)) / (np.exp(x) + np.exp(-x))

🌈 Customization Ideas

Want to expand your understanding of activation functions? Here are some ideas:

  • 📈 Explore Derivatives: Add plots for the derivatives of each non-linear activation function. This is crucial for understanding backpropagation.
  • 🧪 More Functions: Implement and visualize other activation functions like Leaky ReLU, ELU, Swish, or Softmax.
  • 📊 Network Impact: Create a very simple neural network and demonstrate how different activation functions affect its ability to learn a non-linear decision boundary.
  • 🎨 Interactive Widgets: Use Jupyter widgets to create sliders for changing parameters (e.g., `m` and `b` for linear function, or `alpha` for Leaky ReLU) in real-time.

🤝 Contribute

Contributions are always welcome! If you have ideas for new activation functions to add, improvements to existing visualizations, or any other enhancements, please feel free to open an issue or submit a pull request. Let's illuminate the concepts of deep learning together! 💡

Star this repo if you find it helpful! ⭐

Created with 💖 by Chirag

About

Linear and Non Linear Activation Functions : Linear, ReLU, Sigmoid, Softmax, Tanh

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors