Skip to content

Neural network framework. The back-propagation algorithm is implemented with numpy, and the package supports basic activation functions, loss functions and neural architectures.

Notifications You must be signed in to change notification settings

Mathiasotnes/back-propagation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep-Learning

Implementation of simple back-propagation using numpy.

Installation

You can install Deep-Learning using pip:

pip install git+https://github.com/Mathiasotnes/Deep-Learning.git

Usage

Quickly set up a neural network with multiple layers, including a softmax output layer, using the Deep Learning Library.

Example: Multi-Layer Network with Softmax Output

import numpy as np
from brain_of_mathias.models import Layer, Network
from brain_of_mathias.activations import ReLU, Softmax
from brain_of_mathias.losses import MSE

# Sample data - replace with actual data
X_train = np.array([...])  # Input features
y_train = np.array([...])  # Target labels

# Define a network with desired layers
layer1 = Layer(input_size=..., number_of_neurons=..., activation=ReLU())
layer2 = Layer(input_size=..., number_of_neurons=..., activation=ReLU())
output_layer = Layer(input_size=..., number_of_neurons=..., activation=Softmax())

# Initialize the network with the layers
network = Network([layer1, layer2, output_layer], loss_function=MSE())

# Train the network
network.fit(X_train, y_train, learning_rate=0.01, epochs=500)

# Predict
network.predict(X_test)

Features

  • Custom activation and loss functions.
  • Extensible model architecture.
  • Utilities for common operations.

Repo Activity

Alt

About

Neural network framework. The back-propagation algorithm is implemented with numpy, and the package supports basic activation functions, loss functions and neural architectures.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy