A collection of MATLAB implementations demonstrating feedforward neural networks for binary classification tasks with visualization capabilities.
This project contains two comprehensive examples of binary classification using feedforward neural networks in MATLAB. The implementations demonstrate fundamental machine learning concepts including synthetic data generation, neural network training, prediction, and decision boundary visualization.
- Purpose: Demonstrates basic binary classification with two abstract classes
- Data: Synthetic 2D Gaussian clusters centered at different points
- Classes: Two distinct classes separated in feature space
- Purpose: Real-world inspired classification using anthropometric data
- Data: Synthetic height and weight measurements
- Classes: Male vs. Female classification based on physical characteristics
- Synthetic Data Generation: Realistic data simulation with Gaussian distributions
- Neural Network Architecture: Configurable feedforward networks with customizable hidden layers
- Decision Boundary Visualization: Interactive plots showing classification regions
- Performance Metrics: Accuracy calculation and evaluation
- Professional Plotting: High-quality visualizations with proper legends and labels
- Comprehensive Documentation: Well-commented code for educational purposes
- MATLAB R2019b or later (recommended)
- Deep Learning Toolbox (formerly Neural Network Toolbox)
- Statistics and Machine Learning Toolbox (for advanced features)
-
Clone or Download the repository:
git clone <repository-url> # or download and extract the ZIP file
-
Open MATLAB and navigate to the project directory:
cd '/path/to/NN samples'
-
Verify Prerequisites:
% Check if required toolboxes are installed ver('nnet') % Deep Learning Toolbox ver('stats') % Statistics and Machine Learning Toolbox
% Open and run NN1.m
run('NN1.m')
% Open and run NN2.m
run('NN2.m')
hiddenLayerSize = 15; % Change number of hidden neurons
net = feedforwardnet([10, 5]); % Multi-layer architecture
numPoints = 200; % Increase sample size
% Modify distribution parameters for different scenarios
% Adjust grid resolution for decision boundary
[x1Grid, x2Grid] = meshgrid(0:0.05:6, 0:0.05:6); % Higher resolution
NN samples/
├── README.md # This documentation file
├── NN1.m # Abstract binary classification
├── NN2.m # Gender classification example
└── results/ # Generated plots and outputs (created automatically)
- Type: Feedforward Neural Network
- Hidden Layers: 1 layer with 10 neurons (configurable)
- Activation Function: Default MATLAB activation (typically sigmoid/tanh)
- Output Layer: Single neuron with sigmoid activation
- Training Algorithm: Levenberg-Marquardt backpropagation (default)
- Input Features: 2D coordinates (x, y)
- Class 1: Gaussian distribution centered at (1, 1)
- Class 2: Gaussian distribution centered at (5, 5)
- Sample Size: 100 points per class
- Input Features: Height (cm) and Weight (kg)
- Male Class: Height ~175±5cm, Weight ~75±10kg
- Female Class: Height ~165±5cm, Weight ~65±10kg
- Sample Size: 100 points per class
feedforwardnet()
: Creates neural network architecturetrain()
: Trains the network using backpropagationmeshgrid()
: Creates visualization gridcontour()
: Plots decision boundariesscatter()
: Creates scatter plots for data points
- Classification Accuracy: Typically 85-95% for both samples
- Decision Boundary: Clear separation between classes
- Visualization: Professional plots with distinct class regions
- Training Time: 1-5 seconds on modern hardware
- Memory Usage: <100MB for default parameters
- Convergence: Usually within 50-200 epochs
We welcome contributions! Please follow these guidelines:
- Fork the repository
- Create a feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request