Skip to content

AbolfazlKhanMo/edu_sciml

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 

Repository files navigation

Neural Networks from Scratch for Scientific Machine Learning

This repository is a learning-focused implementation of neural networks from scratch, written in Python with NumPy/SciPy, without using deep learning frameworks such as PyTorch or TensorFlow.

The goal is to understand:

  • How neural networks work at a mathematical and algorithmic level
  • How these foundations extend naturally to Scientific Machine Learning (SciML) and physics-informed methods

Motivation

Modern deep learning frameworks hide many important details behind abstractions. This project intentionally avoids those abstractions to:

  • implement forward and backward passes manually
  • derive and code gradients explicitly
  • gain intuition about optimization and stability

This makes the transition to Scientific Machine Learning clearer and more principled.


Scope

The repository progresses in stages:

  1. Neural Networks from Scratch

    • Linear regression
    • Logistic regression
    • Simple MLPs
    • Classification and regression examples
  2. Core Training Mechanics

    • Backpropagation
    • Gradient descent and variants
    • Regularization
    • Numerical gradient checking
  3. Scientific Machine Learning

    • Physics-informed neural networks (PINNs)
    • Wave propagation problems
    • Solving PDEs with neural networks
    • Inverse problems (parameter estimation)

All examples are kept as simple as possible to highlight the underlying ideas.


Design Principles

  • No PyTorch / TensorFlow / JAX
  • Minimal dependencies (NumPy, SciPy)
  • Explicit math over convenience
  • Readable code over performance
  • Educational value first

Intended Audience

This repository is for:

  • students learning neural networks from first principles
  • researchers interested in Scientific Machine Learning
  • anyone who wants to understand what deep learning frameworks actually do

Basic knowledge of calculus, linear algebra, and Python is assumed.


Repository Structure (WIP)

.
├── data/        # toy datasets and preprocessing
├── nn/          # neural network components
├── pinns/       # physics-informed models
├── examples/    # runnable scripts
└── README.md

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors