Skip to content

TensorFlow 2.x implementation of Physics-Informed Neural Networks for solving the nonlinear viscous Burgers equation.This cutting-edge project demonstrates how deep learning can incorporate fundamental physical laws directly into neural network training processes, eliminating the need for traditional numerical discretization methods entirely

Notifications You must be signed in to change notification settings

TechnoAS/PhysicsInformedNuralNetwork-V1

Repository files navigation

๐ŸŒŠ Physics-Informed Neural Networks for Solving Partial Differential Equations

PINN Project TensorFlow Python GPU

๐Ÿ”ฌ Advanced Scientific Computing with Deep Learning

B.Tech Final Year Project - Information Technology

Siddhant Manna | Meghnad Saha Institute of Technology | 2025


๐Ÿ“‹ Project Overview

This comprehensive research project explores the revolutionary application of Physics-Informed Neural Networks (PINNs) for solving complex partial differential equations (PDEs). Unlike traditional numerical methods, PINNs embed physical laws directly into neural network architectures, creating a powerful framework that bridges deep learning and computational physics.

๐ŸŽฏ Objectives

  • Develop PINN frameworks for multiple PDE types
  • Achieve high accuracy with minimal training data
  • Demonstrate superiority over traditional numerical methods
  • Explore advanced techniques like curriculum learning and Fourier embeddings

๐Ÿ“Š Research Statistics

Metric Value Metric Value
๐Ÿ“ Report Pages 60+ ๐Ÿง  PDEs Solved 3
๐Ÿ’ป Training Time 9 min 3 sec ๐Ÿ“š References 13+
๐ŸŽฏ Best L2 Error 6.8ร—10โปโด ๐Ÿ”ง GPU Used NVIDIA T4
โšก Max Iterations 150,000 ๐Ÿ“ˆ Convergence Achieved

๐Ÿ”ฌ Solved Equations

1. Burgers' Equation

$$โˆ‚u/โˆ‚t + uโˆ‚u/โˆ‚x = ฮฝโˆ‚ยฒu/โˆ‚xยฒ$$
  • Application: Fluid dynamics, shock wave modeling
  • Achievement: L2 error of 6.8ร—10โปโด
  • Training Time: 9 minutes on NVIDIA T4

2. Allen-Cahn Equation

$$โˆ‚u/โˆ‚t = ฮตยฒโˆ‡ยฒu + u(1-uยฒ)$$
  • Application: Phase separation, material science
  • Key Feature: Multi-component alloy modeling
  • Enhancement: Curriculum learning integration

3. Time-Dependent Eikonal Equation

$$โˆ‚S/โˆ‚t + |โˆ‡S| = 1$$
  • Application: Wave propagation, optimal path planning
  • Innovation: Backward time integration
  • Performance: Superior to fast-sweeping methods

๐ŸŽจ Research Visualizations

๐Ÿ“ Collocation Points Distribution

Collocation Points Figure 1: Distribution of 10,000 collocation points, 50 initial conditions, and 50 boundary points for PINN training

๐Ÿ“‰ Training Loss Convergence

Loss Convergence Figure 2: PINN training loss convergence over 5,000 epochs with piecewise learning rate decay, achieving L2 error of 6.8ร—10โปโด

๐ŸŒŠ 3D Burgers Solution Surface

3D Solution Figure 3: 3D visualization of Burgers equation solution showing shock formation at t=0.4 and temporal evolution

โš–๏ธ Comparative Analysis: Traditional vs PINN

Comparison Figure 4: Side-by-side comparison demonstrating PINN advantages over traditional numerical methods


๐Ÿ—๏ธ Architecture & Methodology

๐Ÿง  Network Architecture

  • Hidden Layers: 8-9 layers with 20 neurons each
  • Activation: Hyperbolic tangent (tanh)
  • Normalization: Input scaling to [-1,1]
  • Boundary Encoding: Strict Dirichlet condition adherence

โšก Advanced Techniques

Curriculum Learning

  • Progressive parameter complexity increase
  • Stage-wise training approach
  • Enhanced stability for complex PDEs

Causal Training

  • Time-dependent accuracy enforcement
  • Temporal causality preservation
  • Improved generalization over time

Fourier Feature Embedding (FFE)

  • Spatial periodicity encoding
  • Dramatic error reduction (2+ orders of magnitude)
  • Enhanced spatial accuracy

๐Ÿ“ˆ Experimental Results

๐ŸŽฏ Performance Achievements

Equation Method L2 Error Training Time
Burgers PINN 6.8ร—10โปโด 9 min 3 sec
Allen-Cahn PINN+FFE Significantly reduced -
Eikonal PINN+Causal Superior accuracy 150k iterations

๐Ÿ“Š Comparative Analysis

  • PINNs vs Traditional: Higher accuracy with less computational resources
  • FFE Enhancement: 2+ orders of magnitude error reduction
  • Causal Training: Consistent convergence advantages
  • GPU Acceleration: Optimal performance on NVIDIA T4

๐Ÿ› ๏ธ Technical Implementation

Hardware Setup

  • GPU: NVIDIA T4 Tensor Core
  • Platform: Google Colab
  • Precision: Float32 optimization
  • Memory: Optimized for large-scale problems

Software Stack

tensorflow >= 2.x      # Deep learning framework
numpy                  # Numerical computations
matplotlib             # Visualization
scipy                  # Scientific computing

Key Algorithms

  • Automatic Differentiation: TensorFlow GradientTape
  • Optimization: Adam with adaptive learning rates
  • Loss Functions: Multi-component physics-informed loss
  • Sampling: Uniform and adaptive collocation strategies

๐Ÿ“š Research Contributions

๐Ÿ”ฌ Theoretical Advances

  • Novel PINN formulations for time-dependent PDEs
  • Integration of curriculum learning with physics constraints
  • Causal training methodology for temporal problems
  • FFE enhancement for periodic boundary conditions

๐Ÿ’ป Practical Implementations

  • Efficient GPU-accelerated training pipelines
  • Boundary-encoded output layers
  • Multi-stage training protocols
  • Comprehensive error analysis frameworks

๐Ÿ“Š Experimental Validation

  • Rigorous comparison with analytical solutions
  • Performance benchmarking against traditional methods
  • Scalability analysis across different problem sizes
  • Robustness testing under various conditions

๐Ÿš€ Applications & Impact

๐Ÿญ Industrial Applications

  • Fluid Dynamics: Turbulence modeling, flow optimization
  • Materials Science: Alloy design, phase prediction
  • Geophysics: Seismic wave propagation, exploration
  • Robotics: Path planning, navigation systems

๐ŸŽ“ Educational Value

  • Graduate Research: Advanced numerical methods
  • Computational Physics: Modern simulation techniques
  • Machine Learning: Physics-informed AI development
  • Engineering: Real-world problem solving

๐Ÿ”ฎ Future Directions

๐Ÿšง Planned Enhancements

  • Multi-GPU Training: Distributed computing support
  • 3D Extensions: Complex geometry handling
  • Real-time Inference: Optimized deployment
  • Uncertainty Quantification: Bayesian extensions
  • Hybrid Methods: Classical-neural combinations

๐ŸŒŸ Research Opportunities

  • Anisotropic Media: Complex material properties
  • Multi-Physics Coupling: Interdisciplinary problems
  • Inverse Problems: Parameter identification
  • Transfer Learning: Cross-domain applications

๐Ÿ“„ Academic Details

๐ŸŽ“ Project Information

  • Title: Solution of Partial Differential Equations Using Physics Informed Neural Network
  • Author: Siddhant Manna (Roll: 14200222065, Reg: 221420120620)
  • Supervisor: Assistant Professor Indrajit Das
  • Institution: Meghnad Saha Institute of Technology
  • Department: Information Technology
  • Year: 2025

๐Ÿ“š Key References

  • Raissi et al. (2019): Foundational PINN methodology
  • Karniadakis et al. (2021): Physics-informed machine learning
  • Multiple domain-specific applications and enhancements

๐Ÿ† Achievements

  • โœ… High Accuracy: L2 error of 6.8ร—10โปโด for Burgers equation
  • โœ… Computational Efficiency: 9-minute training on single GPU
  • โœ… Novel Techniques: FFE, curriculum learning, causal training
  • โœ… Comprehensive Validation: Multiple PDE types solved
  • โœ… Research Impact: Advancing scientific computing methods

๐Ÿš€ Quick Start

Prerequisites

pip install tensorflow numpy matplotlib scipy

Usage

# Clone the repository
git clone <repository-url>
cd pinn-pde-solver

# Run the main PINN implementation
python Physics-Informed-Neural-Network.py

# Visualize results
python visualize_results.py

๐Ÿ“ž Contact & Collaboration

For research collaboration, technical discussions, or project inquiries:

LinkedIn


โญ If this research contributes to your work, please consider citing! โญ

Made with โค๏ธ for Science

Advancing the frontiers of computational physics through deep learning ๐Ÿš€

About

TensorFlow 2.x implementation of Physics-Informed Neural Networks for solving the nonlinear viscous Burgers equation.This cutting-edge project demonstrates how deep learning can incorporate fundamental physical laws directly into neural network training processes, eliminating the need for traditional numerical discretization methods entirely

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages