Hi all,
Today, we’re diving deep into the Graph Neural Networks, with appropriate mathematical intuition. What you all need is a basic understanding of how machine learning works and every maths that pre requisites it. Generally, its an advanced topic but the original 2009 paper version is not that hard. Without a further due, let’s start!
In this article, we will cover the following topics:
- Graph Representation
- The Transition Function
- State Convergence, Fixed Points & Contraction Mapping
- The Output Function
- Complete GNN Architecture
- Training the GNN
1. Graph Representation

Ref : [https://huggingface.co/blog/intro-graphml](https://huggingface.co/blog/intro-graphm…
Hi all,
Today, we’re diving deep into the Graph Neural Networks, with appropriate mathematical intuition. What you all need is a basic understanding of how machine learning works and every maths that pre requisites it. Generally, its an advanced topic but the original 2009 paper version is not that hard. Without a further due, let’s start!
In this article, we will cover the following topics:
- Graph Representation
- The Transition Function
- State Convergence, Fixed Points & Contraction Mapping
- The Output Function
- Complete GNN Architecture
- Training the GNN
1. Graph Representation

Ref : https://huggingface.co/blog/intro-graphml
Before we dive into neural networks, we need to understand how to represent graph datasets in a way machine can understand.
What is a Graph?
A graph G = (V, E) consists of:
- V: A set of nodes (vertices)
- E: A set of edges connecting these nodes
In the real world, graphs are everywhere:
- Social networks (people are nodes, friendships are edges)
- Molecules (atoms are nodes, bonds are edges) [My favorite part]