Applied Hypergraph Neural Networks: An Implementation and Study on The Effects of Homophily and Construction Techniques
Author: Sharjeel Mustafa
Hypergraph Neural Networks (HGNNs) leverage high-order correlations to improve representation learning beyond pairwise constraints. However, the spectral foundations of these models rely heavily on the assumption of homophily, where connected nodes share similar labels. This study investigates the performance of HGNNs across Cora, Actor, and Chameleon datasets under varying levels of structural homophily. Our results demonstrate that the model across the board improves with increasing homophily as general trend. We further show that explicit construction methods generally outperform implicit neighbourhood-based techniques, but the inherent homophily requirement of the HGNN smoothing regularizer remains a limiting factor for heterophilic data. These findings highlight the need for heterophily-specific architectures that utilize explicit construction techniques.
If you find this work useful, please consider citing:
@misc{sharjeelm,
title={Applied Hypergraph Neural Networks: An Implementation and Study on The Effects of Homophily and Construction Techniques},
author={Sharjeel Mustafa},
year={2025},
url={https://github.com/Sharjeeliv/C859-final/}
}