This is the PyTorch implementation of Untying the Knots of Heterophily in Hypergraphs via Mixed-Curvature Manifolds.
Hypergraph neural networks (HNNs) have recently emerged as a promising paradigm for modeling higher-order relations, yet their study under heterophilic settings remains highly limited. Empirical evidence shows that classical message-passing-based HNNs suffer severe performance degradation on hypergraphs with low homophily ratios.
Our project addresses this challenge by introducing HyperUnmix, a novel method that disentangles heterophily mixing through a mixed-curvature manifold. Guided by the intuition that nodes of different classes exhibit distinct distributional characteristics, we model the representation space as a Cartesian product of multiple hyperbolic submanifolds, each aligned with a specific class. By constraining information flow to propagate mainly within the submanifold corresponding to its class, HyperUnmix effectively alleviates mixing during aggregation.
Extensive experiments on both heterophilic and homophilic hypergraph benchmarks demonstrate that our model establishes new state-of-the-art performance, providing fresh insights into heterophilic hypergraph learning.
-
Environment:
- Python 3.12.3
- Nvidia RTX 4090 with CUDA 12.8
-
Libraries:
- geoopt==0.5.1
- numpy==2.2.6
- torch==2.7.0+cu128
- torch-geometric==2.6.1
- torch_cluster==1.6.3+pt27cu128
- torch_scatter==2.1.2+pt27cu128
- torch_sparse==0.6.18+pt27cu128
- torch_spline_conv==1.2.2+pt27cu128
The project supports the following datasets:
- Heterophilic datasets: Actor, Twitch-gamers, Pokec, Senate, House
- Homophilic datasets: Cora, Citeseer, Pubmed, Cora-CA, DBLP-CA
- Synthetic datasets: Synthetic data for testing different homophily ratios
Use the provided run.sh script to run experiments:
bash run.shThe hyperparameters are tuned via Optuna. The datasets will be automatically downloaded from the internet during program execution. Make sure the folder data exists in the root directory.
Extensive experiments on both heterophilic and homophilic hypergraph benchmarks demonstrate that our model establishes new state-of-the-art performance, providing fresh insights into heterophilic hypergraph learning.



