Skip to content

[PR 2026] The source code of "Exploring Dynamic Interpretable Brain Networks via Hierarchical Graph Transformer"

Notifications You must be signed in to change notification settings

iMoonLab/DIBrain

Repository files navigation

DIBrain: Exploring Dynamic Interpretable Brain Networks via Hierarchical Graph Transformer

Paper License Framework

Source code for “Exploring Dynamic Interpretable Brain Networks via Hierarchical Graph Transformer”, published in Pattern Recognition.

Authors: Hao Hu†, Rundong Xue†, Shaoyi Du, Xiangmin Han, Jingxi Feng, Zeyu Zhang, Wei Zeng, Yue Gao, Juan Wang

Highlights

  • Dynamic Brain Transformer: learns time-varying functional connectivity for adaptive graph construction.
  • Hierarchical Representation Learning: models intra-subnetwork homogeneity and inter-subnetwork heterogeneity.
  • Cross-scale Modeling: bridges ROI-level dynamics and subnetwork-level coordination.
  • Validated on 4 datasets for neurological disorder diagnosis.

Configuration

Default config: setting/abide.yaml

Common options:

  • data.time_seires: path to *.npy
  • train.epochs / lr / weight_decay
  • Hierarchical constraints:
    • train.group_loss (Lintra + Linter)
    • train.hierarchical_loss (LKL)
    • train.hier_alpha / train.hier_beta / train.hier_gamma

Usage

1. Data Preparation

Download the ABIDE dataset from here.

2. Train

cd DIBrain
python main.py --config_filename setting/abide.yaml

Citation

@article{hu2026exploring,
  title={Exploring Dynamic Interpretable Brain Networks via Hierarchical Graph Transformer},
  author={Hu, Hao and Xue, Rundong and Du, Shaoyi and Han, Xiangmin and Feng, Jingxi and Zhang, Zeyu and Zeng, Wei and Gao, Yue and Wang, Juan},
  journal={Pattern Recognition},
  pages={113371},
  year={2026},
  publisher={Elsevier}
}

License

The source code is free for research and educational use only. Any commercial use should get formal permission first.

About

[PR 2026] The source code of "Exploring Dynamic Interpretable Brain Networks via Hierarchical Graph Transformer"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages