This repository contains the implementation of the longitudinal pediatric glioma EFS pipeline from paper [link here] .
The project works on virtual anaconda environment. Use this command for installing the anaconda environment for the segmentation code
conda env create -f environment.ymlThe pipeline works on nifti images (.nii.gz, .nii) for brain MRs. Move the images into the "preprocessed_datadir" folder. The image names must be formated in the form of "patientID_Scandate.nii.gz" where the scandate is in YYYYMMDD format , for example : 547531_20040101.nii.gz.
Once images, are moved to the data directory they can be preprocessed (N4 bias field correction, MNI template registration, Z4 score normalization) To run preprocessing, first move the MNI template for image registration to the "mni_template" folder. The age specific templates can be found [here] Then run the command
python mri_preprocess_3d.py /mnt_template/temp_head.nii.gzThe preprocessed images will be stored in "processed_datadir/nnunet/imagesTs/"
The dataset is loaded using a csv with columns [pat_id,scandate,label]. where scandate is a list of multiple scans for same subject collated into one string, and label is the 1year event prediction (binary) a sample row of this csv looks like this :
| pat_id | scandate | label |
|---|---|---|
| 458545 | 20040101-20050101-20060101-20070101 | 1 |
To create the longitudinal csv from the dataset run :
python create_longitudinalcsv.py --directory_path /processed_datadir/nnunet/imagesTs/ --output_csv /csvs/longitudinal.csv --labels /path/to/list_of_labelsThe longitudinal csv, can further be split into train,val,test csvs
import pandas as pd
from sklearn.model_selection import train_test_split
df = pd.read_csv('/csvs/longitudinal.csv')
train_val, test = train_test_split(df, test_size=0.2, random_state=42)
train, val = train_test_split(train_val, test_size=0.25, random_state=42)
# Save splits
train.to_csv('/csvs/longitudinal_train.csv', index=False)
val.to_csv('/csvs/longitudinal_val.csv', index=False)
test.to_csv('/csvs/longitudinal_test.csv', index=False)To train temporal learning, create the temporal learning oversampled csvs by :
python create_tl_csvs.py --input_path /csvs/longitudinal_csv.py --output_path /csvs/tl_train.csvThe training parameters, and csv paths can be specified in the config.yml file. To train the temporal learning or finetuning for EFS run :
python train.pySpecify the model checkpoints for testing/inference in the config.yml file. Then run :
python infer.pyTo perform intrapatient analysis run snippets from intrapatient_analysis.ipynb notebook
