Creating Custom Networks

Build brain networks from tractogram data using MRtrix and custom parcellations

TVBO ships with pre-built connectomes (see Defining Networks), but you can also create networks from your own tractography data. This guide walks through the full pipeline: building a parcellation atlas, running streamline-based tractography with MRtrix3, and packaging the result as a schema-compliant TVBO network.

Prerequisites

  • MRtrix3 installed (tck2connectome in PATH)
  • A tractogram in .tck format (e.g. the dTOR group-average tractogram)
  • Python packages: nibabel, numpy, templateflow

Pipeline Overview

The workflow has three steps:

  1. Build a parcellation atlas — a NIfTI image where each voxel is labeled with a region index (1, 2, …, \(N\)).
  2. Run tck2connectome — count streamlines and compute mean tract lengths between regions, producing \(N \times N\) weight and length matrices.
  3. Create a TVBO Network — wrap the matrices and metadata into a schema-valid HDF5 + YAML pair.

Step 1: Build a Parcellation Atlas

You can use any atlas available via TemplateFlow, or create a custom one by remapping existing regions.

Example: Lobar Atlas from DKT31

This example merges the 35 cortical + subcortical regions of the Desikan-Killiany (DKT31) atlas into 17 anatomical brain lobes:

  • 6 cortical lobes per hemisphere (Frontal, Parietal, Temporal, Occipital, Cingulate, Insular)
  • 2 subcortical groups per hemisphere (Subcortical, Cerebellum)
  • 1 midline structure (Brain-Stem)
import nibabel as nib
import numpy as np
from pathlib import Path
from templateflow.conf import TF_HOME

# Load DKT31 atlas from TemplateFlow cache
atlas_path = (
    Path(TF_HOME)
    / "tpl-MNI152NLin2009cAsym"
    / "tpl-MNI152NLin2009cAsym_res-02_desc-DKT31_dseg.nii.gz"
)
img = nib.load(atlas_path)
data = np.asarray(img.dataobj, dtype=np.int32)
affine = img.affine

# Define region → lobe mapping
DK_REGION_TO_LOBE = {
    "caudalmiddlefrontal": "Frontal",
    "lateralorbitofrontal": "Frontal",
    "precentral": "Frontal",
    "superiorfrontal": "Frontal",
    # ... (full mapping in scripts/create_lobar_network.py)
    "inferiorparietal": "Parietal",
    "postcentral": "Parietal",
    "precuneus": "Parietal",
    "bankssts": "Temporal",
    "superiortemporal": "Temporal",
    # ... etc.
}

# Lobe ordering — this becomes the node order in the network
LOBE_ORDER = [
    "LH_Frontal", "LH_Parietal", "LH_Temporal", "LH_Occipital",
    "LH_Cingulate", "LH_Insular", "LH_Subcortical", "LH_Cerebellum",
    "RH_Frontal", "RH_Parietal", "RH_Temporal", "RH_Occipital",
    "RH_Cingulate", "RH_Insular", "RH_Subcortical", "RH_Cerebellum",
    "BrainStem",
]

The key step is remapping each voxel from its original FreeSurfer label to the corresponding lobe index (1–17):

lobar = np.zeros_like(data, dtype=np.int32)

for region_name, lobe_name in DK_REGION_TO_LOBE.items():
    fs_idx = DK_FS_INDEX[region_name]
    for hemi_prefix, hemi_offset in [("LH", 1000), ("RH", 2000)]:
        fs_label = hemi_offset + fs_idx
        lobe_label = f"{hemi_prefix}_{lobe_name}"
        lobe_idx = LOBE_ORDER.index(lobe_label) + 1
        lobar[data == fs_label] = lobe_idx

# Save as NIfTI (int32 dtype for tck2connectome)
lobar_img = nib.Nifti1Image(lobar.astype(np.int32), affine, img.header)
nib.save(lobar_img, "lobar_atlas.nii.gz")

Compute MNI centroids for each lobe (used as node positions):

centroids = {}
for idx, lobe_label in enumerate(LOBE_ORDER, start=1):
    voxels = np.argwhere(lobar == idx)
    centroid_vox = voxels.mean(axis=0)
    centroid_mni = affine @ np.array([*centroid_vox, 1.0])
    centroids[lobe_label] = tuple(float(v) for v in centroid_mni[:3])

Step 2: Run tck2connectome

MRtrix’s tck2connectome assigns each streamline to a source–target region pair based on the parcellation atlas, producing a connectivity matrix.

Two passes are needed: one for streamline counts (weights) and one for mean tract lengths:

# Streamline counts
tck2connectome tractogram.tck lobar_atlas.nii.gz weights.csv \
    -symmetric -zero_diagonal

# Mean tract lengths
tck2connectome tractogram.tck lobar_atlas.nii.gz lengths.csv \
    -scale_length -stat_edge mean -symmetric -zero_diagonal

Or programmatically via subprocess:

import subprocess

def run_tck2connectome(tractogram, atlas, weights_csv, lengths_csv):
    subprocess.run([
        "tck2connectome", str(tractogram), str(atlas), str(weights_csv),
        "-symmetric", "-zero_diagonal", "-force",
    ], check=True)

    subprocess.run([
        "tck2connectome", str(tractogram), str(atlas), str(lengths_csv),
        "-scale_length", "-stat_edge", "mean",
        "-symmetric", "-zero_diagonal", "-force",
    ], check=True)

Load the resulting CSV matrices:

weights = np.loadtxt("weights.csv", delimiter=",")
lengths = np.loadtxt("lengths.csv", delimiter=",")

Step 3: Create a TVBO Network

Use Network.from_matrix() to build a network from the connectivity matrices, then attach metadata:

from tvbo import Network

network = Network.from_matrix(
    weights=weights,
    lengths=lengths,
    labels=LOBE_ORDER,
)

Set Node Positions

Assign MNI coordinates from the atlas centroids:

for node in network.nodes:
    c = centroids.get(node.label)
    if c:
        node.position = {
            "x": float(round(c[0], 4)),
            "y": float(round(c[1], 4)),
            "z": float(round(c[2], 4)),
        }

Set Metadata

Provide essential metadata for reproducibility and BIDS compliance:

network.label = "Lobar (dTOR)"
network.descriptor = "SC"
network.distance_unit = "mm"
network.time_unit = "ms"
network.number_of_nodes = len(LOBE_ORDER)

network.parcellation = {
    "atlas": {
        "name": "Lobar",
        "coordinateSpace": "MNI152NLin2009cAsym",
    }
}
network.tractogram = {"name": "dTOR"}
network.bids = {
    "template": "MNI152NLin2009cAsym",
    "cohort": "HCPYA",
    "reconstruction": "dTOR",
    "atlas": "Lobar",
}

Save

The save() method creates both files — YAML sidecar and HDF5 companion:

network.save("my_network.yaml")
# Creates: my_network.yaml  (metadata)
#          my_network.h5     (matrices)

The BIDS-compliant filename is available programmatically:

print(network.bids_filename)
# tpl-MNI152NLin2009cAsym_cohort-HCPYA_rec-dTOR_atlas-Lobar_desc-SC_relmat

Node Subgroup Mapping

For networks with hierarchical structure (e.g. regions within lobes), set a node-to-group mapping:

groups = []
mapping = []
for label in LOBE_ORDER:
    if label.startswith(("LH_", "RH_")):
        group = label.split("_", 1)[1]
    else:
        group = label
    if group not in groups:
        groups.append(group)
    mapping.append(groups.index(group))

network.set_node_mapping(
    name="lobes",
    group_names=groups,
    mapping=mapping,
)

Multi-Layer Networks: Surface Mapping

TVBO supports hierarchical multi-layer networks where a fine-grained network (e.g. surface vertices) references a coarser parent network (e.g. brain regions or lobes). This avoids redundant storage — the surface network stores only the mesh geometry and a vertex→region mapping, while the parent network holds the connectivity matrices.

Architecture

Surface Network (desc-surf)          Parent Network (desc-SC)
┌──────────────────────────┐        ┌──────────────────────┐
│ 64,984 vertices          │        │ 17 lobe nodes        │
│ mesh: vertices, triangles│  ───►  │ edges: weights,      │
│ region_mapping: int[N]   │        │        lengths        │
│ parent_network: *.yaml   │        │ centroids, metadata   │
└──────────────────────────┘        └──────────────────────┘

The parent_network field in the surface YAML points to the SC network file, and node_mapping gives the HDF5 dataset path for the vertex→region index array.

Creating a Surface Network

Load a cortical surface mesh (e.g. fsLR 32k from TemplateFlow) and map each vertex to its parent region by sampling the parcellation atlas:

import nibabel as nib
import numpy as np
from pathlib import Path
from templateflow.conf import TF_HOME

# Load fsLR 32k midthickness surfaces (L + R hemispheres)
fslr_dir = Path(TF_HOME) / "tpl-fsLR"
vertices, triangles = [], []
offset = 0
for hemi in ("L", "R"):
    gii = nib.load(fslr_dir / f"tpl-fsLR_den-32k_hemi-{hemi}_midthickness.surf.gii")
    v = gii.darrays[0].data.astype(np.float32)
    t = gii.darrays[1].data.astype(np.int32)
    vertices.append(v)
    triangles.append(t + offset)
    offset += len(v)

vertices = np.concatenate(vertices)   # (64984, 3)
triangles = np.concatenate(triangles) # (129960, 3)

Map each vertex to a lobe by sampling the atlas in voxel space:

atlas_img = nib.load("lobar_atlas.nii.gz")
atlas_data = np.asarray(atlas_img.dataobj, dtype=np.int32)
inv_affine = np.linalg.inv(atlas_img.affine)

# MNI → voxel → nearest-neighbor sample
ones = np.ones((len(vertices), 1), dtype=np.float32)
vox = (inv_affine @ np.hstack([vertices, ones]).T).T[:, :3]
vi = np.round(vox).astype(np.int32)
for ax in range(3):
    np.clip(vi[:, ax], 0, atlas_data.shape[ax] - 1, out=vi[:, ax])

labels = atlas_data[vi[:, 0], vi[:, 1], vi[:, 2]]
region_mapping = labels.astype(np.int32) - 1  # 0-based, -1 = unmapped

Create the surface Network and link it to the parent:

from tvbo import Network
from tvbo.datamodel import tvbo_datamodel

surface_net = Network(nodes=[], edges=[], number_of_nodes=0)
surface_net.number_of_nodes = len(vertices)
surface_net.label = "Lobar surface (fsLR 32k)"
surface_net.descriptor = "surf"
surface_net.distance_unit = "mm"

# Store mesh geometry
mesh = tvbo_datamodel.Mesh(
    label="CorticalSurface",
    element_type="triangle",
    number_of_vertices=len(vertices),
    number_of_elements=len(triangles),
)
object.__setattr__(surface_net, '_mesh', mesh)
object.__setattr__(surface_net, '_mesh_vertices', vertices)
object.__setattr__(surface_net, '_mesh_elements', triangles)

# Link to parent SC network
surface_net.set_node_mapping(
    region_mapping,
    parent_network=sc_network,  # the saved lobar SC Network
    dataset_path="/mesh/region_mapping",
)

surface_net.save("my_surface.yaml")

Output Structure

The surface network produces:

File Size Contents
*_desc-surf_relmat.yaml ~600 B Metadata + parent reference
*_desc-surf_relmat.h5 ~2 MB Mesh (vertices, triangles, normals) + region mapping
*_desc-SC_relmat.yaml ~2 KB SC metadata (17 nodes, positions)
*_desc-SC_relmat.h5 ~3 KB Weight + length matrices (17×17)

No connectivity matrices are duplicated — the surface network references the parent SC network for all edge data.

Complete Script

The full pipeline is implemented in scripts/create_lobar_network.py, which creates both the SC and surface networks with caching of intermediate files:

python scripts/create_lobar_network.py \
    --tractogram /path/to/dTOR.tck

# With persistent cache (skips tck2connectome if CSVs exist)
python scripts/create_lobar_network.py \
    --tractogram /path/to/dTOR.tck \
    --cache-dir /tmp/lobar_cache

Tips

  • Atlas dtype: Use int32 for the parcellation NIfTI to avoid floating-point warnings from tck2connectome.
  • Large tractograms: The dTOR group-average tractogram (~12 GB) takes significant time per tck2connectome pass. Use --cache-dir to avoid re-running.
  • Custom parcellations: Any integer-labeled NIfTI in the same template space as the tractogram works — you’re not limited to TemplateFlow atlases.
  • Surface coverage: Cortical surfaces (fsLR, fsaverage) cover only cortical regions — subcortical and cerebellar vertices won’t be mapped. Unmapped vertices get index -1 in the region mapping.
  • Schema validation: network.save() validates the YAML against the TVBO LinkML schema automatically.