Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: use attributes #53

Draft
wants to merge 57 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
8d4b417
feat: updated most features (ex. connectivity, bold)
xgui3783 May 23, 2024
64fb5dd
feat: updated space to new schema
xgui3783 May 23, 2024
d15ad22
convert parcellations to attribute system
AhmetNSimsek May 23, 2024
4935823
fix: archive_options
xgui3783 May 23, 2024
56e6e27
fix: publication -> url/doi
xgui3783 May 23, 2024
20c025d
fix: space publication
xgui3783 May 24, 2024
b53f914
fix: dx.doi.org -> doi.org
xgui3783 May 24, 2024
c10f202
fix archive schema
xgui3783 May 27, 2024
0c81587
refactor: converted cnmtx to ecs
xgui3783 May 29, 2024
4d9fc4d
readd shortnames for spaces
AhmetNSimsek Jun 5, 2024
4d977ac
cell density: added aggregate by
xgui3783 Jun 7, 2024
77016e6
fix space_id in layerboundary
xgui3783 Jun 7, 2024
b23b8b0
convert maps to attribute system
AhmetNSimsek Jun 11, 2024
8c74629
Revert "convert maps to attribute system"
AhmetNSimsek Jun 12, 2024
a2831be
visf atlas: remove temp provider in data-proxy
AhmetNSimsek Jun 12, 2024
97ed4a0
convert maps to attribute system
AhmetNSimsek Jun 13, 2024
a203a84
Revert "convert maps to attribute system"
AhmetNSimsek Jun 13, 2024
69e7640
convert maps to attribute system
AhmetNSimsek Jun 13, 2024
0402df2
fix: aggregate by
xgui3783 Jun 13, 2024
d2c2626
fix icbm zip file
xgui3783 Jun 14, 2024
f7f324f
fix: add matrix to connectivity data
xgui3783 Jun 17, 2024
5e31a4c
correct version value for jba 3.1
AhmetNSimsek Jun 24, 2024
aaa50c5
Use mesh attributes for volumes with mesh format
AhmetNSimsek Jul 2, 2024
35fb6f1
Revert map config conversion to attribute system
AhmetNSimsek Jul 3, 2024
02c0f98
Update jba3.1 prerelase to EBRAINS release version
AhmetNSimsek Jul 3, 2024
690b438
reconvert jba 3.1 and 3.0.3 parc configs to attribute collection
AhmetNSimsek Jul 3, 2024
4d3f215
Convert maps to attribute collection
AhmetNSimsek Jul 3, 2024
34551e8
Fix: difumo statistical maps, missing zeroth z vals
AhmetNSimsek Jul 11, 2024
cb4235d
fix: difumo labelled maps
xgui3783 Jul 12, 2024
d37a280
updates cohort, paradigm to aggregate by
xgui3783 Jul 12, 2024
a366642
fix: publication -> doi
xgui3783 Jul 12, 2024
8eb8585
Update maps and spaces to use updated volume schema utilizing mapping
AhmetNSimsek Jul 15, 2024
0d0639b
fix aggregate -> facet
xgui3783 Jul 15, 2024
895d443
fix map schema
xgui3783 Jul 16, 2024
01d8598
feat: add category to modality
xgui3783 Jul 17, 2024
36ffa5d
feat: convert bold
xgui3783 Jul 23, 2024
0c3e953
fix: julich brain doi references
xgui3783 Aug 14, 2024
0b3a3f7
fix ci: check precompmesh, rm connectivity, checkschema
xgui3783 Aug 14, 2024
16d729a
feat: fix parcellation -> parcellationscheme
xgui3783 Aug 15, 2024
f6978f0
fix: parcellationschemes path
xgui3783 Aug 15, 2024
3e5ef86
renamed: facet -> categorization
xgui3783 Aug 19, 2024
d6e3084
tmp: new map under _maps
xgui3783 Aug 19, 2024
3c5ec50
feat: auto calculate/commit sparseindex
xgui3783 Aug 20, 2024
cec440a
fix: json duplicated name
xgui3783 Aug 20, 2024
363eb1b
fix: sparse index trigger
xgui3783 Aug 20, 2024
c4dbb17
trigger workflow
xgui3783 Aug 20, 2024
e72dd1e
fix using secrets
xgui3783 Aug 20, 2024
546d4d3
fix ambiguous statistical map
xgui3783 Aug 20, 2024
ca59f4d
fix: mapname contain spaces
xgui3783 Aug 20, 2024
2e375b3
convert the remaining
xgui3783 Aug 20, 2024
836d21a
Update check-trigger-sparseindex.yaml
xgui3783 Aug 20, 2024
0e5b502
add sparseindex volume to maps
xgui3783 Aug 21, 2024
0d4c2b5
feat: update regional matrix
xgui3783 Aug 26, 2024
40c5b72
discard temp folder `_maps`
AhmetNSimsek Aug 30, 2024
ecb3310
update maps to new schema
AhmetNSimsek Aug 30, 2024
90156b2
WIP: Use attribute_mapping for space template variants
AhmetNSimsek Sep 11, 2024
0ec54a9
fix: matrix row/col remapper
xgui3783 Oct 30, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
93 changes: 93 additions & 0 deletions .github/workflows/calc_upload_sparseindex.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
on:
workflow_call:
inputs:
parcellation_id:
type: string
required: true
space_id:
type: string
required: true
filename:
type: string
description: ""
required: true
mapname:
type: string
description: ""
required: true
secrets:
client-id:
required: true
client-secret:
required: true

jobs:
calculate-sparseindex:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v4
with:
repository: fzj-inm1-bda/siibra-python
ref: refactor_attr
path: siibra-python
- uses: actions/setup-python@v5
with:
python-version: '3.10'
- run: pip install -e ./siibra-python/
- run: |
python _ci/create_sparse_index.py \
${{ inputs.parcellation_id }} \
${{ inputs.space_id }} \
${{ inputs.filename }} \
statistical \
'${{ inputs.mapname }}'
- uses: actions/upload-artifact@v4
with:
name: ${{ inputs.filename }}
path: ${{ inputs.filename }}*
retention-days: 1
if-no-files-found: error

upload-data-file:
needs: calculate-sparseindex
runs-on: ubuntu-latest
strategy:
fail-fast: true
matrix:
extension:
- .sparseindex.alias.json
- .sparseindex.probs.txt
- .sparseindex.voxel.nii.gz

steps:
- uses: actions/download-artifact@v4
with:
name: ${{ inputs.filename }}

- uses: FZJ-INM1-BDA/iav-dep-test/.github/actions/upload_dataproxy@master
with:
upload-file: ${{ inputs.filename }}${{ matrix.extension }}
bucket-name: reference-atlas-data
dest-path: sparse-indices/${{ inputs.filename }}${{ matrix.extension }}
client-id: ${{ secrets.client-id }}
client-secret: ${{ secrets.client-secret }}

upload-meta-file:
needs:
- calculate-sparseindex
- upload-data-file
runs-on: ubuntu-latest

steps:
- uses: actions/download-artifact@v4
with:
name: ${{ inputs.filename }}

- uses: FZJ-INM1-BDA/iav-dep-test/.github/actions/upload_dataproxy@master
with:
upload-file: ${{ inputs.filename }}
bucket-name: reference-atlas-data
dest-path: sparse-indices/${{ inputs.filename }}
client-id: ${{ secrets.client-id }}
client-secret: ${{ secrets.client-secret }}
5 changes: 1 addition & 4 deletions .github/workflows/check-maps-precompmesh.yml
Original file line number Diff line number Diff line change
@@ -1,9 +1,6 @@
name: '[test] check maps precompmesh'

on:
push:
paths:
- 'maps/**.json'
on: [push]

jobs:
check_maps_precompmesh:
Expand Down
12 changes: 6 additions & 6 deletions .github/workflows/check-schema.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,20 +12,20 @@ jobs:
with:
python-version: '3.10'

- name: checkout siibra-python
- name: checkout siibra-schema
uses: actions/checkout@v4
with:
repository: FZJ-INM1-BDA/siibra-python
path: siibra-python-${{ github.run_id }}-${{ github.run_number }}
repository: FZJ-INM1-BDA/siibra-schema
path: siibra-schema-${{ github.run_id }}-${{ github.run_number }}
fetch-depth: 1
clean: True
ref: 'main'

- name: move siibra-python one up from workspace
run: mv siibra-python-${{ github.run_id }}-${{ github.run_number }} ../siibra-python
run: mv siibra-schema-${{ github.run_id }}-${{ github.run_number }} ../siibra-schema

- name: Install requirements
run: pip install -r ../siibra-python/config_schema/requirements.txt
run: pip install -r ../siibra-schema/requirements.txt

- name: check schema
run: python ../siibra-python/config_schema/check_schema.py ./
run: python ../siibra-schema/code/validate.py ./
30 changes: 30 additions & 0 deletions .github/workflows/check-trigger-sparseindex.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
name: '[build] sparse index'

on: [push]
jobs:
check-sparse-index:
runs-on: ubuntu-latest
outputs:
matrix: ${{ steps.configure-matrix.outputs.matrix }}
steps:
- uses: actions/checkout@v4
- id: configure-matrix
run: |
matrix=$(python _ci/check_sparse_index.py)
echo "matrix=$matrix" >> $GITHUB_OUTPUT

calc-upload-sparseindex:
needs: check-sparse-index
strategy:
fail-fast: false
matrix: ${{ fromJSON(needs.check-sparse-index.outputs.matrix) }}
uses: ./.github/workflows/calc_upload_sparseindex.yaml
with:
parcellation_id: ${{ matrix.parcellation_id }}
space_id: ${{ matrix.space_id }}
filename: ${{ matrix.filename }}
mapname: ${{ matrix.mapname }}
secrets:
client-id: ${{ secrets.EBRAINS_OIDC_SIIBRA_CI_CLIENT_ID }}
client-secret: ${{ secrets.EBRAINS_OIDC_SIIBRA_CI_CLIENT_SECRET }}

38 changes: 0 additions & 38 deletions .github/workflows/check_connectivity_json.yml

This file was deleted.

15 changes: 0 additions & 15 deletions .github/workflows/test-configuration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,21 +47,6 @@ jobs:
- name: Region attribute compliance
run: python _ci/region_attr_compliance.py

check_map_volume_idx:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.10
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Install requirements
run: |
pip install -U pip
pip install -r _ci/requirements.txt
- name: Verify map volume indices
run: python _ci/verify_volume_idx.py

check_maps:
runs-on: ubuntu-latest
steps:
Expand Down
119 changes: 58 additions & 61 deletions _ci/check_precompmesh.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
from concurrent.futures import ThreadPoolExecutor
import requests
from enum import Enum
from typing import List

PATH_TO_MAPS = "./maps"
MAX_WORKERS = 4
Expand All @@ -14,71 +15,43 @@ class ValidationResult(Enum):

PRECOMPMESH = "neuroglancer/precompmesh"
PRECOMPUTED = "neuroglancer/precomputed"
IMAGE_TYPE = "siibra/attr/data/image/v0.1"

def check_url(url: str, regionname: str):
def check_url(url: str):
try:
resp = requests.get(url)
resp.raise_for_status()
assert "fragments" in resp.json()
assert "fragments" in resp.json(), f"fragment key not found for {url}"
return (
url, regionname,
url,
ValidationResult.PASSED,
None
)
except Exception as e:
return (
url, regionname,
url,
ValidationResult.FAILED,
str(e)
)

def check_volume(arg):
full_filname, vol_idx, volume, indices = arg
try:
assert PRECOMPUTED in volume.get("providers"), f"volume should have neuroglancer/precompmesh, but does not. url keys are: {volume.get('providers').keys()}"
precomp_url = volume["providers"][PRECOMPUTED]
def check_ng_precomp_volume(url: str, indices: List[int]):

def check_provider(url):
resp = requests.get(f"{url}/info")
resp.raise_for_status()
precomp_info = resp.json()
assert ("mesh" in precomp_info) == (PRECOMPMESH in volume.get("providers")), f"Error in: {full_filname} volidx: {vol_idx}: mesh key exist in precomputed: {'mesh' in precomp_info}, precomputed mesh url exists: {PRECOMPMESH in volume.get('providers')}"
resp = requests.get(f"{url}/info")
resp.raise_for_status()
precomp_info = resp.json()
assert "mesh" in precomp_info, f"mesh key does not exist in precompmesh"

if "mesh" in precomp_info:
mesh_path = precomp_info["mesh"]
regions_to_check = [(region, mapped_index.get("label"))
for region, mapped_indicies in indices.items()
for mapped_index in mapped_indicies
if mapped_index.get("volume") == vol_idx]

print(f"Checking {url} ... {len(regions_to_check)} labels.")
with ThreadPoolExecutor(max_workers=MAX_WORKERS) as ex:
indicies_result = ex.map(
check_url,
[f"{url}/{mesh_path}/{item[1]}:0" for item in regions_to_check],
[item[0] for item in regions_to_check]
)
failed = [result for result in indicies_result if result[-2] == ValidationResult.FAILED]
assert len(failed) == 0, f"""region indices mapping failed, {', '.join([f[-1] for f in failed])}"""
mesh_path = precomp_info["mesh"]

if isinstance(precomp_url, dict):
for url in precomp_url.values():
check_provider(url)
elif isinstance(precomp_url, str):
check_provider(precomp_url)
else:
raise ValueError(f"precompurl not a dict nor a str")
return (
full_filname, vol_idx, volume, indices,
ValidationResult.PASSED,
None
)
except Exception as e:
return (
full_filname, vol_idx, volume, indices,
ValidationResult.FAILED,
str(e)
)
urls_to_check = [f"{url}/{mesh_path}/{idx}:0" for idx in indices]
with ThreadPoolExecutor(max_workers=MAX_WORKERS) as ex:
return [
(result, err)
for url, result, err in ex.map(
check_url,
indices
)
]

def iterate_jsons(path_to_walk:str="."):
for dirpath, dirnames, filenames in os.walk(path_to_walk):
Expand All @@ -93,25 +66,49 @@ def iterate_jsons(path_to_walk:str="."):


def main():
args = [
(full_filname, vol_idx, volume, map_json.get("indices") )
for full_filname, _, map_json in iterate_jsons(PATH_TO_MAPS)
for vol_idx, volume in enumerate(map_json.get("volumes"))
if PRECOMPUTED in volume.get("providers") or PRECOMPMESH in volume.get("providers")
failed = []
skipped = []
attrs = [
(full_filname, attr_idx, attr,
[
(regionname, mapping.get("label"))
for regionname, mapping in attr.get("mappin", {}).items()
])
for full_filname, _, map_json in iterate_jsons()
for attr_idx, attr in enumerate(map_json.get("attributes", []))
if (
attr.get("@type") == IMAGE_TYPE
and attr.get("format") == PRECOMPMESH
)
]

print(f"Main: {len(args)} maps.")
filtered_attrs = []
for full_filname, attr_idx, attr, list_t_regionname_label in attrs:
if any(label is None for regionname, label in list_t_regionname_label):
failed.append(
(ValidationResult.FAILED, f"{full_filname} validation failed. attibute at index {attr_idx}, some mapping does not have label key")
)
continue
filtered_attrs.append(
(full_filname, attr_idx, attr, list_t_regionname_label)
)
urls = [attr.get("url") for full_filname, attr_idx, attr, list_t_regionname_label in filtered_attrs]
indices = [[label for regionname, label in list_t_regionname_label]
for full_filname, attr_idx, attr, list_t_regionname_label in filtered_attrs]

print(f"Main: {len(attrs)} maps.")
with ThreadPoolExecutor(max_workers=MAX_WORKERS) as ex:
result = list(ex.map(check_volume, args))
result = [v for ll in list(ex.map(check_ng_precomp_volume, urls, indices))
for v in ll]

passed = [r for r in result if r[-2] == ValidationResult.PASSED]
failed = [r for r in result if r[-2] == ValidationResult.FAILED]
skipped = [r for r in result if r[-2] == ValidationResult.SKIPPED]
passed = [(r, text) for r, text in result if r == ValidationResult.PASSED]
failed += [(r, text) for r, text in result if r == ValidationResult.FAILED]
skipped += [(r, text) for r, text in result if r == ValidationResult.SKIPPED]

print(f"PASSED: {len(passed)}, FAILED: {len(failed)}, SKIPPED: {len(skipped)}, TOTAL: {len(args)} {len(result)}")
print(f"PASSED: {len(passed)}, FAILED: {len(failed)}, SKIPPED: {len(skipped)}, TOTAL: {len(attrs)} {len(result)}")
with open("./missing.txt", "w") as fp:
fp.write("\n".join([f[-1] for f in failed]))
fp.write("\n".join([text for f, text in failed]))
fp.write("\n")
assert len(failed) == 0, "\n".join([f[-1] for f in failed])
assert len(failed) == 0, "\n".join([text for f, text in failed])
if __name__ == "__main__":
main()
Loading
Loading