Research Unit MDC

The amount of image data, algorithms and visualization solutions is growing vastly. This results in the urgent demand for integration across multiple modalities and scales in space and time.
Helmholtz Imaging Research Unit MDC

Integrative imaging data sciences: we focus on integrating heterogeneous imaging data across modalities, scales, and time. They develop concepts and algorithms for generic processing, stitching, fusion, and visualization of large, high-dimensional datasets. 

imaging pipeline, two stages in the middle of the imaging workflow

The amount of image data, algorithms and visualization solutions is growing vastly. This results in the urgent demand for integration across multiple modalities and scales in space and time. We develop and provide HI solutions that can handle the very heterogeneous image data from the research areas of the Helmholtz Association without imposing restrictions on the respective image modalities. To lay the groundwork for the implementation of HI solutions, our team at MDC will focus on the following research topics:

  • Develop concepts and algorithms for handling and generic processing of high-dimensional datasets.
  • Develop algorithms for large, high-dimensional image data stitching, fusion and visualization

Our “Integrative Imaging Data Science” Group specializes on new concepts, mathematical approaches and representations for large scale image data. Our work includes the development of algorithms, computational resources and visualization solutions across scales in space, time and modalities. We bundle expertise in:

  • concepts and algorithms for handling, generic processing and representation of high-dimensional datasets
  • image analysis and visualization across scales
  • frameworks for large data management, image analysis and data abstraction

Publications

4725570 HI Science Unit MDC 1 https://helmholtz-imaging.de/apa-bold-title.csl 50 creator asc 189 https://helmholtz-imaging.de/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22K5QW62RV%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Baumann%20et%20al.%22%2C%22parsedDate%22%3A%222024-02-13%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BBaumann%2C%20E.%2C%20Dislich%2C%20B.%2C%20Rumberger%2C%20J.%20L.%2C%20Nagtegaal%2C%20I.%20D.%2C%20Martinez%2C%20M.%20R.%2C%20%26amp%3B%20Zlobec%2C%20I.%20%282024%2C%20February%2013%29.%20%26lt%3Bb%26gt%3BHoVer-NeXt%3A%20A%20Fast%20Nuclei%20Segmentation%20and%20Classification%20Pipeline%20for%20Next%20Generation%20Histopathology%26lt%3B%5C%2Fb%26gt%3B.%20Medical%20Imaging%20with%20Deep%20Learning.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fopenreview.net%5C%2Fforum%3Fid%3D3vmB43oqIO%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fopenreview.net%5C%2Fforum%3Fid%3D3vmB43oqIO%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22HoVer-NeXt%3A%20A%20Fast%20Nuclei%20Segmentation%20and%20Classification%20Pipeline%20for%20Next%20Generation%20Histopathology%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elias%22%2C%22lastName%22%3A%22Baumann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bastian%22%2C%22lastName%22%3A%22Dislich%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Josef%20Lorenz%22%2C%22lastName%22%3A%22Rumberger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Iris%20D.%22%2C%22lastName%22%3A%22Nagtegaal%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Maria%20Rodriguez%22%2C%22lastName%22%3A%22Martinez%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Inti%22%2C%22lastName%22%3A%22Zlobec%22%7D%5D%2C%22abstractNote%22%3A%22In%20cancer%2C%20a%20variety%20of%20cell%20types%2C%20along%20with%20their%20local%20density%20and%20spatial%20organization%20within%20tissues%2C%20play%20a%20key%20role%20in%20driving%20cancer%20progression%20and%20modulating%20patient%20outcomes.%20At%20the%20basis%20of%20cancer%20diagnosis%20is%20the%20histopathological%20assessment%20of%20tissues%2C%20stained%20by%20hematoxylin%20%26amp%3B%20eosin%20%28H%26amp%3BE%29%2C%20which%20gives%20the%20nuclei%20of%20cells%20a%20dark%20purple%20appearance%2C%20making%20them%20particularly%20distinguishable%20and%20quantifiable.%20The%20identification%20of%20individual%20nuclei%2C%20whether%20in%20a%20proliferating%20%28mitosis%29%20or%20resting%20state%2C%20and%20their%20further%20phenotyping%20%28e.g.%20immune%20cells%29%20is%20the%20foundation%20on%20which%20histopathology%20images%20can%20be%20used%20for%20further%20investigations%20into%20cellular%20interaction%2C%20prognosis%20or%20response%20prediction.%20To%20this%20end%2C%20we%20develop%20a%20H%26amp%3BE%20based%20nuclei%20segmentation%20and%20classification%20model%20that%20is%20both%20fast%20%281.8s%5C%2Fmm2%20at%200.5mpp%2C%203.2s%5C%2Fmm2%20at%200.25mpp%29%20and%20accurate%20%280.84%20binary%20F1%2C%200.758%20mean%20balanced%20Accuracy%29%20which%20allows%20us%20to%20investigate%20the%20cellular%20composition%20of%20large-scale%20colorectal%20cancer%20%28CRC%29%20cohorts.%20We%20extend%20the%20publicly%20available%20Lizard%20CRC%20nuclei%20dataset%20with%20a%20mitosis%20class%20and%20publish%20further%20validation%20data%20for%20the%20rarest%20classes%3A%20mitosis%20and%20eosinophils.%20Moreover%2C%20our%20pipeline%20is%205%5Cu00d7%20faster%20than%20the%20CellViT%20pipeline%2C%2017%5Cu00d7%20faster%20than%20the%20HoVer-Net%20pipeline%2C%20and%20performs%20competitively%20on%20the%20PanNuke%20pan-cancer%20nuclei%20dataset%20%2847.7%20mPQTiss%2C%20%2B3%25%20over%20HoVer-Net%29.%20Our%20work%20paves%20the%20way%20towards%20extensive%20single-cell%20information%20directly%20from%20H%26amp%3BE%20slides%2C%20leading%20to%20a%20quantitative%20view%20of%20whole%20slide%20images.%20Code%2C%20model%20weights%20as%20well%20as%20all%20additional%20training%20and%20validation%20data%2C%20are%20publicly%20available%20on%20github.%22%2C%22date%22%3A%222024%5C%2F02%5C%2F13%22%2C%22proceedingsTitle%22%3A%22%22%2C%22conferenceName%22%3A%22Medical%20Imaging%20with%20Deep%20Learning%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fopenreview.net%5C%2Fforum%3Fid%3D3vmB43oqIO%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T08%3A58%3A46Z%22%7D%7D%2C%7B%22key%22%3A%225IN7K2RG%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Cersovsky%20et%20al.%22%2C%22parsedDate%22%3A%222023-11-20%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BCersovsky%2C%20J.%2C%20Mohammadi%2C%20S.%2C%20Kainmueller%2C%20D.%2C%20%26amp%3B%20Hoehne%2C%20J.%20%282023%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BTowards%20Hierarchical%20Regional%20Transformer-based%20Multiple%20Instance%20Learning%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B%20%28arXiv%3A2308.12634%29.%20arXiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2308.12634%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2308.12634%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Towards%20Hierarchical%20Regional%20Transformer-based%20Multiple%20Instance%20Learning%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Josef%22%2C%22lastName%22%3A%22Cersovsky%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sadegh%22%2C%22lastName%22%3A%22Mohammadi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Johannes%22%2C%22lastName%22%3A%22Hoehne%22%7D%5D%2C%22abstractNote%22%3A%22The%20classification%20of%20gigapixel%20histopathology%20images%20with%20deep%20multiple%20instance%20learning%20models%20has%20become%20a%20critical%20task%20in%20digital%20pathology%20and%20precision%20medicine.%20In%20this%20work%2C%20we%20propose%20a%20Transformer-based%20multiple%20instance%20learning%20approach%20that%20replaces%20the%20traditional%20learned%20attention%20mechanism%20with%20a%20regional%2C%20Vision%20Transformer%20inspired%20self-attention%20mechanism.%20We%20present%20a%20method%20that%20fuses%20regional%20patch%20information%20to%20derive%20slide-level%20predictions%20and%20show%20how%20this%20regional%20aggregation%20can%20be%20stacked%20to%20hierarchically%20process%20features%20on%20different%20distance%20levels.%20To%20increase%20predictive%20accuracy%2C%20especially%20for%20datasets%20with%20small%2C%20local%20morphological%20features%2C%20we%20introduce%20a%20method%20to%20focus%20the%20image%20processing%20on%20high%20attention%20regions%20during%20inference.%20Our%20approach%20is%20able%20to%20significantly%20improve%20performance%20over%20the%20baseline%20on%20two%20histopathology%20datasets%20and%20points%20towards%20promising%20directions%20for%20further%20research.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2308.12634%22%2C%22date%22%3A%222023-11-20%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2308.12634%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2308.12634%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-02-28T15%3A33%3A18Z%22%7D%7D%2C%7B%22key%22%3A%22YQAWUPZY%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Cole%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BCole%2C%20J.%20H.%20%282020%29.%20%26lt%3Bb%26gt%3BMultimodality%20neuroimaging%20brain-age%20in%20UK%20biobank%3A%20relationship%20to%20biomedical%2C%20lifestyle%2C%20and%20cognitive%20factors%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BNeurobiology%20of%20Aging%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B92%26lt%3B%5C%2Fi%26gt%3B%2C%2034%26%23x2013%3B42.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.neurobiolaging.2020.03.014%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.neurobiolaging.2020.03.014%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Multimodality%20neuroimaging%20brain-age%20in%20UK%20biobank%3A%20relationship%20to%20biomedical%2C%20lifestyle%2C%20and%20cognitive%20factors%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22James%20H.%22%2C%22lastName%22%3A%22Cole%22%7D%5D%2C%22abstractNote%22%3A%22Biomedical%20image%20analysis%20plays%20a%20pivotal%20role%20in%20advancing%20our%20understanding%20of%20the%20human%20body%5Cu2019s%20functioning%20across%20different%20scales%2C%20usually%20based%20on%20deep%20learning-based%20methods.%20However%2C%20deep%20learning%20methods%20are%20notoriously%20data%20hungry%2C%20which%20poses%20a%20problem%20in%20fields%20where%20data%20is%20difficult%20to%20obtain%20such%20as%20in%20neuroscience.%20Transfer%20learning%20%28TL%29%20has%20become%20a%20popular%20and%20successful%20approach%20to%20cope%20with%20this%20issue%2C%20but%20is%20difficult%20to%20apply%20in%20practise%20due%20the%20many%20parameters%20it%20requires%20to%20set%20properly.%20Here%2C%20we%20present%20TLIMB%2C%20a%20novel%20python-based%20framework%20for%20easy%20development%20of%20optimized%20and%20scalable%20TL-based%20image%20analysis%20pipelines%20in%20the%20neurosciences.%20TLIMB%20allows%20for%20an%20intuitive%20configuration%20of%20source%20%5C%2F%20target%20data%20sets%2C%20specific%20TL-approach%20and%20deep%20learning-architecture%2C%20and%20hyperparameter%20optimization%20method%20for%20a%20given%20data%20analysis%20pipeline%20and%20compiles%20these%20into%20a%20nextflow%20workflow%20for%20seamless%20execution%20over%20different%20infrastructures%2C%20ranging%20from%20multicore%20servers%20to%20large%20compute%20clusters.%20Our%20evaluation%20using%20a%20pipeline%20for%20analysing%2010.000%20MRI%20images%20of%20the%20human%20brain%20from%20the%20UK%20Biobank%20shows%20that%20TLIMB%20is%20easy%20to%20use%2C%20incurs%20negligible%20overhead%20and%20can%20scale%20across%20different%20cluster%20sizes.%22%2C%22date%22%3A%2208%5C%2F2020%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.neurobiolaging.2020.03.014%22%2C%22ISSN%22%3A%2201974580%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flinkinghub.elsevier.com%5C%2Fretrieve%5C%2Fpii%5C%2FS0197458020301056%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T08%3A47%3A51Z%22%7D%7D%2C%7B%22key%22%3A%222EUHENGV%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Dohmen%20et%20al.%22%2C%22parsedDate%22%3A%222024-05%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BDohmen%2C%20M.%2C%20Mittermaier%2C%20M.%2C%20Rumberger%2C%20J.%20L.%2C%20Yang%2C%20L.-L.%2C%20Gruber%2C%20A.%20D.%2C%20Toennies%2C%20M.%2C%20Hippenstiel%2C%20S.%2C%20Kainmueller%2C%20D.%2C%20%26amp%3B%20Hocke%2C%20A.%20C.%20%282024%29.%20%26lt%3Bb%26gt%3BSimultaneous%20Lung%20Cell%20and%20Nucleus%20Segmentation%20From%20Labelled%20Versus%20Unlabelled%20Human%20Lung%20DIC%20Images%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3B2024%20IEEE%20International%20Symposium%20on%20Biomedical%20Imaging%20%28ISBI%29%26lt%3B%5C%2Fi%26gt%3B%2C%201%26%23x2013%3B5.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FISBI56570.2024.10635198%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FISBI56570.2024.10635198%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Simultaneous%20Lung%20Cell%20and%20Nucleus%20Segmentation%20From%20Labelled%20Versus%20Unlabelled%20Human%20Lung%20DIC%20Images%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Melanie%22%2C%22lastName%22%3A%22Dohmen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mirja%22%2C%22lastName%22%3A%22Mittermaier%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J.%20Lorenz%22%2C%22lastName%22%3A%22Rumberger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Li-Ling%22%2C%22lastName%22%3A%22Yang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Achim%20D.%22%2C%22lastName%22%3A%22Gruber%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mario%22%2C%22lastName%22%3A%22Toennies%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22Hippenstiel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andreas%20C.%22%2C%22lastName%22%3A%22Hocke%22%7D%5D%2C%22abstractNote%22%3A%22For%20high-throughput%20and%20quantitative%20analyses%20of%20tissue%20microscopy%20images%20such%20as%20tissue%20morphology%2C%20cell%20type%20allocation%20and%20counting%2C%20simultaneous%20semantic%20and%20instance%20segmentation%20is%20needed.%20However%2C%20only%20few%20public%20data%20is%20available%20for%20developing%20such%20methods.%20Therefore%2C%20we%20provide%20a%20data%20set%20of%20differential%20interference%20contrast%20%28DIC%29%20images%20of%20human%20lung%20tissue%20complemented%20by%20multiplex%20fluorescence%20labelling%20including%20semantic%2C%20cell%20and%20nucleus%20instance%20annotations.We%20examined%20the%20structures%20and%20tasks%20for%20which%20fluorescence%20labeling%20is%20essential%20and%20compared%20one%20set%20of%20deep%20neural%20networks%20%28DNNs%29%20trained%20solely%20on%20unlabeled%20DIC%20images%20with%20another%20set%20that%20also%20utilized%20fluorescence%20labeling.%20Our%20findings%20indicate%20that%20while%20fluorescence%20labeling%20is%20crucial%20for%20the%20detection%20of%20the%20majority%20of%20cell%20nuclei%2C%20certain%20cells%20and%20tissue%20compartments%20can%20be%20quantified%20without%20it.Our%20analysis%20also%20extends%20currently%20limited%20knowledge%20about%20cell%20type%20composition%20in%20normal%20human%20lung%20tissue%20and%20contributes%20to%20further%20advances%20in%20quantitative%20tissue%20characterization%20in%20health%20and%20disease.%22%2C%22date%22%3A%222024-05%22%2C%22proceedingsTitle%22%3A%222024%20IEEE%20International%20Symposium%20on%20Biomedical%20Imaging%20%28ISBI%29%22%2C%22conferenceName%22%3A%222024%20IEEE%20International%20Symposium%20on%20Biomedical%20Imaging%20%28ISBI%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FISBI56570.2024.10635198%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10635198%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-01-14T13%3A42%3A45Z%22%7D%7D%2C%7B%22key%22%3A%22FLEHEF8A%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Franzen%20et%20al.%22%2C%22parsedDate%22%3A%222024-11-11%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BFranzen%2C%20J.%2C%20Winklmayr%2C%20C.%2C%20Guarino%2C%20V.%20E.%2C%20Karg%2C%20C.%2C%20Yu%2C%20X.%2C%20Koreuber%2C%20N.%2C%20Albrecht%2C%20J.%20P.%2C%20Bischoff%2C%20P.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282024%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BArctique%3A%20An%20artificial%20histopathological%20dataset%20unifying%20realism%20and%20controllability%20for%20uncertainty%20quantification%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B%20%28arXiv%3A2411.07097%29.%20arXiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2411.07097%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2411.07097%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Arctique%3A%20An%20artificial%20histopathological%20dataset%20unifying%20realism%20and%20controllability%20for%20uncertainty%20quantification%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jannik%22%2C%22lastName%22%3A%22Franzen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Claudia%22%2C%22lastName%22%3A%22Winklmayr%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Vanessa%20E.%22%2C%22lastName%22%3A%22Guarino%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christoph%22%2C%22lastName%22%3A%22Karg%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xiaoyan%22%2C%22lastName%22%3A%22Yu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nora%22%2C%22lastName%22%3A%22Koreuber%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jan%20P.%22%2C%22lastName%22%3A%22Albrecht%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Philip%22%2C%22lastName%22%3A%22Bischoff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%5D%2C%22abstractNote%22%3A%22Uncertainty%20Quantification%20%28UQ%29%20is%20crucial%20for%20reliable%20image%20segmentation.%20Yet%2C%20while%20the%20field%20sees%20continual%20development%20of%20novel%20methods%2C%20a%20lack%20of%20agreed-upon%20benchmarks%20limits%20their%20systematic%20comparison%20and%20evaluation%3A%20Current%20UQ%20methods%20are%20typically%20tested%20either%20on%20overly%20simplistic%20toy%20datasets%20or%20on%20complex%20real-world%20datasets%20that%20do%20not%20allow%20to%20discern%20true%20uncertainty.%20To%20unify%20both%20controllability%20and%20complexity%2C%20we%20introduce%20Arctique%2C%20a%20procedurally%20generated%20dataset%20modeled%20after%20histopathological%20colon%20images.%20We%20chose%20histopathological%20images%20for%20two%20reasons%3A%201%29%20their%20complexity%20in%20terms%20of%20intricate%20object%20structures%20and%20highly%20variable%20appearance%2C%20which%20yields%20challenging%20segmentation%20problems%2C%20and%202%29%20their%20broad%20prevalence%20for%20medical%20diagnosis%20and%20respective%20relevance%20of%20high-quality%20UQ.%20To%20generate%20Arctique%2C%20we%20established%20a%20Blender-based%20framework%20for%203D%20scene%20creation%20with%20intrinsic%20noise%20manipulation.%20Arctique%20contains%2050%2C000%20rendered%20images%20with%20precise%20masks%20as%20well%20as%20noisy%20label%20simulations.%20We%20show%20that%20by%20independently%20controlling%20the%20uncertainty%20in%20both%20images%20and%20labels%2C%20we%20can%20effectively%20study%20the%20performance%20of%20several%20commonly%20used%20UQ%20methods.%20Hence%2C%20Arctique%20serves%20as%20a%20critical%20resource%20for%20benchmarking%20and%20advancing%20UQ%20techniques%20and%20other%20methodologies%20in%20complex%2C%20multi-object%20environments%2C%20bridging%20the%20gap%20between%20realism%20and%20controllability.%20All%20code%20is%20publicly%20available%2C%20allowing%20re-creation%20and%20controlled%20manipulations%20of%20our%20shipped%20images%20as%20well%20as%20creation%20and%20rendering%20of%20new%20scenes.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2411.07097%22%2C%22date%22%3A%222024-11-11%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2411.07097%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.07097%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-01-14T13%3A38%3A51Z%22%7D%7D%2C%7B%22key%22%3A%225WRPUKVR%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Graham%20et%20al.%22%2C%22parsedDate%22%3A%222024-02-01%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BGraham%2C%20S.%2C%20Vu%2C%20Q.%20D.%2C%20Jahanifar%2C%20M.%2C%20Weigert%2C%20M.%2C%20Schmidt%2C%20U.%2C%20Zhang%2C%20W.%2C%20Zhang%2C%20J.%2C%20Yang%2C%20S.%2C%20Xiang%2C%20J.%2C%20Wang%2C%20X.%2C%20Rumberger%2C%20J.%20L.%2C%20Baumann%2C%20E.%2C%20Hirsch%2C%20P.%2C%20Liu%2C%20L.%2C%20Hong%2C%20C.%2C%20Aviles-Rivero%2C%20A.%20I.%2C%20Jain%2C%20A.%2C%20Ahn%2C%20H.%2C%20Hong%2C%20Y.%2C%20%26%23x2026%3B%20Rajpoot%2C%20N.%20M.%20%282024%29.%20%26lt%3Bb%26gt%3BCoNIC%20Challenge%3A%20Pushing%20the%20frontiers%20of%20nuclear%20detection%2C%20segmentation%2C%20classification%20and%20counting%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BMedical%20Image%20Analysis%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B92%26lt%3B%5C%2Fi%26gt%3B%2C%20103047.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.media.2023.103047%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.media.2023.103047%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22CoNIC%20Challenge%3A%20Pushing%20the%20frontiers%20of%20nuclear%20detection%2C%20segmentation%2C%20classification%20and%20counting%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Simon%22%2C%22lastName%22%3A%22Graham%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Quoc%20Dang%22%2C%22lastName%22%3A%22Vu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mostafa%22%2C%22lastName%22%3A%22Jahanifar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Martin%22%2C%22lastName%22%3A%22Weigert%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Uwe%22%2C%22lastName%22%3A%22Schmidt%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Wenhua%22%2C%22lastName%22%3A%22Zhang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jun%22%2C%22lastName%22%3A%22Zhang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sen%22%2C%22lastName%22%3A%22Yang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jinxi%22%2C%22lastName%22%3A%22Xiang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xiyue%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Josef%20Lorenz%22%2C%22lastName%22%3A%22Rumberger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elias%22%2C%22lastName%22%3A%22Baumann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lihao%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chenyang%22%2C%22lastName%22%3A%22Hong%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Angelica%20I.%22%2C%22lastName%22%3A%22Aviles-Rivero%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ayushi%22%2C%22lastName%22%3A%22Jain%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Heeyoung%22%2C%22lastName%22%3A%22Ahn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yiyu%22%2C%22lastName%22%3A%22Hong%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hussam%22%2C%22lastName%22%3A%22Azzuni%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Min%22%2C%22lastName%22%3A%22Xu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mohammad%22%2C%22lastName%22%3A%22Yaqub%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marie-Claire%22%2C%22lastName%22%3A%22Blache%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Beno%5Cu00eet%22%2C%22lastName%22%3A%22Pi%5Cu00e9gu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bertrand%22%2C%22lastName%22%3A%22Vernay%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tim%22%2C%22lastName%22%3A%22Scherr%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Moritz%22%2C%22lastName%22%3A%22B%5Cu00f6hland%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katharina%22%2C%22lastName%22%3A%22L%5Cu00f6ffler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jiachen%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Weiqin%22%2C%22lastName%22%3A%22Ying%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chixin%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%22%2C%22lastName%22%3A%22Snead%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shan%20E.%20Ahmed%22%2C%22lastName%22%3A%22Raza%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fayyaz%22%2C%22lastName%22%3A%22Minhas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nasir%20M.%22%2C%22lastName%22%3A%22Rajpoot%22%7D%5D%2C%22abstractNote%22%3A%22Nuclear%20detection%2C%20segmentation%20and%20morphometric%20profiling%20are%20essential%20in%20helping%20us%20further%20understand%20the%20relationship%20between%20histology%20and%20patient%20outcome.%20To%20drive%20innovation%20in%20this%20area%2C%20we%20setup%20a%20community-wide%20challenge%20using%20the%20largest%20available%20dataset%20of%20its%20kind%20to%20assess%20nuclear%20segmentation%20and%20cellular%20composition.%20Our%20challenge%2C%20named%20CoNIC%2C%20stimulated%20the%20development%20of%20reproducible%20algorithms%20for%20cellular%20recognition%20with%20real-time%20result%20inspection%20on%20public%20leaderboards.%20We%20conducted%20an%20extensive%20post-challenge%20analysis%20based%20on%20the%20top-performing%20models%20using%201%2C658%20whole-slide%20images%20of%20colon%20tissue.%20With%20around%20700%20million%20detected%20nuclei%20per%20model%2C%20associated%20features%20were%20used%20for%20dysplasia%20grading%20and%20survival%20analysis%2C%20where%20we%20demonstrated%20that%20the%20challenge%5Cu2019s%20improvement%20over%20the%20previous%20state-of-the-art%20led%20to%20significant%20boosts%20in%20downstream%20performance.%20Our%20findings%20also%20suggest%20that%20eosinophils%20and%20neutrophils%20play%20an%20important%20role%20in%20the%20tumour%20microevironment.%20We%20release%20challenge%20models%20and%20WSI-level%20results%20to%20foster%20the%20development%20of%20further%20methods%20for%20biomarker%20discovery.%22%2C%22date%22%3A%222024-02-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.media.2023.103047%22%2C%22ISSN%22%3A%221361-8415%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS1361841523003079%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-02-28T16%3A04%3A38Z%22%7D%7D%2C%7B%22key%22%3A%223I45M3VX%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Gutierrez%20Becker%20et%20al.%22%2C%22parsedDate%22%3A%222024-01-01%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BGutierrez%20Becker%2C%20B.%2C%20Fraessle%2C%20S.%2C%20Yao%2C%20H.%2C%20L%26%23xFC%3Bscher%2C%20J.%2C%20Girycki%2C%20R.%2C%20Machura%2C%20B.%2C%20Go%26%23x15B%3Bli%26%23x144%3Bski%2C%20J.%2C%20Czornik%2C%20J.%2C%20Pitura%2C%20M.%2C%20Ar%26%23xFA%3Bs-Pous%2C%20J.%2C%20Fisher%2C%20E.%2C%20Bojic%2C%20D.%2C%20Richmond%2C%20D.%2C%20Bigorgne%2C%20A.%2C%20%26amp%3B%20Prunotto%2C%20M.%20%282024%29.%20%26lt%3Bb%26gt%3BP098%20The%20Endoscopic%20Severity%20Score%20Map%20%28ESSM%29%3A%20An%20Artificial%20Intelligence%20scoring%20system%20providing%20accurate%2C%20objective%20and%20localised%20measurements%20of%20endoscopic%20disease%20severity%20in%20ulcerative%20colitis%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BJournal%20of%20Crohn%26%23x2019%3Bs%20and%20Colitis%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B18%26lt%3B%5C%2Fi%26gt%3B%28Supplement_1%29%2C%20i377%26%23x2013%3Bi378.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1093%5C%2Fecco-jcc%5C%2Fjjad212.0228%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1093%5C%2Fecco-jcc%5C%2Fjjad212.0228%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22P098%20The%20Endoscopic%20Severity%20Score%20Map%20%28ESSM%29%3A%20An%20Artificial%20Intelligence%20scoring%20system%20providing%20accurate%2C%20objective%20and%20localised%20measurements%20of%20endoscopic%20disease%20severity%20in%20ulcerative%20colitis%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22B%22%2C%22lastName%22%3A%22Gutierrez%20Becker%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S%22%2C%22lastName%22%3A%22Fraessle%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22H%22%2C%22lastName%22%3A%22Yao%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J%22%2C%22lastName%22%3A%22L%5Cu00fcscher%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22R%22%2C%22lastName%22%3A%22Girycki%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22B%22%2C%22lastName%22%3A%22Machura%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J%22%2C%22lastName%22%3A%22Go%5Cu015bli%5Cu0144ski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J%22%2C%22lastName%22%3A%22Czornik%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M%22%2C%22lastName%22%3A%22Pitura%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J%22%2C%22lastName%22%3A%22Ar%5Cu00fas-Pous%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22E%22%2C%22lastName%22%3A%22Fisher%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22D%22%2C%22lastName%22%3A%22Bojic%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22D%22%2C%22lastName%22%3A%22Richmond%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22A%22%2C%22lastName%22%3A%22Bigorgne%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M%22%2C%22lastName%22%3A%22Prunotto%22%7D%5D%2C%22abstractNote%22%3A%22Commonly%20used%20scoring%20schemes%20as%20the%20Mayo%20Endoscopic%20Subscore%20%28MES%29%20account%20for%20disease%20severity%20only%20at%20specific%20%28i.e.%2C%20the%20worst%29%20segments%20and%20do%20not%20capture%20disease%20extent.%20However%2C%20for%20an%20accurate%20assessment%20of%20disease%20severity%20in%20patients%20with%20ulcerative%20colitis%20%28UC%29%2C%20the%20measure%20of%20the%20precise%20extent%20of%20disease%20activity%20is%20necessary.%20Alternative%20systems%20that%20include%20disease%20extent%20have%20been%20proposed%20%28Balint%202018%29%2C%20but%20their%20implementation%20is%20prohibited%20by%20the%20time%20and%20cost%20constraints%20of%20comprehensively%20scoring%20each%20location%20along%20the%20entire%20colon.%20Here%2C%20we%20present%20the%20Endoscopic%20Severity%20Score%20Map%20%28ESSM%29%2C%20a%20scoring%20system%20based%20on%20Artificial%20Intelligence%2C%20capable%20of%20providing%20an%20assessment%20of%20disease%20severity%20and%20extent%20in%20UC%20in%20a%20fully%20automated%20manner.The%20ESSM%20consists%20of%203%20main%20elements%20%28Fig.%201%29%3A%201%29%20a%20quality%20algorithm%20which%20selects%20readable%20frames%20from%20a%20colonoscopy%20video%2C%202%29%20a%20scoring%20system%20which%20assigns%20an%20MES%20to%20each%20readable%20frame%20%28Gutierrez%20Becker%202020%29%20and%203%29%20a%20camera%20localisation%20algorithm%20that%20assigns%20each%20frame%20to%20an%20anatomical%20location%20within%20the%20colon%20%28Yao%202022%29.%20The%20ESSM%20was%20trained%20and%20tested%20using%204%2C306%20sigmoidoscopy%20videos%20from%20phase%20III%20Etrolizumab%20clinical%20trials%20%28Hickory%20NCT02100696%2C%20Laurel%20NCT02165215%2C%20Hibiscus%20I%20NCT02163759%2C%20Hibiscus%20II%20NCT02171429%20and%20Gardenia%20NCT02136069%29.We%20evaluate%20the%20performance%20of%20the%20ESSM%20by%20first%20assessing%20the%20agreement%20of%20scoring%20as%20compared%20to%20centrally%20read%20MES.%20The%20agreement%20between%20central%20reading%20and%20the%20ESSM%20at%20the%20colon%20section%20level%20was%20high%20%28quadratic-weighted%20kappa%20k%3D0.81%3B%20Tab.%201%29.%20This%20was%20comparable%20to%20the%20agreement%20between%20central%20and%20local%20reading%20%28k%3D0.84%29%2C%20suggesting%20that%20the%20ESSM%20shows%20levels%20of%20inter-rater%20variability%20comparable%20to%20experienced%20readers.%20Finally%2C%20we%20found%20correlations%20between%20the%20average%20ESSM%20at%20all%20anatomical%20locations%20and%20other%20disease%20activity%20markers%20to%20be%20moderate%20to%20high%3A%20faecal%20calprotectin%20rs%3D0.24%2C%20CRP%20rs%3D0.29%2C%20stool%20frequency%20rs%3D0.49%2C%20rectal%20bleeding%20rs%3D0.43%20and%20physician%20global%20assessment%20rs%3D0.47%20%28Tab.%201%29.Here%2C%20we%20introduced%20the%20ESSM%2C%20a%20fully-automated%20AI-based%20scoring%20system%20that%20enables%20accurate%2C%20objective%20and%20localised%20assessment%20of%20disease%20severity%20in%20UC.%20In%20brief%2C%20we%20show%20that%20the%20ESSM%20compares%20well%20with%20central%20reading%20%28at%20the%20colon%20section%20level%29%20and%20has%20clinical%20relevance%20when%20compared%20to%20other%20markers%20of%20disease%20activity.%20This%20tentatively%20suggests%20that%20the%20ESSM%20has%20the%20potential%20to%20augment%20the%20current%20way%20of%20assessing%20disease%20severity%2C%20both%20in%20clinical%20trials%20and%20everyday%20clinical%20practice.%22%2C%22date%22%3A%222024-01-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1093%5C%2Fecco-jcc%5C%2Fjjad212.0228%22%2C%22ISSN%22%3A%221873-9946%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1093%5C%2Fecco-jcc%5C%2Fjjad212.0228%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T08%3A58%3A04Z%22%7D%7D%2C%7B%22key%22%3A%22BKD9A8GF%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Haller%20et%20al.%22%2C%22parsedDate%22%3A%222022-07-29%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BHaller%2C%20S.%2C%20Feineis%2C%20L.%2C%20Hutschenreiter%2C%20L.%2C%20Bernard%2C%20F.%2C%20Rother%2C%20C.%2C%20Kainm%26%23xFC%3Bller%2C%20D.%2C%20Swoboda%2C%20P.%2C%20%26amp%3B%20Savchynskyy%2C%20B.%20%282022%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BA%20Comparative%20Study%20of%20Graph%20Matching%20Algorithms%20in%20Computer%20Vision%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B%20%28arXiv%3A2207.00291%29.%20arXiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2207.00291%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2207.00291%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22A%20Comparative%20Study%20of%20Graph%20Matching%20Algorithms%20in%20Computer%20Vision%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22Haller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lorenz%22%2C%22lastName%22%3A%22Feineis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lisa%22%2C%22lastName%22%3A%22Hutschenreiter%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Florian%22%2C%22lastName%22%3A%22Bernard%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Carsten%22%2C%22lastName%22%3A%22Rother%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainm%5Cu00fcller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paul%22%2C%22lastName%22%3A%22Swoboda%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bogdan%22%2C%22lastName%22%3A%22Savchynskyy%22%7D%5D%2C%22abstractNote%22%3A%22The%20graph%20matching%20optimization%20problem%20is%20an%20essential%20component%20for%20many%20tasks%20in%20computer%20vision%2C%20such%20as%20bringing%20two%20deformable%20objects%20in%20correspondence.%20Naturally%2C%20a%20wide%20range%20of%20applicable%20algorithms%20have%20been%20proposed%20in%20the%20last%20decades.%20Since%20a%20common%20standard%20benchmark%20has%20not%20been%20developed%2C%20their%20performance%20claims%20are%20often%20hard%20to%20verify%20as%20evaluation%20on%20differing%20problem%20instances%20and%20criteria%20make%20the%20results%20incomparable.%20To%20address%20these%20shortcomings%2C%20we%20present%20a%20comparative%20study%20of%20graph%20matching%20algorithms.%20We%20create%20a%20uniform%20benchmark%20where%20we%20collect%20and%20categorize%20a%20large%20set%20of%20existing%20and%20publicly%20available%20computer%20vision%20graph%20matching%20problems%20in%20a%20common%20format.%20At%20the%20same%20time%20we%20collect%20and%20categorize%20the%20most%20popular%20open-source%20implementations%20of%20graph%20matching%20algorithms.%20Their%20performance%20is%20evaluated%20in%20a%20way%20that%20is%20in%20line%20with%20the%20best%20practices%20for%20comparing%20optimization%20algorithms.%20The%20study%20is%20designed%20to%20be%20reproducible%20and%20extensible%20to%20serve%20as%20a%20valuable%20resource%20in%20the%20future.%20Our%20study%20provides%20three%20notable%20insights%3A%201.%29%20popular%20problem%20instances%20are%20exactly%20solvable%20in%20substantially%20less%20than%201%20second%20and%2C%20therefore%2C%20are%20insufficient%20for%20future%20empirical%20evaluations%3B%202.%29%20the%20most%20popular%20baseline%20methods%20are%20highly%20inferior%20to%20the%20best%20available%20methods%3B%203.%29%20despite%20the%20NP-hardness%20of%20the%20problem%2C%20instances%20coming%20from%20vision%20applications%20are%20often%20solvable%20in%20a%20few%20seconds%20even%20for%20graphs%20with%20more%20than%20500%20vertices.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2207.00291%22%2C%22date%22%3A%222022-07-29%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2207.00291%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2207.00291%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-02-13T13%3A45%3A31Z%22%7D%7D%2C%7B%22key%22%3A%226BWRUHU6%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hirsch%20and%20Kainmueller%22%2C%22parsedDate%22%3A%222020-09-21%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BHirsch%2C%20P.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282020%29.%20%26lt%3Bb%26gt%3BAn%20Auxiliary%20Task%20for%20Learning%20Nuclei%20Segmentation%20in%203D%20Microscopy%20Images%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BProceedings%20of%20the%20Third%20Conference%20on%20Medical%20Imaging%20with%20Deep%20Learning%26lt%3B%5C%2Fi%26gt%3B%2C%20304%26%23x2013%3B321.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fproceedings.mlr.press%5C%2Fv121%5C%2Fhirsch20a.html%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fproceedings.mlr.press%5C%2Fv121%5C%2Fhirsch20a.html%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22An%20Auxiliary%20Task%20for%20Learning%20Nuclei%20Segmentation%20in%203D%20Microscopy%20Images%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%5D%2C%22abstractNote%22%3A%22Segmentation%20of%20cell%20nuclei%20in%20microscopy%20images%20is%20a%20prevalent%20necessity%20in%20cell%20biology.%20Especially%20for%20three-dimensional%20datasets%2C%20manual%20segmentation%20is%20prohibitively%20time-consuming%2C%20motivating%20the%20need%20for%20automated%20methods.%20Learning-based%20methods%20trained%20on%20pixel-wise%20ground-truth%20segmentations%20have%20been%20shown%20to%20yield%20state-of-the-art%20results%20on%202d%20benchmark%20image%20data%20of%20nuclei%2C%20yet%20a%20respective%20benchmark%20is%20missing%20for%203d%20image%20data.%20In%20this%20work%2C%20we%20perform%20a%20comparative%20evaluation%20of%20nuclei%20segmentation%20algorithms%20on%20a%20database%20of%20manually%20segmented%203d%20light%20microscopy%20volumes.%20We%20propose%20a%20novel%20learning%20strategy%20that%20boosts%20segmentation%20accuracy%20by%20means%20of%20a%20simple%20auxiliary%20task%2C%20thereby%20robustly%20outperforming%20each%20of%20our%20baselines.%20Furthermore%2C%20we%20show%20that%20one%20of%20our%20baselines%2C%20the%20popular%20three-label%20model%2C%20when%20trained%20with%20our%20proposed%20auxiliary%20task%2C%20outperforms%20the%20recent%20%7B%5C%5Cem%20StarDist-3D%7D.%20As%20an%20additional%2C%20practical%20contribution%2C%20we%20benchmark%20nuclei%20segmentation%20against%20nuclei%20%7B%5C%5Cem%20detection%7D%2C%20i.e.%20the%20task%20of%20merely%20pinpointing%20individual%20nuclei%20without%20generating%20respective%20pixel-accurate%20segmentations.%20For%20learning%20nuclei%20detection%2C%20large%203d%20training%20datasets%20of%20manually%20annotated%20nuclei%20center%20points%20are%20available.%20However%2C%20the%20impact%20on%20detection%20accuracy%20caused%20by%20training%20on%20such%20sparse%20ground%20truth%20as%20opposed%20to%20dense%20pixel-wise%20ground%20truth%20has%20not%20yet%20been%20quantified.%20To%20this%20end%2C%20we%20compare%20nuclei%20detection%20accuracy%20yielded%20by%20training%20on%20dense%20vs.%20sparse%20ground%20truth.%20Our%20results%20suggest%20that%20training%20on%20sparse%20ground%20truth%20yields%20competitive%20nuclei%20detection%20rates.%22%2C%22date%22%3A%222020-09-21%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%20Third%20Conference%20on%20Medical%20Imaging%20with%20Deep%20Learning%22%2C%22conferenceName%22%3A%22Medical%20Imaging%20with%20Deep%20Learning%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fproceedings.mlr.press%5C%2Fv121%5C%2Fhirsch20a.html%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-05-02T07%3A41%3A13Z%22%7D%7D%2C%7B%22key%22%3A%22EA3TUYHY%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hirsch%20et%20al.%22%2C%22parsedDate%22%3A%222022-08-24%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BHirsch%2C%20P.%2C%20Malin-Mayor%2C%20C.%2C%20Santella%2C%20A.%2C%20Preibisch%2C%20S.%2C%20Kainmueller%2C%20D.%2C%20%26amp%3B%20Funke%2C%20J.%20%282022%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BTracking%20by%20weakly-supervised%20learning%20and%20graph%20optimization%20for%20whole-embryo%20C.%20elegans%20lineages%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B%20%28arXiv%3A2208.11467%29.%20arXiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2208.11467%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2208.11467%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Tracking%20by%20weakly-supervised%20learning%20and%20graph%20optimization%20for%20whole-embryo%20C.%20elegans%20lineages%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Caroline%22%2C%22lastName%22%3A%22Malin-Mayor%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anthony%22%2C%22lastName%22%3A%22Santella%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stephan%22%2C%22lastName%22%3A%22Preibisch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jan%22%2C%22lastName%22%3A%22Funke%22%7D%5D%2C%22abstractNote%22%3A%22Tracking%20all%20nuclei%20of%20an%20embryo%20in%20noisy%20and%20dense%20fluorescence%20microscopy%20data%20is%20a%20challenging%20task.%20We%20build%20upon%20a%20recent%20method%20for%20nuclei%20tracking%20that%20combines%20weakly-supervised%20learning%20from%20a%20small%20set%20of%20nuclei%20center%20point%20annotations%20with%20an%20integer%20linear%20program%20%28ILP%29%20for%20optimal%20cell%20lineage%20extraction.%20Our%20work%20specifically%20addresses%20the%20following%20challenging%20properties%20of%20C.%20elegans%20embryo%20recordings%3A%20%281%29%20Many%20cell%20divisions%20as%20compared%20to%20benchmark%20recordings%20of%20other%20organisms%2C%20and%20%282%29%20the%20presence%20of%20polar%20bodies%20that%20are%20easily%20mistaken%20as%20cell%20nuclei.%20To%20cope%20with%20%281%29%2C%20we%20devise%20and%20incorporate%20a%20learnt%20cell%20division%20detector.%20To%20cope%20with%20%282%29%2C%20we%20employ%20a%20learnt%20polar%20body%20detector.%20We%20further%20propose%20automated%20ILP%20weights%20tuning%20via%20a%20structured%20SVM%2C%20alleviating%20the%20need%20for%20tedious%20manual%20set-up%20of%20a%20respective%20grid%20search.%20Our%20method%20outperforms%20the%20previous%20leader%20of%20the%20cell%20tracking%20challenge%20on%20the%20Fluo-N3DH-CE%20embryo%20dataset.%20We%20report%20a%20further%20extensive%20quantitative%20evaluation%20on%20two%20more%20C.%20elegans%20datasets.%20We%20will%20make%20these%20datasets%20public%20to%20serve%20as%20an%20extended%20benchmark%20for%20future%20method%20development.%20Our%20results%20suggest%20considerable%20improvements%20yielded%20by%20our%20method%2C%20especially%20in%20terms%20of%20the%20correctness%20of%20division%20event%20detection%20and%20the%20number%20and%20length%20of%20fully%20correct%20track%20segments.%20Code%3A%20https%3A%5C%2F%5C%2Fgithub.com%5C%2Ffunkelab%5C%2Flinajea%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2208.11467%22%2C%22date%22%3A%222022-08-24%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2208.11467%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2208.11467%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-02-13T13%3A47%3A55Z%22%7D%7D%2C%7B%22key%22%3A%224JWTN2GS%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hutschenreiter%20et%20al.%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BHutschenreiter%2C%20L.%2C%20Haller%2C%20S.%2C%20Feineis%2C%20L.%2C%20Rother%2C%20C.%2C%20Kainmuller%2C%20D.%2C%20%26amp%3B%20Savchynskyy%2C%20B.%20%282021%29.%20%26lt%3Bb%26gt%3BFusion%20Moves%20for%20Graph%20Matching%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3B2021%20IEEE%5C%2FCVF%20International%20Conference%20on%20Computer%20Vision%20%28ICCV%29%26lt%3B%5C%2Fi%26gt%3B%2C%206250%26%23x2013%3B6259.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FICCV48922.2021.00621%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FICCV48922.2021.00621%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Fusion%20Moves%20for%20Graph%20Matching%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lisa%22%2C%22lastName%22%3A%22Hutschenreiter%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22Haller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lorenz%22%2C%22lastName%22%3A%22Feineis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Carsten%22%2C%22lastName%22%3A%22Rother%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmuller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bogdan%22%2C%22lastName%22%3A%22Savchynskyy%22%7D%5D%2C%22abstractNote%22%3A%22We%20contribute%20to%20approximate%20algorithms%20for%20the%20quadratic%20assignment%20problem%20also%20known%20as%20graph%20matching.%20Inspired%20by%20the%20success%20of%20the%20fusion%20moves%20technique%20developed%20for%20multilabel%20discrete%20Markov%20random%20fields%2C%20we%20investigate%20its%20applicability%20to%20graph%20matching.%20In%20particular%2C%20we%20show%20how%20fusion%20moves%20can%20be%20efficiently%20combined%20with%20the%20dedicated%20state-of-the-art%20dual%20methods%20that%20have%20recently%20shown%20superior%20results%20in%20computer%20vision%20and%20bioimaging%20applications.%20As%20our%20empirical%20evaluation%20on%20a%20wide%20variety%20of%20graph%20matching%20datasets%20suggests%2C%20fusion%20moves%20significantly%20improve%20performance%20of%20these%20methods%20in%20terms%20of%20speed%20and%20quality%20of%20the%20obtained%20solutions.%20Our%20method%20sets%20a%20new%20state-of-the-art%20with%20a%20notable%20margin%20with%20respect%20to%20its%20competitors.%22%2C%22date%22%3A%2210%5C%2F2021%22%2C%22proceedingsTitle%22%3A%222021%20IEEE%5C%2FCVF%20International%20Conference%20on%20Computer%20Vision%20%28ICCV%29%22%2C%22conferenceName%22%3A%222021%20IEEE%5C%2FCVF%20International%20Conference%20on%20Computer%20Vision%20%28ICCV%29%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1109%5C%2FICCV48922.2021.00621%22%2C%22ISBN%22%3A%22978-1-66542-812-5%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F9710698%5C%2F%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-05-02T08%3A46%3A56Z%22%7D%7D%2C%7B%22key%22%3A%22IS9DHT3G%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Karg%20et%20al.%22%2C%22parsedDate%22%3A%222025-03-10%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BKarg%2C%20C.%2C%20Stricker%2C%20S.%2C%20Hutschenreiter%2C%20L.%2C%20Savchynskyy%2C%20B.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282025%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BFully%20Unsupervised%20Annotation%20of%20C.%20Elegans%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B%20%28arXiv%3A2503.07348%29.%20arXiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2503.07348%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2503.07348%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Fully%20Unsupervised%20Annotation%20of%20C.%20Elegans%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christoph%22%2C%22lastName%22%3A%22Karg%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sebastian%22%2C%22lastName%22%3A%22Stricker%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lisa%22%2C%22lastName%22%3A%22Hutschenreiter%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bogdan%22%2C%22lastName%22%3A%22Savchynskyy%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%5D%2C%22abstractNote%22%3A%22In%20this%20work%20we%20present%20a%20novel%20approach%20for%20unsupervised%20multi-graph%20matching%2C%20which%20applies%20to%20problems%20for%20which%20a%20Gaussian%20distribution%20of%20keypoint%20features%20can%20be%20assumed.%20We%20leverage%20cycle%20consistency%20as%20loss%20for%20self-supervised%20learning%2C%20and%20determine%20Gaussian%20parameters%20through%20Bayesian%20Optimization%2C%20yielding%20a%20highly%20efficient%20approach%20that%20scales%20to%20large%20datasets.%20Our%20fully%20unsupervised%20approach%20enables%20us%20to%20reach%20the%20accuracy%20of%20state-of-the-art%20supervised%20methodology%20for%20the%20use%20case%20of%20annotating%20cell%20nuclei%20in%203D%20microscopy%20images%20of%20the%20worm%20C.%20elegans.%20To%20this%20end%2C%20our%20approach%20yields%20the%20first%20unsupervised%20atlas%20of%20C.%20elegans%2C%20i.e.%20a%20model%20of%20the%20joint%20distribution%20of%20all%20of%20its%20cell%20nuclei%2C%20without%20the%20need%20for%20any%20ground%20truth%20cell%20annotation.%20This%20advancement%20enables%20highly%20efficient%20annotation%20of%20cell%20nuclei%20in%20large%20microscopy%20datasets%20of%20C.%20elegans.%20Beyond%20C.%20elegans%2C%20our%20approach%20offers%20fully%20unsupervised%20construction%20of%20cell-level%20atlases%20for%20any%20model%20organism%20with%20a%20stereotyped%20cell%20lineage%2C%20and%20thus%20bears%20the%20potential%20to%20catalyze%20respective%20comparative%20developmental%20studies%20in%20a%20range%20of%20further%20species.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2503.07348%22%2C%22date%22%3A%222025-03-10%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2503.07348%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2503.07348%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-03-12T09%3A44%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22CAUD4GCC%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Maier-Hein%20et%20al.%22%2C%22parsedDate%22%3A%222024-02%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BMaier-Hein%2C%20L.%2C%20Reinke%2C%20A.%2C%20Godau%2C%20P.%2C%20Tizabi%2C%20M.%20D.%2C%20Buettner%2C%20F.%2C%20Christodoulou%2C%20E.%2C%20Glocker%2C%20B.%2C%20Isensee%2C%20F.%2C%20Kleesiek%2C%20J.%2C%20Kozubek%2C%20M.%2C%20Reyes%2C%20M.%2C%20Riegler%2C%20M.%20A.%2C%20Wiesenfarth%2C%20M.%2C%20Kavur%2C%20A.%20E.%2C%20Sudre%2C%20C.%20H.%2C%20Baumgartner%2C%20M.%2C%20Eisenmann%2C%20M.%2C%20Heckmann-N%26%23xF6%3Btzel%2C%20D.%2C%20R%26%23xE4%3Bdsch%2C%20T.%2C%20%26%23x2026%3B%20J%26%23xE4%3Bger%2C%20P.%20F.%20%282024%29.%20%26lt%3Bb%26gt%3BMetrics%20reloaded%3A%20recommendations%20for%20image%20analysis%20validation%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BNature%20Methods%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B21%26lt%3B%5C%2Fi%26gt%3B%282%29%2C%20195%26%23x2013%3B212.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1038%5C%2Fs41592-023-02151-z%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1038%5C%2Fs41592-023-02151-z%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Metrics%20reloaded%3A%20recommendations%20for%20image%20analysis%20validation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lena%22%2C%22lastName%22%3A%22Maier-Hein%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Annika%22%2C%22lastName%22%3A%22Reinke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Patrick%22%2C%22lastName%22%3A%22Godau%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Minu%20D.%22%2C%22lastName%22%3A%22Tizabi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Florian%22%2C%22lastName%22%3A%22Buettner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Evangelia%22%2C%22lastName%22%3A%22Christodoulou%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ben%22%2C%22lastName%22%3A%22Glocker%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fabian%22%2C%22lastName%22%3A%22Isensee%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jens%22%2C%22lastName%22%3A%22Kleesiek%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michal%22%2C%22lastName%22%3A%22Kozubek%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mauricio%22%2C%22lastName%22%3A%22Reyes%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%20A.%22%2C%22lastName%22%3A%22Riegler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Manuel%22%2C%22lastName%22%3A%22Wiesenfarth%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22A.%20Emre%22%2C%22lastName%22%3A%22Kavur%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Carole%20H.%22%2C%22lastName%22%3A%22Sudre%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Baumgartner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Matthias%22%2C%22lastName%22%3A%22Eisenmann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Doreen%22%2C%22lastName%22%3A%22Heckmann-N%5Cu00f6tzel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tim%22%2C%22lastName%22%3A%22R%5Cu00e4dsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Laura%22%2C%22lastName%22%3A%22Acion%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michela%22%2C%22lastName%22%3A%22Antonelli%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tal%22%2C%22lastName%22%3A%22Arbel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Spyridon%22%2C%22lastName%22%3A%22Bakas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Arriel%22%2C%22lastName%22%3A%22Benis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Matthew%20B.%22%2C%22lastName%22%3A%22Blaschko%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M.%20Jorge%22%2C%22lastName%22%3A%22Cardoso%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Veronika%22%2C%22lastName%22%3A%22Cheplygina%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Beth%20A.%22%2C%22lastName%22%3A%22Cimini%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gary%20S.%22%2C%22lastName%22%3A%22Collins%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Keyvan%22%2C%22lastName%22%3A%22Farahani%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Luciana%22%2C%22lastName%22%3A%22Ferrer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Adrian%22%2C%22lastName%22%3A%22Galdran%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bram%22%2C%22lastName%22%3A%22van%20Ginneken%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%22%2C%22lastName%22%3A%22Haase%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Daniel%20A.%22%2C%22lastName%22%3A%22Hashimoto%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%20M.%22%2C%22lastName%22%3A%22Hoffman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Merel%22%2C%22lastName%22%3A%22Huisman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pierre%22%2C%22lastName%22%3A%22Jannin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Charles%20E.%22%2C%22lastName%22%3A%22Kahn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bernhard%22%2C%22lastName%22%3A%22Kainz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexandros%22%2C%22lastName%22%3A%22Karargyris%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alan%22%2C%22lastName%22%3A%22Karthikesalingam%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Florian%22%2C%22lastName%22%3A%22Kofler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Annette%22%2C%22lastName%22%3A%22Kopp-Schneider%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%22%2C%22lastName%22%3A%22Kreshuk%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tahsin%22%2C%22lastName%22%3A%22Kurc%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bennett%20A.%22%2C%22lastName%22%3A%22Landman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Geert%22%2C%22lastName%22%3A%22Litjens%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amin%22%2C%22lastName%22%3A%22Madani%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Klaus%22%2C%22lastName%22%3A%22Maier-Hein%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anne%20L.%22%2C%22lastName%22%3A%22Martel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Mattson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Erik%22%2C%22lastName%22%3A%22Meijering%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bjoern%22%2C%22lastName%22%3A%22Menze%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Karel%20G.%20M.%22%2C%22lastName%22%3A%22Moons%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Henning%22%2C%22lastName%22%3A%22M%5Cu00fcller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Brennan%22%2C%22lastName%22%3A%22Nichyporuk%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Felix%22%2C%22lastName%22%3A%22Nickel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jens%22%2C%22lastName%22%3A%22Petersen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nasir%22%2C%22lastName%22%3A%22Rajpoot%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicola%22%2C%22lastName%22%3A%22Rieke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julio%22%2C%22lastName%22%3A%22Saez-Rodriguez%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Clara%20I.%22%2C%22lastName%22%3A%22S%5Cu00e1nchez%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shravya%22%2C%22lastName%22%3A%22Shetty%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Maarten%22%2C%22lastName%22%3A%22van%20Smeden%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ronald%20M.%22%2C%22lastName%22%3A%22Summers%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Abdel%20A.%22%2C%22lastName%22%3A%22Taha%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aleksei%22%2C%22lastName%22%3A%22Tiulpin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sotirios%20A.%22%2C%22lastName%22%3A%22Tsaftaris%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ben%22%2C%22lastName%22%3A%22Van%20Calster%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ga%5Cu00ebl%22%2C%22lastName%22%3A%22Varoquaux%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paul%20F.%22%2C%22lastName%22%3A%22J%5Cu00e4ger%22%7D%5D%2C%22abstractNote%22%3A%22Increasing%20evidence%20shows%20that%20flaws%20in%20machine%20learning%20%28ML%29%20algorithm%20validation%20are%20an%20underestimated%20global%20problem.%20In%20biomedical%20image%20analysis%2C%20chosen%20performance%20metrics%20often%20do%20not%20reflect%20the%20domain%20interest%2C%20and%20thus%20fail%20to%20adequately%20measure%20scientific%20progress%20and%20hinder%20translation%20of%20ML%20techniques%20into%20practice.%20To%20overcome%20this%2C%20we%20created%20Metrics%20Reloaded%2C%20a%20comprehensive%20framework%20guiding%20researchers%20in%20the%20problem-aware%20selection%20of%20metrics.%20Developed%20by%20a%20large%20international%20consortium%20in%20a%20multistage%20Delphi%20process%2C%20it%20is%20based%20on%20the%20novel%20concept%20of%20a%20problem%20fingerprint%5Cu2014a%20structured%20representation%20of%20the%20given%20problem%20that%20captures%20all%20aspects%20that%20are%20relevant%20for%20metric%20selection%2C%20from%20the%20domain%20interest%20to%20the%20properties%20of%20the%20target%20structure%28s%29%2C%20dataset%20and%20algorithm%20output.%20On%20the%20basis%20of%20the%20problem%20fingerprint%2C%20users%20are%20guided%20through%20the%20process%20of%20choosing%20and%20applying%20appropriate%20validation%20metrics%20while%20being%20made%20aware%20of%20potential%20pitfalls.%20Metrics%20Reloaded%20targets%20image%20analysis%20problems%20that%20can%20be%20interpreted%20as%20classification%20tasks%20at%20image%2C%20object%20or%20pixel%20level%2C%20namely%20image-level%20classification%2C%20object%20detection%2C%20semantic%20segmentation%20and%20instance%20segmentation%20tasks.%20To%20improve%20the%20user%20experience%2C%20we%20implemented%20the%20framework%20in%20the%20Metrics%20Reloaded%20online%20tool.%20Following%20the%20convergence%20of%20ML%20methodology%20across%20application%20domains%2C%20Metrics%20Reloaded%20fosters%20the%20convergence%20of%20validation%20methodology.%20Its%20applicability%20is%20demonstrated%20for%20various%20biomedical%20use%20cases.%22%2C%22date%22%3A%222024-02%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1038%5C%2Fs41592-023-02151-z%22%2C%22ISSN%22%3A%221548-7105%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.nature.com%5C%2Farticles%5C%2Fs41592-023-02151-z%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T08%3A34%3A55Z%22%7D%7D%2C%7B%22key%22%3A%22NTXYK4CK%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Mais%20et%20al.%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BMais%2C%20L.%2C%20Hirsch%2C%20P.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282020%29.%20%26lt%3Bb%26gt%3BPatchPerPix%20for%20Instance%20Segmentation%26lt%3B%5C%2Fb%26gt%3B.%20In%20A.%20Vedaldi%2C%20H.%20Bischof%2C%20T.%20Brox%2C%20%26amp%3B%20J.-M.%20Frahm%20%28Eds.%29%2C%20%26lt%3Bi%26gt%3BComputer%20Vision%20%26%23x2013%3B%20ECCV%202020%26lt%3B%5C%2Fi%26gt%3B%20%28pp.%20288%26%23x2013%3B304%29.%20Springer%20International%20Publishing.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-030-58595-2_18%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-030-58595-2_18%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22PatchPerPix%20for%20Instance%20Segmentation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lisa%22%2C%22lastName%22%3A%22Mais%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Andrea%22%2C%22lastName%22%3A%22Vedaldi%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Horst%22%2C%22lastName%22%3A%22Bischof%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Thomas%22%2C%22lastName%22%3A%22Brox%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Jan-Michael%22%2C%22lastName%22%3A%22Frahm%22%7D%5D%2C%22abstractNote%22%3A%22We%20present%20a%20novel%20method%20for%20proposal%20free%20instance%20segmentation%20that%20can%20handle%20sophisticated%20object%20shapes%20which%20span%20large%20parts%20of%20an%20image%20and%20form%20dense%20object%20clusters%20with%20crossovers.%20Our%20method%20is%20based%20on%20predicting%20dense%20local%20shape%20descriptors%2C%20which%20we%20assemble%20to%20form%20instances.%20All%20instances%20are%20assembled%20simultaneously%20in%20one%20go.%20To%20our%20knowledge%2C%20our%20method%20is%20the%20first%20non-iterative%20method%20that%20yields%20instances%20that%20are%20composed%20of%20learnt%20shape%20patches.%20We%20evaluate%20our%20method%20on%20a%20diverse%20range%20of%20data%20domains%2C%20where%20it%20defines%20the%20new%20state%20of%20the%20art%20on%20four%20benchmarks%2C%20namely%20the%20ISBI%202012%20EM%20segmentation%20benchmark%2C%20the%20BBBC010%20C.%20elegans%20dataset%2C%20and%202d%20as%20well%20as%203d%20fluorescence%20microscopy%20data%20of%20cell%20nuclei.%20We%20show%20furthermore%20that%20our%20method%20also%20applies%20to%203d%20light%20microscopy%20data%20of%20Drosophila%20neurons%2C%20which%20exhibit%20extreme%20cases%20of%20complex%20shape%20clusters.%22%2C%22date%22%3A%222020%22%2C%22proceedingsTitle%22%3A%22Computer%20Vision%20%5Cu2013%20ECCV%202020%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2F978-3-030-58595-2_18%22%2C%22ISBN%22%3A%22978-3-030-58595-2%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-03-02T13%3A57%3A33Z%22%7D%7D%2C%7B%22key%22%3A%22YEZZA9X5%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Mais%20et%20al.%22%2C%22parsedDate%22%3A%222024-06%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BMais%2C%20L.%2C%20Hirsch%2C%20P.%2C%20Managan%2C%20C.%2C%20Kandarpa%2C%20R.%2C%20Rumberger%2C%20J.%20L.%2C%20Reinke%2C%20A.%2C%20Maier-Hein%2C%20L.%2C%20Ihrke%2C%20G.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282024%29.%20%26lt%3Bb%26gt%3BFISBe%3A%20A%20Real-World%20Benchmark%20Dataset%20for%20Instance%20Segmentation%20of%20Long-Range%20thin%20Filamentous%20Structures%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3B2024%20IEEE%5C%2FCVF%20Conference%20on%20Computer%20Vision%20and%20Pattern%20Recognition%20%28CVPR%29%26lt%3B%5C%2Fi%26gt%3B%2C%2022249%26%23x2013%3B22259.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FCVPR52733.2024.02100%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FCVPR52733.2024.02100%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22FISBe%3A%20A%20Real-World%20Benchmark%20Dataset%20for%20Instance%20Segmentation%20of%20Long-Range%20thin%20Filamentous%20Structures%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lisa%22%2C%22lastName%22%3A%22Mais%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Claire%22%2C%22lastName%22%3A%22Managan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ramya%22%2C%22lastName%22%3A%22Kandarpa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Josef%20Lorenz%22%2C%22lastName%22%3A%22Rumberger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Annika%22%2C%22lastName%22%3A%22Reinke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lena%22%2C%22lastName%22%3A%22Maier-Hein%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gudrun%22%2C%22lastName%22%3A%22Ihrke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%5D%2C%22abstractNote%22%3A%22Instance%20segmentation%20of%20neurons%20in%20volumetric%20light%20microscopy%20images%20of%20nervous%20systems%20enables%20ground-breaking%20research%20in%20neuroscience%20by%20facilitating%20joint%20functional%20and%20morphological%20analyses%20of%20neural%20circuits%20at%20cel-lular%20resolution.%20Yet%20said%20multi-neuron%20light%20microscopy%20data%20exhibits%20extremely%20challenging%20properties%20for%20the%20task%20of%20instance%20segmentation%3A%20Individual%20neurons%20have%20long-ranging%2C%20thin%20filamentous%20and%20widely%20branching%20morpholo-gies%2C%20multiple%20neurons%20are%20tightly%20inter-weaved%2C%20and%20par-tial%20volume%20effects%2C%20uneven%20illumination%20and%20noise%20inherent%20to%20light%20microscopy%20severely%20impede%20local%20disentan-gling%20as%20well%20as%20long-range%20tracing%20of%20individual%20neurons.%20These%20properties%20reflect%20a%20current%20key%20challenge%20in%20machine%20learning%20research%2C%20namely%20to%20effectively%20capture%20long-range%20dependencies%20in%20the%20data.%20While%20respective%20method-ological%20research%20is%20buzzing%2C%20to%20date%20methods%20are%20typically%20benchmarked%20on%20synthetic%20datasets.%20To%20address%20this%20gap%2C%20we%20release%20the%20FlyLight%20Instance%20Segmentation%20Benchmark%20%28FISBe%29%20dataset%2C%20the%20first%20publicly%20available%20multi-neuron%20light%20microscopy%20dataset%20with%20pixel-wise%20annotations.%20In%20addition%2C%20we%20define%20a%20set%20of%20instance%20segmentation%20metrics%20for%20benchmarking%20that%20we%20designed%20to%20be%20meaningful%20with%20regard%20to%20downstream%20analyses.%20Lastly%2C%20we%20provide%20three%20baselines%20to%20kick%20off%20a%20competition%20that%20we%20envision%20to%20both%20advance%20the%20field%20of%20machine%20learning%20regarding%20methodology%20for%20capturing%20long-range%20data%20dependencies%2C%20and%20facilitate%20scientific%20discovery%20in%20basic%20neuroscience.%20Project%20page%3A%20https%3A%5C%2F%5C%2Fkainmueller-lab.github.io%5C%2Fjisbe.%22%2C%22date%22%3A%222024-06%22%2C%22proceedingsTitle%22%3A%222024%20IEEE%5C%2FCVF%20Conference%20on%20Computer%20Vision%20and%20Pattern%20Recognition%20%28CVPR%29%22%2C%22conferenceName%22%3A%222024%20IEEE%5C%2FCVF%20Conference%20on%20Computer%20Vision%20and%20Pattern%20Recognition%20%28CVPR%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FCVPR52733.2024.02100%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10656899%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T12%3A13%3A28Z%22%7D%7D%2C%7B%22key%22%3A%22ZEE5L2E9%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Mais%20et%20al.%22%2C%22parsedDate%22%3A%222021-07-26%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BMais%2C%20L.%2C%20Hirsch%2C%20P.%2C%20Managan%2C%20C.%2C%20Wang%2C%20K.%2C%20Rokicki%2C%20K.%2C%20Svirskas%2C%20R.%20R.%2C%20Dickson%2C%20B.%20J.%2C%20Korff%2C%20W.%2C%20Rubin%2C%20G.%20M.%2C%20Ihrke%2C%20G.%2C%20Meissner%2C%20G.%20W.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282021%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BPatchPerPixMatch%20for%20Automated%203d%20Search%20of%20Neuronal%20Morphologies%20in%20Light%20Microscopy%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B.%20bioRxiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2021.07.23.453511%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2021.07.23.453511%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22PatchPerPixMatch%20for%20Automated%203d%20Search%20of%20Neuronal%20Morphologies%20in%20Light%20Microscopy%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lisa%22%2C%22lastName%22%3A%22Mais%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Claire%22%2C%22lastName%22%3A%22Managan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kaiyu%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Konrad%22%2C%22lastName%22%3A%22Rokicki%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%20R.%22%2C%22lastName%22%3A%22Svirskas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Barry%20J.%22%2C%22lastName%22%3A%22Dickson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Wyatt%22%2C%22lastName%22%3A%22Korff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gerald%20M.%22%2C%22lastName%22%3A%22Rubin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gudrun%22%2C%22lastName%22%3A%22Ihrke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Geoffrey%20W.%22%2C%22lastName%22%3A%22Meissner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%5D%2C%22abstractNote%22%3A%22Studies%20of%20individual%20neurons%20in%20the%20Drosophila%20nervous%20system%20are%20facilitated%20by%20transgenic%20lines%20that%20sparsely%20and%20repeatably%20label%20respective%20neurons%20of%20interest.%20Sparsity%20can%20be%20enhanced%20by%20means%20of%20intersectional%20approaches%20like%20the%20split-GAL4%20system%2C%20which%20labels%20the%20positive%20intersection%20of%20the%20expression%20patterns%20of%20two%20%28denser%29%20GAL4%20lines.%20To%20this%20end%2C%20two%20GAL4%20lines%20have%20to%20be%20identified%20as%20labelling%20a%20neuron%20of%20interest.%20Current%20approaches%20to%20tackling%20this%20task%20include%20visual%20inspection%2C%20as%20well%20as%20automated%20search%20in%202d%20projection%20images%2C%20of%20single%20cell%20multi-color%20flip-out%20%28MCFO%29%20acquisitions%20of%20GAL4%20expression%20patterns.%20There%20is%20to%20date%20no%20automated%20method%20available%20that%20performs%20full%203d%20search%20in%20MCFO%20imagery%20of%20GAL4%20lines%2C%20nor%20one%20that%20leverages%20automated%20reconstructions%20of%20the%20labelled%20neuron%20morphologies.%20To%20close%20this%20gap%2C%20we%20propose%20PatchPerPixMatch%2C%20a%20fully%20automated%20approach%20for%20finding%20a%20given%20neuron%20morphology%20in%20MCFO%20acquisitions%20of%20Gen1%20GAL4%20lines.%20PatchPerPixMatch%20performs%20automated%20instance%20segmentation%20of%20MCFO%20acquisitions%2C%20and%20subsequently%20searches%20for%20a%20target%20neuron%20morphology%20by%20minimizing%20an%20objective%20that%20aims%20at%20covering%20the%20target%20with%20a%20set%20of%20well-fitting%20segmentation%20fragments.%20Patch-PerPixMatch%20is%20computationally%20efficient%20albeit%20being%20full%203d%2C%20while%20also%20highly%20robust%20to%20inaccuracies%20in%20the%20automated%20neuron%20instance%20segmentation.%20We%20are%20releasing%20PatchPerPixMatch%20search%20results%20for%20~30%2C000%20neuron%20morphologies%20from%20the%20Drosophila%20hemibrain%20in%20~20%2C000%20MCFO%20acquisitions%20of%20~3%2C500%20Gen1%20GAL4%20lines.%5CnCode%20https%3A%5C%2F%5C%2Fgithub.com%5C%2FKainmueller-Lab%5C%2FPatchPerPixMatch%5CnResults%20https%3A%5C%2F%5C%2Fpppm.janelia.org%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22bioRxiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222021-07-26%22%2C%22DOI%22%3A%2210.1101%5C%2F2021.07.23.453511%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.biorxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2021.07.23.453511v1%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-05-02T12%3A27%3A40Z%22%7D%7D%2C%7B%22key%22%3A%225F8MNE2N%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Malin-Mayor%20et%20al.%22%2C%22parsedDate%22%3A%222023-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BMalin-Mayor%2C%20C.%2C%20Hirsch%2C%20P.%2C%20Guignard%2C%20L.%2C%20McDole%2C%20K.%2C%20Wan%2C%20Y.%2C%20Lemon%2C%20W.%20C.%2C%20Kainmueller%2C%20D.%2C%20Keller%2C%20P.%20J.%2C%20Preibisch%2C%20S.%2C%20%26amp%3B%20Funke%2C%20J.%20%282023%29.%20%26lt%3Bb%26gt%3BAutomated%20reconstruction%20of%20whole-embryo%20cell%20lineages%20by%20learning%20from%20sparse%20annotations%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BNature%20Biotechnology%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B41%26lt%3B%5C%2Fi%26gt%3B%281%29%2C%2044%26%23x2013%3B49.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1038%5C%2Fs41587-022-01427-7%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1038%5C%2Fs41587-022-01427-7%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Automated%20reconstruction%20of%20whole-embryo%20cell%20lineages%20by%20learning%20from%20sparse%20annotations%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Caroline%22%2C%22lastName%22%3A%22Malin-Mayor%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Leo%22%2C%22lastName%22%3A%22Guignard%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katie%22%2C%22lastName%22%3A%22McDole%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yinan%22%2C%22lastName%22%3A%22Wan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22William%20C.%22%2C%22lastName%22%3A%22Lemon%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Philipp%20J.%22%2C%22lastName%22%3A%22Keller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stephan%22%2C%22lastName%22%3A%22Preibisch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jan%22%2C%22lastName%22%3A%22Funke%22%7D%5D%2C%22abstractNote%22%3A%22We%20present%20a%20method%20to%20automatically%20identify%20and%20track%20nuclei%20in%20time-lapse%20microscopy%20recordings%20of%20entire%20developing%20embryos.%20The%20method%20combines%20deep%20learning%20and%20global%20optimization.%20On%20a%20mouse%20dataset%2C%20it%20reconstructs%2075.8%25%20of%20cell%20lineages%20spanning%201%20h%2C%20as%20compared%20to%2031.8%25%20for%20the%20competing%20method.%20Our%20approach%20improves%20understanding%20of%20where%20and%20when%20cell%20fate%20decisions%20are%20made%20in%20developing%20embryos%2C%20tissues%2C%20and%20organs.%22%2C%22date%22%3A%222023-01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1038%5C%2Fs41587-022-01427-7%22%2C%22ISSN%22%3A%221546-1696%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.nature.com%5C%2Farticles%5C%2Fs41587-022-01427-7%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-02-13T14%3A38%3A57Z%22%7D%7D%2C%7B%22key%22%3A%226U7LI8T6%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Meissner%20et%20al.%22%2C%22parsedDate%22%3A%222022-12-03%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BMeissner%2C%20G.%20W.%2C%20Nern%2C%20A.%2C%20Dorman%2C%20Z.%2C%20DePasquale%2C%20G.%20M.%2C%20Forster%2C%20K.%2C%20Gibney%2C%20T.%2C%20Hausenfluck%2C%20J.%20H.%2C%20He%2C%20Y.%2C%20Iyer%2C%20N.%2C%20Jeter%2C%20J.%2C%20Johnson%2C%20L.%2C%20Johnston%2C%20R.%20M.%2C%20Lee%2C%20K.%2C%20Melton%2C%20B.%2C%20Yarbrough%2C%20B.%2C%20Zugates%2C%20C.%20T.%2C%20Clements%2C%20J.%2C%20Goina%2C%20C.%2C%20Otsuna%2C%20H.%2C%20%26%23x2026%3B%20Team%2C%20F.%20P.%20%282022%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BA%20searchable%20image%20resource%20of%20Drosophila%20GAL4-driver%20expression%20patterns%20with%20single%20neuron%20resolution%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B.%20bioRxiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2020.05.29.080473%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2020.05.29.080473%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22A%20searchable%20image%20resource%20of%20Drosophila%20GAL4-driver%20expression%20patterns%20with%20single%20neuron%20resolution%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Geoffrey%20W.%22%2C%22lastName%22%3A%22Meissner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aljoscha%22%2C%22lastName%22%3A%22Nern%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zachary%22%2C%22lastName%22%3A%22Dorman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gina%20M.%22%2C%22lastName%22%3A%22DePasquale%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kaitlyn%22%2C%22lastName%22%3A%22Forster%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Theresa%22%2C%22lastName%22%3A%22Gibney%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joanna%20H.%22%2C%22lastName%22%3A%22Hausenfluck%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yisheng%22%2C%22lastName%22%3A%22He%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nirmala%22%2C%22lastName%22%3A%22Iyer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jennifer%22%2C%22lastName%22%3A%22Jeter%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lauren%22%2C%22lastName%22%3A%22Johnson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rebecca%20M.%22%2C%22lastName%22%3A%22Johnston%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kelley%22%2C%22lastName%22%3A%22Lee%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Brian%22%2C%22lastName%22%3A%22Melton%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Brianna%22%2C%22lastName%22%3A%22Yarbrough%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christopher%20T.%22%2C%22lastName%22%3A%22Zugates%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jody%22%2C%22lastName%22%3A%22Clements%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cristian%22%2C%22lastName%22%3A%22Goina%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hideo%22%2C%22lastName%22%3A%22Otsuna%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Konrad%22%2C%22lastName%22%3A%22Rokicki%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%20R.%22%2C%22lastName%22%3A%22Svirskas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yoshinori%22%2C%22lastName%22%3A%22Aso%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gwyneth%20M.%22%2C%22lastName%22%3A%22Card%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Barry%20J.%22%2C%22lastName%22%3A%22Dickson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Erica%22%2C%22lastName%22%3A%22Ehrhardt%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jens%22%2C%22lastName%22%3A%22Goldammer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Masayoshi%22%2C%22lastName%22%3A%22Ito%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Wyatt%22%2C%22lastName%22%3A%22Korff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lisa%22%2C%22lastName%22%3A%22Mais%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ryo%22%2C%22lastName%22%3A%22Minegishi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shigehiro%22%2C%22lastName%22%3A%22Namiki%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gerald%20M.%22%2C%22lastName%22%3A%22Rubin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gabriella%20R.%22%2C%22lastName%22%3A%22Sterne%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tanya%22%2C%22lastName%22%3A%22Wolff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Oz%22%2C%22lastName%22%3A%22Malkesman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22FlyLight%20Project%22%2C%22lastName%22%3A%22Team%22%7D%5D%2C%22abstractNote%22%3A%22Precise%2C%20repeatable%20genetic%20access%20to%20specific%20neurons%20via%20GAL4%5C%2FUAS%20and%20related%20methods%20is%20a%20key%20advantage%20of%20Drosophila%20neuroscience.%20Neuronal%20targeting%20is%20typically%20documented%20using%20light%20microscopy%20of%20full%20GAL4%20expression%20patterns%2C%20which%20generally%20lack%20the%20single-cell%20resolution%20required%20for%20reliable%20cell%20type%20identification.%20Here%20we%20use%20stochastic%20GAL4%20labeling%20with%20the%20MultiColor%20FlpOut%20approach%20to%20generate%20cellular%20resolution%20confocal%20images%20at%20large%20scale.%20We%20are%20releasing%20aligned%20images%20of%2074%2C000%20such%20adult%20central%20nervous%20systems.%20An%20anticipated%20use%20of%20this%20resource%20is%20to%20bridge%20the%20gap%20between%20neurons%20identified%20by%20electron%20or%20light%20microscopy.%20Identifying%20individual%20neurons%20that%20make%20up%20each%20GAL4%20expression%20pattern%20improves%20the%20prediction%20of%20split-GAL4%20combinations%20targeting%20particular%20neurons.%20To%20this%20end%20we%20have%20made%20the%20images%20searchable%20on%20the%20NeuronBridge%20website.%20We%20demonstrate%20the%20potential%20of%20NeuronBridge%20to%20rapidly%20and%20effectively%20identify%20neuron%20matches%20based%20on%20morphology%20across%20imaging%20modalities%20and%20datasets.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22bioRxiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222022-12-03%22%2C%22DOI%22%3A%2210.1101%5C%2F2020.05.29.080473%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.biorxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2020.05.29.080473v3%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-02-13T14%3A41%3A08Z%22%7D%7D%2C%7B%22key%22%3A%22QMJ6F2CE%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Moebel%20et%20al.%22%2C%22parsedDate%22%3A%222021-11%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BMoebel%2C%20E.%2C%20Martinez-Sanchez%2C%20A.%2C%20Lamm%2C%20L.%2C%20Righetto%2C%20R.%20D.%2C%20Wietrzynski%2C%20W.%2C%20Albert%2C%20S.%2C%20Larivi%26%23xE8%3Bre%2C%20D.%2C%20Fourmentin%2C%20E.%2C%20Pfeffer%2C%20S.%2C%20Ortiz%2C%20J.%2C%20Baumeister%2C%20W.%2C%20Peng%2C%20T.%2C%20Engel%2C%20B.%20D.%2C%20%26amp%3B%20Kervrann%2C%20C.%20%282021%29.%20%26lt%3Bb%26gt%3BDeep%20learning%20improves%20macromolecule%20identification%20in%203D%20cellular%20cryo-electron%20tomograms%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BNature%20Methods%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B18%26lt%3B%5C%2Fi%26gt%3B%2811%29%2C%201386%26%23x2013%3B1394.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1038%5C%2Fs41592-021-01275-4%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1038%5C%2Fs41592-021-01275-4%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Deep%20learning%20improves%20macromolecule%20identification%20in%203D%20cellular%20cryo-electron%20tomograms%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emmanuel%22%2C%22lastName%22%3A%22Moebel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Antonio%22%2C%22lastName%22%3A%22Martinez-Sanchez%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lorenz%22%2C%22lastName%22%3A%22Lamm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ricardo%20D.%22%2C%22lastName%22%3A%22Righetto%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Wojciech%22%2C%22lastName%22%3A%22Wietrzynski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sahradha%22%2C%22lastName%22%3A%22Albert%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Damien%22%2C%22lastName%22%3A%22Larivi%5Cu00e8re%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eric%22%2C%22lastName%22%3A%22Fourmentin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22Pfeffer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julio%22%2C%22lastName%22%3A%22Ortiz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Wolfgang%22%2C%22lastName%22%3A%22Baumeister%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tingying%22%2C%22lastName%22%3A%22Peng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benjamin%20D.%22%2C%22lastName%22%3A%22Engel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Charles%22%2C%22lastName%22%3A%22Kervrann%22%7D%5D%2C%22abstractNote%22%3A%22Cryogenic%20electron%20tomography%20%28cryo-ET%29%20visualizes%20the%203D%20spatial%20distribution%20of%20macromolecules%20at%20nanometer%20resolution%20inside%20native%20cells.%20However%2C%20automated%20identification%20of%20macromolecules%20inside%20cellular%20tomograms%20is%20challenged%20by%20noise%20and%20reconstruction%20artifacts%2C%20as%20well%20as%20the%20presence%20of%20many%20molecular%20species%20in%20the%20crowded%20volumes.%20Here%2C%20we%20present%20DeepFinder%2C%20a%20computational%20procedure%20that%20uses%20artificial%20neural%20networks%20to%20simultaneously%20localize%20multiple%20classes%20of%20macromolecules.%20Once%20trained%2C%20the%20inference%20stage%20of%20DeepFinder%20is%20faster%20than%20template%20matching%20and%20performs%20better%20than%20other%20competitive%20deep%20learning%20methods%20at%20identifying%20macromolecules%20of%20various%20sizes%20in%20both%20synthetic%20and%20experimental%20datasets.%20On%20cellular%20cryo-ET%20data%2C%20DeepFinder%20localized%20membrane-bound%20and%20cytosolic%20ribosomes%20%28roughly%203.2%5Cu2009MDa%29%2C%20ribulose%201%2C5-bisphosphate%20carboxylase%5Cu2013oxygenase%20%28roughly%20560%5Cu2009kDa%20soluble%20complex%29%20and%20photosystem%20II%20%28roughly%20550%5Cu2009kDa%20membrane%20complex%29%20with%20an%20accuracy%20comparable%20to%20expert-supervised%20ground%20truth%20annotations.%20DeepFinder%20is%20therefore%20a%20promising%20algorithm%20for%20the%20semiautomated%20analysis%20of%20a%20wide%20range%20of%20molecular%20targets%20in%20cellular%20tomograms.%22%2C%22date%22%3A%222021-11%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1038%5C%2Fs41592-021-01275-4%22%2C%22ISSN%22%3A%221548-7105%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.nature.com%5C%2Farticles%5C%2Fs41592-021-01275-4%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-05-02T08%3A11%3A22Z%22%7D%7D%2C%7B%22key%22%3A%225ZQ8WVJ8%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Reinke%20et%20al.%22%2C%22parsedDate%22%3A%222024-02-12%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BReinke%2C%20A.%2C%20Tizabi%2C%20M.%20D.%2C%20Baumgartner%2C%20M.%2C%20Eisenmann%2C%20M.%2C%20Heckmann-N%26%23xF6%3Btzel%2C%20D.%2C%20Kavur%2C%20A.%20E.%2C%20R%26%23xE4%3Bdsch%2C%20T.%2C%20Sudre%2C%20C.%20H.%2C%20Acion%2C%20L.%2C%20Antonelli%2C%20M.%2C%20Arbel%2C%20T.%2C%20Bakas%2C%20S.%2C%20Benis%2C%20A.%2C%20Buettner%2C%20F.%2C%20Cardoso%2C%20M.%20J.%2C%20Cheplygina%2C%20V.%2C%20Chen%2C%20J.%2C%20Christodoulou%2C%20E.%2C%20Cimini%2C%20B.%20A.%2C%20%26%23x2026%3B%20Maier-Hein%2C%20L.%20%282024%29.%20%26lt%3Bb%26gt%3BUnderstanding%20metric-related%20pitfalls%20in%20image%20analysis%20validation%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BNature%20Methods%26lt%3B%5C%2Fi%26gt%3B%2C%201%26%23x2013%3B13.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1038%5C%2Fs41592-023-02150-0%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1038%5C%2Fs41592-023-02150-0%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Understanding%20metric-related%20pitfalls%20in%20image%20analysis%20validation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Annika%22%2C%22lastName%22%3A%22Reinke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Minu%20D.%22%2C%22lastName%22%3A%22Tizabi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Baumgartner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Matthias%22%2C%22lastName%22%3A%22Eisenmann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Doreen%22%2C%22lastName%22%3A%22Heckmann-N%5Cu00f6tzel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22A.%20Emre%22%2C%22lastName%22%3A%22Kavur%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tim%22%2C%22lastName%22%3A%22R%5Cu00e4dsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Carole%20H.%22%2C%22lastName%22%3A%22Sudre%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Laura%22%2C%22lastName%22%3A%22Acion%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michela%22%2C%22lastName%22%3A%22Antonelli%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tal%22%2C%22lastName%22%3A%22Arbel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Spyridon%22%2C%22lastName%22%3A%22Bakas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Arriel%22%2C%22lastName%22%3A%22Benis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Florian%22%2C%22lastName%22%3A%22Buettner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M.%20Jorge%22%2C%22lastName%22%3A%22Cardoso%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Veronika%22%2C%22lastName%22%3A%22Cheplygina%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jianxu%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Evangelia%22%2C%22lastName%22%3A%22Christodoulou%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Beth%20A.%22%2C%22lastName%22%3A%22Cimini%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Keyvan%22%2C%22lastName%22%3A%22Farahani%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Luciana%22%2C%22lastName%22%3A%22Ferrer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Adrian%22%2C%22lastName%22%3A%22Galdran%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bram%22%2C%22lastName%22%3A%22van%20Ginneken%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ben%22%2C%22lastName%22%3A%22Glocker%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Patrick%22%2C%22lastName%22%3A%22Godau%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Daniel%20A.%22%2C%22lastName%22%3A%22Hashimoto%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%20M.%22%2C%22lastName%22%3A%22Hoffman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Merel%22%2C%22lastName%22%3A%22Huisman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fabian%22%2C%22lastName%22%3A%22Isensee%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pierre%22%2C%22lastName%22%3A%22Jannin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Charles%20E.%22%2C%22lastName%22%3A%22Kahn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bernhard%22%2C%22lastName%22%3A%22Kainz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexandros%22%2C%22lastName%22%3A%22Karargyris%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jens%22%2C%22lastName%22%3A%22Kleesiek%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Florian%22%2C%22lastName%22%3A%22Kofler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Thijs%22%2C%22lastName%22%3A%22Kooi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Annette%22%2C%22lastName%22%3A%22Kopp-Schneider%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michal%22%2C%22lastName%22%3A%22Kozubek%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%22%2C%22lastName%22%3A%22Kreshuk%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tahsin%22%2C%22lastName%22%3A%22Kurc%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bennett%20A.%22%2C%22lastName%22%3A%22Landman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Geert%22%2C%22lastName%22%3A%22Litjens%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amin%22%2C%22lastName%22%3A%22Madani%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Klaus%22%2C%22lastName%22%3A%22Maier-Hein%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anne%20L.%22%2C%22lastName%22%3A%22Martel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Erik%22%2C%22lastName%22%3A%22Meijering%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bjoern%22%2C%22lastName%22%3A%22Menze%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Karel%20G.%20M.%22%2C%22lastName%22%3A%22Moons%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Henning%22%2C%22lastName%22%3A%22M%5Cu00fcller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Brennan%22%2C%22lastName%22%3A%22Nichyporuk%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Felix%22%2C%22lastName%22%3A%22Nickel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jens%22%2C%22lastName%22%3A%22Petersen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Susanne%20M.%22%2C%22lastName%22%3A%22Rafelski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nasir%22%2C%22lastName%22%3A%22Rajpoot%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mauricio%22%2C%22lastName%22%3A%22Reyes%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%20A.%22%2C%22lastName%22%3A%22Riegler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicola%22%2C%22lastName%22%3A%22Rieke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julio%22%2C%22lastName%22%3A%22Saez-Rodriguez%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Clara%20I.%22%2C%22lastName%22%3A%22S%5Cu00e1nchez%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shravya%22%2C%22lastName%22%3A%22Shetty%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ronald%20M.%22%2C%22lastName%22%3A%22Summers%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Abdel%20A.%22%2C%22lastName%22%3A%22Taha%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aleksei%22%2C%22lastName%22%3A%22Tiulpin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sotirios%20A.%22%2C%22lastName%22%3A%22Tsaftaris%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ben%22%2C%22lastName%22%3A%22Van%20Calster%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ga%5Cu00ebl%22%2C%22lastName%22%3A%22Varoquaux%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ziv%20R.%22%2C%22lastName%22%3A%22Yaniv%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paul%20F.%22%2C%22lastName%22%3A%22J%5Cu00e4ger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lena%22%2C%22lastName%22%3A%22Maier-Hein%22%7D%5D%2C%22abstractNote%22%3A%22Validation%20metrics%20are%20key%20for%20tracking%20scientific%20progress%20and%20bridging%20the%20current%20chasm%20between%20artificial%20intelligence%20research%20and%20its%20translation%20into%20practice.%20However%2C%20increasing%20evidence%20shows%20that%2C%20particularly%20in%20image%20analysis%2C%20metrics%20are%20often%20chosen%20inadequately.%20Although%20taking%20into%20account%20the%20individual%20strengths%2C%20weaknesses%20and%20limitations%20of%20validation%20metrics%20is%20a%20critical%20prerequisite%20to%20making%20educated%20choices%2C%20the%20relevant%20knowledge%20is%20currently%20scattered%20and%20poorly%20accessible%20to%20individual%20researchers.%20Based%20on%20a%20multistage%20Delphi%20process%20conducted%20by%20a%20multidisciplinary%20expert%20consortium%20as%20well%20as%20extensive%20community%20feedback%2C%20the%20present%20work%20provides%20a%20reliable%20and%20comprehensive%20common%20point%20of%20access%20to%20information%20on%20pitfalls%20related%20to%20validation%20metrics%20in%20image%20analysis.%20Although%20focused%20on%20biomedical%20image%20analysis%2C%20the%20addressed%20pitfalls%20generalize%20across%20application%20domains%20and%20are%20categorized%20according%20to%20a%20newly%20created%2C%20domain-agnostic%20taxonomy.%20The%20work%20serves%20to%20enhance%20global%20comprehension%20of%20a%20key%20topic%20in%20image%20analysis%20validation.%22%2C%22date%22%3A%222024-02-12%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1038%5C%2Fs41592-023-02150-0%22%2C%22ISSN%22%3A%221548-7105%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.nature.com%5C%2Farticles%5C%2Fs41592-023-02150-0%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T08%3A36%3A30Z%22%7D%7D%2C%7B%22key%22%3A%22PZ27643P%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Rumberger%20et%20al.%22%2C%22parsedDate%22%3A%222022-04-19%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BRumberger%2C%20J.%20L.%2C%20Baumann%2C%20E.%2C%20Hirsch%2C%20P.%2C%20Janowczyk%2C%20A.%2C%20Zlobec%2C%20I.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282022%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BPanoptic%20segmentation%20with%20highly%20imbalanced%20semantic%20labels%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B%20%28arXiv%3A2203.11692%29.%20arXiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2203.11692%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2203.11692%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Panoptic%20segmentation%20with%20highly%20imbalanced%20semantic%20labels%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Josef%20Lorenz%22%2C%22lastName%22%3A%22Rumberger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elias%22%2C%22lastName%22%3A%22Baumann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andrew%22%2C%22lastName%22%3A%22Janowczyk%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Inti%22%2C%22lastName%22%3A%22Zlobec%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%5D%2C%22abstractNote%22%3A%22We%20describe%20here%20the%20panoptic%20segmentation%20method%20we%20devised%20for%20our%20participation%20in%20the%20CoNIC%3A%20Colon%20Nuclei%20Identification%20and%20Counting%20Challenge%20at%20ISBI%202022.%20Key%20features%20of%20our%20method%20are%20a%20weighted%20loss%20specifically%20engineered%20for%20semantic%20segmentation%20of%20highly%20imbalanced%20cell%20types%2C%20and%20a%20state-of-the%20art%20nuclei%20instance%20segmentation%20model%2C%20which%20we%20combine%20in%20a%20Hovernet-like%20architecture.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2203.11692%22%2C%22date%22%3A%222022-04-19%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2203.11692%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2203.11692%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-02-13T14%3A42%3A21Z%22%7D%7D%2C%7B%22key%22%3A%22IFN4PWXN%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Rumberger%20et%20al.%22%2C%22parsedDate%22%3A%222023-10-02%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BRumberger%2C%20J.%20L.%2C%20Franzen%2C%20J.%2C%20Hirsch%2C%20P.%2C%20Albrecht%2C%20J.-P.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282023%29.%20%26lt%3Bb%26gt%3BACTIS%3A%20Improving%20data%20efficiency%20by%20leveraging%20semi-supervised%20Augmentation%20Consistency%20Training%20for%20Instance%20Segmentation%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3B2023%20IEEE%5C%2FCVF%20International%20Conference%20on%20Computer%20Vision%20Workshops%20%28ICCVW%29%26lt%3B%5C%2Fi%26gt%3B%2C%203792%26%23x2013%3B3801.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FICCVW60793.2023.00410%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FICCVW60793.2023.00410%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22ACTIS%3A%20Improving%20data%20efficiency%20by%20leveraging%20semi-supervised%20Augmentation%20Consistency%20Training%20for%20Instance%20Segmentation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Josef%20Lorenz%22%2C%22lastName%22%3A%22Rumberger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jannik%22%2C%22lastName%22%3A%22Franzen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jan-Philipp%22%2C%22lastName%22%3A%22Albrecht%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222023-10-2%22%2C%22proceedingsTitle%22%3A%222023%20IEEE%5C%2FCVF%20International%20Conference%20on%20Computer%20Vision%20Workshops%20%28ICCVW%29%22%2C%22conferenceName%22%3A%222023%20IEEE%5C%2FCVF%20International%20Conference%20on%20Computer%20Vision%20Workshops%20%28ICCVW%29%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1109%5C%2FICCVW60793.2023.00410%22%2C%22ISBN%22%3A%229798350307443%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10350921%5C%2F%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-02-28T15%3A29%3A39Z%22%7D%7D%2C%7B%22key%22%3A%22ZB2WILP3%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Rumberger%20et%20al.%22%2C%22parsedDate%22%3A%222024-07-24%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BRumberger%2C%20J.%20L.%2C%20Lim%2C%20W.%2C%20Wildfeuer%2C%20B.%2C%20Sodemann%2C%20E.%20B.%2C%20Lecler%2C%20A.%2C%20Stemplinger%2C%20S.%2C%20Issever%2C%20A.%20S.%2C%20Sepahdari%2C%20A.%20R.%2C%20Langner%2C%20S.%2C%20Kainmueller%2C%20D.%2C%20Hamm%2C%20B.%2C%20%26amp%3B%20Erb-Eigner%2C%20K.%20%282024%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BContent-based%20image%20retrieval%20assists%20radiologists%20in%20diagnosing%20eye%20and%20orbital%20mass%20lesions%20in%20MRI%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B.%20medRxiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.07.24.24310920%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.07.24.24310920%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Content-based%20image%20retrieval%20assists%20radiologists%20in%20diagnosing%20eye%20and%20orbital%20mass%20lesions%20in%20MRI%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J.%20Lorenz%22%2C%22lastName%22%3A%22Rumberger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Winna%22%2C%22lastName%22%3A%22Lim%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benjamin%22%2C%22lastName%22%3A%22Wildfeuer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elisa%20B.%22%2C%22lastName%22%3A%22Sodemann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Augustin%22%2C%22lastName%22%3A%22Lecler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Simon%22%2C%22lastName%22%3A%22Stemplinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ahi%20Sema%22%2C%22lastName%22%3A%22Issever%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ali%20R.%22%2C%22lastName%22%3A%22Sepahdari%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S%5Cu00f6nke%22%2C%22lastName%22%3A%22Langner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bernd%22%2C%22lastName%22%3A%22Hamm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katharina%22%2C%22lastName%22%3A%22Erb-Eigner%22%7D%5D%2C%22abstractNote%22%3A%22Background%20Diagnoses%20of%20eye%20and%20orbit%20pathologies%20by%20radiological%20imaging%20is%20challenging%20due%20to%20their%20low%20prevalence%20and%20the%20relative%20high%20number%20of%20possible%20pathologies%20and%20variability%20in%20presentation%2C%20thus%20requiring%20substantial%20domain-specific%20experience.%5CnPurpose%20This%20study%20investigates%20whether%20a%20content-based%20image%20retrieval%20%28CBIR%29%20tool%20paired%20with%20a%20curated%20database%20of%20orbital%20MRI%20cases%20with%20verified%20diagnoses%20can%20enhance%20diagnostic%20accuracy%20and%20reduce%20reading%20time%20for%20radiologists%20across%20different%20experience%20levels.%5CnMaterial%20and%20Methods%20We%20tested%20these%20two%20hypotheses%20in%20a%20multi-reader%2C%20multi-case%20study%2C%20with%2036%20readers%20and%2048%20retrospective%20eye%20and%20orbit%20MRI%20cases.%20We%20asked%20each%20reader%20to%20diagnose%20eight%20orbital%20MRI%20cases%2C%20four%20while%20having%20only%20status%20quo%20reference%20tools%20available%20%28e.g.%20Radiopaedia.org%2C%20StatDx%2C%20etc.%29%2C%20and%20four%20while%20having%20a%20CBIR%20reference%20tool%20additionally%20available.%20Then%2C%20we%20analyzed%20and%20compared%20the%20results%20with%20linear%20mixed%20effects%20models%2C%20controlling%20for%20the%20cases%20and%20participants.%5CnResults%20Overall%2C%20we%20found%20a%20strong%20positive%20effect%20on%20diagnostic%20accuracy%20when%20using%20the%20CBIR%20tool%20only%20as%20compared%20to%20using%20status%20quo%20tools%20only%20%28status%20quo%20only%2055.88%25%2C%20CBIR%20only%2070.59%25%2C%2026.32%25%20relative%20improvement%2C%20p%3D.03%2C%20odds%20ratio%3D2.07%29%2C%20and%20an%20even%20stronger%20effect%20when%20using%20the%20CBIR%20tool%20in%20conjunction%20with%20status%20quo%20tools%20%28status%20quo%20only%2055.88%25%2C%20CBIR%20%2B%20status%20quo%2083.33%25%2C%2049%25%20relative%20improvement%2C%20p%3D.02%2C%20odds%20ratio%3D3.65%29.%20Reading%20time%20in%20seconds%20%28s%29%20decreased%20when%20using%20only%20the%20CBIR%20tool%20%28status%20quo%20only%20334s%2C%20CBIR%20only%20236s%2C%2029%25%20decrease%2C%20p%26lt%3B.001%29%2C%20but%20increased%20when%20used%20in%20conjunction%20with%20status%20quo%20tools%20%28status%20quo%20only%20334s%2C%20CBIR%20%2B%20status%20quo%20396s%2C%2019%25%20increase%2C%20p%26lt%3B.001%29.%5CnConclusion%20We%20found%20significant%20positive%20effects%20on%20diagnostic%20accuracy%20and%20mixed%20effects%20on%20reading%20times%20when%20using%20the%20CBIR%20reference%20tool%2C%20indicating%20the%20potential%20benefits%20when%20using%20CBIR%20reference%20tools%20in%20diagnosing%20eye%20and%20orbit%20mass%20lesions%20by%20radiological%20imaging.%5CnSummary%20Using%20a%20content-based%20image%20retrieval%20tool%20significantly%20improved%20diagnostic%20accuracy%20and%20had%20mixed%20effects%20on%20reading%20time%20for%20diagnosing%20MRI%20exams%20of%20patients%20with%20eye%20and%20orbit%20pathologies.%5CnKey%20ResultsUsing%20the%20CBIR%20tool%20alone%20improved%20diagnostic%20accuracy%20from%2055.88%25%20to%2070.59%25%20%28odds%20ratio%3D2.07%2C%20p%3D.03%29%20and%20decreased%20reading%20time%20from%20334s%20to%20236s%20%28p%26lt%3B.001%29%20compared%20to%20SQ%20alone.Using%20CBIR%20together%20with%20SQ%20tools%20further%20increased%20accuracy%20to%2083.33%25%20%28odds%20ratio%3D3.65%2C%20p%3D.02%29%20but%20increased%20reading%20time%20to%20396s%20%28p%26lt%3B.001%29%20compared%20to%20SQ%20only.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22medRxiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024-07-24%22%2C%22DOI%22%3A%2210.1101%5C%2F2024.07.24.24310920%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.medrxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2024.07.24.24310920v1%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-01-14T13%3A58%3A31Z%22%7D%7D%2C%7B%22key%22%3A%22PELEC6PN%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Rumberger%20et%20al.%22%2C%22parsedDate%22%3A%222024-06-03%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BRumberger%2C%20J.%20L.%2C%20Greenwald%2C%20N.%20F.%2C%20Ranek%2C%20J.%20S.%2C%20Boonrat%2C%20P.%2C%20Walker%2C%20C.%2C%20Franzen%2C%20J.%2C%20Varra%2C%20S.%20R.%2C%20Kong%2C%20A.%2C%20Sowers%2C%20C.%2C%20Liu%2C%20C.%20C.%2C%20Averbukh%2C%20I.%2C%20Piyadasa%2C%20H.%2C%20Vanguri%2C%20R.%2C%20Nederlof%2C%20I.%2C%20Wang%2C%20X.%20J.%2C%20Valen%2C%20D.%20V.%2C%20Kok%2C%20M.%2C%20Hollmann%2C%20T.%20J.%2C%20Kainmueller%2C%20D.%2C%20%26amp%3B%20Angelo%2C%20M.%20%282024%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BAutomated%20classification%20of%20cellular%20expression%20in%20multiplexed%20imaging%20data%20with%20Nimbus%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B.%20bioRxiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.06.02.597062%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.06.02.597062%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Automated%20classification%20of%20cellular%20expression%20in%20multiplexed%20imaging%20data%20with%20Nimbus%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J.%20Lorenz%22%2C%22lastName%22%3A%22Rumberger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Noah%20F.%22%2C%22lastName%22%3A%22Greenwald%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jolene%20S.%22%2C%22lastName%22%3A%22Ranek%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Potchara%22%2C%22lastName%22%3A%22Boonrat%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cameron%22%2C%22lastName%22%3A%22Walker%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jannik%22%2C%22lastName%22%3A%22Franzen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sricharan%20Reddy%22%2C%22lastName%22%3A%22Varra%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alex%22%2C%22lastName%22%3A%22Kong%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cameron%22%2C%22lastName%22%3A%22Sowers%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Candace%20C.%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Inna%22%2C%22lastName%22%3A%22Averbukh%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hadeesha%22%2C%22lastName%22%3A%22Piyadasa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rami%22%2C%22lastName%22%3A%22Vanguri%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Iris%22%2C%22lastName%22%3A%22Nederlof%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xuefei%20Julie%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%20Van%22%2C%22lastName%22%3A%22Valen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marleen%22%2C%22lastName%22%3A%22Kok%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Travis%20J.%22%2C%22lastName%22%3A%22Hollmann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Angelo%22%7D%5D%2C%22abstractNote%22%3A%22Multiplexed%20imaging%20offers%20a%20powerful%20approach%20to%20characterize%20the%20spatial%20topography%20of%20tissues%20in%20both%20health%20and%20disease.%20To%20analyze%20such%20data%2C%20the%20specific%20combination%20of%20markers%20that%20are%20present%20in%20each%20cell%20must%20be%20enumerated%20to%20enable%20accurate%20phenotyping%2C%20a%20process%20that%20often%20relies%20on%20unsupervised%20clustering.%20We%20constructed%20the%20Pan-Multiplex%20%28Pan-M%29%20dataset%20containing%20197%20million%20distinct%20annotations%20of%20marker%20expression%20across%2015%20different%20cell%20types.%20We%20used%20Pan-M%20to%20create%20Nimbus%2C%20a%20deep%20learning%20model%20to%20predict%20marker%20positivity%20from%20multiplexed%20image%20data.%20Nimbus%20is%20a%20pre-trained%20model%20that%20uses%20the%20underlying%20images%20to%20classify%20marker%20expression%20across%20distinct%20cell%20types%2C%20from%20different%20tissues%2C%20acquired%20using%20different%20microscope%20platforms%2C%20without%20requiring%20any%20retraining.%20We%20demonstrate%20that%20Nimbus%20predictions%20capture%20the%20underlying%20staining%20patterns%20of%20the%20full%20diversity%20of%20markers%20present%20in%20Pan-M.%20We%20then%20show%20how%20Nimbus%20predictions%20can%20be%20integrated%20with%20downstream%20clustering%20algorithms%20to%20robustly%20identify%20cell%20subtypes%20in%20image%20data.%20We%20have%20open-sourced%20Nimbus%20and%20Pan-M%20to%20enable%20community%20use%20at%20https%3A%5C%2F%5C%2Fgithub.com%5C%2Fangelolab%5C%2FNimbus-Inference.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22bioRxiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024-06-03%22%2C%22DOI%22%3A%2210.1101%5C%2F2024.06.02.597062%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.biorxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2024.06.02.597062v1%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-01-14T14%3A57%3A03Z%22%7D%7D%2C%7B%22key%22%3A%22CIBR8WMX%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Rumberger%20et%20al.%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BRumberger%2C%20J.%20L.%2C%20Yu%2C%20X.%2C%20Hirsch%2C%20P.%2C%20Dohmen%2C%20M.%2C%20Guarino%2C%20V.%20E.%2C%20Mokarian%2C%20A.%2C%20Mais%2C%20L.%2C%20Funke%2C%20J.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282021%29.%20%26lt%3Bb%26gt%3BHow%20Shift%20Equivariance%20Impacts%20Metric%20Learning%20for%20Instance%20Segmentation%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3B2021%20IEEE%5C%2FCVF%20International%20Conference%20on%20Computer%20Vision%20%28ICCV%29%26lt%3B%5C%2Fi%26gt%3B%2C%207108%26%23x2013%3B7116.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FICCV48922.2021.00704%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FICCV48922.2021.00704%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22How%20Shift%20Equivariance%20Impacts%20Metric%20Learning%20for%20Instance%20Segmentation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Josef%20Lorenz%22%2C%22lastName%22%3A%22Rumberger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xiaoyan%22%2C%22lastName%22%3A%22Yu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hirsch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Melanie%22%2C%22lastName%22%3A%22Dohmen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Vanessa%20Emanuela%22%2C%22lastName%22%3A%22Guarino%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ashkan%22%2C%22lastName%22%3A%22Mokarian%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lisa%22%2C%22lastName%22%3A%22Mais%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jan%22%2C%22lastName%22%3A%22Funke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%5D%2C%22abstractNote%22%3A%22Metric%20learning%20has%20received%20conflicting%20assessments%20concerning%20its%20suitability%20for%20solving%20instance%20segmentation%20tasks.%20It%20has%20been%20dismissed%20as%20theoretically%20flawed%20due%20to%20the%20shift%20equivariance%20of%20the%20employed%20CNNs%20and%20their%20respective%20inability%20to%20distinguish%20same-looking%20objects.%20Yet%20it%20has%20been%20shown%20to%20yield%20state%20of%20the%20art%20results%20for%20a%20variety%20of%20tasks%2C%20and%20practical%20issues%20have%20mainly%20been%20reported%20in%20the%20context%20of%20tile-and-stitch%20approaches%2C%20where%20discontinuities%20at%20tile%20boundaries%20have%20been%20observed.%20To%20date%2C%20neither%20of%20the%20reported%20issues%20have%20undergone%20thorough%20formal%20analysis.%20In%20our%20work%2C%20we%20contribute%20a%20comprehensive%20formal%20analysis%20of%20the%20shift%20equivariance%20properties%20of%20encoder-decoder-style%20CNNs%2C%20which%20yields%20a%20clear%20picture%20of%20what%20can%20and%20cannot%20be%20achieved%20with%20metric%20learning%20in%20the%20face%20of%20same-looking%20objects.%20In%20particular%2C%20we%20prove%20that%20a%20standard%20encoder-decoder%20network%20that%20takes%20d-dimensional%20images%20as%20input%2C%20with%20l%20pooling%20layers%20and%20pooling%20factor%20f%20%2C%20has%20the%20capacity%20to%20distinguish%20at%20most%20f%20dl%20same-looking%20objects%2C%20and%20we%20show%20that%20this%20upper%20limit%20can%20be%20reached.%20Furthermore%2C%20we%20show%20that%20to%20avoid%20discontinuities%20in%20a%20tile-and-stitch%20approach%2C%20assuming%20standard%20batch%20size%201%2C%20it%20is%20necessary%20to%20employ%20valid%20convolutions%20in%20combination%20with%20a%20training%20output%20window%20size%20strictly%20greater%20than%20f%20l%2C%20while%20at%20test-time%20it%20is%20necessary%20to%20crop%20tiles%20to%20size%20n%20%5Cu00b7%20f%20l%20before%20stitching%2C%20with%20n%20%5Cu2265%201.%20We%20complement%20these%20theoretical%20findings%20by%20discussing%20a%20number%20of%20insightful%20special%20cases%20for%20which%20we%20show%20empirical%20results%20on%20synthetic%20and%20real%20data.%22%2C%22date%22%3A%2210%5C%2F2021%22%2C%22proceedingsTitle%22%3A%222021%20IEEE%5C%2FCVF%20International%20Conference%20on%20Computer%20Vision%20%28ICCV%29%22%2C%22conferenceName%22%3A%222021%20IEEE%5C%2FCVF%20International%20Conference%20on%20Computer%20Vision%20%28ICCV%29%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1109%5C%2FICCV48922.2021.00704%22%2C%22ISBN%22%3A%22978-1-66542-812-5%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F9709951%5C%2F%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-05-02T08%3A49%3A11Z%22%7D%7D%2C%7B%22key%22%3A%22LLJ4KLIZ%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Scalia%20et%20al.%22%2C%22parsedDate%22%3A%222024-09-14%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BScalia%2C%20G.%2C%20Rutherford%2C%20S.%20T.%2C%20Lu%2C%20Z.%2C%20Buchholz%2C%20K.%20R.%2C%20Skelton%2C%20N.%2C%20Chuang%2C%20K.%2C%20Diamant%2C%20N.%2C%20H%26%23xFC%3Btter%2C%20J.-C.%2C%20Luescher%2C%20J.-M.%2C%20Miu%2C%20A.%2C%20Blaney%2C%20J.%2C%20Gendelev%2C%20L.%2C%20Skippington%2C%20E.%2C%20Zynda%2C%20G.%2C%20Dickson%2C%20N.%2C%20Koziarski%2C%20M.%2C%20Bengio%2C%20Y.%2C%20Regev%2C%20A.%2C%20Tan%2C%20M.-W.%2C%20%26amp%3B%20Biancalani%2C%20T.%20%282024%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BA%20high-throughput%20phenotypic%20screen%20combined%20with%20an%20ultra-large-scale%20deep%20learning-based%20virtual%20screening%20reveals%20novel%20scaffolds%20of%20antibacterial%20compounds%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B.%20bioRxiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.09.11.612340%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.09.11.612340%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22A%20high-throughput%20phenotypic%20screen%20combined%20with%20an%20ultra-large-scale%20deep%20learning-based%20virtual%20screening%20reveals%20novel%20scaffolds%20of%20antibacterial%20compounds%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gabriele%22%2C%22lastName%22%3A%22Scalia%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Steven%20T.%22%2C%22lastName%22%3A%22Rutherford%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ziqing%22%2C%22lastName%22%3A%22Lu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kerry%20R.%22%2C%22lastName%22%3A%22Buchholz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicholas%22%2C%22lastName%22%3A%22Skelton%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kangway%22%2C%22lastName%22%3A%22Chuang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nathaniel%22%2C%22lastName%22%3A%22Diamant%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jan-Christian%22%2C%22lastName%22%3A%22H%5Cu00fctter%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jerome-Maxim%22%2C%22lastName%22%3A%22Luescher%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anh%22%2C%22lastName%22%3A%22Miu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jeff%22%2C%22lastName%22%3A%22Blaney%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Leo%22%2C%22lastName%22%3A%22Gendelev%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elizabeth%22%2C%22lastName%22%3A%22Skippington%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Greg%22%2C%22lastName%22%3A%22Zynda%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nia%22%2C%22lastName%22%3A%22Dickson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Micha%5Cu0142%22%2C%22lastName%22%3A%22Koziarski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yoshua%22%2C%22lastName%22%3A%22Bengio%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aviv%22%2C%22lastName%22%3A%22Regev%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Man-Wah%22%2C%22lastName%22%3A%22Tan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tommaso%22%2C%22lastName%22%3A%22Biancalani%22%7D%5D%2C%22abstractNote%22%3A%22The%20proliferation%20of%20multi-drug-resistant%20bacteria%20underscores%20an%20urgent%20need%20for%20novel%20antibiotics.%20Traditional%20discovery%20methods%20face%20challenges%20due%20to%20limited%20chemical%20diversity%2C%20high%20costs%2C%20and%20difficulties%20in%20identifying%20structurally%20novel%20compounds.%20Here%2C%20we%20explore%20the%20integration%20of%20small%20molecule%20high-throughput%20screening%20with%20a%20deep%20learning-based%20virtual%20screening%20approach%20to%20uncover%20new%20antibacterial%20compounds.%20Leveraging%20a%20diverse%20library%20of%20nearly%202%20million%20small%20molecules%2C%20we%20conducted%20comprehensive%20phenotypic%20screening%20against%20a%20sensitized%20Escherichia%20coli%20strain%20that%2C%20at%20a%20low%20hit%20rate%2C%20yielded%20thousands%20of%20hits.%20We%20trained%20a%20deep%20learning%20model%2C%20GNEprop%2C%20to%20predict%20antibacterial%20activity%2C%20ensuring%20robustness%20through%20out-of-distribution%20generalization%20techniques.%20Virtual%20screening%20of%20over%201.4%20billion%20compounds%20identified%20potential%20candidates%2C%20of%20which%2082%20exhibited%20antibacterial%20activity%2C%20illustrating%20a%2090X%20improved%20hit%20rate%20over%20the%20high-throughput%20screening%20experiment%20GNEprop%20was%20trained%20on.%20Importantly%2C%20a%20significant%20portion%20of%20these%20newly%20identified%20compounds%20exhibited%20high%20dissimilarity%20to%20known%20antibiotics%2C%20indicating%20promising%20avenues%20for%20further%20exploration%20in%20antibiotic%20discovery.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22bioRxiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024-09-14%22%2C%22DOI%22%3A%2210.1101%5C%2F2024.09.11.612340%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.biorxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2024.09.11.612340v1%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T08%3A57%3A01Z%22%7D%7D%2C%7B%22key%22%3A%22TL65XVJW%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Scheffer%20et%20al.%22%2C%22parsedDate%22%3A%222020-09-03%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BScheffer%2C%20L.%20K.%2C%20Xu%2C%20C.%20S.%2C%20Januszewski%2C%20M.%2C%20Lu%2C%20Z.%2C%20Takemura%2C%20S.%2C%20Hayworth%2C%20K.%20J.%2C%20Huang%2C%20G.%20B.%2C%20Shinomiya%2C%20K.%2C%20Maitlin-Shepard%2C%20J.%2C%20Berg%2C%20S.%2C%20Clements%2C%20J.%2C%20Hubbard%2C%20P.%20M.%2C%20Katz%2C%20W.%20T.%2C%20Umayam%2C%20L.%2C%20Zhao%2C%20T.%2C%20Ackerman%2C%20D.%2C%20Blakely%2C%20T.%2C%20Bogovic%2C%20J.%2C%20Dolafi%2C%20T.%2C%20%26%23x2026%3B%20Plaza%2C%20S.%20M.%20%282020%29.%20%26lt%3Bb%26gt%3BA%20connectome%20and%20analysis%20of%20the%20adult%20Drosophila%20central%20brain%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BeLife%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B9%26lt%3B%5C%2Fi%26gt%3B%2C%20e57443.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7554%5C%2FeLife.57443%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7554%5C%2FeLife.57443%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22A%20connectome%20and%20analysis%20of%20the%20adult%20Drosophila%20central%20brain%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Louis%20K%22%2C%22lastName%22%3A%22Scheffer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22C%20Shan%22%2C%22lastName%22%3A%22Xu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michal%22%2C%22lastName%22%3A%22Januszewski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zhiyuan%22%2C%22lastName%22%3A%22Lu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shin-ya%22%2C%22lastName%22%3A%22Takemura%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kenneth%20J%22%2C%22lastName%22%3A%22Hayworth%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gary%20B%22%2C%22lastName%22%3A%22Huang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kazunori%22%2C%22lastName%22%3A%22Shinomiya%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jeremy%22%2C%22lastName%22%3A%22Maitlin-Shepard%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stuart%22%2C%22lastName%22%3A%22Berg%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jody%22%2C%22lastName%22%3A%22Clements%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Philip%20M%22%2C%22lastName%22%3A%22Hubbard%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22William%20T%22%2C%22lastName%22%3A%22Katz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lowell%22%2C%22lastName%22%3A%22Umayam%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ting%22%2C%22lastName%22%3A%22Zhao%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%22%2C%22lastName%22%3A%22Ackerman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tim%22%2C%22lastName%22%3A%22Blakely%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22John%22%2C%22lastName%22%3A%22Bogovic%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tom%22%2C%22lastName%22%3A%22Dolafi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Takashi%22%2C%22lastName%22%3A%22Kawase%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Khaled%20A%22%2C%22lastName%22%3A%22Khairy%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Laramie%22%2C%22lastName%22%3A%22Leavitt%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%20H%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Larry%22%2C%22lastName%22%3A%22Lindsey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicole%22%2C%22lastName%22%3A%22Neubarth%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Donald%20J%22%2C%22lastName%22%3A%22Olbris%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hideo%22%2C%22lastName%22%3A%22Otsuna%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eric%20T%22%2C%22lastName%22%3A%22Trautman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Masayoshi%22%2C%22lastName%22%3A%22Ito%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexander%20S%22%2C%22lastName%22%3A%22Bates%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jens%22%2C%22lastName%22%3A%22Goldammer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tanya%22%2C%22lastName%22%3A%22Wolff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%22%2C%22lastName%22%3A%22Svirskas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Philipp%22%2C%22lastName%22%3A%22Schlegel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Erika%22%2C%22lastName%22%3A%22Neace%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christopher%20J%22%2C%22lastName%22%3A%22Knecht%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chelsea%20X%22%2C%22lastName%22%3A%22Alvarado%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dennis%20A%22%2C%22lastName%22%3A%22Bailey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Samantha%22%2C%22lastName%22%3A%22Ballinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jolanta%20A%22%2C%22lastName%22%3A%22Borycz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Brandon%20S%22%2C%22lastName%22%3A%22Canino%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Natasha%22%2C%22lastName%22%3A%22Cheatham%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Cook%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marisa%22%2C%22lastName%22%3A%22Dreher%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Octave%22%2C%22lastName%22%3A%22Duclos%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bryon%22%2C%22lastName%22%3A%22Eubanks%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kelli%22%2C%22lastName%22%3A%22Fairbanks%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Samantha%22%2C%22lastName%22%3A%22Finley%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nora%22%2C%22lastName%22%3A%22Forknall%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Audrey%22%2C%22lastName%22%3A%22Francis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gary%20Patrick%22%2C%22lastName%22%3A%22Hopkins%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emily%20M%22%2C%22lastName%22%3A%22Joyce%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22SungJin%22%2C%22lastName%22%3A%22Kim%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicole%20A%22%2C%22lastName%22%3A%22Kirk%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julie%22%2C%22lastName%22%3A%22Kovalyak%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shirley%20A%22%2C%22lastName%22%3A%22Lauchie%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alanna%22%2C%22lastName%22%3A%22Lohff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Charli%22%2C%22lastName%22%3A%22Maldonado%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emily%20A%22%2C%22lastName%22%3A%22Manley%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sari%22%2C%22lastName%22%3A%22McLin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Caroline%22%2C%22lastName%22%3A%22Mooney%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Miatta%22%2C%22lastName%22%3A%22Ndama%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Omotara%22%2C%22lastName%22%3A%22Ogundeyi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nneoma%22%2C%22lastName%22%3A%22Okeoma%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christopher%22%2C%22lastName%22%3A%22Ordish%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicholas%22%2C%22lastName%22%3A%22Padilla%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christopher%20M%22%2C%22lastName%22%3A%22Patrick%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tyler%22%2C%22lastName%22%3A%22Paterson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elliott%20E%22%2C%22lastName%22%3A%22Phillips%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emily%20M%22%2C%22lastName%22%3A%22Phillips%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Neha%22%2C%22lastName%22%3A%22Rampally%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Caitlin%22%2C%22lastName%22%3A%22Ribeiro%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Madelaine%20K%22%2C%22lastName%22%3A%22Robertson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jon%20Thomson%22%2C%22lastName%22%3A%22Rymer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sean%20M%22%2C%22lastName%22%3A%22Ryan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Megan%22%2C%22lastName%22%3A%22Sammons%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anne%20K%22%2C%22lastName%22%3A%22Scott%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ashley%20L%22%2C%22lastName%22%3A%22Scott%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aya%22%2C%22lastName%22%3A%22Shinomiya%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Claire%22%2C%22lastName%22%3A%22Smith%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kelsey%22%2C%22lastName%22%3A%22Smith%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Natalie%20L%22%2C%22lastName%22%3A%22Smith%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Margaret%20A%22%2C%22lastName%22%3A%22Sobeski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alia%22%2C%22lastName%22%3A%22Suleiman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jackie%22%2C%22lastName%22%3A%22Swift%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Satoko%22%2C%22lastName%22%3A%22Takemura%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Iris%22%2C%22lastName%22%3A%22Talebi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dorota%22%2C%22lastName%22%3A%22Tarnogorska%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emily%22%2C%22lastName%22%3A%22Tenshaw%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Temour%22%2C%22lastName%22%3A%22Tokhi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22John%20J%22%2C%22lastName%22%3A%22Walsh%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tansy%22%2C%22lastName%22%3A%22Yang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jane%20Anne%22%2C%22lastName%22%3A%22Horne%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Feng%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ruchi%22%2C%22lastName%22%3A%22Parekh%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Patricia%20K%22%2C%22lastName%22%3A%22Rivlin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Vivek%22%2C%22lastName%22%3A%22Jayaraman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marta%22%2C%22lastName%22%3A%22Costa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gregory%20SXE%22%2C%22lastName%22%3A%22Jefferis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kei%22%2C%22lastName%22%3A%22Ito%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stephan%22%2C%22lastName%22%3A%22Saalfeld%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Reed%22%2C%22lastName%22%3A%22George%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ian%20A%22%2C%22lastName%22%3A%22Meinertzhagen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gerald%20M%22%2C%22lastName%22%3A%22Rubin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Harald%20F%22%2C%22lastName%22%3A%22Hess%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Viren%22%2C%22lastName%22%3A%22Jain%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stephen%20M%22%2C%22lastName%22%3A%22Plaza%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Eve%22%2C%22lastName%22%3A%22Marder%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Michael%20B%22%2C%22lastName%22%3A%22Eisen%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Jason%22%2C%22lastName%22%3A%22Pipkin%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Chris%20Q%22%2C%22lastName%22%3A%22Doe%22%7D%5D%2C%22abstractNote%22%3A%22The%20neural%20circuits%20responsible%20for%20animal%20behavior%20remain%20largely%20unknown.%20We%20summarize%20new%20methods%20and%20present%20the%20circuitry%20of%20a%20large%20fraction%20of%20the%20brain%20of%20the%20fruit%20fly%20Drosophila%20melanogaster.%20Improved%20methods%20include%20new%20procedures%20to%20prepare%2C%20image%2C%20align%2C%20segment%2C%20find%20synapses%20in%2C%20and%20proofread%20such%20large%20data%20sets.%20We%20define%20cell%20types%2C%20refine%20computational%20compartments%2C%20and%20provide%20an%20exhaustive%20atlas%20of%20cell%20examples%20and%20types%2C%20many%20of%20them%20novel.%20We%20provide%20detailed%20circuits%20consisting%20of%20neurons%20and%20their%20chemical%20synapses%20for%20most%20of%20the%20central%20brain.%20We%20make%20the%20data%20public%20and%20simplify%20access%2C%20reducing%20the%20effort%20needed%20to%20answer%20circuit%20questions%2C%20and%20provide%20procedures%20linking%20the%20neurons%20defined%20by%20our%20analysis%20with%20genetic%20reagents.%20Biologically%2C%20we%20examine%20distributions%20of%20connection%20strengths%2C%20neural%20motifs%20on%20different%20scales%2C%20electrical%20consequences%20of%20compartmentalization%2C%20and%20evidence%20that%20maximizing%20packing%20density%20is%20an%20important%20criterion%20in%20the%20evolution%20of%20the%20fly%5Cu2019s%20brain.%22%2C%22date%22%3A%222020-09-03%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.7554%5C%2FeLife.57443%22%2C%22ISSN%22%3A%222050-084X%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7554%5C%2FeLife.57443%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-03-02T14%3A16%3A15Z%22%7D%7D%2C%7B%22key%22%3A%229HI2I7FP%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Schintke%20et%20al.%22%2C%22parsedDate%22%3A%222024-08-01%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BSchintke%2C%20F.%2C%20Belhajjame%2C%20K.%2C%20De%20Mecquenem%2C%20N.%2C%20Frantz%2C%20D.%2C%20Guarino%2C%20V.%20E.%2C%20Hilbrich%2C%20M.%2C%20Lehmann%2C%20F.%2C%20Missier%2C%20P.%2C%20Sattler%2C%20R.%2C%20Sparka%2C%20J.%20A.%2C%20Speckhard%2C%20D.%20T.%2C%20Stolte%2C%20H.%2C%20Vu%2C%20A.%20D.%2C%20%26amp%3B%20Leser%2C%20U.%20%282024%29.%20%26lt%3Bb%26gt%3BValidity%20constraints%20for%20data%20analysis%20workflows%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BFuture%20Generation%20Computer%20Systems%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B157%26lt%3B%5C%2Fi%26gt%3B%2C%2082%26%23x2013%3B97.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.future.2024.03.037%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.future.2024.03.037%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Validity%20constraints%20for%20data%20analysis%20workflows%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Florian%22%2C%22lastName%22%3A%22Schintke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Khalid%22%2C%22lastName%22%3A%22Belhajjame%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ninon%22%2C%22lastName%22%3A%22De%20Mecquenem%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%22%2C%22lastName%22%3A%22Frantz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Vanessa%20Emanuela%22%2C%22lastName%22%3A%22Guarino%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marcus%22%2C%22lastName%22%3A%22Hilbrich%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fabian%22%2C%22lastName%22%3A%22Lehmann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paolo%22%2C%22lastName%22%3A%22Missier%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rebecca%22%2C%22lastName%22%3A%22Sattler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jan%20Arne%22%2C%22lastName%22%3A%22Sparka%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Daniel%20T.%22%2C%22lastName%22%3A%22Speckhard%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hermann%22%2C%22lastName%22%3A%22Stolte%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anh%20Duc%22%2C%22lastName%22%3A%22Vu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ulf%22%2C%22lastName%22%3A%22Leser%22%7D%5D%2C%22abstractNote%22%3A%22Porting%20a%20scientific%20data%20analysis%20workflow%20%28DAW%29%20to%20a%20cluster%20infrastructure%2C%20a%20new%20software%20stack%2C%20or%20even%20only%20a%20new%20dataset%20with%20some%20notably%20different%20properties%20is%20often%20challenging.%20Despite%20the%20structured%20definition%20of%20the%20steps%20%28tasks%29%20and%20their%20interdependencies%20during%20a%20complex%20data%20analysis%20in%20the%20DAW%20specification%2C%20relevant%20assumptions%20may%20remain%20unspecified%20and%20implicit.%20Such%20hidden%20assumptions%20often%20lead%20to%20crashing%20tasks%20without%20a%20reasonable%20error%20message%2C%20poor%20performance%20in%20general%2C%20non-terminating%20executions%2C%20or%20silent%20wrong%20results%20of%20the%20DAW%2C%20to%20name%20only%20a%20few%20possible%20consequences.%20Searching%20for%20the%20causes%20of%20such%20errors%20and%20drawbacks%20in%20a%20distributed%20compute%20cluster%20managed%20by%20a%20complex%20infrastructure%20stack%2C%20where%20DAWs%20for%20large%20datasets%20typically%20are%20executed%2C%20can%20be%20tedious%20and%20time-consuming.%20We%20propose%20validity%20constraints%20%28VCs%29%20as%20a%20new%20concept%20for%20DAW%20languages%20to%20alleviate%20this%20situation.%20A%20VC%20is%20a%20constraint%20specifying%20logical%20conditions%20that%20must%20be%20fulfilled%20at%20certain%20times%20for%20DAW%20executions%20to%20be%20valid.%20When%20defined%20together%20with%20a%20DAW%2C%20VCs%20help%20to%20improve%20the%20portability%2C%20adaptability%2C%20and%20reusability%20of%20DAWs%20by%20making%20implicit%20assumptions%20explicit.%20Once%20specified%2C%20VCs%20can%20be%20controlled%20automatically%20by%20the%20DAW%20infrastructure%2C%20and%20violations%20can%20lead%20to%20meaningful%20error%20messages%20and%20graceful%20behavior%20%28e.g.%2C%20termination%20or%20invocation%20of%20repair%20mechanisms%29.%20We%20provide%20a%20broad%20list%20of%20possible%20VCs%2C%20classify%20them%20along%20multiple%20dimensions%2C%20and%20compare%20them%20to%20similar%20concepts%20one%20can%20find%20in%20related%20fields.%20We%20also%20provide%20a%20proof-of-concept%20implementation%20for%20the%20workflow%20system%20Nextflow.%22%2C%22date%22%3A%222024-08-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.future.2024.03.037%22%2C%22ISSN%22%3A%220167-739X%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0167739X24001079%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T08%3A55%3A49Z%22%7D%7D%2C%7B%22key%22%3A%22FJC9TEB8%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Siegel%20et%20al.%22%2C%22parsedDate%22%3A%222024-08-09%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BSiegel%2C%20N.%20T.%2C%20Kainmueller%2C%20D.%2C%20Deniz%2C%20F.%2C%20Ritter%2C%20K.%2C%20%26amp%3B%20Schulz%2C%20M.-A.%20%282024%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BDo%20transformers%20and%20CNNs%20learn%20different%20concepts%20of%20brain%20age%3F%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B%20bioRxiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.08.09.607321%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.08.09.607321%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Do%20transformers%20and%20CNNs%20learn%20different%20concepts%20of%20brain%20age%3F%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nys%20Tjade%22%2C%22lastName%22%3A%22Siegel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fatma%22%2C%22lastName%22%3A%22Deniz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kerstin%22%2C%22lastName%22%3A%22Ritter%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marc-Andre%22%2C%22lastName%22%3A%22Schulz%22%7D%5D%2C%22abstractNote%22%3A%22%5Cu201cPredicted%20brain%20age%5Cu201d%20refers%20to%20a%20biomarker%20of%20structural%20brain%20health%20derived%20from%20machine%20learning%20analysis%20of%20T1-weighted%20brain%20magnetic%20resonance%20%28MR%29%20images.%20A%20range%20of%20machine%20learning%20methods%20have%20been%20used%20to%20predict%20brain%20age%2C%20with%20convolutional%20neural%20networks%20%28CNNs%29%20currently%20yielding%20state-of-the-art%20accuracies.%20Recent%20advances%20in%20deep%20learning%20have%20introduced%20transformers%2C%20which%20are%20conceptually%20distinct%20from%20CNNs%2C%20and%20appear%20to%20set%20new%20benchmarks%20in%20various%20domains%20of%20computer%20vision.%20However%2C%20transformers%20have%20not%20yet%20been%20applied%20to%20brain%20age%20prediction.%20Thus%2C%20we%20address%20two%20research%20questions%3A%20First%2C%20are%20transformers%20superior%20to%20CNNs%20in%20predicting%20brain%20age%3F%20Second%2C%20do%20conceptually%20different%20deep%20learning%20model%20architectures%20learn%20similar%20or%20different%20%5Cu201cconcepts%20of%20brain%20age%5Cu201d%3F%20We%20adapted%20a%20Simple%20Vision%20Transformer%20%28sViT%29%20and%20a%20Shifted%20Window%20Transformer%20%28SwinT%29%20to%20predict%20brain%20age%2C%20and%20compared%20both%20models%20with%20a%20ResNet50%20on%2046%2C381%20T1-weighted%20structural%20MR%20images%20from%20the%20UK%20Biobank.%20We%20found%20that%20SwinT%20and%20ResNet%20performed%20on%20par%2C%20while%20additional%20training%20samples%20will%20most%20likely%20give%20SwinT%20the%20edge%20in%20prediction%20accuracy.%20We%20identified%20that%20different%20model%20architectures%20may%20characterize%20different%20%28sub-%29sets%20of%20brain%20aging%20effects%2C%20representing%20diverging%20concepts%20of%20brain%20age.%20Thus%2C%20we%20systematically%20tested%20whether%20sViT%2C%20SwinT%20and%20ResNet%20focus%20on%20different%20concepts%20of%20brain%20age%20by%20examining%20variations%20in%20their%20predictions%20and%20clinical%20utility%20for%20indicating%20deviations%20in%20neurological%20and%20psychiatric%20disorders.%20Reassuringly%2C%20we%20did%20not%20find%20substantial%20differences%20in%20the%20structure%20of%20brain%20age%20predictions%20between%20model%20architectures.%20Based%20on%20our%20results%2C%20the%20choice%20of%20deep%20learning%20model%20architecture%20does%20not%20appear%20to%20have%20a%20confounding%20effect%20on%20brain%20age%20studies.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22bioRxiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024-08-09%22%2C%22DOI%22%3A%2210.1101%5C%2F2024.08.09.607321%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.biorxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2024.08.09.607321v1%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-01-14T14%3A55%3A31Z%22%7D%7D%2C%7B%22key%22%3A%22CTPYBKQN%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Takemura%20et%20al.%22%2C%22parsedDate%22%3A%222023-06-06%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BTakemura%2C%20S.%2C%20Hayworth%2C%20K.%20J.%2C%20Huang%2C%20G.%20B.%2C%20Januszewski%2C%20M.%2C%20Lu%2C%20Z.%2C%20Marin%2C%20E.%20C.%2C%20Preibisch%2C%20S.%2C%20Xu%2C%20C.%20S.%2C%20Bogovic%2C%20J.%2C%20Champion%2C%20A.%20S.%2C%20Cheong%2C%20H.%20S.%2C%20Costa%2C%20M.%2C%20Eichler%2C%20K.%2C%20Katz%2C%20W.%2C%20Knecht%2C%20C.%2C%20Li%2C%20F.%2C%20Morris%2C%20B.%20J.%2C%20Ordish%2C%20C.%2C%20Rivlin%2C%20P.%20K.%2C%20%26%23x2026%3B%20Berg%2C%20S.%20%282023%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BA%20Connectome%20of%20the%20Male%20Drosophila%20Ventral%20Nerve%20Cord%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B.%20bioRxiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2023.06.05.543757%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2023.06.05.543757%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22A%20Connectome%20of%20the%20Male%20Drosophila%20Ventral%20Nerve%20Cord%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shin-ya%22%2C%22lastName%22%3A%22Takemura%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kenneth%20J.%22%2C%22lastName%22%3A%22Hayworth%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gary%20B.%22%2C%22lastName%22%3A%22Huang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michal%22%2C%22lastName%22%3A%22Januszewski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zhiyuan%22%2C%22lastName%22%3A%22Lu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elizabeth%20C.%22%2C%22lastName%22%3A%22Marin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stephan%22%2C%22lastName%22%3A%22Preibisch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22C.%20Shan%22%2C%22lastName%22%3A%22Xu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22John%22%2C%22lastName%22%3A%22Bogovic%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andrew%20S.%22%2C%22lastName%22%3A%22Champion%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Han%20SJ%22%2C%22lastName%22%3A%22Cheong%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marta%22%2C%22lastName%22%3A%22Costa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katharina%22%2C%22lastName%22%3A%22Eichler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22William%22%2C%22lastName%22%3A%22Katz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christopher%22%2C%22lastName%22%3A%22Knecht%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Feng%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Billy%20J.%22%2C%22lastName%22%3A%22Morris%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christopher%22%2C%22lastName%22%3A%22Ordish%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Patricia%20K.%22%2C%22lastName%22%3A%22Rivlin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Philipp%22%2C%22lastName%22%3A%22Schlegel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kazunori%22%2C%22lastName%22%3A%22Shinomiya%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tomke%22%2C%22lastName%22%3A%22St%5Cu00fcrner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ting%22%2C%22lastName%22%3A%22Zhao%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Griffin%22%2C%22lastName%22%3A%22Badalamente%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dennis%22%2C%22lastName%22%3A%22Bailey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paul%22%2C%22lastName%22%3A%22Brooks%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Brandon%20S.%22%2C%22lastName%22%3A%22Canino%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jody%22%2C%22lastName%22%3A%22Clements%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Cook%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Octave%22%2C%22lastName%22%3A%22Duclos%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christopher%20R.%22%2C%22lastName%22%3A%22Dunne%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kelli%22%2C%22lastName%22%3A%22Fairbanks%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Siqi%22%2C%22lastName%22%3A%22Fang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Samantha%22%2C%22lastName%22%3A%22Finley-May%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Audrey%22%2C%22lastName%22%3A%22Francis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Reed%22%2C%22lastName%22%3A%22George%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marina%22%2C%22lastName%22%3A%22Gkantia%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kyle%22%2C%22lastName%22%3A%22Harrington%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gary%20Patrick%22%2C%22lastName%22%3A%22Hopkins%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joseph%22%2C%22lastName%22%3A%22Hsu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Philip%20M.%22%2C%22lastName%22%3A%22Hubbard%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexandre%22%2C%22lastName%22%3A%22Javier%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Wyatt%22%2C%22lastName%22%3A%22Korff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julie%22%2C%22lastName%22%3A%22Kovalyak%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dominik%22%2C%22lastName%22%3A%22Krzemi%5Cu0144ski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shirley%20A.%22%2C%22lastName%22%3A%22Lauchie%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alanna%22%2C%22lastName%22%3A%22Lohff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Charli%22%2C%22lastName%22%3A%22Maldonado%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emily%20A.%22%2C%22lastName%22%3A%22Manley%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Caroline%22%2C%22lastName%22%3A%22Mooney%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Erika%22%2C%22lastName%22%3A%22Neace%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Matthew%22%2C%22lastName%22%3A%22Nichols%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Omotara%22%2C%22lastName%22%3A%22Ogundeyi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nneoma%22%2C%22lastName%22%3A%22Okeoma%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tyler%22%2C%22lastName%22%3A%22Paterson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elliott%22%2C%22lastName%22%3A%22Phillips%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emily%20M.%22%2C%22lastName%22%3A%22Phillips%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Caitlin%22%2C%22lastName%22%3A%22Ribeiro%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sean%20M.%22%2C%22lastName%22%3A%22Ryan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jon%20Thomson%22%2C%22lastName%22%3A%22Rymer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anne%20K.%22%2C%22lastName%22%3A%22Scott%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ashley%20L.%22%2C%22lastName%22%3A%22Scott%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%22%2C%22lastName%22%3A%22Shepherd%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aya%22%2C%22lastName%22%3A%22Shinomiya%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Claire%22%2C%22lastName%22%3A%22Smith%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Natalie%22%2C%22lastName%22%3A%22Smith%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alia%22%2C%22lastName%22%3A%22Suleiman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Satoko%22%2C%22lastName%22%3A%22Takemura%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Iris%22%2C%22lastName%22%3A%22Talebi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Imaan%20FM%22%2C%22lastName%22%3A%22Tamimi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eric%20T.%22%2C%22lastName%22%3A%22Trautman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lowell%22%2C%22lastName%22%3A%22Umayam%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22John%20J.%22%2C%22lastName%22%3A%22Walsh%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tansy%22%2C%22lastName%22%3A%22Yang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gerald%20M.%22%2C%22lastName%22%3A%22Rubin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Louis%20K.%22%2C%22lastName%22%3A%22Scheffer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jan%22%2C%22lastName%22%3A%22Funke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stephan%22%2C%22lastName%22%3A%22Saalfeld%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Harald%20F.%22%2C%22lastName%22%3A%22Hess%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stephen%20M.%22%2C%22lastName%22%3A%22Plaza%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gwyneth%20M.%22%2C%22lastName%22%3A%22Card%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gregory%20SXE%22%2C%22lastName%22%3A%22Jefferis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stuart%22%2C%22lastName%22%3A%22Berg%22%7D%5D%2C%22abstractNote%22%3A%22Animal%20behavior%20is%20principally%20expressed%20through%20neural%20control%20of%20muscles.%20Therefore%20understanding%20how%20the%20brain%20controls%20behavior%20requires%20mapping%20neuronal%20circuits%20all%20the%20way%20to%20motor%20neurons.%20We%20have%20previously%20established%20technology%20to%20collect%20large-volume%20electron%20microscopy%20data%20sets%20of%20neural%20tissue%20and%20fully%20reconstruct%20the%20morphology%20of%20the%20neurons%20and%20their%20chemical%20synaptic%20connections%20throughout%20the%20volume.%20Using%20these%20tools%20we%20generated%20a%20dense%20wiring%20diagram%2C%20or%20connectome%2C%20for%20a%20large%20portion%20of%20the%20Drosophila%20central%20brain.%20However%2C%20in%20most%20animals%2C%20including%20the%20fly%2C%20the%20majority%20of%20motor%20neurons%20are%20located%20outside%20the%20brain%20in%20a%20neural%20center%20closer%20to%20the%20body%2C%20i.e.%20the%20mammalian%20spinal%20cord%20or%20insect%20ventral%20nerve%20cord%20%28VNC%29.%20In%20this%20paper%2C%20we%20extend%20our%20effort%20to%20map%20full%20neural%20circuits%20for%20behavior%20by%20generating%20a%20connectome%20of%20the%20VNC%20of%20a%20male%20fly.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22bioRxiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222023-06-06%22%2C%22DOI%22%3A%2210.1101%5C%2F2023.06.05.543757%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.biorxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2023.06.05.543757v1%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-02-28T15%3A47%3A15Z%22%7D%7D%2C%7B%22key%22%3A%22W43AAFNR%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Yu%20et%20al.%22%2C%22parsedDate%22%3A%222024-07-03%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BYu%2C%20X.%2C%20Franzen%2C%20J.%2C%20Samek%2C%20W.%2C%20H%26%23xF6%3Bhne%2C%20M.%20M.-C.%2C%20%26amp%3B%20Kainmueller%2C%20D.%20%282024%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BModel%20Guidance%20via%20Explanations%20Turns%20Image%20Classifiers%20into%20Segmentation%20Models%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B%20%28arXiv%3A2407.03009%29.%20arXiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2407.03009%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2407.03009%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Model%20Guidance%20via%20Explanations%20Turns%20Image%20Classifiers%20into%20Segmentation%20Models%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xiaoyan%22%2C%22lastName%22%3A%22Yu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jannik%22%2C%22lastName%22%3A%22Franzen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Wojciech%22%2C%22lastName%22%3A%22Samek%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marina%20M.-C.%22%2C%22lastName%22%3A%22H%5Cu00f6hne%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Kainmueller%22%7D%5D%2C%22abstractNote%22%3A%22Heatmaps%20generated%20on%20inputs%20of%20image%20classification%20networks%20via%20explainable%20AI%20methods%20like%20Grad-CAM%20and%20LRP%20have%20been%20observed%20to%20resemble%20segmentations%20of%20input%20images%20in%20many%20cases.%20Consequently%2C%20heatmaps%20have%20also%20been%20leveraged%20for%20achieving%20weakly%20supervised%20segmentation%20with%20image-level%20supervision.%20On%20the%20other%20hand%2C%20losses%20can%20be%20imposed%20on%20differentiable%20heatmaps%2C%20which%20has%20been%20shown%20to%20serve%20for%20%281%29~improving%20heatmaps%20to%20be%20more%20human-interpretable%2C%20%282%29~regularization%20of%20networks%20towards%20better%20generalization%2C%20%283%29~training%20diverse%20ensembles%20of%20networks%2C%20and%20%284%29~for%20explicitly%20ignoring%20confounding%20input%20features.%20Due%20to%20the%20latter%20use%20case%2C%20the%20paradigm%20of%20imposing%20losses%20on%20heatmaps%20is%20often%20referred%20to%20as%20%26quot%3BRight%20for%20the%20right%20reasons%26quot%3B.%20We%20unify%20these%20two%20lines%20of%20research%20by%20investigating%20semi-supervised%20segmentation%20as%20a%20novel%20use%20case%20for%20the%20Right%20for%20the%20Right%20Reasons%20paradigm.%20First%2C%20we%20show%20formal%20parallels%20between%20differentiable%20heatmap%20architectures%20and%20standard%20encoder-decoder%20architectures%20for%20image%20segmentation.%20Second%2C%20we%20show%20that%20such%20differentiable%20heatmap%20architectures%20yield%20competitive%20results%20when%20trained%20with%20standard%20segmentation%20losses.%20Third%2C%20we%20show%20that%20such%20architectures%20allow%20for%20training%20with%20weak%20supervision%20in%20the%20form%20of%20image-level%20labels%20and%20small%20numbers%20of%20pixel-level%20labels%2C%20outperforming%20comparable%20encoder-decoder%20models.%20Code%20is%20available%20at%20%5C%5Curl%7Bhttps%3A%5C%2F%5C%2Fgithub.com%5C%2FKainmueller-Lab%5C%2FTW-autoencoder%7D.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2407.03009%22%2C%22date%22%3A%222024-07-03%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2407.03009%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2407.03009%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T08%3A54%3A29Z%22%7D%7D%2C%7B%22key%22%3A%22WYNILUAN%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-3651%5C%2FDARLI-AP-16.pdf%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-3651%5C%2FDARLI-AP-16.pdf%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fb%26gt%3B.%20%28n.d.%29.%20Retrieved%20January%2014%2C%202025%2C%20from%20%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-3651%5C%2FDARLI-AP-16.pdf%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-3651%5C%2FDARLI-AP-16.pdf%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22webpage%22%2C%22title%22%3A%22https%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-3651%5C%2FDARLI-AP-16.pdf%22%2C%22creators%22%3A%5B%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-3651%5C%2FDARLI-AP-16.pdf%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-05-02T08%3A49%3A58Z%22%7D%7D%5D%7D
Baumann, E., Dislich, B., Rumberger, J. L., Nagtegaal, I. D., Martinez, M. R., & Zlobec, I. (2024, February 13). HoVer-NeXt: A Fast Nuclei Segmentation and Classification Pipeline for Next Generation Histopathology. Medical Imaging with Deep Learning. https://openreview.net/forum?id=3vmB43oqIO
Cersovsky, J., Mohammadi, S., Kainmueller, D., & Hoehne, J. (2023). Towards Hierarchical Regional Transformer-based Multiple Instance Learning (arXiv:2308.12634). arXiv. https://doi.org/10.48550/arXiv.2308.12634
Cole, J. H. (2020). Multimodality neuroimaging brain-age in UK biobank: relationship to biomedical, lifestyle, and cognitive factors. Neurobiology of Aging, 92, 34–42. https://doi.org/10.1016/j.neurobiolaging.2020.03.014
Dohmen, M., Mittermaier, M., Rumberger, J. L., Yang, L.-L., Gruber, A. D., Toennies, M., Hippenstiel, S., Kainmueller, D., & Hocke, A. C. (2024). Simultaneous Lung Cell and Nucleus Segmentation From Labelled Versus Unlabelled Human Lung DIC Images. 2024 IEEE International Symposium on Biomedical Imaging (ISBI), 1–5. https://doi.org/10.1109/ISBI56570.2024.10635198
Franzen, J., Winklmayr, C., Guarino, V. E., Karg, C., Yu, X., Koreuber, N., Albrecht, J. P., Bischoff, P., & Kainmueller, D. (2024). Arctique: An artificial histopathological dataset unifying realism and controllability for uncertainty quantification (arXiv:2411.07097). arXiv. https://doi.org/10.48550/arXiv.2411.07097
Graham, S., Vu, Q. D., Jahanifar, M., Weigert, M., Schmidt, U., Zhang, W., Zhang, J., Yang, S., Xiang, J., Wang, X., Rumberger, J. L., Baumann, E., Hirsch, P., Liu, L., Hong, C., Aviles-Rivero, A. I., Jain, A., Ahn, H., Hong, Y., … Rajpoot, N. M. (2024). CoNIC Challenge: Pushing the frontiers of nuclear detection, segmentation, classification and counting. Medical Image Analysis, 92, 103047. https://doi.org/10.1016/j.media.2023.103047
Gutierrez Becker, B., Fraessle, S., Yao, H., Lüscher, J., Girycki, R., Machura, B., Gośliński, J., Czornik, J., Pitura, M., Arús-Pous, J., Fisher, E., Bojic, D., Richmond, D., Bigorgne, A., & Prunotto, M. (2024). P098 The Endoscopic Severity Score Map (ESSM): An Artificial Intelligence scoring system providing accurate, objective and localised measurements of endoscopic disease severity in ulcerative colitis. Journal of Crohn’s and Colitis, 18(Supplement_1), i377–i378. https://doi.org/10.1093/ecco-jcc/jjad212.0228
Haller, S., Feineis, L., Hutschenreiter, L., Bernard, F., Rother, C., Kainmüller, D., Swoboda, P., & Savchynskyy, B. (2022). A Comparative Study of Graph Matching Algorithms in Computer Vision (arXiv:2207.00291). arXiv. https://doi.org/10.48550/arXiv.2207.00291
Hirsch, P., & Kainmueller, D. (2020). An Auxiliary Task for Learning Nuclei Segmentation in 3D Microscopy Images. Proceedings of the Third Conference on Medical Imaging with Deep Learning, 304–321. https://proceedings.mlr.press/v121/hirsch20a.html
Hirsch, P., Malin-Mayor, C., Santella, A., Preibisch, S., Kainmueller, D., & Funke, J. (2022). Tracking by weakly-supervised learning and graph optimization for whole-embryo C. elegans lineages (arXiv:2208.11467). arXiv. https://doi.org/10.48550/arXiv.2208.11467
Hutschenreiter, L., Haller, S., Feineis, L., Rother, C., Kainmuller, D., & Savchynskyy, B. (2021). Fusion Moves for Graph Matching. 2021 IEEE/CVF International Conference on Computer Vision (ICCV), 6250–6259. https://doi.org/10.1109/ICCV48922.2021.00621
Karg, C., Stricker, S., Hutschenreiter, L., Savchynskyy, B., & Kainmueller, D. (2025). Fully Unsupervised Annotation of C. Elegans (arXiv:2503.07348). arXiv. https://doi.org/10.48550/arXiv.2503.07348
Maier-Hein, L., Reinke, A., Godau, P., Tizabi, M. D., Buettner, F., Christodoulou, E., Glocker, B., Isensee, F., Kleesiek, J., Kozubek, M., Reyes, M., Riegler, M. A., Wiesenfarth, M., Kavur, A. E., Sudre, C. H., Baumgartner, M., Eisenmann, M., Heckmann-Nötzel, D., Rädsch, T., … Jäger, P. F. (2024). Metrics reloaded: recommendations for image analysis validation. Nature Methods, 21(2), 195–212. https://doi.org/10.1038/s41592-023-02151-z
Mais, L., Hirsch, P., & Kainmueller, D. (2020). PatchPerPix for Instance Segmentation. In A. Vedaldi, H. Bischof, T. Brox, & J.-M. Frahm (Eds.), Computer Vision – ECCV 2020 (pp. 288–304). Springer International Publishing. https://doi.org/10.1007/978-3-030-58595-2_18
Mais, L., Hirsch, P., Managan, C., Kandarpa, R., Rumberger, J. L., Reinke, A., Maier-Hein, L., Ihrke, G., & Kainmueller, D. (2024). FISBe: A Real-World Benchmark Dataset for Instance Segmentation of Long-Range thin Filamentous Structures. 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 22249–22259. https://doi.org/10.1109/CVPR52733.2024.02100
Mais, L., Hirsch, P., Managan, C., Wang, K., Rokicki, K., Svirskas, R. R., Dickson, B. J., Korff, W., Rubin, G. M., Ihrke, G., Meissner, G. W., & Kainmueller, D. (2021). PatchPerPixMatch for Automated 3d Search of Neuronal Morphologies in Light Microscopy. bioRxiv. https://doi.org/10.1101/2021.07.23.453511
Malin-Mayor, C., Hirsch, P., Guignard, L., McDole, K., Wan, Y., Lemon, W. C., Kainmueller, D., Keller, P. J., Preibisch, S., & Funke, J. (2023). Automated reconstruction of whole-embryo cell lineages by learning from sparse annotations. Nature Biotechnology, 41(1), 44–49. https://doi.org/10.1038/s41587-022-01427-7
Meissner, G. W., Nern, A., Dorman, Z., DePasquale, G. M., Forster, K., Gibney, T., Hausenfluck, J. H., He, Y., Iyer, N., Jeter, J., Johnson, L., Johnston, R. M., Lee, K., Melton, B., Yarbrough, B., Zugates, C. T., Clements, J., Goina, C., Otsuna, H., … Team, F. P. (2022). A searchable image resource of Drosophila GAL4-driver expression patterns with single neuron resolution. bioRxiv. https://doi.org/10.1101/2020.05.29.080473
Moebel, E., Martinez-Sanchez, A., Lamm, L., Righetto, R. D., Wietrzynski, W., Albert, S., Larivière, D., Fourmentin, E., Pfeffer, S., Ortiz, J., Baumeister, W., Peng, T., Engel, B. D., & Kervrann, C. (2021). Deep learning improves macromolecule identification in 3D cellular cryo-electron tomograms. Nature Methods, 18(11), 1386–1394. https://doi.org/10.1038/s41592-021-01275-4
Reinke, A., Tizabi, M. D., Baumgartner, M., Eisenmann, M., Heckmann-Nötzel, D., Kavur, A. E., Rädsch, T., Sudre, C. H., Acion, L., Antonelli, M., Arbel, T., Bakas, S., Benis, A., Buettner, F., Cardoso, M. J., Cheplygina, V., Chen, J., Christodoulou, E., Cimini, B. A., … Maier-Hein, L. (2024). Understanding metric-related pitfalls in image analysis validation. Nature Methods, 1–13. https://doi.org/10.1038/s41592-023-02150-0
Rumberger, J. L., Baumann, E., Hirsch, P., Janowczyk, A., Zlobec, I., & Kainmueller, D. (2022). Panoptic segmentation with highly imbalanced semantic labels (arXiv:2203.11692). arXiv. https://doi.org/10.48550/arXiv.2203.11692
Rumberger, J. L., Franzen, J., Hirsch, P., Albrecht, J.-P., & Kainmueller, D. (2023). ACTIS: Improving data efficiency by leveraging semi-supervised Augmentation Consistency Training for Instance Segmentation. 2023 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW), 3792–3801. https://doi.org/10.1109/ICCVW60793.2023.00410
Rumberger, J. L., Lim, W., Wildfeuer, B., Sodemann, E. B., Lecler, A., Stemplinger, S., Issever, A. S., Sepahdari, A. R., Langner, S., Kainmueller, D., Hamm, B., & Erb-Eigner, K. (2024). Content-based image retrieval assists radiologists in diagnosing eye and orbital mass lesions in MRI. medRxiv. https://doi.org/10.1101/2024.07.24.24310920
Rumberger, J. L., Greenwald, N. F., Ranek, J. S., Boonrat, P., Walker, C., Franzen, J., Varra, S. R., Kong, A., Sowers, C., Liu, C. C., Averbukh, I., Piyadasa, H., Vanguri, R., Nederlof, I., Wang, X. J., Valen, D. V., Kok, M., Hollmann, T. J., Kainmueller, D., & Angelo, M. (2024). Automated classification of cellular expression in multiplexed imaging data with Nimbus. bioRxiv. https://doi.org/10.1101/2024.06.02.597062
Rumberger, J. L., Yu, X., Hirsch, P., Dohmen, M., Guarino, V. E., Mokarian, A., Mais, L., Funke, J., & Kainmueller, D. (2021). How Shift Equivariance Impacts Metric Learning for Instance Segmentation. 2021 IEEE/CVF International Conference on Computer Vision (ICCV), 7108–7116. https://doi.org/10.1109/ICCV48922.2021.00704
Scalia, G., Rutherford, S. T., Lu, Z., Buchholz, K. R., Skelton, N., Chuang, K., Diamant, N., Hütter, J.-C., Luescher, J.-M., Miu, A., Blaney, J., Gendelev, L., Skippington, E., Zynda, G., Dickson, N., Koziarski, M., Bengio, Y., Regev, A., Tan, M.-W., & Biancalani, T. (2024). A high-throughput phenotypic screen combined with an ultra-large-scale deep learning-based virtual screening reveals novel scaffolds of antibacterial compounds. bioRxiv. https://doi.org/10.1101/2024.09.11.612340
Scheffer, L. K., Xu, C. S., Januszewski, M., Lu, Z., Takemura, S., Hayworth, K. J., Huang, G. B., Shinomiya, K., Maitlin-Shepard, J., Berg, S., Clements, J., Hubbard, P. M., Katz, W. T., Umayam, L., Zhao, T., Ackerman, D., Blakely, T., Bogovic, J., Dolafi, T., … Plaza, S. M. (2020). A connectome and analysis of the adult Drosophila central brain. eLife, 9, e57443. https://doi.org/10.7554/eLife.57443
Schintke, F., Belhajjame, K., De Mecquenem, N., Frantz, D., Guarino, V. E., Hilbrich, M., Lehmann, F., Missier, P., Sattler, R., Sparka, J. A., Speckhard, D. T., Stolte, H., Vu, A. D., & Leser, U. (2024). Validity constraints for data analysis workflows. Future Generation Computer Systems, 157, 82–97. https://doi.org/10.1016/j.future.2024.03.037
Siegel, N. T., Kainmueller, D., Deniz, F., Ritter, K., & Schulz, M.-A. (2024). Do transformers and CNNs learn different concepts of brain age? bioRxiv. https://doi.org/10.1101/2024.08.09.607321
Takemura, S., Hayworth, K. J., Huang, G. B., Januszewski, M., Lu, Z., Marin, E. C., Preibisch, S., Xu, C. S., Bogovic, J., Champion, A. S., Cheong, H. S., Costa, M., Eichler, K., Katz, W., Knecht, C., Li, F., Morris, B. J., Ordish, C., Rivlin, P. K., … Berg, S. (2023). A Connectome of the Male Drosophila Ventral Nerve Cord. bioRxiv. https://doi.org/10.1101/2023.06.05.543757
Yu, X., Franzen, J., Samek, W., Höhne, M. M.-C., & Kainmueller, D. (2024). Model Guidance via Explanations Turns Image Classifiers into Segmentation Models (arXiv:2407.03009). arXiv. https://doi.org/10.48550/arXiv.2407.03009

Other Researches


Projects

Helmholtz Imaging Projects are granted to cross-disciplinary research teams that identify innovative research topics at the intersection of imaging and information & data science, initiate cross-cutting research collaborations, and thus underpin the growth of the Helmholtz Imaging network. These annual calls are based on the general concept for Helmholtz Imaging and are in line with the future topics of the Initiative and Networking Fund (INF).

Research Unit DESY

The Research Unit at DESY focuses on the early stages of the imaging pipeline, developing  methods for advanced image reconstruction, including the optimization of measurements and the combination of classical methods with data-driven approaches.

Our goal is to drag out a maximal amount of (quantitative) information from given or designed measurements. 

Research Unit DKFZ

The Research Unit at DKFZ focuses on the downstream stages of the imaging pipeline, developing robust methods for automated image analysis and emphasizing rigorous validation.

Our goal is to enable trustworthy and generalizable AI across scientific imaging domains. 

Publications

Helmholtz Imaging captures the world of science. Discover unique data sets, ready-to-use software tools, and top-level research papers. The platform’s output originates from our research groups as well as from projects funded by us, theses supervised by us and collaborations initiated through us. Altogether, this showcases the whole diversity of Helmholtz Imaging.