Karol Gotkowski

Machine Learning Engineer

Karol Gotkowski is a machine learning engineer in the Applied Computer Vision Lab (ACVL) at the Deutsche Krebsforschungszentrum in Heidelberg.

The focus of his work is to develop pragmatic data-driven deep learning solutions together with collaboration partners from all over Helmholtz and beyond to expand the use of AI in a multitude of different domains. These collaborations range from diabetes detection in whole slide images, to mineral particle instance segmentation for automated mineral quantification, to bubble detachment analysis during the process of electrolysis.

Karol has a M.Sc. degree in computer science from the Technische Universität Darmstadt.

Publication

4725570 Gotkowski 1 https://helmholtz-imaging.de/apa-bold-title.csl 50 date desc 774 https://helmholtz-imaging.de/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22HKUF82MH%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Klein%20et%20al.%22%2C%22parsedDate%22%3A%222024-10-23%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BKlein%2C%20L.%2C%20Ziegler%2C%20S.%2C%20Gerst%2C%20F.%2C%20Morgenroth%2C%20Y.%2C%20Gotkowski%2C%20K.%2C%20Sch%26%23xF6%3Bniger%2C%20E.%2C%20Heni%2C%20M.%2C%20Kipke%2C%20N.%2C%20Friedland%2C%20D.%2C%20Seiler%2C%20A.%2C%20Geibelt%2C%20E.%2C%20Yamazaki%2C%20H.%2C%20H%26%23xE4%3Bring%2C%20H.%20U.%2C%20Wagner%2C%20S.%2C%20Nadalin%2C%20S.%2C%20K%26%23xF6%3Bnigsrainer%2C%20A.%2C%20Mihaljevi%26%23x107%3B%2C%20A.%20L.%2C%20Hartmann%2C%20D.%2C%20Fend%2C%20F.%2C%20%26%23x2026%3B%20Wagner%2C%20R.%20%282024%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BExplainable%20AI-based%20analysis%20of%20human%20pancreas%20sections%20identifies%20traits%20of%20type%202%20diabetes%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B.%20medRxiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.10.23.24315937%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2024.10.23.24315937%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Explainable%20AI-based%20analysis%20of%20human%20pancreas%20sections%20identifies%20traits%20of%20type%202%20diabetes%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22L.%22%2C%22lastName%22%3A%22Klein%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S.%22%2C%22lastName%22%3A%22Ziegler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22F.%22%2C%22lastName%22%3A%22Gerst%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Y.%22%2C%22lastName%22%3A%22Morgenroth%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22K.%22%2C%22lastName%22%3A%22Gotkowski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22E.%22%2C%22lastName%22%3A%22Sch%5Cu00f6niger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M.%22%2C%22lastName%22%3A%22Heni%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22N.%22%2C%22lastName%22%3A%22Kipke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22D.%22%2C%22lastName%22%3A%22Friedland%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22A.%22%2C%22lastName%22%3A%22Seiler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22E.%22%2C%22lastName%22%3A%22Geibelt%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22H.%22%2C%22lastName%22%3A%22Yamazaki%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22H.%20U.%22%2C%22lastName%22%3A%22H%5Cu00e4ring%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S.%22%2C%22lastName%22%3A%22Wagner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S.%22%2C%22lastName%22%3A%22Nadalin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22A.%22%2C%22lastName%22%3A%22K%5Cu00f6nigsrainer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22A.%20L.%22%2C%22lastName%22%3A%22Mihaljevi%5Cu0107%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22D.%22%2C%22lastName%22%3A%22Hartmann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22F.%22%2C%22lastName%22%3A%22Fend%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22D.%22%2C%22lastName%22%3A%22Aust%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J.%22%2C%22lastName%22%3A%22Weitz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jumpertz-von%20Schwartzenberg%22%2C%22lastName%22%3A%22R%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M.%22%2C%22lastName%22%3A%22Distler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22K.%22%2C%22lastName%22%3A%22Maier-Hein%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22A.%20L.%22%2C%22lastName%22%3A%22Birkenfeld%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S.%22%2C%22lastName%22%3A%22Ullrich%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22P.%22%2C%22lastName%22%3A%22J%5Cu00e4ger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22F.%22%2C%22lastName%22%3A%22Isensee%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M.%22%2C%22lastName%22%3A%22Solimena%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22R.%22%2C%22lastName%22%3A%22Wagner%22%7D%5D%2C%22abstractNote%22%3A%22Type%202%20diabetes%20%28T2D%29%20is%20a%20chronic%20disease%20currently%20affecting%20around%20500%20million%20people%20worldwide%20with%20often%20severe%20health%20consequences.%20Yet%2C%20histopathological%20analyses%20are%20still%20inadequate%20to%20infer%20the%20glycaemic%20state%20of%20a%20person%20based%20on%20morphological%20alterations%20linked%20to%20impaired%20insulin%20secretion%20and%20%5Cu03b2-cell%20failure%20in%20T2D.%20Giga-pixel%20microscopy%20can%20capture%20subtle%20morphological%20changes%2C%20but%20data%20complexity%20exceeds%20human%20analysis%20capabilities.%20In%20response%2C%20we%20generated%20a%20dataset%20of%20pancreas%20whole-slide%20images%20with%20multiple%20chromogenic%20and%20multiplex%20fluorescent%20stainings%20and%20trained%20deep%20learning%20models%20to%20predict%20the%20T2D%20status.%20Using%20explainable%20AI%2C%20we%20made%20the%20learned%20relationships%20interpretable%2C%20quantified%20them%20as%20biomarkers%2C%20and%20assessed%20their%20association%20with%20T2D.%20Remarkably%2C%20the%20highest%20prediction%20performance%20was%20achieved%20by%20simultaneously%20focusing%20on%20islet%20%5Cu03b1-and%20%5Cu03b4-cells%20and%20neuronal%20axons.%20Subtle%20alterations%20in%20the%20pancreatic%20tissue%20of%20T2D%20donors%20such%20as%20smaller%20islets%2C%20larger%20adipocyte%20clusters%2C%20altered%20islet-adipocyte%20proximity%2C%20and%20fibrotic%20patterns%20were%20also%20observed.%20Our%20innovative%20data-driven%20approach%20underpins%20key%20findings%20about%20pancreatic%20tissue%20alterations%20in%20T2D%20and%20provides%20novel%20targets%20for%20research.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22medRxiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024-10-23%22%2C%22DOI%22%3A%2210.1101%5C%2F2024.10.23.24315937%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.medrxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2024.10.23.24315937v1%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T12%3A10%3A58Z%22%7D%7D%2C%7B%22key%22%3A%225B4AA934%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Gotkowski%20et%20al.%22%2C%22parsedDate%22%3A%222024-03-19%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BGotkowski%2C%20K.%2C%20L%26%23xFC%3Bth%2C%20C.%2C%20J%26%23xE4%3Bger%2C%20P.%20F.%2C%20Ziegler%2C%20S.%2C%20Kr%26%23xE4%3Bmer%2C%20L.%2C%20Denner%2C%20S.%2C%20Xiao%2C%20S.%2C%20Disch%2C%20N.%2C%20Maier-Hein%2C%20K.%20H.%2C%20%26amp%3B%20Isensee%2C%20F.%20%282024%29.%20%26lt%3Bi%26gt%3B%26lt%3Bb%26gt%3B%26lt%3Bspan%20style%3D%26quot%3Bfont-style%3Anormal%3B%26quot%3B%26gt%3BEmbarrassingly%20Simple%20Scribble%20Supervision%20for%203D%20Medical%20Segmentation%26lt%3B%5C%2Fspan%26gt%3B%26lt%3B%5C%2Fb%26gt%3B%26lt%3B%5C%2Fi%26gt%3B%20%28arXiv%3A2403.12834%29.%20arXiv.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2403.12834%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FarXiv.2403.12834%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Embarrassingly%20Simple%20Scribble%20Supervision%20for%203D%20Medical%20Segmentation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Karol%22%2C%22lastName%22%3A%22Gotkowski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Carsten%22%2C%22lastName%22%3A%22L%5Cu00fcth%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paul%20F.%22%2C%22lastName%22%3A%22J%5Cu00e4ger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sebastian%22%2C%22lastName%22%3A%22Ziegler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lars%22%2C%22lastName%22%3A%22Kr%5Cu00e4mer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22Denner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shuhan%22%2C%22lastName%22%3A%22Xiao%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nico%22%2C%22lastName%22%3A%22Disch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Klaus%20H.%22%2C%22lastName%22%3A%22Maier-Hein%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fabian%22%2C%22lastName%22%3A%22Isensee%22%7D%5D%2C%22abstractNote%22%3A%22Traditionally%2C%20segmentation%20algorithms%20require%20dense%20annotations%20for%20training%2C%20demanding%20significant%20annotation%20efforts%2C%20particularly%20within%20the%203D%20medical%20imaging%20field.%20Scribble-supervised%20learning%20emerges%20as%20a%20possible%20solution%20to%20this%20challenge%2C%20promising%20a%20reduction%20in%20annotation%20efforts%20when%20creating%20large-scale%20datasets.%20Recently%2C%20a%20plethora%20of%20methods%20for%20optimized%20learning%20from%20scribbles%20have%20been%20proposed%2C%20but%20have%20so%20far%20failed%20to%20position%20scribble%20annotation%20as%20a%20beneficial%20alternative.%20We%20relate%20this%20shortcoming%20to%20two%20major%20issues%3A%201%29%20the%20complex%20nature%20of%20many%20methods%20which%20deeply%20ties%20them%20to%20the%20underlying%20segmentation%20model%2C%20thus%20preventing%20a%20migration%20to%20more%20powerful%20state-of-the-art%20models%20as%20the%20field%20progresses%20and%202%29%20the%20lack%20of%20a%20systematic%20evaluation%20to%20validate%20consistent%20performance%20across%20the%20broader%20medical%20domain%2C%20resulting%20in%20a%20lack%20of%20trust%20when%20applying%20these%20methods%20to%20new%20segmentation%20problems.%20To%20address%20these%20issues%2C%20we%20propose%20a%20comprehensive%20scribble%20supervision%20benchmark%20consisting%20of%20seven%20datasets%20covering%20a%20diverse%20set%20of%20anatomies%20and%20pathologies%20imaged%20with%20varying%20modalities.%20We%20furthermore%20propose%20the%20systematic%20use%20of%20partial%20losses%2C%20i.e.%20losses%20that%20are%20only%20computed%20on%20annotated%20voxels.%20Contrary%20to%20most%20existing%20methods%2C%20these%20losses%20can%20be%20seamlessly%20integrated%20into%20state-of-the-art%20segmentation%20methods%2C%20enabling%20them%20to%20learn%20from%20scribble%20annotations%20while%20preserving%20their%20original%20loss%20formulations.%20Our%20evaluation%20using%20nnU-Net%20reveals%20that%20while%20most%20existing%20methods%20suffer%20from%20a%20lack%20of%20generalization%2C%20the%20proposed%20approach%20consistently%20delivers%20state-of-the-art%20performance.%20Thanks%20to%20its%20simplicity%2C%20our%20approach%20presents%20an%20embarrassingly%20simple%20yet%20effective%20solution%20to%20the%20challenges%20of%20scribble%20supervision.%20Source%20code%20as%20well%20as%20our%20extensive%20scribble%20benchmarking%20suite%20will%20be%20made%20publicly%20available%20upon%20publication.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2403.12834%22%2C%22date%22%3A%222024-03-19%22%2C%22DOI%22%3A%2210.48550%5C%2FarXiv.2403.12834%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2403.12834%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222025-02-03T12%3A25%3A39Z%22%7D%7D%2C%7B%22key%22%3A%22GYCLTYZ6%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Gupta%20et%20al.%22%2C%22parsedDate%22%3A%222024-01-15%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BGupta%2C%20S.%2C%20da%20Assuncao%20Godinho%2C%20J.%20R.%2C%20Gotkowski%2C%20K.%2C%20%26amp%3B%20Isensee%2C%20F.%20%282024%29.%20%26lt%3Bb%26gt%3BStandardized%20and%20semiautomated%20workflow%20for%203D%20characterization%20of%20liberated%20particles%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BPowder%20Technology%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B433%26lt%3B%5C%2Fi%26gt%3B%2C%20119159.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.powtec.2023.119159%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.powtec.2023.119159%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Standardized%20and%20semiautomated%20workflow%20for%203D%20characterization%20of%20liberated%20particles%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shuvam%22%2C%22lastName%22%3A%22Gupta%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jose%20Ricardo%22%2C%22lastName%22%3A%22da%20Assuncao%20Godinho%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Karol%22%2C%22lastName%22%3A%22Gotkowski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fabian%22%2C%22lastName%22%3A%22Isensee%22%7D%5D%2C%22abstractNote%22%3A%22MSPaCMAn%20is%20the%20recently%20developed%20workflow%20that%20does%20the%20mineralogical%20quantification%20of%20individual%20particles%20using%20its%20histograms%20while%20considering%20the%20effects%20of%20partial%20volume%20artefacts%20in%20interphases%20at%20particle%20level%20detail.%20This%20paper%20demonstrates%20and%20validates%20the%20new%20developments%20in%20the%20MSPaCMAn%20workflow%2C%20aiming%20to%20minimize%20user%20bias%20and%20enhance%20the%20accuracy%20of%20MSPaCMAn.%20Here%2C%20in%20the%20new%20developments%20of%20MSPaCMAn%20workflow%2C%20firstly%2C%20the%20recently%20developed%20deep%20learning%20method%2C%20namely%20ParticleSeg3D%2C%20is%20employed%20to%20distinguish%20particles%20from%20the%20background.%20Secondly%2C%20the%20particle%26%23039%3Bs%20size%20and%20shape%20information%20are%20considered%20along%20with%20its%20histogram%20to%20classify%20and%20quantify%20the%20mineral%20phases%20in%20liberated%20particles.%20After%20the%20new%20developments%2C%20the%20detection%20limit%20of%20MSPaCMAn%20to%20characterize%20small%20and%20thin%20liberated%20particles%20is%20enhanced.%20Experimental%20results%20demonstrate%20the%20effectiveness%20of%20the%20MSPaCMAn%26%23039%3Bs%20updated%20workflow.%20It%20was%20found%20that%20the%20mineralogical%20composition%20calculated%20by%20the%20MSPaCMAn%26%23039%3Bs%20updated%20workflow%20was%20precise%2C%20with%20the%20highest%20coefficient%20of%20variance%20of%206.56%25.%20Moreover%2C%20the%20mineralogical%20composition%20determined%20by%20MSPaCMAn%26%23039%3Bs%20updated%20workflow%20had%20low%20variance%20across%20three%20different%20reconstruction%20parameters%20and%20two%20different%20voxel%20sizes%20%285.5%5Cu00a0%5Cu03bcm%20and%2010%5Cu00a0%5Cu03bcm%29.%20Comparisons%20with%20other%20quantification%20methods%20highlight%20the%20accuracy%20of%20MSPaCMAn%26%23039%3Bs%20updated%20workflow%20to%20determine%20the%20mineralogical%20compositions.%20When%20analyzing%20a%20test%20sample%20consisting%20of%20quartz%2C%20calcite%2C%20fluorite%2C%20and%20lepidolite%2C%20MSPaCMAn%26%23039%3Bs%20updated%20workflow%20achieved%20the%20highest%20mineralogical%20deviation%20of%20only%207.73%25%20from%20the%20reference%20mineralogy.%20In%20contrast%2C%20the%20random%20forest%20algorithm%20resulted%20in%20a%20deviation%20of%2051.47%25%2C%20while%20the%20manual%20thresholding%20method%20yielded%20a%20deviation%20of%2041.46%25.%20Overall%2C%20these%20findings%20emphasize%20the%20reliability%20and%20accuracy%20of%20MSPaCMAn%26%23039%3Bs%20updated%20workflow%20in%20quantifying%20mineralogical%20composition.%22%2C%22date%22%3A%222024-01-15%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.powtec.2023.119159%22%2C%22ISSN%22%3A%220032-5910%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0032591023009427%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-11-15T11%3A52%3A30Z%22%7D%7D%2C%7B%22key%22%3A%229RDRBGBX%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Gotkowski%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BGotkowski%2C%20K.%2C%20Gupta%2C%20S.%2C%20Godinho%2C%20J.%20R.%20A.%2C%20Tochtrop%2C%20C.%20G.%20S.%2C%20Maier-Hein%2C%20K.%20H.%2C%20%26amp%3B%20Isensee%2C%20F.%20%282024%29.%20%26lt%3Bb%26gt%3BParticleSeg3D%3A%20A%20scalable%20out-of-the-box%20deep%20learning%20segmentation%20solution%20for%20individual%20particle%20characterization%20from%20micro%20CT%20images%20in%20mineral%20processing%20and%20recycling%26lt%3B%5C%2Fb%26gt%3B.%20%26lt%3Bi%26gt%3BPowder%20Technology%26lt%3B%5C%2Fi%26gt%3B%2C%20%26lt%3Bi%26gt%3B434%26lt%3B%5C%2Fi%26gt%3B%2C%20119286.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-DOIURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.powtec.2023.119286%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.powtec.2023.119286%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22ParticleSeg3D%3A%20A%20scalable%20out-of-the-box%20deep%20learning%20segmentation%20solution%20for%20individual%20particle%20characterization%20from%20micro%20CT%20images%20in%20mineral%20processing%20and%20recycling%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Karol%22%2C%22lastName%22%3A%22Gotkowski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shuvam%22%2C%22lastName%22%3A%22Gupta%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jose%20R.A.%22%2C%22lastName%22%3A%22Godinho%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Camila%20G.S.%22%2C%22lastName%22%3A%22Tochtrop%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Klaus%20H.%22%2C%22lastName%22%3A%22Maier-Hein%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fabian%22%2C%22lastName%22%3A%22Isensee%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%2202%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.powtec.2023.119286%22%2C%22ISSN%22%3A%2200325910%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flinkinghub.elsevier.com%5C%2Fretrieve%5C%2Fpii%5C%2FS0032591023010690%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-11-15T11%3A48%3A17Z%22%7D%7D%2C%7B%22key%22%3A%22LQVK5H6P%22%2C%22library%22%3A%7B%22id%22%3A4725570%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Gotkowski%20et%20al.%22%2C%22parsedDate%22%3A%222022-06-22%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%26quot%3B%26gt%3B%5Cn%20%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%26gt%3BGotkowski%2C%20K.%2C%20Gonzalez%2C%20C.%2C%20Kaltenborn%2C%20I.%20J.%2C%20Fischbach%2C%20R.%2C%20Bucher%2C%20A.%2C%20%26amp%3B%20Mukhopadhyay%2C%20A.%20%282022%2C%20June%2022%29.%20%26lt%3Bb%26gt%3Bi3Deep%3A%20Efficient%203D%20interactive%20segmentation%20with%20the%20nnU-Net%26lt%3B%5C%2Fb%26gt%3B.%20Medical%20Imaging%20with%20Deep%20Learning.%20%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fopenreview.net%5C%2Fforum%3Fid%3DR420Pr5vUj3%26%23039%3B%26gt%3Bhttps%3A%5C%2F%5C%2Fopenreview.net%5C%2Fforum%3Fid%3DR420Pr5vUj3%26lt%3B%5C%2Fa%26gt%3B%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22i3Deep%3A%20Efficient%203D%20interactive%20segmentation%20with%20the%20nnU-Net%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Karol%22%2C%22lastName%22%3A%22Gotkowski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Camila%22%2C%22lastName%22%3A%22Gonzalez%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Isabel%20Jasmin%22%2C%22lastName%22%3A%22Kaltenborn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ricarda%22%2C%22lastName%22%3A%22Fischbach%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andreas%22%2C%22lastName%22%3A%22Bucher%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anirban%22%2C%22lastName%22%3A%22Mukhopadhyay%22%7D%5D%2C%22abstractNote%22%3A%223D%20interactive%20segmentation%20is%20highly%20relevant%20in%20reducing%20the%20annotation%20time%20for%20experts.%20However%2C%20current%20methods%20often%20achieve%20only%20small%20segmentation%20improvements%20per%20interaction%20as%20lightweight%20models%20are%20a%20requirement%20to%20ensure%20near-realtime%20usage.%20Models%20with%20better%20predictive%20performance%20such%20as%20the%20nnU-Net%20cannot%20be%20employed%20for%20interactive%20segmentation%20due%20to%20their%20high%20computational%20demands%2C%20which%20result%20in%20long%20inference%20times.%20To%20solve%20this%20issue%2C%20we%20propose%20the%203D%20interactive%20segmentation%20framework%20i3Deep.%20Slices%20are%20selected%20through%20uncertainty%20estimation%20in%20an%20offline%20setting%20and%20afterwards%20corrected%20by%20an%20expert.%20The%20slices%20are%20then%20fed%20to%20a%20refinement%20nnU-Net%2C%20which%20significantly%20improves%20the%20global%203D%20segmentation%20from%20the%20local%20corrections.%20This%20approach%20bypasses%20the%20issue%20of%20long%20inference%20times%20by%20moving%20expensive%20computations%20into%20an%20offline%20setting%20that%20does%20not%20include%20the%20expert.%20For%20three%20different%20anatomies%2C%20our%20approach%20reduces%20the%20workload%20of%20the%20expert%20by%2080.3%25%2C%20while%20significantly%20improving%20the%20Dice%20by%20up%20to%2039.5%25%2C%20outperforming%20other%20state-of-the-art%20methods%20by%20a%20clear%20margin.%20Even%20on%20out-of-distribution%20data%20i3Deep%20is%20able%20to%20improve%20the%20segmentation%20by%2019.3%25.%22%2C%22date%22%3A%222022%5C%2F06%5C%2F22%22%2C%22proceedingsTitle%22%3A%22%22%2C%22conferenceName%22%3A%22Medical%20Imaging%20with%20Deep%20Learning%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fopenreview.net%5C%2Fforum%3Fid%3DR420Pr5vUj3%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222022-12-07T09%3A01%3A34Z%22%7D%7D%5D%7D
Klein, L., Ziegler, S., Gerst, F., Morgenroth, Y., Gotkowski, K., Schöniger, E., Heni, M., Kipke, N., Friedland, D., Seiler, A., Geibelt, E., Yamazaki, H., Häring, H. U., Wagner, S., Nadalin, S., Königsrainer, A., Mihaljević, A. L., Hartmann, D., Fend, F., … Wagner, R. (2024). Explainable AI-based analysis of human pancreas sections identifies traits of type 2 diabetes. medRxiv. https://doi.org/10.1101/2024.10.23.24315937
Gotkowski, K., Lüth, C., Jäger, P. F., Ziegler, S., Krämer, L., Denner, S., Xiao, S., Disch, N., Maier-Hein, K. H., & Isensee, F. (2024). Embarrassingly Simple Scribble Supervision for 3D Medical Segmentation (arXiv:2403.12834). arXiv. https://doi.org/10.48550/arXiv.2403.12834
Gupta, S., da Assuncao Godinho, J. R., Gotkowski, K., & Isensee, F. (2024). Standardized and semiautomated workflow for 3D characterization of liberated particles. Powder Technology, 433, 119159. https://doi.org/10.1016/j.powtec.2023.119159
Gotkowski, K., Gupta, S., Godinho, J. R. A., Tochtrop, C. G. S., Maier-Hein, K. H., & Isensee, F. (2024). ParticleSeg3D: A scalable out-of-the-box deep learning segmentation solution for individual particle characterization from micro CT images in mineral processing and recycling. Powder Technology, 434, 119286. https://doi.org/10.1016/j.powtec.2023.119286
Gotkowski, K., Gonzalez, C., Kaltenborn, I. J., Fischbach, R., Bucher, A., & Mukhopadhyay, A. (2022, June 22). i3Deep: Efficient 3D interactive segmentation with the nnU-Net. Medical Imaging with Deep Learning. https://openreview.net/forum?id=R420Pr5vUj3