280 likes | 496 Views
National Alliance for Medical Image Computing: NAMIC. Ron Kikinis, M.D. http://na-mic.org. Overview. Introduction Core 1 Core 2 Core 3 Support Cores Slicer Demo. NIH Roadmap.
E N D
National Alliance for Medical Image Computing: NAMIC Ron Kikinis, M.D. http://na-mic.org
Overview • Introduction • Core 1 • Core 2 • Core 3 • Support Cores • Slicer Demo
NIH Roadmap • National Centers for Biomedical Computing (NCBC) will develop and implement the core of a universal computing infrastructure... The centers will create innovative software programs and other tools that enable the biomedical community to integrate, analyze, model, simulate, and share data on human health and disease. • 7 National Centers for Biomedical Computing • Funded for 5 years with the option for a second cycle
NIH NCBC’s • Center for Computational Biology (CCB) • Informatics for Integrating Biology and the Bedside (i2b2) • Multiscale Analysis of Genomic and Cellular Networks (MAGNet) • National Alliance for Medical Imaging Computing (NA-MIC) • The National Center for Biomedical Ontology (NCBO) • Physics-Based Simulation of Biological Structures (SIMBIOS) • National Center for Integrative Biomedical Informatics (NCIBI)
Introduction • What is our problem? • What is our science?
What is our problem? • Diagnostic Imaging produces data in increasing quantity and of increasing complexity • Image Computing is about extraction of relevant information from images
What is our science? • Computational tools for image analysis (algorithms) • Software engineering methods and applications for image analysis (tools)
Overview • Introduction • Core 1 • Core 2 • Core 3 • Support Cores • Slicer Demo
Harvard 1. Diffusion-based Registration MIT Utah 2.Group Effect Maps 3. Automatic Segmentation 1. Shape and Atlas Based Segmentation 1. DTI Processing 2. Statistical Shape Analysis 2. Surface Processing 3, DTI Connectivity Analysis 3. PDE Implementations Segmentation Registration Foundational Methods UNC Georgia Tech Structural Features and Statistics 1. Combined Statistical/PDE Methods 1. Quantitative DTI Analysis Connective Features and Statistics 2. Cross-Sectional Shape Analysis 2. Stochastic Flow Models Core 1
Overview • Introduction • Core 1 • Core 2 • Core 3 • Support Cores • Slicer Demo
1. Software Architecture 2. Software Process 3. Software Quality 1. Graphical programming interfaces 1. Cross-platform Build 2. Coordinate pre-compiled tools 2. Cross-platform Distribution 3. Data format interpreters 3. Cross-language API’s 1. Grid Middleware 1. DBP Applications 2. Data Grid 2. Application Methodology 3. Data Mediation 3. Application Quality Assurance Core 2 GE Kitware UCLA Software Engineering Software Engineering Tools UCSD Isomics Software Quality Software Integration Data Access Tools Distributed Computing Applications
NA-MIC Kit • Application • 3D Slicer • Toolkits • ITK, VTK, KWWidgets, LONI pipeline • Software Engineering Tools • Cmake, Ctest, Dart2 • Doxygen, CableSwig, Valgrind, StyleCheck, SourceNavigator, Ctags, Bug Tracking, CVS, Subversion, Dart, Version Control, Python, TCL/TK, Java, C/C++ , OpenGL
Slicer Today • 460K Lines of Code • Cross-Platform Tcl/Tk GUI • VTK/ITK Based C++ Computing • www.slicer.org • > 7000 Registered Downloads • >230 on slicer-users • >150 on slicer-devel
ITK today • Over 20,000 registered downloads • Mailing lists • ITK Users: >1000 subscribers • ITK Developers: >210 subscribers
National Library of MedicineSegmentation and Registration Toolkit $12 million over 6 years Leading edge algorithms Open Source Software www.itk.org
Overview • Introduction • Core 1 • Core 2 • Core 3 • Support Cores • Slicer Demo
Core 3 • Core 3.1: Harvard, Dartmouth • Fronto-temporal connections • Cognitive and behavioral data • Core 3.2: UCI, Toronto • Brain regions involving DLPFC • Clinical, cognitive, genetic data
Overview • Introduction • Core 1 • Core 2 • Core 3 • Support Cores • Slicer Demo
Support Cores (# 4-7) • Service, Training Dissemination • Crucial support for the scientific and engineering enterprise • Support core PI’s also have strong scientific credentials • Collaboration history through BIRN and ITK
The Philosophy Open Source + Open Data = Open Science
The Open Source Model • Enabling technology for translational research • Compatible with both research and commercialization
The Open Data Model • Massive effort in social engineering (aka “This is MY data, why should I share it?”) • Provides data sets and “problems” for algorithm developers • E.g. BIRN develops data-sharing technology used as template by other efforts
Challenges • Imaging community is in the early stage of learning about large-scale research • Everybody needs infrastructure, but who will pay for it? • Translational research and software engineering is expensive and difficult. • The traditional academic reward system does not work. A pure business approach does not work either.
For More Information www.na-mic.org