scholarly journals Cerebral Cortical Thickness Estimation using the TINA Open-Source Image Analysis Environment

2005 ◽  
Author(s):  
Marietta Scott ◽  
Paul A. Bromiley ◽  
Neil A. Thacker

This paper gives an overview of the use and development of the TINA open-source medical image analysis environment, with respect to the determination of human cerebral cortical thickness estimation from magnetic resonance images. The ultimate aim of TINA is to provide a validated system where the source code and datasets are freely available in order to allow peer-validation of published results.

2005 ◽  
Author(s):  
Vincent Chu ◽  
Ghassan Hamarneh

To facilitate the analysis of medical image data in research environment, MATITK is developed to allow ITK algorithms to be called in MATLAB. ITK is a powerful open-source image analysis toolkit, but it requires the knowledge of C++ to use it. With the help of MATITK, researchers familiar with MATLAB can harness the power of ITK without learning C++ and worrying about low-level programming issues. A common set of C++ classes have also been produced to allow future ITK methods to be added to MATITK therefore callable in MATLAB without the bothersome translation between MATLAB and ITK.


Author(s):  
RUDI DEKLERCK ◽  
JUNFENG GUO ◽  
ALEXANDRU SALOMIE ◽  
MAREK SULIGA ◽  
EDGARD NYSSEN ◽  
...  

This paper describes a surface flattening technique, which has been developed in particular to obtain a complete view of the cortical surface of the brain. However, the method is able to produce an overall planar view of any anatomical or real-life object, provided it is topologically compatible with the sphere (i.e. genus 0). It computes the shading of the original surface for rays casted from a nearby surrounding surface and unfolds this surface in a 2D plane, without introducing major distortions. The flat image consisting of the mapped shading results has the advantage that the sulci (i.e. the grooves characterizing the superficial brain geometry) of the cortical surface of the brain can be followed in their entirety, which facilitates the study and the recognition of their patterns. The new visualization method is integrated into a versatile medical image analysis environment. A first study to assess its usefulness has been accomplished and is also reported in this paper.


2009 ◽  
Author(s):  
Erich Birngruber ◽  
René Donner ◽  
Georg Langs

The rapid and flexible visualization of large amounts of com- plex data has become a crucial part in medical image analysis. In re- cent years the Visualization Toolkit (VTK) has evolved as the de-facto standard for open-source medical data visualization. It features a clean design based on a data flow paradigm, which the existing wrappers for VTK (Python, Tcl/Tk, Simulink) closely follow. This allows to elegantly model many types of algorithms, but presents a steep learning curve for beginners. In contrast to existing approaches we propose a framework for accessing VTK’s capabilities from within MATLAB, using a syntax which closely follows MATLAB’s graphics primitives. While providing users with the advanced, fast 3D visualization capabilities MATLAB does not provide, it is easy to learn while being flexible enough to allow for complex plots, large amounts of data and combinations of visualiza- tions. The proposed framework will be made available as open source with detailed documentation and example data sets.


Author(s):  
Gert Wollny ◽  
Peter Kellman ◽  
María-Jesus Ledesma-Carbayo ◽  
Matthew M Skinner ◽  
Jean-Jaques Hublin ◽  
...  

2006 ◽  
Author(s):  
Luis Ibanez ◽  
Lydia Ng ◽  
Josh Cates ◽  
Stephen Aylward ◽  
Bill Lorensen ◽  
...  

This course introduces attendees to select open-source efforts in the field of medical image analysis. Opportunities for users and developers are presented. The course particularly focuses on the open-source Insight Toolkit (ITK) for medical image segmentation and registration. The course describes the procedure for downloading and installing the toolkit and covers the use of its data representation and filtering classes. Attendees are shown how ITK can be used in their research, rapid prototyping, and application development.LEARNING OUTCOMES After completing this course, attendees will be able to: contribute to and benefit from open-source software for medical image analysis download and install the ITK toolkit start their own software project based on ITK design and construct an image processing pipeline combine ITK filters for medical image segmentation combine ITK components for medical image registrationINTENDED AUDIENCE This course is intended for anyone involved in medical image analysis. In particular it targets graduate students, researchers and professionals in the areas of computer science and medicine. Attendees should have an intermediate level on object oriented programming with C++ and must be familiar with the basics of medical image processing and analysis.


2020 ◽  
Vol 13 (5) ◽  
pp. 999-1007
Author(s):  
Karthikeyan Periyasami ◽  
Arul Xavier Viswanathan Mariammal ◽  
Iwin Thanakumar Joseph ◽  
Velliangiri Sarveshwaran

Background: Medical image analysis application has complex resource requirement. Scheduling Medical image analysis application is the complex task to the grid resources. It is necessary to develop a new model to improve the breast cancer screening process. Proposed novel Meta scheduler algorithm allocate the image analyse applications to the local schedulers and local scheduler submit the job to the grid node which analyses the medical image and generates the result sent back to Meta scheduler. Meta schedulers are distinct from the local scheduler. Meta scheduler and local scheduler have the aim at resource allocation and management. Objective: The main objective of the CDAM meta-scheduler is to maximize the number of jobs accepted. Methods: In the beginning, the user sends jobs with the deadline to the global grid resource broker. Resource providers sent information about the available resources connected in the network at a fixed interval of time to the global grid resource broker, the information such as valuation of the resource and number of an available free resource. CDAM requests the global grid resource broker for available resources details and user jobs. After receiving the information from the global grid resource broker, it matches the job with the resources. CDAM sends jobs to the local scheduler and local scheduler schedule the job to the local grid site. Local grid site executes the jobs and sends the result back to the CDAM. Success full completion of the job status and resource status are updated into the auction history database. CDAM collect the result from all local grid site and return to the grid users. Results: The CDAM was simulated using grid simulator. Number of jobs increases then the percentage of the jobs accepted also decrease due to the scarcity of resources. CDAM is providing 2% to 5% better result than Fair share Meta scheduling algorithm. CDAM algorithm bid density value is generated based on the user requirement and user history and ask value is generated from the resource details. Users who, having the most significant deadline are generated the highest bid value, grid resource which is having the fastest processor are generated lowest ask value. The highest bid is assigned to the lowest Ask it means that the user who is having the most significant deadline is assigned to the grid resource which is having the fastest processor. The deadline represents a time by which the user requires the result. The user can define the deadline by which the results are needed, and the CDAM will try to find the fastest resource available in order to meet the user-defined deadline. If the scheduler detects that the tasks cannot be completed before the deadline, then the scheduler abandons the current resource, tries to select the next fastest resource and tries until the completion of application meets the deadline. CDAM is providing 25% better result than grid way Meta scheduler this is because grid way Meta scheduler allocate jobs to the resource based on the first come first served policy. Conclusion: The proposed CDAM model was validated through simulation and was evaluated based on jobs accepted. The experimental results clearly show that the CDAM model maximizes the number of jobs accepted than conventional Meta scheduler. We conclude that a CDAM is highly effective meta-scheduler systems and can be used for an extraordinary situation where jobs have a combinatorial requirement.


Author(s):  
Sanket Singh ◽  
Sarthak Jain ◽  
Akshit Khanna ◽  
Anupam Kumar ◽  
Ashish Sharma

Sign in / Sign up

Export Citation Format

Share Document