Speaker: Mohammad Reza Azodinia

The daily growth of the amount of medical image data produced by medical data acquisition devices, e.g. Positron Emission tomography (PET) images, obligates to shift from traditional medical image analysis towards scalable solutions. However, processing large datasets of 3D/4D PET images is quite pressing in terms of computation time, storage capacity and network bandwidth.

Among different efficient image processing algorithms, median filtering is a principal element in many image processing situations which manages to reduce the noise while preserving the edges. The time complexity and memory consumption of median filter have always hampered its usage especially when the size of input images is so large and applying 3D median filtering is necessary.

Moreover, as processor architectures are reaching their physical limitations, for PCs, the limitation of hardware resources and the tolerance of time consuming present a bottleneck in processing a large amount of large images. Hence using distributed computing technologies has become a popular way to solve problems which do not fit the confines of a single computer.

MapReduce is a programming model for processing large data sets with a parallel, distributed algorithm, which provides high scalability and fault-tolerance on a cluster of commodity hardware. We utilize an open source implementation of MapReduce i.e. Hadoop framework, in order to perform 3D median filtering of large scale sequence of PET images provided by Positron Emission Tomography Centre of the University of Debrecen and test its performance.

Details

Duration

30 + 5
Supervisor: PRIP