Deep Learning Meets Radiomics For End-To-End Brain Tumor Mri Analysis
Wojciech Ponikiewski, Jakub Nalepa
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:14:03
Temporal color constancy (TCC) is a recent research direction that uses multiple temporally correlated images (e.g., frames in a video) to perform illuminant estimation. Compared to traditional single-frame color constancy methods, temporal or multi-frame methods can leverage the additional temporal information inherent in sequences, which makes them naturally suitable for processing videos. in this paper, we present a hybrid framework called Fusion Temporal Color Constancy that combines classical color constancy algorithms, convolutional neural networks (CNN), and recurrent neural networks (RNN) for the task of TCC. Experimental results show that our hybrid approach achieves state-of-the-art accuracy on several publicly available multi-frame and single-frame color constancy benchmarks.