Paper

Did Evolution get it right? An evaluation of Near-Infrared imaging in semantic scene segmentation using deep learning

Animals have evolved to restrict their sensing capabilities to certain region of electromagnetic spectrum. This is surprisingly a very narrow band on a vast scale which makes one think if there is a systematic bias underlying such selective filtration. The situation becomes even more intriguing when we find a sharp cutoff point at Near-infrared point whereby almost all animal vision systems seem to have a lower bound. This brings us to an interesting question: did evolution "intentionally" performed such a restriction in order to evolve higher visual cognition? In this work this question is addressed by experimenting with Near-infrared images for their potential applicability in higher visual processing such as semantic segmentation. A modified version of Fully Convolutional Networks are trained on NIR images and RGB images respectively and compared for their respective effectiveness in the wake of semantic segmentation. The results from the experiments show that visible part of the spectrum alone is sufficient for the robust semantic segmentation of the indoor as well as outdoor scenes.

Results in Papers With Code
(↓ scroll down to see all results)