Applied Sciences, Vol. 15, Pages 4740: A Few-Shot Learning Framework for Depth Completion Based on Self-Training with Noise and Pixel-Wise Knowledge Distillation


Applied Sciences, Vol. 15, Pages 4740: A Few-Shot Learning Framework for Depth Completion Based on Self-Training with Noise and Pixel-Wise Knowledge Distillation

Applied Sciences doi: 10.3390/app15094740

Authors:
Shijie Zhang
Shengjie Zhao
Jin Zeng
Hao Deng

Depth completion generates a comprehensive depth map by utilizing sparse depth data inputs, supplemented by guidance provided by an RGB image. Deep neural network models depend on annotated datasets for optimal training. However, when the quantity of training data is limited, the generalization capability of deep neural network (DNN)-based methods diminishes considerably. Moreover, acquiring a large dataset of depth maps is challenging and resource intensive. To address these challenges, we introduce a novel few-shot learning approach for depth completion. Our approach integrates noisy-student training with knowledge distillation (KD) techniques to enhance model performance and generalization. We incorporate both the noisy-student training and KD modules into a basic deep regression network using a non-local spatial propagation network (NLSPN) for depth completion. The noisy-student training framework enhances the model’s performance and generalization capabilities by introducing controlled noise and self-learning mechanisms. Within our few-shot learning framework for depth completion, the KD mechanism transfers advanced capabilities from the teacher model to the student model. Experimental evaluations demonstrate that our approach effectively addresses the challenges associated with depth completion tasks, particularly in scenarios with limited training data and a constrained number of available samples.



Source link

Shijie Zhang www.mdpi.com