Efficient Image Inpainting with Knowledge Distillation
首发时间:2020-01-17
Abstract:In recent years, deep learning has made outstanding achievements in image classification, recognition, segmentation and generation. And some breakthroughs have been made in the research of image inpainting based on deep learning.The existing algorithms workwell, but they cannot inference in real time.In order to realize fast and efficient image inpainting,optimization is carried out from three aspectsbased on gated convolution. Using the pyramid sample to optimize the dilated gating convolution layers and proposing a coarse-to-fine pyramid sampling network(PUNet), compared with the gating convolution network, PUNet has less computation and more parameters to learn characteristics, as well as integrating different depth characteristics. Proposing holistic,pair-wise,pixel-wise loss function to enhance the local and global consistency. Introducing knowledge distillation into image inpainting and designs a multi-level self-distillation method. Experiments show that PUNet achieves the similar performance to gated convolutional network with 22% inference time.
keywords: PatternRecognition Image Inpainting Knowledge Distillation
点击查看论文中文信息
基于知识蒸馏的高效图像修复
摘要:近年来,深度学习在图像分类、识别、分割和生成领域都取得了突破性进展。基于深度学习的图像修复研究也取得了一些突破。现有算法可以实现图像修复,但不能实现实时推理。为了实现快速高效的图像修复,本文以现有的门控卷积网络为基础,从三个方面进行优化。使用金字塔采样对空洞门控卷积层进行优化,提出了由粗到细的金字塔采样修复网络(Pyramid-Upsample Net, PUNet),与门控卷积网络相比,使用更少的运算量和更多的参数进行特征学习,同时可以融合不同深度特征;提出了整体、逐对、逐点三种维度的损失函数以提升修复结果的局部一致性以及全局一致性;将知识蒸馏引入图像修复,设计了多层次的自蒸馏方法。实验表明,PUNet可以22%的推理时间达到与门控卷积网络相近的图像修复效果。
基金:
引用
No.****
动态公开评议
共计0人参与
勘误表
基于知识蒸馏的高效图像修复
评论
全部评论0/1000