Attention-Aware Multi-Task Convolutional Neural Networks

Lyu, Li, Zhang (2019) Attention-Aware Multi-Task Convolutional Neural Networks IEEE Trans Image Process (IF: -1)

Abstract

Multi-task deep learning methods learn multiple tasks simultaneously and share representations amongst them, so information from related tasks improves learning within one task. The generalization capabilities of the produced models are substantially enhanced. Typical multi-task deep learning models usually share representations of different tasks in lower layers of the network, and separate representations of different tasks in higher layers. However, different groups of tasks always have different requirements for sharing representations, so the required design criterion does not necessarily guarantee that the obtained network architecture is optimal. In addition, most existing methods ignore the redundancy problem and lack the pre-screening process for representations before they are shared. Here, we propose a model called Attention-aware Multi-task Convolutional Neural Network, which automatically learns appropriate sharing through end-to-end training. The attention mechanism is introduced into our architecture to suppress redundant contents contained in the representations. The shortcut connection is adopted to preserve useful information. We evaluate our model by carrying out experiments on different task groups and different datasets. Our model demonstrates an improvement over existing techniques in many experiments, indicating the effectiveness and the robustness of the model. We also demonstrate the importance of attention mechanism and shortcut connection in our model.

Links

http://www.ncbi.nlm.nih.gov/pubmed/31603785
http://dx.doi.org/10.1109/TIP.2019.2944522

Similar articles

Tools