loading page

Multi-task Guided Blind Omnidirectional Image Quality Assessment with Feature Interaction
  • Sifan Li

Abstract

With the development of virtual reality (VR) applications, omnidirectional image quality assessment (OIQA) has become an increasingly vital problem. In this paper, a multi-task guided blind omnidirectional image quality assessment with local and global feature interaction and fusion is proposed. Specifically, a bidirectional pseudo-reference (BPR) module capturing the error maps on viewports using the two opposite pseudo-reference information is first constructed, which is followed by a multi-scale feature extraction module to obtain multi-scale local degradation features. Moreover, to well complement the local features on viewports, a Mamba module is adopted to extract the multiscale global features. Then the features from the local and global branches are deeply fused based on a multi-level aggregation module. Finally, motivated by the multi-task managing mechanism of human brain, a multi-task learning module is introduced to assist the main quality assessment task. Extensive experimental results demonstrate that our proposed method achieves the state-of-the-art performance on the blind OIQA task compared to other models.