loading page

Neural Foveated Super-Resolution for Real-time VR Rendering
  • +3
  • Jiannan Ye,
  • Xiaoxu Meng,
  • Daiyun Guo,
  • Cheng Shang,
  • Haotian Mao,
  • Xubo Yang
Jiannan Ye
Shanghai Jiao Tong University School of Software
Author Profile
Xiaoxu Meng
Tencent
Author Profile
Daiyun Guo
Shanghai Jiao Tong University School of Software
Author Profile
Cheng Shang
Shanghai Jiao Tong University School of Software
Author Profile
Haotian Mao
Shanghai Jiao Tong University School of Software
Author Profile
Xubo Yang
Shanghai Jiao Tong University School of Software

Corresponding Author:yangxubo@sjtu.edu.cn

Author Profile

Abstract

As virtual reality display technologies advance, resolutions and refresh rates continue to approach human perceptual limits, presenting a challenge for real-time rendering algorithms. Neural super-resolution is promising in reducing the computation cost and boosting the visual experience by scaling up low-resolution renderings. However, the added workload of running neural networks cannot be neglected. In this paper, we try to alleviate the burden by exploiting the foveated nature of the human visual system, where acuity decreases rapidly from the focal point to the periphery. With the help of dynamic and geometric information (i.e.,pixel-wise motion vectors, depth, and camera transformation) available inherently in the real-time rendering content, we propose a neural accumulator to effectively aggregate the amortizedly rendered low-resolution visual information from frame to frame recurrently. By leveraging a partition-assemble scheme, we use a neural super-resolution module to upsample the low-resolution image tiles to different qualities according to their perceptual importance and reconstruct the final output heterogeneously. Perceptually high-fidelity foveated high-resolution frames are generated in real-time, surpassing the quality of other foveated super-resolution methods.
08 Nov 2023Submitted to Computer Animation and Virtual Worlds
08 Nov 2023Submission Checks Completed
08 Nov 2023Assigned to Editor
16 Mar 20241st Revision Received
18 Mar 2024Submission Checks Completed
18 Mar 2024Assigned to Editor
12 Apr 2024Reviewer(s) Assigned
25 May 2024Review(s) Completed, Editorial Evaluation Pending
25 May 2024Editorial Decision: Accept