AUTHOREA
Log in Sign Up Browse Preprints
LOG IN SIGN UP
Yuan Xiong
Yuan Xiong

Public Documents 2
Boosting Human Pose Estimation via Heatmap Refinement
Ling Jiang
Zhuocheng Liu

Ling Jiang

and 5 more

July 16, 2024
Human pose estimation based on heatmap regression has achieved significant success in recent years. However, the semantic ambiguity caused by traditional hand-crafted heatmaps seriously affects the model performance. Specifically, hand-crafted heatmaps generated with a fixed Gaussian kernel are semantically misaligned. Various Gaussian covered areas for keypoints with the same type may cause model learning confusion. In this paper, we focus on learnable heatmap generation and propose a refined heatmap generator (RHG) to boost human pose estimation. First, we propose a joint training framework to connect the human pose estimator and RHG for end-to-end training. It employs a joint loss function to learn intermediate representations of the network and dataset. Second, RHG takes annotated dotpoints as input and utilizes scale-aware heatmaps as regression targets to deal with the scale variation. Scale-aware heatmaps are generated by adjusting Gaussian covered areas with geometric priors. Experimental results show that our method achieves 72.0%AP on COCO test-dev2017 and 74.0%AP on CrowdPose dataset, respectively, outperforming state-of-the-art methods.
DreamWalk: Dynamic Remapping and Multiperspectivity for Large-Scale Redirected Walkin...
Yuan Xiong
Tong Chen

Yuan Xiong

and 3 more

May 29, 2023
Redirected walking (RDW) provides an immersive user experience in virtual reality applications. In RDW, the size of the physical play area is limited, which makes it challenging to design the virtual path in a larger virtual space. Mainstream RDW approaches rigidly manipulate gains to guide the user to follow predetermined rules. However, these methods may cause simulator sickness, boundary collision, and reset. Static mapping approaches warp the virtual path through expensive vertex replacement in the stage of model pre-processing. They are restricted to narrow spaces with non-looping pathways, partition walls, and planar surfaces. These methods fail to provide a smooth walking experience for large-scale open scenes. To tackle these problems, we propose a novel approach that dynamically redirects the user to walk in a non-linear virtual space. More specifically, we propose a Bezier-curve-based mapping algorithm to warp the virtual space dynamically and apply multiperspective fusion for visualization augmentation. We conduct comparable experiments to show its superiority over state-of-the-art large-scale redirected walking approaches on our self-collected photogrammetry dataset.

| Powered by Authorea.com

  • Home