We showed recently that a novel prosthesis control, based on natural arm movement reconstructed from an Artificial Neural Network (ANN) that receives movement goal and stump motion as inputs, enables close-to-natural reaching control in Virtual Reality (VR). Here, we applied and tested this control in the physical world using first-person teleoperation of Reachy2, an open-source humanoid robot made by Pollen Robotics that ranked 2nd at ANA Avatar XPRIZE. Direct teleoperation reproducing all arm joints of able-bodied participants (n=14) elicited a 100% success rate for grasping objects at various locations, with best possible usability scores and small workload scores. High success rates (92%) were also obtained when all distal joints from the elbow onward were operated with the novel prosthesis control, using a movement goal identified through gaze-guided computer vision. While usability and workload scores were slightly degraded when able-bodied participants used the prosthesis control as compared to direct teleoperation, both were scored similarly well by a sample from our target population (n=8 participants with transhumeral limb loss). This platform and control schemes offer broad perspectives for prosthesis control, but also for teleoperation robotics and human-robot interactions.