To support the promise of personalized augmented reality (AR) scaffolding mechanisms that adapt in real-time to the user's demonstrated proficiency, we must be able to first assess individual skill levels during task performance. This study introduces a novel data-driven approach for the multidimensional assessment of expertise in AR-guided psychomotor tasks. We evaluate users' performance against standard benchmarks by employing pairwise correlation and principal component analysis (PCA) on multimodal data collected from AR devices and wearable sensors-including hand tracking, galvanic skin response (GSR), and gaze information. The results demonstrate that objective measures-such as visual scanning efficiency, stress indicators, inspection accuracy, and hand dexterity-provide valuable insights into multiple indicators of skill acquisition. These findings validate the potential for AR systems to adapt dynamically to users' expertise levels based on observable data. This approach offers a promising direction for enhancing the effectiveness of AR-based training systems in high-stakes fields such as manufacturing, aviation, and surgery, where precise skill acquisition is critical.