Background: Thoracolumbar flexion–extension range of motion (FE ROM) in horses is difficult to assess reliably using subjective evaluation. A handheld, smartphone-based markerless computer vision system (Anonymised, RH) may enable objective field-based assessment but requires validation against an established optical motion capture reference system (Qualisys®; QS). Objectives: To compare the accuracy and precision of RH relative to QS for measuring FE ROM in horses trotting on a straight-line and on a circle. Study Design: Cross-sectional comparative validation study of a markerless computer vision algorithm. Methods: Fifty-nine horses were recorded trotting on a straight-line; 23 of these were also recorded on a circle. Data were collected simultaneously using RH and QS, with a marker light used for temporal synchronisation. Anatomical landmarks at the withers, mid-back, and croup were used to calculate the flexion–extension angle at the mid-back. FE ROM was derived at the stride-level as the difference between stride-specific maxima and minima. Agreement between RH and QS was analysed separately for straight-line and circular trot at stride- and trial-levels using mean signed error (MSE), mean absolute error (MAE), and Bland–Altman limits of agreement (LoA). Results: On the straight-line, stride-level MSE was −0.08°, MAE 0.96°, and LoA −2.49° to 2.32°. Trial-level agreement improved, with MSE −0.13°, MAE 0.44°, and LoA −1.22° to 0.94°. On the circle, stride-level MSE was −0.71°, MAE 1.14°, and LoA −3.17° to 1.74°. At the trial-level, variability was reduced (MSE −0.62°, MAE 0.78°, LoA −2.18° to 0.93°). Overall agreement was lower on the circle than on the straight-line. Conclusions: Stride-level agreement between RH and QS was influenced by expected stride-to-stride variability, particularly during circular exercise, while trial-level agreement was improved. RH enables objective assessment of FE ROM under field conditions.

Karsten Thuren Key

and 4 more

Background: A handheld smartphone-based computer vision algorithm (Anonymised (RH)) offers accessible alternatives for equine gait analysis but requires validation against a gold-standard multicamera optical motion capture system (Qualisys® (QS)). Objectives: To evaluate the accuracy and precision of RH in measuring vertical displacement signals (VDS) at the eye, withers, back, and croup in horses trotting on a straight line and on a circle. Study Design: Cross-sectional comparative validation study of a markerless computer vision algorithm. Methods: Fifty-nine horses were recorded while trotting on a straight line and 24 were lunged on a circle. RH detected two-dimensional anatomical keypoints on each frame, which were used to estimate a dynamic groundline and compute ground relative VDS with stride-based difference in maxima (Maxdiff) and minima (Mindiff). QS provided synchronous true VDS reference values. Agreement was evaluated using mean signed error (MSE), mean absolute error (MAE), and Bland-Altman analysis. Results: On the straight line (n = 2620 strides), the pooled stride-level MAE for Maxdiff and Mindiff was 3.8 mm. Keypoint-specific errors were 5.1 mm (eye), 4.3 mm (withers), and 3.0 mm (croup). On the circle (n = 2419 strides), pooled stride-level error increased to 5.5 mm. Trial-level analysis (n = 58 straight trials) showed much lower errors: 1.4 mm for both eye and withers and 1.1 mm for croup. On the circle (n = 24 trials), trial-level errors were higher, with 2.8 mm for the eye, 1.8 mm for the withers, and 3.3 mm for the croup. The back keypoint consistently showed the lowest errors across both stride- and trial-levels. Main Limitations: Croup Mindiff detection during circle trials had the highest error and bias. Conclusions: RH measured vertical displacement of the eye, withers, back, and croup with higher accuracy and precision than any previously reported markerless systems, supporting its use for equine gait analysis.

Karsten Thuren Key

and 3 more

Background: Objective gait analysis is valuable in diagnosing and managing equine lameness. Computer vision-based algorithms offer accessible alternatives for equine gait analysis but require thorough accuracy and precision assessment under diverse conditions. Objectives: To evaluate a proprietary vision-based algorithm’s accuracy and precision in measuring vertical displacement signals (VDS) at the eye, withers, and croup, alongside groundline estimation, for horses trotting on straight lines and circles under field conditions. Study Design: Experimental comparative study against manually annotated references. Methods: We obtained 67 recordings from 37 horses. A vision-based algorithm and independently manual annotation produced 2D anatomical keypoints on all frames of the recordings, which were processed to estimate a groundline and compute VDS and stride-based maxima (Maxdiff) and minima (Mindiff) vertical differences. No stride exclusions were applied. Mean signed error (MSE), mean absolute error (MAE) and Bland-Altman plots were used to compare detected and annotated data. Results: At the stride level (n = 1556), the overall mean absolute errors (MAEs) for both Maxdiff and Mindiff were 4.3 mm. The eye keypoint exhibited the lowest errors (2.9 mm Maxdiff, 3.0 mm Mindiff), while the withers error was 5.5 mm for both Maxdiff and Mindiff, and the croup showed 4.3 mm (Maxdiff) and 4.4 mm (Mindiff). Trial-level (n=67) analysis, with below optimal number of strides per trial in this study, revealed lower overall absolute differences (Eye: 2.3 mm, Withers: 3.7 mm, Croup: 2.7 mm) indicating consistent performance across multiple strides. Eye keypoint accuracy was higher on circles than on straight lines, whereas the withers and croup performed comparably under both conditions. Main Limitations: Groundline estimation accuracy was stress-tested on treadmill data in another study. Further clinical comparison with established gait analysis systems is recommended. Conclusions: The algorithm robustly measured vertical displacements under varied conditions, supporting a clinical and field-based utility.

Karsten Thuren Key

and 4 more

Background: Equine lameness diagnosis largely relies on subjective visual assessments, which can be biased. Although marker-based methods, force plates, and inertial measurement units (IMUs) provide objective measurements, they require specialized setups. Vision-based algorithms offer a portable, markerless alternative, but their accuracy needs thorough testing. Objectives: To evaluate a custom vision-based algorithm for estimating the groundline across multiple camera angles, including handheld use in horses trotting on a treadmill. Study design: Experimental comparative study. Methods: Eight Standardbred trotter mares were recorded trotting on a high-speed treadmill using seven iPhones positioned at various heights and angles, including a handheld device. A trained deep neural network algorithm placed 2D keypoints on each video frame. Vertical Displacement Signals (VDS) for the eye, withers, and croup were computed relative to either an algorithm-estimated or a fixed treadmill groundline. Maximum (Maxdiff) and minimum (Mindiff) stride values were compared using Bland-Altman analysis, scatter plots, and histograms. The effect of handheld use on variability and accuracy was assessed by comparing results from a handheld camera to those from a static camera. Results: Groundline estimation closely matched the fixed reference, exhibiting near-zero mean angle error and low mean average error (MAE = 0.45°; n = 242.192). Maxdiff and Mindiff stride-level (n = 36.981) MAE were 0.5 mm, with clinically acceptable additional variability introduced by handheld use at the trial level (Maxdiff and Mindiff MAE < 1.8 mm; n = 357). Main limitations: Treadmill-based data and a single breed/coat colour may limit generalizability to other settings. Conclusions: The vision-based algorithm accurately estimates the groundline and stride VDS parameters from various camera setups, including handheld. Further validation in diverse environments and against other objective gait analysis systems is recommended.