1. Introduction
From the very beginning of ski jumping in the 19th century until now, the ski jumping technique has undergone various developments. This includes, amongst others, the Kongsberger Technique, in which the upper body is bent and the arms are extended over the head. Later, the arms were brought back next to the upper body. During all these changes, the skis remained parallel. The last major change was the introduction of the V-style in 1985, where the skis are not parallel but form a V [
1,
2]. Nowadays, this technique is used by almost all athletes. All these developments resulted in larger jumping distances. In the past, new jumping techniques were generally found by coincidence or trial and error. In contrast, today, researchers work hard to find the optimal flight technique [
3,
4,
5,
6] and study the biomechanics of ski jumping [
7,
8].
Therefore, sports scientists need reliable and automatic tracking systems. Various studies with different measurement systems have been conducted to investigate, amongst other things, the jumping distance [
9], landing momentum [
10], ground reaction forces [
11], and ski position [
12].
These studies incorporate the use of inertial measurement units (IMUs) [
13,
14], force insoles or differential Global Navigation Satellite System (dGNSS) [
15]. The latter has the disadvantage of being obtrusive since antennas on the helmet and a backpack have to be attached to the ski jumpers. Additionally, camera-based tracking methods are used [
5,
7,
16,
17,
18,
19,
20,
21], which are unobtrusive but have the disadvantage of only covering a part of the jumping hill, or many cameras are needed and must be combined. Furthermore, the post-processing of video data has high computational costs and does not work during bad weather conditions.
The evaluated system uses ultra-wideband (UMB) radio communication and ranging in combination with IMUs to track the position of the athletes continuously during the flight. UWB has been used successfully in sport tracking applications in various sports. This includes indoor sports, such as ice hockey [
22], and handball [
23] and outdoor sports, such as soccer [
24].
In previous studies, ski jumping parameters were obtained offline. For a later application during training sessions or TV broadcasting, a real-time system is beneficial and therefore required. Thus, e.g., the speed during the jump can be shown live during the TV broadcasting, or jumping trajectories can be compared directly using a 3D visualization. For a live application, wireless transmission of the obtained data is required. However, for the athletes and coaches, this also has several advantages. This includes easier handling, and the athletes are not interrupted in their focus, which is crucial in competitions and training due to the high risk associated with ski jumping.
Within this work, we investigate the accuracy of a wearable tracking system measuring several ski jumping parameters. The tracking system is unobtrusive, which is essential to be used in competitions so as not to distract the athletes during the crucial flight phase. The parameters include the jumping distance, in-flight positions, and in-flight orientation of the skis. All parameters are determined and transmitted in real-time. The wearable real-time tracking system (WRRTS) brings the lab to the field. In contrast to previously proposed systems, the evaluated system is real-time-capable. Another advantage of the investigated system over previous ones is the wireless data transmission, which results in no interaction with the athlete during training or competition and does not disturb their focus.
The main objective of this study is the systematic validation of a wearable real-time tracking system for ski jumping. Prior studies have only focused on single aspects of tracking ski jumps. The investigated system provides multiple measured parameters obtained and transmitted in real-time, making it very applicable for further research of the kinetics and kinematics of ski jumping. Therefore, the accuracy of the tracking system needs to be investigated so that sports scientists can derive valuable insights.
The remainder of the paper is structured as follows.
Section 2 introduces the study procedure with the WRRTS as well as the reference measurement systems.
Section 3 shows the results of the comparison of the WRRTS with its respective reference systems. In
Section 4, the previously presented results of the validation study are discussed. Finally, in
Section 5 the conclusions of this work are presented.
2. Materials and Methods
In this section, we introduce the procedure of the data acquisition and the different measuring methods with their respective setup at the ski jumping venue. Furthermore, the evaluation of the study data is presented.
2.1. Measurement Systems
The data of the jumps was acquired using the WRRTS, a camera system operated by ccc software GmbH, the total station tracking system QDaedalus [
25,
26], and the official video-based jumping distance measurement. All measurement systems are described in more detail in the following.
2.1.1. Wearable Real-Time Tracking System (WRRTS)
We use the tracking system that was developed based on the previous work of Groh et al. [
9,
10] in combination with UWB-based tracking capabilities.
The WRRTS consists of two main components. The first main component of the tracking system comprises antennas with fixed positions along the jumping hill.
Figure 1 shows the positions of the antennas along the ski jumping facility as well as the positions of the reference measurement devices.
The other main component is the mobile trackers that are attached to the skis of the athletes. In cooperation with the ski binding manufacturer Slatnar, the trackers were designed for easy attachment to the bindings to facilitate their use. An example of the attachment is shown in
Figure 2.
The tracking system uses ultra-wideband radio technology and microelectromechanical inertial measurement units. This combination allows the continuous measurement of ski orientation, acceleration, velocity, and position of both skis during the complete jump in real-time. The internal update rate of the inertial measurement units is 1000 Hz, and that of the ultra-wideband radio technology as well as the transmitted output data is 20 Hz.
The measurements are acquired in a local Cartesian coordinate system centered at the middle of the edge of the jump-off platform, as shown in
Figure 1.
Furthermore, a 3D scan of the landing hill is acquired using a total station. A total station is a theodolite with integrated distance measurement, thus measuring the vertical and horizontal angles and distance to an aimed point. This is used to obtain a mapping from the local Cartesian coordinate system to measurements with respect to the landing hill (e.g., height over ground, jumping distance).
2.1.2. Official Distance Measurements
The jumping distance of the recorded jumps was determined with the official video-based measurement system that has been used in Fédération Internationale de Ski (FIS) competitions for over 25 years. The system operator was certified by FIS for video distance measurements. To determine the jumping distance, the system operator determines the first camera frame where both skis are flat on the landing hill. Through the camera calibration, the respective jumping distance is determined and rounded down to a resolution of
. An example of the determination of the official jumping distance is shown in
Figure 3.
2.1.3. Camera Measurements (ccc Software GmbH)
In the field of sports software, ccc develops video analysis systems for training optimization and competition control and platforms for the storage of training and competition data [
27].
ccc provided camera-based measurements of the ski jumps and was used as a reference for the validation study. To this end, multiple fixed cameras were installed next to the jump-off platform and next to the landing hill at 8
, 18
, 30
, 45
, and 60
after the jump-off platform. Additionally, a pan–tilt–zoom (PTZ) camera was mounted at the upper end of the in-run. Due to a defective calibration, the deviation of the WRRTS in the
Y-coordinate could not be investigated using the PTZ camera. The position of the PTZ camera was measured using the total station, enabling the projection of the V-opening-angle to be determined on the PTZ camera image plane. For the camera at 30 m, the calibration was faulty and therefore could not be used.
Figure 1 shows the positions of all installed cameras.
For assessing the WRRTS measurements using the ccc videos, two different approaches were taken.
The first one was the projection of the 3D positions of the WRRTS into the 2D image plane using intrinsic and extrinsic camera calibration parameters. This approach does not provide a meaningful quantification of the measurement errors (only in pixels, not in meters). However, it provides an intuitive means for visually assessing the data quality since the WRRTS measurements can be seen directly in the image.
The second approach is the comparison of the video data with the WRRTS data in 3D. Since the jumper is only visible in one camera image at a time during a jump, full stereoscopic measurements are not possible. However, it is possible to associate every pixel coordinate in the camera image with a 3D vector that is formed by connecting the camera position (i.e., its optical center) with the corresponding point in the image plane. The deviation of the WRRTS positions from this 3D vector can then be determined. Based on this geometric model, comparisons of the ski orientation can also be performed. It should be noted that due to the two-dimensional nature of the camera images, only deviations in the image plane are captured by this method.
To compare the WRRTS position in 3D with the camera vectors, the trackers and skis are manually labeled in the videos. An example of the manual labeling of the skis and trackers for one of the cameras next to the landing hill is shown in
Figure 4. Additionally,
Figure 5 shows the manual labeling of the V-angle in the PTZ camera.
To compare the camera measurements, we determine the point of the interpolated WRRTS trajectory with the shortest distance to the camera vector. We use the instant of time of this point of the WRRTS trajectory for the position and angle comparison. This procedure is described in more detail in
Section 3.
2.1.4. Total Station Tracking (QDaedalus System)
Since the camera measurements did not provide full 3D positions of the skis, the 3D tracking system QDaedalus was additionally used for validating the three-dimensional position. QDaedalus is a measurement system developed at the Geodesy and Geodynamics Lab at the ETH Zurich and consists of a combination of total stations and CCD cameras that allow the accurate triangulation of objects in 3D. For this study, we used Leica TCA 1205 total stations in the QDaedalus system. The positions of the QDaedalus stations are shown in
Figure 1. The raw position data measured with the QDaedalus system were filtered and interpolated using a least-squares collocation. The trajectories measured with QDaedalus were registered to the same coordinate system as the WRRTS measurements (see
Figure 1) and compared to the WRRTS data regarding 3D positions.
2.2. Venue
The data acquisition took place at the hill size 100 ski jumping hill in Oberhof (Germany).
In contrast to the second day, foggy weather conditions impeded data acquisition on the first day of the acquisition. This affected some of the camera measurements during this day since the manual labeling requires a clear vision of the wearable tracker and skis. Additionally, the QDaedalus measurements were affected during this day since the measurement principles require inter-visibility between the total stations and the athletes.
In total, data of 35 jumps were collected over two days. Four athletes participated in the data acquisition (three male, one female). One of the athletes was part of the German A National Team and three of the German B National Team. Measurements of the jumps were acquired using the WRRTS and the reference measurement systems described below.
2.3. Evaluation
In this subsection, the synchronization of the WRRTS and the camera measurements, the procedure for the position, and angle comparison is described. Additionally, the statistical analysis is presented.
2.3.1. Synchronization
The WRRTS and the camera system used for evaluation do not have a common synchronized time. Therefore, the measurements were synchronized in retrospect. Different synchronization procedures were used for the PTZ camera, and the cameras at the edge of the jumping hill.
To synchronize the PTZ camera with the WRRTS measurements, we applied a time offset correction. The internal time of the WRRTS was set to zero when passing the edge of the jump-off platform (origin of the local coordinate system). Therefore, by manually labeling the frame in the camera image where the tracker passed the edge of the jump-off platform, we determined the offset of the time of the PTZ camera. Using this offset, we synchronized both measurement systems for each jump individually.
To synchronize the lateral cameras at the side of the jumping hill with the WRRTS a different approach had to be used. Due to their limited field of view, the edge of the jump-off platform is not visible and, therefore, cannot be used as a reference point for synchronization. Instead, we take an alternative approach. First, we project the manually annotated pixel coordinates of the tracker into 3D using . Then we linearly interpolate the measurements of the WRRTS. Then, the time offset that minimizes the distance between the interpolated WRRTS measurements and the line of sight between the camera position and the projected annotation is determined. This time offset is then used to align the WRRTS and camera measurements.
For the comparison of the X- and Z-coordinate with the camera measurements, we project the manual annotation in 3D using the Y-coordinate of the synchronized WRRTS measurement.
Since the WRRTS and QDaedalus measurements do not have a synchronized time base either, their temporal relation also had to be determined in retrospect. Therefore the trajectories of both systems are up-sampled via linear interpolation. The up-sampled trajectories are shifted in time to find the best temporal agreement. As the measure for agreement, the mean absolute error of the overlapping trajectories is used.
2.3.2. Statistical Analysis
The results of the WRRTS and the respective reference system are statistically analyzed using different error measures. These are briefly explained in this subsection.
Bland Altman introduced a graphical approach to compare two measuring methods [
28]. It consists of a scatter plot with the difference between paired measures against their mean. The mean of the two methods is used since both methods have a measurement error, and the true value is not known. Therefore the mean of the paired measures is the best estimate of the true value. Plotting the difference against the mean enables us to see if there is a dependency of the measurement error on the value of the measurement.
Additionally, the mean difference , as well as the upper and lower limit of agreement (LoA) is plotted as a horizontal line. The limits show the range of the difference in which of the data are located. The limits of agreement are calculated as , where is the standard deviation.
The assumption for the Bland–Altman plot is that the difference between the two methods is normally distributed. To test whether the assumption of a normal distribution is valid, we use a Kolmogorov–Smirnov test [
29]. For a
p-value ≥ 0.05, we accept the null hypothesis of a normal distribution. For a
p-value < 0.05, the differences to a normal distribution are significant, and we test for other distributions to calculate the limits of agreement.
Apart from the mean, we also investigate the standard error of the mean (SEM). It is defined as
where
N is the number of compared measurements and
the standard deviation. The SEM gives an estimate of how far the mean of the sampled data differs from the mean of the whole population.
The deviation is summarized in terms of the mean absolute error (MAE), which is calculated as
where
N is the number of compared measurements
is the
i-th measurement obtained with the WRRTS (e.g., jumping distance), and
is the
i-th measurement obtained with the corresponding reference measurement system. The MAE is an easy-to-interpret measure combining the bias and precision of a distribution. Thus, we use it as the figure of merit for characterizing the accuracy of the WRRTS.
4. Discussion
4.1. Jumping Distance
The WRRTS determines the jumping distance in good agreement with the official video distance. Even though the accuracy of the WRRTS might not be sufficient for competitions, its usage is helpful for training since no manual operator is needed to determine the jumping distance individually. It is important to mention that the true jumping distance is unknown due to the rounding during the determination of the official video distance. This rounding down of the jumping distance of the official video distance introduces an MAE of compared to the unknown true value. Nevertheless, this MAE introduced by rounding cannot be subtracted from the MAE of the difference of the WRRTS and the official video distance as one might intuitively think.
The main advantage of the investigated system over previous studies [
9] is the much higher accuracy. Groh et al. proposed an accuracy of
compared to a camera system. Additionally, it has the advantage over the official jumping distance measurement of automating the process, so no FIS certified instructor is needed to label frames in the video manually. Therefore, the WRRTS is also faster than the official distance measurement.
4.2. Projection of 3D WRRTS Measurements into the 2D Image Plane
The projection of the 3D positions captured with the WRRTS showed good correspondence between the WRRTS measurements and the video material. The tracker positions of the jumpers in the videos are close to the projected WRRTS data, indicating that the WRRTS positions are plausible. Although this method does not allow a quantification of the measurement error in the local 3D coordinate system, it serves to provide a visual impression of the measurement quality in an intuitive way.
4.3. Comparison of 3D WRRTS Position Measurements with 3D Camera Vectors
The 3D in-flight positions of the WRRTS are compared to the QDaedalus and camera measurements. As seen in the corresponding Bland–Altman plots, the precision of the
X- and
Z-coordinate worsens during the jump, in comparison with the camera measurements. Since the bias also varies, a log transformation or investigation of the relative error does not lead to a difference following a normal distribution. Therefore, looking at the whole data range, the determined limits of agreement tend to be too far, which is also mentioned by Bland and Altman [
28].
Calculating the MAE in 3D with only the camera measurements from the lateral cameras is not possible due to the missing calibration for the PTZ camera and thus no determined MAE of the Y-coordinate.
However, we calculate the MAE in 3D by combining the MAE of the X- and Z-coordinate determined with the camera measurements with the MAE of the Y-coordinate determined with the QDaedalus system (). We thereby find an MAE in 3D of , which is more than a factor of two better than the MAE observed with the QDaedalus system. Since the precision of the differences depends on the precision of both measurement methods, the discrepancy of the MAE shows that the QDaedalus tracking introduces this difference in the MAE.
In comparison with a recent study, which uses dGNSS and achieves a better accuracy of smaller than
[
15], the advantage of the investigated system is that it is unobtrusive in contrast to the backpack and antennas at the helmet needed for the dGNSS.
Based on the 3D in-flight positions, the WRRTS also measures and transmits the in-flight velocity in real-time. Due to the accurately measured positions, we can assume that the velocity is also measured accurately, with its precision worsening with the distance to the jump-off platform.
4.4. Comparison of 3D WRRTS Angle Measurements with 3D Camera Vectors
The determination of the V-angle has a worse precision compared to the angle determined from the cameras next to the landing hill. One reason for this is that the V-angle is calculated using the orientation of both skis, which both introduce a measurement error. Apart from this, due to the missing calibration of the PTZ camera, no 3D vectors could be used to measure the V-angles. Instead, they were taken directly from the image. The angles measured by the WRRTS are projected onto the image plane, but effects such as radial distortion through the camera lens are not corrected. This may introduce a bias in the comparison of both methods. Another reason for the bias and worse precision may be the less accurate labeling of the PTZ camera images. This is introduced through the bending of the skis in conjunction with the jumper appearing relatively small in the image (see
Figure 5). In contrast to this, in the cameras at the side of the landing hill, the jumper appears large, and thus the unbent part of the skis is identified more precisely (see
Figure 4).
The investigated system clearly outperforms previous studies also based on IMUs, which achieve an angular precision of
for the lateral angle [
13].
5. Conclusions
In this paper, we investigated the accuracy of a WRRTS for ski jumping. The system is based on UWB and IMU and capable of real-time data transmission. Overall, we see good correspondence for the investigated parameters.
Table 2 summarizes the findings of this study.
The WRRTS and the respective reference system can be used interchangeably, accepting the determined bias and precision of their difference. The bias and precision of the difference is an estimation for the upper limit for the true unknown bias and precision of the WRRTS since the reference system also introduces a measurement error.
Looking at the possible applications of such a tracking system, we see great potential for the live application of the WRRTS during TV broadcasting, since information such as live-speed or trajectory comparison might be fascinating for the spectators. Furthermore, for the application during training, the WRRTS shows high potential to support coaches and sports scientists to improve further the technique of the athletes to jump even further.