Next Article in Journal
A High Reliability 3D Scanning Measurement of the Complex Shape Rail Surface of the Electromagnetic Launcher
Next Article in Special Issue
Electronic Skin Wearable Sensors for Detecting Lumbar–Pelvic Movements
Previous Article in Journal
Multi-Sensor Orientation Tracking for a Façade-Cleaning Robot
Previous Article in Special Issue
Estimation Methods for Viscosity, Flow Rate and Pressure from Pump-Motor Assembly Parameters
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Combination of Sensor Data and Health Monitoring for Early Detection of Subclinical Ketosis in Dairy Cows

1
Linz Center of Mechatronics GmbH, 4040 Linz, Austria
2
Institute of Stochastics, Johannes Kepler University Linz, 4040 Linz, Austria
3
SMARTBOW GmbH, 4675 Weibern, Austria
4
Clinical Unit for Herd Health Management in Ruminants, University Clinic for Ruminants, Department for Farm Animals and Veterinary Public Health, University of Veterinary Medicine Vienna, 1210 Vienna, Austria
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(5), 1484; https://doi.org/10.3390/s20051484
Submission received: 31 January 2020 / Revised: 27 February 2020 / Accepted: 4 March 2020 / Published: 8 March 2020
(This article belongs to the Special Issue Sensors for Biomechanics Application)

Abstract

:
Subclinical ketosis is a metabolic disease in early lactation. It contributes to economic losses because of reduced milk yield and may promote the development of secondary diseases. Thus, an early detection seems desirable as it enables the farmer to initiate countermeasures. To support early detection, we examine different types of data recordings and use them to build a flexible algorithm that predicts the occurence of subclinical ketosis. This approach shows promising results and can be seen as a step toward automatic health monitoring in farm animals.

1. Introduction

Subclinical ketosis (SCK) is a common metabolic disease of dairy cows in early lactation, characterised by an increased concentration of ketone bodies in the absence of clinical signs of disease [1]. Analyzing the concentration of ß-hydroxybutyrate (BHB) in blood is the recommended reference test for detecting ketosis in dairy cows [2]. A commonly used threshold to define SCK is a ß-hydroxybutyrate (BHB) concentration in blood >1.2 mmol/L [3,4]. To detect SCK in dairy cows, various hand-held devices are commercially available, which were recently evaluated for use on farms [5,6]. The occurrence of SCK in dairy cows is associated with an increased risk of sequalae (e.g., clinical ketosis, displaced abomasum, metritis), decreased milk yield and impaired reproductive performance [3,7,8], affecting the economics of a dairy farm [9]. Major risk factors for the occurrence of ketosis are an excessive body condition score (BCS) before calving, an increased colostrum volume at first milking and an advanced parity [10]. Recent studies showed that subclinical and clinical diseases are associated with distinct animal behaviours, e.g., rumination as well as with standing and lying times, respectively [11,12]. Nowadays, more and more farmers rely on sophisticated sensor technologies for continuous and automated real-time monitoring of animal behaviours as well as of their health status [13,14]. The aim of this study was to predict the ketosis status of dairy cows within the first two weeks of lactation, based on 12 input variables, inter alia of the accelerometer system SMARTBOW (Smartbow GmbH, Weibern, Austria). The prediction is made using a flexible classification algorithm combining time series based acceleration data with other input specifically designed to cope with possibly different availability of data. In Section 2.1, Section 2.2 and Section 2.3, we discuss the different types of recorded data and how they were assessed. Moreover, we define our proposed algorithm and the main parts of it in Section 2.4 and Section 2.5. Section 3 contains the results from a statistical comparison of parts of the recorded data and the classification results. We conclude our work with a short summary and discussion in Section 4.

2. Materials and Methods

2.1. Animal Data and Sampling Procedures

Animal sampling and data collection were approved by the institutional ethics committee of the University of Veterinary Medicine Vienna, Austria (ETK-09/02/2016) as well as the Slovakian Regional Veterinary Food Administration. The study was conducted in 2016 and 2017 on a Slovakian dairy farm, housing approximately 2700 Holstein-Friesian cows. Animals were housed in ventilated freestall barns with group pens or cubicles, with rubber mats bedded with dried slurry separator material. To determine influences of barn climate and barn humidity on cows’ health status, climate loggers (Tinytag 2 Plus, Gemini Data Loggers Ltd., Chichester, West Sussex, UK) were installed in all groups. Temperature and humidity were automatically recorded and stored every hour. Animals were enrolled in the study at drying off, approximately 60 days prior to the expected calving date (D0) and followed up for at least 10 days, i.e., up to 10 days in milk (DIM). Blood samples were collected in week 8 (day −62 to day −56), 3 (day −21 to day −15), 2 (day −14 to day −8) and 1 (day −7 to day −1) before the expected calving date from a coccygeal vessel using vacuum tubes coated with a clot activator for serum collection (Vacuette, 9 mL, Greiner Bio-One GmbH, Kremsmünster, Austria). After clotting for a minimum of 30 min, samples were centrifuged [10 min, 18 C, 3000× g; (Eppendorf Centrifuge 5804, Eppendorf AG, Hamburg, Germany)] to harvest serum. Serum was stored at −20 C until further analysis at the Clinical Pathology Unit (CPU) of the University of Veterinary Medicine, Vienna, Austria. Samples collected in the week before parturition and at D0 were analyzed for non-esterified fatty acids (NEFA) at the CPU. At days 3, 5, and 8 of lactation, the BHB concentration was determined by use of an electronic hand-held device (FreeStyle Precision Xtra, Abbott GmbH and Co. KG, Wiesbaden, Germany), previously validated for dairy cows [6]. Animals showing BHB concentrations of >1.2 mmol/L were defined as suffering from subclinical ketosis and classified as ‘sick’. Body condition score (BCS) was visually estimated according to Edmonson et al. [15] and back fat thickness (BFT) was measured by an ultrasound device (Easy-Scan Linear, IMV Imaging, Meath, Ireland) as previously described by Schröder and Staufenbiel (2006) [16] in weeks 8 and 3 prior to calving and on D0.

2.2. Accelerometer

In this study, we used the accelerometer system SMARTBOW (Smartbow GmbH, Weibern, Austria), which was recently evaluated for monitoring of rumination [17,18] and detecting estrus events [19]. The system includes ear-tags (size and weight of 52 × 36 × 17 mm and 34 g) equipped with a three-dimensional accelerometer, receivers (SMARTBOW WallPoint) that are installed in the barn, and a local server (SMARTBOW Farmserver). Recorded data were sent from the ear-tags wirelessly and in real time to the receivers and transmitted to the local server, where data were processed by specific algorithms. In this study, 10 Hz sensors were used for measuring acceleration in three axes of head and ear movements of the animal with a range from −2 to +2 g. All cows were equipped with the sensor-based ear-tags approximately three weeks before the expected day of calving. In this study, the raw recordings were transformed into 7 data-streams that we are provided with. We inspect two time frames: the week before the day of calving and the the week after the day of calving. These seven data-streams represent the minutes per hour spent:
  • lying/not lying,
  • ruminating/not ruminating,
  • inactive/active/highly active.
The time spent in either of the states in each category above add up to 60 min; thus, we converted the data into 5 dimensional data-sets for every individual, namely minutes per hour spent lying, ruminating, inactive, active, and highly active.
A example of such 5 data-streams for both the pre-partal and post-partal time frame can be seen in Figure 1 below. Moreover, we present the average time spent in the respective behaviours for sick and healthy cows after calving in Figure 2.
For thorough examination, a decision on how to tackle missing data, and some results based on the presented sensor data, we refer to [20]. A summary of the data can be found in Table 1.

2.3. Health Data

The second type of data we consider is health data either directly recorded on farm site or based on previous calvings. Measurements that are either ordinal or nominal are transformed into metric features. In the following list, the features that were assessed are described in detail:
  • Body Condition Score (BCS): A total of three measurements were made, 8 weeks before calving ( 8 w), 3 weeks before calving ( 3 w) and on the day of calving (D0).
    1.
    8 w
    2.
    3 w
    3.
    D0
    In Figure 3, the distribution of these three features is visualised:
  • Back Fat Thickness (BFT): As described above, three measurements were made:
    4.
    8 w
    5.
    3 w
    6.
    D0
    In Figure 4, the distribution of the BFT is visualised:
    7.
    Non-Esterified Fatty Acids (NEFA): We used the maximal NEFA Value of all measurements as described in Section 2.1. This feature is vizualised in Figure 5.
    8.
    305 day Milk-Yield Equivalent: A measure that standardizes the milk yield of the previous lactation. Its impact on different diseases can be found in [21].
    9.
    This feature consists of the maximum observed fat/protein ratio during the previous lactation.
    10.
    Parity: We distinguished between primi- and multiparous cows and transformed these categories as follows: primiparous →−1, multiparous → 1.
    Feature 8, 9 and 10 are depicted in Figure 6.
  • The following features are based on the locations the animals spent their time in the last two weeks before calving. We distinguished between three functional areas, namely cubicles (FA 1), feed alley (FA 2), and passageways (FA 3). These 9 features are depicted in Figure 7.
    11.
    Ratio of Hours spent only in FA 1
    12.
    Ratio of Hours where the animal spent more time in FA 3 than in FA 1
    13.
    Ratio of Hours where the animal spent more time in FA 2 than in FA 1
    14.
    Mean time spent per hour in FA 1
    15.
    Mean time spent per hour in in FA 2
    16.
    Mean time spent per hour in in FA 3
    17.
    Standard Deviation of Time spent per hour in FA 1
    18.
    Standard Deviation of Time spent per hour in FA 2
    19.
    Standard Deviation of Time spent per hour in FA 3
    20.
    This feature describes the amount of hours in the last week before calving, where the animal was exposed to a temperature–humidity index (THI) of 72 or higher, where a THI ≥ 72 is defined by the Austrian Chamber of Agriculture as “moderate heat-stress”, based on [22,23]. We can see the distribution of this feature in Figure 8 below.
The histograms above already give a first overview on to what extent the respective features differ between healthy and diseased individuals. A statistical comparison of all 20 features is discussed in Section 3.1.

2.4. Mathematical Section

This section covers the mathematical and algorithmic aspects for classifying the animals’ health status, i.e., we define the central elements that constitute our classification algorithm. As we are provided with two different types of input, namely time series and features, we utilise methods for both types.
Time Series Classification (TSC) is a non-trivial task, which is thematized in numerous publications. State of the art in TSC is ensemble methods based on transformations of the original series and using flexible distance measures. Simple approaches, such as using Nearest Neighbour algorithm with a suitable similarity measure, still yield comparative results [24,25]. Deep learning approaches are shown to be promising but are still outperformed by distance based methods [26]. As simple methods are desirable both for the simplicity and the comparative performance, we designed a flexible measure to quantify the similarity between two time series which builds the basis for the first step in classification:
Definition 1 (Distance Matrix (DIMA)).
Letting a , b R n , f : R × R R and g : { 1 , , n } 2 R with n N , we define the function D 1 :
D 1 ( a , b ) : = D 1 ( a , b ; f , g , p ) = i = 1 n j = 1 n f ( a i , b j ) g ( i , j ) 1 p .
Unfortunately, we can show that we have to oppose heavy limitations on the parameters to guarantee metric properties:
Theorem 1.
Let a = ( a 1 , , a n ) , b = ( b 1 , , b n ) R n and δ i j be Kronecker’s delta
δ i j : = 1 i f i = j 0 i f i j .
Consider the function D 1 ( a , b ; f , g , p ) defined as in (1). With the choice
f ( a i , b j ) g ( i , j ) : = | a i b j | p δ i j = | a b | p i f i = j 0 i f i j ,
D 1 is a metric on R n for p 1 and is equivalent to the n-dimensional Minkowski Distance [27]. More generally,
g ( i , j ) : = δ i j h i
with some weights h i that fulfil either: h i : h i R + or h i : h i R are the only possible choices of g for D 1 being a metric on R n .
Proof of Theorem 1.
See Appendix A. □
Although the metric properties are of interest from a mathematical point of view and needed for some search speed-up algorithms [28], we can nevertheless utilise this function in a learning approach. As the general formulation of DIMA above allows for a variety of parameter settings, we assume that the function could be adjusted to build a central element for many other time series classification tasks. In our experiment, we decided on the following set of parameters:
p = 1 , f ( a i , b j ) = | a i b j |
The function g ( i , j ) is constructed as follows: given a training set of uni-variate data with two healthy and sick classes, with n h and n s elements in the respective classes:
X h = { x h , 1 , x h , 2 , , x h , n h } and X s = { x s , 1 , x s , 2 , , x s , n s }
with
x h , i : = ( x h , i , 1 , x h , i , n ) , i = 1 , , n h and x s , i : = ( x s , i , 1 , x s , i , n ) , i = 1 , , n s
we calculate the respective class means
x h e a l t h y = 1 n h j = 1 n h x h , j , 1 , , j = 1 n h x h , j , n and x s i c k = 1 n s j = 1 n s x s , j , 1 , , j = 1 n s x s , j , n
and define the matrix G, where I n × n denotes the n × n Identity Matrix:
G i j = s i g n ( min { | x h e a l t h y , i x s i c k , j | , | x s i c k , i x h e a l t h y , j | } max { | x h e a l t h y , i x h e a l t h y , j | , | x s i c k , i x s i c k , j | } )
G ( λ ) : = λ G + ( 1 λ ) I n × n
λ 2 i | i { 0 , , 15 } { 1 2 i | i { 0 , , 15 } } , g ( i , j ) : = G ( λ ) i j
Please note that, in case of λ = 1 , D 1 reduces to the Manhattan Distance with our choices. In Figure 9 below, we see an exemplary visualisation of the matrix G ( 0 ) = G for our two considered time frames.

2.5. Machine Learning in Animal Disease Detection

Machine Learning approaches have been heavily used in animal behavioural/disease assessment over the last few years. Take as an example [29], where the authors applied different methods for detection of subacute ruminal acidosis in dairy cows. In their results, k-Nearest Neighbors showed the best results, outperforming deep learning methods and decision trees. In [30], the authors test deep learning architectures for early detection of respiratory disease in pigs and compare them with classical time series regression approaches. Their results do not show any significant differences in performance measures of the presented methods. The authors of [31] used a wearable device and a one-class support vector machine algorithm to detect lameness in dairy cows.
Having introduced the first main element of our approach in the last sub-section, we continue with shortly describing another two central elements: We utilise Nearest Centroid Classification [32] using Function D 1 with G ( λ ) as above for TSC, and a Naive Bayes Classifier [33] for the feature-based classification. We decided on the NCC algorithm as it is simple and inherently avoids bias based on class frequencies. Moreover, we utilised the well-known naive Bayes algorithm, as it can be easily adjusted to handle missing features both in the learning step and while testing, as we can learn parameters on reduced examples, and just omit the probabilities for testing where a data-point is missing. Thus, our algorithm can be employed for 1 to up to 20 features available. In case of all features missing, one could introduce indecisive results, to indicate that the next steps are up to farm management, or classify solely based on ear-tag data. We omit the description of these two algorithms, and information can be found in the respective citations. Moreover, we decided on a feature-selection step using Relief [34].
We describe the tailor-made algorithm in the following section.

2.6. Proposed Algorithm

The first step is to split up the data into stratified sets for 10-fold cross validation. About 90% constitute the training data, on which the parameters are learned, while the resulting algorithm is used to classify the remaining ∼10 percent. The results are added up. Thus, we repeat the following steps 10 times:
  • For every data stream of the sensor data, we learn an optimal parameter λ such that the leave-one-out inner cross validation balanced error is minimised using an NCC with distance function D 1 with G ( λ ) . In case of ties, the highest value of λ is chosen.
  • Using these 5 (possibly different) parameters, we assume an animal to be sick or healthy, if the five trained NCCs from Step 1 decided at least 4 out of 5 times for a certain class label.
  • The remaining examples are classified as follows:
    (a)
    The features are sorted according to the results from using Relief on the whole training set. For using this algorithm, we need a complete data-set where we only include examples in which all features are available.
    (b)
    Afterwards, we employ an inner 10-fold cross validation to find the optimal amount of features to take, starting with the ones ranked highest and consecutively add the following according to our ordering. The optimum is calculated with respect to balanced accuracy. In this step, we also only include training examples that are complete.
    (c)
    The features are finally processed using a Naive Bayes algorithm to classify the yet undecided examples.
The estimated labels are compared with the actual ones and the outcomes are added up to the results presented in Section 3.2.

3. Results

3.1. Statistical Comparison

In this section, we compare the features we defined Section 2.3 using significance tests. We employ a Two Sided Mann–Whitney U Test with a p-value of 0.05. We decided for a non-parametric test, as our data-set violates assumptions such as normality. Using Bonferroni Correction for multiple hypothesis tests, we arrive at a threshold for significant results of 0.05 / 20 = 0.0025 . The Mean ± Standard Deviation ( μ ± σ ) of each feature for both sick and healthy animals can be found in Table 2 below. Moreover, we added the exact p-values of each individual comparison, where a * indicates a significant difference.
Table 2 above shows that seven of our features do not differ statistically significant between healthy and sick animals, while 13 or 65% do. We can observe that the BCS before calving is higher for diseased animals, with a significant difference three weeks before calving, a finding that supports the results in [10]. In addition, in accordance with [10], the prevalence of SCK in multiparous animals is slightly higher, although there is no significant difference. Our results corroborate the findings in [35] where the authors also found a significant difference in NEFA concentration between healthy and ketotic cows.
All features based on location show significant differences, a result which we interpret as indicating that animals with higher risk of SCK tend to move less, i.e., variate their location less frequently than healthy cows.
In [36], the authors concluded that heat stress increases the ketosis risk mid-lactation. Our results point to the hypothesis that even prepartal heat stress may have an influence on development of SCK, as the average time spent under heat stress is significantly higher in diseased cows than in healthy ones. On the contrary, the authors of [37] calculated a 1.6 times higher risk of clinical ketosis in early lactation if the THI was lower than 83 on the day of calving in comparison to hotter days.
As we discussed the topic of possibly missing data, we briefly state the percentages of missing features below in Table 3.

3.2. Classification Results

As we assumed the location features to be rather specific, we evaluated our algorithm a total of four times, where we included the data described as follows:
  • Sensor data before calving, location features included
  • Sensor data before calving, location features not included
  • Sensor data after calving, location features included
  • Sensor data after calving, location features not included
Applying our algorithm using a tenfold cross validation yielded the following results: We start out with stating the confusion matrices, for the prepartal sensor data with location features included (left) and without a location (right):
Sensors 20 01484 i001
Moreover, we state the results when considering the acceleration data after calving below. On the left-hand side, we find the results when including the location data; on the right-hand side, the results when excluding them:
Sensors 20 01484 i002
Based on these confusion matrices, we can calculate some famous performance measures, which we present in Table 4 below. The best values in each column are boldfaced.
The results, which clearly show better results for the post-partal time frame, are consistent with the findings in [20]. Although using pre-partal data seems desirable with respect to early detecting, the results indicate that the difference is more distinct post partum as all considered performance measures are the highest for Experiment 4.
Moreover, when comparing the difference w.r.t to the inclusion of location data, the results are slightly better when excluding the location data for both times frames, which is surprising given the statistically significant differences between healthy and sick cows as described in Section 3.1.
The percentage of correctly identified sick animals (=Sensitivity) varies from 63.2%–67.0% while the ratio of correctly identified healthy animals (=Specificity) varies from 59.3%–73.6%. The Negative Predictive Value is very high for all experiments (0.91%–0.92) but should be treated with caution as it is highly affected by an imbalanced data structure. As described in Table 1, we are dealing with an imbalanced class structure.
When looking at the results from Experiment 4, we see that more than two-thirds of the sick animals were detected correctly, while nearly 75% of the healthy animals were labelled correctly. Due to the imbalanced prevalence of class labels, estimating “sick” for a data set is only correct for about 32%, as can be seen by inspecting the precision, which still leaves room for improvement.
The algorithm proposed in Section 2.4 can be easily adjusted to include more individuals being classified as sick, useful e.g., when assuming the algorithm as a first “selection”. For that, we can learn a threshold in Naive Bayes such that a certain percentage of “sick” training examples is classified correctly. Moreover, we can only filter data-sets in the first step where all five NCC were decided for the same class label, which leaves possibly more examples without a definitive decision.

3.3. Parameters and Relevant Features Learned

As we estimated the classification quality in a cross validation scheme, we repeatedly selected possibly different subset of features. We state the results as we assume these choices reflect the relevance of the respective features. As we distinguish between two experiments each, where we either included or excluded location features, we have a total of 20 feature selection procedures for both scenarios.
Figure 10 shows that the amount of times a feature was chosen varies greatly, as it ranges from 3 up to 19. NEFA was chosen in 19 out of 20 splits, an indicator of its relevance for detecting SCK. Moreover, all BCS and BFT values were chosen in more than a third of all splits. The three least chosen features are all based on location.
Figure 11 shows that, also when excluding the location features, the amount of times a feature was chosen varies greatly from 3 up to 16. NEFA was chosen in 16 out of 20 splits, followed by THI, Milkyield and BCS -3 w. Based on this observation, the Parity and Max f/p ratio can be considered as having low relevance.

4. Discussion

In this article, we presented results from a study to identify indicators for subclinical ketosis in dairy cows around calving. Moreover we constructed an algorithm, which aims for estimating the health status.
We included a statistical comparison of different parameters, based on milk yield and components, animal movements within the barn, ambient temperature, and on visual observation. The results showed significant differences in 13 of the examined parameters between healthy cows and ones suffering from SCK. A literature review showed that our results partly corroborate the conclusions from other studies.
In a second step, we introduced a flexible machine learning approach, which combines elements of TSC with classical feature based algorithms. It is designed to be simple, interpretable, and flexible with respect to data availability. The results show that our approach is a promising first step for automatic recognition of diseases in dairy cows.
Future work will include more elaborate machine learning approaches to tackle the problem of early detection of SCK.

Author Contributions

Conceptualization, M.I. and M.D.; Data curation, E.G. and M.Ö.; Formal analysis, V.S.; Funding acquisition, D.E., M.I., and M.D.; Investigation, E.G. and M.I.; Methodology, V.S.; Project administration, M.I.; Software, V.S.; Supervision, D.E. and M.D.; Validation, V.S. and E.G.; Visualization, V.S.; Writing—original draft, V.S.; Writing—review and editing, V.S., D.E., E.G., M.I., and M.Ö. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been supported by the COMET-K2 Center of the Linz Center of Mechatronics (LCM) funded by the Austrian federal government and the federal state of Upper Austria. Animal health data were collected in the agriProKnow project funded by the Austrian Research Promotion Agency (FFG, project# 848610).

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Appendix A

Proof. 
We want to prove the following statements:
f ( a , b ) g ( i , j ) = | a b | p δ i j D 1 is equivalent to the n - dimensional Minkowski distance
D 1 is a metric on R n g ( i , j ) = δ i j h i , with h i R + , i { 1 , , n } g ( i , j ) = δ i j h i , with h i R , i { 1 , , n }
Let us begin with (A1): We plug into the definition of D 1 with our special choice for f and g:
D 1 ( a , b ; f , g , p ) = i = 1 n j = 1 n | a i b j | p δ i j 1 p = i = 1 n j = 1 , j i n | a i b j | p δ i j = 0 + i = 1 n δ i i = 1 | a i b i | p 1 p = i = 1 n | a i b i | p 1 p .
Since the last expression (A3) is equivalent to the n-dimensional Minkowski distance, we finished the first proof.
Let us continue with the second statement (A2):
First, we observe that (A2) is equivalent to the following implication:
D 1 is a metric on R n g ( n ^ , m ^ ) = 0 n ^ , m ^ { 1 , , n } with n ^ m ^ .
( g ( n ^ , n ^ ) > 0 n ^ { 1 , , n } g ( n ^ , n ^ ) < 0 n ^ { 1 , , n } )
The proof of (A4) is trivial for n = 1 , as every statement with an universal quantifier (∀) on an empty set is true. The proof of (A5) is also trivial for n = 1 , since g ( 1 , 1 ) cant be zero, as this would imply M ( a , b ) = 0 for all a,b. Therefore, let us continue proving (A4) by assuming n 2 : Suppose now D 1 is a metric on n, which means that the metric conditions have to hold. First, we derive some properties of f: We start with the observation that f ( x , x ) has to be 0 for all x R :
f ( x , x ) = 0 x R .
Let a = ( x , , x ) T R n , x R : From the Identity of Indiscernibles-property of a metric, we see:
D 1 ( a , a ) = 0 D 1 ( a , a ) p = 0 i = 1 n j = 1 n f ( a i , a j ) g ( i , j ) = 0 i = 1 n j = 1 n f ( x , x ) g ( i , j ) = 0 f ( x , x ) i = 1 n j = 1 n g ( i , j ) = 0 f ( x , x ) = 0 i = 1 n j = 1 n g ( i , j ) = 0 .
Next, we can easily show that
i = 1 n j = 1 n g ( i , j ) 0
has to hold under the assumption that D 1 is a metric. Therefore, let
a = ( x , , x ) T , b = ( y , , y ) T R n , x , y R , x y .
We know that D 1 ( a , b ) p > 0 has to hold. Assume now (A7) as false:
D 1 ( a , b ) p = i = 1 n j = 1 n f ( a i , b j ) g ( i , j ) = i = 1 n j = 1 n f ( x , y ) g ( i , j ) = f ( x , y ) i = 1 n j = 1 n g ( i , j ) = 0
Seeing this contradiction, we arrive at the conclusion that (A7) has be true, which yields that
f ( x , x ) = 0 x R .
Our next step is to prove that
f ( x , y ) = f ( y , x ) x , y R , and
f ( x , y ) 0 x , y R , x y
have to hold. Therefore, we inspect the symmetry condition of our metric using the same a , b as above, which tells us that
D 1 ( a , b ) = D 1 ( b , a ) D 1 ( a , b ) p = D 1 ( b , a ) p i = 1 n j = 1 n f ( a i , b j ) g ( i , j ) = i = 1 n j = 1 n f ( b i , a j ) g ( i , j ) i = 1 n j = 1 n f ( x , y ) g ( i , j ) = i = 1 n j = 1 n f ( y , x ) g ( i , j ) f ( x , y ) i = 1 n j = 1 n g ( i , j ) = f ( y , x ) i = 1 n j = 1 n g ( i , j ) .
Using the fact that (A7) holds, we can simplify to
f ( x , y ) = f ( y , x ) .
Next, we aim for (A10): Since a b , we know because of the metric properties that
D 1 ( a , b ) > 0 D 1 ( a , b ) p > 0 i = 1 n j = 1 n f ( a i , b j ) g ( i , j ) > 0 i = 1 n j = 1 n f ( x , y ) g ( i , j ) > 0 f ( x , y ) i = 1 n j = 1 n g ( i , j ) > 0 .
Using (A7), we can derive
f ( x , y ) 0 x , y R , x y
We can use these proven properties of f to finally show that g has to have the aforementioned properties. Therefore, let 1 n ^ , m ^ n , m ^ n ^ . We define four vectors a , b , c , d R :
a = ( x , x , , x , y , Position n ^ x , , x ) T , b = ( x , x , , x , y , Position m ^ x , , x ) T , c = ( x , , x ) T , d = ( x , x , , x , y , Position n ^ x , x , y , Position m ^ x , , x ) T .
Using the metric properties, we know that
D 1 ( a , a ) = 0 D 1 ( a , a ) p = 0 i = 1 n j = 1 n f ( a i , a j ) g ( i , j ) = 0 i = 1 , i n ^ n j = 1 , j n ^ n f ( x , x ) = 0 ( A 8 ) g ( i , j ) + i = 1 , i n ^ n f ( x , y ) g ( i , n ^ ) + j = 1 , j n ^ n f ( y , x ) g ( n ^ , j ) + f ( y , y ) = 0 ( A 8 ) g ( n ^ , n ^ ) = 0 i = 1 , i n ^ n f ( x , y ) g ( i , n ^ ) + j = 1 , j n ^ n f ( y , x ) g ( n ^ , j ) = 0 j = 1 , j n ^ n f ( y , x ) g ( n ^ , j ) = i = 1 , i n ^ n f ( x , y ) g ( i , n ^ ) f ( y , x ) j = 1 , j n ^ n g ( n ^ , j ) = f ( x , y ) i = 1 , i n ^ n g ( i , n ^ ) .
Using the symmetry property (A9), the fact (A10) and renaming the indices, we get:
f ( x , y ) j = 1 , j n ^ n g ( n ^ , j ) = f ( x , y ) i = 1 , i n ^ n g ( i , n ^ ) n ^ = 1 , , n : i = 1 , i n ^ n g ( n ^ , i ) = i = 1 , i n ^ n g ( i , n ^ )
Since we know that D 1 is a metric, we see:
D 1 ( a , c ) = D 1 ( c , a ) D 1 ( a , c ) p = D 1 ( c , a ) p i = 1 , i n ^ n j = 1 n f ( x , x ) = 0 ( A 8 ) g ( i , j ) + j = 1 n f ( y , x ) g ( n ^ , j ) = i = 1 n j = 1 , j n ^ n f ( x , x ) = 0 ( A 8 ) g ( i , j ) + i = 1 n f ( y , x ) g ( i , n ^ ) f ( y , x ) j = 1 n g ( n ^ , j ) = f ( x , y ) i = 1 n g ( i , n ^ )
Using the symmetry property (A9):
                    f ( y , x ) j = 1 n g ( n ^ , j ) = f ( y , x ) i = 1 n g ( i , n ^ )
Using the fact (A10) and renaming the indices, we get
n ^ = 1 , , n : i = 1 n g ( n ^ , i ) = i = 1 n g ( i , n ^ )
Next, we add up/subtract the results (A14) and (A15):
( A 14 ) + ( A 15 ) : 2 i = 1 , i n ^ n g ( n ^ , i ) + g ( n ^ , n ^ ) = g ( n ^ , n ^ ) ( A 14 ) ( A 15 ) : g ( n ^ , n ^ ) = 2 i = 1 , i n ^ n g ( i , n ^ ) + g ( n ^ , n ^ )
subtracting g ( n ^ , n ^ ) and dividing by 2 on both sides in both equations give us
i = 1 i n ^ n g ( n ^ , i ) = 0 n ^ = 1 , , n .
i = 1 i n ^ n g ( i , n ^ ) = 0 n ^ = 1 , , n .
For the case n = 2 , we are done with the proof of (A4), as in this case both (A16) and (A17) read as
g ( 1 , 2 ) = 0 and g ( 2 , 1 ) = 0 .
Assume now that n 3 :
D 1 ( a , b ) = D 1 ( b , a ) D 1 ( a , b ) p = D 1 ( b , a ) p i = 1 i n ^ n j = 1 j m ^ n f ( x , x ) = 0 ( A 8 ) g ( i , j ) + j = 1 j m ^ n f ( y , x ) g ( n ^ , j ) + i = 1 i n ^ n f ( x , y ) g ( i , m ^ ) + f ( y , y ) = 0 ( A 8 ) g ( n ^ , m ^ ) = i = 1 i m ^ n j = 1 j n ^ n f ( x , x ) = 0 ( A 8 ) g ( i , j ) + j = 1 j n ^ n f ( y , x ) g ( m ^ , j ) + i = 1 i m ^ n f ( x , y ) g ( i , n ^ ) + f ( y , y ) = 0 ( A 8 ) g ( m ^ , n ^ ) j = 1 j m ^ n f ( y , x ) g ( n ^ , j ) + i = 1 i n ^ n f ( x , y ) g ( i , m ^ ) = j = 1 j n ^ n f ( y , x ) g ( m ^ , j ) + i = 1 i m ^ n f ( x , y ) g ( i , n ^ )
Using the symmetry property (A9) and the fact (A10), we can pull out a factor f ( x , y ) in all sums and divide by it. Moreover, we rename the indices and get:
i = 1 i m ^ n g ( n ^ , i ) + i = 1 i n ^ n g ( i , m ^ ) = i = 1 i n ^ n g ( m ^ , i ) + i = 1 i m ^ n g ( i , n ^ ) .
Use the facts (A16) and (A17) that
i = 1 i n ^ n g ( n ^ , i ) = 0 , i = 1 i m ^ n g ( m ^ , i ) = 0 , i = 1 i n ^ n g ( i , n ^ ) = 0 , i = 1 i m ^ n g ( i , m ^ ) = 0 .
We subtract the first and fourth expression from (A19) from the left hand side of (A18), and the second and third expression from (A19) from the right-hand side of (A18), which gives us
g ( n ^ , n ^ ) g ( n ^ , m ^ ) + g ( m ^ , m ^ ) g ( n ^ , m ^ ) = g ( m ^ , m ^ ) g ( m ^ , n ^ ) + g ( n ^ , n ^ ) g ( m ^ , n ^ ) ,
which simplifies to
g ( n ^ , m ^ ) = g ( m ^ , n ^ ) m ^ n ^ , 1 m ^ , n ^ n .
Finally, we look at D 1 ( d , d ) :
D 1 ( d , d ) = 0 D 1 ( d , d ) p = 0 i = 1 i n ^ , m ^ n j = 1 j n ^ , m ^ n f ( x , x ) = 0 ( A 8 ) g ( i , j ) + i = 1 i n ^ , m ^ n f ( x , y ) g ( i , n ^ ) + i = 1 i n ^ , m ^ n f ( x , y ) g ( i , m ^ ) + j = 1 j n ^ , m ^ n f ( y , x ) g ( n ^ , j ) + j = 1 j n ^ , m ^ n f ( x , y ) g ( m ^ , j ) + f ( y , y ) = 0 ( A 8 ) g ( n ^ , n ^ ) + g ( n ^ , m ^ ) + g ( m ^ , n ^ ) + g ( m ^ , m ^ ) = 0 i = 1 i n ^ , m ^ n f ( x , y ) g ( i , n ^ ) + i = 1 i n ^ , m ^ n f ( x , y ) g ( i , m ^ ) + j = 1 j n ^ , m ^ n f ( y , x ) g ( n ^ , j ) + j = 1 j n ^ , m ^ n f ( x , y ) g ( m ^ , j ) = 0 .
As above, we use the symmetry property (A9) and the fact (A10), pull out a factor f ( x , y ) in all sums and divide by it. Moreover, we rename the indices and get:
i = 1 i n ^ , m ^ n g ( i , n ^ ) + i = 1 i n ^ , m ^ n g ( i , m ^ ) + i = 1 i n ^ , m ^ n g ( n ^ , i ) + i = 1 i n ^ , m ^ n g ( m ^ , i ) = 0 .
Subtracting all four expressions in (A19) from (A21), we get:
g ( m ^ , n ^ ) g ( n ^ , m ^ ) g ( n ^ , m ^ ) g ( m ^ , n ^ ) = 0 g ( n ^ , m ^ ) = g ( m ^ , n ^ ) m ^ n ^ , 1 m ^ , n ^ n .
Adding up (A20) + (A22) gives us
g ( n ^ , m ^ ) = 0 m ^ n ^ , 1 m ^ , n ^ n .
which concludes our proof of (A4) for n 3 .
To finally show (A2), we have to prove (A5) for n > 1 . As we already showed that g ( i , j ) has to be 0 for i j , we assume that D 1 takes the following form:
D 1 ( a , b ) p = i = 1 n f ( a i , b i ) g ( i , i ) .
We define a = ( x , x , , x ) T as above and two more vectors b and c:
b = ( x , x , , x , y , Position i x , , x ) T , c = ( x , x , , x , y , Position j x , , x ) T
We now prove (A5) by showing that the opposite assumptions lead to a contradiction. Assume therefore that (A5)’ holds:
i , j , i j : g ( i , i ) 0 g ( j , j ) 0
As we assume D 1 to be a metric, we know that D ( a , b ) p > 0 D ( a , c ) p > 0 . Using (A7) and the simplified form of D 1 (A24), we see that the following statement has to hold:
f ( x , y ) g ( i , i ) > 0   f ( x , y ) g ( j , j ) > 0
Inspecting (A25), we see that neither g ( i , i ) = 0 nor g ( j , j ) = 0 can fulfil these inequalities, thus we assume g ( i , i ) < 0 g ( j , j ) > 0 . Using these assumptions and dividing by g ( i , i ) resp. g ( j , j ) , we arrive at:
f ( x , y ) < 0 f ( x , y ) > 0
As we arrived at a contradictory result (A26) by assuming (A5)’ to hold, we can conclude that (A5) has to hold and therefore we have proven (A2). □

References

  1. Andersson, L. Subclinical Ketosis in Dairy Cows. Veter-Clin. N. Am. 1988, 4, 233–251. [Google Scholar] [CrossRef]
  2. Duffield, T.; Sandals, D.; Leslie, K.; Lissemore, K.; McBride, B.; Lumsden, J.; Dick, P.; Bagg, R. Efficacy of Monensin for the Prevention of Subclinical Ketosis in Lactating Dairy Cows. J. Dairy Sci. 1998, 81, 2866–2873. [Google Scholar] [CrossRef]
  3. Duffield, T.F.; Lissemore, K.; McBride, B.; Leslie, K. Impact of hyperketonemia in early lactation dairy cows on health and production. J. Dairy Sci. 2009, 92, 571–580. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Geishauser, T.; Leslie, K.; Kelton, D.; Duffield, T. Evaluation of Five Cowside Tests for Use with Milk to Detect Subclinical Ketosis in Dairy Cows. J. Dairy Sci. 1998, 81, 438–443. [Google Scholar] [CrossRef]
  5. Bach, K.; Heuwieser, W.; McArt, J. Technical note: Comparison of 4 electronic handheld meters for diagnosing hyperketonemia in dairy cows. J. Dairy Sci. 2016, 99, 9136–9142. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Iwersen, M.; Klein-Jöbstl, D.; Pichler, M.; Roland, L.; Fidlschuster, B.; Schwendenwein, I.; Drillich, M. Comparison of 2 electronic cowside tests to detect subclinical ketosis in dairy cows and the influence of the temperature and type of blood sample on the test results. J. Dairy Sci. 2013, 96, 7719–7730. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Chapinal, N.; Leblanc, S.J.; Carson, M.; Leslie, K.; Godden, S.; Capel, M.; Santos, J.; Overton, M.; Duffield, T. Herd-level association of serum metabolites in the transition period with disease, milk production, and early lactation reproductive performance. J. Dairy Sci. 2012, 95, 5676–5682. [Google Scholar] [CrossRef]
  8. Suthar, V.; Canelas-Raposo, J.; Deniz, A.; Heuwieser, W. Prevalence of subclinical ketosis and relationships with postpartum diseases in European dairy cows. J. Dairy Sci. 2013, 96, 2925–2938. [Google Scholar] [CrossRef] [Green Version]
  9. Liang, D.; Arnold, L.; Stowe, C.; Harmon, R.; Bewley, J. Estimating US dairy clinical disease costs with a stochastic simulation model. J. Dairy Sci. 2017, 100, 1472–1486. [Google Scholar] [CrossRef] [Green Version]
  10. Vanholder, T.; Papen, J.; Bemers, R.; Vertenten, G.; Berge, A. Risk factors for subclinical and clinical ketosis and association with production parameters in dairy cows in the Netherlands. J. Dairy Sci. 2015, 98, 880–888. [Google Scholar] [CrossRef] [Green Version]
  11. Itle, A.; Huzzey, J.; Weary, D.M.; Von Keyserlingk, M.A.G. Clinical ketosis and standing behavior in transition cows. J. Dairy Sci. 2015, 98, 128–134. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Stangaferro, M.; Wijma, R.; Caixeta, L.; Al Abri, M.A.; Giordano, J. Use of rumination and activity monitoring for the identification of dairy cows with health disorders: Part III. Metritis. J. Dairy Sci. 2016, 99, 7422–7433. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Rutten, C.; Velthuis, A.; Steeneveld, W.; Hogeveen, H. Invited review: Sensors to support health management on dairy farms. J. Dairy Sci. 2013, 96, 1928–1952. [Google Scholar] [CrossRef] [PubMed]
  14. Wathes, C.M.; Kristensen, H.H.; Aerts, J.M.; Berckmans, D. Is precision livestock farming an engineer’s daydream or nightmare, an animal’s friend or foe, and a farmer’s panacea or pitfall? Comput. Electron. Agric. 2008, 64, 2–10. [Google Scholar] [CrossRef]
  15. Edmonson, A.; Lean, I.; Weaver, L.; Farver, T.; Webster, G. A Body Condition Scoring Chart for Holstein Dairy Cows. J. Dairy Sci. 1989, 72, 68–78. [Google Scholar] [CrossRef]
  16. Schröder, U.; Staufenbiel, R. Invited Review: Methods to Determine Body Fat Reserves in the Dairy Cow with Special Regard to Ultrasonographic Measurement of Backfat Thickness. J. Dairy Sci. 2006, 89, 1–14. [Google Scholar] [CrossRef] [Green Version]
  17. Borchers, M.; Chang, Y.; Tsai, I.; Wadsworth, B.; Bewley, J. A validation of technologies monitoring dairy cow feeding, ruminating, and lying behaviors. J. Dairy Sci. 2016, 99, 7458–7466. [Google Scholar] [CrossRef]
  18. Reiter, S.; Sattlecker, G.; Lidauer, L.; Kickinger, F.; Öhlschuster, M.; Auer, W.; Schweinzer, V.; Klein-Jöbstl, D.; Drillich, M.; Iwersen, M. Evaluation of an ear-tag-based accelerometer for monitoring rumination in dairy cows. J. Dairy Sci. 2018, 101, 3398–3411. [Google Scholar] [CrossRef] [Green Version]
  19. Schweinzer, V.; Gusterer, E.; Kanz, P.; Krieger, S.; Süss, D.; Lidauer, L.; Berger, A.; Kickinger, F.; Öhlschuster, M.; Auer, W.; et al. Evaluation of an ear-attached accelerometer for detecting estrus events in indoor housed dairy cows. Theriogenology 2019, 130, 19–25. [Google Scholar] [CrossRef]
  20. Sturm, V.; Efrosinin, D.; Gusterer, E.; Iwersen, M.; Drillich, M.; Öhlschuster, M. Time Series Classification for Detecting Subclinical Ketosis in Dairy Cows. In Proceedings of the 2019 International Conference on Biotechnology and Bioengineering (9th ICBB 2019), Poznan, Poland, 25–28 September 2019. submitted. [Google Scholar]
  21. Grohn, Y.; Eicker, S.; Hertl, J. The Association between Previous 305-day Milk Yield and Disease in New York State Dairy Cows. J. Dairy Sci. 1995, 78, 1693–1702. [Google Scholar] [CrossRef]
  22. Thom, E.C. The Discomfort Index. Weatherwise 1959, 12, 57–61. [Google Scholar] [CrossRef]
  23. Zimbelman, R.B.; Rhoads, R.P.; Rhoads, M.L.; Duff, G.C.; Baumgard, L.H.; Collier, R.J. A re-evaluation of the impact of temperature humidity index (THI) and black globe humidity index (BGHI) on milk production in high producing dairy cows. In Proceedings of the Southwest Nutrition Conference, Tempe, AZ, USA, 26–27 February 2009; pp. 158–169. [Google Scholar]
  24. Bagnall, A.; Lines, J.; Bostrom, A.; Large, J.; Keogh, E. The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Min. Knowl. Discov. 2016, 31, 606–660. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Lucas, B.; Shifaz, A.; Pelletier, C.; O’Neill, L.; Zaidi, N.; Goethals, B.; Petitjean, F.; Webb, G.I. Proximity Forest: an effective and scalable distance-based classifier for time series. Data Min. Knowl. Discov. 2019, 33, 607–635. [Google Scholar] [CrossRef] [Green Version]
  26. Fawaz, H.I.; Forestier, G.; Weber, J.; Idoumghar, L.; Muller, P.-A. Deep learning for time series classification: A review. Data Min. Knowl. Discov. 2019, 33, 917–963. [Google Scholar] [CrossRef] [Green Version]
  27. Kruskal, J.B. Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis. Psychometrika 1964, 29, 1–27. [Google Scholar] [CrossRef]
  28. Clarkson, K.L. Nearest-neighbor searching and metric space dimensions. In Nearest-Neighbor Methods for Learning and Vision: Theory and Practice; MIT Press: Cambridge, MA, USA, 2006; pp. 15–59. [Google Scholar]
  29. Wagner, N.; Antoine, V.; Mialon, M.-M.; Lardy, R.; Silberberg, M.; Koko, J.; Veissier, I. Machine learning to detect behavioural anomalies in dairy cows under subacute ruminal acidosis. Comput. Electron. Agric. 2020, 170, 105233. [Google Scholar] [CrossRef]
  30. Cowton, J.; Kyriazakis, I.; Ploetz, T.; Bacardit, J. A Combined Deep Learning GRU-Autoencoder for the Early Detection of Respiratory Disease in Pigs Using Multiple Environmental Sensors. Sensors 2018, 18, 2521. [Google Scholar] [CrossRef] [Green Version]
  31. Haladjian, J.; Haug, J.; Nüske, S.; Bruegge, B. A Wearable Sensor System for Lameness Detection in Dairy Cattle. Multimodal Technol. Interact. 2018, 2, 27. [Google Scholar] [CrossRef] [Green Version]
  32. Manning, C.D.; Raghavan, P.; Schutze, H. Introduction to Information Retrieval; Cambridge University Press (CUP): Cambridge, UK, 2008; pp. 292–297. [Google Scholar]
  33. Maron, M.E. Automatic Indexing: An Experimental Inquiry. J. ACM 1961, 8, 404–417. [Google Scholar] [CrossRef]
  34. Kira, K.; Rendell, L.A. A practical approach to feature selection. In Machine Learning Proceedings 1992; Morgan Kaufmann: San Francisco, CA, USA, 1992; pp. 249–256. [Google Scholar]
  35. Cao, Y.; Zhang, J.; Yang, W.; Xia, C.; Zhang, H.-Y.; Wang, Y.-H.; Xu, C. Predictive value of plasma parameters in the risk of postpartum ketosis in dairy cows. J. Veter. Res. 2017, 61, 91–95. [Google Scholar] [CrossRef] [Green Version]
  36. Gantner, V.; Kuterovac, K.; Potočnik, K. 11. Effect of Heat Stress on Metabolic Disorders Prevalence Risk and Milk Production in Holstein Cows in Croatia. Ann. Anim. Sci. 2016, 16, 451–461. [Google Scholar] [CrossRef] [Green Version]
  37. Mellado, M.; Davila, A.; Gaytan, L.; Macías-Cruz, U.; Avendaño-Reyes, L.; García, E. Risk factors for clinical ketosis and association with milk production and reproduction variables in dairy cows in a hot environment. Trop. Anim. Health Prod. 2018, 50, 1611–1616. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Depiction of the five time series for a single individual before (left) and after calving (right).
Figure 1. Depiction of the five time series for a single individual before (left) and after calving (right).
Sensors 20 01484 g001
Figure 2. Time spent with respective behaviours after calving, averaged for healthy (dotted) and sick animals.
Figure 2. Time spent with respective behaviours after calving, averaged for healthy (dotted) and sick animals.
Sensors 20 01484 g002
Figure 3. Histograms of the BCS at different times: 8 weeks prior (left), 3 weeks prior (middle), day of calving (right).
Figure 3. Histograms of the BCS at different times: 8 weeks prior (left), 3 weeks prior (middle), day of calving (right).
Sensors 20 01484 g003
Figure 4. Histograms of BFT in mm at different times: 8 weeks prior (left), 3 weeks prior (middle), Day of Calving (right).
Figure 4. Histograms of BFT in mm at different times: 8 weeks prior (left), 3 weeks prior (middle), Day of Calving (right).
Sensors 20 01484 g004
Figure 5. Histogram of the maximum NEFA value.
Figure 5. Histogram of the maximum NEFA value.
Sensors 20 01484 g005
Figure 6. Histograms of features 8, 9, and 10.
Figure 6. Histograms of features 8, 9, and 10.
Sensors 20 01484 g006
Figure 7. Histograms of features 11–19.
Figure 7. Histograms of features 11–19.
Sensors 20 01484 g007
Figure 8. Histogram of the amount of time spent exposed to a THI of 72 .
Figure 8. Histogram of the amount of time spent exposed to a THI of 72 .
Sensors 20 01484 g008
Figure 9. Depiction of the matrix G ( 0 ) . Each column represents one of the five five behaviours. First, row: before calving, second row: after calving. Colouring: 1 = Orange, −1 = Blue.
Figure 9. Depiction of the matrix G ( 0 ) . Each column represents one of the five five behaviours. First, row: before calving, second row: after calving. Colouring: 1 = Orange, −1 = Blue.
Sensors 20 01484 g009
Figure 10. Bar-Chart with amount every feature was chosen in our inner cross validation step.
Figure 10. Bar-Chart with amount every feature was chosen in our inner cross validation step.
Sensors 20 01484 g010
Figure 11. Bar-Chart with amount every feature was chosen in our inner cross validation step, when excluding the location features.
Figure 11. Bar-Chart with amount every feature was chosen in our inner cross validation step, when excluding the location features.
Sensors 20 01484 g011
Table 1. Amount of healthy and diseased animals.
Table 1. Amount of healthy and diseased animals.
Health StatusExamplesFrequency
Healthy56584.20%
Sick10615.80%
Table 2. Statistical comparison of features. We state that the respective class means plus one standard deviation and the p-value when comparing using a Mann–Whitney test.
Table 2. Statistical comparison of features. We state that the respective class means plus one standard deviation and the p-value when comparing using a Mann–Whitney test.
BCS -8 wBCS -3 wBCS Day0BFT -8 wBFT -3 w
μ ± σ healthy3.19 ± 0.4323.426 ± 0.4283.298 ± 0.4113.425 ± 5.00215.393 ± 5.261
μ ± σ sick3.377 ± 0.4463.613 ± 0.4453.395 ± 0.42415.527 ± 5.8317.581 ± 5.261
p-value0.007530240.000629145 *0.04931770.01483320.000184263 *
BFT Day0NEFA305-D MilkMax f/p RatioParity
μ ± σ healthy15.248 ± 4.8460.296 ± 0.22211538.3 ± 1528.181.686 ± 0.3270.096 ± 0.996
μ ± σ sick16.892 ± 5.5770.384 ± 0.25611317.7 ± 1713.151.676 ± 0.3720.151 ± 0.993
p-value0.006301050.00015792 *0.338080.4496380.600132
Location1Location2Location3Time Area 1Time Area 2
μ ± σ healthy0.703 ± 0.090.074 ± 0.0320.098 ± 0.04250.658 ± 3.1885.564 ± 1.853
μ ± σ sick0.748 ± 0.0930.059 ± 0.0260.08 ± 0.03752.285 ± 2.964.705 ± 1.857
p-value0.00020492 *0.000434685 *0.00242925 *0.0000765469 *0.000498528 *
Time Area 3SD Area 1SD Area 2SD Area 3Hours THI Greater 72
μ ± σ healthy3.563 ± 1.47816.821 ± 2.93711.711 ± 2.30910.233 ± 2.5776.981 ± 11.911
μ ± σ sick2.825 ± 1.15515.336 ± 2.81110.538 ± 2.3748.799 ± 2.03910.858 ± 12.576
p-value0.000151992 *0.0000174387 *0.0000541604 *0.0000066027 *0.0000583004 *
Table 3. Percentages of features missing.
Table 3. Percentages of features missing.
Feature1234567891011–1920
% Missing31.4511.031.1931.4511.031.191.790.150.000.1526.230.00
Table 4. Performance Measures for our proposed algorithm for both time frames. Acc = Accuracy, Sens = Sensitivity, Spec = Specificity, Prec = Precision, J = Youdens Index, κ = Cohens κ , MCC = Matthews Correlation Coefficient, NPV = Negative Predictive Value.
Table 4. Performance Measures for our proposed algorithm for both time frames. Acc = Accuracy, Sens = Sensitivity, Spec = Specificity, Prec = Precision, J = Youdens Index, κ = Cohens κ , MCC = Matthews Correlation Coefficient, NPV = Negative Predictive Value.
ExperimentAccSensSpecJ κ F-ScorePrecMCCNPVLift
10.60510.66980.59290.26270.15040.34890.23590.19270.90541.6454
20.61400.66040.60530.26570.15480.35090.23890.19540.90481.6732
30.71680.63210.73270.36480.25530.41360.30730.28410.91392.3651
40.72580.66980.73630.40610.28260.43560.32270.31550.92242.5399

Share and Cite

MDPI and ACS Style

Sturm, V.; Efrosinin, D.; Öhlschuster, M.; Gusterer, E.; Drillich, M.; Iwersen, M. Combination of Sensor Data and Health Monitoring for Early Detection of Subclinical Ketosis in Dairy Cows. Sensors 2020, 20, 1484. https://doi.org/10.3390/s20051484

AMA Style

Sturm V, Efrosinin D, Öhlschuster M, Gusterer E, Drillich M, Iwersen M. Combination of Sensor Data and Health Monitoring for Early Detection of Subclinical Ketosis in Dairy Cows. Sensors. 2020; 20(5):1484. https://doi.org/10.3390/s20051484

Chicago/Turabian Style

Sturm, Valentin, Dmitry Efrosinin, Manfred Öhlschuster, Erika Gusterer, Marc Drillich, and Michael Iwersen. 2020. "Combination of Sensor Data and Health Monitoring for Early Detection of Subclinical Ketosis in Dairy Cows" Sensors 20, no. 5: 1484. https://doi.org/10.3390/s20051484

APA Style

Sturm, V., Efrosinin, D., Öhlschuster, M., Gusterer, E., Drillich, M., & Iwersen, M. (2020). Combination of Sensor Data and Health Monitoring for Early Detection of Subclinical Ketosis in Dairy Cows. Sensors, 20(5), 1484. https://doi.org/10.3390/s20051484

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop