Next Article in Journal
Design of a Cargo-Carrying Analysis System for Mountain Orchard Transporters Based on RGB-D Data
Next Article in Special Issue
Determination of the Live Weight of Farm Animals with Deep Learning and Semantic Segmentation Techniques
Previous Article in Journal
Directional Amplification at Rock Sites in Fault Damage Zones
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Computer Vision Milky Way Compass

1
School of Engineering, University of South Australia, Mawson Lakes, SA 5095, Australia
2
School of Engineering and Information Technology, University of New South Wales, Canberra, ACT 2610, Australia
3
Department of Biology, Lund University, 22362 Lund, Sweden
4
Joint and Operations Analysis Division, Defence Science and Technology Group, Melbourne, VIC 3207, Australia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(10), 6062; https://doi.org/10.3390/app13106062
Submission received: 13 April 2023 / Revised: 8 May 2023 / Accepted: 9 May 2023 / Published: 15 May 2023
(This article belongs to the Special Issue Applied Computer Vision in Industry and Agriculture)

Abstract

:
The Milky Way is used by nocturnal flying and walking insects for maintaining heading while navigating. In this study, we have explored the feasibility of the method for machine vision systems on autonomous vehicles by measuring the visual features and characteristics of the Milky Way. We also consider the conditions under which the Milky Way is used by insects and the sensory systems that support their detection of the Milky Way. Using a combination of simulated and real Milky Way imagery, we demonstrate that appropriate computer vision methods are capable of reliably and accurately extracting the orientation of the Milky Way under an unobstructed night sky. The technique presented achieves angular accuracy of better then ± 2 ° under moderate light pollution conditions but also demonstrates that higher light pollution levels will adversely effect orientation estimates by systems depending on the Milky Way for navigation.

1. Introduction

Locomotion in straight lines allows organisms to forage, migrate and escape from competition or danger; it is fundamental to the survival of many species. Orientation is a fundamental operation for closed-loop control that allows for effective locomotion. To successfully find the right direction, some animals use landmarks, magnetic cues or wind, while some species use the Sun, the Moon or polarised light as the cue for orientation [1,2,3]. Visual cues from the sky dome are believed to have been used for synchronisation and orientation by organisms almost from the dawn of life, when the first photosensitive cells formed [4]. Navigation using the stars has been undertaken by humans for millennia and has been found in animals such as seals with sophisticated brains and high resolution eyes [5]. Amongst these celestial options for orientation is the Milky Way, although its characteristics for navigation are different from the pinpoint precision offered by stars for navigation and are more favourable for low angular resolution visual systems.

1.1. The Milky Way

The Milky Way (MW) is a large structure compared to other celestial bodies; a typical image of it is shown in Figure 1. It is an irregular luminous band as much as 30° wide, composed of stars and gas clouds stretching across the night sky visible by an observer on Earth (see Figure 2). Astronomers have known since the 1920s that this band is an edge-on view of the MW Galaxy in which we live, amongst a vast pinwheel teeming with nebulae, gas clouds and hundreds of billions of stars in a class of galaxy known as spiral galaxies. The MW Galaxy is approximately 87,400 light years in diameter and is only 1000 light years across in the spiral arms. The Sun, Earth’s star, is located in the Orion spar about 28,000 light years from the galactic centre. There is a visible structure of the MW seen as dark bands and areas within a luminous band caused by interstellar clouds in cosmic dust, as shown in Appendix A Figure A1, collectively known as the Great Rift or Dark Rift. The luminosity of the MW is not constant along its arc, thus the luminosity of the visible portion of the MW will depend on the observer’s location on Earth and the time of year.
For sensing purposes, the MW is comparatively low contrast; however, it is much larger than the pinpoint stars, and its primary features are composed of low spatial frequencies. It is apparent that sensitivity to low contrast and low light is required to observe it; however, high spatial acuity is not required. This is discussed in the context of insect vision in the next section.

1.2. Insect Vision under Starlight

The method we present is based on observed insect behaviour; thus, it is important to understand what they can actually sense and the limitations of their sensing, since their visual perception is substantially different from our own. Insects have evolved compound eyes that have extraordinary adaptations and specialisations [6], and the tiny optics and eyes of insects are challenged by low light levels. There are two main types of compound eyes, as shown in Figure 3, which are apposition eyes and superposition eyes. In an apposition eye, each ommatidium is sleeved by light-absorbing screening pigment, preventing light reaching the photoreceptors from neighbouring ommatidia. This means that a photoreceptor only receives light from a single lens. Apposition compound eyes are typical of diurnal insects that have evolved to achieve the highest possible angular resolution, such as dragonflies [7] and honey bees [8]. In superposition compound eyes, however, there is a wide optically transparent region, the clear zone (marked cz in Figure 3), between the lenses and the retina. The superposition arrangement allows light from possibly hundreds of lenses to be focused onto photoreceptors in the retina, which improves light sensitivity at the expense of angular resolution. This eye type is more suited to nocturnal insects that are active in dim light [2,9]; such insects include night-active beetles and moths. It is apparent that at the lowest level of illumination, substantial pooling is usually being undertaken by the visual system of night-active insects both through optical means and through neural/computational interconnections [10].
Spatial pooling ensures that the effective angular resolution of the eye will be lower than that indicated by the density of the optical elements. Given the already limited resolution of insect eyes compared to our own experience, the biological findings indicate that the MW may be a useful cue even with comparatively low-resolution vision systems. Appendix A discusses the extreme limits of low light vision that have been found in insect visual systems, allowing vision well below starlight conditions, as shown in Appendix A Figure A2.
Daytime celestial navigation has been observed and studied extensively in many species. In some insect species such as dragonflies, it has been found anatomically that there is polarisation sensitivity of the dorsal rim area of their compound eyes, suggesting that they are sensitive to the orientation of skylight polarization [12,13]. This characteristic allows them to perceive a clear blue sky as a large polarisation pattern aligned to the angular position of the sun. This perception is useful to a flying insect, since otherwise the sky might provide negligible heading information, either absolute or relative, particularly if the sun is near apex, near the horizon, or obscured by clouds. A comparison between the characteristics of the MW and the sky polarisation pattern can be made. Although the brightness level is quite different, they are both large, low contrast structures that dominate the sky if they are visible.
Moreover, it has been found that the desert ant Cataglyphis [14] and the field cricket Gryllus campestris [15] have the ability to use celestial cues to maintain their heading direction while walking, even when their locomotion is disturbed about the yaw axis.
Compared with daytime movement, it is a challenging task to establish and maintain an inertial or celestial frame heading at night. A moonless clear night sky is much dimmer than full daylight, with a change in light intensity from 0.0001 lux under such conditions to 10,000 lux in daylight [16]. A nocturnal dung beetle in Southern Africa, Scarabaeus satyrus, has been found to orient in very straight lines, transporting its dung balls away from competitors. These night-active dung beetles use the MW as a stellar directional cue when the moon and lunar sky polarisation pattern are absent [2,17,18,19]. The MW is used as a landmark for the duration of an outbound sprint. The aim of the insect is to get away, in a straight line, from other beetles congregating and foraging at the same site, who might attempt to commandeer the painstakingly assembled ball.
It is important to understand the limitations of the approach used by dung beetles orienting at night. It seems that they are using the MW as a short-term heading reference, rather than as a compass that is aligned in some way to the inertial frame. In this regard, the mechanism could also be accurately described as a celestial landmark, although the distinction between a compass and a landmark is not significant given the time frame involved and the behaviour of the beetle.

1.3. Contribution of This Study

There have been past insect biology-inspired studies using celestial cues for navigation. The sky itself can be a reference in day time with sun, clouds and scattering [20], resulting in patterns that are stable [21]. Some implementations have been developed by using sky-polarised light [20] to achieve autonomous navigation both on the ground [22,23] and as part of a flying navigation system on a drone [24]. NASA has even considered use of sky polarisation for navigation in the challenging Mars environment, where the magnetic field is not useful for navigation and where deep terrain features surrounding the vehicle might mask the sun [25].
However, existing technological approaches have not yet considered the MW as a celestial navigation cue to be used in the same manner that S.satyrus dung beetles use it. Due to the size and distinction of the MW, it could be useful for coarse orientation of space craft, robots, aircraft and planetary rovers, even under dynamic or high-vibration situations.
In this paper, we investigate a computer-vision-based navigation algorithm inspired by insect behaviour and physiology that can use the MW as an orientation reference from which it is possible to compute direction signals under low-light conditions. A computer vision approach based on a series of algorithmic operations also allows for the possibility of gaining insight into the challenges and limitations faced by biological systems. Of particular interest when considering the MW is the ecological effect that light pollution and atmospheric pollution might induce.
The MW is a difficult photographic subject under many atmospheric and human geographic conditions, and its position in the sky varies over hours and seasons. Even in regional South Australia, where our laboratory is located, the MW is not always visible on a clear moonless night, depending on humidity and dust. To overcome this shortage of data, our data set consisted of both real photographs of the MW and the outputs from the Stellarium software [26] to generate adequate numbers of images with a variety of conditions, such as location, time and light pollution. This study considered the clear night sky without the weather conditions that would preclude visibility of the MW (clouds, fog or rain).
The intent of this study is to both develop a novel heading sensor and to deepen our understanding of precisely why insects have evolved to use this signal. Although the MW is not visible in many human-modified environments due to anthropogenic light, it is quite visible under clear skies in the low population-density areas of the world. Stars are visible at and above stratospheric altitudes from aircraft and satellites at orbital ranges even during the day. An example environment where it might be most useful is in the sparsely populated polar regions, where magnetic compass orientation is of marginal utility [27]. On Mars for example, the magnetic field is not useful for navigation, leaving a selection of computer vision and celestial approaches, including solar and stellar cues [24,28,29]. The insects that use the Milky Way are important species in their ecosystems, epitomised by the dung beetle. Understanding, simulating and emulating their Milky Way navigation method may help to understand and even quantify the vulnerabilities of the species to anthropogenic light, allowing mitigation measures to be designed.

2. Materials and Methods

2.1. Data Generation

The MW is not visible in most urban centres across the world, either due to light pollution, geographical location, atmospheric conditions or contaminants [30]. Therefore, it is a challenge to capture adequate night sky images with proper exposure [31]. To overcome this limitation, we used a simulator, Stellarium (version 0.22.2) [26], to generate MW test images for processing. Stellarium is an open-source desktop planetarium software that the creators claim is intended for the community of amateur astronomers and for knowledge transfer in transdisciplinary research. This software was created for simulating the celestial sphere based on a given time and location. In other words, this simulator can give the observer a reasonably realistic rendering of the sky with variable sky and viewing settings [26].
There are several key settings in Stellarium that we used in this paper: date, location, MW brightness/saturation, light pollution level (LP), etc. The following image, Figure 4, shows an example of the latitude and longitude settings.
Before we select the light pollution levels in Stellarium, we should consider the contrast available to the nocturnal insect’s visual system. The Bortle scale [32] is a nine-level numerical scale that measures the night sky’s brightness, with 1 meaning an excellent dark sky site and 9 indicating a heavily light polluted site. The corresponding qualitative appearance caused by different levels of light pollution is shown in Table 1, extracted from Stellarium. A sky quality meter (SQM) is used to measure the luminance of the night sky, which is given in the commonly used astronomical units of magnitude per square arc second (mag/arcsec 2 ).
As we can see from Table 1, different light pollution levels represent different naked-eye limiting magnitude (NELM). For example, level 3 corresponds to the rural night sky where the NELM is 6.6–7.0 and approximate SQM is 21.69–21.89 mag/arcsec 2 . There are some sky brightness values with related condition information shown in Table 2 from [33]. Some natural environmental light levels can be found; for example, the rural night sky (clear, no moon) zenith sky luminance values are in a range of 0.25–0.8 mcd/m 2 , and approximate SQM is 20.3–21.6 mag/arcsec 2 , which is in our assumed range for nocturnal insect night vision, as discussed in Section 1.2. Level 4 corresponds to the rural/suburban transition area where the NELM is 6.1–6.5 and approximate SQM is 20.49–21.69 mag/arcsec 2 . This study [33] also indicates that the rural night sky (overcast) zenith sky luminance values are in a range of 0.25–2.7 mcd/m 2 and that approximate SQM is 19.0–21.6 mag/arcsec 2 , which could also be a suitable level setting for generating our test images. Our rationale is that most of the landscape relevant to humans is somewhat light polluted. Thus, we selected 3 (rural sky) and 4 (rural/suburban transition) as the light pollution level settings [32]. All of the test images use the default MW brightness/saturation setting (brightness: 1, saturation: 1).
Figure 5 provides the azimuthal grid (green line), celestial equatorial coordinate grid (blue line) and an example of the MW image that we used to perform image processing. Figure 6 shows a part of the MW with different settings of the Bortle scale as the light pollution levels in Stellarium.
As discussed above, Stellarium allows for sufficient parameter settings to provide a more flexible way to create night sky images. However, this software cannot create real-time weather environments such as cloud conditions for the simulated sky images; thus, we only considered the clear night sky with different light pollution levels.

2.2. Methodology

It is well established that the brightness variation in the sky caused by the MW is responsible for the orientation behaviour of the S.satyrus dung beetles [18,34], but the actual neural computation involved is not exposed by the behaviour. In this section, we present the development steps of the computer vision method presented in Algorithm 1 that we have developed to extract direction information under low light levels of this large but low-contrast celestial landmark. The objective of the method is to reduce the image to an angle representing the orientation of the MW. The proposed method consists of the distinct steps shown in Figure 7.
Algorithm 1: MW Detection and Orientation
Applsci 13 06062 i001

2.2.1. Image Thresholding

The only discernible large object in a clear, moonless night sky is the MW. This suites thresholding techniques that assume that the image contains two pixel groups, those belonging to the foreground and those belonging to the background. Because of the simple computation, robustness and adaptability, Otsu’s technique [35] is extensively used for computer vision applications. The algorithm is used to automatically determine optimal threshold values to segment the background from the foreground area [36,37]. Figure 8 shows a comparison of two additional common thresholding techniques, Ridler and Calvard’s method [38] and Kapur’s entropy method [39]. All three popular image-thresholding approaches achieve similar results in this situation.
The following equations represent Otsu’s thresholding method of an image with L gray levels. Suppose we split the image pixels into two classes C 0 , C 1 by threshold at level t. The probability of occurrence of level i is given by n i N
p ( i ) = n i N , p ( i ) 0 , i L 1 p ( i ) = 1
The probabilities of two class occurrences are given by
ω 0 = P C 0 = i = 0 t p i = ω ( t )
ω 1 = P C 1 = i = t L 1 p i = 1 ω ( t )
The mean of the two classes is calculated based on the following equations
μ 0 = i = 1 t i · p ( i ) ω 0 = 1 ω ( t ) i = 1 t i · p ( i )
μ 1 = i = t + 1 L i · p ( i ) ω 1 = 1 1 ω ( t ) i = t + 1 L i · p ( i )
Total mean can be written as
μ T = μ ( L 1 ) = i = 0 L 1 i p i
Otsu is defined as the between-class variance (BCV) that is shown as
σ B 2 = ω 0 μ 0 μ T 2 + ω 1 μ 1 μ T 2
The optimal thresholding value of Otsu t is chosen by maximising σ B 2
t = max { σ B 2 }
We used MATLAB R2021a software to calculate the Otsu threshold and to apply the flat morphological structuring element to dilate the object area. The spherical structuring element with a selected radius was used to generate the dilated binary object image. There is an example in Figure 9 that shows the MW detection result of the generated test image shown in Figure 10 with cardinal points.
In Figure 9, the left image is the detection area result that is overlaid on the original RGB test images. The middle image shows the threshold results for the MW area, and on the right is the edge detection result.

2.2.2. Low Redundancy Wavelet Entropy Edge Detection (LRWEEDA)

This method uses a combination of the wavelet transform, Shannon entropy and thresholding to achieve edge detection. LRWEEDA has been developed to be efficient, with low redundancy and noise resilience, and it is well suited to real-time image processing applications [40]. LRWEEDA’s basis in Shannon entropy ensures that its performance in high noise or low signal levels will be consistent and constrained by the information available. There are many edge detecting methods, most notable being the Canny edge detector [41] (which we also used in the results section below), and there are neural-network-based techniques [42]; thus, the selection of LRWEEDA and Canny was a choice from many options. The objective of the algorithm is for it to be used inside the modest computational resources of embedded hardware (such as the Raspberry PI 4 or even the Raspberry PI Zero II); thus, deep-learning-based approaches were avoided despite their high performance [43,44].
For each wavelet decomposition level, vertical and horizontal components, W ψ H ( j , m , n ) and W ψ V , are combined and normalised between zero and one to define ( j , m , n ) ,
( j , m , n ) = n o r m a l i s e ( W ψ H ( j , m , n ) + W ψ V ( j , m , n ) )
Shannon entropy of an image is defined as:
H I ( x , y ) = i = 0 L p I i log p I i
where p is the probability of pixel value I i occurring within I ( x , y ) , and L is the maximum pixel value within I ( x , y ) . To combine Shannon entropy with wavelet decomposition, the Shannon entropy value is calculated at each decomposition level (j). Thus, I ( x , y ) = ( j , m , n ) results in
H ( ( j , m , n ) ) = i = 0 L p i   l o g   p i
The optimal wavelet decomposition level is denoted as ( β , m , n ) . A variable threshold Λ is applied to the image:
m , n ( β , m , n ) ( β , m , n ) = ( β , m , n )   if   ( β , m , n ) Λ 0   if   ( β , m , n ) < Λ
where Λ is the threshold value and varies in the range of 0 Λ 1 .

2.2.3. Radon Transform

The Radon transform (RT) has been used for extracting or reconstructing angular information in many image processing applications. The Radon transform of an input image f ( x , y ) can be denoted as g ( s , θ ) and is defined as its line integral along a line inclined at angle θ from the y-axis and at a distance s from the origin shown in Figure 11. The algorithm was fully developed by Toft in [45]:
g ( s , θ ) = f ( x , y ) δ ( x cos θ + y sin θ s ) d x d y
where < s < , 0 θ < π . g ( s , θ ) indicates a projection that uses the Radon transform of f ( x , y ) at angle θ .
One useful property of the Radon transform is that the rotation of an image by angle θ r causes the Radon transform to be shifted by the same amount; thus,
f x cos θ r y sin θ r , x sin θ r + y cos θ r g s , θ θ r
As discussed above, the Radon transform for each ( s , θ ) calculates the total intensity of the specific line that pairs the parameters. In other words, the maximum value of the Radon transform corresponds to the location of maximum intensity of the straight line in our detected MW area. The Radon transform was implemented by using the radon function in MATLAB, and the angle θ was computed at angles from 0° to 180°, in 1° increments. In this study, the implementation of the Radon transform ensures that angle outputs are integer degrees, which limit the resolution of results. Figure 12 shows the location of the peak of the Radon transform and displays a circle over the maximum.
Once the maximum value of θ has been calculated, we can rotate the original image through the negative of the angle to straighten it, as shown in Figure 13, entirely for comparison purposes, to explore the quality of the result.

3. Results

The algorithm was tested on both simulated images and real night sky images. The Bortle scale levels selected for the simulated images that were used for the processing set were 3 and 4. MW brightness and saturation are set as default (brightness: 1, saturation: 1). The simulated test image results can be seen in Figure 14; the first column provides a brighter and more clear MW shape (brightness: 5), which can be used for comparison between the Stellarium MW area and the detected MW area. The real images we used are from a live sky camera located at Mount Burnett Observatory, Australia [46]. Some additional information is provided about the camera in Table 3. In order to illustrate the MW shape clearly, Figure 15 shows the MW area with increased brightness (Stellarium MW brightness set as 5).
The results from the live sky camera can be seen in Figure 16. This figure illustrates the MW detection and angle calculation results for real night sky images.
The intent of the algorithm is to simulate the use of the MW as a celestial direction reference. The algorithm was executed on simulation images that resulted from the observer undergoing a continuous rotation rate of 0.5°/s with updates at 10 s intervals, imposing both a rotation and the gradual movement of the celestial hemisphere and MW over time. All the dates and times for this test are set on different moonless nights. Figure 17 shows a small subset of the test images from Stellarium for illustrative purposes: The images were captured from angle θ to θ + 315 ° , at 45° intervals. All of the images that were used for this angle calculation test were under LP:4 conditions (MW brightness: 1 and saturation: 1). The rotation angle calculation results for three times and locations are plotted in Figure 18.
A Canny edge detector [41] was also tested with comparable results, shown in comparison to the LRWEEDA in both simulated images (Figure 19) and real sky images (Figure 20).
Figure 21 shows the low-resolution images generated from the Stellarium test images (MELO 0227) with the pixel value provided in Figure 22. The original image is 965 × 965 pixels, and the downsampling image resolution is shown in Figure 22. As we can see from Figure 21, in low-resolution scenarios, our proposed method provides reliable navigation cues, even though the resolution was reduced to around 100 × 100 pixels.
Figure 23 and Figure 24 show the MW area that was detected on two sets of test images from Stellarium chosen from the same location but on different dates. In each test set, only the light pollution levels were set from 1 to 7; the remaining settings were the same (location, date, time, MW brightness and saturation). The resolution for all test images for the LP level comparison was 1280 × 920; the figure shows the total pixel values inside the MW area. As we can see from Figure 23, all test images are listed around the graph, and the total pixel values for each LP level were calculated by summing the detected MW area from the binary images.
Figure 25 shows the comparison of two different light pollution levels (LP: 4 and LP: 6) with simulated test image rotation angle calculation results. All test images were generated in Stellarium with the same date, location and MW (brightness: 1 and saturation: 1) settings. For all of the test images, the angle changed 5° for each iteration every 10 s of Stellarium simulated time.

4. Discussion

Overall, the mean accuracy of the method is better than ± 2 ° for the sequences tested under LP: 4, shown in Table 4, which is comparable to other azimuthal reference technologies, whether magnetic or celestial. The results degraded under higher light pollution levels. Maximum errors did not exceed ± 4 ° under LP: 4 conditions. However, in the presence of light pollution, the angle reference provided by the MW starts to break down; thus, under LP: 6, maximum errors of around ± 18 ° emerge. In addition, different features of the MW endure longer under high LP levels, leading to an offset solution compared to the angle measured at higher light levels. Exploring the performance of the method of segmenting the MW (and by analogy, probably any comparable method), it becomes apparent that the higher LP levels render the MW inaccurate, then unreliable, and finally invisible and useless for navigation as the LP level increases.
Unlike the individual points of intensity that represent stars, the MW is a large distributed pattern that is in some ways a complementary detection problem to star detection by celestial navigation systems. There are a number of advantages to this, including robustness against vibration that would blur out individual stars and robustness to loss of focus and other phenomena such as condensation and dust on lenses. At this stage, the system is not robust to partial occlusion by clouds, foliage or buildings, although such obstacles do not exist in every environment for every use case. The system would also not detect when it was not functioning, a very necessary function in a navigation system. These problems might be overcome using a two-stage process of detection and measurement well suited to deep learning.
An interesting outcome of the study was that the problem transferred easily between a simulator that makes some effort to achieve visual realism and real star images. This bodes well for future pattern classification methods, since there will be no limitation caused by availability of data; as long as the likely environmental illuminants and occlusions can be meaningfully incorporated, superimposed or post processed into the simulator output.
Although this study was inspired by the visual behaviour of insects, it still used cameras that are bound by the limitations of conventional electronics. The megapixel resolution of the cameras is significantly more than the mere hundreds to thousands of ommatidia that form the input to nocturnal insect visual systems. A more biomimetic approach would probably make better use of the characteristics of the MW, including its large size, dominant low spatial frequency features and dimness compared to surrounding stars. Orders of magnitude of efficiency and sensitivity may be achieved by a tuned and optimised system.
We did not use a deep learning approach in this initial study because we wanted to understand the characteristics of the problem and of the solution. We also wanted to understand the nature of the data and potential sources of data, since the MW is a demanding photographic subject that would need to be filtered carefully before being inserted into a training data set. Deep learning is likely to be useful for segmenting the MW from other visible objects, such as trees and buildings but also clouds.
From an ecological perspective, we have shown that under low light pollution conditions, insects should certainly be able to see the MW and use this information for direction signals. However, a cautionary observation is that light pollution is the primary risk to this mode of navigation. Nocturnal insects play vital ecological roles; for example, the dung beetle’s role is well documented [47], and moths and night flying insects are variously pollinators and food for organisms higher in the food chain. Anthropogenic light is accepted as a cause of damage to insect populations [48] and yet would also appear to be manageable and largely avoidable. In this work, we have demonstrated how much disruption it can cause to a MW navigation system.
More system integration is required for this sensor modality to be used for true navigation, apart from the vision system being able to detect the presence of reliable MW conditions. An accurate internal clock and almanac for the orientation of the MW in the inertial frame at a location would be required to determine a navigation heading. A reference for the direction of the gravity vector would be required, particularly for space craft and aircraft. However, for short-term heading stabilisation in low light pollution environments, the Milky-Way-based approach presented here should serve a robot just as well as the MW serves nocturnal beetles.

5. Conclusions

Using biology for inspiration and conventional machine vision techniques for execution, we have demonstrated that the MW can be reliably detected and measured for orientation as a suitable input to a navigation system. The method achieved maximum errors of less than ± 5 ° and mean errors of less than ± 2 ° under the low light pollution test conditions. The study also demonstrated the detrimental effect of stray light on MW-based navigation.
Future research work will consider deep learning and neural biomimetic approaches that may be more robust and less computationally expensive. The method presented is low in computational cost and appears to be reliable; thus, we will soon endeavour to use the method for on board processing of stellar imagery on robotic platforms. Substantial optimisation and robustness can be added to the technique by exploiting the low-resolution aspect of insect vision.

Author Contributions

Conceptualisation, Y.T. and J.C.; methodology, Y.T., A.P. and S.T.; software, Y.T., A.P. and S.T.; validation, Y.T. and J.C.; formal analysis, Y.T.; investigation, Y.T. and J.C.; resources, Y.T.; data curation, Y.T.; writing—original draft preparation, Y.T.; writing—review and editing, Y.T., M.L., A.P., E.W. and J.C.; visualisation, Y.T. and J.C.; supervision, J.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

We thank James Murray of the Mount Burnett Observatory for providing All Sky Camera data for processing.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MWMilky Way
LPlight pollution
ESOEuropean Southern Observatory
SQMsky quality meter
NELMnaked-eye limiting magnitude
BCVbetween-class variance
LRWEEDALow Redundancy Wavelet Entropy Edge Detection
RTRadon transform
MELOMelbourne Observatory
AAOAustralia Astronomical Observatory

Appendix A

Appendix A.1. The Milky Way

The MW has a substantial internal structure and features that are named, as shown in Figure A1.
Figure A1. Main dark nebulae of the Solar apex half of the galactic plane. Image courtesy of Roberto Mura under Creative Commons Attribution-ShareAlike 4.0 International license.
Figure A1. Main dark nebulae of the Solar apex half of the galactic plane. Image courtesy of Roberto Mura under Creative Commons Attribution-ShareAlike 4.0 International license.
Applsci 13 06062 g0a1

Appendix A.2. Insect Vision in Less Than Starlight

Nocturnal insects have a wide range of adaptations, optical, anatomical and neurophysiological, allowing them to fly and navigate under less than starlight illumination, for example under rainforest canopies in the middle of the night [49]. The studied halictid bees have photoreceptors 30 times more sensitive than those of honey bees, but given their apposition eyes, this alone would not be enough to function in starlight. They have also evolved additional neural adaptations to pool light even more broadly across the eye to capture as much light as possible. Superposition eyes offer high performance under low-light conditions. Because of their large ommatidial lenses and wide rhabdoms, some large nocturnal insects with superposition compound eyes, such as hawkmoths, can capture light 4000 times as effectively as photoreceptors in a honeybee’s eye or the human retina, at the same light intensity [50]. Based on Kelber et al., we will estimate that nocturnal insects retain visual perception in dim light as low as 0.1 mcd/m 2 ; however, it is also known that this vision is comparatively low resolution, while still being sensitive to changes in level of illumination. A low-resolution eye that can detect patterns at very low light intensities is suited to the dim, distributed shape of the MW.
Figure A2, adapted from Kelber et al. [50], shows the dim light colour vision and visual thresholds of species. Colours in the bars code for receptor types contributing to vision.
Figure A2. Dim light colour vision and thresholds of species. Figure adapted from data in Kelber et al. [50].
Figure A2. Dim light colour vision and thresholds of species. Figure adapted from data in Kelber et al. [50].
Applsci 13 06062 g0a2

References

  1. Foster, J.J.; El Jundi, B.; Smolka, J.; Khaldy, L.; Nilsson, D.E.; Byrne, M.J.; Dacke, M. Stellar performance: Mechanisms underlying Milky Way orientation in dung beetles. Philos. Trans. R. Soc. Biol. Sci. 2017, 372, 20160079. [Google Scholar] [CrossRef]
  2. Dacke, M.; Baird, E.; El Jundi, B.; Warrant, E.J.; Byrne, M. How dung beetles steer straight. Annu. Rev. Entomol. 2021, 66, 243–256. [Google Scholar] [CrossRef]
  3. Reppert, S.M.; Zhu, H.; White, R.H. Polarized light helps monarch butterflies navigate. Curr. Biol. 2004, 14, 155–158. [Google Scholar] [CrossRef]
  4. Nilsson, D.E. Eye ancestry: Old genes for new eyes. Curr. Biol. 1996, 6, 39–42. [Google Scholar] [CrossRef]
  5. Mauck, B.; Gläser, N.; Schlosser, W.; Dehnhardt, G. Harbour seals (Phoca vitulina) can steer by the stars. Anim. Cogn. 2008, 11, 715–718. [Google Scholar] [CrossRef]
  6. Hardie, R.C.; Stavenga, D.G. Facets of Vision; Springer: Berlin/Heidelberg, Germany, 1989. [Google Scholar]
  7. Land, M.F.; Nilsson, D.E. Animal Eyes; OUP: Oxford, UK, 2012. [Google Scholar]
  8. Rigosi, E.; Wiederman, S.D.; O’Carroll, D.C. Visual acuity of the honey bee retina and the limits for feature detection. Sci. Rep. 2017, 7, 45972. [Google Scholar] [CrossRef]
  9. Warrant, E.; Somanathan, H. Colour vision in nocturnal insects. Philos. Trans. R. Soc. B 2022, 377, 20210285. [Google Scholar] [CrossRef]
  10. Stöckl, A.L.; O’Carroll, D.C.; Warrant, E.J. Neural summation in the hawkmoth visual system extends the limits of vision in dim light. Curr. Biol. 2016, 26, 821–826. [Google Scholar] [CrossRef]
  11. Warrant, E.J. The remarkable visual capacities of nocturnal insects: Vision at the limits with small eyes and tiny brains. Philos. Trans. R. Soc. B Biol. Sci. 2017, 372, 20160063. [Google Scholar] [CrossRef]
  12. Labhart, T.; Meyer, E.P. Detectors for polarized skylight in insects: A survey of ommatidial specializations in the dorsal rim area of the compound eye. Microsc. Res. Tech. 1999, 47, 368–379. [Google Scholar] [CrossRef]
  13. Meyer, E.P.; Labhart, T. Morphological specializations of dorsal rim ommatidia in the compound eye of dragonflies and damselfies (Odonata). Cell Tissue Res. 1993, 272, 17–22. [Google Scholar] [CrossRef]
  14. Fent, K.; Wehner, R. Ocelli: A celestial compass in the desert ant Cataglyphis. Science 1985, 228, 192–194. [Google Scholar] [CrossRef]
  15. Brunner, D.; Labhart, T. Behavioural evidence for polarization vision in crickets. Physiol. Entomol. 1987, 12, 1–10. [Google Scholar] [CrossRef]
  16. Levels, R.L. Recommended light levels (illuminance) for outdoor and indoor venues. In The Engineering Toolbox. Recommended Light Levels; 2020; Available online: https://www.engineeringtoolbox.com/light-level-rooms-d_708.html (accessed on 1 February 2023).
  17. Dacke, M.; El Jundi, B. The dung beetle compass. Curr. Biol. 2018, 28, R993–R997. [Google Scholar] [CrossRef]
  18. Dacke, M.; Baird, E.; Byrne, M.; Scholtz, C.H.; Warrant, E.J. Dung beetles use the Milky Way for orientation. Curr. Biol. 2013, 23, 298–300. [Google Scholar] [CrossRef]
  19. Warrant, E.; Dacke, M. Visual navigation in nocturnal insects. Physiology 2016, 31, 182–192. [Google Scholar] [CrossRef]
  20. Rayleigh, L. XXXIV. On the transmission of light through an atmosphere containing small particles in suspension, and on the origin of the blue of the sky. Lond. Edinb. Dublin Philos. Mag. J. Sci. 1899, 47, 375–384. [Google Scholar] [CrossRef]
  21. Jouir, T.; Strydom, R.; Srinivasan, M.V. A 3D sky compass to achieve robust estimation of UAV attitude. In Proceedings of the Australasian Conference on Robotics and Automation (ACRA), Canberra, Australia, 2–4 December 2015. [Google Scholar]
  22. Lambrinos, D.; Kobayashi, H.; Pfeifer, R.; Maris, M.; Labhart, T.; Wehner, R. An autonomous agent navigating with a polarized light compass. Adapt. Behav. 1997, 6, 131–161. [Google Scholar] [CrossRef]
  23. Dupeyroux, J.; Viollet, S.; Serres, J.R. Polarized skylight-based heading measurements: A bio-inspired approach. J. R. Soc. Interface 2019, 16, 20180878. [Google Scholar] [CrossRef]
  24. Chahl, J.; Mizutani, A. Biomimetic attitude and orientation sensors. IEEE Sens. J. 2010, 12, 289–297. [Google Scholar] [CrossRef]
  25. Thakoor, S.; Morookian, J.M.; Chahl, J.; Hine, B.; Zornetzer, S. BEES: Exploring mars with bioinspired technologies. Computer 2004, 37, 38–47. [Google Scholar] [CrossRef]
  26. Zotti, G.; Hoffmann, S.M.; Wolf, A.; Chéreau, F.; Chéreau, G. The simulated sky: Stellarium for cultural astronomy research. arXiv 2021, arXiv:2104.01019. [Google Scholar] [CrossRef]
  27. Kayton, M.; Fried, W.R. Avionics Navigation Systems; John Wiley & Sons: Hoboken, NJ, USA, 1997. [Google Scholar]
  28. Parish, J.J.; Parish, A.S.; Swanzy, M.; Woodbury, D.; Mortari, D.; Junkins, J.L. Stellar positioning system (part I): An autonomous position determination solution. Navigation 2010, 57, 1–12. [Google Scholar] [CrossRef]
  29. Amert, J.; Fritzinger, M. Hardware Demonstration and Improvements of the Stellar Positioning System. In Proceedings of the 45th Annual AAS Guidance, Navigation and Control (GN&C) Conference, Breckenridge, CO, USA, 2–8 February 2023. [Google Scholar]
  30. Falchi, F.; Cinzano, P.; Duriscoe, D.; Kyba, C.C.; Elvidge, C.D.; Baugh, K.; Portnov, B.A.; Rybnikova, N.A.; Furgoni, R. The new world atlas of artificial night sky brightness. Sci. Adv. 2016, 2, e1600377. [Google Scholar] [CrossRef]
  31. Lucas, M.A.; Chahl, J.S. Challenges for biomimetic night time sky polarization navigation. In Proceedings of the Bioinspiration, Biomimetics, and Bioreplication 2016, Las Vegas, NV, USA, 21–25 March 2016; Volume 9797, pp. 11–22. [Google Scholar]
  32. Bortle, J.E. The Bortle Dark-Sky Scale. In Sky and Telescope; 2001; Available online: https://skyandtelescope.org/wp-content/uploads/BortleDarkSkyScale.pdf (accessed on 15 January 2023).
  33. Hänel, A.; Posch, T.; Ribas, S.J.; Aubé, M.; Duriscoe, D.; Jechow, A.; Kollath, Z.; Lolkema, D.E.; Moore, C.; Schmidt, N.; et al. Measuring night sky brightness: Methods and challenges. J. Quant. Spectrosc. Radiat. Transf. 2018, 205, 278–290. [Google Scholar] [CrossRef]
  34. Foster, J.J.; Smolka, J.; Nilsson, D.E.; Dacke, M. How animals follow the stars. Proc. R. Soc. B Biol. Sci. 2018, 285, 20172322. [Google Scholar] [CrossRef]
  35. Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
  36. Goh, T.Y.; Basah, S.N.; Yazid, H.; Safar, M.J.A.; Saad, F.S.A. Performance analysis of image thresholding: Otsu technique. Measurement 2018, 114, 298–307. [Google Scholar] [CrossRef]
  37. Siddique, M.A.B.; Arif, R.B.; Khan, M.M.R. Digital image segmentation in matlab: A brief study on otsu’s image thresholding. In Proceedings of the 2018 International Conference on Innovation in Engineering and Technology (ICIET), Dhaka, Bangladesh, 27–28 December 2018; pp. 1–5. [Google Scholar]
  38. Ridler, T.; Calvard, S. Picture thresholding using an iterative selection method. IEEE Trans. Syst. Man Cybern 1978, 8, 630–632. [Google Scholar]
  39. Kapur, J.N.; Sahoo, P.K.; Wong, A.K. A new method for gray-level picture thresholding using the entropy of the histogram. Comput. Vision, Graph. Image Process. 1985, 29, 273–285. [Google Scholar] [CrossRef]
  40. Tao, Y.; Scully, T.; Perera, A.G.; Lambert, A.; Chahl, J. A Low Redundancy Wavelet Entropy Edge Detection Algorithm. J. Imaging 2021, 7, 188. [Google Scholar] [CrossRef]
  41. Canny, J. A Computational Approach to Edge Detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, 8, 679–698. [Google Scholar] [CrossRef]
  42. Li, C.; Qu, Z. Review of image edge detection algorithms based on deep learning. J. Comput. Appl. 2020, 40, 3280. [Google Scholar]
  43. Su, Z.; Liu, W.; Yu, Z.; Hu, D.; Liao, Q.; Tian, Q.; Pietikäinen, M.; Liu, L. Pixel difference networks for efficient edge detection. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 5117–5127. [Google Scholar]
  44. Liu, Y.; Cheng, M.M.; Hu, X.; Wang, K.; Bai, X. Richer convolutional features for edge detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 3000–3009. [Google Scholar]
  45. Toft, P. The Radon Transform. Theory and Implementation. Ph. D. Dissertation, Technical University of Denmark, Copenhagen, Denmark, 1996. [Google Scholar]
  46. Observatory, M.B. All Sky Camera. 2023. Available online: https://skypi.mbo.org.au/allsky/videos/ (accessed on 5 February 2023).
  47. Hanski, I.; Cambefort, Y. Dung Beetle Ecology; Princeton University Press: Princeton, NJ, USA, 2014; Volume 1195. [Google Scholar]
  48. Longcore, T.; Rich, C. Ecological light pollution. Front. Ecol. Environ. 2004, 2, 191–198. [Google Scholar] [CrossRef]
  49. Warrant, E.J.; Kelber, A.; Gislén, A.; Greiner, B.; Ribi, W.; Wcislo, W.T. Nocturnal vision and landmark orientation in a tropical halictid bee. Curr. Biol. 2004, 14, 1309–1318. [Google Scholar] [CrossRef]
  50. Kelber, A.; Yovanovich, C.; Olsson, P. Thresholds and noise limitations of colour vision in dim light. Philos. Trans. R. Soc. B Biol. Sci. 2017, 372, 20160065. [Google Scholar] [CrossRef]
Figure 1. The MW observed under a rural sky in South Australia.
Figure 1. The MW observed under a rural sky in South Australia.
Applsci 13 06062 g001
Figure 2. A 360 degree panoramic image covering the southern and northern celestial spheres of the MW created by joining multiple images. Photo: Courtesy of the European Southern Observatory (ESO) under Creative Commons Attribution-ShareAlike 4.0 International license.
Figure 2. A 360 degree panoramic image covering the southern and northern celestial spheres of the MW created by joining multiple images. Photo: Courtesy of the European Southern Observatory (ESO) under Creative Commons Attribution-ShareAlike 4.0 International license.
Applsci 13 06062 g002
Figure 3. Simplified depiction of apposition (Left) and superposition (Right) compound eyes. The clear zone (cz) found in superposition eyes is labelled. Illustration adapted from [11].
Figure 3. Simplified depiction of apposition (Left) and superposition (Right) compound eyes. The clear zone (cz) found in superposition eyes is labelled. Illustration adapted from [11].
Applsci 13 06062 g003
Figure 4. Stellarium location setting, Lincoln National Park, SA.
Figure 4. Stellarium location setting, Lincoln National Park, SA.
Applsci 13 06062 g004
Figure 5. Left: Stellarium image (light pollution level: 1, MW brightness: 4, saturation: 1) with the azimuthal grid (green). Middle: Stellarium image (light pollution level: 1, MW brightness: 4, saturation: 1) with the celestial equatorial coordinate grid (blue). Right: Stellarium sky image (light pollution level: 5, MW brightness: 1, saturation: 1).
Figure 5. Left: Stellarium image (light pollution level: 1, MW brightness: 4, saturation: 1) with the azimuthal grid (green). Middle: Stellarium image (light pollution level: 1, MW brightness: 4, saturation: 1) with the celestial equatorial coordinate grid (blue). Right: Stellarium sky image (light pollution level: 5, MW brightness: 1, saturation: 1).
Applsci 13 06062 g005
Figure 6. Bortle scale levels.
Figure 6. Bortle scale levels.
Applsci 13 06062 g006
Figure 7. Flow chart for angle calculation steps from MW images.
Figure 7. Flow chart for angle calculation steps from MW images.
Applsci 13 06062 g007
Figure 8. Thresholding method comparison for real sky images.
Figure 8. Thresholding method comparison for real sky images.
Applsci 13 06062 g008
Figure 9. Left: RGB image. Middle: binary image. Right: edge image.
Figure 9. Left: RGB image. Middle: binary image. Right: edge image.
Applsci 13 06062 g009
Figure 10. Test image from Stellarium with the following settings: location: SA, Australia, FOV: 120°, date: 1 February 2022, time: 02:30:17 (UTC+10:30).
Figure 10. Test image from Stellarium with the following settings: location: SA, Australia, FOV: 120°, date: 1 February 2022, time: 02:30:17 (UTC+10:30).
Applsci 13 06062 g010
Figure 11. Projection integral in the direction, θ .
Figure 11. Projection integral in the direction, θ .
Applsci 13 06062 g011
Figure 12. Radon transform, indicating the maximum value of θ .
Figure 12. Radon transform, indicating the maximum value of θ .
Applsci 13 06062 g012
Figure 13. Left: angle calculation result for Figure 10. Right: angle result for rotation to vertical.
Figure 13. Left: angle calculation result for Figure 10. Right: angle result for rotation to vertical.
Applsci 13 06062 g013
Figure 14. First column: In order to illustrate the results clearly, this column shows the MW with increased brightness (Stellarium MW brightness set as 5). 2nd–7th columns: Detection and angle calculation results for simulated sky images. The light pollution level for the test images is 3.
Figure 14. First column: In order to illustrate the results clearly, this column shows the MW with increased brightness (Stellarium MW brightness set as 5). 2nd–7th columns: Detection and angle calculation results for simulated sky images. The light pollution level for the test images is 3.
Applsci 13 06062 g014
Figure 15. Top row: All Sky Camera images from Mount Burnett Observatory; bottom row: the simulated sky images with the increased brightness of MW area (Stellarium MW brightness set as 5).
Figure 15. Top row: All Sky Camera images from Mount Burnett Observatory; bottom row: the simulated sky images with the increased brightness of MW area (Stellarium MW brightness set as 5).
Applsci 13 06062 g015
Figure 16. First column: Real sky images that are used for angle calculation test. 2nd–6th columns: Detection and angle calculation results for real sky images.
Figure 16. First column: Real sky images that are used for angle calculation test. 2nd–6th columns: Detection and angle calculation results for real sky images.
Applsci 13 06062 g016
Figure 17. Moving angle calculation test images from Stellarium.
Figure 17. Moving angle calculation test images from Stellarium.
Applsci 13 06062 g017
Figure 18. Moving angle calculation: Location: Melbourne Observatory (MELO), date: 27 February (start time 23:50). Location: Australia Astronomical observatory (AAO), dates: 25 May (start time 22:30), 1 July (start time 22:00).
Figure 18. Moving angle calculation: Location: Melbourne Observatory (MELO), date: 27 February (start time 23:50). Location: Australia Astronomical observatory (AAO), dates: 25 May (start time 22:30), 1 July (start time 22:00).
Applsci 13 06062 g018
Figure 19. Comparison for the Canny and LRWEEDA edge detection methods that were tested on a full rotation circle of the test images from Stellarium with the following settings: Location: Australia Astronomical Observatory (AAO), date: 25 May (start time 22:30).
Figure 19. Comparison for the Canny and LRWEEDA edge detection methods that were tested on a full rotation circle of the test images from Stellarium with the following settings: Location: Australia Astronomical Observatory (AAO), date: 25 May (start time 22:30).
Applsci 13 06062 g019
Figure 20. Comparison of the Canny and LRWEEDA edge detection methods tested on real sky images from the All Sky Camera, Mount Burnett, Australia.
Figure 20. Comparison of the Canny and LRWEEDA edge detection methods tested on real sky images from the All Sky Camera, Mount Burnett, Australia.
Applsci 13 06062 g020
Figure 21. Comparison for the low-resolution images from Stellarium. Moving angle calculation: location: Melbourne Observatory (MELO), date: 27 February (start time 23:50).
Figure 21. Comparison for the low-resolution images from Stellarium. Moving angle calculation: location: Melbourne Observatory (MELO), date: 27 February (start time 23:50).
Applsci 13 06062 g021
Figure 22. Comparison for the low-resolution images from Stellarium with pixel values for different resolution levels.
Figure 22. Comparison for the low-resolution images from Stellarium with pixel values for different resolution levels.
Applsci 13 06062 g022
Figure 23. The MW area that was detected with light pollution levels set from 1 to 7 in Stellarium (date: 2 January 2022, time: 02:30 (UTC +10:30), location: SA, Australia).
Figure 23. The MW area that was detected with light pollution levels set from 1 to 7 in Stellarium (date: 2 January 2022, time: 02:30 (UTC +10:30), location: SA, Australia).
Applsci 13 06062 g023
Figure 24. The MW area that was detected with light pollution Levels set from 1 to 7 in Stellarium (date: 2 May 2022, time: 02:30 (UTC +09:30), location: SA, Australia).
Figure 24. The MW area that was detected with light pollution Levels set from 1 to 7 in Stellarium (date: 2 May 2022, time: 02:30 (UTC +09:30), location: SA, Australia).
Applsci 13 06062 g024
Figure 25. Moving angle calculation for light pollution is set as 4 and 6, location: Melbourne Observatory (MELO), date: 27 February (start time 23:50).
Figure 25. Moving angle calculation for light pollution is set as 4 and 6, location: Melbourne Observatory (MELO), date: 27 February (start time 23:50).
Applsci 13 06062 g025
Table 1. Light pollution levels.
Table 1. Light pollution levels.
LevelTitleSQM (mag/arcsec 2 )NELM
1Excellent dark sky site21.7–22.017.6–8.0
2Typical truly dark site21.5–21.77.1–7.5
3Rural sky21.3–21.56.6–7.0
4Rural/suburban transition20.4–21.36.1–6.5
5Suburban sky19.1–20.45.6–6.0
6Bright suburban sky18.0–19.15.1–5.5
7Suburban/urban transition18.0–19.15.0 at best
8City sky<18.04.5 at best
9Inner City sky<18.04.0 at best
Table 2. Night sky brightness values.
Table 2. Night sky brightness values.
ConditionIlluminance (mlux)Zenith Sky Luminance (mcd/m 2 )Zenith Radiance (mag/arcsec 2 )
Overcast natural night<0.6<0.2>21.8
Natural starlit night0.6–0.90.2–0.321.4–21.9
Bulge of the MWN/A2.7120.5–21.0
Rural night sky (clear, no moon)0.7–30.25–0.820.3–21.6
Rural night sky (overcast)0.7–90.25–0.719.0–21.6
Table 3. Information about the All Sky Camera, Mount Burnett, Australia.
Table 3. Information about the All Sky Camera, Mount Burnett, Australia.
All Sky Camera Information
LocationMount Burnett, Victoria, Australia
Latitude37.9725 S
Longitude145.4955 E
CameraASI224MC
Exposure30 s
ComputerRaspberry Pi 3B+
Table 4. Summary of results from the MW compass method and corresponding LP levels.
Table 4. Summary of results from the MW compass method and corresponding LP levels.
SequenceLPMean ErrorMaximum Error
AAO 01 Jul4 ± 1.71 ° ± 4.0 °
AAO 25 May4 ± 0.19 ° ± 1.0 °
MELO 27 Feb4 ± 0.58 ° ± 1.0 °
MELO 27 Feb6 ± 1.53 ° ± 18.0 °
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tao, Y.; Lucas, M.; Perera, A.; Teague, S.; Warrant, E.; Chahl, J. A Computer Vision Milky Way Compass. Appl. Sci. 2023, 13, 6062. https://doi.org/10.3390/app13106062

AMA Style

Tao Y, Lucas M, Perera A, Teague S, Warrant E, Chahl J. A Computer Vision Milky Way Compass. Applied Sciences. 2023; 13(10):6062. https://doi.org/10.3390/app13106062

Chicago/Turabian Style

Tao, Yiting, Michael Lucas, Asanka Perera, Samuel Teague, Eric Warrant, and Javaan Chahl. 2023. "A Computer Vision Milky Way Compass" Applied Sciences 13, no. 10: 6062. https://doi.org/10.3390/app13106062

APA Style

Tao, Y., Lucas, M., Perera, A., Teague, S., Warrant, E., & Chahl, J. (2023). A Computer Vision Milky Way Compass. Applied Sciences, 13(10), 6062. https://doi.org/10.3390/app13106062

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop