Next Article in Journal
A Novel Approach to Image Recoloring for Color Vision Deficiency
Next Article in Special Issue
Performance of Vehicular Visible Light Communications under the Effects of Atmospheric Turbulence with Aperture Averaging
Previous Article in Journal
Position Sensors for Industrial Applications Based on Electromagnetic Encoders
Previous Article in Special Issue
Optimized Analog Multi-Band Carrierless Amplitude and Phase Modulation for Visible Light Communication-Based Internet of Things Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Wireless Sensor Networks Using Sub-Pixel Optical Camera Communications: Advances in Experimental Channel Evaluation †

by
Vicente Matus
1,*,
Victor Guerra
1,
Cristo Jurado-Verdu
1,
Stanislav Zvanovec
2 and
Rafael Perez-Jimenez
1
1
Institute for Technological Development and Innovation in Communications (IDeTIC), University of Las Palmas de Gran Canaria, 35001 Las Palmas, Spain
2
Department of Electromagnetic Field, Faculty of Electrical Engineering, Czech Technical University in Prague, Technicka, 16627 Prague, Czech Republic
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in 3rd West Asian Symposium on Optical and Millimeter-wave Wireless Communications (WASOWC), Tehran, Iran, 24–25 November 2020.
Sensors 2021, 21(8), 2739; https://doi.org/10.3390/s21082739
Submission received: 18 March 2021 / Revised: 7 April 2021 / Accepted: 9 April 2021 / Published: 13 April 2021
(This article belongs to the Special Issue Visible Light Communication, Networking, and Sensing)

Abstract

:
Optical wireless communications in outdoor scenarios are challenged by uncontrollable atmospheric conditions that impair the channel quality. In this paper, different optical camera communications (OCC) equipment are experimentally studied in the laboratory and the field, and a sub-pixel architecture is raised as a potential solution for outdoor wireless sensor networks (WSN) applications, considering its achievable data throughput, the spatial division of sources, and the ability of cameras to overcome the attenuation caused by different atmospheric conditions such as rain, turbulence and the presence of aerosols. Sub-pixel OCC shows particularly adequate capabilities for some of the WSN applications presented, also in terms of cost-effectiveness and scalability. The novel topology of sub-pixel projection of multiple transmitters over the receiver using small optical devices is presented as a solution using OCC that re-uses camera equipment for communication purposes on top of video-monitoring.

1. Introduction

Optical camera communications (OCC), the sub-field of visible light communications (VLC) in which receivers ( R x ) are implemented using image sensors [1], has excellent potential to be part of the evolution of new technologies beyond the fifth generation of cellular networks (5G). The use of cameras represents a low integration cost due to their massive availability in end-user devices such as smartphones, public infrastructure surveillance cameras, and vehicular security dash cameras. Moreover, transmitters ( T x ) in VLC, in general, are implemented using light-emitting diode (LED) technology, which is widely spread and presents low power consumption and a long lifespan. OCC arises from exploiting the digital cameras, considerably more abundant than individual photodiodes (PDs), but at the same time with limitations on the achievable data rates. For example, PD-based VLC systems exceeding Gbps throughput have been reported [2,3]. In contrast, the frame rate of conventional cameras poses an inherent limitation to OCC’s data rate, but the image-forming optics and the ability to visualize different objects within the field-of-view (FOV) can be exploited to increase the throughput [1,4,5,6].
OCC has been incorporated into the Institute of Electrical and Electronics Engineers (IEEE) 802.15.7r1 standard [7], which shows the interest of the scientific community in its development. The standard contemplates the two fundamental strategies for implementing these systems, which vary according to the camera’s image acquisition technique: global shutter (GS) and rolling shutter (RS) systems. The first mainly use charge-coupled device (CCD) image sensors that simultaneously expose all their pixels when acquiring a new image. On the other hand, RS systems are mainly based on complementary metal-oxide-semiconductor (CMOS) technology and scan the image sequentially line by line of pixels, with a fixed overlap [8]. Although CCD sensors are built using a GS structure, it is important to note that the image acquisition mechanism is not strictly related to the sensor’s manufacturing technology, and CMOS hardware can be built using RS or GS strategies in the readout circuitry. Moreover, OCC systems employing RS hardware can perform GS techniques for the post-processing of the image and the demodulation of data, as shown in this work.
In indoor scenarios (offices, homes, and medical or industrial facilities), VLC systems can already provide high-speed Internet, taking advantage of the short distances and the moderate presence of interfering sources. If the use of cameras is considered, one of the most prominent applications of indoor OCC is in the field of visible light positioning (VLP), which combines data transmission and image processing to recognize the geometry of the environment and monitor interactions between mobile nodes [9,10,11]. Other applications of interest have been proposed for OCC, such as wireless patient monitoring in hospitals, where the use of radiofrequency (RF) signals may interfere with the proper performance of the instrumentation and the RF spectrum is more limited than in other kinds of facilities; and peer-to-peer (P2P) data exchange using optical beacons as an alternative to near-field communications [4].
When considering the use of surveillance cameras in outdoor scenarios for Smart Cities applications and sensor networking, the presence of uncontrollable adverse atmospheric phenomena such as heat-induced turbulence, the presence of small particles in suspension (aerosols, water vapor, pollutants, dust), and rain and snow precipitation, must be taken into account. These phenomena cause light to be absorbed and dispersed, producing both attenuation and time dispersion of the signal in the link’s direction, increasing the communication error rate. Different solutions have been proposed to cope with the challenging conditions; these can be divided into ones that alter the camera’s optics and others that effectively adjust the image sensor’s internal parameters.
However, in all these works mentioned above, it is assumed that the projected area of the light source in the image affects a high number of pixels, as shown in Figure 1. In this case, the lamp’s geometric projection occupies a significant portion of the image [12,13]. This previous consideration has a final impact on the maximum distance of the link, making it necessary either to use luminous surfaces with an extensive area or to use telescopic lenses, which reduce the FOV, as mentioned before. Nonetheless, these approaches make it impossible to exploit the cameras as video and communication devices simultaneously. In contrast, the use of smaller and more numerous transmitters can be exploited in OCC for transmitting multiple data streams to the same receiver. This technique of spatial division has been proposed for intelligent transportation systems (ITS) [4] but is challenged by the need for computer vision algorithms capable of discovering the sources within the image during motion and determining the region of interest (ROI), i.e., the pixels that contain data. Figure 2 shows how multiple sources can potentially convey information to an OCC R x on a car.
This work extends from our previous conference paper in [14]. An OCC link under emulated fog conditions in a laboratory chamber was studied to evaluate the effects of the impairment of visibility over the process of correlation-based signal detection. The OCC camera’s output image sequences were offline processed to detect the ROI and how its dimensions varied according to the different levels of meteorological visibility produced by the presence of fog. It was shown that the correlation process was considerably affected by values under 40 m of visibility. The ROI dimensions presented a negligible change in these conditions.
Further experimentation has been carried out, including the effects of heat-induced turbulence and exploiting the camera’s analog gain in [15,16]. Real conditions of a sandstorm were experimentally studied in [17]. In this work, the previous experiments are compared and analyzed. Further theoretical analysis of the channel and the detection using correlation-based processing for ROI detection is shown in this paper.
This work’s main contribution is to demonstrate the feasibility of outdoor OCC links, in which communication happens from using not just large optical devices but also to employing small optical devices in long distances, falling into the sub-pixel level. The term sub-pixel refers to the fact that the source’s projection area is smaller than the area of one pixel, and it should not be confused with the color-sensitive components of an image sensor pixel, also called subpixels [18]. In a preliminary discussion, it could be assumed that the link is not viable because the light emitted only impinges partly one pixel. However, this work shows that the energy emitted by the LED affects the adjacent pixels for reasons ranging from the scattering in the atmosphere, and the optical focus, so that the signal can be successfully recovered by considering a larger number of pixels. This paper provides a comprehensive compilation of the authors’ highlighted findings in evaluating experimental outdoor OCC. It proposes the sub-pixel approach, discussing the viability of OCC’s real outdoor applications in the IoT and WSN fields. The OCC devices presented in this paper have been implemented using off-the-shelf components, and their designs are available for replication.
This paper is structured as follows. Section 2 summarizes the scientific contributions towards implementing optical IoT and WSN using outdoor OCC systems. In Section 3, a proposal of an architecture of OCC-based sensor networks is developed, according to the WSN key requirements and OCC capabilities. Section 4 outlines the methodology and results of the experimental evaluations done and the implications of the results obtained. Conclusions and future lines are addressed in Section 5.

2. Related Works

In this section, the applications and challenges of OCC and related systems are summarized, focusing on the fact that most OWC technologies, including PD-based and camera-based VLC systems, have been proposed for outdoor Smart Cities, and WSN applications [1,2,19], although some important indoor applications are mentioned as well. Specifically, regarding the implementation of sub-pixel systems for WSN as proposed in this paper, the authors are not aware of other works where an equivalent functionality is reported. It is worth noting that the sub-pixel scenario can be considered to be a VLC system based on individual PDs since camera pixels are based on such devices [18]. However, as will be seen in the following sections, the camera optics and light scattering provide an opportunity to enhance communication using the other contiguous pixels of the camera.

2.1. Spatial Division of Transmitters

Some of OCC’s advantages over single PD-based systems come from the image-forming nature of cameras that allow the spatial separation of the light sources as in [20,21,22]. This technique can substantially improve the data rates achievable by OCC systems if there is the possibility to extend the number of transmitters in space, as in the deployment of sensors.
One of the interesting applications that takes advantage of the spatial division is VLP [10,11,23,24]. These systems take advantage of the high precision achievable, of the order of tens of centimeters, compared to satellite positioning systems that can be tens of meters. VLP systems are robust in enclosed spaces and have been explored in outdoor vehicular settings [25] where more precision than provided by satellite localization means is needed. Furthermore, positioning systems relate to another important application, intelligent transportation systems (ITS), which has been proposed as an application of VLC [23,26,27,28]. ITS aim to improve road safety and the communication between vehicles and infrastructures. As mentioned before, vehicular VLC systems can exploit the spatial division of sources in OCC. This application has the potential to improve the performance of autonomous vehicles, an important sub-field of ITS where the use of machine learning techniques exploiting different sources of sensor data are used for lateral and longitudinal motion control of passenger cars [29]. Although many kinds of sensors are being studied, camera equipment is considered fundamental for capturing information from the environment in most approaches. In [25], computer vision and OCC techniques are combined to determine the position of a vehicle with errors below 20 cm.
OCC’s main limitations can be grouped in three categories: those related to the image-forming capability, where the distortions caused by the optics are a source of interference and noise, especially relevant in screen-to-camera communications [30]; issues related to the timing of capturing and the synchronization with transmission [12,13], in which the slight variations of the camera frame along with the gaps between frame acquisitions induce errors in data decodification; and the important challenges related to the discovery of nodes and their tracking in motion [31], where the time elapsed by computer vision algorithms in the detection of the ROI can have a considerable impact on the latency.

2.2. Atmospheric Phenomena in Optical Wireless Communication

Outdoor scenarios, where the weather and other atmospheric phenomena play an important role in the propagation of optical signals [32], have been experimentally investigated in the OWC field and recently in VLC, as summarized in [33]. In [34], an outdoor link of approximately 400 m was experimentally validated, which exploited defocusing the camera, allowing the surface of the LED to be extended and the transfer rate to be increased to 450 bps (bits per second). In the experiments of [26,35], other examples show how the channel’s effect is compensated by modifying the optical parameters, specifically by using magnification lenses. The first work focused on an application for a vehicular environment based on PDs instead of cameras, achieving a link distance of 40 m in ideal weather conditions. In [35], a GS camera was used to establish a link with a luminous sign located at a distance of 328 m with an effective transmission rate of 15 bps with a 4% error. Finally, Ref. [8] shows the use of a Fresnel lens for establishing a VLC vehicular link in a laboratory emulated environment is demonstrated. Nonetheless, these approaches are impractical if it is considered that the FOV of the receiver optics is drastically reduced. As a solution for maintaining the camera’s original FOV, in previous works [15,16] it is demonstrated that the signal attenuation produced by turbulence and fog phenomena can be overcome by increasing the analog gain of the image sensor without the need to alter the camera optics.

3. Proposal of Optical Camera Communication-Based Sensor Networks Architecture

In this section, the basic architecture of a WSN based on OCC is proposed. First, the wireless channel is derived for the case of the novel sub-pixel transmitter projection scheme, considering the outdoor scenario, where the presence of particles in the air causes both the absorption and scattering of light. A discussion of heat-induced turbulence models found in the literature is done further in this section, and it is finished by summarizing the technical requirements and potential applications of the OCC-based WSN proposed.

3.1. Optical Wireless Channel

As in any OWC system, the power received in an OCC system can be modeled using the solid angle differential approach (Equation (1)) [36]. Since this work focuses on outdoor links, an extinction loss term K e x t ( λ ) has been added to the medium (absorption and scattering) which depends on the wavelength λ [37,38,39]:
P r x = P t x R ( θ , φ ) A l e n s d 2 cos Ψ e K e x t ( λ ) d
where P r x is the received power, P t x the transmitted power and R ( θ , φ ) is the radiation pattern of the source (assumed constant over its entire area) for the elevation θ and azimuth φ angles. The received power depends on the area of the main lens A l e n s projected over the angle of incidence Ψ and the link distance d.
Nevertheless, since cameras are used as optical receivers, image-forming optics must be considered in this type of system. In general, terms and disregarding any blurring effect, a priori, OCC systems have been based on conserving pixel energy density with distance, i.e., the energy of each pixel has no direct dependence on d, as long as the optical emitter’s projection is greater than a single pixel, as derived in [40]:
H p ( 0 ) = A p x 2 A l e n s f 2 A t x R ( θ , φ ) ,
where f is the focal length of the lens of the camera, and A p x and A t x are the area of a pixel of the image sensor and the transmitter LED, respectively. The concept behind this property is the compensation of spherical propagation losses with the focus of the image. Although less energy reaches the camera’s main optics as the distance increases (quadratic decrease), the number of pixels on which the emitter is projected also decreases in the same order, compensating for the effects and keeping the surface energy density constant on the image sensor.
For long distances, it is easy to demonstrate that the number of pixels N p x on which an emitter with area A t x is projected depends on the camera’s FOV and image sensor resolution as:
N p x = N M F O V n F O V m A t x d 2 ,
where N and M define the horizontal and vertical resolution of the sensor, respectively, and F O V n and F O V m define the horizontal and vertical fields of vision of the receiver, respectively. By joining Equations (1) and (3), and projecting the energy over the area of a pixel, Equation (4) is obtained, which summarizes the average energy of a pixel:
P p x = P t x R ( θ , φ ) A l e n s A t x cos Ψ e K e x t ( λ ) d ζ x y A p x ,
where for convenience, the angular resolution of the sensor ζ x y represents the ratio of its FOV to the sensor resolution. However, when the emitter’s projection decreases below a single pixel, the above equation is no longer valid and the system can start to be modeled as a PD-based OWC link, where the received power is directly proportional to the photodiode area illuminated by the projection (this projection is less than A p x and no image can be formed). The power received in a sub-pixel situation ( P s u b p x ) is given by:
P s u b p x = P p x × N p x = P r x × A p x .
It must be understood that N p x (the number of pixels of the transmitter’s projection) in the equation above acts as a coefficient of adjustment referred to the percentage of illuminated pixels. It is clear that P s u b p x loses its independence from distance and starts behaving as in case of a traditional OWC link. Once the arrival power to the sensor is specified, the conversion process must be taken into account when describing the OCC signal within the captured image. CMOS cameras work by converting the incident photons into electrons, storing them and encoding them sequentially row by row [18]. Therefore, it is convenient to carry out a unit conversion that takes into account this particularity. The number of stored electrons ( E p x ) during the exposure time of the capture T e x p (valid for any situation) is obtained by [41]:
E p x = T e x p λ P p x ( λ ) E Q E ( λ ) E p h ( λ ) q d λ
Please note that the concept of pixel arrival power has been extended to include the emission spectrum of the optical source. E Q E ( λ ) is the external quantum efficiency of the receiver substrate (usually silicon), E p h ( λ ) is the energy of the photon at each wavelength, and q is the charge of the electron. Although theoretically, a small emitter located at a long distance will appear as a single bright spot in the capture, the image-forming optics are not perfect, and there is some spatial dispersion of the energy. This dispersion is modeled by the point spread function (PSF) of the system, denoted as h [ n , m ] already in the image domain, where n denotes the horizontal coordinate, and m the vertical coordinate. In essence, h [ n , m ] is the system’s spatial impulse response and is usually dependent on the distance of the link. Any projection must be convoluted with it, so in a sub-pixel situation, the illuminated region can be modeled as:
s [ n , m ] = G V K E p x × h [ n n 0 , m m 0 ]
where K ( × ) is a function that includes analog-to-digital conversion, G V is the analog gain of the CMOS camera’s reading circuitry and n 0 and m 0 are the coordinates of the pixels where the emitter is projected. It must be noted that in an ideal situation h [ n , m ] has unit energy, so if there is energy dispersion, the theoretical pixel level of the projection will be lower than expected.
Regarding the signal-to-noise ratio (SNR) of an OCC link, Equation (8) summarizes it, being applicable to both sub-pixel and generally studied situations, as:
S N R = G V 2 E p x 2 G V 2 σ s h 2 + σ t h 2 + σ a d c 2 .
It has been assumed that the correction factor γ [18] of the camera is unitary for simplicity and without loss of generality, as well as that the link is not saturated (number of stored electrons less than the full-well capacity of the circuitry). The three main contributions to the noise of the OCC link are shot noise ( σ s h 2 ), thermal noise ( σ t h 2 ) and quantization noise ( σ a d c 2 ). The effect of the latter can be minimized by applying the optimal analog gain value, as demonstrated in [15]. Among the noises of shot nature, the most significant contributions are the dark noise, the shot noise of the generated signal itself, and the readout noise. In outdoor links there is another phenomenon that can have a substantial impact on system performance. The background level can vary, at least in a sub-pixel situation where the speed is determined by the camera’s capture rate, comparable to the transmission time of a frame. This effect will be analyzed experimentally in Section 4.
As derived in [15,42,43], turbulence is a consequence of the heterogeneous values of temperature and pressure in the atmosphere. The refractive index of the air changes randomly over time and space, affecting the amplitude and phase of optical signals [36]. The refractive-index structure parameter ( C n 2 ) (in m 2 / 3 ), is used to characterize the strength of optical turbulence. It typically ranges from 10 17 m 2 / 3 or less for weak turbulence, and above 10 13 m 2 / 3 for strong turbulence. It is given by [43,44,45]:
C n 2 = 79 × 10 6 P T 2 2 × C T 2 ,
where P and T are the average values of the air’s pressure in millibar and temperature in Kelvin, respectively. C T 2 is the temperature structure parameter, which can be calculated by measuring the temperature of two or more points in the space separated by a distance R. It is derived from the random processes’ general definition of the structure function D T , given by [45]:
D T = ( T 1 T 2 ) 2 = C T 2 × l 0 4 / 3 × R 2 0 R l 0 C T 2 × R 2 / 3 l 0 R L 0 ,
where | T 1 T 2 | is the temperature difference between two points, and l 0 is the minimum air heterogeneity characterized by Kolmogorov’s theory of turbulence [45], whereas L 0 is the maximum.

3.2. Technical Requirements and Potential Applications of OCC-Based Wireless Sensor Networks

As mentioned before, OCC has many potential applications in different scenarios. This technology is cost-effective and allows simultaneous communication with a significant number of remote nodes, providing dedicated time-frequency channels to each of them thanks to optical cameras’ inherent spatial division multiplexing capabilities. A general-purpose scheme of an OCC-based WSN has been defined and it is depicted in Figure 3. This baseline description includes simple low power receiver-less remote nodes, the deployment scenario, the camera-based gateway, and a cloud-based endpoint. Depending on the use case characteristics, the sensors’ information may be extracted on-the-edge by processing the captured frames in situ (at the camera side) or processing them in the cloud endpoint after streaming the captured video signal. Some remarkable application fields have been identified and are discussed in the following subsections.
IoT technology is beginning to impact the agriculture industry, providing unforeseen capabilities that comprise, among others, local or remote data acquisition, communication between critical agents, and cloud-based intelligent decision making. These capabilities are expected to improve not only the yields but also optimize essential resources such as land and water and even help the workforce. According to [46], the main applications of Smart Agriculture are the monitoring of water, soil, bugs, crop health, machinery, and the environment. These applications rely on several services such as irrigation, fertilization, or soil preparation, which ultimately make intensive use of sensors that have particular connectivity demands.
Communication in Farm Area Networks (FANs) is being carried out using the available cellular infrastructures, IEEE 802.15.4-based technologies such as Bluetoothor Zigbee, LoRa, or Sigfox. Regarding the use of 2G-4G technology, the availability of these deployments is a primary concern in rural areas, and the use of Low Power Wide Area Network (LPWAN) technologies such as LoRa [47], or Sigfox [48] are mainly being adopted in the industry. This communications layer is usually the lowest level of a four-layer architecture, including Medium Access Layer (MAC), Network Layer, and Transport Layer. Notwithstanding, following the proposed scheme of Figure 3, an OCC-enabled FAN using sub-pixel links would need only a physical layer implementation in the first mile, while the camera would act as a data-aggregating agent which could have traditional interfaces such as the mentioned ones. The advantages of integrating OCC in this use case are the unlicensed spectrum, the potential capability of simultaneously monitoring hundreds of devices without MAC protocol, and the simplicity of the node design.

4. Experimental Evaluation

In this section, the experimentation using different OCC equipment is detailed by describing their key parameters and modulation scheme and presenting the experimental setups and the results obtained in various realistic scenarios.

4.1. Physical Layer Strategies

The transmitters and receivers developed for the experiments shown in this section consist of LED modules and CMOS cameras with different optics, respectively. They can be separated into two categories of small and large optical devices. The small devices consist of discrete off-the-shelf components, and the large devices have been developed for an extended vertical dimension (for the case of T x ) and for a longer focal distance (for the case of R x ). The large transmitters have also been implemented using multi-channel red-green-blue (RGB) LEDs, for exploiting more parallel data streams, with a higher power consumption in consequence. Figure 4 shows the implementation of these devices, while Table 1 shows their key parameters.
The modulation scheme used by the transmitters is on-off keying (OOK), and takes advantage of the switching outputs available in most microcontrollers and only requires the use of a transistor for driving high power LEDs without complex front-end devices. In the case of low power LEDs, the switching output can usually directly drive the LED. The packet structure proposed for GS detection is depicted in Figure 5, with a symbol rate of 7.5 baud for cameras using 30 fps frame rate. In the case of RS detection, much higher symbol rates can be employed, proportional to the row-shift time, as described in [13]. The RS experiment used a symbol rate of 8.4 × 10 3 baud, and exploited the color channels of the camera, with three parallel data streams using RGB LEDs.

4.2. Description of the Experiments

The experiments carried out range from laboratory setups emulating outdoor scenarios [14,15,16] to real outdoor scenarios with different conditions [17], as summarized in Table 2 and described in the following sections. The OCC equipment was developed using off-the-shelf components, such as arrays of RGB LEDs in strip format and standard resin-encapsulated 5 mm white LED for the transmitters, and a RS CMOS camera with a built-in microlens. In the case of [17], the camera was attached to a telescope for covering distances up to 200 m.

4.2.1. Emulation of Atmospheric Conditions in Laboratory

These experiments, labeled as 1.1 and 1.2 in Table 2, consisted of testing the large transmitters and the CMOS R x in laboratory settings emulating two important atmospheric conditions: fog and turbulence. First, in Experiment 1.1, only attenuation of the signal was emulated using a white methacrylate sheet in different configurations. The different optical powers received by the camera were captured changing its analog gain, and a control algorithm was derived in [16] to automatically set the gain by using the Pearson’s correlation coefficient ( r x , y ) as an estimator of the image quality, alternative to the SNR. This coefficient is defined as:
r x y = i = 1 N ( x i x ¯ ) ( y i y ¯ ) i = 1 N ( x i x ¯ ) 2 i = 1 N ( y i y ¯ ) 2 ,
where x i are reference of N samples of an expected signal or template (a header of a packet, for example), y i are N consecutive samples of the input signal, and x ¯ ,   y ¯ are their mean values.
In the Experiment 1.2 (See Table 2), the CMOS R x and large T x (See Table 1) were tested using the laboratory chamber at the facilities of the Czech Technical University in Prague. Conditions of fog and turbulence were generated using a glycerin machine and two heater-blowers, respectively. The features of the chamber are listed in Table 3. The level of fog was studied by means of the meteorological visibility (V) [51] measured by a laser source-power meter couple aligned across the chamber, parallel to the OCC link, and the turbulence level was estimated by the well-known refractive-index structure parameter, as derived in [42], using an array of 20 temperature sensors set up equidistantly across the chamber. The diagram of the setup is shown in Figure 6.
From the experimentation under heat-induced turbulence of values of 4.69 × 10 11 m 2 / 3 C n 2 7.13 × 10 11 m 2 / 3 , these conditions showed negligible influence over the OCC system performance. Contrarily, under fog conditions, the OCC system showed susceptibility to being affected by the attenuation caused by the aerosol generated by the glycerin machine. In [14] it was shown that the meteorological visibility under 40 m would make r x y to drop, as shown in Figure 7a. The combined effect of the low values of visibility and the low values of camera exposure cause the ADC input to be considerably low. In [15], the same visibilities were analyzed varying the camera’s analog gain, and it was shown that G V could overcome this issue without the need for large optics in the range of 4.68 m link distance and visibility under 40 m, which could be compared to dense fog weather. It was seen that for better visibilities, above 50 m, the gain can also cause saturation of the ADC, inducing noise. Then, between 40 and 50 m of visibility, the gain control algorithm mentioned before could use a fuzzy or adaptive threshold from which the analog gain would take high or low values, as depicted in Figure 7b.

4.2.2. Real Conditions of Sandstorm Using Large Optical Devices

In the previous work, [17], Experiment 2 (See Table 2), a transmission using the large RGB LED transmitter, and the CMOS receiver attached to a 700 mm Galilean telescope was performed at distances of 100 m and 200 m during a sandstorm event in the nearby area of the IDeTIC facilities, as shown in Figure 8. The large optics used ensure a considerable area of projection over the image sensor, allowing use of RS decoding. A Gaussian Mixture Model (GMM) was used to for the accurate segregation of background and signal. It was observed that the ROI expanded in the presence of aerosols due to scattering, allowing the decoding of around 30% more lines involved in the RS detection, compared to clear conditions.
The visibility during the experiment was estimated to be about 0.57 km. The estimated K e x t ( λ ) values for the RGB channels studied are shown in Table 4. In this high optical extinction scenario, the camera gain has a favorable effect on the SNR, improving it by up to Δ S N R 9 dB at 100 m and 3 dB at 200 m. The obtained BER values for each link span are 9.14 × 10 5 and 4.1 × 10 3 for 100 m and 200 m, respectively.

4.2.3. Real Outdoor Scenario in Sub-Pixel Setting

In the sub-pixel conditions of Experiment 3 (See Table 2), the small LED transmitters’ projection is less than a single camera pixel. When using image-forming optics, the emitters’ physical size is a critical aspect in establishing the links. However, at long distances, the use of emitters that are projected onto several pixels perceptible by the human eye is unfeasible since this magnitude increases quadratically with distance.
Two nodes were programmed to send OOK signals in loop transmission frames with the structure shown in Figure 5 containing 1 Byte of payload in which the values from 0x00 to 0xFF were transmitted sequentially. The bit time defined for the experiments was 133.33 ms, offering a bit rate of 4 bps. Both units were anchored to two posts located at d 1 = 90 m and d 2 = 130 m away from the camera respectively, as shown in Figure 9. Once the nodes were activated and started transmitting data, a 10-min video was recorded using 85 s of exposure time, a minimum analog gain and no digital gain. The camera’s capture rate was set to 30 frames per second, so each transmitted bit was spread over 4 frames. Once the video was captured showing the emissions of both nodes simultaneously, the regions of interest of both transmitters were defined manually, since the elaboration of a discovery procedure was not the objective of this work. Both regions of interest were statistically analyzed to obtain signal-to-noise ratio (SNR) and PSF estimates.
To carry out the SNR analysis, a 150-sample sliding window was processed in which a Gaussian Mixture Model (GMM) was set as:
G 2 ( x ) = α σ 0 2 π e ( x μ 0 ) 2 2 σ 0 2 + 1 α σ 1 2 π e ( x μ 1 ) 2 2 σ 1 2
where G 2 ( x ) is a Gaussian mixture, α is the ratio of the first Gaussian, μ i denotes the expected value and σ i is the standard deviation.
The SNR is then calculated as:
S N R = 1 2 | μ 1 μ 0 | 2 α σ 0 2 + ( 1 α ) σ 1 2
The SNR of each of the samples resulting from applying the slider window was stored to estimate the system’s expected SNR afterward. Assuming that the transmission is OOK, the theoretical error rate for each experimentally estimated SNR was calculated using:
B E R = 1 2 erfc S N R 2 ,
where erfc ( × ) is the complementary error function. Both SNR and BER calculations were carried out for each color of the image sensor (R, G and B) and for an average emphasized with the PSF approximation (Equation (15)). The goal of spatial averaging is to improve the SNR by reducing the effective variance of noise.
y = i = 0 I 1 j = 0 J 1 r x y [ i , j ] × x [ i , j ] i = 0 N 1 j = 0 M 1 r x y [ i , j ] ,
where I and J are the height and width of the ROI considered for averaging, y m e a n is the signal resulting from the arithmetic mean, y is the signal resulting from the emphasized averaging, s [ i , j ] is the original signal in the coordinate ( i , j ) .
The location of the transmitters in the captured images was selected arbitrarily and reinforced by the correlation process to estimate the PSF. The image processing is depicted in Figure 10, showing the photograms obtained by the camera, and the ROI of the sources. The PSFs estimated are plotted in a region of 7-by-7 px in which it can be seen the numerous pixels that have a considerable correlation to the sub-pixel projection. The results of SNR and BER for both T x are summarized in Table 5. The RGB channels do not show considerable difference. The SNR at d 1 reaches 20.0 dB for the green channel, and at d 2 it is 13.0 dB for the red channel. The experimental BER values obtained do not reflect the theoretic expected values possibly due to the non-stationary behavior of the background light level, and the limited amount data analyzed.

4.3. Discussion of the Results

One of OCC technology’s recurring promises is the possibility of using cameras in Smart Cities for massive sensor monitoring applications. However, until now, OCC systems have been based on the use of large lamps, complemented with high-gain optics (telescopes or focal length lenses of hundreds of mm) or short distances, to make use of RS techniques. However, in the cases of use in WSN, the transmission speed is a non-critical factor, so it allows the exploitation of the inherent spatial multiplexing capacity of the cameras.
The sub-pixel system achieved a relatively equal SNR for the red, green, and blue channels, with values of approximately 20 dB and 13 dB for 90 m and 130 m, respectively, using the emphasized PSF enhancement. Considering that the NRZ-OOK technique was used, the theoretical BER level was estimated for each distance, and it can be observed that the experimental BER is considerably far for the d 2 case since the channel is non-stationary. For the case d 1 , no errors were found during the experiments. The channel fluctuations are an issue for the successful segregation of the signal of interest and could become critical for mobile nodes and outdoor conditions.

5. Conclusions

In this paper, a network architecture based on OCC for WSN and IoT applications is developed from the advances in experimental channel evaluation in emulated and real conditions. The experimental setup used for the proof-of-concept of the network strategy was implemented using two transmitters at 90 m and 130 m communicating simultaneously to one CMOS camera at 8 bps each, with the potential to expand the number of T x nodes to several tens of them covering considerable areas, such as crop fields, streets, parks, industrial facilities, among others. WSNs are envisioned in this paper as a field in which sub-pixel OCC has a significant potential to become a competent alternative. Although the achievable data rate is relatively low, signaling sensor data through OCC with single LEDs is cost and energy-efficient, and camera-based receivers can be reused for video recording and communications simultaneously in these settings.
The analysis of previous works analyzed, in which RS-OCC systems were evaluated in emulated and real outdoor scenarios, showed significant limitations of RS schemes. Although the achieved data rate (several hundred bps), distances (hundreds of m), and cost-effectiveness of the equipment are positive aspects of RS-OCC, the need to use large optical devices is an issue. The use of large LED T x or long focal distance lenses are requirements that limit the camera equipment to only be used as a communication device at the expense of their video-monitoring capabilities. The sup-pixel approach uses only small optical devices, i.e., a microlens CMOS camera, and single 5 mm standard LEDs, allowing more transmitters to input data into the receiver, taking advantage of the spatial division that is inherent in the camera equipment, and allowing the images to be used for video-monitoring of the scenario. This re-use of camera equipment for communication and video-monitoring enables the opportunity to apply these schemes to camera systems that are already deployed, e.g., surveillance cameras.
Future works should include further experimentation in the sub-pixel scenario that supports the capabilities of the OCC-based WSN proposal envisioned in this work, including a larger number of nodes and more extended data transmissions. Other important issues to be addressed include the characterization of the channel background light in a long-term measurement, the mobility support and node discovery, which are critical for the latency of the communications. Please note that node detection in this work was done arbitrarily and offline. However, an online correlation-based detection should be implemented for automatic node discovery and for covering mobility and perturbations that might displace the transmitters’ projections on the image sensor. Furthermore, multi-hop schemes and OWC downlink from the network gateway to the sensing nodes could also be developed, although the cost of implementing optical receiver equipment in the sensing nodes can involve a high energy consumption. Finally, the experimental setup in outdoor conditions shown here can be extrapolated to small indoor environments of SCOH, and the sub-pixel context can be kept by using micro-LED devices, which are more power-efficient.

Author Contributions

Conceptualization, V.M., V.G. and R.P.-J.; Formal analysis, V.M. and V.G.; Funding acquisition, S.Z. and R.P.-J.; Investigation, V.M. and V.G.; Methodology, V.M., V.G. and C.J.-V.; Resources, S.Z. and R.P.-J.; Software, V.M., V.G. and C.J.-V.; Supervision, S.Z. and R.P.-J.; Visualization, V.M., V.G. and S.Z.; Writing—original draft, V.M., V.G. and S.Z.; Writing—review & editing, V.M., V.G. and S.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No 764461. This article is-based upon work from COST Action CA19111 (European Network on Future Generation Optical Wireless Communication Technologies, NEWFOCUS), supported by COST (European Cooperation in Science and Technology).

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Cahyadi, W.A.; Chung, Y.H.; Ghassemlooy, Z.; Hassan, N.B. Optical Camera Communications: Principles, Modulations, Potential and Challenges. Electronics 2020, 9, 1339. [Google Scholar] [CrossRef]
  2. Pathak, P.H.; Feng, X.; Hu, P.; Mohapatra, P. Visible light communication, networking, and sensing: A survey, potential and challenges. IEEE Commun. Surv. Tutor. 2015, 17, 2047–2077. [Google Scholar] [CrossRef]
  3. Almadani, Y.; Plets, D.; Bastiaens, S.; Joseph, W.; Ijaz, M.; Ghassemlooy, Z.; Rajbhandari, S. Visible Light Communications for Industrial Applications—Challenges and Potentials. Electronics 2020, 9, 2157. [Google Scholar] [CrossRef]
  4. Saeed, N.; Guo, S.; Park, K.H.; Al-Naffouri, T.Y.; Alouini, M.S. Optical camera communications: Survey, use cases, challenges, and future trends. Phys. Commun. 2019, 37, 100900. [Google Scholar] [CrossRef] [Green Version]
  5. Saha, N.; Ifthekhar, M.S.; Le, N.T.; Jang, Y.M. Survey on optical camera communications: Challenges and opportunities. IET Optoelectron. 2015, 9, 172–183. [Google Scholar] [CrossRef]
  6. Le, N.T.; Hossain, M.; Jang, Y.M. A survey of design and implementation for optical camera communication. Signal Process. Image Commun. 2017, 53, 95–109. [Google Scholar] [CrossRef]
  7. Jang, M. IEEE 802.15 WPAN 15.7 Amendment-Optical Camera Communications Study Group (SG 7a). 2019. Available online: https://www.ieee802.org/15/pub/SG7a.html (accessed on 12 April 2021).
  8. Kim, Y.H.; Cahyadi, W.A.; Chung, Y.H. Experimental Demonstration of VLC-Based Vehicle-to-Vehicle Communications Under Fog Conditions. IEEE Photonics J. 2015, 7, 1–9. [Google Scholar] [CrossRef]
  9. Chaudhary, N.; Alves, L.N.; Ghassemlooy, Z. Current Trends on Visible Light Positioning Techniques. In Proceedings of the 2019 2nd West Asian Colloquium on Optical Wireless Communications (WACOWC), Tehran, Iran, 27–28 April 2019; pp. 100–105. [Google Scholar]
  10. Chaudhary, N.; Younus, O.I.; Alves, L.N.; Ghassemlooy, Z.; Zvanovec, S.; Le-Minh, H. An Indoor Visible Light Positioning System Using Tilted LEDs with High Accuracy. Sensors 2021, 21, 920. [Google Scholar] [CrossRef]
  11. Palacios Játiva, P.; Román Cañizares, M.; Azurdia-Meza, C.A.; Zabala-Blanco, D.; Dehghan Firoozabadi, A.; Seguel, F.; Montejo-Sánchez, S.; Soto, I. Interference Mitigation for Visible Light Communications in Underground Mines Using Angle Diversity Receivers. Sensors 2020, 20, 367. [Google Scholar] [CrossRef] [Green Version]
  12. Jurado-Verdu, C.; Matus, V.; Rabadan, J.; Guerra, V.; Perez-Jimenez, R. Correlation-based receiver for optical camera communications. OSA Opt. Express 2019, 27, 19150–19155. [Google Scholar] [CrossRef] [Green Version]
  13. Jurado-Verdu, C.; Guerra, V.; Rabadan, J.; Perez-Jimenez, R.; Chavez-Burbano, P. RGB Synchronous VLC modulation scheme for OCC. In Proceedings of the 2018 11th International Symposium on Communication Systems, Networks and Digital Signal Processing (CSNDSP), Budapest, Hungary, 18–20 July 2018; pp. 1–6. [Google Scholar]
  14. Matus, V.; Teli, S.R.; Guerra, V.; Jurado-Verdu, C.; Zvanovec, S.; Perez-Jimenez, R. Evaluation of Fog Effects on Optical Camera Communications Link. In Proceedings of the 2020 3rd West Asian Symposium on Optical Wireless Communications (WASOWC), Tehran, Iran, 24–25 November 2020; pp. 1–5. [Google Scholar]
  15. Matus, V.; Eso, E.; Teli, S.R.; Perez-Jimenez, R.; Zvanovec, S. Experimentally Derived Feasibility of Optical Camera Communications under Turbulence and Fog Conditions. Sensors 2020, 20, 757. [Google Scholar] [CrossRef] [Green Version]
  16. Matus, V.; Guerra, V.; Jurado-Verdu, C.; Teli, S.; Zvanovec, S.; Rabadan, J.; Perez-Jimenez, R. Experimental Evaluation of an Analog Gain Optimization Algorithm in Optical Camera Communications. In Proceedings of the 2020 12th International Symposium on Communication Systems, Networks and Digital Signal Processing (CSNDSP), Porto, Portugal, 20–22 July 2020; pp. 1–5. [Google Scholar]
  17. Matus, V.; Guerra, V.; Zvanovec, S.; Rabadan, J.; Perez-Jimenez, R. Sandstorm effect on experimental optical camera communication. OSA Appl. Opt. 2021, 60, 75–82. [Google Scholar] [CrossRef]
  18. Kuroda, T. Essential Principles of Image Sensors; CRC Press: Boca Raton, FL, USA, 2017. [Google Scholar]
  19. Khalighi, M.A.; Uysal, M. Survey on Free Space Optical Communication: A Communication Theory Perspective. IEEE Commun. Surv. Tutor. 2014, 16, 2231–2258. [Google Scholar] [CrossRef]
  20. Teli, S.R.; Zvanovec, S.; Perez-Jimenez, R.; Ghassemlooy, Z. Spatial frequency-based angular behavior of a short-range flicker-free MIMO–OCC link. OSA Appl. Opt. 2020, 59, 10357–10368. [Google Scholar] [CrossRef]
  21. Teli, S.R.; Matus, V.; Zvanovec, S.; Perez-Jimenez, R.; Vitek, S.; Ghassemlooy, Z. The First Study of MIMO Scheme Within Rolling-shutter Based Optical Camera Communications. In Proceedings of the 2020 12th International Symposium on Communication Systems, Networks and Digital Signal Processing (CSNDSP), Porto, Portugal, 20–22 July 2020; pp. 1–5. [Google Scholar]
  22. Le, N.-T.; Jang, Y.M. Performance evaluation of MIMO Optical Camera Communications based rolling shutter image sensor. In Proceedings of the 2016 8th International Conference on Ubiquitous and Future Networks (ICUFN), Vienna, Austria, 5–8 July 2016; pp. 140–144. [Google Scholar]
  23. Gonçalves, A.L.R.; Maia, Á.H.A.; Santos, M.R.; de Lima, D.A.; de Miranda Neto, A. Visible Light Positioning and Communication Methods and Their Applications in the Intelligent Mobility. IEEE Lat. Am. Trans. 2021, 100, 2174–2185. [Google Scholar]
  24. Iturralde, D.; Azurdia-Meza, C.; Krommenacker, N.; Soto, I.; Ghassemlooy, Z.; Becerra, N. A new location system for an underground mining environment using visible light communications. In Proceedings of the 2014 9th International Symposium on Communication Systems, Networks and Digital Signal Processing (CSNDSP), Manchester, UK, 23–25 July 2014; pp. 1165–1169. [Google Scholar]
  25. Hossan, M.; Chowdhury, M.Z.; Hasan, M.; Shahjalal, M.; Nguyen, T.; Le, N.T.; Jang, Y.M. A new vehicle localization scheme based on combined optical camera communication and photogrammetry. Mob. Inf. Syst. 2018, 2018, 8501898. [Google Scholar] [CrossRef] [Green Version]
  26. Karbalayghareh, M.; Miramirkhani, F.; Eldeeb, H.B.; Kizilirmak, R.C.; Sait, S.M.; Uysal, M. Channel Modelling and Performance Limits of Vehicular Visible Light Communication Systems. IEEE Trans. Veh. Technol. 2020, 69, 6891–6901. [Google Scholar] [CrossRef]
  27. Marè, R.M.; Marte, C.L.; Cugnasca, C.E.; Sobrinho, O.G.; dos Santos, A.S. Feasibility of a Testing Methodology for Visible Light Communication Systems Applied to Intelligent Transport Systems. IEEE Lat. Am. Trans. 2020, 100, 515–523. [Google Scholar]
  28. Elamassie, M.; Karbalayghareh, M.; Miramirkhani, F.; Kizilirmak, R.C.; Uysal, M. Effect of Fog and Rain on the Performance of Vehicular Visible Light Communications. In Proceedings of the 2018 IEEE 87th Vehicular Technology Conference (VTC-Spring), Porto, Portugal, 3–6 June 2018; pp. 1–6. [Google Scholar]
  29. Kuutti, S.; Bowden, R.; Jin, Y.; Barber, P.; Fallah, S. A Survey of Deep Learning Applications to Autonomous Vehicle Control. IEEE Trans. Intell. Transp. Syst. 2021, 22, 712–733. [Google Scholar] [CrossRef]
  30. Ashok, A.; Jain, S.; Gruteser, M.; Mandayam, N.; Yuan, W.; Dana, K. Capacity of pervasive camera based communication under perspective distortions. In Proceedings of the 2014 IEEE International Conference on Pervasive Computing and Communications (PerCom), Budapest, Hungary, 24–28 March 2014; pp. 112–120. [Google Scholar]
  31. Shi, J.; He, J.; Jiang, Z.; Zhou, Y.; Xiao, Y. Enabling user mobility for optical camera communication using mobile phone. OSA Opt. Express 2018, 26, 21762–21767. [Google Scholar] [CrossRef]
  32. Beshr, M.; Michie, C.; Andonovic, I. Evaluation of Visible Light Communication system performance in the presence of sunlight irradiance. In Proceedings of the 2015 17th International Conference on Transparent Optical Networks (ICTON), Budapest, Hungary, 5–9 July 2015; pp. 1–4. [Google Scholar]
  33. Georlette, V.; Bette, S.; Brohez, S.; Pérez-Jiménez, R.; Point, N.; Moeyaert, V. Outdoor Visible Light Communication Channel Modeling under Smoke Conditions and Analogy with Fog Conditions. Optics 2020, 1, 259–281. [Google Scholar] [CrossRef]
  34. Eso, E.; Teli, S.; Hassan, N.B.; Vitek, S.; Ghassemlooy, Z.; Zvanovec, S. 400 m rolling-shutter-based optical camera communications link. OSA Opt. Lett. 2020, 45, 1059–1062. [Google Scholar] [CrossRef] [PubMed]
  35. Chavez-Burbano, P.; Guerra, V.; Rabadan, J.; Perez-Jimenez, R. Optical camera communication for smart cities. In Proceedings of the 2017 IEEE/CIC International Conference on Communications in China (ICCC Workshops), Qingdao, China, 22–24 October 2017; pp. 1–4. [Google Scholar]
  36. Ghassemlooy, Z.; Popoola, W.; Rajbhandari, S. Optical Wireless Communications: System and Channel Modelling with Matlab; CRC Press: Boca Raton, FL, USA, 2019. [Google Scholar]
  37. Ishimaru, A. Electromagnetic Wave Propagation, Radiation, and Scattering: From Fundamentals to Applications; John Wiley & Sons: Hoboken, NJ, USA, 2017. [Google Scholar]
  38. Kedar, D.; Arnon, S. Urban optical wireless communication networks: The main challenges and possible solutions. IEEE Commun. Mag. 2004, 42, S2–S7. [Google Scholar] [CrossRef]
  39. Kedar, D.; Arnon, S. The positive contribution of fog to the mitigation of pointing errors in optical wireless communication. Appl. Opt. 2003, 42, 4946–4954. [Google Scholar] [CrossRef]
  40. Yamazato, T.; Kinoshita, M.; Arai, S.; Souke, E.; Yendo, T.; Fujii, T.; Kamakura, K.; Okada, H. Vehicle Motion and Pixel Illumination Modeling for Image Sensor Based Visible Light Communication. IEEE J. Sel. Areas Commun. 2015, 33, 1793–1805. [Google Scholar] [CrossRef]
  41. Guerra, V.; Ticay-Rivas, J.R.; Alonso-Eugenio, V.; Perez-Jimenez, R. Characterization and Performance of a Thermal Camera Communication System. Sensors 2021, 20, 3288. [Google Scholar] [CrossRef]
  42. Bohata, J.; Zvanovec, S.; Korinek, T.; Abadi, M.M.; Ghassemlooy, Z. Characterization of dual-polarization LTE radio over a free-space optical turbulence channel. OSA Appl. Opt. 2015, 54, 7082–7087. [Google Scholar] [CrossRef]
  43. Libich, J.; Perez, J.; Zvanovec, S.; Ghassemlooy, Z.; Nebuloni, R.; Capsoni, C. Combined effect of turbulence and aerosol on free-space optical links. OSA Appl. Opt. 2017, 56, 336–341. [Google Scholar] [CrossRef]
  44. Nor, N.A.M.; Fabiyi, E.; Abadi, M.M.; Tang, X.; Ghassemlooy, Z.; Burton, A. Investigation of moderate-to-strong turbulence effects on free space optics—A laboratory demonstration. In Proceedings of the 2019 15th International Conference on Telecommunications (ConTEL), Graz, Austria, 13–15 July 2019. [Google Scholar]
  45. Andrews, L.C.; Phillips, R.L. Laser Beam Propagation through Random Media; SPIE Press: Bellingham, WA, USA, 2005; Volume 152. [Google Scholar]
  46. Ayaz, M.; Ammad-Uddin, M.; Sharif, Z.; Mansour, A.; Aggoune, E.M. Internet-of-Things (IoT)-Based Smart Agriculture: Toward Making the Fields Talk. IEEE Access 2019, 7, 129551–129583. [Google Scholar] [CrossRef]
  47. Zhu, N.; Xia, Y.; Liu, Y.; Zang, C.; Deng, H.; Ma, Z. Temperature and Humidity Monitoring System for Bulk Grain Container Based on LoRa Wireless Technology. In Lecture Notes in Computer Science, Proceedings of the Cloud Computing and Security, Haikou, China, 8–10 June 2018; Sun, X., Pan, Z., Bertino, E., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 102–110. [Google Scholar]
  48. Mekki, K.; Bajic, E.; Chaxel, F.; Meyer, F. Overview of Cellular LPWAN Technologies for IoT Deployment: Sigfox, LoRaWAN, and NB-IoT. In Proceedings of the 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Athens, Greece, 19–23 March 2018; pp. 197–202. [Google Scholar]
  49. Atmel Corporation. ATmega328p, 8-bit AVR Microcontroller with 32K Bytes In-System Programmable Flash, Datasheet; Atmel Corporation: San Jose, CA, USA, 2015. [Google Scholar]
  50. IMX219PQH5-C Datasheet. Available online: https://datasheetspdf.com/pdf/1404029/Sony/IMX219PQH5-C/1 (accessed on 7 April 2021).
  51. Eso, E.F.; Burton, A.; Hassan, N.B.; Abadi, M.M.; Ghassemlooy, Z.; Zvanovec, S. Experimental Investigation of the Effects of Fog on Optical Camera-based VLC for a Vehicular Environment. In Proceedings of the 2019 15th International Conference on Telecommunications (ConTEL), Graz, Austria, 13–15 July 2019. [Google Scholar]
  52. Cleveland, W.S.; Devlin, S.J. Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting. J. Am. Stat. Assoc. 1988, 83, 596–610. [Google Scholar] [CrossRef]
  53. Cartográfica de Canarias (GRAFCAN). Sistema de Información Territorial de Canarias. Available online: https://grafcan.es/v0kq90T (accessed on 12 April 2021).
Figure 1. Diagram of the implementation of optical camera communication (OCC) using rolling shutter (RS) techniques in outdoor scenarios and the inherent issues associated with field-of-view (FOV) use in long distance setups.
Figure 1. Diagram of the implementation of optical camera communication (OCC) using rolling shutter (RS) techniques in outdoor scenarios and the inherent issues associated with field-of-view (FOV) use in long distance setups.
Sensors 21 02739 g001
Figure 2. Vehicular visible light communication using OCC is an example of segregation of spatially divided data inputs that is feasible by virtue of the image-forming nature of cameras.
Figure 2. Vehicular visible light communication using OCC is an example of segregation of spatially divided data inputs that is feasible by virtue of the image-forming nature of cameras.
Sensors 21 02739 g002
Figure 3. Proposal of the OCC equipment for Wireless Sensor Networks.
Figure 3. Proposal of the OCC equipment for Wireless Sensor Networks.
Sensors 21 02739 g003
Figure 4. Hardware and equipment developed for the OCC experiments. (a) Single LED small transmitters. (b) CMOS camera-based OCC receiver. (c) Standalone implementation of the single LED small transmitters. (d) RGB LED large transmitters. (e) Large optics (telescope) CMOS-based receiver.
Figure 4. Hardware and equipment developed for the OCC experiments. (a) Single LED small transmitters. (b) CMOS camera-based OCC receiver. (c) Standalone implementation of the single LED small transmitters. (d) RGB LED large transmitters. (e) Large optics (telescope) CMOS-based receiver.
Sensors 21 02739 g004
Figure 5. Frame format used by both of the T x devices of the sub-pixel experiment, assuming a global shutter demodulation at the R x .
Figure 5. Frame format used by both of the T x devices of the sub-pixel experiment, assuming a global shutter demodulation at the R x .
Sensors 21 02739 g005
Figure 6. Diagram of the laboratory chamber employed for emulation of atmospheric conditions.
Figure 6. Diagram of the laboratory chamber employed for emulation of atmospheric conditions.
Sensors 21 02739 g006
Figure 7. Results using the Pearson’s correlation coefficient between reference signal and images taken in Experiment 1.2 under different visibility conditions. (a) Scatterplot of the r x y values obtained using fixed gain of 3 dB. The red line is the non-parametric locally estimated scatterplot smoothing (LOESS) of span = 0.35 [52]. (b) Contour plot LOESS of span = 0.15 of r x y values changing the camera gain. The shaded vertical line corresponds to a fuzzy threshold determined by the analog gain control algorithm.
Figure 7. Results using the Pearson’s correlation coefficient between reference signal and images taken in Experiment 1.2 under different visibility conditions. (a) Scatterplot of the r x y values obtained using fixed gain of 3 dB. The red line is the non-parametric locally estimated scatterplot smoothing (LOESS) of span = 0.35 [52]. (b) Contour plot LOESS of span = 0.15 of r x y values changing the camera gain. The shaded vertical line corresponds to a fuzzy threshold determined by the analog gain control algorithm.
Sensors 21 02739 g007
Figure 8. Photographs of the Experiment 3 setup under sandstorm conditions [17]. (a) Transmitter side using large RGB LED. (b) Surroundings of the experiment affected by the sand particles. (c) Receiver side using the CMOS camera attached to a telescope.
Figure 8. Photographs of the Experiment 3 setup under sandstorm conditions [17]. (a) Transmitter side using large RGB LED. (b) Surroundings of the experiment affected by the sand particles. (c) Receiver side using the CMOS camera attached to a telescope.
Sensors 21 02739 g008
Figure 9. Photographs of the communication devices deployed for the sub-pixel experiments at the facilities of IDeTIC. (a) Satellite image from Cartográfica de Canarias (Grafcan) [53]. (b) CMOS camera receiver. (c) single LED small transmitter.
Figure 9. Photographs of the communication devices deployed for the sub-pixel experiments at the facilities of IDeTIC. (a) Satellite image from Cartográfica de Canarias (Grafcan) [53]. (b) CMOS camera receiver. (c) single LED small transmitter.
Sensors 21 02739 g009
Figure 10. Image processing results from the experimentation using the sub-pixel setting. (a) Example of a frame obtained by the camera during experimentation, with insets of the regions of projection of each transmitter. (b) Estimated point spread function (PSF) for d 1 . (c) Estimated PSF for d 2 .
Figure 10. Image processing results from the experimentation using the sub-pixel setting. (a) Example of a frame obtained by the camera during experimentation, with insets of the regions of projection of each transmitter. (b) Estimated point spread function (PSF) for d 1 . (c) Estimated PSF for d 2 .
Sensors 21 02739 g010
Table 1. Description of the T x and R x key features.
Table 1. Description of the T x and R x key features.
FeatureDescription
RGB LED Large Transmitter
Device12 V DC RGB LED strips (108 × 5050 SMD chips)
Front-end deviceMicrocontroller Atmel ATMega328p [49]
Single LED Small Transmitter
Device3.7 V DC White LED 5 mm
Front-end deviceMicrocontroller Atmel ATMega328p [49]
CMOS Camera Receiver
CameraPicamera V2 module (Sony IMX219 [50])
Max resolution 3280 × 2464 px
Gain ( G V ) max. value16 dB
Frame rate30 fps
Table 2. Summary of the experiments carried out, with their key contributions on methodology and results.
Table 2. Summary of the experiments carried out, with their key contributions on methodology and results.
ExperimentDesignProcessesMetricsHighlighted
Findings
Exp. 1.1 [16]Attenuation emulated
in laboratory.
0.46 m link.
Rolling Shutter,
Gain control algorithm.
Pearson’s
Corr. Coef.
Automated
gain optimization.
Exp. 1.2 [14,15]Fog and turbulence
emulation in chamber,
4.68 m link.
Rolling Shutter,
RGB cross-talk
compensation,
ROI detection.
SNR,
Pearson’s
Corr. Coef.
Influence of
camera gain.
Exp. 2 [17]Sandstorm real
outdoor scenario.
100 m, 200 m link.
Rolling Shutter,
Large optical zoom,
Tilt compensation,
Gaussian Mixture Model,
ROI detection.
SNR, BER.ROI expansion
due to scattering.
Exp. 3Sub-pixel real
outdoor scenario.
90 m, 130 m link.
GS detection
with RS hardware,
Small optical devices.
SNR, BER, PSF.Re-use,
PSF enhance,
Scalability.
Table 3. Laboratory chamber parameters and equipment.
Table 3. Laboratory chamber parameters and equipment.
FeatureDescription
Dimensions4.9 m (length), 0.4 m (width), 0.4 m (height)
Temperature sensors20 × Papouch Corp. TQS3-E (precision 0.1 C)
LASER sourceThorlabs HLS635 (635 nm) F810APC
Optical power meterThorlabs PM100D S120C
Heat blowers2 × Sencor SFH7010, 2000 W
Fog machineAntari F-80Z, 700 W
Table 4. Extinction coefficient values under sandstorm measurements.
Table 4. Extinction coefficient values under sandstorm measurements.
Channel λ [nm] K ext ( λ ) [m 1 ]
Red630 7.2 × 10 3
Green530 6.9 × 10 3
Blue475 5.6 × 10 3
Table 5. Signal-to-noise (SNR) ratio and bit error rate (BER) results by channel using point spread function weighted scheme in sub-pixel experiments.
Table 5. Signal-to-noise (SNR) ratio and bit error rate (BER) results by channel using point spread function weighted scheme in sub-pixel experiments.
MetricPositionChannel RChannel GChannel B
SNR (experimental) d 1 19.7 dB 20.0 dB 19.8 dB
d 2 13.0 dB 12.9 dB 12.5 dB
BER (theoretical) d 1 < 10 12 < 10 12 < 10 12
d 2 3.97 × 10 6 5.03 × 10 6 1.24 × 10 5
BER (experimental) d 1 < 3.33 × 10 3 < 3.33 × 10 3 < 3.33 × 10 3
d 2 9.60 × 10 3 9.60 × 10 3 7.20 × 10 3
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Matus, V.; Guerra, V.; Jurado-Verdu, C.; Zvanovec, S.; Perez-Jimenez, R. Wireless Sensor Networks Using Sub-Pixel Optical Camera Communications: Advances in Experimental Channel Evaluation. Sensors 2021, 21, 2739. https://doi.org/10.3390/s21082739

AMA Style

Matus V, Guerra V, Jurado-Verdu C, Zvanovec S, Perez-Jimenez R. Wireless Sensor Networks Using Sub-Pixel Optical Camera Communications: Advances in Experimental Channel Evaluation. Sensors. 2021; 21(8):2739. https://doi.org/10.3390/s21082739

Chicago/Turabian Style

Matus, Vicente, Victor Guerra, Cristo Jurado-Verdu, Stanislav Zvanovec, and Rafael Perez-Jimenez. 2021. "Wireless Sensor Networks Using Sub-Pixel Optical Camera Communications: Advances in Experimental Channel Evaluation" Sensors 21, no. 8: 2739. https://doi.org/10.3390/s21082739

APA Style

Matus, V., Guerra, V., Jurado-Verdu, C., Zvanovec, S., & Perez-Jimenez, R. (2021). Wireless Sensor Networks Using Sub-Pixel Optical Camera Communications: Advances in Experimental Channel Evaluation. Sensors, 21(8), 2739. https://doi.org/10.3390/s21082739

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop