Next Article in Journal
Image-Based Computational and Experimental Biomedical Flows
Next Article in Special Issue
Modeling and Visualization of Coolant Flow in a Fuel Rod Bundle of a Small Modular Reactor
Previous Article in Journal
Numerical Analysis of the Effect of Different Nose Shapes on Train Aerodynamic Performance
Previous Article in Special Issue
Aeroacoustic Coupling in Rectangular Deep Cavities: Passive Control and Flow Dynamics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improvement in the Number of Velocity Vector Acquisitions Using an In-Picture Tracking Method for 3D3C Rainbow Particle Tracking Velocimetry

Meteorology and Fluid Science Division, Central Research Institute of Electric Power Industry, 1646 Abiko, Abiko-shi 270-1194, Chiba, Japan
*
Author to whom correspondence should be addressed.
Fluids 2024, 9(10), 226; https://doi.org/10.3390/fluids9100226
Submission received: 29 July 2024 / Revised: 20 September 2024 / Accepted: 28 September 2024 / Published: 30 September 2024
(This article belongs to the Special Issue Flow Visualization: Experiments and Techniques)

Abstract

:
Particle image velocimetry and particle tracking velocimetry (PTV) have developed from two-dimensional two-component (2D2C) velocity vector measurements to 3D3C measurements. Rainbow particle tracking velocimetry is a low-cost 3D3C measurement technique adopting a single color camera. However, the vector acquisition rate is not so high. To increase the number of acquired vectors, this paper proposes a high probability and long-term tracking method. First, particles are tracked in a raw picture instead of in three-dimensional space. The tracking is aided by the color information. Second, a particle that temporarily cannot be tracked due to particle overlap is compensated for using the positional information at times before and after. The proposed method is demonstrated for flow under a rotating disk with different particle densities and velocities. The use of the proposed method improves the tracking rate, number of continuous tracking steps, and number of acquired velocity vectors. The method can be applied under the difficult conditions of high particle density (0.004 particles per pixel) and large particle movement (maximum of 60 pix).

1. Introduction

The present study aimed to increase the number of velocity vectors acquired from three-dimensional three-component (3D3C) velocity measurements made in rainbow particle tracking velocimetry (PTV) to enhance spatial resolution. To this end, we developed an in-picture tracking method adopting raw particle images in place of the conventional method of tracking 3D positions.
Particle image velocimetry (PIV) and PTV are common methods of measuring fluid velocities from images and have been used for a wide variety of objects, including multiphase flows [1,2] and microfluids [3]. These methods have evolved significantly, transitioning from 2D2C velocity vector measurements confined to the focal plane to full 3D3C measurements [4]. To enhance the dimensionality and components of these measurements, various techniques have been developed, including stereo PIV [5], scanning PIV [6,7], and tomographic PIV [8,9]. The 3D3C velocity measurements usually need to be made from multiple viewpoints using multiple cameras, such as when adopting scanning tomographic particle image velocimetry [9] and the shake-the-box method [10]. Rainbow PTV [11,12,13,14,15,16] is a low-cost method of obtaining 3D3C velocities with a single color camera and simple data processing. The flow volume is illuminated by multi-colored light that changes along the depth direction (z) using a liquid-crystal display projector. Particle positions along vertical and horizontal directions (x and y) in pictures are obtained in the same manner as in traditional PTV, whereas particle positions in the depth component are obtained from the particle colors. The degree of hue (H, 0–360) is calculated from RGB values of the particle and then converted to the depth position.
PTV provides Lagrangian velocity data at random locations. However, to calculate differential quantities such as vorticity, it is necessary to replace the data with grid data, such as the results of computational fluid dynamics analysis or particle image velocimetry. This requires the acquisition of spatially dense velocity vectors [17]. However, the number of vectors acquired in rainbow PTV is not high, as described below.
The accuracy of the depth position estimated from the particle color is not high. To improve accuracy, a multi-cycle color pattern was designed and used in [16]. In addition, recent methods based on machine learning still have a standard deviation error of ±3% of the total depth [14]. This error is too high relative to the positional accuracy of the x and y positions, which can be obtained at a sub-pixel level. The inaccuracy in the estimation of the depth position reduces not only the accuracy of velocity vectors but also the vector acquisition rate.
Low positional accuracy leads to a low vector acquisition rate in the following ways. As the velocity is calculated by dividing the particle movement by the time width of an image, it is necessary to link (i.e., track) the same particles in successive images. The tracking of particles adopts historical information on the particle position. Figure 1 shows moving particles and positional data, including errors for the z direction. Black circles indicate the correct positions, and red circles the measured positions with error. The particle position at time t + 1 is estimated from the particle positions at times t − 1 and t. The particle inside the estimated area (i.e., the dotted circle in Figure 1) is recognized as the same particle. However, a particle outside the search area cannot be tracked. A smaller movement in one-time step corresponds to a greater ratio of the position error to the amount of movement and a higher chance of tracking failure. Even if a slowly moving particle could be tracked, the accuracy of its velocity would be low. To reduce mistracking, in the example of the four-time-step method [18], a particle is judged to be the same if it is tracked in four consecutive images. However, although the four-time-step method, velocity gradient tensor method [19], and other methods are used to track particles, they cannot track particles in the described error-prone situations.
In addressing the above problem, the nearest-neighbor method was adopted to track particles in the picture instead of the 3D space [14]. However, using the nearest-neighbor method, the amount of movement of a particle in one step has to be sufficiently small for the particle to be tracked, which requires a sufficiently high sampling rate. Achieving a higher dynamic range of velocity requires that particles be tracked over a large range of movement.
Such a 3D tracking method requires a long-time tracking fitting process after tracking. Figure 1 shows that, even if tracking is possible, the accuracy of the velocity obtained in the z direction is low. It is expected that positional data containing zigzag errors in the depth direction, as shown in Figure 1, can be supplemented with smooth motion (the red curve in Figure 1) through filter processing; see, for example, [14,20]). This operation requires continuous tracking over a long period of time without losing the particle.
This paper thus proposes a high-probability and long-term tracking method that can be applied under conditions of large particle movement and high particle density by means of two main features.
First, particles are tracked in a raw picture, which is in contrast with the conventional method of tracking particles in 3D space. Using color information as an aid for tracking, it is possible to track a particle with high probability even under conditions of high particle density and large movement in a single time step. In other words, whereas conventional 3D tracking applies equal weights to x, y, and z positions (estimated from H), the proposed tracking process strongly weights the x and y positions, which have higher accuracy, and uses H from particle color to improve the tracking rate.
Second, even if the tracking of a particle fails temporarily, the tracking is reestablished in a later step. When two or more particle images overlap, which often happens under high particle density conditions, the particles are not recognized correctly, and the particle being tracked is temporarily lost. By compensating the positional information of untracked particles at the time step where particle tracking has failed, it is possible not only to increase the tracking rate but also to extend the period of continuous tracking. This increases the number of vectors acquired because the subsequent fitting process requires continuous tracking.
The present work evaluates the proposed method for flow under a rotating disk. Specifically, the work demonstrates the processes of the proposed method, the possibility of improving the number of vector acquisitions, and the application of the method under conditions of high particle density and large particle movement.

2. Data Processing

2.1. In-Picture Tracking

Figure 2 presents flowcharts of data processing. The left flowchart is for the conventional 3D tracking method, whereas the right flowchart is for the in-picture tracking method proposed in this paper. First, the background is subtracted from the measured image, and the particles are identified on the grey scale. Then, particles are identified from the raw image, and their xy position in the pixel and particle color space are extracted. The particle color is converted from RGB to the hue degree and then to the normalized z position using a calibration curve. The hue degree (H) is calculated from the values of the three color channels [11] according to
H = 60 × G B max R , G , B min R , G , B ( i f   max R , G , B = R ) 60 × B R max R , G , B min R , G , B + 120 ( if   max R , G , B = G ) 60 × R G max R , G , B min R , G , B + 240 i f   max R , G , B = B .
The calibration curve is obtained by adopting the in situ slit calibration method [14]. The operation up to this point is the same for the two methods.
In the conventional method, the xy position in millimeters is obtained from the z position because the viewing angle varies with depth. In addition, the normalized z is converted to the z position in millimeters. Tracking based on the 3D positions is then performed to obtain the velocity vector.
In contrast, the proposed in-picture tracking method tracks particles in the raw picture. Particles are tracked by the xy position in pix units, and normalized z is used as an aid. The xy position is known more accurately than the z position, and particles can thus be tracked at a higher rate than for tracking in the 3D domain. The detailed process of tracking and particle linking across time steps is explained in Section 2.2. Inaccurate z positions are compensated for by fitting the tracked passage in the z direction with a polynomial equation. Finally, the x, y, and z positions are converted from pixel units to millimeter units to obtain velocity vectors.

2.2. Detailed Process of Continuous Long-Time Tracking

One of the challenges in PTV is tracking particles when they overlap in an image. Since the particles are distributed in three dimensions, it is common for them to overlap when viewed in 2D, which can cause the tracking system to lose track of individual particles. This is a limiting factor of the particle density in PTV. Overlapping particles can be recognized as single particles of color different from the colors of the overlapping particles, such that the particles are lost. However, two overlapping particles have different motions in the xy plane because of their different z positions, and their overlap is thus a temporary event. Therefore, even if a particle is temporarily lost, it can be tracked for an extended period of time through compensation of the particle position at the time of overlap. To address this issue, our method includes a process to predict and recover the positions of lost particles, allowing for continuous tracking over a longer period.
Initially, the system predicts the position of a particle at a future time step by considering its velocity and acceleration from previous time steps. If the particle is not found in its predicted location at the next time step, the system anticipates where it might appear in subsequent frames. When the particle reappears, the system compensates for the missed positions by creating “virtual” particles at the points where the particle was temporarily lost. This compensation process allows the tracking to remain continuous, even when particles are lost for up to two consecutive time steps. This allows for continuous long-term tracking. Basically, there is no sudden disappearance of a particle except when a particle travels beyond the field of view.
The detailed tracking process at time t is described below and shown in Figure 3. Basically, the velocity is calculated from the positions at two consecutive times, and the acceleration is calculated from the positional information at three consecutive times, i.e., the velocities at two times. Tracked particles are assigned an ID that shows that they are the same particles in different images to continuously track temporarily lost particles.
The process involves four operations, as shown in Table 1. First, process 1, as the basic tracking process, predicts the position at t + 1, assuming that the velocity and acceleration at t − 1 are maintained at t, from the positional information at t − 2 and t − 1 for the particle that has been assigned an ID at t. The particles within the search area are filtered using the dimensionless z positional information obtained from the color information, e.g., by taking ΔZ < 0.025. If a particle exists within the predicted range, the particle is linked as the same particle and given an ID, and the velocity and acceleration at time t are calculated. Filtering by z is performed in the same way in subsequent processes. This threshold value of ΔZ must be greater than the distance moved in the depth direction for one step plus the error of color-depth transformation. This is to be the same as setting the search range in the xy direction. In previous studies, the error width (95th percentile width) has been around ± 2%, and research is underway to reduce this further [16]. So, the threshold value should thus be set according to the velocity of the measurement target and sampling rate.
Next, process 2 targets particles that have an ID at t − 1 but are not found at t and are thus temporarily lost. Assuming that the velocity and acceleration at t − 2 are maintained, the position at t + 1 is predicted, and if there is a particle within the estimated search area that has not yet been assigned an ID, it is judged to be the same particle and assigned an ID. The acceleration at t − 1 can be calculated from the positions at times t − 2, t − 1, and t, and the acceleration at time t can also be calculated from the positions at times t − 1, t, and t + 1. The virtual particle at time t is placed at the position where the acceleration at both times is constant.
Process 3 targets particles that have an ID at t − 2 but are not found at t − 1 and t. The same operation as in process two above is performed at the next time t + 1, and if a particle is found at t + 1, it is given an ID, and a virtual particle is placed at two times, t − 1 and t.
Finally, process 4 applies the four-step method to particles that have not yet been assigned an ID. This process is also applied to the initial state of the tracking process. For particles that have been successfully tracked four consecutive times, new IDs are assigned, and velocities and accelerations are calculated. When assigning new IDs, it is necessary to impose strict requirements that ensure that no mistakes are made.
Once a particle is given an ID, it is allowed to be lost through particle overlap up to two consecutive times, and it continues to be tracked until it moves out of the field of view.
The ability to track over long periods of time allows for polynomial fitting. The fitting function is not specified, but as in previous studies [14,20], a cubic equation is used in the present study. However, using a low number of points (e.g., four points) for fitting is inappropriate, and therefore, only particles that have been tracked more than 10 consecutive times are used in the velocity calculation. In addition, the long-time motion of particles having a short rotational period, such as particles in fine vortices, cannot be fitted. Therefore, initially, a fitting interval or curve order should be set for each particle. However, in this case, we uniformly set the maximum interval at 100 steps.

2.3. Calibration Curve

The calibration curve used in the conversion from the hue degree to the dimensionless z position is obtained by adopting in situ slit calibration [14]. The schematic of the slit calibration method is shown in the lower graph of Figure 4. Here, the color pattern from the projector falls incident on a slit. Only a portion of the color pattern passes through the slit, and the rest of the color pattern is blocked. The slit is moved in the z direction, and a curve is created by acquiring the hue of the particles at each depth position. The slit width is set at 1/30 of the full scale (i.e., 64 pix relative to 1920 pix and 6 mm relative to 180 mm). The slit is positioned such that half of its width overlaps with the previous measurement position. Values are obtained for 59 steps, and interior interpolation is performed between the measurements.
There are two problems with slit calibration: the color projected from the projector at both ends of the slit is not the intended color, and the projected color pattern partially hits the particles at both ends of the slit, resulting in a blurred image. Therefore, only the central 50% range of hue degrees is adopted, and the color of the particles obtained through the slit is analyzed, ignoring the extreme.

3. Experimental Setup

Figure 4 is a schematic of the top view of the experimental setup, showing the camera’s field of view and the optical arrangement. A VPL-HW60 liquid-crystal display projector (Sony) is used as the light source to output the color pattern. The projector has a brightness of 1800 lm and a resolution of 1920 × 1080 pix and provides illumination in the x direction. The image data (.tiff) of the color pattern is projected by the projector. The gamma value has a default value of 1.8. A Fresnel lens having a focal length of 350 mm is set next to the projector to collimate the expanding projection light. The illumination width in the z direction is 180 mm.
To increase the color detection accuracy, an AP3200T-PMCL-1 3CMOS camera (JAI) and VS-1218/3CMOS lens (VS Tech) are used in place of the typically used single CMOS color camera. For a single CMOS camera, a Bayer filter with R, G, and B colors and a 1:2:1 ratio covers a CCD/CMOS array where cells of the same color do not adjoin each other [21]. In contrast, the 3CMOS camera has three CMOS sheets and a dichroic mirror to separate light. Thus, R, G, and B values are obtained for the same cell of an image sensor array.
The distance between the center of the measured volume and the lens of the color camera is 610 mm. The field of view in the x direction ranges from 149 to 187 mm, and the spatial resolution ranges from 72 to 91 µm/pix, changing in the depth direction. The depth of view is sufficient to capture the particle image in the full depth range. The sampling rate is 40 fps, and the shutter speed is 1 ms. Generally, the sensitivity of channel G of a color camera is higher than the sensitivities of the other two channels. In this study, the sensitivity ratio of the camera is 1.6. To align the ranges of the three channels, the gains of the R and B channels are each set at 1.6 times the gain of channel G.
An acryl container having inner dimensions of 300 × 300 × 300 mm3 is filled with particle extended water. The Diaion HP20 tracer particles (Mitsubishi Chemi) have a density of 1.01 g/cm3, and their diameter is controlled to be within the range of 350–600 μm through sieving. The particle number density (particles per pixel, ppp) is evaluated from the captured image. The density cannot be correctly evaluated under dense conditions because overlapping particles cannot be accurately recognized. Therefore, we adopt a dilute condition with few overlapping particles as a standard, and we control the particle number density under a dense condition by mixing several times the dilute number of particles into the particle mixture. The particle concentrations are 0.001, 0.002, 0.003, and 0.004 ppp. Sample images for the different conditions are shown in Figure 5. A rotating disk with a diameter of 200 mm generates a circular flow in a container. The rotational speeds are 60, 120, 180, and 240 rpm, and the corresponding maximum movements in the image (i.e., the initial search areas) are 25, 40, 45, and 60 pix. The search area for the predicted position is uniformly set at 5 pixels. Incidentally, most particles are less than 10 pixels in diameter. Moreover, tracking in 3D space is almost impossible under such strict conditions. The search range for the predicted position is, therefore, set at one-fifth of the distance moved (in millimeters) in the previous step.

4. Results

4.1. Demonstration of In-Picture Tracking

Figure 6 shows an example of the compensation process for a lost particle. The figure shows a cropped image of 100 × 100 pix, with two overlapping particles at the center indicated by a white circle. The overlapping particles are seen as a single particle having a color different from that of the particles before and after, and the tracking is thus disconnected. The blue circles indicate the positions of the two particles, five steps before and five steps after. The positions of the two particles at the overlap time are estimated from the particle positions at the times before and after. The IDs of the particles are taken from before and after the overlap and considered to label the same particles.
This processing compensates for lost particles up to two consecutive times, such that the tracking fails when the overlap continues for more than three consecutive times. Although it is possible to extend the complementation interval beyond three-time steps as a solution, the best solution is to properly recognize each overlapped particle. Analysis of the best solution is left as a future topic of study.
Next, an example of complementation by polynomial approximation is shown, where the low accuracy of the depth position estimated from particle colors is complemented by approximating the particle trajectory with a third-order polynomial, as in [14,20].
Here, only particles that can be tracked for more than 10 steps are considered because the approximation does not work well when the number of successive tracking steps is too small. In addition, if the tracking interval is too long or if the particle has many rounds of vortex motion during the interval of interest, the approximation does not work well with the third-order polynomial. In this study, the approximation interval is limited to 100 time steps for a large circular flow. Although there is no method at this time to determine the fitting function and tracking interval for the various flows, it needs to be adjusted according to the experimental results.
In Figure 7, the left panel shows an example time series of the z position of a particle for 100-time steps and the trajectory obtained by approximation, and the right panel shows the time series of the z-directional velocity and the velocity obtained by approximation. As shown in Figure 3, the xy position in 3D space changes the viewing angle depending on the depth, such that the z position also affects the xy position. Looking at the particle positions alone, the positional deviation before and after fitting does not appear to be large, but the velocity deviation is very large, indicating that fitting is essential. This velocity deviation is mitigated as the particle movement in a one-time step increases, but conversely, continuous particle tracking becomes more difficult.
Figure 8 shows sample results of the particle trajectory of rotational motion in 3D, XZ, and ZY 2D coordinates as the demonstration. The particle number density is 0.003 ppp, and the disk rotating speed is 180 rpm. Results are shown for 100-time steps but with a moderately reduced number of particles for visibility.
The average velocity field in the y-section of the same results shown in Figure 8 is illustrated in Figure 9. The velocities that can be measured for each 10 × 10 × 10 mm3 voxel are temporally averaged at a height of 30 mm < y < 40 mm. The velocity in the depth direction is measured with sufficient accuracy to determine the overall flow.

4.2. Improvement in Velocity Vector Acquisition

Whereas the tracking rate in three-dimensional space is around 30% at the particle number density of 0.001 ppp and the disc rotation speed of 60 rpm condition, in-picture tracking achieved around 90%, and furthermore, almost 100% of particles are successfully tracked by the lost particle compensation process for temporarily lost particles. In both methods, the search area for particles is set to 1/5 of the maximum particle moving distance (mm or pix) in one step for each rotational speed condition. In tracking in three-dimensional space, an increase in the search range can improve the tracking rate, but the probability of making a mistake in linking a particle to another particle increases.
Figure 10 shows the particle tracking rates for different particle densities and disk rotational speeds. Tracking rates achieved using the in-picture method are indicated by the colored sections of bars, and further increases in the tracking rate due to compensation are shown by the black sections of the bars. If there is particle overlap, the number of particles cannot be accurately determined from an image. Therefore, as there is almost no overlap at a particle density of 0.001 ppp, the number of particles is estimated by suspending an equal number of particles based on the number of particles recognized under this condition.
At a disk rotation speed of 60 rpm, almost all particles are tracked, with the tracking rate slightly exceeding 100%. The tracking rate is calculated by dividing the number of tracked particles by the number of recognized particles. The tracking rate exceeds 100% because the number of tracked particles exceeds the number of recognized particles due to the compensation process.
Increasing the particle density and particle movement in one step increases the number of candidate particles to be tracked, making tracking more difficult. An increase in particle density increases the probability of particle overlap and, therefore, the number of lost particles. In this method, if a particle is not recognized for three consecutive steps, compensation is not possible, and the velocity vector cannot be obtained.
An increase in particle movement in one step decreases the tracking rate, as particles are more likely to be outside the measurement range. It also increases the number of candidate particles within the searching area, which also makes tracking more difficult.
In the present study, more than 40% of the particles in the image can be tracked, and velocity vectors are obtained at 0.004 ppp and 240 rpm conditions. If each overlapping particle could be recognized individually, the number of vectors obtained would be expected to increase dramatically.
A polynomial fitting process is required to obtain the velocity vectors, and therefore, particles that are tracked continuously for a certain number of steps are needed. Figure 11 shows the frequency of successive tracking steps before and after the compensation process. The sample data are the same as those in Figure 8 and Figure 9 under the condition of 0.003 ppp and 180 rpm. In this study, the continuous track required for fitting is 10 steps. The compensation process reduces the frequency of occurrences below 10 steps and generally increases the number of continuous tracking steps.
Figure 12 shows the velocity vector acquisition number for different particle densities and disk rotational speeds. As in Figure 11, the results obtained using the in-picture method are indicated by the colored sections of bars, and the improvements due to compensation are shown by black sections.
Here, only particles that can be tracked continuously for more than 10 steps are used. The improvement in the tracking rate and the number of consecutive tracking steps due to compensation for lost particles significantly increases the number of acquisition vectors. For example, two sets of particles that are tracked for only five consecutive steps cannot be processed for fitting, so no velocity vectors are obtained. However, if they are concatenated as identical particles by the compensation process, the number of acquisition velocity vectors increases by 10. This greatly increases the number of vector acquisitions. Under some conditions, the increase is several times.
At a disk rotational speed of 60 rpm, where the movement per step is small, the number of vectors is highest at a particle density of 0.004 ppp, which corresponds to the highest number of particles. In contrast, at a disk rotational speed of 240 rpm, where the movement is large, an intermediate number of particles gives the highest number of vector acquisitions.

5. Conclusions

This paper proposed a high-probability and long-term tracking method with which to increase the number of vector acquisitions for 3D3C color PTV. Particles are tracked in a raw picture instead of in 3D space and by using color information as an aid. A particle that temporarily cannot be tracked is then compensated for using location information at the times before and after.
The proposed method is evaluated for flow under a rotating disk. The adoption of the proposed method improves the tracking rate, continuous tracking period, and a number of acquired velocity vectors. This methodology can be applied to the difficult conditions of high particle density (0.004 ppp) and large particle movement (maximum 60 pix).
The results of the present study have increased the number of vector acquisitions and improved the continuous particle tracking time, which have important implications for the post-processing of the measurements. Increased vector acquisition density leads to higher quality results in methods that reconstruct grid velocity data and pressure and vorticity fields from Lagrangian data [17]. Long-term tracking is also an important factor in improving the result of the method that reconstructs the velocity data as it is Lagrangina data [22].

Author Contributions

Conceptualization, methodology, and writing—original draft preparation, M.T.; software, M.T. and K.F.; writing—review and editing, project administration, Y.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The experimental data is available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Neogi, A.; Mohanta, H.K.; Sande, P.C. Particle Image Velocimetry Investigations on Multiphase Flow in Fluidized Beds: A Review. Flow Meas. Instrum. 2023, 89, 102309. [Google Scholar] [CrossRef]
  2. Seyfi, S.; Karimpour, S.; Balachandar, R. Simultaneous Flow and Particle Measurements for Multiphase Flows in Hydraulic Engineering: A Review and Synthesis of Current State. Flow Meas. Instrum. 2024, 99, 102666. [Google Scholar] [CrossRef]
  3. Etminan, A.; Muzychka, Y.S.; Pope, K.; Nyantekyi-Kwakye, B. Flow Visualization: State-of-the-Art Development of Micro-Particle Image Velocimetry. Meas. Sci. Technol. 2022, 33, 092002. [Google Scholar] [CrossRef]
  4. Discetti, S.; Coletti, F. Volumetric Velocimetry for Fluid Flows. Meas. Sci. Technol. 2018, 29, 042001. [Google Scholar] [CrossRef]
  5. Prasad, A.K. Stereoscopic Particle Image Velocimetry. Exp. Fluids 2000, 29, 103–116. [Google Scholar] [CrossRef]
  6. Hori, T.; Sakakibara, J. High-Speed Scanning Stereoscopic PIV for 3D Vorticity Measurement in Liquids. Meas. Sci. Technol. 2004, 15, 1067–1078. [Google Scholar] [CrossRef]
  7. Burgmann, S.; Brücker, C.; Schröder, W. Scanning PIV Measurements of a Laminar Separation Bubble. Exp. Fluids 2006, 41, 319–326. [Google Scholar] [CrossRef]
  8. Elsinga, G.E.; Scarano, F.; Wieneke, B.; Van Oudheusden, B.W. Tomographic Particle Image Velocimetry. Exp. Fluids 2006, 41, 933–947. [Google Scholar] [CrossRef]
  9. Casey, T.A.; Sakakibara, J.; Thoroddsen, S.T. Scanning Tomographic Particle Image Velocimetry Applied to a Turbulent Jet. Phys. Fluids 2013, 25, 2. [Google Scholar] [CrossRef]
  10. Schanz, D.; Gesemann, S.; Schröder, A. Shake-The-Box: Lagrangian Particle Tracking at High Particle Image Densities. Exp. Fluids 2016, 57, 70. [Google Scholar] [CrossRef]
  11. Watamura, T.; Tasaka, Y.; Murai, Y. LCD-Projector-Based 3D Color PTV. Exp. Therm. Fluid Sci. 2013, 47, 68–80. [Google Scholar] [CrossRef]
  12. Park, H.J.; Saito, D.; Tasaka, Y.; Murai, Y. Color-Coded Visualization of Microbubble Clouds Interacting with Eddies in a Spatially Developing Turbulent Boundary Layer. Exp. Therm. Fluid Sci. 2019, 109, 109919. [Google Scholar] [CrossRef]
  13. Park, H.J.; Yamagishi, S.; Osuka, S.; Tasaka, Y.; Murai, Y. Development of Multi-Cycle Rainbow Particle Tracking Velocimetry Improved by Particle Defocusing Technique and an Example of Its Application on Twisted Savonius Turbine. Exp. Fluids 2021, 62, 71. [Google Scholar] [CrossRef]
  14. Noto, D.; Tasaka, Y.; Murai, Y. In Situ Color-to-Depth Calibration: Toward Practical Three-Dimensional Color Particle Tracking Velocimetry. Exp. Fluids 2021, 62, 131. [Google Scholar] [CrossRef]
  15. Noto, D.; Tasaka, Y.; Murai, Y. Low-Cost 3D Color Particle Tracking Velocimetry: Application to Thermal Turbulence in Water. Exp. Fluids 2023, 64, 92. [Google Scholar] [CrossRef]
  16. Takeyama, M.; Suto, H.; Hattori, Y. 3D3C Rainbow Particle Tracking Velocimetry: Improving Depth Resolution and Velocity Vector Acquisition Rate by Using Color Space for a Multi-Cycle Rainbow Pattern. J. Vis. 2024. [Google Scholar] [CrossRef]
  17. Schneiders, J.F.G.; Scarano, F. Dense Velocity Reconstruction from Tomographic PTV with Material Derivatives. Exp. Fluids 2016, 57, 139. [Google Scholar] [CrossRef]
  18. Malik, N.A.; Dracos, T.; Papantoniou, D.A. Particle Tracking Velocimetry in Three-Dimensional Flows. Exp. Fluids 1993, 15, 279–294. [Google Scholar] [CrossRef]
  19. Ishikawa, M.; Murai, Y.; Wada, A.; Iguchi, M.; Okamoto, K.; Yamamoto, F. A Novel Algorithm for Particle Tracking Velocimetry Using the Velocity Gradient Tensor. Exp. Fluids 2000, 29, 519–531. [Google Scholar] [CrossRef]
  20. Aguirre-Pablo, A.A.; Aljedaani, A.B.; Xiong, J.; Idoughi, R.; Heidrich, W.; Thoroddsen, S.T. Single-Camera 3D PTV Using Particle Intensities and Structured Light. Exp. Fluids 2019, 60, 25. [Google Scholar] [CrossRef]
  21. Pick, S.; Lehmann, F.O. Stereoscopic PIV on Multiple Color-Coded Light Sheets and Its Application to Axial Flow in Flapping Robotic Insect Wings. Exp. Fluids 2009, 47, 1009–1023. [Google Scholar] [CrossRef]
  22. Li, L.; Pan, Z. Three-Dimensional Time-Resolved Lagrangian Flow Field Reconstruction Based on Constrained Least Squares and Stable Radial Basis Function. Exp. Fluids 2024, 65, 57. [Google Scholar] [CrossRef]
Figure 1. Schematic showing particle tracking. Black circles indicate true positions, red circles indicate measured positions with error, the red dashed circle is the search area, and the red curve is the fitted curve.
Figure 1. Schematic showing particle tracking. Black circles indicate true positions, red circles indicate measured positions with error, the red dashed circle is the search area, and the red curve is the fitted curve.
Fluids 09 00226 g001
Figure 2. Flowcharts of data processing. The left flowchart is for the conventional 3D tracking method, whereas the right flowchart is for the proposed in-picture tracking method.
Figure 2. Flowcharts of data processing. The left flowchart is for the conventional 3D tracking method, whereas the right flowchart is for the proposed in-picture tracking method.
Fluids 09 00226 g002
Figure 3. Basic tracking process. Black letters are known values; red letters are values obtained after tracking.
Figure 3. Basic tracking process. Black letters are known values; red letters are values obtained after tracking.
Fluids 09 00226 g003
Figure 4. Top view of the experimental setup. The upper section shows a hole setup, a projector, a camera, a lens, and a test section. Lower shows illumination of slit calibration and sample images of illuminated particles.
Figure 4. Top view of the experimental setup. The upper section shows a hole setup, a projector, a camera, a lens, and a test section. Lower shows illumination of slit calibration and sample images of illuminated particles.
Fluids 09 00226 g004
Figure 5. Example of images for different particle number densities. Brightness values are tripled for visibility.
Figure 5. Example of images for different particle number densities. Brightness values are tripled for visibility.
Fluids 09 00226 g005
Figure 6. Example of an image with particle overlapping. Brightness values are tripled for visibility.
Figure 6. Example of an image with particle overlapping. Brightness values are tripled for visibility.
Fluids 09 00226 g006
Figure 7. Results of the fitting process. Left: depth position, right: depth velocity.
Figure 7. Results of the fitting process. Left: depth position, right: depth velocity.
Fluids 09 00226 g007
Figure 8. Particle trajectory in 3D and 2D coordinates.
Figure 8. Particle trajectory in 3D and 2D coordinates.
Fluids 09 00226 g008
Figure 9. Time-averaged velocity field in XZ coordinates.
Figure 9. Time-averaged velocity field in XZ coordinates.
Fluids 09 00226 g009
Figure 10. Particle tracking rates for different disk rotating speeds and particle densities.
Figure 10. Particle tracking rates for different disk rotating speeds and particle densities.
Fluids 09 00226 g010
Figure 11. Improvements in the number of successive tracking steps due to the compensation process.
Figure 11. Improvements in the number of successive tracking steps due to the compensation process.
Fluids 09 00226 g011
Figure 12. Vector acquisition numbers for different disk rotating speeds and particle densities.
Figure 12. Vector acquisition numbers for different disk rotating speeds and particle densities.
Fluids 09 00226 g012
Table 1. Tracking processes for continuously tracked and temporarily lost particles. ○ means tracked, and × means untracked.
Table 1. Tracking processes for continuously tracked and temporarily lost particles. ○ means tracked, and × means untracked.
t − 2t − 1tOperation
Process 1Basic tracking
Process 2×Compensate time t
Process 3××Compensate time t − 1 and t
Process 4×××Four-time step method
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Takeyama, M.; Fujiwara, K.; Hattori, Y. Improvement in the Number of Velocity Vector Acquisitions Using an In-Picture Tracking Method for 3D3C Rainbow Particle Tracking Velocimetry. Fluids 2024, 9, 226. https://doi.org/10.3390/fluids9100226

AMA Style

Takeyama M, Fujiwara K, Hattori Y. Improvement in the Number of Velocity Vector Acquisitions Using an In-Picture Tracking Method for 3D3C Rainbow Particle Tracking Velocimetry. Fluids. 2024; 9(10):226. https://doi.org/10.3390/fluids9100226

Chicago/Turabian Style

Takeyama, Mao, Kota Fujiwara, and Yasuo Hattori. 2024. "Improvement in the Number of Velocity Vector Acquisitions Using an In-Picture Tracking Method for 3D3C Rainbow Particle Tracking Velocimetry" Fluids 9, no. 10: 226. https://doi.org/10.3390/fluids9100226

APA Style

Takeyama, M., Fujiwara, K., & Hattori, Y. (2024). Improvement in the Number of Velocity Vector Acquisitions Using an In-Picture Tracking Method for 3D3C Rainbow Particle Tracking Velocimetry. Fluids, 9(10), 226. https://doi.org/10.3390/fluids9100226

Article Metrics

Back to TopTop