The Inventors have developed an algorithm for measuring the velocity and 3D location of refractive fluids, such as hot air or gas, from natural videos with textured backgrounds. This method to visualize the movement of refractive fluid elements has exploratory and sensing applications in a wide host of scientific fields, including aeronautical engineering, combustion research, petrochemical probing, and ballistics.
Measuring and visualizing the flow of air and fluid has great importance in broad engineering applications. Multiple techniques have been proposed for this purpose, such as sound tomography, yet most either rely on complicated and expensive setups or are restricted to lab use. The Inventors' technique relies on intensity variations in the movement of refractive fluid elements, observed by one or more video cameras, that are consistent over small space-time volumes. These “refraction wiggles” are implemented by their algorithms to 1) measure the motion of refractive fluids in monocular videos, and 2) recover the 3D position of points on the fluid from stereo cameras in a localization process that is cheaper, more accessible, and applicable in natural settings.
The Inventors have developed a passive technique to measure air flow using video sequences. Light rays bend as they travel through the air of differencing densities. Such deflections are exploited in various air measurement techniques. As the air moves, small changes in the refractive properties appear as small visual deformation, called “refraction wiggles,” of the background texture, similar to the shimmering effect experienced when viewing objects across hot asphalt or through exhaust gases. These motions can be tracked using regular video cameras to infer information about the velocity and depth of a refractive fluid layer.
The Inventors observe that while intensity features result from a background layer and their location is not directly related to the fluid layer, the wiggles correspond to 3D positions and motion of points on the transparent fluid surface. The movement of those wiggles between multiple frames indicates the motion of the transparent fluid, and its disparity between viewpoints is a good cue for the depth of the fluid surface. The algorithms to track motion and recover position of points on the fluid surface are based on the constant motion of a refraction field. This distortion is measured by computing the wiggle features in an input video, and then using those features to estimate the motion and depth of the fluid, by matching them across frames and viewpoints.
- Simple setup can be used outdoors or indoors
- Algorithms can be used to visualize and measure air flow and 3D location directly from regular videos
- First method to provide a complete pipeline that measures the motions and reconstructs the location of refractive flow directly from videos taken in natural settings