INFORMATION TECHNOLOGY

Focus on the tragedy of the Titan, polarized optics for all-weather underwater positioning

In June this year, the sightseeing boat Titan disappeared and five wealthy passengers were buried at the bottom of the sea, which also made water analysis, underwater positioning and underwater rescue a topic of global concern.

Recently, Professor Viktor Gruev from the University of Illinois at Urbana-Champaign (UIUC) and the team of Professor David Forsyth proposed an underwater positioning scheme based on polarized optics, which achieved all-weather underwater geolocation for the first time. The article was published in the new eLight issue of the High Start Action Plan for Excellence under the title “Polarization-Based Underwater Geolocalization with Deep Learning”.

The work tracks the position of the sun through polarized optical imaging to determine the underwater position, and performs deep learning training based on tens of millions of polarization-sensitive images of real global waters, enabling all-weather underwater positioning in different environments such as high sea, clear water or low-visibility waters. The authors believe that once the underwater geographic location is determined, humans are expected to use this information for autonomous underwater navigation and a deeper understanding of the underwater world. For this major breakthrough, UIUC and EurekAlert, ScienMag, Tech Xplore and other media jointly released a working analysis video.

Earth’s waters are a highly complex and dynamic environment, which is of great significance to the survival and development of human beings. However, on-site monitoring of water bodies still faces many challenges. Underwater sampling robots are accurate monitoring results, but one of the main challenges comes from the lack of underwater geolocation technology, because the GPS navigation signals we know cannot penetrate the water surface, although acoustic navigation can provide a certain solution for underwater positioning, but it can only work in a very limited coverage area, and the accuracy is poor.

According to this, inspired by the natural phenomenon that many animals help them migrate by sensing polarization sensitive information in the sky or water, researchers long ago proposed that polarization patterns in the sky are visible in the Snell window in clear shallow water when viewed from underwater, perhaps for geolocation and navigation. For a long time, however, researchers believed that underwater light was primarily horizontally polarized and therefore unsuitable for geolocation, and was not discovered until after the 20th century. In 2018, Professor Gruev achieved underwater geolocation with an accuracy of 1970 km using polarization imaging in clear waters, published in Science Advances.

However, underwater geolocation is still not possible at night or in muddy waters with low visibility, which is widespread around the world. This comes from two main challenges: 1) polarization in muddy waters is considered horizontal; 2) There are no observations of underwater polarization patterns at night. Therefore, if the underwater geolocation of complex waters (muddy waters and night waters) can be realized, it is undoubtedly of great significance and practical value for global water research.

In open sea areas with low scattering coefficient (0.001 m-1) or low-nutrient freshwater, the underwater polarization mode can be accurately represented by a single scattering model, but for waters with high scattering coefficient and night waters, underwater geolocation methods based on scattering models become infeasible. In this study, the research team showed that polarization patterns generated by daylight in low-visibility waters and luminous in high- and low-visibility waters allow accurate geolocation.

First, the research team collected about 10 million actual images using underwater cameras capable of recording radially polarized light fields at four locations around the world, composing a dataset for training. The team then trained a deep neural network to predict geographic location based on underwater polarization angle (AoP) images collected by omnidirectional lenses, combined with camera position sensor data (Figure 1). The research team used RI-ResNet (rotation-invariant ResNet) deep neural network structure and added RDM architecture (recurrent denoising module), so that the scheme can establish the spatiotemporal connection of underwater polarization patterns. The research team compared the accuracy of underwater geolocation across time, date, and different visibility waters with the parameter-driven model (Figure 3). The results show that superior geolocation accuracy can be achieved in more complex waters using polarization information instead of just intensity images (Figures 4 and 5).

This research result opens up broader research prospects for the further development of underwater geolocation technology, and also provides a new and effective way for global water research, which is of great value to subsequent research in other related fields.

Figure 1: Deep neural network underwater geolocation method based on space-based underwater polarization information in low- and high-visibility waters during the day and night. (a)-(c) We deployed underwater polarization-sensitive imaging systems with omnidirectional lenses in high- and low-visibility waters to collect the required data. Next to each plot is a pseudo-color image of the measured polarization angle (AoP) and a graph comparing the observed AoP with the parametric model predictions. It can be seen that the predictions of parametric models are unreliable in low-visibility waters and are not effective at night. (d) We selected four different sites shown on the global map to collect underwater data and assess the effectiveness of our geolocation methodology. (e) Our deep neural network uses particle filters in combination with AoP image sequences to estimate the camera’s position, latitude, and longitude.

Figure 2: (a) The underwater polarization mode is mainly generated by the refraction of light between the air-water interface and the scattering within the water medium. These patterns can be modeled mathematically using Mueller matrices. (b) The particle filter (PF) channel displays high probability particles in red and low probability particles in blue. (c,d) Our proposed network model includes RI-ResNet (rotation-invariant ResNet) architecture, which correspondingly replaces each convolutional layer with RI convolutional layers and considers radial spatial structures in omnidirectional images. (e) The RDM (recurrent denoising module) architecture involves a two-way loop network used to model the temporal dependence between images.

Figure 3: All-day geolocation in low- and high-visibility waters. The top rows ((a) and (b)) and bottom rows ((c) and (d)) show the geolocation accuracy using parametric models and deep neural network models, respectively, and the left and right columns represent high visibility and low visibility waters, respectively. Parametric underwater geolocation has moderate to low accuracy in low-visibility waters due to flaws in modeling that incorporates all the physical factors that cause underwater polarization. In contrast, deep neural network geolocation performed well throughout the day in both high- and low-visibility waters. Each map shows the mean (triangle and diamond) and first standard deviation (solid and dashed) estimated covariance for the particle filter for the geographic location at noon and end of day, respectively. The boxplot represents the median and upper/lower quartiles of prediction error for north-south (purple) and east-west (orange) geolocation.

Figure IV: Underwater geolocation data at a depth of 50 m in Lake Ohrid, North Macedonia. At a depth of 50 m, (a) solar angle error and (b) geolocation error obtained in a few hours.

Figure 5: Geolocation accuracy at night under different moon phases. (a) The global map shows the mean (represented by a diamond) and first standard deviation (represented by a solid line) of the particle filter estimates for the geographic locations of the four stations (cross symbols) of the new and full moon phases. (b) Boxplots represent the median and upper/lower quartiles of prediction errors for north-south (purple) and east-west (orange) geolocation.(Source: China Optics WeChat public account)

Related paper information:https://doi.org/10.1186/s43593-023-00050-6

Special statement: This article is reproduced only for the need to disseminate information, and does not mean to represent the views of this website or confirm the authenticity of its content; If other media, websites or individuals reprint and use from this website, they must retain the “source” indicated on this website and bear their own legal responsibilities such as copyright; If the author does not wish to be reprinted or contact the reprint fee, please contact us.



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button