J. Ocean Eng. Technol. Search

CLOSE


J. Ocean Eng. Technol. > Volume 30(3); 2016 > Article
Song and Choi: Underwater 3D Reconstruction for Underwater Construction Robot Based on 2D Multibeam Imaging Sonar

Abstract

This paper presents an underwater structure 3D reconstruction method using a 2D multibeam imaging sonar. Compared with other underwater environmental recognition sensors, the 2D multibeam imaging sonar offers high resolution images in water with a high turbidity level by showing the reflection intensity data in real-time. With such advantages, almost all underwater applications, including ROVs, have applied this 2D multibeam imaging sonar. However, the elevation data are missing in sonar images, which causes difficulties with correctly understanding the underwater topography. To solve this problem, this paper concentrates on the physical relationship between the sonar image and the scene topography to find the elevation information. First, the modeling of the sonar reflection intensity data is studied using the distances and angles of the sonar beams and underwater objects. Second, the elevation data are determined based on parameters like the reflection intensity and shadow length. Then, the elevation information is applied to the 3D underwater reconstruction. This paper evaluates the presented real-time 3D reconstruction method using real underwater environments. Experimental results are shown to appraise the performance of the method. Additionally, with the utilization of ROVs, the contour and texture image mapping results from the obtained 3D reconstruction results are presented as applications.

1. Introduction

The real-time 3D reconstruction of underwater surfaces has become an important issue in applications like underwater environment inspection, underwater vehicle and robot navigation, and other multi-purpose underwater monitoring applications. This paper specifically focuses on the advantages of the real-time 3D reconstruction of an underwater environment for an underwater construction robot, as shown in Fig. 1. These advantages include obstacle avoidance by the recognition of the underwater scene topography, localization for a remotely operated vehicle (ROV) or an autonomous underwater vehicle (AUV) by matching given large 3D geometric information to increase the operation automation, and the intuitive perception of underwater environments for an ROV operator.
Fig. 1

Underwater Construction Robot

HOGHC7_2016_v30n3_227_f001.jpg
For ground vehicles, the 3D surface reconstruction method has been successfully applied in recent years. However, some problems remain unsolved for underwater vehicles, mainly because of the challenges of underwater environments. One of the most difficult peculiarities is unexpected turbidity, which makes it difficult to use an optical camera or a laser scanner because the field of view (FOV) of these sensors is dramatically decreased. Thus, imaging sonar sensors, which produce images by correcting the reflected sound intensity of acoustic beam pulses from underwater objects, have emerged as a possible alternative, because they are not affected by the turbidity. Fig. 2 shows a real underwater image captured by a 2D imaging sonar.
Fig. 2

Image from 2D Multibeam Imaging Sonar

HOGHC7_2016_v30n3_227_f002.jpg
However, obtaining 3D information and clues from 2D sonar images is still challenging. In this paper, we first concentrate on the physical relationship between the sonar intensity data, which follow Lambert's law, and the scene topography (Aykin and Negahdaripour, 2012). The reflected sonar intensities differ with the angle of reflection, which is a key to estimating the topography. Second, we focus on the acoustic shadows that appear behind objects placed on the sea floor surface. For the same distance, the length of the shadow depends on the elevation of the scene. Hence, these shadows can be important visual clues in 2D imaging sonar imagery to determine the volume and edge boundaries of objects.
Therefore, the proposed real-time 3D reconstruction method first finds object-shadow pairs in the sequenced sonar images by analyzing the reflected sonar intensity pattern. Then, elevation information is computed using Lambert's reflection law and the length of the shadows. We evaluate the real-time 3D reconstruction method using real underwater environments. Experimental results are shown to appraise the performance of the method. Additionally, with the utilization of ROVs, the contour and texture image mapping results from the obtained 3D reconstruction results are presented as applications.
This paper is structured as follows: the related works are explained in section 2. Section 3 describes the data acquisition with the 2D imaging sonar. In section 4, we describe the underwater structure surface 3D reconstruction scheme, and then show the experimental results in section 5. Finally, in section 6, we close this paper with some conclusions and a discussion of future work.

2. Related Work

The real-time underwater structure surface 3D reconstruction problem has been intensively studied for the last decade. There are several different types of sensors that have been applied, and each sensor has merits and demerits. Optical vision sensors have comparatively low prices, but do not provide 3D information. However, 3D information can be computed using vision-based 3D reconstruction techniques like stereo vision, structure from motion (SFM), and visual simultaneous localization and mapping (SLAM) (Brandou et al., 2007; Beall et al., 2010; Pizarro et al., 2004; Negahdaripour and Sarafraz, 2014). Although these techniques have made progress in ground vehicle and robotics applications, they have not been efficiently applied to underwater vehicles because of the challenging conditions of underwater environments such as the unexpected turbidity and light sources. The turbidity causes the FOV of the optical sensors to shorten, which causes difficulties in extracting features from images. In addition, a laser-based structured light system provides high-resolution 3D reconstruction results, but has the same problems with turbidity (Massot-Campos and Oliver-Codina, 2014). Thus, these kinds of optical vision-based sensors have an inherent limitation in underwater environments.
To solve this problem, acoustic device-based 3D reconstruction schemes have been studied to replace optical vision sensors. A multibeam scanning sonar captures 3D point cloud data similar to a topographic laser scanner (Papadopoulos et al., 2011; Coiras et al., 2007). These sensors provide high-resolution 3D reconstructed data, but need additional mechanical systems such as a pan & tilt device, which makes the sensors more expensive. Moreover, the 3D data provided are not real-time because these need a certain amount of time to scan the surroundings. A 3D imaging sonar captures a 3D point cloud in real-time without any additional device (Hansen and Andersen, 1993; Hansen and Andersen, 1996). However, its relatively low resolution and high cost are the primary drawbacks.
In order to overcome these drawbacks and meet the requirements for real-time 3D reconstruction, a 2D forward looking imaging sonar has been applied in recent years to underwater vehicle and robotics applications. Indeed, extracting 3D information from 2D images is challenging, but it has been made possible using the geometry and sonar intensity data relationship (Aykin and Negahdaripour, 2012), which inspired this study.

3. Data Acquisition with 2D Imaging Sonar

3.1 2D Imaging Sonar

As shown in Fig. 3, acoustic pressure waves induced by the transmitters of an imaging sonar propagate and are reflected by the underwater structure surfaces, and the 2D imaging sonar collects these reflected echoes. The 2D imaging sonar discussed here is a BlueView P-900 (Teledyne BlueView, 2015). The P-900 has a range of 2–60 m, and 512 beams (beam width: 1° × 20°, spacing: 0.18°) are formed. The system offers 512 × 1160 images, where each pixel displays the reflection intensity for spots with the same distance, without elevation information.
Fig. 3

Example of how a given scene would appear when viewed visually with 2D imaging sonar

HOGHC7_2016_v30n3_227_f003.jpg

3.2 Coordinate System of Sonar

Equation (1) derives the 2D imaging sonar coordinate system by applying Cartesian coordinates (x0, y0, z0) and spherical coordinates (R, ϕ, θ), to express the P0, as shown in Fig. 4. The 3D position is computed using the elevation method discussed in section 4.
Fig. 4

2D imaging sonar coordinate system

HOGHC7_2016_v30n3_227_f004.jpg
HOGHC7_2016_v30n3_227_e901.jpg
Equation (2) shows the method used to compute the spherical coordinates (R, ϕ, θ) from the P0 of the imaging sonar image using inverse transformation,
HOGHC7_2016_v30n3_227_e902.jpg

3.3 Object and Shadow Detection

As previously explained, the reflected sonar intensities depend on the angles of the reflecting surfaces of objects, which means it has a strong relationship with the shape of the objects. In addition, the shadow depends on the height of the object. Moreover, these shadows play an important role in 2D imaging sonar imagery to determine the volume and edge boundaries of objects. Thus, correctly detecting an object and its shadow is a crucial factor for 3D reconstruction from 2D sonar images.

i. Estimation of Sea Floor Surface:

To define an object, this paper applies a machine learning algorithm. Indeed, the sonar intensity data follow Lambert's reflection law, as seen in Eq. (3).
HOGHC7_2016_v30n3_227_e903.jpg
where k is a constant value, and dA is the single beam reflecting area. The reflection angle of the sea floor surface α is relatively constant. Therefore, the sonar intensity data of the sea floor surface are also relatively constant. Thus, we can estimate the sea floor surface from the dataset.

ii. Feature Detection:

Once the sea floor is defined, then the thresholds of the object T0 and shadow Ts have to be defined to detect the object start point Ostart and end point Oend, and the shadow end point Send, as shown in Fig. 5. Oend is the same as the shadow start point. T0 and Ts are defined by T0 = Isurface + ε0 and Ts = Isurface -εs, respectively. Isurface is the defined sea floor surface sonar intensity. ε0 and εs are constant parameters.
Fig. 5

Object and shadow detections based on thresholds T0 and Ts, respectively

HOGHC7_2016_v30n3_227_f005.jpg
Fig. 6 shows an actual 2D imaging sonar image with the intensity data of a certain sonar beam. In actuality, the raw data are too noisy to accurately detect objects. Thus, a preprocessing process is needed such as the Gaussian image blur. As shown in Fig. 6 (b), the intensity data become clear after the image blur process, and Ostart, Oend, and Send are detected in each 2D sonar beam’s intensity data. Fig. 7 shows the result. The red color indicates Ostart, the green color shows Oend, and Send is marked in blue.
Fig. 6

Actual 2D imaging sonar image with intensity data of certain sonar beam

HOGHC7_2016_v30n3_227_f006.jpg
Fig. 7

Results of 3D reconstruction of real underwater structure surface

HOGHC7_2016_v30n3_227_f007.jpg

4. Sonar Surface 3D Reconstruction

4.1 Constraint Conditions

In order to extract 3D information from 2D sonar images, this paper defines the constraint conditions as follows:
  • In the 3D reconstruction of reflectable materials, a smooth steel-like surface is not detected by 2D imaging sonar. Thus, that kind of object is not applied.

  • The 2D imaging sonar needs to be placed deep enough that it is free from disturbances from sea surface reflections.

  • The reflections from underwater objects follow Lambert's law.

4.2 Surface Elevation Computation Scheme

After detecting objects and shadows from the 2D sonar intensity data, the 3D reconstruction procedure is as follows:
  • ⅰ. Detect objects and shadows based on T0 and Ts

  • ⅱ. Find Ostart, Oend, and Send from the 2D sonar images

  • ⅲ. Compute the elevation using Equation (4)

HOGHC7_2016_v30n3_227_e904.jpg
where c is a constant.

5. Experiments and Results

5.1 3D Reconstruction Results

Fig. 7 shows the results of the 3D reconstruction of a real underwater structure surface. The results are obtained by applying the images obtained from the multibeam imaging sonar as the texture on the 3D mesh results, which are shown in Fig. 8.
Fig. 8

3D mesh results of 3D reconstruction

HOGHC7_2016_v30n3_227_f008.jpg

5.2 Contour and Texture Mapping

Obstacle avoidance is a good application of the 3D reconstruction for an ROV. Therefore, the intuitive perception of the height of an underwater structure is very important for an operator. Fig. 9 (a) can be useful for this. Moreover, in collaboration with an optical camera image, texture mapping on the 3D reconstruction results is possible, as a result of the intuitive underwater scene perception (see Fig. 9 (b)). In this experiment, an ROV was utilized to obtain optical images, which are applied as the texture on the 3D reconstruction result, since the FOV of the optical camera was relatively short compared to the multibeam imaging sonar.
Fig. 9

Experimental results of contour and texture mapping

HOGHC7_2016_v30n3_227_f009.jpg

6. Conclusions

In this paper, we demonstrated that 3D information can be extracted 2D imaging sonar data. This is achieved by detecting the objects and shadows from the sonar intensity data. The 3D information could be obtained using the presented elevation computing scheme. This scheme could be applied to a ROV or an AUV by offering the 3D map with contour and texture. Moreover, the accuracy of this scheme will be further examined, and applied to large 3D mapping using a tracking and mapping method for the localization of ROVs and AUVs.

NOTES

It is noted that this paper is a revised edition based on proceedings of KMRTS 2015 in Gyeongju.

ACKNOWLEDGEMENTS

This research was part of the project titled “R & D center for underwater construction robotics,” funded by the Ministry of Oceans and Fisheries (MOF) and the Korea Institute of Marine Science & Technology Promotion (KIMST), Korea.

References

Aykin, M., Negahdaripour, S.. (Forward-look 2-d Sonar Image Formation and 3d Reconstruction Proceedings of IEEE/MTS Oceans 12 Conference 2012.

Beall, C., Lawrence, B.J., Ila, V., Dellaert, F.. (3d Reconstruction of Underwater Structures Proceedings of Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on. IEEE 2010). 4418-4423.

Brandou, V., Allais, A.G., Perrier, M., Malis, E., Rives, P., Sarrazin, J., Sarradin, P.M.. (3d Reconstruction of Natural Underwater Scenes using the Stereovision System Iris Proceedings of IEEE OCEANS 2007-Europe 2007). 1-6.

Coiras, E., Petillot, Y., Lane, D.M.. (Multiresolution 3-d Reconstruction from Side-scan Sonar Images, IEEE Transactions on Image Processing, 2007). 16(2):382-390 10.1109/TIP.2006.888337.
crossref pmid
Hansen, R., Andersen, P.. (A 3d Underwater Acoustic Camera Properties and Applications, Acoustical Imaging, Springer; 1996). 607-611.
crossref
Hansen, R.K., Andersen, P.A.. (3d Acoustic Camera for Underwater Imaging, Acoustical Imaging, Springer; 1993). 723-727.
crossref
Massot-Campos, M., Oliver-Codina, G.. (Underwater Laser-based Structured Light System for One-shot 3d Reconstruction Proceedings of Sensors 2014 2014.

Negahdaripour, S., Sarafraz, A.. (Improved Stereo Matching in Scattering Media by Incorporating Backscatter Cue 2014.

Papadopoulos, G., Kurniawati, H., Shariff, B.M., Shafeeq, A., Wong, L.J., Patrikalakis, N.M.. (3d Surface Reconstruction for Partially Submerged Marine Structures using an Autonomous Surface Vehicle Proceedings of Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on. IEEE 2011). 3551-3557.

Pizarro, O., Eustice, R., Singh, H.. (Large Area 3d Reconstructions from Underwater Surveys OCEANS’04. MTTS/IEEE TECHNOOCEAN’04 2004). 2: 678-687.

Teledyne BlueView BlueView P-900. [Online] 2015). Available at: <http://www.blueview.com/products/2d-imaging-sonar/pseries-archives/p900-series/> [Accessed 12 March. 2016].

TOOLS
Share :
Facebook Twitter Linked In Google+ Line it
METRICS Graph View
  • 4 Crossref
  •   Scopus
  • 5,756 View
  • 383 Download


ABOUT
BROWSE ARTICLES
ARTICLE CATEGORY

Browse all articles >

PUBLICATION ETHICS
FOR CONTRIBUTORS
Editorial Office
President Office BD Rm. 1302, 13 Jungang-daero 180beon-gil, Dong-gu, Busan 48821, Republic of Korea
Tel: +82-51-759-0656    Fax: +82-51-759-0656    E-mail: ksoehj@ksoe.or.kr                

Copyright © 2024 by The Korean Society of Ocean Engineers.

Developed in M2PI

Close layer
prev next