Puddle detection
2016-04
This is the course project for the graduate-level course 16-823 Physics-based methods in vision at Carnegie Mellon University.
1 Motivation
Pools of water are common hazards for terrestrial autonomous vehicles operating in a natural environment. Puddles can reduce traction and could end up being surprisingly deep, posing a great danger. We propose using a physics-based method to detect water hazards in cluttered environments using a stereo pair of cameras and lidar.
2 Related work
Machine learning is often used for scene classification. However, it is difficult for such systems to distinguish specular surfaces from reflected objects based on appearance alone. Zhao et al use segmentation combined with Adaboost for detecting water hazards, but their method fails to correctly identify reflections adaboost.
Methods using specialized hardware such as polarization filters polarization and short- and mid-wave infrared cameras infrared also exist. However, our robot only has regular cameras and lidar.
Hong et al mention a simple method to identify puddles using a lidar sensor by looking for holes in the data nist. However, this is sensitive to the angle of the puddle’s normal, does not take into account reflections of nearby objects, and may fail to distinguish dark-coloured ground such as asphalt from puddles. Although their method also uses vision to refine puddle detection, it uses relatively simple methods such as checking for reflections of the sky or distant entities in the image without using the geometry of objects in the scene.
Purely physics-based vision approaches to this topic include work by a team at the Jet Propulsion Laboratory jplajplbjplc, who combine several cues including sky reflection and texture.
So far, the approaches mentioned above all struggle when there are reflections of objects in the puddles. The main contribution of our method is that we use geometry information of the scene from the lidar sensor to deal with reflections of objects at near ranges.
3 Approach
Our source of data is an autonomous off-road vehicle equipped with a Multisense S21 stereo pair of cameras equipped with CMOSIS CMV2000 sensors producing image pairs at a rate of 30 Hz with a dynamic range of 60 dB and a Velodyne HDL-64E lidar sensor producing point clouds at a rate of 10 Hz. The robot pose is provided by an IMAR high precision GPS/INS system. The setting for data collection is the off-road trails in Gascola, PA.
As the vehicle approaches a surface patch on the ground, it is able to view the patch from many angles, allowing it to estimate the specularity of the patch. The lidar sensor provides accurate geometric information of the ground to identify planar regions likely to contain puddles as well as range data to the surrounding objects for computing their reflections.
In order to detect pools of water, we will be utilizing multiple cues. In the interests of making this a purely physics based project, we are not considering any deep learning approaches. Some of the cues we are considering are as follows:
- detecting changes in pixel brightness as the vehicle advances
- detecting the reflection of nearby objects
Our code is available on GitHub:
4 Results
4.1 Changes in pixel brightness as vehicle advances
With a good estimate of the ground plane and the robot’s pose, we can track the motion of each patch on the ground. A specular patch’s appearance will change as we drive towards it, whereas a matte patch will remain roughly the same. Using this, we can detect specular puddles when they are reflecting things.
Unfortunately, our pose data is not sufficiently accurate to track patches in complex parts of the image such as grass, so there are false positives in grass. Nonetheless, we note that most puddles seem to be detected correctly.
4.2 Reflection detection
To further detect puddles, especially when they are reflecting the smooth and featureless sky, we can estimate the reflection of the scene. Since we know the index of refraction of water, and we assume that the sky is overcast (not polarized), we can apply Fresnel’s law, which depends only on the angle to the patch. Combined with a depth map from the lidar, we can generate a simulated reflection.
However, it seems our lidar depth map is not sufficiently accurate for a good result. In fact there are usually no points returned from the leafless trees. We may look into using stereo depth instead.
5 Discussion
Since our lidar and the cameras are not synced, we need accurate pose data for registering lidar points to the camera images. Unfortunately, our iMAR GPS/INS system has suffered a hardware issue which causes severe drift in pose.
After we fix the poor performance of reflection detection, we can detect reflections by comparing the observed pixels on the ground plane with the simulated reflection. This can either be a smoothed pixel values (which may not be robust against small misalignments), or histograms of gradients (which may not work as well for puddles that are not perfectly specular).
- nist Hong, T. H., Chang, T., Rasmussen, C., & Shneier, M. (2002). Feature detection and tracking for mobile robots using a combination of ladar and color images. In Robotics and Automation, 2002. Proceedings. ICRA’02. IEEE International Conference on (Vol. 4, pp. 4340-4345). IEEE.
- infrared Matthies, L. H., Bellutta, P., & McHenry, M. (2003, September). Detecting water hazards for autonomous off-road navigation. In AeroSense 2003 (pp. 231-242). International Society for Optics and Photonics.
- jpla Rankin, A. L., Matthies, L. H., & Huertas, A. (2004). Daytime water detection by fusing multiple cues for autonomous off-road navigation. Jet Propulsion Lab Pasadena CA.
- jplb Rankin, A., & Matthies, L. (2010, October). Daytime water detection based on color variation. In Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on (pp. 215-221). IEEE.
- jplc Rankin, A. L., Matthies, L. H., & Bellutta, P. (2011, May). Daytime water detection based on sky reflections. In Robotics and Automation (ICRA), 2011 IEEE International Conference on (pp. 5329-5336). IEEE.
- polarization Xie, B., Pan, H., Xiang, Z., & Liu, J. (2007, August). Polarization-based water hazards detection for autonomous off-road navigation. In Mechatronics and Automation, 2007. ICMA 2007. International Conference on (pp. 1666-1670). IEEE.
- adaboost Yao, T. Z., Xiang, Z. Y., & Liu, J. L. (2009). Robust water hazard detection for autonomous off-road navigation. Journal of Zhejiang University SCIENCE A, 10(6), 786-793.