Many applications in robotics require locating and removing parts, or confirming belts and dunnage sheets are empty before robot arms move, causing damage or downtime. In addition, manufacturing and logistics applications must deal with many different parts, and end effectors must be able to correctly orient to pick them. Thin, shiny, machined parts can be particularly challenging to locate and measure using 3D capture technologies.
For many depth sensors, thin objects are lost in the depth measurement noise. In order to see if Cloudburst might be useful for demanding applications such as identifying thin parts, we captured pointclouds of an assortment of machined objects to see how well it could meet these requirements.
Large shiny square object is 20 cm on a side. The thick dimension is 18mm, while the thin plate in the middle is 3 mm.
The Y-shaped object is 15 mm thick. Thin legs are 15mm wide and the thicker portion is 25 mm wide.
The black C-shaped parts are 2 mm thick with a the narrowest dimension 5 mm wide.
A meter stick is also included for reference (it is 1 mm thick).
The animated picture shows a static 3D frame captured from Cloudburst positioned 55 cm above the scene. The pointcloud is colored according to depth with a range of 30 mm. Hot colors are closer to the camera. The animation illustrates flatness of the measured planes, and makes it easier to visualized this completely raw 3D data.
Cloudburst captures 3D data in a single exposure, which makes it suitable to record 3D in motion as well as statically as shown here. Capturing data with the Jetstream SDK is straight forward and example code able to output both pointclouds and depthmaps is included with the camera. The large depth of field and factory calibration make set-up essentially point-and-shoot. There are no parameters to set or any lighting to adjust.
If you think Cloudburst could work for your application, please contact us.