Speaker
Description
The ITER maintenance is done by means of Remote Handling (RH) systems. During maintenance operations, the RH operator is intended to utilize user interfaces for commanding, monitoring and controlling the RH Equipment. The user interfaces are, for example, GUIs, haptic and joystick devices, Virtual Reality (VR) systems and camera views on the RH Equipment and its environment. Many RH tasks involving movers and robotic arms require millimetre accuracy but camera views are often very limited, of poor quality and might be unavailable during the most close and accurate steps. In these cases, the RH operator is constrained to relying on VR providing that the VR models of RH Equipment and its environment be accurately calibrated, but should also take advantage other complementary technologies such as synthetic viewing and computer aided teleoperation. In this context, the purpose of this research was to prototype and evaluate a novel stereoscopic vision software system called 3D Node that locates and tracks the position and orientation of a target item with respect to a stereo-camera attached to the wrist of a robotic manipulator arm. The 3D Node features stereo-camera calibration, target depth mapping, target position and orientation detection and online target tracking including views by overlaying VR to cameras views for quality checks. The tracking information would be valuable for updating VR models and implementing augmented reality and synthetic viewing functionalities. This paper reports the 3D Node system demonstration on a Divertor RH use case and discusses the system applicability to other ITER RH systems. Parts of the 3D Node development are published in [1],[2].
[1] Niu, L., et al. “Eye-in-Hand Manipulation for Remote Handling: Experimental Setup”, International Conference on Robotics and Mechatronics, 2017.
[2] Niu, et al. "Robust Pose Estimation with the Stereoscopic Camera in Harsh Environment ", International Symposium on Electronic Imaging, 2018.