The robot is designed to dynamically avoid obstacles. The system incorporates real-time obstacle detection through Lidar and camera vision to ensure that it avoids collisions with people, animals, or any other entities in its environment. The robot has avoided all humans and dynamic obstacles when testing in real world, uncontrolled conditions.
The robot is intended to be used for responsible, non-malicious purposes. Users must ensure that the robot is never deployed in scenarios that could cause physical or emotional harm to individuals, animals, or property. This includes, but is not limited to, misuse in aggressive or conflict-related situations, or any form of surveillance or tracking without appropriate consent. Any use of the robot or code for military or other ethically questionable purposes is strictly prohibited.
The program is written to operate with the Robot Operating System (ROS) and has been specifically tuned for use with Neato robots and their mounted cameras. The RRT* path planning parameters and camera vision settings are optimized for these particular hardware configurations. As a result, we cannot guarantee the same level of performance or reproducibility when the program is used with unintended or unsupported receivers. Users are advised to test and calibrate the system thoroughly when adapting it to different hardware to ensure reliable operation.
This robot is specifically designed for indoor environments. It should operate optimally in spaces such as warehouses, offices, or labs. Deployment in outdoor or highly variable conditions is not recommended, as the system is optimized for indoor navigation with consistent environmental factors.
The robot is intended to operate on flat, level surfaces. It may struggle or become inoperable on uneven terrain, steep inclines, or surfaces with significant gradients. Users should ensure that the environment is adequately prepared with level, stable flooring to support safe and reliable operation.
The robot's Lidar and camera vision systems are not capable of detecting transparent surfaces such as glass. Users must ensure that the robot is not navigating through environments where glass barriers or windows could obstruct its path, as it may not recognize these obstacles and could collide with them.
While the robot is capable of navigating through cluttered environments, its ability to find a viable path may be limited depending on the density and complexity of obstacles. Although it will safely avoid obstacles, it may fail to compute an optimal path in a highly congested space. In such cases, the robot may require manual intervention or re-routing to proceed safely.
The robot does not store any data collected during its operation. However, the data transmitted by its Lidar and camera systems, such as environmental scans and visual feeds, may be intercepted during real-time transmission. Users should be aware that this data could be accessible through network connections, and appropriate measures should be taken to secure the communication channels. It is the responsibility of the user to ensure that the robot is operated in a secure network environment to prevent unauthorized access or interception of data. Additionally, the system should not be used in locations where sensitive personal information may be inadvertently captured.
The program utilizes the DepthAnything model to drive monocular depth estimation. As a result, all images captured and processed by the Neato robot, including those used for depth estimation, are subject to the privacy and data usage policies of the DepthAnything model. Users should be aware of and comply with these policies when operating the system, ensuring that any data captured through the robot's sensors is handled in accordance with relevant privacy laws and guidelines.
Users must be fully educated and aware of the robot's capabilities and limitations, especially in terms of navigation, decision-making, and the scope of the RRT* path planning algorithm. This includes understanding that the robot operates in dynamic and potentially unpredictable environments, and its path planning can only handle situations that are accounted for by the current programming and sensor capabilities.