As many industries, such as construction, mining, search & rescue, and surveying, start to rely more on digital data collected from robots and less on human interaction, especially during this pandemic, the need for the right tools for the job at hand grows bigger every day. While here at Commercial UAV News we usually cover aerial drone news only, the future of drones might rely on the combination of these flying tools with other technologies, including autonomous non-flying drones.

Recently, Percepto announced an investment of $45 million in Series B funding to launch a solution for remote, fully autonomous, asset monitoring, inspection, and compliance of industrial sites. To detect risks earlier, enhance measurement accuracy, spot trends easily, and gain better situational awareness, Percepto’s new solution, Percepto AIM (Autonomous Inspection & Monitoring), combines Percepto’s Sparrow autonomous drone-in-a-box solution with Boston Dynamics’ Spot.

Built with a dog’s shape and designed for construction, mining, oil & gas, utilities, and public safety, Spot is an agile mobile ground robot that can easily walk around, climb stairs, open doors, and more, while carrying up to 14kg of inspection equipment, such as RGB cameras or lidar. Integrated with Percepto AIM, Spot carries Percepto’s payloads for high-resolution imaging and thermal vision to detect issues including hot spots on machines or electrical conductors, water and steam leaks around plants, and equipment with degraded performance.

“We understood that there are some aspects of remote inspection and monitoring that can’t be fulfilled by air alone,” Dor Abuhasira, Percepto’s CEO, wrote. “Our customers needed additional robotics, while still maintaining a fully autonomous cycle. They needed one system that holds all visual data from various sources. And they needed the ability to easily control both the data collection and the data visualization and analysis phases remotely.”

By combining not only the Sparrow drone with Spot but also unifying visual data from satellites, CCTV, and phones, Percepto claims its new solution “redefines the role of autonomous robotics in large-scale industrial sites.” Designed to distill and deliver actionable insights and to be robotics-agnostic, Percepto AIM’s autonomous robots can accomplish multiple missions over time with no human intervention whatsoever. Moreover, AIM automatically chooses the right robot for each requested job, and the robots use advanced computer vision capabilities to respond safely and effectively to changing conditions in real-time. 

“There are many “automated” robots that need constant human assistance – when they run out of power mid-mission, or get caught in inclement weather, or get stuck behind obstacles or lost in new site areas,” Sagi Blonder, Co-Founder & CTO at Percepto, said. “We designed AIM robots to replicate all the things human-robot operators do – and may not even realize they’re doing – allowing a truly autonomous operations cycle. For example, AIM robots monitor their battery level, alongside their mission objectives – predictively managing their power levels. This ensures that they always have enough power to return for charging during or after each mission – and don’t either get stuck mid-mission (in the case of ground robots) or drop dangerously from the sky (in the case of drones).”

While Percepto’s solution is one of the first, if not the first, commercially available solution of its kind, this subject has been an ongoing study by many different parties for a while now.

Back in 2007, a collaborative effort between the General Robotics, Automation, Sensing & Perception (GRASP) Laboratory at the University of Pennsylvania, the Georgia Tech Mobile Robot Laboratory, and the University of Southern California’s (USC) Robotic Embedded Systems Laboratory gave life to research on adaptive teams of autonomous aerial and ground robots for situational awareness. The goal was to use aerial and ground robots to monitor a small village, and search for and localize human targets by the color of the uniform, while ensuring that the information from the team was available to a remotely located human operator.

In 2017, Asylon and Endeavor Robotics teamed up to provide a unique and innovative solution for the military: use a drone to deploy an unmanned ground vehicle (UGV), which are used in key points along a battlefield, removing the need for humans to be in those dangerous spots just to gather information. The idea was to use a delivery drone to land, pick up a ground robot, drop it on a rooftop, and then return for a battery change and another ground robot to deliver.

 

Last year, Purdue University professors led research using artificial intelligence and learning algorithms to create a platform allowing multiple aerial, ground, or aquatic drones to communicate and adapt as search and rescue mission factors change.

“For the system, we focused on a multi-agent network of vehicles, which are diverse and can coordinate with each other,” Shaoshuai Mou, a professor in aeronautics and astronautics, said. “Such local coordination will allow them to work as a cohesive whole to accomplish complicated missions such as search and rescue. There are challenges in this area. The environment may be dynamic, for example, with the weather changing. The drones have to be adaptive and must be capable of real-time environment perception and online autonomous decision making.”

 

Earlier this year, as part of DARPA’s OFFensive Swarm-Enabled Tactics (OFFSET), the agency conducted its fourth field experiment using ground vehicles, multirotor and fixed-wing aircraft to locate and secure multiple simulated items of interest. DARPA’s OFFSET program envisions future small-unit infantry forces using swarms comprising upwards of 250 unmanned aircraft systems (UASs) and/or unmanned ground systems (UGSs) to accomplish diverse missions in complex urban environments.

 

None of these are finished products, nor do they compare to what Percepto promises to deliver with AIM. This is just to say that maybe, in the future, the talk about “which drone should you buy,” or “which sensors should you use,” might expand to “which type of drone, or combination of drones, is best for the current job” – aerial, ground or aquatic. With software packages like Pix4Dcloud, which allow users to generate accurate and georeferenced orthomosaics, 3D meshes, point clouds, and elevation models from drones and handheld cameras, using a swarm of drones and Spot-like robots to autonomously map an entire construction site could be the next thing for digital twins. Who knows?