
Drones and self-driving tractors are examples of autonomous machines using physical AI. Source: Adobe Stock
Physical world AI is the future for all autonomous machines, from cars and drones to tractors. The poster child for progress in this area is Waymo. Over many years, the company has developed cutting-edge onboard navigation technologies — including sophisticated hardware as well as numerous artificial intelligence and machine learning models — to guide its cars.
However, I don’t think onboard technology is going to be enough for us to have a world in which autonomous machines become ubiquitous.…

Drones and self-driving tractors are examples of autonomous machines using physical AI. Source: Adobe Stock
Physical world AI is the future for all autonomous machines, from cars and drones to tractors. The poster child for progress in this area is Waymo. Over many years, the company has developed cutting-edge onboard navigation technologies — including sophisticated hardware as well as numerous artificial intelligence and machine learning models — to guide its cars.
However, I don’t think onboard technology is going to be enough for us to have a world in which autonomous machines become ubiquitous. Unlike Waymo, the vast majority of companies don’t have billions of dollars to build the technology necessary for the compute engine to reside solely in the vehicle.
Rather, what’s needed are highly efficient cloud-based systems that, when combined with AI models, provide an ultra high-precision representation of the planet so that mobile robots aren’t wholly dependent on onboard navigation systems. This is a future where autonomous machines will be able to optimize routes and, in some cases, see hazards in their path well before they embark on their journey.
The state of physical world AI today
The AI that exists today is localized, with lots of processing on the edge or on the autonomous machine. What’s missing is AI that is aware of the broader physical landscape.
The good news is that there’s plenty of data about the physical world gathered from satellites, drones, and myriad other devices to feed these models. The bad news? As Gartner notes, physical-world data typically needs heavy engineering to be usable by AI.
This is a field in which my company, Wherobots, and others are working. What we call the “spatial intelligence cloud” is technology designed to process disparate forms of physical world data. This includes abstract shapes such as vectors representing hills, roads, and telephone poles that enable AI models to understand what a machine is “seeing.”
How the cloud could help autonomous machines
Autonomous cars are an obvious example. I don’t think manufacturers will ever replace onboard navigation systems entirely. There are real-time decisions that need to be made through the use of high-definition sensors such as lidar.
However, we can improve decision-making if we know certain things in advance. For example, imagine a future where a last-mile delivery company struggles to consistently transport fresh food in a timely manner due to confusion about the physical world.
In rural areas, autonomous vehicles may fail to recognize that long driveways are often entrances to recipients’ homes. Or, picture a situation within a city, where self-driving cars can’t find a particular apartment within a large complex.
It’s for these reasons that fleet companies use AI and cloud-based tech to create finely detailed and ever-evolving maps of these areas and then serve this information back to the delivery systems. Doing so will allow autonomous vehicles, as well as the couriers who step out of them to hand packages to customers or put them on doorsteps, to speed up delivery times. They could also reduce carbon emissions as well as the risk of taking a wrong turn and getting into an accident.
Maps help drones with BVLOS flights
The U.S. Department of Transportation, through the Federal Aviation Administration, in August proposed allowing drones to operate beyond the visual line of sight (BVLOS) of an operator without needing individual waivers. This would be a significant simplification compared with the current system.
In a future where partially or fully autonomous drones operate at scale, delivery companies will need to build and maintain high-resolution maps of the earth that are spatially aware of things like power lines, building shapes and protrusions or other physical-world obstacles.
Power lines and utility poles, in particular, are a significant hazard that drones have to navigate around. And, as is the case with autonomous vehicles that are looking for a recipient’s front door, autonomous drones need to know exactly where on one’s property the recipient wants the package left.
For instance, a high-fidelity machine intelligence-ready map would help a drone to decipher whether a long, narrow shape is a front porch or a swimming pool.
Autonomous tractors harvest, share data
Tractor companies, including John Deere, have made a lot of progress in the area of autonomy. In 2022, Deere rolled out its first tractor that can work 24 hours a day without a human operator in the cab. These vehicles also address the labor shortage that farmers are facing.
As Jahmy Hindman, chief technology officer at Deere, stated at the vehicle’s rollout, “The last time agriculture was on the precipice of this much change was when we were on the cusp of replacing the horse and plow.”
The Deere’s 8R tractor has GPS guidance and incorporates onboard AI and machine learning capabilities. However, tractor manufacturers could take things a step further. These autonomous machines could also tap into detailed maps of their fields.
This is an area where software company, Leaf Agriculture, is making a difference. Leaf’s platform connects with data providers such as John Deere, Climate Fieldview, and CNHi among others.
Using Wherobots, Leaf translates the proprietary files from these data providers into a consistent format, making it easy for farmers to define spatial boundaries within their land plot known as “management zones.” Each zone has unique needs due to varying characteristics such as elevation, soil type, slope, and drainage capabilities.
With continuously updated maps showing the management zone they’re in, autonomous tractors can make important, real-time decisions, such as knowing when to adjust or stop spraying, allowing farmers to protect margins in a notoriously low-margin business.
The future of autonomy won’t be defined solely by onboard technology, but rather, by the fusion of real-time machine learning at the edge with rich, cloud-based spatial intelligence. Whether it’s a delivery van navigating a large apartment complex, a drone avoiding power lines, or a tractor adjusting inputs by management zone, the common thread is that autonomous machines perform best when they see beyond their immediate sensors to their broader surroundings.
About the author
As the CEO of Wherobots, Mo Sarwat spearheads a team that’s developing the spatial intelligence cloud. Wherobots is founded by the creators of Apache Sedona, a project he co-created and was the architect of. Apache Sedona is an open-source framework designed for large-scale spatial data processing in cloud and on-prem deployments.
Wherobots’ stated mission is to empower organizations to maximize the utility of their data through the application of spatial intelligence and contextual insights.
Prior to Wherobots, Sarwat had over a decade of computer science research experience in academia and industry. He co-authored more than 60 peer-reviewed papers, received two best research paper awards, and was named an Early Career Distinguished Lecturer by the IEEE Mobile Data Management community.
Sarwat was also a recipient of the 2019 National Science Foundation CAREER award, one of the most prestigious honors for young faculty members.