At a US navy base in central California, four-seater all-terrain automobiles roam hillside trails. It is a coaching train, however not for the individuals within the automobiles: That is an effort to coach AI fashions to enter battle zones.
The autonomous navy ATVs are operated by Scout AI, a startup based in 2024 by Coby Adcock and Collin Otis, that calls itself a “frontier lab for protection.” The corporate mentioned on Wednesday that it has raised a $100 million Collection A spherical, led by Align Ventures and Draper Associates, following its $15 million seed spherical in January 2025.
Scout invited TechCrunch for an unique tour of its coaching operations at a navy base it requested us to not title.
The corporate is constructing an AI mannequin it calls “Fury” to function and command navy property, first for logistical assist however quickly for autonomous weapons. CTO Collin Otis compares this work, which builds on present LLMs, to coaching troopers.
“They begin once they’re 18 years previous, and generally they even begin after school, so that you wish to begin with that base degree of intelligence,” Otis instructed TechCrunch. “It’s helpful to start out with somebody who’s already made an funding after which say, hey, what do I’ve to do to show this factor to be an unimaginable navy AGI, versus simply being a broadly clever AGI?”
Scout has secured navy know-how growth contracts totaling $11 million from organizations like DARPA, the Military Functions Laboratory, and different Division of Protection prospects. It’s one in all 20 autonomy corporations whose know-how is being utilized by US Military’s 1st Cavalry Division throughout its common coaching cycle at Ft. Hood in Texas, with the expectation that the unit will carry alongside merchandise that show themselves when it subsequent deploys in 2027.
For Scout’s inner testing, the rubber meets the dust at within the base’s hilly terrain. There, the corporate’s operations staff, led by former troopers, is placing the automobiles via their paces on simulated missions.
Techcrunch occasion
San Francisco, CA
|
October 13-15, 2026
Whereas autonomous vehicles are beginning to be seen in additional cities around the globe, they’re working there in additional structured environments with guidelines. Working autonomously on unmarked trails or off-road is one other problem totally. Otis, a former government at autonomous trucking firm Kodiak, mentioned he was motivated to start out Scout when he realized the system he helped construct there wasn’t clever sufficient to function in an unpredictable battle zone.
An autonomous floor car managed by Scout AI’s Fury mannequin. Picture Credit:Scout Ai / Scout AI
A brand new strategy to autonomy
Scout is popping to a more recent autonomy know-how: Imaginative and prescient Language Motion fashions, or VLAs, which are based mostly on LLMs and used to manage robots. First launched by Google DeepMind in 2023, the know-how seeded robotics start-ups like Bodily Intelligence and Determine.AI, the humanoid robotic firm led by Adock’s brother, Brett.
Adcock is on Determine’s board. He says that have satisfied him of the chance to carry broader intelligence to the navy’s rising fleet of autonomous automobiles. His brother launched him to Otis, who was advising Determine, and so they set about making use of the most recent in AI to navy options.
“If I handed you the controller of a drone proper now and I strapped a headset on you, you could possibly be taught to fly that factor in minutes,” Otis mentioned. “You’re really simply studying easy methods to join your prior data to those couple little joysticks. It’s not a giant leap. That’s the way in which to consider VLAs and why they’re such an unlock.”
Certainly, I bought an opportunity to drive one in all Scout’s ATV across the rutty trails, and the terrain was difficult: steep hills, unfastened sand on turns, disappearing tracks, complicated intersections. I’m not an skilled ATV driver however made a good go on my first try (if I do say so myself). That’s the type of basic intelligence the corporate desires in its fashions, which it has been coaching by way of these ATVs for simply six weeks after utilizing civilian ATVs to start out the method.
I additionally rode within the ATV beneath autonomous management, and will really feel the distinction — it accelerates sooner than a human who is likely to be interested by a passenger’s consolation. The operations staff factors out how the automobiles hug the correct on wider trails however keep in the midst of slim ones, like their coaching drivers. In addition they, when confused, immediately decelerate to assume over their subsequent transfer, one thing that occurs a number of occasions because it carries us on a 6.5 km loop earlier than returning to base.
Although the VLAs are new sufficient that they’ve but to be deployed by any firm in an operational setting, “the know-how is sweet sufficient to be doing that experimentation within the subject with troopers to determine easy methods to most be efficient to US forces,” Stuart Younger, a former DARPA program supervisor who labored on floor car autonomy mentioned. And like different autonomy corporations, Scout’s full autonomy stack additionally contains deterministic techniques and different flavors of AI to spherical out its brokers’ capabilities.
Younger left DARPA this month to affix Subject after managing a program referred to as RACER. It requested corporations to create high-speed, autonomous off-road automobiles, serving to seed this house the identical means that the group’s Grand Problem boosted self-driving vehicles. Two rivals on this house, Subject AI and Overland AI, had been spun out of that program, and Scout additionally participated in as a later addition.
The primary purposes of floor autonomy, in response to Scout executives and navy technologists, will probably be automated resupply: Carrying water or ammunition to distant statement posts, or in a convoy the place a crewed truck is likely to be adopted by six to 10 autonomous automobiles, saving valuable human labor for extra vital duties. Brian Mathwich, an energetic responsibility infantry officer doing a stint as a navy fellow at Scout, recalled a current train in Alaska the place he led a resupply convoy in whole darkness and wished for autonomous automobiles to assist him out.
Picture Credit:Scout AI / Scout AI
Including intelligence to the Military’s motorpool
Scout sees itself primarily as a software program firm, constructing an intelligence layer for navy machines. It doesn’t intend to make the autonomous automobiles themselves however to construct atop them.
Adcock expects the startup’s first product to be extensively adopted will probably be one referred to as “Ox,” the corporate’s command and management software program, bundled on hardened pc {hardware} (GPUs, communications, cameras). It’s meant to permit particular person troopers to orchestrate a number of drones and autonomous floor automobiles with prompt-like instructions: “Go to this waypoint and look ahead to enemy forces.”
Nevertheless, making that software program work requires coaching on actual automobiles. Therefore Foundry, which is what the corporate calls its coaching vary on the navy base. There, drivers spend eight hour shifts placing the ATVs via their paces, then work via a reinforcement studying system to log the place they needed to take over, which is then used to enhance the mannequin. The bottom commander has requested the corporate’s ATV to take a flip with safety patrols.
One speculation Scout is testing is that VLAs will allow this comparatively restricted knowledge set, alongside coaching knowledge in simulations, to ship a completely succesful driving agent. Whereas the the car appears comfy on trails, for instance, it isn’t able to function absolutely off-road.
Scout can also be practising with drones for reconnaissance and as weapons, giving them intelligence with imaginative and prescient language fashions, a multi-modal LLM variant.
Scout is engaged on a system that will see teams of munition drones fly with a bigger “quarterback” platform that gives extra compute assets to command them. In a single mission, the drones would search a geographic space for hidden enemy tanks and assault them, presumably with out human intervention. Otis contends that the choice strategy on this state of affairs is likely to be oblique artillery hearth, which is imprecise in comparison with drone strikes.
Whereas autonomous weapons are a flash level within the politics of protection tech, specialists word the idea is previous: Warmth-seeking missiles and mines have been in use for a lot of many years. The query for technologists is how the weapons are managed, Jay Adams, a retired U.S. Military Captain who leads Scout’s operations staff, instructed TechCrunch.
He notes the corporate’s munitions drones will be programmed to solely assault threats in a selected geographic space, or solely with human affirmation. He additionally says autonomous weapons platforms are unlikely to fireplace as a result of they’re scared, the way in which an eighteen year-old soldier would possibly.
VLAs, too, supply promise for higher concentrating on. Scout says its fashions are pretrained on a selected set of navy knowledge to arrange them for, say, working into an enemy tank whereas on a resupply mission. Lt. Col Nick Rinaldi, who supervises Scout’s work for the Military Functions Laboratory, says that whereas automated concentrating on is difficult and unlikely for use outdoors of constrained environments within the close to time period, the potential of VLAs to cause about threats make them a promising know-how to analyze.
Adams says the promise of drones that may determine their very own targets is essential to future warfare: Whereas Russia’s invasion of Ukraine has generated intense curiosity in drone warfare, he believes having people working particular person UAVs doesn’t scale sufficient for the US to face numerous low-cost unmanned techniques ought to they threaten US forces.
A mission to counter anti-military vibes
Picture Credit:Scout AI / Scout AI
Like many protection startups, Scout wears its mission on its sleeve, and executives will freely criticize corporations which are reluctant at hand their know-how over to the federal government. Google, for instance, reportedly pulled out of a Pentagon contest to develop management techniques for autonomous drone swarms, a functionality Scout can also be engaged on.
“The AI individuals don’t wish to work with the navy,” Otis instructed TechCrunch, referencing Anthropic’s spat with the Pentagon over its phrases of service. “None of them are open to working brokers on one-way assault drones, or working brokers on missile techniques.”
Nonetheless, Scout is definitely utilizing present LLMs as the bottom to construct its brokers, although declined to say which of them. Otis says it has agreements with “very well-known hyperscalers” to offer the pretrained intelligence for Scout’s basis mannequin. Otis additionally declined to touch upon if it makes use of open-weight fashions, equivalent to these supplied by Chinese language corporations. Many corporations reliant on AI inference construct on these fashions to function with decrease value in comparison with fashions from frontier labs like Anthropic or OpenAI.
Scout expects to handle this by constructing its personal mannequin from the bottom up within the years forward, and the founders say a lot of its capital will go into these coaching and compute prices. Certainly, Otis wonders if Scout will beat the prevailing leaders to AGI as a result of its mannequin will probably be consistently interacting with the true world.
“There’s an argument within the AGI group alongside the traces that you could solely get so clever by studying the web, and most intelligence comes with interacting on the earth,” Otis mentioned.
Does that imply Adcock is competing together with his brother’s military of humanoid robots at Determine? No, Otis says, however “we are able to get to scale a lot sooner as a result of our buyer has property,” he mentioned, referring to the Pentagon.
While you buy via hyperlinks in our articles, we could earn a small fee. This doesn’t have an effect on our editorial independence.

