What to expect at SOF Week 2026: Bringing autonomous UAS to the tactical edge

This May at SOF Week 2026, Latent AI is showing up with new autonomous UAS capabilities built on our edge AI platform. We’re not just improving the tools operators use to manage AI in the field. We’re putting that AI to work: autonomous, physical, and mission-ready.
Join us in Tampa, May 18–21, 2026. You’ll find us co-located with our hardware partner One Stop Systems (OSS) at the JW Marriott, Level 2, Booth #5006, as well as in Meeting Pod 2 at the Tampa Convention Center, Level 1. Stop by! We’ll have coffee and cold drinks waiting.
Making autonomy real, built on our platform
Latent AI’s edge AI platform has always been the foundation for optimizing, deploying, and managing models on constrained hardware in the most demanding environments on earth. Our Field Tactical Suite (FTS) puts that platform in the hands of tactical teams, giving operators the ability to deploy, update, and fine-tune AI models in the field, without connectivity, without a data scientist on call.
This year at SOF Week, we’re bringing autonomous UAS capabilities to Group 1 and 2 platforms — combining ATR and flight control in embedded hardware to enable autonomous operation:
- Follow: autonomous target follow in surveillance mode
- Engage: one-click lock and terminal homing for autonomous strike
What you’ll see: Terminal homing via Unreal Engine
Terminal homing is the autonomous guidance of a drone to a locked target without operator input in the final phase of flight. Our centerpiece demo brings terminal homing to life through an interactive Unreal Engine experience. Visitors take the controls and feel firsthand what it means to hand targeting decisions to AI.
Here’s how it works:
Manual first: You fly the drone and attempt to manually intercept a moving target. You feel the cognitive load of tracking, maneuvering, and timing — all at once.
One click: Lock the target. The AI takes over the terminal approach, autonomously tracking and engaging.
The contrast is the point: What once required constant operator attention becomes fire-and-forget, freeing the operator to manage additional targets, coordinate assets, or simply stay off the air.
The solution runs on the Modal AI Starling Max, a full operational system we have running in the lab today. Because the software is platform-agnostic, it can be adapted to run on any drone, on any hardware, in any operating environment.
The simulator is a capability
The Unreal Engine environment you’ll interact with at SOF Week is more than a demo vehicle; it’s a digital twin of the operating environment. Units can use it to practice missions, train to fly in specific terrain, and run war-game scenarios before committing hardware or personnel.
The simulation supports hardware-in-the-loop and software-in-the-loop testing, making it a serious tool for red and blue teams. Want to stress-test your TTPs against an adversarial drone? Optimize an engagement scenario before it goes live? The digital twin gives you a controlled environment to do exactly that, configured to match your actual operating environment.
It’s also how we continue to develop capability beyond what’s on display at SOF Week. The roadmap is being built and validated here first.
What’s on the roadmap
Last year, we showed one-click tracking and the ability to follow a target. Since then, we’ve made the model faster and smaller, running at 30 FPS on the NVIDIA Orin — up from 4 FPS — while supporting lower power hardware. That’s not just an incremental improvement. It’s what makes terminal homing possible. Precise, persistent tracking at speed is the foundation on which everything else builds.
Today, our tracker and terminal homing solution is available on embedded hardware for integration with your drone. The simulator environment is how we develop and validate our autonomy algorithms — testing them against complex, realistic scenarios before they ever touch hardware. It provides teams with a digital twin to train, test, and wargame before committing hardware or personnel, and allows us to move fast on the roadmap: new capabilities are built, stress-tested, and refined in simulation first, then transitioned to embedded hardware for integration.
From here, our next advances focus on expanding platform support across NVIDIA Jetson Orin and Qualcomm, and building autonomous UAS capabilities for additional military use cases:
- HKS (Hunter-Killer System): Two-drone coordination — one to find, one to engage — with target handoff between drones.
- Multiple Independent Control: Operator-directed multi-drone engagement, where each drone independently homes on its assigned target.
- Future swarm capability: The simulator is the foundation for eventual autonomous drone-to-drone collaboration.
The OSS partnership: Where software meets ruggedized edge compute
At SOF Week, you’ll also see Latent Assisted Label running live on OSS hardware, dramatically cutting the time and manual effort required to label AI training data in the field, on a box built to go where the mission goes.
Deploying AI in the field isn’t just a software problem. Operators need compute that can survive the environment — vehicle-mounted, forward-deployed, disconnected — while still running multiple AI workloads simultaneously. OSS’s PCIe Gen 5 3U Short Depth Server is built for exactly that: a MIL-STD-810G rated, ruggedized aluminum chassis that brings datacenter-class compute to the tactical edge, capable of running up to 35 simultaneous AI workloads in conditions from -20°C to 50°C, at altitudes up to 10,000 feet.
But raw compute power alone isn’t enough. Without optimization, AI models either demand more resources than available or create performance bottlenecks that make them operationally useless. That’s where Latent AI’s platform comes in. We optimize models to run efficiently on available hardware, maximizing inference speed and minimizing latency so operators get real-time AI performance on a box that fits in a vehicle and survives the mission.
Latent AI and OSS have been partnering since 2023. OSS brings ruggedized machines built for vehicle-mounted and edge-deployed applications; Latent AI brings the software stack and AI applications that run on them. Together, we’re telling the edge continuum story: intelligence that flows from the tactical edge all the way back to echelons above, bridging the operator in the field to decision-makers at the Pentagon.
Also on the show floor: Look for the team from Human Systems Integration (HSI) demonstrating Latent Linguist on their wearable edge compute platform. HSI’s fabric-embedded Wired-Less technology distributes power and data across the body, no wires, no box to carry, and Latent Linguist brings real-time, offline AI translation to that platform, putting language AI directly on the warfighter.
Any drone. Any mission.
Everything we’re showing at SOF Week reflects a core conviction: AI capabilities should work in your environment, on your hardware, regardless of platform or OS. Find us at two locations:
- Booth #5006 | JW Marriott, Level 2 (co-located with OSS)
- Meeting Pod 2 | Tampa Convention Center Level 1
Schedule a meeting with our team, May 18–21, 2026
Whether you’re focused on autonomous UAS, tactical AI, or expanding your SOF capabilities, we’ll show you where the technology is today and where it’s going. See you in Tampa.