Artificial intelligence is transforming the way we respond to emergencies, enabling scientists to solve complex, high-impact problems with unprecedented speed and precision. At Cal Poly San Luis Obispo, researchers are using high-performance computing and advanced AI models to reimagine what’s possible in search and rescue. Leveraging the computational power ofNCSA’s Delta Supercomputer and the scalable storage of the Granite archive powered by Versity’s ScoutAM, this work pushes the boundaries of what data-driven science can achieve. From real-time analysis to historical insight, this is a glimpse into how AI is enabling a new kind of research—one that doesn’t just predict outcomes, but helps save lives.
Accelerating Search and Rescue with AI
Search and rescue operations have traditionally relied on paper forms to track information and coordinate efforts between command posts. To modernize this process, Cal Poly researchers began by digitizing decades of historical search-and-rescue records. Building on this data, they developed IntelliSAR, an AI-powered system that helps first responders make quicker and more accurate decisions.
These models analyze real-time and historical data to detect movement patterns, highlight high-probability search zones, and prioritize the most relevant clues. By learning from both successful and unsuccessful missions, the models can highlight overlooked clues, avoid repeating inefficient search patterns, and recommend optimized strategies in real time. Some projects focus on simulating past rescue missions to improve future response strategies, while social media activity – such as profiling a missing person’s mental state before disappearance – to add context to location predictions. By dramatically improving both the speed and precision of search efforts, this AI work increases the likelihood of locating missing persons and reducing harm, especially in remote or high-risk environments.
This is a sample heatmap generated by the AI tool. The dark purple area represents the highest likelihood of locating the missing person, with the surrounding colors indicating progressively lower probabilities. Credit: NCSA.
Keeping Pace with AI: Storage and Compute at Scale
This AI-driven science runs on DeltaAI, a powerful system at NCSA that combines NVIDIA Grace Hopper Superchips with HPE’s Slingshot interconnect and Cray programming environment to deliver high-performance computing at scale. But with the vast volumes of data generated by training and running AI models, the challenge becomes clear: how do you store it all and keep it accessible for retraining, reproducibility, and future discovery?
That’s where Versity comes in.
Introducing Granite: Scalable Archival Storage with Versity ScoutAM
Versity’s ScoutAM software powers Granite, NCSA’s massive tape-based archive system. Built on Spectra Logic’s TFinity tape libraries, Granite offers more than 60 petabytes of storage, delivering the scalability, performance, and efficiency needed to support NCSA’s data-intensive research workloads, like those powering real-time search and rescue AI at Cal Poly.
ScoutAM is the brain behind the archive, automatically transferring AI model outputs, training data, and observational inputs from compute to archive as soon as it’s ready to be offloaded. This is essential in environments like NCSA, where high-performance parallel file systems must be kept available for incoming workloads while guaranteeing no valuable data is lost. For projects like AI-assisted emergency response, where every piece of training data and simulation matters, this uninterrupted lifecycle is critical.
A Seamless, Efficient Data Lifecycle
ScoutAM integrates directly into the research workflow, providing a transparent and automated path from hot storage to archive. Researchers don’t need to change their workflows or request manual data movement, ScoutAM handles the transition intelligently, based on policy and system activity.
This streamlined lifecycle means that:
Valuable AI-generated data is never lost or discarded
Storage performance is maintained without manual intervention
Archived data remains accessible for model retraining and analysis
Scientific reproducibility is supported without burdening compute systems
For scientists building life-saving AI models, this system guarantees that past data can continue to inform future decisions, whether it’s identifying new search patterns or refining terrain analysis algorithms. It allows NCSA to scale its infrastructure while preserving the flexibility and accessibility researchers rely on.
Unlocking Long-Term Insight with Historical Metrics
Historical data is vital for maintaining optimal IT system performance and effectively planning for future capacity needs. By continuously monitoring key metrics such as bandwidth usage, latency, and storage activity over time, organizations gain a clear understanding of their system behavior and can identify emerging issues early. This long-term insight enables teams to detect trends, address potential bottlenecks before they escalate, and make informed decisions about resource allocation. Without access to comprehensive historical data, organizations risk reacting too late or over-provisioning their infrastructure. Versity’s ScoutAM addresses this challenge by automatically capturing and storing detailed records of file system activity and metadata. By providing a scalable, searchable archive of historical performance and usage data, ScoutAM empowers IT teams to analyze trends, optimize system configurations, and confidently plan for growth, helping to ensure reliable, efficient, and scalable storage environments.
Supporting Research that Saves Lives
As AI becomes more deeply embedded in critical response workflows, the need for reliable, high-capacity storage infrastructure is only growing. These workloads generate vast amounts of data that must be retained, retrievable, and reusable to support evolving models and infrastructures. Versity is proud to support that mission with archival technology built for performance, scale, and long-term value.
Whether it’s accelerating life-saving search efforts or enabling reproducible AI research, our work with NCSA highlights what’s possible when innovative computing and smart data management come together.
Discover how traditional backup systems, though vital, often fall short when dealing with massive datasets. By directing backup data to an archiving platform, organizations can overcome inefficiencies, reduce storage costs, and enhance data scalability. Learn how this innovative approach can optimize your data management strategy, ensuring both long-term preservation and swift recovery.
The Versity S3 Gateway’s stateless architecture transforms S3-compatible storage with unmatched scalability, resilience, and efficiency. Learn how it simplifies load balancing, enhances fault tolerance, and adapts seamlessly to any infrastructure.