Wildfires, known for their catastrophic effects on both ecosystems and human communities, pose a significant challenge in managing natural disasters. To tackle this, Latitudo 40 is developing an innovative strategy that combines satellite imagery, rover exploration, and advanced AI algorithms.

In order to do this, we need to analyse three critical images: the pre-event satellite view, the post-event aftermath, and a meticulously generated anomaly map. This trio of images not only visualizes the extent of wildfire destruction but also marks a significant step forward in precision damage assessment.


The procedure starts with a super-resolved Sentinel 2 satellite image taken before the event documenting the original condition of the terrain and vegetation. This image is essential as a benchmark, particularly focusing on the Normalized Difference Vegetation Index (NDVI). NDVI plays a crucial role in evaluating vegetation health and density by calculating the discrepancy between near-infrared light (highly reflected by vegetation) and red light (absorbed by vegetation). Areas with dense, flourishing vegetation are indicated by higher NDVI values, offering a crucial comparison with imagery captured after the event.

Following the wildfire, the post-event super-resolved Sentinel 2 image is examined. This image starkly contrasts with the pre-event NDVI values, revealing areas where vegetation has been destroyed or severely degraded. The post-fire landscape often shows significantly lower NDVI values, indicative of burnt vegetation and exposed soil. This comparative analysis between pre and post-event NDVI values lays the groundwork for a nuanced understanding of the wildfire’s impact on the vegetation.

In our refined approach to assessing wildfire damage, we incorporate the Burn Area Index (BAI) into both pre-event and post-event evaluations, utilizing advanced machine learning techniques to enhance accuracy. The BAI excels at identifying burned regions due to its capacity to detect the unique spectral characteristics of charred vegetation within the infrared spectrum, which starkly contrasts with the properties of healthy vegetation or barren land.

The process begins with the algorithm analyzing pre-event satellite images to establish baseline values for BAI and NDVI, creating a point of reference for the landscape’s pre-wildfire state. Following the wildfire, the algorithm conducts another round of BAI assessments on the post-event images, aiming to pinpoint significant deviations in the index values that signal burned vegetation and landscape transformations.

The application of machine learning is crucial to this comparative analysis, conducting a thorough, pixel-by-pixel review of the satellite images. By utilizing sophisticated machine learning models, the algorithm is adept at detecting crucial shifts in vegetation index values with high precision. These models are adeptly trained to recognize not merely numerical variations but also significant changes in the landscape’s physical and vegetative structure, taking into account the natural variability in vegetation and environmental conditions.



The result is a detailed anomaly map that identifies locations with the most notable changes in BAI values following the wildfire. Every point on this map marks a spot where the landscape has experienced drastic changes due to the fire. These points serve not just as markers of damage but as data-driven focal points for in-depth subsequent evaluations.

After the algorithm pinpoints critical areas, the focus shifts to terrestrial exploration. Our autonomous rover, outfitted with optical sensors, is dispatched to investigate the identified anomaly locations.

The rover, designed for resilience and precision, traverses the challenging post-wildfire terrain. As it navigates through the damaged landscape, it collects detailed data – an in-depth exploration that satellite imagery alone could never achieve. This data encompasses various aspects of the damage, including the extent of vegetation loss, and leaf vigour and It is processed and streamed in real-time through a 4G/5G connection.




The machine learning-enhanced BAI analysis’s precise anomaly detection is complemented by the rover’s capability to capture images on the ground.

Equipped to navigate the impacted areas, the Latitudo40’s rover takes high-resolution photographs of specifically identified locations of interest. These images are crucial for the subsequent stage of analysis, where our algorithms compute a vegetation index for each photo, accurately assessing the level of vegetation coverage and health. This step converts the captured images into a 2D raster layer, providing an intricate and comprehensive perspective of the landscape’s present condition. When this layer is superimposed on the anomaly map, it enriches the understanding of the post-wildfire scenario, underscoring regions with significant vegetation loss or alteration. Thus, this holistic method paves the way for more focused and efficient initiatives in the rehabilitation and restoration processes following a wildfire.

Don’t miss this opportunity to be at the forefront of change. Discover more on our site.

If you want to discover our solutions, reach out to us at the following email address, and let the dialogue on innovative solutions begin:


Foollow us on LinkedIn