Registration for the conference is now open!
If you have any doubt, and want to contact us, send an e-mail to ci2025@climateinformatics.com.br
Prithvi models family: foundation models for Earth sciences
Daniela Szwarcman, Daniel Civitarese
In 2023, IBM and NASA launched a partnership to develop a family of foundation models for Earth sciences called Prithvi. This talk presents Prithvi-EO and Prithvi-WxC from our model family. Prithvi-EO, a transformer-based geospatial foundation model family, is pre-trained on over seven years of global multispectral satellite imagery from NASA’s HLS dataset. Prithvi-WxC, a 2.3-billion-parameter model trained on 160 variables from MERRA-2, captures both regional and global climate patterns. We leveraged distributed GPU training strategies to pre-train our Prithvi models and explored their performance and potential applications in NASA’s research initiatives.
The pre-trained models and fine-tuning workflows are openly available in Hugging Face, providing a powerful tool for the global Earth sciences community. This talk will explore the challenges and opportunities in bridging these approaches, highlighting the potential for more flexible, reusable AI models in Earth system science.