AI Extreme Weather and Climate Podcast By Zhi Li cover art

AI Extreme Weather and Climate

AI Extreme Weather and Climate

By: Zhi Li
Listen for free

Brace yourself for a deep dive into the science of how artificial intelligence is revolutionizing our understanding of extreme weather and climate change. Each episode brings you cutting-edge research and insights on how AI-powered tools are being used to predict and mitigate natural disasters like floods, droughts, and wildfires. We'll unravel the complexities of climate models, explore the frontiers of AI-powered early warning systems, and discuss the ethical implications of AI-driven solutions. Join us as we break down the science and uncover the transformative potential of AI in tackling our planet's most pressing challenges.

Zhi Li, 2025
Earth Sciences Personal Development Personal Success Science
Episodes
  • Target Concept Tuning: Solving the AI Blindspot in Extreme Weather Forecasting
    Mar 24 2026

    In this episode of AI Extreme Weather and Climate, Allen and Sydney explore a major breakthrough in meteorological AI: predicting rare but high-impact events like typhoons. While foundation models like Pangu-Weather excel at everyday forecasts, they often stumble during extreme anomalies due to severe data imbalance. We dive into a newly proposed framework called Target Concept Tuning (TaCT) which acts like a "specialized meteorologist" inside the neural network. By using Sparse Autoencoders to untangle superposed features, TaCT automatically identifies the exact internal concepts that cause the model to fail during extreme weather. It then selectively fine-tunes only those specific concepts, dramatically improving typhoon forecasting accuracy without causing the model to "forget" how to predict normal weather patterns. Tune in to learn how making AI more interpretable is making our early warning systems safer and more reliable!

    Paper Discussed in this Episode:

    Ren, S., Gu, X., Peng, Z., Zhang, H., Niu, P., Wu, B., Wang, X., Sun, L., & Wen, J. (2026). Target Concept Tuning Improves Extreme Weather Forecasting. arXiv preprint arXiv:2603.19325. https://doi.org/10.48550/arXiv.2603.19325

    Show more Show less
    Less than 1 minute
  • NeuralGCM: Observation-Based Hybrid Modeling for Global Precipitation Forecasting
    Jan 15 2026

    This paper introduces NeuralGCM, a hybrid atmospheric model that integrates machine learning with traditional differentiable physics to improve global precipitation simulations. Unlike older models that rely on high-resolution simulations for training, this framework is trained directly on satellite observations, specifically the IMERG dataset. By leveraging this observational data, the model effectively corrects common biases in extreme weather events and the diurnal cycle of rainfall. In comparative tests, the model outperformed the ECMWF ensemble in mid-range forecasting and showed superior accuracy over CMIP6 climate models. Additionally, the architecture is exceptionally efficient, running simulations at speeds orders of magnitude faster than conventional general circulation models. These findings suggest that hybrid neural models offer a more reliable and computationally accessible path for predicting future climate impacts.

    Show more Show less
    15 mins
  • Flow-Matched Neural Operators for Continuous PDE Dynamics
    Dec 9 2025

    The episode describes the Continuous Flow Operator (CFO), a novel neural framework for learning the continuous-time dynamics of Partial Differential Equations (PDEs), aimed at overcoming limitations found in conventional models like autoregressive schemes and Neural Ordinary Differential Equations (ODEs). CFO's key innovation is the use of a flow matching objective to directly learn the right-hand side of the PDE dynamics, utilizing the analytic velocity derived from spline-based interpolants fit to trajectory data. This approach uniquely allows for training on irregular and subsampled time grids while enabling arbitrary temporal resolution during inference through standard ODE integration. Across four benchmarks (Lorenz, 1D Burgers, 2D diffusion-reaction, and 2D shallow water equations), the quintic CFO variant demonstrates superior long-horizon stability and significant data efficiency, often outperforming autoregressive baselines trained on complete datasets even when trained on only 25% of irregularly sampled data.

    Show more Show less
    12 mins
No reviews yet