Martin Gauch
banner
gauchm.bsky.social
Martin Gauch
@gauchm.bsky.social
Deep learning & earth science @ Google Research
On that note, props to HESS for getting the bibtex citation right on the first try ("w\_\_\_")!
hess.copernicus.org/articles/29/...

Google Scholar drops two of the _ but at least escapes the remaining one correctly...
November 13, 2025 at 9:30 AM
I can't compete with @kratzert.bsky.social's swag game, but I'll contribute a few old NeuralHydrology stickers that I found recently :)
April 22, 2025 at 5:18 PM
Paper: eartharxiv.org/repository/v...

Code to reproduce is part of NeuralHydrology 1.12.0 which we just released: github.com/neuralhydrol...

Code to analyze results: github.com/gauchm/missi...
GitHub - neuralhydrology/neuralhydrology: Python library to train neural networks with a strong focus on hydrological applications.
Python library to train neural networks with a strong focus on hydrological applications. - neuralhydrology/neuralhydrology
github.com
March 15, 2025 at 12:50 PM
All of the approaches work pretty well! Masked mean tends to perform a little better, but it's often quite close.

More details, experiments, figures, etc. in the paper.

All of this is joint work with @kratzert.bsky.social, @danklotz.bsky.social, Grey Nearing, Debby Cohen, and Oren Gilon.
March 15, 2025 at 12:50 PM
3) Attention: A more general variant of the masked mean that uses an attention mechanism to dynamically weight the embeddings in the average based on additional information, e.g., the basins' static attributes.
March 15, 2025 at 12:50 PM
2) Masked mean: Embed each group of inputs separately (a group being the inputs from one data provider) and average the embeddings that are available at a given time step. This is what we currently do in Google's operational flood forecasting model.
March 15, 2025 at 12:50 PM
In the paper we present and compare three ways to deal with those situations:
1) Input replacing: Just replace NaNs with some fixed value and concatenate the inputs with a flag to indicate missing data.
March 15, 2025 at 12:50 PM