Miltos Kofinas
banner
miltoskofinas.bsky.social
Miltos Kofinas
@miltoskofinas.bsky.social
Postdoctoral researcher @vuamsterdam.bsky.social
AI for climate | Graph Neural Networks | Geometric Deep Learning | Neural Fields | Spatiotemporal Forecasting

mkofinas.github.io
😃 Shoutout to my wonderful collaborators @BorisAKnyazev, @Cyanogenoid, @yunluchen111, @gjburghouts, @egavves, @cgmsnoek, and @davwzha!
[9/9]
February 7, 2025 at 10:20 AM
📊We empirically validate our proposed method across a spectrum of tasks, including classification and editing of implicit neural representations, predicting generalization performance, and learning to optimize, while consistently outperforming state-of-the-art approaches.
[8/9]
February 7, 2025 at 10:20 AM
📏We adapt existing graph neural networks and transformers to take neural graphs as input, and incorporate inductive biases from neural graphs.
In the context of #geometricdeeplearning, neural graphs constitute a new benchmark for graph neural networks.
[7/9]
February 7, 2025 at 10:20 AM
Explicitly integrating the graph structure allows us to process heterogeneous architectures, accommodating architectures with different number of layers or hidden dimensions, non-linearities, and different network connectivities such as residual connections.
[6/9]
February 7, 2025 at 10:20 AM
We take an alternative approach: we introduce neural graphs, representations of neural networks as computational graphs of parameters.
This allows us to harness powerful graph neural networks and transformers that preserve permutation symmetry.
[5/9
February 7, 2025 at 10:20 AM
Recent works propose architectures that respect this symmetry, but rely on intricate weight-sharing patterns.
Further, they ignore the impact of the network architecture itself, and cannot process neural network parameters from diverse architectures.
[4/9]
February 7, 2025 at 10:20 AM
🤔Imagine a simple MLP. How can we process its parameters efficiently? Naively flattening and concatenating weights overlooks an essential structure in the parameters: permutation symmetry.
💡Neurons in a layer can be freely reordered while representing the same function.
[3/9]
February 7, 2025 at 10:20 AM
How can we design neural networks that take neural network parameters as input?
This fundamental question arises in applications as diverse as generating neural network weights, processing implicit neural representations, and predicting generalization performance.
[2/9]
February 7, 2025 at 10:20 AM
Cheers to my wonderful collaborators @BorisAKnyazev, @Cyanogenoid, @yunluchen111, @gjburghouts, @egavves, @cgmsnoek, and @davwzha!
February 7, 2025 at 10:19 AM
Shoutout to my collaborators @erikjbekkers, @nsn86, and @egavves! 😊 [8/8]
February 7, 2025 at 10:19 AM
We term our method Aether, inspired by the postulated medium that permeates all throughout space and allows for the propagation of light. [7/8]
February 7, 2025 at 10:19 AM
Our experiments show that we can accurately discover the underlying fields in charged particles settings, real-world traffic scenes, and gravitational n-body problems, and effectively use them to learn the system and forecast future trajectories. [6/8]
February 7, 2025 at 10:19 AM
We propose to disentangle equivariant object interactions from external global field effects. We model interactions with equivariant graph networks, and combine them with neural fields in a novel graph network that integrates field forces. [5/8]
February 7, 2025 at 10:19 AM
We focus on discovering these fields, and infer them from the observed dynamics alone, without directly observing them. We theorize the presence of latent force fields, and propose neural fields to learn them. [4/8]
February 7, 2025 at 10:19 AM
Equivariant networks are inapplicable in the presence of global fields, as they fail to capture global information.

Meanwhile, the observations constitute the net effect of object interactions and field effects, i.e. object interactions are entangled with global fields. [3/8]
February 7, 2025 at 10:19 AM
Systems of interacting objects often evolve under the influence of global field effects that govern their dynamics, yet previous works have abstracted away from such effects, and only focused on the in vitro case of systems evolving in a vacuum. [2/8]
February 7, 2025 at 10:19 AM