Lena Zellinger
lenazellinger.bsky.social
Lena Zellinger
@lenazellinger.bsky.social
ELLIS PhD student at the University of Edinburgh
https://lenazellinger.github.io/
Reposted by Lena Zellinger
and to @leanderk.bsky.social @paolomorettin.bsky.social Roberto Sebastiani, @andreapasserini.bsky.social @nolovedeeplearning.bsky.social
for the ✨Best Student Paper Runner Up Award✨ for

"A Probabilistic Neurosymbolic Layer for Algebraic Constraint Satisfaction"

👉 openreview.net/forum?id=9Uk...
A Probabilistic Neuro-symbolic Layer for Algebraic Constraint...
In safety-critical applications, guaranteeing the satisfaction of constraints over continuous environments is crucial, e.g., an autonomous agent should never crash over obstacles or go off-road....
openreview.net
July 28, 2025 at 11:13 AM
Reposted by Lena Zellinger
24 hours more to submit your latest papers on #TPMs!
🗓️ Deadline extended: 💥2nd June 2025!💥

We are looking forward to your works on:

🔌 #circuits and #tensor #networks 🕸️
⏳ normalizing #flows 💨
⚖️ scaling #NeSy #AI 🦕
🚅 fast and #reliable inference 🔍
...& more!

please share 🙏
the #TPM ⚡Tractable Probabilistic Modeling ⚡Workshop is back at @auai.org #UAI2025!

Submit your works on:

- fast and #reliable inference
- #circuits and #tensor #networks
- normalizing #flows
- scaling #NeSy #AI
...& more!

🕓 deadline: 23/05/25
👉 tractable-probabilistic-modeling.github.io/tpm2025/
June 2, 2025 at 9:40 AM
Reposted by Lena Zellinger
We propose Neurosymbolic Diffusion Models! We find diffusion is especially compelling for neurosymbolic approaches, combining powerful multimodal understanding with symbolic reasoning 🚀

Read more 👇
May 21, 2025 at 10:57 AM
Reposted by Lena Zellinger
🚨 New paper: “Towards Adaptive Self-Normalized IS”, @ IEEE Statistical Signal Processing Workshop.

TLDR;
To estimate µ = E_p[f(θ)] with SNIS, instead of doing MCMC on p(θ) or learning a parametric q(θ), we try MCMC directly on p(θ)| f(θ)-µ | (variance-minimizing proposal).

arxiv.org/abs/2505.00372
Towards Adaptive Self-Normalized Importance Samplers
The self-normalized importance sampling (SNIS) estimator is a Monte Carlo estimator widely used to approximate expectations in statistical signal processing and machine learning. The efficiency of S...
arxiv.org
May 2, 2025 at 1:29 PM
Reposted by Lena Zellinger
Today we have @lennertds.bsky.social from KU Leuven teaching us how to adapt NeSy methods to deal with sequential problems 🚀

Super interesting topic combining DL + NeSy + HMMs! Keep an eye on Lennert's future works!
April 30, 2025 at 2:13 PM
It’s great to have @wouterboomsma.bsky.social talking at UoE today! Happening at 2pm at EFI 2.35.
April 28, 2025 at 12:52 PM
Reposted by Lena Zellinger
the #TPM ⚡Tractable Probabilistic Modeling ⚡Workshop is back at @auai.org #UAI2025!

Submit your works on:

- fast and #reliable inference
- #circuits and #tensor #networks
- normalizing #flows
- scaling #NeSy #AI
...& more!

🕓 deadline: 23/05/25
👉 tractable-probabilistic-modeling.github.io/tpm2025/
April 16, 2025 at 8:40 AM
Reposted by Lena Zellinger
I am at @realaaai.bsky.social #AAAI25 in sunny #Philadelphia 🌞

reach out if you want to grab coffee and chat about #probabilistic #ML #AI #nesy #neurosymbolic #tensor #lowrank models!

check out our tutorial
👉 april-tools.github.io/aaai25-tf-pc...

and workshop
👉 april-tools.github.io/colorai/
February 25, 2025 at 3:33 PM
Reposted by Lena Zellinger
Have you ever been curious to try Causal Normalizing Flows for your project but found them intimidating? Say no more 😜

I just released a small library to easily implement and use causal-flows:

github.com/adrianjav/ca...
GitHub - adrianjav/causal-flows: CausalFlows: A library for Causal Normalizing Flows in Pytorch
CausalFlows: A library for Causal Normalizing Flows in Pytorch - adrianjav/causal-flows
github.com
February 13, 2025 at 5:54 PM
Reposted by Lena Zellinger
Interested in estimating posterior predictives in Bayesian inference? Really want to know if your approximate inference "is working"?
Come to our poster at the NeurIPS BDU workshop on Saturday - see TL;DR below.
December 11, 2024 at 5:25 PM
Reposted by Lena Zellinger
many of the recent successes in #AI #ML are due to #structured low-rank representations!

but...What's the connection between #lowrank adapters, #tensor networks, #polynomials and #circuits?

join our #AAAI25 workshop to know the answer!

and 2 more days to submit!
👇👇👇
april-tools.github.io/colorai/
November 21, 2024 at 7:15 AM