Johan Edstedt
parskatt.bsky.social
Johan Edstedt
@parskatt.bsky.social
PhD student @ Linköping University

I like 3D vision and training neural networks.
Code: https://github.com/parskatt
Weights: https://github.com/Parskatt/storage/releases/tag/roma
You're on the exponential curve.
February 9, 2026 at 3:52 PM
Nope, more scenes.
February 9, 2026 at 12:07 PM
Other than being lefthanded it makes sense ;^)
February 7, 2026 at 7:36 AM
February 6, 2026 at 10:05 PM
This is just big wheat propaganda, flour cant scale.
February 5, 2026 at 7:44 AM
Reposted by Johan Edstedt
Take me down to the Parallax city where the far moves slow and the near moves quickly
February 1, 2026 at 3:40 PM
"a lazy human would not ignore all prior art" idk, the papers I've reviewed say otherwise ;)
February 2, 2026 at 7:47 AM
I might try some approaches this weekend.
January 30, 2026 at 4:46 PM
My guess is that:

1. COLMAP would work pretty well (with good corresp), but baseline is rather small, and dynamics of cloud would be tricky.

2. Feed-forward methods would underestimate the size of the cloud.

Interested if this is indeed the case, or my intuitions are wrong.
January 30, 2026 at 1:11 PM
Is there any current 3/4D reconstruction method able to accurately reconstruct this scene? (with correct size of thunderstorm)
~ chasing a beautiful supercell thunderstorm across south-central Nebraska on July 1st of 2024 ~
January 30, 2026 at 1:08 PM
regardless of your views on AI, i strongly agree with this viewpoint! you are not funded by taxpayers to optimize your knowledge consumption workflows. it's good that you enjoy your job, you are getting paid to enjoy your job
regardless of your views on AI, i strongly disagree with this viewpoint! you are funded by taxpayers to perform an important service. it's good that you enjoy your job, but are not getting paid to enjoy your job
January 26, 2026 at 6:35 PM
Should the rebuttal contain the updated paper ID?

For revising after the OpenReview debacle we were not supposed to do so.
January 22, 2026 at 6:32 PM
Sounds like it was "on time" by sj standards.
January 22, 2026 at 9:30 AM
You have days like these, but also days where it looks like Mordor.
January 22, 2026 at 9:15 AM
January 21, 2026 at 4:09 PM
hehe, we'll see what I have time for.
January 16, 2026 at 4:12 PM
Sure, but the DeDoDe descriptor would perform very well with 4k keypoints for another detector, such as ALIKED or DaD :)
I would bet that we would beat their IMC22 numbers in that setting.
January 16, 2026 at 3:50 PM
If you run the dedode descriptor with dad keypoints you generally get much better results for lower number of keypoints than the original detector. (Shown is dedode descriptions matched with different detectors).
January 16, 2026 at 3:47 PM
Yes, likely. We ran with 30k keypoints (or whatever gave the best performance). You can see that our results for other descriptors is also in general better.
January 16, 2026 at 3:45 PM
January 16, 2026 at 1:02 PM
Its worse than our original dedode eval, not sure what settings they used here.
January 16, 2026 at 1:01 PM
The internet is both a generator and a distributor of data.
Models are distilled versions of data.

I don't think you would see DL without it, but you can of course argue otherwise.
January 13, 2026 at 6:34 PM
The internet is not the data though? It's the connections between machines.
January 13, 2026 at 3:34 PM
I'd argue DL is due to internet rather than autograd.
January 13, 2026 at 12:28 PM
Correspondence is a much prettier word than match.

Pixel correspondence makes more sense than feature matching (what is a feature?).
January 13, 2026 at 10:30 AM