Pierre-François DP
pfdeplaen.bsky.social
Pierre-François DP
@pfdeplaen.bsky.social
Final-year PhD student in computer vision at KU Leuven, Belgium.
Minimizing entropy only to realize my level of surprise increased
gh.io/pf
Pinned
Did you know that a PCA decomposition or SSL decorrelation techniques (eg Barlow Twins) don't necessarily extract minimally redundant/dependent features?
Our paper explains why and introduces an algorithm for general dependence minimization.
🧵
Reposted by Pierre-François DP
Call for Papers update - ILR+G workshop @iccv.bsky.social

We will now feature a single submission track with new submission dates.

📅 New submission deadline: June 21, 2025
🔗 Submit here: cmt3.research.microsoft.com/ILRnG2025
🌐 More details: ilr-workshop.github.io/ICCVW2025/

#ICCV2025
May 24, 2025 at 8:27 AM
Update: #ICML sent an email asking reviewers to update reviews and add an "update after rebuttal" section.
Although the review process is far from perfect in ML and CV conferences, I welcome the fact that ICML is trying to improve it.
ICML introduced a button for reviewers to acknowledge that they have read rebuttals and will take them into consideration.

The idea sounds nice, but in practice most reviewers (around 75% in my reviewer's batch of papers) just clicked the button without leaving a comment or updating scores...
April 9, 2025 at 6:04 PM
ICML introduced a button for reviewers to acknowledge that they have read rebuttals and will take them into consideration.

The idea sounds nice, but in practice most reviewers (around 75% in my reviewer's batch of papers) just clicked the button without leaving a comment or updating scores...
April 4, 2025 at 4:43 PM
Did you know that a PCA decomposition or SSL decorrelation techniques (eg Barlow Twins) don't necessarily extract minimally redundant/dependent features?
Our paper explains why and introduces an algorithm for general dependence minimization.
🧵
February 21, 2025 at 11:14 AM
🚨 A peer-reviewed publication from MM'24 copied our CVPR 2023 paper! #plagiarism
The authors rephrased our method, but their approach is not different from ours.
Surprisingly, they cited us for general observations but did everything they could to hide our contributions from the readers/reviewers.
January 10, 2025 at 11:52 AM