Gwen Yidou-Weng
yidouweng.bsky.social
Gwen Yidou-Weng
@yidouweng.bsky.social
AI PhD student @UCLA
Probabilistic circuits | Controlled generation
6. AND TRACE is blazingly fast!
—Seconds of classifier training to adapt to new constraints
—Only +20% inference time over base LM
July 19, 2025 at 10:00 PM
5b. TRACE also adapts flexibly to 76 different personas in seconds.
For example, it can instantly imitate Twilight Sparkle’s tone, much better than prompting baseline.
July 19, 2025 at 9:59 PM
5. Results:
Surprisingly, even with this low overhead, TRACE consistently outperforms much more expensive baselines on global control tasks. For example, in detoxification, TRACE outperforms DPO, RL, and FUDGE in quality while maintaining diversity and fluency.
July 19, 2025 at 9:59 PM
3. Prev solutions:
Train-time methods (FT/RL):
—Train a model for p(xt | x<t, s)
Inference-time methods:
—Sampling from p(s | x<t, xt) is intractable for long sequences.
—Auxiliary guides approximate p(s | x<t, xt), but aren’t flexible for new constraints.
July 19, 2025 at 9:58 PM
Wish LM could plan—not just guess the next word?
TRACE lets LM see all endings before each move.
– Global control at inference time
– Tractable lookahead via an HMM LM-proxy
– Linear classifier per constraint
Outperform RL, DPO, FUDGE—at just +20% decoding over base LM.
#ICML2025 @guyvdb.bsky.social
July 19, 2025 at 9:54 PM