We can do this via Distributional Successor Features. Our recent work introduces the 1st tractable & provably convergent algos for learning DSFs.
#NeurIPS2024 #6704
12 Dec, 11-2
We can do this via Distributional Successor Features. Our recent work introduces the 1st tractable & provably convergent algos for learning DSFs.
#NeurIPS2024 #6704
12 Dec, 11-2
Our paper "Action Gaps & Advantages in Continuous-Time Distributional RL" shows how Distributional RL sheds light on this, enabling high-frequency model-free risk-sensitive RL.
#NeurIPS2024 #6410
13 Dec, 11-2
Our paper "Action Gaps & Advantages in Continuous-Time Distributional RL" shows how Distributional RL sheds light on this, enabling high-frequency model-free risk-sensitive RL.
#NeurIPS2024 #6410
13 Dec, 11-2