Kevin Black
kvablack.bsky.social
Kevin Black
@kvablack.bsky.social
wannabe roboticist @ π and Berkeley AI
in retrospect, we should have done more to temper expectations and highlight the difference between the method/architecture/open-source code and the actual pretrained weights, which we def want ppl to play with, but come with absolutely no guarantees.
November 28, 2024 at 8:04 PM
(disclaimer: opinions are my own)

we used language that implied that the pretrained Octo weights would definitely actually be useful for YOU in YOUR lab. in reality, the real world is much bigger than OXE and some carefully-tuned evals. end-to-end robot learning is not even at the GPT-2 level
November 28, 2024 at 8:04 PM
super happy to hear the first one, since we worked so hard on open-sourcing! the last one is the one that I'm less confident about -- it's very hard to beat a well-tuned from-scratch baseline. but if it works for you, then perhaps I've been too pessimistic!
November 27, 2024 at 7:48 PM
don't worry I wasn't insulted :)

interestingly, this is the second time that public dunking on Octo has informed me of someone who actually does use it (like @tomdupuis.bsky.social), revising my opinion of its utility upwards rather than downwards.

(the first time I was the one doing the dunking)
November 27, 2024 at 6:40 PM
yeah we 100% oversold Octo
November 27, 2024 at 6:10 PM
tokenization is inductive bias
November 25, 2024 at 6:10 PM
same, I've been trying to force myself to use btop but something about htop just feels... comfy... maybe it's just familiarity bias
November 22, 2024 at 5:48 AM