Assistant prof. @CarnegieMellon & affiliated faculty @mldcmu, previously instructor @NYU_Courant, PhD jointly @Harvard and @MIT
https://nmboffi.github.io
anyways, thanks for listening! i really think it would help the ai + scientific computing community to have models implemented. jax is just so much easier for almost all tasks except actually using SoTA models, in my opinion. (6/n)
anyways, thanks for listening! i really think it would help the ai + scientific computing community to have models implemented. jax is just so much easier for almost all tasks except actually using SoTA models, in my opinion. (6/n)
i mostly use gpus, but have played with tpus through google research cloud. i'd probably prioritize gpus (they're mostly what's available in academia) (5/n)
i mostly use gpus, but have played with tpus through google research cloud. i'd probably prioritize gpus (they're mostly what's available in academia) (5/n)
another example is graph neural networks. there's jraph, but it's very "roll your own". i work on (3/n)
another example is graph neural networks. there's jraph, but it's very "roll your own". i work on (3/n)
is there any plan for google to release a package with model implementations? their absence seems to be the dominant issue for scaling jax in research
is there any plan for google to release a package with model implementations? their absence seems to be the dominant issue for scaling jax in research
techniques like the above method can be used, in principle, to solve high-dimensional HJB equations (1/)
techniques like the above method can be used, in principle, to solve high-dimensional HJB equations (1/)
i've read your recent ITO papers to try to learn more about this: what's the right assumption on the data? do we have it, or do we need to sample given U but with no data?
i've read your recent ITO papers to try to learn more about this: what's the right assumption on the data? do we have it, or do we need to sample given U but with no data?