Salman Naqvi
banner
forbo7.bsky.social
Salman Naqvi
@forbo7.bsky.social
• Studying BSc (Hons) AI & EdTech
• Part-time at Multimodal Intelligence Lab@EdUHK
• fastai student
• 我会说一点儿普通话
• Loves learning
• Is scitech geek and space nerd
• Has been to 15 countries
🔗 forbo7.github.io
📍 China
👀
December 26, 2024 at 5:41 AM
Granted, I can see the certain appeal of Fusion.
December 9, 2024 at 7:44 AM
Fusion is to Blender as MS Word is to Pages.

Fusion feels like MS Word: workflow feels odd, certain functions don't work as you'd expect, certain functions don't work for no obvious reason, and certain logical keyboard shortcuts don't exist for whatever reason.
December 9, 2024 at 7:42 AM
The "Fillet" operation in Fusion should really be called "bevel"...it's reminding me of some fancy steak
December 9, 2024 at 7:25 AM
Whoops, not $1 but $0.01 😄
November 29, 2024 at 3:12 AM
Interesting; in that case, going for a GPU rental is much better than Colab!

Can get a similarly specced (13GB RAM/113GB Storage/2vCPU) RTX A4000 (which has an extra GB of VRAM and outperforms T4) for roughly $1 less!
November 29, 2024 at 3:10 AM
Even if we use Colab's L4, which has the same RAM as a RTX4090, you can get an RTX4090 for ~$0.40–$0.45/hr which has more flops. L4 here on Colab is at ~$0.49/hr.

And then GPU renters provide access to a "whole Linux machine" you can SSH into, whereas Colab is NB frontend+terminal access
November 27, 2024 at 3:31 AM
Interesting; from the screenshot, if we take a common GPU, V100 is 4.91creds/hr, so roughly 20.5 hours of runtime @ ~$0.49/hr. In this case, a GPU rental like Tensordock looks to be much more economical than Colab?

For $10 on Colab: 20.5 hours of V100 use
For $10 on Tensordock: 33 hours of V100
November 27, 2024 at 3:28 AM
Same RTX4090 machine is ~$0.45/hr.
November 27, 2024 at 12:31 AM
Interesting. Do you know how many hours you get in these + tiers? I suppose it breaks down to how many GPU hours you can get in these upped tiers.

I just checked on Tensordock and you can get a V100 there (256GB/32GB/8vCPU) for ~$0.30/hr. Same A100 machine is ~$1.20/hr.
November 27, 2024 at 12:30 AM
Interesting. From what you've written, Paperspace seems like the better default deal, since A4000>T4 from what I've seen.

I've only used GPU rent-by-the-hour, and for ref, A4000 is ~$0.11/hr, and an RTX4090 is ~$0.45/hr.

If you bite, tell us how it goes :P
November 27, 2024 at 12:26 AM
On a tangent: is the $10/mo proposition on Colab a better deal than going for GPU rentals ala Paperspace, Tensordock, etc?
November 26, 2024 at 1:14 AM
To-the-point explanations with no fluff and no jargon. For example,

www.mathsisfun.com/combinatoric...

www.mathsisfun.com/data/weighte...
November 24, 2024 at 5:23 AM