The Cloud vs On-Prem Debate is Over. Just Do Both.

Posted by Tyler Whitehouse on Apr 15, 2021 7:18:06 PM

small building cloudWhen it comes to GPUs, the "cloud vs on-prem" debate is resolved. The answer is just do both. 

While that sounds crazy, it is now possible.

The first part of the story is an emerging class of GPU enabled "data science workstations", i.e. evolved & leaner versions of gaming & design machines. For roughly the price of a high end Mac, vendors (Dell, HPLenovo ) now offer laptops with a powerful NVIDIA GPU that you can use whenever & wherever you want.

The practical and economic implications are huge, as you can see in this NVIDIA session at GTC2021. They let you be strategic with cloud resources by keeping small to medium sized compute on your workstation and only pushing to the cloud when needed. This lets you lower costs without sacrificing functionality.  

You say "Thanks, but no thanks. Switching work loads between computational resources saves money but is a huge hassle."

We say "Not anymore. That problem is solved." 


This is where the second part comes in. It is a new approach that eliminates the headaches of managing everything yourself. The containerized workbench,  Gigantum, provides reproducibility and portability by default without changing how you work. This means you can work, compute, share and collaborate across any collection of machines you want, radically improving your day to day economics and putting team collaboration on blast.

You say "Compute on whatever resource I want with a better experience than expensive SaaS platforms? Not possible."

We say "Not just possible. Easy."


To get an idea of how easy, see our GTC2021 demo here, check out the short commercial below,  or do a deep dive at our upcoming webinar.

Register for upcoming webinar


 

Topics: Data Science, Nvidia, Multi-Cloud, On-Premises, GPU