We did our first webinar on June 23, 2020, and we wanted to follow up with a brief post recapping the topics covered and giving access to a recording of the webinar.
In the webinar, Tyler Whitehouse (CEO) and Dean Kleissas (CTO) presented some slides and gave a product demo. The intent was to explain a bit about why decentralization is the best way to scale collaboration and productivity for teams on hybrid and multi-cloud environments.
Broadly speaking, decentralization is the attempt to enable data scientists to work across a variety of devices and resources in a self-service fashion. It is a flexible approach that, if done properly, can eliminate the cost and practical problems of centralized approaches. The problem is that decentralization requires a lot of technical skill and diligence.
We have found that the key to scaling a decentralized approach is to provide lot of automation at the local level, not just in a managed cloud. Local automation drastically reduces the skill burden and the amount of time required to make decentralized approaches feasible.
The webinar did the following:
- Outlined the basic technical problems of collaboration and managing work for data science teams;
- Related this problem to cost and productivity concerns for their organizations;
- Explained centralized vs decentralized approaches in the context of hybrid and multi-cloud environments, and why decentralization is better;
- Explained how local automation can make decentralization robust & scalable;
- Demonstrated Gigantum's Client + Hub model for scaling collaboration and productivity.
The webinar featured a demo of creating automatically reproducible work in Jupyter or RStudio and shows how to use Gigantum to easily move reproducible work back and forth between GPU and CPU resources.
You can watch the webinar video by filling out the form below. We hope you enjoy it.