Multi-GPU TensorFlow on Saturn Cloud
If your machine has multiple GPUs you can train a TensorFlow model across all of the GPUs at once.

Many data scientists don't know where to start with the distributed framework Dask. Good news--it's often no more work than just writing a Python function.
Read article →
If your machine has multiple GPUs you can train a TensorFlow model across all of the GPUs at once.

By combining Dask and PyTorch you can easily speed up training a model across a cluster of GPUs. But how much of a benefit does that …

Jupyter struggles with long running notebooks--one hiccup in your connection and the execution can cancel. Here is a solution to manage …

You can get lots of value from Dask without even using a distributed cluster. Try using the LocalCluster instead!

In the final part of this three part series we cover how to take a trained model and deploy it as an API.

In part two of this three part series we cover how to take a trained model and make an interactive web app from it.

In part one of this three part series we cover how to train a model to deploy as a dashboard or API.

How you can automate your complex tasks using Saturn Cloud, Dask, and Prefect

While Saturn Cloud provides client resources to connect to Dask clusters, you can also directly connect from external locations.