← Back to Blog

My First Experience Using RAPIDS

What I learned from using the GPU model training framework RAPIDS for the first time.

Modeling Unstructured Data Using Snowflake and Saturn Cloud

Our Snowflake quickstart example walks through using unstructured data in Snowflake.

Strategies for managing big data

There are many different approaches for how to handle data that won't fit in memory on a single machine.

Host a Jupyter Notebook as an API

Do you have a Jupyter Notebook that you want to run every time an API is called? You can do that with Saturn Cloud jobs.

If You Can Write Functions, You Can Use Dask

Many data scientists don't know where to start with the distributed framework Dask. Good news--it's often no more work than just …

Multi-GPU TensorFlow on Saturn Cloud

If your machine has multiple GPUs you can train a TensorFlow model across all of the GPUs at once.

Speeding up Neural Network Training With Multiple GPUs and Dask

By combining Dask and PyTorch you can easily speed up training a model across a cluster of GPUs. But how much of a benefit does that …

Dealing with Long Running Jupyter Notebooks

Jupyter struggles with long running notebooks--one hiccup in your connection and the execution can cancel. Here is a solution to manage …

Just Start with the Dask LocalCluster

You can get lots of value from Dask without even using a distributed cluster. Try using the LocalCluster instead!