Handy Dandy Guide to Working With Timestamps in pandas
Using times in pandas can sometimes be tricky--this blog post covers the most common problems.

PyTorch has the ability to train models across multiple machines, and thanks to the framework Dask you can easily create a GPU cluster for training.
Read article →
Using times in pandas can sometimes be tricky--this blog post covers the most common problems.

This tutorial walks through how to use PyTorch and Dask to train an image recognition model across a GPU cluster.

Our new Saturn Cloud Hosted platform lets data scientists get going with cloud compute in seconds.

When working on a Machine Learning or a Deep Learning Problem, loss/cost functions are used to optimize the model during training. The …

Senseye uses Saturn Cloud to train machine learning models on GPUs at a massive scale.

This blog post compares using RAPIDS and Dask vs Apache Spark for model training

The distributed computing framework Dask is great for hyperparameter tuning, since you can train different parameter sets concurrently.

Data science has unique workflows that don't always match those of software engineers and require special setup for Kubernetes.

Python is a great language to base your DS/ML framework on, and allows you to avoid being locked into one vendor specific framework.