Live Notebook

You can run this notebook in a live session Binder or view it on Github.

Automate Machine Learning with TPOT

This example shows how TPOT can be used with Dask.

TPOT is an automated machine learning library. It evaluates many scikit-learn pipelines and hyperparameter combinations to find a model that works well for your data. Evaluating all these computations is computationally expensive, but ammenable to parallelism. TPOT can use Dask to distribute these computations on a cluster of machines.

This notebook can be run interactively on the dask examples binder. The following video shows a larger version of this notebook on a cluster.

[1]:
from IPython.display import HTML

HTML('<div style="position:relative;height:0;padding-bottom:56.25%"><iframe src="https://www.youtube.com/embed/uyx9nBuOYQQ?ecver=2" width="640" height="360" frameborder="0" allow="autoplay; encrypted-media" style="position:absolute;width:100%;height:100%;left:0" allowfullscreen></iframe></div>')
[1]:
[2]:
!pip install tpot
Collecting tpot
  Downloading TPOT-0.11.6.post1-py3-none-any.whl (86 kB)
     |████████████████████████████████| 86 kB 7.9 MB/s
Requirement already satisfied: numpy>=1.16.3 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from tpot) (1.18.5)
Requirement already satisfied: scikit-learn>=0.22.0 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from tpot) (0.23.2)
Requirement already satisfied: scipy>=1.3.1 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from tpot) (1.5.3)
Requirement already satisfied: joblib>=0.13.2 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from tpot) (0.17.0)
Collecting update-checker>=0.16
  Downloading update_checker-0.18.0-py3-none-any.whl (7.0 kB)
Collecting stopit>=1.1.1
  Downloading stopit-1.1.2.tar.gz (18 kB)
Collecting deap>=1.2
  Downloading deap-1.3.1-cp38-cp38-manylinux2010_x86_64.whl (157 kB)
     |████████████████████████████████| 157 kB 31.0 MB/s
Requirement already satisfied: pandas>=0.24.2 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from tpot) (1.0.5)
Requirement already satisfied: tqdm>=4.36.1 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from tpot) (4.52.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from scikit-learn>=0.22.0->tpot) (2.1.0)
Requirement already satisfied: requests>=2.3.0 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from update-checker>=0.16->tpot) (2.25.0)
Requirement already satisfied: pytz>=2017.2 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from pandas>=0.24.2->tpot) (2020.4)
Requirement already satisfied: python-dateutil>=2.6.1 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from pandas>=0.24.2->tpot) (2.8.1)
Requirement already satisfied: idna<3,>=2.5 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from requests>=2.3.0->update-checker>=0.16->tpot) (2.10)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from requests>=2.3.0->update-checker>=0.16->tpot) (1.25.11)
Requirement already satisfied: chardet<4,>=3.0.2 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from requests>=2.3.0->update-checker>=0.16->tpot) (3.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from requests>=2.3.0->update-checker>=0.16->tpot) (2020.11.8)
Requirement already satisfied: six>=1.5 in /usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages (from python-dateutil>=2.6.1->pandas>=0.24.2->tpot) (1.15.0)
Building wheels for collected packages: stopit
  Building wheel for stopit (setup.py) ... - done
  Created wheel for stopit: filename=stopit-1.1.2-py3-none-any.whl size=11956 sha256=b05eb3e65aec70196c963ddae0f65d89102fe3da4e8082da2415f0cf75da0b13
  Stored in directory: /home/runner/.cache/pip/wheels/a8/bb/8f/6b9328d23c2dcedbfeb8498b9f650d55d463089e3b8fc0bfb2
Successfully built stopit
Installing collected packages: update-checker, stopit, deap, tpot
Successfully installed deap-1.3.1 stopit-1.1.2 tpot-0.11.6.post1 update-checker-0.18.0
[3]:
import tpot
from tpot import TPOTClassifier
from sklearn.datasets import load_digits
from sklearn.model_selection import train_test_split
/usr/share/miniconda3/envs/dask-examples/lib/python3.8/site-packages/tpot/builtins/__init__.py:36: UserWarning: Warning: optional dependency `torch` is not available. - skipping import of NN models.
  warnings.warn("Warning: optional dependency `torch` is not available. - skipping import of NN models.")

Setup Dask

We first start a Dask client in order to get access to the Dask dashboard, which will provide progress and performance metrics.

You can view the dashboard by clicking on the dashboard link after you run the cell.

[4]:
from dask.distributed import Client
client = Client(n_workers=4, threads_per_worker=1)
client
[4]:

Client

Cluster

  • Workers: 4
  • Cores: 4
  • Memory: 7.29 GB

Create Data

We’ll use the digits dataset. To ensure the example runs quickly, we’ll make the training dataset relatively small.

[5]:
digits = load_digits()

X_train, X_test, y_train, y_test = train_test_split(
    digits.data,
    digits.target,
    train_size=0.05,
    test_size=0.95,
)

These are just small, in-memory NumPy arrays. This example is not applicable to larger-than-memory Dask arrays.

Using Dask

TPOT follows the scikit-learn API; we specify a TPOTClassifier with a few hyperparameters, and then fit it on some data. By default, TPOT trains on your single machine. To ensure your cluster is used, specify the use_dask keyword.

[6]:
# scale up: Increase the TPOT parameters like population_size, generations
tp = TPOTClassifier(
    generations=2,
    population_size=10,
    cv=2,
    n_jobs=-1,
    random_state=0,
    verbosity=0,
    config_dict=tpot.config.classifier_config_dict_light,
    use_dask=True,
)
[7]:
tp.fit(X_train, y_train)
[7]:
TPOTClassifier(config_dict={'sklearn.cluster.FeatureAgglomeration': {'affinity': ['euclidean',
                                                                                  'l1',
                                                                                  'l2',
                                                                                  'manhattan',
                                                                                  'cosine'],
                                                                     'linkage': ['ward',
                                                                                 'complete',
                                                                                 'average']},
                            'sklearn.decomposition.PCA': {'iterated_power': range(1, 11),
                                                          'svd_solver': ['randomized']},
                            'sklearn.feature_selection.SelectFwe': {'alpha': array([0.   , 0.001, 0.002, 0.003, 0.004, 0.005, 0.006, 0.007...
                                                                          'max']},
                            'sklearn.preprocessing.RobustScaler': {},
                            'sklearn.preprocessing.StandardScaler': {},
                            'sklearn.tree.DecisionTreeClassifier': {'criterion': ['gini',
                                                                                  'entropy'],
                                                                    'max_depth': range(1, 11),
                                                                    'min_samples_leaf': range(1, 21),
                                                                    'min_samples_split': range(2, 21)},
                            'tpot.builtins.ZeroCount': {}},
               cv=2, generations=2, n_jobs=-1, population_size=10,
               random_state=0, use_dask=True)

Learn More

See the Dask-ML and TPOT documenation for more information on using Dask and TPOT.