You can run this notebook in a live session Binder or view it on Github.

Analyze web-hosted JSON data

This notebook reads and processes JSON-encoded data hosted on the web using a combination of Dask Bag and Dask Dataframe.

This data comes from mybinder.org a web service to run Jupyter notebooks live on the web (you may be running this notebook there now). My Binder publishes records for every time someone launches a live notebook like this one, and stores that record in a publicly accessible JSON file, one file per day.

Introduction to the dataset

This data is stored as JSON-encoded text files on the public web. Here are some example lines.

[1]:
import dask.bag as db
db.read_text('https://archive.analytics.mybinder.org/events-2018-11-03.jsonl').take(3)
[1]:
('{"timestamp": "2018-11-03T00:00:00+00:00", "schema": "binderhub.jupyter.org/launch", "version": 1, "provider": "GitHub", "spec": "Qiskit/qiskit-tutorial/master", "status": "success"}\n',
 '{"timestamp": "2018-11-03T00:00:00+00:00", "schema": "binderhub.jupyter.org/launch", "version": 1, "provider": "GitHub", "spec": "ipython/ipython-in-depth/master", "status": "success"}\n',
 '{"timestamp": "2018-11-03T00:00:00+00:00", "schema": "binderhub.jupyter.org/launch", "version": 1, "provider": "GitHub", "spec": "QISKit/qiskit-tutorial/master", "status": "success"}\n')

We see that it includes one line for every time someone started a live notebook on the site. It includes the time that the notebook was started, as well as the repository from which it was served.

In this notebook we’ll look at many such files, parse them from JSON to Python dictionaries, and then from there to Pandas dataframes. We’ll then do some simple analyses on this data.

Start Dask Client for Dashboard

Starting the Dask Client is optional. It will start the dashboard which is useful to gain insight on the computation.

[2]:
from dask.distributed import Client, progress
client = Client(threads_per_worker=1,
                n_workers=4,
                memory_limit='2GB')
client
[2]:

Client

Cluster

  • Workers: 4
  • Cores: 4
  • Memory: 8.00 GB

Get a list of files on the web

The mybinder.org team maintains an index file that points to all other available JSON files of data. Lets convert this to a list of URLs that we’ll read in the next section.

[3]:
import dask.bag as db
import json
[4]:
db.read_text('https://archive.analytics.mybinder.org/index.jsonl').map(json.loads).compute()
[4]:
[{'name': 'events-2018-11-03.jsonl', 'date': '2018-11-03', 'count': '7057'},
 {'name': 'events-2018-11-04.jsonl', 'date': '2018-11-04', 'count': '7489'},
 {'name': 'events-2018-11-05.jsonl', 'date': '2018-11-05', 'count': '13590'},
 {'name': 'events-2018-11-06.jsonl', 'date': '2018-11-06', 'count': '13920'},
 {'name': 'events-2018-11-07.jsonl', 'date': '2018-11-07', 'count': '12766'},
 {'name': 'events-2018-11-08.jsonl', 'date': '2018-11-08', 'count': '14105'},
 {'name': 'events-2018-11-09.jsonl', 'date': '2018-11-09', 'count': '11843'},
 {'name': 'events-2018-11-10.jsonl', 'date': '2018-11-10', 'count': '7047'},
 {'name': 'events-2018-11-11.jsonl', 'date': '2018-11-11', 'count': '6940'},
 {'name': 'events-2018-11-12.jsonl', 'date': '2018-11-12', 'count': '16322'},
 {'name': 'events-2018-11-13.jsonl', 'date': '2018-11-13', 'count': '16530'},
 {'name': 'events-2018-11-14.jsonl', 'date': '2018-11-14', 'count': '14099'},
 {'name': 'events-2018-11-15.jsonl', 'date': '2018-11-15', 'count': '13182'},
 {'name': 'events-2018-11-16.jsonl', 'date': '2018-11-16', 'count': '12863'},
 {'name': 'events-2018-11-17.jsonl', 'date': '2018-11-17', 'count': '6490'},
 {'name': 'events-2018-11-18.jsonl', 'date': '2018-11-18', 'count': '7310'},
 {'name': 'events-2018-11-19.jsonl', 'date': '2018-11-19', 'count': '13348'},
 {'name': 'events-2018-11-20.jsonl', 'date': '2018-11-20', 'count': '13982'},
 {'name': 'events-2018-11-21.jsonl', 'date': '2018-11-21', 'count': '13165'},
 {'name': 'events-2018-11-22.jsonl', 'date': '2018-11-22', 'count': '12217'},
 {'name': 'events-2018-11-23.jsonl', 'date': '2018-11-23', 'count': '9070'},
 {'name': 'events-2018-11-24.jsonl', 'date': '2018-11-24', 'count': '6798'},
 {'name': 'events-2018-11-25.jsonl', 'date': '2018-11-25', 'count': '6796'},
 {'name': 'events-2018-11-26.jsonl', 'date': '2018-11-26', 'count': '13617'},
 {'name': 'events-2018-11-27.jsonl', 'date': '2018-11-27', 'count': '14964'},
 {'name': 'events-2018-11-28.jsonl', 'date': '2018-11-28', 'count': '14434'},
 {'name': 'events-2018-11-29.jsonl', 'date': '2018-11-29', 'count': '13845'},
 {'name': 'events-2018-11-30.jsonl', 'date': '2018-11-30', 'count': '12109'},
 {'name': 'events-2018-12-01.jsonl', 'date': '2018-12-01', 'count': '6785'},
 {'name': 'events-2018-12-02.jsonl', 'date': '2018-12-02', 'count': '7119'},
 {'name': 'events-2018-12-03.jsonl', 'date': '2018-12-03', 'count': '13946'},
 {'name': 'events-2018-12-04.jsonl', 'date': '2018-12-04', 'count': '13765'},
 {'name': 'events-2018-12-05.jsonl', 'date': '2018-12-05', 'count': '13106'},
 {'name': 'events-2018-12-06.jsonl', 'date': '2018-12-06', 'count': '12249'},
 {'name': 'events-2018-12-07.jsonl', 'date': '2018-12-07', 'count': '10687'},
 {'name': 'events-2018-12-08.jsonl', 'date': '2018-12-08', 'count': '6269'},
 {'name': 'events-2018-12-09.jsonl', 'date': '2018-12-09', 'count': '6639'},
 {'name': 'events-2018-12-10.jsonl', 'date': '2018-12-10', 'count': '12782'},
 {'name': 'events-2018-12-11.jsonl', 'date': '2018-12-11', 'count': '13442'},
 {'name': 'events-2018-12-12.jsonl', 'date': '2018-12-12', 'count': '13069'},
 {'name': 'events-2018-12-13.jsonl', 'date': '2018-12-13', 'count': '15279'},
 {'name': 'events-2018-12-14.jsonl', 'date': '2018-12-14', 'count': '9941'},
 {'name': 'events-2018-12-15.jsonl', 'date': '2018-12-15', 'count': '5358'},
 {'name': 'events-2018-12-16.jsonl', 'date': '2018-12-16', 'count': '6441'},
 {'name': 'events-2018-12-17.jsonl', 'date': '2018-12-17', 'count': '11332'},
 {'name': 'events-2018-12-18.jsonl', 'date': '2018-12-18', 'count': '11971'},
 {'name': 'events-2018-12-19.jsonl', 'date': '2018-12-19', 'count': '10818'},
 {'name': 'events-2018-12-20.jsonl', 'date': '2018-12-20', 'count': '9408'},
 {'name': 'events-2018-12-21.jsonl', 'date': '2018-12-21', 'count': '7741'},
 {'name': 'events-2018-12-22.jsonl', 'date': '2018-12-22', 'count': '4818'},
 {'name': 'events-2018-12-23.jsonl', 'date': '2018-12-23', 'count': '4870'},
 {'name': 'events-2018-12-24.jsonl', 'date': '2018-12-24', 'count': '5974'},
 {'name': 'events-2018-12-25.jsonl', 'date': '2018-12-25', 'count': '4737'},
 {'name': 'events-2018-12-26.jsonl', 'date': '2018-12-26', 'count': '6725'},
 {'name': 'events-2018-12-27.jsonl', 'date': '2018-12-27', 'count': '7998'},
 {'name': 'events-2018-12-28.jsonl', 'date': '2018-12-28', 'count': '8155'},
 {'name': 'events-2018-12-29.jsonl', 'date': '2018-12-29', 'count': '5108'},
 {'name': 'events-2018-12-30.jsonl', 'date': '2018-12-30', 'count': '4428'},
 {'name': 'events-2018-12-31.jsonl', 'date': '2018-12-31', 'count': '4561'},
 {'name': 'events-2019-01-01.jsonl', 'date': '2019-01-01', 'count': '4194'},
 {'name': 'events-2019-01-02.jsonl', 'date': '2019-01-02', 'count': '8559'},
 {'name': 'events-2019-01-03.jsonl', 'date': '2019-01-03', 'count': '9687'},
 {'name': 'events-2019-01-04.jsonl', 'date': '2019-01-04', 'count': '10048'},
 {'name': 'events-2019-01-05.jsonl', 'date': '2019-01-05', 'count': '6012'},
 {'name': 'events-2019-01-06.jsonl', 'date': '2019-01-06', 'count': '6019'},
 {'name': 'events-2019-01-07.jsonl', 'date': '2019-01-07', 'count': '11903'},
 {'name': 'events-2019-01-08.jsonl', 'date': '2019-01-08', 'count': '12777'},
 {'name': 'events-2019-01-09.jsonl', 'date': '2019-01-09', 'count': '13294'},
 {'name': 'events-2019-01-10.jsonl', 'date': '2019-01-10', 'count': '13112'},
 {'name': 'events-2019-01-11.jsonl', 'date': '2019-01-11', 'count': '10327'},
 {'name': 'events-2019-01-12.jsonl', 'date': '2019-01-12', 'count': '6434'},
 {'name': 'events-2019-01-13.jsonl', 'date': '2019-01-13', 'count': '7004'},
 {'name': 'events-2019-01-14.jsonl', 'date': '2019-01-14', 'count': '12898'},
 {'name': 'events-2019-01-15.jsonl', 'date': '2019-01-15', 'count': '12363'},
 {'name': 'events-2019-01-16.jsonl', 'date': '2019-01-16', 'count': '13444'},
 {'name': 'events-2019-01-17.jsonl', 'date': '2019-01-17', 'count': '14452'},
 {'name': 'events-2019-01-18.jsonl', 'date': '2019-01-18', 'count': '12056'},
 {'name': 'events-2019-01-19.jsonl', 'date': '2019-01-19', 'count': '7590'},
 {'name': 'events-2019-01-20.jsonl', 'date': '2019-01-20', 'count': '6740'},
 {'name': 'events-2019-01-21.jsonl', 'date': '2019-01-21', 'count': '12507'},
 {'name': 'events-2019-01-22.jsonl', 'date': '2019-01-22', 'count': '15355'},
 {'name': 'events-2019-01-23.jsonl', 'date': '2019-01-23', 'count': '16319'},
 {'name': 'events-2019-01-24.jsonl', 'date': '2019-01-24', 'count': '16732'},
 {'name': 'events-2019-01-25.jsonl', 'date': '2019-01-25', 'count': '13642'},
 {'name': 'events-2019-01-26.jsonl', 'date': '2019-01-26', 'count': '6976'},
 {'name': 'events-2019-01-27.jsonl', 'date': '2019-01-27', 'count': '7570'},
 {'name': 'events-2019-01-28.jsonl', 'date': '2019-01-28', 'count': '15906'},
 {'name': 'events-2019-01-29.jsonl', 'date': '2019-01-29', 'count': '15534'},
 {'name': 'events-2019-01-30.jsonl', 'date': '2019-01-30', 'count': '15183'},
 {'name': 'events-2019-01-31.jsonl', 'date': '2019-01-31', 'count': '14421'},
 {'name': 'events-2019-02-01.jsonl', 'date': '2019-02-01', 'count': '12352'},
 {'name': 'events-2019-02-02.jsonl', 'date': '2019-02-02', 'count': '7113'},
 {'name': 'events-2019-02-03.jsonl', 'date': '2019-02-03', 'count': '7331'},
 {'name': 'events-2019-02-04.jsonl', 'date': '2019-02-04', 'count': '14493'},
 {'name': 'events-2019-02-05.jsonl', 'date': '2019-02-05', 'count': '14053'},
 {'name': 'events-2019-02-06.jsonl', 'date': '2019-02-06', 'count': '14072'}]
[5]:
filenames = (db.read_text('https://archive.analytics.mybinder.org/index.jsonl')
               .map(json.loads)
               .pluck('name')
               .compute())

filenames = ['https://archive.analytics.mybinder.org/' + fn for fn in filenames]
filenames[:5]
[5]:
['https://archive.analytics.mybinder.org/events-2018-11-03.jsonl',
 'https://archive.analytics.mybinder.org/events-2018-11-04.jsonl',
 'https://archive.analytics.mybinder.org/events-2018-11-05.jsonl',
 'https://archive.analytics.mybinder.org/events-2018-11-06.jsonl',
 'https://archive.analytics.mybinder.org/events-2018-11-07.jsonl']

Create Bag of all events

We now create a Dask Bag around that list of URLs, and then call the json.loads function on every line to turn those lines of JSON-encoded text into Python dictionaries that can be more easily manipulated.

[6]:
events = db.read_text(filenames).map(json.loads)
events.take(2)
[6]:
({'timestamp': '2018-11-03T00:00:00+00:00',
  'schema': 'binderhub.jupyter.org/launch',
  'version': 1,
  'provider': 'GitHub',
  'spec': 'Qiskit/qiskit-tutorial/master',
  'status': 'success'},
 {'timestamp': '2018-11-03T00:00:00+00:00',
  'schema': 'binderhub.jupyter.org/launch',
  'version': 1,
  'provider': 'GitHub',
  'spec': 'ipython/ipython-in-depth/master',
  'status': 'success'})

Convert to Dask Dataframe

Finally, we can convert our bag of Python dictionaries into a Dask Dataframe, and follow up with more Pandas-like computations.

We’ll do the same computation as above, now with Pandas syntax.

[8]:
df = events.to_dataframe()
df.head()
[8]:
provider schema spec status timestamp version
0 GitHub binderhub.jupyter.org/launch Qiskit/qiskit-tutorial/master success 2018-11-03T00:00:00+00:00 1
1 GitHub binderhub.jupyter.org/launch ipython/ipython-in-depth/master success 2018-11-03T00:00:00+00:00 1
2 GitHub binderhub.jupyter.org/launch QISKit/qiskit-tutorial/master success 2018-11-03T00:00:00+00:00 1
3 GitHub binderhub.jupyter.org/launch QISKit/qiskit-tutorial/master success 2018-11-03T00:01:00+00:00 1
4 GitHub binderhub.jupyter.org/launch jupyterlab/jupyterlab-demo/master success 2018-11-03T00:01:00+00:00 1
[9]:
df.spec.value_counts().nlargest(20).to_frame().compute()
[9]:
spec
ipython/ipython-in-depth/master 485501
jupyterlab/jupyterlab-demo/master 124944
ines/spacy-io-binder/live 44059
bokeh/bokeh-notebooks/master 24069
DS-100/textbook/master 22542
binder-examples/r/master 21242
rationalmatter/juno-demo-notebooks/master 15256
QuantStack/xeus-cling/stable 13600
RasaHQ/rasa_core/master 10952
QISKit/qiskit-tutorial/master 10648
numba/numba-examples/master 7507
binder-examples/julia-python/master 7401
minrk/ligo-binder/master 5305
nteract/examples/master 4976
annierak/desert_open_space_geometry/master 4466
data-8/textbook/gh-pages 4390
dask/dask-examples/master 3721
binder-examples/requirements/master 3424
trekhleb/homemade-machine-learning/master 3238
mkozturk/CMPE140/master 3159

Persist in memory

This dataset fits nicely into memory. Lets avoid downloading data every time we do an operation and instead keep the data local in memory.

[10]:
df = df.persist()

Honestly, at this point it makes more sense to just switch to Pandas, but this is a Dask example, so we’ll continue with Dask dataframe.

Investigate providers other than Github

Most binders are specified as git repositories on GitHub, but not all. Lets investigate other providers.

[11]:
import urllib
[12]:
df.provider.value_counts().compute()
[12]:
GitHub    1014840
GitLab       4266
Git           785
Name: provider, dtype: int64
[13]:
(df[df.provider == 'GitLab']
 .spec
 .map(urllib.parse.unquote, meta=('spec', object))
 .value_counts()
 .to_frame()
 .compute())
[13]:
spec
rruizz/inforfis/master 746
rruizz/inforfis/R 508
bhugueney/cxx-init-for-python-dev/master 458
dbernhard/PythonM1/master 378
shadaba/lss-handson/master 244
dbernhard/JavaM2/master 234
albert.van.breemen/masterclassdeeplearning/master 161
DGothrek/ipyaggrid/binder-demo 132
clemej/data601-clemens-fall18/master 132
dbernhard/ProgHumaNumTAL/master 105
wichit2s/programmingfundamentals/master 104
dbernhard/pythonm1s2/master 78
andrey.kovalev/imagination/master 69
rruizz/inforfis/autin 56
wichit2s/pythonintro/master 54
PersonalDataIO/toronto-letter/master 54
rokm/informed_action/master 35
slloyd/python-introduction/master 34
amarandon/presentation-jupyter/master 33
sgmarkets/sgmarkets-api-notebooks/master 33
kitsunix/pyHIBP/pyHIBP-binder/master 32
open-scientist/python-express/urfist-strasbourg/cocreation 30
thoma.rey/FV_HipoDiff/master 29
PersonalDataIO/gdpr-rights-notebook/master 27
codeearth/drawdown/master 26
davecg/introduction-to-python/master 23
jibe-b/prof/dev 16
rruizz/inforfis/autin-test 15
dlab-indecol/tutorial_notebooks/master 14
butzked/pylolaos/master 14
... ...
gitlab-org/release/docs/master 2
oheoh/info-notebooks/master 2
PersonalDataIO/notebook-example/db60921d05c0424b6639f6ac4c95661530a30574 1
ssorcerer/reports/master 1
s-boardman/peco/8d95e4fa08541d2521d2a3ecf43414a8aa6b79bc 1
RTDS_DEV/RTDS_Summary/master 1
PersonalDataIO/facebook-birthday-breach/master 1
Sabanov/cs231n.stanford.edu/master 1
vaulter82/Prion-stats/master 1
wallacmj/notebooks/master 1
rdubwiley/mi_campaign_finance/master 1
prokopevav85/geojsontest/master 1
fkohrt/Notebooks/master 1
nmg/704p2/master 1
jibe-b/sabre/master 1
g_money/folio_track/master 1
damonlh/puma/master 1
daksh7011/metis/master 1
hassakura/integracao_low_eans/master 1
hassakura/teste_exportcsv/master 1
hixi/colloqium-presentation/master 1
jibe-b/sens-de-la-vie-workflow-brouillon-tests-tuto/dev 1
agrumery/aGrUM/0.13.5 1
anarcat/terms-benchmarks/master 1
kerel-fs/jupyter-notebooks/master 1
ktiwari9/gaussian-process/master 1
lokeller/matrix-bot/master 1
david-chen/test/master 1
mfleck/test_mybinder_org/master 1
jibe-b/crowdsource-science-improvement/dev 1

127 rows × 1 columns

[14]:
(df[df.provider == 'Git']
 .spec
 .apply(urllib.parse.unquote, meta=('spec', object))
 .value_counts()
 .to_frame()
 .compute())
[14]:
spec
https://bitbucket.org/gaur/1820/a4027afe0aa592d49f3989b2e9c8136c36322a77 75
https://gricad-gitlab.univ-grenoble-alpes.fr/nonsmooth/siconos-tutorial.git/b08a0514b22b3927b58bddce3c4018f27ac0fc7d 52
https://collaborating.tuhh.de/cip3725/ib_base.git/0a1f4f66a1a3c29ff347b2abc79bb292b0be17ca 51
https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/conference-ia/d98a199f66d603b0b1e7c25fbe1341d29a40cd39 41
https://git.rcc.uchicago.edu/jhskone/multiproc_py.git/38f9bb6ce3602b73a8ddd1dbcad3f5f9a8d21f6a 34
https://gitlab.mech.kuleuven.be/rob-expressiongraphs/docker/etasl-binder.git/127d6c9f33a938a98607505ef30237b5223000e4 33
https://risk-engineering.org/git/notebooks.git/527b4fdaa5fe43294ed5de196b96262b1c60522b 33
https://gitlab.oceantrack.org/otndc/fact-workshop/a174f5bc60cf9f0c5b86851204c7c85fcfb98131 32
https://bitbucket.org/qlouder/lumc-ml-caffe/acc2701f43f267361594e72980ec59602dab7fd7 27
https://framagit.org/mfauvel/omp_machine_learning/77d246991186d758147fc280eeef0eed563d525f 21
https://risk-engineering.org/git/notebooks.git/eab9bdb1a9bbf59833b26d39b02c5a5e743807c4 20
https://risk-engineering.org/git/notebooks.git/0af42a9c5987be368b6e1fef1ad11816703e0db9 19
https://github.com/camilo912/research_rating/6fe3d52d8dce4cbc548cfd2a19299beafa0eeff2 14
https://bitbucket.org/qlouder/lumc-ml-caffe/4e78f53a1b8aaf24e559be509b57197927818d19 12
https://[email protected]/boismenux/ml.git/0f140be32895f01d66392b4081e6589e42395f53 12
https://gitlab.oceantrack.org/otndc/acoustic-telemetry-analysis-workshop/ed80a18d3ba7e4a39716ef7255cc954ce022710f 10
https://[email protected]/chaffra/ipydevedit.git/7e853c93a5632ab7ba49eac835717b25daab492a 10
https://gitlab.inria.fr/vdrevell/istic-robm/02f2463132c37ace64ef0248af38e20b4fd038b4 9
https://risk-engineering.org/git/notebooks.git/e3148cfc261033f42465d024547935f920bf04ad 8
https://collaborating.tuhh.de/xldrkp/jupyter-notebook-beispiel/e895284be3c0a128ed97dc007fcd7497ef19d2bc 8
https://opensource.ncsa.illinois.edu/bitbucket/scm/smm/smm-network-analysis.git/44acd332da1571acb412db803991e5f1dfc7a329 7
https://bitbucket.org/qlouder/lumc-ml-caffe/282d090e1b04f0df3b155879c68469347e45afff 7
https://risk-engineering.org/git/notebooks.git/6948d6d25e0852620053abcdca93914f749cae56 7
https://gitlab.physik.uni-muenchen.de/Magnus.Bauer/fak_analysis.git/e0669aff9716e7905b0dca28b6b5967ec6423c69 6
https://gitlab.com/uchicago-ime/python-programming-primer/435796a3e2e4956aa311caad51d4bad9925de6b4 6
https://gitlab.physik.uni-muenchen.de/Magnus.Bauer/fak_analysis/433b27076fb8f08361955b743f91d0ea31ef5616 5
https://bitbucket.org/qlouder/lumc-ml-caffe.git/282d090e1b04f0df3b155879c68469347e45afff 5
https://plmlab.math.cnrs.fr/lothode/tp_linux.git/0627af495ecaebe3cc83811afed70c6e21ed7738 5
https://risk-engineering.org/git/notebooks.git/d9e2cefae0f150b23f7fcfb04757033742cba339 5
https://framagit.org/mfauvel/omp_machine_learning/bb53322ad48d521bbbd81100e116e4289b56b036 5
... ...
https://gitlab.mech.kuleuven.be/rob-expressiongraphs/docker/etasl-binder.git/63596595694d466c4fb4a2c544042291f0a05a94 1
https://gitlab.oceantrack.org/otndc/fact-workshop/43f79d279af700880ca372d42b94d191e53a0a87 1
https://gitlab.oceantrack.org/otndc/fact-workshop/72cb4b6632a3c0b522c16ec4c3b69482ce433753 1
https://gitlab.physik.uni-muenchen.de/Martin.Ritter/python-intro/6b2e01e55667d833e0028245e333144536d43e4b 1
https://gitlab.oceantrack.org/otndc/fact-workshop/c3853384f803253544d9b7333cc01d4d7bfa72a6 1
https://gitlab.oceantrack.org/otndc/fact-workshop/c631ee43a05abd62569262f1e599a1e5ee7ec95d 1
https://gitlab.pasteur.fr/dbikard/badSeed_public.git/da655dd38fe444c918ac754206568ad39e03be45 1
https://gitlab.pasteur.fr/dbikard/crisprbrowser.git/a894255cd5055e9b541296ab3d9018d67c955156 1
https://gitlab.in2p3.fr/gregoire.henning/tiipp-invprob-2018/d874f3b9dcfa20c04266ad852246cd43178d0f42 1
https://gitlab.in2p3.fr/gregoire.henning/tiipp-invprob-2018/6662fe629ebb3ad62c6d9de8348c73cac781c7e3 1
https://git.rcc.uchicago.edu/jhskone/multiproc_py.git/d03a0f4c7bffe7f919acd3fa494b200404b2e717 1
https://gitlab.gwdg.de/publications/NanoMAX18_JSR.git/48cbcc2390cf0fb8e85e8c8890ce693c6681d9fe 1
https://git.rwth-aachen.de/rohlfing/gdet3-demos/18361041135e7bce58af5d9410bc9f97197cf011 1
https://git.rwth-aachen.de/rohlfing/gdet3-demos/9ecb214cb8ee5898dc3d73f11b2006a82199e665 1
https://git.rwth-aachen.de/rohlfing/gdet3-demos/b9d0cc2edf7c46f662d2f8dc9a7a95689da4518e 1
https://github.com/MelodyC117/Trial.git/08c1720f667110b719daf6f5b654ddd7f3d46076 1
https://github.com/MelodyC117/Trial.git/acdea1996ff3dc3ea7730256e5fbdc1a0b242289 1
https://github.com/MelodyC117/Trial.git/eb8ac5ddc2a4707b707277ac150973bd894f5d67 1
https://github.com/MelodyC117/Trial.git/fd7119c7c2d6f3ed322d7326a3921f337b55a91c 1
https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/conference-ia/75f80fb38a355e3e688de260a26c72b5f320bb4f 1
https://github.com/camilo912/research_rating/ee7b91c6973f1bafad64adc938edf53c7ff30362 1
https://github.com/ostapkharysh/Data_visualisation/42c00131e615ae8444d3ccf680fa24e81050f275 1
https://github.com/wbogers/notebooks.git/2887c6399c52809ba4eb98328ecb4c79e3bb4b83 1
https://github.molgen.mpg.de/ledwards/FLASHcalculator/9da41d5d86cd75d9f96ac779b2b64e3e79932d9f 1
https://github.uconn.edu/rcc02007/ME3263_Lab-0.git/f25072f2e708c231ea05040cab6aae2699a7be6f 1
https://gitlab.com/jhskone/multiproc_py.git/54224bf1159ac6d5c3b5b9268deea5c1890581b8 1
https://gitlab.com/ricarthor/testing_binder.git/ac37cd61fe29d17301a6380cea61b95de60cdea9 1
https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/conference-ia/34bc48747d1cf70ac59254435e4f59d4b20d26fc 1
https://gitlab.flux.utah.edu/emulab/osdi18-analysis/cc10c099b2cdf46f473a30340433d42b04dac39c 1
http://code.datamode.com/datamode/binder.git/85bd134f3713f8a8cb3e22612929987b49b675a5 1

145 rows × 1 columns

[15]: