Big Data Distributed Systems Packaging Virtual EnvSee in schedule
Packaging in Python is hard. Packaging is particularly hard when code needs to run in a distributed computing environment where it is difficult to know what runs where and which parts of the code are available to run there.
In this talk we will present different ways to ship Python code to a compute cluster, what Python's "pickling" feature has to do with this, what self contained executables are and the challenges we met when shipping Python code to a cluster with 1000s of nodes running 1000s of jobs like TensorFlow or Spark.
As an example, we will show how one can run a PySpark job on top of S3 storage using PEX as a self contained executable artifact. Finally we will explain how those ideas generalize for different Jobs (like Tensorflow, Dask), different virtual environments (like Anaconda or vanilla Python virtual envs) and different distributed storage's (like S3 or HDFS).
The auditor will get an overview of the challenges of Python packaging for distributed applications and see code samples that can be applied in his own project.
Type: Talk (45 mins); Python level: Intermediate; Domain level: Advanced
Originally from Germany Fabian is living in France since 15 years. He worked as a software engineer for several companies covering domains such as Ad Tech, Travel Booking Systems, Real Estate and Solar Cells.
Fabian joined Criteo in beginning of 2016. He now works on the Machine Learning platform.
He holds M.Sc in Computer Science from Technische Universität Dresden (Germany) and M.Sc. in Computer Science from the Ecole Centrale Paris (France)