meet your trainers

Kaz Sato

Evangelist, Google

Kaz Sato is Staff Developer Advocate at Cloud Platform team, Google Inc. He leads the developer advocacy team for Machine Learning and Data Analytics products, such as TensorFlow, Vision API and BigQuery, and speaking at major events including Strata+Hadoop World 2016 San Jose, Google Next 2015 NYC and Tel Aviv and DevFest Berlin. Kaz also has been leading and supporting developer communities for Google Cloud for over 7 years. He is also interested in hardwares and IoT, and has been hosting FPGA meetups since 2013.

Nikolaos Vasiloglou

Technical Chair, MLconf

Nikolaos Vasiloglou holds a PhD from the department of Electrical and Computer Engineering at Georgia Institute of Technology. His thesis was focused on salable machine learning over massive datasets. After graduating from Georgia Tech he founded Analytics1305 LLC and Ismion Inc. He has architected and developed the PaperBoat machine learning library which has been successfully integrated and used in the LogicBlox and HPCCSystems platforms. He has also served as a machine learning consultant for Predictix, Revolution Analytics, Damballa, Tapad and LexisNexis. Vasiloglou has recently focused his studies on Google's TensorFlow and has been active in developing the syllabus for a series of TensorFlow training events. His work has resulted in patents and production systems.

Training Schedule




Session 1: TensorFlow paper

  • Symbolic coding: From imperative to declarative programming
  • Tensor as first class citizen
  • The computational graph model
  • ControlFlow: Mutating variables
  • Managing hardware
  • Device communication model
  • Distributed computing
  • Lossy data compression
  • Automatic Differentiation
  • Synchrony vs Asynchrony
  • Debugging

Session 2: Creating and Visualizing a Computational Graph

  • Variables vs placeholders
  • Operations
  • Creating a computational session
  • Managing Hardware, cpus and gpus
  • Using TensorBoard for Visualization
  • Breaking declaratively: Mutating variables with controlFlow.



Session 3: Linear Algebra with TF

  • Sparse/Dense Matrix/Vectors operations
  • Kronecker Products in TF
  • From Matrices to Tensors
  • Tensor Tiling: the map operator of TF
  • Reductions on Tensors
  • Thinking in batch
  • Limitations of TF



Session 4: Optimization in TF

  • Creating a symbolic objective function
  • Computing the gradients
  • Build your own simple gradient Descent optimizer
  • The mini-batch
  • Inside the optimizer TF class
  • Tweaking predefined optimizer by touching the gradients
  • Presentation and Parameter tuning of famous optimizers: AdamOptimizerRmsPropOptimizer, and AdaGrad
  • Build your first linear regression in 3 lines!



Session 5: Coding ML algorithms

  • Continue building linear models (logistic)
  • Using different objectives: L1, L2, Cross Entropy, Hinge Loss, and Maximum Likelihood
  • Adding L1/L2 regularization
  • Compare your implementation with the tensorFlow Linear module
  • Regularizing with dropout
  • Code your first multilayer perceptron (MLP)
  • Debugging your model with TensorBoard

Session 6: Wide and deep modeling