The goal of this workshop is to survey the field of machine learning and deep learning. It includes the steps for architecting and deploying a large-scale learning pipeline on Google Cloud Platform. The workshop contains the following sections:

The Computational Cost of Building ML Products, Why GCP, GCP Product and Service Offerings, Cloud Compute, Cloud Storage, Big Data/ Analytics, Cloud AI

Enable Google Compute Engine and Cloud Source Repositories APIs, Launch a Datalab Instance from gcloud, Retrieving Code from Github into Datalab VM, Shutting down/ Deleting the instance

Starting out with Colab, Change Runtime Settings, Storing Notebooks, Uploading Notebooks

Why Machine Learning?, Types of Learning, Foundations of Machine Learning, A Formal Model for Machine Learning Theory, How do we assess learning?, Training and Validation data, The Bias/ Variance tradeoff, Evaluation metrics

Linear Regression Model, Gradient Descent, Building a Linear Regression Model with TensorFlow, Importing the dataset, Prepare the dataset for modeling, The Regression Model, Plot model statistics

The Logit or Sigmoid Model, Logistic Regression Cost Function, Logistic Regression Model with TensorFlow Canned Estimators, Tensorflow Datasets (tf.data), FeatureColumns, Estimators

The representation challenge, An Inspiration from the Brain, The Neural Network Architecture, Training the Network, Cost Function or Loss Function, The Backpropagation Algorithm, Activation Functions, MultiLayer Perceptron (MLP) with Tensorflow Estimator API

Beam Programming, Enable Dataflow API, Building a Simple Data Transformation Pipeline

# Repository:

https://github.com/dvdbisong/ieee-ompi-ml-workshop

# Book reference:

The contents of this workshop are excerpts from the book Building Machine Learning and Deep Learning Models on Google Cloud Platform: A Comprehensive Guide for Beginners published by Apress (Springer Nature). *To be released July 2019.*

# Organizers: