-
Mar 16, 2020
Cloud Run: Google Cloud Text to Speech API
The article provides an example for deploying applications on Cloud Run that makes API calls. It provides sample codes for using Google Cloud Text-to-Speech APIs and running a simple web application with Flask that is packaged as a container and deployed on Cloud Run. The entire application is open-source at https://github.com/dvdbisong/text-to-speech-cloud-run. -
Sep 24, 2019
Building a Language Toxicity Classification Model using Google Cloud AutoML for Natural Language
Online communities are a significant part of interactions over the internet. The rise of such communities comes with the need for community moderation to ensure that participants adhere to their prescribed guidelines and avoid “clear and obvious” and objective policy violations. Google Cloud AutoML for Natural Language provides the platform for designing and developing custom language models for language recognition use-cases. This article uses Google Cloud AutoML for Natural Language to develop an end-to-end language toxicity classification model to identify obscene text. -
Jun 6, 2019
Cloud Run: Dataset Summaries via HTTP Request
Google Cloud Run provides a managed platform for running stateless containers. As the Compute infrastructure is managed, it can auto-scale in response to increased user-traffic. Cloud Run Services receives web traffic via HTTP requests or via asynchronous events, like a Pub/Sub message. This project uses Cloud Run to run a stateless container that employs Pandas profiling to display the summary statistics of a structured CSV dataset. -
May 10, 2019
Google Cloud AutoML Vision for Medical Image Classification
Google Cloud AutoML Vision simplifies the creation of custom vision models for image recognition use-cases. This article uses Google Cloud AutoML Vision to develop an end-to-end medical image classification model for Pneumonia Detection using Chest X-Ray Images. -
Apr 22, 2019
Working with Keras on GCP Notebook Instance
This workshop shows how to build Keras models on GCP Notebook Instances. The Keras Model is the core of a Keras programme. A Model is constructed, compiled, trained and evaluated using their respective training and evaluation datasets. Upon satisfactory evaluation, the model is used to make predictions on previously unseen data. -
Mar 27, 2019
Deploying an End-to-End Machine Learning Solution on Kubeflow Pipelines - Kubeflow for Poets
A Kubeflow pipeline component is an implementation of a pipeline task. A component is a step in the workflow. Each task takes one or more artifacts as input and may produce one or more artifacts as output. -
Mar 27, 2019
Kubeflow Pipelines - Kubeflow for Poets
Kubeflow Pipelines is a simple platform for building and deploying containerized machine learning workflows on Kubernetes. Kubeflow pipelines make it easy to implement production grade machine learning pipelines without bothering on the low-level details of managing a Kubernetes cluster. -
Mar 27, 2019
Kubeflow - Kubeflow for Poets
Kubeflow is a platform that is created to enhance and simplify the process of deploying machine learning workflows on Kubernetes. Using Kubeflow, it becomes easier to manage a distributed machine learning deployment by placing components in the deployment pipeline such as the training, serving, monitoring and logging components into containers on the Kubernetes cluster. -
Mar 27, 2019
Kubernetes - Kubeflow for Poets
When a microservice application is deployed in production, it usually has many running containers that need to be allocated the right amount of resources in response to user demands. Also, there is a need to ensure that the containers are online, running and are communicating with one another. The need to efficiently manage and coordinate clusters of containerized applications gave rise to Kubernetes. -
Mar 27, 2019
Docker - Kubeflow for Poets
Docker is a virtualization application that abstracts applications into isolated environments known as containers. The idea behind a container is to provide a unified platform that includes the software tools and dependencies for developing and deploying an application. -
Mar 27, 2019
Microservices Architecture - Kubeflow for Poets
The Microservice architecture is an approach for developing and deploying enterprise cloud-native software applications that involve separating the core business capabilities of the application into decoupled components. Each business capability represents some functionality that the application provides as services to the end-user. -
Mar 27, 2019
Introduction - Kubeflow for Poets
Machine learning is often and rightly viewed as the use of mathematical algorithms to teach the computer to learn tasks that are computationally infeasible to program as a set of specified instructions. However, the use of these algorithms happen to constitute only a tiny fraction of the overall learning pipeline from an Engineering perspective. Building high-performant and dynamic learning models includes a number of other critical components. -
Mar 27, 2019
Kubeflow for Poets: A Guide to Containerization of the Machine Learning Production Pipeline
This writing series provides a systematic approach to productionalizing machine learning pipelines with Kubeflow on Kubernetes. Building machine learning models is just one piece of a more extensive system of tasks and processes that come together to deliver a Machine Learning product. Kubeflow makes it possible to leverage the microservices paradigm of containerization to separate modular components of an application orchestrated on Kubernetes. -
Jan 17, 2018
A Comparative Analysis of Amazon SageMaker and Google Datalab
Google Datalab and Amazon SageMaker have fully managed cloud Jupyter notebooks for designing and developing machine learning and deep learning models by leveraging serverless cloud engines. However, they exist key differences between the two offerings as much as they have a lot in common. This post carries out a comparative analysis to examine the subtle differences and similarities between the two cloud-based machine learning as-a-service platforms. -
Jan 14, 2018
Exploring Amazon SageMaker
Amazon SageMaker is another cloud-based fully managed data analytics/ machine learning modeling platform for designing, building and deploying data models. The key selling point of Amazon SageMaker is "zero-setup". This post takes a tour through spinning up a SageMaker notebook instance for data analytics/ modeling learning models. -
Aug 23, 2017
Supervised Machine Learning: A Conversational Guide for Executives and Practitioners
This post gives a systematic overview of the vital points to consider when building supervised learning models. We address in Q&A style some of the key decisions/issues to go over when building a machine learning/ deep learning model. Whether you are an executive or a machine learning engineer, you can use this article to help start comfortable conversations with each other to facilitate stronger communication about machine learning.
-
Aug 3, 2017
Demystifying Deep Learning
Learning is a non-trivial task. How we learn deep representations as humans are high up there as one of the great enigmas of the world. What we consider trivial and to some others natural is a complex web of fine-grained and intricate processes that indeed have set us apart as unique creations in the universe both seen and unseen. In this post, I explain in simple terms the origins and promise of deep learning. -
Jul 28, 2017
Understanding Machine Learning: An Executive Overview
Machine learning is a technology that has grown to prominence over the past ten years (as at this time of writing) and is fast paving the way for the “Age of Automation”. This post provides a holistic view of the vital constituents that characterizes machine learning. At the end of this piece, the reader can be able to grasp the major landmarks and foundation stones of the field. Also, this overview provides a structured framework to wade deeper into murkier waters without getting overly overwhelmed. -
Jul 21, 2017
A Gentle Introduction to Google Cloud Platform for Machine Learning Practice
In a previous post titled Machine Learning on the cloud, we examined in plain language what is machine learning, what is the cloud, and the merits of leveraging cloud resources for machine learning practice. In this post, I introduce GCP as a simple, yet powerful, and cost effective cloud option for performing machine learning. Whats more, I provide a simple walkthrough on how to set up the environment for machine learning model development on GCP. -
Jul 12, 2017
Machine Learning on the Cloud: Notes for the Layman
Computational expenses have always been the bane of large-scale machine learning. In this post, I explain the fundamentals of Machine Learning on the cloud and the opportunities of unbridled computational horsepower made available by leveraging cloud infrastructures.