Getting hyped about automated hyperparameter tuning

Learn how to use custom containers on Cloud AI Platform to train an XGBoost model with automated hyperparameter tuning.

Using explainability frameworks to interpret financial models

In this post we'll walk through how to interpret an XGBoost model trained on a mortgage dataset using the What-if Tool, SHAP, and Cloud AI Platform.

Which factors contribute to my sleep quality?

I collected my own sleep data using an Oura ring and analyzed it with BigQuery.

Interpreting bag of words models with SHAP

Learn how to build a bag of words text classification model and interpret the model's output with SHAP.

Preventing bias in ML models, with code

With all the tools democratizing machine learning these days, it’s easier than ever to build high accuracy machine learning models. But even if you build a model yourself using an open source framework like TensorFlow or Scikit Learn, it’s still mostly a black box - it’s hard to know exactly why your model made the prediction it did. As model builders, we’re responsible for the predictions generated by our models and being able to explain...

Hello World, I have my own blog!

I'm starting my own blog!