Introduction

A few months ago I discovered fast.ai’s fastbook. It was released in August with an accompanying set of YouTube videos that serves as an introductory course in the topic of deep learning. The content is made very approachable by the authors Jeremy Howard, Rachel Thomas, and Sylvain Gugger.

I already had a cursory knowledge of AI having taken a class in college, but I wanted to dive deeper into the inner workings — ideally all the way to the metal. I purchased the book on Amazon in August, but I’ve been slow to go through it since some of the video lectures are rather long, and I learn far better by reading and doing versus listening.

This week I’ve decided to pick up where I left off. I had been thinking about starting a early today (why not), and one of the sections of the book/lecture specifically calls out the advantages of starting your own blog which is rather strange for a book about deep learning. Anyway, that section of the book pushed me over the edge, so now here we are.

To give a quick introduction of myself: My name is Jerred Shepherd. I’m a software engineer working at Amazon Web Services. I work on problems regarding distributed systems and scalability which has been really fun. I love computers and programming, and I often spend my free time working on side projects as a hobby.

Recent posts from blogs that I like

Commemorating the centenary of John Singer Sargent’s death: 5 War

From the marble quarry at Carrara, through Alpine passes to Lake Garda, over to alligators near Miami, and scenes from the First World War in England.

via The Eclectic Light Company

Cross-entropy and KL divergence

Cross-entropy is widely used in modern ML to compute the loss for classification tasks. This post is a brief overview of the math behind it and a related concept called Kullback-Leibler (KL) divergence. Information content of a single random event We'll start with a single event (E) that has probabi...

via Eli Bendersky

Anubis works

via Xe Iaso