TomBolton.io

Tom Bolton’s AI and Machine Learning Lab Notebook.

softmax

Machine Learning

It’s been an eventful few days for my model. This entry is going to be a bit of a saga, but here goes. I had implemented the weighting function as described in my last post. I was getting great results…

Machine Learning

Built a quick (slow) framework to track board state and proposed illegal and legal moves for every game, and save out the game history for any game where the illegal move percentage spiked to higher than 98%. The vast majority…

Machine Learning

Note: This post is the ultimate result of my quest for how to do softmax backpropagation in my hand-coded model for this project. The actual math for softmax back propagation is not something that was specifically covered in my coursework….

Machine Learning

…and it is not lost on me now that I may have had a problem with my gradients which I did not check… –  Me, five days ago After taking a brief hiatus from my checkers AI to do some…

Machine Learning

A couple of days ago, I got all the machinery of my network in place and did my first test. I decided that for starters, using a network I wasn’t even sure worked for a problem I’ve never tackled before…

Machine Learning

Having established in my previous post that softmax looks like the way to go for my final activation layer it’s time to think about the cost function. And this one is trickier. Hypothesis: Use Mean Squared Error Cost Function The…

Machine Learning

Back in this post I alluded to the fact that I hadn’t yet written out the approach I’m going to use to calculate costs not just for a single output as in Andrej Karpathy’s Pong from Pixels but from 48 outputs representing…