A review by deadly_nightshade_
The Hundred-Page Machine Learning Book by Andriy Burkov

4.0

If you want to condense machine learning into 100 pages (or more precisely, 136) without losing rigor, you're going to end up with a bunch of math equations because machine learning is essentially math applied to data.
This book is so mathematically dense. It's symbolically complex and verbally concise. I really wouldn't recommend this to someone just starting out with machine learning. It could be a good refresher for someone who has studied machine learning at a graduate level. Maybe if you've been working with machine learning for a while and you've forgotten what exactly certain models do mathematically, this could help. I agree with the author : "Practitioners with experience can use this book as a collection of directions for further self-improvement".
The information within seems pretty up-to-date in terms of mentioning validation sets, and even genetic algorithms (mentioned briefly) and LSTMs. Some of the explanations are not sufficient in themselves, and you're going to have to look the topic up online if you're not familiar with it (especially starting on page 65 with deep learning and backpropagation, although the book does include QR codes that you can scan for further elaboration).
What I really enjoyed was that the author talked about certain statistical assumptions that are made about data in machine learning. They seem obvious when stated, but they're not necessarily common sense. For instance, it's usually assumed that your training data is randomly and independently selected from a particular distribution, and a machine learning model trained on the data is supposed to work on unseen future data because we assume that the future data is coming from the same distribution.
I also liked that the author provided multiple ways to solve particular problems (like an imbalanced dataset).
Solid book. I recommend it as a reference, but not necessarily as a one-stop shop for teaching yourself machine learning.