Depends on your motives and what you want to get out of ML. I'd say to get a strong intuition for debugging and working with new neural network architectures does take a lot of ground work from the fundementals. The book "elements of statistical machine learning" is a good read if you have a background with calculus, optimisation, and linear algebra. After which, moving onto the various deep learning books in the field is reasonable.
However, of course, if your interest is just to mess about with neural nets, then have a look at the tensorflow tutorials on CNNs. Unfortunately, it does become a bit impractical to properly mess about with the deeper nets for large datasets such as imgnet. Also, the openai universe is fun to experiment with recurrent nets for a range of environments.