Helpful Resources for Machine Learning and Deep Learning

Md. Al-Imran Abir
2 min readApr 6, 2022

I am learning ML and DL since mid-2020. But none of that was serious. However, in my last MSc semester (December 2021), I took two courses, one on basic Machine Learning and the other one on Deep Learning. While doing the assignments and projects for these courses, I have come across many good articles or notebooks which were really helpful. I thought of keeping a list of those. Then the idea came to me for making it on Medium so that others may also get help.

Resources for PCA:

If you want to understand principal component analysis (PCA), you may follow this article (archive link) where Christian Versloot describes the basics of PCA (Principal Component Analysis) along with python code. He shows the python implementation of PCA using the NumPy function eig() and svd(). Also, the easy and effective way of calculating PCA using the PCA()class of scikit-learn library is also discussed. However, I think there are some typos in the svd()implementation. To get rid of that error, you may follow the answers to this StackExchange question (archive link — the thread itself is a good one for SVD and PCA). Despite the typos, I found this article very helpful and highly recommend going through it.

On the other hand, if you already have a basic idea about PCA and are looking for only python codes, you may find this article (archive link) by Jason Brownlee very helpful.

Resource for LSTM:

You may go through this article. (More will be posted later subject to my free time…)

Resources for BERT:

If you already have some idea about attention and transformers and you are trying to apply transfer learning in natural language processing (NLP), then you may want to use BERT. This article (archive link) by Jay Alammar provides a good starting point for BERT. For code, I would recommend following the associated notebook available on both GitHub and Colab. As a background for BERT, you may also read this article (archive link) by the same author and also take a look at the main BERT paper.

--

--