Description:
In this course, you will learn the foundational concepts of machine learning, deep learning, and generative AI. The course will cover Feed-Forward Neural Networks (FFNN) backpropagation techniques, as well as applications of word vectors in Natural Language Processing. You will explore different kinds of Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and learn how they are built, trained, and used. You will also study LSTM networks, RCNN, Encoder-Decoder architectures, with a focus on autoregression, self-attention, and cross-attention mechanisms used in Large Language Models.
Topics:
- Machine learning basics, deep learning, and Generative AI concepts
- Single Neuron Computation: Perceptron, Sigmoid neurons
- Feed-Forward Neural Networks (FFNN) and backpropagation, word vectors
- Convolutional Neural Networks (CNN) and image captioning application of CNN
- Recurrent Neural Networks (RNN), Hopfield networks and Boltzmann machines,
- Backpropagation Through Time (BPTT) and vanishing/exploding gradient
- Long Short-Term Memory (LSTM) networks, Recurrent Convolutional Neural Networks (RcNN), encoder-decoder and decoding methods (autoregression, self-attention, cross-attention)