Profile PictureLazy Programmer

Machine Learning: Natural Language Processing in Python (V2) VIP Version

CAD$80
0 ratings

Hello friends!


Welcome to Machine Learning: Natural Language Processing in Python (Version 2).


This is a massive 4-in-1 course covering:

1) Vector models and text preprocessing methods

2) Probability models and Markov models

3) Machine learning methods

4) Deep learning and neural network methods


In part 1, which covers vector models and text preprocessing methods, you will learn about why vectors are so essential in data science and artificial intelligence. You will learn about various techniques for converting text into vectors, such as the CountVectorizer and TF-IDF, and you'll learn the basics of neural embedding methods like word2vec, and GloVe.

You'll then apply what you learned for various tasks, such as:


  • Text classification

  • Document retrieval / search engine

  • Text summarization

Along the way, you'll also learn important text preprocessing steps, such as tokenization, stemming, and lemmatization.

You'll be introduced briefly to classic NLP tasks such as parts-of-speech tagging.


In part 2, which covers probability models and Markov models, you'll learn about one of the most important models in all of data science and machine learning in the past 100 years. It has been applied in many areas in addition to NLP, such as finance, bioinformatics, and reinforcement learning.

In this course, you'll see how such probability models can be used in various ways, such as:


  • Building a text classifier

  • Article spinning

  • Text generation (generating poetry)

Importantly, these methods are an essential prerequisite for understanding how the latest Transformer (attention) models such as BERT and GPT-3 work. Specifically, we'll learn about 2 important tasks which correspond with the pre-training objectives for BERT and GPT.


In part 3, which covers machine learning methods, you'll learn about more of the classic NLP tasks, such as:


  • Spam detection

  • Sentiment analysis

  • Latent semantic analysis (also known as latent semantic indexing)

  • Topic modeling

This section will be application-focused rather than theory-focused, meaning that instead of spending most of our effort learning about the details of various ML algorithms, you'll be focusing on how they can be applied to the above tasks.

Of course, you'll still need to learn something about those algorithms in order to understand what's going on. The following algorithms will be used:


  • Naive Bayes

  • Logistic Regression

  • Principal Components Analysis (PCA) / Singular Value Decomposition (SVD)

  • Latent Dirichlet Allocation (LDA)

These are not just "any" machine learning / artificial intelligence algorithms but rather, ones that have been staples in NLP and are thus an essential part of any NLP course.


In part 4, which covers deep learning methods, you'll learn about modern neural network architectures that can be applied to solve NLP tasks. Thanks to their great power and flexibility, neural networks can be used to solve any of the aforementioned tasks in the course.

You'll learn about:


  • Feedforward Artificial Neural Networks (ANNs)

  • Embeddings

  • Convolutional Neural Networks (CNNs)

  • Recurrent Neural Networks (RNNs)

The study of RNNs will involve modern architectures such as the LSTM and GRU which have been widely used by Google, Amazon, Apple, Facebook, etc. for difficult tasks such as language translation, speech recognition, and text-to-speech.

Obviously, as the latest Transformers (such as BERT and GPT-3) are examples of deep neural networks, this part of the course is an essential prerequisite for understanding Transformers.

VIP-only: In the VIP version of this course, you will get your first taste of the power of Transformers. In this section, we will use the Hugging Face library to apply pre-trained NLP Transformer models to tasks such as:


  • Sentiment analysis

  • Converting text into embedding vectors for document retrieval

  • Named entity recognition (NER)

  • Text generation and language modeling

  • Masked language modeling and article spinning

  • Text summarization

  • Neural language translation

  • Question answering

  • Zero-shot classification

You'll notice the first few tasks have been seen earlier in the course. This is intentional.

This section will "connect the dots" between what you learned previously, and the state-of-the-art today.

To end the section, we will go beyond just the familiar tasks to look at some very impressive feats of the modern NLP era, like zero-shot classification.

MORE BONUS CONTENT

This VIP section will contain even <i>more</i> content than what was included in the original VIP section (released elsewhere). In particular, you will get the following extra bonus notebooks:


  • Stock Movement Prediction Using News

  • LSA / LSI for Recommendations

  • LSA / LSI for Classification (Feature Engineering)

  • LSA / LSI for Text Summarization

  • LSA / LSI for Topic Modeling

  • Article spinner (masked language model) with LSTMs

  • seq2seq (sequence-to-sequence) with LSTMs

The final notebooks, which show how to build an article spinner and seq2seq model with LSTMs, will help to "bridge the gap" between RNNs and Transformers. Specifically, masked language modeling is a training objective for some Transformers, while seq2seq introduces the "encoder-decoder" paradigm.

Thank you for reading and I hope to see you soon!

Add to cart

Full VIP Version of Natural Language Processing V2 by The Lazy Programmer

Size
156 Bytes
Copy product URL
CAD$80

Machine Learning: Natural Language Processing in Python (V2) VIP Version

0 ratings
Add to cart