Tag

nlp

Understanding the Backbone of Modern NLP

Transformers have undoubtedly transformed the field of Natural Language Processing. Their attention mechanisms and multi-head attention architecture have propelled them into the limelight, enabling faster training, improved performance, and broader applicability. Whether you’re a novice or an expert, grasping the core concepts of transformers is essential for staying at the forefront of the machine learning landscape. As you embark on your journey into the world of transformers, remember that understanding the foundational principles is the first step towards harnessing their immense potential in shaping the future of NLP and beyond.

And our guide, Transformers in Action, is here to take you along that path.

Learn How to Model Language as Tensors

In case you missed it, here is Chris Mattmann and Dr. Scott Penberthy’s live Twitch coding stream recap. For more, check out the book: Machine Learning with TensorFlow, Second Edition. For more live coding streams, subscribe to Manning’s Twitch channel… Continue Reading →

Multitask Learning

From Deep Learning for Natural Language Processing by Stephan Raaijmakers

This article covers multitask learning for NLP.

Deep Transfer Learning for NLP with Transformers

From Transfer Learning for Natural Language Processing by Paul Azunre

In this article, we cover some representative deep transfer learning modeling architectures for NLP that rely on a recently popularized neural architecture – the transformer – for key functions.

Detecting Word Types with POS Tagging, Part 2

From Getting Started with Natural Language Processing by Ekaterina Kochmar

This article shows you how to extract the meaningful bits of information from raw text and how to identify their roles. Once you have roles identified, you can move on to syntactic parsing.

Fun Uses for Word Vectors

In this video, Hobson shows you how to move words from inflammatory to less inflammatory context with the help of word vectors (Word2vec).

NLP Analysis of Large Text Datasets: conducted by Dr. Leonard Apeltsin

In this video, Dr. Leonard Apelstin demonstrates xClustering Large Text Datasets in Python.

Detecting Word Types with Part-of-Speech Tagging, part 1

From Getting Started with Natural Language Processing by Ekaterina Kochmar

This article shows you how to extract the meaningful bits of information from raw text and how to identify their roles. Let’s first look into why identifying roles is important.

Shallow Transfer Learning in NLP

From Transfer Learning for Natural Language Processing by Paul Azunre

This article delves into using shallow transfer learning to improve your NLP models.

Getting Started with Baselines

From Transfer Learning for Natural Language Processing by Paul Azunre

This article discusses getting started with baselines and generalized linear models.

© 2024 Manning — Design Credits