Natural language processing using transformer architectures
Whether you need to automatically judge the sentiment of a user review, summarize long documents, translate text, or build a chatbot, you need the best language model available. In 2018, pretty much every NLP benchmark was crushed by novel transformer-based architectures, replacing long-standing arc...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Online |
Sprache: | eng |
Veröffentlicht: |
Erscheinungsort nicht ermittelbar
O'Reilly Media, Inc.
2020
Sebastopol, CA O'Reilly Media Inc. |
Ausgabe: | 1st edition |
Schlagworte: | |
Online Zugang: | https://learning.oreilly.com/library/view/-/0636920373605/?ar |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Whether you need to automatically judge the sentiment of a user review, summarize long documents, translate text, or build a chatbot, you need the best language model available. In 2018, pretty much every NLP benchmark was crushed by novel transformer-based architectures, replacing long-standing architectures based on recurrent neural networks. In short, if you’re into NLP, you need transformers. But to use transformers, you need to know what they are, what transformer-based architectures look like, and how you can implement them in your projects. Aurélien Géron (Kiwisoft) dives into recurrent neural networks and their limits, the invention of the transformer, attention mechanisms, the transformer architecture, subword tokenization using SentencePiece, self-supervised pretraining—learning from huge corpora, one-size-fits-all language models, BERT and GPT 2, and how to use these language models in your projects using TensorFlow. What you'll learn Understand transformers and modern language models and how they can tackle complex NLP tasks Identify what tools to use and what the code looks like... |
---|---|
Beschreibung: | 1 Online-Ressource (1 video file, approximately 45 min.) |