Colleagues, the Sequence Models for Natural Language Processing program from Google Cloud is an introduction to sequence models and their applications, including an overview of sequence model architectures and how to handle inputs of variable length. Acquire high-demand skills to Predict future values of a time-series, Classify free form text, Address time-series and text problems with recurrent neural networks, Choose between RNNs/LSTMs and simpler models, and Train and reuse word embeddings in text problems. You will get hands-on practice building and optimizing your own text classification and sequence models on a variety of public datasets in the labs we’ll work on together. Prerequisites: Basic SQL, familiarity with Python and TensorFlow. Training modules address: 1) Working with Sequences - what a sequence is, see how you can prepare sequence data for modeling, and be introduced to some classical approaches to sequence modeling and practice applying them, 2) Recurrent Neural Networks - how they address the variable-length sequence problem, explain how our traditional optimization procedure applies to RNNs, and review the limits of what RNNs can and cannot represent, 3) Dealing with Longer Sequences - learn about LSTMs, Deep RNNs, working with real world data, 4) Text Classification - examine different ways of working with text and how to create your own text classification models, 5) Reusable Embeddings - labeled data for our classification models is expensive and precious. Here we will address how we can reuse pre-trained embeddings to make our models with TensorFlow Hub, 6) Encoder-Decoder Models - sequence-to-sequence model called the encoder-decoder network to solve tasks, such as Machine Translation, Text Summarization and Question Answering, and 6) Summary - review what you have learned so far about sequence modeling for time-series and natural language data.
Enroll today (eams & execs welcome): https://tinyurl.com/2p9yyx3v
Much career success, Lawrence E. Wilson - Artificial Intelligence Academy
No comments:
Post a Comment