Creating your Own Chat Agent/ Using OpenAI API Training Description
Duration: 3 day
About this Course
In this 3-day developer class on working with Large Language Models, we will show you how to use Transformers in Natural Language Processing and leverage the capabilities available on Huggingface.
You’ll learn how transformer models work and their limitations, as well as how to fine-tune a pre-trained model using the Trainer API or Keras. We’ll cover sharing models and tokenizers on the Hugging Face Hub and how to create your own dataset and perform a semantic search with FAISS using the Datasets library.
Join us for an interactive 3-day journey into the world of Large Language Models with Huggingface, and take your Natural Language Processing projects to the next level.
Prerequisites
To take this 3-day course, you should have taken our AI Workbench class or have basic knowledge of programming concepts and syntax in a language such as Python or JavaScript. General familiarity with APIs is also recommended.
What’s included?
- Authorized Courseware
- Intensive Hands on Skills Development with an Experienced Subject Matter Expert
- Hands-on practice on real Servers and extended lab support 1.800.482.3172
- Examination Vouchers & Onsite Certification Testing- (excluding Adobe and PMP Boot Camps)
- Academy Code of Honor: Test Pass Guarantee
- Optional: Package for Hotel Accommodations, Lunch and Transportation
With several convenient training delivery methods offered, The Academy makes getting the training you need easy. Whether you prefer to learn in a classroom or an online live learning virtual environment, training videos hosted online, and private group classes hosted at your site. We offer expert instruction to individuals, government agencies, non-profits, and corporations. Our live classes, on-sites, and online training videos all feature certified instructors who teach a detailed curriculum and share their expertise and insights with trainees. No matter how you prefer to receive the training, you can count on The Academy for an engaging and effective learning experience.
Methods
- Instructor-Led (the best training format we offer)
- Live Online Classroom – Online Instructor-Led
- Self-Paced Video
Speak to an Admissions Representative for complete details
Start | Finish | Public Price | Public Enroll | Private Price | Private Enroll |
---|---|---|---|---|---|
9/23/2024 | 9/25/2024 | ||||
10/14/2024 | 10/16/2024 | ||||
11/4/2024 | 11/6/2024 | ||||
11/25/2024 | 11/27/2024 | ||||
12/16/2024 | 12/18/2024 | ||||
1/6/2025 | 1/8/2025 | ||||
1/27/2025 | 1/29/2025 | ||||
2/17/2025 | 2/19/2025 | ||||
3/10/2025 | 3/12/2025 | ||||
3/31/2025 | 4/2/2025 | ||||
4/21/2025 | 4/23/2025 | ||||
5/12/2025 | 5/14/2025 | ||||
6/2/2025 | 6/4/2025 | ||||
6/23/2025 | 6/25/2025 | ||||
7/14/2025 | 7/16/2025 | ||||
8/4/2025 | 8/6/2025 | ||||
8/25/2025 | 8/27/2025 | ||||
9/15/2025 | 9/17/2025 | ||||
10/6/2025 | 10/8/2025 | ||||
10/27/2025 | 10/29/2025 | ||||
11/17/2025 | 11/19/2025 | ||||
12/8/2025 | 12/10/2025 | ||||
12/29/2025 | 12/31/2025 | ||||
1/19/2026 | 1/21/2026 |
Outline
TRANSFORMER MODELS
Natural Language Processing
Transformers, what can they do?
How do Transformers work?
Encoder models
Decoder models
Sequence-to-sequence models
Bias and limitations
USING TRANSFORMERS
Behind the pipeline
Models
Tokenizers
Handling multiple sequences
Putting it all together
FINE-TUNING A PRE-TRAINED MODEL
Processing the data
Fine-tuning a model with the Trainer API or Keras
A full training
Fine-tuning, Check!
SHARING MODELS AND TOKENIZERS
The Hugging Face Hub
Using pre-trained models
Sharing pre-trained models
Building a model card
Part 1 completed!
End-of-chapter quiz
THE DATASETS LIBRARY
What if my dataset isn’t on the Hub?
Time to slice and dice
Big data? Datasets to the rescue!
Creating your own dataset
Semantic search with FAISS
Datasets