Search

Word Search

Information System News

Understanding Text Generation Parameters in
Transformers
Rick W

Understanding Text Generation Parameters in Transformers

This post is divided into seven parts; they are: - Core Text Generation Parameters - Experimenting with Temperature - Top-K and Top-P Sampling - Controlling Repetition - Greedy Decoding and Sampling - Parameters for Specific Applications - Beam Search and Multiple Sequences Generation Let's pick the GPT-2 model as an example.
Previous Article Further Applications with Context Vectors
Next Article Building RAG Systems with Transformers
Print
201