Practical workshop: the softmax function, temperature, top_k, and top_p
The complete source code with all methods and tests is available on GitHub in the ch03_softmax_workshop.py file, but you are encouraged to walk through the code and repeat it step by step.
We will operate with the array. We’ll start by defining our sample logits and initializing parameters such as temperature and others:
import numpy as np
logits = {
'guitar': 2.5,
'melody': 2.2,
'whisper': 1.4,
'sunset': 1.3,
'avalanche': 0.3,
'elephant': 0.2
}
temperature=0.7
p=0.7
min_p_value=0.3
k=3
Now, we are ready to start step by step. First, we will implement the temperature-agnostic softmax function.
Step 1: Implementing the softmax function
This step focuses on converting raw model outputs (logits) into a meaningful probability distribution using the softmax function. This operation mirrors how a language...