minus-squarekciwsnurb@aussie.zonetoTechnology@beehaw.org•Ask ChatGPT to pick a number between 1 and 100linkfedilinkarrow-up0·7 months agoThe temperature scale, I think. You divide the logit output by the temperature before feeding it to the softmax function. Larger (resp. smaller) temperature results in a higher (resp. lower) entropy distribution. linkfedilink
The temperature scale, I think. You divide the logit output by the temperature before feeding it to the softmax function. Larger (resp. smaller) temperature results in a higher (resp. lower) entropy distribution.