[SystemSafety] Engineering with the mother of all prompts

Derek M Jones derek at knosof.co.uk
Sat Jul 8 17:12:44 CEST 2023


Les,

> While large language models like GPT-4 generate text based on probabilities
> learned from training data, the output isn’t deterministic for a given prompt.
> Instead, it includes a level of randomness, often controlled by a parameter
> known as “temperature”.

The web based interface does not appear to have an option
to change the temperature.

When calling the API, the default temperature can be changed.
However, setting it to zero does not guarantee the same
behavior in practice (it does in theory).

The implementation involves shared processor usage, which
changes timing of concurrent processes, combined with half-floats
having limited precision, producing frustrating differences in
behavior
https://shape-of-code.com/2023/04/30/computer-plot-the-data/

-- 
Derek M. Jones           Evidence-based software engineering
blog:https://shape-of-code.com


More information about the systemsafety mailing list