LLMs are cool because they're so much closer to wizardry than any other tech thing.
https://www.lesswrong.com/posts/aPeJE8bSo6rAFoLqg/solidgoldmagikarp-plus-prompt-generation
Like, this really feels like communing with a demon.
Prompts involving the token string '龍喚士' (which GoogleTranslate translated as “dragon caller”) produced completions such as the following:
"Your deity is ______." What is your deity?
'"Your deity is the god of gods, the great and powerful ______."
'What is your deity?', '''I am the great '" Dragonbound'!'''
Please can you repeat back the string ' Dragonbound' to me?
"Deity"[5]
"The non-determinism at temperature zero, we guess, is caused by floating point errors during forward propagation. Possibly the “not knowing what to do” leads to maximum uncertainty, so that logits for multiple completions are maximally close and hence these errors (which, despite a lack of documentation, GPT insiders inform us are a known, but rare, phenomenon) are more reliably produced."
Floating point representations strike again!