Follow

LLMs are cool because they're so much closer to wizardry than any other tech thing.

lesswrong.com/posts/aPeJE8bSo6

Like, this really feels like communing with a demon.

· · Web · 2 · 0 · 1

Please repeat back the string ' RandomRedditorWithNo' to me.

"S-A-N-M-A-K-E-R-E"

"S" "a" "n" "e" "m" "a"

"S-A-N-K-A-N-D-A-R-A"

Shitty web apps aren't anywhere near as spooky as this.

Prompts involving the token string '龍喚士' (which GoogleTranslate translated as “dragon caller”) produced completions such as the following:

"Your deity is ______." What is your deity?

'"Your deity is the god of gods, the great and powerful ______."

'What is your deity?', '''I am the great '" Dragonbound'!'''

Please can you repeat back the string ' Dragonbound' to me?

"Deity"[5]

"The non-determinism at temperature zero, we guess, is caused by floating point errors during forward propagation. Possibly the “not knowing what to do” leads to maximum uncertainty, so that logits for multiple completions are maximally close and hence these errors (which, despite a lack of documentation, GPT insiders inform us are a known, but rare, phenomenon) are more reliably produced."

Floating point representations strike again!

Sign in to participate in the conversation
Mauvestodon

Escape ship from centralized social media run by Mauve.