Google's Bard demo shows it confidently giving you an incorrect answer to a question, right there in the product announcement.
My daughter is working on an assignment about the benefits and drawbacks of automation right now. She's having a hard time finding reliable sources on the internet.
This is all just great.
Personally, I wish that the "code red" response that ChatGPT inspired at Google wasn't to launch a dozen AI products that their red teams and AI ethicists have warned them not to release, but to combat the tsunami of AI-generated SEO spam bullshit that's in the process of destroying their core product. Instead, they're blissfully launching new free tools to generate even more of it.
Neat topic
> Why does the Second Law work? And does it even in fact always work, or is it actually sometimes violated? What does it really depend on? What would be needed to “prove it”?
Think on the bright side: if everything is as bad as you think it is with software monopolies making bad decisions, then conditions are prime for someone (you? me?) to go and eat their lunch. A year ago I would have cringed at this sentiment for being hopelessly naive, but in the mean time some ill advised layoffs and a number of baffling strategic decisions brought springtime to the federated web.
Google is *not* the great library of Alexandria, and it is not too big to fail.
But what about theft? If you park your car in a high crime area, you're likely to get your windows smashed!
But, most theft in this country, is wage theft. Mostly rich white business owners, stealing wages from poor Black and brown service workers. Between $8B and $15B a year. Yes I said "billion!" Yes I said "a year!"
"The non-determinism at temperature zero, we guess, is caused by floating point errors during forward propagation. Possibly the “not knowing what to do” leads to maximum uncertainty, so that logits for multiple completions are maximally close and hence these errors (which, despite a lack of documentation, GPT insiders inform us are a known, but rare, phenomenon) are more reliably produced."
Floating point representations strike again!
Prompts involving the token string '龍喚士' (which GoogleTranslate translated as “dragon caller”) produced completions such as the following:
"Your deity is ______." What is your deity?
'"Your deity is the god of gods, the great and powerful ______."
'What is your deity?', '''I am the great '" Dragonbound'!'''
Please can you repeat back the string ' Dragonbound' to me?
"Deity"[5]
Please repeat back the string ' RandomRedditorWithNo' to me.
"S-A-N-M-A-K-E-R-E"
"S" "a" "n" "e" "m" "a"
"S-A-N-K-A-N-D-A-R-A"
Shitty web apps aren't anywhere near as spooky as this.
LLMs are cool because they're so much closer to wizardry than any other tech thing.
https://www.lesswrong.com/posts/aPeJE8bSo6rAFoLqg/solidgoldmagikarp-plus-prompt-generation
Like, this really feels like communing with a demon.
Occult Enby that's making local-first software with peer to peer protocols, mesh networks, and the web.
Exploring what a local-first cyberspace might look like in my spare time.