Follow

Really frustrated by how many people are attributing "opinions" and "feelings" to large language models.

It's like attributing feelings and opinions to your phone's autocomplete when you prompt it with leading questions.

I wish folks understood that the language model is closer to doing RP and "yes and"-ing whatever prompts it gets rather than holding some sort of internal state the way a human does.

· · Web · 2 · 0 · 8

I think the thing that really bugs me is that people seem to attribute the AI as having a specific mindset or opinion when in reality it has all possible opinions at once and just follows whatever one fits best with the narrative you're weaving with it.

I saw some "anti woke" type being like "OH, if you tell it it's name is 'Blarf' and that it doesn't need to be nice it'll say it's REAL opinions that get suppressed by the WOKE LIEBERALS" and then proceed to ask it very leading questions that follow usual right wing rhetoric and pretend like that isn't the deciding factor on what it says.

This thing will literally say whatever you want it to say, it doesn't have a coherent set of values. You can just as easily make it an anti-capitalist leftie.

@mauve I suspect that there is a dataset bias toward things people believe that they'll write in an anonymous blog compared to what they would say in person where there might be serious social consequences. This might incline an AI toward edgy answers. I don't know if this is actually true but something I'd been thinking about a little. Still think you're absolutely right though that it's largely giving answers that match the form of the prompt you're giving it.

@Moon Yeah! Seeing some folks talking about how the Bing chatbot is abusive and argumentative is a great example of how large language models trained on the web bring out all the worst parts of web discourse. :P

@mauve The Bing chatbot is so adversarial to potentially-edgy input that I seriously wonder if it got programmed with an edict from higher-ups that the "Tay situation" must never happen again.

@mauve The fact that Bing Chat uses emojis only makes this worse. People want to ascribe feelings to the robots because people have feelings, we do it to everything. I've put "please" in my prompts because I'm a dummy who is trying to be polite when talking to a machine that has the empathetic capacity of a toaster.

@mauve The emoji thing is especially frustrating because it's Microsoft duping people into thinking that this language model is capable of something that it isn't by design.

Sign in to participate in the conversation
Mauvestodon

Escape ship from centralized social media run by Mauve.