What does the arab in your carrd mean? Is it like afab and amab?
.. i’m palestinian
I'm starting a collection
"What if in the Mad Max universe everything outside of Australia was completely normal"
What if in the Mad Max universe the whole planet became a desert world but everything outside of Australia was just doing Dune shit
the monty hall problem is something i find interesting and i wish there was also a term to describe the way people respond to the monty hall problem. like, "i don't understand this explanation of statistics (a thing i obviously know little about) so i'm going to assume you're just lying to me". the monty-hall-problem problem
I need to find an english phrase with the same gravitas as "Callate ya hijo de la gran puta que eres tontísimo".
Love this, we do really need more frogs
(in my mind, they are dividing as cells into more and more types of frogs((sorry, just trying color here)))
hey don't cry. 7,401 species of frog in the world, ok?
hope is a skill
The problem here isn’t that large language models hallucinate, lie, or misrepresent the world in some way. It’s that they are not designed to represent the world at all; instead, they are designed to convey convincing lines of text. So when they are provided with a database of some sort, they use this, in one way or another, to make their responses more convincing. But they are not in any real way attempting to convey or transmit the information in the database. As Chirag Shah and Emily Bender put it: “Nothing in the design of language models (whose training task is to predict words given context) is actually designed to handle arithmetic, temporal reasoning, etc. To the extent that they sometimes get the right answer to such questions is only because they happened to synthesize relevant strings out of what was in their training data. No reasoning is involved […] Similarly, language models are prone to making stuff up […] because they are not designed to express some underlying set of information in natural language; they are only manipulating the form of language” (Shah & Bender, 2022). These models aren’t designed to transmit information, so we shouldn’t be too surprised when their assertions turn out to be false.
ChatGPT is bullshit