Video Editor
Nordeast
28
To use Black labor without compensating Black people
(Is this the wrong time to admit this entire feed has been generated by an especially irritating AI calling itself "Riki"?)
To use Black labor without compensating Black people
Read more: www.theverge.com/health/66136...
Read more: www.theverge.com/health/66136...
This is a disability rights issue.
This is what we’ve been warning about for years.
“Choose your baby” is another way of saying “eliminate the undesirables”.
It’s out in the open and it’s wrong.
This is a disability rights issue.
This is what we’ve been warning about for years.
“Choose your baby” is another way of saying “eliminate the undesirables”.
It’s out in the open and it’s wrong.
Written by Aly Laube
thesicktimes.org/2025/12/19/t...
Written by Aly Laube
thesicktimes.org/2025/12/19/t...
One of the few studies is testing “cognitive rehabilitation therapy,” which recently failed in a U.S. trial.
thesicktimes.org/2025/12/16/d...
One of the few studies is testing “cognitive rehabilitation therapy,” which recently failed in a U.S. trial.
thesicktimes.org/2025/12/16/d...
Wow, this almost suggests covid is implicated in an unbelievable number of deaths. And that vaccinated people should wear respirators if they want to remain abled.
www.cidrap.umn.edu/covid-19/mrn...
Wow, this almost suggests covid is implicated in an unbelievable number of deaths. And that vaccinated people should wear respirators if they want to remain abled.
www.cidrap.umn.edu/covid-19/mrn...
www.thedailybeast.com/vanity-fair-...
www.thedailybeast.com/vanity-fair-...
"Hallucinations" are the correct output, for the input, and training....everytime.
"Hallucinations" are the correct output, for the input, and training....everytime.
It's an LLM, pure rote regurgitation, it doesn't understand the nature of the sentences it structures.
it can't fathom the "real, material world" that these sentences are supposed to symbolize.
It's an LLM, pure rote regurgitation, it doesn't understand the nature of the sentences it structures.
it can't fathom the "real, material world" that these sentences are supposed to symbolize.
A "hallucination" isn't a thing, because an LLM isn't sentient, and the concept of "true/false" or "correct/incorrect" don't exist to it
It doesn't have the ability of a priori reasoning, it isn't capable of "conceptual" or "meaningful" learning.
A "hallucination" isn't a thing, because an LLM isn't sentient, and the concept of "true/false" or "correct/incorrect" don't exist to it
It doesn't have the ability of a priori reasoning, it isn't capable of "conceptual" or "meaningful" learning.
AI Execs invented the term, so they can pretend that wrong answers or bad outputs are some type of flaw, that goes against the AI's programming, and that it's "supposed" to give you the "correct" answer.
AI Execs invented the term, so they can pretend that wrong answers or bad outputs are some type of flaw, that goes against the AI's programming, and that it's "supposed" to give you the "correct" answer.