🤖 AI, LLMs, GenAI, NLP
🐍 Python Dev
🚀 Indie Hacker
🎮 Game Dev, ProcGen, Unity, C#
🏎️ F1 Fan
🇬🇧 UK Based
🦣 mastodonapp.uk/@StuartGray
✖️ x.com/StuartGray (inactive)
Procedural Generation has nothing to do with AI, and has been around for decades with no complaints.
These people are now suddenly against Fractals of all things!?
It’s been use in music and big commercial games since at least the early 90s.
Procedural Generation has nothing to do with AI, and has been around for decades with no complaints.
These people are now suddenly against Fractals of all things!?
It’s been use in music and big commercial games since at least the early 90s.
The main age-verification lobbyist — a man who largely believes porn should be outlawed — admitted the state-level bills he pushed for won't work and were really a predicate for federal action.
He wants the DOJ to seize domains.
The main age-verification lobbyist — a man who largely believes porn should be outlawed — admitted the state-level bills he pushed for won't work and were really a predicate for federal action.
He wants the DOJ to seize domains.
* Skyrim had an unusually long life (and still going!), partly thanks to 3rd party mods
* So much so that the Bethesda head grew to massively resent what he viewed as “lost revenue”!
* There was/is also ESO Online
* Skyrim had an unusually long life (and still going!), partly thanks to 3rd party mods
* So much so that the Bethesda head grew to massively resent what he viewed as “lost revenue”!
* There was/is also ESO Online
The "Penrose Effect" seems to be a real thing - hypothesised in the 1930s and re-tested in the last decade or so:
Where you reduce your inpatient psychiatric provision, you'll see a correlated rise within 10yrs in prisons of seriously mentally ill prisoners.
The "Penrose Effect" seems to be a real thing - hypothesised in the 1930s and re-tested in the last decade or so:
Where you reduce your inpatient psychiatric provision, you'll see a correlated rise within 10yrs in prisons of seriously mentally ill prisoners.
We split MMLU in two parts (leaked/clean) and show that almost all models tend to perform better on leaked samples
We split MMLU in two parts (leaked/clean) and show that almost all models tend to perform better on leaked samples
These websites can then be found in CommonCrawl dumps that are generally used for pretraining data curation...
These websites can then be found in CommonCrawl dumps that are generally used for pretraining data curation...
For instance, the fraction of MMLU questions that are leaked in pretraining had gone from ~1% to 24% between OLMo-1 and 2 😬
For instance, the fraction of MMLU questions that are leaked in pretraining had gone from ~1% to 24% between OLMo-1 and 2 😬