
2025-06-14 08:20:43
Grenfell survivor urges government to speed up firms’ contract ban ahead of eighth anniversary | Morning Star
https://morningstaronline.co.uk/article/grenfell-survivor-urges-government-speed-firms-contract…
Grenfell survivor urges government to speed up firms’ contract ban ahead of eighth anniversary | Morning Star
https://morningstaronline.co.uk/article/grenfell-survivor-urges-government-speed-firms-contract…
AI, AGI, and learning efficiency
My 4-month-old kid is not DDoSing Wikipedia right now, nor will they ever do so before learning to speak, read, or write. Their entire "training corpus" will not top even 100 million "tokens" before they can speak & understand language, and do so with real intentionally.
Just to emphasize that point: 100 words-per-minute times 60 minutes-per-hour times 12 hours-per-day times 365 days-per-year times 4 years is a mere 105,120,000 words. That's a ludicrously *high* estimate of words-per-minute and hours-per-day, and 4 years old (the age of my other kid) is well after basic speech capabilities are developed in many children, etc. More likely the available "training data" is at least 1 or 2 orders of magnitude less than this.
The point here is that large language models, trained as they are on multiple *billions* of tokens, are not developing their behavioral capabilities in a way that's remotely similar to humans, even if you believe those capabilities are similar (they are by certain very biased ways of measurement; they very much aren't by others). This idea that humans must be naturally good at acquiring language is an old one (see e.g. #AI #LLM #AGI
Combating Reentrancy Bugs on Sharded Blockchains
Roman Kashitsyn, Robin K\"unzler, Ognjen Mari\'c, Lara Schmid
https://arxiv.org/abs/2506.05932 ht…
Low-temperature anomalies in 1D hard rods with soft attractive nearest-neighbor interactions
Igor Trav\v{e}nec, Ladislav \v{S}amaj
https://arxiv.org/abs/2507.01568