Tootfinder

Opt-in global Mastodon full text search. Join the index!

No exact results. Similar results found.
@UrbanNature@nerdculture.de
2026-04-15 19:19:15

Tipp: Die Stiftung Naturschutz Berlin bietet am 23.4.26 von 16-18 Uhr einen kostenlosen Online-Vortrag zum Thema Libellen an 🧚‍♂️
Hier anmelden:
umweltkalender-berlin.de/angeb

@haayman@todon.nl
2026-03-16 05:10:23

"Daarna sprak je een zin uit die me diep raakte: “Alles waar ik in geloofde, is mislukt.”
Je vertelde over de antikernwapendemonstraties begin jaren tachtig, je lidmaatschap van vredesorganisaties in Israël/Palestina, je werk als oprichter van de sectie Vrouwenstudies aan de UvA en al je waarschuwingen voor racisme en fascisme als journalist."

@roelgrif@mstdn.social
2026-04-15 14:33:55

RIVM update rioolwaarden en percentage positief Sars-Cov-2.
Met een wat tegenstrijdig beeld.
Aan de ene kant een forse update van de rioolwaarden (11 nieuwe dagen in de data, van 2 t/m 12 april, o.b.v. 95%-5% van de meetstations) waarin we plots nog eens een halvering zien van het toch al record lage niveau dat we vorige week zagen.
#qp2t

@wraithe@mastodon.social
2026-05-14 16:01:16

From a world expert on the transmission of infectious diseases, discussing hantavirus.
bsky.app/profile/did:plc:z3nm4

@aredridel@kolektiva.social
2026-04-14 14:22:42

So to follow up on this, I've caught it in action. Models, when quantized a bit, just do a bit more poorly with short contexts. Even going from f32 (as trained) to bf16 (as usually run) to q8 tends to do okay for "normal" context windows. And q4 you start feeling like "this model is a little stupid and gets stuck sometimes” (it is! It's just that it's still mostly careening about in the space of "plausible" most of the time. Not good guesswork, but still in the zone). With long contexts, the probability of parameters collapsing to zero are higher, so the more context the more likelihood you are to see brokenness.
And then at Q2 (2 bits per parameter) or Q1, the model falls apart completely. Parameters collapse to zero easily. You start seeing "all work and no play makes jack a dull boy” sorts of behavior, with intense and unscrutinized repetition, followed by a hard stop when it just stops working.
And quantization is a parameter that a model vendor can turn relatively easily. (they have to regenerate the model from the base with more quantization, but it's a data transformation on the order of running a terabyte through a straightforward and fast process, not like training).
If you have 1000 customers and enough equipment to handle the requests of 700, going from bf16 to q8 is a no-brainer. Suddenly you can handle the load and have a little spare capacity. They get worse results, probably pay the same per token (or they're on a subscription that hides the cost anyway so you are even freer to make trade-offs. There's a reason that subscription products are kinda poorly described.)
It's also possible for them to vary this across a day: use models during quieter periods? Maybe you get an instance running a bf16 quantization. If you use it during a high use period? You get a Q4 model.
Or intelligent routing is possible. No idea if anyone is doing this, but if they monitor what you send a bit, and you generally shoot for an expensive model for simple requests? They could totally substitute a highly quantized version of the model to answer the question.
There are •so many tricks• that can be pulled here. Some of them very reasonable to make, some of them treading into outright misleading or fraudulent, and it's weirdly hard to draw the line between them.

@paul@social.van.buu.re
2026-04-16 08:46:35

Mensenluitjes, ik zoek een paar testers die even willen meekijken naar een Mastodon-deelknopje op een website.
Ik wil even via videobellen meekijken, terwijl je dit knopje gebruikt.
Stuur me een prive-berichtje als je tijd en zin hebt, please.

‪@zydecopaws@pnw.zone‬
2026-05-15 01:52:01

@… my dad couldn’t cook indoors to save his life, but I have to admit he did a decent job on the grill. My mother was an awful cook, bland food, and rarely any fresh vegetables (most of them came from cans).
I had to take over cooking duties at the ripe age of 14, escaped the daily grind by going to work at a McDonalds at 16, and as a result since learned t…

@Don_kun@nerdculture.de
2026-04-14 15:55:21

Abfrage von Namenslisten von Kunst-Jury-Mitgliedern - Weimer macht sich jetzt auch bei der Bildenden Kunst unbeliebt.
Hat der eine Wette laufen, dass er sich jede Kulturszene zum Feind machen kann?
tagesspiegel.de/politi…

@Techmeme@techhub.social
2026-04-08 15:10:56

Alibaba and China Telecom launch a data center in southern China that is powered by 10,000 of Alibaba's Zhenwu chips designed for AI training and inferencing (Arjun Kharpal/CNBC)
cnbc.com/2026/04/08/china-alib

@wraithe@mastodon.social
2026-05-13 13:00:54

I know I posted about this a couple months ago, but I never did follow up - By God It Does work!
You have to apply a little force so that the center breaks free, but it works
All these years, and learning new things…😂
bsky.app/profile/did:plc:5iw4w