/3
Ik vraag me trouwens af of een overheid er bij buitenlandse bestedingen ook rekening mee houdt dat hetzelfde bedrag hier of daar besteden netto eem groot verschil maakt.
Van binnenlandse bestedingen komt een flink deel weer terug in belastingen en premies die werkgever en werknemers betalen.
Voor buitenlandse bestedingen geldt een kleine tot grote mate van: 'weg=weg'.
#DigitaleSoevereiniteit #DigiD
Wollt ihr den AI Slop sehen, wegen dem der Zürcher Bauernverband wohl demnächst einen wütenden Brief von Nintendo bekommt? 🤦♀️⚖️
#Bauernlobby
Die #Stadtwerke #Flensburg treiben den Bau einer #Großwärmepumpe voran.
Ab 2027 soll sie Wärme aus dem
1️⃣ I suspect that Hatchs’ revived BSG would have been a terrible fanservice kind of revival and I suspect same of Fillions animated attempt. It would be a shadow of a deeply flawed show and Firefly is missing (at least one) irreplaceable character, depending on timeline (RIP Ron Glass)
Perheenjäsenet ovat olleet vuorotellen sairaina. Päänsärkyä, väsymystä ja yleistä huonoa oloa, mutta ei juurikaan kuumetta tai ulos päin näkyvää flunssaista olemusta. Nyt kun minulla vähän päätä särkee, vaimo jo pitää varmana että tulee se tauti minuunkin.
So to follow up on this, I've caught it in action. Models, when quantized a bit, just do a bit more poorly with short contexts. Even going from f32 (as trained) to bf16 (as usually run) to q8 tends to do okay for "normal" context windows. And q4 you start feeling like "this model is a little stupid and gets stuck sometimes” (it is! It's just that it's still mostly careening about in the space of "plausible" most of the time. Not good guesswork, but still in the zone). With long contexts, the probability of parameters collapsing to zero are higher, so the more context the more likelihood you are to see brokenness.
And then at Q2 (2 bits per parameter) or Q1, the model falls apart completely. Parameters collapse to zero easily. You start seeing "all work and no play makes jack a dull boy” sorts of behavior, with intense and unscrutinized repetition, followed by a hard stop when it just stops working.
And quantization is a parameter that a model vendor can turn relatively easily. (they have to regenerate the model from the base with more quantization, but it's a data transformation on the order of running a terabyte through a straightforward and fast process, not like training).
If you have 1000 customers and enough equipment to handle the requests of 700, going from bf16 to q8 is a no-brainer. Suddenly you can handle the load and have a little spare capacity. They get worse results, probably pay the same per token (or they're on a subscription that hides the cost anyway so you are even freer to make trade-offs. There's a reason that subscription products are kinda poorly described.)
It's also possible for them to vary this across a day: use models during quieter periods? Maybe you get an instance running a bf16 quantization. If you use it during a high use period? You get a Q4 model.
Or intelligent routing is possible. No idea if anyone is doing this, but if they monitor what you send a bit, and you generally shoot for an expensive model for simple requests? They could totally substitute a highly quantized version of the model to answer the question.
There are •so many tricks• that can be pulled here. Some of them very reasonable to make, some of them treading into outright misleading or fraudulent, and it's weirdly hard to draw the line between them.
POL-HK: Bad Fallingbostel: Mit 1,72 Promille unterwegs; Munster: Autoreifen entwendet; Soltau: Kind durch Unfall leicht verletzt Heidekreis (ots) - 12.03.2026 / Mit 1,72 Promille unterwegs Bad Fallingbostel: Einsatzkräfte der Polizeiinspektion Heidekreis kontrollierten in der Nacht zu Donnerstag in Bad Fallingbostel einen 66-jährigen Autofahrer, der zuvor aufgrund seiner ...
I kind of want to make a bumper sticker that says "Born to ride a bike, Forced to drive a car" because with the weather starting to improve I am feeling restless and also helpless.
I may get the basement trainer bike set up so I can just try some real gentle riding when I am feeling well enough.
#bikeTooter
But it also looks like Hungary has hope. Any cracks in that kind of concentration of power are good news. Big news.
Difficult days and difficult years ahead, yes — but days with promise, days following on what’s shaping up to be an incredible victory.
Rebecca Solnit says “Hope is not happiness or confidence or inner peace; it’s a commitment to search for possibilities.” So: here’s hoping!
2/
Started watching Daredevil Season 2.
If you think you recognize Mr. Charles from somewhere, just imagine him saying "I kinda feel like God."
#daredevilbornagain #cerealkiller