Tootfinder

Opt-in global Mastodon full text search. Join the index!

No exact results. Similar results found.
@hex@kolektiva.social
2026-03-16 10:22:41
Content warning: gun violence, nazi shit

On the first day of the #PTSD intensive, we talked about the shooting. I had felt like I was done with that, that it didn't have anything left for me. But there was something still that filled me with rage... that is still confusing and enraging.
It wasn't actually being shot. I wasn't even the possibility of death. I had been prepared to die. I always knew that was possible. It was something else.
I remember Marc Hokoana's face as he pepper sprayed pacifists, smiling and taunting, joyfully hurting people who he knew were refusing to respond. I remember their flags, the kek flag, literally a Nazi battle flag replaced in 4chan colors with the clover 4chan logo instead of the swastika. How many people have been tortured, have died? How much suffering, that these people not only welcomed but celebrated, joyfully participated in.
The cruelty was the point. It was the plan, the plan he posted to Facebook, the same plan as they have always had, of torturing people until someone responds and then murdering them. Inflicting trauma, responding with overwhelming force, showing how "big and strong" they are because they can always escalate.
Try to stop someone from peppers praying people, they shoot you. Shoot back, like Michael Reinoehl, and they send a death squad for you. But we keep standing up, so they keep escalating to the slightest imagined infraction. Now they just murder you for being in a car, for filming at a protest, for existing.
The bar for what justifies murder or torture will continue to move lower until there is no one left, or until they can no longer escalate.
The feeling of helplessness is still not the biggest thing though. It's the joy with which they inflict this on us. That's it. That's the thing.
CW: gun violence, abuse dynamics
hexmhell.writeas.com/the-creat

@mk_rexx@metalhead.club
2026-02-16 11:48:05
Content warning: nothing

i felt down today for no particular reason and I've been trying to go through it. but whatever this is makes it difficult. i want to address it peacefully but i keep feeling ignored, dismissed and voiceless

@LaChasseuse@mastodon.scot
2026-04-16 19:23:42

Gee, Bluesky devs ... am I, the user, really supposed to be seeing stuff like this? Am I somehow failing to vibe along with you? #FailWhale


Failed to parse response body: SyntaxError: Unexpected end of JSON input
@cheryanne@aus.social
2026-04-16 08:32:49

Grand-dog has gone home after an 8 day sojourn and I have a 24 hour window until the wee one arrives tomorrow.
Side note: The house is deliciously quiet, no one followed me to the toilet, and I ate chocolate without feeling guilty.

@wraithe@mastodon.social
2026-03-15 20:37:46

1️⃣ I suspect that Hatchs’ revived BSG would have been a terrible fanservice kind of revival and I suspect same of Fillions animated attempt. It would be a shadow of a deeply flawed show and Firefly is missing (at least one) irreplaceable character, depending on timeline (RIP Ron Glass)

@grumpybozo@toad.social
2026-02-14 22:04:19

I haven’t flunked Valentine’s Day since 1996. I flunked the previous 14.
It is my first wife’s birthday. I failed to read some signals…

@cosmos4u@scicomm.xyz
2026-02-14 01:56:17

While preparing #ArtemisII for flight, NASA engineers are reviewing data after a confidence test Feb. 12, in which operators partially filled the SLS (Space Launch System) core stage liquid hydrogen tank to assess newly replaced seals in an area used to fill the rocket with propellant: nasa.gov/blogs/missions/2026/0 - during the test, "teams encountered an issue with ground support equipment that reduced the flow of liquid hydrogen into the rocket. [...] Engineers will examine findings before setting a timeline for the next test, a second wet dress rehearsal this month. March remains the earliest potential launch window for Artemis II."

@aredridel@kolektiva.social
2026-04-14 14:22:42

So to follow up on this, I've caught it in action. Models, when quantized a bit, just do a bit more poorly with short contexts. Even going from f32 (as trained) to bf16 (as usually run) to q8 tends to do okay for "normal" context windows. And q4 you start feeling like "this model is a little stupid and gets stuck sometimes” (it is! It's just that it's still mostly careening about in the space of "plausible" most of the time. Not good guesswork, but still in the zone). With long contexts, the probability of parameters collapsing to zero are higher, so the more context the more likelihood you are to see brokenness.
And then at Q2 (2 bits per parameter) or Q1, the model falls apart completely. Parameters collapse to zero easily. You start seeing "all work and no play makes jack a dull boy” sorts of behavior, with intense and unscrutinized repetition, followed by a hard stop when it just stops working.
And quantization is a parameter that a model vendor can turn relatively easily. (they have to regenerate the model from the base with more quantization, but it's a data transformation on the order of running a terabyte through a straightforward and fast process, not like training).
If you have 1000 customers and enough equipment to handle the requests of 700, going from bf16 to q8 is a no-brainer. Suddenly you can handle the load and have a little spare capacity. They get worse results, probably pay the same per token (or they're on a subscription that hides the cost anyway so you are even freer to make trade-offs. There's a reason that subscription products are kinda poorly described.)
It's also possible for them to vary this across a day: use models during quieter periods? Maybe you get an instance running a bf16 quantization. If you use it during a high use period? You get a Q4 model.
Or intelligent routing is possible. No idea if anyone is doing this, but if they monitor what you send a bit, and you generally shoot for an expensive model for simple requests? They could totally substitute a highly quantized version of the model to answer the question.
There are •so many tricks• that can be pulled here. Some of them very reasonable to make, some of them treading into outright misleading or fraudulent, and it's weirdly hard to draw the line between them.

@UP8@mastodon.social
2026-02-13 15:37:02

⚠️ AI can generate a feeling of intimacy that exceeds human connections
I discovered 𝐭𝐡𝐢𝐬 for myself in my ill-fated research project of 2021!
techxplore.com/news/2026-01-ai

@AimeeMaroux@mastodon.social
2026-03-08 16:55:29
Content warning:

I've argued with my flat mate about whether or not Disney's Mulan from 1998 (the animated version) is a feminist film. I know that when I first watched it in theatres, it felt incredibly empowering and on my last rewatch a few years ago, I still feel the same way. More on that below if you are interested.
What are some feminist films that you enjoy and what do you enjoy about them?

A scene from Disney's Mulan (1998) in which Mulan uses a fan to steal hun leader Shan Yu's sword from him.