So there is a delivery company in Norway called Helthjem, which means something like all the way home or completely home. The idea being that so many delivery companies here actually fail to deliver to you home and make you pick up from a drop off point.
They just send me this text:
Hi! The package from GPN cannot be delivered to your home. It is being sent to Linderud Flower shop. You will be notified when it can be picked up. Tracking number: [redacted] Regards, Helthjem
Series B, Episode 10 - Voice from the Past
VOICE: [V.O.] Supreme Commander?
SERVALAN: Yes?
VOICE: [V.O.] Governor Le Grand of Outer Gaul for you.
SERVALAN: Le Grand. Well, well, well. [Le Grand appears on the communicator screen]
SERVALAN: Governor.
https://blake.torpidity.net/m/210/379
After #Trump finally crashes and burns (I'm still saying I don't think he makes it to the mid terms, and I think it's more than possible he won't make it to the end of the year) we'll hear a lot of people say, "the system worked!" Today people are already talking about "saving democracy" by fighting back. This will become a big rally cry to vote (for Democrats, specifically), and the complete failure of the system will be held up as the best evidence for even greater investment in it.
I just want to point out that American democracy gave nuclear weapons to a pedophile, who, before being elected was already a well known sexual predator, and who made the campaign promise to commit genocide. He then preceded to commit genocide. And like, I don't care that he's "only" kidnaped and disappeared a few thousand brown people. That's still genocide. Even if you don't kill every member of a targeted group, any attempt to do so is still "committing genocide." Trump said he would commit genocide, then he hired all the "let's go do a race war" guys he could find and *paid* them to go do a race war. And, even now as this deranged monster is crashing out, he is still authorized to use the world's largest nuclear arsenal.
He committed genocide during his first term when his administration separated migrant parents and children, then adopted those children out to other parents. That's technically genocide. The point was to destroy the very people been sending right wing terror squads after.
There was a peaceful hand over of power to a known Russian asset *twice*, and the second time he'd already committed *at least one* act of genocide *and* destroyed cultural heritage sites (oh yeah, he also destroyed indigenous grave sites, in case you forgot, during his first term).
All of this was allowed because the system is set up to protect exactly these types of people, because *exactly* these types of people are *the entire power structure*.
Going back to that system means going back to exactly the system that gave nuclear weapons to a pedophile *TWICE*.
I'm already seeing the attempts to pull people back, the congratulations as we enter the final phase, the belief that getting Trump out will let us all get back to normal. Normal. The normal that lead here in the first place. I can already see the brunch reservations being made. When Trump is over, we will be told we won. We will be told that it's time to go back to sleep.
When they tell you everything worked, everything is better, that we can stop because we won, tell them "fuck you! Never again means never again." Destroy every system that ever gave these people power, that ever protected them from consequences, that ever let them hide what they were doing.
These democrats funded a genocide abroad and laid the groundwork for genocide at home. They protected these predators, for years. The whole power structure is guilty. As these files implicate so many powerful people, they're trying to shove everything back in the box. After all the suffering, after we've finally made it clear that we are the once with the power, only now they're willing to sacrifice Trump to calm us all down.
No, that's a good start but it can't be the end.
Winning can't be enough to quench that rage. Keep it burning. When this is over, let victory fan that anger until every institution that made this possible lies in ashes. Burn it all down and salt the earth. Taking down Trump is a great start, but it's not time to give up until this isn't possible again.
#USPol
Replaced article(s) found for cs.LG. https://arxiv.org/list/cs.LG/new
[4/5]:
- Sample, Don't Search: Rethinking Test-Time Alignment for Language Models
Gon\c{c}alo Faria, Noah A. Smith
https://arxiv.org/abs/2504.03790 https://mastoxiv.page/@arXiv_csCL_bot/114301112970577326
- A Survey on Archetypal Analysis
Aleix Alcacer, Irene Epifanio, Sebastian Mair, Morten M{\o}rup
https://arxiv.org/abs/2504.12392 https://mastoxiv.page/@arXiv_statME_bot/114357826909813483
- The Stochastic Occupation Kernel (SOCK) Method for Learning Stochastic Differential Equations
Michael L. Wells, Kamel Lahouel, Bruno Jedynak
https://arxiv.org/abs/2505.11622 https://mastoxiv.page/@arXiv_statML_bot/114539065460187982
- BOLT: Block-Orthonormal Lanczos for Trace estimation of matrix functions
Kingsley Yeon, Promit Ghosal, Mihai Anitescu
https://arxiv.org/abs/2505.12289 https://mastoxiv.page/@arXiv_mathNA_bot/114539035462135281
- Clustering and Pruning in Causal Data Fusion
Otto Tabell, Santtu Tikka, Juha Karvanen
https://arxiv.org/abs/2505.15215 https://mastoxiv.page/@arXiv_statML_bot/114550346291754635
- On the performance of multi-fidelity and reduced-dimensional neural emulators for inference of ph...
Chloe H. Choi, Andrea Zanoni, Daniele E. Schiavazzi, Alison L. Marsden
https://arxiv.org/abs/2506.11683 https://mastoxiv.page/@arXiv_statML_bot/114692410563481289
- Beyond Force Metrics: Pre-Training MLFFs for Stable MD Simulations
Maheshwari, Tang, Ock, Kolluru, Farimani, Kitchin
https://arxiv.org/abs/2506.14850 https://mastoxiv.page/@arXiv_physicschemph_bot/114709402590755731
- Quantifying Uncertainty in the Presence of Distribution Shifts
Yuli Slavutsky, David M. Blei
https://arxiv.org/abs/2506.18283 https://mastoxiv.page/@arXiv_statML_bot/114738165218533987
- ZKPROV: A Zero-Knowledge Approach to Dataset Provenance for Large Language Models
Mina Namazi, Alexander Nemecek, Erman Ayday
https://arxiv.org/abs/2506.20915 https://mastoxiv.page/@arXiv_csCR_bot/114754394485208892
- SpecCLIP: Aligning and Translating Spectroscopic Measurements for Stars
Zhao, Huang, Xue, Kong, Liu, Tang, Beers, Ting, Luo
https://arxiv.org/abs/2507.01939 https://mastoxiv.page/@arXiv_astrophIM_bot/114788369702591337
- Towards Facilitated Fairness Assessment of AI-based Skin Lesion Classifiers Through GenAI-based I...
Ko Watanabe, Stanislav Frolov, Aya Hassan, David Dembinsky, Adriano Lucieri, Andreas Dengel
https://arxiv.org/abs/2507.17860 https://mastoxiv.page/@arXiv_csCV_bot/114912976717523345
- PASS: Probabilistic Agentic Supernet Sampling for Interpretable and Adaptive Chest X-Ray Reasoning
Yushi Feng, Junye Du, Yingying Hong, Qifan Wang, Lequan Yu
https://arxiv.org/abs/2508.10501 https://mastoxiv.page/@arXiv_csAI_bot/115032101532614110
- Unified Acoustic Representations for Screening Neurological and Respiratory Pathologies from Voice
Ran Piao, Yuan Lu, Hareld Kemps, Tong Xia, Aaqib Saeed
https://arxiv.org/abs/2508.20717 https://mastoxiv.page/@arXiv_csSD_bot/115111255835875066
- Machine Learning-Driven Predictive Resource Management in Complex Science Workflows
Tasnuva Chowdhury, et al.
https://arxiv.org/abs/2509.11512 https://mastoxiv.page/@arXiv_csDC_bot/115213444524490263
- MatchFixAgent: Language-Agnostic Autonomous Repository-Level Code Translation Validation and Repair
Ali Reza Ibrahimzada, Brandon Paulsen, Reyhaneh Jabbarvand, Joey Dodds, Daniel Kroening
https://arxiv.org/abs/2509.16187 https://mastoxiv.page/@arXiv_csSE_bot/115247172280557686
- Automated Machine Learning Pipeline: Large Language Models-Assisted Automated Dataset Generation ...
Adam Lahouari, Jutta Rogal, Mark E. Tuckerman
https://arxiv.org/abs/2509.21647 https://mastoxiv.page/@arXiv_condmatmtrlsci_bot/115286737423175311
- Quantifying the Impact of Structured Output Format on Large Language Models through Causal Inference
Han Yuan, Yue Zhao, Li Zhang, Wuqiong Luo, Zheng Ma
https://arxiv.org/abs/2509.21791 https://mastoxiv.page/@arXiv_csCL_bot/115287166674809413
- The Generation Phases of Flow Matching: a Denoising Perspective
Anne Gagneux, S\'egol\`ene Martin, R\'emi Gribonval, Mathurin Massias
https://arxiv.org/abs/2510.24830 https://mastoxiv.page/@arXiv_csCV_bot/115462527449411627
- Data-driven uncertainty-aware seakeeping prediction of the Delft 372 catamaran using ensemble Han...
Giorgio Palma, Andrea Serani, Matteo Diez
https://arxiv.org/abs/2511.04461 https://mastoxiv.page/@arXiv_eessSY_bot/115507785247809767
- Generalized infinite dimensional Alpha-Procrustes based geometries
Salvish Goomanee, Andi Han, Pratik Jawanpuria, Bamdev Mishra
https://arxiv.org/abs/2511.09801 https://mastoxiv.page/@arXiv_statML_bot/115547135711272091
toXiv_bot_toot
Series B, Episode 08 - Hostage
JOBAN: Supreme commander.
SERVALAN: An honour, Counsellor.
JOBAN: Thank you. [Sits down] Aren't you going to offer me a drink?
SERVALAN: Yes, of course. [Pours drink for Joban]
JOBAN: Not joining me?
https://blake.torpidity.net/m/208/199…
Liebe ÖRR,
Honorare kürzen ist kein Sparen, außer an Qualität. Die Honorare sind auskömmlich ausgehandelt, sodass die Autoren davon leben können. Werden sie gekürzt, ist das auf Dauer nicht mehr möglich.
Was wäre die Folge? Autoren steigen aus oder nehmen zusätzlich Jobs zum Beispiel in der besser zahlenden PR an, um ihren Job in den ÖRR quer zu finanzieren.
You always get, what you pay for.
Series A, Episode 12 - Deliverance
TRAVIS: You sent for me?
SERVALAN: You've lost some of your fire, Travis. Whatever happened to your pride?
TRAVIS: My pride, Supreme Commander?
https://blake.torpidity.net/m/112/227 B7B6
"I need a hard drive for that computer."
Out of stock
"Wow that game looks cool"
Out of stock
"That handheld gaming system is neat, oh and it's priced right."
Out of stock, you idiot. Should have preordered 13 years ago.
"That niche board game looks like something I want to play"
Too bad loser, out of stock. You can buy it from the scalpers at 4x the markup!
"Maybe a pokemon booster pack will cheer me up…