During five months of covering ICE raids, LA Public Press added legal training for staff, digital security protocols, threat assessment policies, and more (Michelle Zenarosa/Poynter)
https://www.poynter.org/business-work/2025/la-public-press-…
JPMorgan Chase secures deals with fintech middlemen, like Plaid, covering 95% of third-party data pulls, and will receive payment for access to customer data (Hugh Son/CNBC)
https://www.cnbc.com/2025/11/14/jpmorgan-chase-fintech-fees.html
from my link log —
A unique performance optimization for a 3D geometry language.
https://cprimozic.net/notes/posts/persistent-expr-memo-optimization-for-geoscript/
saved 2026-01-11
Series A, Episode 13 - Orac
CALLY: We should keep on moving, they could be right behind us.
BLAKE: Yes, without weapons we don't stand a chance. Look, you keep going. I'm going to stay here and try and bring the roof down - block them off.
https://blake.torpidity.net/m/113/347 …
Sometimes hacking a computer means a whole different thing...
In this use, I'm actually hacking (with a hacksaw) the case apart to add an I/O shield (custom made) so I can fit an older motherboard that was not designed for this case.
And thank you to Gateway for making me do this by not using a removable shield and instead stamping every hole they needed and covering them with a sticker if they weren't on this model. 🤦♂️
Series A, Episode 09 - Project Avalon
AVALON: I won't help you. You can't force me to help you.
TRAVIS: Don't be naive. I can force you to do anything. It isn't necessary, though. You're already helping me. Just by being here you've set in motion a chain of events that's been absolutely predetermined.
https://
Spatially-informed transformers: Injecting geostatistical covariance biases into self-attention for spatio-temporal forecasting
Yuri Calleo
https://arxiv.org/abs/2512.17696 https://arxiv.org/pdf/2512.17696 https://arxiv.org/html/2512.17696
arXiv:2512.17696v1 Announce Type: new
Abstract: The modeling of high-dimensional spatio-temporal processes presents a fundamental dichotomy between the probabilistic rigor of classical geostatistics and the flexible, high-capacity representations of deep learning. While Gaussian processes offer theoretical consistency and exact uncertainty quantification, their prohibitive computational scaling renders them impractical for massive sensor networks. Conversely, modern transformer architectures excel at sequence modeling but inherently lack a geometric inductive bias, treating spatial sensors as permutation-invariant tokens without a native understanding of distance. In this work, we propose a spatially-informed transformer, a hybrid architecture that injects a geostatistical inductive bias directly into the self-attention mechanism via a learnable covariance kernel. By formally decomposing the attention structure into a stationary physical prior and a non-stationary data-driven residual, we impose a soft topological constraint that favors spatially proximal interactions while retaining the capacity to model complex dynamics. We demonstrate the phenomenon of ``Deep Variography'', where the network successfully recovers the true spatial decay parameters of the underlying process end-to-end via backpropagation. Extensive experiments on synthetic Gaussian random fields and real-world traffic benchmarks confirm that our method outperforms state-of-the-art graph neural networks. Furthermore, rigorous statistical validation confirms that the proposed method delivers not only superior predictive accuracy but also well-calibrated probabilistic forecasts, effectively bridging the gap between physics-aware modeling and data-driven learning.
toXiv_bot_toot
from my link log —
USB in a nutshell: making sense of the USB standard.
https://www.beyondlogic.org/usbnutshell/usb1.shtml
saved 2025-11-23 https:/…