Which version best conveys the message "ISO 8601. Every over date/time format is inferior"?
Based on different versions of the original logo I made multiple attempts. One of them will become a sticker. White on red, Blue on White or White on Blue?
#iso8601 #iso8601ultras
It makes sense that I posted my favorite 20 LPs of 2025 Friday and over the weekend I found another great death metal and punk album from this year, and now today, just Monday of the following week, I stumble across a TROVE of great rap records that just came out also. Yeah, that tracks. 😂
Well, at least now I follow this label and am aware of these bands, etc. (And they're not a secret or anything, I'm gonna post about em here but I'm still going through em all.) Unbelieva…
Michail Chkhikvishvili,
a self-described cult leader who called himself “Commander Butcher”,
did not look like a Hollywood vision of a contemporary terrorist,
despite the bizarre, almost made-for-TV extremist actions he planned,
such as having people dressed as Santa Claus hand out poison candies on the streets of New York.
Chkhikvishvili appeared in a Brooklyn court last week as one might find an office IT tech:
close-cropped hair and black-rimmed glasses…
Finsler structure of the Apollonian weak metric on the unit disc
Alok Kumar Pandey, Ashok Kumar, Bankteshwar Tiwari
https://arxiv.org/abs/2509.18621 https://
Regularized Random Fourier Features and Finite Element Reconstruction for Operator Learning in Sobolev Space
Xinyue Yu, Hayden Schaeffer
https://arxiv.org/abs/2512.17884 https://arxiv.org/pdf/2512.17884 https://arxiv.org/html/2512.17884
arXiv:2512.17884v1 Announce Type: new
Abstract: Operator learning is a data-driven approximation of mappings between infinite-dimensional function spaces, such as the solution operators of partial differential equations. Kernel-based operator learning can offer accurate, theoretically justified approximations that require less training than standard methods. However, they can become computationally prohibitive for large training sets and can be sensitive to noise. We propose a regularized random Fourier feature (RRFF) approach, coupled with a finite element reconstruction map (RRFF-FEM), for learning operators from noisy data. The method uses random features drawn from multivariate Student's $t$ distributions, together with frequency-weighted Tikhonov regularization that suppresses high-frequency noise. We establish high-probability bounds on the extreme singular values of the associated random feature matrix and show that when the number of features $N$ scales like $m \log m$ with the number of training samples $m$, the system is well-conditioned, which yields estimation and generalization guarantees. Detailed numerical experiments on benchmark PDE problems, including advection, Burgers', Darcy flow, Helmholtz, Navier-Stokes, and structural mechanics, demonstrate that RRFF and RRFF-FEM are robust to noise and achieve improved performance with reduced training time compared to the unregularized random feature model, while maintaining competitive accuracy relative to kernel and neural operator tests.
toXiv_bot_toot
Moody Urbanity - Urban Geometry 📐
情绪化城市 - 都市几何 📐
📷 Pentax MX
🎞️ Kentmere Pan 400
#filmphotography #Photography #blackandwhite
The Illusion of Readiness: Stress Testing Large Frontier Models on Multimodal Medical Benchmarks
Yu Gu, Jingjing Fu, Xiaodong Liu, Jeya Maria Jose Valanarasu, Noel Codella, Reuben Tan, Qianchu Liu, Ying Jin, Sheng Zhang, Jinyu Wang, Rui Wang, Lei Song, Guanghui Qin, Naoto Usuyama, Cliff Wong, Cheng Hao, Hohin Lee, Praneeth Sanapathi, Sarah Hilado, Bian Jiang, Javier Alvarez-Valle, Mu Wei, Jianfeng Gao, Eric Horvitz, Matt Lungren, Hoifung Poon, Paul Vozila
The design and reconstructible history of the Mayan eclipse table of the Dresden Codex: #Maya eclipse tables: #eclipse tables in the #DresdenCodex were based on lunar tables and adjusted for slippage over time.
I have a ten-year-old post about typefaces for dyslexia to which I’ve added two 2016 links:
• https://pmc.ncbi.nlm.nih.gov/articles/PMC5629233/
•
For more than three years, I’ve lived with long COVID.
The dizziness never leaves.
I can’t drive more than half an hour without starting to get nauseous.
Any strenuous activity — mental or physical — leaves me with post-exertional malaise that feels like a hangover the next day.
I’ve learned to adapt and find gratitude for many things, but life remains an exhausting calculus of rationing energy across work, chores, and my kids.
I am only one of over an estimated …