Tootfinder

Opt-in global Mastodon full text search. Join the index!

No exact results. Similar results found.
@inthehands@hachyderm.io
2026-01-25 15:41:44

This is a fashion trend. It’s not just functional; camo serves no purpose of disguising whatsoever on Minneapolis streets. Militaries are dressing the part — and ICE is playing dress-up.
Avery Trufelman’s excellent podcast Articles of Interest spent an entire season on this topic, under the name “Gear.” Here’s on relevant episode.
2/2
articlesofinterest.substack.co

@macandi@social.heise.de
2026-01-23 11:04:00

Live-Webinar zum Apple-Gerätemanagement: MDM, ABM und neue Funktionen
Apple-Geräte professionell verwalten – von MDM-Grundlagen über Bereitstellungsmodelle wie Declarative Device Management bis zu aktuellen Funktionen und Trends.

@Simone21@mastodon.social
2026-01-25 18:59:42

Wenn die Provokationen in Minnesota funktionieren, kann Trump die Midterms absagen.
In einem Kriegszustand sind keine Wahlen möglich.
#Trump #noKings #Minnesota

@fanf@mendeddrum.org
2025-12-26 12:42:01

from my link log —
Turning an old Amazon Kindle into a eink development platform.
blog.lidskialf.net/2021/02/08/
saved 2021-02-09

@arXiv_physicsoptics_bot@mastoxiv.page
2025-11-25 11:11:43

High-precision luminescence cryothermometry strategy by using hyperfine structure
Marina N. Popova, Mosab Diab, Boris Z. Malkin
arxiv.org/abs/2511.19088 arxiv.org/pdf/2511.19088 arxiv.org/html/2511.19088
arXiv:2511.19088v1 Announce Type: new
Abstract: A novel, to the best of our knowledge, ultralow-temperature luminescence thermometry strategy is proposed, based on a measurement of relative intensities of hyperfine components in the spectra of Ho$^{3 }$ ions doped into a crystal. A $^{7}$LiYF$_4$:Ho$^{3 }$ crystal is chosen as an example. First, we show that temperatures in the range 10-35 K can be measured using the Boltzmann behavior of the populations of crystal-field levels separated by an energy interval of 23 cm$^{-1}$. Then we select the 6089 cm$^{-1}$ line of the holmium $^5I_5 \rightarrow ^5I_7$ transition, which has a well-resolved hyperfine structure and falls within the transparency window of optical fibers (telecommunication S band), to demonstrate the possibility of measuring temperatures below 3 K. The temperature $T$ is determined by a least-squares fit to the measured intensities of all eight hyperfine components using the dependence $I(\nu) = I_1 \exp(-b\nu)$, where $I_1$ and $b = a\nu \frac{\nu}{kT}$ are fitting parameters and a accounts for intensity variations due to mixing of wave functions of different crystal-field levels by the hyperfine interaction. In this method, the absolute and relative thermal sensitivities grow at $T$ approaching zero as $\frac{1}{T^2}$.and $\frac{1}{T}$, respectively. We theoretically considered the intensity distributions within hyperfine manifolds and compared the results with experimental data. Application of the method to experimentally measured relative intensities of hyperfine components of the 6089 cm$^{-1}$ PL line yielded $T = 3.7 \pm 0.2$ K. For a temperature of 1 K, an order of magnitude better accuracy is expected.
toXiv_bot_toot

@eichkat3r@hessen.social
2026-01-22 09:38:44

mich nervt es wenn programme eine plugin schnittstelle haben aber keine möglichkeit damit alle funktionalitäten des programms zu erweitern
zb kann man wohl in gimp keine zusätzlichen tools zur toolbox hinzufügen sodass man sich immer durch menüs klicken muss
es lohnt also nicht wirklich für wiederkehrende tasks ein plugin zu entwickeln
<…

@arXiv_csLG_bot@mastoxiv.page
2025-12-22 10:33:20

Can You Hear Me Now? A Benchmark for Long-Range Graph Propagation
Luca Miglior, Matteo Tolloso, Alessio Gravina, Davide Bacciu
arxiv.org/abs/2512.17762 arxiv.org/pdf/2512.17762 arxiv.org/html/2512.17762
arXiv:2512.17762v1 Announce Type: new
Abstract: Effectively capturing long-range interactions remains a fundamental yet unresolved challenge in graph neural network (GNN) research, critical for applications across diverse fields of science. To systematically address this, we introduce ECHO (Evaluating Communication over long HOps), a novel benchmark specifically designed to rigorously assess the capabilities of GNNs in handling very long-range graph propagation. ECHO includes three synthetic graph tasks, namely single-source shortest paths, node eccentricity, and graph diameter, each constructed over diverse and structurally challenging topologies intentionally designed to introduce significant information bottlenecks. ECHO also includes two real-world datasets, ECHO-Charge and ECHO-Energy, which define chemically grounded benchmarks for predicting atomic partial charges and molecular total energies, respectively, with reference computations obtained at the density functional theory (DFT) level. Both tasks inherently depend on capturing complex long-range molecular interactions. Our extensive benchmarking of popular GNN architectures reveals clear performance gaps, emphasizing the difficulty of true long-range propagation and highlighting design choices capable of overcoming inherent limitations. ECHO thereby sets a new standard for evaluating long-range information propagation, also providing a compelling example for its need in AI for science.
toXiv_bot_toot

@ErikUden@mastodon.de
2025-12-18 21:31:24

Just finished filming a video, who can guess what it was about? #Plushtodon ;)

A professional picture of Erik Uden sitting in front of a Cuba flag, with a christmas tree on his right and two plushies, one of which an old Mastodon plushie to his left. There is also a candle with many hammer and sickles drawn on top of it, the Mastodon plushie is also holding a book named "Your Party: The return of the left". The other plushie is an anarchist cat plushie by YouTuber Jreg.
@fanf@mendeddrum.org
2025-12-23 12:42:01

from my link log —
A Golang malformed HTTP POST mystery.
deliveroo.engineering/2019/02/
saved 2019-03-07

@arXiv_csLG_bot@mastoxiv.page
2025-12-22 10:32:10

Polyharmonic Cascade
Yuriy N. Bakhvalov
arxiv.org/abs/2512.17671 arxiv.org/pdf/2512.17671 arxiv.org/html/2512.17671
arXiv:2512.17671v1 Announce Type: new
Abstract: This paper presents a deep machine learning architecture, the "polyharmonic cascade" -- a sequence of packages of polyharmonic splines, where each layer is rigorously derived from the theory of random functions and the principles of indifference. This makes it possible to approximate nonlinear functions of arbitrary complexity while preserving global smoothness and a probabilistic interpretation. For the polyharmonic cascade, a training method alternative to gradient descent is proposed: instead of directly optimizing the coefficients, one solves a single global linear system on each batch with respect to the function values at fixed "constellations" of nodes. This yields synchronized updates of all layers, preserves the probabilistic interpretation of individual layers and theoretical consistency with the original model, and scales well: all computations reduce to 2D matrix operations efficiently executed on a GPU. Fast learning without overfitting on MNIST is demonstrated.
toXiv_bot_toot