"New START,"
💥the only remaining nuclear arms control agreement between the United States and Russia,
🔥expires this Thursday.
Rose Gottemoeller, former deputy Secretary General of NATO, was America's chief negotiator on "New START."
She joins the Amanpour and Company show from Capitol Hill, where she was briefing US senators on the agreement.
🇺🇦 #NowPlaying on KEXP's #DriveTime
Sébastien Tellier:
🎵 Kilometer (A-Trak radio edit)
#SébastienTellier
https://cralias.bandcamp.com/track/s-bastien-tellier-kilometer-dj-animebby-edit
https://open.spotify.com/track/0hcb7YNNk9NSEntmC3980x
Habt Acht!
Gefährliche Entwicklung, zumal annehmbar in falschen Händen.
Wehret den Anfängen.
Original: https://c3d2.de/news/20260226-pm-SaechsPDVG.html
Presse:
'The United States is now a hostile and potentially predatory nation’, says conservative historian
Robert Kagan
In a new column in The Atlantic, Robert Kagan writes that Donald Trump "managed in one year to destroy the American order "
https://www.cnn.com/2026/02/12/tv/vi…
Deep unfolding of MCMC kernels: scalable, modular & explainable GANs for high-dimensional posterior sampling
Jonathan Spence, Tob\'ias I. Liaudat, Konstantinos Zygalakis, Marcelo Pereyra
https://arxiv.org/abs/2602.20758 https://arxiv.org/pdf/2602.20758 https://arxiv.org/html/2602.20758
arXiv:2602.20758v1 Announce Type: new
Abstract: Markov chain Monte Carlo (MCMC) methods are fundamental to Bayesian computation, but can be computationally intensive, especially in high-dimensional settings. Push-forward generative models, such as generative adversarial networks (GANs), variational auto-encoders and normalising flows offer a computationally efficient alternative for posterior sampling. However, push-forward models are opaque as they lack the modularity of Bayes Theorem, leading to poor generalisation with respect to changes in the likelihood function. In this work, we introduce a novel approach to GAN architecture design by applying deep unfolding to Langevin MCMC algorithms. This paradigm maps fixed-step iterative algorithms onto modular neural networks, yielding architectures that are both flexible and amenable to interpretation. Crucially, our design allows key model parameters to be specified at inference time, offering robustness to changes in the likelihood parameters. We train these unfolded samplers end-to-end using a supervised regularized Wasserstein GAN framework for posterior sampling. Through extensive Bayesian imaging experiments, we demonstrate that our proposed approach achieves high sampling accuracy and excellent computational efficiency, while retaining the physics consistency, adaptability and interpretability of classical MCMC strategies.
toXiv_bot_toot
Optimal factor matchings for point processes on non-amenable unimodular graphs
Yinon Spinka, Oren Yakir
https://arxiv.org/abs/2601.08983 https://arxiv.org/…
🇺🇦 #NowPlaying on #BBC6Music's #6MusicArtistInResidence
Nico & The Velvet Underground:
🎵 It Was A Pleasure Then
#Nico #TheVelvetUnderground
https://maurinquina.bandcamp.com/track/it-was-a-pleasure-then-remixed-by-maurin-quina
https://open.spotify.com/track/1yC6eMSaNaN03PgbZGFWAB