Ob du’s glaubst oder nicht: Das Eichhörnchen hat dich längst analysiert. 😍
#DietmarSchneider #Fotogalerie #Foto
Second-order Optimization under Heavy-Tailed Noise: Hessian Clipping and Sample Complexity Limits
Abdurakhmon Sadiev, Peter Richt\'arik, Ilyas Fatkhullin
https://arxiv.org/abs/2510.10690
Fiducial-Cosmology-dependent systematics for the DESI 2024 Full-Shape Analysis
R. Gsponer, S. Ramirez-Solano, F. Rodr\'iguez-Mart\'inez, M. Vargas-Maga\~na, S. Novell-Masot, N. Findlay, H. Gil-Mar\'in, P. Zarrouk, S. Nadathur, A. Rocher, S. Brieden, A. P\'erez-Fern\'andez, J. Aguilar, S. Ahlen, D. Bianchi, D. Brooks, F. J. Castander, T. Claybaugh, A. Cuceu, A. de la Macorra, A. de Mattia, Arjun Dey, P. Doel, A. Font-Ribera, J. E. Forero-Romero, E. Gazta\~naga, S. Go…
Luise Königin von Preußen als Südseeschönheit 🌺🙂
Die Büste von Luise im Schlosspark Charlottenburg ist jeden Tag anders gestylt – und das schon seit Jahren!
Ich hatte bisher noch nicht das Glück, den/die Stylist:in zu treffen.
Luise Königin von Preußen (✶ 10. 03. 1776 – † 19. 07.1810)
#Fotogalerie
Federated Split Learning for Resource-Constrained Robots in Industrial IoT: Framework Comparison, Optimization Strategies, and Future Directions
Wanli Ni, Hui Tian, Shuai Wang, Chengyang Li, Lei Sun, Zhaohui Yang
https://arxiv.org/abs/2510.05713
Global Solutions to Non-Convex Functional Constrained Problems with Hidden Convexity
Ilyas Fatkhullin, Niao He, Guanghui Lan, Florian Wolf
https://arxiv.org/abs/2511.10626 https://arxiv.org/pdf/2511.10626 https://arxiv.org/html/2511.10626
arXiv:2511.10626v1 Announce Type: new
Abstract: Constrained non-convex optimization is fundamentally challenging, as global solutions are generally intractable and constraint qualifications may not hold. However, in many applications, including safe policy optimization in control and reinforcement learning, such problems possess hidden convexity, meaning they can be reformulated as convex programs via a nonlinear invertible transformation. Typically such transformations are implicit or unknown, making the direct link with the convex program impossible. On the other hand, (sub-)gradients with respect to the original variables are often accessible or can be easily estimated, which motivates algorithms that operate directly in the original (non-convex) problem space using standard (sub-)gradient oracles. In this work, we develop the first algorithms to provably solve such non-convex problems to global minima. First, using a modified inexact proximal point method, we establish global last-iterate convergence guarantees with $\widetilde{\mathcal{O}}(\varepsilon^{-3})$ oracle complexity in non-smooth setting. For smooth problems, we propose a new bundle-level type method based on linearly constrained quadratic subproblems, improving the oracle complexity to $\widetilde{\mathcal{O}}(\varepsilon^{-1})$. Surprisingly, despite non-convexity, our methodology does not require any constraint qualifications, can handle hidden convex equality constraints, and achieves complexities matching those for solving unconstrained hidden convex optimization.
toXiv_bot_toot
Graureiher beim Snacken... 2/2
#DietmarSchneider #Fotogalerie #Foto #Photography
Graureiher beim Snacken... 1/2
#DietmarSchneider #Fotogalerie #Foto #Photography
Herrje wie süß! Eine aufgeplusterte Kohlmeise...❤️
#DietmarSchneider #Fotogalerie #Foto #Photography
So süß wie sich das Eichhörnchen mit einem Pfötchen abstützt 😍
#DietmarSchneider #Fotogalerie #Foto #Photography