Intrinsic Dimension Estimating Autoencoder (IDEA) Using CancelOut Layer and a Projected Loss
Antoine Orioua, Philipp Krah, Julian Koellermeier
https://arxiv.org/abs/2509.10011 h…
Adapting Noise to Data: Generative Flows from 1D Processes
Jannis Chemseddine, Gregor Kornhardt, Richard Duong, Gabriele Steidl
https://arxiv.org/abs/2510.12636 https://
A Unified Framework for Adaptive Waveform Processing in Next Generation Wireless Networks
Abdelali Arous, Hamza Haif, Arman Farhang, Huseyin Arslan
https://arxiv.org/abs/2510.12648
Executable Ontologies: Synthesizing Event Semantics with Dataflow Architecture
Aleksandr Boldachev
https://arxiv.org/abs/2509.09775 https://arxiv.org/pdf/2…
Self-Augmented Robot Trajectory: Efficient Imitation Learning via Safe Self-augmentation with Demonstrator-annotated Precision
Hanbit Oh, Masaki Murooka, Tomohiro Motoda, Ryoichi Nakajo, Yukiyasu Domae
https://arxiv.org/abs/2509.09893
🪞 Inside a global campaign hijacking open-source project identities
https://www.fullstory.com/blog/inside-a-global-campaign-hijacking-open-source-project-identities/
Simple Projection Variants Improve ColBERT Performance
Benjamin Clavi\'e, Sean Lee, Rikiya Takehi, Aamir Shakir, Makoto P. Kato
https://arxiv.org/abs/2510.12327 https://
OFP-Repair: Repairing Floating-point Errors via Original-Precision Arithmetic
Youshuai Tan, Zishuo Ding, Jinfu Chen, Weiyi Shang
https://arxiv.org/abs/2510.09938 https://…
Sparse Polyak: an adaptive step size rule for high-dimensional M-estimation
Tianqi Qiao, Marie Maros
https://arxiv.org/abs/2509.09802 https://arxiv.org/pdf…
Towards Fast Coarse-graining and Equation Discovery with Foundation Inference Models
Manuel Hinz, Maximilian Mauel, Patrick Seifner, David Berghaus, Kostadin Cvejoski, Ramses J. Sanchez
https://arxiv.org/abs/2510.12618