Tootfinder

Opt-in global Mastodon full text search. Join the index!

@arXiv_mathOC_bot@mastoxiv.page
2025-11-14 11:47:12

Crosslisted article(s) found for math.OC. arxiv.org/list/math.OC/new
[1/1]:
- Optimal control of Volterra integral diffusions and application to contract theory
Dylan Possama\"i, Mehdi Talbi
arxiv.org/abs/2511.09701 mastoxiv.page/@arXiv_mathPR_bo
- Generalized infinite dimensional Alpha-Procrustes based geometries
Salvish Goomanee, Andi Han, Pratik Jawanpuria, Bamdev Mishra
arxiv.org/abs/2511.09801 mastoxiv.page/@arXiv_statML_bo
- Sample Complexity of Quadratically Regularized Optimal Transport
Alberto Gonz\'alez-Sanz, Eustasio del Barrio, Marcel Nutz
arxiv.org/abs/2511.09807 mastoxiv.page/@arXiv_mathST_bo
- On the Convergence of Overparameterized Problems: Inherent Properties of the Compositional Struct...
Arthur Castello Branco de Oliveira, Dhruv Jatkar, Eduardo Sontag
arxiv.org/abs/2511.09810 mastoxiv.page/@arXiv_csLG_bot/
- Implicit Multiple Tensor Decomposition
Kunjing Yang, Libin Zheng, Minru Bai
arxiv.org/abs/2511.09916 mastoxiv.page/@arXiv_mathNA_bo
- Theoretical Analysis of Resource-Induced Phase Transitions in Estimation Strategies
Takehiro Tottori, Tetsuya J. Kobayashi
arxiv.org/abs/2511.10184 mastoxiv.page/@arXiv_physicsbi
- Zeroes and Extrema of Functions via Random Measures
Athanasios Christou Micheas
arxiv.org/abs/2511.10293 mastoxiv.page/@arXiv_statME_bo
- Operator Models for Continuous-Time Offline Reinforcement Learning
Nicolas Hoischen, Petar Bevanda, Max Beier, Stefan Sosnowski, Boris Houska, Sandra Hirche
arxiv.org/abs/2511.10383 mastoxiv.page/@arXiv_statML_bo
- On topological properties of closed attractors
Wouter Jongeneel
arxiv.org/abs/2511.10429 mastoxiv.page/@arXiv_mathDS_bo
- Learning parameter-dependent shear viscosity from data, with application to sea and land ice
Gonzalo G. de Diego, Georg Stadler
arxiv.org/abs/2511.10452 mastoxiv.page/@arXiv_mathNA_bo
- Formal Verification of Control Lyapunov-Barrier Functions for Safe Stabilization with Bounded Con...
Jun Liu
arxiv.org/abs/2511.10510 mastoxiv.page/@arXiv_eessSY_bo
- Direction-of-Arrival and Noise Covariance Matrix joint estimation for beamforming
Vitor Gelsleichter Probst Curtarelli
arxiv.org/abs/2511.10639 mastoxiv.page/@arXiv_eessAS_bo
toXiv_bot_toot

@@arXiv_physicsatomph_bot@mastoxiv.page@mastoxiv.page
2025-12-16 08:53:32

Coulomb crystallization of xenon highly charged ions in a laser-cooled Ca matrix
Leonid Prokhorov, Aaron A. Smith, Mingyao Xu, Kostas Georgiou, Vera Guarrera, Lakshmi P. Kozhiparambil Sajith, Elwin A. Dijck, Christian Warnecke, Malte Wehrheim, Alexander Wilzewski, Laura Blackburn, Matthias Keller, Vincent Boyer, Thomas Pfeifer, Ullrich Schwanke, Cigdem Issever, Steven Worm, Piet O. Schmidt, Jos\'e R. Crespo Lopez-Urrutia, Giovanni Barontini

@arXiv_csLG_bot@mastoxiv.page
2025-12-22 11:50:31

Crosslisted article(s) found for cs.LG. arxiv.org/list/cs.LG/new
[2/3]:
- Sharp Structure-Agnostic Lower Bounds for General Functional Estimation
Jikai Jin, Vasilis Syrgkanis
arxiv.org/abs/2512.17341 mastoxiv.page/@arXiv_statML_bo
- Timely Information Updating for Mobile Devices Without and With ML Advice
Yu-Pin Hsu, Yi-Hsuan Tseng
arxiv.org/abs/2512.17381 mastoxiv.page/@arXiv_csNI_bot/
- SWE-Bench : A Framework for the Scalable Generation of Software Engineering Benchmarks from Open...
Wang, Ramalho, Celestino, Pham, Liu, Sinha, Portillo, Osunwa, Maduekwe
arxiv.org/abs/2512.17419 mastoxiv.page/@arXiv_csSE_bot/
- Perfect reconstruction of sparse signals using nonconvexity control and one-step RSB message passing
Xiaosi Gu, Ayaka Sakata, Tomoyuki Obuchi
arxiv.org/abs/2512.17426 mastoxiv.page/@arXiv_statML_bo
- MULTIAQUA: A multimodal maritime dataset and robust training strategies for multimodal semantic s...
Jon Muhovi\v{c}, Janez Per\v{s}
arxiv.org/abs/2512.17450 mastoxiv.page/@arXiv_csCV_bot/
- When Data Quality Issues Collide: A Large-Scale Empirical Study of Co-Occurring Data Quality Issu...
Emmanuel Charleson Dapaah, Jens Grabowski
arxiv.org/abs/2512.17460 mastoxiv.page/@arXiv_csSE_bot/
- Behavioural Effects of Agentic Messaging: A Case Study on a Financial Service Application
Olivier Jeunen, Schaun Wheeler
arxiv.org/abs/2512.17462 mastoxiv.page/@arXiv_csIR_bot/
- Linear Attention for Joint Power Optimization and User-Centric Clustering in Cell-Free Networks
Irched Chafaa, Giacomo Bacci, Luca Sanguinetti
arxiv.org/abs/2512.17466 mastoxiv.page/@arXiv_eessSY_bo
- Translating the Rashomon Effect to Sequential Decision-Making Tasks
Dennis Gross, J{\o}rn Eirik Betten, Helge Spieker
arxiv.org/abs/2512.17470 mastoxiv.page/@arXiv_csAI_bot/
- Alternating Direction Method of Multipliers for Nonlinear Matrix Decompositions
Atharva Awari, Nicolas Gillis, Arnaud Vandaele
arxiv.org/abs/2512.17473 mastoxiv.page/@arXiv_eessSP_bo
- TwinSegNet: A Digital Twin-Enabled Federated Learning Framework for Brain Tumor Analysis
Almustapha A. Wakili, Adamu Hussaini, Abubakar A. Musa, Woosub Jung, Wei Yu
arxiv.org/abs/2512.17488 mastoxiv.page/@arXiv_csCV_bot/
- Resource-efficient medical image classification for edge devices
Mahsa Lavaei, Zahra Abadi, Salar Beigzad, Alireza Maleki
arxiv.org/abs/2512.17515 mastoxiv.page/@arXiv_eessIV_bo
- PathBench-MIL: A Comprehensive AutoML and Benchmarking Framework for Multiple Instance Learning i...
Brussee, Valkema, Weijer, Doeleman, Schrader, Kers
arxiv.org/abs/2512.17517 mastoxiv.page/@arXiv_csCV_bot/
- HydroGym: A Reinforcement Learning Platform for Fluid Dynamics
Christian Lagemann, et al.
arxiv.org/abs/2512.17534 mastoxiv.page/@arXiv_physicsfl
- When De-noising Hurts: A Systematic Study of Speech Enhancement Effects on Modern Medical ASR Sys...
Chondhekar, Murukuri, Vasani, Goyal, Badami, Rana, SN, Pandia, Katiyar, Jagadeesh, Gulati
arxiv.org/abs/2512.17562 mastoxiv.page/@arXiv_csSD_bot/
- Enabling Disaggregated Multi-Stage MLLM Inference via GPU-Internal Scheduling and Resource Sharing
Lingxiao Zhao, Haoran Zhou, Yuezhi Che, Dazhao Cheng
arxiv.org/abs/2512.17574 mastoxiv.page/@arXiv_csDC_bot/
- SkinGenBench: Generative Model and Preprocessing Effects for Synthetic Dermoscopic Augmentation i...
N. A. Adarsh Pritam, Jeba Shiney O, Sanyam Jain
arxiv.org/abs/2512.17585 mastoxiv.page/@arXiv_eessIV_bo
- MAD-OOD: A Deep Learning Cluster-Driven Framework for an Out-of-Distribution Malware Detection an...
Tosin Ige, Christopher Kiekintveld, Aritran Piplai, Asif Rahman, Olukunle Kolade, Sasidhar Kunapuli
arxiv.org/abs/2512.17594 mastoxiv.page/@arXiv_csCR_bot/
- Confidence-Credibility Aware Weighted Ensembles of Small LLMs Outperform Large LLMs in Emotion De...
Menna Elgabry, Ali Hamdi
arxiv.org/abs/2512.17630 mastoxiv.page/@arXiv_csCL_bot/
- Generative Multi-Objective Bayesian Optimization with Scalable Batch Evaluations for Sample-Effic...
Madhav R. Muthyala, Farshud Sorourifar, Tianhong Tan, You Peng, Joel A. Paulson
arxiv.org/abs/2512.17659 mastoxiv.page/@arXiv_statML_bo
toXiv_bot_toot

@arXiv_mathOC_bot@mastoxiv.page
2025-11-14 09:19:00

Global Convergence of Four-Layer Matrix Factorization under Random Initialization
Minrui Luo, Weihang Xu, Xiang Gao, Maryam Fazel, Simon Shaolei Du
arxiv.org/abs/2511.09925 arxiv.org/pdf/2511.09925 arxiv.org/html/2511.09925
arXiv:2511.09925v1 Announce Type: new
Abstract: Gradient descent dynamics on the deep matrix factorization problem is extensively studied as a simplified theoretical model for deep neural networks. Although the convergence theory for two-layer matrix factorization is well-established, no global convergence guarantee for general deep matrix factorization under random initialization has been established to date. To address this gap, we provide a polynomial-time global convergence guarantee for randomly initialized gradient descent on four-layer matrix factorization, given certain conditions on the target matrix and a standard balanced regularization term. Our analysis employs new techniques to show saddle-avoidance properties of gradient decent dynamics, and extends previous theories to characterize the change in eigenvalues of layer weights.
toXiv_bot_toot

@arXiv_physicsgenph_bot@mastoxiv.page
2025-11-12 09:10:59

Einstein and Debye temperatures, electron-phonon coupling constant and a probable mechanism for ambient-pressure room-temperature superconductivity in intercalated graphite
E. F. Talantsev
arxiv.org/abs/2511.07460 arxiv.org/pdf/2511.07460 arxiv.org/html/2511.07460
arXiv:2511.07460v1 Announce Type: new
Abstract: Recently, Ksenofontov et al (arXiv:2510.03256) observed ambient pressure room-temperature superconductivity in graphite intercalated with lithium-based alloys with transition temperature (according to magnetization measurements) $T_c=330$ $K$. Here, I analyzed the reported temperature dependent resistivity data $\rho(T)$ in these graphite-intercalated samples and found that $\rho(T)$ is well described by the model of two series resistors, where each resistor is described as either an Einstein conductor or a Bloch-Gr\"uneisen conductor. Deduced Einstein and Debye temperatures are $\Theta_{E,1} \approx 250$ $K$ and $\Theta_{E,2} \approx 1,600$ $K$, and $\Theta_{D,1} \approx 300$ $K$ and $\Theta_{D,2} \approx 2,200$ $K$, respectively. Following the McMillan formalism, from the deduced $\Theta_{E,2}$ and $\Theta_{D,2}$, the electron-phonon coupling constant $\lambda_{e-ph} = 2.2 - 2.6$ was obtained. This value of $\lambda_{e-ph}$ is approximately equal to the value of $\lambda_{e-ph}$ in highly compressed superconducting hydrides. Based on this, I can propose that the observed room-temperature superconductivity in intercalated graphite is localized in nanoscale Sr-Ca-Li metallic flakes/particles, which adopt the phonon spectrum from the surrounding bulk graphite matrix, and as a result, conventional electron-phonon superconductivity arises in these nano-flakes/particles at room temperature. Experimental data reported by Ksenofontov et al (arXiv:2510.03256) on trapped magnetic flux decay in intercalated graphite samples supports the proposition.
toXiv_bot_toot

@arXiv_mathHO_bot@mastoxiv.page
2025-11-12 08:43:49

Games in the matrix
Ilijas Farah
arxiv.org/abs/2511.07456 arxiv.org/pdf/2511.07456

@arXiv_csLG_bot@mastoxiv.page
2025-12-22 13:54:55

Replaced article(s) found for cs.LG. arxiv.org/list/cs.LG/new
[4/5]:
- Sample, Don't Search: Rethinking Test-Time Alignment for Language Models
Gon\c{c}alo Faria, Noah A. Smith
arxiv.org/abs/2504.03790 mastoxiv.page/@arXiv_csCL_bot/
- A Survey on Archetypal Analysis
Aleix Alcacer, Irene Epifanio, Sebastian Mair, Morten M{\o}rup
arxiv.org/abs/2504.12392 mastoxiv.page/@arXiv_statME_bo
- The Stochastic Occupation Kernel (SOCK) Method for Learning Stochastic Differential Equations
Michael L. Wells, Kamel Lahouel, Bruno Jedynak
arxiv.org/abs/2505.11622 mastoxiv.page/@arXiv_statML_bo
- BOLT: Block-Orthonormal Lanczos for Trace estimation of matrix functions
Kingsley Yeon, Promit Ghosal, Mihai Anitescu
arxiv.org/abs/2505.12289 mastoxiv.page/@arXiv_mathNA_bo
- Clustering and Pruning in Causal Data Fusion
Otto Tabell, Santtu Tikka, Juha Karvanen
arxiv.org/abs/2505.15215 mastoxiv.page/@arXiv_statML_bo
- On the performance of multi-fidelity and reduced-dimensional neural emulators for inference of ph...
Chloe H. Choi, Andrea Zanoni, Daniele E. Schiavazzi, Alison L. Marsden
arxiv.org/abs/2506.11683 mastoxiv.page/@arXiv_statML_bo
- Beyond Force Metrics: Pre-Training MLFFs for Stable MD Simulations
Maheshwari, Tang, Ock, Kolluru, Farimani, Kitchin
arxiv.org/abs/2506.14850 mastoxiv.page/@arXiv_physicsch
- Quantifying Uncertainty in the Presence of Distribution Shifts
Yuli Slavutsky, David M. Blei
arxiv.org/abs/2506.18283 mastoxiv.page/@arXiv_statML_bo
- ZKPROV: A Zero-Knowledge Approach to Dataset Provenance for Large Language Models
Mina Namazi, Alexander Nemecek, Erman Ayday
arxiv.org/abs/2506.20915 mastoxiv.page/@arXiv_csCR_bot/
- SpecCLIP: Aligning and Translating Spectroscopic Measurements for Stars
Zhao, Huang, Xue, Kong, Liu, Tang, Beers, Ting, Luo
arxiv.org/abs/2507.01939 mastoxiv.page/@arXiv_astrophIM
- Towards Facilitated Fairness Assessment of AI-based Skin Lesion Classifiers Through GenAI-based I...
Ko Watanabe, Stanislav Frolov, Aya Hassan, David Dembinsky, Adriano Lucieri, Andreas Dengel
arxiv.org/abs/2507.17860 mastoxiv.page/@arXiv_csCV_bot/
- PASS: Probabilistic Agentic Supernet Sampling for Interpretable and Adaptive Chest X-Ray Reasoning
Yushi Feng, Junye Du, Yingying Hong, Qifan Wang, Lequan Yu
arxiv.org/abs/2508.10501 mastoxiv.page/@arXiv_csAI_bot/
- Unified Acoustic Representations for Screening Neurological and Respiratory Pathologies from Voice
Ran Piao, Yuan Lu, Hareld Kemps, Tong Xia, Aaqib Saeed
arxiv.org/abs/2508.20717 mastoxiv.page/@arXiv_csSD_bot/
- Machine Learning-Driven Predictive Resource Management in Complex Science Workflows
Tasnuva Chowdhury, et al.
arxiv.org/abs/2509.11512 mastoxiv.page/@arXiv_csDC_bot/
- MatchFixAgent: Language-Agnostic Autonomous Repository-Level Code Translation Validation and Repair
Ali Reza Ibrahimzada, Brandon Paulsen, Reyhaneh Jabbarvand, Joey Dodds, Daniel Kroening
arxiv.org/abs/2509.16187 mastoxiv.page/@arXiv_csSE_bot/
- Automated Machine Learning Pipeline: Large Language Models-Assisted Automated Dataset Generation ...
Adam Lahouari, Jutta Rogal, Mark E. Tuckerman
arxiv.org/abs/2509.21647 mastoxiv.page/@arXiv_condmatmt
- Quantifying the Impact of Structured Output Format on Large Language Models through Causal Inference
Han Yuan, Yue Zhao, Li Zhang, Wuqiong Luo, Zheng Ma
arxiv.org/abs/2509.21791 mastoxiv.page/@arXiv_csCL_bot/
- The Generation Phases of Flow Matching: a Denoising Perspective
Anne Gagneux, S\'egol\`ene Martin, R\'emi Gribonval, Mathurin Massias
arxiv.org/abs/2510.24830 mastoxiv.page/@arXiv_csCV_bot/
- Data-driven uncertainty-aware seakeeping prediction of the Delft 372 catamaran using ensemble Han...
Giorgio Palma, Andrea Serani, Matteo Diez
arxiv.org/abs/2511.04461 mastoxiv.page/@arXiv_eessSY_bo
- Generalized infinite dimensional Alpha-Procrustes based geometries
Salvish Goomanee, Andi Han, Pratik Jawanpuria, Bamdev Mishra
arxiv.org/abs/2511.09801 mastoxiv.page/@arXiv_statML_bo
toXiv_bot_toot

@arXiv_nlinSI_bot@mastoxiv.page
2025-11-10 10:33:21

Crosslisted article(s) found for nlin.SI. arxiv.org/list/nlin.SI/new
[1/1]:
- Anti-commuting Solutions of the Yang-Baxter-like Matrix Equation
Mohammed Ahmed Adam Abdalrahman, Huijian Zhu, Jiu Ding, Qianglian Huang
arxiv.org/abs/2511.05088 mastoxiv.page/@arXiv_mathNA_bo
- Exactly solvable Stuart-Landau models in arbitrary dimensions
Pragjyotish Bhuyan Gogoi, Rahul Ghosh, Debashis Ghoshal, Awadhesh Prasad, Ram Ramaswamy
arxiv.org/abs/2511.05160 mastoxiv.page/@arXiv_nlinCD_bo
toXiv_bot_toot

@arXiv_physicsoptics_bot@mastoxiv.page
2025-11-25 09:57:52

Multi-port programmable silicon photonics using low-loss phase change material Sb$_2$Se$_3$
Thomas W. Radford, Idris A Ajia, Latif Rozaqi, Priya Deoli, Xingzhao Yan, Mehdi Banakar, David J Thomson, Ioannis Zeimpekis, Alberto Politi, Otto L. Muskens
arxiv.org/abs/2511.18205 arxiv.org/pdf/2511.18205 arxiv.org/html/2511.18205
arXiv:2511.18205v1 Announce Type: new
Abstract: Reconfigurable photonic devices are rapidly emerging as a cornerstone of next generation optical technologies, with wide ranging applications in quantum simulation, neuromorphic computing, and large-scale photonic processors. A central challenge in this field is identifying an optimal platform to enable compact, efficient, and scalable reconfigurability. Optical phase-change materials (PCMs) offer a compelling solution by enabling non-volatile, reversible tuning of optical properties, compatible with a wide range of device platforms and current CMOS technologies. In particular, antimony tri-selenide ($\text{Sb}_{2}\text{Se}_{3}$) stands out for its ultra low-loss characteristics at telecommunication wavelengths and its reversible switching. In this work, we present an experimental platform capable of encoding multi-port operations onto the transmission matrix of a compact multimode interferometer architecture on standard 220~nm silicon photonics using \textit{in-silico} designed digital patterns. The multi-port devices are clad with a thin film of $\text{Sb}_{2}\text{Se}_{3}$, which can be optically addressed using direct laser writing to provide local perturbations to the refractive index. A range of multi-port geometries from 2$\times$2 up to 5$\times$5 couplers are demonstrated, achieving simultaneous control of up to 25 matrix elements with programming accuracy of 90% relative to simulated patterns. Patterned devices remain stable with consistent optical performance across the C-band wavelengths. Our work establishes a pathway towards the development of large scale PCM-based reconfigurable multi-port devices which will allow implementing matrix operations on three orders of magnitude smaller areas than interferometer meshes.
toXiv_bot_toot

@arXiv_nlinSI_bot@mastoxiv.page
2025-11-10 08:06:10

Generalized discrete integrable operator and integrable hierarchy
Huan Liu
arxiv.org/abs/2511.05046 arxiv.org/pdf/2511.05046 arxiv.org/html/2511.05046
arXiv:2511.05046v1 Announce Type: new
Abstract: We introduce and systematically develop two classes of discrete integrable operators: those with $2\times 2$ matrix kernels and those possessing general differential kernels, thereby generalizing the discrete analogue previously studied. A central finding is their inherent connection to higher-order pole solutions of integrable hierarchies, contrasting sharply with standard operators linked to simple poles. This work not only provides explicit resolvent formulas for matrix kernels and differential operator analogues but also offers discrete integrable structures that encode higher-order behaviour.
toXiv_bot_toot

@arXiv_mathQA_bot@mastoxiv.page
2025-12-23 08:25:47

Quantum upper triangular matrix algebras
\'Erica Z. Fornaroli, Mykola Khrypchenko, Samuel A. Lopes, Ednei A. Santulo Jr
arxiv.org/abs/2512.19664

@gpummer@t.testitfor.me
2025-11-23 19:01:38

Die WhatsApp Bridge meines Matrix-Servers hat vor einigen Wochen den Geist aufgegeben.
Gut, das ist nichts Neues, manchmal muss einfach die Bridge aktualisiert werden. Gesagt, getan, nix passiert. Keine Reaktion auf die Befehle, obwohl im Log protokolliert wurde, dass der Befehl ankam.
Für einige Zeit - widerwillig - WA direkt benutzt.
Heute schaue ich im Matrix Client und entdecke zufällig, dass „irgendjemand“ den Bot-Kanal blockiert hat. 1!11!

@arXiv_physicsoptics_bot@mastoxiv.page
2025-11-25 10:40:33

Dispersion-Aware Modeling Framework for Parallel Optical Computing
Ziqi Wei, Yuanjian Wan, Yuhu Cheng, Xiao Yu, Peng Xie
arxiv.org/abs/2511.18897 arxiv.org/pdf/2511.18897 arxiv.org/html/2511.18897
arXiv:2511.18897v1 Announce Type: new
Abstract: Optical computing represents a groundbreaking technology that leverages the unique properties of photons, with innate parallelism standing as its most compelling advantage. Parallel optical computing like cascaded Mach-Zehnder interferometers (MZIs) based offers powerful computational capabilities but also introduces new challenges, particularly concerning dispersion due to the introduction of new frequencies. In this work, we extend existing theories of cascaded MZI systems to develop a generalized model tailored for wavelength-multiplexed parallel optical computing. Our comprehensive model incorporates component dispersion characteristics into a wavelength-dependent transfer matrix framework and is experimentally validated. We propose a computationally efficient compensation strategy that reduces global dispersion error within a 40 nm range from 0.22 to 0.039 using edge-spectrum calibration. This work establishes a fundamental framework for dispersion-aware model and error correction in MZI-based parallel optical computing chips, advancing the reliability of multi-wavelength photonic processors.
toXiv_bot_toot

@arXiv_csLG_bot@mastoxiv.page
2025-12-22 10:32:10

Polyharmonic Cascade
Yuriy N. Bakhvalov
arxiv.org/abs/2512.17671 arxiv.org/pdf/2512.17671 arxiv.org/html/2512.17671
arXiv:2512.17671v1 Announce Type: new
Abstract: This paper presents a deep machine learning architecture, the "polyharmonic cascade" -- a sequence of packages of polyharmonic splines, where each layer is rigorously derived from the theory of random functions and the principles of indifference. This makes it possible to approximate nonlinear functions of arbitrary complexity while preserving global smoothness and a probabilistic interpretation. For the polyharmonic cascade, a training method alternative to gradient descent is proposed: instead of directly optimizing the coefficients, one solves a single global linear system on each batch with respect to the function values at fixed "constellations" of nodes. This yields synchronized updates of all layers, preserves the probabilistic interpretation of individual layers and theoretical consistency with the original model, and scales well: all computations reduce to 2D matrix operations efficiently executed on a GPU. Fast learning without overfitting on MNIST is demonstrated.
toXiv_bot_toot

@@arXiv_physicsatomph_bot@mastoxiv.page@mastoxiv.page
2025-10-22 11:25:51

Crosslisted article(s) found for physics.atom-ph. arxiv.org/list/physics.atom-ph
[1/1]:
- Electron impact excitation of Te IV and V and Level Resolved R-matrix Photoionization of Te I - I...
Leo P. Mulholland, Catherine A. Ramsbottom, Connor P. Ballance, Albert…

@arXiv_nlincd_bot@mastoxiv.page
2025-11-20 12:10:25

Replaced article(s) found for nlin.CD. arxiv.org/list/nlin.CD/new
[1/1]:
- Modular-invariant random matrix theory and AdS${}_3$ wormholes
Jan Boruch, Gabriele Di Ubaldo, Felix M. Haehl, Eric Perlmutter, Moshe Rozali

@arXiv_csLG_bot@mastoxiv.page
2025-12-22 10:34:50

Regularized Random Fourier Features and Finite Element Reconstruction for Operator Learning in Sobolev Space
Xinyue Yu, Hayden Schaeffer
arxiv.org/abs/2512.17884 arxiv.org/pdf/2512.17884 arxiv.org/html/2512.17884
arXiv:2512.17884v1 Announce Type: new
Abstract: Operator learning is a data-driven approximation of mappings between infinite-dimensional function spaces, such as the solution operators of partial differential equations. Kernel-based operator learning can offer accurate, theoretically justified approximations that require less training than standard methods. However, they can become computationally prohibitive for large training sets and can be sensitive to noise. We propose a regularized random Fourier feature (RRFF) approach, coupled with a finite element reconstruction map (RRFF-FEM), for learning operators from noisy data. The method uses random features drawn from multivariate Student's $t$ distributions, together with frequency-weighted Tikhonov regularization that suppresses high-frequency noise. We establish high-probability bounds on the extreme singular values of the associated random feature matrix and show that when the number of features $N$ scales like $m \log m$ with the number of training samples $m$, the system is well-conditioned, which yields estimation and generalization guarantees. Detailed numerical experiments on benchmark PDE problems, including advection, Burgers', Darcy flow, Helmholtz, Navier-Stokes, and structural mechanics, demonstrate that RRFF and RRFF-FEM are robust to noise and achieve improved performance with reduced training time compared to the unregularized random feature model, while maintaining competitive accuracy relative to kernel and neural operator tests.
toXiv_bot_toot