2026-02-08 21:30:44
From synthetic turbulence to true solutions: A deep diffusion model for discovering periodic orbits in the Navier-Stokes equations
Jeremy P Parker, Tobias M Schneider
https://arxiv.org/abs/2602.23181 https://arxiv.org/pdf/2602.23181 https://arxiv.org/html/2602.23181
arXiv:2602.23181v1 Announce Type: new
Abstract: Generative artificial intelligence has shown remarkable success in synthesizing data that mimic complex real-world systems, but its potential role in the discovery of mathematically meaningful structures in physical models remains underexplored. In this work, we demonstrate how a generative diffusion model can be used to uncover previously unknown solutions of a nonlinear partial differential equation: the two-dimensional Navier-Stokes equations in a turbulent regime. Trained on data from a direct numerical simulation of turbulence, the model learns to generate time series that resemble physically plausible trajectories. By carefully modifying the temporal structure of the model and enforcing the symmetries of the governing equations, we produce synthetic trajectories that are periodic in time, despite the fact that the training data did not contain periodic trajectories. These synthetic trajectories are then refined into true solutions using an iterative solver, yielding 111 new periodic orbits (POs) with very short periods. Our results reveal a previously unobserved richness in the PO structure of this system and suggest a broader role for generative AI: not as replacements for simulation and existing solvers, but as a complementary tool for navigating the complex solution spaces of nonlinear dynamical systems.
toXiv_bot_toot
Source: Tencent tells staff that Yao Shunyu, an ex-OpenAI researcher who joined in September, is now its chief AI scientist, reporting to President Martin Lau (Juro Osawa/The Information)
https://www.theinformation.com/briefings/tencent-name…
Understanding the Role of Rehearsal Scale in Continual Learning under Varying Model Capacities
JinLi He, Liang Bai, Xian Yang
https://arxiv.org/abs/2602.20791 https://arxiv.org/pdf/2602.20791 https://arxiv.org/html/2602.20791
arXiv:2602.20791v1 Announce Type: new
Abstract: Rehearsal is one of the key techniques for mitigating catastrophic forgetting and has been widely adopted in continual learning algorithms due to its simplicity and practicality. However, the theoretical understanding of how rehearsal scale influences learning dynamics remains limited. To address this gap, we formulate rehearsal-based continual learning as a multidimensional effectiveness-driven iterative optimization problem, providing a unified characterization across diverse performance metrics. Within this framework, we derive a closed-form analysis of adaptability, memorability, and generalization from the perspective of rehearsal scale. Our results uncover several intriguing and counterintuitive findings. First, rehearsal can impair model's adaptability, in sharp contrast to its traditionally recognized benefits. Second, increasing the rehearsal scale does not necessarily improve memory retention. When tasks are similar and noise levels are low, the memory error exhibits a diminishing lower bound. Finally, we validate these insights through numerical simulations and extended analyses on deep neural networks across multiple real-world datasets, revealing statistical patterns of rehearsal mechanisms in continual learning.
toXiv_bot_toot
🧬 Largest RNA language model to date offers new way to predict behavior and boost drug discovery
#rna
Just finished "Match Point!" by Maddie Gallegos, an excellent graphic novel about racquetball, dumpster diving, best friends, and pressure from Dad. The characters and their fromance are super cute, and while I'm sure some might find the ending too happy, I'm usually fine with seeing the aspirational version of relationships because it can serve as a good role model, while other narratives can help explain how to handle worse outcomes.
#AmReading #ReadingNow
On the Generalization Behavior of Deep Residual Networks From a Dynamical System Perspective
Jinshu Huang, Mingfei Sun, Chunlin Wu
https://arxiv.org/abs/2602.20921 https://arxiv.org/pdf/2602.20921 https://arxiv.org/html/2602.20921
arXiv:2602.20921v1 Announce Type: new
Abstract: Deep neural networks (DNNs) have significantly advanced machine learning, with model depth playing a central role in their successes. The dynamical system modeling approach has recently emerged as a powerful framework, offering new mathematical insights into the structure and learning behavior of DNNs. In this work, we establish generalization error bounds for both discrete- and continuous-time residual networks (ResNets) by combining Rademacher complexity, flow maps of dynamical systems, and the convergence behavior of ResNets in the deep-layer limit. The resulting bounds are of order $O(1/\sqrt{S})$ with respect to the number of training samples $S$, and include a structure-dependent negative term, yielding depth-uniform and asymptotic generalization bounds under milder assumptions. These findings provide a unified understanding of generalization across both discrete- and continuous-time ResNets, helping to close the gap in both the order of sample complexity and assumptions between the discrete- and continuous-time settings.
toXiv_bot_toot
Crosslisted article(s) found for physics.flu-dyn. https://arxiv.org/list/physics.flu-dyn/new
[1/1]:
- Physics Constrained Neural Collision Operators for Variable Hard Sphere Surrogates and Ab Initio ...
Ehsan Roohi, Ahmad Shoja-Sani, Stefan Stefanov
https://arxiv.org/abs/2602.21244 https://mastoxiv.page/@arXiv_physicscompph_bot/116135902638427987
- Chapman-Enskog expansion for chirally colliding disks
Ruben Lier, Pawe{\l} Matus
https://arxiv.org/abs/2602.21367 https://mastoxiv.page/@arXiv_condmatsoft_bot/116135969484479808
- Passive freeze-out of the Richtmyer-Meshkov instability
J. Strucka, et al.
https://arxiv.org/abs/2602.21375 https://mastoxiv.page/@arXiv_physicsplasmph_bot/116135979972070806
- A CFD-Based Investigation of Local Luminal Curvature as a Primary Determinant of Hemodynamic Envi...
Marcella P. A. Dallavanzi, Jos\'e L. Gasche, Iago L. Oliveira
https://arxiv.org/abs/2602.21409 https://mastoxiv.page/@arXiv_physicsmedph_bot/116135886905819251
- Unstable magnetic reconnection self-generates turbulence
Nick Williams, Alessandro De Rosis, Alex Skillen
https://arxiv.org/abs/2602.21422 https://mastoxiv.page/@arXiv_physicsplasmph_bot/116135995018666818
- Out-of-time-ordered correlators for turbulent fields: a quantum-classical correspondence
Motoki Nakata
https://arxiv.org/abs/2602.21710 https://mastoxiv.page/@arXiv_physicsplasmph_bot/116136015335973276
- Particle, kinetic and hydrodynamic models for sea ice floes. Part II: Rotating floes with nonline...
Quanling Deng, Seung-Yeal Ha, Jaemoon Lee
https://arxiv.org/abs/2602.21972 https://mastoxiv.page/@arXiv_mathph_bot/116136043517281861
- A consistent phase-averaged model of the interactions between surface gravity waves and currents
Jacques Vanneste, William R. Young
https://arxiv.org/abs/2602.21976 https://mastoxiv.page/@arXiv_physicsaoph_bot/116135993734282277
- Hydrodynamics of Dense Active Fluids: Turbulence-Like States and the Role of Advected Activity
Sandip Sahoo, Siddhartha Mukherjee, Samriddhi Sankar Ray
https://arxiv.org/abs/2602.22044 https://mastoxiv.page/@arXiv_condmatsoft_bot/116136069077151185
- Surrogate models for Rock-Fluid Interaction: A Grid-Size-Invariant Approach
Pinheiro, Guo, Menke, Joshi, Heaney, ElSheikh, Pain
https://arxiv.org/abs/2602.22188 https://mastoxiv.page/@arXiv_csLG_bot/116136497040052377
toXiv_bot_toot