Tootfinder

Opt-in global Mastodon full text search. Join the index!

No exact results. Similar results found.
@arXiv_csLG_bot@mastoxiv.page
2025-07-09 10:15:32

Efficient Training of Large-Scale AI Models Through Federated Mixture-of-Experts: A System-Level Approach
Xiaobing Chen, Boyang Zhang, Xiangwei Zhou, Mingxuan Sun, Shuai Zhang, Songyang Zhang, Geoffrey Ye Li
arxiv.org/abs/2507.05685

@Techmeme@techhub.social
2025-07-08 10:15:58

The American Federation of Teachers, the US' second-largest teachers' union, plans to start an AI training hub with $23M from Microsoft, OpenAI, and Anthropic (Natasha Singer/New York Times)
nytimes.com/2025/07/08/technol

@arXiv_csRO_bot@mastoxiv.page
2025-07-09 09:45:02

Communication-Efficient Module-Wise Federated Learning for Grasp Pose Detection in Cluttered Environments
Woonsang Kang, Joohyung Lee, Seungjun Kim, Jungchan Cho, Yoonseon Oh
arxiv.org/abs/2507.05861

@arXiv_csCR_bot@mastoxiv.page
2025-07-09 08:43:22

PROTEAN: Federated Intrusion Detection in Non-IID Environments through Prototype-Based Knowledge Sharing
Sara Chennoufi, Yufei Han, Gregory Blanc, Emiliano De Cristofaro, Christophe Kiennert
arxiv.org/abs/2507.05524

@joshmoore@fediscience.org
2025-05-08 11:04:26

Publishing's class problem
The 2024 data shows a similar picture of middle-class origin dominance of publishing. Although there are year-on-year variations, since 2019 there has consistently been more than 60% proportions of middle-class origin people in publishing occupations. In the same period, working-class origin proportions have never been above 20%.
#FederatedMemory

@Techmeme@techhub.social
2025-06-08 00:20:54

Some WordPress veterans and the Linux Foundation start FAIR, a federated update network to decentralize WordPress infrastructure and boost supply chain security (Chris Stokel-Walker/Fast Company)
fastcompany.com/91347003/wordp

@arXiv_csLG_bot@mastoxiv.page
2025-07-09 10:21:52

Prototype-Guided and Lightweight Adapters for Inherent Interpretation and Generalisation in Federated Learning
Samuel Ofosu Mensah, Kerol Djoumessi, Philipp Berens
arxiv.org/abs/2507.05852

@arXiv_csCR_bot@mastoxiv.page
2025-07-08 13:06:01

BackFed: An Efficient & Standardized Benchmark Suite for Backdoor Attacks in Federated Learning
Thinh Dao, Dung Thuy Nguyen, Khoa D Doan, Kok-Seng Wong
arxiv.org/abs/2507.04903

@arXiv_csLG_bot@mastoxiv.page
2025-06-09 10:06:32

Mitigating Catastrophic Forgetting with Adaptive Transformer Block Expansion in Federated Fine-Tuning
Yujia Huo, Jianchun Liu, Hongli Xu, Zhenguo Ma, Shilong Wang, Liusheng Huang
arxiv.org/abs/2506.05977

@arXiv_csCR_bot@mastoxiv.page
2025-06-09 07:49:22

FedShield-LLM: A Secure and Scalable Federated Fine-Tuned Large Language Model
Md Jueal Mia, M. Hadi Amini
arxiv.org/abs/2506.05640