Last leg on our brief history of NLP (so far) is the advent of large language models with GPT-3 in 2020 and the introduction of learning from the prompt (aka few-shot learning).
T. B. Brown et al. (2020). Language models are few-shot learners. NIPS'20
https://…
TopicAttack: An Indirect Prompt Injection Attack via Topic Transition
Yulin Chen, Haoran Li, Yuexin Li, Yue Liu, Yangqiu Song, Bryan Hooi
https://arxiv.org/abs/2507.13686
Precision spectral measurements of Chromium and Titanium from 10 to 250 GeV$/n$ and sub-Iron to Iron ratio with the Calorimetric Electron Telescope on the ISS
O. Adriani, Y. Akaike, K. Asano, Y. Asaoka, E. Berti, P. Betti, G. Bigongiari, W. R. Binns, M. Bongi, P. Brogi, A. Bruno, N. Cannady, G. Castellini, C. Checchia, M. L. Cherry, G. Collazuol, G. A. de Nolfo, K. Ebisawa, A. W. Ficklin, H. Fuke, S. Gonzi, T. G. Guzik, T. Hams, K. Hibino, M. Ichimura, M. H. Israel, K. Kasahara, J. Kat…
CSBrain: A Cross-scale Spatiotemporal Brain Foundation Model for EEG Decoding
Yuchen Zhou, Jiamin Wu, Zichen Ren, Zhouheng Yao, Weiheng Lu, Kunyu Peng, Qihao Zheng, Chunfeng Song, Wanli Ouyang, Chao Gou
https://arxiv.org/abs/2506.23075