Some developers are releasing longer context windows for Llama 3 thanks to Meta's open-source approach; Llama 3 has a short context window of 8,000 tokens (Kalley Huang/The Information)
https://www.theinformation.com/articles/how-developers-gav…
Context-Aware Clustering using Large Language Models
Sindhu Tipirneni, Ravinarayana Adkathimar, Nurendra Choudhary, Gaurush Hiranandani, Rana Ali Amjad, Vassilis N. Ioannidis, Changhe Yuan, Chandan K. Reddy
https://arxiv.org/abs/2405.00988
Artificial intelligence for context-aware visual change detection in software test automation
Milad Moradi, Ke Yan, David Colwell, Rhona Asgari
https://arxiv.org/abs/2405.00874
ZeroCAP: Zero-Shot Multi-Robot Context Aware Pattern Formation via Large Language Models
Vishnunandan L. N. Venkatesh, Byung-Cheol Min
https://arxiv.org/abs/2404.02318
"In-Context Learning" or: How I learned to stop worrying and love "Applied Information Retrieval"
Andrew Parry, Debasis Ganguly, Manish Chandra
https://arxiv.org/abs/2405.01116
I was today years old when I learned that context receivers have been tossed in favor of https://github.com/Kotlin/KEEP/blob/context-parameters/proposals/context-parameters.md
Apple researchers detail an AI system that can resolve ambiguous references to on-screen entities, in some cases better than GPT-4 can, and can run "on-device" (Michael Nuñez/VentureBeat)
https://venturebeat.com/ai/apple-resea
This https://arxiv.org/abs/2403.20147 has been replaced.
initial toot: https://mastoxiv.page/@arXiv_csCL_…