Following the election of Robert Prevost as the new Pope today, YDS asked professor Teresa Berger for perspective. Teresa Berger is Professor of Liturgical Studies and Thomas E. Golden Jr. Professor of Catholic Theology.
Here is our brief exchange:
https://divinity.yale.edu/news/new-pop
Building on the 90s, statistical n-gram language models, trained on vast text collections, became the backbone of NLP research. They fueled advancements in nearly all NLP techniques of the era, laying the groundwork for today's AI.
F. Jelinek (1997), Statistical Methods for Speech Recognition, MIT Press, Cambridge, MA
#NLP
Abuse in Buddhism: The Law of Silence https://openbuddhism.org/library/videos/2022/elodie-emery-wandrille-lanos-present-buddhism-the-unspeakable-truth/
Mattel combines its film and television production units into a single unit under the direction of Mattel Films president and Barbie producer Robbie Brenner (Jeremy Fuster/The Wrap)
https://www.thewrap.com/mattel-combines-film-tv-units-under-robbie-bren…
Next stop on our NLP timeline (as part of the #ISE2025 lecture) was Terry Winograd's SHRDLU, an early natural language understanding system developed in 1968-70 that could manipulate blocks in a virtual world.
Winograd, T. Procedures as a Representation for Data in a Computer Program for Understanding Natural Language. MIT AI Technical Report 235.
The First Compute Arms Race: the Early History of Numerical Weather Prediction
Charles Yang
https://arxiv.org/abs/2506.21816 https://…
Abuse in Buddhism: The Law of Silence https://openbuddhism.org/library/videos/2022/elodie-emery-wandrille-lanos-present-buddhism-the-unspeakable-truth/
Last leg on our brief history of NLP (so far) is the advent of large language models with GPT-3 in 2020 and the introduction of learning from the prompt (aka few-shot learning).
T. B. Brown et al. (2020). Language models are few-shot learners. NIPS'20
https://…