Tootfinder

Opt-in global Mastodon full text search. Join the index!

@lysander07@sigmoid.social
2025-05-08 08:03:00

Next stop on our NLP timeline (as part of the #ISE2025 lecture) was Terry Winograd's SHRDLU, an early natural language understanding system developed in 1968-70 that could manipulate blocks in a virtual world.
Winograd, T. Procedures as a Representation for Data in a Computer Program for Understanding Natural Language. MIT AI Technical Report 235.

Slide from the Information Service Engineering 2025 lecture, Natural Language Processing 01, A Brief History of NLP, NLP Timeline. The picture depicts a timeline in the middle from top to bottom. There is a marker placed at 1970. Left of the timeline, a screenshot of the SHRDLU system is shown displaying a block world in simple line graphics. On the right side, the following text is displayed: SHRDLU was an early natural language understanding system developed by Terry Winograd in 1968-70 that …
@lysander07@sigmoid.social
2025-05-28 05:10:40

Last week, we continued our #ISE2025 lecture on distributional semantics with the introduction of neural language models (NLMs) and compared them to traditional statistical n-gram models.
Benefits of NLMs:
- Capturing Long-Range Dependencies
- Computational and Statistical Tractability
- Improved Generalisation
- Higher Accuracy
@…

The image illustrates the architecture of a Neural Language Model, specifically focusing on Word Vectors II - Neural Language Models. It is part of a presentation on Natural Language Processing, created by the Karlsruhe Institute of Technology (KIT) and FIZ Karlsruhe, as indicated by their logos in the top right corner.

The diagram shows a neural network processing an input word embedding, represented by the phrase "to be or not to." The input is transformed into a d-sized vector representatio…
@lysander07@sigmoid.social
2025-05-21 16:04:40

In the #ISE2025 lecture today we were introducing our students to the concept of distributional semantics as the foundation of modern large language models. Historically, Wittgenstein was one of the important figures in the Philosophy of Language stating thet "The meaning of a word is its use in the language."

An AI-generated image of Ludwig Wittgenstein as a comic strip character. A speech bubble show his famous quote "The meaning of a word is its use in the language."
Bibliographical Reference: Wittgenstein, Ludwig. Philosophical Investigations, Blackwell Publishing (1953).
Ludwig Wittgenstein (1889–1951)
@lysander07@sigmoid.social
2025-05-17 07:38:59

In our #ISE2025 lecture last Wednesday, we learned how in n-gram language models via Markov assumption and maximum likelihood estimation we can predict the probability of the occurrence of a word given a specific context (i.e. n words previous in the sequence of words).
#NLP

Slide from the Information Service Engineering 2025 lecture, 03 Natural Language Processing 02, 2.9, Language MOdels:
Title: N-Gram Language Model
The probability of a sequence of words can be computed via contitional probability and the Bayes Rule (including the chain rule for n words). Approximation is performed via Markov assumption (dependency only on the n last words), and the Maximum Likelihood estimation (approximating the probabilities of a sequence of words by counting and normalising …
@lysander07@sigmoid.social
2025-05-19 14:04:32

Generating Shakespeare-like text with an n-gram language model is straight forward and quite simple. But, don't expect to much of it. It will not be able to recreate a lost Shakespear play for you ;-) It's merely a parrot, making up well sounding sentences out of fragments of original Shakespeare texts...
#ise2025

Slide from the Information Service Engineering lecture 04, Natural Language Procerssing 03, 2.9 Language Models, N-Gram Shakespeare Generation.
The background of the slide shows an AI-generated portrait of William Shakespeare as an ink drawing. There are 4 speech-bubbles around Shakespeare's head, representing artificially generated text based on 1-grams, 2-grams, 3-grams and 4-grams: '
1-gram: To him swallowed confess hear both. Which. Of save on trail for are ay device and rote life have Hill…
@lysander07@sigmoid.social
2025-05-15 08:11:37

This week, we were discussing the central question Can we "predict" a word? as the basis for statistical language models in our #ISE2025 lecture. Of course, I wasx trying Shakespeare quotes to motivate the (international) students to complement the quotes with "predicted" missing words ;-)
"All the world's a stage, and all the men and women merely...."

Slide from the Information Service Engineering 2025 lecture, Natural Language Processing 03, 2.10 Language Models. The Slide shows a graphical portrait of William Shakespeare (created by midjourney AI) as an ink sketch with yellow accents. The text states "Can we "predict" a word?"