Tootfinder

Opt-in global Mastodon full text search. Join the index!

No exact results. Similar results found.
@mela@zusammenkunft.net
2024-10-24 00:51:54

Wann plant ihr eigentlich zu beweisen, dass es nichts mit Rassismus zu tun hat, wenn schwarze Frauen viel häufiger eine Schwangerschaft nicht überleben, als Weiße aus einer vergleichbaren sozioökonomischen Schicht?
Oder, dass die Ergebnisse des Mailänder Kongresses richtig waren und Gehörlose falsch liegen, dass lautsprachliche Beschulung diskriminierend ist? Nur, damit ich mir das schon mal in den Kalender eintragen kann.

@pdmckone@mstdn.ca
2024-10-25 12:33:51

The audience is outraged. "You'll never get away with this!"
As the limo begins its leisurely getaway, the Ginger Bear rolls down his window, turns directly to the audience and says, "I'll cut you a cheque. Two hunnerd enough?"
They clap.

@mszll@datasci.social
2024-11-15 21:42:56

Is "decentralizedwashing" a term yet? See recent #bluesky discussions like: social.wildeboer.net/@jwildebo

@blakes7bot@mas.torpidity.net
2024-11-10 20:27:36

Series D, Episode 04 - Stardrive
TARRANT: What use will that be? The main drive chamber can't be pressurized. How do you carry out a delicate repair operation wearing a spacesuit and gauntlets?
VILA: I don't.
blake.torpidity.net/m/404/48 B7B5

ChatGPT4 describes the image as: "The image features three men in a futuristic setting, which appears to be a science fiction environment, possibly from a television show. The setting has a dimly lit interior with metallic and sleek design elements, suggesting a spaceship or high-tech facility. The men are dressed in distinct, futuristic costumes, likely indicating their roles as characters in a sci-fi series. The central figure wears a black costume with metallic and shiny details, while the o…
@WaardRichard@social.edu.nl
2024-11-15 12:49:39

The way we think about large language models (LLMs) like ChatGPT is more important than we often realise. What if we just think about them as sophisticated databases?
#ai #llms #genai

@theawely@mamot.fr
2024-12-13 18:48:37

Excited about the new xLSTM model release. There are many well-though designs compared to transformers: recurrence (which should allows composability), gating (like Mamba & LSTM which is based on, which allows time complexity independent of the input size), state tracking (unlike Mamba & transformers). For now, these advantage aren’t apparent on benchmarks, but most training techniques are secrets, and the recent advances of LLMs evidenced that they matter a lot.