Tootfinder

Opt-in global Mastodon full text search. Join the index!

No exact results. Similar results found.
@mszll@datasci.social
2024-11-15 08:53:49

Yes please! Supporting nonlinear careers to diversify science journals.plos.org/plosbiology/

Academic trajectory comparisons to a model linear career
@blakes7bot@mas.torpidity.net
2024-11-08 14:07:56

Series D, Episode 12 - Warlord
ORAC: Why not ask them?
TARRANT: [Into comm] This is Xenon base, identify yourselves. [static] I repeat, identify yourselves. [more static] You have ten seconds or I'll open fire.
blake.torpidity.net/m/412/23 B7B3

ChatGPT4 describes the image as: "The image depicts a scene from a science fiction setting, likely from a television show. The characters are dressed in futuristic costumes, suggesting a space or advanced technological environment. The room in which they are gathered features consoles and electronic panels, reinforcing the sci-fi context.

Four individuals are gathered around a transparent tabletop model or device with circuitry visible inside. They appear to be engaged in a serious discussion …
@theawely@mamot.fr
2024-12-13 18:48:37

Excited about the new xLSTM model release. There are many well-though designs compared to transformers: recurrence (which should allows composability), gating (like Mamba & LSTM which is based on, which allows time complexity independent of the input size), state tracking (unlike Mamba & transformers). For now, these advantage aren’t apparent on benchmarks, but most training techniques are secrets, and the recent advances of LLMs evidenced that they matter a lot.