Tootfinder

Opt-in global Mastodon full text search. Join the index!

@imprs_solar@academiccloud.social
2025-01-15 17:29:02

#IMPRS alumna Meike Fischer's paper featuring prominently in #UniGöttingen and #MPSgoettingen's press release about
The Moon: a chunk ejected from Earth?

@dkl@23.social
2025-01-07 16:25:55

Ok. Bitwarden musste im letzten Release ganz dringend die UI des Firefox-Plugins verschlimmbessern.

@theawely@mamot.fr
2024-12-13 18:48:37

Excited about the new xLSTM model release. There are many well-though designs compared to transformers: recurrence (which should allows composability), gating (like Mamba & LSTM which is based on, which allows time complexity independent of the input size), state tracking (unlike Mamba & transformers). For now, these advantage aren’t apparent on benchmarks, but most training techniques are secrets, and the recent advances of LLMs evidenced that they matter a lot.