Excited about the new xLSTM model release. There are many well-though designs compared to transformers: recurrence (which should allows composability), gating (like Mamba & LSTM which is based on, which allows time complexity independent of the input size), state tracking (unlike Mamba & transformers). For now, these advantage aren’t apparent on benchmarks, but most training techniques are secrets, and the recent advances of LLMs evidenced that they matter a lot.
There's 3 types of people in this world:
1️⃣ Baffled folks with closed mindsets, completely clueless & scared of AI
2️⃣ Loud, untrained self-proclaimed experts who complain about AI more than they learn about it
3️⃣ Educated #AI users quietly & rapidly getting things done w/ a wry smile
Series C, Episode 05 - The Harvest of Kairos
TARRANT: [To Dayna] Stay with him.
[Somewhere else on the Liberator]
https://blake.torpidity.net/m/305/276 B7B3
Series B, Episode 05 - Pressure Point
AVON: [Completes his dash with Gan catching him. Places his hand on Blake's arm.] Thank you.
BLAKE: [To Vila] How long?
https://blake.torpidity.net/m/205/449 B7B5
Got bullied* into doing some yoga and chilling out instead.
All good though. Now I am beered up and ready to write!
* after complaining about my shoulders for 20 straight minutes
J'ai réussi Š compiler / lancer une application Capacitor sur un émulateur iOS sur l'offre #Scaleway Apple Silicon M1
J'ai essayé d'automatiser au mieux le provisioning des requirements de #Capacitor
OK, that was a lot of soldering #diysynth