Tootfinder

Opt-in global Mastodon full text search. Join the index!

@davidaugust@mastodon.online
2024-12-25 18:57:53

Like Malaysia Airlines MH17 and Korean Air Lines 007, seems Azerbaijan Airlines J2-8243 may have been shot down. #russia just can’t stop hurting people.
“FlightRadar24…said…aircraft…faced ‘strong GPS jamming,’ which ‘made…aircraft transmit bad ADS-B data,’ referring to…information…allows flight-tracking websites to follow planes…Russia…blamed in…past for jamming GPS transmissions…”

@blakes7bot@mas.torpidity.net
2024-11-08 10:47:54

Series A, Episode 03 - Cygnus Alpha
ZEN: Course and speed confirmed.
AVON: With our speed we'll probably outrun them. This time. But they'll keep coming. Pushing us, tracking us. They'll never give up.
blake.torpidity.net/m/103/588 B7B2

ChatGPT4 describes the image as: "This image is a scene from the British sci-fi television series "Blake's 7." The setting appears to be the interior of a spaceship, with futuristic design elements such as hexagonal patterns and advanced control panels in the background. The two individuals are wearing space-themed costumes, typical of a science fiction genre from that era. The context suggests they are having a serious conversation, possibly discussing a mission or plan aboard the ship."

Bl…
@theawely@mamot.fr
2024-12-13 18:48:37

Excited about the new xLSTM model release. There are many well-though designs compared to transformers: recurrence (which should allows composability), gating (like Mamba & LSTM which is based on, which allows time complexity independent of the input size), state tracking (unlike Mamba & transformers). For now, these advantage aren’t apparent on benchmarks, but most training techniques are secrets, and the recent advances of LLMs evidenced that they matter a lot.