Einmal Sonnenschein, voll tanken, bitte! ⛽☀️
Mit dem Vision Iconic wagt Mercedes-Benz einen mutigen Blick in die Zukunft seiner Luxusklasse – und verabschiedet sich von der bisherigen Doppelstrategie aus Verbrennern und Elektroplattformen. 🌍
Zum Artikel: https://
“TABS [by Mozilla] pulls exactly the data you need—from HTML to Markdown to JSON—using the fastest, most efficient method for each page. It adapts to the structure and complexity of the site, staying stealthy and reliable so your [AI] agents always get what they need without friction.”
Ethical Stealthy AI Scraping (tm) by Mozilla.
#Mozilla
Mira Murati's Thinking Machines Lab makes Tinker, its API for fine-tuning language models, generally available, adds support for Kimi K2 Thinking, and more (Thinking Machines Lab)
https://thinkingmachines.ai/blog/tinker-general-availability/
Everything You Ever Wanted to Know, But Were Too Afraid To Ask
https://classautonomy.info/anarcho-syndicalism-everything-you-ever-wanted-to-know-but-were-too-afraid-to-ask/
Always fun/challenging to read new AI (pre)papers like this. "Base models know how to reason, thinking models learn when".
#AI #Google #reasoning
Egypt is building three lines of high speed railways over 2000km. Meanwhile, in Canada, we are still ‘thinking' about building our first 1000km line. Maybe. If we talk nice to the rail tycoons.
#Rail #Canada #CanPoli #CdnPoli
https://www.cnn.com/travel/egypt-high-speed-rail-siemens-velaro-desiro-trains-spc
Left #Bluesky or thinking about it? Into #genealogy or #familyhistory but don't know where to start? Follow @…
Reading thoughts about a new Chinese openweights AI model, Kimi K2 Thinking.
https://www.interconnects.ai/p/kimi-k2-thinking-what-it-means
Chinese startup Moonshot releases Kimi K2 Thinking, an open-source model it claims beats GPT-5 in agentic capabilities; source: the model cost $4.6M to train (Evelyn Cheng/CNBC)
https://www.cnbc.com/2025/11/06/alibaba-backed-moonshot-releas…
Kimi K2 Thinking is impressive, it is as at least as good as other top foundation models, fully openweights, 1 trillion parameters but only using 32B in a MoE setting. Also you can download and reuse it for free.
https://github.com/MoonshotAI/Kimi-K2