Tootfinder

Opt-in global Mastodon full text search. Join the index!

No exact results. Similar results found.
@seeingwithsound@mas.to
2025-10-01 14:05:54

Artificial phantasia: Evidence for propositional reasoning-based mental imagery in large language models arxiv.org/abs/2509.23108 on the representation of visual imagery in humans; more information in the Bluesky thread

@cjust@infosec.exchange
2025-10-02 02:25:37

Maybe AI Was Never a Tool
They can deliver conclusions that feel complete but skip the struggle that gives thought its humanity. This is what I call anti-intelligence—not stupidity, but perhaps better expressed as a kind of counterfeit cognition. It's intelligence without friction that results in output—built in that shared cognitive dynamic—that looks like insight but has bypassed the work that makes insight truly yours.

@arXiv_qbioNC_bot@mastoxiv.page
2025-12-10 08:57:11

Multi state neurons
Robert Worden
arxiv.org/abs/2512.08815 arxiv.org/pdf/2512.08815 arxiv.org/html/2512.08815
arXiv:2512.08815v1 Announce Type: new
Abstract: Neurons, as eukaryotic cells, have powerful internal computation capabilities. One neuron can have many distinct states, and brains can use this capability. Processes of neuron growth and maintenance use chemical signalling between cell bodies and synapses, ferrying chemical messengers over microtubules and actin fibres within cells. These processes are computations which, while slower than neural electrical signalling, could allow any neuron to change its state over intervals of seconds or minutes. Based on its state, a single neuron can selectively de-activate some of its synapses, sculpting a dynamic neural net from the static neural connections of the brain. Without this dynamic selection, the static neural networks in brains are too amorphous and dilute to do the computations of neural cognitive models. The use of multi-state neurons in animal brains is illustrated in hierarchical Bayesian object recognition. Multi-state neurons may support a design which is more efficient than two-state neurons, and scales better as object complexity increases. Brains could have evolved to use multi-state neurons. Multi-state neurons could be used in artificial neural networks, to use a kind of non-Hebbian learning which is faster and more focused and controllable than traditional neural net learning. This possibility has not yet been explored in computational models.
toXiv_bot_toot

@arXiv_csHC_bot@mastoxiv.page
2025-09-29 07:35:25

Position: Human Factors Reshape Adversarial Analysis in Human-AI Decision-Making Systems
Shutong Fan, Lan Zhang, Xiaoyong Yuan
arxiv.org/abs/2509.21436