Tootfinder

Opt-in global Mastodon full text search. Join the index!

No exact results. Similar results found.
@arXiv_csRO_bot@mastoxiv.page
2025-10-13 09:35:20

Model-Based Lookahead Reinforcement Learning for in-hand manipulation
Alexandre Lopes, Catarina Barata, Plinio Moreno
arxiv.org/abs/2510.08884

@arXiv_csCL_bot@mastoxiv.page
2025-10-02 10:29:21

Beyond Log Likelihood: Probability-Based Objectives for Supervised Fine-Tuning across the Model Capability Continuum
Gaotang Li, Ruizhong Qiu, Xiusi Chen, Heng Ji, Hanghang Tong
arxiv.org/abs/2510.00526

@arXiv_csCV_bot@mastoxiv.page
2025-10-06 10:04:09

Training-Free Out-Of-Distribution Segmentation With Foundation Models
Laith Nayal, Hadi Salloum, Ahmad Taha, Yaroslav Kholodov, Alexander Gasnikov
arxiv.org/abs/2510.02909

@arXiv_csRO_bot@mastoxiv.page
2025-10-09 09:47:11

Vision-Language-Action Models for Robotics: A Review Towards Real-World Applications
Kento Kawaharazuka, Jihoon Oh, Jun Yamada, Ingmar Posner, Yuke Zhu
arxiv.org/abs/2510.07077

@arXiv_qbioNC_bot@mastoxiv.page
2025-12-10 08:57:11

Multi state neurons
Robert Worden
arxiv.org/abs/2512.08815 arxiv.org/pdf/2512.08815 arxiv.org/html/2512.08815
arXiv:2512.08815v1 Announce Type: new
Abstract: Neurons, as eukaryotic cells, have powerful internal computation capabilities. One neuron can have many distinct states, and brains can use this capability. Processes of neuron growth and maintenance use chemical signalling between cell bodies and synapses, ferrying chemical messengers over microtubules and actin fibres within cells. These processes are computations which, while slower than neural electrical signalling, could allow any neuron to change its state over intervals of seconds or minutes. Based on its state, a single neuron can selectively de-activate some of its synapses, sculpting a dynamic neural net from the static neural connections of the brain. Without this dynamic selection, the static neural networks in brains are too amorphous and dilute to do the computations of neural cognitive models. The use of multi-state neurons in animal brains is illustrated in hierarchical Bayesian object recognition. Multi-state neurons may support a design which is more efficient than two-state neurons, and scales better as object complexity increases. Brains could have evolved to use multi-state neurons. Multi-state neurons could be used in artificial neural networks, to use a kind of non-Hebbian learning which is faster and more focused and controllable than traditional neural net learning. This possibility has not yet been explored in computational models.
toXiv_bot_toot