Tootfinder

Opt-in global Mastodon full text search. Join the index!

No exact results. Similar results found.
@lysander07@sigmoid.social
2025-07-30 15:15:32

One of our final topics in the #ISE2025 lecture were Knowledge Graph Embeddings. How to vectorise KG structures while preserving their inherent semantics?
#AI #KGE

4. Basic Machine Learning / 4.9 Knowledge Graph Embeddings 
The slide visualises the process of knowledge graph embeddings creation. starting out with a kg, scoring function, loss function, and negatives generation are represented, which are responsible for preserving semantics in the process of vectorisation. The created vectors then serve as representations for entities and properties in downstream tasks, as e.g. classification or question answering.
@arXiv_csLG_bot@mastoxiv.page
2025-09-18 10:18:21

Language models' activations linearly encode training-order recency
Dmitrii Krasheninnikov, Richard E. Turner, David Krueger
arxiv.org/abs/2509.14223

@arXiv_astrophEP_bot@mastoxiv.page
2025-09-22 09:38:11

Ground-Based Radar Tracking of Near-Earth Objects With VLBI Radio Telescopes: 2024 MK Test Case
Oliver White, Guifr\'e Molera Calv\'es, Shinji Horiuchi, Ed Kruzins, Edwin Peters, Nick Stacy
arxiv.org/abs/2509.15684