VSAONLINE. SEASON 12. February 2, 20:00GMT. Nolan Shaw.

0 views
Skip to first unread message

Evgeny Osipov

unread,
Jan 29, 2026, 4:45:29 PM (5 days ago) Jan 29
to 'Google Groups' via VSACommunity, Leonid Mokrushin

 

Dear all,

Welcome to the next talk of Season 12 on VSAONLINE. Nolan Shaw Cheriton School of Computer Science University of Waterloo, Canada will give a talk

Developing a Foundation of Vector Symbolic Architectures Using Category Theory

 

Date: February 2,  2026

Time: 20:00 GMT

Zoom: https://ltu-se.zoom.us/j/65564790287

 WEB: https://bit.ly/vsaonline

 

Abstract: Connectionist approaches to machine learning, \emph{i.e.} neural networks, are enjoying a considerable vogue right now. However, these methods require large volumes of data and produce models that are uninterpretable to humans. An alternative framework that is compatible with neural networks and gradient-based learning, but explicitly models compositionality, is Vector Symbolic Architectures (VSAs). VSAs are a family of algebras on high-dimensional vector representations. They arose in cognitive science from the need to unify neural processing and the kind of symbolic reasoning that humans perform. While machine learning methods have benefited from category-theoretical analyses, VSAs have not yet received similar treatment. In this paper, we present a first attempt at applying category theory to VSAs. Specifically, We generalise from vectors to co-presheaves, and describe VSA operations as the right Kan extensions of the external tensor product. This formalisation involves a proof that the right Kan extension in such cases can be expressed as simple, element-wise operations. We validate our formalisation with worked examples that connect to current VSA implementations, while suggesting new possible designs for VSAs.

invite.ics
Reply all
Reply to author
Forward
0 new messages