VSAONLINE. SEASON 11. September 8, 20:00GMT. Eilene Tomkins-Flanagan

2 views
Skip to first unread message

Evgeny Osipov

unread,
Sep 3, 2025, 2:45:19 AMSep 3
to 'Google Groups' via VSACommunity, Parikshit Ram, Daswin De Silva, Fredrick Angelo Galapon, Cynamon, Josh, MCDONALD, NATHAN R CIV USAF AFMC AFRL/RITB, Ibrahim, Mohamed, Ross Gayler, GEETH R DE MEL, Marco Angioli, Peter Bruza, Colyn Seeley, Jesper Olsen, Wanyu Lei, Leonid Mokrushin

Dear all,

Welcome to the next talk of Season 11 on VSAONLINE. Eilene Tomkins-Flanagan and Mary Kelly will  deliver an exciting talk

Hey Pentti, We Did It!: A Fully Vector-Symbolic Lisp

Date: August 25,  2025

Time: 20:00 GMT

Zoom: https://ltu-se.zoom.us/j/65564790287

 

WEB: https://bit.ly/vsaonline

Title:  Hey Pentti, We Did It!: A Fully Vector-Symbolic Lisp.

Abstract:  A key feature of symbolic computation is the productivity of formal grammars. Although vector-symbolic architectures (VSAs) allow the expression of composite symbolic sentences over a vector space, the VSA literature has left the study of formal grammars expressed in terms of VSAs under-explored. Until recently, it had not even been demonstrated that VSAs may express an unrestricted grammar (hence, form a Turing-complete system). However, the correspondence between grammar theory and automata theory permits us to study the computational properties of systems in which the terms of a given language are embedded. Thus, VSA-encoded languages supply a means to reason about computability and learning theory in the systems in which they are found. Accordingly, we present three languages: (1) a subset of Lisp 1.5 described over a generic VSA, demonstrating that any VSA may express an unrestricted grammar; (2) an extension of the prior language for Fourier-domain Holographic Reduced Representations that performs numeric computation efficiently; and (3) a type system described over a generic VSA that expresses only programs that halt in polynomial time. These preliminary contributions illustrate the expressive power of VSAs and make a fruitful basis for reasoning about recent progress in machine learning and challenges to the present transformer-led paradigm (notably in the case of the ARC-AGI task) as researchers push toward more "general" models. We find that the heuristic search approach augmenting transformers such that they become capable of addressing fluid reasoning tasks has long been known to be unsustainably inefficient, particularly compared to a human baseline, and that such inefficiencies may be rectified using the kinds of structured, productive representations we have demonstrated to be possible using VSAs. By imposing systematic restrictions on a search space via a formal language, we intend to build AI systems that fit a humanlike learning curve.

 

invite.ics

Evgeny Osipov

unread,
Sep 8, 2025, 12:53:23 PMSep 8
to 'Google Groups' via VSACommunity, Parikshit Ram, Daswin De Silva, Fredrick Angelo Galapon, Cynamon, Josh, MCDONALD, NATHAN R CIV USAF AFMC AFRL/RITB, Ibrahim, Mohamed, Ross Gayler, GEETH R DE MEL, Marco Angioli, Peter Bruza, Colyn Seeley, Jesper Olsen, Wanyu Lei, Leonid Mokrushin, Rocco Martino, Trevor Cohen, Dave Bender

Dear all,

Reminder! See you all soon!

invite.ics
Reply all
Reply to author
Forward
0 new messages