VSAONLINE. SEASON 11. December 22, 20:00GMT. Tony Shaska.

0 views
Skip to first unread message

Evgeny Osipov

unread,
Dec 16, 2025, 6:51:34 AM12/16/25
to 'Google Groups' via VSACommunity, Parikshit Ram, Daswin De Silva, Fredrick Angelo Galapon, Cynamon, Josh, MCDONALD, NATHAN R CIV USAF AFMC AFRL/RITB, Ibrahim, Mohamed, Ross Gayler, GEETH R DE MEL, Marco Angioli, Peter Bruza, Colyn Seeley, Jesper Olsen, Wanyu Lei, Leonid Mokrushin, Rocco Martino, Trevor Cohen, Dave Bender, Dmitri Rachkovskij, Giacomo Camposampiero, Fatemeh Asgarinejad, Paxon Frady, Sadrzadeh, Mehrnoosh, Antonello Rosato, Ilya Paveliev, Ye Tian, Tajana Rosing, robin....@ling.gu.se, yonatan....@u-paris.fr, luec...@em.uni-frankfurt.de, staffan...@ling.gu.se, Tony Plate, josh cynamon, Tim Ufer, Swarup Kumar Mohalik, Marin Orlic, Nuwan madhusanka, Dmitry Krotov, Alex Serb, michie...@ugent.be

Dear all,

Welcome to the next talk of Season 11 on VSAONLINE. Tony Shaska, Oakland University , USA will give a talk

”Artificial Neural Networks on Graded Vector Spaces.”

 

Date: December 22,  2025

Time: 20:00 GMT

Zoom: https://ltu-se.zoom.us/j/65564790287

 WEB: https://bit.ly/vsaonline

 

Abstract: This talk presents a transformative framework for artificial neural networks over graded vector spaces, tailored to model hierarchical and structured data in fields like algebraic geometry and physics. By exploiting the algebraic properties of graded vector spaces, where features carry distinct weights, we extend classical neural networks with graded neurons, layers, and activation functions that preserve structural integrity. Grounded in group actions, representation theory, and graded algebra, our approach combines theoretical rigor with practical utility.

We introduce graded neural architectures, loss functions prioritizing graded components, and equivariant extensions adaptable to diverse gradings. Case studies validate the framework's effectiveness, outperforming standard neural networks in tasks such as predicting invariants in weighted projective spaces and modeling supersymmetric systems.

This work establishes a new frontier in machine learning, merging mathematical sophistication with interdisciplinary applications. Future challenges, including computational scalability and finite field extensions, offer rich opportunities for advancing this paradigm.

 

 

invite.ics
Reply all
Reply to author
Forward
0 new messages