Gradient Descent via Pyomo (Neural Networks)

60 views
Skip to first unread message

Jonathan Nowacki

unread,
Oct 30, 2023, 12:58:08 AM10/30/23
to Pyomo Forum
In neural networks people often select the adam optimizer to perform backpropagation and gradient descent.  I'm curious if anyone has implemented this in various scales, from simple to complex) via Pyomo.

Basically, I'm looking for ways to use Pyomo to teach how neural networks work.  I don't care if you are modeling a single neuron or many.  The idea is to understand the concepts.

Has anyone done something like this?
Reply all
Reply to author
Forward
0 new messages