Override the layer's accGradParameters function to be an empty function.
If you want to use this kind of layer a lot, just derive a layer with nn.Linear as parent, make accGradParameters and empty function and use that everywhere.
You received this message because you are subscribed to a topic in the Google Groups "torch7" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/torch7/S8hWQtEIkxg/unsubscribe.
To unsubscribe from this group and all its topics, send an email to torch7+un...@googlegroups.com.
To post to this group, send email to tor...@googlegroups.com.
Visit this group at http://groups.google.com/group/torch7.
For more options, visit https://groups.google.com/d/optout.
Visit this group at https://groups.google.com/group/torch7.
Overall, I recommend not using any call to getParameters() and instead to use separate optimizers for each block of weights. These can be accessed using the :parameters() method.
Thank You for your reply.Is there a way to integrate the custom made layers with the nn package??Soumava Kumar Roy
On Tuesday, March 28, 2017 at 5:46:52 AM UTC+5:30, David Belanger wrote:Overall, I recommend not using any call to getParameters() and instead to use separate optimizers for each block of weights. These can be accessed using the :parameters() method.