AssertionError: Only scalar valued functions supported. Stacktrace: [1] differentiate(::Function; o::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at C:\Users\Ross\.julia\packages\AutoGrad\6QsMu\src\core.jl:153
```
I am looking for a little pointer on how to get further with this idea please! Kind Regards,
Ross
6-element Array{Any,1}:
(128, 160, 1, 1)
(40, 56, 20, 1)
(18, 26, 50, 1)
(14, 22, 60, 1)
(12, 20, 1, 1)
(1280, 1)
The result is a flat 2D array, with each value representing a 4x4 pixel-window in an equivalent mask (1280 = 32*40 = 128/4 * 160/4). The groundtruth binary mask (I only have one detection class) is collapsed to this representative form with a
reshape(pool(mask, window=stride=(4,4)), (1280,1))
using Knet: Knet, AutoGrad, dir, Data, Param, @diff, value, params, grad, progress, progress!
# Compute gradients on loss function:
J = @diff cnn(first(dtrn)[1],first(dtrn)[2])
# J is a struct, to get the actual loss value from J:
@show value(J)
# params(J) returns an iterator of Params J depends on (i.e. model.b, model.w):
@show params(J) |> collect
# To get the gradient of a parameter from J:
∇w = [grad(J,l.w) for l in cnn.layers]
# Note that each gradient has the same size and shape as the corresponding parameter:
∇b = [grad(J,l.b) for l in cnn.layers]
[max(Array(w)...) for w in ∇w]