Problem in back propagation

12 views
Skip to first unread message

AHMED IMAM SHAH

unread,
Nov 24, 2021, 6:41:46 AM11/24/21
to COMP541
Hello everyone,
I am having problem training my model. My forward pass is working fine. However, the backward pass is giving an error message. The error message is attached. 

Here is the function which is giving the error:
function outputPad(x)
    a, b, c, d = size(x)
    hostx = Array{Float32}(x)
    paddedzeros = zeros(a+2, b+2, c, d)
    paddedzeros[2:a+1, 2:b+1, :, :] .= hostx
    return array_type(paddedzeros)
end

This function pads the output of the deconv4 layer.
I don't understand what is the issue with this function. 


Best,
Ahmed Imam Shah
MS. Computer Science and Engineering
Koç University, Istanbul, Turkey
Screenshot from 2021-11-24 14-39-29.png

Ali Safaya

unread,
Nov 24, 2021, 9:30:20 AM11/24/21
to AHMED IMAM SHAH, COMP541
Hi AHMED,

I think this is related to the Array type of `hostx` variable. Could you test the following:

function outputPad(x)
    a, b, c, d = size(x)
    hostx = array_type(x)
    paddedzeros = zeros(a+2, b+2, c, d)
    paddedzeros[2:a+1, 2:b+1, :, :] .= paddedzeros + hostx
    return array_type(paddedzeros)
end



--
You received this message because you are subscribed to the Google Groups "COMP541" group.
To unsubscribe from this group and stop receiving emails from it, send an email to COMP541+u...@ku.edu.tr.
To view this discussion on the web visit https://groups.google.com/a/ku.edu.tr/d/msgid/COMP541/CA%2Bsbv-gDWHU0TcmeDWZ9sHLY3qgVGJp8tf4Nhy7wVukhQPENfA%40mail.gmail.com.

AHMED IMAM SHAH

unread,
Nov 24, 2021, 9:42:31 AM11/24/21
to Ali Safaya, COMP541
Thank you Ali for your reply. I tried that but the problem with this is now the forward pass also does not work.

MethodError: no method matching +(::Array{Float64,4}, ::KnetArray{Float32,4})
Closest candidates are:
  +(::Any, ::Any, !Matched::Any, !Matched::Any...) at operators.jl:538
  +(!Matched::ChainRulesCore.NotImplemented, ::Any) at /kuacc/users/ashah20/.julia/packages/ChainRulesCore/7OROc/src/tangent_arithmetic.jl:24
  +(!Matched::Knet.KnetArrays.Bcasted{Float32}, ::KnetArray{Float32,N} where N) at /kuacc/users/ashah20/.julia/packages/Knet/C0PoK/src/knetarrays/binary.jl:132

Best,
Ahmed

Ali Safaya

unread,
Nov 24, 2021, 9:44:00 AM11/24/21
to AHMED IMAM SHAH, COMP541
Please make sure that both arrays have the same type, either Array for CPU or KnetArray GPU.

AHMED IMAM SHAH

unread,
Nov 24, 2021, 9:50:07 AM11/24/21
to Ali Safaya, COMP541
That is what I was trying to do. Making x a CPU array then to the computation and then back to GPU. It didnt work. Now I made both of them GPU array. Again I get the error in the same function in the backpropagation. 

GPU compilation of kernel broadcast_kernel(CUDA.CuKernelContext, SubArray{Float32,4,CUDA.CuDeviceArray{Float32,4,1},Tuple{UnitRange{Int64},UnitRange{Int64},Base.Slice{Base.OneTo{Int64}},Base.Slice{Base.OneTo{Int64}}},false}, Base.Broadcast.Broadcasted{Nothing,NTuple{4,Base.OneTo{Int64}},typeof(identity),Tuple{AutoGrad.Result{KnetArray{Float32,4}}}}, Int64) failed
KernelError: passing and using non-bitstype argument
Argument 4 to your kernel function is of type Base.Broadcast.Broadcasted{Nothing,NTuple{4,Base.OneTo{Int64}},typeof(identity),Tuple{AutoGrad.Result{KnetArray{Float32,4}}}}, which is not isbits:
  .args is of type Tuple{AutoGrad.Result{KnetArray{Float32,4}}} which is not isbits.
    .1 is of type AutoGrad.Result{KnetArray{Float32,4}} which is not isbits.
      .value is of type Union{Nothing, KnetArray{Float32,4}} which is not isbits.
      .func is of type Any which is not isbits.
      .args is of type Any which is not isbits.
      .kwargs is of type Any which is not isbits.
Stacktrace:
 [1] differentiate(::Function; o::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /kuacc/users/ashah20/.julia/packages/AutoGrad/TTpeo/src/core.jl:148
 [2] differentiate at /kuacc/users/ashah20/.julia/packages/AutoGrad/TTpeo/src/core.jl:135 [inlined]
 [3] trainfunc(::Chain, ::KnetArray{Float32,4}) at ./In[31]:7
 [4] top-level scope at ./In[33]:5
 [5] include_string(::Function, ::Module, ::String, ::String) at ./loading.jl:1091

Ali Safaya

unread,
Nov 24, 2021, 9:52:34 AM11/24/21
to AHMED IMAM SHAH, COMP541
Can you please provide a fully reproducible example?



Reply all
Reply to author
Forward
0 new messages