CuArray only supports element types that are stored inline

23 views
Skip to first unread message

MISRA YAVUZ

unread,
Nov 12, 2021, 11:42:05 AM11/12/21
to COMP541
Hi, 

I'm having this error at NERTagger cell, running on Colab with GPU runtime. The root of the issue seems to be the Embedding layer, which is not implemented by me. In[36]:20 is where I call embed(x) in forward pass. Also, I'm using _atype while creating the model.
Can you help?

Thanks in advance, Mısra


┌ Info: Testing forward pass of NERTagger └ @ Main In[36]:23
CuArray only supports element types that are stored inline
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:33
  [2] CuArray{Any, 1, CUDA.Mem.DeviceBuffer}(#unused#::UndefInitializer, dims::Tuple{Int64})
    @ CUDA ~/.julia/packages/CUDA/YpW0k/src/array.jl:36
  [3] CuArray
    @ ~/.julia/packages/CUDA/YpW0k/src/array.jl:290 [inlined]
  [4] CuArray
    @ ~/.julia/packages/CUDA/YpW0k/src/array.jl:295 [inlined]
  [5] CuArray(A::Vector{Any})
    @ CUDA ~/.julia/packages/CUDA/YpW0k/src/array.jl:304
  [6] convert(#unused#::Type{CuArray}, a::Vector{Any})
    @ GPUArrays ~/.julia/packages/GPUArrays/3sW6s/src/host/construction.jl:4
  [7] adapt_storage(#unused#::Type{CuArray}, xs::Vector{Any})
    @ CUDA ~/.julia/packages/CUDA/YpW0k/src/array.jl:483
  [8] adapt_structure(to::Type, x::Vector{Any})
    @ Adapt ~/.julia/packages/Adapt/RGNRk/src/Adapt.jl:42
  [9] adapt
    @ ~/.julia/packages/Adapt/RGNRk/src/Adapt.jl:40 [inlined]
 [10] (::Adapt.var"#1#2"{UnionAll})(x::Vector{Any})
    @ Adapt ~/.julia/packages/Adapt/RGNRk/src/base.jl:3
 [11] map
    @ ./tuple.jl:214 [inlined]
 [12] adapt_structure(to::Type, xs::Tuple{Base.Slice{Base.OneTo{Int64}}, Vector{Any}})
    @ Adapt ~/.julia/packages/Adapt/RGNRk/src/base.jl:3
 [13] adapt(to::Type, x::Tuple{Base.Slice{Base.OneTo{Int64}}, Vector{Any}})
    @ Adapt ~/.julia/packages/Adapt/RGNRk/src/Adapt.jl:40
 [14] _getindex(::CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}, ::Base.Slice{Base.OneTo{Int64}}, ::Vararg{Any, N} where N)
    @ GPUArrays ~/.julia/packages/GPUArrays/3sW6s/src/host/indexing.jl:125
 [15] getindex(::CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}, ::Function, ::Vector{Any})
    @ GPUArrays ~/.julia/packages/GPUArrays/3sW6s/src/host/indexing.jl:115
 [16] getindex(::Knet.KnetArrays.KnetMatrix{Float32}, ::Function, ::Vector{Any})
    @ Knet.KnetArrays ~/.julia/packages/Knet/RCkV0/src/knetarrays/getindex.jl:39
 [17] forw(::Function, ::Param{Knet.KnetArrays.KnetMatrix{Float32}}, ::Vararg{Any, N} where N; kwargs::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ AutoGrad ~/.julia/packages/AutoGrad/TTpeo/src/core.jl:66
 [18] forw
    @ ~/.julia/packages/AutoGrad/TTpeo/src/core.jl:65 [inlined]
 [19] getindex(::Param{Knet.KnetArrays.KnetMatrix{Float32}}, ::Function, ::Vector{Any})
    @ AutoGrad ./none:0
 [20] (::Embedding)(x::Vector{Any})
    @ Main ./In[32]:27
 [21] (::NERTagger)(x::Vector{Any}; batchsizes::Vector{Any})
    @ Main ./In[36]:20
 [22] top-level scope
    @ In[36]:27
 [23] eval
    @ ./boot.jl:360 [inlined]
 [24] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)
    @ Base ./loading.jl:1116

Ege Dinçer

unread,
Nov 12, 2021, 12:17:43 PM11/12/21
to MISRA YAVUZ, COMP541

Hi,

I'm also getting the same error and when I searched the internet for the error, I found out an issue claiming: "You’re not allowed to store mutable structs as the element types of CuArrays (or any GPUArray) because they end up getting stored as pointers, and being heap-allocated and managed" Can it be due to the changes in cuda that now it doesn't support the mutable structs?

(I am currently using cuda/11.4 and Cudnn/8.2.2)

here is the link for the issue : https://discourse.julialang.org/t/cuarray-only-supports-element-types-that-are-stored-inline/71339

Regards,

Ege

--
You received this message because you are subscribed to the Google Groups "COMP541" group.
To unsubscribe from this group and stop receiving emails from it, send an email to COMP541+u...@ku.edu.tr.
To view this discussion on the web visit https://groups.google.com/a/ku.edu.tr/d/msgid/COMP541/CAE_2wfNPMZzNU-YzpbEaNR5zc0DffQa8_R9ss9bKLa1rAKecVg%40mail.gmail.com.

Shadi Sameh Hamdan

unread,
Nov 12, 2021, 12:43:46 PM11/12/21
to Ege Dinçer, MISRA YAVUZ, COMP541
Hello,

I am unable to identify the issue from the stack trace. Can you please include a minimal working example, that is the simplest example of code that I can run in order to get the same issue? If you are worried it would contain a significant part of the assignment solution, send it to the TAs instead of the whole COMP group.

Best,
Shadi 

Shadi Sameh Hamdan

unread,
Nov 12, 2021, 1:56:18 PM11/12/21
to Ege Dinçer, MISRA YAVUZ, COMP541
Hello,

When initializing arrays, it is always recommended to keep the type as specific as possible. In the case of word indices, they are always integer numbers, so instead of initializing the array as:

words = []

It is better to be more specific and make them:

words = Array{Int}([])

The first instance is of type Array{Any} while the second is more specifically of type Array{Int}. When working with the GPU, working with non specific types like Any causes issues. To solve this issue, either initialize the array as in the second example or make sure to cast the first instance as Array{Int}(words). I recommend the former. 

Let me know if you are still facing issues.

Best regards,
Shadi 
Reply all
Reply to author
Forward
0 new messages