In a Language Model, vocabulary size is equal to the number of output of the network.
When using a vocabulary size of ~3,000,000 in my RNN model, I got the following error:
I1020 19:48:51.488100 6972 net.cpp:155] Setting up rnn1_layer
I1020 19:48:51.488250 6972 net.cpp:163] Top shape: 1044 24 200 (5011200)
I1020 19:48:51.488281 6972 layer_factory.hpp:76] Creating layer inner_product
I1020 19:48:51.488297 6972 net.cpp:110] Creating Layer inner_product
I1020 19:48:51.488306 6972 net.cpp:477] inner_product <- rnn1_layer
I1020 19:48:51.488319 6972 net.cpp:433] inner_product -> inner_product
F1020
19:48:55.256937 6972 blob.cpp:29] Check failed: shape[i] <=
2147483647 / count_ (3000002 vs. 85707) blob size exceeds INT_MAX
*** Check failure stack trace: ***
@ 0x7fd9af53bb7d google::LogMessage::Fail()
@ 0x7fd9af53dc7f google::LogMessage::SendToLog(
)
@ 0x7fd9af53b76c google::LogMessage::Flush()
@ 0x7fd9af53e51d google::LogMessageFatal::~LogMessageFatal()
@ 0x7fd9af879d60 caffe::Blob<>::Reshape()
@ 0x7fd9af919b6d caffe::InnerProductLayer<>::Reshape()
@ 0x7fd9af9d0083 caffe::Net<>::Init()
@ 0x7fd9af9d1aa4 caffe::Net<>::Net()
@ 0x7fd9af9e99e2 caffe::Solver<>::InitTrainNet()
@ 0x7fd9af9e9fec caffe::Solver<>::Init()
@ 0x7fd9af9eacc7 caffe::Solver<>::Solver()
@ 0x415dd5 caffe::GetSolver<>()
@ 0x40de29 train()
@ 0x40a7db main
@ 0x7fd9ae4b676d (unknown)
@ 0x40a8e9 (unknown)
Aborted (core dumped)
Any help in resolving the error will be great!