投稿2017/09/28 「推論実行するのは?」 を参考にした Python推論 でエラー発生

382 views
Skip to first unread message

Ride Forever

unread,
Jan 1, 2018, 10:31:41 AM1/1/18
to Neural Network Console Users (JP)
NNC(1.1.6519.49966)を使い習字の文字を認識するモデルを生成しました。

当該モデルは、サンプルの「12_residual_learning」のInputを50*50のグレースケール画像を読み込むよう修正し、学習を行ったモデルです。
以下の手順を行いました。

①EDITタブから、Python Code (NNabla)を選択し以下のソースコードを得ました
------------------------------------------------------------------------------------------------------------
import nnabla as nn
import nnabla.functions as F
import nnabla.parametric_functions as PF

def network(x, y, test=False):
    # Input -> 1,50,50
    # Convolution -> 64,54,54
    with nn.parameter_scope('Convolution'):
        h = PF.convolution(x, 64, (5,5), (4,4))
    # BatchNormalization
    with nn.parameter_scope('BatchNormalization'):
        h = PF.batch_normalization(h, (1,), 0.5, 0.01, not test)
    # ReLU
    h = F.relu(h, True)

    # Convolution_2
    with nn.parameter_scope('Convolution_2'):
        h1 = PF.convolution(h, 64, (3,3), (1,1))
    # BatchNormalization_2
    with nn.parameter_scope('BatchNormalization_2'):
        h1 = PF.batch_normalization(h1, (1,), 0.5, 0.01, not test)
    # ReLU_2
    h1 = F.relu(h1, True)
    # Convolution_3
    with nn.parameter_scope('Convolution_3'):
        h1 = PF.convolution(h1, 64, (1,1), (0,0))
    # BatchNormalization_3
    with nn.parameter_scope('BatchNormalization_3'):
        h1 = PF.batch_normalization(h1, (1,), 0.5, 0.01, not test)
    # ReLU_3
    h1 = F.relu(h1, True)
    # Convolution_4
    with nn.parameter_scope('Convolution_4'):
        h1 = PF.convolution(h1, 64, (3,3), (1,1))
    # BatchNormalization_4
    with nn.parameter_scope('BatchNormalization_4'):
        h1 = PF.batch_normalization(h1, (1,), 0.5, 0.01, not test)

    # Add2 -> 64,54,54
    h2 = F.add2(h1, h, False)
    # ReLU_4
    h2 = F.relu(h2, True)

    # MaxPooling -> 64,27,27
    h2 = F.max_pooling(h2, (2,2), (2,2), True)

    # RepeatStart
    for i in range(2):

        # Convolution_5
        with nn.parameter_scope('Convolution_5_[' + str(i) + ']'):
            h3 = PF.convolution(h2, 64, (3,3), (1,1))
        # BatchNormalization_5
        with nn.parameter_scope('BatchNormalization_5_[' + str(i) + ']'):
            h3 = PF.batch_normalization(h3, (1,), 0.5, 0.01, not test)
        # ReLU_5
        h3 = F.relu(h3, True)
        # Convolution_6
        with nn.parameter_scope('Convolution_6_[' + str(i) + ']'):
            h3 = PF.convolution(h3, 64, (1,1), (0,0))
        # BatchNormalization_6
        with nn.parameter_scope('BatchNormalization_6_[' + str(i) + ']'):
            h3 = PF.batch_normalization(h3, (1,), 0.5, 0.01, not test)
        # ReLU_6
        h3 = F.relu(h3, True)
        # Convolution_7
        with nn.parameter_scope('Convolution_7_[' + str(i) + ']'):
            h3 = PF.convolution(h3, 64, (3,3), (1,1))
        # BatchNormalization_7
        with nn.parameter_scope('BatchNormalization_7_[' + str(i) + ']'):
            h3 = PF.batch_normalization(h3, (1,), 0.5, 0.01, not test)

        # Add2_2 -> 64,27,27
        h4 = F.add2(h3, h2, False)
        # ReLU_7
        h4 = F.relu(h4, True)

        # RepeatEnd
        h2 = h4

      # MaxPooling_2 -> 64,13,13
      h4 = F.max_pooling(h4, (2,2), (2,2), True)

      # RepeatStart_2
      for i in range(2):

          # Convolution_8
          with nn.parameter_scope('Convolution_8_[' + str(i) + ']'):
              h5 = PF.convolution(h4, 64, (3,3), (1,1))
          # BatchNormalization_8
          with nn.parameter_scope('BatchNormalization_8_[' + str(i) + ']'):
              h5 = PF.batch_normalization(h5, (1,), 0.5, 0.01, not test)
          # ReLU_8
          h5 = F.relu(h5, True)
          # Convolution_9
          with nn.parameter_scope('Convolution_9_[' + str(i) + ']'):
              h5 = PF.convolution(h5, 64, (1,1), (0,0))
          # BatchNormalization_9
          with nn.parameter_scope('BatchNormalization_9_[' + str(i) + ']'):
              h5 = PF.batch_normalization(h5, (1,), 0.5, 0.01, not test)
          # ReLU_9
          h5 = F.relu(h5, True)
          # Convolution_10
          with nn.parameter_scope('Convolution_10_[' + str(i) + ']'):
              h5 = PF.convolution(h5, 64, (3,3), (1,1))
          # BatchNormalization_10
          with nn.parameter_scope('BatchNormalization_10_[' + str(i) + ']'):
              h5 = PF.batch_normalization(h5, (1,), 0.5, 0.01, not test)

          # Add2_3 -> 64,13,13
          h6 = F.add2(h5, h4, False)
          # ReLU_10
          h6 = F.relu(h6, True)

          # RepeatEnd_2
          h4 = h6

        # MaxPooling_3 -> 64,6,6
        h6 = F.max_pooling(h6, (2,2), (2,2), True)

        # Convolution_11
        with nn.parameter_scope('Convolution_11'):
            h7 = PF.convolution(h6, 64, (3,3), (1,1))
        # BatchNormalization_11
        with nn.parameter_scope('BatchNormalization_11'):
            h7 = PF.batch_normalization(h7, (1,), 0.5, 0.01, not test)
        # ReLU_11
        h7 = F.relu(h7, True)
        # Convolution_12
        with nn.parameter_scope('Convolution_12'):
            h7 = PF.convolution(h7, 64, (1,1), (0,0))
        # BatchNormalization_12
        with nn.parameter_scope('BatchNormalization_12'):
            h7 = PF.batch_normalization(h7, (1,), 0.5, 0.01, not test)
        # ReLU_12
        h7 = F.relu(h7, True)
        # Convolution_13
        with nn.parameter_scope('Convolution_13'):
            h7 = PF.convolution(h7, 64, (3,3), (1,1))
        # BatchNormalization_13
        with nn.parameter_scope('BatchNormalization_13'):
            h7 = PF.batch_normalization(h7, (1,), 0.5, 0.01, not test)

        # Add2_4 -> 64,6,6
        h8 = F.add2(h7, h6, False)
        # ReLU_13
        h8 = F.relu(h8, True)

        # AveragePooling -> 64,1,1
        h8 = F.average_pooling(h8, (4,4), (4,4), True)

        # Affine -> 10
        with nn.parameter_scope('Affine'):
            h8 = PF.affine(h8, (10,))
        # Softmax
        h8 = F.softmax(h8)
        # CategoricalCrossEntropy -> 1
        h8 = F.categorical_cross_entropy(h8, y)
        return h8
------------------------------------------------------------------------------------------------------------


②続いて、過去の投稿をもとに以下のソースコードを作成しました
===========================================================================
#################################################
# モジュールのインポート
# Python 3 系に実装されている Python 2 系 と互換性の無い機能を
# Python 2 系で使用できるようにする
from __future__ import absolute_import
# Python 2 と 3 の互換性ライブラリ
from six.moves import range
# with 文コンテキスト用ユーティリティ
from contextlib import contextmanager
# 数値計算ライブラリ NumPy インポート
import numpy as np
# OS 依存標準ライブラリインポート
#import os
# NNabla関連モジュールのインポート
import nnabla as nn
import nnabla.parametric_functions as PF
import nnabla.functions as F
import nnabla.solvers as S
from nnabla.utils.data_iterator import data_iterator_csv_dataset
#import nnabla.logger as logger
#import nnabla.utils.save as save
# その他
#from args import get_args
#from mnist_data import data_iterator_mnist
#################################################

#################################################
# RESNET
def network(x, y, test=False):
    # Input -> 1,50,50
    # Convolution -> 64,54,54
    with nn.parameter_scope('Convolution'):
        h = PF.convolution(x, 64, (5,5), (4,4))
    # BatchNormalization
    with nn.parameter_scope('BatchNormalization'):
        h = PF.batch_normalization(h, (1,), 0.5, 0.01, not test)
    # ReLU
    h = F.relu(h, True)

    # Convolution_2
    with nn.parameter_scope('Convolution_2'):
        h1 = PF.convolution(h, 64, (3,3), (1,1))
    # BatchNormalization_2
    with nn.parameter_scope('BatchNormalization_2'):
        h1 = PF.batch_normalization(h1, (1,), 0.5, 0.01, not test)
    # ReLU_2
    h1 = F.relu(h1, True)
    # Convolution_3
    with nn.parameter_scope('Convolution_3'):
        h1 = PF.convolution(h1, 64, (1,1), (0,0))
    # BatchNormalization_3
    with nn.parameter_scope('BatchNormalization_3'):
        h1 = PF.batch_normalization(h1, (1,), 0.5, 0.01, not test)
    # ReLU_3
    h1 = F.relu(h1, True)
    # Convolution_4
    with nn.parameter_scope('Convolution_4'):
        h1 = PF.convolution(h1, 64, (3,3), (1,1))
    # BatchNormalization_4
    with nn.parameter_scope('BatchNormalization_4'):
        h1 = PF.batch_normalization(h1, (1,), 0.5, 0.01, not test)

    # Add2 -> 64,54,54
    h2 = F.add2(h1, h, False)
    # ReLU_4
    h2 = F.relu(h2, True)

    # MaxPooling -> 64,27,27
    h2 = F.max_pooling(h2, (2,2), (2,2), True)

    # RepeatStart
    for i in range(2):

        # Convolution_5
        with nn.parameter_scope('Convolution_5_[' + str(i) + ']'):
            h3 = PF.convolution(h2, 64, (3,3), (1,1))
        # BatchNormalization_5
        with nn.parameter_scope('BatchNormalization_5_[' + str(i) + ']'):
            h3 = PF.batch_normalization(h3, (1,), 0.5, 0.01, not test)
        # ReLU_5
        h3 = F.relu(h3, True)
        # Convolution_6
        with nn.parameter_scope('Convolution_6_[' + str(i) + ']'):
            h3 = PF.convolution(h3, 64, (1,1), (0,0))
        # BatchNormalization_6
        with nn.parameter_scope('BatchNormalization_6_[' + str(i) + ']'):
            h3 = PF.batch_normalization(h3, (1,), 0.5, 0.01, not test)
        # ReLU_6
        h3 = F.relu(h3, True)
        # Convolution_7
        with nn.parameter_scope('Convolution_7_[' + str(i) + ']'):
            h3 = PF.convolution(h3, 64, (3,3), (1,1))
        # BatchNormalization_7
        with nn.parameter_scope('BatchNormalization_7_[' + str(i) + ']'):
            h3 = PF.batch_normalization(h3, (1,), 0.5, 0.01, not test)

        # Add2_2 -> 64,27,27
        h4 = F.add2(h3, h2, False)
        # ReLU_7
        h4 = F.relu(h4, True)

        # RepeatEnd
        h2 = h4

    # MaxPooling_2 -> 64,13,13
    h4 = F.max_pooling(h4, (2,2), (2,2), True)

    # RepeatStart_2
    for i in range(2):

        # Convolution_8
        with nn.parameter_scope('Convolution_8_[' + str(i) + ']'):
            h5 = PF.convolution(h4, 64, (3,3), (1,1))
        # BatchNormalization_8
        with nn.parameter_scope('BatchNormalization_8_[' + str(i) + ']'):
            h5 = PF.batch_normalization(h5, (1,), 0.5, 0.01, not test)
        # ReLU_8
        h5 = F.relu(h5, True)
        # Convolution_9
        with nn.parameter_scope('Convolution_9_[' + str(i) + ']'):
            h5 = PF.convolution(h5, 64, (1,1), (0,0))
        # BatchNormalization_9
        with nn.parameter_scope('BatchNormalization_9_[' + str(i) + ']'):
            h5 = PF.batch_normalization(h5, (1,), 0.5, 0.01, not test)
        # ReLU_9
        h5 = F.relu(h5, True)
        # Convolution_10
        with nn.parameter_scope('Convolution_10_[' + str(i) + ']'):
            h5 = PF.convolution(h5, 64, (3,3), (1,1))
        # BatchNormalization_10
        with nn.parameter_scope('BatchNormalization_10_[' + str(i) + ']'):
            h5 = PF.batch_normalization(h5, (1,), 0.5, 0.01, not test)

        # Add2_3 -> 64,13,13
        h6 = F.add2(h5, h4, False)
        # ReLU_10
        h6 = F.relu(h6, True)

        # RepeatEnd_2
        h4 = h6

        # MaxPooling_3 -> 64,6,6
        h6 = F.max_pooling(h6, (2,2), (2,2), True)

        # Convolution_11
        with nn.parameter_scope('Convolution_11'):
            h7 = PF.convolution(h6, 64, (3,3), (1,1))
        # BatchNormalization_11
        with nn.parameter_scope('BatchNormalization_11'):
            h7 = PF.batch_normalization(h7, (1,), 0.5, 0.01, not test)
        # ReLU_11
        h7 = F.relu(h7, True)
        # Convolution_12
        with nn.parameter_scope('Convolution_12'):
            h7 = PF.convolution(h7, 64, (1,1), (0,0))
        # BatchNormalization_12
        with nn.parameter_scope('BatchNormalization_12'):
            h7 = PF.batch_normalization(h7, (1,), 0.5, 0.01, not test)
        # ReLU_12
        h7 = F.relu(h7, True)
        # Convolution_13
        with nn.parameter_scope('Convolution_13'):
            h7 = PF.convolution(h7, 64, (3,3), (1,1))
        # BatchNormalization_13
        with nn.parameter_scope('BatchNormalization_13'):
            h7 = PF.batch_normalization(h7, (1,), 0.5, 0.01, not test)

        # Add2_4 -> 64,6,6
        h8 = F.add2(h7, h6, False)
        # ReLU_13
        h8 = F.relu(h8, True)

        # AveragePooling -> 64,1,1
        h8 = F.average_pooling(h8, (4,4), (4,4), True)

        # Affine -> 10
        with nn.parameter_scope('Affine'):
            h8 = PF.affine(h8, (10,))
        # Softmax
        h8 = F.softmax(h8)
        # CategoricalCrossEntropy -> 1
        h8 = F.categorical_cross_entropy(h8, y)
        return h8

#################################################
# ニューラルネットワークを構築
if __name__ == '__main__':
    # モデルのパラメータ読み込み
    nn.clear_parameters()
    nn.load_parameters('./parameters.h5')
    input_data = data_iterator_csv_dataset('./dataset.csv', 1, False)
    Cdata, Clabel = input_data.next()
    # DEBUG
    # InputData:データファイル内のCSVデータ
    print(" Cdata:{}".format(Cdata))
    # DEBUG
    # Label:データセットファイル内の y ラベルデータ(教師データ)
    # 教師データなので、推論には不要だが NNConsole に合わせてある
    print("Clabel:{}".format(Clabel))
    x = nn.Variable(Cdata.shape)
    t = nn.Variable(Clabel.shape)
    y = network(x, t)
    # DEBUG
    # print("y:{}".format(y.d.take(0)))
    # print("shape:{}".format(y.d.shape))

    print(nn.get_parameters(grad_only=False))
    #x = nn.Variable((1, 50, 50))
    # y = network(x, t)
    # x.d = (input image)
    #y.forward()
    #print(y.d)


    for i in range(input_data.size):
        x.d, t.d = input_data.next()
        y.forward()

        # DEBUG
        print("入力データ:{}".format(x.d))
        print("教師データ:{}".format(t.d.take(0)))

        # ▲ 推論結果によって、出力方法を変更要
        # 今回は、バイナリー(0,1)のうち、'1' である確率 0.0~1.0
        #    print("  確率:{}".format(y.d.all(axis=1)))
        print("  確率:{} %".format(y.d.take(0) * 100.0))
===========================================================================


③dataset.csv を作成し、以下のように記載しました
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
x:file,y:result
s.csv,1
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


④s.csv を作成し、以下のように記載しました
_______________________________________________________________________________
0.011764706,0.011764706,0.011764706,0.007843137,0.070588235,0.121568627,0.094117647,0.082352941,0.066666667,0.066666667,0.066666667,0.078431373,0.101960784,0.105882353,0.105882353,0.105882353,0.105882353,0.090196078,0.078431373,0.078431373,0.082352941,0.094117647,0.121568627,0.133333333,0.121568627,0.11372549,0.11372549,0.11372549,0.121568627,0.105882353,0.090196078,0.082352941,0.078431373,0.082352941,0.090196078,0.090196078,0.082352941,0.090196078,0.101960784,0.11372549,0.133333333,0.145098039,0.149019608,0.137254902,0.121568627,0.105882353,0.094117647,0.078431373,0.078431373,0.078431373
0.011764706,0.011764706,0.007843137,0.094117647,0.184313725,0.192156863,0.196078431,0.180392157,0.160784314,0.160784314,0.149019608,0.160784314,0.192156863,0.203921569,0.219607843,0.215686275,0.203921569,0.192156863,0.168627451,0.160784314,0.17254902,0.215686275,0.262745098,0.28627451,0.258823529,0.247058824,0.262745098,0.294117647,0.305882353,0.309803922,0.317647059,0.321568627,0.37254902,0.443137255,0.466666667,0.498039216,0.549019608,0.639215686,0.717647059,0.796078431,0.850980392,0.874509804,0.898039216,0.905882353,0.898039216,0.898039216,0.894117647,0.88627451,0.88627451,0.894117647
0.011764706,0.007843137,0.101960784,0.180392157,0.160784314,0.184313725,0.215686275,0.219607843,0.207843137,0.215686275,0.215686275,0.203921569,0.219607843,0.247058824,0.250980392,0.239215686,0.219607843,0.219607843,0.247058824,0.247058824,0.247058824,0.258823529,0.305882353,0.337254902,0.298039216,0.298039216,0.294117647,0.341176471,0.423529412,0.533333333,0.603921569,0.678431373,0.764705882,0.831372549,0.854901961,0.874509804,0.905882353,0.945098039,0.956862745,0.964705882,0.956862745,0.956862745,0.952941176,0.952941176,0.952941176,0.952941176,0.952941176,0.952941176,0.956862745,0.941176471
0.019607843,0.121568627,0.184313725,0.184313725,0.184313725,0.219607843,0.250980392,0.262745098,0.250980392,0.258823529,0.270588235,0.270588235,0.28627451,0.309803922,0.321568627,0.305882353,0.270588235,0.274509804,0.329411765,0.360784314,0.384313725,0.419607843,0.509803922,0.603921569,0.639215686,0.658823529,0.662745098,0.694117647,0.792156863,0.878431373,0.921568627,0.952941176,0.952941176,0.952941176,0.952941176,0.952941176,0.945098039,0.941176471,0.933333333,0.933333333,0.933333333,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.866666667
0.023529412,0.180392157,0.207843137,0.235294118,0.239215686,0.274509804,0.321568627,0.329411765,0.305882353,0.305882353,0.341176471,0.360784314,0.396078431,0.454901961,0.501960784,0.537254902,0.537254902,0.564705882,0.639215686,0.717647059,0.780392157,0.831372549,0.88627451,0.941176471,0.956862745,0.956862745,0.956862745,0.956862745,0.956862745,0.952941176,0.941176471,0.941176471,0.933333333,0.933333333,0.933333333,0.933333333,0.941176471,0.941176471,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.952941176,0.843137255
0.031372549,0.262745098,0.349019608,0.352941176,0.321568627,0.37254902,0.435294118,0.48627451,0.501960784,0.556862745,0.635294118,0.690196078,0.749019608,0.819607843,0.866666667,0.905882353,0.917647059,0.921568627,0.952941176,0.956862745,0.956862745,0.956862745,0.952941176,0.941176471,0.933333333,0.933333333,0.933333333,0.933333333,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.917647059,0.929411765,0.952941176,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.956862745,0.82745098
0.031372549,0.364705882,0.498039216,0.48627451,0.474509804,0.576470588,0.729411765,0.831372549,0.878431373,0.905882353,0.941176471,0.956862745,0.956862745,0.952941176,0.952941176,0.945098039,0.945098039,0.945098039,0.941176471,0.933333333,0.933333333,0.933333333,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.752941176,0.556862745,0.537254902,0.717647059,0.929411765,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.956862745,0.807843137
0.031372549,0.352941176,0.490196078,0.521568627,0.537254902,0.694117647,0.921568627,0.956862745,0.952941176,0.945098039,0.941176471,0.933333333,0.933333333,0.933333333,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.952941176,0.854901961,0.498039216,0.219607843,0.137254902,0.317647059,0.760784314,0.956862745,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.956862745,0.796078431
0.023529412,0.294117647,0.450980392,0.51372549,0.537254902,0.647058824,0.878431373,0.945098039,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.952941176,0.82745098,0.435294118,0.203921569,0.101960784,0.160784314,0.603921569,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.964705882,0.780392157
0.023529412,0.22745098,0.388235294,0.498039216,0.521568627,0.588235294,0.803921569,0.952941176,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.956862745,0.780392157,0.411764706,0.239215686,0.137254902,0.133333333,0.509803922,0.929411765,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.964705882,0.752941176
0.019607843,0.180392157,0.349019608,0.478431373,0.521568627,0.564705882,0.749019608,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.898039216,0.921568627,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.956862745,0.862745098,0.568627451,0.321568627,0.207843137,0.137254902,0.137254902,0.48627451,0.917647059,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.968627451,0.729411765
0.019607843,0.180392157,0.329411765,0.454901961,0.533333333,0.576470588,0.690196078,0.905882353,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.952941176,0.854901961,0.615686275,0.658823529,0.905882353,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.952941176,0.843137255,0.537254902,0.28627451,0.192156863,0.180392157,0.17254902,0.180392157,0.474509804,0.894117647,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.976470588,0.701960784
0.019607843,0.180392157,0.28627451,0.411764706,0.51372549,0.549019608,0.611764706,0.815686275,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.952941176,0.866666667,0.525490196,0.419607843,0.737254902,0.952941176,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.850980392,0.678431373,0.498039216,0.309803922,0.215686275,0.196078431,0.192156863,0.196078431,0.450980392,0.874509804,0.952941176,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.976470588,0.678431373
0.023529412,0.180392157,0.250980392,0.364705882,0.490196078,0.537254902,0.568627451,0.682352941,0.878431373,0.952941176,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.705882353,0.431372549,0.611764706,0.921568627,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.862745098,0.764705882,0.807843137,0.807843137,0.611764706,0.352941176,0.215686275,0.137254902,0.160784314,0.498039216,0.909803922,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,0.980392157,0.658823529
0.023529412,0.17254902,0.239215686,0.329411765,0.462745098,0.549019608,0.576470588,0.6,0.77254902,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.952941176,0.874509804,0.537254902,0.545098039,0.874509804,0.952941176,0.933333333,0.941176471,0.941176471,0.945098039,0.909803922,0.850980392,0.898039216,0.945098039,0.964705882,0.854901961,0.474509804,0.184313725,0.078431373,0.121568627,0.556862745,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,0.980392157,0.635294118
0.031372549,0.184313725,0.219607843,0.282352941,0.419607843,0.525490196,0.549019608,0.580392157,0.717647059,0.921568627,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.933333333,0.639215686,0.51372549,0.839215686,0.956862745,0.933333333,0.941176471,0.945098039,0.909803922,0.866666667,0.933333333,0.945098039,0.933333333,0.941176471,0.909803922,0.525490196,0.192156863,0.090196078,0.145098039,0.580392157,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,0.988235294,0.615686275
0.047058824,0.192156863,0.203921569,0.235294118,0.37254902,0.509803922,0.545098039,0.6,0.717647059,0.909803922,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.952941176,0.725490196,0.545098039,0.82745098,0.956862745,0.933333333,0.941176471,0.941176471,0.898039216,0.917647059,0.945098039,0.941176471,0.941176471,0.945098039,0.917647059,0.525490196,0.196078431,0.105882353,0.192156863,0.635294118,0.945098039,0.933333333,0.933333333,0.933333333,0.933333333,0.933333333,0.941176471,0.929411765,0.988235294,0.588235294
0.058823529,0.207843137,0.203921569,0.219607843,0.317647059,0.450980392,0.525490196,0.588235294,0.682352941,0.854901961,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.952941176,0.82745098,0.647058824,0.843137255,0.952941176,0.933333333,0.945098039,0.945098039,0.941176471,0.945098039,0.941176471,0.941176471,0.933333333,0.952941176,0.88627451,0.466666667,0.180392157,0.121568627,0.196078431,0.592156863,0.952941176,0.956862745,0.956862745,0.956862745,0.956862745,0.952941176,0.941176471,0.929411765,0.992156863,0.564705882
0.058823529,0.192156863,0.192156863,0.203921569,0.239215686,0.337254902,0.462745098,0.576470588,0.635294118,0.77254902,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.933333333,0.952941176,0.874509804,0.784313725,0.909803922,0.941176471,0.941176471,0.929411765,0.854901961,0.866666667,0.929411765,0.952941176,0.952941176,0.945098039,0.941176471,0.749019608,0.352941176,0.168627451,0.133333333,0.17254902,0.4,0.737254902,0.854901961,0.819607843,0.803921569,0.796078431,0.866666667,0.941176471,0.929411765,0.992156863,0.533333333
0.070588235,0.184313725,0.17254902,0.180392157,0.196078431,0.270588235,0.407843137,0.549019608,0.6,0.690196078,0.894117647,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.909803922,0.88627451,0.874509804,0.77254902,0.760784314,0.909803922,0.952941176,0.952941176,0.850980392,0.639215686,0.576470588,0.615686275,0.658823529,0.725490196,0.796078431,0.807843137,0.639215686,0.411764706,0.305882353,0.298039216,0.337254902,0.411764706,0.533333333,0.51372549,0.384313725,0.352941176,0.337254902,0.545098039,0.909803922,0.933333333,1,0.51372549
0.078431373,0.184313725,0.160784314,0.149019608,0.168627451,0.203921569,0.321568627,0.48627451,0.533333333,0.603921569,0.839215686,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.905882353,0.815686275,0.741176471,0.796078431,0.878431373,0.929411765,0.929411765,0.796078431,0.611764706,0.48627451,0.435294118,0.521568627,0.662745098,0.784313725,0.784313725,0.741176471,0.737254902,0.760784314,0.807843137,0.82745098,0.839215686,0.729411765,0.51372549,0.407843137,0.384313725,0.611764706,0.929411765,0.929411765,1,0.51372549
0.070588235,0.160784314,0.145098039,0.137254902,0.145098039,0.160784314,0.262745098,0.411764706,0.474509804,0.545098039,0.760784314,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.952941176,0.917647059,0.839215686,0.717647059,0.752941176,0.929411765,0.964705882,0.921568627,0.729411765,0.51372549,0.462745098,0.48627451,0.533333333,0.615686275,0.729411765,0.839215686,0.917647059,0.968627451,0.976470588,0.976470588,0.968627451,0.843137255,0.729411765,0.717647059,0.878431373,0.952941176,0.921568627,1,0.51372549
0.070588235,0.160784314,0.149019608,0.137254902,0.137254902,0.160784314,0.247058824,0.376470588,0.466666667,0.533333333,0.690196078,0.909803922,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.952941176,0.807843137,0.647058824,0.752941176,0.933333333,0.952941176,0.866666667,0.690196078,0.537254902,0.384313725,0.207843137,0.156862745,0.247058824,0.349019608,0.419607843,0.537254902,0.639215686,0.662745098,0.658823529,0.6,0.556862745,0.639215686,0.894117647,0.945098039,0.921568627,1,0.51372549
0.090196078,0.184313725,0.168627451,0.149019608,0.137254902,0.149019608,0.203921569,0.309803922,0.435294118,0.537254902,0.635294118,0.839215686,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.909803922,0.725490196,0.650980392,0.803921569,0.917647059,0.929411765,0.909803922,0.854901961,0.650980392,0.294117647,0.070588235,0.043137255,0.043137255,0.058823529,0.215686275,0.407843137,0.407843137,0.321568627,0.262745098,0.250980392,0.407843137,0.796078431,0.956862745,0.921568627,1,0.51372549
0.101960784,0.184313725,0.156862745,0.156862745,0.160784314,0.156862745,0.160784314,0.22745098,0.384313725,0.525490196,0.6,0.749019608,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.905882353,0.780392157,0.752941176,0.831372549,0.894117647,0.905882353,0.878431373,0.764705882,0.501960784,0.250980392,0.082352941,0.011764706,0.058823529,0.419607843,0.792156863,0.803921569,0.678431373,0.568627451,0.533333333,0.678431373,0.898039216,0.945098039,0.921568627,1,0.51372549
0.101960784,0.192156863,0.160784314,0.149019608,0.160784314,0.156862745,0.149019608,0.207843137,0.360784314,0.521568627,0.603921569,0.71372549,0.905882353,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.952941176,0.878431373,0.784313725,0.780392157,0.725490196,0.564705882,0.509803922,0.670588235,0.77254902,0.650980392,0.329411765,0.070588235,0.070588235,0.525490196,0.952941176,0.964705882,0.964705882,0.952941176,0.941176471,0.956862745,0.945098039,0.941176471,0.929411765,1,0.51372549
0.094117647,0.192156863,0.160784314,0.149019608,0.160784314,0.168627451,0.168627451,0.203921569,0.309803922,0.474509804,0.588235294,0.662745098,0.839215686,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.82745098,0.71372549,0.752941176,0.690196078,0.407843137,0.294117647,0.635294118,0.952941176,0.933333333,0.62745098,0.196078431,0.070588235,0.419607843,0.905882353,0.941176471,0.933333333,0.941176471,0.941176471,0.933333333,0.941176471,0.941176471,0.929411765,1,0.51372549
0.078431373,0.160784314,0.156862745,0.149019608,0.168627451,0.17254902,0.180392157,0.196078431,0.258823529,0.388235294,0.521568627,0.6,0.752941176,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.952941176,0.82745098,0.603921569,0.658823529,0.682352941,0.454901961,0.270588235,0.525490196,0.878431373,0.952941176,0.701960784,0.250980392,0.047058824,0.298039216,0.866666667,0.952941176,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,1,0.51372549
0.078431373,0.156862745,0.160784314,0.160784314,0.168627451,0.184313725,0.192156863,0.196078431,0.22745098,0.329411765,0.474509804,0.568627451,0.694117647,0.905882353,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.917647059,0.635294118,0.533333333,0.588235294,0.435294118,0.219607843,0.298039216,0.639215686,0.839215686,0.658823529,0.258823529,0.023529412,0.168627451,0.752941176,0.964705882,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,1,0.51372549
0.082352941,0.168627451,0.149019608,0.149019608,0.168627451,0.184313725,0.184313725,0.196078431,0.215686275,0.274509804,0.411764706,0.521568627,0.6,0.819607843,0.952941176,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.952941176,0.784313725,0.568627451,0.603921569,0.537254902,0.298039216,0.219607843,0.549019608,0.82745098,0.729411765,0.364705882,0.082352941,0.11372549,0.611764706,0.952941176,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,1,0.51372549
0.082352941,0.168627451,0.156862745,0.156862745,0.168627451,0.180392157,0.184313725,0.196078431,0.168627451,0.149019608,0.270588235,0.443137255,0.51372549,0.690196078,0.921568627,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.894117647,0.670588235,0.647058824,0.603921569,0.352941176,0.22745098,0.537254902,0.862745098,0.815686275,0.474509804,0.160784314,0.125490196,0.533333333,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,0.992156863,0.498039216
0.082352941,0.156862745,0.168627451,0.168627451,0.17254902,0.184313725,0.184313725,0.168627451,0.133333333,0.11372549,0.219607843,0.411764706,0.48627451,0.556862745,0.803921569,0.952941176,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.752941176,0.62745098,0.568627451,0.329411765,0.192156863,0.431372549,0.741176471,0.717647059,0.411764706,0.149019608,0.105882353,0.48627451,0.929411765,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,0.988235294,0.474509804
0.078431373,0.156862745,0.160784314,0.160784314,0.168627451,0.17254902,0.160784314,0.137254902,0.149019608,0.184313725,0.258823529,0.396078431,0.474509804,0.498039216,0.701960784,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.952941176,0.866666667,0.678431373,0.580392157,0.364705882,0.207843137,0.407843137,0.682352941,0.670588235,0.4,0.149019608,0.090196078,0.407843137,0.909803922,0.945098039,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,0.980392157,0.454901961
0.078431373,0.145098039,0.145098039,0.145098039,0.156862745,0.168627451,0.160784314,0.156862745,0.180392157,0.203921569,0.22745098,0.309803922,0.407843137,0.466666667,0.639215686,0.909803922,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.905882353,0.705882353,0.623529412,0.498039216,0.337254902,0.454901961,0.670588235,0.670588235,0.431372549,0.149019608,0.054901961,0.305882353,0.850980392,0.952941176,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.976470588,0.423529412
0.078431373,0.145098039,0.133333333,0.137254902,0.160784314,0.180392157,0.196078431,0.192156863,0.184313725,0.192156863,0.203921569,0.247058824,0.352941176,0.443137255,0.576470588,0.839215686,0.952941176,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.850980392,0.839215686,0.843137255,0.741176471,0.694117647,0.725490196,0.639215686,0.376470588,0.11372549,0.066666667,0.384313725,0.878431373,0.945098039,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.968627451,0.396078431
0.082352941,0.149019608,0.137254902,0.145098039,0.17254902,0.203921569,0.207843137,0.192156863,0.180392157,0.184313725,0.203921569,0.247058824,0.341176471,0.443137255,0.545098039,0.752941176,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.945098039,0.952941176,0.964705882,0.952941176,0.929411765,0.807843137,0.568627451,0.384313725,0.431372549,0.741176471,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.964705882,0.37254902
0.078431373,0.160784314,0.149019608,0.149019608,0.168627451,0.203921569,0.196078431,0.168627451,0.160784314,0.184313725,0.203921569,0.22745098,0.294117647,0.423529412,0.549019608,0.694117647,0.905882353,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.945098039,0.945098039,0.894117647,0.831372549,0.82745098,0.803921569,0.764705882,0.729411765,0.815686275,0.956862745,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.956862745,0.341176471
0.019607843,0.082352941,0.160784314,0.145098039,0.137254902,0.160784314,0.168627451,0.137254902,0.145098039,0.184313725,0.207843137,0.215686275,0.250980392,0.376470588,0.509803922,0.6,0.792156863,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.917647059,0.862745098,0.815686275,0.729411765,0.635294118,0.545098039,0.454901961,0.443137255,0.450980392,0.537254902,0.780392157,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.329411765
0.007843137,0.007843137,0.070588235,0.156862745,0.137254902,0.137254902,0.145098039,0.133333333,0.156862745,0.184313725,0.192156863,0.203921569,0.235294118,0.317647059,0.431372549,0.533333333,0.694117647,0.909803922,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.929411765,0.764705882,0.62745098,0.603921569,0.623529412,0.580392157,0.423529412,0.235294118,0.145098039,0.105882353,0.180392157,0.509803922,0.917647059,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.956862745,0.294117647
0.011764706,0.011764706,0.007843137,0.058823529,0.160784314,0.17254902,0.168627451,0.168627451,0.196078431,0.22745098,0.196078431,0.180392157,0.196078431,0.250980392,0.384313725,0.521568627,0.678431373,0.909803922,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.905882353,0.82745098,0.780392157,0.690196078,0.525490196,0.329411765,0.196078431,0.180392157,0.262745098,0.407843137,0.729411765,0.945098039,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.945098039,0.964705882,0.784313725,0.137254902
0.011764706,0.011764706,0.011764706,0,0.066666667,0.192156863,0.184313725,0.160784314,0.180392157,0.215686275,0.215686275,0.180392157,0.160784314,0.196078431,0.329411765,0.462745098,0.564705882,0.815686275,0.956862745,0.933333333,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.945098039,0.956862745,0.956862745,0.929411765,0.792156863,0.592156863,0.549019608,0.647058824,0.780392157,0.88627451,0.952941176,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.956862745,0.898039216,0.647058824,0.298039216,0.019607843
0.011764706,0.011764706,0.011764706,0.011764706,0.007843137,0.054901961,0.168627451,0.168627451,0.137254902,0.17254902,0.184313725,0.17254902,0.160784314,0.160784314,0.215686275,0.28627451,0.317647059,0.509803922,0.862745098,0.964705882,0.952941176,0.933333333,0.933333333,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.933333333,0.945098039,0.956862745,0.945098039,0.945098039,0.964705882,0.964705882,0.945098039,0.941176471,0.941176471,0.941176471,0.933333333,0.952941176,0.952941176,0.82745098,0.501960784,0.184313725,0.070588235,0.019607843
0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.007843137,0.031372549,0.133333333,0.149019608,0.137254902,0.145098039,0.156862745,0.160784314,0.149019608,0.137254902,0.168627451,0.196078431,0.270588235,0.498039216,0.760784314,0.894117647,0.952941176,0.964705882,0.952941176,0.941176471,0.933333333,0.933333333,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.933333333,0.941176471,0.941176471,0.933333333,0.933333333,0.933333333,0.941176471,0.933333333,0.941176471,0.956862745,0.898039216,0.678431373,0.37254902,0.156862745,0.082352941,0.101960784,0.023529412
0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.007843137,0.019607843,0.11372549,0.149019608,0.125490196,0.137254902,0.149019608,0.160784314,0.145098039,0.156862745,0.184313725,0.184313725,0.196078431,0.239215686,0.419607843,0.650980392,0.815686275,0.894117647,0.929411765,0.952941176,0.956862745,0.952941176,0.933333333,0.933333333,0.933333333,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.941176471,0.956862745,0.929411765,0.780392157,0.51372549,0.270588235,0.156862745,0.133333333,0.125490196,0.101960784,0.019607843
0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.007843137,0.019607843,0.105882353,0.168627451,0.156862745,0.17254902,0.180392157,0.17254902,0.160784314,0.168627451,0.156862745,0.145098039,0.133333333,0.168627451,0.239215686,0.305882353,0.4,0.490196078,0.611764706,0.784313725,0.894117647,0.956862745,0.968627451,0.956862745,0.941176471,0.933333333,0.933333333,0.933333333,0.933333333,0.933333333,0.933333333,0.945098039,0.952941176,0.839215686,0.611764706,0.4,0.258823529,0.196078431,0.184313725,0.168627451,0.145098039,0.105882353,0.019607843
0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.121568627,0.192156863,0.192156863,0.192156863,0.17254902,0.168627451,0.17254902,0.156862745,0.160784314,0.149019608,0.149019608,0.17254902,0.149019608,0.149019608,0.180392157,0.196078431,0.247058824,0.37254902,0.51372549,0.670588235,0.850980392,0.941176471,0.964705882,0.964705882,0.956862745,0.952941176,0.952941176,0.968627451,0.929411765,0.694117647,0.411764706,0.282352941,0.215686275,0.196078431,0.196078431,0.180392157,0.160784314,0.160784314,0.137254902,0.019607843
0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.007843137,0.090196078,0.192156863,0.184313725,0.17254902,0.184313725,0.184313725,0.180392157,0.17254902,0.149019608,0.125490196,0.145098039,0.149019608,0.168627451,0.219607843,0.180392157,0.105882353,0.105882353,0.121568627,0.168627451,0.317647059,0.474509804,0.650980392,0.819607843,0.850980392,0.862745098,0.866666667,0.796078431,0.525490196,0.282352941,0.219607843,0.196078431,0.168627451,0.160784314,0.160784314,0.145098039,0.149019608,0.160784314,0.156862745,0.023529412
0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0,0.078431373,0.192156863,0.184313725,0.17254902,0.17254902,0.184313725,0.180392157,0.156862745,0.145098039,0.145098039,0.145098039,0.156862745,0.168627451,0.156862745,0.137254902,0.149019608,0.168627451,0.168627451,0.156862745,0.168627451,0.22745098,0.317647059,0.360784314,0.376470588,0.360784314,0.262745098,0.180392157,0.17254902,0.192156863,0.184313725,0.168627451,0.137254902,0.133333333,0.145098039,0.145098039,0.149019608,0.145098039,0.023529412
0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.007843137,0.066666667,0.168627451,0.156862745,0.145098039,0.168627451,0.184313725,0.180392157,0.168627451,0.149019608,0.149019608,0.160784314,0.17254902,0.168627451,0.149019608,0.180392157,0.207843137,0.219607843,0.250980392,0.247058824,0.219607843,0.235294118,0.235294118,0.215686275,0.196078431,0.17254902,0.180392157,0.180392157,0.17254902,0.180392157,0.184313725,0.160784314,0.160784314,0.180392157,0.168627451,0.149019608,0.125490196,0.019607843
0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.011764706,0.007843137,0.023529412,0.031372549,0.023529412,0.035294118,0.054901961,0.066666667,0.058823529,0.058823529,0.070588235,0.082352941,0.101960784,0.101960784,0.094117647,0.101960784,0.105882353,0.11372549,0.133333333,0.133333333,0.121568627,0.133333333,0.137254902,0.133333333,0.121568627,0.105882353,0.094117647,0.094117647,0.090196078,0.094117647,0.094117647,0.090196078,0.101960784,0.11372549,0.11372549,0.11372549,0.101960784,0.019607843
_______________________________________________________________________________

⑤②を実行すると、202行目の「y = network(x, t)」にて、以下のようなエラーが発生します。
File "C:\Users\taro\Anaconda3\lib\site-packages\nnabla\parameter.py", line 161, in get_parameter_or_create
assert param.shape == tuple(shape)
AssertionError

デバッグすると、param.shape は(64,1,5,5) tuple(shape) は(64,50.0,5,5) となり、期待する値でないため、
例外が発生しているようです。

色々と調べたりしましたが私の技術力が不足しており、お手上げ状態となってしまいました。
宜しければ、ご指南お願いしたく投稿させていただきました。

何卒、よろしくお願いいたします。


小林由幸

unread,
Jan 9, 2018, 12:02:15 AM1/9/18
to Neural Network Console Users (JP)
コードを拝見しました。

実際のエラーは202行目ではなく、network関数内のどこかで発生していると思われますが、
そちらの行は特定可能でしょうか?

以下、いくつか気になった点についてコメントさせていただきます。

* 推論時に用いるネットワークのコードについて
②の181~182行は推論時は不要ですので、コメントアウトしてください。
Neural Network ConsoleのEDITタブで、CategoricalCrossEntropyを削除した後に
Python CodeをExportすることで、181~182行を除いたコードを得ることもできます。
181~182行を除くことで、yのforward計算の結果各クラスのスコアが出力されるようになります。
181~182行が含まれる場合、yのforward計算の結果ロス関数の値が出力されます。

* data_iterator_csv_datasetを用いたデータの入力について
現状学習時は50x50ピクセルのモノクロ画像を入力とし、推論時は50行50列の行列を
入力とされているということでよろしかったでしょうか。
行列を入力した場合、本来1,50,50の3次元配列の入力を想定した出力コードに対し、
50,50の2次元配列が入力されていることになります。
対策としては、dataset.csvの中に学習時と同様画像でデータを与える、
もしくは読み込み後、バッチサイズ,1,50,50にサイズを変更することが考えられます。

より簡単には、data_iterator_csv_datasetを用いず、xのサイズを1,1,50,50
(バッチサイズ=1、チャンネル数1の50x50画像)とし、ここに読み込んだ画像を
1/255倍したものを代入するという方法もあります。

import nnabla as nn
from scipy.misc import imread


x
=nn.Variance((1,1,28,28))
x
.d = imread("C:/neural_network_console/samples/sample_dataset/MNIST/validation/0/3.png").reshape(1,1,28,28)/255.0


Reply all
Reply to author
Forward
0 new messages