Theano_buildbot - Build # 173 - SUCCESS - Fail=0 Tot=82810 Skip=6129

2 views
Skip to first unread message

jenkins...@mila.quebec

unread,
Oct 16, 2018, 3:24:58 PM10/16/18
to theano-...@googlegroups.com
Theano_buildbot - Build # 173 - SUCCESS -
Tot=82810 Skip=6129 Fail=0

See https://jenkins.mila.quebec/job/Theano_buildbot/173/ to view the results.

Failed tests:
All tests passed

jenkins...@mila.quebec

unread,
Oct 26, 2018, 8:38:00 AM10/26/18
to theano-...@googlegroups.com
Theano_buildbot - Build # 174 - Unstable -
Tot=83090 Skip=6129 Fail=6

See https://jenkins.mila.quebec/job/Theano_buildbot/174/ to view the results.

Failed tests:
6 tests failed.
FAILED: theano.gpuarray.tests.test_linalg.Theano_python3 / Theano python3 / test_cholesky_grad

Error Message:
('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 3
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(5, 5)]
Inputs strides: [(40, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCholesky{lower=True, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 611, in <lambda>
yield (lambda: utt.verify_grad(lambda r: gpu_cholesky(r.dot(r.T)),
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 609, in gpu_cholesky
return GpuCholesky(lower)(A)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Stack Trace:
Traceback (most recent call last):
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 59, in testPartExecutor
yield
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 605, in run
testMethod()
File "/miniconda/envs/py3k/lib/python3.6/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 612, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/miniconda/envs/py3k/lib/python3.6/site-packages/six.py", line 692, in reraise
raise value.with_traceback(tb)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 3
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(5, 5)]
Inputs strides: [(40, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCholesky{lower=True, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 611, in <lambda>
yield (lambda: utt.verify_grad(lambda r: gpu_cholesky(r.dot(r.T)),
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 609, in gpu_cholesky
return GpuCholesky(lower)(A)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.


FAILED: theano.gpuarray.tests.test_linalg.Theano_python3 / Theano python3 / test_cholesky_grad

Error Message:
('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 3
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(5, 5)]
Inputs strides: [(40, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCholesky{lower=True, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 615, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 614, in <lambda>
yield (lambda: utt.verify_grad(lambda r: GpuCholesky(lower=True)(r.dot(r.T)),
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Stack Trace:
Traceback (most recent call last):
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 59, in testPartExecutor
yield
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 605, in run
testMethod()
File "/miniconda/envs/py3k/lib/python3.6/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 615, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/miniconda/envs/py3k/lib/python3.6/site-packages/six.py", line 692, in reraise
raise value.with_traceback(tb)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 3
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(5, 5)]
Inputs strides: [(40, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCholesky{lower=True, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 615, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 614, in <lambda>
yield (lambda: utt.verify_grad(lambda r: GpuCholesky(lower=True)(r.dot(r.T)),
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.


FAILED: theano.gpuarray.tests.test_linalg.Theano_python3 / Theano python3 / test_cholesky_grad

Error Message:
('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 3
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(5, 5)]
Inputs strides: [(40, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCholesky{lower=False, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 618, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 617, in <lambda>
yield (lambda: utt.verify_grad(lambda r: GpuCholesky(lower=False)(r.dot(r.T)),
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Stack Trace:
Traceback (most recent call last):
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 59, in testPartExecutor
yield
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 605, in run
testMethod()
File "/miniconda/envs/py3k/lib/python3.6/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 618, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/miniconda/envs/py3k/lib/python3.6/site-packages/six.py", line 692, in reraise
raise value.with_traceback(tb)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 3
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(5, 5)]
Inputs strides: [(40, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCholesky{lower=False, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 618, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 617, in <lambda>
yield (lambda: utt.verify_grad(lambda r: GpuCholesky(lower=False)(r.dot(r.T)),
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.


FAILED: theano.gpuarray.tests.test_linalg.Theano_python3 / Theano python3 / test_cholesky_grad_indef

Error Message:
('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 1
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(2, 2)]
Inputs strides: [(16, 8)]
Inputs values: [gpuarray.array([[ 1. , 0.2],
[ 0.2, -2. ]])]
Outputs clients: [[GpuCholesky{lower=True, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 653, in __call__
return self.run(*args, **kwds)
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 605, in run
testMethod()
File "/miniconda/envs/py3k/lib/python3.6/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 625, in test_cholesky_grad_indef
chol_f = theano.function([x], theano.tensor.grad(cholesky(x).sum(), [x]))
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Stack Trace:
Traceback (most recent call last):
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 59, in testPartExecutor
yield
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 605, in run
testMethod()
File "/miniconda/envs/py3k/lib/python3.6/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 627, in test_cholesky_grad_indef
chol_f(matrix)
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/miniconda/envs/py3k/lib/python3.6/site-packages/six.py", line 692, in reraise
raise value.with_traceback(tb)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 1
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(2, 2)]
Inputs strides: [(16, 8)]
Inputs values: [gpuarray.array([[ 1. , 0.2],
[ 0.2, -2. ]])]
Outputs clients: [[GpuCholesky{lower=True, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 653, in __call__
return self.run(*args, **kwds)
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 605, in run
testMethod()
File "/miniconda/envs/py3k/lib/python3.6/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 625, in test_cholesky_grad_indef
chol_f = theano.function([x], theano.tensor.grad(cholesky(x).sum(), [x]))
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.


FAILED: theano.gpuarray.tests.test_linalg.Theano_python3 / Theano python3 / test_lower_triangular_and_cholesky_grad

Error Message:
('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 1
Inputs types: [GpuArrayType<None>(float64, col)]
Inputs shapes: [(100, 1)]
Inputs strides: [(8, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCublasTriangularSolve{trans='N', lower=True}(GpuContiguous.0, GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 653, in f
A = gpu_solve_lower_triangular(L, y)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 444, in gpu_solve_lower_triangular
return GpuCublasTriangularSolve(True, trans)(A, b)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 324, in make_node
inp2 = gpu_contiguous(inp2)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Stack Trace:
Traceback (most recent call last):
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 59, in testPartExecutor
yield
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 605, in run
testMethod()
File "/miniconda/envs/py3k/lib/python3.6/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 658, in <lambda>
yield (lambda: utt.verify_grad(f, [r, y], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/miniconda/envs/py3k/lib/python3.6/site-packages/six.py", line 692, in reraise
raise value.with_traceback(tb)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 1
Inputs types: [GpuArrayType<None>(float64, col)]
Inputs shapes: [(100, 1)]
Inputs strides: [(8, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCublasTriangularSolve{trans='N', lower=True}(GpuContiguous.0, GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 653, in f
A = gpu_solve_lower_triangular(L, y)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 444, in gpu_solve_lower_triangular
return GpuCublasTriangularSolve(True, trans)(A, b)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 324, in make_node
inp2 = gpu_contiguous(inp2)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.


FAILED: theano.gpuarray.tests.test_linalg.TestCusolver.Theano_python3 / Theano python3 / test_solve_grad

Error Message:
('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 1
Inputs types: [GpuArrayType<None>(float64, col)]
Inputs shapes: [(6, 1)]
Inputs strides: [(8, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCusolverSolve{A_structure='general', trans='N', inplace=False}(GpuContiguous.0, GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 158, in test_solve_grad
self.verify_solve_grad(6, 1, A_structure, lower, rng)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 150, in verify_solve_grad
utt.verify_grad(solve_op, [A_val, b_val], 3, rng, eps=eps)
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 146, in make_node
inp2 = gpu_contiguous(inp2)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Stack Trace:
Traceback (most recent call last):
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 59, in testPartExecutor
yield
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 605, in run
testMethod()
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 158, in test_solve_grad
self.verify_solve_grad(6, 1, A_structure, lower, rng)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 150, in verify_solve_grad
utt.verify_grad(solve_op, [A_val, b_val], 3, rng, eps=eps)
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/miniconda/envs/py3k/lib/python3.6/site-packages/six.py", line 692, in reraise
raise value.with_traceback(tb)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
theano.gof.utils.MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 1
Inputs types: [GpuArrayType<None>(float64, col)]
Inputs shapes: [(6, 1)]
Inputs strides: [(8, 8)]
Inputs values: ['not shown']
Outputs clients: [[GpuCusolverSolve{A_structure='general', trans='N', inplace=False}(GpuContiguous.0, GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 158, in test_solve_grad
self.verify_solve_grad(6, 1, A_structure, lower, rng)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 150, in verify_solve_grad
utt.verify_grad(solve_op, [A_val, b_val], 3, rng, eps=eps)
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1728, in verify_grad
o_output = fun(*tensor_pt)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 146, in make_node
inp2 = gpu_contiguous(inp2)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

jenkins...@mila.quebec

unread,
Oct 26, 2018, 8:17:06 PM10/26/18
to theano-...@googlegroups.com
Theano_buildbot - Build # 175 - Unstable -
Tot=83790 Skip=6129 Fail=15

See https://jenkins.mila.quebec/job/Theano_buildbot/175/ to view the results.

Failed tests:
15 tests failed.
FAILED: theano.gpuarray.tests.test_linalg.Theano_python2 / Theano python2 / test_cholesky_grad
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 612, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
FAILED: theano.gpuarray.tests.test_linalg.Theano_python2 / Theano python2 / test_cholesky_grad
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 615, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
FAILED: theano.gpuarray.tests.test_linalg.Theano_python2 / Theano python2 / test_cholesky_grad
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 618, in <lambda>
[r], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
FAILED: theano.gpuarray.tests.test_linalg.Theano_python2 / Theano python2 / test_cholesky_grad_indef

Error Message:
('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 1
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(2, 2)]
Inputs strides: [(16, 8)]
Inputs values: [gpuarray.array([[ 1. , 0.2],
[ 0.2, -2. ]])]
Outputs clients: [[GpuCholesky{lower=True, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/miniconda/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 625, in test_cholesky_grad_indef
chol_f = theano.function([x], theano.tensor.grad(cholesky(x).sum(), [x]))
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 627, in test_cholesky_grad_indef
chol_f(matrix)
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
Apply node that caused the error: GpuContiguous(GpuFromHost<None>.0)
Toposort index: 1
Inputs types: [GpuArrayType<None>(float64, matrix)]
Inputs shapes: [(2, 2)]
Inputs strides: [(16, 8)]
Inputs values: [gpuarray.array([[ 1. , 0.2],
[ 0.2, -2. ]])]
Outputs clients: [[GpuCholesky{lower=True, inplace=False}(GpuContiguous.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/miniconda/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 625, in test_cholesky_grad_indef
chol_f = theano.function([x], theano.tensor.grad(cholesky(x).sum(), [x]))
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/linalg.py", line 493, in make_node
inp = gpu_contiguous(inp)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 615, in __call__
node = self.make_node(*inputs, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/basic_ops.py", line 1079, in make_node
return Apply(self, [input], [input.type()])

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.


FAILED: theano.gpuarray.tests.test_linalg.Theano_python2 / Theano python2 / test_lower_triangular_and_cholesky_grad
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 658, in <lambda>
yield (lambda: utt.verify_grad(f, [r, y], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
FAILED: theano.gpuarray.tests.test_linalg.Theano_python2 / Theano python2 / test_lower_triangular_and_cholesky_grad

Error Message:
GradientError: numeric gradient and analytic gradient exceed tolerance:
At position 0 of argument 0 with shape (100, 100),
val1 = 61.417695 , val2 = -0.006426
abs. error = 61.424121, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[ 0.82418808, 0.479966 , 1.17346801, ..., -0.89693179,
1.1333561 , 1.72204708],
[ 0.60937818, 0.59035144, -0.63494634, ..., 0.38795807,
-1.33227093, 1.37288101],
[-1.74410198, -1.7043582 , 0.98157291, ..., 0.43921063,
0.53149007, 1.74936573],
...,
[ 1.4820272 , -0.4070006 , 0.27235611, ..., -1.52090356,
0.11336321, -1.12153178],
[ 0.09191356, -1.03761209, 0.14203255, ..., -0.80062017,
-0.46813556, -0.29792215],
[-0.42640828, -1.25819163, 1.16196006, ..., -2.13613786,
0.01712027, 0.88170587]]), array([[ 0.67942981],
[ 0.84003691],
[ 0.80139855],
[ 0.4134548 ],
[ 0.79754123],
[ 0.91048039],
[ 0.13280862],
[ 0.59288864],
[ 0.54108386],
[ 0.63660533],
[ 0.11010899],
[ 0.65704268],
[ 0.88940502],
[ 0.94640594],
[ 0.46083678],
[ 0.65795242],
[ 0.67514642],
[ 0.42068286],
[ 0.09293023],
[ 0.8825209 ],
[ 0.48771946],
[ 0.03602727],
[ 0.57973204],
[ 0.86167913],
[ 0.9762849 ],
[ 0.85554925],
[ 0.34594622],
[ 0.02005155],
[ 0.38865439],
[ 0.12135134],
[ 0.46725583],
[ 0.99886711],
[ 0.37875904],
[ 0.54572805],
[ 0.82540798],
[ 0.78354625],
[ 0.02688303],
[ 0.14376768],
[ 0.25440734],
[ 0.97160841],
[ 0.61142859],
[ 0.99961918],
[ 0.38834315],
[ 0.19522355],
[ 0.76331484],
[ 0.69625549],
[ 0.55486424],
[ 0.42897632],
[ 0.06831499],
[ 0.38228057],
[ 0.22321113],
[ 0.29616156],
[ 0.54413117],
[ 0.53585203],
[ 0.62790604],
[ 0.45202495],
[ 0.21847578],
[ 0.01656385],
[ 0.00557879],
[ 0.78208971],
[ 0.98303934],
[ 0.28067987],
[ 0.98132663],
[ 0.37865535],
[ 0.68170719],
[ 0.26389976],
[ 0.41900707],
[ 0.59481279],
[ 0.0319248 ],
[ 0.00610308],
[ 0.92895857],
[ 0.06957089],
[ 0.49545636],
[ 0.55282008],
[ 0.80529384],
[ 0.65829893],
[ 0.60295273],
[ 0.6292522 ],
[ 0.79419288],
[ 0.58372654],
[ 0.29008071],
[ 0.82303948],
[ 0.48191757],
[ 0.33953194],
[ 0.79686958],
[ 0.46021843],
[ 0.11204681],
[ 0.86889479],
[ 0.05922137],
[ 0.55385442],
[ 0.43680813],
[ 0.53319453],
[ 0.557082 ],
[ 0.92396596],
[ 0.24384524],
[ 0.26384151],
[ 0.53790837],
[ 0.36034912],
[ 0.13287837],
[ 0.32456718]])],
The value of eps is:, None,
The out_type is:, None

Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 658, in <lambda>
yield (lambda: utt.verify_grad(f, [r, y], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1795, in verify_grad
abs_tol, rel_tol)
GradientError: GradientError: numeric gradient and analytic gradient exceed tolerance:
At position 0 of argument 0 with shape (100, 100),
val1 = 61.417695 , val2 = -0.006426
abs. error = 61.424121, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[ 0.82418808, 0.479966 , 1.17346801, ..., -0.89693179,
1.1333561 , 1.72204708],
[ 0.60937818, 0.59035144, -0.63494634, ..., 0.38795807,
-1.33227093, 1.37288101],
[-1.74410198, -1.7043582 , 0.98157291, ..., 0.43921063,
0.53149007, 1.74936573],
...,
[ 1.4820272 , -0.4070006 , 0.27235611, ..., -1.52090356,
0.11336321, -1.12153178],
[ 0.09191356, -1.03761209, 0.14203255, ..., -0.80062017,
-0.46813556, -0.29792215],
[-0.42640828, -1.25819163, 1.16196006, ..., -2.13613786,
0.01712027, 0.88170587]]), array([[ 0.67942981],
[ 0.84003691],
[ 0.80139855],
[ 0.4134548 ],
[ 0.79754123],
[ 0.91048039],
[ 0.13280862],
[ 0.59288864],
[ 0.54108386],
[ 0.63660533],
[ 0.11010899],
[ 0.65704268],
[ 0.88940502],
[ 0.94640594],
[ 0.46083678],
[ 0.65795242],
[ 0.67514642],
[ 0.42068286],
[ 0.09293023],
[ 0.8825209 ],
[ 0.48771946],
[ 0.03602727],
[ 0.57973204],
[ 0.86167913],
[ 0.9762849 ],
[ 0.85554925],
[ 0.34594622],
[ 0.02005155],
[ 0.38865439],
[ 0.12135134],
[ 0.46725583],
[ 0.99886711],
[ 0.37875904],
[ 0.54572805],
[ 0.82540798],
[ 0.78354625],
[ 0.02688303],
[ 0.14376768],
[ 0.25440734],
[ 0.97160841],
[ 0.61142859],
[ 0.99961918],
[ 0.38834315],
[ 0.19522355],
[ 0.76331484],
[ 0.69625549],
[ 0.55486424],
[ 0.42897632],
[ 0.06831499],
[ 0.38228057],
[ 0.22321113],
[ 0.29616156],
[ 0.54413117],
[ 0.53585203],
[ 0.62790604],
[ 0.45202495],
[ 0.21847578],
[ 0.01656385],
[ 0.00557879],
[ 0.78208971],
[ 0.98303934],
[ 0.28067987],
[ 0.98132663],
[ 0.37865535],
[ 0.68170719],
[ 0.26389976],
[ 0.41900707],
[ 0.59481279],
[ 0.0319248 ],
[ 0.00610308],
[ 0.92895857],
[ 0.06957089],
[ 0.49545636],
[ 0.55282008],
[ 0.80529384],
[ 0.65829893],
[ 0.60295273],
[ 0.6292522 ],
[ 0.79419288],
[ 0.58372654],
[ 0.29008071],
[ 0.82303948],
[ 0.48191757],
[ 0.33953194],
[ 0.79686958],
[ 0.46021843],
[ 0.11204681],
[ 0.86889479],
[ 0.05922137],
[ 0.55385442],
[ 0.43680813],
[ 0.53319453],
[ 0.557082 ],
[ 0.92396596],
[ 0.24384524],
[ 0.26384151],
[ 0.53790837],
[ 0.36034912],
[ 0.13287837],
[ 0.32456718]])],
The value of eps is:, None,
The out_type is:, None


FAILED: theano.gpuarray.tests.test_linalg.Theano_python2_debug / Theano python2 debug / test_lower_triangular_and_cholesky_grad

Error Message:
GradientError: numeric gradient and analytic gradient exceed tolerance:
At position 1 of argument 0 with shape (100, 100),
val1 = -1.647935 , val2 = 0.008435
abs. error = 1.656370, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[-0.37677055, 0.32463221, 1.68402489, ..., 0.49646674,
0.30921242, -0.1380071 ],
[-0.31406368, -0.23834718, -0.70688616, ..., 0.07867473,
-0.03591484, -0.69033128],
[-0.23354082, 1.10444135, -0.77831345, ..., -0.37810674,
-0.85684233, 0.98508864],
...,
[-1.39800254, -0.32657637, 1.18996232, ..., 0.77066803,
1.4804801 , -0.68554455],
[-0.04174452, -1.05453767, 1.21432403, ..., -0.51165503,
-0.70889612, 0.26483268],
[-1.50149752, -1.33047436, -1.30863128, ..., -1.1015948 ,
-0.31188388, -0.23172907]]), array([[ 0.83812234],
[ 0.17554982],
[ 0.34566324],
[ 0.69526631],
[ 0.66284244],
[ 0.46398483],
[ 0.73233987],
[ 0.69225381],
[ 0.89224367],
[ 0.39707122],
[ 0.56768322],
[ 0.45040174],
[ 0.39671409],
[ 0.65088028],
[ 0.76885903],
[ 0.90107458],
[ 0.68002225],
[ 0.26227698],
[ 0.71872772],
[ 0.59494308],
[ 0.39271917],
[ 0.02702927],
[ 0.53276892],
[ 0.53729438],
[ 0.64523072],
[ 0.62399654],
[ 0.06246988],
[ 0.0999261 ],
[ 0.28645567],
[ 0.7004813 ],
[ 0.23364424],
[ 0.28695575],
[ 0.23223289],
[ 0.53865553],
[ 0.13464174],
[ 0.35540841],
[ 0.88399869],
[ 0.19811673],
[ 0.94914303],
[ 0.54190852],
[ 0.08000766],
[ 0.22727741],
[ 0.70911438],
[ 0.95834675],
[ 0.91922509],
[ 0.86545178],
[ 0.59818696],
[ 0.62626489],
[ 0.21661836],
[ 0.13325502],
[ 0.01839359],
[ 0.19663679],
[ 0.86799031],
[ 0.83398874],
[ 0.82916783],
[ 0.92070176],
[ 0.69489592],
[ 0.40249441],
[ 0.61431809],
[ 0.65740987],
[ 0.33741802],
[ 0.14589024],
[ 0.75741726],
[ 0.43215608],
[ 0.79860224],
[ 0.20883352],
[ 0.73451009],
[ 0.23787279],
[ 0.25842679],
[ 0.44244119],
[ 0.2325811 ],
[ 0.10406719],
[ 0.86571375],
[ 0.55272828],
[ 0.08768517],
[ 0.62484515],
[ 0.2690396 ],
[ 0.00160047],
[ 0.12156752],
[ 0.3059304 ],
[ 0.73795243],
[ 0.37626465],
[ 0.3508246 ],
[ 0.17204439],
[ 0.02534875],
[ 0.37568615],
[ 0.07818308],
[ 0.06166707],
[ 0.46802027],
[ 0.66328687],
[ 0.22330989],
[ 0.17044826],
[ 0.11115906],
[ 0.39724468],
[ 0.96838428],
[ 0.75930249],
[ 0.08572719],
[ 0.72238823],
[ 0.74672317],
[ 0.59874469]])],
The value of eps is:, None,
The out_type is:, None

Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 658, in <lambda>
yield (lambda: utt.verify_grad(f, [r, y], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1795, in verify_grad
abs_tol, rel_tol)
GradientError: GradientError: numeric gradient and analytic gradient exceed tolerance:
At position 1 of argument 0 with shape (100, 100),
val1 = -1.647935 , val2 = 0.008435
abs. error = 1.656370, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[-0.37677055, 0.32463221, 1.68402489, ..., 0.49646674,
0.30921242, -0.1380071 ],
[-0.31406368, -0.23834718, -0.70688616, ..., 0.07867473,
-0.03591484, -0.69033128],
[-0.23354082, 1.10444135, -0.77831345, ..., -0.37810674,
-0.85684233, 0.98508864],
...,
[-1.39800254, -0.32657637, 1.18996232, ..., 0.77066803,
1.4804801 , -0.68554455],
[-0.04174452, -1.05453767, 1.21432403, ..., -0.51165503,
-0.70889612, 0.26483268],
[-1.50149752, -1.33047436, -1.30863128, ..., -1.1015948 ,
-0.31188388, -0.23172907]]), array([[ 0.83812234],
[ 0.17554982],
[ 0.34566324],
[ 0.69526631],
[ 0.66284244],
[ 0.46398483],
[ 0.73233987],
[ 0.69225381],
[ 0.89224367],
[ 0.39707122],
[ 0.56768322],
[ 0.45040174],
[ 0.39671409],
[ 0.65088028],
[ 0.76885903],
[ 0.90107458],
[ 0.68002225],
[ 0.26227698],
[ 0.71872772],
[ 0.59494308],
[ 0.39271917],
[ 0.02702927],
[ 0.53276892],
[ 0.53729438],
[ 0.64523072],
[ 0.62399654],
[ 0.06246988],
[ 0.0999261 ],
[ 0.28645567],
[ 0.7004813 ],
[ 0.23364424],
[ 0.28695575],
[ 0.23223289],
[ 0.53865553],
[ 0.13464174],
[ 0.35540841],
[ 0.88399869],
[ 0.19811673],
[ 0.94914303],
[ 0.54190852],
[ 0.08000766],
[ 0.22727741],
[ 0.70911438],
[ 0.95834675],
[ 0.91922509],
[ 0.86545178],
[ 0.59818696],
[ 0.62626489],
[ 0.21661836],
[ 0.13325502],
[ 0.01839359],
[ 0.19663679],
[ 0.86799031],
[ 0.83398874],
[ 0.82916783],
[ 0.92070176],
[ 0.69489592],
[ 0.40249441],
[ 0.61431809],
[ 0.65740987],
[ 0.33741802],
[ 0.14589024],
[ 0.75741726],
[ 0.43215608],
[ 0.79860224],
[ 0.20883352],
[ 0.73451009],
[ 0.23787279],
[ 0.25842679],
[ 0.44244119],
[ 0.2325811 ],
[ 0.10406719],
[ 0.86571375],
[ 0.55272828],
[ 0.08768517],
[ 0.62484515],
[ 0.2690396 ],
[ 0.00160047],
[ 0.12156752],
[ 0.3059304 ],
[ 0.73795243],
[ 0.37626465],
[ 0.3508246 ],
[ 0.17204439],
[ 0.02534875],
[ 0.37568615],
[ 0.07818308],
[ 0.06166707],
[ 0.46802027],
[ 0.66328687],
[ 0.22330989],
[ 0.17044826],
[ 0.11115906],
[ 0.39724468],
[ 0.96838428],
[ 0.75930249],
[ 0.08572719],
[ 0.72238823],
[ 0.74672317],
[ 0.59874469]])],
The value of eps is:, None,
The out_type is:, None
FAILED: theano.gpuarray.tests.test_linalg.TestCusolver.Theano_python2 / Theano python2 / test_solve_grad
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 158, in test_solve_grad
self.verify_solve_grad(6, 1, A_structure, lower, rng)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 150, in verify_solve_grad
utt.verify_grad(solve_op, [A_val, b_val], 3, rng, eps=eps)
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1738, in verify_grad
o_fn_out = o_fn(*[p.copy() for p in pt])
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 305, in __call__
link.raise_with_op(node, thunk)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/vm.py", line 301, in __call__
thunk()
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 892, in rval
r = p(n, [x[0] for x in i], o)
File "/home/jenkins/workspace/Theano_buildbot/theano/gof/op.py", line 778, in perform
"Did you used Theano flags mode=FAST_COMPILE?"
MethodNotDefined: ('perform', <class 'theano.gpuarray.basic_ops.GpuContiguous'>, 'GpuContiguous', 'Did you used Theano flags mode=FAST_COMPILE? You can use optimizer=fast_compile instead.')
FAILED: theano.gpuarray.tests.test_linalg.TestMagma.Theano_python2_debug / Theano python2 debug / test_gpu_eigh

Error Message:
WrongValue
: shape, dtype, strides, min, max, n_inf, n_nan:
Expected : (1000,) float32 (4,) 4.95695e-06 1314.44 0 0
expected : [ 4.95695440e-06 2.43063131e-03 6.56550657e-03 8.45660921e-03
9.82335489e-03 1.51141742e-02 3.01291961e-02 3.50385942e-02
4.77125458e-02 5.82044348e-02 7.38457143e-02 8.81205797e-02
1.05794981e-01 1.37858063e-01 1.61130071e-01 1.70795664e-01
1.95671290e-01 2.17497826e-01 2.44790688e-01 2.58401036e-01
2.90724725e-01 3.31791401e-01 3.74514997e-01 4.07889873e-01
4.27099943e-01 4.58712190e-01 5.12224793e-01 5.40247977e-01
6.52739346e-01 6.62490845e-01 7.12915599e-01 7.48282075e-01
8.21284473e-01 9.47684109e-01 1.00372255e+00 1.08571851e+00
1.10923076e+00 1.15577698e+00 1.23103857e+00 1.27739286e+00
1.34922767e+00 1.38970327e+00 1.44845450e+00 1.52855444e+00
1.55094600e+00 1.66099846e+00 1.72633600e+00 1.87144017e+00
1.92127752e+00 2.00717258e+00 2.11877966e+00 2.15797496e+00
2.23655105e+00 2.30516052e+00 2.40463877e+00 2.49258280e+00
2.62716818e+00 2.67910910e+00 2.76909781e+00 2.81405997e+00
2.92032743e+00 3.02176428e+00 3.17762852e+00 3.26503921e+00
3.39789414e+00 3.51429725e+00 3.56538200e+00 3.69499993e+00
3.72457266e+00 3.94525743e+00 4.01687908e+00 4.15097809e+00
4.19942045e+00 4.42228317e+00 4.45706224e+00 4.73860645e+00
4.91810560e+00 4.98184109e+00 5.07521820e+00 5.17228889e+00
5.34171963e+00 5.45142508e+00 5.56108427e+00 5.73216009e+00
5.94291115e+00 6.05116940e+00 6.12221861e+00 6.29570436e+00
6.48470116e+00 6.64421034e+00 6.71117640e+00 6.97567129e+00
7.10571241e+00 7.27729321e+00 7.36499977e+00 7.37904310e+00
7.69736385e+00 7.88009787e+00 8.02557373e+00 8.12077999e+00
8.13267326e+00 8.51464462e+00 8.64583874e+00 8.76710701e+00
9.09334469e+00 9.12918663e+00 9.31581783e+00 9.36159039e+00
9.67258930e+00 9.87184334e+00 1.02059679e+01 1.04488344e+01
1.05670919e+01 1.07169905e+01 1.07497215e+01 1.10400810e+01
1.12153215e+01 1.13224611e+01 1.16055603e+01 1.16963806e+01
1.19752798e+01 1.21110573e+01 1.22265587e+01 1.25771618e+01
1.28293982e+01 1.29500542e+01 1.29810133e+01 1.34098921e+01
1.35341091e+01 1.38398790e+01 1.42068262e+01 1.43256416e+01
1.44484386e+01 1.46121044e+01 1.48533888e+01 1.50341854e+01
1.53007574e+01 1.57418499e+01 1.59119043e+01 1.63046894e+01
1.68564129e+01 1.68851528e+01 1.70628433e+01 1.71390266e+01
1.74612522e+01 1.75313129e+01 1.77527847e+01 1.82674885e+01
1.85521393e+01 1.88042583e+01 1.89404888e+01 1.90692692e+01
1.94673805e+01 1.95628281e+01 1.99444294e+01 2.02087059e+01
2.05806122e+01 2.08594494e+01 2.10871830e+01 2.11624222e+01
2.15076637e+01 2.16838818e+01 2.20606976e+01 2.22646713e+01
2.25439777e+01 2.27612400e+01 2.31183567e+01 2.32988453e+01
2.35924835e+01 2.38846226e+01 2.40376492e+01 2.42350578e+01
2.47274876e+01 2.49741325e+01 2.51768570e+01 2.53544865e+01
2.59752007e+01 2.63588200e+01 2.64530563e+01 2.67544708e+01
2.70595531e+01 2.75399170e+01 2.77670879e+01 2.80325642e+01
2.82077541e+01 2.85318165e+01 2.89092712e+01 2.91349010e+01
2.93408699e+01 2.95958805e+01 2.96948166e+01 3.00860271e+01
3.05833683e+01 3.07005100e+01 3.11746578e+01 3.17492638e+01
3.19743900e+01 3.20989494e+01 3.29974403e+01 3.32096329e+01
3.36388550e+01 3.41417732e+01 3.44234467e+01 3.46411171e+01
3.49069366e+01 3.50758324e+01 3.55646210e+01 3.59638176e+01
3.63454933e+01 3.64776955e+01 3.65684090e+01 3.72665138e+01
3.76621933e+01 3.80810699e+01 3.82100372e+01 3.84924049e+01
3.91441002e+01 3.92919731e+01 3.99105492e+01 4.01626320e+01
4.04036751e+01 4.05547981e+01 4.06410065e+01 4.12782440e+01
4.21440964e+01 4.23651848e+01 4.25549545e+01 4.28855019e+01
4.34562874e+01 4.35923424e+01 4.44004250e+01 4.47157669e+01
4.49225883e+01 4.51686859e+01 4.55530319e+01 4.58978348e+01
4.63271751e+01 4.70960655e+01 4.73684082e+01 4.77179146e+01
4.83595238e+01 4.85004005e+01 4.90902519e+01 4.92705307e+01
4.96360359e+01 5.01768379e+01 5.05178413e+01 5.08784904e+01
5.11056557e+01 5.14547729e+01 5.19274788e+01 5.24789810e+01
5.29460449e+01 5.30116997e+01 5.35173454e+01 5.42221222e+01
5.50382385e+01 5.54338760e+01 5.56765175e+01 5.67423553e+01
5.72529335e+01 5.74227867e+01 5.77410622e+01 5.83652496e+01
5.86097183e+01 5.88395958e+01 5.96139984e+01 5.99075851e+01
6.04755020e+01 6.07458153e+01 6.10089455e+01 6.14998322e+01
6.18782349e+01 6.23324318e+01 6.28642349e+01 6.32751808e+01
6.35134048e+01 6.41652756e+01 6.46347656e+01 6.50759125e+01
6.56187363e+01 6.60985947e+01 6.64315338e+01 6.71582718e+01
6.73653107e+01 6.79324036e+01 6.82919464e+01 6.88681946e+01
7.00567932e+01 7.01396103e+01 7.07118073e+01 7.09223709e+01
7.12132950e+01 7.17250824e+01 7.19492722e+01 7.28280563e+01
7.35820694e+01 7.41950378e+01 7.43562851e+01 7.54476700e+01
7.57032013e+01 7.59635544e+01 7.66656647e+01 7.68573761e+01
7.69329071e+01 7.74783020e+01 7.81940842e+01 7.87316437e+01
7.95768509e+01 8.03262482e+01 8.06446457e+01 8.11797104e+01
8.19003067e+01 8.29543304e+01 8.30585175e+01 8.47227859e+01
8.48795853e+01 8.56383362e+01 8.58651505e+01 8.64563446e+01
8.68795624e+01 8.75914688e+01 8.79169006e+01 8.81961365e+01
8.87552032e+01 8.91354980e+01 8.98847809e+01 9.06329422e+01
9.11041794e+01 9.19139175e+01 9.22878265e+01 9.29695969e+01
9.42230301e+01 9.45274353e+01 9.48185349e+01 9.52042542e+01
9.61282959e+01 9.68018417e+01 9.72427368e+01 9.76677017e+01
9.79643402e+01 9.89993362e+01 9.96723785e+01 1.00230354e+02
1.00958504e+02 1.01144760e+02 1.01869331e+02 1.02874229e+02
1.02904434e+02 1.03852859e+02 1.04019096e+02 1.05214668e+02
1.05695366e+02 1.06208916e+02 1.07330170e+02 1.07653946e+02
1.07939964e+02 1.09021042e+02 1.09413651e+02 1.09680161e+02
1.10011093e+02 1.10543510e+02 1.11877129e+02 1.12545586e+02
1.13083252e+02 1.13340965e+02 1.13597832e+02 1.14893730e+02
1.15122238e+02 1.15610634e+02 1.16723404e+02 1.17573578e+02
1.18001404e+02 1.18856468e+02 1.19183693e+02 1.19383789e+02
1.19824661e+02 1.20793152e+02 1.21629776e+02 1.22111984e+02
1.22910255e+02 1.24381828e+02 1.24494942e+02 1.25564362e+02
1.25876587e+02 1.26439720e+02 1.26950394e+02 1.27683754e+02
1.28036545e+02 1.28972351e+02 1.29401245e+02 1.29767471e+02
1.30305740e+02 1.31761520e+02 1.32293549e+02 1.33433395e+02
1.33985825e+02 1.34606613e+02 1.35062225e+02 1.36064072e+02
1.36855148e+02 1.38415924e+02 1.38668686e+02 1.39371658e+02
1.40336609e+02 1.40744110e+02 1.41222580e+02 1.42173462e+02
1.42935059e+02 1.43147659e+02 1.43824432e+02 1.44559082e+02
1.44970078e+02 1.45750214e+02 1.47568771e+02 1.47879166e+02
1.48287186e+02 1.48723450e+02 1.49316437e+02 1.50665558e+02
1.51612854e+02 1.52353348e+02 1.52627533e+02 1.53208176e+02
1.53676437e+02 1.54828842e+02 1.55477768e+02 1.55687332e+02
1.57617661e+02 1.58180496e+02 1.58925629e+02 1.60108002e+02
1.60826004e+02 1.61694244e+02 1.62173004e+02 1.62343689e+02
1.63625641e+02 1.64465378e+02 1.66040375e+02 1.66495392e+02
1.67364197e+02 1.67490417e+02 1.68366867e+02 1.69653091e+02
1.70087143e+02 1.71359467e+02 1.71970917e+02 1.72535706e+02
1.73246536e+02 1.74076462e+02 1.75430618e+02 1.75900070e+02
1.77175766e+02 1.77901413e+02 1.78409058e+02 1.79418167e+02
1.79849564e+02 1.81056076e+02 1.81273819e+02 1.82647186e+02
1.83071991e+02 1.83674240e+02 1.84167053e+02 1.85599640e+02
1.85839264e+02 1.87136734e+02 1.87937851e+02 1.89759125e+02
1.90429413e+02 1.90787903e+02 1.91714005e+02 1.93405823e+02
1.94088638e+02 1.94364426e+02 1.94851410e+02 1.95657364e+02
1.95754623e+02 1.97215256e+02 1.99144989e+02 1.99374725e+02
1.99958054e+02 2.00290100e+02 2.01551865e+02 2.02195236e+02
2.03400803e+02 2.04490005e+02 2.05146271e+02 2.06696518e+02
2.07431931e+02 2.08515503e+02 2.08875031e+02 2.09138855e+02
2.11361252e+02 2.12075974e+02 2.12996887e+02 2.13278702e+02
2.14725723e+02 2.15240494e+02 2.16976303e+02 2.18039413e+02
2.18929291e+02 2.19677063e+02 2.20076141e+02 2.20786560e+02
2.21755539e+02 2.23308548e+02 2.23881210e+02 2.24910248e+02
2.26122620e+02 2.26824722e+02 2.27602142e+02 2.29322418e+02
2.30083221e+02 2.31209335e+02 2.32024445e+02 2.32400284e+02
2.34048050e+02 2.35109833e+02 2.35243790e+02 2.37044067e+02
2.38457260e+02 2.38974152e+02 2.39564072e+02 2.40691864e+02
2.41053101e+02 2.42154022e+02 2.43079788e+02 2.43548431e+02
2.44631989e+02 2.45381302e+02 2.47547119e+02 2.48108627e+02
2.48317902e+02 2.50132950e+02 2.51336975e+02 2.52039307e+02
2.52273834e+02 2.53735535e+02 2.56003174e+02 2.57179474e+02
2.57470062e+02 2.58366089e+02 2.59290375e+02 2.60104065e+02
2.61393951e+02 2.63408417e+02 2.64429657e+02 2.66147369e+02
2.66442230e+02 2.67852875e+02 2.68556671e+02 2.69077240e+02
2.70379425e+02 2.70930511e+02 2.72054138e+02 2.73200226e+02
2.74283630e+02 2.74882355e+02 2.76109222e+02 2.76495148e+02
2.78251801e+02 2.78679443e+02 2.79492096e+02 2.81089264e+02
2.81439545e+02 2.82457184e+02 2.84318359e+02 2.85156494e+02
2.85935883e+02 2.87196106e+02 2.88148987e+02 2.88940125e+02
2.90283020e+02 2.91615692e+02 2.93534149e+02 2.94158844e+02
2.95170105e+02 2.96113495e+02 2.97466003e+02 2.98160370e+02
2.98743744e+02 3.01239899e+02 3.01673309e+02 3.02067841e+02
3.03104065e+02 3.04792603e+02 3.07057953e+02 3.08511475e+02
3.08977478e+02 3.10162689e+02 3.11624207e+02 3.12233307e+02
3.13438873e+02 3.16306793e+02 3.17600189e+02 3.18045410e+02
3.18985138e+02 3.19925415e+02 3.21572235e+02 3.23243622e+02
3.24389069e+02 3.24890564e+02 3.26830383e+02 3.27363190e+02
3.28513184e+02 3.29457367e+02 3.30261597e+02 3.32414337e+02
3.32803345e+02 3.35136932e+02 3.35746368e+02 3.36515717e+02
3.37433319e+02 3.38529694e+02 3.38647156e+02 3.39897522e+02
3.41167908e+02 3.42195374e+02 3.43903381e+02 3.45109375e+02
3.45971405e+02 3.48015503e+02 3.50154633e+02 3.52141663e+02
3.53709930e+02 3.54543762e+02 3.55922546e+02 3.57053986e+02
3.57960175e+02 3.59561371e+02 3.60487579e+02 3.61945465e+02
3.62680389e+02 3.64553436e+02 3.65587463e+02 3.66173553e+02
3.67008606e+02 3.68565460e+02 3.69315125e+02 3.71006256e+02
3.74027557e+02 3.75229706e+02 3.75971252e+02 3.76704620e+02
3.77797607e+02 3.78974457e+02 3.81736542e+02 3.82784210e+02
3.85088196e+02 3.86485779e+02 3.86615967e+02 3.87383118e+02
3.88326172e+02 3.89644318e+02 3.90108398e+02 3.92277344e+02
3.94343109e+02 3.95922760e+02 3.97144165e+02 3.97967285e+02
3.99390900e+02 4.01153656e+02 4.01692657e+02 4.04214417e+02
4.05092163e+02 4.05514435e+02 4.08079559e+02 4.09409668e+02
4.10008209e+02 4.11698700e+02 4.12361115e+02 4.14451019e+02
4.15920746e+02 4.16397125e+02 4.18858978e+02 4.20117859e+02
4.20393524e+02 4.23108887e+02 4.25401001e+02 4.26086761e+02
4.26724976e+02 4.28041748e+02 4.29472473e+02 4.30272919e+02
4.31629425e+02 4.32333923e+02 4.34616852e+02 4.35638824e+02
4.37058197e+02 4.39406738e+02 4.40159821e+02 4.40602905e+02
4.43222656e+02 4.45677002e+02 4.46184601e+02 4.47560822e+02
4.49509705e+02 4.50333710e+02 4.52579834e+02 4.53891388e+02
4.55905029e+02 4.57151459e+02 4.58729980e+02 4.59613190e+02
4.61382172e+02 4.62581726e+02 4.64082062e+02 4.66120239e+02
4.67154633e+02 4.68468689e+02 4.70620758e+02 4.72802704e+02
4.74405670e+02 4.75040100e+02 4.77381317e+02 4.78177277e+02
4.79088623e+02 4.83709167e+02 4.84677673e+02 4.86512939e+02
4.86794678e+02 4.87567566e+02 4.89485382e+02 4.92935913e+02
4.94727783e+02 4.95705353e+02 4.97375610e+02 4.98891266e+02
5.01160461e+02 5.02262512e+02 5.03153778e+02 5.05040497e+02
5.06154449e+02 5.08683655e+02 5.10286011e+02 5.11951385e+02
5.12975037e+02 5.14463928e+02 5.17308960e+02 5.19472107e+02
5.20296814e+02 5.21050293e+02 5.24258362e+02 5.26466614e+02
5.28564575e+02 5.29713623e+02 5.31732483e+02 5.32625061e+02
5.33895874e+02 5.35004822e+02 5.36689087e+02 5.39138428e+02
5.40180054e+02 5.42444702e+02 5.44950623e+02 5.45765259e+02
5.49442810e+02 5.50435425e+02 5.51638123e+02 5.53327515e+02
5.55800964e+02 5.57163269e+02 5.58274109e+02 5.61365173e+02
5.61997620e+02 5.65394043e+02 5.65887451e+02 5.67524719e+02
5.70905701e+02 5.71448181e+02 5.72789124e+02 5.74239624e+02
5.76320679e+02 5.78608643e+02 5.80261719e+02 5.83148926e+02
5.83390747e+02 5.85851440e+02 5.89416199e+02 5.89794250e+02
5.90741821e+02 5.94033020e+02 5.96356628e+02 5.97505310e+02
6.00270691e+02 6.02918640e+02 6.05315063e+02 6.06780029e+02
6.08941040e+02 6.10082947e+02 6.11770813e+02 6.13071045e+02
6.15556091e+02 6.17526672e+02 6.20302490e+02 6.20396851e+02
6.22601013e+02 6.25393188e+02 6.26939758e+02 6.29066589e+02
6.32269653e+02 6.34546814e+02 6.35416382e+02 6.37068420e+02
6.38491760e+02 6.41345642e+02 6.43357910e+02 6.44174438e+02
6.47398499e+02 6.49653748e+02 6.52133789e+02 6.53688965e+02
6.57702637e+02 6.60041321e+02 6.61746948e+02 6.62333862e+02
6.63630371e+02 6.65492981e+02 6.69170044e+02 6.71110474e+02
6.72536011e+02 6.74833435e+02 6.76352234e+02 6.79998901e+02
6.80831055e+02 6.82621216e+02 6.83880432e+02 6.87046387e+02
6.88309204e+02 6.93048645e+02 6.93579224e+02 6.96200562e+02
6.97856506e+02 6.99901184e+02 7.03498291e+02 7.04409424e+02
7.06626221e+02 7.06928345e+02 7.12676086e+02 7.14202209e+02
7.14917236e+02 7.16116211e+02 7.21841492e+02 7.22903809e+02
7.24273010e+02 7.26642517e+02 7.28822144e+02 7.31066284e+02
7.33493103e+02 7.36873901e+02 7.39621704e+02 7.40442505e+02
7.41733948e+02 7.43432190e+02 7.44783813e+02 7.47014648e+02
7.48981018e+02 7.52452759e+02 7.54216614e+02 7.56387817e+02
7.62324951e+02 7.62614563e+02 7.68018372e+02 7.70884216e+02
7.72539001e+02 7.75369324e+02 7.76304565e+02 7.78585632e+02
7.81802124e+02 7.84003906e+02 7.86518860e+02 7.87025818e+02
7.90002625e+02 7.90734070e+02 7.94419800e+02 7.97527588e+02
7.99448425e+02 8.02019470e+02 8.06504822e+02 8.08564331e+02
8.10306091e+02 8.15614502e+02 8.16939087e+02 8.21243042e+02
8.22536011e+02 8.27309326e+02 8.29849243e+02 8.32455139e+02
8.34898499e+02 8.38141663e+02 8.38948486e+02 8.40470703e+02
8.43448120e+02 8.47748230e+02 8.51299561e+02 8.56558838e+02
8.57538391e+02 8.58703491e+02 8.61431030e+02 8.63937561e+02
8.67265686e+02 8.72864136e+02 8.78027283e+02 8.79890747e+02
8.81216492e+02 8.85117615e+02 8.86630493e+02 8.89289734e+02
8.91421753e+02 8.94427124e+02 8.97910828e+02 8.99207092e+02
9.03311951e+02 9.08170410e+02 9.09508667e+02 9.12288818e+02
9.16102722e+02 9.17471375e+02 9.21352051e+02 9.22124451e+02
9.24350098e+02 9.28648193e+02 9.31345032e+02 9.36004578e+02
9.39655762e+02 9.44435181e+02 9.47724243e+02 9.49452148e+02
9.55252319e+02 9.57909363e+02 9.59536194e+02 9.62560364e+02
9.66002258e+02 9.71531494e+02 9.74324097e+02 9.76539368e+02
9.77509521e+02 9.79911438e+02 9.84302795e+02 9.91155884e+02
9.92229187e+02 9.97336731e+02 1.00085919e+03 1.00228998e+03
1.01001196e+03 1.01233582e+03 1.01921857e+03 1.02124139e+03
1.02359625e+03 1.02738391e+03 1.03333105e+03 1.03532080e+03
1.03664587e+03 1.03988660e+03 1.04701904e+03 1.05137598e+03
1.05603210e+03 1.05945276e+03 1.06536682e+03 1.06804517e+03
1.07083899e+03 1.07204895e+03 1.07769202e+03 1.08044275e+03
1.08816882e+03 1.09153479e+03 1.09620166e+03 1.10103137e+03
1.10380078e+03 1.10870288e+03 1.11296863e+03 1.11855432e+03
1.12163879e+03 1.12748096e+03 1.13287183e+03 1.13865771e+03
1.14910779e+03 1.15240210e+03 1.15756091e+03 1.16246033e+03
1.16705090e+03 1.16905505e+03 1.17729236e+03 1.18330652e+03
1.18423425e+03 1.19582153e+03 1.20297327e+03 1.20445508e+03
1.20709021e+03 1.21210010e+03 1.22435461e+03 1.23170166e+03
1.24609351e+03 1.24998047e+03 1.27155884e+03 1.27597461e+03
1.28550818e+03 1.28894080e+03 1.29022766e+03 1.31443567e+03]
value : [ 6.44587635e-06 2.42215884e-03 6.55746367e-03 8.47825967e-03
9.84001625e-03 1.51212206e-02 3.01121604e-02 3.50309238e-02
4.77008820e-02 5.82269318e-02 7.38391802e-02 8.81094709e-02
1.05783433e-01 1.37876660e-01 1.61138996e-01 1.70798272e-01
1.95682883e-01 2.17516378e-01 2.44780913e-01 2.58415818e-01
2.90732592e-01 3.31826329e-01 3.74541014e-01 4.07888710e-01
4.27108943e-01 4.58717763e-01 5.12218297e-01 5.40250957e-01
6.52754366e-01 6.62499070e-01 7.12900281e-01 7.48282731e-01
8.21294427e-01 9.47669744e-01 1.00372756e+00 1.08570588e+00
1.10923398e+00 1.15579808e+00 1.23106456e+00 1.27740943e+00
1.34924150e+00 1.38969684e+00 1.44845283e+00 1.52857733e+00
1.55094934e+00 1.66098249e+00 1.72633862e+00 1.87145305e+00
1.92126703e+00 2.00717378e+00 2.11880136e+00 2.15797400e+00
2.23656559e+00 2.30516005e+00 2.40463853e+00 2.49258256e+00
2.62717700e+00 2.67909527e+00 2.76910543e+00 2.81408095e+00
2.92035508e+00 3.02174354e+00 3.17764568e+00 3.26505327e+00
3.39788890e+00 3.51429582e+00 3.56541133e+00 3.69496727e+00
3.72456574e+00 3.94524908e+00 4.01688194e+00 4.15100193e+00
4.19943953e+00 4.42227983e+00 4.45704603e+00 4.73858547e+00
4.91809988e+00 4.98185253e+00 5.07524729e+00 5.17229319e+00
5.34174109e+00 5.45144653e+00 5.56105661e+00 5.73216152e+00
5.94291592e+00 6.05118942e+00 6.12223053e+00 6.29571581e+00
6.48470592e+00 6.64421844e+00 6.71118736e+00 6.97567844e+00
7.10572672e+00 7.27730656e+00 7.36503363e+00 7.37904167e+00
7.69739819e+00 7.88008833e+00 8.02561569e+00 8.12077332e+00
8.13265896e+00 8.51465702e+00 8.64584160e+00 8.76709175e+00
9.09335232e+00 9.12919140e+00 9.31584740e+00 9.36162472e+00
9.67258167e+00 9.87185574e+00 1.02059650e+01 1.04488287e+01
1.05671215e+01 1.07169905e+01 1.07497215e+01 1.10400839e+01
1.12153311e+01 1.13224564e+01 1.16055689e+01 1.16963863e+01
1.19753284e+01 1.21110716e+01 1.22265844e+01 1.25771475e+01
1.28293905e+01 1.29500608e+01 1.29810209e+01 1.34098845e+01
1.35341215e+01 1.38398771e+01 1.42068796e+01 1.43256235e+01
1.44484634e+01 1.46121407e+01 1.48533859e+01 1.50341854e+01
1.53007593e+01 1.57418642e+01 1.59119253e+01 1.63046837e+01
1.68564301e+01 1.68851528e+01 1.70628796e+01 1.71390076e+01
1.74612522e+01 1.75313282e+01 1.77527885e+01 1.82674885e+01
1.85521202e+01 1.88042450e+01 1.89404945e+01 1.90692883e+01
1.94673939e+01 1.95628319e+01 1.99444408e+01 2.02086945e+01
2.05806541e+01 2.08594761e+01 2.10872097e+01 2.11624088e+01
2.15076504e+01 2.16838837e+01 2.20607071e+01 2.22646351e+01
2.25439968e+01 2.27612591e+01 2.31183586e+01 2.32988567e+01
2.35924911e+01 2.38846054e+01 2.40376797e+01 2.42350540e+01
2.47274971e+01 2.49741154e+01 2.51768665e+01 2.53545055e+01
2.59751892e+01 2.63588066e+01 2.64530506e+01 2.67544842e+01
2.70595131e+01 2.75399055e+01 2.77671013e+01 2.80325642e+01
2.82077751e+01 2.85318222e+01 2.89092636e+01 2.91348991e+01
2.93408699e+01 2.95959110e+01 2.96948338e+01 3.00860119e+01
3.05833607e+01 3.07005196e+01 3.11746788e+01 3.17491932e+01
3.19743214e+01 3.20989304e+01 3.29974861e+01 3.32096558e+01
3.36388321e+01 3.41417885e+01 3.44234734e+01 3.46411476e+01
3.49069290e+01 3.50758286e+01 3.55646248e+01 3.59638290e+01
3.63454933e+01 3.64777184e+01 3.65683975e+01 3.72665138e+01
3.76622124e+01 3.80810928e+01 3.82100334e+01 3.84924126e+01
3.91440887e+01 3.92919769e+01 3.99105568e+01 4.01626358e+01
4.04036827e+01 4.05547791e+01 4.06410065e+01 4.12782326e+01
4.21441040e+01 4.23651848e+01 4.25549622e+01 4.28854828e+01
4.34563103e+01 4.35923691e+01 4.44004364e+01 4.47157631e+01
4.49225616e+01 4.51686745e+01 4.55530396e+01 4.58978271e+01
4.63271751e+01 4.70960655e+01 4.73684082e+01 4.77179375e+01
4.83595314e+01 4.85003624e+01 4.90902748e+01 4.92705154e+01
4.96359863e+01 5.01768494e+01 5.05178108e+01 5.08785057e+01
5.11056480e+01 5.14547882e+01 5.19274864e+01 5.24790192e+01
5.29460602e+01 5.30117264e+01 5.35173988e+01 5.42221527e+01
5.50382462e+01 5.54338722e+01 5.56765137e+01 5.67423630e+01
5.72529182e+01 5.74227982e+01 5.77410278e+01 5.83652763e+01
5.86096878e+01 5.88395653e+01 5.96139641e+01 5.99075623e+01
6.04754944e+01 6.07458305e+01 6.10089493e+01 6.14998245e+01
6.18782120e+01 6.23323822e+01 6.28642464e+01 6.32752228e+01
6.35134430e+01 6.41652527e+01 6.46347656e+01 6.50758820e+01
6.56187363e+01 6.60985870e+01 6.64314957e+01 6.71582489e+01
6.73653107e+01 6.79323883e+01 6.82919922e+01 6.88681946e+01
7.00568085e+01 7.01396103e+01 7.07118530e+01 7.09224014e+01
7.12133408e+01 7.17250595e+01 7.19493027e+01 7.28280411e+01
7.35820618e+01 7.41950226e+01 7.43562622e+01 7.54476624e+01
7.57032394e+01 7.59635391e+01 7.66656723e+01 7.68573837e+01
7.69329453e+01 7.74783249e+01 7.81940918e+01 7.87316208e+01
7.95767899e+01 8.03262329e+01 8.06446457e+01 8.11797256e+01
8.19003220e+01 8.29543533e+01 8.30585327e+01 8.47228165e+01
8.48796082e+01 8.56383362e+01 8.58651733e+01 8.64563751e+01
8.68795547e+01 8.75914383e+01 8.79169159e+01 8.81961136e+01
8.87551880e+01 8.91354904e+01 8.98847885e+01 9.06329575e+01
9.11041794e+01 9.19139252e+01 9.22878723e+01 9.29696350e+01
9.42230530e+01 9.45274582e+01 9.48185730e+01 9.52042618e+01
9.61282654e+01 9.68018646e+01 9.72427902e+01 9.76677322e+01
9.79643555e+01 9.89993591e+01 9.96723633e+01 1.00230331e+02
1.00958527e+02 1.01144760e+02 1.01869415e+02 1.02874229e+02
1.02904434e+02 1.03852852e+02 1.04019081e+02 1.05214706e+02
1.05695389e+02 1.06208908e+02 1.07330193e+02 1.07653976e+02
1.07940002e+02 1.09021057e+02 1.09413673e+02 1.09680168e+02
1.10011116e+02 1.10543510e+02 1.11877136e+02 1.12545593e+02
1.13083252e+02 1.13340965e+02 1.13597839e+02 1.14893753e+02
1.15122284e+02 1.15610687e+02 1.16723434e+02 1.17573608e+02
1.18001404e+02 1.18856522e+02 1.19183739e+02 1.19383812e+02
1.19824684e+02 1.20793167e+02 1.21629807e+02 1.22112000e+02
1.22910278e+02 1.24381889e+02 1.24494911e+02 1.25564430e+02
1.25876572e+02 1.26439728e+02 1.26950417e+02 1.27683769e+02
1.28036575e+02 1.28972366e+02 1.29401276e+02 1.29767487e+02
1.30305847e+02 1.31761536e+02 1.32293427e+02 1.33433426e+02
1.33985733e+02 1.34606628e+02 1.35062241e+02 1.36064102e+02
1.36855103e+02 1.38415970e+02 1.38668686e+02 1.39371628e+02
1.40336685e+02 1.40744186e+02 1.41222595e+02 1.42173401e+02
1.42934982e+02 1.43147629e+02 1.43824432e+02 1.44559128e+02
1.44970047e+02 1.45750168e+02 1.47568787e+02 1.47879166e+02
1.48287170e+02 1.48723480e+02 1.49316406e+02 1.50665543e+02
1.51612839e+02 1.52353333e+02 1.52627533e+02 1.53208176e+02
1.53676407e+02 1.54828888e+02 1.55477722e+02 1.55687241e+02
1.57617691e+02 1.58180557e+02 1.58925644e+02 1.60107986e+02
1.60826035e+02 1.61694275e+02 1.62173019e+02 1.62343750e+02
1.63625687e+02 1.64465317e+02 1.66040375e+02 1.66495392e+02
1.67364273e+02 1.67490433e+02 1.68366882e+02 1.69653091e+02
1.70087204e+02 1.71359573e+02 1.71970932e+02 1.72535767e+02
1.73246582e+02 1.74076477e+02 1.75430588e+02 1.75900055e+02
1.77175827e+02 1.77901367e+02 1.78409012e+02 1.79418091e+02
1.79849579e+02 1.81056259e+02 1.81273834e+02 1.82647202e+02
1.83072098e+02 1.83674332e+02 1.84167084e+02 1.85599716e+02
1.85839371e+02 1.87136749e+02 1.87937943e+02 1.89759155e+02
1.90429382e+02 1.90787842e+02 1.91714066e+02 1.93405869e+02
1.94088654e+02 1.94364456e+02 1.94851395e+02 1.95657364e+02
1.95754639e+02 1.97215225e+02 1.99145065e+02 1.99374786e+02
1.99958038e+02 2.00290131e+02 2.01551865e+02 2.02195190e+02
2.03400879e+02 2.04489944e+02 2.05146240e+02 2.06696472e+02
2.07432022e+02 2.08515594e+02 2.08875122e+02 2.09138901e+02
2.11361252e+02 2.12076096e+02 2.12996994e+02 2.13278748e+02
2.14725769e+02 2.15240570e+02 2.16976288e+02 2.18039474e+02
2.18929337e+02 2.19677155e+02 2.20076202e+02 2.20786621e+02
2.21755600e+02 2.23308624e+02 2.23881287e+02 2.24910339e+02
2.26122635e+02 2.26824768e+02 2.27602081e+02 2.29322495e+02
2.30083252e+02 2.31209412e+02 2.32024506e+02 2.32400375e+02
2.34048004e+02 2.35109848e+02 2.35243835e+02 2.37044128e+02
2.38457214e+02 2.38974243e+02 2.39564148e+02 2.40691925e+02
2.41053116e+02 2.42153992e+02 2.43079849e+02 2.43548401e+02
2.44632080e+02 2.45381439e+02 2.47547089e+02 2.48108643e+02
2.48317917e+02 2.50132996e+02 2.51336945e+02 2.52039398e+02
2.52273941e+02 2.53735565e+02 2.56003113e+02 2.57179504e+02
2.57470123e+02 2.58366119e+02 2.59290344e+02 2.60103821e+02
2.61393890e+02 2.63408295e+02 2.64429657e+02 2.66147369e+02
2.66442261e+02 2.67852753e+02 2.68556702e+02 2.69077179e+02
2.70379395e+02 2.70930420e+02 2.72054108e+02 2.73200195e+02
2.74283661e+02 2.74882355e+02 2.76109314e+02 2.76495056e+02
2.78251801e+02 2.78679321e+02 2.79492004e+02 2.81089264e+02
2.81439514e+02 2.82457367e+02 2.84318298e+02 2.85156586e+02
2.85935883e+02 2.87196198e+02 2.88148956e+02 2.88940033e+02
2.90282990e+02 2.91615723e+02 2.93534210e+02 2.94158875e+02
2.95170258e+02 2.96113525e+02 2.97466034e+02 2.98160431e+02
2.98743988e+02 3.01239960e+02 3.01673492e+02 3.02067902e+02
3.03103973e+02 3.04792694e+02 3.07057922e+02 3.08511627e+02
3.08977600e+02 3.10162567e+02 3.11624207e+02 3.12233276e+02
3.13439026e+02 3.16306732e+02 3.17600159e+02 3.18045471e+02
3.18985138e+02 3.19925476e+02 3.21572205e+02 3.23243805e+02
3.24389069e+02 3.24890625e+02 3.26830353e+02 3.27363220e+02
3.28513123e+02 3.29457367e+02 3.30261566e+02 3.32414398e+02
3.32803436e+02 3.35136932e+02 3.35746368e+02 3.36515808e+02
3.37433441e+02 3.38529633e+02 3.38647186e+02 3.39897491e+02
3.41167908e+02 3.42195343e+02 3.43903381e+02 3.45109406e+02
3.45971466e+02 3.48015442e+02 3.50154694e+02 3.52141663e+02
3.53709869e+02 3.54543732e+02 3.55922485e+02 3.57053925e+02
3.57960175e+02 3.59561218e+02 3.60487549e+02 3.61945404e+02
3.62680450e+02 3.64553436e+02 3.65587433e+02 3.66173615e+02
3.67008667e+02 3.68565552e+02 3.69315186e+02 3.71006165e+02
3.74027618e+02 3.75229675e+02 3.75971191e+02 3.76704681e+02
3.77797668e+02 3.78974609e+02 3.81736389e+02 3.82784302e+02
3.85088287e+02 3.86485718e+02 3.86615967e+02 3.87383118e+02
3.88326263e+02 3.89644165e+02 3.90108307e+02 3.92277435e+02
3.94343231e+02 3.95922760e+02 3.97144287e+02 3.97967346e+02
3.99391022e+02 4.01153625e+02 4.01692627e+02 4.04214417e+02
4.05092316e+02 4.05514313e+02 4.08079712e+02 4.09409698e+02
4.10008240e+02 4.11698700e+02 4.12360992e+02 4.14451141e+02
4.15920868e+02 4.16397156e+02 4.18858856e+02 4.20117920e+02
4.20393433e+02 4.23109009e+02 4.25401001e+02 4.26086731e+02
4.26724976e+02 4.28041718e+02 4.29472412e+02 4.30273041e+02
4.31629211e+02 4.32333740e+02 4.34616730e+02 4.35638672e+02
4.37058167e+02 4.39406769e+02 4.40159851e+02 4.40602966e+02
4.43222534e+02 4.45676819e+02 4.46184662e+02 4.47560822e+02
4.49509796e+02 4.50333923e+02 4.52579926e+02 4.53891449e+02
4.55905029e+02 4.57151367e+02 4.58729950e+02 4.59613190e+02
4.61382324e+02 4.62581848e+02 4.64082092e+02 4.66120300e+02
4.67154633e+02 4.68468689e+02 4.70620758e+02 4.72802795e+02
4.74405914e+02 4.75040192e+02 4.77381317e+02 4.78177277e+02
4.79088593e+02 4.83709229e+02 4.84677734e+02 4.86513031e+02
4.86794861e+02 4.87567719e+02 4.89485382e+02 4.92935913e+02
4.94727814e+02 4.95705505e+02 4.97375732e+02 4.98891418e+02
5.01160553e+02 5.02262451e+02 5.03153778e+02 5.05040588e+02
5.06154449e+02 5.08683716e+02 5.10286072e+02 5.11951294e+02
5.12975098e+02 5.14464172e+02 5.17308899e+02 5.19472046e+02
5.20296936e+02 5.21050354e+02 5.24258240e+02 5.26466675e+02
5.28564697e+02 5.29713745e+02 5.31732727e+02 5.32624878e+02
5.33895874e+02 5.35004761e+02 5.36688965e+02 5.39138428e+02
5.40180054e+02 5.42444458e+02 5.44950623e+02 5.45765259e+02
5.49443054e+02 5.50435181e+02 5.51638000e+02 5.53327454e+02
5.55801208e+02 5.57163635e+02 5.58274353e+02 5.61365356e+02
5.61997559e+02 5.65394348e+02 5.65887756e+02 5.67524597e+02
5.70905884e+02 5.71448364e+02 5.72789246e+02 5.74239502e+02
5.76320801e+02 5.78608459e+02 5.80261780e+02 5.83148682e+02
5.83390686e+02 5.85851501e+02 5.89415955e+02 5.89794189e+02
5.90741638e+02 5.94033020e+02 5.96356812e+02 5.97505554e+02
6.00270630e+02 6.02918213e+02 6.05315063e+02 6.06780212e+02
6.08941040e+02 6.10083008e+02 6.11770508e+02 6.13071289e+02
6.15556091e+02 6.17526367e+02 6.20302429e+02 6.20396667e+02
6.22600891e+02 6.25393127e+02 6.26939453e+02 6.29066589e+02
6.32269714e+02 6.34546448e+02 6.35416748e+02 6.37068054e+02
6.38491516e+02 6.41345764e+02 6.43358215e+02 6.44174133e+02
6.47398376e+02 6.49653809e+02 6.52133911e+02 6.53689148e+02
6.57702698e+02 6.60040955e+02 6.61747131e+02 6.62333801e+02
6.63630371e+02 6.65493347e+02 6.69169861e+02 6.71110413e+02
6.72535889e+02 6.74833313e+02 6.76352356e+02 6.79999084e+02
6.80830933e+02 6.82621277e+02 6.83880371e+02 6.87046326e+02
6.88309387e+02 6.93048401e+02 6.93579163e+02 6.96200256e+02
6.97856689e+02 6.99900696e+02 7.03497925e+02 7.04408936e+02
7.06626099e+02 7.06928101e+02 7.12675659e+02 7.14202515e+02
7.14917175e+02 7.16116150e+02 7.21841553e+02 7.22903748e+02
7.24272888e+02 7.26642395e+02 7.28822327e+02 7.31066284e+02
7.33492554e+02 7.36873718e+02 7.39621704e+02 7.40443115e+02
7.41734436e+02 7.43431885e+02 7.44784668e+02 7.47014832e+02
7.48981323e+02 7.52452698e+02 7.54216553e+02 7.56387634e+02
7.62325195e+02 7.62614807e+02 7.68018372e+02 7.70884399e+02
7.72538879e+02 7.75369080e+02 7.76304382e+02 7.78586182e+02
7.81802246e+02 7.84004028e+02 7.86518555e+02 7.87026245e+02
7.90002808e+02 7.90734253e+02 7.94420044e+02 7.97527527e+02
7.99448853e+02 8.02019043e+02 8.06504639e+02 8.08564514e+02
8.10305969e+02 8.15614502e+02 8.16939209e+02 8.21242920e+02
8.22536438e+02 8.27308960e+02 8.29848877e+02 8.32455139e+02
8.34898132e+02 8.38141541e+02 8.38948303e+02 8.40470337e+02
8.43448059e+02 8.47747986e+02 8.51299866e+02 8.56558838e+02
8.57538635e+02 8.58703369e+02 8.61431091e+02 8.63937500e+02
8.67265503e+02 8.72864319e+02 8.78027405e+02 8.79890869e+02
8.81216492e+02 8.85117981e+02 8.86630798e+02 8.89290283e+02
8.91420776e+02 8.94427673e+02 8.97910767e+02 8.99206848e+02
9.03311401e+02 9.08170959e+02 9.09508606e+02 9.12288940e+02
9.16102600e+02 9.17471375e+02 9.21352417e+02 9.22125061e+02
9.24349792e+02 9.28648621e+02 9.31345215e+02 9.36004211e+02
9.39656433e+02 9.44435181e+02 9.47723450e+02 9.49451355e+02
9.55251953e+02 9.57909363e+02 9.59536194e+02 9.62560364e+02
9.66002686e+02 9.71531494e+02 9.74323608e+02 9.76539124e+02
9.77509766e+02 9.79912292e+02 9.84304443e+02 9.91154846e+02
9.92229614e+02 9.97335938e+02 1.00085791e+03 1.00228900e+03
1.01001233e+03 1.01233582e+03 1.01921838e+03 1.02124121e+03
1.02359650e+03 1.02738379e+03 1.03333301e+03 1.03532129e+03
1.03664795e+03 1.03988525e+03 1.04701978e+03 1.05137500e+03
1.05603149e+03 1.05945251e+03 1.06536658e+03 1.06804456e+03
1.07083948e+03 1.07205017e+03 1.07769250e+03 1.08044348e+03
1.08816711e+03 1.09153577e+03 1.09620178e+03 1.10103088e+03
1.10380139e+03 1.10870190e+03 1.11296899e+03 1.11855322e+03
1.12164258e+03 1.12748096e+03 1.13286865e+03 1.13865649e+03
1.14910632e+03 1.15240259e+03 1.15756104e+03 1.16246179e+03
1.16704919e+03 1.16905554e+03 1.17729028e+03 1.18330896e+03
1.18423621e+03 1.19582214e+03 1.20298132e+03 1.20445337e+03
1.20707104e+03 1.21210461e+03 1.22435571e+03 1.23170142e+03
1.24609460e+03 1.24998035e+03 1.27155566e+03 1.27597485e+03
1.28551099e+03 1.28893518e+03 1.29023010e+03 1.31443518e+03]
Max Abs Diff: 0.019165
Mean Abs Diff: 0.000166637
Median Abs Diff: 3.05176e-05
Std Abs Diff: 0.000762243
Max Rel Diff: 0.30037
Mean Rel Diff: 0.000312976
Median Rel Diff: 2.69464e-07
Std Rel Diff: 0.00949465

rtol, atol: 1e-05 0.001


Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 592, in test_gpu_eigh
self.check_gpu_eigh(1000, UPLO='U', compute_v=False, atol=1e-3)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 579, in check_gpu_eigh
utt.assert_allclose(d_np, d_gpu, rtol=rtol, atol=atol)
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 358, in assert_allclose
raise WrongValue(expected, value, rtol, atol)
WrongValue: WrongValue
: shape, dtype, strides, min, max, n_inf, n_nan:
Expected : (1000,) float32 (4,) 4.95695e-06 1314.44 0 0
expected : [ 4.95695440e-06 2.43063131e-03 6.56550657e-03 8.45660921e-03
9.82335489e-03 1.51141742e-02 3.01291961e-02 3.50385942e-02
4.77125458e-02 5.82044348e-02 7.38457143e-02 8.81205797e-02
1.05794981e-01 1.37858063e-01 1.61130071e-01 1.70795664e-01
1.95671290e-01 2.17497826e-01 2.44790688e-01 2.58401036e-01
2.90724725e-01 3.31791401e-01 3.74514997e-01 4.07889873e-01
4.27099943e-01 4.58712190e-01 5.12224793e-01 5.40247977e-01
6.52739346e-01 6.62490845e-01 7.12915599e-01 7.48282075e-01
8.21284473e-01 9.47684109e-01 1.00372255e+00 1.08571851e+00
1.10923076e+00 1.15577698e+00 1.23103857e+00 1.27739286e+00
1.34922767e+00 1.38970327e+00 1.44845450e+00 1.52855444e+00
1.55094600e+00 1.66099846e+00 1.72633600e+00 1.87144017e+00
1.92127752e+00 2.00717258e+00 2.11877966e+00 2.15797496e+00
2.23655105e+00 2.30516052e+00 2.40463877e+00 2.49258280e+00
2.62716818e+00 2.67910910e+00 2.76909781e+00 2.81405997e+00
2.92032743e+00 3.02176428e+00 3.17762852e+00 3.26503921e+00
3.39789414e+00 3.51429725e+00 3.56538200e+00 3.69499993e+00
3.72457266e+00 3.94525743e+00 4.01687908e+00 4.15097809e+00
4.19942045e+00 4.42228317e+00 4.45706224e+00 4.73860645e+00
4.91810560e+00 4.98184109e+00 5.07521820e+00 5.17228889e+00
5.34171963e+00 5.45142508e+00 5.56108427e+00 5.73216009e+00
5.94291115e+00 6.05116940e+00 6.12221861e+00 6.29570436e+00
6.48470116e+00 6.64421034e+00 6.71117640e+00 6.97567129e+00
7.10571241e+00 7.27729321e+00 7.36499977e+00 7.37904310e+00
7.69736385e+00 7.88009787e+00 8.02557373e+00 8.12077999e+00
8.13267326e+00 8.51464462e+00 8.64583874e+00 8.76710701e+00
9.09334469e+00 9.12918663e+00 9.31581783e+00 9.36159039e+00
9.67258930e+00 9.87184334e+00 1.02059679e+01 1.04488344e+01
1.05670919e+01 1.07169905e+01 1.07497215e+01 1.10400810e+01
1.12153215e+01 1.13224611e+01 1.16055603e+01 1.16963806e+01
1.19752798e+01 1.21110573e+01 1.22265587e+01 1.25771618e+01
1.28293982e+01 1.29500542e+01 1.29810133e+01 1.34098921e+01
1.35341091e+01 1.38398790e+01 1.42068262e+01 1.43256416e+01
1.44484386e+01 1.46121044e+01 1.48533888e+01 1.50341854e+01
1.53007574e+01 1.57418499e+01 1.59119043e+01 1.63046894e+01
1.68564129e+01 1.68851528e+01 1.70628433e+01 1.71390266e+01
1.74612522e+01 1.75313129e+01 1.77527847e+01 1.82674885e+01
1.85521393e+01 1.88042583e+01 1.89404888e+01 1.90692692e+01
1.94673805e+01 1.95628281e+01 1.99444294e+01 2.02087059e+01
2.05806122e+01 2.08594494e+01 2.10871830e+01 2.11624222e+01
2.15076637e+01 2.16838818e+01 2.20606976e+01 2.22646713e+01
2.25439777e+01 2.27612400e+01 2.31183567e+01 2.32988453e+01
2.35924835e+01 2.38846226e+01 2.40376492e+01 2.42350578e+01
2.47274876e+01 2.49741325e+01 2.51768570e+01 2.53544865e+01
2.59752007e+01 2.63588200e+01 2.64530563e+01 2.67544708e+01
2.70595531e+01 2.75399170e+01 2.77670879e+01 2.80325642e+01
2.82077541e+01 2.85318165e+01 2.89092712e+01 2.91349010e+01
2.93408699e+01 2.95958805e+01 2.96948166e+01 3.00860271e+01
3.05833683e+01 3.07005100e+01 3.11746578e+01 3.17492638e+01
3.19743900e+01 3.20989494e+01 3.29974403e+01 3.32096329e+01
3.36388550e+01 3.41417732e+01 3.44234467e+01 3.46411171e+01
3.49069366e+01 3.50758324e+01 3.55646210e+01 3.59638176e+01
3.63454933e+01 3.64776955e+01 3.65684090e+01 3.72665138e+01
3.76621933e+01 3.80810699e+01 3.82100372e+01 3.84924049e+01
3.91441002e+01 3.92919731e+01 3.99105492e+01 4.01626320e+01
4.04036751e+01 4.05547981e+01 4.06410065e+01 4.12782440e+01
4.21440964e+01 4.23651848e+01 4.25549545e+01 4.28855019e+01
4.34562874e+01 4.35923424e+01 4.44004250e+01 4.47157669e+01
4.49225883e+01 4.51686859e+01 4.55530319e+01 4.58978348e+01
4.63271751e+01 4.70960655e+01 4.73684082e+01 4.77179146e+01
4.83595238e+01 4.85004005e+01 4.90902519e+01 4.92705307e+01
4.96360359e+01 5.01768379e+01 5.05178413e+01 5.08784904e+01
5.11056557e+01 5.14547729e+01 5.19274788e+01 5.24789810e+01
5.29460449e+01 5.30116997e+01 5.35173454e+01 5.42221222e+01
5.50382385e+01 5.54338760e+01 5.56765175e+01 5.67423553e+01
5.72529335e+01 5.74227867e+01 5.77410622e+01 5.83652496e+01
5.86097183e+01 5.88395958e+01 5.96139984e+01 5.99075851e+01
6.04755020e+01 6.07458153e+01 6.10089455e+01 6.14998322e+01
6.18782349e+01 6.23324318e+01 6.28642349e+01 6.32751808e+01
6.35134048e+01 6.41652756e+01 6.46347656e+01 6.50759125e+01
6.56187363e+01 6.60985947e+01 6.64315338e+01 6.71582718e+01
6.73653107e+01 6.79324036e+01 6.82919464e+01 6.88681946e+01
7.00567932e+01 7.01396103e+01 7.07118073e+01 7.09223709e+01
7.12132950e+01 7.17250824e+01 7.19492722e+01 7.28280563e+01
7.35820694e+01 7.41950378e+01 7.43562851e+01 7.54476700e+01
7.57032013e+01 7.59635544e+01 7.66656647e+01 7.68573761e+01
7.69329071e+01 7.74783020e+01 7.81940842e+01 7.87316437e+01
7.95768509e+01 8.03262482e+01 8.06446457e+01 8.11797104e+01
8.19003067e+01 8.29543304e+01 8.30585175e+01 8.47227859e+01
8.48795853e+01 8.56383362e+01 8.58651505e+01 8.64563446e+01
8.68795624e+01 8.75914688e+01 8.79169006e+01 8.81961365e+01
8.87552032e+01 8.91354980e+01 8.98847809e+01 9.06329422e+01
9.11041794e+01 9.19139175e+01 9.22878265e+01 9.29695969e+01
9.42230301e+01 9.45274353e+01 9.48185349e+01 9.52042542e+01
9.61282959e+01 9.68018417e+01 9.72427368e+01 9.76677017e+01
9.79643402e+01 9.89993362e+01 9.96723785e+01 1.00230354e+02
1.00958504e+02 1.01144760e+02 1.01869331e+02 1.02874229e+02
1.02904434e+02 1.03852859e+02 1.04019096e+02 1.05214668e+02
1.05695366e+02 1.06208916e+02 1.07330170e+02 1.07653946e+02
1.07939964e+02 1.09021042e+02 1.09413651e+02 1.09680161e+02
1.10011093e+02 1.10543510e+02 1.11877129e+02 1.12545586e+02
1.13083252e+02 1.13340965e+02 1.13597832e+02 1.14893730e+02
1.15122238e+02 1.15610634e+02 1.16723404e+02 1.17573578e+02
1.18001404e+02 1.18856468e+02 1.19183693e+02 1.19383789e+02
1.19824661e+02 1.20793152e+02 1.21629776e+02 1.22111984e+02
1.22910255e+02 1.24381828e+02 1.24494942e+02 1.25564362e+02
1.25876587e+02 1.26439720e+02 1.26950394e+02 1.27683754e+02
1.28036545e+02 1.28972351e+02 1.29401245e+02 1.29767471e+02
1.30305740e+02 1.31761520e+02 1.32293549e+02 1.33433395e+02
1.33985825e+02 1.34606613e+02 1.35062225e+02 1.36064072e+02
1.36855148e+02 1.38415924e+02 1.38668686e+02 1.39371658e+02
1.40336609e+02 1.40744110e+02 1.41222580e+02 1.42173462e+02
1.42935059e+02 1.43147659e+02 1.43824432e+02 1.44559082e+02
1.44970078e+02 1.45750214e+02 1.47568771e+02 1.47879166e+02
1.48287186e+02 1.48723450e+02 1.49316437e+02 1.50665558e+02
1.51612854e+02 1.52353348e+02 1.52627533e+02 1.53208176e+02
1.53676437e+02 1.54828842e+02 1.55477768e+02 1.55687332e+02
1.57617661e+02 1.58180496e+02 1.58925629e+02 1.60108002e+02
1.60826004e+02 1.61694244e+02 1.62173004e+02 1.62343689e+02
1.63625641e+02 1.64465378e+02 1.66040375e+02 1.66495392e+02
1.67364197e+02 1.67490417e+02 1.68366867e+02 1.69653091e+02
1.70087143e+02 1.71359467e+02 1.71970917e+02 1.72535706e+02
1.73246536e+02 1.74076462e+02 1.75430618e+02 1.75900070e+02
1.77175766e+02 1.77901413e+02 1.78409058e+02 1.79418167e+02
1.79849564e+02 1.81056076e+02 1.81273819e+02 1.82647186e+02
1.83071991e+02 1.83674240e+02 1.84167053e+02 1.85599640e+02
1.85839264e+02 1.87136734e+02 1.87937851e+02 1.89759125e+02
1.90429413e+02 1.90787903e+02 1.91714005e+02 1.93405823e+02
1.94088638e+02 1.94364426e+02 1.94851410e+02 1.95657364e+02
1.95754623e+02 1.97215256e+02 1.99144989e+02 1.99374725e+02
1.99958054e+02 2.00290100e+02 2.01551865e+02 2.02195236e+02
2.03400803e+02 2.04490005e+02 2.05146271e+02 2.06696518e+02
2.07431931e+02 2.08515503e+02 2.08875031e+02 2.09138855e+02
2.11361252e+02 2.12075974e+02 2.12996887e+02 2.13278702e+02
2.14725723e+02 2.15240494e+02 2.16976303e+02 2.18039413e+02
2.18929291e+02 2.19677063e+02 2.20076141e+02 2.20786560e+02
2.21755539e+02 2.23308548e+02 2.23881210e+02 2.24910248e+02
2.26122620e+02 2.26824722e+02 2.27602142e+02 2.29322418e+02
2.30083221e+02 2.31209335e+02 2.32024445e+02 2.32400284e+02
2.34048050e+02 2.35109833e+02 2.35243790e+02 2.37044067e+02
2.38457260e+02 2.38974152e+02 2.39564072e+02 2.40691864e+02
2.41053101e+02 2.42154022e+02 2.43079788e+02 2.43548431e+02
2.44631989e+02 2.45381302e+02 2.47547119e+02 2.48108627e+02
2.48317902e+02 2.50132950e+02 2.51336975e+02 2.52039307e+02
2.52273834e+02 2.53735535e+02 2.56003174e+02 2.57179474e+02
2.57470062e+02 2.58366089e+02 2.59290375e+02 2.60104065e+02
2.61393951e+02 2.63408417e+02 2.64429657e+02 2.66147369e+02
2.66442230e+02 2.67852875e+02 2.68556671e+02 2.69077240e+02
2.70379425e+02 2.70930511e+02 2.72054138e+02 2.73200226e+02
2.74283630e+02 2.74882355e+02 2.76109222e+02 2.76495148e+02
2.78251801e+02 2.78679443e+02 2.79492096e+02 2.81089264e+02
2.81439545e+02 2.82457184e+02 2.84318359e+02 2.85156494e+02
2.85935883e+02 2.87196106e+02 2.88148987e+02 2.88940125e+02
2.90283020e+02 2.91615692e+02 2.93534149e+02 2.94158844e+02
2.95170105e+02 2.96113495e+02 2.97466003e+02 2.98160370e+02
2.98743744e+02 3.01239899e+02 3.01673309e+02 3.02067841e+02
3.03104065e+02 3.04792603e+02 3.07057953e+02 3.08511475e+02
3.08977478e+02 3.10162689e+02 3.11624207e+02 3.12233307e+02
3.13438873e+02 3.16306793e+02 3.17600189e+02 3.18045410e+02
3.18985138e+02 3.19925415e+02 3.21572235e+02 3.23243622e+02
3.24389069e+02 3.24890564e+02 3.26830383e+02 3.27363190e+02
3.28513184e+02 3.29457367e+02 3.30261597e+02 3.32414337e+02
3.32803345e+02 3.35136932e+02 3.35746368e+02 3.36515717e+02
3.37433319e+02 3.38529694e+02 3.38647156e+02 3.39897522e+02
3.41167908e+02 3.42195374e+02 3.43903381e+02 3.45109375e+02
3.45971405e+02 3.48015503e+02 3.50154633e+02 3.52141663e+02
3.53709930e+02 3.54543762e+02 3.55922546e+02 3.57053986e+02
3.57960175e+02 3.59561371e+02 3.60487579e+02 3.61945465e+02
3.62680389e+02 3.64553436e+02 3.65587463e+02 3.66173553e+02
3.67008606e+02 3.68565460e+02 3.69315125e+02 3.71006256e+02
3.74027557e+02 3.75229706e+02 3.75971252e+02 3.76704620e+02
3.77797607e+02 3.78974457e+02 3.81736542e+02 3.82784210e+02
3.85088196e+02 3.86485779e+02 3.86615967e+02 3.87383118e+02
3.88326172e+02 3.89644318e+02 3.90108398e+02 3.92277344e+02
3.94343109e+02 3.95922760e+02 3.97144165e+02 3.97967285e+02
3.99390900e+02 4.01153656e+02 4.01692657e+02 4.04214417e+02
4.05092163e+02 4.05514435e+02 4.08079559e+02 4.09409668e+02
4.10008209e+02 4.11698700e+02 4.12361115e+02 4.14451019e+02
4.15920746e+02 4.16397125e+02 4.18858978e+02 4.20117859e+02
4.20393524e+02 4.23108887e+02 4.25401001e+02 4.26086761e+02
4.26724976e+02 4.28041748e+02 4.29472473e+02 4.30272919e+02
4.31629425e+02 4.32333923e+02 4.34616852e+02 4.35638824e+02
4.37058197e+02 4.39406738e+02 4.40159821e+02 4.40602905e+02
4.43222656e+02 4.45677002e+02 4.46184601e+02 4.47560822e+02
4.49509705e+02 4.50333710e+02 4.52579834e+02 4.53891388e+02
4.55905029e+02 4.57151459e+02 4.58729980e+02 4.59613190e+02
4.61382172e+02 4.62581726e+02 4.64082062e+02 4.66120239e+02
4.67154633e+02 4.68468689e+02 4.70620758e+02 4.72802704e+02
4.74405670e+02 4.75040100e+02 4.77381317e+02 4.78177277e+02
4.79088623e+02 4.83709167e+02 4.84677673e+02 4.86512939e+02
4.86794678e+02 4.87567566e+02 4.89485382e+02 4.92935913e+02
4.94727783e+02 4.95705353e+02 4.97375610e+02 4.98891266e+02
5.01160461e+02 5.02262512e+02 5.03153778e+02 5.05040497e+02
5.06154449e+02 5.08683655e+02 5.10286011e+02 5.11951385e+02
5.12975037e+02 5.14463928e+02 5.17308960e+02 5.19472107e+02
5.20296814e+02 5.21050293e+02 5.24258362e+02 5.26466614e+02
5.28564575e+02 5.29713623e+02 5.31732483e+02 5.32625061e+02
5.33895874e+02 5.35004822e+02 5.36689087e+02 5.39138428e+02
5.40180054e+02 5.42444702e+02 5.44950623e+02 5.45765259e+02
5.49442810e+02 5.50435425e+02 5.51638123e+02 5.53327515e+02
5.55800964e+02 5.57163269e+02 5.58274109e+02 5.61365173e+02
5.61997620e+02 5.65394043e+02 5.65887451e+02 5.67524719e+02
5.70905701e+02 5.71448181e+02 5.72789124e+02 5.74239624e+02
5.76320679e+02 5.78608643e+02 5.80261719e+02 5.83148926e+02
5.83390747e+02 5.85851440e+02 5.89416199e+02 5.89794250e+02
5.90741821e+02 5.94033020e+02 5.96356628e+02 5.97505310e+02
6.00270691e+02 6.02918640e+02 6.05315063e+02 6.06780029e+02
6.08941040e+02 6.10082947e+02 6.11770813e+02 6.13071045e+02
6.15556091e+02 6.17526672e+02 6.20302490e+02 6.20396851e+02
6.22601013e+02 6.25393188e+02 6.26939758e+02 6.29066589e+02
6.32269653e+02 6.34546814e+02 6.35416382e+02 6.37068420e+02
6.38491760e+02 6.41345642e+02 6.43357910e+02 6.44174438e+02
6.47398499e+02 6.49653748e+02 6.52133789e+02 6.53688965e+02
6.57702637e+02 6.60041321e+02 6.61746948e+02 6.62333862e+02
6.63630371e+02 6.65492981e+02 6.69170044e+02 6.71110474e+02
6.72536011e+02 6.74833435e+02 6.76352234e+02 6.79998901e+02
6.80831055e+02 6.82621216e+02 6.83880432e+02 6.87046387e+02
6.88309204e+02 6.93048645e+02 6.93579224e+02 6.96200562e+02
6.97856506e+02 6.99901184e+02 7.03498291e+02 7.04409424e+02
7.06626221e+02 7.06928345e+02 7.12676086e+02 7.14202209e+02
7.14917236e+02 7.16116211e+02 7.21841492e+02 7.22903809e+02
7.24273010e+02 7.26642517e+02 7.28822144e+02 7.31066284e+02
7.33493103e+02 7.36873901e+02 7.39621704e+02 7.40442505e+02
7.41733948e+02 7.43432190e+02 7.44783813e+02 7.47014648e+02
7.48981018e+02 7.52452759e+02 7.54216614e+02 7.56387817e+02
7.62324951e+02 7.62614563e+02 7.68018372e+02 7.70884216e+02
7.72539001e+02 7.75369324e+02 7.76304565e+02 7.78585632e+02
7.81802124e+02 7.84003906e+02 7.86518860e+02 7.87025818e+02
7.90002625e+02 7.90734070e+02 7.94419800e+02 7.97527588e+02
7.99448425e+02 8.02019470e+02 8.06504822e+02 8.08564331e+02
8.10306091e+02 8.15614502e+02 8.16939087e+02 8.21243042e+02
8.22536011e+02 8.27309326e+02 8.29849243e+02 8.32455139e+02
8.34898499e+02 8.38141663e+02 8.38948486e+02 8.40470703e+02
8.43448120e+02 8.47748230e+02 8.51299561e+02 8.56558838e+02
8.57538391e+02 8.58703491e+02 8.61431030e+02 8.63937561e+02
8.67265686e+02 8.72864136e+02 8.78027283e+02 8.79890747e+02
8.81216492e+02 8.85117615e+02 8.86630493e+02 8.89289734e+02
8.91421753e+02 8.94427124e+02 8.97910828e+02 8.99207092e+02
9.03311951e+02 9.08170410e+02 9.09508667e+02 9.12288818e+02
9.16102722e+02 9.17471375e+02 9.21352051e+02 9.22124451e+02
9.24350098e+02 9.28648193e+02 9.31345032e+02 9.36004578e+02
9.39655762e+02 9.44435181e+02 9.47724243e+02 9.49452148e+02
9.55252319e+02 9.57909363e+02 9.59536194e+02 9.62560364e+02
9.66002258e+02 9.71531494e+02 9.74324097e+02 9.76539368e+02
9.77509521e+02 9.79911438e+02 9.84302795e+02 9.91155884e+02
9.92229187e+02 9.97336731e+02 1.00085919e+03 1.00228998e+03
1.01001196e+03 1.01233582e+03 1.01921857e+03 1.02124139e+03
1.02359625e+03 1.02738391e+03 1.03333105e+03 1.03532080e+03
1.03664587e+03 1.03988660e+03 1.04701904e+03 1.05137598e+03
1.05603210e+03 1.05945276e+03 1.06536682e+03 1.06804517e+03
1.07083899e+03 1.07204895e+03 1.07769202e+03 1.08044275e+03
1.08816882e+03 1.09153479e+03 1.09620166e+03 1.10103137e+03
1.10380078e+03 1.10870288e+03 1.11296863e+03 1.11855432e+03
1.12163879e+03 1.12748096e+03 1.13287183e+03 1.13865771e+03
1.14910779e+03 1.15240210e+03 1.15756091e+03 1.16246033e+03
1.16705090e+03 1.16905505e+03 1.17729236e+03 1.18330652e+03
1.18423425e+03 1.19582153e+03 1.20297327e+03 1.20445508e+03
1.20709021e+03 1.21210010e+03 1.22435461e+03 1.23170166e+03
1.24609351e+03 1.24998047e+03 1.27155884e+03 1.27597461e+03
1.28550818e+03 1.28894080e+03 1.29022766e+03 1.31443567e+03]
value : [ 6.44587635e-06 2.42215884e-03 6.55746367e-03 8.47825967e-03
9.84001625e-03 1.51212206e-02 3.01121604e-02 3.50309238e-02
4.77008820e-02 5.82269318e-02 7.38391802e-02 8.81094709e-02
1.05783433e-01 1.37876660e-01 1.61138996e-01 1.70798272e-01
1.95682883e-01 2.17516378e-01 2.44780913e-01 2.58415818e-01
2.90732592e-01 3.31826329e-01 3.74541014e-01 4.07888710e-01
4.27108943e-01 4.58717763e-01 5.12218297e-01 5.40250957e-01
6.52754366e-01 6.62499070e-01 7.12900281e-01 7.48282731e-01
8.21294427e-01 9.47669744e-01 1.00372756e+00 1.08570588e+00
1.10923398e+00 1.15579808e+00 1.23106456e+00 1.27740943e+00
1.34924150e+00 1.38969684e+00 1.44845283e+00 1.52857733e+00
1.55094934e+00 1.66098249e+00 1.72633862e+00 1.87145305e+00
1.92126703e+00 2.00717378e+00 2.11880136e+00 2.15797400e+00
2.23656559e+00 2.30516005e+00 2.40463853e+00 2.49258256e+00
2.62717700e+00 2.67909527e+00 2.76910543e+00 2.81408095e+00
2.92035508e+00 3.02174354e+00 3.17764568e+00 3.26505327e+00
3.39788890e+00 3.51429582e+00 3.56541133e+00 3.69496727e+00
3.72456574e+00 3.94524908e+00 4.01688194e+00 4.15100193e+00
4.19943953e+00 4.42227983e+00 4.45704603e+00 4.73858547e+00
4.91809988e+00 4.98185253e+00 5.07524729e+00 5.17229319e+00
5.34174109e+00 5.45144653e+00 5.56105661e+00 5.73216152e+00
5.94291592e+00 6.05118942e+00 6.12223053e+00 6.29571581e+00
6.48470592e+00 6.64421844e+00 6.71118736e+00 6.97567844e+00
7.10572672e+00 7.27730656e+00 7.36503363e+00 7.37904167e+00
7.69739819e+00 7.88008833e+00 8.02561569e+00 8.12077332e+00
8.13265896e+00 8.51465702e+00 8.64584160e+00 8.76709175e+00
9.09335232e+00 9.12919140e+00 9.31584740e+00 9.36162472e+00
9.67258167e+00 9.87185574e+00 1.02059650e+01 1.04488287e+01
1.05671215e+01 1.07169905e+01 1.07497215e+01 1.10400839e+01
1.12153311e+01 1.13224564e+01 1.16055689e+01 1.16963863e+01
1.19753284e+01 1.21110716e+01 1.22265844e+01 1.25771475e+01
1.28293905e+01 1.29500608e+01 1.29810209e+01 1.34098845e+01
1.35341215e+01 1.38398771e+01 1.42068796e+01 1.43256235e+01
1.44484634e+01 1.46121407e+01 1.48533859e+01 1.50341854e+01
1.53007593e+01 1.57418642e+01 1.59119253e+01 1.63046837e+01
1.68564301e+01 1.68851528e+01 1.70628796e+01 1.71390076e+01
1.74612522e+01 1.75313282e+01 1.77527885e+01 1.82674885e+01
1.85521202e+01 1.88042450e+01 1.89404945e+01 1.90692883e+01
1.94673939e+01 1.95628319e+01 1.99444408e+01 2.02086945e+01
2.05806541e+01 2.08594761e+01 2.10872097e+01 2.11624088e+01
2.15076504e+01 2.16838837e+01 2.20607071e+01 2.22646351e+01
2.25439968e+01 2.27612591e+01 2.31183586e+01 2.32988567e+01
2.35924911e+01 2.38846054e+01 2.40376797e+01 2.42350540e+01
2.47274971e+01 2.49741154e+01 2.51768665e+01 2.53545055e+01
2.59751892e+01 2.63588066e+01 2.64530506e+01 2.67544842e+01
2.70595131e+01 2.75399055e+01 2.77671013e+01 2.80325642e+01
2.82077751e+01 2.85318222e+01 2.89092636e+01 2.91348991e+01
2.93408699e+01 2.95959110e+01 2.96948338e+01 3.00860119e+01
3.05833607e+01 3.07005196e+01 3.11746788e+01 3.17491932e+01
3.19743214e+01 3.20989304e+01 3.29974861e+01 3.32096558e+01
3.36388321e+01 3.41417885e+01 3.44234734e+01 3.46411476e+01
3.49069290e+01 3.50758286e+01 3.55646248e+01 3.59638290e+01
3.63454933e+01 3.64777184e+01 3.65683975e+01 3.72665138e+01
3.76622124e+01 3.80810928e+01 3.82100334e+01 3.84924126e+01
3.91440887e+01 3.92919769e+01 3.99105568e+01 4.01626358e+01
4.04036827e+01 4.05547791e+01 4.06410065e+01 4.12782326e+01
4.21441040e+01 4.23651848e+01 4.25549622e+01 4.28854828e+01
4.34563103e+01 4.35923691e+01 4.44004364e+01 4.47157631e+01
4.49225616e+01 4.51686745e+01 4.55530396e+01 4.58978271e+01
4.63271751e+01 4.70960655e+01 4.73684082e+01 4.77179375e+01
4.83595314e+01 4.85003624e+01 4.90902748e+01 4.92705154e+01
4.96359863e+01 5.01768494e+01 5.05178108e+01 5.08785057e+01
5.11056480e+01 5.14547882e+01 5.19274864e+01 5.24790192e+01
5.29460602e+01 5.30117264e+01 5.35173988e+01 5.42221527e+01
5.50382462e+01 5.54338722e+01 5.56765137e+01 5.67423630e+01
5.72529182e+01 5.74227982e+01 5.77410278e+01 5.83652763e+01
5.86096878e+01 5.88395653e+01 5.96139641e+01 5.99075623e+01
6.04754944e+01 6.07458305e+01 6.10089493e+01 6.14998245e+01
6.18782120e+01 6.23323822e+01 6.28642464e+01 6.32752228e+01
6.35134430e+01 6.41652527e+01 6.46347656e+01 6.50758820e+01
6.56187363e+01 6.60985870e+01 6.64314957e+01 6.71582489e+01
6.73653107e+01 6.79323883e+01 6.82919922e+01 6.88681946e+01
7.00568085e+01 7.01396103e+01 7.07118530e+01 7.09224014e+01
7.12133408e+01 7.17250595e+01 7.19493027e+01 7.28280411e+01
7.35820618e+01 7.41950226e+01 7.43562622e+01 7.54476624e+01
7.57032394e+01 7.59635391e+01 7.66656723e+01 7.68573837e+01
7.69329453e+01 7.74783249e+01 7.81940918e+01 7.87316208e+01
7.95767899e+01 8.03262329e+01 8.06446457e+01 8.11797256e+01
8.19003220e+01 8.29543533e+01 8.30585327e+01 8.47228165e+01
8.48796082e+01 8.56383362e+01 8.58651733e+01 8.64563751e+01
8.68795547e+01 8.75914383e+01 8.79169159e+01 8.81961136e+01
8.87551880e+01 8.91354904e+01 8.98847885e+01 9.06329575e+01
9.11041794e+01 9.19139252e+01 9.22878723e+01 9.29696350e+01
9.42230530e+01 9.45274582e+01 9.48185730e+01 9.52042618e+01
9.61282654e+01 9.68018646e+01 9.72427902e+01 9.76677322e+01
9.79643555e+01 9.89993591e+01 9.96723633e+01 1.00230331e+02
1.00958527e+02 1.01144760e+02 1.01869415e+02 1.02874229e+02
1.02904434e+02 1.03852852e+02 1.04019081e+02 1.05214706e+02
1.05695389e+02 1.06208908e+02 1.07330193e+02 1.07653976e+02
1.07940002e+02 1.09021057e+02 1.09413673e+02 1.09680168e+02
1.10011116e+02 1.10543510e+02 1.11877136e+02 1.12545593e+02
1.13083252e+02 1.13340965e+02 1.13597839e+02 1.14893753e+02
1.15122284e+02 1.15610687e+02 1.16723434e+02 1.17573608e+02
1.18001404e+02 1.18856522e+02 1.19183739e+02 1.19383812e+02
1.19824684e+02 1.20793167e+02 1.21629807e+02 1.22112000e+02
1.22910278e+02 1.24381889e+02 1.24494911e+02 1.25564430e+02
1.25876572e+02 1.26439728e+02 1.26950417e+02 1.27683769e+02
1.28036575e+02 1.28972366e+02 1.29401276e+02 1.29767487e+02
1.30305847e+02 1.31761536e+02 1.32293427e+02 1.33433426e+02
1.33985733e+02 1.34606628e+02 1.35062241e+02 1.36064102e+02
1.36855103e+02 1.38415970e+02 1.38668686e+02 1.39371628e+02
1.40336685e+02 1.40744186e+02 1.41222595e+02 1.42173401e+02
1.42934982e+02 1.43147629e+02 1.43824432e+02 1.44559128e+02
1.44970047e+02 1.45750168e+02 1.47568787e+02 1.47879166e+02
1.48287170e+02 1.48723480e+02 1.49316406e+02 1.50665543e+02
1.51612839e+02 1.52353333e+02 1.52627533e+02 1.53208176e+02
1.53676407e+02 1.54828888e+02 1.55477722e+02 1.55687241e+02
1.57617691e+02 1.58180557e+02 1.58925644e+02 1.60107986e+02
1.60826035e+02 1.61694275e+02 1.62173019e+02 1.62343750e+02
1.63625687e+02 1.64465317e+02 1.66040375e+02 1.66495392e+02
1.67364273e+02 1.67490433e+02 1.68366882e+02 1.69653091e+02
1.70087204e+02 1.71359573e+02 1.71970932e+02 1.72535767e+02
1.73246582e+02 1.74076477e+02 1.75430588e+02 1.75900055e+02
1.77175827e+02 1.77901367e+02 1.78409012e+02 1.79418091e+02
1.79849579e+02 1.81056259e+02 1.81273834e+02 1.82647202e+02
1.83072098e+02 1.83674332e+02 1.84167084e+02 1.85599716e+02
1.85839371e+02 1.87136749e+02 1.87937943e+02 1.89759155e+02
1.90429382e+02 1.90787842e+02 1.91714066e+02 1.93405869e+02
1.94088654e+02 1.94364456e+02 1.94851395e+02 1.95657364e+02
1.95754639e+02 1.97215225e+02 1.99145065e+02 1.99374786e+02
1.99958038e+02 2.00290131e+02 2.01551865e+02 2.02195190e+02
2.03400879e+02 2.04489944e+02 2.05146240e+02 2.06696472e+02
2.07432022e+02 2.08515594e+02 2.08875122e+02 2.09138901e+02
2.11361252e+02 2.12076096e+02 2.12996994e+02 2.13278748e+02
2.14725769e+02 2.15240570e+02 2.16976288e+02 2.18039474e+02
2.18929337e+02 2.19677155e+02 2.20076202e+02 2.20786621e+02
2.21755600e+02 2.23308624e+02 2.23881287e+02 2.24910339e+02
2.26122635e+02 2.26824768e+02 2.27602081e+02 2.29322495e+02
2.30083252e+02 2.31209412e+02 2.32024506e+02 2.32400375e+02
2.34048004e+02 2.35109848e+02 2.35243835e+02 2.37044128e+02
2.38457214e+02 2.38974243e+02 2.39564148e+02 2.40691925e+02
2.41053116e+02 2.42153992e+02 2.43079849e+02 2.43548401e+02
2.44632080e+02 2.45381439e+02 2.47547089e+02 2.48108643e+02
2.48317917e+02 2.50132996e+02 2.51336945e+02 2.52039398e+02
2.52273941e+02 2.53735565e+02 2.56003113e+02 2.57179504e+02
2.57470123e+02 2.58366119e+02 2.59290344e+02 2.60103821e+02
2.61393890e+02 2.63408295e+02 2.64429657e+02 2.66147369e+02
2.66442261e+02 2.67852753e+02 2.68556702e+02 2.69077179e+02
2.70379395e+02 2.70930420e+02 2.72054108e+02 2.73200195e+02
2.74283661e+02 2.74882355e+02 2.76109314e+02 2.76495056e+02
2.78251801e+02 2.78679321e+02 2.79492004e+02 2.81089264e+02
2.81439514e+02 2.82457367e+02 2.84318298e+02 2.85156586e+02
2.85935883e+02 2.87196198e+02 2.88148956e+02 2.88940033e+02
2.90282990e+02 2.91615723e+02 2.93534210e+02 2.94158875e+02
2.95170258e+02 2.96113525e+02 2.97466034e+02 2.98160431e+02
2.98743988e+02 3.01239960e+02 3.01673492e+02 3.02067902e+02
3.03103973e+02 3.04792694e+02 3.07057922e+02 3.08511627e+02
3.08977600e+02 3.10162567e+02 3.11624207e+02 3.12233276e+02
3.13439026e+02 3.16306732e+02 3.17600159e+02 3.18045471e+02
3.18985138e+02 3.19925476e+02 3.21572205e+02 3.23243805e+02
3.24389069e+02 3.24890625e+02 3.26830353e+02 3.27363220e+02
3.28513123e+02 3.29457367e+02 3.30261566e+02 3.32414398e+02
3.32803436e+02 3.35136932e+02 3.35746368e+02 3.36515808e+02
3.37433441e+02 3.38529633e+02 3.38647186e+02 3.39897491e+02
3.41167908e+02 3.42195343e+02 3.43903381e+02 3.45109406e+02
3.45971466e+02 3.48015442e+02 3.50154694e+02 3.52141663e+02
3.53709869e+02 3.54543732e+02 3.55922485e+02 3.57053925e+02
3.57960175e+02 3.59561218e+02 3.60487549e+02 3.61945404e+02
3.62680450e+02 3.64553436e+02 3.65587433e+02 3.66173615e+02
3.67008667e+02 3.68565552e+02 3.69315186e+02 3.71006165e+02
3.74027618e+02 3.75229675e+02 3.75971191e+02 3.76704681e+02
3.77797668e+02 3.78974609e+02 3.81736389e+02 3.82784302e+02
3.85088287e+02 3.86485718e+02 3.86615967e+02 3.87383118e+02
3.88326263e+02 3.89644165e+02 3.90108307e+02 3.92277435e+02
3.94343231e+02 3.95922760e+02 3.97144287e+02 3.97967346e+02
3.99391022e+02 4.01153625e+02 4.01692627e+02 4.04214417e+02
4.05092316e+02 4.05514313e+02 4.08079712e+02 4.09409698e+02
4.10008240e+02 4.11698700e+02 4.12360992e+02 4.14451141e+02
4.15920868e+02 4.16397156e+02 4.18858856e+02 4.20117920e+02
4.20393433e+02 4.23109009e+02 4.25401001e+02 4.26086731e+02
4.26724976e+02 4.28041718e+02 4.29472412e+02 4.30273041e+02
4.31629211e+02 4.32333740e+02 4.34616730e+02 4.35638672e+02
4.37058167e+02 4.39406769e+02 4.40159851e+02 4.40602966e+02
4.43222534e+02 4.45676819e+02 4.46184662e+02 4.47560822e+02
4.49509796e+02 4.50333923e+02 4.52579926e+02 4.53891449e+02
4.55905029e+02 4.57151367e+02 4.58729950e+02 4.59613190e+02
4.61382324e+02 4.62581848e+02 4.64082092e+02 4.66120300e+02
4.67154633e+02 4.68468689e+02 4.70620758e+02 4.72802795e+02
4.74405914e+02 4.75040192e+02 4.77381317e+02 4.78177277e+02
4.79088593e+02 4.83709229e+02 4.84677734e+02 4.86513031e+02
4.86794861e+02 4.87567719e+02 4.89485382e+02 4.92935913e+02
4.94727814e+02 4.95705505e+02 4.97375732e+02 4.98891418e+02
5.01160553e+02 5.02262451e+02 5.03153778e+02 5.05040588e+02
5.06154449e+02 5.08683716e+02 5.10286072e+02 5.11951294e+02
5.12975098e+02 5.14464172e+02 5.17308899e+02 5.19472046e+02
5.20296936e+02 5.21050354e+02 5.24258240e+02 5.26466675e+02
5.28564697e+02 5.29713745e+02 5.31732727e+02 5.32624878e+02
5.33895874e+02 5.35004761e+02 5.36688965e+02 5.39138428e+02
5.40180054e+02 5.42444458e+02 5.44950623e+02 5.45765259e+02
5.49443054e+02 5.50435181e+02 5.51638000e+02 5.53327454e+02
5.55801208e+02 5.57163635e+02 5.58274353e+02 5.61365356e+02
5.61997559e+02 5.65394348e+02 5.65887756e+02 5.67524597e+02
5.70905884e+02 5.71448364e+02 5.72789246e+02 5.74239502e+02
5.76320801e+02 5.78608459e+02 5.80261780e+02 5.83148682e+02
5.83390686e+02 5.85851501e+02 5.89415955e+02 5.89794189e+02
5.90741638e+02 5.94033020e+02 5.96356812e+02 5.97505554e+02
6.00270630e+02 6.02918213e+02 6.05315063e+02 6.06780212e+02
6.08941040e+02 6.10083008e+02 6.11770508e+02 6.13071289e+02
6.15556091e+02 6.17526367e+02 6.20302429e+02 6.20396667e+02
6.22600891e+02 6.25393127e+02 6.26939453e+02 6.29066589e+02
6.32269714e+02 6.34546448e+02 6.35416748e+02 6.37068054e+02
6.38491516e+02 6.41345764e+02 6.43358215e+02 6.44174133e+02
6.47398376e+02 6.49653809e+02 6.52133911e+02 6.53689148e+02
6.57702698e+02 6.60040955e+02 6.61747131e+02 6.62333801e+02
6.63630371e+02 6.65493347e+02 6.69169861e+02 6.71110413e+02
6.72535889e+02 6.74833313e+02 6.76352356e+02 6.79999084e+02
6.80830933e+02 6.82621277e+02 6.83880371e+02 6.87046326e+02
6.88309387e+02 6.93048401e+02 6.93579163e+02 6.96200256e+02
6.97856689e+02 6.99900696e+02 7.03497925e+02 7.04408936e+02
7.06626099e+02 7.06928101e+02 7.12675659e+02 7.14202515e+02
7.14917175e+02 7.16116150e+02 7.21841553e+02 7.22903748e+02
7.24272888e+02 7.26642395e+02 7.28822327e+02 7.31066284e+02
7.33492554e+02 7.36873718e+02 7.39621704e+02 7.40443115e+02
7.41734436e+02 7.43431885e+02 7.44784668e+02 7.47014832e+02
7.48981323e+02 7.52452698e+02 7.54216553e+02 7.56387634e+02
7.62325195e+02 7.62614807e+02 7.68018372e+02 7.70884399e+02
7.72538879e+02 7.75369080e+02 7.76304382e+02 7.78586182e+02
7.81802246e+02 7.84004028e+02 7.86518555e+02 7.87026245e+02
7.90002808e+02 7.90734253e+02 7.94420044e+02 7.97527527e+02
7.99448853e+02 8.02019043e+02 8.06504639e+02 8.08564514e+02
8.10305969e+02 8.15614502e+02 8.16939209e+02 8.21242920e+02
8.22536438e+02 8.27308960e+02 8.29848877e+02 8.32455139e+02
8.34898132e+02 8.38141541e+02 8.38948303e+02 8.40470337e+02
8.43448059e+02 8.47747986e+02 8.51299866e+02 8.56558838e+02
8.57538635e+02 8.58703369e+02 8.61431091e+02 8.63937500e+02
8.67265503e+02 8.72864319e+02 8.78027405e+02 8.79890869e+02
8.81216492e+02 8.85117981e+02 8.86630798e+02 8.89290283e+02
8.91420776e+02 8.94427673e+02 8.97910767e+02 8.99206848e+02
9.03311401e+02 9.08170959e+02 9.09508606e+02 9.12288940e+02
9.16102600e+02 9.17471375e+02 9.21352417e+02 9.22125061e+02
9.24349792e+02 9.28648621e+02 9.31345215e+02 9.36004211e+02
9.39656433e+02 9.44435181e+02 9.47723450e+02 9.49451355e+02
9.55251953e+02 9.57909363e+02 9.59536194e+02 9.62560364e+02
9.66002686e+02 9.71531494e+02 9.74323608e+02 9.76539124e+02
9.77509766e+02 9.79912292e+02 9.84304443e+02 9.91154846e+02
9.92229614e+02 9.97335938e+02 1.00085791e+03 1.00228900e+03
1.01001233e+03 1.01233582e+03 1.01921838e+03 1.02124121e+03
1.02359650e+03 1.02738379e+03 1.03333301e+03 1.03532129e+03
1.03664795e+03 1.03988525e+03 1.04701978e+03 1.05137500e+03
1.05603149e+03 1.05945251e+03 1.06536658e+03 1.06804456e+03
1.07083948e+03 1.07205017e+03 1.07769250e+03 1.08044348e+03
1.08816711e+03 1.09153577e+03 1.09620178e+03 1.10103088e+03
1.10380139e+03 1.10870190e+03 1.11296899e+03 1.11855322e+03
1.12164258e+03 1.12748096e+03 1.13286865e+03 1.13865649e+03
1.14910632e+03 1.15240259e+03 1.15756104e+03 1.16246179e+03
1.16704919e+03 1.16905554e+03 1.17729028e+03 1.18330896e+03
1.18423621e+03 1.19582214e+03 1.20298132e+03 1.20445337e+03
1.20707104e+03 1.21210461e+03 1.22435571e+03 1.23170142e+03
1.24609460e+03 1.24998035e+03 1.27155566e+03 1.27597485e+03
1.28551099e+03 1.28893518e+03 1.29023010e+03 1.31443518e+03]
Max Abs Diff: 0.019165
Mean Abs Diff: 0.000166637
Median Abs Diff: 3.05176e-05
Std Abs Diff: 0.000762243
Max Rel Diff: 0.30037
Mean Rel Diff: 0.000312976
Median Rel Diff: 2.69464e-07
Std Rel Diff: 0.00949465

rtol, atol: 1e-05 0.001

jenkins...@mila.quebec

unread,
Nov 20, 2018, 3:12:07 AM11/20/18
to theano-...@googlegroups.com
Theano_buildbot - Build # 176 - Still Unstable -
Tot=83790 Skip=6129 Fail=14

See https://jenkins.mila.quebec/job/Theano_buildbot/176/ to view the results.

Failed tests:
14 tests failed.
val1 = -2450.062086 , val2 = 0.181664
abs. error = 2450.243750, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[-1.4671976 , 0.87695278, -0.25630673, ..., 0.61035582,
-0.33588441, 0.68865392],
[ 1.13440449, 0.66912212, -0.71367018, ..., -0.3467143 ,
0.89223387, 2.17749554],
[-1.14889511, -0.71738022, -0.17158629, ..., -1.24721574,
-0.50248909, -2.03692075],
...,
[-0.57624493, -1.96992105, 1.31778659, ..., -0.06949245,
1.59763893, -0.91761231],
[ 0.20490581, -1.48687837, -2.0669003 , ..., 1.0156506 ,
1.79692309, 1.01792491],
[-0.88255505, -2.42588701, -0.80288159, ..., -0.27872079,
-1.06547257, -0.34368379]]), array([[ 0.93032785],
[ 0.71073491],
[ 0.48153657],
[ 0.47452149],
[ 0.32680352],
[ 0.4229061 ],
[ 0.1572323 ],
[ 0.78808858],
[ 0.30590056],
[ 0.1021778 ],
[ 0.0043968 ],
[ 0.06798814],
[ 0.19349215],
[ 0.68010391],
[ 0.54182663],
[ 0.11617413],
[ 0.22099407],
[ 0.40693936],
[ 0.28211806],
[ 0.79463197],
[ 0.28463694],
[ 0.83832016],
[ 0.6751529 ],
[ 0.74465647],
[ 0.07556967],
[ 0.8027141 ],
[ 0.30143087],
[ 0.92672357],
[ 0.84958626],
[ 0.14494702],
[ 0.51280325],
[ 0.34122274],
[ 0.13847632],
[ 0.96273541],
[ 0.26485525],
[ 0.44868448],
[ 0.69479046],
[ 0.11896481],
[ 0.80072309],
[ 0.99226937],
[ 0.58297704],
[ 0.9980363 ],
[ 0.12152626],
[ 0.26431445],
[ 0.44090569],
[ 0.44525719],
[ 0.51425041],
[ 0.63079806],
[ 0.18999467],
[ 0.39329886],
[ 0.48604677],
[ 0.58888097],
[ 0.89367117],
[ 0.88797812],
[ 0.27283814],
[ 0.77036738],
[ 0.00882258],
[ 0.4884521 ],
[ 0.7136142 ],
[ 0.23392989],
[ 0.19618522],
[ 0.53812779],
[ 0.68553467],
[ 0.80687664],
[ 0.90011721],
[ 0.58327137],
[ 0.31980321],
[ 0.57792696],
[ 0.68658882],
[ 0.67775878],
[ 0.53339935],
[ 0.91218372],
[ 0.28645725],
[ 0.7984034 ],
[ 0.2247087 ],
[ 0.3231151 ],
[ 0.57659201],
[ 0.1492347 ],
[ 0.6340653 ],
[ 0.62943173],
[ 0.00711547],
[ 0.58936414],
[ 0.22912702],
[ 0.71213507],
[ 0.63098558],
[ 0.64239212],
[ 0.96038265],
[ 0.010323 ],
[ 0.88919007],
[ 0.82413583],
[ 0.70934543],
[ 0.70732269],
[ 0.85447924],
[ 0.36701944],
[ 0.79902075],
[ 0.04475366],
[ 0.14050671],
[ 0.67555194],
[ 0.61447741],
[ 0.14239055]])],
The value of eps is:, None,
The out_type is:, None

Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 658, in <lambda>
yield (lambda: utt.verify_grad(f, [r, y], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1795, in verify_grad
abs_tol, rel_tol)
GradientError: GradientError: numeric gradient and analytic gradient exceed tolerance:
At position 0 of argument 0 with shape (100, 100),
val1 = -2450.062086 , val2 = 0.181664
abs. error = 2450.243750, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[-1.4671976 , 0.87695278, -0.25630673, ..., 0.61035582,
-0.33588441, 0.68865392],
[ 1.13440449, 0.66912212, -0.71367018, ..., -0.3467143 ,
0.89223387, 2.17749554],
[-1.14889511, -0.71738022, -0.17158629, ..., -1.24721574,
-0.50248909, -2.03692075],
...,
[-0.57624493, -1.96992105, 1.31778659, ..., -0.06949245,
1.59763893, -0.91761231],
[ 0.20490581, -1.48687837, -2.0669003 , ..., 1.0156506 ,
1.79692309, 1.01792491],
[-0.88255505, -2.42588701, -0.80288159, ..., -0.27872079,
-1.06547257, -0.34368379]]), array([[ 0.93032785],
[ 0.71073491],
[ 0.48153657],
[ 0.47452149],
[ 0.32680352],
[ 0.4229061 ],
[ 0.1572323 ],
[ 0.78808858],
[ 0.30590056],
[ 0.1021778 ],
[ 0.0043968 ],
[ 0.06798814],
[ 0.19349215],
[ 0.68010391],
[ 0.54182663],
[ 0.11617413],
[ 0.22099407],
[ 0.40693936],
[ 0.28211806],
[ 0.79463197],
[ 0.28463694],
[ 0.83832016],
[ 0.6751529 ],
[ 0.74465647],
[ 0.07556967],
[ 0.8027141 ],
[ 0.30143087],
[ 0.92672357],
[ 0.84958626],
[ 0.14494702],
[ 0.51280325],
[ 0.34122274],
[ 0.13847632],
[ 0.96273541],
[ 0.26485525],
[ 0.44868448],
[ 0.69479046],
[ 0.11896481],
[ 0.80072309],
[ 0.99226937],
[ 0.58297704],
[ 0.9980363 ],
[ 0.12152626],
[ 0.26431445],
[ 0.44090569],
[ 0.44525719],
[ 0.51425041],
[ 0.63079806],
[ 0.18999467],
[ 0.39329886],
[ 0.48604677],
[ 0.58888097],
[ 0.89367117],
[ 0.88797812],
[ 0.27283814],
[ 0.77036738],
[ 0.00882258],
[ 0.4884521 ],
[ 0.7136142 ],
[ 0.23392989],
[ 0.19618522],
[ 0.53812779],
[ 0.68553467],
[ 0.80687664],
[ 0.90011721],
[ 0.58327137],
[ 0.31980321],
[ 0.57792696],
[ 0.68658882],
[ 0.67775878],
[ 0.53339935],
[ 0.91218372],
[ 0.28645725],
[ 0.7984034 ],
[ 0.2247087 ],
[ 0.3231151 ],
[ 0.57659201],
[ 0.1492347 ],
[ 0.6340653 ],
[ 0.62943173],
[ 0.00711547],
[ 0.58936414],
[ 0.22912702],
[ 0.71213507],
[ 0.63098558],
[ 0.64239212],
[ 0.96038265],
[ 0.010323 ],
[ 0.88919007],
[ 0.82413583],
[ 0.70934543],
[ 0.70732269],
[ 0.85447924],
[ 0.36701944],
[ 0.79902075],
[ 0.04475366],
[ 0.14050671],
[ 0.67555194],
[ 0.61447741],
[ 0.14239055]])],
The value of eps is:, None,
The out_type is:, None


jenkins...@mila.quebec

unread,
Nov 21, 2018, 12:11:11 AM11/21/18
to theano-...@googlegroups.com
Theano_buildbot - Build # 177 - Still unstable -
Tot=83790 Skip=6129 Fail=15

See https://jenkins.mila.quebec/job/Theano_buildbot/177/ to view the results.

Failed tests:
15 tests failed.
FAILED: theano.gpuarray.tests.test_linalg.TestMagma.Theano_python2_debug / Theano python2 debug / test_gpu_eigh

Error Message:
WrongValue
: shape, dtype, strides, min, max, n_inf, n_nan:
Expected : (1000,) float32 (4,) 0.000489231 1329.03 0 0
expected : [ 4.89230908e-04 2.12977896e-03 3.13293701e-03 6.67754840e-03
1.40918661e-02 1.92183014e-02 2.25752015e-02 2.72230878e-02
3.87566276e-02 6.79594204e-02 7.56917223e-02 1.06902987e-01
1.25276476e-01 1.36405200e-01 1.53002992e-01 1.82613134e-01
2.02814341e-01 2.47903526e-01 2.78994411e-01 2.96188176e-01
3.14503759e-01 3.56028557e-01 3.98533016e-01 4.17901546e-01
4.90445793e-01 5.09236455e-01 5.32531142e-01 5.59509158e-01
5.99932015e-01 7.00924218e-01 7.33008921e-01 7.90517509e-01
8.78386796e-01 9.32002425e-01 9.74158168e-01 1.01129854e+00
1.07917309e+00 1.15071452e+00 1.18585610e+00 1.31918740e+00
1.37965107e+00 1.43139780e+00 1.45824099e+00 1.54170978e+00
1.62419415e+00 1.68142617e+00 1.79060233e+00 1.84580815e+00
1.91080856e+00 2.02409029e+00 2.15122032e+00 2.21839261e+00
2.22805572e+00 2.33068490e+00 2.44404292e+00 2.48344588e+00
2.65764761e+00 2.75451469e+00 2.84316611e+00 2.97589254e+00
3.04562998e+00 3.16397953e+00 3.22816157e+00 3.27334309e+00
3.35018253e+00 3.51056385e+00 3.61152577e+00 3.67874050e+00
3.72451997e+00 3.92017555e+00 3.98458195e+00 4.19438982e+00
4.26176262e+00 4.44420242e+00 4.47579050e+00 4.70065880e+00
4.77662277e+00 4.90885448e+00 5.16805267e+00 5.19677114e+00
5.33062124e+00 5.40188169e+00 5.58674431e+00 5.70062113e+00
5.84209442e+00 5.89712572e+00 6.05004644e+00 6.24996042e+00
6.28961372e+00 6.35290003e+00 6.46114254e+00 6.66021729e+00
6.91916370e+00 7.12386560e+00 7.32566452e+00 7.57610226e+00
7.64203882e+00 7.74720716e+00 7.96737528e+00 8.11195278e+00
8.27048969e+00 8.45675373e+00 8.56421947e+00 8.64499474e+00
8.80696583e+00 9.08377934e+00 9.27954578e+00 9.33949375e+00
9.67354107e+00 9.73110390e+00 9.98426533e+00 1.02513180e+01
1.03698158e+01 1.04776506e+01 1.08126974e+01 1.09044924e+01
1.11186914e+01 1.13258410e+01 1.16921988e+01 1.17399549e+01
1.18261547e+01 1.19929199e+01 1.22477922e+01 1.26201115e+01
1.26639690e+01 1.28501177e+01 1.29435987e+01 1.32729578e+01
1.36774063e+01 1.38649168e+01 1.42879829e+01 1.43730230e+01
1.44511642e+01 1.46355152e+01 1.48354130e+01 1.49453945e+01
1.53802366e+01 1.54855146e+01 1.55675106e+01 1.61148834e+01
1.64968357e+01 1.67138081e+01 1.68647079e+01 1.70887566e+01
1.71699142e+01 1.72498531e+01 1.76943913e+01 1.79336357e+01
1.81074104e+01 1.83980484e+01 1.85775814e+01 1.86951084e+01
1.91310596e+01 1.95102253e+01 1.96415691e+01 1.97686882e+01
2.00792942e+01 2.04385757e+01 2.06840496e+01 2.10087662e+01
2.12453995e+01 2.14383755e+01 2.17911930e+01 2.20333805e+01
2.21764374e+01 2.24260311e+01 2.28442974e+01 2.30538788e+01
2.31309834e+01 2.35197716e+01 2.39328041e+01 2.40887146e+01
2.41917248e+01 2.45445805e+01 2.53421555e+01 2.55752201e+01
2.57296429e+01 2.59239006e+01 2.61374950e+01 2.65188313e+01
2.70373020e+01 2.73517113e+01 2.77448864e+01 2.78163166e+01
2.80459995e+01 2.82876492e+01 2.85247421e+01 2.90162640e+01
2.94064426e+01 2.95603008e+01 3.00983028e+01 3.04284725e+01
3.05698719e+01 3.08299389e+01 3.10155983e+01 3.14266052e+01
3.19124031e+01 3.21357002e+01 3.26962738e+01 3.28808060e+01
3.36507492e+01 3.39199257e+01 3.41434669e+01 3.45634651e+01
3.49445419e+01 3.50615768e+01 3.52362671e+01 3.54621506e+01
3.58567276e+01 3.61798859e+01 3.63464508e+01 3.68529625e+01
3.72920189e+01 3.74682655e+01 3.79431305e+01 3.80115013e+01
3.84746399e+01 3.91339302e+01 3.96361809e+01 3.97272453e+01
3.99532585e+01 4.01345634e+01 4.06527786e+01 4.09145050e+01
4.14337845e+01 4.21907196e+01 4.22522774e+01 4.25551262e+01
4.29039154e+01 4.38117828e+01 4.41242485e+01 4.43046303e+01
4.44786606e+01 4.47252388e+01 4.50300980e+01 4.52520981e+01
4.59720497e+01 4.63360405e+01 4.69958267e+01 4.73848915e+01
4.78359566e+01 4.79470100e+01 4.85371513e+01 4.88817902e+01
4.93199539e+01 4.96406403e+01 5.05801125e+01 5.07774963e+01
5.12299805e+01 5.16515617e+01 5.21387215e+01 5.24999580e+01
5.29075928e+01 5.30500641e+01 5.32980194e+01 5.37913513e+01
5.44327507e+01 5.48703499e+01 5.54252968e+01 5.57441406e+01
5.58686523e+01 5.65598068e+01 5.66993294e+01 5.75738983e+01
5.76345062e+01 5.86155777e+01 5.88713493e+01 5.95167274e+01
5.98670120e+01 6.00635948e+01 6.06487503e+01 6.10369339e+01
6.16479034e+01 6.19931450e+01 6.23950653e+01 6.30500908e+01
6.37085228e+01 6.43323975e+01 6.45980911e+01 6.54231567e+01
6.56577148e+01 6.58293686e+01 6.62960434e+01 6.67252808e+01
6.73152542e+01 6.78940811e+01 6.80604095e+01 6.83826675e+01
6.89333649e+01 6.96096039e+01 7.02785492e+01 7.04530411e+01
7.07854309e+01 7.17996368e+01 7.18703918e+01 7.23416519e+01
7.32832031e+01 7.34858017e+01 7.36877823e+01 7.41157761e+01
7.46205368e+01 7.55300980e+01 7.56960983e+01 7.65761948e+01
7.70449219e+01 7.73158417e+01 7.81423340e+01 7.83301086e+01
7.89410019e+01 7.97195663e+01 8.00195618e+01 8.05239639e+01
8.09951324e+01 8.28365707e+01 8.29485855e+01 8.34414673e+01
8.43953476e+01 8.48697891e+01 8.52431717e+01 8.57822952e+01
8.65215836e+01 8.68748627e+01 8.73740768e+01 8.80134811e+01
8.80546570e+01 8.84077835e+01 8.90928116e+01 9.02344589e+01
9.04520035e+01 9.15475082e+01 9.21446075e+01 9.26446991e+01
9.29936218e+01 9.36920395e+01 9.43039856e+01 9.52742996e+01
9.54694443e+01 9.64435806e+01 9.72243805e+01 9.75561371e+01
9.79105072e+01 9.89687805e+01 9.95528030e+01 9.97306061e+01
1.00530960e+02 1.01104813e+02 1.02085381e+02 1.02377037e+02
1.02782135e+02 1.03399628e+02 1.03669106e+02 1.04035011e+02
1.05070404e+02 1.05429268e+02 1.06007362e+02 1.06920204e+02
1.07101021e+02 1.08478912e+02 1.09070030e+02 1.09573059e+02
1.09937714e+02 1.10632866e+02 1.10885803e+02 1.11919601e+02
1.12639381e+02 1.13435143e+02 1.14277077e+02 1.15069397e+02
1.15326553e+02 1.15752426e+02 1.15855774e+02 1.16174515e+02
1.17536049e+02 1.18485741e+02 1.18578918e+02 1.18764275e+02
1.19779701e+02 1.20533669e+02 1.21078941e+02 1.21695374e+02
1.23116829e+02 1.23333618e+02 1.23817024e+02 1.25266251e+02
1.25627075e+02 1.26793488e+02 1.27580589e+02 1.28196274e+02
1.28788101e+02 1.29288513e+02 1.30663132e+02 1.30814804e+02
1.31497559e+02 1.32191238e+02 1.32632111e+02 1.33183655e+02
1.33662186e+02 1.34850204e+02 1.35557175e+02 1.36055191e+02
1.37061935e+02 1.37184875e+02 1.37808075e+02 1.39221329e+02
1.39630005e+02 1.40350250e+02 1.41141693e+02 1.42246323e+02
1.42512833e+02 1.43260849e+02 1.44010605e+02 1.44338867e+02
1.46055557e+02 1.46734344e+02 1.47149597e+02 1.47747437e+02
1.47992096e+02 1.49210007e+02 1.50201019e+02 1.50473526e+02
1.51602875e+02 1.52176910e+02 1.52860275e+02 1.53430252e+02
1.53968384e+02 1.54485397e+02 1.56663742e+02 1.57157532e+02
1.57566696e+02 1.58046204e+02 1.58632233e+02 1.59611160e+02
1.60449066e+02 1.61080048e+02 1.62012695e+02 1.63134903e+02
1.63781158e+02 1.64873459e+02 1.65375778e+02 1.65671127e+02
1.66652664e+02 1.67070236e+02 1.67674927e+02 1.69239014e+02
1.69924973e+02 1.70515274e+02 1.71132767e+02 1.72319275e+02
1.72720444e+02 1.73888580e+02 1.74564819e+02 1.75664444e+02
1.76262695e+02 1.76494019e+02 1.77510757e+02 1.77645920e+02
1.79172440e+02 1.80101990e+02 1.81353958e+02 1.81965469e+02
1.83660828e+02 1.84429276e+02 1.85226761e+02 1.86121964e+02
1.86388718e+02 1.87349655e+02 1.87725769e+02 1.89127838e+02
1.89853683e+02 1.90707047e+02 1.91777298e+02 1.92425735e+02
1.93159149e+02 1.94241852e+02 1.94340775e+02 1.95008881e+02
1.95073624e+02 1.97234070e+02 1.97584259e+02 1.98934479e+02
1.99668304e+02 2.00329880e+02 2.00704575e+02 2.02396103e+02
2.02894363e+02 2.03399002e+02 2.04125305e+02 2.05381866e+02
2.05944931e+02 2.07943665e+02 2.08293121e+02 2.08654480e+02
2.10693176e+02 2.11176041e+02 2.12256302e+02 2.12439377e+02
2.14080383e+02 2.14558411e+02 2.15282181e+02 2.16877182e+02
2.17594879e+02 2.17812393e+02 2.18641495e+02 2.19710297e+02
2.21451981e+02 2.22071609e+02 2.23107040e+02 2.23267593e+02
2.25905243e+02 2.26620926e+02 2.26984253e+02 2.28312485e+02
2.29737167e+02 2.30212433e+02 2.31229248e+02 2.31513412e+02
2.32572495e+02 2.32942673e+02 2.34115692e+02 2.35883057e+02
2.36877045e+02 2.38990311e+02 2.40194977e+02 2.40716278e+02
2.41334824e+02 2.42073013e+02 2.42177826e+02 2.43227295e+02
2.43842453e+02 2.45897491e+02 2.46181885e+02 2.47454697e+02
2.48015167e+02 2.48929214e+02 2.49426239e+02 2.50470871e+02
2.50934540e+02 2.52497513e+02 2.53442169e+02 2.55106766e+02
2.55705383e+02 2.56187988e+02 2.56375092e+02 2.57441986e+02
2.59716461e+02 2.61266418e+02 2.62640717e+02 2.64483429e+02
2.65059937e+02 2.65337921e+02 2.66213776e+02 2.68024750e+02
2.70080170e+02 2.70308289e+02 2.71459717e+02 2.72110809e+02
2.73613770e+02 2.74371033e+02 2.75197662e+02 2.75613617e+02
2.77125000e+02 2.78525665e+02 2.79917328e+02 2.80601624e+02
2.82654938e+02 2.83838440e+02 2.84764069e+02 2.86522644e+02
2.87402649e+02 2.88669403e+02 2.89981476e+02 2.91002777e+02
2.91752380e+02 2.92264984e+02 2.92740051e+02 2.95540649e+02
2.95945984e+02 2.96462952e+02 2.97755371e+02 2.99351868e+02
2.99617340e+02 3.01182831e+02 3.01822083e+02 3.03013824e+02
3.03545624e+02 3.04450226e+02 3.05499329e+02 3.07614655e+02
3.07796234e+02 3.09663177e+02 3.11000824e+02 3.12442871e+02
3.13311615e+02 3.13981384e+02 3.15125732e+02 3.16846710e+02
3.18297028e+02 3.19824554e+02 3.20183136e+02 3.22089600e+02
3.22665283e+02 3.23390411e+02 3.24579346e+02 3.26779022e+02
3.27760040e+02 3.28091187e+02 3.29320465e+02 3.29544861e+02
3.31250519e+02 3.33555389e+02 3.34481415e+02 3.35336365e+02
3.36432648e+02 3.36615326e+02 3.37862579e+02 3.41406799e+02
3.42474121e+02 3.43616730e+02 3.44051331e+02 3.46855713e+02
3.47452393e+02 3.49747742e+02 3.50391296e+02 3.50420563e+02
3.52046844e+02 3.54568451e+02 3.55124115e+02 3.55792358e+02
3.57415222e+02 3.58575104e+02 3.58938446e+02 3.59965790e+02
3.61536072e+02 3.62370575e+02 3.65192291e+02 3.67147400e+02
3.67959290e+02 3.69768311e+02 3.70614197e+02 3.71312897e+02
3.72857208e+02 3.73754456e+02 3.75941010e+02 3.76610199e+02
3.78235474e+02 3.79139008e+02 3.80356171e+02 3.81105957e+02
3.83061340e+02 3.84039429e+02 3.84622131e+02 3.85097137e+02
3.88893707e+02 3.89745483e+02 3.90177551e+02 3.91255707e+02
3.92934479e+02 3.94500641e+02 3.95852936e+02 3.96025452e+02
3.98757416e+02 4.00838928e+02 4.02533447e+02 4.03515503e+02
4.05245483e+02 4.05664978e+02 4.07181580e+02 4.08819916e+02
4.09781494e+02 4.12330872e+02 4.12999786e+02 4.13436157e+02
4.16030579e+02 4.17402435e+02 4.19632019e+02 4.19887756e+02
4.21076721e+02 4.22819733e+02 4.24274384e+02 4.25442535e+02
4.27771881e+02 4.28928162e+02 4.30498291e+02 4.32734100e+02
4.33565704e+02 4.34107513e+02 4.34672028e+02 4.35567688e+02
4.37534637e+02 4.39258270e+02 4.41043304e+02 4.41884369e+02
4.43626343e+02 4.47008850e+02 4.48276337e+02 4.49634552e+02
4.50286987e+02 4.50758789e+02 4.53319214e+02 4.54743927e+02
4.56333801e+02 4.57762268e+02 4.58775757e+02 4.59154236e+02
4.61899780e+02 4.62287384e+02 4.65804718e+02 4.65915588e+02
4.69140259e+02 4.69594910e+02 4.70214478e+02 4.72187317e+02
4.75492188e+02 4.76831146e+02 4.78016296e+02 4.78876282e+02
4.80702942e+02 4.82937683e+02 4.84370239e+02 4.85447357e+02
4.86678986e+02 4.88261200e+02 4.89125122e+02 4.90496613e+02
4.91627472e+02 4.92392731e+02 4.95415680e+02 4.97546082e+02
4.98519806e+02 5.01003418e+02 5.02865601e+02 5.05376648e+02
5.05943542e+02 5.07106140e+02 5.08591064e+02 5.10544556e+02
5.12251160e+02 5.14649536e+02 5.15656250e+02 5.16845642e+02
5.18270081e+02 5.19318359e+02 5.23630249e+02 5.24411377e+02
5.24660889e+02 5.26879333e+02 5.27787659e+02 5.30309875e+02
5.30780090e+02 5.35010681e+02 5.38322021e+02 5.39500854e+02
5.40761230e+02 5.44147644e+02 5.44728943e+02 5.46824158e+02
5.48718262e+02 5.49549622e+02 5.52125793e+02 5.54012268e+02
5.55997925e+02 5.56360107e+02 5.59715881e+02 5.60918030e+02
5.62265320e+02 5.64904968e+02 5.65518921e+02 5.67688354e+02
5.70330017e+02 5.71531067e+02 5.72709717e+02 5.73772461e+02
5.74635193e+02 5.76114807e+02 5.78688782e+02 5.80944153e+02
5.82957336e+02 5.83999512e+02 5.86374268e+02 5.87890076e+02
5.90274780e+02 5.92789795e+02 5.94019470e+02 5.95919861e+02
5.98507568e+02 5.99237854e+02 6.03538574e+02 6.04865479e+02
6.05613770e+02 6.08181580e+02 6.11423157e+02 6.12872131e+02
6.13704468e+02 6.15147156e+02 6.16140564e+02 6.18674927e+02
6.20023560e+02 6.24206665e+02 6.25302063e+02 6.25840881e+02
6.29186768e+02 6.32762573e+02 6.36280273e+02 6.36891846e+02
6.38314148e+02 6.41161377e+02 6.43433289e+02 6.45997803e+02
6.47491089e+02 6.48863037e+02 6.51700867e+02 6.53373108e+02
6.54144531e+02 6.55946594e+02 6.57468140e+02 6.59019470e+02
6.61489075e+02 6.63827576e+02 6.66006042e+02 6.68568481e+02
6.69098450e+02 6.70250366e+02 6.72675781e+02 6.74382141e+02
6.78338562e+02 6.79055542e+02 6.81062866e+02 6.83051270e+02
6.87433716e+02 6.89493835e+02 6.91695862e+02 6.92172363e+02
6.95642151e+02 6.97679443e+02 7.02713928e+02 7.03507385e+02
7.05351074e+02 7.09136841e+02 7.10424255e+02 7.14557922e+02
7.16256531e+02 7.16933655e+02 7.20567322e+02 7.21709106e+02
7.22399597e+02 7.25895020e+02 7.27103394e+02 7.31972900e+02
7.33965088e+02 7.36153870e+02 7.37215271e+02 7.40088196e+02
7.43148438e+02 7.45903381e+02 7.49233154e+02 7.50174316e+02
7.53542908e+02 7.56362793e+02 7.57446716e+02 7.60002075e+02
7.61068665e+02 7.62952332e+02 7.66672546e+02 7.68044189e+02
7.71573730e+02 7.74104919e+02 7.75025513e+02 7.76655823e+02
7.78113159e+02 7.82079407e+02 7.84105103e+02 7.84413391e+02
7.87688599e+02 7.90116821e+02 7.93338562e+02 7.94950989e+02
7.97679932e+02 7.99141418e+02 8.00582031e+02 8.02866211e+02
8.06674377e+02 8.07715515e+02 8.11933105e+02 8.16129822e+02
8.18053040e+02 8.18826904e+02 8.22571350e+02 8.24581421e+02
8.27553894e+02 8.33810791e+02 8.34986450e+02 8.38146973e+02
8.38753540e+02 8.40131531e+02 8.45192261e+02 8.47327454e+02
8.49306458e+02 8.55219482e+02 8.57979004e+02 8.59609802e+02
8.64477051e+02 8.67826111e+02 8.70596924e+02 8.72258484e+02
8.75446167e+02 8.77990356e+02 8.80466003e+02 8.83258179e+02
8.87311096e+02 8.88546997e+02 8.89433289e+02 8.92023621e+02
8.94646912e+02 8.98081482e+02 9.02650696e+02 9.06503540e+02
9.10038757e+02 9.12826355e+02 9.16996338e+02 9.19810608e+02
9.22638184e+02 9.26118103e+02 9.28608215e+02 9.30068787e+02
9.34010132e+02 9.36171936e+02 9.40510742e+02 9.44901367e+02
9.47517456e+02 9.53077942e+02 9.54970337e+02 9.59001587e+02
9.61072632e+02 9.64217224e+02 9.65422668e+02 9.70974976e+02
9.74228455e+02 9.77934326e+02 9.82992065e+02 9.85552795e+02
9.87179993e+02 9.94839233e+02 9.98557739e+02 1.00312830e+03
1.00889075e+03 1.01186896e+03 1.01528748e+03 1.01572839e+03
1.02170581e+03 1.02625330e+03 1.02736719e+03 1.02998657e+03
1.03566577e+03 1.04165332e+03 1.04748730e+03 1.04879260e+03
1.05461511e+03 1.05606140e+03 1.06002490e+03 1.06505225e+03
1.07181201e+03 1.07324780e+03 1.07702405e+03 1.08293176e+03
1.08789355e+03 1.09165332e+03 1.09612341e+03 1.09847888e+03
1.10769482e+03 1.11061194e+03 1.11224780e+03 1.11740137e+03
1.12600427e+03 1.12979663e+03 1.13307056e+03 1.13479761e+03
1.14168933e+03 1.14933862e+03 1.15824756e+03 1.16360193e+03
1.16658777e+03 1.17518091e+03 1.17726892e+03 1.18883545e+03
1.19374170e+03 1.20296753e+03 1.21020850e+03 1.21291370e+03
1.22026611e+03 1.22789514e+03 1.23526062e+03 1.23744495e+03
1.24352417e+03 1.25645020e+03 1.26784790e+03 1.27135571e+03
1.28876624e+03 1.29042517e+03 1.29937598e+03 1.32902979e+03]
value : [ 5.20384463e-04 2.12052371e-03 3.12634627e-03 6.66383002e-03
1.40903657e-02 1.91810895e-02 2.25853324e-02 2.72042081e-02
3.87822092e-02 6.79658130e-02 7.57073015e-02 1.06888019e-01
1.25288516e-01 1.36433661e-01 1.53017715e-01 1.82603270e-01
2.02828571e-01 2.47915745e-01 2.79005468e-01 2.96148479e-01
3.14518899e-01 3.56056601e-01 3.98567855e-01 4.17888165e-01
4.90447998e-01 5.09236157e-01 5.32532990e-01 5.59517980e-01
5.99948943e-01 7.00941563e-01 7.33029783e-01 7.90516138e-01
8.78382266e-01 9.32017088e-01 9.74162698e-01 1.01130247e+00
1.07916117e+00 1.15071130e+00 1.18586326e+00 1.31920063e+00
1.37963653e+00 1.43141806e+00 1.45821238e+00 1.54170680e+00
1.62419868e+00 1.68142414e+00 1.79059923e+00 1.84579206e+00
1.91079915e+00 2.02410603e+00 2.15119219e+00 2.21838617e+00
2.22804594e+00 2.33072186e+00 2.44406581e+00 2.48346376e+00
2.65763068e+00 2.75451756e+00 2.84316564e+00 2.97590661e+00
3.04564238e+00 3.16400456e+00 3.22816610e+00 3.27333927e+00
3.35019469e+00 3.51056647e+00 3.61153316e+00 3.67876625e+00
3.72453380e+00 3.92019677e+00 3.98457670e+00 4.19437170e+00
4.26172972e+00 4.44420719e+00 4.47580528e+00 4.70067644e+00
4.77662563e+00 4.90884733e+00 5.16807747e+00 5.19675446e+00
5.33064079e+00 5.40186357e+00 5.58674192e+00 5.70064640e+00
5.84212303e+00 5.89713240e+00 6.05006599e+00 6.24997425e+00
6.28960943e+00 6.35289192e+00 6.46114254e+00 6.66022873e+00
6.91912127e+00 7.12386465e+00 7.32568502e+00 7.57607698e+00
7.64203358e+00 7.74719906e+00 7.96739817e+00 8.11195564e+00
8.27048302e+00 8.45676041e+00 8.56424713e+00 8.64499664e+00
8.80699730e+00 9.08375835e+00 9.27955818e+00 9.33950424e+00
9.67356205e+00 9.73108006e+00 9.98427677e+00 1.02513142e+01
1.03698521e+01 1.04776373e+01 1.08127146e+01 1.09045095e+01
1.11186914e+01 1.13258486e+01 1.16922131e+01 1.17399263e+01
1.18261671e+01 1.19929619e+01 1.22477894e+01 1.26201487e+01
1.26639853e+01 1.28501558e+01 1.29436102e+01 1.32729855e+01
1.36774006e+01 1.38649197e+01 1.42879992e+01 1.43730659e+01
1.44511747e+01 1.46355143e+01 1.48354387e+01 1.49453936e+01
1.53802214e+01 1.54855175e+01 1.55675011e+01 1.61149044e+01
1.64968472e+01 1.67138367e+01 1.68647327e+01 1.70887432e+01
1.71699295e+01 1.72498398e+01 1.76943970e+01 1.79336376e+01
1.81074276e+01 1.83980560e+01 1.85775776e+01 1.86950893e+01
1.91310329e+01 1.95102348e+01 1.96415787e+01 1.97687130e+01
2.00793076e+01 2.04385700e+01 2.06840591e+01 2.10087700e+01
2.12454510e+01 2.14383793e+01 2.17912006e+01 2.20333824e+01
2.21764545e+01 2.24260197e+01 2.28443127e+01 2.30538559e+01
2.31309948e+01 2.35198135e+01 2.39328022e+01 2.40886955e+01
2.41917229e+01 2.45446053e+01 2.53421555e+01 2.55751972e+01
2.57296524e+01 2.59239140e+01 2.61375160e+01 2.65188427e+01
2.70373058e+01 2.73517342e+01 2.77449074e+01 2.78162975e+01
2.80460129e+01 2.82876854e+01 2.85247631e+01 2.90162849e+01
2.94064350e+01 2.95602989e+01 3.00983181e+01 3.04284782e+01
3.05698700e+01 3.08299427e+01 3.10156269e+01 3.14266243e+01
3.19124146e+01 3.21357079e+01 3.26962891e+01 3.28808174e+01
3.36507759e+01 3.39199219e+01 3.41434364e+01 3.45634499e+01
3.49445648e+01 3.50615883e+01 3.52363052e+01 3.54621773e+01
3.58567085e+01 3.61798935e+01 3.63464661e+01 3.68529930e+01
3.72920456e+01 3.74682770e+01 3.79431381e+01 3.80115204e+01
3.84746437e+01 3.91339874e+01 3.96361694e+01 3.97272758e+01
3.99532509e+01 4.01345863e+01 4.06528091e+01 4.09145164e+01
4.14338036e+01 4.21907082e+01 4.22522621e+01 4.25551033e+01
4.29039536e+01 4.38117523e+01 4.41241760e+01 4.43046265e+01
4.44786491e+01 4.47252502e+01 4.50300865e+01 4.52521057e+01
4.59720306e+01 4.63360443e+01 4.69958305e+01 4.73849068e+01
4.78359489e+01 4.79470177e+01 4.85371895e+01 4.88817520e+01
4.93199654e+01 4.96406364e+01 5.05801315e+01 5.07775116e+01
5.12299919e+01 5.16515503e+01 5.21386986e+01 5.24999771e+01
5.29075928e+01 5.30500298e+01 5.32979927e+01 5.37913361e+01
5.44327736e+01 5.48703651e+01 5.54253006e+01 5.57441711e+01
5.58686676e+01 5.65597801e+01 5.66993484e+01 5.75738831e+01
5.76344948e+01 5.86155663e+01 5.88713608e+01 5.95167160e+01
5.98669853e+01 6.00635948e+01 6.06487312e+01 6.10369606e+01
6.16479378e+01 6.19931526e+01 6.23950691e+01 6.30500984e+01
6.37085457e+01 6.43324203e+01 6.45981216e+01 6.54231567e+01
6.56577225e+01 6.58293839e+01 6.62960663e+01 6.67252655e+01
6.73152618e+01 6.78941193e+01 6.80604019e+01 6.83826904e+01
6.89333267e+01 6.96096039e+01 7.02785492e+01 7.04530716e+01
7.07854462e+01 7.17996521e+01 7.18703842e+01 7.23416595e+01
7.32831879e+01 7.34857788e+01 7.36878128e+01 7.41157913e+01
7.46205444e+01 7.55300903e+01 7.56960983e+01 7.65762253e+01
7.70449448e+01 7.73158875e+01 7.81423187e+01 7.83301468e+01
7.89409409e+01 7.97195587e+01 8.00195847e+01 8.05239105e+01
8.09951782e+01 8.28366241e+01 8.29485626e+01 8.34415131e+01
8.43954163e+01 8.48698502e+01 8.52431946e+01 8.57822800e+01
8.65216064e+01 8.68748093e+01 8.73741074e+01 8.80134811e+01
8.80546722e+01 8.84077682e+01 8.90927963e+01 9.02344513e+01
9.04520340e+01 9.15475006e+01 9.21446457e+01 9.26447144e+01
9.29936066e+01 9.36919937e+01 9.43039780e+01 9.52742157e+01
9.54694443e+01 9.64435806e+01 9.72243576e+01 9.75561295e+01
9.79105682e+01 9.89687729e+01 9.95527496e+01 9.97305984e+01
1.00530945e+02 1.01104805e+02 1.02085312e+02 1.02377014e+02
1.02782143e+02 1.03399620e+02 1.03669067e+02 1.04034988e+02
1.05070419e+02 1.05429283e+02 1.06007347e+02 1.06920219e+02
1.07100998e+02 1.08478920e+02 1.09070045e+02 1.09573051e+02
1.09937683e+02 1.10632889e+02 1.10885818e+02 1.11919601e+02
1.12639381e+02 1.13435143e+02 1.14277000e+02 1.15069412e+02
1.15326538e+02 1.15752419e+02 1.15855782e+02 1.16174561e+02
1.17536034e+02 1.18485756e+02 1.18578903e+02 1.18764252e+02
1.19779663e+02 1.20533661e+02 1.21078903e+02 1.21695404e+02
1.23116844e+02 1.23333672e+02 1.23816956e+02 1.25266243e+02
1.25627083e+02 1.26793442e+02 1.27580643e+02 1.28196243e+02
1.28788101e+02 1.29288513e+02 1.30663116e+02 1.30814804e+02
1.31497543e+02 1.32191254e+02 1.32632111e+02 1.33183594e+02
1.33662186e+02 1.34850143e+02 1.35557159e+02 1.36055161e+02
1.37061966e+02 1.37184875e+02 1.37808060e+02 1.39221252e+02
1.39630005e+02 1.40350266e+02 1.41141739e+02 1.42246277e+02
1.42512878e+02 1.43260910e+02 1.44010620e+02 1.44338913e+02
1.46055527e+02 1.46734406e+02 1.47149567e+02 1.47747467e+02
1.47992096e+02 1.49210022e+02 1.50201080e+02 1.50473526e+02
1.51602936e+02 1.52176971e+02 1.52860291e+02 1.53430344e+02
1.53968384e+02 1.54485458e+02 1.56663712e+02 1.57157547e+02
1.57566666e+02 1.58046249e+02 1.58632187e+02 1.59611191e+02
1.60449066e+02 1.61080078e+02 1.62012695e+02 1.63134964e+02
1.63781143e+02 1.64873520e+02 1.65375854e+02 1.65671173e+02
1.66652679e+02 1.67070190e+02 1.67674927e+02 1.69238983e+02
1.69924957e+02 1.70515335e+02 1.71132767e+02 1.72319260e+02
1.72720444e+02 1.73888611e+02 1.74564911e+02 1.75664474e+02
1.76262726e+02 1.76494080e+02 1.77510757e+02 1.77645950e+02
1.79172379e+02 1.80101974e+02 1.81354034e+02 1.81965393e+02
1.83660828e+02 1.84429214e+02 1.85226791e+02 1.86121994e+02
1.86388733e+02 1.87349640e+02 1.87725739e+02 1.89127838e+02
1.89853668e+02 1.90707062e+02 1.91777252e+02 1.92425674e+02
1.93159149e+02 1.94241714e+02 1.94340775e+02 1.95008881e+02
1.95073654e+02 1.97234009e+02 1.97584167e+02 1.98934372e+02
1.99668259e+02 2.00329910e+02 2.00704605e+02 2.02396133e+02
2.02894363e+02 2.03398926e+02 2.04125305e+02 2.05381760e+02
2.05944946e+02 2.07943665e+02 2.08293213e+02 2.08654419e+02
2.10693253e+02 2.11176056e+02 2.12256271e+02 2.12439529e+02
2.14080353e+02 2.14558456e+02 2.15282120e+02 2.16877151e+02
2.17594742e+02 2.17812271e+02 2.18641327e+02 2.19710281e+02
2.21451950e+02 2.22071686e+02 2.23107010e+02 2.23267624e+02
2.25905258e+02 2.26620956e+02 2.26984299e+02 2.28312546e+02
2.29737167e+02 2.30212433e+02 2.31229279e+02 2.31513412e+02
2.32572510e+02 2.32942688e+02 2.34115723e+02 2.35883057e+02
2.36877029e+02 2.38990326e+02 2.40195038e+02 2.40716278e+02
2.41334900e+02 2.42072968e+02 2.42177856e+02 2.43227264e+02
2.43842392e+02 2.45897614e+02 2.46181976e+02 2.47454697e+02
2.48015167e+02 2.48929245e+02 2.49426346e+02 2.50470917e+02
2.50934509e+02 2.52497437e+02 2.53442001e+02 2.55106628e+02
2.55705322e+02 2.56187988e+02 2.56374969e+02 2.57441833e+02
2.59716522e+02 2.61266449e+02 2.62640625e+02 2.64483429e+02
2.65059784e+02 2.65337891e+02 2.66213806e+02 2.68024902e+02
2.70080048e+02 2.70308136e+02 2.71459717e+02 2.72110687e+02
2.73613800e+02 2.74371002e+02 2.75197632e+02 2.75613647e+02
2.77124939e+02 2.78525635e+02 2.79917328e+02 2.80601532e+02
2.82654938e+02 2.83838440e+02 2.84764099e+02 2.86522644e+02
2.87402771e+02 2.88669495e+02 2.89981415e+02 2.91002686e+02
2.91752319e+02 2.92265106e+02 2.92740143e+02 2.95540680e+02
2.95946045e+02 2.96462952e+02 2.97755493e+02 2.99351776e+02
2.99617371e+02 3.01182892e+02 3.01821991e+02 3.03013733e+02
3.03545624e+02 3.04450409e+02 3.05499329e+02 3.07614685e+02
3.07796051e+02 3.09663177e+02 3.11000916e+02 3.12442810e+02
3.13311707e+02 3.13981354e+02 3.15125854e+02 3.16846649e+02
3.18297058e+02 3.19824646e+02 3.20183105e+02 3.22089569e+02
3.22665314e+02 3.23390503e+02 3.24579132e+02 3.26779022e+02
3.27760040e+02 3.28091095e+02 3.29320435e+02 3.29544617e+02
3.31250641e+02 3.33555481e+02 3.34481415e+02 3.35336395e+02
3.36432709e+02 3.36615356e+02 3.37862671e+02 3.41406769e+02
3.42474121e+02 3.43616730e+02 3.44051239e+02 3.46855743e+02
3.47452301e+02 3.49747833e+02 3.50391296e+02 3.50420319e+02
3.52046844e+02 3.54568420e+02 3.55123932e+02 3.55792297e+02
3.57415405e+02 3.58574982e+02 3.58938538e+02 3.59965820e+02
3.61536041e+02 3.62370544e+02 3.65192139e+02 3.67147522e+02
3.67959290e+02 3.69768280e+02 3.70614319e+02 3.71312927e+02
3.72857086e+02 3.73754547e+02 3.75940979e+02 3.76610077e+02
3.78235413e+02 3.79139038e+02 3.80356079e+02 3.81105835e+02
3.83061157e+02 3.84039398e+02 3.84622009e+02 3.85096985e+02
3.88893585e+02 3.89745514e+02 3.90177429e+02 3.91255676e+02
3.92934326e+02 3.94500610e+02 3.95852936e+02 3.96025543e+02
3.98757324e+02 4.00838928e+02 4.02533295e+02 4.03515381e+02
4.05245270e+02 4.05665131e+02 4.07181671e+02 4.08819916e+02
4.09781433e+02 4.12330994e+02 4.12999725e+02 4.13436157e+02
4.16030792e+02 4.17402313e+02 4.19632080e+02 4.19887726e+02
4.21076721e+02 4.22819763e+02 4.24274323e+02 4.25442566e+02
4.27771637e+02 4.28928070e+02 4.30498108e+02 4.32733978e+02
4.33565552e+02 4.34107422e+02 4.34671753e+02 4.35567841e+02
4.37534515e+02 4.39258270e+02 4.41043213e+02 4.41884308e+02
4.43626404e+02 4.47008789e+02 4.48276398e+02 4.49634644e+02
4.50287170e+02 4.50758759e+02 4.53319122e+02 4.54743835e+02
4.56333954e+02 4.57762421e+02 4.58775909e+02 4.59154144e+02
4.61899658e+02 4.62287262e+02 4.65804810e+02 4.65915344e+02
4.69140472e+02 4.69594940e+02 4.70214630e+02 4.72187500e+02
4.75492188e+02 4.76831207e+02 4.78016296e+02 4.78876587e+02
4.80703094e+02 4.82937683e+02 4.84370453e+02 4.85447327e+02
4.86678986e+02 4.88261383e+02 4.89125153e+02 4.90496582e+02
4.91627441e+02 4.92392639e+02 4.95415741e+02 4.97546143e+02
4.98519897e+02 5.01003296e+02 5.02865631e+02 5.05376709e+02
5.05943604e+02 5.07106323e+02 5.08591095e+02 5.10544708e+02
5.12250977e+02 5.14649780e+02 5.15656250e+02 5.16845703e+02
5.18270142e+02 5.19318420e+02 5.23630371e+02 5.24411560e+02
5.24661011e+02 5.26879456e+02 5.27787720e+02 5.30309692e+02
5.30779846e+02 5.35010437e+02 5.38321960e+02 5.39500549e+02
5.40761597e+02 5.44147644e+02 5.44728943e+02 5.46823792e+02
5.48718140e+02 5.49549561e+02 5.52125854e+02 5.54012085e+02
5.55997681e+02 5.56360168e+02 5.59716064e+02 5.60918091e+02
5.62265564e+02 5.64905212e+02 5.65518860e+02 5.67688049e+02
5.70330017e+02 5.71531189e+02 5.72709961e+02 5.73772583e+02
5.74635132e+02 5.76114807e+02 5.78688904e+02 5.80944031e+02
5.82956970e+02 5.83999451e+02 5.86373779e+02 5.87890442e+02
5.90274231e+02 5.92789978e+02 5.94019348e+02 5.95919861e+02
5.98507385e+02 5.99238098e+02 6.03538635e+02 6.04865234e+02
6.05613647e+02 6.08182129e+02 6.11422913e+02 6.12872375e+02
6.13704956e+02 6.15147339e+02 6.16140625e+02 6.18674927e+02
6.20023682e+02 6.24206421e+02 6.25301331e+02 6.25841003e+02
6.29186707e+02 6.32762695e+02 6.36279907e+02 6.36892151e+02
6.38313782e+02 6.41161438e+02 6.43433289e+02 6.45997742e+02
6.47491028e+02 6.48862854e+02 6.51700989e+02 6.53372925e+02
6.54144531e+02 6.55946716e+02 6.57468140e+02 6.59019165e+02
6.61488770e+02 6.63827576e+02 6.66005920e+02 6.68568359e+02
6.69098389e+02 6.70250671e+02 6.72675903e+02 6.74382874e+02
6.78338623e+02 6.79054993e+02 6.81062927e+02 6.83051514e+02
6.87433716e+02 6.89493958e+02 6.91695801e+02 6.92172424e+02
6.95642212e+02 6.97679749e+02 7.02713806e+02 7.03507507e+02
7.05350952e+02 7.09137085e+02 7.10424316e+02 7.14558228e+02
7.16256531e+02 7.16933838e+02 7.20567200e+02 7.21708801e+02
7.22399536e+02 7.25894592e+02 7.27103149e+02 7.31973083e+02
7.33964966e+02 7.36153809e+02 7.37215271e+02 7.40087769e+02
7.43148621e+02 7.45902954e+02 7.49233154e+02 7.50174500e+02
7.53542664e+02 7.56363037e+02 7.57446411e+02 7.60002197e+02
7.61068237e+02 7.62952087e+02 7.66672668e+02 7.68043945e+02
7.71573792e+02 7.74104675e+02 7.75025330e+02 7.76655212e+02
7.78113647e+02 7.82079224e+02 7.84104675e+02 7.84412964e+02
7.87688110e+02 7.90116760e+02 7.93338745e+02 7.94950562e+02
7.97679504e+02 7.99141479e+02 8.00581482e+02 8.02865906e+02
8.06674316e+02 8.07715149e+02 8.11932739e+02 8.16130005e+02
8.18053284e+02 8.18827026e+02 8.22571411e+02 8.24581543e+02
8.27552734e+02 8.33810181e+02 8.34987183e+02 8.38147156e+02
8.38753235e+02 8.40131897e+02 8.45192932e+02 8.47327148e+02
8.49306885e+02 8.55219543e+02 8.57978821e+02 8.59609680e+02
8.64477173e+02 8.67826477e+02 8.70597290e+02 8.72258057e+02
8.75445496e+02 8.77990295e+02 8.80465698e+02 8.83257935e+02
8.87310852e+02 8.88546692e+02 8.89433411e+02 8.92023071e+02
8.94646484e+02 8.98081665e+02 9.02650574e+02 9.06503845e+02
9.10038147e+02 9.12825867e+02 9.16995850e+02 9.19810730e+02
9.22638855e+02 9.26117920e+02 9.28607361e+02 9.30068176e+02
9.34010010e+02 9.36171509e+02 9.40510498e+02 9.44902039e+02
9.47517639e+02 9.53077942e+02 9.54970032e+02 9.59001404e+02
9.61072205e+02 9.64216736e+02 9.65422424e+02 9.70975159e+02
9.74228516e+02 9.77934509e+02 9.82992004e+02 9.85553650e+02
9.87180115e+02 9.94839722e+02 9.98558838e+02 1.00312799e+03
1.00889056e+03 1.01186957e+03 1.01528760e+03 1.01572839e+03
1.02170618e+03 1.02625366e+03 1.02736670e+03 1.02998499e+03
1.03566553e+03 1.04165344e+03 1.04748853e+03 1.04879114e+03
1.05461548e+03 1.05606238e+03 1.06002588e+03 1.06505298e+03
1.07181067e+03 1.07324780e+03 1.07702576e+03 1.08293030e+03
1.08789368e+03 1.09165430e+03 1.09612610e+03 1.09847888e+03
1.10769287e+03 1.11061316e+03 1.11224780e+03 1.11740173e+03
1.12600488e+03 1.12979822e+03 1.13307056e+03 1.13480164e+03
1.14169189e+03 1.14933838e+03 1.15824597e+03 1.16360193e+03
1.16659033e+03 1.17518359e+03 1.17727271e+03 1.18883459e+03
1.19373987e+03 1.20296497e+03 1.21020923e+03 1.21291443e+03
1.22026587e+03 1.22788940e+03 1.23525281e+03 1.23745837e+03
1.24352246e+03 1.25645020e+03 1.26784766e+03 1.27135596e+03
1.28876660e+03 1.29042651e+03 1.29937451e+03 1.32902893e+03]
Max Abs Diff: 0.0134277
Mean Abs Diff: 0.000160862
Median Abs Diff: 3.05176e-05
Std Abs Diff: 0.000615052
Max Rel Diff: 0.0636786
Mean Rel Diff: 7.82472e-05
Median Rel Diff: 2.97369e-07
Std Rel Diff: 0.00202026

rtol, atol: 1e-05 0.001


Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 592, in test_gpu_eigh
self.check_gpu_eigh(1000, UPLO='U', compute_v=False, atol=1e-3)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 579, in check_gpu_eigh
utt.assert_allclose(d_np, d_gpu, rtol=rtol, atol=atol)
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 358, in assert_allclose
raise WrongValue(expected, value, rtol, atol)
WrongValue: WrongValue
: shape, dtype, strides, min, max, n_inf, n_nan:
Expected : (1000,) float32 (4,) 0.000489231 1329.03 0 0
expected : [ 4.89230908e-04 2.12977896e-03 3.13293701e-03 6.67754840e-03
1.40918661e-02 1.92183014e-02 2.25752015e-02 2.72230878e-02
3.87566276e-02 6.79594204e-02 7.56917223e-02 1.06902987e-01
1.25276476e-01 1.36405200e-01 1.53002992e-01 1.82613134e-01
2.02814341e-01 2.47903526e-01 2.78994411e-01 2.96188176e-01
3.14503759e-01 3.56028557e-01 3.98533016e-01 4.17901546e-01
4.90445793e-01 5.09236455e-01 5.32531142e-01 5.59509158e-01
5.99932015e-01 7.00924218e-01 7.33008921e-01 7.90517509e-01
8.78386796e-01 9.32002425e-01 9.74158168e-01 1.01129854e+00
1.07917309e+00 1.15071452e+00 1.18585610e+00 1.31918740e+00
1.37965107e+00 1.43139780e+00 1.45824099e+00 1.54170978e+00
1.62419415e+00 1.68142617e+00 1.79060233e+00 1.84580815e+00
1.91080856e+00 2.02409029e+00 2.15122032e+00 2.21839261e+00
2.22805572e+00 2.33068490e+00 2.44404292e+00 2.48344588e+00
2.65764761e+00 2.75451469e+00 2.84316611e+00 2.97589254e+00
3.04562998e+00 3.16397953e+00 3.22816157e+00 3.27334309e+00
3.35018253e+00 3.51056385e+00 3.61152577e+00 3.67874050e+00
3.72451997e+00 3.92017555e+00 3.98458195e+00 4.19438982e+00
4.26176262e+00 4.44420242e+00 4.47579050e+00 4.70065880e+00
4.77662277e+00 4.90885448e+00 5.16805267e+00 5.19677114e+00
5.33062124e+00 5.40188169e+00 5.58674431e+00 5.70062113e+00
5.84209442e+00 5.89712572e+00 6.05004644e+00 6.24996042e+00
6.28961372e+00 6.35290003e+00 6.46114254e+00 6.66021729e+00
6.91916370e+00 7.12386560e+00 7.32566452e+00 7.57610226e+00
7.64203882e+00 7.74720716e+00 7.96737528e+00 8.11195278e+00
8.27048969e+00 8.45675373e+00 8.56421947e+00 8.64499474e+00
8.80696583e+00 9.08377934e+00 9.27954578e+00 9.33949375e+00
9.67354107e+00 9.73110390e+00 9.98426533e+00 1.02513180e+01
1.03698158e+01 1.04776506e+01 1.08126974e+01 1.09044924e+01
1.11186914e+01 1.13258410e+01 1.16921988e+01 1.17399549e+01
1.18261547e+01 1.19929199e+01 1.22477922e+01 1.26201115e+01
1.26639690e+01 1.28501177e+01 1.29435987e+01 1.32729578e+01
1.36774063e+01 1.38649168e+01 1.42879829e+01 1.43730230e+01
1.44511642e+01 1.46355152e+01 1.48354130e+01 1.49453945e+01
1.53802366e+01 1.54855146e+01 1.55675106e+01 1.61148834e+01
1.64968357e+01 1.67138081e+01 1.68647079e+01 1.70887566e+01
1.71699142e+01 1.72498531e+01 1.76943913e+01 1.79336357e+01
1.81074104e+01 1.83980484e+01 1.85775814e+01 1.86951084e+01
1.91310596e+01 1.95102253e+01 1.96415691e+01 1.97686882e+01
2.00792942e+01 2.04385757e+01 2.06840496e+01 2.10087662e+01
2.12453995e+01 2.14383755e+01 2.17911930e+01 2.20333805e+01
2.21764374e+01 2.24260311e+01 2.28442974e+01 2.30538788e+01
2.31309834e+01 2.35197716e+01 2.39328041e+01 2.40887146e+01
2.41917248e+01 2.45445805e+01 2.53421555e+01 2.55752201e+01
2.57296429e+01 2.59239006e+01 2.61374950e+01 2.65188313e+01
2.70373020e+01 2.73517113e+01 2.77448864e+01 2.78163166e+01
2.80459995e+01 2.82876492e+01 2.85247421e+01 2.90162640e+01
2.94064426e+01 2.95603008e+01 3.00983028e+01 3.04284725e+01
3.05698719e+01 3.08299389e+01 3.10155983e+01 3.14266052e+01
3.19124031e+01 3.21357002e+01 3.26962738e+01 3.28808060e+01
3.36507492e+01 3.39199257e+01 3.41434669e+01 3.45634651e+01
3.49445419e+01 3.50615768e+01 3.52362671e+01 3.54621506e+01
3.58567276e+01 3.61798859e+01 3.63464508e+01 3.68529625e+01
3.72920189e+01 3.74682655e+01 3.79431305e+01 3.80115013e+01
3.84746399e+01 3.91339302e+01 3.96361809e+01 3.97272453e+01
3.99532585e+01 4.01345634e+01 4.06527786e+01 4.09145050e+01
4.14337845e+01 4.21907196e+01 4.22522774e+01 4.25551262e+01
4.29039154e+01 4.38117828e+01 4.41242485e+01 4.43046303e+01
4.44786606e+01 4.47252388e+01 4.50300980e+01 4.52520981e+01
4.59720497e+01 4.63360405e+01 4.69958267e+01 4.73848915e+01
4.78359566e+01 4.79470100e+01 4.85371513e+01 4.88817902e+01
4.93199539e+01 4.96406403e+01 5.05801125e+01 5.07774963e+01
5.12299805e+01 5.16515617e+01 5.21387215e+01 5.24999580e+01
5.29075928e+01 5.30500641e+01 5.32980194e+01 5.37913513e+01
5.44327507e+01 5.48703499e+01 5.54252968e+01 5.57441406e+01
5.58686523e+01 5.65598068e+01 5.66993294e+01 5.75738983e+01
5.76345062e+01 5.86155777e+01 5.88713493e+01 5.95167274e+01
5.98670120e+01 6.00635948e+01 6.06487503e+01 6.10369339e+01
6.16479034e+01 6.19931450e+01 6.23950653e+01 6.30500908e+01
6.37085228e+01 6.43323975e+01 6.45980911e+01 6.54231567e+01
6.56577148e+01 6.58293686e+01 6.62960434e+01 6.67252808e+01
6.73152542e+01 6.78940811e+01 6.80604095e+01 6.83826675e+01
6.89333649e+01 6.96096039e+01 7.02785492e+01 7.04530411e+01
7.07854309e+01 7.17996368e+01 7.18703918e+01 7.23416519e+01
7.32832031e+01 7.34858017e+01 7.36877823e+01 7.41157761e+01
7.46205368e+01 7.55300980e+01 7.56960983e+01 7.65761948e+01
7.70449219e+01 7.73158417e+01 7.81423340e+01 7.83301086e+01
7.89410019e+01 7.97195663e+01 8.00195618e+01 8.05239639e+01
8.09951324e+01 8.28365707e+01 8.29485855e+01 8.34414673e+01
8.43953476e+01 8.48697891e+01 8.52431717e+01 8.57822952e+01
8.65215836e+01 8.68748627e+01 8.73740768e+01 8.80134811e+01
8.80546570e+01 8.84077835e+01 8.90928116e+01 9.02344589e+01
9.04520035e+01 9.15475082e+01 9.21446075e+01 9.26446991e+01
9.29936218e+01 9.36920395e+01 9.43039856e+01 9.52742996e+01
9.54694443e+01 9.64435806e+01 9.72243805e+01 9.75561371e+01
9.79105072e+01 9.89687805e+01 9.95528030e+01 9.97306061e+01
1.00530960e+02 1.01104813e+02 1.02085381e+02 1.02377037e+02
1.02782135e+02 1.03399628e+02 1.03669106e+02 1.04035011e+02
1.05070404e+02 1.05429268e+02 1.06007362e+02 1.06920204e+02
1.07101021e+02 1.08478912e+02 1.09070030e+02 1.09573059e+02
1.09937714e+02 1.10632866e+02 1.10885803e+02 1.11919601e+02
1.12639381e+02 1.13435143e+02 1.14277077e+02 1.15069397e+02
1.15326553e+02 1.15752426e+02 1.15855774e+02 1.16174515e+02
1.17536049e+02 1.18485741e+02 1.18578918e+02 1.18764275e+02
1.19779701e+02 1.20533669e+02 1.21078941e+02 1.21695374e+02
1.23116829e+02 1.23333618e+02 1.23817024e+02 1.25266251e+02
1.25627075e+02 1.26793488e+02 1.27580589e+02 1.28196274e+02
1.28788101e+02 1.29288513e+02 1.30663132e+02 1.30814804e+02
1.31497559e+02 1.32191238e+02 1.32632111e+02 1.33183655e+02
1.33662186e+02 1.34850204e+02 1.35557175e+02 1.36055191e+02
1.37061935e+02 1.37184875e+02 1.37808075e+02 1.39221329e+02
1.39630005e+02 1.40350250e+02 1.41141693e+02 1.42246323e+02
1.42512833e+02 1.43260849e+02 1.44010605e+02 1.44338867e+02
1.46055557e+02 1.46734344e+02 1.47149597e+02 1.47747437e+02
1.47992096e+02 1.49210007e+02 1.50201019e+02 1.50473526e+02
1.51602875e+02 1.52176910e+02 1.52860275e+02 1.53430252e+02
1.53968384e+02 1.54485397e+02 1.56663742e+02 1.57157532e+02
1.57566696e+02 1.58046204e+02 1.58632233e+02 1.59611160e+02
1.60449066e+02 1.61080048e+02 1.62012695e+02 1.63134903e+02
1.63781158e+02 1.64873459e+02 1.65375778e+02 1.65671127e+02
1.66652664e+02 1.67070236e+02 1.67674927e+02 1.69239014e+02
1.69924973e+02 1.70515274e+02 1.71132767e+02 1.72319275e+02
1.72720444e+02 1.73888580e+02 1.74564819e+02 1.75664444e+02
1.76262695e+02 1.76494019e+02 1.77510757e+02 1.77645920e+02
1.79172440e+02 1.80101990e+02 1.81353958e+02 1.81965469e+02
1.83660828e+02 1.84429276e+02 1.85226761e+02 1.86121964e+02
1.86388718e+02 1.87349655e+02 1.87725769e+02 1.89127838e+02
1.89853683e+02 1.90707047e+02 1.91777298e+02 1.92425735e+02
1.93159149e+02 1.94241852e+02 1.94340775e+02 1.95008881e+02
1.95073624e+02 1.97234070e+02 1.97584259e+02 1.98934479e+02
1.99668304e+02 2.00329880e+02 2.00704575e+02 2.02396103e+02
2.02894363e+02 2.03399002e+02 2.04125305e+02 2.05381866e+02
2.05944931e+02 2.07943665e+02 2.08293121e+02 2.08654480e+02
2.10693176e+02 2.11176041e+02 2.12256302e+02 2.12439377e+02
2.14080383e+02 2.14558411e+02 2.15282181e+02 2.16877182e+02
2.17594879e+02 2.17812393e+02 2.18641495e+02 2.19710297e+02
2.21451981e+02 2.22071609e+02 2.23107040e+02 2.23267593e+02
2.25905243e+02 2.26620926e+02 2.26984253e+02 2.28312485e+02
2.29737167e+02 2.30212433e+02 2.31229248e+02 2.31513412e+02
2.32572495e+02 2.32942673e+02 2.34115692e+02 2.35883057e+02
2.36877045e+02 2.38990311e+02 2.40194977e+02 2.40716278e+02
2.41334824e+02 2.42073013e+02 2.42177826e+02 2.43227295e+02
2.43842453e+02 2.45897491e+02 2.46181885e+02 2.47454697e+02
2.48015167e+02 2.48929214e+02 2.49426239e+02 2.50470871e+02
2.50934540e+02 2.52497513e+02 2.53442169e+02 2.55106766e+02
2.55705383e+02 2.56187988e+02 2.56375092e+02 2.57441986e+02
2.59716461e+02 2.61266418e+02 2.62640717e+02 2.64483429e+02
2.65059937e+02 2.65337921e+02 2.66213776e+02 2.68024750e+02
2.70080170e+02 2.70308289e+02 2.71459717e+02 2.72110809e+02
2.73613770e+02 2.74371033e+02 2.75197662e+02 2.75613617e+02
2.77125000e+02 2.78525665e+02 2.79917328e+02 2.80601624e+02
2.82654938e+02 2.83838440e+02 2.84764069e+02 2.86522644e+02
2.87402649e+02 2.88669403e+02 2.89981476e+02 2.91002777e+02
2.91752380e+02 2.92264984e+02 2.92740051e+02 2.95540649e+02
2.95945984e+02 2.96462952e+02 2.97755371e+02 2.99351868e+02
2.99617340e+02 3.01182831e+02 3.01822083e+02 3.03013824e+02
3.03545624e+02 3.04450226e+02 3.05499329e+02 3.07614655e+02
3.07796234e+02 3.09663177e+02 3.11000824e+02 3.12442871e+02
3.13311615e+02 3.13981384e+02 3.15125732e+02 3.16846710e+02
3.18297028e+02 3.19824554e+02 3.20183136e+02 3.22089600e+02
3.22665283e+02 3.23390411e+02 3.24579346e+02 3.26779022e+02
3.27760040e+02 3.28091187e+02 3.29320465e+02 3.29544861e+02
3.31250519e+02 3.33555389e+02 3.34481415e+02 3.35336365e+02
3.36432648e+02 3.36615326e+02 3.37862579e+02 3.41406799e+02
3.42474121e+02 3.43616730e+02 3.44051331e+02 3.46855713e+02
3.47452393e+02 3.49747742e+02 3.50391296e+02 3.50420563e+02
3.52046844e+02 3.54568451e+02 3.55124115e+02 3.55792358e+02
3.57415222e+02 3.58575104e+02 3.58938446e+02 3.59965790e+02
3.61536072e+02 3.62370575e+02 3.65192291e+02 3.67147400e+02
3.67959290e+02 3.69768311e+02 3.70614197e+02 3.71312897e+02
3.72857208e+02 3.73754456e+02 3.75941010e+02 3.76610199e+02
3.78235474e+02 3.79139008e+02 3.80356171e+02 3.81105957e+02
3.83061340e+02 3.84039429e+02 3.84622131e+02 3.85097137e+02
3.88893707e+02 3.89745483e+02 3.90177551e+02 3.91255707e+02
3.92934479e+02 3.94500641e+02 3.95852936e+02 3.96025452e+02
3.98757416e+02 4.00838928e+02 4.02533447e+02 4.03515503e+02
4.05245483e+02 4.05664978e+02 4.07181580e+02 4.08819916e+02
4.09781494e+02 4.12330872e+02 4.12999786e+02 4.13436157e+02
4.16030579e+02 4.17402435e+02 4.19632019e+02 4.19887756e+02
4.21076721e+02 4.22819733e+02 4.24274384e+02 4.25442535e+02
4.27771881e+02 4.28928162e+02 4.30498291e+02 4.32734100e+02
4.33565704e+02 4.34107513e+02 4.34672028e+02 4.35567688e+02
4.37534637e+02 4.39258270e+02 4.41043304e+02 4.41884369e+02
4.43626343e+02 4.47008850e+02 4.48276337e+02 4.49634552e+02
4.50286987e+02 4.50758789e+02 4.53319214e+02 4.54743927e+02
4.56333801e+02 4.57762268e+02 4.58775757e+02 4.59154236e+02
4.61899780e+02 4.62287384e+02 4.65804718e+02 4.65915588e+02
4.69140259e+02 4.69594910e+02 4.70214478e+02 4.72187317e+02
4.75492188e+02 4.76831146e+02 4.78016296e+02 4.78876282e+02
4.80702942e+02 4.82937683e+02 4.84370239e+02 4.85447357e+02
4.86678986e+02 4.88261200e+02 4.89125122e+02 4.90496613e+02
4.91627472e+02 4.92392731e+02 4.95415680e+02 4.97546082e+02
4.98519806e+02 5.01003418e+02 5.02865601e+02 5.05376648e+02
5.05943542e+02 5.07106140e+02 5.08591064e+02 5.10544556e+02
5.12251160e+02 5.14649536e+02 5.15656250e+02 5.16845642e+02
5.18270081e+02 5.19318359e+02 5.23630249e+02 5.24411377e+02
5.24660889e+02 5.26879333e+02 5.27787659e+02 5.30309875e+02
5.30780090e+02 5.35010681e+02 5.38322021e+02 5.39500854e+02
5.40761230e+02 5.44147644e+02 5.44728943e+02 5.46824158e+02
5.48718262e+02 5.49549622e+02 5.52125793e+02 5.54012268e+02
5.55997925e+02 5.56360107e+02 5.59715881e+02 5.60918030e+02
5.62265320e+02 5.64904968e+02 5.65518921e+02 5.67688354e+02
5.70330017e+02 5.71531067e+02 5.72709717e+02 5.73772461e+02
5.74635193e+02 5.76114807e+02 5.78688782e+02 5.80944153e+02
5.82957336e+02 5.83999512e+02 5.86374268e+02 5.87890076e+02
5.90274780e+02 5.92789795e+02 5.94019470e+02 5.95919861e+02
5.98507568e+02 5.99237854e+02 6.03538574e+02 6.04865479e+02
6.05613770e+02 6.08181580e+02 6.11423157e+02 6.12872131e+02
6.13704468e+02 6.15147156e+02 6.16140564e+02 6.18674927e+02
6.20023560e+02 6.24206665e+02 6.25302063e+02 6.25840881e+02
6.29186768e+02 6.32762573e+02 6.36280273e+02 6.36891846e+02
6.38314148e+02 6.41161377e+02 6.43433289e+02 6.45997803e+02
6.47491089e+02 6.48863037e+02 6.51700867e+02 6.53373108e+02
6.54144531e+02 6.55946594e+02 6.57468140e+02 6.59019470e+02
6.61489075e+02 6.63827576e+02 6.66006042e+02 6.68568481e+02
6.69098450e+02 6.70250366e+02 6.72675781e+02 6.74382141e+02
6.78338562e+02 6.79055542e+02 6.81062866e+02 6.83051270e+02
6.87433716e+02 6.89493835e+02 6.91695862e+02 6.92172363e+02
6.95642151e+02 6.97679443e+02 7.02713928e+02 7.03507385e+02
7.05351074e+02 7.09136841e+02 7.10424255e+02 7.14557922e+02
7.16256531e+02 7.16933655e+02 7.20567322e+02 7.21709106e+02
7.22399597e+02 7.25895020e+02 7.27103394e+02 7.31972900e+02
7.33965088e+02 7.36153870e+02 7.37215271e+02 7.40088196e+02
7.43148438e+02 7.45903381e+02 7.49233154e+02 7.50174316e+02
7.53542908e+02 7.56362793e+02 7.57446716e+02 7.60002075e+02
7.61068665e+02 7.62952332e+02 7.66672546e+02 7.68044189e+02
7.71573730e+02 7.74104919e+02 7.75025513e+02 7.76655823e+02
7.78113159e+02 7.82079407e+02 7.84105103e+02 7.84413391e+02
7.87688599e+02 7.90116821e+02 7.93338562e+02 7.94950989e+02
7.97679932e+02 7.99141418e+02 8.00582031e+02 8.02866211e+02
8.06674377e+02 8.07715515e+02 8.11933105e+02 8.16129822e+02
8.18053040e+02 8.18826904e+02 8.22571350e+02 8.24581421e+02
8.27553894e+02 8.33810791e+02 8.34986450e+02 8.38146973e+02
8.38753540e+02 8.40131531e+02 8.45192261e+02 8.47327454e+02
8.49306458e+02 8.55219482e+02 8.57979004e+02 8.59609802e+02
8.64477051e+02 8.67826111e+02 8.70596924e+02 8.72258484e+02
8.75446167e+02 8.77990356e+02 8.80466003e+02 8.83258179e+02
8.87311096e+02 8.88546997e+02 8.89433289e+02 8.92023621e+02
8.94646912e+02 8.98081482e+02 9.02650696e+02 9.06503540e+02
9.10038757e+02 9.12826355e+02 9.16996338e+02 9.19810608e+02
9.22638184e+02 9.26118103e+02 9.28608215e+02 9.30068787e+02
9.34010132e+02 9.36171936e+02 9.40510742e+02 9.44901367e+02
9.47517456e+02 9.53077942e+02 9.54970337e+02 9.59001587e+02
9.61072632e+02 9.64217224e+02 9.65422668e+02 9.70974976e+02
9.74228455e+02 9.77934326e+02 9.82992065e+02 9.85552795e+02
9.87179993e+02 9.94839233e+02 9.98557739e+02 1.00312830e+03
1.00889075e+03 1.01186896e+03 1.01528748e+03 1.01572839e+03
1.02170581e+03 1.02625330e+03 1.02736719e+03 1.02998657e+03
1.03566577e+03 1.04165332e+03 1.04748730e+03 1.04879260e+03
1.05461511e+03 1.05606140e+03 1.06002490e+03 1.06505225e+03
1.07181201e+03 1.07324780e+03 1.07702405e+03 1.08293176e+03
1.08789355e+03 1.09165332e+03 1.09612341e+03 1.09847888e+03
1.10769482e+03 1.11061194e+03 1.11224780e+03 1.11740137e+03
1.12600427e+03 1.12979663e+03 1.13307056e+03 1.13479761e+03
1.14168933e+03 1.14933862e+03 1.15824756e+03 1.16360193e+03
1.16658777e+03 1.17518091e+03 1.17726892e+03 1.18883545e+03
1.19374170e+03 1.20296753e+03 1.21020850e+03 1.21291370e+03
1.22026611e+03 1.22789514e+03 1.23526062e+03 1.23744495e+03
1.24352417e+03 1.25645020e+03 1.26784790e+03 1.27135571e+03
1.28876624e+03 1.29042517e+03 1.29937598e+03 1.32902979e+03]
value : [ 5.20384463e-04 2.12052371e-03 3.12634627e-03 6.66383002e-03
1.40903657e-02 1.91810895e-02 2.25853324e-02 2.72042081e-02
3.87822092e-02 6.79658130e-02 7.57073015e-02 1.06888019e-01
1.25288516e-01 1.36433661e-01 1.53017715e-01 1.82603270e-01
2.02828571e-01 2.47915745e-01 2.79005468e-01 2.96148479e-01
3.14518899e-01 3.56056601e-01 3.98567855e-01 4.17888165e-01
4.90447998e-01 5.09236157e-01 5.32532990e-01 5.59517980e-01
5.99948943e-01 7.00941563e-01 7.33029783e-01 7.90516138e-01
8.78382266e-01 9.32017088e-01 9.74162698e-01 1.01130247e+00
1.07916117e+00 1.15071130e+00 1.18586326e+00 1.31920063e+00
1.37963653e+00 1.43141806e+00 1.45821238e+00 1.54170680e+00
1.62419868e+00 1.68142414e+00 1.79059923e+00 1.84579206e+00
1.91079915e+00 2.02410603e+00 2.15119219e+00 2.21838617e+00
2.22804594e+00 2.33072186e+00 2.44406581e+00 2.48346376e+00
2.65763068e+00 2.75451756e+00 2.84316564e+00 2.97590661e+00
3.04564238e+00 3.16400456e+00 3.22816610e+00 3.27333927e+00
3.35019469e+00 3.51056647e+00 3.61153316e+00 3.67876625e+00
3.72453380e+00 3.92019677e+00 3.98457670e+00 4.19437170e+00
4.26172972e+00 4.44420719e+00 4.47580528e+00 4.70067644e+00
4.77662563e+00 4.90884733e+00 5.16807747e+00 5.19675446e+00
5.33064079e+00 5.40186357e+00 5.58674192e+00 5.70064640e+00
5.84212303e+00 5.89713240e+00 6.05006599e+00 6.24997425e+00
6.28960943e+00 6.35289192e+00 6.46114254e+00 6.66022873e+00
6.91912127e+00 7.12386465e+00 7.32568502e+00 7.57607698e+00
7.64203358e+00 7.74719906e+00 7.96739817e+00 8.11195564e+00
8.27048302e+00 8.45676041e+00 8.56424713e+00 8.64499664e+00
8.80699730e+00 9.08375835e+00 9.27955818e+00 9.33950424e+00
9.67356205e+00 9.73108006e+00 9.98427677e+00 1.02513142e+01
1.03698521e+01 1.04776373e+01 1.08127146e+01 1.09045095e+01
1.11186914e+01 1.13258486e+01 1.16922131e+01 1.17399263e+01
1.18261671e+01 1.19929619e+01 1.22477894e+01 1.26201487e+01
1.26639853e+01 1.28501558e+01 1.29436102e+01 1.32729855e+01
1.36774006e+01 1.38649197e+01 1.42879992e+01 1.43730659e+01
1.44511747e+01 1.46355143e+01 1.48354387e+01 1.49453936e+01
1.53802214e+01 1.54855175e+01 1.55675011e+01 1.61149044e+01
1.64968472e+01 1.67138367e+01 1.68647327e+01 1.70887432e+01
1.71699295e+01 1.72498398e+01 1.76943970e+01 1.79336376e+01
1.81074276e+01 1.83980560e+01 1.85775776e+01 1.86950893e+01
1.91310329e+01 1.95102348e+01 1.96415787e+01 1.97687130e+01
2.00793076e+01 2.04385700e+01 2.06840591e+01 2.10087700e+01
2.12454510e+01 2.14383793e+01 2.17912006e+01 2.20333824e+01
2.21764545e+01 2.24260197e+01 2.28443127e+01 2.30538559e+01
2.31309948e+01 2.35198135e+01 2.39328022e+01 2.40886955e+01
2.41917229e+01 2.45446053e+01 2.53421555e+01 2.55751972e+01
2.57296524e+01 2.59239140e+01 2.61375160e+01 2.65188427e+01
2.70373058e+01 2.73517342e+01 2.77449074e+01 2.78162975e+01
2.80460129e+01 2.82876854e+01 2.85247631e+01 2.90162849e+01
2.94064350e+01 2.95602989e+01 3.00983181e+01 3.04284782e+01
3.05698700e+01 3.08299427e+01 3.10156269e+01 3.14266243e+01
3.19124146e+01 3.21357079e+01 3.26962891e+01 3.28808174e+01
3.36507759e+01 3.39199219e+01 3.41434364e+01 3.45634499e+01
3.49445648e+01 3.50615883e+01 3.52363052e+01 3.54621773e+01
3.58567085e+01 3.61798935e+01 3.63464661e+01 3.68529930e+01
3.72920456e+01 3.74682770e+01 3.79431381e+01 3.80115204e+01
3.84746437e+01 3.91339874e+01 3.96361694e+01 3.97272758e+01
3.99532509e+01 4.01345863e+01 4.06528091e+01 4.09145164e+01
4.14338036e+01 4.21907082e+01 4.22522621e+01 4.25551033e+01
4.29039536e+01 4.38117523e+01 4.41241760e+01 4.43046265e+01
4.44786491e+01 4.47252502e+01 4.50300865e+01 4.52521057e+01
4.59720306e+01 4.63360443e+01 4.69958305e+01 4.73849068e+01
4.78359489e+01 4.79470177e+01 4.85371895e+01 4.88817520e+01
4.93199654e+01 4.96406364e+01 5.05801315e+01 5.07775116e+01
5.12299919e+01 5.16515503e+01 5.21386986e+01 5.24999771e+01
5.29075928e+01 5.30500298e+01 5.32979927e+01 5.37913361e+01
5.44327736e+01 5.48703651e+01 5.54253006e+01 5.57441711e+01
5.58686676e+01 5.65597801e+01 5.66993484e+01 5.75738831e+01
5.76344948e+01 5.86155663e+01 5.88713608e+01 5.95167160e+01
5.98669853e+01 6.00635948e+01 6.06487312e+01 6.10369606e+01
6.16479378e+01 6.19931526e+01 6.23950691e+01 6.30500984e+01
6.37085457e+01 6.43324203e+01 6.45981216e+01 6.54231567e+01
6.56577225e+01 6.58293839e+01 6.62960663e+01 6.67252655e+01
6.73152618e+01 6.78941193e+01 6.80604019e+01 6.83826904e+01
6.89333267e+01 6.96096039e+01 7.02785492e+01 7.04530716e+01
7.07854462e+01 7.17996521e+01 7.18703842e+01 7.23416595e+01
7.32831879e+01 7.34857788e+01 7.36878128e+01 7.41157913e+01
7.46205444e+01 7.55300903e+01 7.56960983e+01 7.65762253e+01
7.70449448e+01 7.73158875e+01 7.81423187e+01 7.83301468e+01
7.89409409e+01 7.97195587e+01 8.00195847e+01 8.05239105e+01
8.09951782e+01 8.28366241e+01 8.29485626e+01 8.34415131e+01
8.43954163e+01 8.48698502e+01 8.52431946e+01 8.57822800e+01
8.65216064e+01 8.68748093e+01 8.73741074e+01 8.80134811e+01
8.80546722e+01 8.84077682e+01 8.90927963e+01 9.02344513e+01
9.04520340e+01 9.15475006e+01 9.21446457e+01 9.26447144e+01
9.29936066e+01 9.36919937e+01 9.43039780e+01 9.52742157e+01
9.54694443e+01 9.64435806e+01 9.72243576e+01 9.75561295e+01
9.79105682e+01 9.89687729e+01 9.95527496e+01 9.97305984e+01
1.00530945e+02 1.01104805e+02 1.02085312e+02 1.02377014e+02
1.02782143e+02 1.03399620e+02 1.03669067e+02 1.04034988e+02
1.05070419e+02 1.05429283e+02 1.06007347e+02 1.06920219e+02
1.07100998e+02 1.08478920e+02 1.09070045e+02 1.09573051e+02
1.09937683e+02 1.10632889e+02 1.10885818e+02 1.11919601e+02
1.12639381e+02 1.13435143e+02 1.14277000e+02 1.15069412e+02
1.15326538e+02 1.15752419e+02 1.15855782e+02 1.16174561e+02
1.17536034e+02 1.18485756e+02 1.18578903e+02 1.18764252e+02
1.19779663e+02 1.20533661e+02 1.21078903e+02 1.21695404e+02
1.23116844e+02 1.23333672e+02 1.23816956e+02 1.25266243e+02
1.25627083e+02 1.26793442e+02 1.27580643e+02 1.28196243e+02
1.28788101e+02 1.29288513e+02 1.30663116e+02 1.30814804e+02
1.31497543e+02 1.32191254e+02 1.32632111e+02 1.33183594e+02
1.33662186e+02 1.34850143e+02 1.35557159e+02 1.36055161e+02
1.37061966e+02 1.37184875e+02 1.37808060e+02 1.39221252e+02
1.39630005e+02 1.40350266e+02 1.41141739e+02 1.42246277e+02
1.42512878e+02 1.43260910e+02 1.44010620e+02 1.44338913e+02
1.46055527e+02 1.46734406e+02 1.47149567e+02 1.47747467e+02
1.47992096e+02 1.49210022e+02 1.50201080e+02 1.50473526e+02
1.51602936e+02 1.52176971e+02 1.52860291e+02 1.53430344e+02
1.53968384e+02 1.54485458e+02 1.56663712e+02 1.57157547e+02
1.57566666e+02 1.58046249e+02 1.58632187e+02 1.59611191e+02
1.60449066e+02 1.61080078e+02 1.62012695e+02 1.63134964e+02
1.63781143e+02 1.64873520e+02 1.65375854e+02 1.65671173e+02
1.66652679e+02 1.67070190e+02 1.67674927e+02 1.69238983e+02
1.69924957e+02 1.70515335e+02 1.71132767e+02 1.72319260e+02
1.72720444e+02 1.73888611e+02 1.74564911e+02 1.75664474e+02
1.76262726e+02 1.76494080e+02 1.77510757e+02 1.77645950e+02
1.79172379e+02 1.80101974e+02 1.81354034e+02 1.81965393e+02
1.83660828e+02 1.84429214e+02 1.85226791e+02 1.86121994e+02
1.86388733e+02 1.87349640e+02 1.87725739e+02 1.89127838e+02
1.89853668e+02 1.90707062e+02 1.91777252e+02 1.92425674e+02
1.93159149e+02 1.94241714e+02 1.94340775e+02 1.95008881e+02
1.95073654e+02 1.97234009e+02 1.97584167e+02 1.98934372e+02
1.99668259e+02 2.00329910e+02 2.00704605e+02 2.02396133e+02
2.02894363e+02 2.03398926e+02 2.04125305e+02 2.05381760e+02
2.05944946e+02 2.07943665e+02 2.08293213e+02 2.08654419e+02
2.10693253e+02 2.11176056e+02 2.12256271e+02 2.12439529e+02
2.14080353e+02 2.14558456e+02 2.15282120e+02 2.16877151e+02
2.17594742e+02 2.17812271e+02 2.18641327e+02 2.19710281e+02
2.21451950e+02 2.22071686e+02 2.23107010e+02 2.23267624e+02
2.25905258e+02 2.26620956e+02 2.26984299e+02 2.28312546e+02
2.29737167e+02 2.30212433e+02 2.31229279e+02 2.31513412e+02
2.32572510e+02 2.32942688e+02 2.34115723e+02 2.35883057e+02
2.36877029e+02 2.38990326e+02 2.40195038e+02 2.40716278e+02
2.41334900e+02 2.42072968e+02 2.42177856e+02 2.43227264e+02
2.43842392e+02 2.45897614e+02 2.46181976e+02 2.47454697e+02
2.48015167e+02 2.48929245e+02 2.49426346e+02 2.50470917e+02
2.50934509e+02 2.52497437e+02 2.53442001e+02 2.55106628e+02
2.55705322e+02 2.56187988e+02 2.56374969e+02 2.57441833e+02
2.59716522e+02 2.61266449e+02 2.62640625e+02 2.64483429e+02
2.65059784e+02 2.65337891e+02 2.66213806e+02 2.68024902e+02
2.70080048e+02 2.70308136e+02 2.71459717e+02 2.72110687e+02
2.73613800e+02 2.74371002e+02 2.75197632e+02 2.75613647e+02
2.77124939e+02 2.78525635e+02 2.79917328e+02 2.80601532e+02
2.82654938e+02 2.83838440e+02 2.84764099e+02 2.86522644e+02
2.87402771e+02 2.88669495e+02 2.89981415e+02 2.91002686e+02
2.91752319e+02 2.92265106e+02 2.92740143e+02 2.95540680e+02
2.95946045e+02 2.96462952e+02 2.97755493e+02 2.99351776e+02
2.99617371e+02 3.01182892e+02 3.01821991e+02 3.03013733e+02
3.03545624e+02 3.04450409e+02 3.05499329e+02 3.07614685e+02
3.07796051e+02 3.09663177e+02 3.11000916e+02 3.12442810e+02
3.13311707e+02 3.13981354e+02 3.15125854e+02 3.16846649e+02
3.18297058e+02 3.19824646e+02 3.20183105e+02 3.22089569e+02
3.22665314e+02 3.23390503e+02 3.24579132e+02 3.26779022e+02
3.27760040e+02 3.28091095e+02 3.29320435e+02 3.29544617e+02
3.31250641e+02 3.33555481e+02 3.34481415e+02 3.35336395e+02
3.36432709e+02 3.36615356e+02 3.37862671e+02 3.41406769e+02
3.42474121e+02 3.43616730e+02 3.44051239e+02 3.46855743e+02
3.47452301e+02 3.49747833e+02 3.50391296e+02 3.50420319e+02
3.52046844e+02 3.54568420e+02 3.55123932e+02 3.55792297e+02
3.57415405e+02 3.58574982e+02 3.58938538e+02 3.59965820e+02
3.61536041e+02 3.62370544e+02 3.65192139e+02 3.67147522e+02
3.67959290e+02 3.69768280e+02 3.70614319e+02 3.71312927e+02
3.72857086e+02 3.73754547e+02 3.75940979e+02 3.76610077e+02
3.78235413e+02 3.79139038e+02 3.80356079e+02 3.81105835e+02
3.83061157e+02 3.84039398e+02 3.84622009e+02 3.85096985e+02
3.88893585e+02 3.89745514e+02 3.90177429e+02 3.91255676e+02
3.92934326e+02 3.94500610e+02 3.95852936e+02 3.96025543e+02
3.98757324e+02 4.00838928e+02 4.02533295e+02 4.03515381e+02
4.05245270e+02 4.05665131e+02 4.07181671e+02 4.08819916e+02
4.09781433e+02 4.12330994e+02 4.12999725e+02 4.13436157e+02
4.16030792e+02 4.17402313e+02 4.19632080e+02 4.19887726e+02
4.21076721e+02 4.22819763e+02 4.24274323e+02 4.25442566e+02
4.27771637e+02 4.28928070e+02 4.30498108e+02 4.32733978e+02
4.33565552e+02 4.34107422e+02 4.34671753e+02 4.35567841e+02
4.37534515e+02 4.39258270e+02 4.41043213e+02 4.41884308e+02
4.43626404e+02 4.47008789e+02 4.48276398e+02 4.49634644e+02
4.50287170e+02 4.50758759e+02 4.53319122e+02 4.54743835e+02
4.56333954e+02 4.57762421e+02 4.58775909e+02 4.59154144e+02
4.61899658e+02 4.62287262e+02 4.65804810e+02 4.65915344e+02
4.69140472e+02 4.69594940e+02 4.70214630e+02 4.72187500e+02
4.75492188e+02 4.76831207e+02 4.78016296e+02 4.78876587e+02
4.80703094e+02 4.82937683e+02 4.84370453e+02 4.85447327e+02
4.86678986e+02 4.88261383e+02 4.89125153e+02 4.90496582e+02
4.91627441e+02 4.92392639e+02 4.95415741e+02 4.97546143e+02
4.98519897e+02 5.01003296e+02 5.02865631e+02 5.05376709e+02
5.05943604e+02 5.07106323e+02 5.08591095e+02 5.10544708e+02
5.12250977e+02 5.14649780e+02 5.15656250e+02 5.16845703e+02
5.18270142e+02 5.19318420e+02 5.23630371e+02 5.24411560e+02
5.24661011e+02 5.26879456e+02 5.27787720e+02 5.30309692e+02
5.30779846e+02 5.35010437e+02 5.38321960e+02 5.39500549e+02
5.40761597e+02 5.44147644e+02 5.44728943e+02 5.46823792e+02
5.48718140e+02 5.49549561e+02 5.52125854e+02 5.54012085e+02
5.55997681e+02 5.56360168e+02 5.59716064e+02 5.60918091e+02
5.62265564e+02 5.64905212e+02 5.65518860e+02 5.67688049e+02
5.70330017e+02 5.71531189e+02 5.72709961e+02 5.73772583e+02
5.74635132e+02 5.76114807e+02 5.78688904e+02 5.80944031e+02
5.82956970e+02 5.83999451e+02 5.86373779e+02 5.87890442e+02
5.90274231e+02 5.92789978e+02 5.94019348e+02 5.95919861e+02
5.98507385e+02 5.99238098e+02 6.03538635e+02 6.04865234e+02
6.05613647e+02 6.08182129e+02 6.11422913e+02 6.12872375e+02
6.13704956e+02 6.15147339e+02 6.16140625e+02 6.18674927e+02
6.20023682e+02 6.24206421e+02 6.25301331e+02 6.25841003e+02
6.29186707e+02 6.32762695e+02 6.36279907e+02 6.36892151e+02
6.38313782e+02 6.41161438e+02 6.43433289e+02 6.45997742e+02
6.47491028e+02 6.48862854e+02 6.51700989e+02 6.53372925e+02
6.54144531e+02 6.55946716e+02 6.57468140e+02 6.59019165e+02
6.61488770e+02 6.63827576e+02 6.66005920e+02 6.68568359e+02
6.69098389e+02 6.70250671e+02 6.72675903e+02 6.74382874e+02
6.78338623e+02 6.79054993e+02 6.81062927e+02 6.83051514e+02
6.87433716e+02 6.89493958e+02 6.91695801e+02 6.92172424e+02
6.95642212e+02 6.97679749e+02 7.02713806e+02 7.03507507e+02
7.05350952e+02 7.09137085e+02 7.10424316e+02 7.14558228e+02
7.16256531e+02 7.16933838e+02 7.20567200e+02 7.21708801e+02
7.22399536e+02 7.25894592e+02 7.27103149e+02 7.31973083e+02
7.33964966e+02 7.36153809e+02 7.37215271e+02 7.40087769e+02
7.43148621e+02 7.45902954e+02 7.49233154e+02 7.50174500e+02
7.53542664e+02 7.56363037e+02 7.57446411e+02 7.60002197e+02
7.61068237e+02 7.62952087e+02 7.66672668e+02 7.68043945e+02
7.71573792e+02 7.74104675e+02 7.75025330e+02 7.76655212e+02
7.78113647e+02 7.82079224e+02 7.84104675e+02 7.84412964e+02
7.87688110e+02 7.90116760e+02 7.93338745e+02 7.94950562e+02
7.97679504e+02 7.99141479e+02 8.00581482e+02 8.02865906e+02
8.06674316e+02 8.07715149e+02 8.11932739e+02 8.16130005e+02
8.18053284e+02 8.18827026e+02 8.22571411e+02 8.24581543e+02
8.27552734e+02 8.33810181e+02 8.34987183e+02 8.38147156e+02
8.38753235e+02 8.40131897e+02 8.45192932e+02 8.47327148e+02
8.49306885e+02 8.55219543e+02 8.57978821e+02 8.59609680e+02
8.64477173e+02 8.67826477e+02 8.70597290e+02 8.72258057e+02
8.75445496e+02 8.77990295e+02 8.80465698e+02 8.83257935e+02
8.87310852e+02 8.88546692e+02 8.89433411e+02 8.92023071e+02
8.94646484e+02 8.98081665e+02 9.02650574e+02 9.06503845e+02
9.10038147e+02 9.12825867e+02 9.16995850e+02 9.19810730e+02
9.22638855e+02 9.26117920e+02 9.28607361e+02 9.30068176e+02
9.34010010e+02 9.36171509e+02 9.40510498e+02 9.44902039e+02
9.47517639e+02 9.53077942e+02 9.54970032e+02 9.59001404e+02
9.61072205e+02 9.64216736e+02 9.65422424e+02 9.70975159e+02
9.74228516e+02 9.77934509e+02 9.82992004e+02 9.85553650e+02
9.87180115e+02 9.94839722e+02 9.98558838e+02 1.00312799e+03
1.00889056e+03 1.01186957e+03 1.01528760e+03 1.01572839e+03
1.02170618e+03 1.02625366e+03 1.02736670e+03 1.02998499e+03
1.03566553e+03 1.04165344e+03 1.04748853e+03 1.04879114e+03
1.05461548e+03 1.05606238e+03 1.06002588e+03 1.06505298e+03
1.07181067e+03 1.07324780e+03 1.07702576e+03 1.08293030e+03
1.08789368e+03 1.09165430e+03 1.09612610e+03 1.09847888e+03
1.10769287e+03 1.11061316e+03 1.11224780e+03 1.11740173e+03
1.12600488e+03 1.12979822e+03 1.13307056e+03 1.13480164e+03
1.14169189e+03 1.14933838e+03 1.15824597e+03 1.16360193e+03
1.16659033e+03 1.17518359e+03 1.17727271e+03 1.18883459e+03
1.19373987e+03 1.20296497e+03 1.21020923e+03 1.21291443e+03
1.22026587e+03 1.22788940e+03 1.23525281e+03 1.23745837e+03
1.24352246e+03 1.25645020e+03 1.26784766e+03 1.27135596e+03
1.28876660e+03 1.29042651e+03 1.29937451e+03 1.32902893e+03]
Max Abs Diff: 0.0134277
Mean Abs Diff: 0.000160862
Median Abs Diff: 3.05176e-05
Std Abs Diff: 0.000615052
Max Rel Diff: 0.0636786
Mean Rel Diff: 7.82472e-05
Median Rel Diff: 2.97369e-07
Std Rel Diff: 0.00202026

rtol, atol: 1e-05 0.001
At position 1 of argument 0 with shape (100, 100),
val1 = 4442.320311 , val2 = -0.324583
abs. error = 4442.644894, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[ 0.2486728 , -1.44695068, -1.70282722, ..., -0.86061597,
0.08456989, -0.80090556],
[-1.13265687, -0.30299113, 0.94938754, ..., -1.25424281,
1.25341849, -0.64547763],
[ 2.24643339, 1.45198374, 0.59919166, ..., 0.81466691,
0.51244435, -0.5979247 ],
...,
[-1.29899674, 0.52331519, -0.46837642, ..., -1.12989603,
1.06999052, 0.00752135],
[-0.20141794, 0.2760969 , -1.30929085, ..., -0.32930689,
-0.14727093, 0.4449663 ],
[ 1.1588256 , 0.82731201, -1.22678511, ..., -0.3779908 ,
0.62439104, 1.24342604]]), array([[ 0.63120613],
[ 0.01322899],
[ 0.55894639],
[ 0.64874059],
[ 0.06931994],
[ 0.43648005],
[ 0.73585117],
[ 0.3432791 ],
[ 0.82072319],
[ 0.66513859],
[ 0.9030465 ],
[ 0.48921726],
[ 0.45528938],
[ 0.64062398],
[ 0.19852452],
[ 0.22750301],
[ 0.81587072],
[ 0.86545032],
[ 0.56283064],
[ 0.23996996],
[ 0.97212574],
[ 0.99543432],
[ 0.54443007],
[ 0.70123126],
[ 0.52075974],
[ 0.85974199],
[ 0.56963695],
[ 0.78784509],
[ 0.52430566],
[ 0.90244877],
[ 0.73651418],
[ 0.21400898],
[ 0.82071994],
[ 0.24710827],
[ 0.38792445],
[ 0.25312956],
[ 0.76665066],
[ 0.20470174],
[ 0.85917546],
[ 0.61053043],
[ 0.35274424],
[ 0.37485784],
[ 0.35674603],
[ 0.04031824],
[ 0.70199989],
[ 0.32917027],
[ 0.410183 ],
[ 0.71631803],
[ 0.73618646],
[ 0.73483268],
[ 0.01884252],
[ 0.76307612],
[ 0.45778137],
[ 0.28801077],
[ 0.02166582],
[ 0.98050848],
[ 0.4539063 ],
[ 0.5417516 ],
[ 0.42252517],
[ 0.75695069],
[ 0.79403951],
[ 0.78153004],
[ 0.3972998 ],
[ 0.38613215],
[ 0.34815288],
[ 0.3239741 ],
[ 0.06900486],
[ 0.14465879],
[ 0.95459061],
[ 0.26215449],
[ 0.57184172],
[ 0.6656896 ],
[ 0.37154472],
[ 0.56039991],
[ 0.57132107],
[ 0.7422107 ],
[ 0.98100269],
[ 0.41390474],
[ 0.52824576],
[ 0.80101808],
[ 0.17663848],
[ 0.113728 ],
[ 0.9635118 ],
[ 0.90218232],
[ 0.45011886],
[ 0.97417866],
[ 0.0170934 ],
[ 0.49297691],
[ 0.5786326 ],
[ 0.82844133],
[ 0.69937967],
[ 0.66433775],
[ 0.29620123],
[ 0.96889434],
[ 0.19125628],
[ 0.36134495],
[ 0.37550172],
[ 0.21064748],
[ 0.67328733],
[ 0.1832737 ]])],
The value of eps is:, None,
The out_type is:, None

Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 658, in <lambda>
yield (lambda: utt.verify_grad(f, [r, y], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1795, in verify_grad
abs_tol, rel_tol)
GradientError: GradientError: numeric gradient and analytic gradient exceed tolerance:
At position 1 of argument 0 with shape (100, 100),
val1 = 4442.320311 , val2 = -0.324583
abs. error = 4442.644894, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[ 0.2486728 , -1.44695068, -1.70282722, ..., -0.86061597,
0.08456989, -0.80090556],
[-1.13265687, -0.30299113, 0.94938754, ..., -1.25424281,
1.25341849, -0.64547763],
[ 2.24643339, 1.45198374, 0.59919166, ..., 0.81466691,
0.51244435, -0.5979247 ],
...,
[-1.29899674, 0.52331519, -0.46837642, ..., -1.12989603,
1.06999052, 0.00752135],
[-0.20141794, 0.2760969 , -1.30929085, ..., -0.32930689,
-0.14727093, 0.4449663 ],
[ 1.1588256 , 0.82731201, -1.22678511, ..., -0.3779908 ,
0.62439104, 1.24342604]]), array([[ 0.63120613],
[ 0.01322899],
[ 0.55894639],
[ 0.64874059],
[ 0.06931994],
[ 0.43648005],
[ 0.73585117],
[ 0.3432791 ],
[ 0.82072319],
[ 0.66513859],
[ 0.9030465 ],
[ 0.48921726],
[ 0.45528938],
[ 0.64062398],
[ 0.19852452],
[ 0.22750301],
[ 0.81587072],
[ 0.86545032],
[ 0.56283064],
[ 0.23996996],
[ 0.97212574],
[ 0.99543432],
[ 0.54443007],
[ 0.70123126],
[ 0.52075974],
[ 0.85974199],
[ 0.56963695],
[ 0.78784509],
[ 0.52430566],
[ 0.90244877],
[ 0.73651418],
[ 0.21400898],
[ 0.82071994],
[ 0.24710827],
[ 0.38792445],
[ 0.25312956],
[ 0.76665066],
[ 0.20470174],
[ 0.85917546],
[ 0.61053043],
[ 0.35274424],
[ 0.37485784],
[ 0.35674603],
[ 0.04031824],
[ 0.70199989],
[ 0.32917027],
[ 0.410183 ],
[ 0.71631803],
[ 0.73618646],
[ 0.73483268],
[ 0.01884252],
[ 0.76307612],
[ 0.45778137],
[ 0.28801077],
[ 0.02166582],
[ 0.98050848],
[ 0.4539063 ],
[ 0.5417516 ],
[ 0.42252517],
[ 0.75695069],
[ 0.79403951],
[ 0.78153004],
[ 0.3972998 ],
[ 0.38613215],
[ 0.34815288],
[ 0.3239741 ],
[ 0.06900486],
[ 0.14465879],
[ 0.95459061],
[ 0.26215449],
[ 0.57184172],
[ 0.6656896 ],
[ 0.37154472],
[ 0.56039991],
[ 0.57132107],
[ 0.7422107 ],
[ 0.98100269],
[ 0.41390474],
[ 0.52824576],
[ 0.80101808],
[ 0.17663848],
[ 0.113728 ],
[ 0.9635118 ],
[ 0.90218232],
[ 0.45011886],
[ 0.97417866],
[ 0.0170934 ],
[ 0.49297691],
[ 0.5786326 ],
[ 0.82844133],
[ 0.69937967],
[ 0.66433775],
[ 0.29620123],
[ 0.96889434],
[ 0.19125628],
[ 0.36134495],
[ 0.37550172],
[ 0.21064748],
[ 0.67328733],
[ 0.1832737 ]])],

jenkins...@mila.quebec

unread,
Nov 21, 2018, 11:28:01 PM11/21/18
to theano-...@googlegroups.com
Theano_buildbot - Build # 178 - Still unstable -
Tot=83790 Skip=6129 Fail=5

See https://jenkins.mila.quebec/job/Theano_buildbot/178/ to view the results.

Failed tests:
5 tests failed.
FAILED: theano.gpuarray.tests.test_reduction.TestScalar.Theano_python2_debug / Theano python2 debug / test_none

Error Message:
InvalidValueError
type(variable) = GpuArrayType<None>(float64, scalar)
variable = GpuContiguous.0
type(value) = <type 'pygpu.gpuarray.GpuArray'>
dtype(value) = float64
shape(value) = (1,)
value = [-0.43974637]
min(value) = N/A
max(value) = N/A
isfinite = N/A
client_node = None
hint = perform output
specific_hint = none
context = ...
GpuContiguous [id A] ''
|<GpuArrayType<None>(float64, scalar)> [id B]



Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_reduction.py", line 145, in test_none
self.compute(None)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_reduction.py", line 133, in compute
self.compute_gpu(test_gpu_tensor, test_host_tensor, axis)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_reduction.py", line 122, in compute_gpu
f(test_gpu_tensor)
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/debugmode.py", line 2128, in deco
return f()
File "/home/jenkins/workspace/Theano_buildbot/theano/compile/debugmode.py", line 1868, in f
specific_hint=hint2)
InvalidValueError: InvalidValueError
type(variable) = GpuArrayType<None>(float64, scalar)
variable = GpuContiguous.0
type(value) = <type 'pygpu.gpuarray.GpuArray'>
dtype(value) = float64
shape(value) = (1,)
value = [-0.43974637]
min(value) = N/A
max(value) = N/A
isfinite = N/A
client_node = None
hint = perform output
specific_hint = none
context = ...
GpuContiguous [id A] ''
|<GpuArrayType<None>(float64, scalar)> [id B]
val1 = -2.692415 , val2 = 0.007147
abs. error = 2.699562, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[-0.87358593, -1.55278895, -0.35408721, ..., -0.49752079,
-0.85510664, -0.6999594 ],
[-0.5949778 , 0.97782005, 1.8256197 , ..., -0.55031211,
-0.40044814, 0.8257682 ],
[ 0.97345433, 0.23409088, 0.51433448, ..., -1.18179834,
0.65900681, -0.08932739],
...,
[-0.76324739, 0.83465287, -1.12997627, ..., -0.37088538,
-0.34178301, -0.12359521],
[ 0.23698662, 0.07112414, -1.25534911, ..., -0.97729031,
-1.80208581, -1.70212787],
[ 0.40060026, 1.00124872, -1.50947706, ..., 1.27537703,
-0.93864411, -0.20013627]]), array([[ 0.9413269 ],
[ 0.16741418],
[ 0.67268197],
[ 0.04398833],
[ 0.03145609],
[ 0.90863686],
[ 0.31636607],
[ 0.70491174],
[ 0.71329291],
[ 0.00947699],
[ 0.9955215 ],
[ 0.5918795 ],
[ 0.37757517],
[ 0.93415092],
[ 0.02667386],
[ 0.77824184],
[ 0.33266335],
[ 0.56459107],
[ 0.50887874],
[ 0.02919855],
[ 0.0581641 ],
[ 0.37019088],
[ 0.74335675],
[ 0.95642767],
[ 0.11587993],
[ 0.36623191],
[ 0.84603199],
[ 0.06971566],
[ 0.41894258],
[ 0.29541243],
[ 0.95899419],
[ 0.80717005],
[ 0.70956049],
[ 0.45233082],
[ 0.34581293],
[ 0.9588289 ],
[ 0.35633667],
[ 0.56067025],
[ 0.32088824],
[ 0.56667321],
[ 0.74412296],
[ 0.14557711],
[ 0.0900058 ],
[ 0.26803796],
[ 0.52384278],
[ 0.94051757],
[ 0.97765257],
[ 0.90530343],
[ 0.58566156],
[ 0.9222097 ],
[ 0.0968617 ],
[ 0.58593306],
[ 0.16392659],
[ 0.03092491],
[ 0.57364773],
[ 0.61984368],
[ 0.53805058],
[ 0.93857932],
[ 0.44896453],
[ 0.40421722],
[ 0.35727167],
[ 0.39782218],
[ 0.36335789],
[ 0.59645554],
[ 0.56735849],
[ 0.89063713],
[ 0.59141395],
[ 0.72087309],
[ 0.37766799],
[ 0.49039317],
[ 0.6761985 ],
[ 0.65205537],
[ 0.03173416],
[ 0.92980844],
[ 0.30735488],
[ 0.26652657],
[ 0.61456164],
[ 0.80812145],
[ 0.95297593],
[ 0.02547018],
[ 0.87960149],
[ 0.86531778],
[ 0.54710591],
[ 0.86716071],
[ 0.54228721],
[ 0.45245928],
[ 0.90779225],
[ 0.65753862],
[ 0.18926345],
[ 0.41429259],
[ 0.87322022],
[ 0.66811472],
[ 0.77683998],
[ 0.59526259],
[ 0.93218046],
[ 0.59446028],
[ 0.82324398],
[ 0.72296155],
[ 0.51677092],
[ 0.2989389 ]])],
The value of eps is:, None,
The out_type is:, None

Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 658, in <lambda>
yield (lambda: utt.verify_grad(f, [r, y], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1795, in verify_grad
abs_tol, rel_tol)
GradientError: GradientError: numeric gradient and analytic gradient exceed tolerance:
At position 0 of argument 0 with shape (100, 100),
val1 = -2.692415 , val2 = 0.007147
abs. error = 2.699562, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[-0.87358593, -1.55278895, -0.35408721, ..., -0.49752079,
-0.85510664, -0.6999594 ],
[-0.5949778 , 0.97782005, 1.8256197 , ..., -0.55031211,
-0.40044814, 0.8257682 ],
[ 0.97345433, 0.23409088, 0.51433448, ..., -1.18179834,
0.65900681, -0.08932739],
...,
[-0.76324739, 0.83465287, -1.12997627, ..., -0.37088538,
-0.34178301, -0.12359521],
[ 0.23698662, 0.07112414, -1.25534911, ..., -0.97729031,
-1.80208581, -1.70212787],
[ 0.40060026, 1.00124872, -1.50947706, ..., 1.27537703,
-0.93864411, -0.20013627]]), array([[ 0.9413269 ],
[ 0.16741418],
[ 0.67268197],
[ 0.04398833],
[ 0.03145609],
[ 0.90863686],
[ 0.31636607],
[ 0.70491174],
[ 0.71329291],
[ 0.00947699],
[ 0.9955215 ],
[ 0.5918795 ],
[ 0.37757517],
[ 0.93415092],
[ 0.02667386],
[ 0.77824184],
[ 0.33266335],
[ 0.56459107],
[ 0.50887874],
[ 0.02919855],
[ 0.0581641 ],
[ 0.37019088],
[ 0.74335675],
[ 0.95642767],
[ 0.11587993],
[ 0.36623191],
[ 0.84603199],
[ 0.06971566],
[ 0.41894258],
[ 0.29541243],
[ 0.95899419],
[ 0.80717005],
[ 0.70956049],
[ 0.45233082],
[ 0.34581293],
[ 0.9588289 ],
[ 0.35633667],
[ 0.56067025],
[ 0.32088824],
[ 0.56667321],
[ 0.74412296],
[ 0.14557711],
[ 0.0900058 ],
[ 0.26803796],
[ 0.52384278],
[ 0.94051757],
[ 0.97765257],
[ 0.90530343],
[ 0.58566156],
[ 0.9222097 ],
[ 0.0968617 ],
[ 0.58593306],
[ 0.16392659],
[ 0.03092491],
[ 0.57364773],
[ 0.61984368],
[ 0.53805058],
[ 0.93857932],
[ 0.44896453],
[ 0.40421722],
[ 0.35727167],
[ 0.39782218],
[ 0.36335789],
[ 0.59645554],
[ 0.56735849],
[ 0.89063713],
[ 0.59141395],
[ 0.72087309],
[ 0.37766799],
[ 0.49039317],
[ 0.6761985 ],
[ 0.65205537],
[ 0.03173416],
[ 0.92980844],
[ 0.30735488],
[ 0.26652657],
[ 0.61456164],
[ 0.80812145],
[ 0.95297593],
[ 0.02547018],
[ 0.87960149],
[ 0.86531778],
[ 0.54710591],
[ 0.86716071],
[ 0.54228721],
[ 0.45245928],
[ 0.90779225],
[ 0.65753862],
[ 0.18926345],
[ 0.41429259],
[ 0.87322022],
[ 0.66811472],
[ 0.77683998],
[ 0.59526259],
[ 0.93218046],
[ 0.59446028],
[ 0.82324398],
[ 0.72296155],
[ 0.51677092],
[ 0.2989389 ]])],
The value of eps is:, None,
The out_type is:, None


FAILED: theano.gpuarray.tests.test_linalg.Theano_python3 / Theano python3 / test_lower_triangular_and_cholesky_grad

Error Message:
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 59, in testPartExecutor
yield
File "/miniconda/envs/py3k/lib/python3.6/unittest/case.py", line 605, in run
testMethod()
File "/miniconda/envs/py3k/lib/python3.6/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 658, in <lambda>
yield (lambda: utt.verify_grad(f, [r, y], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1795, in verify_grad
abs_tol, rel_tol)
theano.gradient.GradientError: GradientError: numeric gradient and analytic gradient exceed tolerance:

jenkins...@mila.quebec

unread,
Dec 8, 2018, 8:56:59 AM12/8/18
to theano-...@googlegroups.com
Theano_buildbot - Build # 179 - Still unstable -
Tot=83790 Skip=6129 Fail=6

See https://jenkins.mila.quebec/job/Theano_buildbot/179/ to view the results.

Failed tests:
6 tests failed.
FAILED: theano.gpuarray.tests.test_reduction.TestScalar.Theano_python2_debug / Theano python2 debug / test_none

Error Message:
InvalidValueError
type(variable) = GpuArrayType<None>(float64, scalar)
variable = GpuContiguous.0
type(value) = <type 'pygpu.gpuarray.GpuArray'>
dtype(value) = float64
shape(value) = (1,)
value = [-0.5855367]
value = [-0.5855367]
min(value) = N/A
max(value) = N/A
isfinite = N/A
client_node = None
hint = perform output
specific_hint = none
context = ...
GpuContiguous [id A] ''
|<GpuArrayType<None>(float64, scalar)> [id B]




FAILED: theano.scan_module.tests.test_scan.T_Scan.Theano_python2_debug / Theano python2 debug / test_grad_multiple_outs_some_disconnected

Error Message:
0.0205299831997

Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/home/jenkins/workspace/Theano_buildbot/theano/scan_module/tests/test_scan.py", line 2008, in test_grad_multiple_outs_some_disconnected
assert final_cost < 0.02, final_cost
AssertionError: 0.0205299831997
At position 2 of argument 0 with shape (100, 100),
val1 = -24050.781781 , val2 = 0.725584
abs. error = 24051.507365, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[ 0.82012967, 0.53932947, 0.79950803, ..., -0.27456699,
-0.20484751, -1.16736978],
[-0.35938734, -0.56797985, 1.56261071, ..., 1.80149008,
-1.11966046, -0.20735454],
[-0.98084527, 1.13106894, -1.27600589, ..., 1.43766287,
-1.52353646, 0.77955075],
...,
[-0.03711281, -1.0081877 , 1.05061423, ..., -0.4101615 ,
0.12093505, -0.112845 ],
[ 0.33789368, -0.52921799, -0.14181119, ..., -0.56831955,
-0.6526261 , -1.55137944],
[-0.74065039, 0.39428827, -1.64027467, ..., 0.36029371,
1.91319371, 0.2087125 ]]), array([[ 8.77184799e-02],
[ 7.48727016e-01],
[ 9.73092570e-01],
[ 6.96066151e-02],
[ 6.02716933e-01],
[ 8.99164542e-01],
[ 5.03043762e-01],
[ 2.48809678e-02],
[ 2.80921041e-01],
[ 8.83054603e-01],
[ 5.12723637e-01],
[ 9.81461693e-01],
[ 2.39584552e-01],
[ 9.83064781e-01],
[ 5.96155028e-01],
[ 7.41624922e-01],
[ 8.96043565e-01],
[ 8.35178164e-01],
[ 6.74338199e-01],
[ 8.49911178e-01],
[ 4.28699885e-01],
[ 1.77341317e-01],
[ 6.57020444e-01],
[ 4.38548235e-01],
[ 3.77302930e-01],
[ 7.05952015e-02],
[ 4.76352277e-02],
[ 3.97811530e-01],
[ 3.74109097e-01],
[ 8.24803340e-01],
[ 9.23290948e-01],
[ 1.17673865e-01],
[ 1.56873318e-01],
[ 5.19526473e-01],
[ 4.20645372e-01],
[ 6.92132978e-01],
[ 4.26993433e-01],
[ 1.24635529e-01],
[ 3.81236080e-01],
[ 3.33849837e-01],
[ 8.40445572e-01],
[ 7.84473074e-01],
[ 9.87146267e-01],
[ 1.86192671e-05],
[ 6.59436362e-01],
[ 8.37945979e-01],
[ 4.75634910e-02],
[ 2.13101195e-01],
[ 6.43231801e-01],
[ 4.04281786e-01],
[ 1.12450925e-01],
[ 9.60710848e-01],
[ 5.72313058e-01],
[ 4.78160465e-01],
[ 1.57592853e-02],
[ 1.60133189e-01],
[ 1.98754573e-02],
[ 7.45431742e-01],
[ 3.88494244e-01],
[ 7.42600664e-01],
[ 8.65485197e-01],
[ 7.79148329e-01],
[ 5.74994203e-01],
[ 7.72952352e-01],
[ 3.40488222e-01],
[ 3.90092647e-01],
[ 3.80001290e-01],
[ 1.50418697e-02],
[ 2.20368434e-01],
[ 6.07610022e-01],
[ 8.76952140e-03],
[ 5.87922989e-01],
[ 5.29390086e-01],
[ 8.55430579e-01],
[ 2.55355529e-01],
[ 3.83056127e-01],
[ 6.76566749e-01],
[ 1.96006345e-01],
[ 7.27774269e-01],
[ 7.86825665e-01],
[ 8.22587675e-01],
[ 3.27320164e-01],
[ 1.69033535e-02],
[ 5.75632307e-02],
[ 2.13472642e-01],
[ 2.53647155e-01],
[ 9.64485699e-01],
[ 7.64740373e-01],
[ 3.54968115e-01],
[ 1.29598118e-01],
[ 2.01388052e-01],
[ 1.47400099e-01],
[ 4.88845156e-02],
[ 1.04714233e-01],
[ 3.23810854e-01],
[ 1.02601700e-01],
[ 6.95229168e-01],
[ 5.66036111e-01],
[ 8.58726427e-01],
[ 9.94486601e-02]])],
The value of eps is:, None,
The out_type is:, None

Stack Trace:
Traceback (most recent call last):
File "/miniconda/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
File "/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/jenkins/workspace/Theano_buildbot/theano/gpuarray/tests/test_linalg.py", line 658, in <lambda>
yield (lambda: utt.verify_grad(f, [r, y], 3, rng))
File "/home/jenkins/workspace/Theano_buildbot/theano/tests/unittest_tools.py", line 92, in verify_grad
T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)
File "/home/jenkins/workspace/Theano_buildbot/theano/gradient.py", line 1795, in verify_grad
abs_tol, rel_tol)
GradientError: GradientError: numeric gradient and analytic gradient exceed tolerance:
At position 2 of argument 0 with shape (100, 100),
val1 = -24050.781781 , val2 = 0.725584
abs. error = 24051.507365, abs. tolerance = 0.000100
rel. error = 1.000000, rel. tolerance = 0.000100
Exception args:
The error happened with the following inputs:, [array([[ 0.82012967, 0.53932947, 0.79950803, ..., -0.27456699,
-0.20484751, -1.16736978],
[-0.35938734, -0.56797985, 1.56261071, ..., 1.80149008,
-1.11966046, -0.20735454],
[-0.98084527, 1.13106894, -1.27600589, ..., 1.43766287,
-1.52353646, 0.77955075],
...,
[-0.03711281, -1.0081877 , 1.05061423, ..., -0.4101615 ,
0.12093505, -0.112845 ],
[ 0.33789368, -0.52921799, -0.14181119, ..., -0.56831955,
-0.6526261 , -1.55137944],
[-0.74065039, 0.39428827, -1.64027467, ..., 0.36029371,
1.91319371, 0.2087125 ]]), array([[ 8.77184799e-02],
[ 7.48727016e-01],
[ 9.73092570e-01],
[ 6.96066151e-02],
[ 6.02716933e-01],
[ 8.99164542e-01],
[ 5.03043762e-01],
[ 2.48809678e-02],
[ 2.80921041e-01],
[ 8.83054603e-01],
[ 5.12723637e-01],
[ 9.81461693e-01],
[ 2.39584552e-01],
[ 9.83064781e-01],
[ 5.96155028e-01],
[ 7.41624922e-01],
[ 8.96043565e-01],
[ 8.35178164e-01],
[ 6.74338199e-01],
[ 8.49911178e-01],
[ 4.28699885e-01],
[ 1.77341317e-01],
[ 6.57020444e-01],
[ 4.38548235e-01],
[ 3.77302930e-01],
[ 7.05952015e-02],
[ 4.76352277e-02],
[ 3.97811530e-01],
[ 3.74109097e-01],
[ 8.24803340e-01],
[ 9.23290948e-01],
[ 1.17673865e-01],
[ 1.56873318e-01],
[ 5.19526473e-01],
[ 4.20645372e-01],
[ 6.92132978e-01],
[ 4.26993433e-01],
[ 1.24635529e-01],
[ 3.81236080e-01],
[ 3.33849837e-01],
[ 8.40445572e-01],
[ 7.84473074e-01],
[ 9.87146267e-01],
[ 1.86192671e-05],
[ 6.59436362e-01],
[ 8.37945979e-01],
[ 4.75634910e-02],
[ 2.13101195e-01],
[ 6.43231801e-01],
[ 4.04281786e-01],
[ 1.12450925e-01],
[ 9.60710848e-01],
[ 5.72313058e-01],
[ 4.78160465e-01],
[ 1.57592853e-02],
[ 1.60133189e-01],
[ 1.98754573e-02],
[ 7.45431742e-01],
[ 3.88494244e-01],
[ 7.42600664e-01],
[ 8.65485197e-01],
[ 7.79148329e-01],
[ 5.74994203e-01],
[ 7.72952352e-01],
[ 3.40488222e-01],
[ 3.90092647e-01],
[ 3.80001290e-01],
[ 1.50418697e-02],
[ 2.20368434e-01],
[ 6.07610022e-01],
[ 8.76952140e-03],
[ 5.87922989e-01],
[ 5.29390086e-01],
[ 8.55430579e-01],
[ 2.55355529e-01],
[ 3.83056127e-01],
[ 6.76566749e-01],
[ 1.96006345e-01],
[ 7.27774269e-01],
[ 7.86825665e-01],
[ 8.22587675e-01],
[ 3.27320164e-01],
[ 1.69033535e-02],
[ 5.75632307e-02],
[ 2.13472642e-01],
[ 2.53647155e-01],
[ 9.64485699e-01],
[ 7.64740373e-01],
[ 3.54968115e-01],
[ 1.29598118e-01],
[ 2.01388052e-01],
[ 1.47400099e-01],
[ 4.88845156e-02],
[ 1.04714233e-01],
[ 3.23810854e-01],
[ 1.02601700e-01],
[ 6.95229168e-01],
[ 5.66036111e-01],
[ 8.58726427e-01],
[ 9.94486601e-02]])],
Reply all
Reply to author
Forward
0 new messages