You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using this model for similar data except that our images contain Sanskrit words. I created the train, val , test files similar to the ones(i.e image_name followed by ordinals for characters) used in this model.
But in our case, the number of characters(i.e n_classes) is 118(instead of 95 in original one) and y_max_len=200(instead of 50 in original one).
When I train the model , I am getting the following error
loaded 25996 samples from ./dataset/train_img_list.txt
loaded 756 samples from ./dataset/val_img_list.txt
building symbolic tensors(0.84720993042)
('#Train samples: ', 25996)
('#Val samples: ', 756)
('#Train Iterations: ', 406)
('#Val Iterations: ', 11)
setting parameters(0.848186016083)
('n_classes: ', 118)
('multi-step: ', set([40600, 71050, 60900]))
building the model(0.848335027695)
Subtensor{int64}.0
Shape.0
computing updates and function(1.2518889904)
using normal sgd and learning_rate:0.00999999977648
('bw_lstm_b', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('fw_lstm_W', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('fw_lstm_U', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('fw_lstm_b', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('bw_lstm_W', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('bw_lstm_U', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('hidden_b', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('hidden_W', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
building training function(2.72188806534)
building validating function(25.7086689472)
begin to train(27.9824080467)
.epoch 1/200 begin(27.982)
[prefetch]height: 150, x_max_step:900.0, y_max_width:200
Traceback (most recent call last):
File "train/train.py", line 148, in
loss = train()
File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 898, in call
storage_map=getattr(self.fn, 'storage_map', None))
File "/usr/local/lib/python2.7/dist-packages/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 884, in call
self.fn() if output_subset is None else
File "/usr/local/lib/python2.7/dist-packages/theano/gof/op.py", line 872, in rval
r = p(n, [x[0] for x in i], o)
File "/usr/local/lib/python2.7/dist-packages/theano/tensor/subtensor.py", line 2173, in perform
out[0] = inputs[0].getitem(inputs[1:])
IndexError: index 121 is out of bounds for axis 2 with size 119
Apply node that caused the error: AdvancedSubtensor(Reshape{3}.0, SliceConstant{None, None, None}, InplaceDimShuffle{0,x}.0, <TensorType(int32, matrix)>)
Toposort index: 463
Inputs types: [TensorType(float32, 3D), <theano.tensor.type_other.SliceType object at 0x7f6a4d6d9510>, TensorType(int64, col), TensorType(int32, matrix)]
Inputs shapes: [(900, 64, 119), 'No shapes', (64, 1), (64, 200)]
Inputs strides: [(30464, 476, 4), 'No strides', (8, 8), (800, 4)]
Inputs values: ['not shown', slice(None, None, None), 'not shown', 'not shown']
Outputs clients: [[Reshape{2}(AdvancedSubtensor.0, MakeVector{dtype='int64'}.0), Shape_i{2}(AdvancedSubtensor.0), Shape_i{1}(AdvancedSubtensor.0), Shape_i{0}(AdvancedSubtensor.0)]]
Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "train/train.py", line 84, in
mid_layer_type = BLSTMLayer, forget=False)
File "/misc/me/rohits/aaron-cnn-lstm-ctc/layers/net.py", line 40, in init
blank = options['blank'], log_space = True)
File "/misc/me/rohits/aaron-cnn-lstm-ctc/layers/ctc_layer.py", line 25, in init
self.log_ctc(labels_len_const = labels_len_const)
File "/misc/me/rohits/aaron-cnn-lstm-ctc/layers/ctc_layer.py", line 94, in log_ctc
x1 = self.x[:, T.arange(n_samples)[:, None], self.y]
The text was updated successfully, but these errors were encountered:
I am using this model for similar data except that our images contain Sanskrit words. I created the train, val , test files similar to the ones(i.e image_name followed by ordinals for characters) used in this model.
But in our case, the number of characters(i.e n_classes) is 118(instead of 95 in original one) and y_max_len=200(instead of 50 in original one).
When I train the model , I am getting the following error
loaded 25996 samples from ./dataset/train_img_list.txt
loaded 756 samples from ./dataset/val_img_list.txt
building symbolic tensors(0.84720993042)
('#Train samples: ', 25996)
('#Val samples: ', 756)
('#Train Iterations: ', 406)
('#Val Iterations: ', 11)
setting parameters(0.848186016083)
('n_classes: ', 118)
('multi-step: ', set([40600, 71050, 60900]))
building the model(0.848335027695)
Subtensor{int64}.0
Shape.0
computing updates and function(1.2518889904)
using normal sgd and learning_rate:0.00999999977648
('bw_lstm_b', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('fw_lstm_W', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('fw_lstm_U', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('fw_lstm_b', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('bw_lstm_W', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('bw_lstm_U', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('hidden_b', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
('hidden_W', <class 'theano.tensor.sharedvar.TensorSharedVariable'>)
building training function(2.72188806534)
building validating function(25.7086689472)
begin to train(27.9824080467)
.epoch 1/200 begin(27.982)
[prefetch]height: 150, x_max_step:900.0, y_max_width:200
Traceback (most recent call last):
File "train/train.py", line 148, in
loss = train()
File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 898, in call
storage_map=getattr(self.fn, 'storage_map', None))
File "/usr/local/lib/python2.7/dist-packages/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 884, in call
self.fn() if output_subset is None else
File "/usr/local/lib/python2.7/dist-packages/theano/gof/op.py", line 872, in rval
r = p(n, [x[0] for x in i], o)
File "/usr/local/lib/python2.7/dist-packages/theano/tensor/subtensor.py", line 2173, in perform
out[0] = inputs[0].getitem(inputs[1:])
IndexError: index 121 is out of bounds for axis 2 with size 119
Apply node that caused the error: AdvancedSubtensor(Reshape{3}.0, SliceConstant{None, None, None}, InplaceDimShuffle{0,x}.0, <TensorType(int32, matrix)>)
Toposort index: 463
Inputs types: [TensorType(float32, 3D), <theano.tensor.type_other.SliceType object at 0x7f6a4d6d9510>, TensorType(int64, col), TensorType(int32, matrix)]
Inputs shapes: [(900, 64, 119), 'No shapes', (64, 1), (64, 200)]
Inputs strides: [(30464, 476, 4), 'No strides', (8, 8), (800, 4)]
Inputs values: ['not shown', slice(None, None, None), 'not shown', 'not shown']
Outputs clients: [[Reshape{2}(AdvancedSubtensor.0, MakeVector{dtype='int64'}.0), Shape_i{2}(AdvancedSubtensor.0), Shape_i{1}(AdvancedSubtensor.0), Shape_i{0}(AdvancedSubtensor.0)]]
Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "train/train.py", line 84, in
mid_layer_type = BLSTMLayer, forget=False)
File "/misc/me/rohits/aaron-cnn-lstm-ctc/layers/net.py", line 40, in init
blank = options['blank'], log_space = True)
File "/misc/me/rohits/aaron-cnn-lstm-ctc/layers/ctc_layer.py", line 25, in init
self.log_ctc(labels_len_const = labels_len_const)
File "/misc/me/rohits/aaron-cnn-lstm-ctc/layers/ctc_layer.py", line 94, in log_ctc
x1 = self.x[:, T.arange(n_samples)[:, None], self.y]
The text was updated successfully, but these errors were encountered: