line 74, in <module>
example_batch_loss = loss(target_example_batch, example_batch_predictions)
File "/Users/danielrakityansky/PycharmProjects/neuralxd/main.py", line 66, in loss
return tf.keras.losses.sparse_categorical_crossentropy(labels, logits, from_logits=True)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/tensorflow/python/keras/losses.py", line 953, in sparse_categorical_crossentropy
y_true, y_pred, from_logits=from_logits, axis=axis)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/tensorflow/python/keras/backend.py", line 4205, in sparse_categorical_crossentropy
labels=targets, logits=logits)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/tensorflow/python/ops/nn_ops.py", line 3338, in sparse_softmax_cross_entropy_with_logits
logits.get_shape()))
ValueError: Shape mismatch: The shape of labels (received (192,)) should equal the shape of logits except for the last dimension (received (19200, 191)).
Искать, куда одна штука потерялась
Обсуждают сегодня