Unexpected Shape Mismatch scenarios with tf.data.Dataset and Model.fit in TensorFlow 2.12
I just started working with I'm trying to implement I've hit a wall trying to I'm getting frustrated with I'm working with a `ValueError: Shapes (None, 10) and (None, 5) are incompatible` when trying to train my Keras model using TensorFlow 2.12..... I'm working with a dataset created using `tf.data.Dataset`, and I believe the scenario is stemming from the way I'm batching and preparing my data. I have two classes in my model with 5 output neurons, but it seems like the input shape from my dataset is not aligning properly. Here's the relevant portion of my code: ```python import tensorflow as tf from tensorflow import keras # Sample data creation num_samples = 1000 num_features = 20 num_classes = 5 # Generating random data X = tf.random.normal((num_samples, num_features)) Y = tf.random.uniform((num_samples,), maxval=num_classes, dtype=tf.int32) # One-hot encoding labels Y = tf.keras.utils.to_categorical(Y, num_classes) # Creating a tf.data.Dataset dataset = tf.data.Dataset.from_tensor_slices((X, Y)).batch(32) # Building a simple model model = keras.Sequential([ keras.layers.Dense(64, activation='relu', input_shape=(num_features,)), keras.layers.Dense(num_classes, activation='softmax') ]) model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) # Training the model model.fit(dataset, epochs=10) ``` I've confirmed that `Y` is indeed one-hot encoded with a shape of `(1000, 5)`. However, I'm not sure why the behavior suggests there is a shape mismatch. I've also printed the shape of each batch from the dataset to ensure they match, and they correctly show `(32, 5)` for the labels. I've tried reshaping the data and adjusting the batch size, but the scenario continues. Any insights on why this shape mismatch is occurring and how to resolve it would be greatly appreciated! Any ideas how to fix this? What's the correct way to implement this? I'm working with Python in a Docker container on CentOS. What's the correct way to implement this? I recently upgraded to Python 3.10. Could this be a known issue?