TensorFlow model scenarios to train with 'InvalidArgumentError' when using custom data generator
I'm attempting to set up I can't seem to get Hey everyone, I'm running into an issue that's driving me crazy... I'm working on a TensorFlow project (version 2.8.0) where I'm trying to train a Convolutional Neural Network (CNN) using a custom data generator that yields images and labels. However, I'm working with an `InvalidArgumentError` during training, specifically when I call `model.fit`. Here's a simplified version of my data generator: ```python class CustomDataGenerator(tf.keras.utils.Sequence): def __init__(self, image_paths, labels, batch_size): self.image_paths = image_paths self.labels = labels self.batch_size = batch_size self.indices = np.arange(len(self.image_paths)) def __len__(self): return int(np.ceil(len(self.image_paths) / self.batch_size)) def __getitem__(self, index): batch_indices = self.indices[index * self.batch_size:(index + 1) * self.batch_size] batch_images = [self.load_image(self.image_paths[i]) for i in batch_indices] batch_labels = [self.labels[i] for i in batch_indices] return np.array(batch_images), np.array(batch_labels) def load_image(self, path): img = tf.io.read_file(path) img = tf.image.decode_image(img, channels=3) img = tf.image.resize(img, [128, 128]) return img / 255.0 ``` During training, I initiate the model with: ```python model.fit(data_generator, epochs=10) ``` But I keep getting this behavior: ``` InvalidArgumentError: want to reshape a tensor with 128 elements to shape [32,128,128,3] (128 is not divisible by 32) ``` I’ve checked that my generator yields the correct shapes, and I've logged the output of `batch_images.shape` just before returning from `__getitem__`, which matches the expected shape. I suspect it might be an scenario with how the batches are generated or the way TensorFlow handles the input, but I'm not sure how to debug this further. Also, my labels are one-hot encoded and have the shape `(batch_size, num_classes)`. Can someone guide to troubleshoot this scenario or suggest best practices for implementing a custom data generator in TensorFlow? I'm working on a API that needs to handle this. Thanks in advance! I'm on macOS using the latest version of Python. I appreciate any insights!