1

I have a dataset of tfrecords and mapped them to a Tensorflow dataset. The dataset structure has a dict structure as follows: { "image" : ImageDataTensor, "additional_features" : Tensor, "y" : Tensor }. The image key holds the image data of shape (height, width, channels) whereas the additional_features holds handcrafted features of shape (n_features,). The labels are provided as integers in the key y in shape (batch_size,).

The model to test the functionality looks like:

# Create the two inputs
input_image = tf.keras.Input(shape=(image_height, image_width, n_channels), name='image')
input_features = tf.keras.Input(shape=(7,), name='additional_features')

# For simplicity, just pool the image for now (for simplicity no further conv ops,...)
x = tf.keras.layers.GlobalAveragePooling2D()(input_image)

# Combine the results from the two input branches
combined_features = tf.keras.layers.concatenate([x, input_features], name="merged_inputs")
x1 = tf.keras.layers.Dense(128, name="denseConnectFeat")(combined_features)

# Create the output (classification) layer
output_layer = tf.keras.layers.Dense(n_classes, activation='softmax', name= "y")(x1)

model = tf.keras.Model(inputs={ "image": input_image,
                                "pre_calc_feats" : input_features },
                       outputs={"y" : output_layer})

model.compile(loss={ "y" : "sparse_categorical_crossentropy"},
              metrics=['sparse_categorical_accuracy'],
              optimizer=tf.keras.optimizers.Adam(learning_rate=0.001))

If i try training the model using my tfdatasets for training and validation as follows:

model_history = mod.fit(train_dataset, validation_data=val_dataset,epochs=10)

I get the following error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[505], line 26
     22 mod.compile(loss={ "y" : "sparse_categorical_crossentropy"},
     23             metrics=['sparse_categorical_accuracy'],
     24             optimizer=keras.optimizers.RMSprop(1e-3))
     25 if True:
---> 26     model_history = mod.fit(train_dataset, validation_data=val_dataset,epochs=10)
     27 else:
     28     print(mod.summary())

File /usr/local/lib/python3.10/dist-packages/keras/src/utils/traceback_utils.py:122, in filter_traceback.<locals>.error_handler(*args, **kwargs)
    119     filtered_tb = _process_traceback_frames(e.__traceback__)
    120     # To get the full stack trace, call:
    121     # `keras.config.disable_traceback_filtering()`
--> 122     raise e.with_traceback(filtered_tb) from None
    123 finally:
    124     del filtered_tb

File /usr/local/lib/python3.10/dist-packages/optree/ops.py:747, in tree_map(func, tree, is_leaf, none_is_leaf, namespace, *rests)
    745 leaves, treespec = _C.flatten(tree, is_leaf, none_is_leaf, namespace)
    746 flat_args = [leaves] + [treespec.flatten_up_to(r) for r in rests]
--> 747 return treespec.unflatten(map(func, *flat_args))

**ValueError: None values not supported.**

What am i doing wrong?

Using the dataset in the normal style with mapping functions that return just image, label instead of the dictionary style of the multi input, the image classifier, just using the image and label data, works fine

1
  • I have created a dummy dataset to train the model and successfully executed the code using TensorFlow version 2.17.1. Please refer to the attached gist with the code for your reference. Let me know if you're still facing any issues. Commented Dec 17, 2024 at 8:34

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.