2

I am trying to use keras-tuner to tune hyperparameters, like

!pip install keras-tuner --upgrade
import keras_tuner as kt
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Flatten
from tensorflow.keras.optimizers import Adam

def build_model(hp):
    model = Sequential([
        Flatten(input_shape=(28, 28)),
        Dense(units= hp.Int('units', min_value = 16, max_value = 64, step = 16), activation='relu'),
        Dense(units = hp.Int('units', min_value = 8, max_value = 20, step = 2), activation='softmax')
    ])

    model.compile(
        optimizer=Adam(learning_rate=hp.Float('learning_rate', min_value=1e-4, max_value=1e-2, sampling='LOG')),
        loss='sparse_categorical_crossentropy',
        metrics=['accuracy']
    )
    return model

# Create a RandomSearch Tuner
tuner = kt.RandomSearch(
    build_model, 
    objective='val_accuracy',
    max_trials=10,
    executions_per_trial=2
)

# Display a summary of the search space
tuner.search_space_summary()

shows

Search space summary
Default search space size: 2
units (Int)
{'default': None, 'conditions': [], 'min_value': 16, 'max_value': 64, 'step': 16, 'sampling': 'linear'}
learning_rate (Float)
{'default': 0.0001, 'conditions': [], 'min_value': 0.0001, 'max_value': 0.01, 'step': None, 'sampling': 'log'}

However, when checking the search_space_summary() output, only the 1st Dense layer is shown in the summary, while the information about the 2nd Dense layer, i.e., Dense(units = hp.Int('units', min_value = 8, max_value = 20, step = 2), activation='softmax'), is not seen.

Did I misconfigured something or it is supposed to yield the output like that? Could anyone help me to understand why it outputs the summary like this?

0

1 Answer 1

2

Each hyperparameter must have a unique name. This is also listed in the docs. In your case, both layer units parameters are called units . You should rename them to something like units_1 and units_2, for example.

Sign up to request clarification or add additional context in comments.

3 Comments

aha, you are right. thanks for the reference, now I see why i had that problem!
Well it would probably be better if trying to do this threw an error, instead of silently ignoring the second parameter...
yes, error or warning would be great

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.