strange loss curves when BatchNormalization used in Keras
up vote
0
down vote
favorite
Part of code:
mobilenetv2 = MobileNetV2(input_shape=(IMG_SIZE, IMG_SIZE, CHANNELS),
alpha=1.0,
depth_multiplier=1,
include_top=False,
weights='imagenet',
input_tensor=None,
pooling=None,
classes=12)
for layer in mobilenetv2.layers:
layer.trainable = False
last = mobilenetv2.layers[-1].output
x = Flatten()(last)
x = Dense(120, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = Dense(84, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
preds = Dense(12, activation='softmax')(x)
model = Model(inputs=mobilenetv2.input, outputs=preds)
but the loss curve:
Are the above curves normal? I did not use dropout layers, because I am asked to compare dropout layers with BatchNormalization. but the curves look strange. Are my codes right? Or anything missing?
Thanks
keras keras-layer batch-normalization
add a comment |
up vote
0
down vote
favorite
Part of code:
mobilenetv2 = MobileNetV2(input_shape=(IMG_SIZE, IMG_SIZE, CHANNELS),
alpha=1.0,
depth_multiplier=1,
include_top=False,
weights='imagenet',
input_tensor=None,
pooling=None,
classes=12)
for layer in mobilenetv2.layers:
layer.trainable = False
last = mobilenetv2.layers[-1].output
x = Flatten()(last)
x = Dense(120, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = Dense(84, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
preds = Dense(12, activation='softmax')(x)
model = Model(inputs=mobilenetv2.input, outputs=preds)
but the loss curve:
Are the above curves normal? I did not use dropout layers, because I am asked to compare dropout layers with BatchNormalization. but the curves look strange. Are my codes right? Or anything missing?
Thanks
keras keras-layer batch-normalization
Maybe this answer is relevant.
– today
Nov 17 at 15:06
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Part of code:
mobilenetv2 = MobileNetV2(input_shape=(IMG_SIZE, IMG_SIZE, CHANNELS),
alpha=1.0,
depth_multiplier=1,
include_top=False,
weights='imagenet',
input_tensor=None,
pooling=None,
classes=12)
for layer in mobilenetv2.layers:
layer.trainable = False
last = mobilenetv2.layers[-1].output
x = Flatten()(last)
x = Dense(120, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = Dense(84, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
preds = Dense(12, activation='softmax')(x)
model = Model(inputs=mobilenetv2.input, outputs=preds)
but the loss curve:
Are the above curves normal? I did not use dropout layers, because I am asked to compare dropout layers with BatchNormalization. but the curves look strange. Are my codes right? Or anything missing?
Thanks
keras keras-layer batch-normalization
Part of code:
mobilenetv2 = MobileNetV2(input_shape=(IMG_SIZE, IMG_SIZE, CHANNELS),
alpha=1.0,
depth_multiplier=1,
include_top=False,
weights='imagenet',
input_tensor=None,
pooling=None,
classes=12)
for layer in mobilenetv2.layers:
layer.trainable = False
last = mobilenetv2.layers[-1].output
x = Flatten()(last)
x = Dense(120, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = Dense(84, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
preds = Dense(12, activation='softmax')(x)
model = Model(inputs=mobilenetv2.input, outputs=preds)
but the loss curve:
Are the above curves normal? I did not use dropout layers, because I am asked to compare dropout layers with BatchNormalization. but the curves look strange. Are my codes right? Or anything missing?
Thanks
keras keras-layer batch-normalization
keras keras-layer batch-normalization
asked Nov 17 at 14:35
BAE
2,52762762
2,52762762
Maybe this answer is relevant.
– today
Nov 17 at 15:06
add a comment |
Maybe this answer is relevant.
– today
Nov 17 at 15:06
Maybe this answer is relevant.
– today
Nov 17 at 15:06
Maybe this answer is relevant.
– today
Nov 17 at 15:06
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53352213%2fstrange-loss-curves-when-batchnormalization-used-in-keras%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Maybe this answer is relevant.
– today
Nov 17 at 15:06