Skip to content

Instantly share code, notes, and snippets.

@nonlinearjunkie
Created July 1, 2020 10:32
Show Gist options
  • Select an option

  • Save nonlinearjunkie/cfa574e6e66d2de475c147324f88db0a to your computer and use it in GitHub Desktop.

Select an option

Save nonlinearjunkie/cfa574e6e66d2de475c147324f88db0a to your computer and use it in GitHub Desktop.

Revisions

  1. nonlinearjunkie created this gist Jul 1, 2020.
    44 changes: 44 additions & 0 deletions tl_03.py
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,44 @@
    base_model = tf.keras.applications.MobileNetV2(input_shape=(224,224,3),
    alpha=1.0,
    include_top=False,
    weights="imagenet")
    for layer in base_model.layers:
    layer.trainable = False

    model_transfered_1=Sequential()

    model_transfered_1.add(base_model)

    # Flattening
    model_transfered_1.add(Flatten())

    # Fully connected layer 1st layer
    model_transfered_1.add(Dense(32))
    model_transfered_1.add(BatchNormalization())
    model_transfered_1.add(Activation('relu'))
    model_transfered_1.add(Dropout(0.4))

    # Fully connected layer 2nd layer
    model_transfered_1.add(Dense(32))
    model_transfered_1.add(BatchNormalization())
    model_transfered_1.add(Activation('relu'))
    model_transfered_1.add(Dropout(0.4))

    model_transfered_1.add(Dense(7, activation='softmax'))

    model_transfered_1.compile(optimizer=Adam(lr=0.0005),
    loss='categorical_crossentropy', metrics=['categorical_accuracy'])
    epochs = 10
    steps_per_epoch = train_generator.n//train_generator.batch_size
    validation_steps = validation_generator.n//validation_generator.batch_size

    callbacks = [PlotLossesKerasTF(), checkpoint, reduce_lr]

    history = model_transfered_1.fit(
    x=train_generator,
    steps_per_epoch=steps_per_epoch,
    epochs=epochs,
    validation_data = validation_generator,
    validation_steps = validation_steps,
    shuffle=True
    )