Skip to content

Instantly share code, notes, and snippets.

@alonlavian
Last active May 28, 2019 13:22
Show Gist options
  • Select an option

  • Save alonlavian/cf4c6e85aa7f042f15f09da65fc14182 to your computer and use it in GitHub Desktop.

Select an option

Save alonlavian/cf4c6e85aa7f042f15f09da65fc14182 to your computer and use it in GitHub Desktop.

Revisions

  1. alonlavian revised this gist May 28, 2019. 1 changed file with 5 additions and 6 deletions.
    11 changes: 5 additions & 6 deletions VGG16.py
    Original file line number Diff line number Diff line change
    @@ -2,18 +2,17 @@
    from tensorflow.keras import layers
    from tensorflow.keras.models import Model

    #VGG16 pre-trained and no top
    base_model = VGG16(weights='imagenet',
    include_top=False,
    base_model = VGG16(weights='imagenet', #pre-trained weights
    include_top=False, #remove original classifier
    input_shape=(HEIGHT, WIDTH, 3))

    #Freeze all
    #Freeze all layers
    for layer in base_model.layers:
    layer.trainable = False

    #Add top
    #Add binary classifier
    x = base_model.output
    x = layers.Flatten()(base_model.output) #last_output
    x = layers.Flatten()(base_model.output)
    x = layers.Dense(1024, activation='relu')(x)
    x = layers.Dropout(0.2)(x)
    x = layers.Dense (1, activation='sigmoid')(x)
  2. alonlavian created this gist May 23, 2019.
    21 changes: 21 additions & 0 deletions VGG16.py
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,21 @@
    from tensorflow.keras.applications.vgg16 import VGG16
    from tensorflow.keras import layers
    from tensorflow.keras.models import Model

    #VGG16 pre-trained and no top
    base_model = VGG16(weights='imagenet',
    include_top=False,
    input_shape=(HEIGHT, WIDTH, 3))

    #Freeze all
    for layer in base_model.layers:
    layer.trainable = False

    #Add top
    x = base_model.output
    x = layers.Flatten()(base_model.output) #last_output
    x = layers.Dense(1024, activation='relu')(x)
    x = layers.Dropout(0.2)(x)
    x = layers.Dense (1, activation='sigmoid')(x)

    model = Model(inputs=base_model.input, outputs=x)