Skip to content

Instantly share code, notes, and snippets.

@sutyum
Last active January 21, 2024 13:50
Show Gist options
  • Save sutyum/cf36141d14d6ac2d5f7b3ee2e8ce0d8c to your computer and use it in GitHub Desktop.
Save sutyum/cf36141d14d6ac2d5f7b3ee2e8ce0d8c to your computer and use it in GitHub Desktop.

Revisions

  1. sutyum revised this gist Jan 21, 2024. 1 changed file with 4 additions and 0 deletions.
    4 changes: 4 additions & 0 deletions medmerge-70b.yaml
    Original file line number Diff line number Diff line change
    @@ -13,6 +13,10 @@ models:
    parameters:
    weight: 0.08
    density: 0.45
    - model: WizardLM/WizardMath-70B-V1.0
    parameters:
    weight: 0.08
    density: 0.45
    merge_method: dare_ties
    base_model: NousResearch/Llama-2-70b-hf
    parameters:
  2. sutyum revised this gist Jan 21, 2024. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion medmerge-70b.yaml
    Original file line number Diff line number Diff line change
    @@ -9,7 +9,7 @@ models:
    parameters:
    weight: 0.08
    density: 0.45
    - model: allenai/tulu-2-dpo-70b
    - model: WizardLM/WizardLM-70B-V1.0
    parameters:
    weight: 0.08
    density: 0.45
  3. sutyum revised this gist Jan 21, 2024. 1 changed file with 4 additions and 0 deletions.
    4 changes: 4 additions & 0 deletions medmerge-70b.yaml
    Original file line number Diff line number Diff line change
    @@ -9,6 +9,10 @@ models:
    parameters:
    weight: 0.08
    density: 0.45
    - model: allenai/tulu-2-dpo-70b
    parameters:
    weight: 0.08
    density: 0.45
    merge_method: dare_ties
    base_model: NousResearch/Llama-2-70b-hf
    parameters:
  4. sutyum created this gist Jan 21, 2024.
    16 changes: 16 additions & 0 deletions medmerge-70b.yaml
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,16 @@
    models:
    - model: NousResearch/Llama-2-70b-hf
    # no parameters necessary for base model
    - model: wanglab/ClinicalCamel-70B
    parameters:
    weight: 0.08
    density: 0.45
    - model: epfl-llm/meditron-70b
    parameters:
    weight: 0.08
    density: 0.45
    merge_method: dare_ties
    base_model: NousResearch/Llama-2-70b-hf
    parameters:
    int8_mask: true
    dtype: bfloat16