Skip to content

Instantly share code, notes, and snippets.

@MathewAlexander
Last active September 23, 2020 18:50
Show Gist options
  • Select an option

  • Save MathewAlexander/dc7214e95fd4abc4538b0208b06bee8c to your computer and use it in GitHub Desktop.

Select an option

Save MathewAlexander/dc7214e95fd4abc4538b0208b06bee8c to your computer and use it in GitHub Desktop.

Revisions

  1. MathewAlexander revised this gist Sep 23, 2020. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion webnlg_inference.py
    Original file line number Diff line number Diff line change
    @@ -1,7 +1,7 @@
    tokenizer = T5Tokenizer.from_pretrained('t5-base')
    model =T5ForConditionalGeneration.from_pretrained('path_to_trained_model',
    return_dict=True)
    def generate(text,modedl,tokenizer):
    def generate(text,model,tokenizer):
    model.eval()
    input_ids = tokenizer.encode("WebNLG:{} </s>".format(text),
    return_tensors="pt")
  2. MathewAlexander revised this gist Sep 18, 2020. 1 changed file with 1 addition and 0 deletions.
    1 change: 1 addition & 0 deletions webnlg_inference.py
    Original file line number Diff line number Diff line change
    @@ -1,3 +1,4 @@
    tokenizer = T5Tokenizer.from_pretrained('t5-base')
    model =T5ForConditionalGeneration.from_pretrained('path_to_trained_model',
    return_dict=True)
    def generate(text,modedl,tokenizer):
  3. MathewAlexander created this gist Sep 18, 2020.
    8 changes: 8 additions & 0 deletions webnlg_inference.py
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,8 @@
    model =T5ForConditionalGeneration.from_pretrained('path_to_trained_model',
    return_dict=True)
    def generate(text,modedl,tokenizer):
    model.eval()
    input_ids = tokenizer.encode("WebNLG:{} </s>".format(text),
    return_tensors="pt")
    outputs = model.generate(input_ids)
    return tokenizer.decode(outputs[0])