-
-
Save jerrydotlam/b6c58d8a18231e543007d13fe8dd9d91 to your computer and use it in GitHub Desktop.
Revisions
-
chunhualiao revised this gist
Sep 3, 2023 . 1 changed file with 2 additions and 2 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -1,7 +1,7 @@ Based on discussions on https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf/discussions/10 - pip install git+https://github.com/huggingface/transformers.git@main - pip install tokenizer transformers ``` -
chunhualiao created this gist
Sep 3, 2023 .There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -0,0 +1,23 @@ Based on discussions on https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf/discussions/10 pip install git+https://github.com/huggingface/transformers.git@main pip install tokenizer transformers ``` # Use a pipeline as a high-level helper from transformers import pipeline from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig tokenizer = AutoTokenizer.from_pretrained("codellama/CodeLlama-7b-Instruct-hf") model = AutoModelForCausalLM.from_pretrained("codellama/CodeLlama-7b-Instruct-hf") # Create a pipeline code_generator = pipeline('text-generation', model=model, tokenizer=tokenizer) # Generate code for an input string input_string = "Write a python function to calculate the factorial of a number" generated_code = code_generator(input_string, max_length=100)[0]['generated_text'] print(generated_code) ```