Skip to content

Instantly share code, notes, and snippets.

@zainhas
Last active February 18, 2025 16:01
Show Gist options
  • Select an option

  • Save zainhas/f1ca3e9a979f3d44193ac4c4f1bbc89c to your computer and use it in GitHub Desktop.

Select an option

Save zainhas/f1ca3e9a979f3d44193ac4c4f1bbc89c to your computer and use it in GitHub Desktop.

Revisions

  1. zainhas revised this gist Jan 24, 2025. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion thinking_tokens.py
    Original file line number Diff line number Diff line change
    @@ -5,7 +5,7 @@

    thought = client.chat.completions.create(
    model="deepseek-ai/DeepSeek-R1",
    messages=[{"role": "user", "content": "Which is larger 9.9 or 9.11?"}],
    messages=[{"role": "user", "content": question}],
    stop = ['</think>']
    )

  2. zainhas created this gist Jan 24, 2025.
    28 changes: 28 additions & 0 deletions thinking_tokens.py
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,28 @@
    from together import Together
    client = Together(api_key = TOGETHER_API_KEY)

    question = "Which is larger 9.9 or 9.11?"

    thought = client.chat.completions.create(
    model="deepseek-ai/DeepSeek-R1",
    messages=[{"role": "user", "content": "Which is larger 9.9 or 9.11?"}],
    stop = ['</think>']
    )

    PROMPT_TEMPLATE = """
    Thought process: {thinking_tokens} </think>
    Question: {question}
    Answer:
    """

    answer = client.chat.completions.create(
    model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
    messages=[{"role": "user",
    "content": PROMPT_TEMPLATE.format(thinking_tokens=thought.choices[0].message.content, question = question) }],
    )

    print(answer.choices[0].message.content)

    #Answers: 9.9 is larger than 9.11.