Skip to content

Instantly share code, notes, and snippets.

View ScottGuthart's full-sized avatar

Scott Guthart ScottGuthart

  • Edelman.AI
  • New York
View GitHub Profile
@burkeholland
burkeholland / ainstall.md
Last active November 2, 2025 09:31
Boost

Boost Prompt

A prompt to boost your lazy "do this" prompts. Install with one of the buttons below.

Install in VS Code Install in VS Code Insiders

Use

@intellectronica
intellectronica / AGENTS.md
Last active August 28, 2025 17:44
opencode config
Source
../.config/ruler/instructions.md

Default Agent Instructions

  • You have useful tools available as MCP or command-line. Use them.
  • Unless you are absolutely sure that you have correct and, crucially, up-to-date information in your knowledge, always get information from the web. You can use Tavily for searching and getting information, and you can always use curl to fetch web pages.
  • If you are working on something that presents web pages, you should use Playwright to open these pages, take snapshots, and inspect them.
  • When working in a git repository, always switch to a new branch, unless explicitly insutrcted not to.
  • If the git repo has a corresponding repo on GitHub, use the gh tool for things like looking at issues, opening pull requests, reading review comments, and looking at CI results.

Beast Mode

Beast Mode is a custom chat mode for VS Code agent that adds an opinionated workflow to the agent, including use of a todo list, extensive internet research capabilities, planning, tool usage instructions and more. Designed to be used with 4.1, although it will work with any model.

Below you will find the Beast Mode prompt in various versions - starting with the most recent - 3.1

Installation Instructions

  • Go to the "agent" dropdown in VS Code chat sidebar and select "Configure Modes".
  • Select "Create new custom chat mode file"
@andresbrocco
andresbrocco / LLM Prompts for Dev.md
Last active September 13, 2025 14:17
LLM Prompts for Dev

LLM Prompts for Dev

Adapted from Harper Reed's blog

%% High-level data-flow for the “LLM Prompts for Dev” workflow
flowchart TD
  %% ──────────────── Idea phase ────────────────
  subgraph A["Idea phase"]
    A1[["Idea Refinement template"]]:::tmpl

2025-04-10T16:46:42 conversation: 01jrg9qt152x9ktnt6e1yvvwet id: 01jrg9qt1pt54njgyw2k0nf7ec

Model: gemini-2.5-pro-exp-03-25

Prompt

write shell completions for this tool

Prompt fragments

@jerieljan
jerieljan / _llm
Last active October 3, 2025 15:49
A basic zsh-completion for Simon Willison's `llm`. https://llm.datasette.io
#compdef llm
# Instructions:
#
# - Have llm working on your setup, of course. Have some models or templates to actually use.
# - Have zsh and zsh-completions configured and working (i.e., installed, fpath configured, tabbing works for other commands, etc)
# - Place this file `_llm` alongside your completions (e.g., ~/.oh-my-zsh/custom/plugins/zsh_completions/src/)
# - Restart your terminal session or `source ~/.zshrc`
# - Try it out by tabbing while `-m` or `-t` and it should suggest whatever models you have available or templates you've configured.
@Hellisotherpeople
Hellisotherpeople / blog.md
Last active August 12, 2025 21:18
You probably don't know how to do Prompt Engineering, let me educate you.

You probably don't know how to do Prompt Engineering

(This post could also be titled "Features missing from most LLM front-ends that should exist")

Apologies for the snarky title, but there has been a huge amount of discussion around so called "Prompt Engineering" these past few months on all kinds of platforms. Much of it is coming from individuals who are peddling around an awful lot of "Prompting" and very little "Engineering".

Most of these discussions are little more than users finding that writing more creative and complicated prompts can help them solve a task that a more simple prompt was unable to help with. I claim this is not Prompt Engineering. This is not to say that crafting good prompts is not a difficult task, but it does not involve doing any kind of sophisticated modifications to general "template" of a prompt.

Others, who I think do deserve to call themselves "Prompt Engineers" (and an awful lot more than that), have been writing about and utilizing the rich new eco-system

@kepano
kepano / obsidian-web-clipper.js
Last active October 27, 2025 04:27
Obsidian Web Clipper Bookmarklet to save articles and pages from the web (for Safari, Chrome, Firefox, and mobile browsers)
javascript: Promise.all([import('https://unpkg.com/[email protected]?module'), import('https://unpkg.com/@tehshrike/[email protected]'), ]).then(async ([{
default: Turndown
}, {
default: Readability
}]) => {
/* Optional vault name */
const vault = "";
/* Optional folder name such as "Clippings/" */
@eduardoarandah
eduardoarandah / .vimrc
Last active February 2, 2021 17:37
Mi configuración .vimrc y coc-settings.json
scriptencoding utf-8 " basic
set nocompatible " basic
filetype off " basic
filetype plugin on " Enable filetype plugins
filetype indent on " Enable loading the indent file for specific file types
syntax enable " Enable syntax highlighting
set encoding=utf-8 " Encoding (needed in youcompleteme)
set fileencoding=utf-8 " The encoding written to file.
set noerrorbells " No annoying sound on errors
set number " Line numbers on