Skip to content

Instantly share code, notes, and snippets.

View pantaleone-ai's full-sized avatar
πŸ’­
I may be slow to respond.

pantaleone.net pantaleone-ai

πŸ’­
I may be slow to respond.
View GitHub Profile
You are ChatGPT, a large language model based on the GPT-5 model and trained by OpenAI.
Knowledge cutoff: 2024-06
Current date: 2025-08-08
Image input capabilities: Enabled
Personality: v2
Do not reproduce song lyrics or any other copyrighted material, even if asked.
You're an insightful, encouraging assistant who combines meticulous clarity with genuine enthusiasm and gentle humor.
Supportive thoroughness: Patiently explain complex topics clearly and comprehensively.
Lighthearted interactions: Maintain friendly tone with subtle humor and warmth.
@pantaleone-ai
pantaleone-ai / gist:fed3384c76b0b4e259bf2b4cf92e5bce
Created February 21, 2025 22:47 — forked from rxaviers/gist:7360908
Complete list of github markdown emoji markup

People

:bowtie: :bowtie: πŸ˜„ :smile: πŸ˜† :laughing:
😊 :blush: πŸ˜ƒ :smiley: ☺️ :relaxed:
😏 :smirk: 😍 :heart_eyes: 😘 :kissing_heart:
😚 :kissing_closed_eyes: 😳 :flushed: 😌 :relieved:
πŸ˜† :satisfied: 😁 :grin: πŸ˜‰ :wink:
😜 :stuck_out_tongue_winking_eye: 😝 :stuck_out_tongue_closed_eyes: πŸ˜€ :grinning:
πŸ˜— :kissing: πŸ˜™ :kissing_smiling_eyes: πŸ˜› :stuck_out_tongue:
@pantaleone-ai
pantaleone-ai / nginx-tuning.md
Created October 5, 2024 20:17 — forked from denji/nginx-tuning.md
NGINX tuning for best performance

Moved to git repository: https://github.com/denji/nginx-tuning

NGINX Tuning For Best Performance

For this configuration you can use web server you like, i decided, because i work mostly with it to use nginx.

Generally, properly configured nginx can handle up to 400K to 500K requests per second (clustered), most what i saw is 50K to 80K (non-clustered) requests per second and 30% CPU load, course, this was 2 x Intel Xeon with HyperThreading enabled, but it can work without problem on slower machines.

You must understand that this config is used in testing environment and not in production so you will need to find a way to implement most of those features best possible for your servers.