Skip to content

Instantly share code, notes, and snippets.

@JaosnHsieh
Last active December 8, 2022 10:19
Show Gist options
  • Save JaosnHsieh/a3dd22506729e2f140a6981099118662 to your computer and use it in GitHub Desktop.
Save JaosnHsieh/a3dd22506729e2f140a6981099118662 to your computer and use it in GitHub Desktop.
2022 Some notes, code snippets, configurations, findings in open souce code.

about-yarn-upgrade.md

Recently I deleted yarn.lock file manually on the monorepo and there are tons of dependencies compatibility issue.

And I found yarn upgrade and yarn outdated are useful commands.

Turned out yarn install or just yarn would only update yarn.lock whenever dependencies in package.json changed. It won't update all packages to the latest stable version ( a.b.c, fixed a, and latest b and c ).

We can and should manually upgrade by yarn upgrade or npm upgrade.

This command updates dependencies to their latest version based on the version range specified in the package.json file. The yarn.lock file will be recreated as well.

Timing for me to use some yarn commands

yarn install or yarn -> on local dev

yarn install --frozen-lockfile --prefer-offline -> on ci

yarn outdated or yarn upgrade -> when we have time and wanna improve code quality by updating dependencies to latest stable version

references

https://classic.yarnpkg.com/lang/en/docs/cli/upgrade/

https://docs.npmjs.com/updating-packages-downloaded-from-the-registry

Auto HTTPS

Sometimes I need a quick https setup for development environment.

Caddy used to be my go-to but I expereienced ssl3 error on v2.x a few times and found no way to solve it.

I have just tried this docker-nginx-auto-ssl in docker-compose.yml and it worked in 1 min.

docker-compose.yml

# docker-compose.yml
version: '2'
services:
  nginx:
    image: valian/docker-nginx-auto-ssl
    restart: on-failure
    volumes:
      - ssl_data:/etc/resty-auto-ssl
    environment:
      ALLOWED_DOMAINS: 'example.com'
      SITES: 'example.com=localhost:3001'
    network_mode: 'host'

volumes:
  ssl_data:
~

flutent-json-schema-pattern-properties.md

Pattern properties in json schema is useful when validating random amount of objects key.

for example, a users object contains 0 to n users data key by user id.

data

{
   "userid1":{...},
   "userid100":{...}
}

Runnable example

test.js

const fjs = require('fluent-json-schema');
const Ajv = require('ajv');

const schema = fjs
  .object()
  .additionalProperties(false)
  //   .prop('prop')
  .patternProperties({ '^fo.*$': fjs.string() });

console.log('schema.valueOf()', schema.valueOf());
const ajv = new Ajv({ allErrors: true });
const validate = ajv.compile(schema.valueOf());

const data = {
  foo: 'pass',
};

let valid = validate(data);
console.log({ valid });
console.log('validate.errors', validate.errors);

const data2 = {
  test: 'test',
};

let valid2 = validate(data2);
console.log({ valid2 });
console.log('validate.errors', validate.errors);

versions

node v16.14.0

"ajv": "^8.11.0"

"fluent-json-schema": "^3.1.0"

references:

https://stackoverflow.com/a/30809082/6414615

https://github.com/fastify/fluent-json-schema/blob/master/src/ObjectSchema.test.js#L421-L435

Inspect memory usage on node.js process

There are 2 methods I found on the internet to inspect the memory usage on a node.js process.

  1. Maximum resident set size (kbytes) by executing nodejs script ( ubuntu 18 works )
  2. process.memoryUsage() in node.js API

Commands to try

  1. /usr/bin/time -pv node --max-old-space-size=200 ./test.js
  2. /usr/bin/time -pv node --max-old-space-size=4096 ./test.js

test.js

const memoryLeakAllocations = [];

const field = 'heapUsed';
const allocationStep = 10000 * 1024; // 10MB

const TIME_INTERVAL_IN_MSEC = 40;

setInterval(() => {
  const allocation = allocateMemory(allocationStep);

  memoryLeakAllocations.push(allocation);

  const mu = process.memoryUsage();
  // # bytes / KB / MB / GB
  const gbNow = mu[field] / 1024 / 1024 / 1024;
  const gbRounded = Math.round(gbNow * 100) / 100;

  console.log(`Heap allocated ${gbRounded} GB`);
}, TIME_INTERVAL_IN_MSEC);

function allocateMemory(size) {
  // Simulate allocation of bytes
  const numbers = size / 8;
  const arr = [];
  arr.length = numbers;
  for (let i = 0; i < numbers; i++) {
    arr[i] = i;
  }
  return arr;
}

references:

https://nodejs.org/en/docs/guides/backpressuring-in-streams/

https://blog.appsignal.com/2021/12/08/nodejs-memory-limits-what-you-should-know.html

nginx-serving-static-files-directory

/usr/local/etc/nginx.conf or /etc/nginx/nginx.conf

Below config would serve /somewhere directory files

.....
location / {
            autoindex on;
            alias /somewhere;
            # root /somewhere;
             
}
.....

add basic auth

generate basic auth file

htpasswd -bdc basic-auth-file admin 12345

cat basic-auth-file

nginx config

location / {
            auth_basic            "Restricted Area for Private Use Only";
            auth_basic_user_file  /basic-auth-file;
            autoindex on;
            root /somewhere;
        }

daemon off for debugging

add below to top of nginx.conf

daemon off;
error_log /dev/stdout info;

now run nginx would run without daemon

from: https://stackoverflow.com/a/23328458/6414615 https://docs.nginx.com/nginx/admin-guide/web-server/serving-static-content/ https://stackoverflow.com/questions/9454150/nginx-password-protected-dir-with-autoindex-feature https://www.jianshu.com/p/1c0691c9ad3c

rsync nested node_modules folders

Useful for javscript/typescript monorepo to copy previous installed node_modules.

My use case is on gitlab-runner shell executor to faster yarn install without messing with problematic gitlab-runner cache setting.

rsync -a -m --include="*/" --include="**/node_modules/**" --exclude="*" ./ ~/git/test/rsync-test

Simple logging Api in nginx

A HTTP POST API /log for hobby apps.

  1. log rotate by /etc/logrotate.d/nginx without extra configuring
  2. api limit by limit_req_zone
  3. app client might add a sample rate to prevent abuse requests such as sentry javascript sdk sample rate 0 to 1

/etc/nginx/sites-enable/default.conf


.....

limit_req_zone $binary_remote_addr zone=mytracking:10m rate=10r/s;

log_format my-tracking-format escape=json '$remote_addr - $remote_user [$time_local] '
                       '"$request" $status $bytes_sent '
                       '"$http_referer" "$http_user_agent" "$request_body"';

server {
  listen 80 default_server;
  
.....

location = /log {
  limit_req zone=mytracking burst=20 nodelay;
  access_log /mnt/logs/nginx/my-tracking.access.log my-tracking-format;
  proxy_pass http://localhost:80/empty;
}

location = /empty {
  access_log off;
  return 200 'logged';
}

.....
}

server {
  listen 443 ssl;
  
  .....
  
  location = /log {
    limit_req zone=mytracking burst=20 nodelay;
    access_log /mnt/logs/nginx/my-tracking.access.log my-tracking-format;
    proxy_pass http://localhost:80/empty;
  }
  .....
}

version: nginx version: nginx/1.14.0 (Ubuntu)

Test

request

curl -d '{"key1":"value1", "key2":"value2"}' -H "Content-Type: application/json" -X POST https://yourserver.com/log

checking log

tail -f /var/log/nginx/my-tracking.access.log

references:

https://stackoverflow.com/questions/17609472/really-logging-the-post-request-body-instead-of-with-nginx

https://stackoverflow.com/a/14034744/6414615

UTF8 compaitble atob btoa

Built-in functions in browser and node.js v16 atob and btoa are not utf8 compatible.

And the btoa npm module is not throwing the error as browser, node.js does.

reproduce.js

const utf8Str = `style=“color:white”`;
const encodedStr = btoa(utf8Str);
const decodedStr = atob(encodedStr);
console.log(decodedStr === utf8Str);
VM111:2 Uncaught DOMException: Failed to execute 'btoa' on 'Window': The string to be encoded contains characters outside of the Latin1 range.
    at <anonymous>:2:20

Solution:

https://stackoverflow.com/a/30106551/6414615

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment