You can think of the require module as the command and the module module as the organizer of all required modules.
REPL stands for Read Eval Print Loop and it represents a computer environment
The main object exported by the require module is a function. When Node invokes that require() function with a local file path as the function's only argument, Node goes through the following sequence of steps: Resolving > Loading > Wrapping > Evaluating > Caching
The whole process of requiring/loading a module is synchronous. we cannot change the exports object asynchronously(in a callback).
We can natively require JSON files and C++ addon files with the require function. If a file extension was not specified, the first thing Node will try to resolve is a .js file. If it can’t find a .js file, it will try a .json file and it will parse the .json file if found as a JSON text file. After that, it will try to find a binary .node file.
- require.extensions
- [Function].toString()
We can use the exports object to export properties, but we cannot replace the exports object directly because it’s just a reference to module.exports
Module Wrapping:
require('module').wrapper;This Wrapper fucntion has 5 arguments: exports, require, module, filename and dirname
Callbacks/Promises: Call me back when you're ready, Node! we need to register a listener for the special error event or the other way to handle exceptions from emitted errors is to register a listener for the global uncaughtException process event.
const EventEmitter = require('events');
const fs = require('fs');
class WithTime extends EventEmitter {
  execute(asyncFunc, ...args) {
    this.emit('begin');
    console.time('execute');
    asyncFunc(...args, (err, data) => {
      if (err) {
        return this.emit('error', err);
      }
      this.emit('data', data);
      console.timeEnd('execute');
      this.emit('end');
    });
  }
}
// we need to register a listener for the special error event
const withTime = new WithTime();
withTime.on('begin', () => console.log('start reading...'));
withTime.on('end', () => console.log('Done with execute'));
withTime.removeListener('end', () => console.log('Done with execute 2'));
withTime.on('data', (data) => {
  // do something with data
  console.log(data);
});
withTime.on('error', (error) => {
  console.log(error);
});
withTime.execute(fs.readFile, './number.txt');
// The other way to handle exceptions from emitted errors
// is to register a listener for the global uncaughtException process event.
process.on('uncaughtException', (err) => {
  // something went unhandled.
  // Do any cleanup and exit anyway!
  console.error(err); // don't do just that!
  // FORCE exit the process too.
  process.exit(1);
});- Streams are used in Node.js to read and write data from Input-Output devices. Node.js makes use of the 'fs' library to create readable and writable streams to files. These streams can be used to read and write data from files.
- Pipes can be used to connect multiple streams together. One of the most common example is to pipe the read and write stream together for the transfer of data from one file to the other.
- Node.js is often also tagged as an event driven framework, and it's very easy to define events in Node.js. Functions can be defined which respond to these events.
- Events also expose methods for responding to key events. For example, we have seen the once() event handler which can be used to make sure that a callback function is only executed once when an event is triggered.