schnee effeckt und fehler Korektur

This commit is contained in:
2023-08-14 17:52:24 +02:00
parent 4a843d4936
commit 79af4e9907
6813 changed files with 343821 additions and 356128 deletions

27
node_modules/readdirp/LICENSE generated vendored
View File

@@ -1,20 +1,21 @@
This software is released under the MIT license:
MIT License
Copyright (c) 2012-2015 Thorsten Lorenz
Copyright (c) 2012-2019 Thorsten Lorenz, Paul Miller (https://paulmillr.com)
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

294
node_modules/readdirp/README.md generated vendored
View File

@@ -1,204 +1,122 @@
# readdirp [![Build Status](https://secure.travis-ci.org/thlorenz/readdirp.svg)](http://travis-ci.org/thlorenz/readdirp)
# readdirp [![Weekly downloads](https://img.shields.io/npm/dw/readdirp.svg)](https://github.com/paulmillr/readdirp)
[![NPM](https://nodei.co/npm/readdirp.png?downloads=true&stars=true)](https://nodei.co/npm/readdirp/)
Recursive version of [fs.readdir](https://nodejs.org/api/fs.html#fs_fs_readdir_path_options_callback). Exposes a **stream API** and a **promise API**.
Recursive version of [fs.readdir](http://nodejs.org/docs/latest/api/fs.html#fs_fs_readdir_path_callback). Exposes a **stream api**.
```javascript
var readdirp = require('readdirp')
, path = require('path')
, es = require('event-stream');
// print out all JavaScript files along with their size
var stream = readdirp({ root: path.join(__dirname), fileFilter: '*.js' });
stream
.on('warn', function (err) {
console.error('non-fatal error', err);
// optionally call stream.destroy() here in order to abort and cause 'close' to be emitted
})
.on('error', function (err) { console.error('fatal error', err); })
.pipe(es.mapSync(function (entry) {
return { path: entry.path, size: entry.stat.size };
}))
.pipe(es.stringify())
.pipe(process.stdout);
```sh
npm install readdirp
```
Meant to be one of the recursive versions of [fs](http://nodejs.org/docs/latest/api/fs.html) functions, e.g., like [mkdirp](https://github.com/substack/node-mkdirp).
**Table of Contents** *generated with [DocToc](http://doctoc.herokuapp.com/)*
- [Installation](#installation)
- [API](#api)
- [entry stream](#entry-stream)
- [options](#options)
- [entry info](#entry-info)
- [Filters](#filters)
- [Callback API](#callback-api)
- [allProcessed ](#allprocessed)
- [fileProcessed](#fileprocessed)
- [More Examples](#more-examples)
- [stream api](#stream-api)
- [stream api pipe](#stream-api-pipe)
- [grep](#grep)
- [using callback api](#using-callback-api)
- [tests](#tests)
# Installation
npm install readdirp
# API
***var entryStream = readdirp (options)***
Reads given root recursively and returns a `stream` of [entry info](#entry-info)s.
## entry stream
Behaves as follows:
- `emit('data')` passes an [entry info](#entry-info) whenever one is found
- `emit('warn')` passes a non-fatal `Error` that prevents a file/directory from being processed (i.e., if it is
inaccessible to the user)
- `emit('error')` passes a fatal `Error` which also ends the stream (i.e., when illegal options where passed)
- `emit('end')` called when all entries were found and no more will be emitted (i.e., we are done)
- `emit('close')` called when the stream is destroyed via `stream.destroy()` (which could be useful if you want to
manually abort even on a non fatal error) - at that point the stream is no longer `readable` and no more entries,
warning or errors are emitted
- to learn more about streams, consult the very detailed
[nodejs streams documentation](http://nodejs.org/api/stream.html) or the
[stream-handbook](https://github.com/substack/stream-handbook)
## options
- **root**: path in which to start reading and recursing into subdirectories
- **fileFilter**: filter to include/exclude files found (see [Filters](#filters) for more)
- **directoryFilter**: filter to include/exclude directories found and to recurse into (see [Filters](#filters) for more)
- **depth**: depth at which to stop recursing even if more subdirectories are found
- **entryType**: determines if data events on the stream should be emitted for `'files'`, `'directories'`, `'both'`, or `'all'`. Setting to `'all'` will also include entries for other types of file descriptors like character devices, unix sockets and named pipes. Defaults to `'files'`.
- **lstat**: if `true`, readdirp uses `fs.lstat` instead of `fs.stat` in order to stat files and includes symlink entries in the stream along with files.
## entry info
Has the following properties:
- **parentDir** : directory in which entry was found (relative to given root)
- **fullParentDir** : full path to parent directory
- **name** : name of the file/directory
- **path** : path to the file/directory (relative to given root)
- **fullPath** : full path to the file/directory found
- **stat** : built in [stat object](http://nodejs.org/docs/v0.4.9/api/fs.html#fs.Stats)
- **Example**: (assuming root was `/User/dev/readdirp`)
parentDir : 'test/bed/root_dir1',
fullParentDir : '/User/dev/readdirp/test/bed/root_dir1',
name : 'root_dir1_subdir1',
path : 'test/bed/root_dir1/root_dir1_subdir1',
fullPath : '/User/dev/readdirp/test/bed/root_dir1/root_dir1_subdir1',
stat : [ ... ]
## Filters
There are three different ways to specify filters for files and directories respectively.
- **function**: a function that takes an entry info as a parameter and returns true to include or false to exclude the entry
- **glob string**: a string (e.g., `*.js`) which is matched using [minimatch](https://github.com/isaacs/minimatch), so go there for more
information.
Globstars (`**`) are not supported since specifying a recursive pattern for an already recursive function doesn't make sense.
Negated globs (as explained in the minimatch documentation) are allowed, e.g., `!*.txt` matches everything but text files.
- **array of glob strings**: either need to be all inclusive or all exclusive (negated) patterns otherwise an error is thrown.
`[ '*.json', '*.js' ]` includes all JavaScript and Json files.
`[ '!.git', '!node_modules' ]` includes all directories except the '.git' and 'node_modules'.
Directories that do not pass a filter will not be recursed into.
## Callback API
Although the stream api is recommended, readdirp also exposes a callback based api.
***readdirp (options, callback1 [, callback2])***
If callback2 is given, callback1 functions as the **fileProcessed** callback, and callback2 as the **allProcessed** callback.
If only callback1 is given, it functions as the **allProcessed** callback.
### allProcessed
- function with err and res parameters, e.g., `function (err, res) { ... }`
- **err**: array of errors that occurred during the operation, **res may still be present, even if errors occurred**
- **res**: collection of file/directory [entry infos](#entry-info)
### fileProcessed
- function with [entry info](#entry-info) parameter e.g., `function (entryInfo) { ... }`
# More Examples
`on('error', ..)`, `on('warn', ..)` and `on('end', ..)` handling omitted for brevity
```javascript
var readdirp = require('readdirp');
const readdirp = require('readdirp');
// Glob file filter
readdirp({ root: './test/bed', fileFilter: '*.js' })
.on('data', function (entry) {
// do something with each JavaScript file entry
});
// Use streams to achieve small RAM & CPU footprint.
// 1) Streams example with for-await.
for await (const entry of readdirp('.')) {
const {path} = entry;
console.log(`${JSON.stringify({path})}`);
}
// Combined glob file filters
readdirp({ root: './test/bed', fileFilter: [ '*.js', '*.json' ] })
.on('data', function (entry) {
// do something with each JavaScript and Json file entry
});
// 2) Streams example, non for-await.
// Print out all JS files along with their size within the current folder & subfolders.
readdirp('.', {fileFilter: '*.js', alwaysStat: true})
.on('data', (entry) => {
const {path, stats: {size}} = entry;
console.log(`${JSON.stringify({path, size})}`);
})
// Optionally call stream.destroy() in `warn()` in order to abort and cause 'close' to be emitted
.on('warn', error => console.error('non-fatal error', error))
.on('error', error => console.error('fatal error', error))
.on('end', () => console.log('done'));
// Combined negated directory filters
readdirp({ root: './test/bed', directoryFilter: [ '!.git', '!*modules' ] })
.on('data', function (entry) {
// do something with each file entry found outside '.git' or any modules directory
});
// 3) Promise example. More RAM and CPU than streams / for-await.
const files = await readdirp.promise('.');
console.log(files.map(file => file.path));
// Function directory filter
readdirp({ root: './test/bed', directoryFilter: function (di) { return di.name.length === 9; } })
.on('data', function (entry) {
// do something with each file entry found inside directories whose name has length 9
});
// Limiting depth
readdirp({ root: './test/bed', depth: 1 })
.on('data', function (entry) {
// do something with each file entry found up to 1 subdirectory deep
});
// callback api
readdirp({ root: '.' }, function(fileInfo) {
// do something with file entry here
}, function (err, res) {
// all done, move on or do final step for all file entries here
// Other options.
readdirp('test', {
fileFilter: '*.js',
directoryFilter: ['!.git', '!*modules']
// directoryFilter: (di) => di.basename.length === 9
type: 'files_directories',
depth: 1
});
```
Try more examples by following [instructions](https://github.com/paulmillr/readdirp/blob/master/examples/Readme.md)
on how to get going.
For more examples, check out `examples` directory.
## tests
## API
The [readdirp tests](https://github.com/paulmillr/readdirp/blob/master/test/readdirp.js) also will give you a good idea on
how things work.
`const stream = readdirp(root[, options])`**Stream API**
- Reads given root recursively and returns a `stream` of [entry infos](#entryinfo)
- Optionally can be used like `for await (const entry of stream)` with node.js 10+ (`asyncIterator`).
- `on('data', (entry) => {})` [entry info](#entryinfo) for every file / dir.
- `on('warn', (error) => {})` non-fatal `Error` that prevents a file / dir from being processed. Example: inaccessible to the user.
- `on('error', (error) => {})` fatal `Error` which also ends the stream. Example: illegal options where passed.
- `on('end')` — we are done. Called when all entries were found and no more will be emitted.
- `on('close')` — stream is destroyed via `stream.destroy()`.
Could be useful if you want to manually abort even on a non fatal error.
At that point the stream is no longer `readable` and no more entries, warning or errors are emitted
- To learn more about streams, consult the very detailed [nodejs streams documentation](https://nodejs.org/api/stream.html)
or the [stream-handbook](https://github.com/substack/stream-handbook)
`const entries = await readdirp.promise(root[, options])`**Promise API**. Returns a list of [entry infos](#entryinfo).
First argument is awalys `root`, path in which to start reading and recursing into subdirectories.
### options
- `fileFilter: ["*.js"]`: filter to include or exclude files. A `Function`, Glob string or Array of glob strings.
- **Function**: a function that takes an entry info as a parameter and returns true to include or false to exclude the entry
- **Glob string**: a string (e.g., `*.js`) which is matched using [picomatch](https://github.com/micromatch/picomatch), so go there for more
information. Globstars (`**`) are not supported since specifying a recursive pattern for an already recursive function doesn't make sense. Negated globs (as explained in the minimatch documentation) are allowed, e.g., `!*.txt` matches everything but text files.
- **Array of glob strings**: either need to be all inclusive or all exclusive (negated) patterns otherwise an error is thrown.
`['*.json', '*.js']` includes all JavaScript and Json files.
`['!.git', '!node_modules']` includes all directories except the '.git' and 'node_modules'.
- Directories that do not pass a filter will not be recursed into.
- `directoryFilter: ['!.git']`: filter to include/exclude directories found and to recurse into. Directories that do not pass a filter will not be recursed into.
- `depth: 5`: depth at which to stop recursing even if more subdirectories are found
- `type: 'files'`: determines if data events on the stream should be emitted for `'files'` (default), `'directories'`, `'files_directories'`, or `'all'`. Setting to `'all'` will also include entries for other types of file descriptors like character devices, unix sockets and named pipes.
- `alwaysStat: false`: always return `stats` property for every file. Default is `false`, readdirp will return `Dirent` entries. Setting it to `true` can double readdir execution time - use it only when you need file `size`, `mtime` etc. Cannot be enabled on node <10.10.0.
- `lstat: false`: include symlink entries in the stream along with files. When `true`, `fs.lstat` would be used instead of `fs.stat`
### `EntryInfo`
Has the following properties:
- `path: 'assets/javascripts/react.js'`: path to the file/directory (relative to given root)
- `fullPath: '/Users/dev/projects/app/assets/javascripts/react.js'`: full path to the file/directory found
- `basename: 'react.js'`: name of the file/directory
- `dirent: fs.Dirent`: built-in [dir entry object](https://nodejs.org/api/fs.html#fs_class_fs_dirent) - only with `alwaysStat: false`
- `stats: fs.Stats`: built in [stat object](https://nodejs.org/api/fs.html#fs_class_fs_stats) - only with `alwaysStat: true`
## Changelog
- 3.5 (Oct 13, 2020) disallows recursive directory-based symlinks.
Before, it could have entered infinite loop.
- 3.4 (Mar 19, 2020) adds support for directory-based symlinks.
- 3.3 (Dec 6, 2019) stabilizes RAM consumption and enables perf management with `highWaterMark` option. Fixes race conditions related to `for-await` looping.
- 3.2 (Oct 14, 2019) improves performance by 250% and makes streams implementation more idiomatic.
- 3.1 (Jul 7, 2019) brings `bigint` support to `stat` output on Windows. This is backwards-incompatible for some cases. Be careful. It you use it incorrectly, you'll see "TypeError: Cannot mix BigInt and other types, use explicit conversions".
- 3.0 brings huge performance improvements and stream backpressure support.
- Upgrading 2.x to 3.x:
- Signature changed from `readdirp(options)` to `readdirp(root, options)`
- Replaced callback API with promise API.
- Renamed `entryType` option to `type`
- Renamed `entryType: 'both'` to `'files_directories'`
- `EntryInfo`
- Renamed `stat` to `stats`
- Emitted only when `alwaysStat: true`
- `dirent` is emitted instead of `stats` by default with `alwaysStat: false`
- Renamed `name` to `basename`
- Removed `parentDir` and `fullParentDir` properties
- Supported node.js versions:
- 3.x: node 8+
- 2.x: node 0.6+
## License
Copyright (c) 2012-2019 Thorsten Lorenz, Paul Miller (<https://paulmillr.com>)
MIT License, see [LICENSE](LICENSE) file.

43
node_modules/readdirp/index.d.ts generated vendored Normal file
View File

@@ -0,0 +1,43 @@
// TypeScript Version: 3.2
/// <reference types="node" lib="esnext" />
import * as fs from 'fs';
import { Readable } from 'stream';
declare namespace readdir {
interface EntryInfo {
path: string;
fullPath: string;
basename: string;
stats?: fs.Stats;
dirent?: fs.Dirent;
}
interface ReaddirpOptions {
root?: string;
fileFilter?: string | string[] | ((entry: EntryInfo) => boolean);
directoryFilter?: string | string[] | ((entry: EntryInfo) => boolean);
type?: 'files' | 'directories' | 'files_directories' | 'all';
lstat?: boolean;
depth?: number;
alwaysStat?: boolean;
}
interface ReaddirpStream extends Readable, AsyncIterable<EntryInfo> {
read(): EntryInfo;
[Symbol.asyncIterator](): AsyncIterableIterator<EntryInfo>;
}
function promise(
root: string,
options?: ReaddirpOptions
): Promise<EntryInfo[]>;
}
declare function readdir(
root: string,
options?: readdir.ReaddirpOptions
): readdir.ReaddirpStream;
export = readdir;

287
node_modules/readdirp/index.js generated vendored Normal file
View File

@@ -0,0 +1,287 @@
'use strict';
const fs = require('fs');
const { Readable } = require('stream');
const sysPath = require('path');
const { promisify } = require('util');
const picomatch = require('picomatch');
const readdir = promisify(fs.readdir);
const stat = promisify(fs.stat);
const lstat = promisify(fs.lstat);
const realpath = promisify(fs.realpath);
/**
* @typedef {Object} EntryInfo
* @property {String} path
* @property {String} fullPath
* @property {fs.Stats=} stats
* @property {fs.Dirent=} dirent
* @property {String} basename
*/
const BANG = '!';
const RECURSIVE_ERROR_CODE = 'READDIRP_RECURSIVE_ERROR';
const NORMAL_FLOW_ERRORS = new Set(['ENOENT', 'EPERM', 'EACCES', 'ELOOP', RECURSIVE_ERROR_CODE]);
const FILE_TYPE = 'files';
const DIR_TYPE = 'directories';
const FILE_DIR_TYPE = 'files_directories';
const EVERYTHING_TYPE = 'all';
const ALL_TYPES = [FILE_TYPE, DIR_TYPE, FILE_DIR_TYPE, EVERYTHING_TYPE];
const isNormalFlowError = error => NORMAL_FLOW_ERRORS.has(error.code);
const [maj, min] = process.versions.node.split('.').slice(0, 2).map(n => Number.parseInt(n, 10));
const wantBigintFsStats = process.platform === 'win32' && (maj > 10 || (maj === 10 && min >= 5));
const normalizeFilter = filter => {
if (filter === undefined) return;
if (typeof filter === 'function') return filter;
if (typeof filter === 'string') {
const glob = picomatch(filter.trim());
return entry => glob(entry.basename);
}
if (Array.isArray(filter)) {
const positive = [];
const negative = [];
for (const item of filter) {
const trimmed = item.trim();
if (trimmed.charAt(0) === BANG) {
negative.push(picomatch(trimmed.slice(1)));
} else {
positive.push(picomatch(trimmed));
}
}
if (negative.length > 0) {
if (positive.length > 0) {
return entry =>
positive.some(f => f(entry.basename)) && !negative.some(f => f(entry.basename));
}
return entry => !negative.some(f => f(entry.basename));
}
return entry => positive.some(f => f(entry.basename));
}
};
class ReaddirpStream extends Readable {
static get defaultOptions() {
return {
root: '.',
/* eslint-disable no-unused-vars */
fileFilter: (path) => true,
directoryFilter: (path) => true,
/* eslint-enable no-unused-vars */
type: FILE_TYPE,
lstat: false,
depth: 2147483648,
alwaysStat: false
};
}
constructor(options = {}) {
super({
objectMode: true,
autoDestroy: true,
highWaterMark: options.highWaterMark || 4096
});
const opts = { ...ReaddirpStream.defaultOptions, ...options };
const { root, type } = opts;
this._fileFilter = normalizeFilter(opts.fileFilter);
this._directoryFilter = normalizeFilter(opts.directoryFilter);
const statMethod = opts.lstat ? lstat : stat;
// Use bigint stats if it's windows and stat() supports options (node 10+).
if (wantBigintFsStats) {
this._stat = path => statMethod(path, { bigint: true });
} else {
this._stat = statMethod;
}
this._maxDepth = opts.depth;
this._wantsDir = [DIR_TYPE, FILE_DIR_TYPE, EVERYTHING_TYPE].includes(type);
this._wantsFile = [FILE_TYPE, FILE_DIR_TYPE, EVERYTHING_TYPE].includes(type);
this._wantsEverything = type === EVERYTHING_TYPE;
this._root = sysPath.resolve(root);
this._isDirent = ('Dirent' in fs) && !opts.alwaysStat;
this._statsProp = this._isDirent ? 'dirent' : 'stats';
this._rdOptions = { encoding: 'utf8', withFileTypes: this._isDirent };
// Launch stream with one parent, the root dir.
this.parents = [this._exploreDir(root, 1)];
this.reading = false;
this.parent = undefined;
}
async _read(batch) {
if (this.reading) return;
this.reading = true;
try {
while (!this.destroyed && batch > 0) {
const { path, depth, files = [] } = this.parent || {};
if (files.length > 0) {
const slice = files.splice(0, batch).map(dirent => this._formatEntry(dirent, path));
for (const entry of await Promise.all(slice)) {
if (this.destroyed) return;
const entryType = await this._getEntryType(entry);
if (entryType === 'directory' && this._directoryFilter(entry)) {
if (depth <= this._maxDepth) {
this.parents.push(this._exploreDir(entry.fullPath, depth + 1));
}
if (this._wantsDir) {
this.push(entry);
batch--;
}
} else if ((entryType === 'file' || this._includeAsFile(entry)) && this._fileFilter(entry)) {
if (this._wantsFile) {
this.push(entry);
batch--;
}
}
}
} else {
const parent = this.parents.pop();
if (!parent) {
this.push(null);
break;
}
this.parent = await parent;
if (this.destroyed) return;
}
}
} catch (error) {
this.destroy(error);
} finally {
this.reading = false;
}
}
async _exploreDir(path, depth) {
let files;
try {
files = await readdir(path, this._rdOptions);
} catch (error) {
this._onError(error);
}
return { files, depth, path };
}
async _formatEntry(dirent, path) {
let entry;
try {
const basename = this._isDirent ? dirent.name : dirent;
const fullPath = sysPath.resolve(sysPath.join(path, basename));
entry = { path: sysPath.relative(this._root, fullPath), fullPath, basename };
entry[this._statsProp] = this._isDirent ? dirent : await this._stat(fullPath);
} catch (err) {
this._onError(err);
}
return entry;
}
_onError(err) {
if (isNormalFlowError(err) && !this.destroyed) {
this.emit('warn', err);
} else {
this.destroy(err);
}
}
async _getEntryType(entry) {
// entry may be undefined, because a warning or an error were emitted
// and the statsProp is undefined
const stats = entry && entry[this._statsProp];
if (!stats) {
return;
}
if (stats.isFile()) {
return 'file';
}
if (stats.isDirectory()) {
return 'directory';
}
if (stats && stats.isSymbolicLink()) {
const full = entry.fullPath;
try {
const entryRealPath = await realpath(full);
const entryRealPathStats = await lstat(entryRealPath);
if (entryRealPathStats.isFile()) {
return 'file';
}
if (entryRealPathStats.isDirectory()) {
const len = entryRealPath.length;
if (full.startsWith(entryRealPath) && full.substr(len, 1) === sysPath.sep) {
const recursiveError = new Error(
`Circular symlink detected: "${full}" points to "${entryRealPath}"`
);
recursiveError.code = RECURSIVE_ERROR_CODE;
return this._onError(recursiveError);
}
return 'directory';
}
} catch (error) {
this._onError(error);
}
}
}
_includeAsFile(entry) {
const stats = entry && entry[this._statsProp];
return stats && this._wantsEverything && !stats.isDirectory();
}
}
/**
* @typedef {Object} ReaddirpArguments
* @property {Function=} fileFilter
* @property {Function=} directoryFilter
* @property {String=} type
* @property {Number=} depth
* @property {String=} root
* @property {Boolean=} lstat
* @property {Boolean=} bigint
*/
/**
* Main function which ends up calling readdirRec and reads all files and directories in given root recursively.
* @param {String} root Root directory
* @param {ReaddirpArguments=} options Options to specify root (start directory), filters and recursion depth
*/
const readdirp = (root, options = {}) => {
let type = options.entryType || options.type;
if (type === 'both') type = FILE_DIR_TYPE; // backwards-compatibility
if (type) options.type = type;
if (!root) {
throw new Error('readdirp: root argument is required. Usage: readdirp(root, options)');
} else if (typeof root !== 'string') {
throw new TypeError('readdirp: root argument must be a string. Usage: readdirp(root, options)');
} else if (type && !ALL_TYPES.includes(type)) {
throw new Error(`readdirp: Invalid type passed. Use one of ${ALL_TYPES.join(', ')}`);
}
options.root = root;
return new ReaddirpStream(options);
};
const readdirpPromise = (root, options = {}) => {
return new Promise((resolve, reject) => {
const files = [];
readdirp(root, options)
.on('data', entry => files.push(entry))
.on('end', () => resolve(files))
.on('error', error => reject(error));
});
};
readdirp.promise = readdirpPromise;
readdirp.ReaddirpStream = ReaddirpStream;
readdirp.default = readdirp;
module.exports = readdirp;

116
node_modules/readdirp/package.json generated vendored
View File

@@ -1,19 +1,28 @@
{
"author": "Thorsten Lorenz <thlorenz@gmx.de> (thlorenz.com)",
"name": "readdirp",
"description": "Recursive version of fs.readdir with streaming api.",
"version": "2.2.1",
"description": "Recursive version of fs.readdir with streaming API.",
"version": "3.6.0",
"homepage": "https://github.com/paulmillr/readdirp",
"repository": {
"type": "git",
"url": "git://github.com/paulmillr/readdirp.git"
},
"license": "MIT",
"bugs": {
"url": "https://github.com/paulmillr/readdirp/issues"
},
"author": "Thorsten Lorenz <thlorenz@gmx.de> (thlorenz.com)",
"contributors": [
"Thorsten Lorenz <thlorenz@gmx.de> (thlorenz.com)",
"Paul Miller (https://paulmillr.com)"
],
"main": "index.js",
"engines": {
"node": ">=0.10"
"node": ">=8.10.0"
},
"files": [
"readdirp.js",
"stream-api.js"
"index.js",
"index.d.ts"
],
"keywords": [
"recursive",
@@ -25,26 +34,89 @@
"find",
"filter"
],
"main": "readdirp.js",
"scripts": {
"test-main": "(cd test && set -e; for t in ./*.js; do node $t; done)",
"test-0.10": "nave use 0.10 npm run test-main",
"test-0.12": "nave use 0.12 npm run test-main",
"test-4": "nave use 4.4 npm run test-main",
"test-6": "nave use 6.2 npm run test-main",
"test-all": "npm run test-main && npm run test-0.10 && npm run test-0.12 && npm run test-4 && npm run test-6",
"test": "npm run test-main"
"dtslint": "dtslint",
"nyc": "nyc",
"mocha": "mocha --exit",
"lint": "eslint --report-unused-disable-directives --ignore-path .gitignore .",
"test": "npm run lint && nyc npm run mocha"
},
"dependencies": {
"graceful-fs": "^4.1.11",
"micromatch": "^3.1.10",
"readable-stream": "^2.0.2"
"picomatch": "^2.2.1"
},
"devDependencies": {
"nave": "^0.5.1",
"proxyquire": "^1.7.9",
"tap": "1.3.2",
"through2": "^2.0.0"
"@types/node": "^14",
"chai": "^4.2",
"chai-subset": "^1.6",
"dtslint": "^3.3.0",
"eslint": "^7.0.0",
"mocha": "^7.1.1",
"nyc": "^15.0.0",
"rimraf": "^3.0.0",
"typescript": "^4.0.3"
},
"license": "MIT"
"nyc": {
"reporter": [
"html",
"text"
]
},
"eslintConfig": {
"root": true,
"extends": "eslint:recommended",
"parserOptions": {
"ecmaVersion": 9,
"sourceType": "script"
},
"env": {
"node": true,
"es6": true
},
"rules": {
"array-callback-return": "error",
"no-empty": [
"error",
{
"allowEmptyCatch": true
}
],
"no-else-return": [
"error",
{
"allowElseIf": false
}
],
"no-lonely-if": "error",
"no-var": "error",
"object-shorthand": "error",
"prefer-arrow-callback": [
"error",
{
"allowNamedFunctions": true
}
],
"prefer-const": [
"error",
{
"ignoreReadBeforeAssign": true
}
],
"prefer-destructuring": [
"error",
{
"object": true,
"array": false
}
],
"prefer-spread": "error",
"prefer-template": "error",
"radix": "error",
"semi": "error",
"strict": "error",
"quotes": [
"error",
"single"
]
}
}
}

294
node_modules/readdirp/readdirp.js generated vendored
View File

@@ -1,294 +0,0 @@
'use strict';
var fs = require('graceful-fs')
, path = require('path')
, micromatch = require('micromatch').isMatch
, toString = Object.prototype.toString
;
// Standard helpers
function isFunction (obj) {
return toString.call(obj) === '[object Function]';
}
function isString (obj) {
return toString.call(obj) === '[object String]';
}
function isUndefined (obj) {
return obj === void 0;
}
/**
* Main function which ends up calling readdirRec and reads all files and directories in given root recursively.
* @param { Object } opts Options to specify root (start directory), filters and recursion depth
* @param { function } callback1 When callback2 is given calls back for each processed file - function (fileInfo) { ... },
* when callback2 is not given, it behaves like explained in callback2
* @param { function } callback2 Calls back once all files have been processed with an array of errors and file infos
* function (err, fileInfos) { ... }
*/
function readdir(opts, callback1, callback2) {
var stream
, handleError
, handleFatalError
, errors = []
, readdirResult = {
directories: []
, files: []
}
, fileProcessed
, allProcessed
, realRoot
, aborted = false
, paused = false
;
// If no callbacks were given we will use a streaming interface
if (isUndefined(callback1)) {
var api = require('./stream-api')();
stream = api.stream;
callback1 = api.processEntry;
callback2 = api.done;
handleError = api.handleError;
handleFatalError = api.handleFatalError;
stream.on('close', function () { aborted = true; });
stream.on('pause', function () { paused = true; });
stream.on('resume', function () { paused = false; });
} else {
handleError = function (err) { errors.push(err); };
handleFatalError = function (err) {
handleError(err);
allProcessed(errors, null);
};
}
if (isUndefined(opts)){
handleFatalError(new Error (
'Need to pass at least one argument: opts! \n' +
'https://github.com/paulmillr/readdirp#options'
)
);
return stream;
}
opts.root = opts.root || '.';
opts.fileFilter = opts.fileFilter || function() { return true; };
opts.directoryFilter = opts.directoryFilter || function() { return true; };
opts.depth = typeof opts.depth === 'undefined' ? 999999999 : opts.depth;
opts.entryType = opts.entryType || 'files';
var statfn = opts.lstat === true ? fs.lstat.bind(fs) : fs.stat.bind(fs);
if (isUndefined(callback2)) {
fileProcessed = function() { };
allProcessed = callback1;
} else {
fileProcessed = callback1;
allProcessed = callback2;
}
function normalizeFilter (filter) {
if (isUndefined(filter)) return undefined;
function isNegated (filters) {
function negated(f) {
return f.indexOf('!') === 0;
}
var some = filters.some(negated);
if (!some) {
return false;
} else {
if (filters.every(negated)) {
return true;
} else {
// if we detect illegal filters, bail out immediately
throw new Error(
'Cannot mix negated with non negated glob filters: ' + filters + '\n' +
'https://github.com/paulmillr/readdirp#filters'
);
}
}
}
// Turn all filters into a function
if (isFunction(filter)) {
return filter;
} else if (isString(filter)) {
return function (entryInfo) {
return micromatch(entryInfo.name, filter.trim());
};
} else if (filter && Array.isArray(filter)) {
if (filter) filter = filter.map(function (f) {
return f.trim();
});
return isNegated(filter) ?
// use AND to concat multiple negated filters
function (entryInfo) {
return filter.every(function (f) {
return micromatch(entryInfo.name, f);
});
}
:
// use OR to concat multiple inclusive filters
function (entryInfo) {
return filter.some(function (f) {
return micromatch(entryInfo.name, f);
});
};
}
}
function processDir(currentDir, entries, callProcessed) {
if (aborted) return;
var total = entries.length
, processed = 0
, entryInfos = []
;
fs.realpath(currentDir, function(err, realCurrentDir) {
if (aborted) return;
if (err) {
handleError(err);
callProcessed(entryInfos);
return;
}
var relDir = path.relative(realRoot, realCurrentDir);
if (entries.length === 0) {
callProcessed([]);
} else {
entries.forEach(function (entry) {
var fullPath = path.join(realCurrentDir, entry)
, relPath = path.join(relDir, entry);
statfn(fullPath, function (err, stat) {
if (err) {
handleError(err);
} else {
entryInfos.push({
name : entry
, path : relPath // relative to root
, fullPath : fullPath
, parentDir : relDir // relative to root
, fullParentDir : realCurrentDir
, stat : stat
});
}
processed++;
if (processed === total) callProcessed(entryInfos);
});
});
}
});
}
function readdirRec(currentDir, depth, callCurrentDirProcessed) {
var args = arguments;
if (aborted) return;
if (paused) {
setImmediate(function () {
readdirRec.apply(null, args);
})
return;
}
fs.readdir(currentDir, function (err, entries) {
if (err) {
handleError(err);
callCurrentDirProcessed();
return;
}
processDir(currentDir, entries, function(entryInfos) {
var subdirs = entryInfos
.filter(function (ei) { return ei.stat.isDirectory() && opts.directoryFilter(ei); });
subdirs.forEach(function (di) {
if(opts.entryType === 'directories' || opts.entryType === 'both' || opts.entryType === 'all') {
fileProcessed(di);
}
readdirResult.directories.push(di);
});
entryInfos
.filter(function(ei) {
var isCorrectType = opts.entryType === 'all' ?
!ei.stat.isDirectory() : ei.stat.isFile() || ei.stat.isSymbolicLink();
return isCorrectType && opts.fileFilter(ei);
})
.forEach(function (fi) {
if(opts.entryType === 'files' || opts.entryType === 'both' || opts.entryType === 'all') {
fileProcessed(fi);
}
readdirResult.files.push(fi);
});
var pendingSubdirs = subdirs.length;
// Be done if no more subfolders exist or we reached the maximum desired depth
if(pendingSubdirs === 0 || depth === opts.depth) {
callCurrentDirProcessed();
} else {
// recurse into subdirs, keeping track of which ones are done
// and call back once all are processed
subdirs.forEach(function (subdir) {
readdirRec(subdir.fullPath, depth + 1, function () {
pendingSubdirs = pendingSubdirs - 1;
if(pendingSubdirs === 0) {
callCurrentDirProcessed();
}
});
});
}
});
});
}
// Validate and normalize filters
try {
opts.fileFilter = normalizeFilter(opts.fileFilter);
opts.directoryFilter = normalizeFilter(opts.directoryFilter);
} catch (err) {
// if we detect illegal filters, bail out immediately
handleFatalError(err);
return stream;
}
// If filters were valid get on with the show
fs.realpath(opts.root, function(err, res) {
if (err) {
handleFatalError(err);
return stream;
}
realRoot = res;
readdirRec(opts.root, 0, function () {
// All errors are collected into the errors array
if (errors.length > 0) {
allProcessed(errors, readdirResult);
} else {
allProcessed(null, readdirResult);
}
});
});
return stream;
}
module.exports = readdir;

98
node_modules/readdirp/stream-api.js generated vendored
View File

@@ -1,98 +0,0 @@
'use strict';
var stream = require('readable-stream');
var util = require('util');
var Readable = stream.Readable;
module.exports = ReaddirpReadable;
util.inherits(ReaddirpReadable, Readable);
function ReaddirpReadable (opts) {
if (!(this instanceof ReaddirpReadable)) return new ReaddirpReadable(opts);
opts = opts || {};
opts.objectMode = true;
Readable.call(this, opts);
// backpressure not implemented at this point
this.highWaterMark = Infinity;
this._destroyed = false;
this._paused = false;
this._warnings = [];
this._errors = [];
this._pauseResumeErrors();
}
var proto = ReaddirpReadable.prototype;
proto._pauseResumeErrors = function () {
var self = this;
self.on('pause', function () { self._paused = true });
self.on('resume', function () {
if (self._destroyed) return;
self._paused = false;
self._warnings.forEach(function (err) { self.emit('warn', err) });
self._warnings.length = 0;
self._errors.forEach(function (err) { self.emit('error', err) });
self._errors.length = 0;
})
}
// called for each entry
proto._processEntry = function (entry) {
if (this._destroyed) return;
this.push(entry);
}
proto._read = function () { }
proto.destroy = function () {
// when stream is destroyed it will emit nothing further, not even errors or warnings
this.push(null);
this.readable = false;
this._destroyed = true;
this.emit('close');
}
proto._done = function () {
this.push(null);
}
// we emit errors and warnings async since we may handle errors like invalid args
// within the initial event loop before any event listeners subscribed
proto._handleError = function (err) {
var self = this;
setImmediate(function () {
if (self._paused) return self._warnings.push(err);
if (!self._destroyed) self.emit('warn', err);
});
}
proto._handleFatalError = function (err) {
var self = this;
setImmediate(function () {
if (self._paused) return self._errors.push(err);
if (!self._destroyed) self.emit('error', err);
});
}
function createStreamAPI () {
var stream = new ReaddirpReadable();
return {
stream : stream
, processEntry : stream._processEntry.bind(stream)
, done : stream._done.bind(stream)
, handleError : stream._handleError.bind(stream)
, handleFatalError : stream._handleFatalError.bind(stream)
};
}
module.exports = createStreamAPI;