Add files via upload

This commit is contained in:
bakustarver 2024-07-01 20:36:56 +03:00 committed by GitHub
parent 81e7741ef4
commit 1985318952
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
44 changed files with 11064 additions and 0 deletions

View file

@ -0,0 +1,15 @@
The ISC License
Copyright (c) 2011-2022 Isaac Z. Schlueter, Ben Noordhuis, and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

View file

@ -0,0 +1,143 @@
# graceful-fs
graceful-fs functions as a drop-in replacement for the fs module,
making various improvements.
The improvements are meant to normalize behavior across different
platforms and environments, and to make filesystem access more
resilient to errors.
## Improvements over [fs module](https://nodejs.org/api/fs.html)
* Queues up `open` and `readdir` calls, and retries them once
something closes if there is an EMFILE error from too many file
descriptors.
* fixes `lchmod` for Node versions prior to 0.6.2.
* implements `fs.lutimes` if possible. Otherwise it becomes a noop.
* ignores `EINVAL` and `EPERM` errors in `chown`, `fchown` or
`lchown` if the user isn't root.
* makes `lchmod` and `lchown` become noops, if not available.
* retries reading a file if `read` results in EAGAIN error.
On Windows, it retries renaming a file for up to one second if `EACCESS`
or `EPERM` error occurs, likely because antivirus software has locked
the directory.
## USAGE
```javascript
// use just like fs
var fs = require('graceful-fs')
// now go and do stuff with it...
fs.readFile('some-file-or-whatever', (err, data) => {
// Do stuff here.
})
```
## Sync methods
This module cannot intercept or handle `EMFILE` or `ENFILE` errors from sync
methods. If you use sync methods which open file descriptors then you are
responsible for dealing with any errors.
This is a known limitation, not a bug.
## Global Patching
If you want to patch the global fs module (or any other fs-like
module) you can do this:
```javascript
// Make sure to read the caveat below.
var realFs = require('fs')
var gracefulFs = require('graceful-fs')
gracefulFs.gracefulify(realFs)
```
This should only ever be done at the top-level application layer, in
order to delay on EMFILE errors from any fs-using dependencies. You
should **not** do this in a library, because it can cause unexpected
delays in other parts of the program.
## Changes
This module is fairly stable at this point, and used by a lot of
things. That being said, because it implements a subtle behavior
change in a core part of the node API, even modest changes can be
extremely breaking, and the versioning is thus biased towards
bumping the major when in doubt.
The main change between major versions has been switching between
providing a fully-patched `fs` module vs monkey-patching the node core
builtin, and the approach by which a non-monkey-patched `fs` was
created.
The goal is to trade `EMFILE` errors for slower fs operations. So, if
you try to open a zillion files, rather than crashing, `open`
operations will be queued up and wait for something else to `close`.
There are advantages to each approach. Monkey-patching the fs means
that no `EMFILE` errors can possibly occur anywhere in your
application, because everything is using the same core `fs` module,
which is patched. However, it can also obviously cause undesirable
side-effects, especially if the module is loaded multiple times.
Implementing a separate-but-identical patched `fs` module is more
surgical (and doesn't run the risk of patching multiple times), but
also imposes the challenge of keeping in sync with the core module.
The current approach loads the `fs` module, and then creates a
lookalike object that has all the same methods, except a few that are
patched. It is safe to use in all versions of Node from 0.8 through
7.0.
### v4
* Do not monkey-patch the fs module. This module may now be used as a
drop-in dep, and users can opt into monkey-patching the fs builtin
if their app requires it.
### v3
* Monkey-patch fs, because the eval approach no longer works on recent
node.
* fixed possible type-error throw if rename fails on windows
* verify that we *never* get EMFILE errors
* Ignore ENOSYS from chmod/chown
* clarify that graceful-fs must be used as a drop-in
### v2.1.0
* Use eval rather than monkey-patching fs.
* readdir: Always sort the results
* win32: requeue a file if error has an OK status
### v2.0
* A return to monkey patching
* wrap process.cwd
### v1.1
* wrap readFile
* Wrap fs.writeFile.
* readdir protection
* Don't clobber the fs builtin
* Handle fs.read EAGAIN errors by trying again
* Expose the curOpen counter
* No-op lchown/lchmod if not implemented
* fs.rename patch only for win32
* Patch fs.rename to handle AV software on Windows
* Close #4 Chown should not fail on einval or eperm if non-root
* Fix isaacs/fstream#1 Only wrap fs one time
* Fix #3 Start at 1024 max files, then back off on EMFILE
* lutimes that doens't blow up on Linux
* A full on-rewrite using a queue instead of just swallowing the EMFILE error
* Wrap Read/Write streams as well
### 1.0
* Update engines for node 0.6
* Be lstat-graceful on Windows
* first

View file

@ -0,0 +1,23 @@
'use strict'
module.exports = clone
var getPrototypeOf = Object.getPrototypeOf || function (obj) {
return obj.__proto__
}
function clone (obj) {
if (obj === null || typeof obj !== 'object')
return obj
if (obj instanceof Object)
var copy = { __proto__: getPrototypeOf(obj) }
else
var copy = Object.create(null)
Object.getOwnPropertyNames(obj).forEach(function (key) {
Object.defineProperty(copy, key, Object.getOwnPropertyDescriptor(obj, key))
})
return copy
}

View file

@ -0,0 +1,448 @@
var fs = require('fs')
var polyfills = require('./polyfills.js')
var legacy = require('./legacy-streams.js')
var clone = require('./clone.js')
var util = require('util')
/* istanbul ignore next - node 0.x polyfill */
var gracefulQueue
var previousSymbol
/* istanbul ignore else - node 0.x polyfill */
if (typeof Symbol === 'function' && typeof Symbol.for === 'function') {
gracefulQueue = Symbol.for('graceful-fs.queue')
// This is used in testing by future versions
previousSymbol = Symbol.for('graceful-fs.previous')
} else {
gracefulQueue = '___graceful-fs.queue'
previousSymbol = '___graceful-fs.previous'
}
function noop () {}
function publishQueue(context, queue) {
Object.defineProperty(context, gracefulQueue, {
get: function() {
return queue
}
})
}
var debug = noop
if (util.debuglog)
debug = util.debuglog('gfs4')
else if (/\bgfs4\b/i.test(process.env.NODE_DEBUG || ''))
debug = function() {
var m = util.format.apply(util, arguments)
m = 'GFS4: ' + m.split(/\n/).join('\nGFS4: ')
console.error(m)
}
// Once time initialization
if (!fs[gracefulQueue]) {
// This queue can be shared by multiple loaded instances
var queue = global[gracefulQueue] || []
publishQueue(fs, queue)
// Patch fs.close/closeSync to shared queue version, because we need
// to retry() whenever a close happens *anywhere* in the program.
// This is essential when multiple graceful-fs instances are
// in play at the same time.
fs.close = (function (fs$close) {
function close (fd, cb) {
return fs$close.call(fs, fd, function (err) {
// This function uses the graceful-fs shared queue
if (!err) {
resetQueue()
}
if (typeof cb === 'function')
cb.apply(this, arguments)
})
}
Object.defineProperty(close, previousSymbol, {
value: fs$close
})
return close
})(fs.close)
fs.closeSync = (function (fs$closeSync) {
function closeSync (fd) {
// This function uses the graceful-fs shared queue
fs$closeSync.apply(fs, arguments)
resetQueue()
}
Object.defineProperty(closeSync, previousSymbol, {
value: fs$closeSync
})
return closeSync
})(fs.closeSync)
if (/\bgfs4\b/i.test(process.env.NODE_DEBUG || '')) {
process.on('exit', function() {
debug(fs[gracefulQueue])
require('assert').equal(fs[gracefulQueue].length, 0)
})
}
}
if (!global[gracefulQueue]) {
publishQueue(global, fs[gracefulQueue]);
}
module.exports = patch(clone(fs))
if (process.env.TEST_GRACEFUL_FS_GLOBAL_PATCH && !fs.__patched) {
module.exports = patch(fs)
fs.__patched = true;
}
function patch (fs) {
// Everything that references the open() function needs to be in here
polyfills(fs)
fs.gracefulify = patch
fs.createReadStream = createReadStream
fs.createWriteStream = createWriteStream
var fs$readFile = fs.readFile
fs.readFile = readFile
function readFile (path, options, cb) {
if (typeof options === 'function')
cb = options, options = null
return go$readFile(path, options, cb)
function go$readFile (path, options, cb, startTime) {
return fs$readFile(path, options, function (err) {
if (err && (err.code === 'EMFILE' || err.code === 'ENFILE'))
enqueue([go$readFile, [path, options, cb], err, startTime || Date.now(), Date.now()])
else {
if (typeof cb === 'function')
cb.apply(this, arguments)
}
})
}
}
var fs$writeFile = fs.writeFile
fs.writeFile = writeFile
function writeFile (path, data, options, cb) {
if (typeof options === 'function')
cb = options, options = null
return go$writeFile(path, data, options, cb)
function go$writeFile (path, data, options, cb, startTime) {
return fs$writeFile(path, data, options, function (err) {
if (err && (err.code === 'EMFILE' || err.code === 'ENFILE'))
enqueue([go$writeFile, [path, data, options, cb], err, startTime || Date.now(), Date.now()])
else {
if (typeof cb === 'function')
cb.apply(this, arguments)
}
})
}
}
var fs$appendFile = fs.appendFile
if (fs$appendFile)
fs.appendFile = appendFile
function appendFile (path, data, options, cb) {
if (typeof options === 'function')
cb = options, options = null
return go$appendFile(path, data, options, cb)
function go$appendFile (path, data, options, cb, startTime) {
return fs$appendFile(path, data, options, function (err) {
if (err && (err.code === 'EMFILE' || err.code === 'ENFILE'))
enqueue([go$appendFile, [path, data, options, cb], err, startTime || Date.now(), Date.now()])
else {
if (typeof cb === 'function')
cb.apply(this, arguments)
}
})
}
}
var fs$copyFile = fs.copyFile
if (fs$copyFile)
fs.copyFile = copyFile
function copyFile (src, dest, flags, cb) {
if (typeof flags === 'function') {
cb = flags
flags = 0
}
return go$copyFile(src, dest, flags, cb)
function go$copyFile (src, dest, flags, cb, startTime) {
return fs$copyFile(src, dest, flags, function (err) {
if (err && (err.code === 'EMFILE' || err.code === 'ENFILE'))
enqueue([go$copyFile, [src, dest, flags, cb], err, startTime || Date.now(), Date.now()])
else {
if (typeof cb === 'function')
cb.apply(this, arguments)
}
})
}
}
var fs$readdir = fs.readdir
fs.readdir = readdir
var noReaddirOptionVersions = /^v[0-5]\./
function readdir (path, options, cb) {
if (typeof options === 'function')
cb = options, options = null
var go$readdir = noReaddirOptionVersions.test(process.version)
? function go$readdir (path, options, cb, startTime) {
return fs$readdir(path, fs$readdirCallback(
path, options, cb, startTime
))
}
: function go$readdir (path, options, cb, startTime) {
return fs$readdir(path, options, fs$readdirCallback(
path, options, cb, startTime
))
}
return go$readdir(path, options, cb)
function fs$readdirCallback (path, options, cb, startTime) {
return function (err, files) {
if (err && (err.code === 'EMFILE' || err.code === 'ENFILE'))
enqueue([
go$readdir,
[path, options, cb],
err,
startTime || Date.now(),
Date.now()
])
else {
if (files && files.sort)
files.sort()
if (typeof cb === 'function')
cb.call(this, err, files)
}
}
}
}
if (process.version.substr(0, 4) === 'v0.8') {
var legStreams = legacy(fs)
ReadStream = legStreams.ReadStream
WriteStream = legStreams.WriteStream
}
var fs$ReadStream = fs.ReadStream
if (fs$ReadStream) {
ReadStream.prototype = Object.create(fs$ReadStream.prototype)
ReadStream.prototype.open = ReadStream$open
}
var fs$WriteStream = fs.WriteStream
if (fs$WriteStream) {
WriteStream.prototype = Object.create(fs$WriteStream.prototype)
WriteStream.prototype.open = WriteStream$open
}
Object.defineProperty(fs, 'ReadStream', {
get: function () {
return ReadStream
},
set: function (val) {
ReadStream = val
},
enumerable: true,
configurable: true
})
Object.defineProperty(fs, 'WriteStream', {
get: function () {
return WriteStream
},
set: function (val) {
WriteStream = val
},
enumerable: true,
configurable: true
})
// legacy names
var FileReadStream = ReadStream
Object.defineProperty(fs, 'FileReadStream', {
get: function () {
return FileReadStream
},
set: function (val) {
FileReadStream = val
},
enumerable: true,
configurable: true
})
var FileWriteStream = WriteStream
Object.defineProperty(fs, 'FileWriteStream', {
get: function () {
return FileWriteStream
},
set: function (val) {
FileWriteStream = val
},
enumerable: true,
configurable: true
})
function ReadStream (path, options) {
if (this instanceof ReadStream)
return fs$ReadStream.apply(this, arguments), this
else
return ReadStream.apply(Object.create(ReadStream.prototype), arguments)
}
function ReadStream$open () {
var that = this
open(that.path, that.flags, that.mode, function (err, fd) {
if (err) {
if (that.autoClose)
that.destroy()
that.emit('error', err)
} else {
that.fd = fd
that.emit('open', fd)
that.read()
}
})
}
function WriteStream (path, options) {
if (this instanceof WriteStream)
return fs$WriteStream.apply(this, arguments), this
else
return WriteStream.apply(Object.create(WriteStream.prototype), arguments)
}
function WriteStream$open () {
var that = this
open(that.path, that.flags, that.mode, function (err, fd) {
if (err) {
that.destroy()
that.emit('error', err)
} else {
that.fd = fd
that.emit('open', fd)
}
})
}
function createReadStream (path, options) {
return new fs.ReadStream(path, options)
}
function createWriteStream (path, options) {
return new fs.WriteStream(path, options)
}
var fs$open = fs.open
fs.open = open
function open (path, flags, mode, cb) {
if (typeof mode === 'function')
cb = mode, mode = null
return go$open(path, flags, mode, cb)
function go$open (path, flags, mode, cb, startTime) {
return fs$open(path, flags, mode, function (err, fd) {
if (err && (err.code === 'EMFILE' || err.code === 'ENFILE'))
enqueue([go$open, [path, flags, mode, cb], err, startTime || Date.now(), Date.now()])
else {
if (typeof cb === 'function')
cb.apply(this, arguments)
}
})
}
}
return fs
}
function enqueue (elem) {
debug('ENQUEUE', elem[0].name, elem[1])
fs[gracefulQueue].push(elem)
retry()
}
// keep track of the timeout between retry() calls
var retryTimer
// reset the startTime and lastTime to now
// this resets the start of the 60 second overall timeout as well as the
// delay between attempts so that we'll retry these jobs sooner
function resetQueue () {
var now = Date.now()
for (var i = 0; i < fs[gracefulQueue].length; ++i) {
// entries that are only a length of 2 are from an older version, don't
// bother modifying those since they'll be retried anyway.
if (fs[gracefulQueue][i].length > 2) {
fs[gracefulQueue][i][3] = now // startTime
fs[gracefulQueue][i][4] = now // lastTime
}
}
// call retry to make sure we're actively processing the queue
retry()
}
function retry () {
// clear the timer and remove it to help prevent unintended concurrency
clearTimeout(retryTimer)
retryTimer = undefined
if (fs[gracefulQueue].length === 0)
return
var elem = fs[gracefulQueue].shift()
var fn = elem[0]
var args = elem[1]
// these items may be unset if they were added by an older graceful-fs
var err = elem[2]
var startTime = elem[3]
var lastTime = elem[4]
// if we don't have a startTime we have no way of knowing if we've waited
// long enough, so go ahead and retry this item now
if (startTime === undefined) {
debug('RETRY', fn.name, args)
fn.apply(null, args)
} else if (Date.now() - startTime >= 60000) {
// it's been more than 60 seconds total, bail now
debug('TIMEOUT', fn.name, args)
var cb = args.pop()
if (typeof cb === 'function')
cb.call(null, err)
} else {
// the amount of time between the last attempt and right now
var sinceAttempt = Date.now() - lastTime
// the amount of time between when we first tried, and when we last tried
// rounded up to at least 1
var sinceStart = Math.max(lastTime - startTime, 1)
// backoff. wait longer than the total time we've been retrying, but only
// up to a maximum of 100ms
var desiredDelay = Math.min(sinceStart * 1.2, 100)
// it's been long enough since the last retry, do it again
if (sinceAttempt >= desiredDelay) {
debug('RETRY', fn.name, args)
fn.apply(null, args.concat([startTime]))
} else {
// if we can't do this job yet, push it to the end of the queue
// and let the next iteration check again
fs[gracefulQueue].push(elem)
}
}
// schedule our next run if one isn't already scheduled
if (retryTimer === undefined) {
retryTimer = setTimeout(retry, 0)
}
}

View file

@ -0,0 +1,118 @@
var Stream = require('stream').Stream
module.exports = legacy
function legacy (fs) {
return {
ReadStream: ReadStream,
WriteStream: WriteStream
}
function ReadStream (path, options) {
if (!(this instanceof ReadStream)) return new ReadStream(path, options);
Stream.call(this);
var self = this;
this.path = path;
this.fd = null;
this.readable = true;
this.paused = false;
this.flags = 'r';
this.mode = 438; /*=0666*/
this.bufferSize = 64 * 1024;
options = options || {};
// Mixin options into this
var keys = Object.keys(options);
for (var index = 0, length = keys.length; index < length; index++) {
var key = keys[index];
this[key] = options[key];
}
if (this.encoding) this.setEncoding(this.encoding);
if (this.start !== undefined) {
if ('number' !== typeof this.start) {
throw TypeError('start must be a Number');
}
if (this.end === undefined) {
this.end = Infinity;
} else if ('number' !== typeof this.end) {
throw TypeError('end must be a Number');
}
if (this.start > this.end) {
throw new Error('start must be <= end');
}
this.pos = this.start;
}
if (this.fd !== null) {
process.nextTick(function() {
self._read();
});
return;
}
fs.open(this.path, this.flags, this.mode, function (err, fd) {
if (err) {
self.emit('error', err);
self.readable = false;
return;
}
self.fd = fd;
self.emit('open', fd);
self._read();
})
}
function WriteStream (path, options) {
if (!(this instanceof WriteStream)) return new WriteStream(path, options);
Stream.call(this);
this.path = path;
this.fd = null;
this.writable = true;
this.flags = 'w';
this.encoding = 'binary';
this.mode = 438; /*=0666*/
this.bytesWritten = 0;
options = options || {};
// Mixin options into this
var keys = Object.keys(options);
for (var index = 0, length = keys.length; index < length; index++) {
var key = keys[index];
this[key] = options[key];
}
if (this.start !== undefined) {
if ('number' !== typeof this.start) {
throw TypeError('start must be a Number');
}
if (this.start < 0) {
throw new Error('start must be >= zero');
}
this.pos = this.start;
}
this.busy = false;
this._queue = [];
if (this.fd === null) {
this._open = fs.open;
this._queue.push([this._open, this.path, this.flags, this.mode, undefined]);
this.flush();
}
}
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,53 @@
{
"name": "graceful-fs",
"description": "A drop-in replacement for fs, making various improvements.",
"version": "4.2.11",
"repository": {
"type": "git",
"url": "https://github.com/isaacs/node-graceful-fs"
},
"main": "graceful-fs.js",
"directories": {
"test": "test"
},
"scripts": {
"preversion": "npm test",
"postversion": "npm publish",
"postpublish": "git push origin --follow-tags",
"test": "nyc --silent node test.js | tap -c -",
"posttest": "nyc report"
},
"keywords": [
"fs",
"module",
"reading",
"retry",
"retries",
"queue",
"error",
"errors",
"handling",
"EMFILE",
"EAGAIN",
"EINVAL",
"EPERM",
"EACCESS"
],
"license": "ISC",
"devDependencies": {
"import-fresh": "^2.0.0",
"mkdirp": "^0.5.0",
"rimraf": "^2.2.8",
"tap": "^16.3.4"
},
"files": [
"fs.js",
"graceful-fs.js",
"legacy-streams.js",
"polyfills.js",
"clone.js"
],
"tap": {
"reporter": "classic"
}
}

View file

@ -0,0 +1,355 @@
var constants = require('constants')
var origCwd = process.cwd
var cwd = null
var platform = process.env.GRACEFUL_FS_PLATFORM || process.platform
process.cwd = function() {
if (!cwd)
cwd = origCwd.call(process)
return cwd
}
try {
process.cwd()
} catch (er) {}
// This check is needed until node.js 12 is required
if (typeof process.chdir === 'function') {
var chdir = process.chdir
process.chdir = function (d) {
cwd = null
chdir.call(process, d)
}
if (Object.setPrototypeOf) Object.setPrototypeOf(process.chdir, chdir)
}
module.exports = patch
function patch (fs) {
// (re-)implement some things that are known busted or missing.
// lchmod, broken prior to 0.6.2
// back-port the fix here.
if (constants.hasOwnProperty('O_SYMLINK') &&
process.version.match(/^v0\.6\.[0-2]|^v0\.5\./)) {
patchLchmod(fs)
}
// lutimes implementation, or no-op
if (!fs.lutimes) {
patchLutimes(fs)
}
// https://github.com/isaacs/node-graceful-fs/issues/4
// Chown should not fail on einval or eperm if non-root.
// It should not fail on enosys ever, as this just indicates
// that a fs doesn't support the intended operation.
fs.chown = chownFix(fs.chown)
fs.fchown = chownFix(fs.fchown)
fs.lchown = chownFix(fs.lchown)
fs.chmod = chmodFix(fs.chmod)
fs.fchmod = chmodFix(fs.fchmod)
fs.lchmod = chmodFix(fs.lchmod)
fs.chownSync = chownFixSync(fs.chownSync)
fs.fchownSync = chownFixSync(fs.fchownSync)
fs.lchownSync = chownFixSync(fs.lchownSync)
fs.chmodSync = chmodFixSync(fs.chmodSync)
fs.fchmodSync = chmodFixSync(fs.fchmodSync)
fs.lchmodSync = chmodFixSync(fs.lchmodSync)
fs.stat = statFix(fs.stat)
fs.fstat = statFix(fs.fstat)
fs.lstat = statFix(fs.lstat)
fs.statSync = statFixSync(fs.statSync)
fs.fstatSync = statFixSync(fs.fstatSync)
fs.lstatSync = statFixSync(fs.lstatSync)
// if lchmod/lchown do not exist, then make them no-ops
if (fs.chmod && !fs.lchmod) {
fs.lchmod = function (path, mode, cb) {
if (cb) process.nextTick(cb)
}
fs.lchmodSync = function () {}
}
if (fs.chown && !fs.lchown) {
fs.lchown = function (path, uid, gid, cb) {
if (cb) process.nextTick(cb)
}
fs.lchownSync = function () {}
}
// on Windows, A/V software can lock the directory, causing this
// to fail with an EACCES or EPERM if the directory contains newly
// created files. Try again on failure, for up to 60 seconds.
// Set the timeout this long because some Windows Anti-Virus, such as Parity
// bit9, may lock files for up to a minute, causing npm package install
// failures. Also, take care to yield the scheduler. Windows scheduling gives
// CPU to a busy looping process, which can cause the program causing the lock
// contention to be starved of CPU by node, so the contention doesn't resolve.
if (platform === "win32") {
fs.rename = typeof fs.rename !== 'function' ? fs.rename
: (function (fs$rename) {
function rename (from, to, cb) {
var start = Date.now()
var backoff = 0;
fs$rename(from, to, function CB (er) {
if (er
&& (er.code === "EACCES" || er.code === "EPERM" || er.code === "EBUSY")
&& Date.now() - start < 60000) {
setTimeout(function() {
fs.stat(to, function (stater, st) {
if (stater && stater.code === "ENOENT")
fs$rename(from, to, CB);
else
cb(er)
})
}, backoff)
if (backoff < 100)
backoff += 10;
return;
}
if (cb) cb(er)
})
}
if (Object.setPrototypeOf) Object.setPrototypeOf(rename, fs$rename)
return rename
})(fs.rename)
}
// if read() returns EAGAIN, then just try it again.
fs.read = typeof fs.read !== 'function' ? fs.read
: (function (fs$read) {
function read (fd, buffer, offset, length, position, callback_) {
var callback
if (callback_ && typeof callback_ === 'function') {
var eagCounter = 0
callback = function (er, _, __) {
if (er && er.code === 'EAGAIN' && eagCounter < 10) {
eagCounter ++
return fs$read.call(fs, fd, buffer, offset, length, position, callback)
}
callback_.apply(this, arguments)
}
}
return fs$read.call(fs, fd, buffer, offset, length, position, callback)
}
// This ensures `util.promisify` works as it does for native `fs.read`.
if (Object.setPrototypeOf) Object.setPrototypeOf(read, fs$read)
return read
})(fs.read)
fs.readSync = typeof fs.readSync !== 'function' ? fs.readSync
: (function (fs$readSync) { return function (fd, buffer, offset, length, position) {
var eagCounter = 0
while (true) {
try {
return fs$readSync.call(fs, fd, buffer, offset, length, position)
} catch (er) {
if (er.code === 'EAGAIN' && eagCounter < 10) {
eagCounter ++
continue
}
throw er
}
}
}})(fs.readSync)
function patchLchmod (fs) {
fs.lchmod = function (path, mode, callback) {
fs.open( path
, constants.O_WRONLY | constants.O_SYMLINK
, mode
, function (err, fd) {
if (err) {
if (callback) callback(err)
return
}
// prefer to return the chmod error, if one occurs,
// but still try to close, and report closing errors if they occur.
fs.fchmod(fd, mode, function (err) {
fs.close(fd, function(err2) {
if (callback) callback(err || err2)
})
})
})
}
fs.lchmodSync = function (path, mode) {
var fd = fs.openSync(path, constants.O_WRONLY | constants.O_SYMLINK, mode)
// prefer to return the chmod error, if one occurs,
// but still try to close, and report closing errors if they occur.
var threw = true
var ret
try {
ret = fs.fchmodSync(fd, mode)
threw = false
} finally {
if (threw) {
try {
fs.closeSync(fd)
} catch (er) {}
} else {
fs.closeSync(fd)
}
}
return ret
}
}
function patchLutimes (fs) {
if (constants.hasOwnProperty("O_SYMLINK") && fs.futimes) {
fs.lutimes = function (path, at, mt, cb) {
fs.open(path, constants.O_SYMLINK, function (er, fd) {
if (er) {
if (cb) cb(er)
return
}
fs.futimes(fd, at, mt, function (er) {
fs.close(fd, function (er2) {
if (cb) cb(er || er2)
})
})
})
}
fs.lutimesSync = function (path, at, mt) {
var fd = fs.openSync(path, constants.O_SYMLINK)
var ret
var threw = true
try {
ret = fs.futimesSync(fd, at, mt)
threw = false
} finally {
if (threw) {
try {
fs.closeSync(fd)
} catch (er) {}
} else {
fs.closeSync(fd)
}
}
return ret
}
} else if (fs.futimes) {
fs.lutimes = function (_a, _b, _c, cb) { if (cb) process.nextTick(cb) }
fs.lutimesSync = function () {}
}
}
function chmodFix (orig) {
if (!orig) return orig
return function (target, mode, cb) {
return orig.call(fs, target, mode, function (er) {
if (chownErOk(er)) er = null
if (cb) cb.apply(this, arguments)
})
}
}
function chmodFixSync (orig) {
if (!orig) return orig
return function (target, mode) {
try {
return orig.call(fs, target, mode)
} catch (er) {
if (!chownErOk(er)) throw er
}
}
}
function chownFix (orig) {
if (!orig) return orig
return function (target, uid, gid, cb) {
return orig.call(fs, target, uid, gid, function (er) {
if (chownErOk(er)) er = null
if (cb) cb.apply(this, arguments)
})
}
}
function chownFixSync (orig) {
if (!orig) return orig
return function (target, uid, gid) {
try {
return orig.call(fs, target, uid, gid)
} catch (er) {
if (!chownErOk(er)) throw er
}
}
}
function statFix (orig) {
if (!orig) return orig
// Older versions of Node erroneously returned signed integers for
// uid + gid.
return function (target, options, cb) {
if (typeof options === 'function') {
cb = options
options = null
}
function callback (er, stats) {
if (stats) {
if (stats.uid < 0) stats.uid += 0x100000000
if (stats.gid < 0) stats.gid += 0x100000000
}
if (cb) cb.apply(this, arguments)
}
return options ? orig.call(fs, target, options, callback)
: orig.call(fs, target, callback)
}
}
function statFixSync (orig) {
if (!orig) return orig
// Older versions of Node erroneously returned signed integers for
// uid + gid.
return function (target, options) {
var stats = options ? orig.call(fs, target, options)
: orig.call(fs, target)
if (stats) {
if (stats.uid < 0) stats.uid += 0x100000000
if (stats.gid < 0) stats.gid += 0x100000000
}
return stats;
}
}
// ENOSYS means that the fs doesn't support the op. Just ignore
// that, because it doesn't matter.
//
// if there's no getuid, or if getuid() is something other
// than 0, and the error is EINVAL or EPERM, then just ignore
// it.
//
// This specific case is a silent failure in cp, install, tar,
// and most other unix tools that manage permissions.
//
// When running as root, or if other types of errors are
// encountered, then it's strict.
function chownErOk (er) {
if (!er)
return true
if (er.code === "ENOSYS")
return true
var nonroot = !process.getuid || process.getuid() !== 0
if (nonroot) {
if (er.code === "EINVAL" || er.code === "EPERM")
return true
}
return false
}
}

View file

@ -0,0 +1,31 @@
var fs = require('fs')
var tap = require('tap')
var dir = __dirname + '/test'
var node = process.execPath
var path = require('path')
var files = fs.readdirSync(dir)
var env = Object.keys(process.env).reduce(function (env, k) {
env[k] = process.env[k]
return env
}, {
TEST_GRACEFUL_FS_GLOBAL_PATCH: 1
})
tap.jobs = require('os').cpus().length
var testFiles = files.filter(function (f) {
return (/\.js$/.test(f) && fs.statSync(dir + '/' + f).isFile())
})
tap.plan(testFiles.length)
testFiles.forEach(function(f) {
tap.test(f, function(t) {
t.spawn(node, ['--expose-gc', 'test/' + f])
if (path.basename(f) !== 'monkeypatch-by-accident.js') {
t.spawn(node, ['--expose-gc', 'test/' + f], {
env: env
}, '🐵 test/' + f)
}
t.end()
})
})

View file

@ -0,0 +1,53 @@
var realFs = require('fs')
var methods = ['chown', 'chownSync', 'chmod', 'chmodSync']
methods.forEach(function (method) {
causeErr(method, realFs[method])
})
function causeErr (method, original) {
realFs[method] = function (path) {
var err = makeErr(path, method)
if (!/Sync$/.test(method)) {
var cb = arguments[arguments.length - 1]
process.nextTick(cb.bind(null, err))
} else {
throw err
}
}
}
function makeErr (path, method) {
var err = new Error('this is fine')
err.syscall = method.replace(/Sync$/, '')
err.code = path.toUpperCase()
return err
}
var fs = require('../')
var t = require('tap')
var errs = ['ENOSYS', 'EINVAL', 'EPERM']
t.plan(errs.length * methods.length)
errs.forEach(function (err) {
methods.forEach(function (method) {
var args = [err]
if (/chmod/.test(method)) {
args.push('some mode')
} else {
args.push('some uid', 'some gid')
}
if (method.match(/Sync$/)) {
t.doesNotThrow(function () {
fs[method].apply(fs, args)
})
} else {
args.push(function (err) {
t.notOk(err)
})
fs[method].apply(fs, args)
}
})
})

View file

@ -0,0 +1,22 @@
var fs = require('fs')
var path = require('path')
var gfsPath = path.resolve(__dirname, '..', 'graceful-fs.js')
var gfs = require(gfsPath)
var importFresh = require('import-fresh')
var fs$close = fs.close
var fs$closeSync = fs.closeSync
var test = require('tap').test
test('`close` is patched correctly', function(t) {
t.match(fs$close.toString(), /graceful-fs shared queue/, 'patch fs.close');
t.match(fs$closeSync.toString(), /graceful-fs shared queue/, 'patch fs.closeSync');
t.match(gfs.close.toString(), /graceful-fs shared queue/, 'patch gfs.close');
t.match(gfs.closeSync.toString(), /graceful-fs shared queue/, 'patch gfs.closeSync');
var newGFS = importFresh(gfsPath)
t.equal(fs.close, fs$close)
t.equal(fs.closeSync, fs$closeSync)
t.equal(newGFS.close, fs$close)
t.equal(newGFS.closeSync, fs$closeSync)
t.end();
})

View file

@ -0,0 +1,4 @@
process.chdir = 'i am not a function so dont call me maybe'
const t = require('tap')
require('../')
t.equal(process.chdir, 'i am not a function so dont call me maybe')

View file

@ -0,0 +1,77 @@
// this test makes sure that various things get enoent, instead of
// some other kind of throw.
var g = require('../')
var NODE_VERSION_MAJOR_WITH_BIGINT = 10
var NODE_VERSION_MINOR_WITH_BIGINT = 5
var NODE_VERSION_PATCH_WITH_BIGINT = 0
var nodeVersion = process.versions.node.split('.')
var nodeVersionMajor = Number.parseInt(nodeVersion[0], 10)
var nodeVersionMinor = Number.parseInt(nodeVersion[1], 10)
var nodeVersionPatch = Number.parseInt(nodeVersion[2], 10)
function nodeSupportsBigInt () {
if (nodeVersionMajor > NODE_VERSION_MAJOR_WITH_BIGINT) {
return true
} else if (nodeVersionMajor === NODE_VERSION_MAJOR_WITH_BIGINT) {
if (nodeVersionMinor > NODE_VERSION_MINOR_WITH_BIGINT) {
return true
} else if (nodeVersionMinor === NODE_VERSION_MINOR_WITH_BIGINT) {
if (nodeVersionPatch >= NODE_VERSION_PATCH_WITH_BIGINT) {
return true
}
}
}
return false
}
var t = require('tap')
var file = 'this file does not exist even a little bit'
var methods = [
['open', 'r'],
['readFile'],
['stat'],
['lstat'],
['utimes', new Date(), new Date()],
['readdir']
]
// any version > v6 can do readdir(path, options, cb)
if (process.version.match(/^v([6-9]|[1-9][0-9])\./)) {
methods.push(['readdir', {}])
}
// any version > v10.5 can do stat(path, options, cb)
if (nodeSupportsBigInt()) {
methods.push(['stat', {}])
methods.push(['lstat', {}])
}
t.plan(methods.length)
methods.forEach(function (method) {
t.test(method[0], runTest(method))
})
function runTest (args) { return function (t) {
var method = args.shift()
args.unshift(file)
var methodSync = method + 'Sync'
t.type(g[methodSync], 'function')
t.throws(function () {
g[methodSync].apply(g, args)
}, { code: 'ENOENT' })
// add the callback
args.push(verify(t))
t.type(g[method], 'function')
t.doesNotThrow(function () {
g[method].apply(g, args)
})
}}
function verify (t) { return function (er) {
t.type(er, Error)
t.equal(er.code, 'ENOENT')
t.end()
}}

View file

@ -0,0 +1,68 @@
var fs = require('../')
var test = require('tap').test
test('open lots of stuff', function (t) {
// Get around EBADF from libuv by making sure that stderr is opened
// Otherwise Darwin will refuse to give us a FD for stderr!
process.stderr.write('')
// How many parallel open()'s to do
var n = 1024
var opens = 0
var fds = []
var going = true
var closing = false
var doneCalled = 0
for (var i = 0; i < n; i++) {
go()
}
function go() {
opens++
fs.open(__filename, 'r', function (er, fd) {
if (er) throw er
fds.push(fd)
if (going) go()
})
}
// should hit ulimit pretty fast
setTimeout(function () {
going = false
t.equal(opens - fds.length, n)
done()
}, 100)
function done () {
if (closing) return
doneCalled++
if (fds.length === 0) {
// First because of the timeout
// Then to close the fd's opened afterwards
// Then this time, to complete.
// Might take multiple passes, depending on CPU speed
// and ulimit, but at least 3 in every case.
t.ok(doneCalled >= 2)
return t.end()
}
closing = true
setTimeout(function () {
// console.error('do closing again')
closing = false
done()
}, 100)
// console.error('closing time')
var closes = fds.slice(0)
fds.length = 0
closes.forEach(function (fd) {
fs.close(fd, function (er) {
if (er) throw er
})
})
}
})

View file

@ -0,0 +1,31 @@
'use strict';
if (process.env.TEST_GRACEFUL_FS_GLOBAL_PATCH) {
require('tap').plan(0, 'obviously not relevant when monkeypatching fs')
process.exit(0)
}
const fs = require('fs')
// Save originals before loading graceful-fs
const names = [
'ReadStream',
'WriteStream',
'FileReadStream',
'FileWriteStream'
]
const orig = {}
names.forEach(name => orig[name] = fs[name])
const t = require('tap')
const gfs = require('../')
if (names.some(name => gfs[name] === orig[name])) {
t.plan(0, 'graceful-fs was loaded before this test was run')
process.exit(0)
}
t.plan(names.length)
names.forEach(name => {
t.ok(fs[name] === orig[name], `fs.${name} unchanged`)
})

View file

@ -0,0 +1,34 @@
var fs = require('../')
var test = require('tap').test
test('open an existing file works', function (t) {
var fd = fs.openSync(__filename, 'r')
fs.closeSync(fd)
fs.open(__filename, 'r', function (er, fd) {
if (er) throw er
fs.close(fd, function (er) {
if (er) throw er
t.pass('works')
t.end()
})
})
})
test('open a non-existing file throws', function (t) {
var er
try {
var fd = fs.openSync('this file does not exist', 'r')
} catch (x) {
er = x
}
t.ok(er, 'should throw')
t.notOk(fd, 'should not get an fd')
t.equal(er.code, 'ENOENT')
fs.open('neither does this file', 'r', function (er, fd) {
t.ok(er, 'should throw')
t.notOk(fd, 'should not get an fd')
t.equal(er.code, 'ENOENT')
t.end()
})
})

View file

@ -0,0 +1,50 @@
'use strict'
var fs = require('../')
var rimraf = require('rimraf')
var mkdirp = require('mkdirp')
var t = require('tap')
var td = t.testdir({
files: {}
})
var p = require('path').resolve(td, 'files')
process.chdir(td)
// Make sure to reserve the stderr fd
process.stderr.write('')
var num = 4097
var paths = new Array(num)
t.test('write files', function (t) {
rimraf.sync(p)
mkdirp.sync(p)
t.plan(num)
for (var i = 0; i < num; ++i) {
paths[i] = 'files/file-' + i
var stream = fs.createWriteStream(paths[i])
stream.on('finish', function () {
t.pass('success')
})
stream.write('content')
stream.end()
}
})
t.test('read files', function (t) {
// now read them
t.plan(num)
for (var i = 0; i < num; ++i) (function (i) {
var stream = fs.createReadStream(paths[i])
var data = ''
stream.on('data', function (c) {
data += c
})
stream.on('end', function () {
t.equal(data, 'content')
})
})(i)
})

View file

@ -0,0 +1,61 @@
var fs = require("fs")
var t = require("tap")
var currentTest
var strings = ['b', 'z', 'a']
var buffs = strings.map(function (s) { return Buffer.from(s) })
var hexes = buffs.map(function (b) { return b.toString('hex') })
function getRet (encoding) {
switch (encoding) {
case 'hex':
return hexes
case 'buffer':
return buffs
default:
return strings
}
}
var readdir = fs.readdir
var failed = false
fs.readdir = function(path, options, cb) {
if (!failed) {
// simulate an EMFILE and then open and close a thing to retry
failed = true
process.nextTick(function () {
var er = new Error('synthetic emfile')
er.code = 'EMFILE'
cb(er)
process.nextTick(function () {
g.closeSync(fs.openSync(__filename, 'r'))
})
})
return
}
failed = false
currentTest.type(cb, 'function')
currentTest.type(options, 'object')
currentTest.ok(options)
process.nextTick(function() {
var ret = getRet(options.encoding)
cb(null, ret)
})
}
var g = require("../")
var encodings = ['buffer', 'hex', 'utf8', null]
encodings.forEach(function (enc) {
t.test('encoding=' + enc, function (t) {
currentTest = t
g.readdir("whatevers", { encoding: enc }, function (er, files) {
if (er)
throw er
t.same(files, getRet(enc).sort())
t.end()
})
})
})

View file

@ -0,0 +1,20 @@
var fs = require("fs")
var readdir = fs.readdir
fs.readdir = function(path, options, cb) {
process.nextTick(function() {
cb(null, ["b", "z", "a"])
})
}
var g = require("../")
var test = require("tap").test
test("readdir reorder", function (t) {
g.readdir("whatevers", function (er, files) {
if (er)
throw er
t.same(files, [ "a", "b", "z" ])
t.end()
})
})

View file

@ -0,0 +1,46 @@
'use strict'
var fs = require('../')
var rimraf = require('rimraf')
var mkdirp = require('mkdirp')
var t = require('tap')
var td = t.testdir({
files: {}
})
var p = require('path').resolve(td, 'files')
process.chdir(td)
// Make sure to reserve the stderr fd
process.stderr.write('')
var num = 4097
var paths = new Array(num)
t.test('write files', function (t) {
rimraf.sync(p)
mkdirp.sync(p)
t.plan(num)
for (var i = 0; i < num; ++i) {
paths[i] = 'files/file-' + i
fs.writeFile(paths[i], 'content', 'ascii', function (er) {
if (er)
throw er
t.pass('written')
})
}
})
t.test('read files', function (t) {
// now read them
t.plan(num)
for (var i = 0; i < num; ++i) {
fs.readFile(paths[i], 'ascii', function (er, data) {
if (er)
throw er
t.equal(data, 'content')
})
}
})

View file

@ -0,0 +1,36 @@
'use strict'
var importFresh = require('import-fresh')
var path = require('path')
var realFs = require('fs')
var test = require('tap').test
var EMFILE = Object.assign(new Error('FAKE EMFILE'), { code: 'EMFILE' })
test('eventually times out and returns error', function (t) {
var readFile = realFs.readFile
var realNow = Date.now
t.teardown(function () {
realFs.readFile = readFile
Date.now = realNow
})
realFs.readFile = function (path, options, cb) {
process.nextTick(function () {
cb(EMFILE)
// hijack Date.now _after_ we call the callback, the callback will
// call it when adding the job to the queue, we want to capture it
// any time after that first call so we can pretend it's been 60s
Date.now = function () {
return realNow() + 60000
}
})
}
var fs = importFresh(path.dirname(__dirname))
fs.readFile('literally anything', function (err) {
t.equal(err.code, 'EMFILE', 'eventually got the EMFILE')
t.end()
})
})

View file

@ -0,0 +1,14 @@
const t = require('tap')
const gfs = require('../')
t.equal(gfs.ReadStream, gfs.FileReadStream)
t.equal(gfs.WriteStream, gfs.FileWriteStream)
const frs = {}
const fws = {}
gfs.FileReadStream = frs
gfs.FileWriteStream = fws
t.equal(gfs.FileReadStream, frs)
t.equal(gfs.FileWriteStream, fws)
t.not(gfs.ReadStream, frs)
t.not(gfs.WriteStream, fws)
t.not(gfs.ReadStream, gfs.FileReadStream)
t.not(gfs.WriteStream, gfs.FileWriteStream)

View file

@ -0,0 +1,44 @@
'use strict';
var util = require('util')
var fs = require('fs')
var test = require('tap').test
// mock fs.statSync to return signed uids/gids
var realStatSync = fs.statSync
fs.statSync = function(path) {
var stats = realStatSync.call(fs, path)
stats.uid = -2
stats.gid = -2
return stats
}
var gfs = require('../graceful-fs.js')
test('graceful fs uses same stats constructor as fs', function (t) {
t.equal(gfs.Stats, fs.Stats, 'should reference the same constructor')
if (!process.env.TEST_GRACEFUL_FS_GLOBAL_PATCH) {
t.equal(fs.statSync(__filename).uid, -2)
t.equal(fs.statSync(__filename).gid, -2)
}
t.equal(gfs.statSync(__filename).uid, 0xfffffffe)
t.equal(gfs.statSync(__filename).gid, 0xfffffffe)
t.end()
})
test('does not throw when async stat fails', function (t) {
gfs.stat(__filename + ' this does not exist', function (er, stats) {
t.ok(er)
t.notOk(stats)
t.end()
})
})
test('throws ENOENT when sync stat fails', function (t) {
t.throws(function() {
gfs.statSync(__filename + ' this does not exist')
}, /ENOENT/)
t.end()
})

View file

@ -0,0 +1,12 @@
var fs = require('fs')
var gfs = require('../graceful-fs.js')
var test = require('tap').test
test('graceful fs uses same stats constructor as fs', function (t) {
t.equal(gfs.Stats, fs.Stats, 'should reference the same constructor')
t.ok(fs.statSync(__filename) instanceof fs.Stats,
'should be instance of fs.Stats')
t.ok(gfs.statSync(__filename) instanceof fs.Stats,
'should be instance of fs.Stats')
t.end()
})

View file

@ -0,0 +1,44 @@
process.env.GRACEFUL_FS_PLATFORM = 'win32'
var t = require('tap')
var fs = require('fs')
var ers = ['EPERM', 'EBUSY', 'EACCES']
t.plan(ers.length)
ers.forEach(function(code) {
t.test(code, function(t) {
fs.rename = function (a, b, cb) {
setTimeout(function () {
var er = new Error(code + ' blerg')
er.code = code
cb(er)
})
}
var gfs = require('../')
var a = __dirname + '/a'
var b = __dirname + '/b'
t.test('setup', function (t) {
try { fs.mkdirSync(a) } catch (e) {}
try { fs.mkdirSync(b) } catch (e) {}
t.end()
})
t.test('rename', { timeout: 100 }, function (t) {
t.plan(1)
gfs.rename(a, b, function (er) {
t.ok(er)
})
})
t.test('cleanup', function (t) {
try { fs.rmdirSync(a) } catch (e) {}
try { fs.rmdirSync(b) } catch (e) {}
t.end()
})
t.end()
})
})

View file

@ -0,0 +1,78 @@
var fs = require('../')
var rimraf = require('rimraf')
var mkdirp = require('mkdirp')
var t = require('tap')
var td = t.testdir({ files: {} })
var p = require('path').resolve(td, 'files')
process.chdir(td)
// Make sure to reserve the stderr fd
process.stderr.write('')
var num = 4097
var paths = new Array(num)
t.test('make files', function (t) {
rimraf(p, function (err) {
if (err) {
throw err
}
mkdirp(p, function (err) {
if (err) {
throw err
}
for (var i = 0; i < num; ++i) {
paths[i] = 'files/file-' + i
fs.writeFileSync(paths[i], 'content')
}
t.end()
})
})
})
t.test('copy files', function (t) {
var rem = num
for (var i = 0; i < num; ++i) {
paths[i] = 'files/file-' + i
fs.copyFile(paths[i], paths[i] + '.copy', function(err) {
if (err)
throw err
if (--rem === 0) {
t.end()
}
})
}
})
t.test('copy files with flags', function (t) {
var rem = num
for (var i = 0; i < num; ++i) {
paths[i] = 'files/file-' + i
fs.copyFile(paths[i], paths[i] + '.copy', 2, function(err) {
if (err)
throw err
if (--rem === 0) {
t.end()
}
})
}
})
t.test('read files', function (t) {
function expectContent(err, data) {
if (err)
throw err
t.equal(data, 'content')
}
// now read them
t.plan(num * 2)
for (var i = 0; i < num; ++i) {
fs.readFile(paths[i], 'ascii', expectContent)
fs.readFile(paths[i] + '.copy', 'ascii', expectContent)
}
})

View file

@ -0,0 +1,40 @@
var importFresh = require('import-fresh');
var t = require('tap')
var v8
try {
v8 = require('v8')
} catch (er) {}
if (!v8 || !v8.getHeapStatistics || typeof v8.getHeapStatistics().number_of_detached_contexts !== 'number') {
t.plan(0, 'no reliable context tracking available')
process.exit(0)
}
if (typeof global.gc !== 'function') {
t.plan(0, '--expose_gc not enabled')
process.exit(0)
}
function checkHeap (t) {
var v8stats = v8.getHeapStatistics()
t.equal(v8stats.number_of_detached_contexts, 0, 'no detached contexts')
}
t.test('no memory leak when loading multiple times', function(t) {
t.plan(1);
importFresh(process.cwd() + '/graceful-fs.js') // node 0.10-5 were getting: Cannot find module '../'
// simulate project with 4000 tests
var i = 0;
function importFreshGracefulFs() {
importFresh(process.cwd() + '/graceful-fs.js');
if (i < 4000) {
i++;
process.nextTick(importFreshGracefulFs)
} else {
global.gc()
checkHeap(t);
t.end();
}
}
importFreshGracefulFs();
})

View file

@ -0,0 +1,171 @@
6.1.0 / 2020-10-31
------------------
- Add `finalEOL` option to disable writing final EOL ([#115](https://github.com/jprichardson/node-jsonfile/issues/115), [#137](https://github.com/jprichardson/node-jsonfile/pull/137))
- Update dependency ([#138](https://github.com/jprichardson/node-jsonfile/pull/138))
6.0.1 / 2020-03-07
------------------
- Update dependency ([#130](https://github.com/jprichardson/node-jsonfile/pull/130))
- Fix code style ([#129](https://github.com/jprichardson/node-jsonfile/pull/129))
6.0.0 / 2020-02-24
------------------
- **BREAKING:** Drop support for Node 6 & 8 ([#128](https://github.com/jprichardson/node-jsonfile/pull/128))
- **BREAKING:** Do not allow passing `null` as options to `readFile()` or `writeFile()` ([#128](https://github.com/jprichardson/node-jsonfile/pull/128))
- Refactor internals ([#128](https://github.com/jprichardson/node-jsonfile/pull/128))
5.0.0 / 2018-09-08
------------------
- **BREAKING:** Drop Node 4 support
- **BREAKING:** If no callback is passed to an asynchronous method, a promise is now returned ([#109](https://github.com/jprichardson/node-jsonfile/pull/109))
- Cleanup docs
4.0.0 / 2017-07-12
------------------
- **BREAKING:** Remove global `spaces` option.
- **BREAKING:** Drop support for Node 0.10, 0.12, and io.js.
- Remove undocumented `passParsingErrors` option.
- Added `EOL` override option to `writeFile` when using `spaces`. [#89]
3.0.1 / 2017-07-05
------------------
- Fixed bug in `writeFile` when there was a serialization error & no callback was passed. In previous versions, an empty file would be written; now no file is written.
3.0.0 / 2017-04-25
------------------
- Changed behavior of `throws` option for `readFileSync`; now does not throw filesystem errors when `throws` is `false`
2.4.0 / 2016-09-15
------------------
### Changed
- added optional support for `graceful-fs` [#62]
2.3.1 / 2016-05-13
------------------
- fix to support BOM. [#45][#45]
2.3.0 / 2016-04-16
------------------
- add `throws` to `readFile()`. See [#39][#39]
- add support for any arbitrary `fs` module. Useful with [mock-fs](https://www.npmjs.com/package/mock-fs)
2.2.3 / 2015-10-14
------------------
- include file name in parse error. See: https://github.com/jprichardson/node-jsonfile/pull/34
2.2.2 / 2015-09-16
------------------
- split out tests into separate files
- fixed `throws` when set to `true` in `readFileSync()`. See: https://github.com/jprichardson/node-jsonfile/pull/33
2.2.1 / 2015-06-25
------------------
- fixed regression when passing in string as encoding for options in `writeFile()` and `writeFileSync()`. See: https://github.com/jprichardson/node-jsonfile/issues/28
2.2.0 / 2015-06-25
------------------
- added `options.spaces` to `writeFile()` and `writeFileSync()`
2.1.2 / 2015-06-22
------------------
- fixed if passed `readFileSync(file, 'utf8')`. See: https://github.com/jprichardson/node-jsonfile/issues/25
2.1.1 / 2015-06-19
------------------
- fixed regressions if `null` is passed for options. See: https://github.com/jprichardson/node-jsonfile/issues/24
2.1.0 / 2015-06-19
------------------
- cleanup: JavaScript Standard Style, rename files, dropped terst for assert
- methods now support JSON revivers/replacers
2.0.1 / 2015-05-24
------------------
- update license attribute https://github.com/jprichardson/node-jsonfile/pull/21
2.0.0 / 2014-07-28
------------------
* added `\n` to end of file on write. [#14](https://github.com/jprichardson/node-jsonfile/pull/14)
* added `options.throws` to `readFileSync()`
* dropped support for Node v0.8
1.2.0 / 2014-06-29
------------------
* removed semicolons
* bugfix: passed `options` to `fs.readFile` and `fs.readFileSync`. This technically changes behavior, but
changes it according to docs. [#12][#12]
1.1.1 / 2013-11-11
------------------
* fixed catching of callback bug (ffissore / #5)
1.1.0 / 2013-10-11
------------------
* added `options` param to methods, (seanodell / #4)
1.0.1 / 2013-09-05
------------------
* removed `homepage` field from package.json to remove NPM warning
1.0.0 / 2013-06-28
------------------
* added `.npmignore`, #1
* changed spacing default from `4` to `2` to follow Node conventions
0.0.1 / 2012-09-10
------------------
* Initial release.
[#89]: https://github.com/jprichardson/node-jsonfile/pull/89
[#45]: https://github.com/jprichardson/node-jsonfile/issues/45 "Reading of UTF8-encoded (w/ BOM) files fails"
[#44]: https://github.com/jprichardson/node-jsonfile/issues/44 "Extra characters in written file"
[#43]: https://github.com/jprichardson/node-jsonfile/issues/43 "Prettyfy json when written to file"
[#42]: https://github.com/jprichardson/node-jsonfile/pull/42 "Moved fs.readFileSync within the try/catch"
[#41]: https://github.com/jprichardson/node-jsonfile/issues/41 "Linux: Hidden file not working"
[#40]: https://github.com/jprichardson/node-jsonfile/issues/40 "autocreate folder doesn't work from Path-value"
[#39]: https://github.com/jprichardson/node-jsonfile/pull/39 "Add `throws` option for readFile (async)"
[#38]: https://github.com/jprichardson/node-jsonfile/pull/38 "Update README.md writeFile[Sync] signature"
[#37]: https://github.com/jprichardson/node-jsonfile/pull/37 "support append file"
[#36]: https://github.com/jprichardson/node-jsonfile/pull/36 "Add typescript definition file."
[#35]: https://github.com/jprichardson/node-jsonfile/pull/35 "Add typescript definition file."
[#34]: https://github.com/jprichardson/node-jsonfile/pull/34 "readFile JSON parse error includes filename"
[#33]: https://github.com/jprichardson/node-jsonfile/pull/33 "fix throw->throws typo in readFileSync()"
[#32]: https://github.com/jprichardson/node-jsonfile/issues/32 "readFile & readFileSync can possible have strip-comments as an option?"
[#31]: https://github.com/jprichardson/node-jsonfile/pull/31 "[Modify] Support string include is unicode escape string"
[#30]: https://github.com/jprichardson/node-jsonfile/issues/30 "How to use Jsonfile package in Meteor.js App?"
[#29]: https://github.com/jprichardson/node-jsonfile/issues/29 "writefile callback if no error?"
[#28]: https://github.com/jprichardson/node-jsonfile/issues/28 "writeFile options argument broken "
[#27]: https://github.com/jprichardson/node-jsonfile/pull/27 "Use svg instead of png to get better image quality"
[#26]: https://github.com/jprichardson/node-jsonfile/issues/26 "Breaking change to fs-extra"
[#25]: https://github.com/jprichardson/node-jsonfile/issues/25 "support string encoding param for read methods"
[#24]: https://github.com/jprichardson/node-jsonfile/issues/24 "readFile: Passing in null options with a callback throws an error"
[#23]: https://github.com/jprichardson/node-jsonfile/pull/23 "Add appendFile and appendFileSync"
[#22]: https://github.com/jprichardson/node-jsonfile/issues/22 "Default value for spaces in readme.md is outdated"
[#21]: https://github.com/jprichardson/node-jsonfile/pull/21 "Update license attribute"
[#20]: https://github.com/jprichardson/node-jsonfile/issues/20 "Add simple caching functionallity"
[#19]: https://github.com/jprichardson/node-jsonfile/pull/19 "Add appendFileSync method"
[#18]: https://github.com/jprichardson/node-jsonfile/issues/18 "Add updateFile and updateFileSync methods"
[#17]: https://github.com/jprichardson/node-jsonfile/issues/17 "seem read & write sync has sequentially problem"
[#16]: https://github.com/jprichardson/node-jsonfile/pull/16 "export spaces defaulted to null"
[#15]: https://github.com/jprichardson/node-jsonfile/issues/15 "`jsonfile.spaces` should default to `null`"
[#14]: https://github.com/jprichardson/node-jsonfile/pull/14 "Add EOL at EOF"
[#13]: https://github.com/jprichardson/node-jsonfile/issues/13 "Add a final newline"
[#12]: https://github.com/jprichardson/node-jsonfile/issues/12 "readFile doesn't accept options"
[#11]: https://github.com/jprichardson/node-jsonfile/pull/11 "Added try,catch to readFileSync"
[#10]: https://github.com/jprichardson/node-jsonfile/issues/10 "No output or error from writeFile"
[#9]: https://github.com/jprichardson/node-jsonfile/pull/9 "Change 'js' to 'jf' in example."
[#8]: https://github.com/jprichardson/node-jsonfile/pull/8 "Updated forgotten module.exports to me."
[#7]: https://github.com/jprichardson/node-jsonfile/pull/7 "Add file name in error message"
[#6]: https://github.com/jprichardson/node-jsonfile/pull/6 "Use graceful-fs when possible"
[#5]: https://github.com/jprichardson/node-jsonfile/pull/5 "Jsonfile doesn't behave nicely when used inside a test suite."
[#4]: https://github.com/jprichardson/node-jsonfile/pull/4 "Added options parameter to writeFile and writeFileSync"
[#3]: https://github.com/jprichardson/node-jsonfile/issues/3 "test2"
[#2]: https://github.com/jprichardson/node-jsonfile/issues/2 "homepage field must be a string url. Deleted."
[#1]: https://github.com/jprichardson/node-jsonfile/pull/1 "adding an `.npmignore` file"

View file

@ -0,0 +1,15 @@
(The MIT License)
Copyright (c) 2012-2015, JP Richardson <jprichardson@gmail.com>
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files
(the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify,
merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View file

@ -0,0 +1,230 @@
Node.js - jsonfile
================
Easily read/write JSON files in Node.js. _Note: this module cannot be used in the browser._
[![npm Package](https://img.shields.io/npm/v/jsonfile.svg?style=flat-square)](https://www.npmjs.org/package/jsonfile)
[![build status](https://secure.travis-ci.org/jprichardson/node-jsonfile.svg)](http://travis-ci.org/jprichardson/node-jsonfile)
[![windows Build status](https://img.shields.io/appveyor/ci/jprichardson/node-jsonfile/master.svg?label=windows%20build)](https://ci.appveyor.com/project/jprichardson/node-jsonfile/branch/master)
<a href="https://github.com/feross/standard"><img src="https://cdn.rawgit.com/feross/standard/master/sticker.svg" alt="Standard JavaScript" width="100"></a>
Why?
----
Writing `JSON.stringify()` and then `fs.writeFile()` and `JSON.parse()` with `fs.readFile()` enclosed in `try/catch` blocks became annoying.
Installation
------------
npm install --save jsonfile
API
---
* [`readFile(filename, [options], callback)`](#readfilefilename-options-callback)
* [`readFileSync(filename, [options])`](#readfilesyncfilename-options)
* [`writeFile(filename, obj, [options], callback)`](#writefilefilename-obj-options-callback)
* [`writeFileSync(filename, obj, [options])`](#writefilesyncfilename-obj-options)
----
### readFile(filename, [options], callback)
`options` (`object`, default `undefined`): Pass in any [`fs.readFile`](https://nodejs.org/api/fs.html#fs_fs_readfile_path_options_callback) options or set `reviver` for a [JSON reviver](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse).
- `throws` (`boolean`, default: `true`). If `JSON.parse` throws an error, pass this error to the callback.
If `false`, returns `null` for the object.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
jsonfile.readFile(file, function (err, obj) {
if (err) console.error(err)
console.dir(obj)
})
```
You can also use this method with promises. The `readFile` method will return a promise if you do not pass a callback function.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
jsonfile.readFile(file)
.then(obj => console.dir(obj))
.catch(error => console.error(error))
```
----
### readFileSync(filename, [options])
`options` (`object`, default `undefined`): Pass in any [`fs.readFileSync`](https://nodejs.org/api/fs.html#fs_fs_readfilesync_path_options) options or set `reviver` for a [JSON reviver](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse).
- `throws` (`boolean`, default: `true`). If an error is encountered reading or parsing the file, throw the error. If `false`, returns `null` for the object.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
console.dir(jsonfile.readFileSync(file))
```
----
### writeFile(filename, obj, [options], callback)
`options`: Pass in any [`fs.writeFile`](https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback) options or set `replacer` for a [JSON replacer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify). Can also pass in `spaces`, or override `EOL` string or set `finalEOL` flag as `false` to not save the file with `EOL` at the end.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, function (err) {
if (err) console.error(err)
})
```
Or use with promises as follows:
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj)
.then(res => {
console.log('Write complete')
})
.catch(error => console.error(error))
```
**formatting with spaces:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { spaces: 2 }, function (err) {
if (err) console.error(err)
})
```
**overriding EOL:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { spaces: 2, EOL: '\r\n' }, function (err) {
if (err) console.error(err)
})
```
**disabling the EOL at the end of file:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { spaces: 2, finalEOL: false }, function (err) {
if (err) console.log(err)
})
```
**appending to an existing JSON file:**
You can use `fs.writeFile` option `{ flag: 'a' }` to achieve this.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/mayAlreadyExistedData.json'
const obj = { name: 'JP' }
jsonfile.writeFile(file, obj, { flag: 'a' }, function (err) {
if (err) console.error(err)
})
```
----
### writeFileSync(filename, obj, [options])
`options`: Pass in any [`fs.writeFileSync`](https://nodejs.org/api/fs.html#fs_fs_writefilesync_file_data_options) options or set `replacer` for a [JSON replacer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify). Can also pass in `spaces`, or override `EOL` string or set `finalEOL` flag as `false` to not save the file with `EOL` at the end.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj)
```
**formatting with spaces:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { spaces: 2 })
```
**overriding EOL:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { spaces: 2, EOL: '\r\n' })
```
**disabling the EOL at the end of file:**
```js
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { spaces: 2, finalEOL: false })
```
**appending to an existing JSON file:**
You can use `fs.writeFileSync` option `{ flag: 'a' }` to achieve this.
```js
const jsonfile = require('jsonfile')
const file = '/tmp/mayAlreadyExistedData.json'
const obj = { name: 'JP' }
jsonfile.writeFileSync(file, obj, { flag: 'a' })
```
License
-------
(MIT License)
Copyright 2012-2016, JP Richardson <jprichardson@gmail.com>

View file

@ -0,0 +1,26 @@
# Test against this version of Node.js
environment:
matrix:
# node.js
- nodejs_version: "10"
- nodejs_version: "12"
# Install scripts. (runs after repo cloning)
install:
# Get the latest stable version of Node.js or io.js
- ps: Install-Product node $env:nodejs_version
# install modules
- npm config set loglevel warn
- npm install --silent
# Post-install test scripts.
test_script:
# Output useful info for debugging.
- node --version
- npm --version
# run tests
- npm test
# Don't actually build.
build: off

View file

@ -0,0 +1,88 @@
let _fs
try {
_fs = require('graceful-fs')
} catch (_) {
_fs = require('fs')
}
const universalify = require('universalify')
const { stringify, stripBom } = require('./utils')
async function _readFile (file, options = {}) {
if (typeof options === 'string') {
options = { encoding: options }
}
const fs = options.fs || _fs
const shouldThrow = 'throws' in options ? options.throws : true
let data = await universalify.fromCallback(fs.readFile)(file, options)
data = stripBom(data)
let obj
try {
obj = JSON.parse(data, options ? options.reviver : null)
} catch (err) {
if (shouldThrow) {
err.message = `${file}: ${err.message}`
throw err
} else {
return null
}
}
return obj
}
const readFile = universalify.fromPromise(_readFile)
function readFileSync (file, options = {}) {
if (typeof options === 'string') {
options = { encoding: options }
}
const fs = options.fs || _fs
const shouldThrow = 'throws' in options ? options.throws : true
try {
let content = fs.readFileSync(file, options)
content = stripBom(content)
return JSON.parse(content, options.reviver)
} catch (err) {
if (shouldThrow) {
err.message = `${file}: ${err.message}`
throw err
} else {
return null
}
}
}
async function _writeFile (file, obj, options = {}) {
const fs = options.fs || _fs
const str = stringify(obj, options)
await universalify.fromCallback(fs.writeFile)(file, str, options)
}
const writeFile = universalify.fromPromise(_writeFile)
function writeFileSync (file, obj, options = {}) {
const fs = options.fs || _fs
const str = stringify(obj, options)
// not sure if fs.writeFileSync returns anything, but just in case
return fs.writeFileSync(file, str, options)
}
const jsonfile = {
readFile,
readFileSync,
writeFile,
writeFileSync
}
module.exports = jsonfile

View file

@ -0,0 +1,40 @@
{
"name": "jsonfile",
"version": "6.1.0",
"description": "Easily read/write JSON files.",
"repository": {
"type": "git",
"url": "git@github.com:jprichardson/node-jsonfile.git"
},
"keywords": [
"read",
"write",
"file",
"json",
"fs",
"fs-extra"
],
"author": "JP Richardson <jprichardson@gmail.com>",
"license": "MIT",
"dependencies": {
"universalify": "^2.0.0"
},
"optionalDependencies": {
"graceful-fs": "^4.1.6"
},
"devDependencies": {
"mocha": "^8.2.0",
"rimraf": "^2.4.0",
"standard": "^16.0.1"
},
"main": "index.js",
"files": [
"index.js",
"utils.js"
],
"scripts": {
"lint": "standard",
"test": "npm run lint && npm run unit",
"unit": "mocha"
}
}

View file

@ -0,0 +1,149 @@
const assert = require('assert')
const fs = require('fs')
const os = require('os')
const path = require('path')
const rimraf = require('rimraf')
const jf = require('../')
/* global describe it beforeEach afterEach */
describe('+ readFileSync()', () => {
let TEST_DIR
beforeEach((done) => {
TEST_DIR = path.join(os.tmpdir(), 'jsonfile-tests-readfile-sync')
rimraf.sync(TEST_DIR)
fs.mkdir(TEST_DIR, done)
})
afterEach((done) => {
rimraf.sync(TEST_DIR)
done()
})
it('should read and parse JSON', () => {
const file = path.join(TEST_DIR, 'somefile3.json')
const obj = { name: 'JP' }
fs.writeFileSync(file, JSON.stringify(obj))
try {
const obj2 = jf.readFileSync(file)
assert.strictEqual(obj2.name, obj.name)
} catch (err) {
assert(err)
}
})
describe('> when invalid JSON', () => {
it('should include the filename in the error', () => {
const fn = 'somefile.json'
const file = path.join(TEST_DIR, fn)
fs.writeFileSync(file, '{')
assert.throws(() => {
jf.readFileSync(file)
}, (err) => {
assert(err instanceof Error)
assert(err.message.match(fn))
return true
})
})
})
describe('> when invalid JSON and throws set to false', () => {
it('should return null', () => {
const file = path.join(TEST_DIR, 'somefile4-invalid.json')
const data = '{not valid JSON'
fs.writeFileSync(file, data)
assert.throws(() => {
jf.readFileSync(file)
})
const obj = jf.readFileSync(file, { throws: false })
assert.strictEqual(obj, null)
})
})
describe('> when invalid JSON and throws set to true', () => {
it('should throw an exception', () => {
const file = path.join(TEST_DIR, 'somefile4-invalid.json')
const data = '{not valid JSON'
fs.writeFileSync(file, data)
assert.throws(() => {
jf.readFileSync(file, { throws: true })
})
})
})
describe('> when json file is missing and throws set to false', () => {
it('should return null', () => {
const file = path.join(TEST_DIR, 'somefile4-invalid.json')
const obj = jf.readFileSync(file, { throws: false })
assert.strictEqual(obj, null)
})
})
describe('> when json file is missing and throws set to true', () => {
it('should throw an exception', () => {
const file = path.join(TEST_DIR, 'somefile4-invalid.json')
assert.throws(() => {
jf.readFileSync(file, { throws: true })
})
})
})
describe('> when JSON reviver is set', () => {
it('should transform the JSON', () => {
const file = path.join(TEST_DIR, 'somefile.json')
const sillyReviver = function (k, v) {
if (typeof v !== 'string') return v
if (v.indexOf('date:') < 0) return v
return new Date(v.split('date:')[1])
}
const obj = {
name: 'jp',
day: 'date:2015-06-19T11:41:26.815Z'
}
fs.writeFileSync(file, JSON.stringify(obj))
const data = jf.readFileSync(file, { reviver: sillyReviver })
assert.strictEqual(data.name, 'jp')
assert(data.day instanceof Date)
assert.strictEqual(data.day.toISOString(), '2015-06-19T11:41:26.815Z')
})
})
describe('> when passing encoding string as option', () => {
it('should not throw an error', () => {
const file = path.join(TEST_DIR, 'somefile.json')
const obj = {
name: 'jp'
}
fs.writeFileSync(file, JSON.stringify(obj))
let data
try {
data = jf.readFileSync(file, 'utf8')
} catch (err) {
assert.ifError(err)
}
assert.strictEqual(data.name, 'jp')
})
})
describe('> w/ BOM', () => {
it('should properly parse', () => {
const file = path.join(TEST_DIR, 'file-bom.json')
const obj = { name: 'JP' }
fs.writeFileSync(file, `\uFEFF${JSON.stringify(obj)}`)
const data = jf.readFileSync(file)
assert.deepStrictEqual(obj, data)
})
})
})

View file

@ -0,0 +1,273 @@
const assert = require('assert')
const fs = require('fs')
const os = require('os')
const path = require('path')
const rimraf = require('rimraf')
const jf = require('../')
/* global describe it beforeEach afterEach */
describe('+ readFile()', () => {
let TEST_DIR
beforeEach((done) => {
TEST_DIR = path.join(os.tmpdir(), 'jsonfile-tests-readfile')
rimraf.sync(TEST_DIR)
fs.mkdir(TEST_DIR, done)
})
afterEach((done) => {
rimraf.sync(TEST_DIR)
done()
})
it('should read and parse JSON', (done) => {
const file = path.join(TEST_DIR, 'somefile.json')
const obj = { name: 'JP' }
fs.writeFileSync(file, JSON.stringify(obj))
jf.readFile(file, (err, obj2) => {
assert.ifError(err)
assert.strictEqual(obj2.name, obj.name)
done()
})
})
it('should resolve a promise with parsed JSON', (done) => {
const file = path.join(TEST_DIR, 'somefile.json')
const obj = { name: 'JP' }
fs.writeFileSync(file, JSON.stringify(obj))
jf.readFile(file)
.then((data) => {
assert.strictEqual(data.name, obj.name)
done()
})
.catch(err => {
assert.ifError(err)
done()
})
})
describe('> when invalid JSON', () => {
let fn, file
beforeEach((done) => {
fn = 'somefile.json'
file = path.join(TEST_DIR, fn)
fs.writeFileSync(file, '{')
done()
})
it('should include the filename in the error', (done) => {
jf.readFile(file, (err, obj2) => {
assert(err instanceof Error)
assert(err.message.match(fn))
done()
})
})
it('should reject the promise with filename in error', (done) => {
jf.readFile(file)
.catch(err => {
assert(err instanceof Error)
assert(err.message.match(fn))
done()
})
})
})
describe('> when invalid JSON and throws set to false', () => {
let fn, file
beforeEach((done) => {
fn = 'somefile4-invalid.json'
file = path.join(TEST_DIR, fn)
const data = '{not valid JSON'
fs.writeFileSync(file, data)
done()
})
it('should return null and no error', (done) => {
let bothDone = false
jf.readFile(file, (err, obj2) => {
assert(err instanceof Error)
assert(err.message.match(fn))
if (bothDone) {
done()
}
bothDone = true
})
jf.readFile(file, { throws: false }, (err, obj2) => {
assert.ifError(err)
assert.strictEqual(obj2, null)
if (bothDone) {
done()
}
bothDone = true
})
})
it('should resolve the promise with null as data', (done) => {
jf.readFile(file, { throws: false })
.then(data => {
assert.strictEqual(data, null)
done()
})
.catch(err => {
assert.ifError(err)
done()
})
})
})
describe('> when invalid JSON and throws set to true', () => {
let fn, file
beforeEach((done) => {
fn = 'somefile4-invalid.json'
file = path.join(TEST_DIR, fn)
const data = '{not valid JSON'
fs.writeFileSync(file, data)
done()
})
it('should return an error', (done) => {
let bothDone = false
jf.readFile(file, (err, obj2) => {
assert(err instanceof Error)
assert(err.message.match(fn))
if (bothDone) {
done()
}
bothDone = true
})
jf.readFile(file, { throws: true }, (err, obj2) => {
assert(err instanceof Error)
assert(err.message.match(fn))
if (bothDone) {
done()
}
bothDone = true
})
})
it('should reject the promise', (done) => {
jf.readFile(file, { throws: true })
.catch(err => {
assert(err instanceof Error)
assert(err.message.match(fn))
done()
})
})
})
describe('> when JSON reviver is set', () => {
let file, sillyReviver
beforeEach((done) => {
file = path.join(TEST_DIR, 'somefile.json')
sillyReviver = function (k, v) {
if (typeof v !== 'string') return v
if (v.indexOf('date:') < 0) return v
return new Date(v.split('date:')[1])
}
const obj = { name: 'jp', day: 'date:2015-06-19T11:41:26.815Z' }
fs.writeFileSync(file, JSON.stringify(obj))
done()
})
it('should transform the JSON', (done) => {
jf.readFile(file, { reviver: sillyReviver }, (err, data) => {
assert.ifError(err)
assert.strictEqual(data.name, 'jp')
assert(data.day instanceof Date)
assert.strictEqual(data.day.toISOString(), '2015-06-19T11:41:26.815Z')
done()
})
})
it('should resolve the promise with transformed JSON', (done) => {
jf.readFile(file, { reviver: sillyReviver })
.then(data => {
assert.strictEqual(data.name, 'jp')
assert(data.day instanceof Date)
assert.strictEqual(data.day.toISOString(), '2015-06-19T11:41:26.815Z')
done()
}).catch(err => {
assert.ifError(err)
done()
})
})
})
describe('> when passing encoding string as option', () => {
let file, obj
beforeEach((done) => {
file = path.join(TEST_DIR, 'somefile.json')
obj = {
name: 'jp'
}
fs.writeFileSync(file, JSON.stringify(obj))
done()
})
it('should not throw an error', (done) => {
jf.readFile(file, 'utf8', (err) => {
assert.ifError(err)
assert.strictEqual(obj.name, 'jp')
done()
})
})
it('should resolve the promise', (done) => {
jf.readFile(file, 'utf8')
.then(data => {
assert.strictEqual(data.name, obj.name)
done()
})
.catch(err => {
assert.ifError(err)
done()
})
})
})
describe('> w/ BOM', () => {
let file, obj
beforeEach((done) => {
file = path.join(TEST_DIR, 'file-bom.json')
obj = { name: 'JP' }
fs.writeFileSync(file, `\uFEFF${JSON.stringify(obj)}`)
done()
})
it('should properly parse', (done) => {
jf.readFile(file, (err, data) => {
assert.ifError(err)
assert.deepStrictEqual(obj, data)
done()
})
})
it('should resolve the promise with parsed JSON', (done) => {
jf.readFile(file)
.then(data => {
assert.deepStrictEqual(data, obj)
done()
})
.catch(err => {
assert.ifError(err)
done()
})
})
})
})

View file

@ -0,0 +1,119 @@
const assert = require('assert')
const fs = require('fs')
const os = require('os')
const path = require('path')
const rimraf = require('rimraf')
const jf = require('../')
/* global describe it beforeEach afterEach */
describe('+ writeFileSync()', () => {
let TEST_DIR
beforeEach((done) => {
TEST_DIR = path.join(os.tmpdir(), 'jsonfile-tests-writefile-sync')
rimraf.sync(TEST_DIR)
fs.mkdir(TEST_DIR, done)
})
afterEach((done) => {
rimraf.sync(TEST_DIR)
done()
})
it('should serialize the JSON and write it to file', () => {
const file = path.join(TEST_DIR, 'somefile4.json')
const obj = { name: 'JP' }
jf.writeFileSync(file, obj)
const data = fs.readFileSync(file, 'utf8')
const obj2 = JSON.parse(data)
assert.strictEqual(obj2.name, obj.name)
assert.strictEqual(data[data.length - 1], '\n')
assert.strictEqual(data, '{"name":"JP"}\n')
})
describe('> when JSON replacer is set', () => {
it('should replace JSON', () => {
const file = path.join(TEST_DIR, 'somefile.json')
const sillyReplacer = function (k, v) {
if (!(v instanceof RegExp)) return v
return `regex:${v.toString()}`
}
const obj = {
name: 'jp',
reg: /hello/g
}
jf.writeFileSync(file, obj, { replacer: sillyReplacer })
const data = JSON.parse(fs.readFileSync(file))
assert.strictEqual(data.name, 'jp')
assert.strictEqual(typeof data.reg, 'string')
assert.strictEqual(data.reg, 'regex:/hello/g')
})
})
describe('> when spaces passed as an option', () => {
it('should write file with spaces', () => {
const file = path.join(TEST_DIR, 'somefile.json')
const obj = { name: 'JP' }
jf.writeFileSync(file, obj, { spaces: 8 })
const data = fs.readFileSync(file, 'utf8')
assert.strictEqual(data, `${JSON.stringify(obj, null, 8)}\n`)
})
it('should use EOL override', () => {
const file = path.join(TEST_DIR, 'somefile.json')
const obj = { name: 'JP' }
jf.writeFileSync(file, obj, { spaces: 2, EOL: '***' })
const data = fs.readFileSync(file, 'utf8')
assert.strictEqual(data, '{*** "name": "JP"***}***')
})
})
describe('> when passing encoding string as options', () => {
it('should not error', () => {
const file = path.join(TEST_DIR, 'somefile6.json')
const obj = { name: 'jp' }
jf.writeFileSync(file, obj, 'utf8')
const data = fs.readFileSync(file, 'utf8')
assert.strictEqual(data, `${JSON.stringify(obj)}\n`)
})
})
describe('> when EOF option is set to a falsey value', () => {
beforeEach((done) => {
TEST_DIR = path.join(os.tmpdir(), 'jsonfile-tests-writefile-sync')
rimraf.sync(TEST_DIR)
fs.mkdir(TEST_DIR, done)
})
afterEach((done) => {
rimraf.sync(TEST_DIR)
done()
})
it('should not have a the EOL symbol at the end of file', (done) => {
const file = path.join(TEST_DIR, 'somefile2.json')
const obj = { name: 'jp' }
jf.writeFileSync(file, obj, { finalEOL: false })
const rawData = fs.readFileSync(file, 'utf8')
const data = JSON.parse(rawData)
assert.strictEqual(rawData[rawData.length - 1], '}')
assert.strictEqual(data.name, obj.name)
done()
})
it('should have a the EOL symbol at the end of file when finalEOL is a truth value in options', (done) => {
const file = path.join(TEST_DIR, 'somefile2.json')
const obj = { name: 'jp' }
jf.writeFileSync(file, obj, { finalEOL: true })
const rawData = fs.readFileSync(file, 'utf8')
const data = JSON.parse(rawData)
assert.strictEqual(rawData[rawData.length - 1], '\n')
assert.strictEqual(data.name, obj.name)
done()
})
})
})

View file

@ -0,0 +1,260 @@
const assert = require('assert')
const fs = require('fs')
const os = require('os')
const path = require('path')
const rimraf = require('rimraf')
const jf = require('../')
/* global describe it beforeEach afterEach */
describe('+ writeFile()', () => {
let TEST_DIR
beforeEach((done) => {
TEST_DIR = path.join(os.tmpdir(), 'jsonfile-tests-writefile')
rimraf.sync(TEST_DIR)
fs.mkdir(TEST_DIR, done)
})
afterEach((done) => {
rimraf.sync(TEST_DIR)
done()
})
it('should serialize and write JSON', (done) => {
const file = path.join(TEST_DIR, 'somefile2.json')
const obj = { name: 'JP' }
jf.writeFile(file, obj, (err) => {
assert.ifError(err)
fs.readFile(file, 'utf8', (err, data) => {
assert.ifError(err)
const obj2 = JSON.parse(data)
assert.strictEqual(obj2.name, obj.name)
// verify EOL
assert.strictEqual(data[data.length - 1], '\n')
done()
})
})
})
it('should write JSON, resolve promise', (done) => {
const file = path.join(TEST_DIR, 'somefile2.json')
const obj = { name: 'JP' }
jf.writeFile(file, obj)
.then(res => {
fs.readFile(file, 'utf8', (err, data) => {
assert.ifError(err)
const obj2 = JSON.parse(data)
assert.strictEqual(obj2.name, obj.name)
// verify EOL
assert.strictEqual(data[data.length - 1], '\n')
done()
})
})
.catch(err => {
assert.ifError(err)
done()
})
})
describe('> when JSON replacer is set', () => {
let file, sillyReplacer, obj
beforeEach((done) => {
file = path.join(TEST_DIR, 'somefile.json')
sillyReplacer = function (k, v) {
if (!(v instanceof RegExp)) return v
return `regex:${v.toString()}`
}
obj = {
name: 'jp',
reg: /hello/g
}
done()
})
it('should replace JSON', (done) => {
jf.writeFile(file, obj, { replacer: sillyReplacer }, (err) => {
assert.ifError(err)
const data = JSON.parse(fs.readFileSync(file))
assert.strictEqual(data.name, 'jp')
assert.strictEqual(typeof data.reg, 'string')
assert.strictEqual(data.reg, 'regex:/hello/g')
done()
})
})
it('should replace JSON, resolve promise', (done) => {
jf.writeFile(file, obj, { replacer: sillyReplacer })
.then(res => {
const data = JSON.parse(fs.readFileSync(file))
assert.strictEqual(data.name, 'jp')
assert.strictEqual(typeof data.reg, 'string')
assert.strictEqual(data.reg, 'regex:/hello/g')
done()
})
.catch(err => {
assert.ifError(err)
done()
})
})
})
describe('> when spaces passed as an option', () => {
let file, obj
beforeEach((done) => {
file = path.join(TEST_DIR, 'somefile.json')
obj = { name: 'jp' }
done()
})
it('should write file with spaces', (done) => {
jf.writeFile(file, obj, { spaces: 8 }, (err) => {
assert.ifError(err)
const data = fs.readFileSync(file, 'utf8')
assert.strictEqual(data, `${JSON.stringify(obj, null, 8)}\n`)
done()
})
})
it('should write file with spaces, resolve the promise', (done) => {
jf.writeFile(file, obj, { spaces: 8 })
.then(res => {
const data = fs.readFileSync(file, 'utf8')
assert.strictEqual(data, `${JSON.stringify(obj, null, 8)}\n`)
done()
})
.catch(err => {
assert.ifError(err)
done()
})
})
})
describe('> when spaces, EOL are passed as options', () => {
let file, obj
beforeEach((done) => {
file = path.join(TEST_DIR, 'somefile.json')
obj = { name: 'jp' }
done()
})
it('should use EOL override', (done) => {
jf.writeFile(file, obj, { spaces: 2, EOL: '***' }, (err) => {
assert.ifError(err)
const data = fs.readFileSync(file, 'utf8')
assert.strictEqual(data, '{*** "name": "jp"***}***')
done()
})
})
it('should use EOL override, resolve the promise', (done) => {
jf.writeFile(file, obj, { spaces: 2, EOL: '***' })
.then(res => {
const data = fs.readFileSync(file, 'utf8')
assert.strictEqual(data, '{*** "name": "jp"***}***')
done()
})
.catch(err => {
assert.ifError(err)
done()
})
})
})
describe('> when passing encoding string as options', () => {
let file, obj
beforeEach((done) => {
file = path.join(TEST_DIR, 'somefile.json')
obj = { name: 'jp' }
done()
})
it('should not error', (done) => {
jf.writeFile(file, obj, 'utf8', (err) => {
assert.ifError(err)
const data = fs.readFileSync(file, 'utf8')
assert.strictEqual(data, `${JSON.stringify(obj)}\n`)
done()
})
})
it('should not error, resolve the promise', (done) => {
jf.writeFile(file, obj, 'utf8')
.then(res => {
const data = fs.readFileSync(file, 'utf8')
assert.strictEqual(data, `${JSON.stringify(obj)}\n`)
done()
})
.catch(err => {
assert.ifError(err)
done()
})
})
})
describe('> when EOF option is set to a falsey value', () => {
beforeEach((done) => {
TEST_DIR = path.join(os.tmpdir(), 'jsonfile-tests-writefile')
rimraf.sync(TEST_DIR)
fs.mkdir(TEST_DIR, done)
})
afterEach((done) => {
rimraf.sync(TEST_DIR)
done()
})
it('should not have a the EOL symbol at the end of file', (done) => {
const file = path.join(TEST_DIR, 'somefile2.json')
const obj = { name: 'jp' }
jf.writeFile(file, obj, { finalEOL: false }, (err) => {
assert.ifError(err)
fs.readFile(file, 'utf8', (_, rawData) => {
const data = JSON.parse(rawData)
assert.strictEqual(rawData[rawData.length - 1], '}')
assert.strictEqual(data.name, obj.name)
done()
})
})
})
it('should have a the EOL symbol at the end of file when finalEOL is a truth value in options', (done) => {
const file = path.join(TEST_DIR, 'somefile2.json')
const obj = { name: 'jp' }
jf.writeFile(file, obj, { finalEOL: true }, (err) => {
assert.ifError(err)
fs.readFile(file, 'utf8', (_, rawData) => {
const data = JSON.parse(rawData)
assert.strictEqual(rawData[rawData.length - 1], '\n')
assert.strictEqual(data.name, obj.name)
done()
})
})
})
})
// Prevent https://github.com/jprichardson/node-jsonfile/issues/81 from happening
describe("> when callback isn't passed & can't serialize", () => {
it('should not write an empty file, should reject the promise', function (done) {
this.slow(1100)
const file = path.join(TEST_DIR, 'somefile.json')
const obj1 = { name: 'JP' }
const obj2 = { person: obj1 }
obj1.circular = obj2
jf.writeFile(file, obj1)
.catch(err => {
assert(err)
assert(!fs.existsSync(file))
done()
})
})
})
})

View file

@ -0,0 +1,14 @@
function stringify (obj, { EOL = '\n', finalEOL = true, replacer = null, spaces } = {}) {
const EOF = finalEOL ? EOL : ''
const str = JSON.stringify(obj, replacer, spaces)
return str.replace(/\n/g, EOL) + EOF
}
function stripBom (content) {
// we do this because JSON.parse would convert it to a utf8 string if encoding wasn't specified
if (Buffer.isBuffer(content)) content = content.toString('utf8')
return content.replace(/^\uFEFF/, '')
}
module.exports = { stringify, stripBom }

View file

@ -0,0 +1,20 @@
(The MIT License)
Copyright (c) 2017, Ryan Zimmerman <opensrc@ryanzim.com>
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the 'Software'), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View file

@ -0,0 +1,76 @@
# universalify
![GitHub Workflow Status (branch)](https://img.shields.io/github/actions/workflow/status/RyanZim/universalify/ci.yml?branch=master)
![Coveralls github branch](https://img.shields.io/coveralls/github/RyanZim/universalify/master.svg)
![npm](https://img.shields.io/npm/dm/universalify.svg)
![npm](https://img.shields.io/npm/l/universalify.svg)
Make a callback- or promise-based function support both promises and callbacks.
Uses the native promise implementation.
## Installation
```bash
npm install universalify
```
## API
### `universalify.fromCallback(fn)`
Takes a callback-based function to universalify, and returns the universalified function.
Function must take a callback as the last parameter that will be called with the signature `(error, result)`. `universalify` does not support calling the callback with three or more arguments, and does not ensure that the callback is only called once.
```js
function callbackFn (n, cb) {
setTimeout(() => cb(null, n), 15)
}
const fn = universalify.fromCallback(callbackFn)
// Works with Promises:
fn('Hello World!')
.then(result => console.log(result)) // -> Hello World!
.catch(error => console.error(error))
// Works with Callbacks:
fn('Hi!', (error, result) => {
if (error) return console.error(error)
console.log(result)
// -> Hi!
})
```
### `universalify.fromPromise(fn)`
Takes a promise-based function to universalify, and returns the universalified function.
Function must return a valid JS promise. `universalify` does not ensure that a valid promise is returned.
```js
function promiseFn (n) {
return new Promise(resolve => {
setTimeout(() => resolve(n), 15)
})
}
const fn = universalify.fromPromise(promiseFn)
// Works with Promises:
fn('Hello World!')
.then(result => console.log(result)) // -> Hello World!
.catch(error => console.error(error))
// Works with Callbacks:
fn('Hi!', (error, result) => {
if (error) return console.error(error)
console.log(result)
// -> Hi!
})
```
## License
MIT

View file

@ -0,0 +1,24 @@
'use strict'
exports.fromCallback = function (fn) {
return Object.defineProperty(function (...args) {
if (typeof args[args.length - 1] === 'function') fn.apply(this, args)
else {
return new Promise((resolve, reject) => {
args.push((err, res) => (err != null) ? reject(err) : resolve(res))
fn.apply(this, args)
})
}
}, 'name', { value: fn.name })
}
exports.fromPromise = function (fn) {
return Object.defineProperty(function (...args) {
const cb = args[args.length - 1]
if (typeof cb !== 'function') return fn.apply(this, args)
else {
args.pop()
fn.apply(this, args).then(r => cb(null, r), cb)
}
}, 'name', { value: fn.name })
}

View file

@ -0,0 +1,34 @@
{
"name": "universalify",
"version": "2.0.1",
"description": "Make a callback- or promise-based function support both promises and callbacks.",
"keywords": [
"callback",
"native",
"promise"
],
"homepage": "https://github.com/RyanZim/universalify#readme",
"bugs": "https://github.com/RyanZim/universalify/issues",
"license": "MIT",
"author": "Ryan Zimmerman <opensrc@ryanzim.com>",
"files": [
"index.js"
],
"repository": {
"type": "git",
"url": "git+https://github.com/RyanZim/universalify.git"
},
"scripts": {
"test": "standard && nyc --reporter text --reporter lcovonly tape test/*.js | colortape"
},
"devDependencies": {
"colortape": "^0.1.2",
"coveralls": "^3.0.1",
"nyc": "^15.0.0",
"standard": "^14.3.1",
"tape": "^5.0.1"
},
"engines": {
"node": ">= 10.0.0"
}
}

View file

@ -0,0 +1,86 @@
'use strict'
const test = require('tape')
const universalify = require('..')
const fn = universalify.fromCallback(function (a, b, cb) {
setTimeout(() => cb(null, [this, a, b]), 15)
})
const errFn = universalify.fromCallback(function (cb) {
setTimeout(() => cb(new Error('test')), 15)
})
const falseyErrFn = universalify.fromCallback(function (cb) {
setTimeout(() => cb(0, 15)) // eslint-disable-line standard/no-callback-literal
})
test('callback function works with callbacks', t => {
t.plan(4)
fn.call({ a: 'a' }, 1, 2, (err, arr) => {
t.ifError(err, 'should not error')
t.is(arr[0].a, 'a')
t.is(arr[1], 1)
t.is(arr[2], 2)
t.end()
})
})
test('callback function works with promises', t => {
t.plan(3)
fn.call({ a: 'a' }, 1, 2)
.then(arr => {
t.is(arr[0].a, 'a')
t.is(arr[1], 1)
t.is(arr[2], 2)
t.end()
})
.catch(t.end)
})
test('callbacks function works with promises without modify the original arg array', t => {
t.plan(2)
const array = [1, 2]
fn.apply(this, array).then((arr) => {
t.is(array.length, 2)
t.is(arr.length, 3)
t.end()
})
})
test('callback function error works with callbacks', t => {
t.plan(2)
errFn(err => {
t.assert(err, 'should error')
t.is(err.message, 'test')
t.end()
})
})
test('callback function error works with promises', t => {
t.plan(2)
errFn()
.then(() => t.end('Promise should not resolve'))
.catch(err => {
t.assert(err, 'should error')
t.is(err.message, 'test')
t.end()
})
})
test('should correctly reject on falsey error values', t => {
t.plan(2)
falseyErrFn()
.then(() => t.end('Promise should not resolve'))
.catch(err => {
t.assert((err != null), 'should error')
t.is(err, 0)
t.end()
})
})
test('fromCallback() sets correct .name', t => {
t.plan(1)
const res = universalify.fromCallback(function hello () {})
t.is(res.name, 'hello')
t.end()
})

View file

@ -0,0 +1,103 @@
'use strict'
const test = require('tape')
const universalify = require('..')
const fn = universalify.fromPromise(function (a, b) {
return new Promise(resolve => {
setTimeout(() => resolve([this, a, b]), 15)
})
})
const errFn = universalify.fromPromise(function () {
return new Promise((resolve, reject) => {
setTimeout(() => reject(new Error('test')), 15)
})
})
test('promise function works with callbacks', t => {
t.plan(4)
fn.call({ a: 'a' }, 1, 2, (err, arr) => {
t.ifError(err, 'should not error')
t.is(arr[0].a, 'a')
t.is(arr[1], 1)
t.is(arr[2], 2)
t.end()
})
})
test('promise function works with promises', t => {
t.plan(3)
fn.call({ a: 'a' }, 1, 2)
.then(arr => {
t.is(arr[0].a, 'a')
t.is(arr[1], 1)
t.is(arr[2], 2)
t.end()
})
.catch(t.end)
})
test('promise function optional param works with callbacks', t => {
t.plan(4)
fn.call({ a: 'a' }, 1, (err, arr) => {
t.ifError(err, 'should not error')
t.is(arr[0].a, 'a')
t.is(arr[1], 1)
t.is(arr[2], undefined)
t.end()
})
})
test('promise function optional param works with promises', t => {
t.plan(3)
fn.call({ a: 'a' }, 1)
.then(arr => {
t.is(arr[0].a, 'a')
t.is(arr[1], 1)
t.is(arr[2], undefined)
t.end()
})
.catch(t.end)
})
test('promise function error works with callbacks', t => {
t.plan(2)
errFn(err => {
t.assert(err, 'should error')
t.is(err.message, 'test')
t.end()
})
})
test('promise function error works with promises', t => {
t.plan(2)
errFn()
.then(() => t.end('Promise should not resolve'))
.catch(err => {
t.assert(err, 'should error')
t.is(err.message, 'test')
t.end()
})
})
test('fromPromise() sets correct .name', t => {
t.plan(1)
const res = universalify.fromPromise(function hello () {})
t.is(res.name, 'hello')
t.end()
})
test('fromPromise() handles an error in callback correctly', t => {
// We need to make sure that the callback isn't called twice if there's an
// error inside the callback. This should instead generate an unhandled
// promise rejection. We verify one is created, with the correct message.
t.plan(2)
const errMsg = 'some callback error'
process.once('unhandledRejection', (err) => {
t.is(err.message, errMsg, 'correct error message')
})
fn(1, 2, err => {
t.ifError(err, 'no error here')
throw new Error(errMsg)
})
})