javascript - How to pipe Node.js scripts together using the Unix | pipe (on the command line)? -
i see how pipe stuff using node.js streams, how pipe multiple scripts using unix |
, given of these scripts can async?
$ ./a.js | ./b.js
example:
a.js (chmod 0755)
#!/usr/bin/env node settimeout(function(){ console.log(json.stringify({ foo: 'bar' })); }, 10);
b.js (chmod 0755)
#!/usr/bin/env node console.log(process.argv);
this output:
$ ./a.js | ./b.js [ 'node', '/users/viatropos/tests/b.js' ] events.js:72 throw er; // unhandled 'error' event ^ error: write epipe @ errnoexception (net.js:883:11) @ object.afterwrite (net.js:700:19)
at first glance seems there's lot going wrong, not sure start. there way work? end goal able take console.log
output ./a.js
, use in ./b.js
. reason is, of time these scripts run 1 @ time, nice able pipe them together, ideally system should able handle both cases.
the problem b.js
ends , closes standard in, causes error in a.js
, because standard out got shut off , didn't handle possibility. have 2 options: handle stdout closing in a.js
or accept input in b.js
.
fixing a.js
:
process.on("sigpipe", process.exit);
if add line, it'll give when there's no 1 reading output anymore. there better things on sigpipe depending on program doing, key stop console.log
ing.
fixing b.js
:
#!/usr/bin/env node var stdin = process.openstdin(); var data = ""; stdin.on('data', function(chunk) { data += chunk; }); stdin.on('end', function() { console.log("data:\n" + data + "\nend data"); });
of course, don't have data. key have keeps process running; if you're piping it, stdin.on('data', fx)
seems useful thing do.
remember, either 1 of prevent error. expect second useful if you're planning on piping between programs.
Comments
Post a Comment