My node.js application requires several files. I understand that it's recommended to read files asynchronously, so, apparently I have to do something like this:
fs.readFile("file1", function(...) {
fs.readFile("file2", function(...) {
fs.readFile("file3", function(...) {
[my application]
}
}
}
However, in this case, the files will be read sequentially.
Is there a way to read many files in parallel, and still make sure that the application starts after ALL of them have been loaded?
user123444555621 :
Pretty easy.\n\nvar files = [...], remaining = files.length;\nvar callback = function () {\n remaining--;\n if (!remaining) {\n go();\n }\n};\nfiles.forEach(function(file) {\n fs.readFile(file, callback);\n});\n",
2012-12-20T07:28:19
Finesse :
You can use a modern Promise-based API:\nimport {promises as fs} from 'fs';\n\nfunction readFiles(files) {\n return Promise.all(\n files.map(path => fs.readFile(path))\n );\n}\n\nreadFiles(['file1', 'file2', 'file3'])\n .then(fileContents => {\n console.log('All the files are read', fileContents);\n });\n",
2021-12-17T09:17:43