Home:ALL Converter>reading files asynchronously in parallel

reading files asynchronously in parallel

Ask Time:2012-12-20T14:53:43         Author:Erel Segal-Halevi

Json Formatter

My node.js application requires several files. I understand that it's recommended to read files asynchronously, so, apparently I have to do something like this:

fs.readFile("file1", function(...) {
  fs.readFile("file2", function(...) {
    fs.readFile("file3", function(...) {
       [my application]
    }
  }
}

However, in this case, the files will be read sequentially.

Is there a way to read many files in parallel, and still make sure that the application starts after ALL of them have been loaded?

Author:Erel Segal-Halevi,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/13966758/reading-files-asynchronously-in-parallel
user123444555621 :

Pretty easy.\n\nvar files = [...], remaining = files.length;\nvar callback = function () {\n remaining--;\n if (!remaining) {\n go();\n }\n};\nfiles.forEach(function(file) {\n fs.readFile(file, callback);\n});\n",
2012-12-20T07:28:19
Finesse :

You can use a modern Promise-based API:\nimport {promises as fs} from 'fs';\n\nfunction readFiles(files) {\n return Promise.all(\n files.map(path => fs.readFile(path))\n );\n}\n\nreadFiles(['file1', 'file2', 'file3'])\n .then(fileContents => {\n console.log('All the files are read', fileContents);\n });\n",
2021-12-17T09:17:43
yy