javascript - Node.js event loop not making sense to me -


I am new to node. Js. I am Jim R. Wilson is working his way through "nod.js the rayed way" and i am working in a paradox in the book (and myself in node. Js?) That i do not match any amount of googling With my satisfaction for

In this book and in other resources duplication has been said that I have seen online that node If the call runs in response to some event line-by-line until JS is complete, then the event loop does the revenue with the waiting for the next callback or with the volume. And because Node.js is doing anything with a single-threaded (and clearly cluster module, it also runs as a single process), my understanding is that most, only a Segment Javascript code execution at one time.

Am I properly understanding? Here is the paradox (in my mind) if this is the case then the node. How is JS so much concurrent?

Here is an example that shows my confusion, it is to run a directory of several thousands of XML files and to remove each relevant bits in a JSON document.

The first parser:

  'use strict'; Const fs = is required ('fs'), cheerio = ('charion'); Module Exports = function (file name, callback) {fs.readFile (file name, function (error, data) {if (err) {return callback (err);} let $ = cheerio.load (data.toString ()), Collect = Function (Index, AMM) {Return Dollar (AMM) .text ();}; Callback (empty, {_id: $ ('pgterms \\: ebook'). ('About RDF:'). 'Ebooks /', ''), Title: $ ('dcterms \\: title'). Text (), Authors: $ ('pgterms \\: agent pgterms \\: name'). Map (gather) : $ ('[Rdf \\: resource $ = "/ LCSH"] ~ rdf \\: value'). Map (assembled)})}); };  

And the bit that runs on the directory structure:

  'strict experiment'; Const file = requirement ('file'), rdfParser = require ('./ lib / rdf-parser.js'); Console.log ('Starting directory'); File.walk (__dirname + '/ cache', function (err, dirpath, dirs, files) {files.forEach (function (path) {rdfParser (path, function (error, doctor) {if (err) {throw error ;} And {console.log (doc);}});});});  

If you run this code, you will get an error from the fact that the program terminates all available file descriptors, indicating that the program appears to be thousands Files are opened together.

My question is ... How could this possibly be explained, unless the incident model and / or concurrency model behave differently?

I'm sure someone knows this and it can be highlighted, but for this moment, let me paint very confused!

Am I properly understanding?

Yes.

How is the node? If this is the case then JS is too much concurrent?

Javascript execution is not the only time - IO (and other heavy tasks) when you call an asynchronous function, it will start the job (for example, reading a file) and Immediately return it to the "Run the next line of script" as soon as you put it. This work, however, will continue in the background to read the file, and after completing it, it will be kept in the callback which is specified in the event loop queue which will call it with the available data immediately.

For this "background" processing details, and how the nodes actually manage to manage all these asynchronous tasks in parallel, take a look at the question.


Comments

Popular posts from this blog

java - org.apache.http.ProtocolException: Target host is not specified -

java - Gradle dependencies: compile project by relative path -

ruby on rails - Object doesn't support #inspect when used with .include -