preface

For a tree structure of data, look like this:



The root node A is unique, and the child nodes B, C, and D can have multiple (b, C, D) nested data of peer nodes, as well as grandson nodes B1, B2, and so on, which is roughly tree-structured data. So how do you iterate over data like this?

  1. Depth traversal
  • Deep traversal is to think about the problem from the vertical dimension, that is, to achieve a layer nested layer traversal. Depth traversal can also be divided into serial and parallel.
  • Serial depth traversal: starting from the root node A, b is traversed first, b1 under B is traversed again, and b2 is traversed again. After b’s branch is traversed, c’s branch is traversed, and so on.
  • Parallel depth traversal: starting from the root node,b, C and D of the same level are traversed together, b1 and B2 of node B are traversed together, and branches of C are traversed simultaneously.
  • Breadth traversal
    • Breadth traversal is to think in a horizontal dimension, and the key is to use Pointers and queues to achieve traversal. Breadth traversal can also be divided into starting from the head and starting from the tail.
    • Header start: Queue the root node first (because the root node is unique). Starting from the root node A, traverse A, store all child nodes B, C, and D under A in the queue, and then move the pointer one bit later to the second place of the queue, and store the child nodes under node B again, and so on.
    • Tail start: In reverse of the head, the pointer should move forward one bit. Ok, so now that you know depth and breadth how do you use it in Node? You can learn better by using the delete file directory example. In Node, there are synchronous and asynchronous cases, so both need to be taken into account.

    Deep sequential asynchronous serial deletion

    function preParallDeep(dir,callback){
    	fs.stat(dir,function(err,statObj){
    		if(statObj.isFile()){
    			fs.unlink(dir,callback);
    		}else{//files is an array of file names in the directory (not included)'. ''.. ') the fs. Readdir (dir,function(err,files){
    				files=files.map(item=>path.join(dir,item));
    				let index=0;
    				function next(){// Use next to implement asynchronous functionsif(index==files.length) return fs.rmdir(dir,callback)
    					letcurrent=files[index++]; preParallDeep(current,next); } next(); })}})}Copy the code

    Deep sequential asynchronous parallel deletion

    • Closures are used to solve the problem of getting each child node through a loop.

    function preParallDeep(dir,callback){
    	fs.stat(dir,function(err,statObj){
    		if(statObj.isFile()){
    			fs.unlink(dir,callback);
    		}else{//files is an array of file names in the directory (not included)'. ''.. ') the fs. Readdir (dir,function(err,files){
    				files=files.map(item=>path.join(dir,item));
    				if(files.length==0){// Delete the empty directoryreturn fs.rmdir(dir,callback);
    				}
    				for(leti=0; i<files.length; I++){// loop through nodes of the same class (function(I){// The preParallDeep(files[I],function(){// if the length of the index and the parent node is equal to -1, the child node and the parent node are deletedif(I ==files.length-1){// Child nodereturn fs.rmdir(dir,callback)
    							}
    						});
    					})(i)
    				}
    			})
    		}
    	})
    }Copy the code

    • We do it recursively

    function preParallDeep(dir,callback){
    	fs.stat(dir,function(err,statObj){
    		if(statObj.isFile()){
    			fs.unlink(dir,callback)
    		}else{
    			fs.readdir(dir,function(err,files){
    				files=files.map(item=>path.join(dir,item));
    				let index=0;
    				if(files.length===0) return fs.rmdir(dir,callback);
    				function done() {if(++index==files.length) return fs.rmdir(dir,callback);
    				}
    				files.forEach(dir=>{
    					preParallDeep(dir,done)})})}})}Copy the code

    • Implementation through Promises

    function preParallDeep(dir){
    	return new Promise((resolve,reject)=>{
    		fs.stat(dir,function(err,statObj){//fs.stat() checks the existence of the fileif(statObj.isfile ()){fs.unlink(dir,resolve); }else{
    				fs.readdir(dir,function(err,files){ files=files.map(item=>preParallDeep(path.join(dir,item))); Promise.all(files).then(()=>{ fs.rmdir(dir,resolve); })})})})}Copy the code

    Breadth first synchronous deletion

    function white(dir){
    	let arr=[dir];
    	let index=0;
    	let current;
    	while(current=arr[index++]){
    		let dirs=fs.readdirSync(current);
    		dirs=dirs.map(item=>path.join(current,item));
    		arr=[...arr,...dirs];
    	}
    	console.log(arr)
    }
    Copy the code

    Breadth first asynchronous deletion

    function white(dir, callback) {
      let currentDir, index = 0, arr = [dir];
      function remove(dirs) {
    		let key=dirs.length-1;
    		function removeNext(){
    			fs.rmdir(dirs[key--],function(err){
    				if(key===0) return callback();
    				removeNext();
    			})
    		}
    		removeNext();
      }
      function next(currentDir) {
        fs.stat(currentDir, function (err, statObj) {//fs.stat() checks the existence of the fileif (statObj.isFile()) {next(arr[++index]); }else {
            fs.readdir(currentDir, function (err, files) {
              if (err) return;
              files = files && files.map(item => path.join(currentDir,item));
              arr = [...arr, ...files];
              if (index === arr.length - 1) return remove(arr);
              next(arr[++index]);
            })
          }
        })
      }
      next(arr[index]);
    }
    Copy the code

    conclusion

    The FS module of Node has asynchronous and synchronous apis, which are handled differently. When writing recursive functions, consider the relationship between the first layer and the second layer first, so that the thinking will not be chaotic, keep the most clear mind to write code, have ideas to exchange and learn, to a collision of technical ideas. Finally, breadth and depth traversal, which is generally a little bit deeper for performance, and breadth is easier to understand, depending on the specific business scenario.