bug in loader.q for loading large CSV files

There seems to be a bug in script at below path to load large csv files in chunk and dumo to disk.

It seems to be skipping records.

https://github.com/KxSystems/cookbook/blob/master/dataloader/loader.q

shouldn’t this portion :

// check if we have already read some data from this file

 // if this is the first time we’ve seen it, then the first row

 // contains the header information, so we want to load it accounting for that

 // in both cases we want to return a table with the same column names

 data:$[filename in filesread; 

  [flip columnames!(“PSFI S”;enlist",")0:rawdata;

          filesread,::filename];

   columnnames xcol (“PSFI S”;enlist",")0:rawdata];

be:

// check if we have already read some data from this file

 // if this is the first time we’ve seen it, then the first row

 // contains the header information, so we want to load it accounting for that

 // in both cases we want to return a table with the same column names

 data:$[filename in filesread; 

  [flip columnnames!(“PSFI S”;enlist",")0:rawdata];

   [filesread,::filename;

           columnnames xcol (“PSFI S”;enlist",")0:rawdata]];