Error Storing a Large Table

Hello, wondering if someone can help me out.

I am trying to store a large table in batches by upserting to disk, in batches using

((:e:/hdb/nbbo) upsert .Q.en[:c:/hdb] nbbo);

Seems to be fine if I stop the loader once a few million rows have been stored.  But if I let it run to completion if I try and query it I get this error

ERROR: 'e:/hdb/nbbo/sym: The operation completed successfully. (user-defined signal)

I can’t sort the table, apply attribute or just query it.

Any ideas?

Now I’m trying to query the table while the loader is running.  I am seeing these errors.  

disk compression - bad logicalBlockSize

Not enough storage is available to process this command.

did I not set compression correctly?  Basically I just set 

.z.zd::(17;2;6);

and then started upserting

…I didn’t know you could compress a table saved to disk. 

Is this table splayed? What about partitioned? 

Yes it is splayed and partitioned, just removed path specifics for post

upsert to a splay appends only if the files contain vectors of fixed-width primitives of non-nested data with no attributes. Otherwise, the file is rewritten in full on each append.

compressed files cannot be read safely whilst they are being appended to.

Right, I am appending to a Splay, with a table without any attributes.  But when I stop loader and try and query the table I get this error

Not enough storage is available to process this command.

seems to work until I have around 30,000,000 + rows

googling suggests that this is related to IRPStackSize:

https://windowsinstructed.com/fix-not-enough-storage-is-available-to-process-this-command/
https://www.cm3solutions.com/enough-storage-available-process-command-visual-studio/

do you have 32 bit windows?

if you only query for 1 column do you get the same error?

Tried adding that regedit fix but did not work and querying 1 col did not work either.  Windows I’m using is 64bit.

Its weird, if I start storing the table compressed, then I can query it until a certain point then it throws that error.  If I store the table as uncompressed then I can never query the table getting the same error.  So it looks like it may have something to do with the size of the segments on disk.

Are you using 32bit or 64bit q for windows? 

I’m using 32b free license q for windows atm.  

32bit only allows for ~4 gigs of memory. When I run over this limit on linux, q throws a 'wsfull error. Maybe thats the error you’re getting, but its different in windows? 

Well I do see `wsfull errors on windows generally.  This seems to be an OS error though, wondering if could be related.

OS reports: Not enough storage is available to process this command.

Thanks

Thanks for everyones help, turns out it was mem issue with 32b but was getting OS error instead of wsfull.

I’ve seen strange errors like this before in linux too - when I’ve tried to query a database in 32bit q and run out of memory, it does not show wsfull when reading from disk.