I am trying to store a large table in batches by upserting to disk, in batches using
((:e:/hdb/nbbo) upsert .Q.en[:c:/hdb] nbbo);
Seems to be fine if I stop the loader once a few million rows have been stored. But if I let it run to completion if I try and query it I get this error
ERROR: 'e:/hdb/nbbo/sym: The operation completed successfully. (user-defined signal)
I can’t sort the table, apply attribute or just query it.
upsert to a splay appends only if the files contain vectors of fixed-width primitives of non-nested data with no attributes. Otherwise, the file is rewritten in full on each append.
compressed files cannot be read safely whilst they are being appended to.
Tried adding that regedit fix but did not work and querying 1 col did not work either. Windows I’m using is 64bit.
Its weird, if I start storing the table compressed, then I can query it until a certain point then it throws that error. If I store the table as uncompressed then I can never query the table getting the same error. So it looks like it may have something to do with the size of the segments on disk.
32bit only allows for ~4 gigs of memory. When I run over this limit on linux, q throws a 'wsfull error. Maybe thats the error you’re getting, but its different in windows?
I’ve seen strange errors like this before in linux too - when I’ve tried to query a database in 32bit q and run out of memory, it does not show wsfull when reading from disk.