Append to file

Hi2all.
I tried to append data to a file at several iterations and I saw that each iteration did more slowly.

It looks like it read and rewrite existing file.

I tried do it with compression at bin format and w/o compression at bin format, problem exist in both types.

Example of my code (c#):

               c.ks(“insert”, “MyQuotes”, ycum);
                c.ks(@“:c:/q/data/6A\_data upsert MyQuotes");                 c.ks("delete from MyQuotes”);

 

Can you please show my any hint…

if the data in a column has an attribute, e.g. `p it will rewrite the whole file.
to append data in linear time, don’t have any attributes on the column.

I didn’t use attributes, just created simple table like this: 
MyQuotes:(Time:();Symbol:();Bid:();Ask:();BidSize:();AskSize:())

sorry, overlooked - use a splayed table

i.e.
   c.ks(@“:c:/q/data/6A_data/ upsert .Q.en[:.;MyQuotes]”);

Thanks! it work now w/o performance loss.

Another question about splayed files:

  1. How to get data by index? At flat table I can use this:  

q)select from get `:TestFIle3 where i>0

but this not work at splayed files.

2)Sorting compressed splayed tables, when I set 

q)c.k(“.z.zd:(16;2;1);”);

sorting not work.

Of course I can compress data and sort, after compress it again, but I think it not smart way.

mistake *uncompress -)

note the trailing /

q)`:t/ set (a:til 10);

q)select from get`:t/ where i>3

a

4

5

6

7

8

9

I don’t think there’s a way around decompressing,sorting,compressing.

If I’m right selection by index is reading whole file at first( get inside ruquest), and if file is too big I got error, so can I read file by chunks (like a .Q.fs )?
ps I interested for both compressed and uncompressed files.

if you have the problem that the data is too big for 32bit, you could look at other recent threads here about segmenting your db