Compression Erorr

Hi,

I am using q on windows. I have to compress tables at the time of splaying itself.

For simplicity i have taken following example:

.z.zd:(16;2;1)

`:db/t/ set ( ti:09:31:00 09:30:01 09:30:02 09:30:03; p:33 33.5 33.2 33.3)

When i run above command i get following error:

Error: db/t/ti: This operation completed successfully.

.d file is created in db/t. running get `:db/t/.d give me list of columns.

Do i need to include some libraries to make compression work.

http://code.kx.com/wiki/Cookbook/FileCompression

Q) I want to use algorithm #2 (gzip). Do I need additional libraries??
A) Yes, but these may already be installed on your system. It binds dynamically to zlib (more info at http://zlib.net). For windows we chose to be compatible with the dlls from http://www.winimage.com/zLibDll/zlib125dll.zip. For linux and solaris you may find it convenient to install zlib using your package manager software, or consult your system administrator for assistance. Note that your will require the matching 32 or 64bit libs for kdb+, i.e. 32bit libs for 32bit kdb+, 64bit libs for 64bit kdb+.

Thanks Charles. I should have seen it first place.

Its working fine with no errors. The issue i am facing is the slow compression rate, As i am using .Q.fs for 1 gb file with upsert command

.Q.fs[{:newfile/ upsert .Q.en[:.;]  flip colnames!(“PSFFFcI”;“,”)0:x}]`:examplecsv/trades2014_01.csv

it takes around 4 minutes. I think the reason may be upsert for the 131000j buffer at a time, it decompress the existing compress file and merge with compression.May be i can use .Q.fsn

Is these any workaround?

 

appending to a compressed file is expensive (lots of file ops plus decompress/recompress of last block),
so try to minimize how many times it calls to do that.
e.g. as you say by using .Q.fsn

or by using a hardware accelerator card (such as an AHA card)

or initial write out uncompressed, and then compress afterwards.

and/or multithread the compression part in peach. Left as an exercise for the reader ;-)