export hdb to csv day by day

Hi,

I have a hdb (parted and splayed), so it looks like:

  • 2013.10.01

   - table_1

      - col_a

      - col_b

   - table_2

      - col_1

      - col_2

  • 2013.10.02

   - table_1

      - col_a

      - col_b

   - table_2

      - col_1

      - col_2

I would like to export each table for each day to csv file.

Currently I am loading the table with the data of one day into memory and than save it to disk like this:

h “table\_1csv set select from table\_1 where date=2013.10.01; save table_1csv.csv; delete table_1csv from `.”

the problem is, that not all tables fit into memory. Is there a way to do this?

Regards,

Michael

If it doesn’t fit into memory, you can chunk, say, 3 hours at a time (whatever fits into memory for you) and append a text file:?http://code.kx.com/wiki/JB:QforMortals2/i_o#Text_Files

Michael,

You could try writing to .csv file data chunks that fit into memory. Example function below:

batchedCsvSave:{[file;table;batchSize]
  fileHandle: hopen file;
  (neg fileHandle) first csv 0: 1#table; // First output column names
  i:0;
  while[i<count table; (neg fileHandle) each 1 _ csv 0: table i+til batchSize; i+:batchSize];
  hclose fileHandle;
  }

Usage: t:(n:til 1000;f:1000?10f); batchedCsvSave[`:t.csv;t;1000]

HTH,

Pawel

2013/11/5 Michael Wittig <michael@widdix.de>

Hi,

I have a hdb (parted and splayed), so it looks like:

  • 2013.10.01

   - table_1

      - col_a

      - col_b

   - table_2

      - col_1

      - col_2

  • 2013.10.02

   - table_1

      - col_a

      - col_b

   - table_2

      - col_1

      - col_2

I would like to export each table for each day to csv file.

Currently I am loading the table with the data of one day into memory and than save it to disk like this:

h “table\_1csv&nbsp;set select from table\_1 where date=2013.10.01; save table_1csv.csv; delete table_1csv from `.”

the problem is, that not all tables fit into memory. Is there a way to do this?

Regards,

Michael


Submitted via Google Groups

thx!

Am Dienstag, 5. November 2013 16:15:28 UTC+1 schrieb pawel.tryfon:

Michael,

You could try writing to .csv file data chunks that fit into memory. Example function below:

batchedCsvSave:{[file;table;batchSize]
fileHandle: hopen file;
(neg fileHandle) first csv 0: 1#table; // First output column names
i:0;
while[i<count table; (neg fileHandle) each 1 _ csv 0: table i+til batchSize; i+:batchSize];
hclose fileHandle;
}

Usage: t:(n:til 1000;f:1000?10f); batchedCsvSave[`:t.csv;t;1000]

HTH,

Pawel

2013/11/5 Michael Wittig <mic…@widdix.de>

Hi,

I have a hdb (parted and splayed), so it looks like:

  • 2013.10.01

   - table_1

      - col_a

      - col_b

   - table_2

      - col_1

      - col_2

  • 2013.10.02

   - table_1

      - col_a

      - col_b

   - table_2

      - col_1

      - col_2

I would like to export each table for each day to csv file.

Currently I am loading the table with the data of one day into memory and than save it to disk like this:

h “table\_1csv&nbsp;set select from table\_1 where date=2013.10.01; save table_1csv.csv; delete table_1csv from `.”

the problem is, that not all tables fit into memory. Is there a way to do this?

Regards,

Michael


Submitted via Google Groups