Partial upserting to a table, using existing table column types

Hi,

I have a table with a defined set of column types. I receive updates to this table via JSON, and the types in the JSON dicts do not exactly match. The JSON dict may have more or less columns than the row it needs to update, although it will always have the key columns present.If a columns is absent in the update it must not be altered in the existing row. I need to select the common columns and cast each one to the existing type from the table column. 

I have some working code below, but it feels a bit like line noise. Is there a more idiomatic solution? Passing the dict in to the join function feels wrong. Thanks for any help!

Cheers,

Donovan.

t:([sym:ibmamd]; price:10h,20h;volume: 1,2)v:(symprice)!(ibm;25)q)common:{(cols x)inter(cols y)}castcommon:{[d;t];c:common[d;t];c!(meta t){[d;x;y]x[y][t]$d[y]}[d;]/:c}q)tsym| price volume---| ------------ibm| 10 1amd| 20 2t upsert castcommon[v;t]tq)tsym| price volume---| ------------ibm| 25 1amd| 20 2

On my phone here but should work…t upsert ((.Q.ty each flip 0!0#t)key v)$vPS. You don't need to find the common columns. Upsert will just up a null.HTH,SeanOn Thu, Jul 23, 2015 at 1:34 PM, Donovan Hide <donovanhide>&gt; wrote:Hi,I have a table with a defined set of column types. I receive updates to this table via JSON, and the types in the JSON dicts do not exactly match. The JSON dict may have more or less columns than the row it needs to update, although it will always have the key columns present.If a columns is absent in the update it must not be altered in the existing row. I need to select the common columns and cast each one to the existing type from the table column.I have some working code below, but it feels a bit like line noise. Is there a more idiomatic solution? Passing the dict in to the join function feels wrong. Thanks for any help!Cheers,Donovan.t:([sym:ibmamd]; price:10h,20h;volume: 1,2)v:(symprice)!(ibm;25)q)common:{(cols x)inter(cols y)}castcommon:{[d;t];c:common[d;t];c!(meta t){[d;x;y]x[y][`t]$d[y]}[d;]/:c}q)tsym| price volume—| ------------ibm| 10 1amd| 20 2`t upsert castcommon[v;t]`tq)tsym| price volume—| ------------ibm| 25 1amd| 20 2–

Submitted via Google Groups

Hi Sean, 
In the case where there are extra values in v that aren’t in the table you’re trying to insert to this won’t work. Instead it should be:

q)typemap:.Q.ty each flip 0!t

q)`t upsert typemap[c]$(c:key[v] inter key typemap)#v

Thanks Sean and Andrew for the help. .Q.ty certainly seems useful! Andrew is right, in that I do need to do an intersection of the columns to get what I need. Having got a bit further I’ve discovered another hurdle: odd type mappings for the decoded JSON.

q)t:([Account:0#]Sequence:0#0i;Flags:0#0i;Balance:0#0j;OwnerCount:0#0i;TransferRate:0#0j;RegularKey:0#;AccountTxnID:0#"")q)v:AccountBalanceFlagsOwnerCountSequence!(“rNSmjVCuPxnD5Kt1km6GRpfn1RScZsFKSw”;“7294584223”;0f;0f;360f)q)tAccount| Sequence Flags Balance OwnerCount TransferRate RegularKey AccountTxnID-------| ----------------------------------------------------------------------q)vAccount | "rNSmjVCuPxnD5Kt1km6GRpfn1RScZsFKSw"Balance | "7294584223"Flags | 0fOwnerCount| 0fSequence | 360fq).Q.ty each vAccount | cBalance | cFlags | FOwnerCount| FSequence | Fq).Q.ty each flip 0!tAccount | sSequence | iFlags | iBalance | jOwnerCount | iTransferRate| jRegularKey | sAccountTxnID| c`

From the above I hope it’s easy to see that numbers that need to go into a long column arrive as strings, while numbers that need to go into an int column arrive as floats. The float=>int mapping is fine, but I need to conditionally upper case the j for strings where the source is c and the destination is j. Perhaps I need to modify the JSON decoder to make all strings containing only digits a long, or alternatively make a custom decoding map in a dict which depends on the columns of both the source dict and the destination table?