Dear q folks,
If you are looking for a LSTM(Long short-term memory) neural network implemented in KDB, here is some code I wrote -
https://github.com/krish240574/lstmq/blob/master/polstm.q - this implements the following :
-
A LSTM, that attempts, and is still attempting to, predict the next letter in a sequence, the sequence being a Shakespearean passage here.
-
I have implemented the code elaborately, so I could bring out the nuances of a NN in the code, knowing that KDB code could be really tight and dense if it wanted to.
I am still working on getting the initialisation values right, running the code for many, many iterations(read epochs, in NN parlance) has yet to converge to a minimal loss value. I’ve tried mechanisms like “maxout” and “Xavier initialisation” and things are slowly improving.
I shall be commenting the code in detail, so it is more lucid.
I have also implemented a SEQ2SEQ paper and another that utilises a “attention” mechanism to apply for language translation.
They are in https://github.com/krish240574/lstmq/blob/master/polstm.q.
Please note that this code is being refined and commented so it can become really useful. That is in progress.
Do take a look,
Regards,
Kumar