Recurrent neural networks in KDB+.

Dear q folks, 
If you are looking for a LSTM(Long short-term memory) neural network implemented in KDB, here is some code I wrote - 
https://github.com/krish240574/lstmq/blob/master/polstm.q - this implements the following :

  1. A LSTM, that attempts, and is still attempting to, predict the next letter in a sequence, the sequence being a Shakespearean passage here.

  2. I have implemented the code elaborately, so I could bring out the nuances of a NN in the code, knowing that KDB code could be really tight and dense if it wanted to. 

I am still working on getting the initialisation values right, running the code for many, many iterations(read epochs, in NN parlance) has yet to converge to a minimal loss value. I’ve tried mechanisms like “maxout” and “Xavier initialisation” and things are slowly improving. 

I shall be commenting the code in detail, so it is more lucid. 

I have also implemented a SEQ2SEQ paper and another that utilises a “attention” mechanism to apply for language translation. 

They are in https://github.com/krish240574/lstmq/blob/master/polstm.q

Please note that this code is being refined and commented so it can become really useful. That is in progress. 

Do take a look, 

Regards, 

Kumar

Sorry, links for SEQ2EQ and a bidirectional LSTM are in the following files - 
https://github.com/krish240574/lstmq/blob/master/seq2seq.q - SEQ2SEQ

https://github.com/krish240574/lstmq/blob/master/bilstmatt.q - Bi-directional LSTM with “attention”. 

Apologies for pasting the wrong link in the earlier mail(late night !).

Regards, 

Kumar