auroagwei

Development Log for Neural Network Prototype

Education
auroagwei Updated   
COINBASE:BTCUSD   Bitcoin
The idea, at the core:
  • Port a limited RNN/LSTM Neural Network model from Python with a reduced training set and dimension size for layers to demonstrate that a fully functional (even if limited) Neural Net can work in Pine.

  • Limited model + having the python code on hand = Able to test and verify components in Pine at every step, in theory

    The model/script I'm attempting to implement a limited subset of is detailed here:
    iamtrask.github.io/2...nyone-can-code-lstm/

    A dataset in binary is required, but binary does not exist in pinescript, thus:

  • To do this, decimal to binary and binary to decimal functions are required. This didn't exist previously - I've written a script to accomplish just that:

    Originally, this was going to have a input_dim of 2, hidden_dim of 16, but I've changed the hidden_dim to 8 (binary dimensions from 8 to 5) to reduce the dataset range to max 32 while I figure out to implement working pseudo-arrays and state updates. I've looked at RicardoSantos's scripts for Markov and Pseudoarrays, and will be using them as a reference going forward.

  • I've verified the output of the Sigmoid function and 1st derivative of the Sigmoid function in Python for values of (-1,0,0.5,1 ). I've yet to publish the Sigmoid script pending approval from TV moderators about including python code that is commented out at the bottom to verify the results of that script.

What I'm trying to do here with training dataset generation was unsuccessful, for multiple reasons:
Lack of formal array constructs in pine
Psuedorandom Number generator limitations

Manual state weighting and updating as per RicardoSantos's Function Markov Process is required:

What's being plotted for are the first three layers, but without the full range of the input_dimensions, hidden_dimensions:

syn_0 (blue)
syn_1 (green)
syn_h (red)


While there's more than a few technical hurdles to overcome (i.e. potential pine issues from max variables to runtime/compile limits, no real arrays, functions to do state updates RichardoSantos Markov Function style, etc), I'm fairly confident a limited working model should be possible to create in Pine.
Comment:
Scaling back:

I've apparently got weight updates working for a basic Perceptron, but getting error loss/etc is another story:
Comment:
NAND Perceptron script:
- Error/loss functions are missing due to difficulty with iterations/epochs without loop structure and proper arrays
- Layer weights are computed INSIDE pinescript unlike other scripts that use external programs and setups to generate activation function values and output for ANN/NN
Comment:
I'm looking forward to real array methods and structures in future Pinescript updates/versions - it would enable both myself and other script authors to make more complex scripts without individual variable array workarounds:
trello.com/c/aQm30HO...array-data-structure

It would most certainly make storing/accessing data per loop per iteration/epoch for Neural Network cells and layers much, much easier in Pine (in theory, at least).
Comment:
I've been able to get an iteration loop working and have been able to verify the outputs against python - predictions are correct.
Error/Sum of squared errors is still not working properly in terms of loss output, but the predictions are correct:
Comment:
AND Gate:
OR Gate:
NOT Gate:
NAND Gate:
Comment:
NAND Gate:
Comment:
Comment:
XOR/XNOR Gate state detection - Those two gate structures require MLP and nonlinear activation to properly train + predict correct output:

MLP is a bit out of the scope for this script, but with the basic unit of machine learning established and operational, it should hypothetically be possible.
Comment:
I've identified a few errors/mistakes that will need to be corrected in an update + done some exploratory coding for MLP, and I don't think it's practical on an implementation level without real arrays in future version of pinescript
Disclaimer

The information and publications are not meant to be, and do not constitute, financial, investment, trading, or other types of advice or recommendations supplied or endorsed by TradingView. Read more in the Terms of Use.