In my previous post (Non Linear Regression Example with Keras and Tensorflow Backend) I used a multi variable non linear function to demonstrate function of non linear regression with Keras. The function was of lower order and hence in this post I am using an example with standard non linear function (Guass function available at https://www.itl.nist.gov/div898/strd/nls/data/gauss3.shtml )
The example is fully developed with Google Colab and is accessible online. The code can be run on the cloud (GPU) without the need for any software installation. Just download the following files and upload them when requested by the during execution
As compared to my previous example this required careful selection of models. The number of layers is more than required but I left it that way to obtain the best fit on training data. Since this is a known standard function over fitting was not an issue.
A good observation from this exercise was that sigmoid and tanh activation functions did not perform very well. The solution did not converge due to Vanishing gradient problem. Initially my expectation was that both these functions are good for modelling non linear functions but realized that the ReLu activation function performed quite well.
I’ll post more examples in the future.