Hi Supremegizmo,
You have a free choice to set the data range in NN as optimization, testing and production (out of sample). To avoid overfit, it is not necessary to limit the weight at a constant value. The most preferred parameters to verify the best NN model by identifying its performance in all above data ranges.
If you found that your model is only looks good during the optimization but not good during testing(paper trade) and out of sample, means your model is overfit.
I think the best way to check is by stopping during optimization and see the result in other data range. In NSDT we can only see the model performance after we stop the optimization, while in CH we can see the model performance in all data ranges in real time.
In your case, if you fix the weight in certain NN configuration, means you have only bias parameters in all NN layer that are used to approximate your output. hence, it is not wise to fix the weight value in your NN model.
QUOTE=supremegizmo;2199446]Master Arry
I am tested recurrent networks for prediction fx moves. My whole data set has 1000 bars and i am training the network on 300 bars . would it be wise to leave inputs as it is i.e 0.5 for all weights ? The reason for this is i do not want to fit the data . All i want is recurrent network to learn the 300 bars pattern which i think is crutial for future moves