Example of problem:
Pattern one is: 1, 10, 1: 1
Pattern two is: 10, 10, 10: 0
From this extremely simple pattern we will know that a small value followed by a large increase and then a large decrease means it is a 1. Otherwise its a 0. Using something like Genetic Algorithms I can write a system to learn these patterns.
But how do I normalize the numbers or avoid Genetic Algorithms from overfitting this data such that the underlying pattern is learned. For example, if all of a sudden the following set is encountered [500, 10000, 500] it will not be recognized as a 1.