Sunday, October 26, 2014
Parallel computing is especially useful for neural network training. I have trained some neural nets for as long as a week. I have trained 3 or 4 of these at one time on 3 or 4 conventional computers. I have enough computers that I could train at least 10 neural networks at a time, giving a factor of 10 speedup. This is useful for training preprocessor networks, expert neural networks, voting neural network classifiers, modular neural networks, etc.
My Asa H 2.0 artificial intelligence receives a stream of inputs and generates two output streams; one composed of physical actions taken in the world and the other a set of models that describe the world Asa finds itself in. Of all the patterns Asa comes to recognize in its input stream the majority are spatial patterns. Only a minority of learned input patterns are temporal patterns. Of all the physical outputs Asa learns the majority are temporal patterns. Only a minority of the physical outputs learned by Asa are spatial patterns. See also my blog of 22 Sept. 2014.
Friday, October 24, 2014
Wednesday, October 22, 2014
In Asa H 2.0 light I typically use pattern length and frequency of pattern occurrence as components of a case's vector value/utility. (see my blog of 19 Feb. 2011) A pattern's complexity can also be used in measuring its importance. It is not clear which measure of complexity to use, however, permutation entropy? (Brandt and Pompe, Phys. Rev. Lett., 11 April 2002) Ke and Tong's measure? (Phys. Rev. E, 2008) or what?
The various agents can be trained in a wide variety of specialties. The agents may have different values; one agent may value lifespan more than offspring. Another agent may value offspring more than lifespan. One knowledgebase may contain cases that value case length or complexity more. Another knowledgebase may contain cases that value frequency of pattern recurrence more. Such diversity will help the society deal with complex time varying environments. The society of agents will be more capable than a single agent.
Sunday, October 19, 2014
One sort of proto-logic works by verifying if a subset of symbols is present in a certain set. (Principles of Quantum Artificial Intelligence, Andreas Wichert, World Scientific, 2014, pg 31 ). The set is represented by a vector which is divided into sub-vectors. Asa searches for sub-vectors in this way but it is satisfied with an approximate rather than an exact match.