Coins, applied to neural networks
I have always been fascinated by the concepts of Artificial Intelligence. I truly think that computerised brains are the next step of evolution, as biology (hindered by outdated morals, also) will never develop or evolve quickly enough. But enough of that.
I was thinking the other day of artificial neural networks, the electronic analogy to a brain, and the fact that for N neurons, you get to have N*M connections, where M is the average number of connections a neuron makes, which are in fact the important parts.
Now, this is only an idea, Haven't thought it through, but if history is anything to go by, I will quickly forget it, so I'll just write it up. What if we could use a discreet interval for describing neural connections? Kind of like when paying money. You make thousands of payments every day, but you use only a finite number of coin and banknote values. Assume you want to operate a change on a neural link. You want to weaken it and make it twice less important or you want to make it twice as strong. Imagine the link is a money value, like 1$. Then you would weaken it to 50 cents or make it a strong 2$. But there is no two dollar bill, so you create another one dollar link.
The advantages of this, as far as I thought of it, are that you would have 10000 or more connections for each neuron, but only 6 or 10 types of them. So instead of operating 10000 operations, you would use only one operation per type. It also allows for different link operations, not only multiplications (as neural weights normally are used for). Maybe type 1 through 5 performs multiplications, but type 6 makes a logarithmic operation or something.
So the main advantage is that the entire structure can be scaled better. The only thing I can compare this with is a supermarket chain. You have thousands of customers and millions of transactions, but you have only three loyalty card types and only ten demographic groups that you compute your strategy on.
Now, of course, I will have to figure out the math of it, which I was never particularly good at, and the first problem I see is that each neuron will have to have its own neural neighbourhood lists and that might prove trickier than just making a note of each connection. However, since connections can only be of a certain type, compression of neural data is much easier, operations faster, as each list functions as an information index in a database.
Maybe all this is something trivial for a neural network scientist or I am just full of useless thoughts, but I'll note them nonetheless.
I was thinking the other day of artificial neural networks, the electronic analogy to a brain, and the fact that for N neurons, you get to have N*M connections, where M is the average number of connections a neuron makes, which are in fact the important parts.
Now, this is only an idea, Haven't thought it through, but if history is anything to go by, I will quickly forget it, so I'll just write it up. What if we could use a discreet interval for describing neural connections? Kind of like when paying money. You make thousands of payments every day, but you use only a finite number of coin and banknote values. Assume you want to operate a change on a neural link. You want to weaken it and make it twice less important or you want to make it twice as strong. Imagine the link is a money value, like 1$. Then you would weaken it to 50 cents or make it a strong 2$. But there is no two dollar bill, so you create another one dollar link.
The advantages of this, as far as I thought of it, are that you would have 10000 or more connections for each neuron, but only 6 or 10 types of them. So instead of operating 10000 operations, you would use only one operation per type. It also allows for different link operations, not only multiplications (as neural weights normally are used for). Maybe type 1 through 5 performs multiplications, but type 6 makes a logarithmic operation or something.
So the main advantage is that the entire structure can be scaled better. The only thing I can compare this with is a supermarket chain. You have thousands of customers and millions of transactions, but you have only three loyalty card types and only ten demographic groups that you compute your strategy on.
Now, of course, I will have to figure out the math of it, which I was never particularly good at, and the first problem I see is that each neuron will have to have its own neural neighbourhood lists and that might prove trickier than just making a note of each connection. However, since connections can only be of a certain type, compression of neural data is much easier, operations faster, as each list functions as an information index in a database.
Maybe all this is something trivial for a neural network scientist or I am just full of useless thoughts, but I'll note them nonetheless.
Comments
You can start with some more lightweight, such as snails or mosquitos.
Tudor