I spent most of today writing a neural net from scratch and implementing a genetic algorithm to allow it to learn. I didn't read any papers but I got something surprisingly useable! I wrote a basic physics engine and it was able to pick up some basic orbital mechanics.

My eventual goal is to simulate von neumann probes, which will be controlled by the neural nets.

Β· Β· Web Β· 1 Β· 1 Β· 1

@icedquinn I'm hardcoding the layout but they're evolving the weights

@stephen any particular reason you didn't go for gradient descent :blobcat3c:
@stephen i have it in my notes to try those again but replacing the mutator function with an HTM.

although then i remembered HTMs only work on sparse arrays, and genes are dense, which is going to make that complicated.
@stephen well the mutation function is where the magic happens. if you have a more clever one, they evolve faster :blobcat3csmirk:

@icedquinn That's fair, my current mutation function is so bad

My neural nets reproduce sexually, and each new neuron is just the average of its parents' times a random value between 0.99 and 1.01 🀒

@stephen :blobcatthink:

single point crossover you pick a random split point and all weights to the left are from parent a and all weights to the right are from parent b.

mutation usually happens after crossover and just randomly changes some number of values by some amount.

i dunno what averaging them like that is called :blabcat: if it works, i guess.

@icedquinn Yesterday I implemented uniform crossover and single-point crossover. I also moved from my weird mutation to gaussian mutations. It works a lot better now

Sign in to participate in the conversation

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!