Important bug found: Bad context size for the ARC prize

 For the ARC price https://arcprize.org/, today I have:
- I fixed a bug that caused my neural network to not see all the input text of a sample because the context was 140 characters but the samples were 211 characters
- I increased the number of layers from 4 to 8. To do this on the NVIDIA A40 GPU on runpod.io/, I changed my batch_size from 1024 to 512 and changed num_layers from 4 to 8.

 



Comments

Popular posts from this blog

Learning to solve the example 1 of puzzle 3aa6fb7a in the ARC prize

The Thorium actor engine is operational now, we can start to work on actor applications for metagenomics

The source code of SOAPdenovo2 sits in the shadows