Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's even worse than number of input/outputs, number of neurons, efficiency or directional feedback.

The brain also has plasticity! The connections between neurons change dynamically - an extra level of meta.



Connections between LLM neurons also change during training.


a) "during training" is a huuuuge asterisk

b) Do you have a citation for that? my understanding is that while some weights can go to zero and effectively be removed, no (actually used in prod) network architecture or training method allows arbitrary connections.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: