SILOQY RNG needs "conditioning": Variance, etc.- #2
Reference in New Issue
Block a user
No description provided.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
The implemented RNG (2) is "too perfect": Variance is low. It is "better-than-NIST".-
THIS is BAD for certain ML functions: Weight init needs simmetry breakage to actually train.-
SO a "conditioner" or variance-injector or similar needs developed.-
Prototype at:
PS: Might be applicable into/to "ML-based weight init" via "meta" network (v. ChatGPT): That system ML-d the best init weights based on a "suite" of stat-tests on incoming data, training a "meta" network.-
.. that then provided a better startinf off point for downstream ML.-
https://chatgpt.com/g/g-p-68bf350bb9748191ad9debe95867c538-siloqy-dolphin/c/68e679cb-4ee8-832b-a814-23015788521b
More code: