L2ashobby
banner
l2ashobby.bsky.social
L2ashobby
@l2ashobby.bsky.social
Learning about machine learning
The universal approximation theorem states that a neural network of a specific structure can approximate a continuous function to any accuracy. While that means deeper neural nets are not needed in theory, later experiments showed practical benefits of increasing layers vs increasing hidden size.
September 23, 2025 at 10:35 PM
so this is what being nerd-sniped feels like...
September 11, 2025 at 4:27 PM
"ups the contrast" - small absolute differences are even smaller after squaring, while differences close to 1 remain close to 1.
September 9, 2025 at 4:37 PM
This input data approach falls under the general body of data augmentation methods.
August 13, 2025 at 4:45 PM
KL-Divergence isolates the extra information needed to encode those P-based messages as a result of wrongly assuming distribution Q.
July 19, 2025 at 4:56 PM
Common sense reasoning about uncertainty means treating Bayesian probabilities as frequentist probabilities.
July 17, 2025 at 1:20 AM