He/Kaiming Initialization
Master He (Kaiming) initialization, the optimal weight initialization technique for ReLU networks that prevents gradient vanishing in deep neural architectures.
9 min readConcept
Explore machine learning concepts related to neural-networks. Clear explanations and practical insights.
Master He (Kaiming) initialization, the optimal weight initialization technique for ReLU networks that prevents gradient vanishing in deep neural architectures.
Understand Xavier (Glorot) initialization, the weight initialization technique that maintains signal variance across layers for stable deep network training.