10th International Congress on Information and Communication Technology in concurrent with ICT Excellence Awards (ICICT 2025) will be held at London, United Kingdom | February 18 - 21 2025.
Authors - Bilal FAYE, Hanane AZZAG, Mustapha LEBBAH Abstract - Batch Normalization (BN) enhances neural network generalization and accelerates training by normalizing mini-batches to a uniform mean and variance. However, its performance degrades with diverse data distributions. To overcome this, we introduce Supervised Batch Normalization (SBN), which extends normalization by leveraging multiple mean and variance parameters to account for contexts identified prior to training. These contexts—defined explicitly (e.g., domains in domain adaptation) or implicitly (e.g., via clustering algorithms)—ensure effective normalization for samples with shared features. Experiments across single-and multi-task datasets demonstrate the superiority of SBN over BN and other normalization techniques. For example, integrating SBN with Vision Transformer yields a 15.13% accuracy boost on CIFAR-100, while in domain adaptation scenarios, SBN with AdaMatch achieves a 22.25% accuracy gain on MNIST and SVHN compared to BN. Our code implementation is available on our GitHub repository: https://github.com/bfaye/ supervised-batch-normalization.