Naseer, M., & Shafique, M. (2023). Poster: Link between Bias, Node Sensitivity and Long-Tail Distribution in trained DNNs. In 2023 IEEE 16th International Conference on Software Testing, Verification and Validation (pp. 474–477). https://doi.org/10.1109/ICST57152.2023.00054
E191-01 - Forschungsbereich Cyber-Physical Systems E191-02 - Forschungsbereich Embedded Computing Systems
-
Erschienen in:
2023 IEEE 16th International Conference on Software Testing, Verification and Validation
-
ISBN:
978-1-6654-5666-1
-
Datum (veröffentlicht):
2023
-
Veranstaltungsname:
16th IEEE International Conference on Software Testing, Verification and Validation (ICST 2023)
en
Veranstaltungszeitraum:
16-Apr-2023 - 20-Apr-2023
-
Veranstaltungsort:
Irland
-
Umfang:
4
-
Peer Reviewed:
Ja
-
Keywords:
Bias; Class-wise Performance; Deep Neural Networks (DNNs); Input Sensitivity; Robustness
en
Abstract:
Owing to their remarkable learning (and relearning) capabilities, deep neural networks (DNNs) find use in numerous real-world applications. However, the learning of these data-driven machine learning models is generally as good as the data available to them for training. Hence, training datasets with long-tail distribution pose a challenge for DNNs, since the DNNs trained on them may provide a varying degree of classification performance across different output classes. While the overall bias of such networks is already highlighted in existing works, this work identifies the node bias that leads to a varying sensitivity of the nodes for different output classes. To the best of our knowledge, this is the first work highlighting this unique challenge in DNNs, discussing its probable causes, and providing open challenges for this new research direction. We support our reasoning using an empirical case study of the networks trained on a real-world dataset.
en
Forschungsschwerpunkte:
Computer Engineering and Software-Intensive Systems: 100%