“…Our aim is to provide researchers with a foundational resource that facilitates informed decision-making in selecting AFs for neural networks, recognizing that a more exhaustive exploration or detailed analysis would necessitate a dedicated and focused effort beyond the scope of this comprehensive listing. The presented overview is limited to real-valued activation functions; complex-valued neural networks (e.g., [3][4][5][6][7][8][9][10][11][12][13][14][15][16], brief overview available in [17,18]), bicomplex-valued neural networks (e.g., [19]), quaternion-valued neural networks (e.g., [20][21][22][23][24]), photonic neural networks (e.g., [25]), fuzzy neural networks (e.g., [26][27][28][29][30][31]), AFs for probabilistic boolean logic (e.g., [32]), quantum AFs (e.g., [33]) and others are out of the scope of this work. 1 We have chosen to categorize AFs into two main classes: fixed AFs (section 3) and adaptive activation functions (AAFs) (section 4), the latter having a parameter that is trained alongside the other weights in a network.…”