Sign Language (SL) is a communication method between people. It is an essential language; especially for people who are speech impaired and hearing impaired, it can be considered as their mother tongues. Hand gestures form the nonverbal communication of this language. We focus on interpreting Arabic Sign Alphabet (ASA) in this study and, as a case study, the recognition of alphabet in Iraqi Sign Language (IrSL) is carried out with the help of specialists from the "Al-Amal Institute for the Deaf and Dumb". A new ASA dataset of various hand gestures was created and adopted. In addition, a deep learning model named the Deep Arabic Sign Alphabet (DASA) is proposed, which is a developed version of the Convolutional Neural Network (CNN). It can efficiently interpret the ASA, achieving a high interpretation accuracy of 95.25%.