The biggest contributor to global warming is energy production and use. Moreover, a push for electrical vehicle and other economic developments are expected to further increase energy use. To combat these challenges, electrical load forecasting is essential as it supports energy production planning and scheduling, assists with budgeting, and helps identify saving opportunities. Machine learning approaches commonly used for energy forecasting such as feedforward neural networks and support vector regression encounter challenges with capturing time dependencies. Consequently, this paper proposes Sequence to Sequence Recurrent Neural Network (S2S RNN) with Attention for electrical load forecasting. The S2S architecture from language translation is adapted for load forecasting and a corresponding sample generation approach is designed. RNN enables capturing time dependencies present in the load data and S2S model further improves time modeling by combining two RNNs: encoder and decoder. The attention mechanism alleviates the burden of connecting encoder and decoder. The experiments evaluated attention mechanisms with different RNN cells (vanilla, LSTM, and GRU) and with varied time horizons. Results show that S2S with Bahdanau attention outperforms other models. Accuracy decreases as forecasting horizon increases; however, longer input sequences do not always increase accuracy. INDEX TERMS Attention mechanism, gated recurrent units, GRU, load forecasting, long short-term memory, LSTM, recurrent neural networks, sequence-to-sequence networks. LJUBISA SEHOVAC (Student Member, IEEE) received the B.Sc. degree in applied math and M.E.Sc. degree in software engineering, collaborative specialization in artificial intelligence