Accurate stock market forecasting has remained an elusive endeavor due to the inherent complexity of financial systems dynamics. While deep neural networks have shown initial promise, robustness concerns around long-term dependencies persist. This research pioneers a synergistic fusion of nonlinear time series analysis and algorithmic advances in representation learning to enhance predictive modeling. Phase space reconstruction provides a principled way to reconstruct multidimensional phase spaces from single variable measurements, elucidating dynamical evolution. Transformer networks with self-attention have recently propelled state-of-the-art results in sequence modeling tasks. This paper introduces PSR-Transformer Networks specifically tailored for stock forecasting by feeding PSR interpreted constructs to transformer encoders. Extensive empirical evaluation on 20 years of historical equities data demonstrates significant accuracy improvements along with enhanced robustness against LSTM, CNN-LSTM and Transformer models. The proposed interdisciplinary fusion establishes new performance benchmarks on modeling financial time series, validating synergies between domain-specific reconstruction and cutting-edge deep learning.