Machine learning (ML) algorithms have revolutionized the way we interpret data in astronomy, particle physics, biology, and even economics, since they can remove biases due to a priori chosen models. Here we apply a particular ML method, the genetic algorithms (GA), to cosmological data that describes the background expansion of the Universe, namely the Pantheon Type Ia supernovae and the Hubble expansion history HðzÞ datasets. We obtain model independent and nonparametric reconstructions of the luminosity distance d L ðzÞ and Hubble parameter HðzÞ without assuming any dark energy model or a flat Universe. We then estimate the deceleration parameter qðzÞ, a measure of the acceleration of the Universe, and we make a ∼4.5σ model independent detection of the accelerated expansion, but we also place constraints on the transition redshift of the acceleration phase ðz tr ¼ 0.662 AE 0.027Þ. We also find a deviation from ΛCDM at high redshifts, albeit within the errors, hinting toward the recently alleged tension between the SnIa/quasar data and the cosmological constant ΛCDM model at high redshifts (z ≳ 1.5). Finally, we show the GA can be used in complementary null tests of the ΛCDM via reconstructions of the Hubble parameter and the luminosity distance.