In free-space laser communications, an optical signal propagating through the atmosphere experiences power fluctuations and fading due to pointing inaccuracies and changes in the refractive index of the atmosphere. Determining how a receiver will detect the distorted signal over time is advantageous for the design of robust optical terminals, for the evaluation of mitigation techniques, and for the development of automatic repeat request (ARQ) and forward error correction (FEC) protocols. In this work, the impact of atmospheric effects is considered to generate numerical time series of received power for a ground-to-satellite link (uplink) scenario. The generation procedure of the numerical series is described, and the generated series and their statistics are presented and compared with existing theory. As channel characteristics may vary rapidly during links with satellites at lower orbits, an uplink to a low-Earth-orbit (LEO) satellite is selected as scenario to illustrate the change in channel characteristics with satellite elevation and slew rate. However, this work is applicable to any uplink scenario. The results and analysis presented in this work can serve for link-budget design, for dimensioning interleavers, for delay analysis in ARQ protocols, for development of FEC schemes, for standardization, and for evaluation of power-fading mitigation techniques.