In recent years, Riemannian stochastic variance reduction algorithms(R-SVRG) and Riemannian stochastic recursive gradient(R-SRG) have attracted considerable attention on Riemannian optimization. In this paper, we consider the linear combination of three descent directions on Riemannian manifolds as the new descent direction(i.e.,R-SRG, R-SVRG and R-SGD) with the parameters of linear combination is time-varying, and we propose two algorithms called Rimannian Stochastic Hybrid Gradient Algorithm(R-SHG) with adaptive parameters and time-varying parameters. Under normal circumstances, it is impossible to analyze the convergence of R-SRG alone. The main reason is that the conditional expectation of the descending direction is biased estimation. Therefore, the R-SHG with adaptive parameters is uses adaptive parameters to get a global convergence analysis with a decaying step size. For step size is fixed, we consider two cases of parameter inner loop fixed and inner loop time-varying, and quantitatively research the convergence speed of the algorithm. Since the global convergence of the algorithm requires higher functional differentiability, we propose R-SHG with time-varying parameters. If we choose option 2. we can get similar conclusions under weaker conditions for the case of step size is reduced and fixed.