It is known that incorporating gradient information can significantly enhance the prediction accuracy of stochastic kriging. However, such an enhancement cannot be scaled trivially to high-dimensional design space, since one needs to invert a large covariance matrix that captures the spatial correlations between the responses and the gradient estimates at the design points. Not only is the inversion computationally inefficient, but also numerically unstable since the covariance matrix is often ill-conditioned. We address the scalability issue via a novel approach without resorting to matrix approximations. By virtue of the so-called Markovian covariance functions, the associated covariance matrix can be invertible analytically, thereby improving both the efficiency and stability dramatically. Numerical experiments demonstrate that the proposed approach can handle large-scale problems where prior methods fail completely.