Conjugate gradient methods are important first-order optimization algorithms both in Euclidean spaces and on Riemannian manifolds. However, while various types of Riemannian conjugate gradient methods have been studied, the iteration complexity analysis remains unknown. This paper proposes a novel Riemannian conjugate gradient method for nonconvex problems with an iteration complexity guarantee. In particular, we introduce a novel restart condition, leading to a function decrease at each iteration. Our method converges to an ϵ-stationary point with an iteration complexity of O ( ϵ − 2 ) . To the best of our knowledge, this is the first Riemannian conjugate gradient method with iteration complexity guarantees. Numerical experiments on Rayleigh-quotient and Brockett-cost-function minimization problems demonstrate the efficiency and practical applicability of the proposed method.