In this paper, we study the problem of low-rank Hankel matrix completion, which aims to recover a partially observed Hankel matrix under the prior knowledge that it is
low-rank. This problem finds important applications in signal processing, systems control and machine learning. Existing approaches based on convex relaxations suffer from high computational and memory cost when the problem dimension is large. Motivated by the recent success in nonconvex statistical estimation, we aim to directly estimate the memory-efficient low-rank factors by minimizing a loss function penalizing the deviation from the Hankel structure and the observations via projected gradient descent. Despite nonconvexity, we show that with a properly designed initialization, projected gradient descent provably converges to the ground truth at a linear rate that is much faster than prior art, as soon as the sample complexity is sufficiently large. Our results are directly applicable to completion of low-rank Toeplitz and multi-dimensional Hankel matrices. In addition, we show that the reconstruction is stable in the presence of bounded noise.