AbstractPeople make surprising but reliable perceptual errors. Here, we provide a unified explanation for errors in the perception of three-dimensional (3D) motion. To do so, we characterized the retinal motion signals produced by objects moving with arbitrary trajectories through arbitrary locations in 3D. Next, we developed a Bayesian model, treating 3D motion perception as optimal inference given sensory noise and the geometry of 3D viewing. The model predicts a wide array of systematic perceptual errors, that depend on stimulus distance, contrast, and eccentricity. We then used a virtual reality (VR) headset as well as a standard 3D display to test these predictions in both traditional psychophysical and more naturalistic settings. We found evidence that people make many of the predicted errors, including a lateral bias in the perception of motion trajectories, a dependency of this bias on stimulus contrast, viewing distance, and eccentricity, and a surprising tendency to misreport approaching motion as receding and vice versa. In sum, we developed a quantitative model that provides a parsimonious account for a range of systematic misperceptions of motion in naturalistic environments.