Recently, many studies on unmanned aerial vehicle (UAVs) that perform position control using camera images have been conducted. The measurements of the surrounding environment and position of the mobile robot are important in controlling the UAV. The distance and direction of the optical ray to the object can be obtained from the diameter and coordinates in the image. In these studies, various camera systems using plane cameras, fisheye cameras, or omnidirectional cameras are used. Because these camera systems have different geometrical optics, one simple image position measurement method cannot yield the position and posture. Therefore, we propose a new method that measures the position from the size of three-dimensional landmarks using omnidirectional cameras. Three-dimensional measurements are performed by these omnidirectional cameras using the distance and direction to the object. This method can measure three-dimensional positions from the direction and distance of the ray; therefore, if the optical path such as the reflection or refraction is known, it can perform measurements using a different optical system’s camera. In this study, we construct a method to obtain the relative position and relative posture necessary for the self-position estimation based on an object with an omnidirectional camera; further, we verify this method by experiment.