When rendering the visual scene for near-eye head-mounted displays, accurate knowledge of the geometry of the displays, scene objects, and eyes is required for the correct generation of the binocular images. Despite possible design and calibration efforts, these quantities are subject to positional and measurement errors, resulting in some misalignment of the images projected to each eye. Previous research investigated the effects in virtual reality (VR) setups that triggered such symptoms as eye strain and nausea. This work aimed at investigating the effects of binocular vertical misalignment (BVM) in see-through augmented reality (AR). In such devices, two conflicting environments coexist. One environment corresponds to the real world, which lies in the background and forms geometrically aligned images on the retinas. The other environment corresponds to the augmented content, which stands out as foreground and might be subject to misalignment. We simulated a see-through AR environment using a standard three-dimensional (3D) stereoscopic display to have full control and high accuracy of the real and augmented contents. Participants were involved in a visual search task that forced them to alternatively interact with the real and the augmented contents while being exposed to different amounts of BVM. The measured eye posture indicated that the compensation for vertical misalignment is equally shared by the sensory (binocular fusion) and the motor (vertical vergence) components of binocular vision. The sensitivity of each participant varied, both in terms of perceived discomfort and misalignment tolerance, suggesting that a per-user calibration might be useful for a comfortable visual experience.