A core challenge in perception is recognizing objects across the highly variable retinal input that occurs when objects are viewed from different directions (e.g., frontvsside views). It has long been known that certain views are of particular importance, but it remains unclear why. A philosophical and scientific debate has raged over whether object perception retains a ‘perspectival’ aspect (1–5) whereby an object’s proximal appearance might influence perceptual judgments. We reasoned that characterising the computations underlying visual comparisons between objects could explain the privileged status of certain views, and potentially resolve the debate. We measured pose discrimination for a wide range of objects, finding large variations in performance depending on the object and the view angle, with front and back views yielding particularly good discrimination. Strikingly, a simple and biologically plausible computational model based on measuring the projected 3D optical flow between views of objects accurately predicted both successes and failures of discrimination performance. This provides a unifying account of why certain views have a privileged status, spanning both poles of the debate. Shifts between corresponding locations on objects are estimated in 3D, but are then projected into the image plane, leading to the observed ‘perspectival’ effects in pose discrimination performance.Significance statementWe show that qualitatively special viewpoints of 3D objects can be predicted by an optical-flow model that measures how points on the surface shift in the image as viewpoint changes. This provides a unifying, quantitative account for why some viewpoints of objects are perceptually special, and provides a potential solution to the ongoing debate on whether our percepts of the 3D world have a perspectival aspect.