Editorial SENSOR FUSION PHOTOGRAMMETRY DOES NOT exist in a bubble. There are a variety of techniques that compete with traditional image-based photogrammetry. During the first decade of this century it seemed that traditional photogrammetry would be eclipsed by laser scanning, but its reemergence as a competitive technique is testament to its longevity. Of course, laser scanning (lidar) is not the only technology that competes with photogrammetric camera images: synthetic-aperture radar, multispectral and hyperspectral scanning are all examples of alternative techniques. However, instead of treating these various alternates as rivals, what if they are embraced as complementary technologies that, in combination with conventional photogrammetry, can produce an improved overall outcome? This is the fundamental idea of sensor fusion. Although introductions to the subject often use the example of sensors for autonomous vehicles, the topic permeates various strands throughout our own discipline and beyond.Photogrammetry has always incorporated other techniques in order to achieve workable solutions to its primary goal of generating accurate 3D data from imagery. There are certain close-range applications where it is possible (even by non-specialists using "black-box" structure from motion (SfM) software) to produce a 3D model from a series of overlapping photographs on their own, with no other data. However, this will be in an arbitrary coordinate system and will not even have a scale attached. However, most applications require some form of external (non-photogrammetric) information to allow the model to be scaled, translated and rotated to a defined coordinate system (reference frame), perhaps using selected control data to amend the photogrammetric model (for example, in a bundle adjustment). The photogrammetric output may also be augmented by additional information (such as missing detail or semantic information). Such additional data may be provided by a simple scale rule, at one end of the spectrum, to a variety of observations from modern instrumentation, especially data from global navigation satellite systems (GNSS). Even in the formative days of our discipline, Laussedat used spirit levels, tapes and theodolites, in addition to his photographic plates, to produce his mid-19th-century maps and building façades. As another example, most terrestrial laser scanner (TLS) surveys will use a total station to determine the station positions (the two technologies are now being integrated into a single instrument by some manufacturers). A related topic is data fusion and dataset conflation, where some authors distinguish various stages, such as integration followed by fusion. Even if disparate sensors can be considered integrated in traditional photogrammetric solutions, they frequently do not demonstrate fusion (often distinguished by the identity of the component datasets being no longer maintained).Sensor fusion is a current "hot topic" in geomatics and beyond. There are many areas, including the Internet of Things ...