In this paper I investigate the construction and the properties of the so-called marginal perspective cost H, a function related to Optimal Entropy-Transport problems obtained by a minimizing procedure, involving a cost function c and an entropy function. In the pure entropic case, which corresponds to the choice c = 0, the function H naturally produces a symmetric divergence. I consider various examples of entropies and I compute the induced marginal perspective function, which includes some well-known functionals like the Hellinger distance, the Jensen-Shannon divergence and the Kullback-Liebler divergence. I discuss the metric properties of these functions and I highlight the important role of the so-called Matusita divergences. In the entropy-transport case, starting from the power like entropy Fp(s) = (s p − p(s − 1) − 1)/(p(p − 1)) and the cost c = d 2 for a given metric d, the main result of the paper ensures that for every p > 1 the induced marginal perspective cost Hp is the square of a metric on the corresponding cone space.