A new class of matrix support functionals is presented which establish a connection between optimal value functions for quadratic optimization problems, the matrix-fractional function, the pseudo-matrix-fractional function, the nuclear norm, and multitask learning. The support function is based on the graph of the product of a matrix with its transpose. Closed form expressions for the support functional and its subdifferential are derived. In particular, the support functional is shown to be continuously differentiable on the interior of its domain, and a formula for the derivative is given when it exists.