We study the probability that a monic polynomial with integer coefficients has a low-degree factor over the integers, which is equivalent to having a low-degree algebraic root. It is known in certain cases that random polynomials with integer coefficients are very likely to be irreducible, and our project can be viewed as part of a general program of testing whether this is a universal behavior exhibited by many random polynomial models. Our main result shows that pointwise delocalization of the roots of a random polynomial can be used to imply that the polynomial is unlikely to have a low-degree factor over the integers. We apply our main result to a number of models of random polynomials, including characteristic polynomials of random matrices, where strong delocalization results are known.