In this paper, we discuss a machine learning based approach to jointly estimating both linear and nonlinear noise contributions in an optical fiber communication link. We will expound the rational for utilizing machine learning for this problem, before discussing current progress and then concluding with future research directions.