In this paper we present a method for the accurate estimation of the derivative (aka. sensitivity) of expectations of functions involving an indicator function by combining a stochastic algorithmic differentiation and a regression. The method is an improvement of the approach presented in [6,4].The finite difference approximation of a partial derivative of a Monte-Carlo integral of a discontinuous function is known to exhibit a high Monte-Carlo error. The issue is obvious, since the Monte-Carlo approximation of a discontinuous function is just a finite sum of discontinuous functions and as such not even differentiable.The algorithmic differentiation of a discontinuous pay-off is problematic. A natural approach is to replace the discontinuity by continuous functions (aka. pay-off smoothing). This is equivalent to replacing a path-wise automatic differentiation by a (local) finite difference approximation.We show that this local finite difference approximation can be seen as a linear regression with the simplest regression basis function (a single indicator). Investigating the expression, we observe that we can separate the expectation of the indicator function (the density) and the regression of the size and speed of the discontinuity. With this formulation, we then replace the regression(s) by more accurate estimators.