The well-known "Janson's inequality" gives Poisson-like upper bounds for the lower tail probability P(X (1 − ε)EX) when X is the sum of dependent indicator random variables of a special form. We show that, for large deviations, this inequality is optimal whenever X is approximately Poisson, i.e., when the dependencies are weak. We also present correlation-based approaches that, in certain symmetric applications, yield related conclusions when X is no longer close to Poisson. As an illustration we, e.g., consider subgraph counts in random graphs, and obtain new lower tail estimates, extending earlier work (for the special case ε = 1) of Janson, Luczak and Ruciński.