Detecting disruptions with sufficient anticipation time is essential to undertake any form of remedial strategy, mitigation or avoidance. Traditional predictors based on machine learning techniques can be very performing if properly optimised but do not provide a natural estimate of the quality of their outputs. In this paper a new set of tools, based on probabilistic extensions of Support Vector Machines, are introduced and applied for the first time to JET data. The probabilistic output constitutes a natural qualification of the prediction quality and provides additional flexibility. Indeed the versatility of the developed techniques is such that different versions of the tools can be optimised to perform various tasks, from prediction for mitigation to avoidance or even classification. Large databases of disruptions, covering entire campaigns, are analysed, both for the case of the graphite and the ITER Like Wall. Success rates of the order of 97% with about 4.5% of false alarms can be easily achieved, satisfying even the requirements of the next generation of devices. The fact that the developed tools give the probability of disruption improves the interpretability of the results, provides an estimate of the predictor quality and gives new insights into the physics. Moreover, a probabilistic treatment permits to insert more easily these classifiers into general decision support and control systems.