The modelling of regulatory frameworks and industry standards, including their argumentation and expected evidence, are used during assurance processes to demonstrate the compliance of systems. However, this is handled mainly in a static fashion, and using these models for dynamic evidence checking along the system life-cycle, including operation (checking the model at runtime), is not yet mainstream. This preliminary work shows a tool-supported modelling method for the automatic and dynamic evaluation of evidence. The solution is supported by an Eclipse OpenCert tool extension where the capabilities of evidence models are extended with automatic checks. The user monitoring the assurance project receives alerts when evidence are unsatisfied. It also exports a continuous log of these checks using the XES standard to enable traceability and historical creation of passing and failing checks for analysis and auditing purposes. While some evidence checks are generic, the diversity of checking processes required our solution to be extensible.