BACKGROUND: The Sit-to-Stand (STS) test is widely used in clinical practice as an indicator of lower-limb functionality decline, especially for older adults. Hitherto, due to its high variability, there is no standard approach for categorising the STS motion pattern, and the vision-based evaluation remains the most reliable method to evaluate people’ performance. This paper presents a comparative analysis between visual assessments and an automated-software approach for the categorisation of STS, relying on registrations from a force plate. METHODS: A group of 5 participants (30 ± 6 years) took part in 2 different sessions of visual inspections on 200 STS movements randomly extracted from a dataset of 742 acquisitions under self-paced and controlled speed conditions. Assessors were asked to identify three specific STS events from the Ground Reaction Force, simultaneously with the software analysis: the start of the trunk movement (Initiation), the beginning of the stable upright stance (Standing) and the sitting movement (Sitting). The Test-Retest Reliability between first and second visual evaluations was compared with the Inter-Rater Agreement between visual and software assessments, as indexes of human and software performance, respectively.RESULTS: No statistical differences between methods were found for the identification of the Initiation and the Sitting events at self-paced speed and for only the Sitting event at controlled speed. The estimated significant values of maximum discrepancy between visual and automated assessments were 0.200 s [0,039; 0.361] in unconstrained conditions and 0,340 s [0,014; 0,666] for standardised movements.CONCLUSIONS: The software assessments displayed an overall good agreement against visual evaluations of the Ground Reaction Force, relying, at the same time, on objective measures. In this sense, the proposed approach can provide robust and consistent data in the field of Big Data analytics, augmenting the performance of artificial intelligence methods for Human Activity Recognition tasks.