This paper describes a computer based approach to comparing data obtained for Knowledge Based Systems via established but very varied knowledge elicitation (KE) techniques. It describes not only the detailed comparison of diflerent KE methods (in this case 'scaling' and 'non-scaling ') but also investigates the use of 'demonstration ' or 'evaluation' systems, as a variation on the more established rapid prototyping approaches to the elicitation and evaluation of knowledge for KBS construction, in this case by focusing upon the quality and relevance of the elicited knowledge from the perspective of the expert himsev Preliminary results from the study reported here suggest that non-scaling methods produce a greater amount of raw data than scaling methods, and that this data is less likely to require correction or modiJication for inclusion within a Knowledge Based System. However, the results also indicate that non-scaling derived data is more likely than scaling derived data data to be rejected outright.