Despite the widespread use, critics claim that citation analysis has serious limitations in evaluating the research performance of scholars. First, conventional citation analysis methods yield one-dimensional and sometimes misleading evaluation as a result of not taking into account differences in citation quality, not filtering out citation noise such as self-citations, and not considering non-numeric aspects of citations such as language, culture, and time. Second, the citation database coverage of today is disjoint and incomplete, which can result in conflicting quality assessment outcomes across different data sources. This paper discuss the findings from a citation analysis study that measured the impact of scholarly publications based on the data mined from Web of Science, Scopus, and Google Scholar, and briefly describes a work-in-progress prototype system called CiteSearch, which is designed to overcome the weaknesses of existing citation analysis methods with a robust citation-based quality assessment approach.