Health literacy has come to play a critical role in health education and promotion, yet it is poorly understood in adolescents and few measurement tools exist. Standardized instruments to measure health literacy in adults assume it to be a derivative of general literacy. This paper reports on the development and the early-stage validation of a health literacy tool for high school students that measured skills to understand and evaluate health information. A systematic process was used to develop, score and validate items. Questionnaire data were collected from 275, primarily 10th grade students in three secondary schools in Vancouver, Canada that reflected variation in demographic profile. Forty-eight percent were male, and 69.1% spoke a language other than English. Bivariate correlations between background variables and the domain and overall health literacy scores were calculated. A regression model was developed using 15 explanatory variables. The R(2) value was 0.567. Key findings were that lower scores were achieved by males, students speaking a second language other than English, those who immigrated to Canada at a later age and those who skipped school more often. Unlike in general literacy where the family factors of mother's education and family affluence both played significant roles, these two factors failed to predict the health literacy of our school-aged sample. The most significant contributions of this work include the creation of an instrument for measuring adolescent health literacy and further emphasizing the distinction between health literacy and general literacy.
SUMMARYThis study undertook a qualitative exploration of an operational definition of health literacy and an examination of quantitative measures of health literacy skills. We interviewed 229 older Canadian adults. First we engaged them in open-ended discussions about their search for information on a self-selected health topic. Next we administered nine self-report items on health literacy skills, and then task-performance items. Task-performance questions were based on two published reading passages on five levels of difficulty to measure 'understanding' of healthrelated material. The Rapid Estimate of Adult Literacy in Medicine (REALM) was also administered as the comparison for criterion-related validity. Our open-ended questions elicited responses about the processes that people undergo when they attempt to access, understand, appraise and communicate health information.Qualitative findings revealed complexities in participants' interpretation of the meaning of all four health literacy skills. These descriptive findings add new knowledge about health literacy as a construct. Participants agreed with most of the self-report statements, thus indicating high belief in their own health literacy. REALM scores ranged from 45 to 66 with an average of 65 and standard deviation of 2.5. Quantitative scores on the reading passages were modestly correlated with scores on the REALM. The sum scale of self-report items, however, did not correlate with task-performance items, suggesting that the different types of items may not be measuring the same construct. We suggest that self-report items need more development and validation. Our study makes a contribution in exploring the complexities of measuring health literacy skills for general health contexts.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.