Interest in self-rated health (SRH) as a tool for use in disease and mortality risk screening is increasing. The authors assessed the discriminatory ability of baseline SRH to predict 10-year mortality rates compared with objectively measured health status. Principal component analysis was used to create a health score that included systolic blood pressure, presence of diabetes mellitus, body mass index, electrocardiographic parameters, B-type natriuretic peptide, and other biochemical and hematologic measures. From 1997 to 2007, a total of 474 of the 1,388 baseline participants died and 81 were lost to follow-up, yielding 11,833 person-years of observation. The adjusted hazard ratio for death was 1.74 (95% confidence interval (CI): 1.32, 2.29) for persons reporting poor health versus those reporting good health. When combined with age and sex, SRH had a C statistic to predict death equal to 0.69 (95% CI: 0.67, 0.71), which was comparable to that of the inclusive health score (C 1⁄4 0.69, 95% CI: 0.67, 0.72). The addition of other parameters, such as lifestyle, physical functioning, mental symptoms, and physical symptoms, had little effect on these 2 predictive models (C 1⁄4 0.71 (95% CI: 0.69, 0.73) and C 1⁄4 0.71 (95% CI: 0.69, 0.74), respectively). The abilities of the SRH and the health score models to predict death decreased in parallel fashion over time. These results suggest that older adults who report poor health warrant particular attention as persons who have accumulated biologic markers of disease.