That foods might provide therapeutic benefits is clearly not a new concept. The tenet, “Let food be thy medicine and medicine be thy food” was embraced 2500 years ago by Hippocrates, the father of medicine. However, this “food as medicine” philosophy fell into relative obscurity in the 19th century with the advent of modern drug therapy. In the 1900s, the important role of diet in disease prevention and health promotion came to the forefront once again.
During the first 50 years of the 20th century, scientific focus was on the identification of essential elements, particularly vitamins, and their role in the prevention of various dietary deficiency diseases. This emphasis on nutrient deficiencies or “undernutrition” shifted dramatically, however, during the 1970s when diseases linked to excess and “overnutrition” became a major public health concern. Thus began a flurry of public health guidelines, including the Senate Select (McGovern) Committee's Dietary Goals for the United States (1977), the Dietary Guidelines for Americans (1980, 1985, 1990, 1996, 2000— a joint publication of the USDA and the Department of Health and Human Services), the Surgeon General's Report on Nutrition and Health (1988), the National Research Council's Diet and Health (1989) and Healthy People 2000 and 2010 from the U.S. Public Health Service. All of these reports are aimed at public policy and education emphasizing the importance of consuming a diet that is low in saturated fat, and high in vegetables, fruits, whole grains and legumes to reduce the risk of chronic diseases such as heart disease, cancer, osteoporosis, diabetes and stroke.
Scientists also began to identify physiologically active components in foods from both plants and animals (known as phytochemicals and zoochemicals, respectively) that potentially could reduce risk for a variety of chronic diseases. These events, coupled with an aging, health-conscious population, changes in food regulations, numerous technological advances and a marketplace ripe for the introduction of health-promoting products, coalesced in the 1990s to create the trend we now know as “functional foods.” This report includes a discussion of how functional foods are currently defined, the strength of the evidence both required and thus far provided for many of these products, safety considerations in using some of these products, factors driving the functional foods phenomenon, and finally, what the future may hold for this new food category.
What are functional foods?
All foods are functional to some extent because all foods provide taste, aroma and nutritive value. However, foods are now being examined intensively for added physiologic benefits, which may reduce chronic disease risk or otherwise optimize health. It is these research efforts that have led to the global interest in the growing food category now recognized as “functional foods.” Functional foods have no universally accepted definition. The concept was first developed in Japan in the 1980s when, faced with escalating health care costs, the Ministry of Health and Welfare initiated a regulatory system to approve certain foods with documented health benefits in hopes of improving the health of the nation's aging population (1). These foods, which are eligible to bear a special seal, are now recognized as Foods for Specified Health Use (FOSHU).3 As of July 2002, nearly 300 food products had been granted FOSHU status in Japan.