387 results for "soy/feed/ADHD Throughout the Years,"

Whole Grains Fact Sheet

Download the Whole Grains Fact Sheet here Grains have been known as the “staff of life” for thousands of years, serving as a vital food source for humans. Today, foods made with whole grains are recognized as important sources of nutrients like fiber, trace minerals, and certain vitamins and phytochemicals that are not restored through traditional grain enrichment and fortification practices. These components are believed to play a key role in reducing risk of disease. Research shows that healthful diets rich in whole grain foods may play a part in reducing risks of heart disease, certain types of cancer and type 2 diabetes. They may also help in managing body weight.¹ Whole grains are composed of three plant components: the bran, the germ and the endosperm. In the last century, advances in the milling and processing of grains have allowed for the large–scale separation and removal of the bran and germ, resulting in refined flour that consists only of the endosperm. Refined flour has become popular because it produces baked goods with a softer texture and extended freshness. However, removing much of the bran and germ results in losses of fiber, B vitamins, vitamin E, trace minerals, protein, unsaturated fat and about 75 percent of phytochemicals, which are substances in plant-based foods with physiologically active components that may have functional health benefits. To correct for some of these losses, the process of enrichment began in the early 1940s to restore some B vitamins (thiamin, riboflavin and niacin) and the mineral iron to flour.² Since 1998, the U.S. Food and Drug Administration (FDA) has required enriched grain products to also be fortified with folic acid, the synthetic form of the B vitamin folate, to help women of childbearing age reduce the risk of having a pregnancy affected with a neural tube […]

insights

Your Guide To Portion Size

Download: Serving Size Vs. Portion Size: What’s the Difference? Every five years since 1980 the U.S. Government publishes the Dietary Guidelines for Americans (DGA). A main emphasis of the DGA over the years has been advice on how much of certain food groups and nutrients to consume, encouraging Americans to eat more of those that we don’t eat enough of (e.g., calcium, dietary fiber and vitamin D) and less of those that we eat in excess (e.g., added sugars, saturated fat and sodium). Previous versions of the DGA have noted the importance of choosing more nutrient-dense foods, so that we are more likely to get all the nutrition we need within the calories it takes to maintain a healthy weight. The 2020—2025 DGA remind us that the benefits of healthy eating don’t appear overnight. Instead, they add up over time with every bite, having the potential to contribute to good health. Aside from key recommendations for specific nutrients and food groups, one of the main action-oriented principles that the 2020—2025 DGA offers to help build healthy eating patterns, is to pay attention to portion sizes. What is Portion Size? Portion size is a term that is often confused with serving size. Understanding the difference between the two is important. Serving sizes appear on the Nutrition Facts label, and that amount is used to calculate the nutrient information that is displayed. But serving sizes listed on food packaging are not a recommendation for how much to eat or drink. Serving sizes are required by law to be based, in part, on food consumption data from the National Health and Nutrition Examination Survey (NHANES) to derive the amount of a food or beverage that people typically consume in one sitting. In contrast, portion sizes are not established and regulated by the government […]

insights

Cracking The Code: What Will It Really Take To Make America Eat Healthy?

Have you ever created a seemingly foolproof plan to execute—only to watch it unfold far differently than you expected? As food and nutrition professionals, this is a challenge that many of us face as we work to communicate evidence-based plans and strategies to help consumers advance their goals and improve their health. The overall evidence supporting the basics of a healthy diet pattern is, in many respects, strong. Sure, questions remain. Do individual differences affect how macronutrients impact health? How do tens of thousands of phytonutrients function in the body? There are countless unknowns, and it will take generations to uncover them. Still, the most pressing questions—the ones with immediate, weighty implications—center on how we can effectively enable consumers to initiate, adopt, and sustain healthier eating patterns. Make no mistake, the stakes are rising alongside the increasing rates of overweight, obesity, and chronic diseases. Good nutrition can help mitigate these conditions. Perhaps Charles Duhigg, award-winning journalist and author of The Power of Habit said it best: “The gap between knowing and doing is where most of our food struggles live.” Improving Consumer Communication Through Research One of the keys to helping consumers eat healthier lies in what insights we can gather from consumers directly. This is one of IFIC’s greatest strengths—our relentless pursuit to be a “consumer whisperer” through our research and consumer insights platform. Specifically, we are interested in: Through the annual IFIC Food & Health Survey, monthly IFIC Spotlight Surveys, and a range of research initiatives, IFIC uncovers the beliefs, intentions, and behaviors that shape consumer food and beverage decisions. The IFIC Spotlight Survey: Americans’ Perceptions & Priorities on Healthy Eating offers key insights to understand consumers’ food and health struggles. Understanding Americans’ Top Food & Nutrition Priorities In the Stages of Change Model, pre-contemplation progresses to contemplation, […]

insights

A Brief History of Food Fortification in the U.S.

According to the World Health Organization (WHO), food fortification is the practice of deliberately increasing the content of one or more micronutrients (i.e., vitamins and minerals) in a food or condiment to improve the nutritional quality of that food—thereby providing a public health benefit with minimal risk to individual health. But why would fortification be necessary? Over decades of research, fortification has been identified as one of the most cost-effective nutrition interventions available, particularly for low- and middle-income countries around the globe. In fact, the worldwide fortification of commonly consumed foods provides enhanced nutrient intake throughout the lifespan for populations that are at risk for widespread nutritional deficiencies. Even in wealthier countries, like the United States, fortification has led to positive health benefits for the entire population. In the U.S., micronutrient deficiency diseases like goiter, rickets, beriberi, and pellagra were once common health problems as recently as the 20th century. Thanks to systematic fortification within the U.S. food supply, these diseases have been virtually eliminated. Read on to learn more about the historical origins of food fortification in the U.S., as well as insights into fortification’s contributions to improved public health. But before we dive in, let’s define two food-related terms that are often used interchangeably but are slightly different: The 1920s: Iodine in Salt During the 1921 American Medical Association (AMA) convention, two Ohio doctors presented findings from their clinical trial demonstrating the effectiveness of sodium iodide treatments for the prevention of goiter in Akron schoolgirls. Prior to their study, research from Europe had also suggested an association between iodine deficiency and goiter (an enlarged thyroid gland). It was found that without iodine, the body could not properly make thyroid hormones, which often resulted in an unsightly neck goiter or, in more serious cases, neurocognitive impairments. Iodine deficiency generally occurs […]

insights

Americans’ Confidence in the Safety of U.S. Food Supply Hits Record Low, New Data Shows

The International Food Information Council (IFIC) Releases New Consumer Data on Food and Ingredient Safety in Partnership with the International Association for Food Protection (IAFP) (Washington, D.C.) — American confidence in the safety of the U.S. food supply has dropped to an all-time low, according to new findings from the 2025 IFIC Food & Health Survey. IFIC is releasing the data today in partnership with the International Association for Food Protection (IAFP), a leading professional organization committed to advancing food safety worldwide. Specifically, IFIC will present the findings to thousands of food safety professionals from around the globe at the IAFP Annual Meeting in Cleveland, Ohio. Now in its 20th year, the IFIC Food & Health Survey captures the beliefs, behaviors, and attitudes of 3,000 U.S. adults, 18 to 80 years old. While the report covers a wide range of topics, this year’s food and ingredient safety findings point to an erosion in public trust. Confidence In the Safety of the U.S. Food Supply Hits All-Time Low Just over half of Americans say they are very or somewhat confident in the safety of the U.S. food supply (55%) —a sharp drop from 62% in 2024 and 70% in 2023. Only 11% of respondents are “very confident;” that number has also steadily declined from a high of 24% in 2022, dropping to 17% in 2023, 14% in 2024, and now 11% in 2025.  2025 marks the lowest level of confidence in the 13 years IFIC has gauged public sentiment on the topic, and the decline in confidence from 2024 spans nearly all demographic groups. Among those with low confidence in the safety of the U.S. food supply, leading consumer concerns include: Top U.S. Consumer Food Safety Concerns Revealed — Foodborne Illness Tops the List This year, foodborne illness from bacteria, such […]

Media

“Best By,” Not “Bad After”: Why Food Date Labels Deserve Greater Attention 

IFIC has a long-standing history of conducting consumer research on nutrition and food safety, including Americans’ use of food labels to advance public health and inform regulatory efforts. The IFIC Spotlight Survey, Americans’ Perceptions Of Food Date Labeling, continues that tradition. The survey was conducted in response to a joint Request for Information (RFI) from the U.S. Department of Agriculture’s (USDA) Food Safety and Inspection Service (FSIS) and the Food and Drug Administration (FDA). The agencies seek to understand consumer perceptions of date labeling and its potential impact on food waste.  Food Date Labeling & Food Waste Implications  Food waste is a significant and growing issue in the U.S. At the same time, food prices continue to outpace overall inflation, placing additional strain on Americans—particularly those with limited resources. Currently, 8.4% of U.S. households report low food security and 5.1% report very low food security. That’s nearly 47.4 million people, including over 12 million older adults, without reliable access to food.  In 2019, the Environmental Protection Agency (EPA) reported that 66 million tons of food were wasted, making it the largest category of material in municipal landfills. Importantly, much of that discarded food was still safe to eat. According to the USDA, misunderstanding food date labels contributes to an estimated 20% of household food waste.  In short, improving consumer understanding of food date labels can:  These outcomes are not just meaningful—they are urgently needed.   Consumers Read Food Date Labels As Safety Versus Quality Indicators  At the heart of this issue is a critical, often overlooked distinction: the difference between food quality and food safety. Understanding that nuance could be the key to reducing waste and maximizing access to safe, nutritious food.  While food is perishable and may lose freshness over time, that does not necessarily mean it’s unsafe to eat.  […]

insights

Americans Love Sweetness—But Think It Is Important To Cut Back, New Research Finds

IFIC Study Reveals Complex Relationship Among Sweet Taste, Health Goals, and Food Choices (Washington, D.C.) — Sweet-tasting foods and drinks may be beloved by many Americans, but a new survey from the International Food Information Council (IFIC) reveals a mounting tension between consumers’ enjoyment and health goals when it comes to sweetness in the diet. According to the IFIC Spotlight Survey: Americans’ Perceptions Of Sweetness in Their Diets, nearly 6 in 10 Americans (58%) prefer sweet as their favorite taste—more than savory/umami (49%), salty (45%), sour (24%), or bitter (21%). Yet despite their fondness for sweet flavors, 8 in 10 Americans(78%) believe it is important to reduce the overall sweetness of their diet, primarily to eat healthier, manage blood sugar or diabetes, manage body weight, and support dental health. “From a biological perspective, our love of sweet taste makes sense—it’s thought to be an ancient survival mechanism that signaled safe, energy-rich food, like glucose from plants” said IFIC Senior Director of Research & Consumer Insights, Kris Sollid, RD. “As we’ve evolved, navigating our innate preferences and health goals has become more complicated.” Americans Support Sweetness While Also Scaling Back When asked what comes to mind first when they think about sweet-tasting foods or drinks, most Americans mentioned a specific food (39%) or beverage (23%)—while far fewer thought of an ingredient or feeling. But their attitudes toward that sweetness are nuanced. While 59% agree that sweet-tasting foods and drinks can be part of a healthy diet, many still support scaling back the overall sweetness of their diets (78%). Among those who think it is important to reduce sweetness, the most common reasons include eating healthier (49%), managing blood sugar or diabetes (43%), managing weight (41%), and improving dental health (36%). Notably, women were more likely than men to cite each of […]

Media

Taking a Look at Regenerative Agriculture

According to IFIC’s 2022 Food and Health Survey, 43% of consumers want to purchase food and beverages that were produced in a way that minimizes carbon footprint/climate impact. Climate change is affecting all our favorite foods—from avocado toast to acai bowls—and how farmers grow food matters more now than ever. One way farmers are improving their food-growing game is regenerative agriculture, or “regenerative ag,” a farming practice with the lofty goal of not just slowing, but actually reversing, climate change. This concept likely does not come up in many everyday conversations, thus knowing more about what regenerative farming is can held shed some light on how many of our favorite foods reach our markets via regenerative ag practices. Regenerative ag is all about “holistic land management,” meaning farmers employ techniques that give back to the land rather than take away. Practices are focused on building up high-quality soil, retaining rainwater, improving the water cycle, increasing biodiversity, and promoting both human and animal welfare. One way farmers can accomplish much of this effort is by working in sync with carbon, one of life’s most important elements. This fundamental element makes up all living things, including the building blocks of our food—carbohydrates, protein, and fat wouldn’t exist without carbon. Plants especially love carbon; they take it from the atmosphere and the soil to grow and produce nutrients. Carbon-rich soil not only nourishes plants, but also creates resilient soil that can retain water during a drought, doesn’t erode as quickly, and provides ample nutrition to growing plants. Carbon is important since it sustains all life, but when released into the atmosphere it can form the harmful greenhouse gas carbon dioxide and directly contribute to atmospheric warming and climate change. Capturing carbon from the atmosphere into the soil, a process called carbon sequestration, simultaneously pulls carbon dioxide out of the air and […]

insights