390 results for "soy/feed/ADHD Throughout the Years,"

A Brief History of Food Fortification in the U.S.

According to the World Health Organization (WHO), food fortification is the practice of deliberately increasing the content of one or more micronutrients (i.e., vitamins and minerals) in a food or condiment to improve the nutritional quality of that food—thereby providing a public health benefit with minimal risk to individual health. But why would fortification be necessary? Over decades of research, fortification has been identified as one of the most cost-effective nutrition interventions available, particularly for low- and middle-income countries around the globe. In fact, the worldwide fortification of commonly consumed foods provides enhanced nutrient intake throughout the lifespan for populations that are at risk for widespread nutritional deficiencies. Even in wealthier countries, like the United States, fortification has led to positive health benefits for the entire population. In the U.S., micronutrient deficiency diseases like goiter, rickets, beriberi, and pellagra were once common health problems as recently as the 20th century. Thanks to systematic fortification within the U.S. food supply, these diseases have been virtually eliminated. Read on to learn more about the historical origins of food fortification in the U.S., as well as insights into fortification’s contributions to improved public health. But before we dive in, let’s define two food-related terms that are often used interchangeably but are slightly different: The 1920s: Iodine in Salt During the 1921 American Medical Association (AMA) convention, two Ohio doctors presented findings from their clinical trial demonstrating the effectiveness of sodium iodide treatments for the prevention of goiter in Akron schoolgirls. Prior to their study, research from Europe had also suggested an association between iodine deficiency and goiter (an enlarged thyroid gland). It was found that without iodine, the body could not properly make thyroid hormones, which often resulted in an unsightly neck goiter or, in more serious cases, neurocognitive impairments. Iodine deficiency generally occurs […]

insights

Americans’ Confidence in the Safety of U.S. Food Supply Hits Record Low, New Data Shows

The International Food Information Council (IFIC) Releases New Consumer Data on Food and Ingredient Safety in Partnership with the International Association for Food Protection (IAFP) (Washington, D.C.) — American confidence in the safety of the U.S. food supply has dropped to an all-time low, according to new findings from the 2025 IFIC Food & Health Survey. IFIC is releasing the data today in partnership with the International Association for Food Protection (IAFP), a leading professional organization committed to advancing food safety worldwide. Specifically, IFIC will present the findings to thousands of food safety professionals from around the globe at the IAFP Annual Meeting in Cleveland, Ohio. Now in its 20th year, the IFIC Food & Health Survey captures the beliefs, behaviors, and attitudes of 3,000 U.S. adults, 18 to 80 years old. While the report covers a wide range of topics, this year’s food and ingredient safety findings point to an erosion in public trust. Confidence In the Safety of the U.S. Food Supply Hits All-Time Low Just over half of Americans say they are very or somewhat confident in the safety of the U.S. food supply (55%) —a sharp drop from 62% in 2024 and 70% in 2023. Only 11% of respondents are “very confident;” that number has also steadily declined from a high of 24% in 2022, dropping to 17% in 2023, 14% in 2024, and now 11% in 2025.  2025 marks the lowest level of confidence in the 13 years IFIC has gauged public sentiment on the topic, and the decline in confidence from 2024 spans nearly all demographic groups. Among those with low confidence in the safety of the U.S. food supply, leading consumer concerns include: Top U.S. Consumer Food Safety Concerns Revealed — Foodborne Illness Tops the List This year, foodborne illness from bacteria, such […]

Media

“Best By,” Not “Bad After”: Why Food Date Labels Deserve Greater Attention 

IFIC has a long-standing history of conducting consumer research on nutrition and food safety, including Americans’ use of food labels to advance public health and inform regulatory efforts. The IFIC Spotlight Survey, Americans’ Perceptions Of Food Date Labeling, continues that tradition. The survey was conducted in response to a joint Request for Information (RFI) from the U.S. Department of Agriculture’s (USDA) Food Safety and Inspection Service (FSIS) and the Food and Drug Administration (FDA). The agencies seek to understand consumer perceptions of date labeling and its potential impact on food waste.  Food Date Labeling & Food Waste Implications  Food waste is a significant and growing issue in the U.S. At the same time, food prices continue to outpace overall inflation, placing additional strain on Americans—particularly those with limited resources. Currently, 8.4% of U.S. households report low food security and 5.1% report very low food security. That’s nearly 47.4 million people, including over 12 million older adults, without reliable access to food.  In 2019, the Environmental Protection Agency (EPA) reported that 66 million tons of food were wasted, making it the largest category of material in municipal landfills. Importantly, much of that discarded food was still safe to eat. According to the USDA, misunderstanding food date labels contributes to an estimated 20% of household food waste.  In short, improving consumer understanding of food date labels can:  These outcomes are not just meaningful—they are urgently needed.   Consumers Read Food Date Labels As Safety Versus Quality Indicators  At the heart of this issue is a critical, often overlooked distinction: the difference between food quality and food safety. Understanding that nuance could be the key to reducing waste and maximizing access to safe, nutritious food.  While food is perishable and may lose freshness over time, that does not necessarily mean it’s unsafe to eat.  […]

insights

Americans Love Sweetness—But Think It Is Important To Cut Back, New Research Finds

IFIC Study Reveals Complex Relationship Among Sweet Taste, Health Goals, and Food Choices (Washington, D.C.) — Sweet-tasting foods and drinks may be beloved by many Americans, but a new survey from the International Food Information Council (IFIC) reveals a mounting tension between consumers’ enjoyment and health goals when it comes to sweetness in the diet. According to the IFIC Spotlight Survey: Americans’ Perceptions Of Sweetness in Their Diets, nearly 6 in 10 Americans (58%) prefer sweet as their favorite taste—more than savory/umami (49%), salty (45%), sour (24%), or bitter (21%). Yet despite their fondness for sweet flavors, 8 in 10 Americans(78%) believe it is important to reduce the overall sweetness of their diet, primarily to eat healthier, manage blood sugar or diabetes, manage body weight, and support dental health. “From a biological perspective, our love of sweet taste makes sense—it’s thought to be an ancient survival mechanism that signaled safe, energy-rich food, like glucose from plants” said IFIC Senior Director of Research & Consumer Insights, Kris Sollid, RD. “As we’ve evolved, navigating our innate preferences and health goals has become more complicated.” Americans Support Sweetness While Also Scaling Back When asked what comes to mind first when they think about sweet-tasting foods or drinks, most Americans mentioned a specific food (39%) or beverage (23%)—while far fewer thought of an ingredient or feeling. But their attitudes toward that sweetness are nuanced. While 59% agree that sweet-tasting foods and drinks can be part of a healthy diet, many still support scaling back the overall sweetness of their diets (78%). Among those who think it is important to reduce sweetness, the most common reasons include eating healthier (49%), managing blood sugar or diabetes (43%), managing weight (41%), and improving dental health (36%). Notably, women were more likely than men to cite each of […]

Media

Taking a Look at Regenerative Agriculture

According to IFIC’s 2022 Food and Health Survey, 43% of consumers want to purchase food and beverages that were produced in a way that minimizes carbon footprint/climate impact. Climate change is affecting all our favorite foods—from avocado toast to acai bowls—and how farmers grow food matters more now than ever. One way farmers are improving their food-growing game is regenerative agriculture, or “regenerative ag,” a farming practice with the lofty goal of not just slowing, but actually reversing, climate change. This concept likely does not come up in many everyday conversations, thus knowing more about what regenerative farming is can held shed some light on how many of our favorite foods reach our markets via regenerative ag practices. Regenerative ag is all about “holistic land management,” meaning farmers employ techniques that give back to the land rather than take away. Practices are focused on building up high-quality soil, retaining rainwater, improving the water cycle, increasing biodiversity, and promoting both human and animal welfare. One way farmers can accomplish much of this effort is by working in sync with carbon, one of life’s most important elements. This fundamental element makes up all living things, including the building blocks of our food—carbohydrates, protein, and fat wouldn’t exist without carbon. Plants especially love carbon; they take it from the atmosphere and the soil to grow and produce nutrients. Carbon-rich soil not only nourishes plants, but also creates resilient soil that can retain water during a drought, doesn’t erode as quickly, and provides ample nutrition to growing plants. Carbon is important since it sustains all life, but when released into the atmosphere it can form the harmful greenhouse gas carbon dioxide and directly contribute to atmospheric warming and climate change. Capturing carbon from the atmosphere into the soil, a process called carbon sequestration, simultaneously pulls carbon dioxide out of the air and […]

insights

What is Regenerative Agriculture?

Climate change is affecting all our favorite foods—from avocado toast to acai bowls—and how farmers grow food matters more now than ever. One way farmers are improving their food-growing game is regenerative agriculture, or “regenerative ag,” a farming practice with the lofty goal of not just slowing, but actually reversing, climate change. Less than a quarter of participants in the International Food Information Council’s 2019 Food and Health Survey stated they were familiar with this term, so let’s dig in to some more details about regenerative ag practices and their farming impacts! Regenerative ag is all about “holistic land management,” meaning farmers employ techniques that give back to the land rather than take away. Practices are focused on building up high-quality soil, retaining rainwater, improving the water cycle, increasing biodiversity, and promoting both human and animal welfare. One way farmers can accomplish much of this effort is by working in sync with carbon, one of life’s most important elements. This fundamental element makes up all living things, including the building blocks of our food—carbohydrates, protein, and fat wouldn’t exist without carbon. Plants especially love carbon; they take it from the atmosphere and the soil to grow and produce nutrients. Carbon-rich soil not only nourishes plants, but also creates resilient soil that can retain water during a drought, doesn’t erode as quickly, and provides ample nutrition to growing plants. Carbon is important since it sustains all life, but when released into the atmosphere it can form the harmful greenhouse gas carbon dioxide and directly contribute to atmospheric warming and climate change. Capturing carbon from the atmosphere into the soil, a process called carbon sequestration, simultaneously pulls carbon dioxide out of the air and transfers it to the soil for nourishing. Many farmers are adopting carbon sequestering techniques because of this dual […]

insights

What’s Up with Protein and Protein Supps?

Chatter about protein and protein supplements have been getting a good deal of attention recently. With so much misinformation about amounts, timing and sources here are the answers to five common questions about this important macronutrient. Why is protein important? Protein plays vital functions in our bodies, including building connective tissues and supporting the immune system. Protein can also help us maintain a healthy weight by increasing satiety and preserving lean body mass. In addition, protein can support exercise and fitness goals since protein aids muscle growth and repair. How much protein is needed? While there are extremely rare conditions where protein intake should be carefully monitored, the large majority of us get significant health benefits from eating the right amount of protein for our needs. Macronutrient amounts, including protein, are determined by the Recommended Dietary Allowance (RDA) set by the National Academy of Medicine (formerly the Institute of Medicine (IOM)). The RDA for protein is 0.8 gram per kilogram of body weight per day (g/kg/d) for adults. However, a recent position statement from the International Society of Sports Nutrition (ISSN) suggests that the majority of people who exercise should eat a minimum of 1.4 to 2.0 g/kg/d of protein. Training athletes, which most of us are not, may require even more protein than that. Now before you start busting out your calculator to crunch these g/kg/d numbers, let’s talk about what it means. The recommended amount of 0.8 g/kg/d was defined by the IOM as the intake level necessary to meet sufficient protein needs for an average healthy adult, which varies based on activity level, gender, weight and genetics. The amounts suggested by ISSN for exercising individuals and training athletes are designed to support building and maintaining muscle mass.   Additionally, the IOM has established Acceptable Macronutrient Distribution Range (AMDR) […]

article