Blog

How to Interpret Headline-Breaking Science

We have all had the experience of standing in the checkout line and glancing at the tabloids advertising fad diets and miracle foods telling readers that if they eat this or that, they will be able to lose 10 pounds in a week. While most of the time it is easy to identify headlines likely that lack much truth and reliability but what about news and headlines regarding scientific studies? Even highly-regarded news and media outlets often promote false or misleading claims about food and health. While doing so may be an advertising tactic, it is also likely an unintended consequence of condensing the highly complicated, ever-changing science of nutrition into a one-size-fits-all assertion.

While it’s easier to trust publications promoting scientific research than the media, both can be sources of inaccurate information and can spread misleading claims. Below are several tactics to employ the next time you come across what might be a major headline-breaking study.

Step 1 – Consider the story covering the study.

Source: Ask yourself where you learned about the study. Was it posted by a friend on social media? Highlighted on the nightly news? Published in a major newspaper? Consider whether or not the study and its findings are being interpreted and promoted to create a catchy tagline that optimizes viewership. Is it too go to be true?  Or maybe it’s the opposite, maybe its generating fear?

Who is the author? Is the article written by a trained scientist a food blogger, or a journalist? Do they have a history of writing balanced, scientifically-sound articles, or have they historically written more emotionally-charged content? Consider both the credentials as well as the potential motives of the author.

What’s the full story? It is impossible to capture all of the findings, strengths, and weaknesses of a study in a single headline, so it’s important to dig for details. In addition to bypassing the headline for the full article, you should also check out any source material to ensure it supports the headline and article. Not all writers are scientists who have the expertise to both adequately evaluate a study, and put the study’s findings into the broader context of related evidence on the topic. Further, whether the writer does so intentionally or not, they may look for parts of studies that best fit within the context of their own story. Make sure to seek out and evaluate the study, not just the writer’s take on it.

Step 2 – Evaluate the study

Consider if the study was peer-reviewed and published in a reputable journal. The peer-review process ensures unbiased experts rate the quality of how the study was organized and raise questions as needed. Further, peer-reviewed studies that are published in a reputable journals and reviews or guidelines published by international health agencies and policy-making bodies are considered to have more reliable findings.

Check the timing. Sometimes stories reference studies that have been published for a year or longer in order to try and raise attention about an issue. Make sure to check the study’s publication date to determine if the findings are current and relevant.

Identify the study type. Studies vary greatly in their quality and design. It is important to remember that association does not equal causation. Some study designs can identify associations between a behavior or an exposure and a health outcome, but can show that is it caused by that behavior or exposure. Further, as with all study types, it is essential to consider confounding variables (i.e. something that wasn’t controlled for by the researcher, but may impact the outcome of the study) and potential sources of bias, which could skew study findings.

Reputable research takes steps to reduce the risk that the researchers’ or participants’ preference or bias, or even chance could affect the study’s results. It is important to be aware of the fact that not all research is created equal, and thus their results are not equal. This guide from FoodInsight may help you navigate the different types of studies and what the implications of the results may be, when the studies are carried out appropriately.

In following these steps to properly evaluate headline-breaking studies, you could be part of the solution to the growing problem of the dissemination of bad science and deceiving headlines. Time to put your white hat (and reading glasses) on.

 

 

 

Bold Food Concepts and Abstract Ingredient Combinations Shine at IFT 2018

Every summer, the Institute of Food Technologists (IFT) hosts its annual conference to bring together passionate people working to innovate within the food industry space. The focal point of the multi-day event is food ingredients, and the exciting opportunities to use them ingredients in new products. This year’s conference was held in Chicago from July 15-18, and featured presentations that explored trends in food business and innovation, along with an exhibition hall that featured food and food ingredient companies of all sizes, from startups to large multinationals. As with previous years, a few key trends stood out, ultimately forecasting what’s to come by way of new products headed to store shelves. Below are the top trends to look forward to this coming year:

  1. New Technology, Futuristic Solutions:  While the food industry has been shifting over the past several years towards products that promote wellness and sustainability, IFT2018 highlighted the first ever IFTNEXT Food Disruption Challenge, a competition that allows emerging food companies and entrepreneurs to pitch new products or processes leveraging modern technology to enhance the global food supply. The finalists chosen to share their innovations represented a diverse set of breakthrough solutions in the ingredient, packaging and sustainable agriculture space. The people’s choice award for Future Food Disruptor of the Year went to a processor of insect ingredients as a more environmentally-sound alternative to livestock production. The company’s protein concentrate may be used for sports nutrition products and certain beverages, while their textured insect protein may be used as a meat replacement for burgers or nuggets, or as an alternative to eggs or butter. However, the judges’ pick for the competition’s grand prize went to a company transforming an otherwise wasted by-product of soy milk production called okara into a gluten-free flour.
  2. Focus on Coffee: Beans have left the cup and are headed for the snack aisle. Producers are using new extraction technologies to bring dynamic coffee flavors to a range of products. A wide variety of confections, from cookies to cakes, featured classic coffee house flavors such as ‘latte,’ ‘espresso,’ and ‘cappuccino.’
  3. Color & Texture: As novelty and variety continues to entice buyers, many brands featured unusual textures and colors in everything from teas to jerky. Products with bright and enticing colors, such as turmeric yellow, abounded and sparkling beverages prevailed. Additionally, new textures such as kelp jerky were featured as consumers seek out “unique textural experiences.”
  4. Florals: Regardless of the season, botanicals are in spring. Companies are adding fresh, bright and seasonal floral flavors to new products. Blooms such as hibiscus, violet, honeysuckle, rose and elderflower were increasingly popular in the exhibition hall, contributing new color, taste and aroma to packaged foods. However; as this trend is still in its infancy, most of these florals are being paired with other more familiar flavors to ease consumers into the trend.
  5. Salt Reduction Strategies: Companies specializing on savory items debuted products that work to deliver great taste while reducing the amount of sodium listed on a label. One booth presented new flavor enhancers that offered prominent umami and kokumi notes, allowing products to use less salt but still deliver satisfying, rich flavors. Hydrocolloids such as carrageenan are also being used to help reduce the salt content of foods such as lunch meats.

Find a full recap of this year’s show at iftevent.org and learn more about many of the food ingredients used in these new products here.

New York Hosts Summer Fancy Foods Show

The Specialty Food Association (SFA) hosted its bi-annual Fancy Food Show in New York City from June 30 to July 2, 2018. The international event included over 2,400 exhibitors at a huge three-story trade show known as a main attraction for retail food distributors and food media editors to come and scout out new food and beverage products their customers and readers will love.  The focus is mainly on new packaged foods hoping to be the next best superfood snack or unique cooking ingredient to gain traction with modern day consumers. While the Fancy Food Show is one of the best opportunities to debut and promote new products, the success of these foods depends mostly on the combinations of food ingredients used to create new and exciting consumer experiences. Below are the top food trends and new products to look forward to this coming year:

1: Functionality: While prepared and packaged foods have always provided convenience, not all are recognized for their positive nutritional qualities. However, health and wellness prevailed at this year’s show. Many products are now balancing flavor and wellness, packing vitamins, protein, probiotics and more into foods and beverages that promise both great taste and good health.

2: Cauliflower: It looks like cauliflower is the new kale. While cauliflower-rice, “steaks” and purees now show up on restaurant menus, cauliflower is continuing to make its way to the packaged food aisle in increasingly creative ways. This year’s show featured cauliflower pretzels, pizza crusts and even a cauliflower-based baking mix.

3: Ice Cream: A whole new class of exciting ice cream flavors debuted at this summer’s show. Standouts included those with unexpected flavors and ingredients such as black sesame, toasted rice and even purred vegetables, such as flavors like vanilla with zucchini, mint chip with spinach, and strawberry with carrot. Most of these products rely on emulsifiers and thickening ingredients, such as cellulose gum and gellan gum, to provide the creaminess and texture that customers love and expect with ice cream.

Find a full recap of this year’s show at specialtyfood.com and learn more about the ingredients that make all of these new products possible here.

Irish Moss: The History of Carrageenan’s Roots

 

If you have ever checked the list of ingredients on your favorite ice cream, yogurt, chocolate milk or frozen pizza, you’ve probably seen carrageenan listed. Whether you have noticed it before or not, carrageenan has been used in packaged foods for over 50 years, and its history in the world’s food supply dates back even further.

Chondrus crispus

Carrageenan is made from a type of red seaweed known as Chondrus crispus.  Archaeologists estimate humans have been harvesting seaweed, like Chondrus crispus, for nearly 14,000 years. Evidence of red seaweed’s medicinal benefits in China can be traced back to 600 BC, and it was originally used as a food source around 400 BC on the British Isles.

Often referred to as Irish moss, the thick seaweed used for carrageenan grows abundantly along the rocky coastline of the Atlantic, including the shores of the British Isles, North America and Europe. This seaweed is especially abundant along Ireland’s rocky coastline, where it has been cultivated for hundreds of years for both its gelling properties in foods as well as purported medicinal purposes. In fact, carrageenan’s name comes from Carrigan Head, a cape near Northern Ireland, the title of which was inspired by the Irish word “carraigín,” which translates to “little rock.” In the 19th century, the Irish believed carrageenan could cure sick calves along with human colds, flu and congestion. First, the seaweed was harvested and laid out to dry. Then it was washed and boiled before being added to flans, tonics and even beer. Used similarly to gelatin, carrageenan became a key ingredient in the classic Irish pudding, Blancmange, a delicately-set cream dessert. Blancmange is still made in Ireland, where whole pieces of dried red seaweed can be purchased in local markets.

The Irish Potato Famine

Carrageenan was also used to combat nutritional deficiencies in the 1800s during the Irish Potato Famine. The red seaweed was added to warmed milk with sugar and spices to create a fortified beverage. This drink is still consumed today in both Ireland and the Caribbean. As Irish immigrants fled famine and came to the United States, the first American seaweed farming production was established off the coast of Massachusetts. However, it wasn’t until World War II, when a similar ingredient called agar was no longer available, that carrageenan soared in popularity in the US food supply.

Carrageenan Today

Since the mid-20th century, carrageenan has and continues to be used in many products such as chocolate milk, ice cream, frozen foods and many organic items. It is now consumed in nearly every region of the world, including the US, Europe, China, Japan and Brazil. For more information on carrageenan, please review our Sources of Food Ingredients or visit Marinalg.org.

 

Changes Headed to a Food Label Near You

In the spring of 2016, the U.S. Food and Drug Administration (FDA) announced new changes to the Nutrition Facts Label for packaged foods. The changes were made to allow consumers to make more informed and healthful decisions in their diets.  While you may have already seen this new format on food products, the FDA has extended the compliance deadline to 2020, although manufacturers with less than $10 million in annual food sales have until Jan. 1, 2021 to comply.

So, what’s different?

  1. New Look, Bigger Font
    The type size for the total calories, serving size and number of servings has been increased and bolded. Along with making the serving size more visible, the actual size of each serving has been updated to reflect a typical serving size. However, the serving sizes listed on food products are not recommendations from the FDA but rather a measurement which is intended to reflect realistic intake. For example, the serving size for ice cream was previously 1/2 cup, and is now 2/3 cup.
  2.  “Added Sugars” make it to the label
    A line for “Added Sugars” has been added to the label beneath the listing for “Total Sugars” to help consumers understand the amount of sugar that is being added to a product. This means that the number does not include the naturally-occurring sugar found in fruits and vegetables. Naturally-occurring sugars are accounted for in “Total Sugars” on the label. These new designations are meant to help consumer understand the source of sugar.
  3. “Total Fat” to replace “Calories from Total Fat”
    Research has shown the type of fat (e.g., polyunsaturated fat) is more important to consider than the total calories from fat alone. Therefore, FDA has chosen to remove calories from total fat, but will continue to require listing of “Total Fat,” “Saturated Fat” and “Trans Fat.”
  4. The amount of vitamin D and potassium required to be listed
    This change is based on research from the Institute of Medicine (IOM) which shows that Americans do not always get the recommended amounts of vitamin D and potassium. These vitamins will be required to be listed in order to increase visibility of their requirements. Similar information for vitamins A and C may still be included, but their inclusion is now voluntary as deficiencies of these vitamins are rare today.
  5. New footnote on “Percent Daily Value” (% DV)
    The footnote at the bottom of the label has been updated to read as: “The % Daily Value tells you how much a nutrient in a serving of food contributes to a daily diet. 2,000 calories a day is used for general nutrition advice.” This change has been made to better explain what daily value means.

Facts Up Front

While the above changes will be required and regulated by the FDA, manufacturers can opt to include an additional ‘Facts Up Front’ label on the front of packaging. Introduced by First Lady Michelle Obama in 2010, this nutrition labeling system places the amount of calories, saturated fat, sodium and sugars per serving side by side in a simple format on the front display area of a food product. Small packages that cannot fit all four nutrients may display only one icon, for example, calories per serving. If the package size permits, manufactures may also include up to two “nutrients to encourage” if the product has more than 10 percent of the daily value per serving of potassium, fiber, protein, vitamin A, vitamin C, vitamin D, calcium or iron. This optional label is designed to act as a convenient tool to help consumers understand the nutrient quality of foods at first-glance.

 

History in the Frozen Food Aisle

Frozen foods have long been a staple in the Western diet, but they have evolved considerably in terms of safety, quality and packaging compared to the first commercially-sold frozen food.

Early Days & Fish Fillets

The founder of frozen food as we know it today (a la a bag of frozen peas in our freezer) was Clarence Birdseye. Birdseye was inspired to formulate a mechanized fast-freezing method after watching Inuit tribes of Labrador, Canada, preserve freshly-caught fish with the wind, ice and cold weather. As the climate in his hometown of Brooklyn didn’t allow for such fast freezing, Birdseye invented the multi-plate freezing method, in which food is pressed between two chilled metal plates. The method then grew to involve two chilled conveyor belts for faster freezing. The first machine was designed to only freeze haddock and “seal in every bit of just-from-the-ocean flavor,” as noted by Birdseye himself (check out a drawing here). After this invention, the foundation had been laid for the entire frozen food industry.

Conveyor Belts to Home Kitchens

Birdseye established the General Seafood Corporation in 1924, offering only frozen fish fillets. As production increased to include meats and produce, Birdseye joined forces with the Postum Company in 1929 to create General Foods Corporation. With this expansion, Birdseye launched a marketing campaign to familiarize Americans with the new frozen food category. However, Birdseye’s frozen food didn’t reach mass popularity until the 1940s, when most American households purchased their first freezer. In addition to updating the frozen boxcar to transport his foods, Birdseye was also involved in developing grocery store freezer display cases, which led to the sale of frozen television dinners and fish sticks.

Did you know? Swanson executives came up with the first frozen dinner trays after a turkey surplus left with them too much turkey after Thanksgiving.

Today, food technologists and scientists continue to improve methods and ingredients for freezing. While Birdseye’s method of flash freezing remains popular, there is also air blasting, spiral belt, and cryogenic freezers to name a few.  Additionally, foods that were hard to freeze  can now be flash frozen with the addition of ingredients, for example, to prevent enzymatic reactions in fresh produce and  preserve texture in frozen desserts. Modern innovations in “smart packaging” also maintain freshness and keep products from thawing (think the plastic film over your favorite frozen mac and cheese).

Considering frozen food? Here are the top 5 reasons to go frozen!

  1. Cut down on meal prep
    Pre-cut frozen vegetables are usually faster and simpler to cook than whole, unwashed produce.
  2. Get more nutrients
    Since frozen foods are packaged at peak freshness, they often contain more nutrients than their fresh counterparts.
  3. Reduce food waste
    Frozen food reduces the amount of food thrown a way due to spoilage.
  4. Easy portion control
    Many frozen foods are sold in single-serve packages which allow for easy for portion control.
  5. Cut food costs
    Frozen produce is typically much less expensive than fresh.

The best way to prevent freezer burn? Make sure your item is in an air-tight container! As something starts to freeze, water evaporates and freezes when it hits the air, creating ice crystals and altering the texture of foods.

From Appert to the Ball Brothers: a history of canning

It’s hard to imagine a world without a jar of strawberry jam in the cabinet, beans from the tin, a can of tuna salad for a quick lunch or a trusty can opener. And while preservation methods such as drying, curing, freezing, pickling and fermenting have deep-roots in ancient food cultures, the process of canning is fairly new. In 2013, the Can Manufactures Institute estimated the US and Europe go through 40 billon cans a year – a far cry from when just one can could take over six hours to make and weighed around seven pounds.

The Great Seal

The process of preserving food in a hermetically-sealed jar or tin was the answer to a problem proposed by the French and English governments as their armies subsided on salted meat and hardtack — the need for more nutritious and non-perishable food was great. In France, Napoleon saw the toll poor nutrition took on his men and launched the Preservation Prize in 1795, offering 12,000 francs to anyone who could improve the process of preserving food. In 1810, French chef Nicolas François Appert offered a solution – canning.

As a chef, confectioner and scientist, Appert made many contributions – the invention of bouillon cubes, nonacidic gelatin extraction and improvements in the autoclave – however; food preservation is what earned him the greatest praise (and 12,000 francs). Appert created a method of hermetically sealing glass jars with cork, wire, wax and boiling water. Appert believed the key to non-perishable foods was to heat and seal jars to keep decay out. The understanding of bacteria’s role in spoilage would not be fully understood until Louis Pasteur discovered the process of pasteurization in 1863.

Appert published his work in L’Art de conserver, pendant plusieurs années, toutes les substances animales et végétales (The Art of Preserving All Kinds of Animal and Vegetable Substances for Several Years) in 1810. For those that purchased the book, a small note attached to the cover included Appert’s address so that skeptics could come to his home and purchase preserved goods.

While Appert’s method was effective in preventing spoilage, the glass jars were cumbersome and had the tendency to explode. The answer to these issues came from England, where the government was also struggling with supplying sustainable rations to its navy and arctic explorers. In June of 1813, Bryan Donkin served King George III and Queen Charlotte canned beef… from a tin. British merchant Peter Durand patented the method of storing food in cans made of tin on behalf of French national Philippe de Girard (who invented the method) in 1811. Durand sold the patent to Donkin, who was able to deliver canned food to the royal table and produce cans on a larger scale. Following approval from the Royal Family, Donkin’s cans were immediately placed on British ships. One surgeon aboard a naval vessel in 1814 noted that the tinned food offered “a most excellent restorative to convalescents, and would often, on long voyages, save the lives of many men who run into consumption [tuberculosis] at sea for want of nourishment after acute diseases; my opinion, therefore, is that its adoption generally at sea would be a most desirable and laudable act.”

Across the Pond

The first can arrives in America in 1825, as Thomas Kensett and Ezra Daggert sell their patented cans filled with oysters, fruits, meats and vegetables to New Yorkers. However, canned food doesn’t achieve commercial success in the USA until Gail Bordon’s 1856 invention, condensed milk. Milk was hard to keep fresh and was costly to source in urban areas, such as New York. Bordon’s Condensed Milk addressed a growing problem. When Civil War breaks out, the demand for canned food and milk increases exponentially.

The one caveat to canned food at the time remained how to open it. Early cans were often reinforced with stronger metals, and a hammer and chisel or knife were the only ways to open them. The first incarnation of a can opener isn’t invented until 1860 by American, Ezra J. Warner. Still slightly crude and cumbersome (used mostly through the war and by shop clerks), a more commercially-friendly opener doesn’t arrive in home kitchens until the 1920s.

As can consumption increased, so did the science and methodology behind safer canning. In 1895, a team at the Massachusetts Institute of Technology (MIT) tried to solve the problem of smelly canned clams that swelled with gas released by bacterial metabolism. Researches Samuel Cate Prescott and William Lyman Underwood found the bacteria that caused the cans to swell was not affected by the boiling of the cans but instead by “applying pressurized steam at 120 ˚C [which] killed the bacteria in 10 minutes.” This finding disrupted the industry, changing the ways cans were created and adding pressure to the process.

The Home Front

Home canning was slower to take off than tin. The USDA made its first reference to the canning process in the Farmer’s Bulletin 359 from May 1909, entitled “Canning Vegetables in the Home” followed by “Canning Peaches on the Farm” in 1910.  These guidelines outlined the safest method for home canning, known as fractional sterilization, a multi-day process where jars are boiled three times for an hour each. Additionally, home canners no longer relied on Appert’s method of corking jars, following John L. Mason’s creation of the metal screw-top in 1858 and Alexander H. Kerr’s two-part canning lid developed in 1915 (the lid most canners use today).

Tin can production increases to feed soldiers through World War I and World War II – home canning also sees a large increase during this time. Communal canning centers are established in WWI with the help of the Ball Brothers Company and ‘pressure canners,’ placed on top of a stove in home kitchens, become available. Canning reaches its peak during WWII, as food rations for both the front line and home front are cut. As sugar was highly prized and highly rationed, households that canned would receive extra pounds of sugar, which increased the popularity of canning tremendously. Although, as food rations were lifted, the incentive to can decreased and so did home canning.

Canning Today

Home cooks around the world continue to can, but it’s far from the amount in the 1930s and 40s. Canning is still an excellent way to capture the taste of a season – from peaches and tomatoes in summer to apples in the fall. If you’re interested in taking up canning, there are a few helpful additives and food ingredients that will help you produce better results in your kitchen. If you’re starting with something savory, canning/pickling salt or salt substitutes (which offer the same salty taste without the increase in sodium) create excellent pickled products.  Acidulations, or acids, are a key component in canned produce. Sources for including acid includes vinegar, lemon juice, citric acid or even ground Aspirin. To add a touch of brightness to your mason jar, there are color enhancers and colorants. This includes citric acid for preserving the color of just-cut fruit, ascorbic acid to prevent browning and sulfites to both prevent spoilage and the changing of colors. Finally, when canning items with a high proportion of liquid, there are texture enhancers and thickening agents, such as food-grade calcium chloride or a variety of starches to thicken. Pickling lime can improve your pickles and pectin will yield better-canned fruits.

For official guidelines on home canning, consult the USDA’s Complete Guide to Canning or National Center for Home Food Preservation’s safe canning guidelines.

Don’t fear ingredients in your food!

The term “chemophobia” is defined as an aversion to or prejudice against chemicals or chemistry. It also refers to an exaggerated or irrational distrust of certain foods, including food ingredients or food additives. Over the past several years, food companies and the media have perpetuated chemophobia amongst consumers by declaring the removal of certain ingredients or additives from their products. These announcements are typically not based on safety reasons but rather because the scientific name(s) of ingredient(s) are unfamiliar or sound intimidating. The fact is, while consumers have every right to avoid certain foods or ingredients based on personal preference, there’s no reason to be fearful of them. All components of food are safe and regulated by government authorities responsible for protecting public health. Let’s take a look at the truth behind claims that certain ingredients are “scary” or “unclean” and conquer chemophobia once and for all.

  • The term “clean” is appropriate after washing dirt off your produce, not when interpreting labels. Regardless of the length of ingredient lists or the way ingredients sound, foods that contain unfamiliar ingredients or additives are not “dirty.” In fact, in many cases they help ensure that foods are safe to eat and free of pathogens that could cause foodborne illness. The trend of using the term “clean” to describe a diet that is free of additives has not only created a misconception that it is a safer way of eating, but it is also now falsely associated with positive health outcomes such as weight loss. The truth is that reducing the amount of calories consumed, not the amount of ingredients or additives, is what helps produce weight loss.
  • Food science is beneficial, and shouldn’t scare you. One common tactic used by groups to paint food ingredients and additives in a negative light is to suggest their names should scare us. Ingredients like xanthan gum, titanium dioxide and sodium phosphate may sound odd, but oftentimes additives are named based on their original sources, such as minerals, salts, or other naturally-occurring substances. What’s more, these additives play important technical roles in foods, such as enhancing their nutritional value, improving texture or consistency, making foods more convenient to prepare, extending shelf-life, and contributing to a more sustainable food supply. A quick online search can help you identify where an ingredient’s name originates and what purpose it serves in a food.
  • All foods are complex, meaning they contain many naturally-occurring ingredients. In 2013, James Kennedy, a renowned chemistry teacher and blogger, published a poster series called the “All-Natural Banana.” This series showed the abundance of chemicals and ingredients that occur naturally in foods and fruits enjoyed by consumers every day. A banana, for instance, naturally contains over 50 ingredients that include maltose, proline, tyrosine and myristic acid. In this series, Kennedy addressed the fact that natural foods are typically more chemically complicated than foods considered to be manufactured or processed. Some of these naturally-occurring ingredients may be potentially harmful if consumed at extremely high levels, but the government prevents this by regulating the levels of ingredients food companies are allowed to use. Further, these ingredients are not consumed alone in concentrated forms, but instead in the context of a total diet.
  • Only “food-grade” ingredients can appear in foods. A common fear promoted by self-described health “experts” is that if an ingredient appears in a nonfood item, it has no business in foods. What these individuals fail to recognize is that some ingredients have different grades depending on the application. For example, an “industrial-grade” phosphate is produced under different conditions than a “food-grade” phosphate and is not allowed to be used in foods. Phosphate food ingredients must be made under strict manufacturing conditions directed by the laws enforced by the US Food and Drug Administration (FDA) and other regulatory agencies.
  • A long ingredient list doesn’t determine the healthfulness of foods. Governing agencies not only regulate the safety and content of foods, but also how their ingredients appear on labels. For example, the FDA requires that the ingredient list baking powder includes all sub-ingredients, which looks like this: baking powder (sodium bicarbonate, sodium aluminum sulfate, cornstarch). Imagine how long an ingredient list for a whole grain baked item would be! By law, nutrient-dense products with a combination of ingredients and flavors are obligated to include long, scientific-sounding ingredient lists.

The bottom line: Not recognizing or knowing the origin of an ingredient name should prompt curiosity, not fear, since the consumer safety of food ingredients is determined and monitored by qualified scientists and government agencies. Do your own research about ingredients you may not be familiar with and base your opinions on high-quality, peer-reviewed scientific studies, not marketing campaigns. Don’t let chemophobia determine what foods you should or shouldn’t eat, or hold you back from eating the foods you enjoy!

To learn more about the different food ingredients in your food and where they come from, check out the Facts on Food Ingredients webpages on this website.

Codex Alimentarius: Protecting the Health and Safety of Consumers

Have you ever wondered how your food is grown or made? Or, if your food is from another country, how do you know it’s safe? Who sets standards for the food that come from other countries? If you’ve ever asked these questions or pondered similar thoughts, you should learn more about the CODEX ALIMENTARIUS.

Codex Alimentarius, established in 1961, is an international standards-setting body overseen by the World Health Organization (WHO) and Food and Agriculture Organization (FAO) of the United Nations with the mandate of ensuring a safe food supply and facilitating international trade.

Codex AlimentariusCodex brings together over 400 member nations and recognized non-governmental organizations (NGOs) to establish nutrition, safety and trade standards that are reflective of sound science and fair trade practices. These standards help local farmers and smaller nations gain access to foreign markets while helping to guarantee safety for consumers everywhere. Food producers that wish to export their products to foreign markets can follow Codex standards to ensure safe growing, processing, packaging, testing and shipping practices, while knowing their products will be accepted in most national markets around the world. This helps promote international food trade and protect consumers in the increasingly globalized food supply chain.

Numerous Codex committees and task forces meet on a regular basis to discuss and propose standards that help promote the safety and availability of food. These meetings are transparent, inclusive and operate on consensus ensuring that no single country or organization is able to dominate the discussion. This provides a level playing field where consumer safety and science are prioritized and all voices are heard.

Standards adopted at the committee level are then reviewed for approval by the Codex Commission. If adopted, the standards are issued to all Codex stakeholders and posted online. Member nations may then adopt the standards into their own regulations or instruct local growers and producers of the standards. Companies may also follow the standards voluntarily if they intend to export their products. These transparent standards, which are readily available in multiple languages, allow all food producers and countries to follow consistent and globally accepted guidelines.

When it comes to the growing needs of the global population, Codex ensures food traded between nations is safe, healthy and available. The standards put forth by Codex help foster uniform practices for all food producers. Codex benefits all of those involved in the food trade, from local farmers and small businesses to the largest importing and exporting economies.

To learn more about Codex and how it works, watch these videos on our Multimedia page.

FDA finds no safety concerns with common emulsifiers at current consumption levels

New research from scientists at the U.S. Food and Drug Administration (FDA) found no safety concerns with several ingredients commonly found in food. The research, which focused on ingredients known as emulsifiers and in particular, sodium carboxymethylcellulose (CMC) and polysorbate80 (P80), was conducted in response to research claiming these two ingredients have negative effects on the human body.

Emulsifiers are used in food to produce a consistent blend of two or more ingredients. They ensure the ingredients remain mixed and don’t separate at any point from processing to consumption, which reduces food waste and makes the food look and taste more appealing. Common examples of foods containing emulsifiers include ice cream, salad dressings, margarine, chocolate, breads and other baked goods, desserts, candy, cheese and some beverages.

Over the last two years, a few researchers have made claims about the potential impacts these ingredients may have on microbes in the human gut. Due to the importance of gut microbes in the digestive tract and on overall health, it was important to investigate these claims and ensure that the use of these common food ingredients remains safe.

To understand whether the researchers’ claims were valid, a team of scientists from the U.S. Food and Drug Administration (FDA) conducted a review of several emulsifiers commonly used in food to determine whether these ingredients could pose any risk to human health. The study focused on CMC and P80, which were implicated in the negative research.

The FDA’s findings directly refute the earlier research linking these two ingredients to disruptions in gut microbes. These findings also raise serious questions about the validity of claims that CMC and P80 specifically, and all emulsifiers in general, cause other negative effects on the body.

The study reviewed dietary exposure to the emulsifiers over the course of two different time periods. It relied on Acceptable Daily Intake (ADI) levels established by the Joint FAO/WHO Expert Committee on Food Additives, an international expert scientific committee that is administered jointly by the Food and Agriculture Organization of the United Nations (FAO) and the World Health Organization (WHO). The key findings from the FDA study included the following:

  • The amount of emulsifiers found in consumer food hasn’t substantially increased over the past 15 years.
  • Of the seven emulsifiers tested, both CMC and P80 were on the lower spectrum for exposure at their current levels.

The FDA’s findings provide significant evidence that emulsifiers remain safe at the levels currently consumed and that claims suggesting these ingredients are harmful are not valid. While it is important to continually review the ingredients used in food and ensure they remain safe, it is equally important to review new research that draws negative conclusions and validate it through additional research and investigation by qualified scientists like those at FDA.

To read the abstract and full study, click here.

  • 1
  • 2