Fighting fallacies: researchers battle false factoids that surround people every day
By Stephanie Jacques
Kansans don't have accents. Natural ingredients are healthier. One flea is all it takes to make a dog itch.
Although most popular beliefs like these are harmless, they can become a barrier to researchers trying to disseminate their findings to a broader audience. Kansas State University researchers in English, food science and veterinary medicine are going toe-to-toe to give everyday preconceived notions a solid research-data right hook.
Fallacy: Among Americans, Kansans don't have an accent.
Fact: Kansans are developing a Hollywood-style accent.
Mary Kohn, associate professor of English, is researching how people say a word and how that changes over time. According to Kohn, the change in Kansans' accents happened around the same time as similar changes in California accents and many other areas across the country.
"We do have an accent," Kohn said. "All dialects do, but one of the cool things about this is that we can show how our accent is looking similar to what has developed in several pockets of the country. That change has developed this stereotype as being Californian."
Kohn said accents are typically identified by the way vowels are pronounced. In English, the tongue's movements produce around 12 to 13 different vowel sounds, which can crowd the mouth. According to Kohn, one way to deal with that is to lose a distinction or two.
"What we find is that Kansas is participating in a vowel shift that we call the California Vowel Shift," Kohn said. "In other words, our young Kansas participants — really anyone younger than 65 or so — are cultivating an accent that sounds very much like you would expect from someone from California."
One example where Midwesterners have dropped a vowel distinction is between the words "cot" and "caught." "We actually completely lost the vowel difference between 'cot' and 'caught,'" Kohn said. "Most Midwesterners can't hear that sound difference. If you ask them to produce it, they really struggle, but if someone from Great Britain says 'cot' and 'caught,' you could hear the difference."
Kohn says Kansans and Californians say "cot" and "caught" as "c-ah-t." While the pronunciation of "cot" is the same across most U.S. dialects, the vowels in "caught" are pronounced differently in areas without the Californian Vowel Shift. For example, a New Yorker might say "caught" as "c-uh-ah-t" and a Southerner might say "caught" as "c-a-ow-t."
Since the end of World War II, the California Vowel Shift has been popping up across the nation, mostly in areas with a highly mobile middle class.
"We can watch the sound change across generations," Kohn said. "We haven't pinpointed exactly what the social cause of it is yet — and it is going to have to be something that can explain how this same accent pops up in a bunch of different places all at once."
Kohn's sociolinguistic research can help educators and speech pathologists discern between dialect difference and pathology.
"An understanding of dialect difference can help teachers tell when a student is really struggling or when a student just speaks a dialect that differs from expectations in the classroom," Kohn said. "Misunderstandings about linguistic differences can lead to misdiagnoses of speech and learning disorders."
Fallacy: The phrase "natural ingredient" is defined and synonymous with healthy.
Fact: consumers greatly differ in what they consider a natural ingredient, and the Food and Drug Administration has not defined the term.
The U.S. Food and Drug Administration, which oversees most food products, has yet to define the term "natural ingredient" because many people interpret it in a variety of ways, according to Edgar Chambers IV, university distinguished professor of food, nutrition, dietetics and health. Chambers and his collaborators are involved in three studies to determine what people consider natural ingredients.
"With the exception of meat, there is not a definition of natural for most food products in the U.S.," Chambers said. "Companies can call just about anything natural and get away with it."
Chambers' collaborators are Edgar Chambers, research technician; Mauricio Castro, doctoral student; and Thao Tran, master's degree student. Each is in charge of one of the three studies, which when combined — accounting for more than 2,000 participants — did not find a consistent majority of people who agreed on the naturalness of ingredients or on the reasons why an ingredient was not natural.
"The problem with defining 'natural' is that what you think is natural is not the same as what I think is natural," Chambers said. "We are finding from our U.S. research that the term 'natural' is so varied in meaning and it could referKansas State University Herbarium to anything from the growing to the processing to the health issues."
Top reasons participants gave for saying an ingredient was not natural included not knowing what the ingredient was, it was unhealthy or led to health concerns, it was genetically modified, herbicides or pesticides were used in growing it, it was disgusting, it was illegal or it had been changed in some way that could not be replicated in a traditional home kitchen.
"Natural is so misunderstood," Chambers said. "It's so confusing to people. It means something different to so many different people, so it's a rather difficult thing to define, which is why the FDA hasn't defined it."
In one of the studies, 37.2 percent of people said wheat flour — the most common type of flour used in breads, cookies and cakes — was not natural because they didn't know what it was, it contained gluten or it had been processed. The term "wheat flour" was used in the study to differentiate from other flours such as sorghum flour.
"One of the biggest reasons people had for calling something not natural was if they thought it was not healthy; if it's not healthy, it can't be natural," Chambers V said. "This included ingredients with chemical-sounding names, names that sound similar to unhealthy ingredients and those that have a misperception of being unhealthy."
Among the ingredients that were deemed the most unnatural were insect powder, or ground insects used to increase protein, by 93.4 percent; sodium bicarbonate, or baking soda, by 87 percent; corn syrup — not the high fructose kind, just regular corn syrup — by 80.5 percent; gluten by 23 percent; and salt by 45.6 percent. Corn — historically the most genetically modified crop — was considered the most natural ingredient by 69 percent of people, and sea salt was considered natural by 60 percent of people.
"To consumers, this makes complete sense, but to food scientists it makes no sense, which makes it harder to define," Chambers said. "When something like this isn't defined, it can be used any way companies want, creating a niche market." According to Chambers, natural food stores account for 20-25 percent of the market, and it's growing. To avoid confusion, many companies have started labeling products as what they aren't —non-GMO, no high fructose corn syrup and 100 percent organic — instead of using the term "natural."
"There is a lot consumer education that needs to happen," Chambers said. "We want consumers to understand that because natural is not defined, its use is not very helpful to them. We all should know where our food comes from."
Fallacy: Fleas jump from pet to pet.
Fact: Once a flea finds a host pet, it stays there until it dies.
Michael Dryden, university distinguished professor of diagnostic medicine and pathobiology in the College of Veterinary Medicine, has been fighting fleas — and the myths surrounding them — since the 1980s. Dryden's research in flea biology and habits has helped develop better flea treatments and rid pets of the itchy parasites.
"There were a lot of people who had the misconception that fleas jump on, feed, jump off, lay eggs in cracks and crevices, and then find another host," Dryden said. "The fleas on our pets don't do that, and overcoming that myth radically changed how we approach flea control today."
Dryden, who is known as "Dr. Flea" for his expertise, said that the fleas on pet cats and dogs evolved to feed on Africa's big cats and dogs. If these fleas had jumped off a host — like rodent fleas do — finding another host would be too difficult. Understanding this difference made approaching flea control more proactive, as opposed to reactive flea baths and exterminators for the house.
"If fleas start reproducing, they don't leave," Dryden said. "They lay their eggs on the animal, which roll off into the environment like little pingpong balls. Once we realized that reproduction was tied to the infected dog or cat, it changed everything in how we approached control."
Dryden and his colleagues started testing methods to attack the flea's reproduction in Florida, the flea capital of the world because of its warm climate. Untreated dogs in Florida average 100 fleas per animal, Dryden said. Using these dogs and cats, Dryden and his colleagues have tested numerous flea products ranging from sprays, spot-ons and ultrasonic collars — Dryden said they don't work because fleas can't hear ultrasound — to the new oral chews, which kill all fleas on the pet in eight to 12 hours.
"With these newer oral treatments, flea numbers dropped dramatically, with 99-100 percent of pets flea-free in less than two months," Dryden said. "It's so much faster than the topical treatments, which result in 50-70 percent of pets being flea-free at the end of the two- to three-month studies in Florida."
Dryden said that the rate at which the fleas are killed is important because most fleas are blood-fed within five minutes of jumping onto Fido and lay their eggs within 24 hours. If owners want to prevent an infestation on their pet and in the home, it's critical to kill females before they lay eggs.
"The idea is, if we can kill the fleas before they lay eggs, they will go extinct within a generation," Dryden said. "Generally, down in Florida, that's within two months."
One of the myths that Dryden has disproved is that the bite of just one flea results in the pet scratching. The researchers worked with a board-certified dermatologist to see if it took many fleas or just one to cause a pet to scratch. They found it takes several fleas to cause the allergic reaction. The researchers also are working with pet owners to help them understand that if their pet has fleas, then there also is an infestation of immature stages — eggs, larvae and pupae — in their home.
"It's really difficult to overcome those dogmas because people become entrenched," Dryden said. "We all have preconceived notions — good, bad or otherwise; they are there. Our job as scientists is to design a study that regardless of what perceptions are out there, at the end of the day, you've got good undeniable data."
Read the rest of Seek and see the PDF version of this story from New Prairie Press.