Eating organic used to be a fringe commitment. Not anymore. The idea that the adage "you are what you eat" actually has merit, that America's industrialized food system is making consumers—literally, consumers—obese, diabetic and primed for heart disease, has converted millions of us into pursuers of the American Organic Dream: Eat Organic To Live Longer and Better. But many aren't buying it. Most consumers, for example. Although sales of organic food increased sixfold over the last decade, organics are still a tiny fraction of the food Americans eat. Perhaps that's because organic food can cost up to twice as much as conventionally grown? Perhaps it's because—as critics of the organic food movement argue—there's just not a lot of solid evidence that going organic makes you any healthier. This side says the race by food makers to slap labels like "farm-grown," "free-range," and "all natural" is more about catching a fad than upgrading our food in any meaningful way. Should we all go organic, and pay the extra that it costs, because few things are more important than our health? Or is the organic movement, and the firms cashing in on it, hawking a hoax, or at least grossly overstating the biological benefits to be had when the chicken that we eat is raised with some more legroom?