Why sudy with someone? Because student foragers see what they want to see rather than what’s in front of them. Let me give you a consistent example.

There are two plants which really do not resemble each other save for one feature: Each has tiny stinging hairs. Otherwise the plants are quite different. One tends to have small oval to lance shaped leaves with teeth, the other has large hand-shaped leaves without teeth. One likes a moist environment, the other a dry one. Their size can vary greatly. Their blossoms are very different. They are also in different families. They really do not look alike.

Confusing these plants is not like confusing a large pony for a small horse. It’s on par with confusing a wolf and a mountain lion because they each have yellow eyes. Yet in less than six months I have had three reports of this confusing happening, all because of the stinging hairs. The reports also have one other amazing aspect in common: No one died, or even got sick from eating the wrong part of the misidentified plant. In fact, they liked it.

Urtica dioica, common stinging nettle

Whenever one can add another weed, or part of a weed, to the edible list that is good. The two plants that are being confused are the common nettle (Urtica dioica) and the spurge nettle (Cnidoscolus stimulosus.) The Urtica clan grows around the world and has been used by man where ever it is found, not only for food but as cordage and as a medicine. The stinging hairs present quite a problem. Most folks boil the leaves for a few minutes to render them harmless. The entire plant, however, can be wilted before a fire and made edible without having to boil it. There isn’t much mystery left to the Urticas as they have been used for thousands of years and are well-recorded.

Cnidoscolus stimulosus, spurge nettle

Then there is the “spurge nettle” related in name only. It comes from a family (Cnidoscolus) commonly found in the warmer areas of the Americas. In the southern United States there are two “spurge nettles” basically one east of the Mississippi river, and one west of the Mississippi river. The C. stimulosus is east of the mighty muddy, and the C. texanus west, with some overlap. What we know about the C. stimulosus is that its root can be eaten after being boiled. What we know about the C. texanus is that its seeds can be eaten after being roasted (or otherwise cooked.) But we don’t know if the C. stimulosus seeds are edible or if the roots of the C. texanus are edible. However, there was a 1954 study that suggested there were no toxins in the C. texanus root and it should be investigated as a possible food source. A 1957 study also looked at the seed oil of the C. Texanus. It was higher linoleic acid and lower in saturated fat than cottonseed oil, has antioxidants, is high in protein but absent of carotine and ascorbic acid. Specifically the oil was 71% linoleic acid, 15·5 % oleic acid, 10 % palmitic acid, and 3 % stearic acid.

At this point we have two minor mysteries, is the seed of the C. stimulosus edible and is the root of the C. texanus edible? It would be a tad odd that these two closely related species would have dissimilar edible parts though admittedly the great Mississippi was barrier enough so that over time two species did arise. Being in the same genus does not confer edibility, some times a variation within a species can make an edible plant not edible, the Dioscorea bulbifera might be a good example.

If the respective seeds and roots of these two species were not mystery enough then we add three reports to me in 2009 from people saying they ate the leaves of the C. stimulosus! (Amounts and size of persons is unreported.) One put them through a blender, made a smoothie, and drank them raw, the other boiled them then enjoyed. Each thought they had consumed an Urtica. (One showed a friend the plant who knew the difference, and the other sometime later noticed the Urtica and Cnidoscolus were different species.) They both reported them good and neither person reported any ill effects. This is not beyond the realm of possible. The third person told me he and his grandmother used to fry the leaves and eat them.

The leaves of a Central American relative, the chaya (Cnidoscolus chayamansa), are edible if cooked or mashed up and allowed to sit for a few hours to remove some cyanide. (As blending is total cellular disruption it might speed up the curing process.) Since the leaves of the chaya are used as a food, and two people have eaten leaves of the C. stimulosus, this would suggest we investigate the C. stimulosus leaves as a possible wild food. And if the leaves of the C. stimulosus are edible, what of the leaves of the C. texanus?

Let us suppose for a moment that it is all true, that the roots, seeds and leaves of both the C. stimulosus and C. texanus are edible.  If so, how was that information lost, if ever known?

I can understand native peoples not eating the foliage of either because of the stinging hairs, and they do sting badly. (Some chayas have stingers and some do not.) Yet these same people ate Urticas, which also sting. The seeds of C. texanus are encased in a very well-armed seed pod, yet they, too, are known to be edible. One saving grace of the C. texanus seeds is that they can be roasted rather than boiled (unlike perhaps leaves) so primitive man with fire might have learned of the seed’s edibility. Clearly stingers have not excluded plants or parts as food. The question maker, in practical terms, is the C. texanus root.

Roots are prime wild food, now and in the past, especially in the past. Roots and creatures that moved sustained humanity. We know the C. stimulosus, which produces a small root in comparison to the C. texanus, is edible. That knowledge was passed down from the first foragers. More so, the C. texanus roots grows quite large.  That means it should have been an important food root to the first foragers if it is edible. Such an important root surely would have been mentioned. It would seem highly unlikely that early foragers would have discovered the edibility of the C. texanus seeds but not its root yet know of the edibility of the C. stimulosus root but not its seeds. That’s all rather irrational if all is edible.

Another possible explanation for these mysteries could be early historians or — gasp –botanists. Were these two plants always known as separate species? Who did the historians and/or botanists talk to and where and when? Was the information lost, just parsed out and missed, or are different parts not edible? Did the first foragers know of the edibility and did the knowledge die with them because someone did not ask the right questions about the right plant? It might be worth a PhD thesis.

So we have several mysteries. We have a plant that grows a large root (C. texanus) that would have been very valuable to the first foragers that is not known as edible but research suggests it is edible and a potential large food source. To the contrary we know the smaller root of its species sibling, C. stimulous, was eaten (I have had a few. The flavor resembles pasta.) Then the seeds of the C. texanus are known to be edible and are reported to be very tasty. Regarding the edibility of the seeds of the C. stimulosus, we know little or nothing though they are as obvious on the plants as on their relative.

And lastly the leaves… taken at  face value, we have two different people reporting they mistook the C. stimulosus for an Urtica and consumed it with no ill effects, one blending, one boiling. And one report of eating them without any identity mistake. Are they really edible? And if so, what of the leaves of the C. texanus? The consumption of the chaya suggests it is possible. Again, was such edibility unknown, or unreported? Or perhaps the leaves are not edible. I would also think they should not be wilted even if edible for while that might render the sting stingless it probably would not take care of the cyanide issue.

You see, not all is known even now about the wild edibles around us.  I just wish folks would be more careful about identifying plants. Dogs and cats do look different.

{ 2 comments }

Properly cared for cast iron pots can last for centuries

When Europeans began to migrate into tracts of North America what was the one thing they had the native Americans wanted more than anything else? Rifles? Axes? Horses? No, none of those. The one thing the “Indians” coveted the most was the metal cooking pot.  That tells you two or three things. First that eating was more on their minds than fighting, and that sometimes historical facts can get lost when history is rewritten. It also suggests that squaws had more to say in matters than one is led to think.

Mary Ball, mother of George Washington, inherited her mother's cast iron pans.

The metal cooking pot may not be up there with the invention of the wheel, but it’s close. It probably safe to say there are few kitchens in the world without at least one metal cooking pot.  We rightfully worry today about having enough food to feed humanity. A decade without new pots to replace old ones could be as devastating as lack of food. Cooking without a pot is not impossible, but it ain’t easy, either. And perhaps that is why every home should have at least one cast iron pot or pan. Treated well they literally last for generations. George Washington’s grandmother, Mary Hewes, thought so much of her cast iron pot and frying pan she willed them to her daughter Mary Ball, Washington’s mother.

Fish Baked In Clay

One facet of foraging is learning how to cook wild foods without pots and pans, such as dry roasting roots, wilting greens, and baking in various kinds of pits. That’s what cooking was before there were pots. Boiling food was not a handy option. Rocks had to be heated and then put into wooden bowls or skin bags. Sometimes skin bags of water were suspended over fires. As long as they have water in them they don’t burn through. Boiling was not a prime culinary technique. That suggests that food that needed to be boiled were not prime eats either, such as pokeweed. Indeed, the Alabama Indians referred to pokeweed as the plant white men ate (as the Alabamians did not eat it.) I suspect boiling was limited to making medicine which was more important at times than food thus worth the energy.

Camp Ovens have legs, Dutch Ovens Do Not

Metal pots revolutionized cooking, and the prime material for several hundred years was cast iron. Some will argue, when all things are considered, it is still the prime cooking material. It’s inexpensive, cooks well, and will outlive whoever currently owns it.  Many a cast iron skillet has been passed down from mother to daughter to grand daughter and beyond. And as cooking without pots is a skill to learn, so, too, is cooking with cast iron and an open fire. That is why I like cast iron cookware: I can use it in my kitchen, with my fireplace when the power goes out yet again, or when camping.

Cooking wild foods in cast iron over an open fire is how many of our ancestors cooked even a century ago. There’s past in the pan. A chuck wagon wasn’t a chuck wagon without cast iron pans. Colonists sailing to American carried their cast iron cookery with them. We all have ancestors who cooked with cast iron. From about 1865 to WWII nearly every bride, no matter how poor she was, could count on a least a cast iron skillet and dutch oven for a wedding present. It was essential to life.

I cook with cast iron and will admit to collecting cast iron cookware, from the no name to the coveted, but I usually find mine at garage sales, flea markets and recycle centers.  Indeed, I recently got a Griswold dutch oven lid in a salvage yard for $1.41. While that might not mean much to many readers it was one heck of a find. Of all my cast iron cookery I think I bought only two items new, and that was years ago.

In my mind foraging and cast iron pans complement each other, two methods of eating, long ago without pots and pans and then with metal pots, an evolution of food before the chemist got involved. And I must admit, there is very little in the way of modern cookware in my kitchen that I’d like to pass on. But my mother has a couple of my grandmother’s cast iron muffin pans. Some day they will be mine. And as I have no children I will pass them on to much younger cousins. They will have a pan their great great grandmother used. Up against history like that, teflon doesn’t have a chance.

To read about cast iron cookware click here. To read about cooking before pots and pans click here and or here.

{ 0 comments }

An arctic express of frigid air recently sped down and across the United States. Here in Florida it snowed for the second time in 33 years, delivering a week record-setting temperatures below freezing.

For many plants 32º F or 0º C is falling off a cliff, and Ma nature doesn’t care who she shoves off, invasive weeds or pampered ornamentals. I’m afraid my mango tree will never recover. The carombola (star fruit) might. I’m about 50 miles north of its most northern commercial range but in a protected spot, and it’s about 10 years old so we will see. The avocado is well-established so it should make it.

Despite complaints, there are some advantages to cold weather. My olives should fruit this year, getting the necessary chill hours. And a lot of invasives that have had over three decades to proliferate, such as Brazilian Pepper, will be frozen down for another generation. There is a park in Cocoa, Florida, that is so overrun with Brazilian Pepper that I could find only 13 edible species in an area that should support easily six dozen or more.

During the cold snap frugal Green Deane burned wood in his fireplace to keep warm. Cutting down a tree to stay warm is certainly a green debate. This wood, however, was scrap wood headed for a land fill. And it was free, not a small consideration. It came from a company that imports slabs of granite from Brazil and India. That granite comes packed in shipping crates and inside the crates the slabs are kept from shifting by wood. A lot of wood. For years the company had literally boxcar size piles of scrap wood they could not give away, basically because they are also full of nails and bolts. My fireplace, however, does not care. I consider the hot iron just more radiant heat.

So, burning scrap wood is keeping wood from a landfill — green — but it’s putting smoke and particulate matter into the air — not green. Then again, forest fires put smoke and particulate matter into the air. And of course the all-time champs of that are volcanos. Mother Nature isn’t too “green” herself.

On the other hand I did  not pay the power company extra to keep me warm, so they used a little less fuel. Green. And with my cast iron pan collection I cooked with the fire, also using less electricity, more green. As for edibles weeds, they did suffer. The frosts seems to be more damaging than the freezes. But those will fertilize the next generation. Which brings us to exactly what is being “green?”

Are beavers green? They cut down healthy trees, using only the branches. They block the natural flow of a stream by building dams, not only altering the environment upstream but downstream when the dam eventually breaks. Beavers have no concern for the fish up or downstream, the erosion their dams cause, or what all that rotting wood does. Beavers it would seem are not green.

of the strangler fig? It takes over trees and kills them. Not exactly friendly or green. For that matter, what of the parasitic wasp. It lays its eggs in other bugs and its offsprings live off the host, killing it. And we call that wasp “beneficial” because we don’t like the bug it infests. For that matter is cancer green? After all it kills people and people are harming the environment so maybe in the long run cancer is more green than the beaver. “Green” just might be a matter of degree and definition, and those may come and go like the tide. Green seems to be whatever one wants it to be.

My measuring tool of green is gastronomic. If I can eat an edible weed because the ground and water are wholesome, then we are doing something right. If it is not edible because the ground or water is not wholesome, we are doing something wrong. And that is one reason why I teach foraging. It makes pollution personal. And if that helps make a person “green” what ever that is, then all the better.

{ 0 comments }

Nutrition or Food?

The 20th century was a hundred years of significant changes in what we eat. In 1900 food was … well… food, and real. No food pretended to be something it is not, such as chemicals pretending to be sugar. There were few artificial foods or “food products.” Technofood. Food was food and had been so for thousands of years. One should also add that the price of food rarely changed either.

Today food is “nutrition” and much of the “food“ we eat is not real…. well, it exists, so it is real. But it’s not real food. Surprisingly most of the non-real food does not call itself food. It’s called “nutrition.” Hence, the first rule of Green Deane’s thumb: If the packaging tells you it is healthy it probably is not.  An apple does not need to prove itself. An edible weed does not have to prove itself either. A package of white flour, sugar and additives pretending to be a wholesome breakfast does have to prove itself.

I am often asked what’s the nutrition of a particular wild edible. In fact today I was asked the nutritional value of a particular bug. A bug! Not even a big big. A little bug. The mindset is clearly well-established. It’s a curious question, this “what is the nutritional content of X?” It is not a question your grandmother or great grandmother would have asked, or any of her ancestors. Food was food. “Nutrition” did not exist. There were no nutritionists or dietitians. People were eating what people had been eating for most of human history. That humanity thrived proved that what they were eating worked. Our very existence is  prima facie evidence of that. “Nutritionism” had not taken over the food chain. I am not a fan of the past but I think how we ate yesterday is better than we eat today.

To be accurate, nutrition wasn’t a significant idea for the first half of the 20th century. And while there were “dietitians” nutrition did not arise as the main player until the 1980s. That’s when the government, for purely political reasons, started holding hearings on food. Things tend to go down hill when government gets involved, and it was so with food. Politicians “discovered” poverty and hunger in the United States in the early 1960s. The food stamp program was started in 1964 when Democrats held the White House. A few years later under Republican Richard Nixon, the Secretary of  Agriculture, the infamous joke-telling Earl Butz, ended a program that paid farmers NOT to grow corn. Like popcorn that commodity exploded. Some argue it brought down the cost of food. Then in the early 70s Democrat George McGovern, who had presidential aspirations and who had helped discover poverty and hunger in America, started holding congressional hearings of food. There were huge conflicts of interest. Experts were cherry picked while others were threatened with their jobs if they opposed the various committee findings.

The short version of the very protracted event was that the government began to suggest one food over the other, such as chicken over beef. You can imagine the grilling that followed. Lobbies threatened to remove from office any politician who singled out any one food, rightfully or wrongly. McGovern, from a beef-producing state, was vilified and was told to find a way out of the ugly mess his hearings had created (as prediced by the experts who were not cherry picked.) As a result politicians abandoned food recommendations and found legal refuge in “nutrition.” The new position became don’t shun meat but do avoid the bugaboo in meat, saturated fat (based upon the fraudulent research of Ancel Keys, thus compounding the mistake.) So for more than 30 years, nutrition has been the operand factor and one reason why the government’s food pyramid is exactly inverted. It is not based so much on healthy eating as lobbying success, read threats to reelection. This, I think, is behind our health crisis — more cancer, more heart disease — and the obesity epidemic. With the history noted let’s fast forward.

In mid-2011 the American Journal of Clinical Nutrition published an editorial about our genetic past and nutrition. They argue our genetic compatibility with food (meaning we can eat it and it helps us thrive) was established 50,000 to 100,000 years ago. They say it has not, as a whole, changed much since with only minor exceptions such as some populations developing the ability to digest lactose (the sugar in milk.)  They also make this observation: Nutritional advice is not working basically because it stops us from eating like our ancestors. I doubt doctors who push pill solutions to problems read such articles.

Consider: When you have a sudden and substantial change in a chronic disease something is happening other than with the disease. We now have babies with adult-onset Type II diabetes, virtually unheard of in children 50 years ago, or even 30 years ago. What’s different? Their high consumption of sugar and high fructose corn syrup (had to do something with all that corn grown post-Earl Butz. That’s also why there is corn-made ethanol ruining the rubber parts in your car’s fuel system.) I think the low-fat high-carb mantra of the past 30 years is behind much of our diseases today including the obesity epidemic. That’s not the only problem. There is also the issue of how we think about food.  Let’s take a closer look at the idea of “nutrition.”

There are many problems eating for “nutrition” rather than just eating real food. One of them is caused by science itself. The brazil nut is a good example. Men who eat brazil nuts regularly have lower rates of prostate cancer, or perhaps more accurately, what prostate cancer they get they get at an older age than usual and it is less aggressive.

The Brazil nut is too complex for science to say why it works as an anti-cancer food. There is also little money to be gained by advising men to eat more Brazil nuts. But, science can break down the Brazil nut, find out what chemicals are in it, and then speculate on that. And that is just what science did. The answer it came up with was selenium.

Selenium was a good candidate in the Brazil nut, a good “nutritional” candidate for the award of somehow preventing prostate issues. So while science could not champion the entire complicated Brazil nut it could champion selenium as a potential nutrient to ward off prostate problems. But then there was a problem: Men taking selenium supplements get cancer sooner and the aggressive kind. Ooops….

One cannot really blame science, but one does have to recognize its shortcoming. The selenium in the Brazil nut does not exist in isolation. It is with a multitude of other naturally-made chemicals in the nut and it could very well be their balance, their relationship or their mutual consumption that makes the entire nut good at preventing prostate cancer. Said another way, the prevention does not come from just the selenium. It does not flow from just the “nutriment.”

A similar issue exists with flax seed. It is touted as a non-meat source of Omega 3 fatty acids, and that is true. So flax seed oil is often recommended for vegetarians to get Omega 3 fatty acids. The problem is the oil by itself has been linked to prostate cancer whereas the entire seed itself has not. Again, science — researchers — reduced the food to its parts then recommended a part only to find the part in isolation was not so healthy. (As an aside I have a suspicion that things which bother the prostate also bother the breast, and things that bother the breast can also bother the prostate. Thus one should pay attention to research regarding which ever you don’t have.) A similar “Ooops” exists with vitamin A and lung cancer. Vitamin A in food is generally good. As a supplement it’s potentially dangerous. Said another way, it is the food not its nutrition that is probably good for you. Fish oil may indeed be good but fish is better (well, in theory before mercury pollution.) But the point is made: Eat food, not nutrition.

One would think eating food not for nutrition would be a focal point of health-conscious people but apparently not. When I go to a “health food store” usually more than half of its space is given to individual nutrients, vitamin A to zinc et cetera. That means even these patrons who think they are eating better than most are caught in the same mindset: Eating nutrients, not food. Eating for “nutrition” ignores that chemical’s place in the food, that food in the larger diet, that diet in lifestyle, and all of that in the environment. It’s like not breathing clean air but rather bottled gasses, seeking out this or that mixture or additive. It’s assuming man-mixed gasses are better than clean air.  It’s consuming parts not the whole.

So, do I know the nutritional profile of say a Bidens alba? Yes, it’s has about twice of everything that spinach has except the oxalates. But is that how we should think of it, as a collection of nutrition? Couldn’t an alternative view be that a variety of food is like the variety of chemicals in the Brazil nut, that it’s the greatest variety of food that produces the best health, and that eating one food over another is like taking too much selenium? Or nitrogen?

“Nutritionism” hasn’t worked well, or shall we say that since its invention it has a short and poor track record. The chemist in the kitchen creating technofood has done more harm than good whereas the way our ancestors ate proved it worked by our very existence. Our great grandparents did not eat for nutrition nor did they shun certain foods. They also ate a wide variety of food simply not eaten today.  Isn’t it time for diet diversity again?  Isn’t it time to Eat The Weeds?

{ 5 comments }

I have long criticized what I call chemists in the kitchen. They brought us such things as cancer-causing additives, artery-damaging trans-fats, insulin-skewing high fructose corn syrup, and untested Genetically Modified Organisms as food. That is not a good track record. These well-intended tinkerers have announced another success soon to be added to the food chain,  Bisin a “natural” substance that inhibits bacterial growth in food.

With a treatment of Bisin a sandwich can stay fresh on the shelf for a year, an open bottle of wine could remain good for years  (that it remains open only a matter of hours in my home makes that benefit seem irrelevant.) Fresh salad dressing could stay fresh for years…. you get the picture, food that will not rot.

Not only that but listen to the spin from the discoverer, microbiologist Dan O’Sullivan of the University of Minnesota:

(Bisin) seems to be much better than anything which has gone before,” O’Sullivan said. “It doesn’t compromise nutrient quality. We are not adding a chemical: We are adding a natural ingredient.”

To which I would say cyanide is a natural ingredient but…. As a “natural” product Bisin does not have to be pharmaceutically tested and could be in food within a year. It might be wonderful. But, what’s the possible down side?

If Bisin limits bacteria growth might it have the potential to severely compromise our internal flora? Gut bacteria is critically essential to good health. Basically it digests our food and makes up 80% of our immune system. Compromised gut bacteria can lead to a host of illnesses ranging from food intolerances to weight gain to immune diseases. We literally cannot survive without our “internal garden.” Antibiotics already upset that delicate function. A food additive that intentionally kills off bacteria could severely influence our health.  Consider: Bisin has already been shown to kill off the bacteria found in yogurt, which makes up a portion of our intestinal flora. What if Bisin makes our gut a bacterial no man’s land?

More so, the problems could manifest themselves immediately or take decades. We now know that what a newborn is fed can influence their gut bacteria and health for life, that you can predict an increase in certain illnesses as adults based upon gut bacteria as an infant. Is Bisin another trans-fat time bomb, a “safe” substance that will be consumed and take its toll for half a century before being removed?

While I have not read of said I can hear a counter argument getting dressed in the wings: The bacteria won’t survive the acidic stomach to make it to intestines, home sweet home zero for our bacteria. They said the same thing about specific elements in GMOs. They were wrong.

I don’t know if Bisin will slip quietly into the food supply or cause a firestorm but it should be something we should watch. One of the many reasons to forage is to have healthy gut bacteria. Hygiene in modern societies, while getting rid of infectious diseases, has has increased diseases related to impaired internal flora. That wild food we consume is more than just chemically free and un-engineered. It can also contain good bacteria that affects our fundamental health by improving our internal garden. The chemists in the kitchen may have just discovered another potential threat to our health, a threat they will enthusiastically add to the food chain untested as food.

We are, again, guinea pigs.

{ 0 comments }