What if Society Can Be Sick, Just Like A Body…?

Part 1 – Food
In 1944, during World War 2, the Dutch had what came to be known as the Hunger Winter. It was a famine that took place because of Nazi occupation and caused the death of around 22,000 people. And if you think that sounds like fun, search the Siege of Leningrad sometime. The Russians had to create a temporary police force just to stop roving gangs of cannibals for that one. Apparently when the Germans wanted to starve people they went all out. Yeah, the Nazis really did suck.
However, following this, babies were born. Some women were pregnant when the famine started, others became pregnant shortly after it began. The babies of women who at some points were on 600 calorie a day diets, where fat all but disappeared from the diet and, it being the Netherlands, where people started eating tulip bulbs just to survive, those children turned out to have some interesting health problems which formed a pattern.
Something called “The Dutch Famine Birth Cohort Study”, cohort meaning ‘born around the same time’, was carried out by the departments of Clinical Epidemiology, Gynecology, Obstetrics, and Internal Medicine at the Academic Medical Center of Amsterdam, together with the Environmental Epidemiology group at U of Southampton in England. They discovered that the kids of women pregnant during the famine had higher rates of diabetes, obesity, cardiovascular disease, and a host of other problems. They also had lower birth weights, which isn’t surprising, but that low birth weight persisted into second and third generations.
Their results were published in September of 2008 under the intensely boring title “Transgenerational Effects of Prenatal Exposure to the Dutch Famine on Neonatal Adiposity and Health in Later Life”. A Dutch pediatrician by the name of Dr Willem Dicke even found increased risk of Celiac disease because the kids grew up with bodies that didn’t know how to handle wheat products. I can’t give you the title of his paper because I don’t speak Dutch, but he did publish and if the other articles are any indication I’m sure it was a veritable cornucopia of medical-ese.
Hollywood even got into this because screen star Audrey Hepburn was a kid in the Netherlands during the famine. Throughout her life she battled anemia, respiratory illnesses, and edema. There was even suggestion that her bouts of clinical depression may have been connected. Elaine Walker and Dante Cicchetti gave that argument some pretty hefty backing in 2003 in their succinctly titled “Neurodevelopmental Mechanisms in Psychopathology”. If that wasn’t enough, researchers Alan Brown and Ezra Susser published in 2008 the no more poetically but longer titled article “Prenatal Nutritional Deficiency and Risk of Adult Schizophrenia”, which gave the argument yet more support. They also found other schizotypal personality and neurological defects in malnourished children and by extension babies of malnourished mothers. Meaning anxiety, schizophrenia, and mood disorders generally can and have been demonstrated to be related to the overall health of the body.
Part 2 – Environment
Most people have never heard of Frederick Winslow Taylor, but most people in the United States have been a victim of him. He was an interesting guy. Taylor was a mechanical engineer in the late 1800s who figured out that the main reason industrialists weren’t making enough money was because of the inefficiency of their work force. Some people have referred to him as a father of ‘Scientific Management’ but really he was a guy who created what are now called ‘Time and Motion Studies’.
Those studies were a little stroke of genius. Taylor did things like tie glowing balls to women’s hands while they worked at their desks and then put a camera in front of them with the shutter open for several minutes. What that did was give him a trace for where their hands moved throughout the process of doing their normal jobs. He could then find the stray hand movements and write up guidelines to eliminate them. Father of Scientific Management perhaps, but what he’s truly the father of is “Every Second Counts”. Literally, Every … Second … Counts. And for anyone who has worked a desk job for a few decades, we could also call him the Father of Carpal Tunnel Syndrome or the Father of Degenerative Joint Disease. Not surprisingly people like Henry Ford loved the guy.
The problem is, machines break down and the parts have to be replaced. When we’re talking about a bolt in an engine that’s one thing, but when we’re talking about someone’s hip, well that’s kind of abusive. So much so that when unions first formed they weren’t for equal pay or better hats, they were for 40 hour work weeks and mandatory lunches. We in the US are still constantly debating whether or not it’s justifiable to legislate off-time.
Those unions weren’t alone in their objection to trying to squeeze the human body into the smallest box possible. It didn’t take long for the medical community to realize that what Taylor was doing was trapping people in the same condition as what was once called ‘Muscle Binding’.
Muscle binding as a definitive medical issue was first established in the late 1800s (huh, how’s that for a coincidence?). Binding generally involved people who exercised excessively with their large muscles, like their bicep, but not their small contrary muscles, like their triceps. So they could flex like crazy, but they couldn’t do a pull-up to save their life. The harder they worked to build their big muscles, the more the contraries atrophied, and the more the large muscles grouped together and ‘bound’ until they were virtually useless. It could even lead to nerve damage permanently reducing function.
As late as 1994 medical scientists were still publishing how muscle binding and not working the total body could cause paralysis. In their even more jargon-heavy titled article “Role of muscle insulin-like growth factors in nerve sprouting: suppression of terminal sprouting in paralyzed muscle by IGF-binding protein 4″ P. Caroni, C. Schneider, M.C. Kiefer, and J. Zapf found, among other things, “these findings suggest that IGFs are major signaling factors from inactivated muscle to promote local restorative reactions, including interstitial cell proliferation and nerve sprouting”. Note that last phrase, ‘nerve sprouting’. Damage to the muscles can actually cause paralysis.
Which dovetails nicely with the malnutrition issue I’ve already discussed, starve a kid and not only do their muscles suffer but so do their nerves. More importantly it points out that Carpal Tunnel Syndrome and Time and Motion and all of that are destructive not just to the bones and soft tissue, but to the nervous system. Guess what your brain is made of?
Part 3 – Mind / Body
So, damage to the food supply damages the body, damage to the body damages the brain. But the Time and Motion studies had two components, Motion yes, but also Time. The Motion obsession hurt people. It caused physical damage. But the brain is an organ, just like your kidneys or your femur. Throughout the history of humans, whether you believe we’re 6000 years old or not, clocks are new as hell. The concept of 1 minute hasn’t been around for more than a few hundred years. In fact, the very concept of a ‘clock’ originated with the Catholics in Medieval Europe. As both Michael Foley in his book “Why Do Catholics Eat Fish On Fridays: The Catholic Origin To Just About Everything” and James Burke discussed in his television series “Connections”, the word clock originates from the Latin ‘clocca’ which means ‘bell’.
Why do we say it’s four o’bell? Because that’s how time was originally marked, ‘four of the bells’. Or, Four Bells, 4 o’clock. Just like the Muslims, Medieval Catholics had to pray at specific times of the day. So they invented a machine to ring a bell to remind them. Before that the last civilization that developed anything like what we would consider a modern clock was the Greeks and the Egyptians. They used a water-bowl with a hole in it. It was more like an egg timer than a way to divide the universe into totally artificial incremental bits.
Before that, astronomical phenomena were used. And rather than shoehorning those phenomena into our fake system, we lived by their cycles. Which is why the Chinese New Year falls on a different day every year (as does the Jewish and Muslim New Year), and why we have to occasionally add an extra day to February and the date of Easter is always up for grabs. Those events are all based on the moon. Our western calendar was originally kinda based on the sun, but we don’t even do that anymore (explain Daylight Savings Time logically, g’ahead, I dare you).
In fact, time wasn’t even ‘universalized’ until the railroad was invented. Before that people were still using sundials even in major cities like St. Louis or London. Even the idea of a clock with a face is relatively new. The oldest surviving clock in England, as Foley points out, is a bell tower with no read-out what so ever. The human body is absolutely horrible at gauging the length of a minute. Try this experiment: get a stopwatch, start it, close your eyes, and then stop it when you think a minute is over. You’ll be wrong.
Minutes in our brain’s world have absolutely no meaning. Now tell your boss that there’s no fundamental difference between 8:59am and 9:02 am.
Part 4 – Conclusion
Bipolar Disorder, and Attention Deficit Disorder, and so many of these anxiety related neurological disorders are the natural outgrowth of a more and more compartmentalized society. As the human body is forced into closer and closer confined movements, more and more segmented and dissected bits, more and more artificial and referent-free chunks, it starts to disintegrate. So too does the nervous system. Some people’s brains come through the process relatively unscathed. Just like some people can eat candy all day and not get fat.
But, now that the West has become obsessed with ‘progress’ and the ‘future’, what’s the end game? Okay, so we figure out time to the micro-milisecond. So what? The brain is the single largest cluster of nerves in the human body. Chronic Spinal Inflammation can happen with only a few decades of working a desk. We train babies to clocks almost from birth. And for those poor few for whom that system never really makes sense, their brains are perpetually on alert.
Can it really be any surprise then that medications used to treat Bipolar are also used to treat Anxiety, and to treat Schizophrenia, and to treat other psychotic disorders? Depakote, Abilify, Lithium, all are anticonvulsants also used to treat epilepsy. That’s just disturbing. When muscle and nerve fiber has been damaged patients experience numbness and uncontrolled spasms. There’s no reason to suspect those moments of numbness, those spasms, that lack of control, somehow stop at the neck. When the danger is being 1 minute late to work, then everything becomes dangerous. Anxiety is the natural outgrowth of an obsession with Time, anxiety is an inflammation of the mind. We don’t have to keep doing this to ourselves. Because if the children of those famines are any indication, not only will their conditions persist throughout their lives and be passed on to their children, but it will only get worse with repeated exposure.

An Immodest Proposal: That Global Warming is Stupid

Fun fact, the original American mass-produced automobile, the Ford Model T got roughly 13 miles to the gallon (according to Ford’s official website). The average today? 21 miles per gallon, a century and we’ve managed 8 whole miles. Angus MacKenzie did an op-ed piece in April of 2008 that addressed this disparity with his article titled “The 25 MPG Model T: Why Haven’t We Done Better?”

MacKenzie makes a salient point, we have done better. We’ve done much better. Cars are now not the death-traps they once were, owners are not nearly as likely to get a broken arm with crank starters, or die in an engine explosion / drive train fracture / headlamp oil fire. Compared to the Model T modern cars are houses on wheels, they’ve even got cupholders for heaven’s sake. He even concludes by stating, as others have, that car manufacturers have tried to market more efficient vehicles to no avail.

The VW Lupo 3 was brought to market with lackluster sales, even though it got 63-78 mpg in town. The Toyota Hybrid Prius is a joke. Not because it isn’t more fuel efficient but because the battery that powers it travels around the world twice on barges after being strip-mined out of the earth before it even drives out of the show room, burning still more dino bones as it goes. Fewer, perhaps, but burns them all the same. Oh, and let’s not forget that it’s a plug and play vehicle. Which means the fuel consumption the owner avoids at the gas pump is offset to the local coal or natural gas burning power station.

Internal combustion engines are just a terrible form of transportation. My question is, why haven’t we come up with something better by now? Sure there’s the VW ‘Air Car’ which (I’m not kidding) uses a combination of air that is pre-compressed and then a fuel heater to expand the air further. The estimates are it could get 800 miles on a single tank of, well, whatever you got. Supposedly it can run on diesel, gasoline, bio-fuel, kerosene, daemon spittle, just so it gets hot and stays hot. Basically, it’s a steam engine without the steam, but with a ‘boiler’. Which kinda brings us back to that old technology from the introduction. Except self-propelled vehicles with steam engines go even further back as they were developed in the 1700s.

What about global warming? Because of a man named Sterling. Robert Sterling was a Scot who invented an engine that actually does run on air. His engine relies on the difference between expanding and contracting gasses when their temperature changes. If you put one half of a Sterling engine in the sun, and the other in the shade, you’ve got power All. Day. Long. And no exhaust, no carbon emissions, nada. Stick a Sterling on top of your house with some battery back-ups and you’ve got free power until your roof collapses.

But every time I see people debating fuel efficiency, or wind energy, or even cultural and environmental preservation, for some reason it ends up in a shouting match about green house gases and how we’re killing the Madagascar Sucker-Footed Bat (yes, that is a thing). Global warming is a frivolity, a chicane, a humbug. Not because it isn’t happening, because it has nothing to do with the solutions because it has nothing to do with the problems.

Strip mining the Appalachian Mountains for coal, fracking half of Colorado for natural gas, it’s the death rattle of a dying industry. And it’s absolutely disgusting to look at, we live here and we’re shitting in our own den. When British Petroleum got the worst case of plumbing backup in The History of Ever, the US Congress held an informal prayer ceremony. No shit, while millions of gallons of sweet crude oil and natural gas went spewing into the Gulf of Mexico, our elected officials gathered their collective power, and with the full due process of law they held hands and sang “Kum-By-Yah”.

John Stewart of The Daily Show had the only response appropriate in that situation, “You’re going to ask God for guidance? As if him sticking the oil under two miles of ocean wasn’t enough of a hint?” This wasn’t an ‘Oops’. It was an act of terrorism pure and simple. A foreign power balking at our sovereignty and poisoning our water supply. If they’d done it to the Pentagon we would be at war. But no, instead we prayed about it.

The religiosity of the whole thing isn’t what gets me, it’s the sheer grudging acceptance. This attitude of ‘Well, what were we supposed to do?’ BP says we need to drill, so we drill. And when the well decides to drain itself before we can get there to drink it dry we start talking about the Arctic National Wildlife Refuge or the Tar Sands like they’re some sort of black gold mine.

Nevermind that they’ll end up looking like the oil absorbing cat litter in a gas station parking lot. Oh, and we’ll have to construct a pipeline to haul the stuff. Unfortunately it’ll have to go through a few Indian Reservations violating centuries old treaties and possibly spurring an international incident. Tis but a trifle, BP and Chevron so. May ExxonMobil guide and protect you.

So why are we still talking about the Arctic Polar Bear, or the Canada Goose, or the Monarch Butterfly? Not because I don’t care about them, I actually really do. I really do a lot. But I give a shit about me too. And as long as the radical left keeps couching its language in the most absurd, inane, quasi-spiritual doggerel, the ears of everyone who doesn’t want to commune with their inner Tao and activate their 12th Chaka-Kahn are going to be slammed in our faces. So let’s get down to brass tacks, quit it already.

Money lovers don’t care about the White-Faced Saki Monkey, or the Bolivian Wombat. They care about money. Have that conversation.

Talk about old-growth forest pharmaceuticals for fun and profit. Not by clear-cutting mind you, but by truly preserving and exploring. Hell, find a Native or two and have an actual honest to goodness conversation about what the elders used to chew for a headache. Talk about eco-tourism, invite the Natives for that one too. Instead of using Eminent Domain to to give Keystone a boondoggle. Talk about being the first manufacturer of a combination solar panel / Sterling engine personal home generator. Talk about getting 100 miles to the gallon and only having to fill up your tank every three months. Gas is at $10 a quart? Who gives a crap?!

Picture it, truly picture it. Take all the ne’r do wells, all the xenophobes, all the people who think the UN is the biggest waste of real estate since we admitted Kansas as a free state. Take all the American Exceptionalists, all the people who think France will destroy us with Socialism and that Muslim terrorists are hiding in every 24 hour movie rental store. Take all those people, stick them in front of a television and make them watch as the President announces a new government initiative to tell China and OPEC to take a flying leap because the US is going oil free.

Moral Absolutes Are So Cute!

Eddie Izzard has, as part of his stand-up routine, a musing on the subject of lying. He essentially says that we should divide lies much like we divide felonies, we have Murder in the 1st, 2nd, and 3rd degree, Manslaughter, etc. So it should be with lying, we should have 1st Degree Lies like, “I took out the trash”, all the way up to 4th Degree Lies, “No, I did not gas several million Jews”. Hold on to that idea for a moment.
On an episode of “Politically Incorrect with Bill Maher” several years ago, Izzard repeated it in response to Christine O’Donnell stating all lies are a moral wrong regardless of context. From stealing candy to hiding a murder, a lie is a lie.
To be fair, she had started the conversation with the suggestion that compliments should never be a lie. Even if you don’t like someone, you should still be able to find something to say about them that is ingratiating or at the very least non-confrontational. Just as a form of respect.
But Izzard, as well as Maher were quick to say that there is a vast difference between being deferential at a dinner party and being outright fawning in some vain pursuit of this nebulous idea of ‘truth’. O’Donnell was insistent that any lie is disrespect. Maher and panelist Jasmine Guy both replied that sometimes lies can be a form of respect, even protection. Drawing on his example, Izzard took Guy’s and Maher’s point and asked “So if Hitler were at your door and you knew the Jews you were hiding were going to die if you admitted it, you would tell the truth?!”

O’Donnell was both unhesitating and unwavering, “God would provide a way for me to not have to lie.” Cue collective noises of exasperation, disbelief, and outright anger from the audience as well Maher, Izzard, Guy, and Martin Mull. Maher replied without even thinking, “Oh shut up. You can’t possibly believe that.” But there is a very important idea floating in the back of what O’Donnell is saying, because I think she does believe it. And It’s extremely important that she does.
Permit a moment’s delay to travel way back to the dark ages of the 1930s. Where developmental psychologist Jean Piaget proposed the idea of human behavior having discrete developmental stages in children. Specifically four, sensorimotor, preoperational, concrete operational, and formal operational. Or, want to stick it in your mouth, try to stick it in your mouth, stick it in your mouth, and decide it isn’t food.
Building on Piaget’s work in the 1950s Lawrence Kohlberg added development of morality. He argued that children first learn that there are morals, then what those morals are and then that those morals have nuances. He also argued that these ‘moral discoveries’ could be located at specific points in a child’s growth, just like height or teeth. He formalized his argument with what is now called “The Heinz Dilemma”.
Kohlberg gave several children the following narrative: Heinz is a guy who steals something (loaf of bread, pills, something important). His wife is sick / dying. And he is broke / disabled, and can’t afford it otherwise. Is Heinz a bad man?
Kohlberg then asked the children why Heinz was or was not a bad man. It was the reasoning that mattered. There have been criticisms of Kohlberg, specifically that his theory and method are extremely Eurocentric, which is true. A more egalitarian society could easily put the emphasis on whether or not the druggist who refused to help Heinz’s sick wife was the true criminal. But the fact is, as he tested each child Kohlberg did find a fairly consistent curve.
Meaning even if his theory is exclusive to Western culture, it’s a consistent theme within that Western individualistic mindset and ethos. So much so that children as young as 4 and 5 are already able to articulate very precise reasons why Heinz is or is not accountable for his ‘crime’.
Which brings us back to Izzard and O’Donnell. In Kohlberg’s work the stage where everything is in positive and negative terms, where every drop of nuance is completely non-existent, is Stage One.
O’Donnell’s absolutes, where a lie, whether you’re complimenting someone on a dress you don’t actually like or telling Hitler that you agree with everything he says, are the very first stage that children in the West have. It is the stage they complete before they’ve even stopped occasionally chewing on the cat. It is an intensely selfish stage, where everything is easily coded and everything can be readily accepted or rejected.
It is not surprising that Kohlberg found this stage reliably present in very young children, they’re still learning that mommy’s lipstick is mommy’s and not an extremely colorful multi-surface writing utensil. This is right around the same time children say “No” constantly and cry uncontrollably when they’re denied.
Which is key to understanding O’Donnell’s statement, a three year old’s sense of immediacy dovetails nicely with a need to categorize people instantly. Very little thought is necessary. Heinz stole so he’s bad, no more thought necessary. But a child’s immediacy is directly related to their knowledge of the world and ability to care for themselves. When they cry, someone feeds; when they pee, someone cleans. Human young would not be able to survive if not for these expectations. In fact we do not survive if the care-giver falters. The medical condition “Failure to Thrive” has a specific ‘attention’ component. Babies in orphanages that are not well tended routinely die, quite literally, from a lack of love.
But humans grow. We learn to bathe, feed ourselves, get jobs, and run for the Senate (O’Donnell was considered for a Congressional seat shortly after she was arguing with Izzard). While what she says may, at first glance, sound absurdly naive, it is actually incredibly entitled. It is O’Donnell saying ‘I refuse to have to think about this, even with an eye to reciprocity slanted in my favor (Kohlberg’s Stage Two), because I don’t want the world to work that way’. She is saying ‘God will think it through for me.’
O’Donnell infantilizes herself in front of a studio audience, who are all so aghast at her seeming obliviousness that they fail themselves to notice she’s reduced the full gamete of human experience to a Spaghetti Western where the man in the white hat comes to save the day. What is worse by far though is O’Donnell isn’t alone. Pure capitalists frequently argue that market forces compel people to ‘be honest’.
Racism in its raw form suggests that people are fundamentally different on a genetic level. Sexists state emphatically that men and women are just different, with no possibility of common experience even within the same culture. Religious Fundamentalists (those who believe whatever dogma they follow is the direct word of the creator to be followed precisely without interpretation) state emphatically and repeatedly, as O’Donnell did, that morals and laws are not only synonyms but are absolute. So much so that even the Creator cannot violate them.
It is the last part of the argument that really is troublesome. If true, even the Creator cannot violate the law. O’Donnell would not be able to argue that ‘God would provide an exit’ if laws were not absolute. She could break the law and not be punished or God could be a jerk and make it work out in the villain’s favor. But for O’Donnell’s statement to work both she and God are locked, she cannot break the law of deceit and God cannot allow her to be punished for her honesty. But that does suggest the question, ‘Who originated the law?’
This sort of circular justification is present in many spheres, even every-day conversation. From social science dilettantes suggesting that people behave correctly because of law, to the hyper-masculine or feminine explaining their behavior in terms of how ‘real’ men or women behave. The trouble is there has to be an origin. If God made the law, he can unmake the law, if women are like that a woman can be unlike that, people follow laws because they choose to follow law. A simple proof of this is, if laws are truly absolute then no crime would exist. Essentially, we follow laws by and large because by and large we choose to be law-followers.
O’Donnell cannot live with that uncertainty, not pure Capitalists, racists, or Fundamentalist Muslim Clerics and Roman Catholics. The uncertainty of having to trust another person to be nice, and fleshing them out if they’re not, and occasionally getting betrayed, is a level of unpredictability that O’Donnell simply will not allow in her universe. Except she’s stagnated at Stage One, her universe is roughly the same size and shape as a four year old’s. Christine O’Donnell’s universe is her childhood home.
That is why in O’Donnell’s universe homosexuality isn’t real. It was not in her mother’s livingroom. Neither was Allah, the genuinely poor, or women who preferred to cut their hair short. When confronted with them O’Donnell must retreat into the only answer she has, “Then it must be ‘bad’.”
Izzard somewhat addressed this when he replied to her nonsensical suggestion of divine intervention with the statement “Then you could just take out a gun and shoot Hitler. You wouldn’t have to lie and you’d get rid of the problem.” And that is where O’Donnell’s awkward reasoning takes a dangerous and precipitous fall. She replies by saying “We cannot limit God [that way]“.
This is where the line is so frequently crossed. When a woman is no longer a woman because she does not act ‘right’. When a man isn’t manly because he doesn’t have a beard. When we must defer to displays of bravado because one person chooses to filter all of their conversations in competitive terms. That is when the law becomes absolute, and the these self-appointed ‘enforcers’ become all too often mindless thugs.
O’Donnell’s deferring to a higher moral authority as a source of guidance may or may not be simplistic, but that last sentence is where she transitions from seeking guidance to positing a deux ex machina. She is not merely suggesting, she is outright stating that the good and fluffy Lord himself will extricate her from the situation if she is intractably painted into the metaphorical corner of having to truly consider her options and determine the lesser of two evils for herself. Weighing relative value isn’t just the last option, it’s not an option at all. Much like a baby who has been denied a cookie, she cries foul.
That is the center to this type of thinking. It limits us all. No one can be selective, ever. O’Donnell is ultimately an easy target, she is not bright (which is not to say she is worth less than a smarter person, only that she is not very intelligent). But she is not alone in her convictions. It is a mindset to be wary of, that somehow these ‘laws’ are above consideration. Much like those who believe that the Second Amendment to the US Constitution is sacrosanct, or that marijuana laws are moral rather than legal, zealots of O’Donnell’s ilk are genuinely dangerous.
It starts with a simple enough phrase: “You’re either this, or that.” It can be something very small. Either you have long hair or you’re a man. But we should all be extremely cautious at the second half of the comparison. Because if the ‘that’ of the equation is implicitly diminished, and it’s another person, another human being, then you’re not very far at all from it being a justification to simply remove them. After all, do we really want to have the same basic mental frames as a three year old whose still sticking carpet fluff in their mouth?
We have to call it out, all of us. Children learn by example.

A Scientist And Radical’s Response to “A Liberal’s Defense of GMOs”

Web author Saul of Hearts posted an article titled “A Liberal’s Defense of GMOs” that was one of the Editor’s Picks on Medium.com. It lined out how, as a self described ‘crazy fucking hippie’, he was “…just not that scared of [Genetically Modified Organisms]”. It was an interesting read, but not a very good one.
Proving there’s no enemy worse than a foolish ally, Saul says some very odd things to defend a point that’s valid, though not the way he makes it. For example, he never specifies what is a ‘hippie’. He say he regularly attended Burning Man and lives on a co-op. But the reader is left to assume he therefore cares about the soil and the trees and what happens to them. He never gives his education except to say he’s “fascinated by genetics” and reads about the topic. He also mentions having taken classes on the subject at the U of Penn. Though tellingly he never says which books, or if he graduated.
Permit me a momentary aside for an irksome trend I’ve noticed: I don’t follow the line of reasoning that he’s a hippie because of Burning Man. Burning Man is a gigantic festival every year in the Arizona desert. It involves campers and art installations and culminates in a gigantic bonfire. For many it can be an extremely powerful experience. But to say, ‘I’m an environmentalist, I go to Burning Man’, is to say, “I’m an environmentalist because I go out to the desert every year to burn several tons of harvested wood for completely selfish reasons.”
Burning Man is a social construct, not an environmentalist training facility. I don’t know how it has come to be shorthand for ‘I care more about Big Boobied Mamma Earth than you’, but it shouldn’t. I always think of a retired friend of mine whose fond of saying, “I was a real hippie, I couldn’t afford to go to Woodstock.”
Burning Man as proof of ethical superiority is a sentiment that should not go unchallenged.
But even if one accepts that hippie and environmentalist naturally follow from Larry Harvey’s Ten Principles, Saul follows his assertion with wildly inaccurate claims. He asserts animal DNA has never been spliced into a plant. That’s demonstrably false. Osamu Shimomura isolated what makes the firefly jellyfish glow and won the Nobel Prize for it. That chemistry was later incorporated into several tobacco plants and featured in National Geographic. Why say it never happened? It doesn’t even connect with anything else he argues.

Saul also says he supports nutrient enhanced crops for malnourished regions. But he never qualifies malnutrition. A primary grain manufactured to be high in vitamin A is great for the immune system and eyes, if one ignores the fact that overdosing on vitamin A causes renal failure. Liver is high in vitamin A which is why in 1913 Antarctic explorers Douglas Mawson and Xavier Mertz died from overdose of A when their rations ran low and they ate the liver of their dogs. Vitamin A is so good at killing things it’s used as a topical treatment for acne. And yet, a vitamin A enriched cereal grain is precisely the focus of the “Golden Rice Project” which Saul links to as proof of a good program. So the operation was a success, too bad the patient died.
Quite frankly, the Golden Rice Project defeats several of Saul’s stated interests. He says “I think Monsanto is evil, that patenting seeds and suing farmers is unethical, and that some GMO crops [can] lend themselves to irresponsible herbicide and pesticide use and cross-contamination.” But Golden Rice was developed by Syngenta, a direct competitor to Monsanto, both of whom hold patents on seeds.
More importantly, vitamin A requires fat to fully metabolize, putting it in a grain makes it practically useless. And it has to be shipped! It is the embodiment of corporate monopolization. Golden Rice is precisely the sort of greed masquerading as charity that Vanda Shiva warns of in her book “Stolen Harvest” and that Saul seems to describe as “evil”. Shiva’s book, incidentally, was published 13 years ago. Meaning Saul’s good intentions have actually caused him to advocate for the very thing he says he disdains. If anything it proves how insidious the emotional manipulation of this topic has become.
And ultimately this discussion deserves more than a cursory glance. I’m bothered that Saul’s argument could be seen as a good distillation of the support of GMOs from the Millennial Left. Why combat propaganda with misdirection? I think this issue deserves better advocacy. And so I write:
An Educated Liberal’s Defense of Genetically Modified Organisms
I am not a Crazy Fucking Hippie. I have never been to Burning Man. I couldn’t afford it if I wanted. I am so completely unable to go to Burning Man that until roughly 9 months ago the only way I could have attended would be if I biked there because I did not own a car. I am a tribal socialist, I recycle nearly everything that comes into my house. I read the entire label, on everything I buy. When I was a kid I listened to BBC World Service on short-wave because I already knew US news was crap.

I do not live in a co-op. I am so much not an early adopter that I own a grand total of 9 electronic devices, one of which is a coffee maker with a timer, and another is a 30 year old fridge. I do not eat veg or vegan exclusively but I eat weekly at one of the two vegan restaurants in town. I also support local free-range cattle ranches as I live on the high plains and if god had not intended me to eat cows and buffalo and deer he would not have made them from meat.
I am fascinated by human genetics, diet, history, and culture. I read as many books as I can on those subjects as they have been required for both of my degrees. I passed several classes at University and obtained a piece of paper with absurdly dramatic lettering on it that says I have a pretty solid understanding of Anthropology.
And I am only mildly concerned with GMOs.
Don’t get me wrong, I understand where some of my more apprehensive friends are coming from. I share their desire for a verifiable safe and sustainable food supply. There is a great deal that frustrates me about food production, distribution, and consumption in the US diet.
I think that corporate greed is always a factor in industry, and should be curtailed before innovation leads to exclusion and disenfranchisement like the “Monsanto Protection Act”.
I think misleading farmers into purchasing genetically programmed crops, or trapping them in a cycle of corporate dependency, leads to irresponsible overuse of pesticides like the endocrine-disruptor Atrazine developed by Syngenta.
I think allowing multi-national conglomerates to purchase mega-hectares of land which are then mono-cropped is disastrous to the environment, and can eventually lead to cross-contamination with non-GMO plants which result in something as mundane as the lawsuit Monsanto v. Geertson Seed Farms, or something as catastrophic as the Irish Potato Famine.
But I also know that anti-corporate sentiment or philosophy should not be the single basis for excluding a promising area of modern research. It is a valuable tool that can be utilized in many ways to contribute to a safe and sustained food supply.

I want to address four main points that are frequently involved in anti-GMO discussions that are either directly disprovable, or more nuanced than they might seem:
1) GMOs create mutations that are ‘unnatural’ in a way that simple cross-breeding would not.
This is a simple misunderstanding of ‘unnatural’. The implication is that nature filters out its screw-ups in a way that GMOs don’t. In fact, selective breeding has many notable cases where the end result was a perfectly viable lifeform that was terrible. The rabbage or cabbish is one that stands out. The intention was to produce a plant that was completely edible, with the roots of a radish and the head of a cabbage. Soviet agronomist Georgi Karpechenko managed a plant that produced hundreds of seeds. Unfortunately it had the inedible roots of a cabbage, and the thin chewy leaves of a raddish.
If there is a danger to GMOs it’s in the perception of its precision, people are scared of a corn plant with a rabbit’s tail. Sterility of offspring, so the argument goes, is not an issue if we can simply clone the parent plant indefinitely. Meaning we skip the part where nature would have prevented the abomination from reproducing.
But cloning has been done ‘naturally’. All yellow bananas at the grocery store are the Cavendish variety, which are genetically identical to the original. It’s fairly certain that every banana you’ve ever eaten, was every other banana’s twin sister. And this was developed long before we’d mapped the genome of anything.
We like to play with nature. Mendel discovered inheritable traits by experimenting with dwarf peas. Corn, as far as anyone can tell, was originally a type of tropical grass from the Yucatan. It’s what the GMO is intended for that matters, not simply that it is a GMO.
Dwarf wheat is a fantastic example of a success story. Wanna feed India? Use a grass that evolved in Asia anyway and modify it so that the seed grows at the end of a short stalk instead of a long one. More seed on the head because it has less chance of breaking off and killing itself. Bang, mostly native plant, GMOed to suit a specific need in-region. Norman Borlaug won the Nobel Prize for it. Does that mean we ONLY feed India wheat? No. Does it mean that starvation is never due to government corruption? Hell no. But don’t throw the baby out with the bath water.
2) GMOs contain animal DNA that has been “spliced” into plants!
So that’s bad then?
As near as I can figure, this objection is Linnaeus’ fault. Bless his long dead soul. Carolus Linnaeus developed a system for classification of living organisms several centuries ago and though it’s almost completely useless in its original form we’ve never really let go. What’s at issue is that Linnaeus treated all life like a flow chart, everything going forward into the future.
His scheme starts with living things that make their own food, or Flora (Plants), and those that eat food, Fauna (animals). Those are the two “Kingdoms”, and never the two shall meet. Many people learned his system in school: Kingdom, Phylum, Class, Order, Family, Genus, Species. Because Flora and Fauna are separate groups, they can’t intermingle. It would be like a horse trying to breed with a house cat, they’re on two completely separate branches.
Except Linneaus was wrong, one of the newer classification hierarchies that’s popular was developed by Carl Woese in the 1960s and puts animals and plants into the same group. Even the Linnean system has been so tweaked he wouldn’t recognize it. Current US text books have six kingdoms: Animalia, Plantae, Fungi, Protista, Archaea, and Bacteria.
To prove the point, human cells have mitochondria, which are basically self-contained organisms with their own DNA. You, right now, barring any abnormalities, have carried two contradictory strands of DNA since before you were born. Best guess is that mitochondria were a proto-bacteria that formed a symbiotic relationship with some long distant ancestor cell and just never left. Mitochondria are frequently called the ‘powerhouses’ of the cell because they generate ATP which is essential in muscle growth and movement.
Meaning, mitochondria produce food. According to Linneaus, that makes them plants. Nature was putting plant DNA into animal DNA long before we tried. Think of GMOs as us returning the favor. Again the question should not be ‘what are they doing’ it should be ‘why are they doing it’. If we’re incorporating animal DNA so a plant glows in the dark that’s a little strange sure, but if we’re incorporating animal DNA that allows rudimentary solar sensitivity to release UV protecting chemicals to prevent plants burning to death as a form of drought tolerance, that might be useful.
While genes are nothing like computer code that can be spliced into or out of an existing framework as independent informational chunks, they can give valuable insight into how to modify a given organism in a particular direction. And there is no way to learn from that information without eventually applying it to a living thing.
Ears grown on the back of a mouse for humans seems the height of vanity and hubris, but what about the naked mole rat’s possible role in cancer research as discussed in a June 2013 article in Nature Magazine? Even the VP at PETA, Mary Beth Sweetland, gets her insulin from animals, what if we could develop a plant from the Costus Igneus or “India Insulin Plant” that produced it as a nectar? Which would give a nice circularity to ‘nectar’ as the origin of the word was the drink that kept the Greek gods alive.
3) GMOs are -Insert hyperbolic catchprase (cancer causing, bad for the environment, cause zombies, whatever)- so you should sign my petition against DuPont!
Here’s the thing, scientific investigation is fascinating, even life-changing, and frequently boring as shit. And I mean that literally. Science is boring in the way only ungodly hours of documenting turds can be.
Coprolites found in the US have yielded amazing information on pre-Clovis (read: Native American) diets and possible nutritional needs. Know what coprolites are? Fossilized shit. Think about that, we know what we know about what humans thousands of years ago ate because some poor grad student sifted through several pounds of human crap so old it had turned into a rock.
If you read online tomorrow that coffee causes cancer, I guarantee you’ll read by next week that it has anti-oxidants and whitens your teeth. Science makes terrible news because it moves at a snail’s pace. A perfect example is that Einstein proposed his General Theory of Relativity in 1915, he then had to wait four years for an eclipse to confirm he was even on the right track.
How often would you visit Huffpost or TMZ if they only updated every four months? But scientific journals that publish anything more than yearly are considered ‘cutting edge’. Now imagine that 90% of everything you read is either wrong, or ends with the phrase “in 40 years we’ll have more information”? Speaking of Einstein, the final confirming tests of General Relativity were not possible until the 1970s, meaning by the time he was right, he was dead.
Something as multi-faceted as cancer research takes decades. Sifting through thousands of case studies, then filtering for possible genetic conditions, then filtering again for environment, then filtering again for personal habits. Saying “GMOs cause cancer” is like saying “Carrots cause death”. Prove me wrong, because everyone who has ever eaten a carrot either has died or will soon enough.
Again it’s how that GMO is used. Yeah, a fly with an extra leg is creepy, and that’s what grabs headlines. But I don’t know anyone who thinks avocados are freakish abominations. And the avocado you’ve seen in the grocery store is, at best, a massively grafted distant step-daughter to the plant that was first discovered in Mexico all those centuries ago.
4) We should label all GMO products and let consumers decide.
Alright. And to a certain extent I agree. But where do we begin? Is it GMO to graft two plants together? They are disparate genetic structures being artificially combined. Is it GMO if any of the genome has been affected? Because simple x-rays for pathogens will do that. Is it GMO after a certain percentage of its DNA has been changed? If so, what percentage? Some sections of your DNA are completely unused, other sections cause diabetes, do we only worry about the ‘active’ bits? How much do you know about your own genetic code, let alone that of a guava, or farm raised catfish? Do you think you could decipher a label that stated “AAA59179.1 resequenced 22.3% for color and flavor”?
And really, what is artificial modification? While the visual might be fun, it is extremely difficult to physically cut a DNA strand, insert a foreign section, and glue everything back together. Most genetic modification is done by combining seeds or eggs from one generation with cells from another, as it was in the case of Dolly the ‘cloned’ sheep. Technically Dolly had three mothers, one of which was herself. Is a clone therefore genetically modified?
How long and detailed do we want this label to be? It took several acts of Congress just to settle the question of what qualifies as a ‘standard’ serving size for the purpose of telling you how much fat is in your peanutbutter. Try to imagine the absurd debates that will take place over whether something qualifies as a ‘clone’ and is or is not therefore GMO, or if it’s a zygotic twin and therefore is or is not a GMO.
And we’re eating GMOs every day, I’ve already mentioned avocados and wheat, try to find a product without corn, or cattle that wasn’t fed alfalfa. If we’re going to start labeling things, first we’ll have to decide what qualifies as “modified” in the first place. Is it a GMO if it was fed a GMO?
Personally, I think GMO has become a catch-all term for several interrelated issues regarding industrialization, corporatization, and food sustainability.
–Factory farming and intensive mono-cropping is simply unsustainable. It is dangerous to animals (see: Mad Cow Disease, Antibiotic Resistance in Livestock) and to plants in that it over-uses the land with a single plant that takes a specific set of nutrients from the soil and never puts them back. Which in turn causes the need for massive industrial fertilization which can poison ground water and toxify the land. Babylon is a great classic example, they over cultivated their fields and used a watering method that slowly salinated the soil. Where there once was the ‘cradle of civilization’ there is now a desert.
–Dependence on single-crop diets causes rampant obesity and health problems because no single plant or animal can serve as a population’s nutrition base. Even the Inuit, who have one of the most limited diets on the planet, do not have a singular food source. The human body simple isn’t designed to live on nothing but wheat, hamburger, and fried potatoes.
If we really want to do something about public health we should be focusing on localized food production, on the de-monopolization of agribusiness, on developing GMOs and non-GMOs that are adapted to suit their environment, not that require hectares of “shadow acres” for extra farm equipment and storage facilities for chemical fertilizers and pesticides. We should be talking about removing subsidies for crops we don’t need or want that make it more profitable for a farmer to grow a crop that we then have to find a use for, rather than a crop his local township would love to put on their shelves. Returning to Golden Rice, what in the name of Darwin are we doing making one grain into a multi-vitamin? Have we learned nothing from the US corn subsidy boondoggle?
I don’t know about you, but these sound like pretty Leftie objectives to me. By all means, lets march and buy in favor of independent farmers and against corporate greed. But lets also not forget Borlaug and that coprolite grad student, slowly working toward the goal of bettering our abilities and our understanding of ourselves and our needs in a genuine hope of helping humanity. If a GMO exists to exclude its environment, that is definitely a problem. But simply that it is a GMO should not discredit it.

Why I Don’t Do Yoga

Note: Within this document I occasionally use the word “Privilege” with capitalization to indicate the proper noun. This is to differentiate a social inequality between two groups whereby one has social capital that the other does not, from simply being granted a boon by an authority figure such as a gold star from a teacher that allows an extra cookie at snack time.
The stage magician duo Penn and Teller have a TV show. It’s called “Penn & Teller’s Bullshit” and the basic aim of the show is to critically, and snidely, analyze concepts and tropes that have a great deal of social currency but not much fact. They have episodes that make fun of PETA, alien abductions, exorcisms, and they have an episode dedicated to the debunking of the American myth of the moral superiority of buying organic food.
The organic food episode frolics, revels really, in the absurd claims of some ‘healthy alternatives’ to genetic modification, chemical fertilizer, and pesticides. It has a taste test section, which demonstrates that people who claim organic food tastes better actually prefer the taste of non-organic produce using the blind forced choice method that’s a staple of such skits, telling the participants one item is organic and the other isn’t and making them figure out which is which (at one point they give up all pretense and just cut the same banana in half with hi-larious results). They prove that the term ‘organic’ is not universally applied. And in particular they spotlight the ‘typical’ family of organic purchasers.
Not surprisingly, given the tenor of the whole series, they pick a couple who are so cliche they can only be described using in-group class signifying modifiers: they’re ‘modern’, they’re ‘eco-conscious’, they ‘love mother earth’, they’re even wearing tie-dye shirts, and they have a ‘reclaimed’ piano that they use at one point to sing at the camera a song from their ‘indie folk band’ catalog. Yes, they’re in a band.
And with slightly glassy eyes and all, they are our guides. They both sound mildly stoned, or stupid, and they’re used as a baseline to which the episode repeatedly returns to see what nonsense thing they’ll say next. By the end of the episode I was left wondering what their day jobs were. I’ve heard their stuff, there’s no way they’re making enough to buy organic with that music.
But what stood out to me about this couple wasn’t their over-the-top, formulaic to the point of suspicious, depiction. It’s that they live in a 2 bedroom apartment, and they’re both pale with absolutely no discussion of a Native heritage. Yet they have a tipi in their backyard where they eat lunch. You read that right, a tipi. No reason, they just have a tipi. They even say that, “And we have a ‘tipi’.”
I discuss things at work, as so many in the world do. One of my coworkers loves the show and speaks of it frequently. I have a slightly more nuanced appreciation of it. I get what they’re doing, and it’s frequently funny or slightly informative, but it’s infotainment and they use far too much hyperbole for it to be simply thoughtful analysis. They’re stage magicians after all, misdirection and playing to the crowd is what they do. To make my point, when discussing the organics episode in particular, I bring up the couple with the tipi. All that I meant by that was that it was a set-up, they’re trying to make the average organics enthusiast look like a moron, so they picked some morons. Unintentional shills perhaps, but not people who are genuinely educated about anything, let alone buying organic in particular. Not all of us are that idiotic about what ‘organic’ we buy or why. Not all of us have a tipi in the back yard.
The conversation took an interesting turn when another coworker, hearing the discussion, asked “Why does the tipi make them morons?”
I explained, “Because tipis are houses, if you have a house you don’t need a tipi, if you have a tipi you don’t need a house. It’s a portable tent designed for semi-nomadic tribes to be carried by dogs or on someone’s shoulders. If they just had a tent in their back yard at least they’d be admitting they just like camping. Tipis are awfully specific, and that level of disrespect requires a pretty high level of obliviousness and entitlement.”
The reply I received caught me off guard. “But how is that disrespectful? Like you said, it’s just a tent, are they only allowed to have tents in shapes you approve? Sounds to me you’re looking for something to get offended about. When I see a tipi I think it’s great.” And that, right there, that reversal, is something I don’t often encounter so directly. It’s usually much more subtle, and it is why it’s difficult to explain appropriation to most people. It says, ‘If a tent is a tent, then why is your definition of tent special? Do you own the concept of tent?’ And explaining how that’s a straw man is why I don’t do Yoga.

Several years ago I was discussing my, at the time, fairly intense work-out regimen with a friend and someone from school. I mentioned that I include yoga whenever I do weightlifting because I learned the hard way that it really reduced my post-exercise inflammation. The reply was mind-boggling, “I didn’t know you were Hindu.”
I’m not, and I said so. And they rightfully pointed out that Yoga is a religion, aerobic stretching is just calisthenics. If I’m doing calisthenics then I should say so, if I’m ‘orienting my body with the will of the divine’ that’s something else. And if I’m not ‘orienting my physical self to enhance the flow of that which is most powerful in the universe through my imperfect vessel’, but rather just using some random word from another culture because it sounds pretty, that’s appropriation. Hair styles are universal, Yoga is a very particular type of prayer.

I try to not be one cavalier with my words. Even moreso after that conversation. And so I now tried to make a similar point about a tipi. “Okay,” I said, “How would you feel if, rather than me saying ‘I can do your job better than you’, I instead put on a wig and your name tag and pretended I actually was you, doing a better job of being you than you? Wouldn’t you think I was kinda making fun of you?”
“I’d see that as you wanting to be me. How is that not a compliment?” She replied. Which is really the heart of the matter, of course to her it would be a compliment. Why wouldn’t I want to be like her? How could anyone want to be anything else if they had the choice?

I’ve found that people reveal their Privilege most obviously in this aspect of discussions relating to relative power. The easiest way to find ‘the cool kids’ in the high school cafeteria is whoever isn’t letting anybody else sit at their table, believes whatever table they sit at is ‘theirs’. Not surprisingly, the easiest way to find the privileged group in a room of adults follows roughly the same format, find the person or people who believe that it’s just adorable when other people act like them.
At the time I genuinely wasn’t sure if she was being argumentative, but I had no reason to believe she was. I don’t think she had considered for a moment that actual Privilege is about normalization to my expectation any behavior of an outside group that would keep them unique from myself. It removes another’s right to be distinct, and gives me the right to take from them whatever I want.
Importantly, the Penn and Teller fan made a similar point to my own: “I get where The Fool is going,” he said. “If I have a Laughing Buddha in my front room and I’m not a Buddhist, aren’t I basically telling people ‘Hey, I know this is your culture and all, but I just think it’s cute’.”
Which I would take a step further: If I have a Laughing Buddha in my livingroom and tell people, “Because it brings good luck”, the justifiable response is, “But you’re not Buddhist…?” And that’s appropriation. I have no stock in the image, I’m expecting that the meaning given to it by those who have made it mean anything is now mine to weld simply by association.
Now, that’s not to say that I exclude the very possibility of one imbuing one’s own meaning into something which specifically incorporates and gives deference to its original intention. If I have a Laughing Buddha in my front room and when asked reply, “Because I just like to remind myself that even an ascetic religion that preaches complete rejection of this reality still has an entire incarnation of their highest form dedicated to joy and laughter”, that’s something else. I am, at the very least, acknowledging that while its totality is not a part of who I am, what it actually represents to the people who value it also is valuable to me.
I was angry, at first, at the arrogance of the question she’d posed. Why should I have to justify that a fucking tipi is offensive? It’s taking one tiny part of a culture and mocking it by removing all the meaning but hoping your audience will give the meaning back. It’s playing to the ignorance of everyone around you by using an artifact from my group to enhance the reputation of yours.
And then I got called away, to explain the Scientific Method to a 10 year old. Seriously, he was going for a merit badge for Cub Scouts and he had to ask a real live scientist how we do what we do. So I left. And to a 10 year old I explained very simply that the scientific method is coming up with an idea and then testing it. And if the test is a failure, come up with a new idea that incorporates what you’ve learned.
That was a splash of cold water. What I didn’t realize in the heat of the moment was my coworker was asking from a purely aesthetic frame. It’s not a coworker toward whom I have any ill will, so I had to step away from the conversation. My first response was frustration.
I returned. “It’s appropriation because they didn’t call it a ‘tent’,” I said. “I know that seems petty. But if they’d said they have a ‘tent’ in their back yard and it happened to be a tipi at least they would be acknowledging the purpose of it. That they call it ‘tipi’ means they’ve conceded it’s something they think of as ‘Indian Stuff’. They’re not thinking of it functionally, they’re thinking of it as a cultural artifact. I had to work hard to earn what little respect I have in the Native community. That they think they can simply build a tent in their back yard and use the ‘real live Indian’ word for it, and that somehow imbues it with meaning. That’s insulting. My people are not a fetish, we’re real.”
The point of the Penn & Teller’s Bullshit episode was that people use the word “Organic” with no idea of what that word actually means. They treat it as currency, but the only way it can actually have currency is if the true weight of its implications is opaque to the point of unintelligible and eventually completely meaningless. It becomes a shell into which people pour their own interpretations which can, in some cases, be the precise opposite of the actual definition of the concept.
To illustrate how absurd it can get, though the show never points this out, no yellow banana can be truly non-genetically-modified “Organic”. All yellow bananas are the “Cavendish” breed of a single plant which is an infertile clone of the original. Yellow bananas, therefore, cannot be organic, they are incapable. “Banana” is, therefore, an antonym for the term “Non-GMO”. And yet, Dole Fruit Intl. has recently advertised an “Organic Banana Operation” at www.doleorganic.com. You heard it hear first folks, a GMO that’s so GMO it’s been a clone for over 200 years is, somehow organic. I wonder if they grow them in a tipi…
I don’t do Yoga, and I never really did, and I should have been more careful with my words.

The Obsessive Self

I recently experimented with a diet/exercise monitoring website called ‘myfitnesspal.com’. It is well enough constructed for most people’s needs. It would not be very useful for someone trying to lose an extra 1% of bodyfat, or get themselves from a 4 minute mile to a 3.8 minute mile. But then it isn’t designed for extremely precise, high-focus athletes, it’s designed for people who are new to tracking their dietary intake. Charts for calories burned by exercise are highly general, foods are tracked primarily by the nutrition guides on the label, etc.
I really only did it for a few weeks to get myself to pay a little more attention to what I eat. My sugar intake was a tad high, but then I don’t have much of a sweet tooth so it wasn’t in the nebulous ‘danger zone’. And my exercise has dropped off in recent years, but I’m nowhere near immobile, and my job involves quite a bit of walking, so really it was more fine tuning than truly making any lifestyle changes.
In the spirit of full disclosure I should say I was wary of getting too involved with the site, or truly constructing my own ‘program’. As someone with a past punctuated with eating disorders and obsessive over-exercise, I knew I could cue a neurotic episode and I was not interested in returning to my 3 hour daily calisthenics and weightlifting marathons from my early 20s. I miss my six-pack, don’t get me wrong, but I don’t miss the perpetual soreness, the testosterone induced rages, the acne, or eating a medium supreme pizza for lunch and still being hungry.
A Cognitive/Behavioral/Social science friend recently posted a May 28th 2013 article from sociologyinfocus.com by Ami Stearns discussing the “Quantified Self” conference that some guys from Wired Magazine have created. The basic concept is that with all the ‘monitoring’ programs and websites and digital devices available to the average early-adopter these days, you can “Know Yourself Through Numbers” (the movement’s motto). She summarizes their ideas as, “Users can track and quantify everyday activities, whether it’s calories burned, miles run, television consumed, quality of REM sleep achieved, sonnet lines penned, or ovulation cycles estimated.” In a weird way, it’s an inversion of the Sociological Imagination that C Wright Mills discussed in his writings. Mills suggested that Sociology allowed one to begin to view the relationship of individual actions to the wider society, to view activity within context.
The Quantified Self is the opposite of that kind of CBS training, it’s knowing the numbers but with a sample size so small that it’s impossible to relate to anything larger. Social pressures become invisible because the origin’s of ideas like ‘too much sugar’ are impossible to question. Stearns drew a connection from the Quantified Self to Foucault’s analogy of Jeremy Bentham’s panopticon as a model of social behavior-enforcement. Bentham proposed a prison without doors, where every cell is visible from a central guard post. The guards can look down on the prisoners, even shoot them, but the prisoners cannot see the guards. The prisoners become essentially self-policing because they never know when they’re being watched, or what the consequences of transgression might be.
Foucault’s simile had many critics. Margaret Mead proposed a direct contradiction of his ideas whereby we are trained from birth to behave a certain way so that by adulthood the behaviors become automatic. It’s not that we’re self-policing so much as acting unacceptably just doesn’t occur to us.
And then there’s the Quantified Self. It’s some part panopticon, the website can see you but you can’t see the programmer. And some part Mead, it says you’re eating too much sugar and you don’t know why you know that sugar is bad but you do so it’s probably right.
What bothers about all this is not that people are paying attention to stuff. Historically, even at our most obsessive we pale in comparison to say the Ancient Greeks or Romans. Heck, the Romans were so obsessive that they could only enter a house with their right foot first lest they curse themselves and everybody inside. Our word for evil intent, ‘sinister’, comes from the Roman word for ‘left’, sinistra. Screw how many steps you take, you have to know what foot you set down first thing out of bed. Of course, that depends on if you got up from the ‘right’ side of the bed in the first place.
We’ve always tried to find ways to control our universe by controlling ourselves. What does bother me is the compartmentalization. Vitamin B12 was only chemically identified 50 years ago, and yet there is no shortage of books on exactly how to eat so you live to be 100. How can they be sure? Henry VIII obsessively ate according to the ‘humor’ theory of internal balance between blood, yellow bile, black bile, and phlegm. We still have some of those concepts trapped in our lexicon as well, if someone is ‘sanguine’ they’re cheerful. Blood was the humor of passion, to be sanguine is to be ‘full of blood’. The detail there is that what lead to ‘blood letting’ was the same concepts, that the blood content of the body was too high and needed to be drained. Yeah, that worked…
So I see a post on Facebook not too long ago by Jeff Novick on June 3rd 2013 from the website “Forks Over Knives” that discusses whether a totally veggie based diet contains all the proteins necessary for human nutrition. What surprised me was so many ‘layman’ replies that argued back and forth about whether the article was bullshit. Several people referenced the “Paleo Diet” which is a joke perpetuated by people who think the Flintstones was a biography. But regardless of whether one thinks ‘cavemen’ ate 5 lbs of mammoth every night, no one mentioned that he had to walk about 20 miles to hunt the thing down in the first place so the mammoth could have been made of lard and frosting and the guy would still be healthier than most of the people playing on the interwebs these days.
Christina Warinner did a TED talk video at OU not too long ago debunking the Paleo Diet, she happens to be an anthropologist/archaeologist so it’s kinda in her wheel house. But even she didn’t really play up that your average pre-industrial human was walking around 10 miles a day. Even Herman Pontzer’s team who recently studied a Tanzanian tribe and published their findings in PLoS ONE journal found the men walked an average of 7 miles a day. Even if their calorie intake to output ratio was the same as yours or mine, they’re moving around a hell of a lot more.
Which gets me to the heart of all this: Why isn’t food, food? Morgan Spurlock did an info-tainment movie called “SuperSize Me” where he ate nothing but McDonald’s food for a month. His health suffered. As a corollary British journalists Giles Coren and Sue Perkins did a series called “The Supersizers Go” where they ate the diet of various periods of history. Remember Henry VIII? They hated that period. You can find the video on YouTube, it’s all about how they had things like sweetened fish with pepper because that combination was balanced humor-wise. As a side note, after a horse riding accident Henry had a broken foot that never healed and stank of rot for the rest of his life. Apparently, his diet wasn’t so great.
When my friend posted the article about legumes and complete proteins a debate immediately ensued on his FB page about amino acid balancing, lysine, iron, and B vitamins. Funny thing, not a one of the people posting was an MD, an Anthropologist, a Biological Chemist. I asked the very simple, “So… Do people still eat food because it tastes good and they want to share a meal with friends and family, or is that not a thing anymore?” And the first reply was, “You eat to stay alive and healthy. Everything else is just a bonus.”
It seems to me we haven’t made ‘diet’ the panopticon, we haven’t made ‘doctors’ the panopticon, we haven’t even made ‘nutrition’ the panopticon, we’ve made FOOD the panopticon. Careful, it’s watching you.
Michael Polin, who is quick to say that he is not a nutrition expert, has written several books on eating. He boils all diets everywhere down to three basic rules: Eat food (by which he means food in its raw or as close to raw as possible form), not too much (don’t eat to overfullness), mostly plants. That’s it, three rules that are consistent through most every diet he’s studied. Again and again he makes the point that people who are some of the healthiest on earth have no idea what ‘nutrition’ is, they eat because they’re hungry, the eat because their family is visiting, they eat for a variety of reasons, but it’s almost never because they’re trying to keep their Omega 3 levels stable.
The Romans were some of the most superstitious people in history. When they tried to conquer what is now England they burned out the oak trees to take away the Druid’s magic. Not kidding. They divided the universe into themselves (people), and everything else (peasants, slaves, barbarians, fish…). And to prove their ‘education’, their superiority, their enlightenment, they held gladiatorial games that still disgust us 2000 years later. Hitler was totally obsessed with his diet and exercise and health, and that guy had some seriously wacky ideas about purity. Google that shit, it was like he was crazy or something. Pol Pot put all of Cambodia on diet restrictions.
I’m not saying that Dr Atkins or Dr. Di Pasquale was or is trying to bring about the Fourth Reicht, or the next Killing Fields. But in their book “Good Omens”, a satirical look at the Biblical story of the second coming of Jesus, Terry Pratchet and Neil Giaman characterize the Horseman Famine as having been on earth the whole time as a diet guru. As they put it, he carries out his function in a variety of ways: nouvelle cuisine (the sort that consists of “a string bean, a pea, and a sliver of chicken breast, aesthetically arranged on a square china plate, invented the last time he’d been in Paris; Diet fads (“D-Plan Dieting: Slim Yourself Beautiful, the book was called; The Diet Book of the Century!”; And new foods (“indistinguishable from any other except for … the nutritional content, which was roughly equivalent to that of a Sony Walkman. It didn’t matter how much you ate, you lost weight. … And hair. And skin tone. And, if you ate enough of it long enough, vital signs).” One of the future ventures that Famine never gets to try is a new diet food that has the same nutrient value but 10x the fat, so you gain weight as you waste away. The irony was the fun part for him.
Famine is eventually destroyed, but the prologue to his story, to the Horsemen’s story, is the line “They’ve gone back to where they came from, where they’ve always been, the minds of man.” Diet and movement are fundamental, alcohol is a choice, driving is a choice, even sex is basically a choice, but stop eating and you die, period. So when someone is talking about food and movement they’re talking about your existence. Don’t be 22 year old me, don’t see every meal as a threat, don’t fear the pleasure of a cheese cake, or the pleasure of Brussels sprouts for that matter (prepared properly they’re fantastic!). Because the minute it’s something to be afraid of, or feel guilty about, it’s fetishized. And eating a cheese cake becomes a ‘reward’ or a ‘slip’, or a ‘survival bonus’. It’s a cheese cake. I’m gonna be what the neo-pagan community calls a ‘fluffy’ for just a moment, someone who seems to deny that bad things exist, and say just live.
The BBC Radio 4 recently ran a radio program called “Constant Cravings”. Hosted by Sally Marlow, an addiction specialist, the show discussed whether one can become ‘addicted’ to food. A cognitive psychologist on the show makes the point that the longer you obsess about the food, the more that obsession becomes the addiction. Neurology has known for quite some time that neural pathways form to a great extent by what behavior we repeat. Your brain doesn’t do things because it’s programmed to nearly as much as it is programmed to try and then when you finally get it right the connections that don’t help are removed. Emotional programming is a type of behavior, and once it’s in place it’s very hard to break. The compulsion is born of the obsession, the obsession is born of the pressure to be obsessed.
I am not an MD, I’m not a neuroscientist, but I am an Anthropologist. I’ve definitely learned that the most depression prone societies are the ones where they think you’re supposed to be depressed. That somehow self-criticism is a natural state. The Romans were kinda nutz. Hitler was kinda nutz. Pol Pot was kinda nutz. And all of them were obsessed with food.
So, how to not be obsessed? Reduce the conditions. If everything has an ‘acceptable’ and ‘unacceptable’ variable that must be codified at every stage, the very act of considering the meal becomes habituated to obsession and extensive critical analysis. Like a Roman wondering ‘Did I step into this room properly?’ to the point that their very language became suffused with meaning and implications of that question, we think about every slice of bread, every drop of maple syrup. Sorry, every ‘carb’.
Keep it simple. Eat, move, and breathe. That’s it, eat, move, and breathe. Polin has three rules, so let’s take these groups and see what we get. Go outside, share a hot meal, and make sure it’s good food with plenty of liquids. And maybe it’s okay if you got to talking and forgot to journal that extra potato chip into your pedometer/calorie counter/exercise timer.
BBC Radio program “The Naked Scientist” recently interviewed a man who helped to build a program that purports to interpret your ‘personality’ by scanning your FB ‘likes’. He said something that some could interpret as chilling, that if you don’t like the reading their program gives you for your personality summary then you should change how and what you post and ‘like’. At first even I was startled at the suggestion that we should camouflage something of ourselves even on our own pages. But that’s just it, they’re not ‘our’ pages, they’re owned and operated by a corporation and companies already admit to using people’s profiles when considering them for hiring. He’s not saying to lie about yourself to yourself, which is how it could sound. He’s saying to craft your image. If you know you’re going to be examined, if you know you’re in a panopticon of sorts, change what they see by changing how you see yourself.

The Fallacy of Equivocation: Jews and Blacks and Gays, Oh My!

One of the first issues one confronts in social ethics is that of the Fallacy of Equivocation. In the simplest of terms, the fallacy stems from cross-applying one thing to all things. A truly basic example would be the argument “Carpenters use a plane to smooth wood. The Boeing 747 is a plane. Therefore the Boeing 747 smooths wood.”
In that example it’s clear where the mistake is: “plane” has two different meanings but the proof only uses one. The trouble for social ethics is that this happens quite a bit when we use ideas that have different meanings, but the definitions are distinctions of nuance rather than clear divisions. One extremely useful ethical dissection tool is the Fallacy of Equivocation of Suffering. This gets complicated in a hurry.
First, it must be accepted not all suffering is equal. Picture someone who has grown up on a fluffy cloud, spoon-fed, never so much as a skinned their knee. On the day of their 30th birthday someone walks up and sticks them in the leg with a knitting needle. This is the most pain they have ever experienced in their entire lives. On a scale from 1 to 10, this is 50.
Now picture someone who, since the day they were born, every afternoon at 3pm their father beat them into unconsciousness with a belt. On their 30th birthday, someone walks up and sticks them with a knitting needle. On a scale of 1 to 10, for them, this was maybe a 2. They might not even remember it tomorrow morning.
Therefore, getting stuck with a knitting needle cannot be rated as a 10 or a 2, it depends on experience. Therefore, physical pain cannot be equally determined for all people. Equivocation of suffering is a fallacy.
Second, equivocation of pain is not a fallacy. It must be accepted that pain is a fundamental aspect of living beings with a central nervous system. It is so much a part of just being alive that people who cannot feel pain have a specific medical term just for them. Congenital analgesia is incredibly dangerous to children, for example they run the risk of chewing off their own tongues when teething. Pain is fundamental. Therefore, pain, to some degree, can be equivocated. I cannot know what it is to saw my own leg off, but I have felt the pain of being cut, I can know I would not enjoy cutting off my own leg.
So equivocation of pain is both a fallacy, and not a fallacy. That’s why this gets complicated in a hurry. Because it isn’t just about physical pain, it’s also about suffering.
Now, say we have someone who has never so much as suffered a bad hair day. When they say “The smoking section is by the trashcans?! Jesus, this place is like Nazi Germany!”, it can be reasonably argued that they’re actually right. For that person any rule they don’t like is a horrible indignity and an attack on their freedom. For that person, the knitting needle isn’t just painful, it’s cruel. It’s why teen angst is so common, and also a cliche. The young adult period is when many people are having many experiences for the first time, which is why teen poetry so frequently sucks. It’s maudlin and simplistic and a host of other things specifically because it’s covering familiar ground like it’s never been seen before.
Which is where Hitler comes in. Hitler, and the Nazis generally, are an idea as much as they are a historical fact. In the same way that the word “gazillion” means ‘an indeterminately large number’. “Nazi” is infrequently used to mean “a member of the German National Socialist Party” and often to mean “an extremely strict and regimented task master who follows arbitrary guidelines without reason.” For someone who has little scope of history or education in political realities, a member of congress can be a “Hitler” just by passing a ban on smoking in public parks. It’s how a 20something, heterosexual, white, male, living in the modern US, can say with a straight face that he’s the ones who is the real victim of prejudice. To him, he is. For him, the knitting needle is a 10, and Hitler was a 10, therefore the knitting needle is Hitler.
This is where the equivocation fallacy comes into play, pain and suffering are not the same. Moreover, “Hitler” and “Nazi” also have more than one meaning, both are vernacular insults. Therefore “Nazi” and Nazi are not the same. Calling someone “a Nazi” is insinuating that they are mindless totalitarians, thoughtless racists, or just plain mindless and thoughtless control freaks. Which is why Godwin’s Law that, in any online discussion, someone will eventually compare someone else to Hitler or the Nazis, is a Law. Hitler is that person’s best, most powerful insult. But Hitler’s easy, he’s well known and there are still people alive who lived through the holocaust. That’s why Godwin’s Law is a Law, because lots of people have heard of Hitler and not nearly as many people have heard of Idi Amin. If Idi Amin were as well known, Godwin’s Law would be about Amin instead. It’s not the actual Nazis, it’s the accusation “Nazi”.
Which is how Hitler isn’t the knitting needle. Many have heard of Idi Amin, or if they haven’t they at least know they’re not Hitler, the person. Which means everyone in the discussion knows immediately that shouting “Hitler!” in a crowded newsgroup is essentially a Hail Mary. Which is why the argument’s basically over after that and why Godwin’s Law was later modified with the corollary that anyone who invokes Hitler or the Nazis cedes whatever point it is they were trying to make. Because Godwin’s Law isn’t about 1930s Germany, it’s about calling someone “Satan” or just saying without saying “Shut up!”
So, where is all of this going? A white, married, 20something, in the modern US is not a holocaust Jew. This seems an absurd statement. But this is where the nuance of equivalent suffering comes to a head. The argument “Obama wants to take our guns, just like Hitler” is historically inaccurate. It simply is, Hitler made guns easier for most German citizens to access than they had been before he became Chancellor. But this is not about the Second Amendment to the US Constitution, or the failures and eventual fall of the Wiemar Republic. It is a statement of suffering, it is someone saying “Taking my firearm takes my freedom, and that’s what Hitler did to a lot of people, and now he’s remembered as a monster, so we shouldn’t take my freedom away because that’s what monsters do.” And if equivalent suffering is a fallacy, it is a good argument. They will have their rights violated, and that’s what happened to the Jews under Hitler. Except this is where the fallacy/not fallacy dividing line becomes extremely important. Because the Jews, under Hitler and the Nazis, were starved and gassed to death.
Even without legal and historical nuance included, one need never have lived through the holocaust to know that it was a hard slog indeed and it’s not a comparison one should be making casually. Also, it is not necessary for different groups to compare how much they have suffered to agree that suffering is bad. A black should not be held to the standard of a WWII Jew in order for their perspective to be valid. A homosexual should not be quiet simply because they don’t have a tattoo of a lot number on their arm. A woman is not wrong that corporate sexism is bad simply because it does not include regular lynchings. And more to the point, for the privileged to use “Hey, I’ve suffered too” makes it a silencing tool. Because what is being pointed to is an inequality, not a uniformity of pain. And what an argument of ‘we’ve all suffered in our own way’ does is deny that inequality. If a homosexual is talking to a heterosexual, the heterosexual does not get to say “Yeah, I totally understand not being able to marry, my second wife and I had a hell of a time getting my first marriage annulled.” The inequality is in the statement, the homo isn’t having trouble getting married, they legally can’t.
If a man says to a woman, “Hey, I totally understand the glass ceiling, it took me three years to make partner,” that man is undermining the woman’s worth in two different ways. First because he’s comparing abilities where an equivalent is nowhere evident, but second because he’s excluding social factors that haven’t even been considered let alone disproven. It’s called a straw man, where one assumes how someone came to a conclusion, and then disagrees with the assumptions but not the original statement. The woman isn’t asking the man to comprehend how difficult it is to achieve, she is stating that it was hard for her to achieve specifically because she is a she. More importantly she is talking about how hard her achievement was in addition to his.
He, on the other hand, has ignored everything she said and focused on his own career and his own suffering and eschewing her augmentation. Interestingly while admitting that he outranks her without giving any logical reason that should be true. His fallacy of equal pain is that since he had to work hard to get ahead, he understands having his ass slapped at the Christmas party by his boss.
And any pale-skinned American who says “I totally get racism, people don’t believe it’s my picture on my driver’s license after I’ve worked in the yard for a couple of weeks,” really just needs to be smacked. Racism isn’t about people mistaking hue, it’s about Class, perceived origins, and up to 500 years of systematic exclusion and political manipulation.
The appropriate answer, for example, to #BlackLivesMatter or is not #AllLivesMatter. The hint for the white guy, of course, is that it’s never taken a constitutional amendment to make him a person. Again, inconvenience is being equated with daily or even weekly harassment. One could also argue that what is hidden just below a statement like that is the argument ‘Sure it’s a pain for you, but you people suffer all the time. My suffering is more rare because it’s just me, therefore it’s more deserving of attention.” A devaluation if ever there were one.

Original Ideas Frequently Aren’t, Like The Internet For Example

“Rule 34″ is in the same category as “Godwin’s Law” and the verbing of the words “Google” and “friend”. It is in that galaxy of terms that relate primarily to the internet and those who use it.
Sidebar: Occam’s Razor was actually, “Plurality must never be posited without necessity”, essentially ‘a hypothesis with the fewest assumptions is generally the most likely’. This has come to be spuriously summarized as ‘The simplest answer is the best’. Meaning an idea can be modified without actually being all that original, and an original idea can get rephrased to something slightly different and still be relevant.
Godwin’s Law has undergone a similar transformation. Originally stated, “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1.” Therefore a discussion of Hitler, the Nazis, or Totalitarian regimes is automatically excluded as the “1″ is redundant. Yet the Urban Dictionary already gives that Godwin said, “…somebody will bring up the Nazis or Hitler”. Again, inaccurately rewritten, but not somehow a brilliant and original idea. It’ll be interesting to see what Godwin’s Law will look like in another several years, like poor Occam’s.
Rule 34, “If it exists, someone has made pornographic material involving it”, is flippant but it is also a reinterpretation of the much older idea of the synthesis of disparate social arenas. Anthropology calls it a Fetishization or Personification, making manifest what is intangible or overrating that which is irrational to the point of representation by an object or giving it human form. Freud is the one who said it always had to involve sex. So Rule 34 is actually about 100 years old at the least.
This is important because there is an implication gaining momentum, in the more casual corners of the Social Sciences, that the line is blurring between producer and end-user in a way that is exclusively contemporary and primarily a product of the internet. This is frustrating to see not just because it is reminiscent of every deceptive market retooling of, “this is new therefore it’s better”. But it is wrong. It ignores patterns of history that are discernible though require an interdisciplinary approach.
Instead much that is written on social media and the social concepts therein smacks of what Robert Anton Wilson called “neophilia”, an ever-present preoccupation with novelty. As though this is the only time that has ever been, and these are the best things that ever were. Until the iPhone 12 comes out of course, then it will be even better.
Which is not to say that Anthropology or Sociology should exist exclusively in the rarefied atmosphere of Latin cognates and grad students. But there should be rigorous research, not pandering. Each new word that appears does not automatically represent a completely atypical, funky, fresh, straight-out-of-the-blue idea that no one has ever had. Phrases and behaviors are part of a norm that can be analyzed. It must help shape theory by incorporation and interpretation.
Even on the internet, overarching social patterns have antecedents. Finding them requires not deferring exclusively to post-internet culturally significant events. Some such as speedofcreativity.org have termed the supposed blurring of creator and consumer as “The Changing Face of Creativity” (to emphasize the irony of that phrase, google “the changing face of” and count how many media ad companies come up). A fairly recent example of the mindset that this is exceptional is available on the blog Themarysue.com in Aja Ramona’s article “Next-To-Normal-Girl: Tumblr’s Ovenight Fandom”.
The article documents a Tumblr artwork contrasting ‘normal’ gamer girls with ‘fakes’. The resulting backlash was for those who disagreed with the original art to post their own, showing the two women in a sexually provocative embrace. This spread quickly, and spawned offshoots involving a variety of dichotomous juxtapositions including ‘normal’ and ‘fake’ gamer men, various characters from separate genres of sci-fi and anime, and even some discussions of radical feminism using a few of the images for a new ad campaign. In each case commentary, support, or criticism was offered, and new groups were formed based on each position.
It started as a simple cartoon, quickly gained notoriety, briefly became a movement, and spawned off-shoots. The author states “As a body, fans have become so good at the production of fanwork, the assimilation of ideas, the collective language surrounding their fandoms, and the overall apparatus of enthusiasm, that they don’t necessarily need a substantial canon to participate in the act of being fannish. Apart from being a fanthropologist’s wet dream, the overnight popularity of this [contrast] definitely contains a hint of defiance… Whether satire and meta alone can fuel [this] without a “real” canon, is… a different question. But there’s a definite advantage and a freedom that comes from working with flimsy source material… And definitely don’t write off the enormous creative energy of fandom to make something out of nothing–and then make it unbelievably awesome.”
Ramona is correct, this is the sort of thing a graduate thesis can come of. But it seems that Rule 34 and the “fannish” behavior Ramona documents is simply a more titillating and provocative analogy to other radical breaks from ‘canon’ that happened long before the internet was even a gleam in DARPA’s eye. One such comparison is the Protestant Reformation.
With the coming of the printing press, and the Vulgate texts, various Christian sects that would never have existed gained massive followings in what was, up to then, the blink of an eye. It could rather easily be argued that there is little difference, as far as obscurity, between Andromeda ‘Slash-fic’ and the Moravians. More to the point, the “collective language” and “apparatus of enthusiasm” is not exclusive to the internet, it has existed since the written word was first put on gigantic Sumerian temples.
Ramona states we should not discount fandom’s ability to “make something out of nothing”, and “they don’t need a substantial ‘canon’ to participate”, as if this is unique. The Millerite movement which lead to the 7th Day Adventists demonstrate this simply is untrue. Or the Gideons, whose entire dogma is a single chapter in the book of Judges in the Bible. And “Fanthropologist”? The apparent derivative would be an Anthropologist who studies fandom, people who form a group with similar interests and behavior. Which is an Anthropologist. Perhaps Ramona was attempting to be tongue-in-cheek with the usage, but it betrays a neophilic sensibility, an implication that there’s Anthropologists and then there’s the people who really understand.
Of course, the other issue TheMarySue brings up is the speed and therefore heightened passion of the exchange. But even that is not new. The True Levellers (nicknamed the “Diggers” based on a cartoon) was a group of Anabaptist Communists who published several pamphlets in reaction to The Levellers, who helped fill out the ranks of The New Model Army. They lasted less than a year and their influence is still with us, and they happened in 1649.
They were immediately suppressed by Oliver Cromwell’s forces, other cartoons were issued, other pamphlets, commentary, support, and criticism. Several recognizable names were influenced by their work, including Karl Marx, and a hippie group which used the name in the 60s while helping to give the Haight-Ashbury district of San Francisco its reputation. Meaning a group of Mennonite Farmers, in the middle of the 17th Century, that lasted for 6 months, helped to spawn Marxism and Woodstock and influenced modern Radical Feminism.
In this is all the trappings of what Aja Romano has documented of her Tumblr feeds: An immediate in-group reaction, a derisive image leading to factionalization, suppression from ‘authority’, eventual counter-protest, and new group formation and institutionalization, all based on text and cartoons. When compared to a Revitalization movement that took place 350 years ago, Occupy Wall Street was stodgy and plodding.
It should also be noted, it is not accurate to say that moving the arena from religion to Dr Who somehow changes the correlations. In other times and places religion garnered vociferous support and derision, it is what was recorded. The pattern is key. Equally valid comparisons could be made with political movements such as the Cherokee Nation during the U.S. Civil War, the Creek “Red Stick” Rebellion, the English Round Heads, or the Luddites. The Boston Massacre is called “The Boston Massacre” because of a cartoon. Also, we in the privileged Online West have a cloyed view of religion and politics and are less likely to see direct parallels in our much more nuanced appreciation of fan-fic. Precisely why Social Science exists, to discern those patterns.
This attitude of exclusivity should be noted by Social Science but not fostered. Cultural Relativism asserts that each community be taken on its own terms. This does not mean that Behavioral Science should share them. Advocacy and appreciation are not the same as reverence. Privilege easily leads to pity, and pity is not far from contempt. If “Fanthropology” and the “Fannish” people are any indication, the language demonstrates the exclusivity and passive contempt already exists. We may live among the head-hunters, that does not mean we must become them.
None of this discounts that something noteworthy happened. Every thing is new to someone. Every exhibition of collective behavior is appropriate to Anthropology. But to present it as devoid of context, exclusive of precursor, somehow purely from the laptops of the interwebs, is misleading and pretentious. It denies any old idea validity, or any historical parallel. It makes our parents an alien species, not ancestors. It renders Anthropology obsolete. That is a very lonely position to take. It doesn’t make us the narrator of our own story, it makes us capricious.
More importantly, if everything we do is original, so long as it’s done on-line, then those who control the network control our minds. In that case, we are no longer the driving force of culture, they are. And anyone not on-line becomes backward at best, irrelevant at worst.
It is a cynical view of history to say that we are immaculate. It is a petulant view of ourselves to say that every thing we do is singular. It is a dangerous view of the internet to say that it is a world unto itself that we have brought into being.
As an Anthropologist, I cannot accept that touch-screens and baud rates are the metric of culture. As a human being I refuse to be told that we’re pristine, it places us in a very child-like state of entitlement. If we perpetually assert that we are the most important thing ever, that no one can understand, then we must recognize it’s the same message everyone else is getting. And that far too easily that rationale becomes the justification for unspeakable cruelty. As it has so often in the past. There is culture on the internet, but it’s not exclusive. There is innovation on the internet, but it is not exceptional.

War is Not an Anthropology Value

Jared Diamond wrote a book. “Guns, Germs, and Steel” won a Pulitzer Prize and claims to be a comprehensive summary of “the fates of human societies” (how modest). It contains chapters like “Apples or Indians: Why did peoples of some regions fail to domesticate plants?” and “How Africa Became Black: The history of Africa”. That’s one whole chapter for ‘The history of Africa’, the second largest continent on earth, the one with the second oldest form of writing in existance. That chapter is a whopping 22 pages long if you leave out the illustrations. This is one of those guys who says he thinks ‘big picture’.
“We’re never going to achieve the level of precision in History that we achieve in Quantum Mechanics” he said during an interview on the BBC radio program ‘The Life Scientific’. Not with him at the helm we won’t, certainly. But it’s okay, because as he also said “Most historians bristle at the idea of making comparisons between different events… Like wars, a historian of the Spanish Civil War would never be taken seriously if he wrote about the American Civil War”. I guess he’s never heard of historiography. I guess he’s never heard of PBS Classroom who has a website that’ll do it for you. But nevermind, he’s here to give us the broad-strokes so we can reach the higher truths. He’s a maverick, just ask him.
Except he doesn’t, give us the bullet points I mean. He doesn’t think ‘big picture’, he doesn’t even think in generalities, or aggregates, he thinks in cliches. Why did the Indians of California not develop agriculture in such a fertile environment? Because the plants in the area “resist domestication”. “Look at deer”, he suggests, “they’re impossible to domesticate so the Indians didn’t try”. We’ll nevermind that the Russians proved it takes less than 75 years to domesticate a fox. We’ll nevermind that the Ancient Pueblo and Hohokan cultures of a few thousand years ago, in region, had extensive agriculture and still show evidence of irrigation canals to this day. California Indians didn’t do it because, well they just didn’t apparently.
But he’s not just factually inaccurate. In 2008 he wrote an article for the New Yorker about revenge killing in which he discusses a tribal skirmish. He gave the name of his informants. That’s just plain amoral. It’s so unethical that the American Anthropological Association Code has a whole section devoted to informant’s right to privacy. When asked on ‘The Life Scientific’ if he felt bad about the article, and his reporting one informant’s account as if it were a distillate of the entire event, he honest to goodness answered, “The thing I would change if I wrote that article today is I wouldn’t give the names. Because the expectation about Anthropology today is that you change names and omit details. It’s a practice I follow now.” Note the use of the present tense so specifically. Not that distant past of 5 long years ago.
So, cross-comparison is the only way to determine fact, that’s why historians are wrong so often, they don’t do cross-comparison. Cross-comparison is Social Science’s most important tool. Unless he’s only got the one source. Then it’s okay to not cross-compare.
Using someone’s name in a discussion of murder is totally unacceptable, so he won’t do that anymore. He’s not as reckless as he was in his youth, way back 5 years ago. Of course, the anonymity guidelines go back significantly more than 5 years, not just in the AAA but the American Journalists Association Code of Ethics. I’m beginning to question the Pulitzer Committee’s ethics.
So who cares? I’ll relate a story: Not too long ago, I got a Mr Diamond of my very own. He was some guy on Facebook who went head to head with me on a few different topics, and I eventually lost interest. Unfortunately not before he’d insulted everybody else in the threads, repeatedly, sometimes that’s the way it goes. He did seem to realize that most everyone had shruggingly abandoned the fight because as the number of replies decreased his baiting became more petty.
But this isn’t about me truly. I’m not relating that I got childishly dissed on a social networking site. I’m not self-absorbed enough to reduce serious concerns over what passes for pop-history to ‘someone was mean on the interwebs’. It’s why I got dissed that’s relevant.
He quoted some Freud, dropped some Hobbsian ‘Social Contract’ Theory, he brought out all the old standards. With “Nature red in tooth and claw” firmly established, he then proceeded to posit that humans are essentially monsters and civilization is nothing but a veneer of control hiding and off-setting the nastiness, brutishness, and shortness of existence. Generosity is a masque for greedy self-interest, compassion is a mass delusion, everybody is cruel. Why this onslaught of nihilism? Because I foolishly asked why he’d said that empathy wasn’t a factor in human kindness.
He even said that any Anthropologist would agree with him. I asked which ones. He said he had no time for philosophy and anyway, he wasn’t interested in trivialities, he was talking ‘big picture’ (where have I heard that before…?). Taking a different tack, I went to the biology. I pointed out that neurological science has discovered what Social Science has long suspected, we come together for love not out of fear. Fun fact: we get an oxytocin, serotonin, and dopamine rush when we get a hug. We get a cortisol rush when we get yelled at. Literally, at our most vulnerable we feel relaxed, and at our most distant and insulated we want to run away.
So how to resolve that our very brains seem to be programmed to contra-indicate a pain motive to all things? Simple, deny it. The guy pulled a line straight out of “Guns, Germs, and Steel”. Right there, page 291, in answer to why people come together to form cohesive groups, “…Wars, or threats of war, have played a key role in most, if not all, amalgamations of societies.”
My own arm-chair psychologist said this: “Right and wrong are cultural phenomena. Not even sure if empathy factors in. There’s two main motives: The first is fear of consequences, we outlaw what we don’t want to happen to us. The second is an ego defense mechanism, we create laws to justify/rationalize our actions.” The two of them completely agree, fear brings us together and the invisible hand of the state unites our beliefs and gives us an after-the-fact excuse for our atrocities. And they’re both wrong, and we’ve known it for a long time, and that’s really dangerous because one of them won a Pulitzer for saying it.
I asked, “Who cares?” We all should. When someone says that the only reason we love our friends or family is because we’re afraid of them, that’s wrong. Factually it’s wrong, biologically it’s wrong, ethically it’s wrong. When someone says the only reason we care for the sick or the dying is because we hope to establish a debt of reciprocity, that’s wrong. It’s wrong factually, biologically, and ethically.
Even if it were true, where does the justification end? Should rape victims band together in a gang of mutual fear and pain and form a Castration Complex? Should murder victim family members band together in a self-perpetuating spiral of paranoia and create the Everyone’s A Potential Killer Confederacy? And given the rationale, shouldn’t that have happened already?
How do we heal if the only reason other people are in the room is because they’re terrified of us, but they’re more terrified of everything else? How can we call society social if we’re not, we’re just a bunch of people who’ve squeezed into a smaller place so the lion has a harder time getting the out-layers? It’s like some Dystopic fantasy authored to justify being horrible to one another for no other reason than that we’re all in close proximity.
My amateur philosopher was just a dick on Facebook, but the other guy won a damned Pulitzer. More importantly, people believe the argument so pervasively that they can quote it at me in a meaningless post without even realizing that they’re doing it. Maybe it’s not that we’re all monsters, maybe it’s that these two are, and they want the rest of us to believe it’s okay.
I remember an anecdote from “The West Wing” TV show. Toby Zeigler tells the story of an old Jewish man in a concentration camp who falls to his knees and begins to pray. “What are you doing??” His friend asks.
“I’m giving thanks.” The man replies.
“Thanks for what? Are you stupid, look where we are!”
“I’m giving thanks,” the man continues, “For not making me like them.”
I love my friends. They seem to like me a fair amount. I love my partner too, and consider myself quite lucky for all of them being in my life. But it’s certainly not just to cover my ass when the zombie apocalypse comes. More importantly, I’m thankful beyond measure that I know that ‘we’re all just wallowing in collective misery’ is wrong and that I’ve got science on my side.
I’m thankful I’m not like him, and that most of us aren’t. Jared Diamond is wrong, it’s not that war is what causes society. Pain and fear are not humanity’s primary motives, he’s just hateful. But there is a lesson to be had from Diamond’s writing. It’s a warning to watch for those who justify their meanness not by actually proving it’s natural, but by arguing it should be.
In answer to the question of how we protect ourselves from people with no conscience, people trapped in a constant mindset of amoral self-interest, Martha Stout in ‘The Sociopath Next Door’ states, “As a psychologist, I can tell you that the absence of an intervening sense of responsibility based in emotional attachment is associated with an endless, usually futile preoccupation with domination…”
Jared Diamond is preoccupied with domination. He ignores facts, and codes of ethics, and people who don’t say what he wants. He is not a good theorist, he’s not a good researcher, and he’s not a good author. Because of that he’s a terrible anthropologist. He says it’s all about war, I say he’s telling more about himself than society.

Which of These is Not Like The Other?

What do fat kids and guns have in common?
First, a list: one of these things is not like the other, “PowerBar, Oatmeal, egg, Cheese, Cookie, Cake, Candy Bar, AK-47″. See what I did just there? Maybe so. But let’s fire off a few rounds and see what shakes loose.
Patrick Carrube, when writing his article “Why the 1911 Doesn’t Suck” for TheTruthAboutGuns.com echoed a sentiment that I’ve read from other websites, heard from gun enthusiasts, and is generally reflected in the purchase records (such as they are) of gun sales from all over the country. Namely, that a lot of newer, higher yield magazine, higher fire-rate guns are no match for the reliability and accuracy of a gun that’s had basically the same design for over a century.
But along with his testimonials and anecdotes he makes an interesting point about the intensive over-manufacturing of modern firearms. He says their tolerances are sometimes so exact, so laser etched, so gee-wiz ultra high-tech that even ammo they’re supposed to be able to use doesn’t fit. He says that the older, looser, less exacting models are actually sometimes better.
He stands by his M1911, 100 years of ‘if it ain’t broke don’t fix it’ can’t be wrong, right? Hell, he goes so far as to say that the fact that the safety can be tricky to release, and that the gun takes some getting used to is a strength, not a weakness. That it takes finesse and practice, like good shooting should. My favorite part was this paragraph, “One can argue that reliable 1911′s are actually more affordable than some other modern pistols. What Smith & Wesson pistol of recent manufacture won’t feed hollowpoints? …Glock? SiG? Beretta? Hundreds of them! …My neighbor has a brand-new S&W that jams every other round, and I have had two issues this year alone with brand-new, well-known, and popular firearms.” He doesn’t mention, but he could, a firearm series that’s become a cliche in the gun world. The Remington 700 series, with its “Walker Trigger” is famous for being such a piece of crap that it basically fires whenever it feels like it, safety on or not.
Still remember the list? PowerBar, Oatmeal, egg, Cheese, Cookie, Cake, Candy Bar, AK-47. Let’s soldier on.
After the Sandyhook shooting, and the Batman Premier shooting, and the Virginia Tech shooting, debate centered around firearm limitation. It’s about freedom, 2nd Amendment, a law so important it’s part of the DNA of the document that made us a country, says one side. It’s about insanity, it’s about protecting people from guns and gun violence, it’s about drawing a line in the sand and saying ‘you’re free, but you’re not that GodDamned free’, says another side.
I was asked by a friend of mine after the latest round of “Ban/No-Ban” went a spinnin’ if, once and for all, I favored a gun ban. I answered that I favor tipping points. That from the first, I was talking about violence and masculinity, competition based civic discourse, and that I wanted the whole issue re-framed. “I’m in favor of tipping points,” I said. “And until people see this for what it is, I’m just here to ask questions”. Unfortunately, and much to my chagrin, I find that frequently irks people something fierce.
How’s that list doing? PowerBar, Oatmeal, egg, Cheese, Cookie, Cake, Candy Bar, AK-74. Let’s bang on.
When the full magnitude of the Remington 700 series’ flaws came to light, replete with inter-company memos that went back 30 years showing that they knew all about it killing people accidentally, a news story on Dateline, the works. The call went out and defense attorneys came out of the woodwork to declare that it was a 2nd Amendment issue. “You can’t put manufacturing restrictions on guns” they said, “it’s our Constitutional right!” After Sandyhook, several of my friends bristled at any hint of gun regulation, even on magazine size. And yet, when I asked about it every one of them answered that 75 round drums were worthless, that with proper training a 6 shooter was just as fast as any uzi. None of them owned high cap mags, none of them wanted to. When I asked them why they were defending a technology that was obviously useless and pointless, I got no answer. They just somehow knew that it was important. Like a kid who’d been asked why sugar tastes good.
Speaking of sugar, permit me a moment’s digression, I promise it’s worth it. Pulitzer Prize winning author Michael Moss just wrote a book. “Salt, Sugar, Fat” is about junk food, and junk food science, and how companies have spent decades and billions to get you hooked on Twinkies and hotdogs. He even interviewed people in the industry. He related the following on NPR’s Fresh Air, with Dave Davies standing in for Terry Gross: “I was surprised to hear from the former CEO of Philip Morris, who is no friend of government, no friend of government regulation,” says Moss, “to tell me that, ‘Look, Michael, in the case of the processed food industry, what you’re looking at is a total inability on their part to collectively decide to do the right thing by consumers on the health profile of their products. In this case, I can see how you might need government regulation if [for] nothing else [than] to give the companies cover from the pressure of Wall Street.’ ” These are businesses at their core, and they have to show a profit. They can’t be the first one on the block to win the award for ‘healthiest but bland as dog shit cereal’, their market share will drop too far for it to be worth the risk.
So, a guy who made his money on cigarettes and Hostess Cupcakes is asking for the government to step in and help Cambell’s and Nabisco to not be the bad-guy. He’s asking for government to take the bullet. Remember that list? PowerBar, Oatmeal, egg, Cheese, Cookie, Cake, Candy Bar, AK-47.
Depending on what stats you trust, we have enough guns for every man, woman, and child to own 4, and enough bullets to blow a crater in the moon with the collective gun powder. What if, just what if, the gun manufacturers know this? What if they also know that none of them wants to be the first to stop making 100 round drums and 9mm’s that don’t fire for shit but people buy like iPods and are released like the newest video game with state-of-the-art graphics? What if, just what if, we’re as addicted to guns as we are to nutter butters, and the companies know it?
So what’s with the damn list? Egg, that’s what. Of every item on the list, eggs are the only thing that don’t come fully formed from a factory press. They come out of the ass end of a chicken. It’s called Confirmation Bias, through a combination of Serial Position Effect, Availability Cascade, Von Restorff, Spacing Effect, and Repetition we’ve all been told that the shotgun is the stand alone, that it’s not like the others. When the whole time it was about what a chicken can make that we can’t. Eggs are even an essential element in the production of some of the others. Only the egg is the irreducible item.
What if, just what if, AK-47 is exactly like most of the others. What if it’s exactly like that food CEO and Cookie Crisps? What if they’ve been making shitty guns for years because that’s the only way they have of saying, without actually saying, “The NRA leadership has officially drunk the kool-aid, we’ve saturated the market three times over, this has become nothing but a never ending spiral. It’s gotten so bad we’re making them to the standards only a crazy person would want, and instead of cutting off the whole thing you’re helping the crazy people get more!? Take the hint already and do what we can’t to get out of this death spiral.”?
I’m in favor of tipping points. Let’s do this, so we can start talking about why it is that 80% of mass murderers are male, why it is that they go to schools and not Klan rallies, why they use guns and not cars or Molotov cocktails, why they make it personal and not political or anonymous. How is it not blind to say it’s an absolute right for me to own something nobody seems to own, and an absolute wrong to just stop making it?
So, what if the manufacturers aren’t trying to cajole us into doing what they can’t? Well, that’s certainly a possibility. But what would it look like if they were trying…?