Here’s the link, I start at 13.40…
Category Archives: Random
The more I look at development, the more I think the age of the game-changer is over. Sixty percent of the world’s poor live in middle-income countries; only 14 percent of them are in fragile of conflict-prone ones. The countries still getting aid are getting less and less of it. Charles Kenny, who wrote an entire book about how much better the developing world is now than it used to be, points out that in the 1990s, 40 percent of aid-receiving countries relied on donations for more than one-tenth of their budgets. Now, that’s below 30 percent, and dropping.
Not that we should ignore the Afghanistans and Burundis of the world, but by 2030, up to 41 countries are going to move into the middle-income bracket. Increasingly, their challenge, as ours, will be the distribution of resources, not the creation of them. The development technologies of the future aren’t going to be boreholes and school buildings. They’re going to be labor inspectors, census bureaus, government administrators, state pensions: All the boring stuff that makes our own countries function.
I have no idea how much Apartheid is taught these days, but American schoolkids need to know this shit:
Black townships in ‘white’ South Africa were kept as unattractive as possible. Few urban amenities were ever provided. Black businessmen were prevented by government restrictions from expanding their enterprises there. No African was allowed to carry on more than one business. Businesses were confined to providing ‘daily essential necessities’, like wood, coal, milk, and vegetables. No banks or clothing stores or supermarkets were permitted. Restrictions were even placed on dry-cleaners, garages, and petrol stations. Nor were Africans allowed to establish companies or partnerships in urban areas, or to construct their own buildings. These had to be leased from the local authority. Black housing was rudimentary, consisting of rows of identical ‘matchbox’ houses. Only a small proportion had electricity or adequate plumbing. Overcrowding was commonplace. In Soweto, the main black urban area serving Johannesburg, the average number of people living in each ‘matchbox’ house in 1970 was thirteen.
The disadvantages under which the African population laboured in the ‘white’ economy were legion. Africans were barred by law from skilled work, from forming registered unions, and from taking strike action. In industrial disputes, armed police were often called in by white employers to deal with the workforce. If Africans lost their job, they faced the possibility of deportation. A considerable proportion of the workforce received wages which fell short of providing the costs of family subsistence: An employers’ organisation, the Associated Chambers of Commerce, calculated in 1970 that the average industrial wage was 30 per cent below the minimum monthly budget needed for a Soweto family of five.
That’s a clip from Martin Meredith’s ‘The State of Africa: A History of the Continent Since Independence‘. And above that, a photo I took when I was in South Africa for work a few years ago.
I’m not sure why I’m so interested in South Africa, why I feel so strongly that this country’s history should be known and discussed more, why this shit gives me a double-gravity feeling in my stomach unlike anywhere else.
In college I got super into this political philosopher, John Rawls. Rawls’s big thing was that we should organize our societies as if we were doing so from scratch, like we couldn’t decide how or to whom we would be born into them. You might be the child of a poor Jamaican single mother or a hipster trust fund brat or an AIDS orphan. You might be tall or short or dumb or smart or have an alcoholic father or Down’s Syndrome or anger management problems. If you could enter a society with any of these challenges, goes his idea, you would design it so that they did not become your fate.
South Africa is the 20th century’s most extreme example of this principle applied in exactly the opposite way it was intended: If you were deliberately trying to disenfranchise an ethnic group, to make it impossible for them to achieve wealth or stability or well-being, how would you do it? You would start by denying them housing and medical care and political representation. You’d restrict their movement, keep them uneducated, erect un-jumpable hurdles to prosperity. You’d rig the rules so that no matter how hard they tried, they were breaking them.
By this point we’ve all read Ta-Nehisi Coates’ The Case for Reparations. It’s basically a biography of all the structures, from slavery to sharecropping to segregation, that prevented African-Americans from fully participating in America’s rise to become the world’s wealthiest country.
I’m not trying to be all ‘America practiced Apartheid too!’ The circumstances in both countries are unique, and arguments based on analogies, as Coates himself has pointed out, are usually meant to inflame, not to teach.
But why I think Apartheid should be regarded as a more important benchmark in the 20th century is that these structures, the ones facilitating prosperity or preventing it, exist in every society. It’s the deliberation with which they were established, as well as their outcome, that are extreme in the South African case, but every country’s state apparatus falls along the same spectrum, whether we admit it or not. I feel like Coates’s article, academic books like Why Nations Fail (with its talk of ‘extractive institutions’) and even problematic gen-pop shit like ‘check your privilege’ hashtags, represent a growing acknowledgement that this is the case.
One of the reasons we watch science fiction is to watch our societies exaggerated back at us. Sometimes we can do that without having to make anything up.
Today I’m on NPR’s ‘Snap Judgment’ talking about the time my co-worker died and what it did to our workplace afterwards.
I wrote about it for The Billfold last year, and someone at NPR saw it and they asked me if I could convert it into a monologue and I don’t really know what that means and so I read what I wrote into a microphone and now it’s on the radio. (And, um, no that’s not me in the photo.)
These are what I learned and think about this experience:
Recording takes ages. The 10 or so minutes you hear on the podcast took four and a half hours to record. I stood in a phone-booth-size room lined with padding and read my script into a microphone over and over and over. I did it sitting, standing, far from the microphone, close to it, loud, whispering, everything. Whenever my stomach gargled or I scratched myself or my shoelace-nub dragged along the floor, we had to redo the line because the mic picked it up.
Acting is hella hard, you guys. Every time I finished reading the script out loud, I got notes from the producer: ‘Do it again, but this time act like it’s really funny.’ ‘We need you to sound numb, but also in the moment.’ ‘Try it as Edward Norton in Fight Club.’
It’s super hard to keep all this in mind while still remembering to read at about 65 percent of your normal speaking speed, sticking word-for-word to your script and standing absolutely still so the microphone can’t hear any of your rustles.
So yeah, most of the reason it took so long was my rank amateurishness. ‘Can’t you guys fix this with Auto-Tune?’ I kept asking. And this was a script that I wrote. Describing something that actually happened to me. If I had this much trouble making it sound convincing, how are there people who can inhabit shit like ‘If I bleat when I speak, it’s because I’ve just been fleeced‘ or ‘They run as if the very whips of their masters are behind them‘?
I am not sure I should have done this. Writing is, by definition, at a distance from its subjects. Even in present tense, it’s still told by an omnipotent narrator, still filtered through one person’s voice and perspective.
Speaking something out loud is different: You have to decide how you’re going to sound when you describe something, not just the words you use. You have to give a voice, an actual voice, to all of your characters. They can sound like Alicia Silverstone in Clueless or they can be Condoleezza Rice, it’s up to you.
When I wrote this, I thought it was a story about how much of an asshole I am (everything I write is at least 60 percent that). How I tried to make my coworker’s death about me, how I failed to form any connection with my colleagues afterwards, how I let a chance for personal connection go by.
Reading it out loud, speaking about and as the people who were there, I’m afraid it becomes a story about how I’m less of an asshole than they were. That is unfair. And listening to it now, I fear I am not a good enough writer or speaker to have made it not that.
Ironic detachment is easy. I genuinely struggle with this. I don’t mean as a writer, but like as a colleague and a friend and a person. It’s easy to be numb, remote, to hide behind sarcasm, to deadpan the details. It’s harder to try. To make people real. To assume the best of them. To refrain from comparing my insides to their outsides.
I don’t know, I’ve been reading a lot of Gabe Delahaye lately. He has this post from a few weeks ago about the New York Times article where they interviewed people who spotted Philip Seymour Hoffman in the days before his death. Not friends or family, just random people who saw him at a restaurant or Starbucks or whatever. The whole story is just quotes from these people about how haggard and tired he looked.
OH DID IT? DID A HEROIN ADDICT’S SKIN LOOK BAD IN THE DAYS BEFORE HE OVERDOSED ON HEROIN?
If I have a point—and I am not sure that I do—it is that we do not have to give a quote to the New York Times just because they asked us for a quote. We do not have to write a Tweet just because we are waiting in line for the bathroom. We can spend entire days in silence if we so choose. You can keep your mouth shut. It is possible.
Standing still, reading your own words over and over again into a microphone, it makes you think about how you’re saying them. Once it’s finished, once you’ve decided, you’re left with the question of why.
On September 24, 2010, Mark Zuckerberg announced on Oprah that he was donating $100 million to the Newark Public School system. Zuckerberg wasn’t from Newark, he had no particular connection to the city. But he had become convinced—by the city’s great need, as well as its charismatic mayor—that his donation could have real impact there.
‘Schooled’, Dale Russakoff’s brilliant New Yorker story, describes what happened next:
More than twenty million dollars of Zuckerberg’s gift and matching donations went to consulting firms with various specialties: public relations, human resources, communications, data analysis, teacher evaluation. Many of the consultants had worked for Joel Klein, Teach for America, and other programs in the tight-knit reform movement, and a number of them had contracts with several school systems financed by Race to the Top grants and venture philanthropy. The going rate for individual consultants in Newark was a thousand dollars a day.
I’ve been working in international development for eight years now. It took me at least the first two to realize that money is not enough. Newark had a huge donation, passionate leaders, engaged parents, principals begging for more autonomy, teachers willing to compromise, a whole nation of expertise to draw from. And yet the reform effort stalled.
Improbably, a district with a billion dollars in revenue and two hundred million dollars in philanthropy was going broke. Anderson [the district superintendent] announced a fifty-seven-million-dollar budget gap in March, 2013, attributing it mostly to the charter exodus. She cut more than eighteen million dollars from school budgets and laid off more than two hundred attendance counsellors, clerical workers, and janitors, most of them Newark residents with few comparable job prospects. “We’re raising the poverty level in Newark in the name of school reform,” she lamented to a group of funders. “It’s a hard thing to wrestle with.”
School employees’ unions, community leaders, and parents decried the budget cuts, the layoffs, and the announcement of more school closings. Anderson’s management style didn’t help. At the annual budget hearing, when the school advisory board pressed for details about which positions and services were being eliminated in schools, her representatives said the information wasn’t available. Anderson’s budget underestimated the cost of the redundant teachers by half.
The board voted down her budget and soon afterward gave a vote of no confidence—unanimously, in both cases, but without effect, given their advisory status.
You can read this as a story of city leaders trying to circumvent basic principles of democracy and public participation to implement their own technocratic regime. Or you can read it as a story of entrenched interests protecting their own jobs and salaries and ideologies at the expense of educating children. Either way, it should make all of us careful about these sort of one-big-push reforms, the idea that all it takes to fix a broken system is a big fat stimulus and the political will for a reboot.
It’s not fair to blame Anderson or Zuckerberg or Cory Booker or Chris Christie. Laughing at their failure is understandable, our first instinct, but it’s only useful if it’s our first step toward learning from it. It sounds as if everyone involved—the teachers, the principals, the parents, the money—was genuinely dedicated to fixing the schools. It is depressing that all that, still, wasn’t enough.
Depressinger still is that this is a story that takes place in a developed country, with a functioning government, with the background already painted onto the canvas. If we can’t fix our own failing schools, what chance do we have of fixing them in countries without all that?
I haven’t spent enough time in developing countries to know them like I know my own, but what I’ve seen so far is that every society, rich and poor, contains intolerable failures, has already marshaled its own forces to fix and defend them. I do not know what it is that they need to solve their problems, but I fear it may be more than what we can offer.
One idea—microfinance, child sponsorship, LifeStraw, GiveDirectly—is not going to solve the problems of Zimbabwe or Peru or Papua New Guinea or any more than $100 million is going to solve the problems of the Newark public school system. I don’t want to say that international development doesn’t need your money, because it does. But more than that, it needs your patience.
I majored in journalism. I worked at the student newspaper at my community college and then my real one, then did internships at two daily newspapers. Then I gave it up, I moved to Europe, I went to grad school and I ended up working at NGOs for the next eight years.
Since 2012 I’ve been sort of doing journalism again. Nothing serious, just little essays about stupid shit I did as a teenager or a friend of mine who was briefly a prostitute. Lately I’ve been getting slightly more ambitious, writing about foreign countries I visit for work and, this one time, how HIV is way worse in the US than in Europe.
If it’s not already obvious that I’m an amateur from my essays, it certainly is from the methods by which I produce them. I interview people too long, ask them stupid questions, forget to call them ‘doctor’, bug them with too many follow-ups. And I also, the biggest sin of all, send them drafts of my essays for comments before they’re published.
This is highly un-standard operating procedure. In journalism school the rule was, you could check direct quotes—i.e. the stuff in quote marks, not paraphrases—with your sources, and you could fact-check your numbers with them, but giving them actual excerpts from your story would compromise the independent, objective role of journalism.
The reasons behind this rule are obvious. Can you imagine an investigative reporter writing an exposé of a corrupt governor and checking it with him beforehand? Journalism is supposed to, like the old saying says, comfort the afflicted and afflict the comfortable. Giving a source advance warning of your story, a chance to revoke their quotes or edit your conclusions before it’s published, profoundly undermines that role.
So I get why the rule exists. But not all journalism is political analysis or corruption investigations or public-figure profiles. In the last few years, the rise of ‘explainers’ (Ezra Klein, Nate Silver) and the general trend toward narrative-izing academic findings (Malcolm Gladwell, David Brooks, TED Talks) have demonstrated the utility—and the demand—for works of journalism that see their sources as collaborators rather than antagonists.
Me, I’m paralyzed-scared of getting anything factually wrong in my essays. As I mentioned the other day, for my HIV piece I read probably 150 documents and interviewed like 18 people. Many of these people and documents didn’t agree with each other, or emphasized different historical or demographic factors as the key to explaining the higher rates of HIV deaths in the United States (‘It’s the health care system!’ one of them would say. ‘The health care system doesn’t matter!’ says another ).
Weighing that up, then cinching it into a few thousand words, then trying to make it readable for people who are less obsessed with this topic than I am, there’s no way to do that without leaving some conclusions and explanations on the side of the road. The only way to make sure I got my conclusions right was to share them with the people who provided the basis on which I made them.
So I sent my essay to six of my sources. Everyone got back to me. All of them had comments and corrections, all of them were reasonable, and all of their changes got included in the essay before it ran.
Most of the corrections were related to terminology. ‘Your story says there were 15,500 people diagnosed with HIV in 2010,’ one of my sources wrote. ‘What you mean is infections, not diagnoses.’ That’s actually a pretty important distinction, and the kind that traditional magazine fact-checkers might not notice.
I also let them alter their direct quotes. I was a bit nervous about this, since In journalism school they taught us that anything in quote marks is sacrosanct. ‘I have you on tape with this exact wording,’ is what they told us to say when sources backtracked on their interviews. ‘You knew you were talking to a journalist.’
But what’s the point? Like the others, the changes in quotes they suggested were grammar and terminology and clarification, not self-preservation. One of my sources told me that when you’re on Medicaid it’s difficult to move ‘from one place to another’. She wanted me to change it to ‘from one state to another’. Should I have stood on principle on not changing the quote? Her suggestion is more accurate than what I had originally anyway.
Knowing I was going to send a draft of my article to my sources made me write it differently, made me work harder to fairly summarize what they said. It’s possible to get all your facts and your quotes correct and your conclusions wrong; having expert eyes on the full content, the tone and the structure and the corny jokes, made me think harder about what I was actually saying, not just the numbers I was using to say it.
There’s also the issue of courtesy. Academics, authors, people who work at AIDS clinics, they’re busy; the ones I spoke to spent unbelievable amounts of time, one-on-one, walking me through the basics of the field in which they are experts, my own little Socratic seminar. They sent me their academic work and their data and their annual reports, knowing that I was going to quote and paraphrase them without a chaperone. I paid them nothing for this, not even the guarantee of being name-checked in my article. The least I can do—as a person, if not as a journalist—is to show them in advance how I will represent them, give them a chance to correct what I got wrong or over-condensed.
I’m not arguing that every single piece of journalism should be checked with the subject of it. I was talking with a magazine editor the other day about this, and he said ‘whenever you write a profile of someone, they end up hating you. That’s how it works.’ No one wants to read a piece of propaganda, or be fed conclusions that have been vetted and authorized by the people they are concluding about. Fair enough.
But the ethical prohibition on sharing drafts of stories with sources comes from the assumed un-alignment of interests between the journalist and subject. The subject of a profile or a political story or business news has an interest in putting out a particular version of themselves—the hero, the victim, the striver, the successful startup, whatever. The journalist has an interest in telling the truth, or at least in finding the angle that’s going to get their story read and shared and talked about.
But in the case of explainers and science journalism and (some types of) feature stories, the interests of the journalist and the subject are aligned. Both want to bring the truth to a complex subject. Both want to bring attention to a field or a finding that was previously unknown. Both want to frame the narrative in a way that will get the general public interested. The bestselling Freakonomics was written through collaboration between a journalist and an academic. The documentary Food, Inc was created with the oversight of two of the subjects (Michael Pollan and Eric Schlosser) interviewed in it. I think that adds to the credibility of the finished works, rather than diminishing them.
I didn’t share my HIV story with all of my sources. The CDC, who graciously provided me with Excel after Excel of estimates and back-calculations, and was generally lovely to work with, all they got was the figures from the story and an outline of my general points. Even I’m savvy enough to know that they have interests beyond the accuracy of the story.
Sometimes I think about this old Yogi Berra quote, about his relationship to the press: ‘You shouldn’t have printed what I said. You should have printed what I meant.’ (See, this is why you shouldn’t use direct quotes from memory. I can’t find it on Google. It might not have been Berra, and was probably phrased differently. Anyway!)
I remember reading it on a 365 Dumb Quotes calendar we kept on the kitchen table as a kid. These days, it doesn’t seem so dumb.
Here’s a section that got cut from my New Republic story about the use of the US dollar in Zimbabwe
Wait, so a country can just adopt the United States’s currency without our permission?
“The U.S. government has never taken any overt position on dollarization, formal or informal.” This is Benjamin Cohen, a political economy processor at the University of California Santa Barbara, former Fed employee and the author of some articles I’ve been reading to try to understand how one country just gets up one morning and starts using another country’s money.
Ninety percent of the world’s $100 bills, Dr. Cohen says, are in circulation outside of the United States. Dozens of countries are considered to be “highly dollarized,” meaning more than 30 percent of their money supply is in dollars.
Unlike Zimbabwe, which has formally adopted the dollar, most countries use the U.S. dollar informally, in parallel with the local currency. A few years ago I was in Cambodia for work, and found that the local currency, the riel, was only used for small stuff like meals, transport and entertainment. Anything major—a TV, a plane ticket, an iPhone—prices were quoted and paid in U.S. dollars.
It’s not just Cambodia. These sorts of arrangements are commonplace throughout the Middle East, Latin America and Southeast Asia. People use the local currency, but keep U.S. dollars as a hedge against inflation, like Tea Partiers hoarding gold.
According to Cohen, the United States has no reason to prevent these arrangements. Not only does the U.S. dollar provide a quarry of monetary calm for citizens of inflating nations, the U.S. actually makes money every time our money leaves our borders. “Seniorage,” as the economists call it, is the profit the U.S. earns every time a foreigner ‘buys’ a dollar for a dollar (It costs 6 cents to print a $1 bill. If you print one, then use it to buy something that costs a dollar, you’ve just earned 94 cents profit. That’s seniorage.).
This sounds like it shouldn’t be a real thing, but the US earns $20 billion per year from all those $100 bills held internationally. Not a huge proportion of GDP, but hey, free money, right?
The other upsides are obvious. Every time another country uses our currency, it reinforces the U.S. dollar as world’s preferred international currency, just like every time someone drinks a Coke or eats a Big Mac it reinforces the status of those brands.
Foreign countries using our currency even gives us diplomatic power. Panama, one of the first countries to formally adopt the U.S. dollar, froze in its tracks when the U.S. cut off access to hard currency in the late 1980s to put pressure on Noriega.
The only real downside of foreign countries dollarizing, for the U.S. at least, is that it creates a headache for the Fed. The more countries dollarize, the more the Fed has to take them into account when making monetary policy. A million calculations go into the decision to raise or lower interest rates, and the last thing the Fed needs is to add the interests of Cambodian iPod salesmen into the mix.
One of the more significant downsides is if a dollarized country suddenly reintroduced their domestic currency, it might flood the market with millions of now-unneeded U.S. dollars, reducing the value of all of them. It doesn’t even have to be a whole country. If the dollar was used widely enough, huge purchases of dollars by foreigners could significantly affect its value.
This is why, Cohen says, the U.S. takes a policy of “benign neglect” toward foreign countries that want to formally or informally dollarize. You want to buy a bunch of dollars and give them to your citizens in exchange for your old currency? Fine. You want to encourage your banks to offer accounts denominated in U.S. dollars? Have a blast. The U.S. isn’t going to be particularly helpful in helping you set this up, but they’re not going to stop you either.
Ten countries (East Timor, Ecuador, El Salvador, Panama and a bunch of small island nations) are formally dollarized, meaning the U.S. dollar is their official currency (most of them have their own coins though).
Zimbabwe is formally dollarized in that all government spending is in U.S. dollars, but it also recognizes the euro, the British pound, the Botswanan pula and the South African rand (why the Mozambican metical got left out, I have no idea). Stores accept payment in whatever currency you have handy, and sometimes give you change in a different currency than you paid.
One of the things that always surprised me about Zimbabwe was how it just switched to U.S. dollars one day, without any relationship to the U.S. Federal Reserve. It was even under sanctions at the time. Can it just do that?
“It’s totally normal to switch to the U.S. dollar without any relationship to the Fed,” Cohen says. “It doesn’t require an application. Anyone can buy paper money, and anyone can get a dollar bank account. Their own country may restrict those things, but the U.S. doesn’t.”
When Ecuador officially adopted the U.S. dollar in 2000, it carried out a mass currency conversion. The central bank sold their U.S. treasury bonds to the U.S. for cash, brought the cash back to Ecuador and gave Ecuadoreans a window in which to exchange their sucres for U.S. dollars. The U.S. didn’t orchestrate, nor condemn, this process.
Like an introduced species, the U.S. dollar tends to take over an increasingly large percentage of the economy. The only country Cohen knows of that has de-dollarized is Israel, which introduced the U.S. dollar in the late 1970s as a parallel currency, and only managed to get rid of it after a series of economic reforms reinstated confidence in the shekel. Lots of informally dollarized countries, like Argentina, go through waves of increasing, then decreasing dollarization in line with citizens’ confidence in the local currency.
I have no idea what any of this means for Zimbabwe. As I say in the New Republic story, bringing back the Zimbabwe dollar is seen by economists (including the head of the Reserve Bank of Zimbabwe) as a bad idea, but that doesn’t mean it won’t happen.
Dr. Cohen’s written a bunch of interesting, easy to read articles on dollarization from the US perspective
- U.S. Policy on Dollarisation: A Political Analysis (my favorite)
- Dollarization: Pros and Cons
- Is A Dollarized Hemisphere in the U.S. interest?
- Dollarization, Rest in Peace
Thanks for the interview!
Originally posted at The Billfold
I make a mean marinara sauce. I sauté onions, garlic and bacon (yes, bacon) for 10 minutes until they sweeten and become crisp, then add a big glass of red wine, a can of chopped tomatoes and generous pinches of salt, basil, oregano and rosemary. Then I leave the room. When I come back two hours later, the sauce is thick, sweet and almost purple. I throw in a handful of fresh basil leaves—done.
I’ve been thinking a lot about my marinara this week because I’ve been reading Michael Moss’s Salt, Sugar, Fat: How the Food Giants Tricked Us. Company after company, product after product, Moss shows how Big Food formulates products for maximum addictiveness and overeatability. Oreos, Cheetos, Lunchables, Wonder Bread, they’re all the same Iowa corn and Brazilian sugarcane, just liquefied, dyed and processed into different shapes and colors.
The same week I read Moss’s book cataloguing how Big Food is trying to kill us, I read David H. Freedman’s Atlantic cover story about how it’s also going to save us all. According to Freedman, big food companies—the same ones Moss accuses of nutritional euthanasia—are actually de-fatting, de-sugaring and de-salting their products one by one. McDonald’s is using whole-wheat buns, Cargill is selling a fullness-inducing tapioca starch, Stevia is fucking everywhere.
It’s a great article, and Freedman’s butchering of sacred foodie cows (Michael Pollan! Farmer’s markets! Granola!) is both essential and effective. But when it comes to his core argument, that America’s obesity problem is going to be solved by better processed food and bigger corporations, I’m not convinced. That’s not because I think it’s impossible to make a healthier Oreo or Pepsi or Lunchable—it wouldn’t actually be all that hard. Nope, corporations won’t make us healthier because capitalism makes it impossible for them to do so. Bear with me, I’ll explain.
1. Scale, Speed and Shelf Life
Let’s say I want to start selling my marinara, and I want to turn it into an industrial food megabrand—another Ragu, Hot Pockets, Lean Cuisine. The first thing I have to do is make it in huge batches and make each of those batches taste the same. No more willy-nilly tossing of spices, no more adding whatever veggies are in the fridge. I need to standardize every single element, from the weight of the onions to the heat under the pot.
To keep costs down, maybe I cut the simmering time in half, use salt instead of hours to make the flavors come out. Moss notes that herbs are up to 10 times more expensive than salt in industrial cooking, so that’s the first no-brainer modification.
The next problem is shelf life. Those Lunchables might look all crisp and fresh when you grab them out of the refrigerated aisle, but they sat around at room temperature for at least two months before they got there. Warehouses, wholesalers, truck beds, stockrooms, my marinara is going to need a lot of help not to go bad in all that time. That means preservatives (most of which, according to Moss, are derivatives and modifications of salt), chemicals, coloring agents to save my marinara’s magenta as it trundles across the country.
So now my sauce has been made in huge batches, jarred, shipped and shelved. It’s in the supermarket aisle. I win!
But wait. Thanks to all the preservatives and additives, my marinara tastes like an old sock. I go back to my simmering pot, add a glob of vegetable oil, a dash—OK, a deluge—of high fructose corn syrup, some thickeners and emulsifiers so it has that pasta saucey texture, and it’s ready for the store again.
Before I grew up and started cooking, I thought the pasta sauce I bought at the store was the same as the one I could make on the stove. I was just paying a bit extra so a factory worker somewhere did the chopping, seasoning and simmering for me. This is how our economy is supposed to work, right? I don’t knit my own clothes, I don’t build my own house, I don’t weld my bike together from parts. Why should food be any different?
There’s a scene in Moss’s book where he goes to a Cargill facility and they make him a slice of industrial-scale bread without any salt. The texture, the taste, the color, everything is wrong, Moss says. It tastes like a piece of tin foil.
This scene confused me. When I make bread at home, I use about half a teaspoon of salt for an entire loaf. If you cut the salt out of my homemade bread, yeah, it’s bland and a bit puffier (Alton Brown teaches us that salt counteracts the effectiveness of yeast), but it’s still bread, not some horrifying replicant.
But my bread, the one I spend the better part of a day kneading and proofing, is stale before I can eat about half of it. Wonder Bread, with 27 ingredients, half a teaspoon of sugar and 7 percent of your daily allowance of salt in every slice, lasts on the shelf for two weeks.
Processed food isn’t bad for you because the products—pasta sauce, macaroni and cheese, white bread—are inherently sweet and salty. They are bad for you because they are inherently industrial. Supermarket supply chains are long, slow and and unforgiving. Which means everything you buy at one has to be made in massive batches, perfectly standardized and capable of sitting at room temperature in a glass jar or plastic bag for months on end. If you took that kind of abuse, you’d need chemical assistance too.
My marinara sauce is now mass-produced, shelf-stable and OK-tasting. Sure, it’s got some extra salt and sugar, but it’s still one of the healthier brands on the shelves.
The only problem is, no one is buying it. Every other brand of pasta sauce at the supermarket has way more sugar and fat than my sauce, and they taste way better. To get people to switch to my sauce, I’m going to have to add even more sweeteners (sugar) and flavor enhancers (salt).
One of the most tragic sequences in Moss’s book is the story of Kraft in the early 2000s. The company, reeling with power from its huge market share in cereal (Raisin Bran), cookies (Oreos) and packaged pastas (the eponymous mac and cheese), started taking health and nutrition much more seriously. It added extra labels (alongside the miniscule USDA-mandated serving sizes, it listed nutrition facts for the whole package) and stealthily reduced the salt, sugar and fat in its most popular products. It even cut the calories in Oreos and started selling them in 100-calorie packs.
And then Hershey’s invaded. Starting in 2003, the chocolate company launched a line of S’mores cookies that were fatter and sweeter than Kraft’s newly trimmed-down Oreos. Kraft started to lose market share. It had no choice but to retaliate. And that’s how we got Banana Split Cream Oreos, Dairy Queen Blizzard Creme Oreos and Triple Double Oreos. They tasted better than normal Oreos, they had more sugar and fat and, not coincidentally, they sold better. Does Hershey’s even make cookies anymore?
The story of Kraft is one of the reasons I find Freedman’s “How Junk Food Can End Obesity” article so unconvincing. All of the major food companies—from Pepsi and General Mills right down the line to Monsanto—are publicly traded. They’re big, they’re multinational, they’re corporations. This means the only thing that matters to them is profits.
This isn’t a normative description or a moral judgment, it’s just a factual description of their corporate form. In a dilemma between earning more profit and protecting public health, profit will win. In a dilemma between earning more profit and anything, profit will win. Again, not a judgment, just a description.
Freedman profiles the Carl’s Jr. Charbroiled Atlantic Cod Fish Sandwich, a not-fried, not-sugared, not-terrible-for-you sandwich sharing menu space with fries and sodas. With the right marketing, the right “Would you like to try” push from employees, America might just start eating it. And, Freedman argues, just might get a little slimmer, a little healthier.
That’s a nice scenario, and it might even happen, and yay if it does. But Freedman doesn’t walk us through the scenario where Wendy’s or Burger King launches a similar fish burger, one that’s fried, that’s salted and sugared, that has triple the tartar sauce. That because of all these differences (and this is the killing stroke) tastes better. What can Carl’s Jr. do except retaliate in kind?
Two years ago, the New Yorker ran a feature detailing how Pepsi (and its subsidiary, Frito-Lay) were launching a “we’re healthy now” makeover. Less sugar and salt, more vitamins and whole grains. They even hired a guy from the World Health Organization to implement his own science-backed health standards right through the soda-and-potato-chips family.
And then, like Kraft before it, Pepsi buckled. The minute U.S. sales fell to third place (after Coke and—the horror—Diet Coke), Pepsi launched an all-hands-on-deck marketing campaign to go back to selling its old sugar-water staple.
Two years after the healthy makeover, Pepsi’s CEO told shareholders, “We refocused our efforts on our key global brands and categories in our most important developed markets to drive profitable growth,” annual report-ese for, “we marketed the shit out of our unhealthiest products.” Pepsi traded the guy from the WHO for Beyonce. The stock soared.
And that’s how it goes. Processed food companies are like drug addicts, promising “next time it’ll be different, watch!’ when they’re euphoric on market share and rising stock prices. As soon as they crash back down, they’re right back to their old habits. Cheap sugar, loud marketing, bogus health claims.
This is why Moss’s book and, in a different way, Freedman’s article are so depressing. Companies aren’t evil, they’re not greedy, they’re not pernicious. They’re just companies. As Moss points out, they’re as addicted to shitty food as we are.
Freedman’s right that just because a food is “processed” doesn’t necessarily mean its bad for you. And just because something is organic or local or homemade or “natural” doesn’t mean its good for you. But I can’t help but notice that a Starbucks muffin has 500 calories and that the one I make at home has 140. Ragu, the number one pasta sauce in America, has almost nine teaspoons of sugar, more than a day’s recommended amount of salt and as much fat as a milkshake in each jar.
Freedman would probably point out that my marinara sauce is not particularly healthy (wine and bacon, after all, are just foodie forms of salt, sugar and fat) and, serving for serving, must be more expensive than $2-per-jar Ragu. He might argue that in a few years, Ragu or General Foods or Kraft will offer a pasta sauce that’s nutritionally identical to mine, and that I’d be an asshole and a snob not to buy it. And he might be right.
But for now, neither of us can escape the reality that food, like everything else we buy, is designed to be cheap to make, to last forever and to taste better than the next product down the shelf. And also like everything else, after you buy it, you’re on your own.
I disagree with basically everything in this essay, but I can’t stop thinking about it.
I’ve recently been reading the collected writings of Theodore Kaczynski. I’m worried that it may change my life. […]
Here are the four premises with which he begins the book:
1. Technological progress is carrying us to inevitable disaster.
2. Only the collapse of modern technological civilization can avert disaster.
3. The political left is technological society’s first line of defense against revolution.
4. What is needed is a new revolutionary movement, dedicated to the elimination of technological society
Kaczynski’s prose is sparse, and his arguments logical and unsentimental, as you might expect from a former mathematics professor with a degree from Harvard. I have a tendency toward sentimentality around these issues, so I appreciate his discipline. I’m about a third of the way through the book at the moment, and the way that the four arguments are being filled out is worryingly convincing.
Kingsnorth’s (and Kaczynski’s) argument is basically that the human species is destroying the planet, and that we as individuals may be powerless to stop it, but we’re obligated not to participate in it.
This is the progress trap. Each improvement in our knowledge or in our technology will create new problems, which require new improvements. Each of these improvements tends to make society bigger, more complex, less human-scale, more destructive of nonhuman life, and more likely to collapse under its own weight.
Spencer Wells takes up the story in his book Pandora’s Seed, a revisionist history of the development of agriculture. The story we were all taught at school—or I was, anyway—is that humans “developed” or “invented” agriculture, because they were clever enough to see that it would form the basis of a better way of living than hunting and gathering. […]
Hunter-gatherers living during the Paleolithic period, between 30,000 and 9,000 BCE, were on average taller—and thus, by implication, healthier—than any people since, including people living in late twentieth-century America. Their median life span was higher than at any period for the next six thousand years, and their health, as estimated by measuring the pelvic inlet depth of their skeletons, appears to have been better, again, than at any period since—including the present day. This collapse in individual well-being was likely due to the fact that settled agricultural life is physically harder and more disease-ridden than the life of a shifting hunter-gatherer community.
So much for progress. But why in this case, Wells asks, would any community move from hunting and gathering to agriculture? The answer seems to be: not because they wanted to, but because they had to. They had spelled the end of their hunting and gathering lifestyle by getting too good at it. They had killed off most of their prey and expanded their numbers beyond the point at which they could all survive. They had fallen into a progress trap.
We have been falling into them ever since.
I have such a kneejerk rejection of these kinds of arguments it’s practically an allergy. I happened to read Kingsnorth’s essay the same week I was read John Steinbeck’s ‘Travels With Charley‘, his road-trip diary from 1962, and this passage suddenly got relevant.
It is life at a peak of some kind of civilization. The restaurant accommodations, great scallops of counters with simulated leather stools, are as spotless as and not unlike the lavatories. Everything that can be captured and held down is sealed in clear plastic. The food is oven-fresh, spotless and tasteless; untouched by human hands.[…]
Even while I protest the assembly-line production of our food, our songs, our language, and eventually our souls, I know that it was a rare home that baked good bread in the old days. Mother’s cooking was with rare exceptions poor, that good unpasteurized milk touched only by flies and bits of manure crawled with bacteria, the healthy old-time life was riddled with aches, sudden death from unknown causes, that that sweet local speech I mourn was the child of illiteracy and ignorance.
It is the nature of a man as he grows older, a small bridge in time, to protest against change, particularly change for the better. But it is true that we have exchanged corpulence for starvation, and either one will kill us.
The lines of change are down. We, or at least I, can have no conception of human life and human thought in a hundred years or fifty years. Perhaps my greatest wisdom is the knowledge that I do not know. The sad ones are those who waste their energy in trying to hold it back, for they can only feel bitterness in loss and no joy in gain.
I see Steinbeck as an example that even in the ‘Golden Era’ our current technophobes harken back to, critics at the time were harkening back even further. I want to snark that 10,000 years ago there was probably a middle-aged nomad complaining that things were better 10,030 years ago.
But Kingsnorth and his essay are smarter than that.
A scythe is an old tool, but it has changed through its millennia of existence, changed and adapted as surely as have the humans who wield it and the grasses it is designed to mow. Like a microchip or a combustion engine, it is a technology that has allowed us to manipulate and control our environment, and to accelerate the rate of that manipulation and control. A scythe, too, is a progress trap. But it is limited enough in its speed and application to allow that control to be exercised in a way that is understandable by, and accountable to, individual human beings. It is a compromise we can control, as much as we can ever control anything; a stage on the journey we can still understand.
There is always change, as a neo-environmentalist would happily tell you; but there are different qualities of change. There is human-scale change, and there is industrial-scale change; there is change led by the needs of complex systems, and change led by the needs of individual humans. There is a manageable rate of evolution, and there is a chaotic, excitable rush toward shiny things perched on the edge of a great ravine, flashing and scrolling like sirens in the gathering dusk.
Kingsnorth uses these observations as an excuse to withdraw, live off the soil, mow the grass with a scythe, unplug. For him, that’s living at a human scale.
But for other people, maybe living at human scale is spending more time plugged in. Maybe it’s making music and sharing it with your friends, maybe it’s using social media to organize events to meet your neighbors, maybe it’s (ahem) writing essays in magazines about the stuff you read and the stuff you think.
I’m not calling Kingsnorth a hypocrite. If he wants to go off-grid, escape the progress trap, if that makes him happy, he should. But I don’t think his premises, or even his doomsday ‘the planet is dying!’ prediction means we all should. This is the world we’ve got. Whether we got here through progress or a ‘progress trap’, here we are.
Steinbeck’s diary describes getting lost over and over again, and how most locals are terrible at giving directions. After awhile, he says, he doesn’t even ask how he should get where he’s going, he just asks them to tell him where he is.