5 Jun 2015

Twilight of the Professors

Michael Schwalbe

Twenty-eight years ago Russell Jacoby argued in The Last Intellectuals that the post-WWII expansion of higher education in the U.S. absorbed a generation of radicals who opted to become professors rather than freelance intellectual troublemakers. The constraints and rewards of academic life, according to Jacoby, effectively depoliticized many professors of leftist inclinations. Instead of writing in the common tongue for the educated public, they were carrot and sticked into writing in jargon for tiny academic audiences. As a result, their political force was largely spent in the pursuit of academic careers.
Jacoby acknowledges that universities gave refuge to dissident thinkers who had few other ways to make a decent living. He also grants that careerism did not make it impossible to publish radical work or to teach students to think critically about capitalist society. The problem is that the demands of academic careers made it harder to reach the heights achieved by public intellectuals of the previous generation. We thus ended up with, to paraphrase Jacoby, a thousand leftist sociologists but no C. Wright Mills.
Since Jacoby’s book was published, things have gotten worse. There are still plenty of left-leaning professors in U.S. colleges and universities. But as an employment sector, higher education has changed. There are now powerful conservatizing trends afoot that will likely lead to the extinction of professors as a left force in U.S. society within a few decades.
One major change is that the expanding academic job market that Jacoby observed is now shrinking. When the market for professors was growing, as it was in the 1960s and 1970s, radicals could get jobs in universities, earn tenure, and do critical intellectual work, even if it was often muted by a desire for conventional academic rewards. Today, tenure-track jobs are fewer and farther between. In response to reduced budgets and out of a desire for a more “flexible”—that is, cheap, pliable, and disposable—labor force, university administrators have cut tenure-track lines, preferring to hire faculty on a temporary, part-time, non-tenure-track basis.
This tightening of the academic job market has intensified competition for the tenure-track jobs that remain. Under these conditions, it is prudent—as each new cohort of graduate students discovers—to focus one’s efforts on publishing in academic journals and avoid rocking any boats, in print or in the classroom. Graduate students are advised that Facebook pages and Tweets should be crafted with the concerns of prospective employers in mind. Anticipation of a competitive job market thus begins to conservatize students early in their graduate careers.
The contingent employment that awaits many of today’s graduate students and that is the fact of life for many of today’s faculty is further conservatizing. Although all faculty are supposed to enjoy academic freedom, contingent faculty whose writing or teaching causes trouble are easily dismissed. A contract is simply not renewed, or a department chair says, “Sorry, we have no sections for you to teach,” and that’s the end of the matter. This precarious situation conduces to playing it safe, making no demands, and keeping students happy. Then there is the practical matter of how much research and writing one can do while trying to piece together a living by teaching four or more courses per semester, often at exploitively low wages.
Competition for jobs and contingent employment are the easier-to-see conservatizing forces. Others are less obvious. One of these is the growth of online instruction. This form of instruction turns a course—something once understood to be a mix of scripted and improvised performance under the control of a professor—into an ownable piece of intellectual property that can be administratively inspected and altered. Knowing that every detail of what one does as an instructor leaves an electronic record, subject at any time to administrative review, can be inhibiting at best and chilling at worst. The best bet, again, is to keep the material and discourse on safe ground.
Conservatizing forces are affecting tenure-track and tenured faculty as well. Budget cuts have led to increased pressure to get grants—an academic ball game that favors normal science and conventionality. Austerity has also intensified internal competition for resources, a competition that has in turn led to greater productivity demands (We must publish more, lest we look bad compared to department X!) backed up by more stringent post-tenure review procedures. All this tends to keep faculty oriented to doing pedestrian academic work. While there is no rule against cultivating a role as a public intellectual, there is only so much time in the day, and professors, like other workers, come to devote themselves to doing what they will be held accountable for and rewarded for.
Then there are the usual conservative attacks on professors. These are nothing new. From the elders of Athens to Andrew Carnegie to Reed Irvine to David Horowitz to today’s know-nothing Republican legislators, blasting professors for asking disturbing questions and pointing out inconvenient truths is standard cultural and class warfare. For the most part, such attacks, at least since the end of the McCarthy era, have been deflected by traditions of free speech, academic freedom, and tenure. Yet here again new economic and political realities have made these attacks more ominous.
When right-wing legislators control state governments, their anti-intellectualism can have serious upshot. It’s not just that budgets for public universities can and have been cut because of legislators’ hostility to non-vocational higher education, it’s that professors are increasingly aware that their public statements can draw retribution. Here in North Carolina in the past year we have seen centers and institutes in the UNC system closed because faculty associated with these centers and institutes offended Republican legislators. No professors lost their jobs, but nor did anyone fail to see that a warning shot had been fired.
Some North Carolina Republican legislators even tried to dictate higher teaching loads for UNC system faculty and to forbid state employees, including professors, from using any work time or state resources to so much as comment on public issues. Both proposals were promptly quashed by saner heads. But again the message was clear: We are watching you and, if we can, we will use our political power to cut you off at the knees. This is not a battle peculiar to North Carolina. Similar struggles are occurring in Wisconsin and other states where ALEC-driven Republicans are in control.
A widening embrace of neoliberal ideology amplifies these threats. Austerians and free marketeers want public universities to operate like vo-tech schools and to serve as think tanks for big business. And so they have targeted for elimination programs in the social sciences and humanities. The claim is that because these programs do not lead to jobs and do little to advance capitalist enterprise, taxpayer money should not be used to support them. This is a view that appeals to middle- and working-class voters whose wages have stagnated and whose taxes have risen. It’s also a view that could lead to an eventual gutting of the liberal arts in public universities, and thus even fewer jobs for PhDs who might aspire to be public intellectuals.
I have been referring mainly to public research universities, in part because they are what I know best. But it is also because of their value and vulnerability that public research universities are of special concern. These universities are supposed to operate in the public interest, thus giving faculty warrant to speak about policy issues and social problems. And unlike teaching-oriented schools, research universities expect faculty to publish. For these reasons, public research universities have had the potential to nurture critical intellectual work. Yet precisely because they are taxpayer supported, at least in part, they are vulnerable to attacks by neoliberal ideologues.
Prestigious private universities are a different story. They are not vulnerable to the same kinds of budget manipulation and demagogic rhetoric about the proper use of taxpayer money. Faculty in these schools have ample time and generous support for research and writing. Ivy League schools are also happy to see their faculty achieve celebrity status. With a few exceptions, however, Ivy League professors who achieve public visibility do so as house intellectuals for the nation’s elite; they are more likely to be legitimators of the status quo than its radical critics. The path to tenure at Harvard does not go through publication inMonthly Review.
Some colleagues with whom I’ve spoken about the demise of professors as public intellectuals have told me not to worry. It’s true, they admit, that we might not have the towering figures of the past, but today we have thousands of websites, blogs, and Twitter feeds through which ordinary professors, those without national fame, can reach a wide audience. For self-serving reasons, I would like to believe this. We indeed have more ways to put out ideas and information than in the pre-Internet era. But I am not so easily comforted.
Even if there are more potential outlets for critical analysis, the same conservatizing forces noted above—tougher job competition, contingent employment, surveillable online instruction, demands for grant-getting and conventional forms of productivity, more stringent accountability regimes, legislative monitoring and related attacks—continue to gain strength. So even if there are new means for reaching non-academic audiences, most professors have good reasons to ignore them. You want to Tweet, blog, or write for websites? That’s fine, just do it in your spare time and don’t expect to be rewarded for it. And be careful what you say.
In 2008, Frank Donoghue, an English professor at Ohio State University, published The Last Professors: The Corporate University and the Fate of the Humanities. Donoghue says, and I agree, that being a professor is still a great job—it affords status, decent pay, autonomy, control over one’s work, and a measure of democratic control over one’s workplace—but today the job is being degraded by the drive for greater managerial control of the university. Professors, especially at the middle and lower tiers of academia, are thus ceasing to be the self-directed, curiosity-driven intellectual workers they once were, or could have been. Despite the undeniable corporatization of the university, when I first read Donoghue’s book I thought he was being alarmist. Now I think he was too cautious.
Just as the public intellectuals that Jacoby reveres began to fade when their economic niche eroded such that they could no longer survive by freelance writing for engaged publics, so too with professors. The niche that once supported critical intellectual work in the university and allowed professors to offer independent analysis to a wide audience is changing. These changes will ever more strongly discourage professors, even tenured ones, from aspiring to or becoming public left intellectuals. What remains after that is likely to be merely academic.

Crisis and the Politics of Possibility

Rob Urie

Following in the tradition of his father and Bill Clinton before him, in 2003 George W. Bush launched a ‘liberal’ war to, in the words of the agreed upon rationales; protect the ‘homeland’ from attack and remove a dangerous tyrant from office. Enlisted in selling the ‘war of liberation’ were liberal hawks who apparently believed, against considerable history, that the American military is a force of liberation and trusting feminists who believed, against considerable history, that exporting Western liberalism at the point of a gun would free women from religious patriarchy to realize themselves. Reliably, the result was grim destruction beyond the imagination of most mere mortals— the only liberation that took place was of one million Iraqis from their mortal coils.
The point isn’t retrospective finger pointing, but rather to pin down where liberal goals departed from the social mechanisms that were believed to support them? Part of the calamity generation resulted from people— American liberals, who were free to have opinions without bearing their consequences. The sympathetic frame put forward was of ‘speaking for the voiceless,’ many of whom apparently believed the American liberation myth themselves. But the more serious shortcoming was in not understanding the motivations of the political leadership, the nature of the corporations and other economic interests seeking to benefit from war and the complexity of the social relations that were destroyed. Put differently, the conception of ‘freedom’ at work left unconsidered the divergent interests; military contractors, infrastructure rebuilders and multinational oil companies, whose life-blood is economic plunder.
uriecrisis1
By the measures of their times, Pablo Picasso, the Fluxus Group, Ornette Coleman, Frida Kahlo, John Cage, Iggy Pop and Susan Brownmiller had social voices through what they did, not what they owned. Simple assertion that everyone should have a voice has little bearing on whether or not they do. In a move related to this latter point, car companies assert that owning a luxury car gives one social bearing. By 2007 this theory had minimum wage earners in California borrowing $720,000 to buy a house that screamed ‘success.’ The ‘democratization of capital’ gave voice to the voiceless until the housing bust made it evident that poor people can’t afford a voice. Many who bought into the capitalist ruse will pay for it for the rest of their lives. Original image source: vidalexus.com.
In its contemporary incarnation the question of who gets to speak, to have voice in Western societies, came to the fore in the 1960s, with deeper articulation coming in the 1970s. Emerging from the perceived failure of the French rebellion of 1968, French philosophers Jacques Derrida, Michel Foucault and Jean-Francois Lyotard used their readings of Friedrich Nietzsche and Martin Heidegger to reject overarching (‘meta’) social explanations offered by the then moribund European left. What followed was an intellectual fashion where both adherents and critics of philosophical postmodernism proceeded from the conclusions of these philosophers without spending much time with the sources of their critique.
Paradoxically in some respects, advocating for voices for the (socially) voiceless became the received wisdom for a generation or more in academia. From a left perspective, then emergent neo-classical economics provided a link between post-modern ‘micro-narratives’ and the capitalist thesis of the day, ‘micro-production.’ Foucault’s ‘technologies of the self’ intersected with crude business theory to craft externally defined ‘consumption units.’ Margaret Thatcher’s “there is no such thing as society, there are individual men and women, and there are families” was put forward in the context of capitalist renaissance sold through the prospect of self-realization through ‘markets.’ The farce sold as promise was social inclusion through consumption, through what we ‘choose’ to own.
uriecrisis2
The housing boom and bust engineered by Wall Street grew out of the ‘democratization of capital’ which promised economic inclusion through borrowing to fund consumption. Predatory lending had a long history including spectacular, but for the most part localized, implosions in the years that preceded the boom / bust of the late 1990s and 2000s. Certain types of loans were known to be predatory and had repeatedly been outlawed until they reappeared and were used to target poor neighborhoods of color. The initial boom boosted house prices and created the illusion that predatory capitalism benefitted traditional ‘out’ groups. The subsequent bust dispelled this myth and the social carnage left behind persists. Original image source: redcrossggr.wordpress.com.
This is to agree with the premise that everyone who wants a social voice should have one while strongly contesting the idea that simple assertion makes it so. Furthermore, from what is necessarily conjecture based on evidence of looming environmental catastrophe, unless a very large number of people become political in the sense of defining and acting on joint interests, there seems a real possibility that no one will have a voice. In his Specters of Marx Derrida, intellectual ‘patriarch’ of philosophical post-modernism, revisited the de-politicization implicit in social ‘realization’ that assumed away political and economic context. The contingencies of race, sexual orientation and gender posed through critique of their historical and intellectual bases in social ontology were re-asserted by later thinkers as ontological fact through ‘identity.’
Put in somewhat cruder terms, what is considered personal choice in the realm of theorized rights is bourgeois fantasy outside of it. Arguing that race and gender are culturally and historically contingent places them in intellectual context without addressing their social facts. These facts include systematic differences in economic outcomes, incarceration rates and political disenfranchisement that tie to historical social relations. The question of changing these political and economic facts ties back to intellectual history through proposed solutions— in broad terms to the Thatcherite precept that personal ‘responsibility,’ the bundle of personal attributes that one ‘acquires,’ determines social outcomes versus the liberal – left argument that social problems require social solutions. The systematic nature of gender and racial disparities supports the latter interpretation of social facts.
uriecrisis3
The current debate over ‘anthropocene’ (human caused) versus more clearly attributable causes of global warming illustrates the ‘external’ nature of material social opposition. As I argued here, available evidence strongly suggests that capitalist production and Western consumption produced the CO2 emissions that are causing global warming. The ‘anthropocene’ argument implies that hunter-gatherers in Namibia are as responsible for global warming as the Koch Brothers and ExxonMobil. From a Marxian – left ‘material’ perspective continued existence is a precondition for having one’s voice heard. Evidence from the capitalist response to global warming suggests that existence is but a matter of opinion, of having the correct attitude, in that view. Original image source: zmescience.com.
Part of this philosophical conundrum can be found in Hegel’s ‘idealism,’ in the historical back-and-forth of ideas that Marx concluded lacked basis in the economic circumstances that give them experiential content. Whatever one’s ‘identity’ as determined through cloistered self-realization, people have social identities to their significant others, to the landlord, to one’s employer, to the police and to the tax man. One’s capacity to favor the self-realized self, whatever its ‘true’ genesis, over the ‘external’ selves of social life is a function of political and economic circumstance. Economic independence, the material precondition of cloistered self-realization, is a function of economic class; hence the criticism that asserting the ‘right’ to self-realization without providing the political and economic context required for it to take place is bourgeois fantasy.
But it is bourgeois fantasy that begins from a particular political perspective. To his credit, Derrida confronted the distressingly obvious intersection of neoliberalism and philosophical post-modernism through his retained understanding that the micro-narratives of Western individualism are bound together through the capitalist ‘meta’ narrative. As is made visible by capitalist ‘facts,’ a complex global architecture has been built to support this ‘external’ system of theorized individual self-realization. Capitalist theory is premised on a narrow conception of human ‘being’ as reactive ‘chooser.’ Its small-bore ‘choices’ fit the ‘micro-narrative’ as metaphor for the way in which ‘external’ social ontology is internalized as ‘identity.’ A close analog is the (engineered) social mechanics by which buying a luxury car imparts ‘success’ as externally given internalized identity.
uriecrisis4
One of the more persistently painful political chores in writing about politics is addressing the policies of President Barack Obama given the sincerely held aspirations he embodies for large numbers of people. For those who have never taken a marketing course, aspirational advertising is one of the more powerful marketing tools because it exploits the paradox of social technologies used to manipulate alleged individual decisions. Once aspirations are removed what is left are actual policies which place Mr. Obama as virtually indistinguishable from any of his recent predecessors. The charge of racism lodged against Mr. Obama’s substantive, as opposed to ad hominem, critics is in fact racist for using race as reason not to criticize Mr. Obama’s policies when criticism is warranted. Original image source: wrln.org.
The paradox is: if some notion of a ‘real’ self to be self-realized is held then material conditions that facilitate it are necessary in a world where material conditions are a function of class, race and gender. Contrariwise if, as capitalist / neoliberal theory have it, there is no ‘self’ there to be ‘internally’ realized but only an unfilled (unfillable) bundle of wants that require ‘external’ realization through market exchange then the issue is in the process of being resolved through trade agreements and the technologies of social coercion being created ‘for our own good.’ The postmodern assertion that small ‘d’ democracy provides a voice for the previously voiceless— a base premise of the post-Occupy ‘left,’ is either content-free or material conditions must be reconfigured to provide the necessary social context for it to work.
The increasingly common critique that Marxist theory is largely the opinion of White men of European descent seems in an informal sense to be descriptively accurate. But to put the question back: what, if any, are the social conditions that facilitate other views? In the classical liberal / capitalist view everyone, poor and rich alike, has the capacity for transformative change by working harder and consuming smarter— within the ‘choice’ of capitalism. In this view American history is the failure of poor and / or people of color to improve their own lots. Social outcomes may differ, hence the tendency to blame ‘absentee fathers’ or a ‘culture of dependence’ for systematic racial and gender disparities. But unless critics simply assume away social and historical context, assertions that social conditions are irrelevant to self-realization are implausible.
uriecrisis5
Western bankers have used the levers of state power like the IMF and World Bank to their own advantage for centuries. Following the initial bank bailouts in 2008 banks used national governments in the U.S. and Europe to squeeze borrowers or to shift their liabilities to those who can be made to pay. In the process entire nations like Greece and Ireland assumed private debts to benefit bankers while impoverishing the most vulnerable of ‘their’ citizens. This is but one aspect of the ‘neutral’ basis from which ‘horizontal’ political movements hope to wrest power from captured governments. Original image sources: huffingtonpost.com; foxnews.com.
The postmodern conception of the social voice appears to correspond to the interpretation of free-speech rights by big-city Mayors who establish free-speech zones a few miles away from political events. People are free to say and do (‘be’) anything they care to as long as they don’t stop traffic or reduce corporate profits. And it supports the liberal conception of a ‘marketplace of ideas’ that in theory makes all voices heard, not just those of White men of European descent. This ‘democratization’ through markets was a major selling point of the Reagan / Thatcher programs of capitalist resurgence. But as with other capitalist conceits, the marketplace belongs to those who can pay for it. From within it people have a choice between CNN, Fox News or shouting from inside free-speech zones. This latter point is what Derrida appears to have choked on when he realized what postmodernism without the originary texts had wrought.
Should this read as addressing issues resolved long ago, much of the ‘horizontal’ political theory of the ‘new’ left in the U.S. and Europe combines postmodern conceptions of the ‘self’ with capitalist / neoliberal theories of the state to conclude that well-organized oppression can be willed away. ‘Left’ and ‘Right’ can be thrown together as repressive narratives, but to those who have bothered to study political and economic theory, ‘Right,’ as in an amalgam of tightly circumscribed ‘choices’ and repressive technologies. And it is existing political economy. Moreover the perception, implied or explicit, that there is ‘neutral’ ground from which liberatory politics can emerge has long history as ideology— it is anything but politically neutral. Economic democracy, a/k/a socialism, seems the more promising basis for a politics of possibility.

The Price of Race Privilege

David Rosen

Americans are being squeezed.  The mounting crisis caused by global economic restructuring and the Great Recession is taking its toll on all Americans, especially people of color, African-Americans, Hispanics and Native-Americans.  However, too often overlooked in analyses of the deepening crisis is its effect on a growing number of the traditionally more privileged white populations.  They too are being squeezed.
Part I of this two-part story focused on four economic issues affecting the nation’s white majority — wealth, income & poverty, bankruptcy and education.
This part focuses on four more personal issues affecting white Americans – drug addiction, suicide, obesity and homelessness.
Racism is America’s great shame, endemic to the body politic, social life and personal relations.  For nearly a half-century, Republican politicians have exploited racial divisions by promoting the Southern strategy, playing the “race card” to win elections.  The strategy has long been successful and will likely be played out yet again in the upcoming 2016 campaign.
However, given the changes since the ‘60s, old-fashioned terms of racial stereotyping like “nigger” and “wet back” can no longer be publically uttered.  They have, however, been replaced by equally coded terms like “inner city rioter,” “welfare cheat” and “undocumented.”  No matter the rhetorical obfuscation, everyone knows what the coded expressions mean.
The Republicans strategy has long been to assure white voters, especially poor and working-class people, that their “white skin privilege” made them better-off then people of color.  Sadly, this story has long been true; on average, whites have been better off then America’s demographic minorities.  However, such race-based deception can only work for so long; its time is may be running out.
White Americans are being, simultaneously, eclipsed and squeezed – and they know it.  Their relative proportion of the country’s population is shrinking while their economic gains stagnate.  A century ago, wave after wave of mostly white European immigrants – Irish, Italian, Eastern Jews –recast the nation’s demographic make-up.  Often forgotten, the white gentry of the day, old-line WASPs, denied that these immigrants were white.  Their arrival took place as U.S. economic and military prowess began to assert itself.
Now, a century later, a new wave of immigrants is arriving from all over the world, but especially from the embattled continental South.   Unfortunately, this demographic shift is occurring as U.S. economic prowess is being eclipsed and its military hegemony flounders.  Like a tectonic plate, the global economic reordering now underway is fostering postmodern feudalism.   The lords of America’s 21st manner or plantation are the new robber-baron gentry, financial capitalist putting the squeeze on an ever-growing number of Americans.
And white people, like black, Hispanic and other Americans, are caught in the Great Squeeze.  It is an era marked by growing inequality and mounting social deprivation.  How whites deal with this social reordering will very much determine the nation’s fate.
* * *
Considerations of four personal issues affecting white Americans – drug addiction, obesity, mental health and homelessness – follow.
Drugs
A recent NPR story on the pre-campaign election circuses in New Hampshire and Iowa featured a spot on the rising drug crisis besetting heartland American.  As it reported, “Drug overdoses now kill more Americans than traffic accidents.  And, in many places, there’s a growing acceptance that this isn’t just a problem for other people.”  Ted Gatsas, the mayor of Manchester, NH, lamented, “A dose of heroin is now cheaper than a six pack of good beer.”  Political hopefuls like Gov. Chris Christie, Hillary Clinton and Sen. Rand Paul voiced their concern about the issue.  Most revealing, the NPR report focused on victims of the heroin plague and used a coded language – “isn’t … other people” (i.e., people of color) – to denote white citizens; remarkable the word “white” was never uttered, although those profiled were white residents.
According to the Centers for Disease Control (CDC), overdoses (i.e., “drug poisoning”) are “the number one cause of injury-related death in the United States, with 43,982 deaths occurring in 2013.”  It found, based on data from 28 states, that the “death rate for heroin overdose doubled from 2010 through 2012.”   Drilling down, it found there were 8,257 heroine deaths, most involving men aged 25–44 years.
In 2013, an estimated 25 million Americans were using illicit drugs, about 9.4 percent of the population aged 12 or older; this is up to from the 2002-09 rate of 7.9 percent.  The drugs used were marijuana/hashish, cocaine (including crack), heroin, hallucinogens, inhalants and prescription-type psychotherapeutics.  Among whites, illicit drug use increased to 9.5 percent from 8.5 percent in less then a decade.
Illegal drug usage in the post-Prohibition era has gone through three phases.  During the ‘50s-‘70s, hipsters and hippies, white and black, smoking the evil weed.  Second, in the ‘80s, a “crack cocaine scare” gripped the nation following the adoption of the infamous Anti-Drug Abuse Act (1986) and the launch of the “war on drugs.”   The Act made penalties 100 times harsher for crack than for powder cocaine convictions; 85 percent of those jailed for crack cocaine offenses were black, despite the fact that the majority of users were white.  In the post-Great Recession era, the drug war is seen as a failure by a growing number of politicians and ordinary Americans.  They share a belief its time to change the nation’s drug laws (e.g., decriminalize marijuana) and make drug busts for usage and small sales less punitive (e.g., end 3-strikes laws).
Among whites today, drug use or abuse is rampant.  In 2013, the “legal” drug of choice was alcohol, where nearly three-fifths (58%) were drinkers and nearly a quarter (24%) binge drinkers.  The use of tobacco products (e.g., cigarettes, cigars) among whites is still over one-quarter (28%).  With regard to “illegal” drugs, in ’13, marijuana was Americans favorite means of getting high, accounting for four-fifths (81%) of illicit drug users, about 20 millions users per month.  Among full-time college students, whites have the highest rate of illegal drug use at 25 percent.

Methamphetamine (“meth”) was once the drug of choice among white males (e.g., outlaw motorcycle gangs and blue-collar guys) and remains so, but is loosing its appeal.  In the ‘90s at the height of its popularity, the Open Society estimates there were only one million meth users.  Today, its use has spread to white women and Hispanics.
The new drugs of choice among white Americans are psychotherapeutic drugs and heroine.  A 2010 report from the National Survey on Drug Use and Health found that, during 2009, 2.4 million individuals used psycho drugs, including pain relievers, tranquilizers, stimulants and sedatives used for nonmedical purposes.  Most alarming, it found for 2007 – the last year data was available — deaths from unintentional overdoses from such drug use increased to approximately 27,000.  Almost a decade later, one can only wonder what is the number of current deaths?  According to the American College of Obstetricians and Gynecologists, “Adolescent girls and women older than 35 years have significantly greater rates of abuse and dependence on psychotherapeutic drugs than men.”
Columbia University’s Mailman School of Public Health recently reported that between 2002-2005 and 2008-2011there had been a 75 percent jump in heroin usage among “Hispanics and non-Hispanic whites.”  “The noteworthy increase in the annual rate of heroin abuse or dependence among non-Hispanic whites parallels the significant increase in nonmedical opioid use during the last decade and the growing number of heroin overdose deaths described for this race and ethnic group in recent years,” noted Dr. Silvia Martins, the study’s lead epidemiologist.
Suicide
Some refer to the American West as the “Suicide Belt,” the region of the country with the highest suicide rates.  As one researcher asked: “Why Westerners are so much more likely than people in other parts of the country, particularly those in the East, to kill themselves?”  No one has an answer to this vexing question.
In 2013, whites had the highest suicide rate in the country, at 14.2 per 100,000; American Indians and Alaska Natives were second with a rate of 11.7.  However, during 2005–2009, the highest suicide rates were among American Indian/Alaskan Native males with 27.6 suicides and Non-Hispanic White males with 25.96 suicides.  Among women, Non-Hispanic Whites had the highest rate with 6.7 suicides.

Special consideration of suicides among active duty service-people and veterans of the U.S. military is troubling, especially in light of the most recent wars in Afghanistan and Iraq and the aging of Vietnam War vets of the ‘60s.  The U.S. Army reports that between 2008 and 2012, the number of active duty solders who’ve committed suicide increased by 30 percent, to 349 from 268.  While there is considerable debate about extent of suicides among veterans, Sen. John McCain and others have cited Veterans Administration figures putting the total at 22 vets per day or over 8,000 a year.  Many of these are white men.
Obesity
Americans are getting pudgy, putting on the pounds.  According to the Kaiser Family Foundation, nearly two-thirds (64%) of American adults are overweight.  While people of color suffering higher rates of overweight then whites, 63 percent of whites are overweight.  The greatest concentration of overweight whites are in the red states: Alabama (66%), Arkansas (69%), Georgia (64%), Indiana (67%), Iowa (67%), Mississippi (67%), Nebraska (67%), North Dakota (68%), Oklahoma (67%), South Dakota (67%), Tennessee (68%), West Virginia (69%) and Wisconsin (67%).
Being overweight is not the same as being obese.  Both are calculated in terms of “body mass index” (BMI).  In a 2013 study on obesity, income and race, Pew Research provided this definition:  “A person’s BMI is his or her weight (in kilograms) divided by the square of his or her height (in meters), rounded to one decimal place.  A BMI of 25 or more is considered overweight; 30 or more is considered obese.”   Pew divided income into three categories calculated against the poverty level: (i) up to 130%, (ii) between 130% and 349% and (iii) above 350%.  Looking exclusively at white men and women by income, it found that in at the lowest income level, 30 percent of men and 39 percent of women were obese; among more middle-income whites, 35 percent of men and 38 percent of women were obese; and among the highest income group, 32 percent of men and 28 percent of women were obese.
Homelessness
In the 1950s and 1960s, the typical homeless person was white, male and in his 50s.  Today, principally people of color suffer homelessness.  The
Institute for Children, Poverty and Homelessness estimated in 2012 that homelessness was experienced by 1 in 403 families; however, only 1 in 141 African-American families experienced homeless compared to 1 in 990 white families.
Looking specifically at homelessness among white Americans, the National Coalition for the Homeless estimated in 2009 that 35 percent were homeless.  It also noted, “people experiencing homelessness in rural areas are more likely to be white, female, married, currently working, homeless for the first time, and homeless for a shorter period of time.”
* * *
Drug addiction, suicide, obesity and homelessness are issues affecting all Americans; they are indicators of how people cope with crisis by taking it out on themselves.  Each person who shoots up, kills him/herself, is obese or homeless is both an individual failing to effective cope with deeply personal issues and a painful symptom of the spreading loss of faith in the social system.
White privilege is under attack, less so by people of color who are poorer and suffer greater then by the policies of the – mostly white — 1 percent.  Racism plays a key – if unspoken – role in the repression of white people; it keeps them blind, in denial, to what causes their deepening immiseration.
Race is at the center of three critical issues that will likely play significant roles in the upcoming ’16 election – police killings of unarmed people of color, inequality and immigration.  The nearly daily media reports of “police lynchings” make inescapable the police’s role in enforcing racist customs.  The suburbanization of poverty, with inner-city people of color being pushed to the impoverished suburbs, reflects the out-of-site, out-of-mind campaign now redrawing the urban landscape.  The politicians and people who resist immigration reform today conveniently forget that their ancestors were once immigrants.  More so, many fear the loss of their white skin privileges as their population majority is projected to shrink to a minority later this century.
How will the tyranny of white skin privilege be broken?  How will the Republican ace-in-the-hole Southern strategy finally be ended?
According to the Southern Poverty Law Center, “Since 2000, the number of [white] hate groups has increased by 30 percent.”  It estimates that since Pres. Obama’s 2008 election the number of such group rose 1,360 in 2012 from 813 in 2008; in 2014, there were 874 such groups.
One can only hope that as the Great Squeeze takes its toll on more and more white people, they will find common struggle with other Americans from all backgrounds rather then the 1 percent who, like puppet masters, pull the strings on all-too-may politicians and pundits.

The Destruction of Jamaica’s Economy Through Austerity

Pete Dolack

A small country immiserates itself under orders of international lenders; unemployment and poverty rise, the debt burden increases and investment is starved in favor of paying interest on loans. If this sounds familiar, it is, but the country here is Jamaica.
So disastrous has austerity been for Jamaica that its per capita gross domestic product is lower than it was 20 years ago, the worst performance of any country in the Western Hemisphere. In just three years, from the end of 2011 to the end of 2014, real wages have fallen 17 percent and are expected to fall further in 2015, according to the country’s central bank, the Bank of Jamaica.
Such is the magic of austerity, or “structural adjustment programs,” to use the official euphemism of the International Monetary Fund and World Bank.
A new paper from the Center for Economic and Policy Research, “Partners in Austerity: Jamaica, the United States and the International Monetary Fund,” reports that the amount of money Jamaica will use to pay interest (not even the principal) on its debt will be more than four times what it will spend on capital expenditures in 2015 and 2016. And despite a new loan, the country actually paid more to the IMF than it received in disbursements from the IMF during 2014!
As a further sign of the times, the current pro-austerity government of Jamaica is led by the National People’s Party, the party of former democratic socialist President Michael Manley. President Manley took office in 1972 on promises to combat social inequality and injustice, and he is credited with enacting legislation intended to establish a national minimum wage, pay equality for women, maternity leave with pay, the right of workers to join trade unions, free education to the university level, and education reforms that enabled students and teachers to be represented on school boards.
He also became an international figure advocating for progressive programs to be implemented elsewhere. Naturally, this did not sit well with the United States government. When President Manley stood with Angola against the invasion by the apartheid South African régime and supported Cuban assistance to Angola, he defied a warning from U.S. Secretary of State Henry Kissinger. The CIA presence in the Jamaican capital, Kingston, was doubled.
Jamaica Observer commentary noted parallels between the overthrow of Salvador Allende in Chile and unrest in Jamaica later in the 1970s:
“The imperialists applied the same ‘successful’ Chile model of destabilisation in Jamaica. They applied the same strategy of ‘making the economy scream,’ creating artificial shortages of basic items, promoting violence, including the savage murder of 150 people in a home for the elderly. Violence erupted in Jamaica as was never seen before in the ‘shock and awe’ tactics mastered by the imperialists whenever they want to create fundamental change in someone else’s country. Manley and Jamaica yielded under the pressure and eventually took the IMF route.”
Replacing Human Development with Austerity
The conservative who took office in 1980 reversed President Manley’s programs. By the time that President Manley returned to office in 1989, he had moved well to the right under the impact of changing world geopolitical circumstances and the dominance of neoliberal ideology. As an obituary in The Economist dryly put it, “He did as the IMF told him, liberalised foreign exchange and speeded up the privatisation of state enterprises.”
The one-size-fits-all program, a condition of IMF and World Bank loans, includes currency devaluation (making imports more expensive), mass privatization of state assets (usually done at fire-sale prices), cuts to wages and the prioritization of the profits of foreign capital over a country’s own welfare. The 2001 film Life and Debt, produced and directed by Stephanie Black, depicted a country on its knees thanks to “structural adjustment.” The film’s Web site sets up the picture then this way:
“The port of Kingston is lined with high-security factories, made available to foreign garment companies at low rent. These factories are offered with the additional incentive of the foreign companies being allowed to bring in shiploads of material there tax-free, to have them sewn and assembled and then immediately transported out to foreign markets. Over 10,000 women currently work for foreign companies under sub-standard work conditions. The Jamaican government, in order to ensure the employment offered, has agreed to the stipulation that no unionization is permitted in the Free Trade Zones. Previously, when the women have spoken out and attempted to organize to improve their wages and working conditions, they have been fired and their names included on a blacklist ensuring that they never work again.”
The film shows the destruction of Jamaica’s banana industry and the decimation of its milk-production capacity because the country is forced to open itself to unrestricted penetration by multi-national capital, while those corporations continue to receive subsidies provided them by their home governments. The Life and Debt Web site reports:
“In 1992, liberalization policies demanded that the import taxes placed on imported milk solids from Western countries be eliminated and subsidies to the local industry removed. In 1993, one year after liberalization, millions of dollars of unpasteurized local milk had to be dumped, 700 cows were slaughtered pre-maturely and several dairy farmers closed down operations. At present, the industry has sized down nearly 60% and continues to decline. It is unlikely the dairy industry will ever revitalise its growth.”
Poverty and Unemployment Continue to Rise
Austerity continues its course today. The Center for Economic and Policy Research’s “Partners in Austerity” paper, written by Jake Johnston, notes that conditions in Jamaica are worsening — unemployment, at 14.3 percent as 2014 drew to a close, is higher than it was when the global economic crisis broke out in 2008 and the 2012 poverty rate (latest for which statistics are available) of 20 percent is double that of 2007.
Jamaica currently has a debt-to-GDP ratio of 140 percent, an unsustainable level that has risen. Yet it is required as a condition of its latest IMF loan to maintain an unprecedented budget surplus of 7.5 percent. Thus the paper declares the country is undergoing the world’s most severe austerity because this surplus, the highest dictated to any country, must be extracted from working people on top of what is extracted for interest payments.
Jamaica has re-financed its debt twice in the past three years, and its latest IMF loan, agreed to in 2013, comes two years after previous loans were cut off because the government said it would pay promised wage increases to public-sector employees. The debt exchanges lowered the interest rates and extended the payment period, a combination that does not necessarily mean less interest will ultimately be paid out. Without debt relief, there is no exit from this vicious circle. The “Partners in Austerity” paper says:
“Crippled with devastatingly high debt levels and anemic growth for years, Jamaica is certainly in need of financing. But it is also the case that, after billions of dollars of previous World Bank, [Inter-American Development Bank] and IMF loans, much of its debt is actually owed to the very same institutions that are now offering new loans.” [page 2]
Financing schemes, whatever negative consequences it might ultimately have for the debtor country, are lucrative for investment banks. For example, banks underwriting Argentine government bonds earned an estimated US$1 billion in fees between 1991 and 2001, profiting from public debt. Yet the foreign debt continued to grow. In one example during this period, a brief pause in Argentina’s payment schedule was granted in exchange for higher interest payments — Argentina’s debt increased under the deal, but the investment bank that arranged this restructuring, Credit Suisse First Boston, racked up a fee of US$100 million.
Less for Public Needs
As a result of the new austerity measures, Jamaican government spending on infrastructure has fallen to 2.6 percent of gross domestic product, as opposed to 4.2 percent as recently as 2009. Moreover, the government is required to siphon $4.4 billion over four years from its National Housing Trust to replenish government coffers drained to pay off the loans. The trust, a legacy of President Manley, is mandated to provide affordable housing, and yet it is the same National People’s Party that is raiding it under IMF orders.
The country’s economic difficulties would be still more severe if it were not for aid from Venezuela and investments from China, according to “Partners in Austerity.” The paper reports:
“Venezuelan funding comes through the Petrocaribe agreement, where Jamaica receives oil from Venezuela, paying a portion up front and keeping the rest as a long-term loan. Jamaica pays a lower interest on the Petrocaribe funds than it does to its multilateral partners. According to the IMF, net disbursements through Petrocaribe totaled over $1 billion over the last three years, averaging 2.5 percent of GDP per year. … A significant portion of the Petrocaribe funds are being used to refinance domestic debt, in support of the IMF program. Additionally, a portion of funds takes the form of grants and is used for social development, bolstering support to the neediest who have been most impacted by continued austerity. … Without the Venezuelan and Chinese investments staving off recession, it’s likely the IMF program would fail due to serious public opposition.” [page 13]
It is possible to provide aid that actually assists development rather than as a cover for exploitation, as Venezuela demonstrates.
Why do disastrous “structural adjustment” programs continue to be foisted on countries around the world despite the results? Undoubtedly many who prescribe “structural adjustment” continue to believe in neoliberalism in the face of all evidence. But this ideology doesn’t fall out of the sky; it is an ideology in service of the biggest industrialists.

Social Control in Europe: Virtual Jobs

Bill Blunden

As the economy in Europe festers the New York Times reports that the ranks of the Eurozone’s unemployed are finding solace in a curious parallel economy populated by thousands of counterfeit businesses known as “practice firms.” This alternate universe doesn’t actually produce tangible goods or services, rather it offers people with unpaid positions that foster a sense of routine, structure, and personal connection. And while participating in this bogus job market may offer some relief on a superficial level the tendrils of social control are visible to those who know where to look.
Originally designed to offer job training in the aftermath of World War II, this massive commercial simulation is now being leveraged to address the problem of long term unemployment, which accounts for more than half of those who are currently out of a job in the EU. The basic idea is to keep sidelined people from feeling isolated and depressed by giving them a place where they can at least go through the motions of a normal job.
Such is the comfort of familiar patterns. If you can’t scrape together a living through low-paid temporary contract work in the real world you can always keep up appearances and work for an employer that pretends to pay you while your stomach growls. At one point in the article theTimes describes a scenario that borders on Orwellian doublethink as a woman asks her colleagues “What’s our strategy to improve profitability?”
As Patricia Routledge exclaims: It’s Bouquet, dear! B-U-C-K-E-T!
Though advocates contend that this vast make-believe workplace garners professionalism and confidence it’s important to recognize that this strategy only treats symptoms. Most people don’t start asking hard questions until catastrophe strikes and the world stops making sense. By keeping the unemployed occupied with what’s essentially vocational hoop-jumping they’re distracted from pondering deeper topics and questioning more fundamental assumptions about the society they live in.
Barbara Ehrenreich, the author of Nickel and Dimed, accurately calls out fake job therapy as an exercise in denial:
“The first step, as in any 12-step program, is to overcome denial. Job searching is not a job; retraining is not a panacea. You may be poorer than you’ve ever been, but you are also freer — to express anger and urgency, to dream and create, to get together with others and conspire to build a better world.”
Celebrities like Oprah Winfrey preach a myopic gospel of self-improvement, a narrative which advocates personal change while almost entirely ignoring larger institutional issues. Plutocrats sanctimoniously bray that “my wealth is my virtue” in the wake of the 2008 collapse and other unprecedented massive transfers of wealth. They have the nerve to stigmatize the victims of the ensuing economic implosion for their own unemployment and then demand austerity as a remedy. Never mind the billionaires moving the goal posts in the background or the aging Greek man named Dimitris Christoulas who chose suicide over penury.
Faced with the threat of a political uprising the ruling class would prefer that the unemployed dutifully remain on the job treadmill, keep their nose to the grindstone, and stay with the program. Because in doing so workers offer tacit acquiescence to existing political, economic, and social arrangements. To do otherwise might give the unwashed masses a chance to organize and consider alternatives. For the moneyed gentry of the 0.1% that could be truly dangerous.

The Topless Dancer, Slavery and the Origins of Capitalism

Louis Proyect

Although I’ve written thirty-five articles about the origins of capitalism over the years, I never suspected that my first for CounterPunch would be prompted in a roundabout way by my relationship with a topless dancer forty years ago.
In the middle of May, I blogged an excerpt from an unpublished comic book memoir I did with Harvey Pekar in 2008. It covered my experience in Houston in the mid-seventies, part of which involved an affair with a comrade who had been dancing in Montrose just before I arrived, a neighborhood that mixed bohemia, gay and topless bars, and apartment complexes geared to swingers in double-knit suits.
About a week after the excerpt appeared, someone directed to a Facebook page that belonged to a well-known ISO dissertation student who having posted a link to my blog frowned on the idea that I would write a memoir without ever having done anything. Since the memoir was written under the direction of Harvey Pekar, who toiled for decades in obscurity as a file clerk in a veteran’s hospital in Cleveland, I doubt that the student had a clue about the memoir’s intention. It was not a saga about exemplary deeds in the revolutionary movement but recounted instead the humdrum life of a rank-and-filer who felt deeply alienated by what amounted to a cult. Plus, lots of jokes. After all, it was a comic book as Harvey insisted on calling his work.
Parenthetically I would advise against reading the blog of someone you hate. It is bad for your mental health. As a recommendation to the young dissertation student or anybody else with a grudge against me, let me paraphrase what Jeeves said to Bertie Wooster, substituting “Proyect” for “Nietzsche”: “You would not enjoy Nietzsche, sir. He is fundamentally unsound.”
As the Facebook feeding frenzy spilled across multiple timelines serving as an amen chorus to the dissertation student, word got back to me that I was being condemned by hundreds—maybe thousands—of lefties as a “misogynist”. This undoubtedly must have had something to do with my memoir describing how my girlfriend used to dance for me in the privacy of my shag-carpeted Montrose apartment. I am glad I left out the business about our Teletubbies doll fetish since that would certainly have led to me being excluded from polite leftist society forever.
Of course, this reminded me of the last go-round with ISO’ers over misogyny. Just over two years ago, their members were up in arms over Ruth Fowler writing an article in CounterPunch that amounted in their eyes to “guffawing over Angelina Jolie’s recent decision to undergo a preventative double mastectomy.” I suppose there is some connection to my situation since Ruth Fowler also worked as a stripper at one point in her life. If she had been trained in the ISO, Ruth would never have gone near a topless bar. Some might say that it was far better that she never went near the ISO, a den of iniquity if there ever was one—the offense being intellectual conformity rather than lewd and lascivious behavior.
Undoubtedly the primary motive for these Orwellian minutes of hate on Facebook was the critiques I have written of the ISO’s “Leninism” on CounterPunch and my blog as well. On top of that, the dissertation student must have hated my articles on the origins of capitalism that were an attempt to rebut the theories of Robert Brenner and his followers who are organized informally as “Political Marxists”, a trend that is still very powerful in the left academy but now arguably falling out of fashion. Since the dissertation student was a fervent member in good standing of the Political Marxism tendency, you can easily imagine why he would want to encourage the idea that I was like Bill Cosby.
It all boils down to this. Because I had the impudence to challenge the ideas of a prestigious UCLA professor and Marxist celebrity about the origins of capitalism, the word had gotten out long ago that I was sticking my nose in where it did not belong. I was like a medical school dropout performing appendectomies, not smart enough to know that you needed proper credentials to write about matters that more properly belonged in those journals that Aaron Swartz was trying to liberate.
Sometimes referred to as the “transition debate”, it all began with a series of articles that grew out of a Science and Society review by Paul Sweezy of Maurice Dobb’s 1947 “Studies in the Development of Capitalism“. Dobb had set forth a new interpretation in his book: capitalism was essentially an internally generated system brought on by profound changes in class relations in the British countryside during the waning middle ages. Sweezy argued that it was primarily a function of expanding trade based in towns rather than the countryside. Between the two men, there was no attempt to stigmatize each other as departing from Marxist principles. It was simply a difference of scholarly interpretation.
Despite the focus on the British countryside, Dobb admitted that mercantilism—the system we associate with colonialism and slavery—played a “highly important role in the adolescence of capitalist industry”. In doing so, he was obviously being mindful of how Marx characterized the genesis of the industrial capitalist in chapter thirty-one of Capital, Volume 1:
The discovery of gold and silver in America, the extirpation, enslavement and entombment in mines of the aboriginal population, the beginning of the conquest and looting of the East Indies, the turning of Africa into a warren for the commercial hunting of black-skins, signalised the rosy dawn of the era of capitalist production.
You can also find Karl Marx stressing the importance of mercantilism in chapter 20 of Capital, volume 3:
There is no doubt — and it is precisely this fact which has led to wholly erroneous conceptions — that in the 16th and 17th centuries the great revolutions, which took place in commerce with the geographical discoveries and speeded the development of merchant’s capital, constitute one of the principal elements in furthering the transition from feudal to capitalist mode of production. [emphasis added] The sudden expansion of the world-market, the multiplication of circulating commodities, the competitive zeal of the European nations to possess themselves of the products of Asia and the treasures of America, and the colonial system — all contributed materially toward destroying the feudal fetters on production.
None of this mattered to Robert Brenner when he began writing articles in the 1970s making the case that it was only in the British countryside that a transition to capitalism took place. Everywhere else farming was a small-scale, self-sustaining enterprise but due to what amounted to an accident of history farming took on a more competitive character in England as long-term leases were extended to a new class of profit-seeking entrepreneurs controlling vast estates.
Once it gained a foothold there, the system matured and then diffused to the rest of the world. Given his scholarly credentials, one might have expected Brenner to explain where Marx went wrong on the colonialism and slavery question. It is entirely possible that Brenner is right and that Marx was wrong but in the absence of a clear accounting for the differences, one cannot help but wonder if they were simply being swept under the rug.
In the July-August 1977 issue of New Left Review, Brenner made a huge splash with an erudite restatement of his scholarly views on the origins of capitalism. But unlike Maurice Dobb, the thirty-three year old professor had no problem categorizing Paul Sweezy as beyond the pale of Marxism, likening him to Adam Smith.
It seemed that by emphasizing the role of slavery and colonialism, the theorists grouped around Sweezy’s Monthly Review were accommodating themselves to the “national bourgeoisie” and fostering “a false strategy for anti-capitalist revolution.” There was a danger that “third-worldist ideology” might creep into the Marxist movement and tempt socialists away from the class struggle into worshipping false idols like Sukarno or Nkrumah.
Brenner’s difference with Marx, whether or not he ever understood it, can be reduced in the final analysis to his tendency to conflate slavery and serfdom in the general category of coerced labor:
Only where labour has been separated from possession of the means of production, and where labourers have been emancipated from any direct relation of domination (such as slavery or serfdom), are both capital and labour power ‘free’ to make possible their combination at the highest possible level of technology. Only where they are free, will such combination appear feasible and desirable. Only where they are free, will such combination be necessitated.
While Marx never wrote at great length about slavery as a mode of production, he would never relate it to the feudal system that prevailed in Europe prior to the rise of capitalism. Serfdom was a form of exploitation geared to the creation of food and clothing to sustain an aristocrat’s soldiers and the provision of obligatory labor such as building roads or clearing brush on his manor. Slavery, on the other hand, was designed to supply commodities to the world market, especially when free wage labor was not available on a Caribbean island or the Mississippi Delta. Marx put it this way in a letter to Pavel Vasilyevich Annenkov in 1847:
Direct slavery is as much the pivot upon which our present-day industrialism turns as are machinery, credit, etc. Without slavery there would be no cotton, without cotton there would be no modern industry. It is slavery that has given value to the colonies, it is the colonies that have created world trade, and world trade is the necessary condition for large-scale machine industry. Consequently, prior to the slave trade, the colonies sent very few products to the Old World, and did not noticeably change the face of the world. Slavery is therefore an economic category of paramount importance.
In the recent past there has been one significant scholarly work after another that expounds on Marx’s cursory observation that “Without slavery there would be no cotton, without cotton there would be no modern industry.” Walter Johnson’s “Rivers of Dark Dreams: Slavery and Empire in the Cotton Kingdom” draws upon slave narratives, popular literature, legal records and personal correspondence to demonstrate how tied in the South was to global capitalist markets rather than existing as some kind of feudal backwater. Edward Baptist’s “The Half Has Never Been Told: Slavery and the Making of American Capitalism”, as the title implies, makes the case that slavery was the “pivot” for industrialism, as Marx put it. Finally and most essential for the cotton, slavery and capitalism connection there is Sven Beckert’s “Empire of Cotton: A Global History”.
In a useful summary of the new literature, Beckert wrote an article titled “Slavery and Capitalism” for the December 12, 2014 edition of theChronicle of Higher Education, that is appropriately illustrated with a graphic of a slave’s chains connected to a one hundred dollar bill. Beckert wrote:
For too long, many historians saw no problem in the opposition between capitalism and slavery. They depicted the history of American capitalism without slavery, and slavery as quintessentially noncapitalist. Instead of analyzing it as the modern institution that it was, they described it as premodern: cruel, but marginal to the larger history of capitalist modernity, an unproductive system that retarded economic growth, an artifact of an earlier world.
This would obviously include Robert Brenner and his followers.
In the very next paragraph, Beckert refers to the original theorists making the connection between slavery and capitalism: “Some scholars have always disagree with such accounts. In the 1930s and 1940s, C.L.R. James and Eric Williams argued for the centrality of slavery to capitalism, though their findings were largely ignored.”
But I certainly did not ignore them when I joined the Trotskyist movement in 1967. Among the first books I read about African-American history was Eric Williams’s “Capitalism and Slavery” that can thankfully be read on the Internet. “Capitalism and Slavery” was based on his Oxford dissertation. Williams met with James, his former tutor, on numerous occasions when both were living in England. It seems that James read both drafts of the dissertation and had a significant role in formulating the book’s primary thesis, namely that sugar plantations, rum and slavery trade helped to catapult Great Britain into world domination at the expense of the African peoples in the Diaspora. Without the underdevelopment of Jamaica, Trinidad, etc., capitalist development in Great Britain would not have had the supercharged character that it did.
Can one conjecture on why the tide is turning against Robert Brenner’s Political Marxism group? In broad-brush strokes, you might see it as a necessary correction that recognizes the contribution made to modern civilization by slave labor. When Brenner wrote his NLR article in 1977, it was an attempt to write off “Third World” Marxism as some sort of deviation from class-based politics. The student movement was receding and the youthful enthusiasm over Che Guevara or Ho Chi Minh now seemed misplaced.
Nearly forty years later, the Third World continues to be an inexhaustible source of revolutionary energy. Given the importance of Hugo Chavez’s Bolivarian Revolution, it might be useful to reflect on his inspiration’s legacy.
Simón Bolivar was staunchly anti-slavery, liquidating his plantation and freeing his slaves early on in his political career. He had visited newly liberated Haiti in 1815 to get help for his revolution. Alexandre Pétion, Haiti’s president furnished 4,000 muskets, 15,000 pounds of powder, flints, lead and a printing press and in return for freeing South America’s slaves, something that Bolivar was happy to do.
In 1939, C.L.R. James wrote an article titled “Revolution and the Negro” that referred to this act of solidarity:
Menaced during its whole existence by imperialism, European and American, the Haitians have never been able to overcome the bitter heritage of their past. Yet that revolution of a half million not only helped to protect the French Revolution but initiated great revolutions in its own right. When the Latin American revolutionaries saw that half a million slaves could fight and win, they recognised the reality of their own desire for independence. Bolivar, broken and ill, went to Haiti. The Haitians nursed him back to health, gave him money and arms with which he sailed to the mainland. He was defeated, went back to Haiti, was once more welcomed and assisted. And it was from Haiti that he sailed to start on the final campaign, which ended in the independence of the five states.
I first learned about the Brenner thesis from James M. Blaut, the author of more than a hundred scholarly articles, who had joined the Marxism list to spread the word about his two books “The Colonizer’s Model of the World” and “Eight Eurocentric Historians”, which included a chapter on Robert Brenner. Blaut, who was a supporter of the Puerto Rican Socialist Party (his wife Meca was a key leader) and who described getting arrested at a Vietnam War protest as his greatest accomplishment, was no elitist when it came to debating the merits of his books. He saw the Internet as a democratic and freewheeling medium that was essential for clarifying and spreading socialist ideas. An article commemorating Blaut in Antipode, a journal to which he had contributed numerous articles, stated that his work was almost certainly inspired by C.L.R. James and Eric Williams.
Pancreatic cancer took his life before he was able to complete the third installment of a trilogy on Eurocentrism and historiography. The final work, which was in progress, advanced a way of writing history that would give people like Simón Bolivar their proper due. It would also acknowledge the importance of slavery in American development that has been reflected in a work like Craig Wilder’s “Ebony and Ivory” that documents how some of the most prestigious American universities were funded by the proceeds of the slave trade or the plantation system.
It is too bad that Blaut did not live long enough to see works by Beckert, Baptist and Johnson making the case that he made until illness made it impossible to write another word, either in print or on the Internet. He would have been pleased to know that such scholars are effectively completing the third volume of his trilogy by their example.