14 Sept 2016

US has spent nearly $5 trillion on wars since 9/11

Bill Van Auken

In another indication of the terrible price paid by working people in the United States and all over the globe for the crimes of US imperialism, a new report from Brown University estimates that Washington has squandered nearly $5 trillion since September 11, 2001 on the wars launched under the pretext of fighting terrorism.
The report coincides with the 15th anniversary of 9/11, with 10,000 US troops still in Afghanistan, 15 years after the US invasion of that country, and an estimated 6,000 in Iraq. Hundreds more special operations forces have been deployed to Syria, where the US is fighting for regime change in a de facto alliance with that country’s affiliates of Al Qaeda—which was supposedly the principal target of the last decade and a half of war.
While the financial costs of these wars are staggering, bordering on the unfathomable, the author of the report, Boston University professor Neta Crawford, correctly places them in their far broader, and more horrifying, context of the trail of blood and destruction that US military operations have left in their wake:
“...a full accounting of any war’s burdens cannot be placed in columns on a ledger. From the civilians harmed or displaced by violence, to the soldiers killed and wounded, to the children who play years later on roads and fields sown with improvised explosive devices and cluster bombs, no set of numbers can convey the human toll of the wars in Iraq and Afghanistan, or how they have spilled into the neighboring states of Syria and Pakistan, and come home to the US and its allies in the form of wounded veterans and contractors.”
Some of these numbers are also quantifiable, and appalling, from the over one million Iraqi lives lost to the US invasion of 2003 to the more than 12 million refugees driven from just the four countries laid waste by US wars: Afghanistan, Iraq, Pakistan and Syria. In addition, there are the nearly 7,000 US troops killed in Iraq and Afghanistan, along with roughly an equal number of private contractors, as well as the 52,000 officially listed as wounded in combat and the untold hundreds of thousands more suffering from traumatic brain injuries, PTSD (post-traumatic stress disorder) and other mental health problems resulting from multiple deployments in dirty colonial-style wars.
Nonetheless, the report argues persuasively that it is also vital to make a serious and comprehensive evaluation of the real financial costs of these wars.
The overall cost of US imperialism’s wars includes the $1.7 trillion directly appropriated by Congress to wage them as so-called Overseas Contingency Operations (OCO). This is above and beyond the Pentagon’s base budget, which totals some $6.8 trillion from FY2001-2016.
By defining these wars as OCOs, Congress, together with both the Bush and Obama administrations, has acted as if they are some kind of unforeseeable emergencies that could not be planned for within the government’s normal budgetary process, even as they dragged out for a decade and a half. As a result, they were freed from any kind of normal fiscal accountability, with no taxes or other revenues allotted to pay for them.
In addition to this direct war funding, the report includes the costs of veterans’ medical and disability care, allocations for Homeland Security, interest on Pentagon war appropriations and future costs for veterans’ care.
This last cost is estimated at amounting to at least $1 trillion between now and 2053. The basis for such an estimate is made clear by the presentation of some alarming statistics.
By the end of 2015, more than 1,600 soldiers who fought in Iraq and Afghanistan had undergone major limb amputations as a result of wounds suffered in combat. A total of 327,000 veterans of these wars had been diagnosed with Traumatic Brain Injury as of 2014 and by the same year fully 700,000 out of the 2.7 million people deployed to the war zones had been classified as 30 percent or more disabled.
The report points out that Veterans Affairs is the fastest growing department in the US government, with its staffing levels having nearly doubled since 2001 to 350,000 workers. Yet, according to another recent report, it “still lacks sufficient funding to fill thousands of vacancies for doctors and nurses and to finance badly needed repairs to its hospitals and clinics.”
In addition to these costs, the report estimates that, unless Congress changes the way that it is paying for the wars, even without their continuation, cumulative interest on war appropriations made just through FY2013 will amount to a staggering $7.9 trillion by 2053.
The report recalls that as the Bush administration was preparing to launch the war of aggression against Iraq, the administration’s chief economic adviser Lawrence Lindsey came under intense fire for estimating that the “upper bound” costs of the war reached between $100 and $200 billion. This estimate was roundly rejected by everyone from Defense Secretary Donald Rumsfeld to House Democrats, who put the figure at roughly $50 billion, which it is now clear underestimated the real cost by a factor of 100.
Reflected in these wars, both in the criminality with which they were initiated and fought, and in the way they were funded, are the financial parasitism and socially destructive forms of speculation that pervade the workings of American capitalism as a whole.
By keeping the wars’ costs “off the books” and relying on an “all-volunteer” military to fight them, the US ruling class also hoped to dampen the popular hostility to militarism.
The new report does not attempt to estimate the wars’ broader impact on the economy and the living standards of broad masses of American working people. Another report issued two years ago by Harvard University conservatively estimated that the cost of the Iraq and Afghanistan wars amounted to $75,000 for every American household.
The report points to previous studies indicating that the wars cost tens of thousands of jobs and significantly reduced investment in infrastructure. The vast amount of resources diverted into slaughter and destruction in the Middle East and Central Asia could have funded the $3.32 trillion that the American Society of Civil Engineers (ASCE) says must be spent over the next decade to fix America’s crumbling ports, highways, bridges, trains, water and electric facilities and paid off the entire $1.26 trillion in student debt, with money left over.
Instead, the elected officials of both major capitalist parties have continuously insisted that there is no money for jobs, decent wages, education, health care and other basic necessities, while spending unlimited money on militarism and war, leaving the bill to be paid for through the intensification of austerity measures directed against the working class.
The human and fiscal toll wrought by the wars of the last 15 years are only a foretaste of the global catastrophe that is threatened as US imperialism prepares for far larger wars, with its military escalation focused ever more directly against the world’s second and third largest nuclear powers, Russia and China.

The witch-hunt against Chinese influence in Australia

Peter Symonds

The eruption of a concerted anti-Chinese campaign in the Australian media and political establishment, particularly over the past fortnight, directed at vilifying anyone not fully supportive of Washington’s confrontational “pivot” against Beijing is a warning to workers, not only in Australia but internationally, of the advanced character of the US war drive in the Asia Pacific.
What began with the “exposure” of small payments by a Chinese businessman to opposition Labor Party frontbencher Sam Dastyari escalated last week into a full blown witch-hunt against any politician, business figure or organisation construed as questioning full support for the US military alliance or the increasingly shrill denunciations of Chinese “expansionism,” especially in the South China Sea.
The most explicit diatribe was written by the Sydney Morning Herald’s international editor, Peter Hartcher, who called for a “Four Pests Campaign,” akin to that conducted by Mao Zedong, to eradicate “rats, flies mosquitoes and sparrows” and “defend against agents of foreign influence.” In his sweeping vilification, Hartcher targeted “rats” such as Dastyari, “flies” or supposed unwitting dupes of Beijing, including former Labor Foreign Minister Bob Carr, and “mosquitoes” or businessmen allegedly beholden to China, such as billionaires Kerry Stokes and James Packer.
Hartcher’s definition of “sparrows” or purveyors of Chinese influence encompassed not only Chinese-Australian organisations but placed a question mark over anyone of Chinese background—that is, the half million Australian residents born in China and 150,000 Chinese students in Australian universities, not to mention many more of Chinese descent. Such anti-Chinese hysteria serves to pave the way for extensive police raids and arrests in the event of war with China, as took place with the mass internment in Australia of “enemy aliens” during the two world wars.
Those named by Hartcher are by no means hostile to the US and the American alliance but rather are concerned that a confrontational stance toward China will damage relations with the country, which is Australia’s top trading partner and a source of significant investment.
The most sinister insinuation appeared in the Australian Financial Review last week, implying that Prime Minister Malcolm Turnbull, because of his business interests in China, was not trusted by Australian intelligence agencies. When President Barack Obama announced the “pivot” in the Australian parliament in 2011, Turnbull was somewhat critical of it and suggested an accommodation with China. Since becoming prime minister last September, he has toed the US line verbally on the South China Sea but not authorised a so-called freedom of navigation operation (FONOP) within the 12-nautical-mile territorial limits around Chinese islets.
Pressure from Washington for Canberra to conduct a provocative FONOP in the South China Sea has intensified following the July 2 federal election that Turnbull’s Liberal-National Coalition won by the slenderest of margins. In its immediate aftermath, US Vice President Joe Biden visited Australia and stressed Washington’s determination to remain the dominant Pacific power.
In a pointed warning to any equivocators, Biden stressed: “If I had to bet on which country is going to lead economically in the 21st century... I’d bet on the United States. But I’d put it another way: It’s never a good bet to bet against the United States.” A long line of visiting US admirals and generals had already made it publicly clear that Washington expected Turnbull to send an Australian warship to challenge Chinese claims in the South China Sea.
Biden’s visit was the signal for the anti-China campaign now underway to create the poisonous climate for a FONOP provocation that has the potential to trigger Chinese retaliation and escalate into open conflict. A spate of lurid and unsubstantiated stories has appeared about Chinese hacking, the dangers to “national security” of Chinese investment and the network of Chinese influence in Australia. Even though a contrite Dastyari admitted his “error” and resigned as a Labor spokesman, government ministers on Monday lined up during question time to denounce “Shanghai Sam.”
“Chinese influence” pales into insignificance compared to the influence built up since World War II and wielded by the US throughout the Australian political establishment, the media, various think tanks, such as the US Studies Centre, and the state apparatus, especially the military and intelligence agencies. In 2010, a handful of Labor and union powerbrokers, later revealed as “protected sources” of the American embassy in Canberra, orchestrated an overnight inner party coup to oust Kevin Rudd as prime minister after he alienated the Obama administration by advocating a US accommodation to China.
It is this same pro-US apparatus that has sprung into action to brand as a Chinese fifth column those sections of the political and corporate elite who urge caution in becoming enmeshed in the American military build-up against China. The campaign, which also encourages anti-Chinese xenophobia, meets up with the needs of the ruling class to divert the huge social tensions generated by the widening gulf between rich and poor outward against a “foreign enemy.”
Whereas in the United States and Europe it is currently Russia that is the target for vilification, in Australia it is China and for definite reasons. The ramping up of anti-China propaganda in Australia takes place amid mounting concerns in Washington that Obama’s “pivot,” aimed at subordinating China to American interests, has stalled. On the economic front, the Trans Pacific Partnership (TPP), which seeks to undermine Chinese influence in Asia, is unlikely to be ratified by the US Congress. Diplomatically, Obama was unable to press the recent East Asian Summit in Laos to confront China over the South China Sea.
As Biden declared during his visit, the US has no intention of relinquishing its position as the dominant Pacific power. It is now looking to Australia to step in and militarily challenge Chinese maritime claims in the South China Sea. Such a move would dramatically ratchet up tensions in the region, provide Washington with much-needed international support and, should Beijing react militarily, provide the pretext for the US to intervene more forcefully. Australian imperialism, which relies on US backing to prosecute its own interests in Asia and internationally, has provided military forces and political support to the US in virtually every predatory conflict since World War II—from the Korean and Vietnam wars to the latest US-led operations in the Middle East.
The ferocity with which longstanding political figures and wealthy businessmen are being vilified is a sign of the extreme geo-political tensions embroiling the entire region as the US and its allies prepare for war with China. It is also a measure of the fear in ruling circles of the widespread anti-war sentiment among workers and youth that will be unleashed as the war danger becomes more imminent.
The Australian working class, like its counterparts in China, Japan, the United States and the rest of the world, has no interest in a conflict between nuclear-armed powers. Its answer to the threat of war lies in the political fight being waged by the Socialist Equality Party (Australia) and its sister parties of the International Committee of the Fourth International to build an international anti-war movement to put an end to capitalism and its outmoded nation-state system, which is the root cause of war.

13 Sept 2016

University of Newcastle Commonwealth Government Scholarships 2017/2018 for Masters & PhD – Australia

Application Deadline: 15 September 2016 (round 2) 31st January 2017 (round 1 2017)
Offered annually? Yes
Brief description: The University of Newcastle Australia is offering university and commonwealth government scholarships for Research Masters & PhD degree for international students
Eligible Field of Study: courses offered at the university
About Scholarship: Full-time research students are eligible to apply for University and Commonwealth Government scholarships, which are awarded during one main round offer. They provide a living allowance so you can commit to full-time study. Historically over 90% of international students hold a scholarship from UON or a sponsor.
Scholarship Offered Since: Not specified
Scholarship Type: Research Masters and PhD Scholarship
Selection Criteria: Scholarships are granted on the basis of academic merit, which includes your undergraduate grade point average and extra research attainments.
Eligibility
Eligibility criteria is set by the Commonwealth Government and candidates must:
  • have a current offer of admission into a research higher degree.
  • have completed at least four years of undergraduate study and have attained Honours Class 1 or equivalent and a high grade point average (GPA)
  • be no more than two full-time equivalent years into their PhD (or one year for Masters) at the end of the year.
Successful international scholarship candidates usually also have:
  • satisfied the English proficiency requirement (IELTS of at least 6.5)
  • A master degree with strong research component
  • International peer reviewed research publication or research experience
Scholarship conditions
  • Be enrolled full-time (part-time enrolment may be approved in exceptional circumstances. Part-time scholarships are taxable)
  • be enrolled on-campus (off-campus enrolment may be approved in exceptional circumstances)
  • scholars may only work a maximum of 8 hours per week (between Monday to Friday 9am to 5pm)
  • Scholarship offers must be taken-up in the year of the offer and cannot be deferred to the following year. A leave of absence may be taken after 1 year full-time enrolment.
Number of Scholarships: Several
Value of Scholarship: A scholarship funded by the University of Newcastle or the Commonwealth Government provides:
  • An annual living allowance $26, 288 per annum (2016 rate – indexed annually)
The scholarship may also include:
  • a relocation allowance (up to $2,020)
  • a thesis allowance (up to $500)
  • a full tuition fee scholarship (international students)
  • overseas student health cover (OSHC) (international students)
Duration of Scholarship: PhD scholarships are for three years and Masters scholarships are for two years, less any tenure already completed towards a research degree.
Eligible Countries: International students
To be taken at (country): The University of Newcastle, Australia
How to Apply
  • Currently enrolled candidates can apply for a scholarship by contacting researchscholarships[@]newcastle.edu.au
  • New applicants can apply for a scholarship at the same time as applying for admission.
Visit scholarship webpage for details
Your application for admission must include:
  • Copies of all academic transcripts
  • A research proposal
  • Evidence of extra academic attainments e.g. publications
  • If required, certified evidence of meeting the English proficiency requirement
What happens once you submit your application?
  • Applications are assessed and ranked according to merit
  • Attendance at an interview is not normally required
  • Scholarship outcomes are known from mid December each year
Sponsors: The University of Newcastle and Commonwealth Government
Important Notes: ensure you attach the following documents to support your scholarship application:
  • Copies of research publications, exhibitions or conference papers
  • Curriculum Vitae
  • Details of previous research experience e.g. research work experience / study
  • Any additional documents that may add to your scholarship application e.g. evidence of the award of a University Medal

Schlumberger Foundation Fellowship for Women from Developing countries 2017/2018

Application Deadline: 18th November, 2016 for the 2017 Fellowships (the deadline for reference letters is November 25th 2016).
Offered annually? Yes
Accepted Subject Areas: Physical sciences and related disciplines
About Fellowship: Each year, The Faculty for the Future fellowships, Launched by the Schlumberger Foundation, are awarded to women from developing and emerging economies who are preparing for PhD or post-doctoral study in the physical sciences and related disciplines at top universities for their disciplines abroad. Grant recipients are selected for their leadership capabilities as for their scientific talents, and are expected to return to their home countries to continue their academic careers and inspire other young women.
Launched by the Schlumberger Foundation in 2004, the Faculty for the Future community now stands at 257 women from 62 countries, and grows steadily each year.
Offered Since: 2004
Type: PhD and Post-doctoral Fellowship
Selection Criteria: A successful application will have gone through four selection rounds, with the reviewers paying particular attention to the following criteria:
  • Academic performance;
  • Quality of references;
  • Quality of host country university;
  • Level of commitment to return to home country;
  • Commitment to teaching;
  • Relevance of research to home country;
  • Commitment to inspiring young women into the sciences.
Eligibility: Applicants must meet all the following criteria:
  • Be a woman;
  • Be a citizen of a developing country;
  • Wish to pursue a PhD degree or Post-doctoral research in the physical sciences or related disciplines;
  • Have applied to, have been admitted to, or are currently enrolled in a university abroad;
  • Wish to return to their home country to continue their academic career upon completion of their studies;
  • Be very committed to teaching and demonstrate active participation in faculty life and outreach work to encourage young women into the sciences;
  • Hold an excellent academic record.
Number of fellowships: Several
Value of Award: Faculty for the Future grants are awarded based on the actual costs of studying and living in the chosen location, and is worth USD 50,000 for PhDs and USD 40,000 for Post-doctoral study. Grants may be renewed through to completion of studies subject to performance, self-evaluation and recommendations from supervisors.
Eligible Countries: Developing Countries and Emerging Economies
To be taken at: Top universities abroad
How to Apply
Sponsors: The Schlumberger Foundation Faculty for the Future
Important Notes: Final selection is based in part on the standard of your application and accompanying materials;
Your application should highlight aspects about you and your career that will give the reviewer a focused yet well-rounded view of your candidature. Read and follow the instructions from the link below carefully. The instructions are your guide to producing a comprehensive and competitive application;

Isn’t It Time to Ban the Bomb?

Lawrence Wittner

Although the mass media failed to report it, a landmark event occurred recently in connection with resolving the long-discussed problem of what to do about nuclear weapons. On August 19, 2016, a UN committee, the innocuously-named Open-Ended Working Group, voted to recommend to the UN General Assembly that it mandate the opening of negotiations in 2017 on a treaty to ban them.
For most people, this recommendation makes a lot of sense. Nuclear weapons are the most destructive devices ever created. If they are used―as two of them were used in 1945 to annihilate the populations of Hiroshima and Nagasaki―the more than 15,000 nuclear weapons currently in existence would destroy the world. Given their enormous blast, fire, and radioactivity, their explosion would bring an end to virtually all life on earth. The few human survivors would be left to wander, slowly and painfully, in a charred, radioactive wasteland. Even the explosion of a small number of nuclear weapons through war, terrorism, or accident would constitute a catastrophe of unprecedented magnitude.
Every President of the United States since 1945, from Harry Truman to Barack Obama, has warned the world of the horrors of nuclear war. Even Ronald Reagan―perhaps the most military-minded among them―declared again and again: “A nuclear war cannot be won and must never be fought.”
Fortunately, there is no technical problem in disposing of nuclear weapons. Through negotiated treaties and unilateral action, nuclear disarmament, with verification, has already taken place quite successfully, eliminating roughly 55,000 nuclear weapons of the 70,000 in existence at the height of the Cold War.
Also, the world’s other agents of mass destruction, biological and chemical weapons, have already been banned by international agreements.
Naturally, then, most people think that creating a nuclear weapons-free world is a good idea. A 2008 poll in 21 nations around the globe found that 76 percent of respondents favored an international agreement for the elimination of all nuclear weapons and only 16 percent opposed it. This included 77 percent of the respondents in the United States.
But government officials from the nine nuclear-armed nations are inclined to view nuclear weapons―or at least their nuclear weapons―quite differently. For centuries, competing nations have leaned heavily upon military might to secure what they consider their “national interests.” Not surprisingly, then, national leaders have gravitated toward developing powerful military forces, armed with the most powerful weaponry. The fact that, with the advent of nuclear weapons, this traditional behavior has become counter-productive has only begun to penetrate their consciousness, usually helped along on such occasions by massive public pressure.
Consequently, officials of the superpowers and assorted wannabes, while paying lip service to nuclear disarmament, continue to regard it as a risky project. They are much more comfortable with maintaining nuclear arsenals and preparing for nuclear war. Thus, by signing the nuclear Non-proliferation Treaty of 1968, officials from the nuclear powers pledged to “pursue negotiations in good faith on . . . a treaty on general and complete disarmament under strict and effective international control.” And today, nearly a half-century later, they have yet to begin negotiations on such a treaty. Instead, they are currently launching yet another round in the nuclear arms race. The U.S. government alone is planning to spend $1 trillion over the next 30 years to refurbish its entire nuclear weapons production complex, as well as to build new air-, sea-, and ground-launched nuclear weapons.
Of course, this enormous expenditure―plus the ongoing danger of nuclear disaster―could provide statesmen with a powerful incentive to end 71 years of playing with their doomsday weapons and, instead, get down to the business of finally ending the grim prospect of nuclear annihilation. In short, they could follow the lead of the UN committee and actually negotiate a ban on nuclear weapons as the first step toward abolishing them.
But, to judge from what happened in the UN Open-Ended Working Group, a negotiated nuclear weapons ban is not likely to occur. Uneasy about what might emerge from the committee’s deliberations, the nuclear powers pointedly boycotted them. Moreover, the final vote in that committee on pursuing negotiations for a ban was 68 in favor and 22 opposed, with 13 abstentions. The strong majority in favor of negotiations was comprised of African, Latin American, Caribbean, Southeast Asian, and Pacific nations, with several European nations joining them. The minority came primarily from nations under the nuclear umbrellas of the superpowers. Consequently, the same split seems likely to occur in the UN General Assembly, where the nuclear powers will do everything possible to head off UN action.
Overall, then, there is a growing division between the nuclear powers and their dependent allies, on the one hand, and a larger group of nations, fed up with the repeated evasions of the nuclear powers in dealing with the nuclear disaster that threatens to engulf the world. In this contest, the nuclear powers have the advantage, for, when all is said and done, they have the option of clinging to their nuclear weapons, even if that means ignoring a treaty adopted by a clear majority of nations around the world. Only an unusually firm stand by the non-nuclear nations, coupled with an uprising by an aroused public, seems likely to awaken the officials of the nuclear powers from their long sleepwalk toward catastrophe.

Controlling Africa With Western “Democracy”

Thomas C. Mountain


The west uses “democracy”, as in elections, to control Africa. This has resulted in over half a century of murder and mayhem because all but one African country is a mixture of different ethnicity’s and nationalities with tribalism dominant in African societies.
Elections mean tribal winners and tribal losers and no in between, or consensus based governance. This has been a recipe for disaster, as in tribal conflict, since the end of direct western colonial rule and the imposition of neocolonialism after WW2.
Its called “divide and rule” and towards this end western style “democracy” has been a big hit in Africa as far as Pax Americana and its western minions are concerned.
In more recent times “crisis management” has been the de facto policy, as in create a crisis (tribal conflict in the run up to an election) and then manage the crisis to better rape and pillage African resources. As long as the “native tribes” are killing each other the less chance of any sort of “united front against Imperialism” as in independent, nationalist African governments demanding a fair share of Africa’s wealth.
Western “democracy” is all about controlling Africa for western benefit, something crucial to the ability of the western leaders maintaining the support of their population by being able to deliver the goods, economically speaking, in the form of higher standards of living. It is African blood that pays for the rich lifestyles of the western populace, and African blood that pays for social peace in western societies.
The western elites don’t care if it is buy, rig or steal when it comes to African elections as long as “my bastards” win everything is copacetic. Election instigated tribal violence is so standard that when an election is fixed without an outbreak of murder and mayhem it comes to be a cause of wonderment in the western media. The very first thing Pax Americana and its vassals demand after any African crisis are “elections”. And sure enough, another crisis is brewing.
Western “democracy” is really American “democracy” for the system used in Europe, and most of the rest of the world, originated in the United States of America. The amazing job of brainwashing that has been done to convince both Americans and their acolytes internationally is that somehow American Democracy was ever something progressive.
The historical fact is that the war for independence by the British colonies in North America was a counterrevolution for the purpose of preserving slavery. That’s right, Washington, Jefferson et al were fighting for independence for the British colonies to preserve slavery, the most criminal, inhumane and barbaric crime against humanity the world has ever known, the enslavement of Africans in subhuman bondage.
Thanks to cutting edge historians such as Dr. Gerald Horne, amongst others, there is indisputable historical evidence to convict the founding fathers of the USA of waging war to preserve slavery, which, with the help of the slave owning French aristocracy, succeeded in doing so for almost another century in the USA.
subscription-flame
Britain had outlawed slavery in the British Isles and Washington, Jefferson et al saw the handwriting on the wall, that their way of life based on the barbaric exploitation and degradation of Africans could only be preserved by independence from Britain. So they formed the Colonial Congress and carried out their counterrevolution with its goal to preserve their barbarism, for the system of slavery they enforced so viciously can hardly be considered civilization.
The historical record of the form of barbarism practiced by the slave owning leadership of the newly independent USA is most powerfully exemplified by what is probably the only reliable first hand account of how that “Founder of American Democracy”, Thomas Jefferson treated “his” Africans.
“After dinner the master [Jefferson] and I went to see the slaves plant peas. Their bodies dirty brown rather than black, their dirty rags, their miserable, hideous half-nakedness, these haggard figures, this secretive anxious air, the hateful timorous looks, altogether seized me with an initial sentiment of terror and sadness that I ought to hide my face from. Their indolence in turning up the ground with the hoe was extreme. The master [Jefferson] took a whip to frighten them, and soon ensued a comic scene. Placed in the middle of the gang, he menaced, and turned far and wide [on all sides] turning around. Now, as he turned his face, one by one, the blacks changed attitude; those whom he looked at directly worked best, those whom he half saw worked least, and those he didn’t see at all, ceased working altogether; and if he made an about-face, the hoe was raised to view, but otherwise slept behind his back”. (Thanks to “The Many Headed Hydra…” for the previous quotation)
This first hand account is from a founding member of the French “Society for Friendship with Blacks”, the first French antislavery organization. His name was Constantine Volney and he was the author of that African-Centered classic historical work, “Ruins; Or, Meditations on the Revolutions of Empires” in 1791. It is a fascinating account about his visit to Africa’s Nile Valley as a part of Napoleon’s scientific team before the last major desecration’s began.
Being an honest, antiracist historian, Volney believed, based on what he saw with his own eyes in the Egyptian tombs and temples, that civilization began in Africa, on the banks of the Nile River.
In his own words; “It was there that a people, since forgotten, discovered the elements of science and art, at a time when all other men were barbarous, and that a race, now regarded as the refuse of society, because their hair is wooly and their skin is dark, explored among the phenomena of nature, those civil and religious systems which have since held mankind in awe”.
“Ruins” was one of the most widely read historical texts of the late 18th and early 19th century. It was published in 6 languages in over 15 editions.
Volney was eventually driven from the USA by the forerunner of the Undesirable Aliens Act, passed by a slave owner Congress still having difficulties achieving a good nights sleep, haunted by dreams of the revolution in Haiti and the slaughter of their fellow slave owners by their erstwhile captives, Toussaint and his fellow Africans.
Thomas Jefferson was the author of the Constitution of the United States of America which with its hypocrisy drenched words of “All men are created equal” is supposed to be the template for the governance of African societies?
Its bad enough that whites and asians accept this falsehood but that it is essential that we in Africa must do so as well?.
Any wonder why western “democracy” has brought about so much murder and mayhem in Africa? That a system that was created by a society that treated Africans so barbarically should only result in barbarism in Africa when forced upon the people here?
It is more that a little ironic that the very structure used to govern the newly independent slavery dominated USA was plagiarized from the League of the Iroquois, the federation of American Indian nations whose grand council and democratic processes were adopted almost without change by the original author of what became the Constitution of the USA, Benjamin Franklin, Thomas Jefferson’s mentor.
Can you tell me of anyone, scholar or laymen in the USA, or its acolytes internationally, who know this fact, that the supposedly barbaric American Indians were the very people whose method of governance was adopted and distorted to become the basis of the system of governance known today as American “democracy”?
The League of the Iroquois was composed of nation “states” which had jurisdiction over affairs of the “state” only. Each “state” had its own elected legislature, which, as in Franklin’s Constitution, chose a number of “electors” to the “federal” League of the Iroquois. These “electors” were accorded to each “state” based on the individual “states” population. The “electors” met regularly in the sacred hall for their deliberations. This “grand council” (the name Franklin used in the original draft of the Constitution for what came to be the Congress of the USA) was unicameral, as was Franklin’s original white settler “council”, later Congress, of the former British colonies.
This Grand Council of the League of the Iroquois declared war and negotiated peace treaties, sent and received ambassadors, decided on the new members joining the League and in general acted as a “federal” government whose decisions superseded those of the “states” in affairs of the “nation”.
As in Franklin’s Constitution, in the League of the Iroquois, the electors could not be serving in the military while holding office. In both cases an electorate chose the electors and could recall their choice at anytime. One of the main differences between the structures of the two “democracies” was that in the League of the Iroquois the electors were reserved for men but ELECTED BY THE WOMEN. That’s right, in the League of the Iroquois the women elected the leadership, something much more “democratic” than the actual minority of men who made up the electorate in the USA.
The League of the Iroquois maintained a national state that stretched from New England to the Mississippi River that existed in conditions of internal peace for a thousand years or more.
Africans, like the American Indians, traditionally practiced a more consensual form of democracy, not a winners and losers system of divide and rule. The introduction of American “democracy” was critical to the success of neo colonialism in Africa and it’s implementation is responsible for most of the conflict and destruction wracking Africa today.
Western “democracy”, a system adopted by slave owners and redesigned to enable the preservation of a system of barbarism, maintained by force and violence, which has been forced on Africa, with this foreign infection subsequently proving to be critical in the continuing subjugation of the African continent by the western powers.
It’s all about controlling Africa with western “democracy” and like Cuba in Latin America, there is only one country on the African continent that rejects this system of exploitation, the small climate disaster wracked nation of Eritrea. Here in Eritrea we prefer to build our own system of “democracy” based on a peoples liberation war of 30 years and a culture of unity despite religious and ethnic differences that has withstood invasion, sanctions and climate disaster without faltering in our commitment to building a “Rich Eritrea without Rich Eritreans”, in other words Socialism. The west can have its so called “democracy” as far as we Eritreans are concerned.

Israel’s Bogus Civil War

Jonathan Cook

Nazareth: Is Israel on the verge of civil war, as a growing number of Israeli commentators suggest, with its Jewish population deeply riven over the future of the occupation?
On one side is a new peace movement, Decision at 50, stuffed with former political and security leaders. Ehud Barak, a previous prime minister who appears to be seeking a political comeback, may yet emerge as its figurehead.
The group has demanded the government hold a referendum next year – the half-centenary of Israel’s occupation, which began in 1967 – on whether it is time to leave the territories. Its own polling shows a narrow majority ready to concede a Palestinian state.
On the other is Benjamin Netanyahu, in power for seven years with the most right-wing government in Israel’s history. On Friday he posted a video on social media criticising those who want to end the occupation.
Observing that a Palestinian state would require removing hundreds of thousands of Jewish settlers currently living – illegally – on Palestinian land, Netanyahu concluded: “There’s a phrase for that. It’s called ethnic cleansing.”
Not only did the comparison upend international law, but Netanyahu infuriated the Obama administration by implying that, in seeking to freeze settlement growth, the US had supported such ethnic cleansing. A spokeswoman called the comments “inappropriate and unhelpful” – Washington-speak for deceitful and inflammatory.
But the Israeli prime minister is not the only one hoodwinking his audience.
Whatever its proponents imply, the Decision at 50 referendum is about neither peace nor the Palestinians’ best interests. Its assumption is that yet again the Israeli public should determine unilaterally the Palestinians’ fate.
Although the exact wording is yet to be decided, the referendum’s backers appear concerned solely with the status of the West Bank.
An Israeli consensus believes Gaza has been free of occupation since the settlers were pulled out in 2005, despite the fact that Israel still surrounds most of the coastal strip with soldiers, patrols its air space with drones and denies access to the sea.
The same unyielding, deluded Israeli consensus has declared East Jerusalem, the expected capital of a Palestinian state, as instead part of Israel’s “eternal capital”.
But the problem runs deeper still. When the new campaign proudly cites new figures showing that 58 per cent support “two states for two nations”, it glosses over what most Israelis think such statehood would entail for the Palestinians.
A survey in June found 72 per cent do not believe the Palestinians live under occupation, while 62 per cent told pollsters last year they think Palestinians have no rights to a nation.
When Israelis talk in favour of a Palestinian state, it is chiefly to thwart a far bigger danger – a single state shared with the “enemy”. The Decision at 50 poll shows 87 per cent of Israeli Jews dread a binational conclusion to the conflict. Ami Ayalon, a former head of the Shin Bet intelligence service and a leader of Decision at 50, echoed them, warning of an “approaching disaster”.
So what do Israelis think a Palestinian state should look like? Previous surveys have been clear. It would not include Jerusalem or control its borders. It would be territorially carved up to preserve the “settlement blocs”, which would be annexed to Israel. And most certainly it would be “demilitarised” – without an army or air force.
In other words, Palestinians would lack sovereignty. Such a state exists only in the imagination of the Israeli public. A Palestinian state on these terms would simply be an extension of the Gaza model to the West Bank.
Nonetheless, the idea of a civil war is gaining ground. Tamir Pardo, the recently departed head of Israel’s spy agency Mossad, warned last month that Israel was on the brink of tearing itself apart through “internal divisions”.
He rated this a bigger danger than any of the existential threats posited by Mr Netanyahu, such as Iran’s supposed nuclear bomb.
But the truth is that there is very little ideologically separating most Israeli Jews. All but a tiny minority wish to see the Palestinians continue as a subjugated people. For the great majority, a Palestinian state means nothing more than a makeover of the occupation, penning up the Palestinians in slightly more humane conditions.
After many years in power, the right is growing in confidence. It sees no price has been paid, either at home or abroad, for endlessly tightening the screws on the Palestinians.
Israeli moderates have had to confront the painful reality that their country is not quite the enlightened outpost in the Middle East they had imagined. They may raise their voices in protest now but, if the polls are right, most will eventually submit to the right’s realisation of its vision of a Greater Israel.
Those who cannot stomach such an outcome will have to stop equivocating and choose a side. They can leave, as some are already doing, or stay and fight – not for a bogus referendum that solves nothing, but to demand dignity and freedom for the Palestinian people.

Water Wars Come To Bangalore (And Soon, The World)

Vijay Kundaji

I am not sure where each of you is, in your own reading and assessment of the situation, but I am now more or less convinced that life as we ‘know’ it, both here in ever ’emerging’ India, and worldwide, is going to change very rapidly in an unprecedented direction – and very possibly we will all experience this directly in our own lifetimes.
Natural resource degradation, contamination and depletion through human over-exploitation – that is, by our ever growing and consuming population – plus climate change, are going to completely redefine our lives and that of our kids. We can remain ostrich-like and convince ourselves that derivatives traders, financiers and technologists will keep the ‘economy’ – as we have been brainwashed into understanding it – afloat and will keep providing endless ‘returns’ on paper money through this idea of compound interest. But, the distress in the natural world today is too immense – and too real – that it is about to breach all our lives and assumptions.
From my chance vantage point, closely observing a small 3-acre piece of land near Thally in the penumbra of Bangalore and industrial Hosur, I have literally seen with my own eyes the degradation of the ecosystem and how water has vanished. If I had a time lapse movie made out of it (and speed-run 20 years of visuals in a few minutes), you would see verdant landscapes, pale in front of your eyes and eventually go brown, scores of lakes brimming with water shrink to a pond first, and then become a dry bed, and borewells that were on average 70 to 100 feet deep, go dry today at 1500 feet. Plastic and industrial trash blows around in the wind on what was once food-yielding agricultural land and is now essentially waste land waiting for an Audi-seeking, dream-peddling real estate developer to snatch it up. Paddy fields below the bunds of abundant lakes now sound like historic fiction and food crop agriculture has been decimated and replaced by a scrappy brand of greenhouse, cash-crop farming by those with the capital to sink multiple 1500 foot borewells and suck what remains dry to convert it into money.
For millennia, the population in that area had been self-sustaining in all resources and especially in water. But, two years ago, after it was clear that the water was vanishing in the area, all the villages in the Thally area were put on ‘the grid’ and connected to a pipeline from the Cauvery via the highly controversial Hogenakkal water project. (We have a water reservoir just outside our piece of land – although we don’t get any of that water ourselves).
So what becomes of the Cauvery now? And even if the Gods are kind and somehow the monsoon switches back on, then in the days and years ahead? Where will those highly contested “cusecs” come from? Is some smart technologist going to invent them in a glass and concrete building with VC money?
I am sorry – but things are truly grim.  And 1 out of 6 humans in the world is crammed into the political boundaries of India.
Elsewhere the tides are rising (coastal erosion from Saurashtra all the way around to the Sundarbans), coasts are churning and rivers flooding (see Bihar/UP/Assam, N. Karnataka, Maharashtra and the wild fluctuations from conditions of drought to floods in a few weeks), while glaciers are receding rapidly in the Himalayas (as is happening worldwide).
The ‘comforting notion’ that some smart people sitting somewhere will solve all this with the help of technology, friends, is just that … a notion.  It’s not happening.  A lot of our problems have, in fact, been created by our resource gobbling and energy crazy, carbon heavy way of life.  We are hurtling as humans towards the irreversible 2 degree temperature rise when ‘climate change’ will hit a new trajectory (The Paris climate talks, etc if you read about them, were a joke).
I’ll stop my diatribe.  And fervently hope that my angst is completely misplaced.  Even though my own experience suggests otherwise.  I hope to be an ostrich (“Je suis une autruche”- to use a clause that got popular in a different, distressing context !)

India and the US: Inching Towards an Informal Alliance

Chintamani Mahapatra


US Secretary of State John Kerry’s recent visit to India, along with Commerce Secretary Penny Pritzker, to participate in the second Strategic and Commercial Dialogue with their Indian counterparts, and Indian Defence Minister Parrikar’s visit to Washington to meet with US Secretary of Defence to further strengthen bilateral defence cooperation have made it appear as if India and the US are inching towards shaping an informal alliance relationship.
Alliance was taboo terminology in India’s approach towards the world during the four decades of the Cold War era. The politically correct phrase was non-alignment. India never felt comfortable with alliance politics indulged in by the US and the former Soviet Union. Successive governments in New Delhi promoted non-alignment as a credible foreign policy strategy and backed the Non-Aligned Movement (NAM) consisting of the vast majorities of developing countries. Indian ideologues became defensive when critics described India’s closer defence and security ties with the former Soviet Union as antithetical to its profession of non-alignment policy.
The Indian government did not formally abandon NAM even after the end of the Cold War, although non-alignment slowly disappeared from the lexicon of Indian foreign policy and international relations. As and when Indian policy-makers came to terms with the new realities of the post-Cold War era marked by a sole superpower world order, the new mantra chanted by Indian strategic analysts came to be “strategic autonomy.”
As India was accused by critics of compromising its non-alignment by maintaining closer defence ties with the Soviet Union during the Cold War period, it faced similar reproach of compromising its “strategic autonomy” when India began to forge a strategic partnership with the US in the post-Cold War era.
However, the demands of the time, politico-security developments in  the post-Soviet world order, rise of non-state actors as effective challengers of state sovereignty, nuclear capability of state sponsors of terrorism, meltdown of the Middle Eastern political order, end of the era of peaceful rise of China, among others, have required a new kind of strategic collaboration between India and the US.
The new strategic partnership project between India and the US that began since President Bill Clinton’s visit to India in March 2000 has gone through its varying pace and intensity from time to time, but after about fifteen years of its evolution, one may safely contend that there is no going back. The civil nuclear technology cooperation agreement, growing trade in arms and other military hardware, regular military exercises, new initiatives for co-production and co-development of defence items as part of the Defence Technology and Trade Initiative (DTTI) have cemented the strategic partnership between the two countries.
However, 'strategic partnership' is not a well-defined term and many commentators have actually come to joke about it. India has strategic partnerships with many countries, including China. What then is the brand of the Indo-US strategic partnership? Detailed examination will, of course, show the qualitatively different brand of India’s strategic partnership with the US than that with China.
The notable distinctiveness of the Indo-US strategic partnership consists of defence trade, technology transfer, military-to-military cooperation and most recently, the Logistics Exchange Memorandum of Agreement (LEMOA). This is one of the four fundamental agreements that have been under negotiation between the two countries. That it took so long to ink this agreement need not surprise anyone, since both the US and India are vibrant democracies and all stakeholders are allowed to participate in decision-making on critical issues. Now that LEMOA has been concluded, other agreements will be taken up for discussion.
Significantly, the discourse in India on Indo-US defence and security cooperation has matured to an extent where hardly any one raised serious opposition to LEMOA. As is in the US, a broad consensus seems to have been developing in India for robust defence and strategic ties with the US. 

FSI and Myanmar: More Clarity Required

Preet Malik


To try to evaluate a democratic process on a uniform framework is fraught with problems of a structural nature. Just by way of an example, in a farewell call on the then Malaysian prime minister in August 1990, he  remarked that “in India, we had too much democracy.” 

He was referring to the fact that the opposition could hold up the process of governance in India. In Malaysia, such possibilities were then contained by strong governmental action against any form of opposition to its policies. Myanmar is a case in point. How do you reflect in measureable terms the graduated move towards a democratic system that has to be viewed in positive terms and thus to be shown as an improvement over the past when the military held Myanmar in its authoritarian clasp, while the constitutional construct continues to place major hurdles in the way of attaining full democracy. The issue would remain as to how would one establishes a purely scientific basis for measuring this change.

The 2008 constitution is the basis on which the system of administration that Myanmar today enjoys. The fact that today there is a non-military elected government in place is a very positive development; particularly as this is the first such government in place after 1962. However, this positive is constrained by the provisions of the constitution that places the Myanmar Armed forces as central to preserving the unity and integrity of the nation while significantly placing them outside the control of the civilian authority; and the home or interior ministries are headed by a nominee of the armed forces, ensuring that both domestic and external security remains in the domain of the military.

Accordingly, any measurement of the actual functioning of democracy would have to factor in the overwhelming controls that the military continues to enjoy in the governance of the country. The elected government has flexibility to determine the course of the economy within its programmes for socio-economic development. It has control to a large extent over the direction that it would take on foreign policy and of course the place of Myanmar in international and regional discourse.  However on key domestic policy areas like the Rohingya issue, the general issue of communal peace and harmony, reconciliation process with ethnic groups that fall within the purview of security, and on areas of strategic determination, the overbearing role of the armed forces remains centred around the veto over changes or policies that they disagree with.

The question therefore for the FSI is as to how it would determine accurately the weightage it would need to give to the different aspects of the technically limited democracy that has come to prevail in Myanmar. The essential fact is that while accepting the progress made, full democracy is far from being restored to Myanmar. Another significant negative is that the ethnic minority issue remains a key factor to which a solution is still to emerge. This poses a threat to the stability of the country and could become an excuse as it did in the early 1960s to prevalence of democracy.

The union governments, whether democratic or authoritarian, have so far failed to meet the demands and aspirations of the ethnic groups who have claimed that there has been a consistent failure to meet the provisions of the Panglong Agreement in letter and spirit. This has led to armed resistance and exploitation of the situation, particularly by China. The Thein Sein regime succeeded in bringing the groups to the negotiating table with a universal ceasefire as a key component of the negotiations. Significantly, it also succeeded in including the Karens to join the process. However, there are certain key groups that have continued their armed insurrection, encouraged by China. Suu Kyi's recent visit to China has now resulted in the possibility of these groups also joining the process that the present government is following under the Panglong nomenclature. The key issue is the demand for structural changes that would establish a true federal structure with a significant undertaking on autonomy. This would involve amendments to the constitution that can only happen if the armed forces accept that changes pose no threat to the security and integrity of the country. Again, to satisfy the demands of autonomy, the role of the armed forces in the governance of the states would have to be curtailed if not eliminated. This could pose a serious problem in evolving a solution that would satisfy the ethnic groups.

To conclude, the negatives in Myanmar to a large extent still continue to override the positives. Any true index would have to reflect that the situation remains far from ideal. The challenge posed by the ethnic groups and the systemic change that would have to take place to meet it is an area that imposes itself on any analysis of the direction in which Myanmar is moving. The 2016 FSI has taken these factors into account but the weightage that it would apply to these developments is not quite clear.

UK government pledge to expand grammar schools signals escalation of selective education

Robert Stevens

Conservative Education Secretary Justine Greening yesterday announced the government’s plans to reintroduce grammar schools in England.
This followed a speech Friday by Prime Minister Theresa May, her first focussing on domestic policy. This outlined plans to allow all schools in England, including existing state comprehensives and academies, the right to apply to select pupils by ability. Alongside this, the remaining grammar schools will be allowed to expand.
May’s policy overturns that of her predecessor, David Cameron, and that of the 1997-2010 Labour government, which imposed a statutory ban on the expansion of grammar schools in 1998.
Some £50 million is to be allocated to fund expansion. Within days of May’s speech, five councils have already drawn up plans to open new grammar schools.
Grammar schools were first introduced following the 1944 Education Act. At that time children sat exams at 11 years of age (the 11 plus) that creamed off the top achievers for grammar schools while the majority of pupils attended secondary moderns. At their height in 1964, 1,300 grammar schools educated a quarter of all pupils.
During the 1960s and 70s, with the advent of the comprehensive state school system, the majority of grammar schools closed. By 1988, no grammars were left in Wales and by 2014, just 163 remained in England (mainly in counties without major urban locations).
The attack on comprehensive education has been proceeding apace, so that the majority of secondary schools are now academies and free schools (state-funded but privately run). But the latest announcement is a decisive shift to selection.
May’s grammar school policy is a direct pitch to a section of the middle class—dressed up with claims of a commitment to a “meritocratic society.” She stated that present education policy does not benefit those “who can’t afford to move house or pay for a private education,” with government, “saying to parents who want a selective education for their child that we won’t let them have it.”
The wealthiest social layers in Britain send their children to fee-paying “public schools” (private schools, also known as independent schools). However, these 242 schools, which charge fees at an average £23,000 per annum, educate only seven percent of the total number of schoolchildren in England. These schools are well out of the financial reach for large sections of the middle class. May noted in her speech, “Between 2010 and 2015 their fees rose four times faster than average earnings growth, while the percentage of their pupils who come from overseas has gone up by 33 percent since 2008.”
“I want to relax the restrictions that stop selective schools from expanding, that deny parents the right to have a new selective school opened where they want one, and that stop existing non-selective schools to become selective in the right circumstances and where there is demand,” she said.
After the vote to leave the European Union in June’s referendum, “Everything we do will be driven, not by the interests of the privileged few”, she claimed. “Not by those with the loudest voices, the special interests, the greatest wealth or the access to influence.”
Her speech referenced eight times the struggles and hardship facing the “working class”, “who made real sacrifices after the financial crash in 2008, though they were in no way responsible… I want Britain to be the world’s great meritocracy – a country where everyone has a fair chance to go as far as their talent and their hard work will allow.”
May’s cynicism is boundless. A right-wing Thatcherite, she portrays the expansion of selective schools as a means to enhance social mobility—able to take “a proportion of pupils from lower income households, so that selective education is not reserved for those with the means to move into a catchment area or pay for tuition to pass the test.”
In reality this ideologically driven offensive will only reinforce class divisions and social inequality. The claim that grammar schools enable social mobility is a myth. Thanks to their greater opportunities—a more stable home environment, access to the arts, a culture of academic attainment, ability to pay tuition—selection overwhelmingly benefits students from better off families.
It is for this reason that streaming by attainment in the state sector has been the policy of successive governments over the past two decades. By 2011, research published by the Institute of Education has revealed that one in six children is allocated a stream by the age of seven.
There is overwhelming evidence proving that grammar schools from their inception have primarily benefited the more socially privileged layers. Researchers from the University of Bristol, the University of Bath and the Institute of Education, University of London, found that those who failed to pass the 11-plus to enter grammar schools were left at an “immediate disadvantage” in terms of future earnings. Grammars lead to a widening of the income gap between rich and poor.
The survey, based on the average pay of the top and bottom 10 percent of the workforce born between 1961 and 1983, found that in areas with a grammar school system, top earners are likely to earn £16.41 an hour more than those on the lowest incomes (around £30,000 a year based on a 35-hour week). In areas where the education system was fully comprehensive, the salary gap was £12.33—a quarter less.
In a report issued Monday, the Institute for Fiscal Studies noted, “Children from deprived backgrounds are much less likely to attend existing grammar schools than are better off children. Only about three percent of pupils at existing grammar schools are eligible for free school meals (a widely used indicator of poverty in schools), which compares with about 17 percent of pupils in grammar school areas as a whole.”
May announced a series of other reactionary proposals designed to eliminate comprehensive education—including private schools supporting/sponsoring state schools in return for maintaining their charitable status. Universities will be obliged to sponsor a state school or set up a new Free School. In exchange, they will be encouraged to charge higher tuition fees.
These policies are being introduced under condition in which the vast majority of graduates leave university burdened with tens of thousands of pounds of debt due to sky-high tuition fees.
It is not primarily lack of educational attainment that is having the greatest adverse impact on the young, but a deepening economic crisis. A recent study found that having a degree today in Britain is ever less likely to secure a decent, well-paid job. The Intergenerational Foundation revealed that Britons between ages 15 and 35 are at risk of being the first modern generation to earn less than their predecessors are over the course of their working lives. The burden of student debt repayment will wipe out any “graduate premium” for most professions, typically costing in total £282,420 over 30 years.
There are divisions in ruling circles as to the efficacy of expanding the grammar system. One senior minister told the Sunday Telegraph: “With such a small majority [of 12 MPs in parliament], now is not the time to be picking a fight like this.”
The Financial Times editorialised that grammars were the “wrong solution”, warning, “it is worrying that Mrs May has chosen to begin by reviving such a divisive, totemic policy.”
To avert this danger, the nominally liberal Guardian stepped forward to issue words of praise for the government. “England’s school system is at last working pretty well,” it asserted. May was “helping social mobility,” but should not therefore return to a policy “that was abandoned 50 years ago because it had failed.”