24 Jun 2016

STEM Scholarships for African Girls 2016

Brief description: The Visiola Foundation  is offering undergraduate scholarships to female students from Africa.
Application Deadline: Sunday 30th June 2016
Offered annually? Not known
Eligible Countries: Countries in Africa
To be taken at (country): The interested female participants will be able to choose where to study from the following schools:
  • Ashesi University, Ghana
  • Apply directly to Lead City University
  • Bayero University
Eligible Fields: Science, Technology, Engineering and Mathematics fields
About the Award: The Visiola Foundation offers scholarships to high potential African girls from disadvantaged backgrounds to pursue STEM degrees at partner institutions listed above. Please apply directly to the desired university to be considered.
Type: Undergraduate taught
Eligibility: Candidates are expected to meet all of the eligibility criteria:
  • Nationals of an African country.
  • Received both primary and secondary education in an African country.
  • Graduated in the top 10% of their secondary school’s graduating class.
  • Have clearly demonstrated financial needs.
  • Demonstrate passion for and commitment to community service as evidenced by prior and on-going involvement in local activities.
  • Are not related to or otherwise connected with any staff member or friend/acquaintances of staff members or otherwise affiliated members of the University or the Foundation.
Number of Awardees: Several
Value of Scholarship: Not stated
Duration of Scholarship: Duration of course
How to Apply: Interested candidates should submit a complete application, including:
  • An application form
  • Copy of a Government issued identification document.
  • One recent passport size photograph.
  • Final year Primary and Secondary School report cards and exam certificates (WAEC).
  • Annual report cards for senior secondary school (SS1 – SS3).
  • Explanation of family’s financial history with supporting documents.
  • Curriculum Vitae (CV)/Resume.
  • Personal statement discussing the candidate’s passion for community service and involvement in local activities (not to exceed 500 words).
Interested candidates should check the Scholarship Webpage to know how to apply.
Award Provider: Visiola Foundation

The 2016 Elsevier Foundation Awards for Early-Career Women Scientists in the Developing World

Application Deadline: 1st of September 2016
Offered annually? Yes
Eligible Countries: Angola, Benin, Botswana, Burkina Faso, Burundi, Cameroon, Cape Verde, Central African Republic, Chad, Congo Dem. Rep., Congo Rep., Côte d’Ivoire, Equatorial Guinea, Eritrea, Ethiopia, Gabon, Gambia, Ghana, Guinea-Bissau, Guinea, Kenya, Lesotho, Liberia, Madagascar, Malawi, Mali, Mozambique, Niger, Nigeria, Rwanda, São Tome and Principe, Senegal, Sierra Leone, South Sudan, Tanzania, Togo, Uganda, Zambia, Zimbabwe.
To be taken at (country): Winners receive an all-expenses-paid trip to attend the American Association for the Advancement of Science (AAAS) annual meeting in Boston, where they are given their awards.
Brief description: The Elsevier Foundation is calling for nominations of hardworking science-inclined women from Africa and other developing countries.
Eligible Field of Study: Engineering Sciences: Engineering, Innovation and Technology
About the Award: Launched by The Elsevier Foundation, TWAS and OWSD, the Awards reward and encourage women working and living in developing countries in the early stages of their scientific careers. Awardees must have made a demonstrable impact on the research environment both at a regional and international level and have often overcome great challenges to achieve research excellence.
The award has an important impact on local research cultures. Previous winners say the awards have had a powerful effect, enhancing the visibility of their past work and creating new opportunities for the future. The awardees are powerful role models for young women who are contemplating whether to remain in an environment that is often hostile to their needs and experience.
Nominations are invited from senior academics, including OWSD members, TWAS Fellows, ICTP visiting scientists and staff, national science academies, national research councils and heads of departments/universities both in developing and developed countries.
Offered Since: 2012
Type:  Science Research
Eligibility: The nominee must be a female scientist; have received her PhD within the previous 10 years; and have lived and worked in one of the following developing countries during the three years immediately prior to the nomination:
Selection Criteria: The competition will be judged by a distinguished panel of international scientists, including members of TWAS, OWSD and ICTP, and chaired by OWSD. The assessment will be based on achievements in the field, with particular attention paid both to the nominees’ contribution to capacity-building in their region, as well as international impact. Winners will be informed of their selection in November 2015.
Number of Awardees: five Awards. One woman is awarded for each of five regions in the developing world: Latin America and the Caribbean; East and South-East Asia and the Pacific; the Arab region; Central and South Asia; and Sub-Saharan Africa (see the list of countries in Africa above)
Value of Award: Each winner receives a cash prize of USD 5,000 and an all-expenses-paid trip to attend the American Association for the Advancement of Science (AAAS) annual meeting in Boston in February 2017. Lasting 5 days, the event is packed with networking opportunities. The winners receive their award at a special networking ceremony, as well as invitations to mentoring and science communication workshops, a visit to a local laboratory, and a celebratory dinner organised by the Elsevier Foundation.
Duration of Award: Not stated
How to Apply: Nominators should visit the Award webpage to download and fill the nomination form. Completed nomination forms should be sent to:
OWSD Secretariat
OWSD-Elsevier Awards
ICTP campus
Strada Costiera 11
4151 Trieste
Italy
Phone: +39 040 2240321  Fax: +39 040 2240689
email: info@owsd.net
Award Provider: The Elsevier Foundation
Important Notes: Candidates should please note that self-nominations are not accepted. Nominations must be made on the nomination form and signed by the nominator; they must include the candidate’s curriculum vitae and full list of publications; and be accompanied by three reference letters.

Biophilia as Extreme Sport

Christy Rodgers

The renowned biologist E.O. Wilson gave us the term “biophilia,” which he defined as “the urge to affiliate with other forms of life.” As the world’s human population goes on expanding and walling itself up in cities, and the Sixth Extinction gathers steam, this urge is often expressed as an increasingly desperate kind of nostalgia. It drives support for conserving wild places many will never visit, as well as pastoral landscapes in which most will never work. Not to mention the proliferation of pretty floral, animal, and landscape images on our laptops and phones.
We know we’re missing something – we just don’t seem to have the time or inclination to get out there and look for it in the natural world. We turn instead to extravagant machine-made sound and light shows and other pseudo-experiences to replace the sensory and cognitive richness of the biological affiliations we’ve lost.
Charles Foster, the author of Being a Beast: Adventures Across the Species Divide, is thus a something of an atavism. An English gadabout and veterinarian with Oxfordian university credentials, he has written a memoir of his gonzo-naturalist attempts not just to observe wild animals, but to live like them, to experience their world from the inside. He is not nostalgic by temperament, but his book is likely to be read by people who are. His personal antidote to our increasing disconnection from the biosphere is not one it would be likely – or beneficial, especially to the other animals – for many of us to follow.
He sets himself the task of aping – if you’ll excuse the expression – five creatures: badgers, otters, foxes, red deer, and swifts. All of them have a sort of talismanic status in what remains of Britain’s wild. He digs himself a badger den, and lives there (off and on) for beingabeastweeks. He attempts to careen down rivers naked and eat live fish, like an otter. He forages for trash and naps in shrubs, like a London fox. He sets a staghound on himself on a moor (he is very quickly caught).
It’s a good read, for a while. Foster pushes his prose to zippy, zingy places to represent the difficulty of seeing, smelling, reacting and thinking like his chosen species. He admits his limitations and failures, emphatically, but he still thinks the experiment worth doing. He thinks he comes surprisingly close, a few times, to “being a beast.”
It’s hard not to sympathize with his attempt, desperate as it is, to make the case that we still share a world with these wild creatures and millions more and we are the better for it. And we’d be even better off (as the other animals indubitably would) if we recognized them as beings as worthy of admiration, respect and even love, as our own species.
Foster obviously has those feelings abundantly. He even takes to task the “colonialist” mentality of some other naturalists – the idea that the main reason to acquire knowledge of nature is to better control it. And yet in the end he reminds me of nothing so much as the intrepid anthropologists who seek out un-contacted tribes and try to live among them and – with all the sympathy in the world – “interpret” their lives to the civilized world. What is learned, and who benefits and how, remain ambiguous at best.
I did learn quite a lot about Charles Foster in this memoir. (I learned almost nothing about the shadowy wife who mutely plays housemaid for him and their six kids – whom he recruits as participants in his experiments in ways that are a little unsettling, like sharing the badger den in freezing weather and eating worms with him, or dropping their shit by rivers to mark their territory, like otters.) But for all his efforts, I’m not sure I learned much about his chosen animals that a good nature documentary (maybe with some fancy underground, underwater, and airborne camera work) couldn’t tell me. His attempts to give these animals interiority by imitating their physical behaviors from within his very different skin and form of consciousness left me skeptical.
It’s not that I don’t believe animals – or plants, for that matter – can think. I’m positive they can, in terms of any reasonable definition of thinking. It’s just that I don’t believe that I – or Charles Foster either – can know what their experience feels like to them.
As I was reading his book one afternoon on a windswept beach in California, I looked up and saw a little girl carefully making her way across the sand, her eyes on the ground. Even though I myself havebeen a little girl walking on the beach (although not that particular beach) and was just then sensing the same sunlight, sand and wind, I had no more idea what that child’s inner experience was than I have of what any given otter is experiencing when he or she slides off a riverbank, or a stag when he steps through a brushy grove. And Foster’s combination of constructed situations, highly torqued prose, and biochemistry factoids didn’t really get me there.
Mainly I was left thinking that there was a fundamental contradiction in this labor of intended respect: must humans try to be everywhere? To insert ourselves, physically or cognitively, into every aspect of life, including the inner experience of other animals? Can’t we express our respect for wild nature by letting it be?
Probably the answer to those questions is no – there is something relentlessly colonialist in human curiosity – but it seems that our encounters could at least be marked by more deference. In some cultures, after long periods of trial and error, they have been.
Being a Beast brought to mind Werner Herzog’s 2005 documentary Grizzly Man, a striking counterpoint. Herzog is not what you’d call a “nature lover,” and a lot of his work seems aimed at de-romanticizing the idea that some civilized humans have of the world outside civilization. The subject of the film, if you haven’t seen it, is a lone young man who fell in love with Alaskan grizzlies and decided to get as close to them as possible. He succeeded – for a while. But he met a horrible end.
I think Herzog’s perspective is that most humans are now irrevocably sundered from wild nature, and good riddance. But that isn’t the point I take. For me the most telling interview in the film is with a member of an Alaskan Native nation, who explains the norms he was raised with regarding grizzlies: Real respect meant giving them a wide berth. Not just because they were dangerous to humans but because they deserved to live their lives free of human interference. Their lives were full and complete without us. The land they ranged belonged to them first.
We have not even learned how to acknowledge, respect and co-exist with otherness within our own species. But hyper-identification isn’t the answer. Humility might be.
The other thing I learned about Foster from this book is that he identifies much more with predators than with prey animals. The red deer is the hardest animal for him to meld with, probably because he cops to having shot a number of them in his now-abandoned hunting days. He expands this lack of identification to all of humanity, averring that we are a top predator and therefore cannot really respect an animal whose existence is defined by the fact that it is a food source.
It seems a bit nasty to refer back to Grizzly Man in this context, or the child who was devoured by an alligator at Disney World recently. I’d rather say that, even leaving other predator species aside, most women and a large number of children have probably had the experience of feeling like prey at some point in their lives. Regardless of how Charles Foster feels, many of us can identify. I’d even venture that the identification as prey was part of the impetus behind the development of civilization – for what that’s worth. But we are also our own worst predators, so we have to straddle that paradox.
I don’t want to diss this book, or its author, entirely. He tells a good story, he raises a lot of good questions, his heart seems to be in the right place, and he’s done his homework. You should always get points for those things. I also give him big points for his admiration of the ideas of Rupert Sheldrake. Sheldrake is the rogue biologist who has been ruthlessly attacked by reductionists like Richard Dawkins for venturing to posit that the principles guiding the cosmos make it more like a living thing than a machine.
But in the end, I think I just prefer my naturalists old school: sitting quietly, walking carefully, and using their limited human minds humbly as they wait with infinite patience for the wild to reveal itself to them.

The Sad Truth About Messi

Cesar Chelala

Gabriel Omar Batistuta, an Argentine soccer player nicknamed Batigol as well as El Angel Gabriel, now retired, was the top scorer with Argentina’s national team. This happened until Lionel Messi, with a majestic goal, surpassed his record. Argentina beat the U.S. 4-0 in a game in which Messi showed again why he is widely considered the best player in the world.
Lionel Messi netted his 55th goal, overcoming Batistuta’s record of 54 goals in 78 games. Messi is now the top scorer also with Spain’s La Liga (The League), the top professional association football division of the Spanish football league system, Clásicos (as any match between fierce rivals Real Madrid and FC Barcelona is called) and with his own Barcelona team.
Messi reached his new record with a stupendous free-kick stunner from over 25 yards that left Brad Guzan, the U.S. goalkeeper, totally defeated. There was a tremendous air of expectation at the stadium before Messi kicked the ball, only interrupted by repeated shouts of “Messi”, “Messi”! from the crowd. Messi slowly tied up his shoes, looked at the net and with jeweler’s precision, he sent the ball which landed with spaceship ease in the upper right hand corner of the net. More than 70,000 people saw him reach another milestone.
Breaking records has now become a habit for Messi. In addition to the free-kick goal, Messi contributed to Argentina’s victory with two splendid assists that ended in goals. Messi scored his goal three days short of his 29th birthday, while Batistuta was four years older when he scored his last one.
Every move by La Pulga (the Flea), as he is frequently called, was loudly celebrated by the public. No one, however, was as enthusiastic as the fan that, walking across the field, came in front of Messi and bowed at his feet. He then embraced Messi, who signed the fan’s shirt with a marker pen before the fan run off and was caught by security personnel.
Unlike his often boastful compatriot Maradona, Messi, a generally humble man, declared after the game, “I have to thank my team-mates with whom I’ve played. The record I have achieved is thanks to them…From the first day we did things in a spectacular way, and we deserve to be here, for how we’ve been working. I hope that the final game goes well.”
In an interview with the newspaper La Nación, Eduardo Galeano, the noted Uruguayan writer, once said that while Maradona had the ball tied to his foot, Messi had the ball inside it, and that is why he was able to play the way he did. Messi learned about Galeano’s theory from the Argentine national team’s coach and was so delighted with it that he sent Galeano a jersey with his name as a present.
Speaking to Líbero, a program that goes out on TV channel TyC Sports, Batistuta said about Messi before he achieved his record, “I became the national team’s leading scorer almost without realizing. It wasn’t a goal I set for myself. It wasn’t an objective or an obsession, because I never cared about statistics. Now Leo is going to take it from me, I have to say it hurts a bit. It’s not just any old player who is moving past me, though. It’s going to someone who is out of this world and that makes me feel a bit better.” And, without realizing it, Batistuta told us the sad truth about Messi. Messi is not just a normal human being. He is an extraterrestrial.

Beyond Dangerous: the Politics of Climate

Michael Doliner

Kevin Anderson is the Deputy Director for the Tyndall Center for Climate Change Research at the University of Manchester. Here is a talk he gave in 2011. Unlike many who talk about climate science, Anderson sticks closely to the numbers. Those in a hurry might skip to 15:00. Anderson shows, starting around there what it would take to actually achieve the goals of the Copenhagen Accord, limiting Global average temperature rise to 2° C. He shows several graphs each with a number of scenarios that take into account the uncertainty of the science. Anderson has examined every scientific assessment he could find. He has selected three different peak emissions dates. All scenarios require a drastic reduction of emissions from peak at rates never achieved before even as economies collapsed, for example in the former Soviet Union.
In this video Anderson still contemplates the possibility of emissions peaking in 2015. Later peaks require steeper reductions. The last graph shows what would happen if we wait until 2025 to peak emissions. Some of the more pessimistic scientific assessments would no longer be possible, their threshold carbon budgets having already been exceeded. Those that still have something left require an immediate end to all fossil fuel use except for the production of food.
Later in the video (around 46:00) Anderson does offer some suggestions for reducing emissions quickly without affecting, too much, our way of life. Only rapid reductions in depand have any chance. These reductions need be taken by only 1% of the population and their sacrifices in doing so would be minimal. They would result in little or no reduction in life quality and a large reduction in emissions. Why have they not been done? That they haven’t, I think, demonstrates that the 1% have a lot less control than they think. Their actions must fit tightly within the growth paradigm. Using more is the reward for success.
These measures would have offered and still offer quick early reductions, but not enough to stave off a rise in world average temperature of 2° C. That would require some pain. If emissions do not peak by 2025 all scientific assessments predict the impossibility of staying within 2° C, as the Copenhagen Accord requires.
Earlier in the video Anderson points out that the assessment of dangers that produced the threshold of 2° C as a safe threshold in 1992 were redone later with more data and scientists discovered that these effects occurred at much lower temperatures. So even if we could keep temperature rise to 2° we would be beyond dangerous, hence the name of the talk.
The effects of 4° C on agriculture, heat waves, and the like would make  civilization impossible, and perhaps also human life.
Anderson still believed, in 2011, that there was a remote chance to keep the global average temperature rise under 2°, but the chance was slight and is growing slighter. The rates of reduction necessary even with a 2015 peak were far beyond anything ever achieved, and later peaks require even steeper reductions. Anderson was speaking in 2011. Even then virtually all scenarios required carbon capture with technology that does not yet exist. Of course the 2015 peak is no longer possible. Obviously, the situation is more desperate now.
This should put into perspective the coming Presidential election between Clinton and Trump. Does either of them have any idea of the gravity of the situation? Is there a remote chance that either has the intellectual equipment to understand it? I am not talking here about native intelligence, whatever that is, but habits of thought. Clinton looks for the way the wind is blowing. Trump polishes his own star. Will either of them be able to see beyond a world of class and personal interests and actually bite down on a fact? The people they talk to have known for 25 years what was happening and have done nothing to reduce emissions. How would implications from the actual facts enter those minds? Think of Clinton and Libya; Trump and his failed casinos. Has either shown the capacity for making good, or even sane, decisions? Could either of them even begin to organize what has to be done? Could Hillary Clinton or Donald Trump find a way to convince the entire nation to change its ways, its very thought processes, so drastically? Could either actually lead anybody? Are there people who actually believe in either of them? Wouldn’t either rather take the easy popular choice of doing nothing and letting the good times roll as long as they can? Is it sensible to pay so much attention to these two when we should be doing something about humankind’s immanent demise? Given the total disaster that either of them presents, and the utter irrelevance which one we choose, does it make sense for a serious person to give them another moment’s thought?
Anderson complains that scientists self-censor because they cannot fund bad news and don’t want to be its messenger. But it seems to me that he is guilty of the same fault when he offers hopeful answers that affect our lives little.
Only someone who knows the danger and cares about the future of humankind could make the last ditch effort to try and keep global average temperature rise to 2°C. For that effort cannot be made without total upheaval in how we live. We are not going to do this with a simple painless adjustment of the knob. We cannot use hydrocarbon fuels for energy, period. But within the global economic system using fossil fuels is a rational choice. Poorer developing countries will use fossil fuels in addition to whatever alternatives show up. If laws in richer countries forbid the use of fossil fuels, their price will drop and that will only make them that much more attractive elsewhere. The highly-competitive growth-seeking global economy will use every scrap of every resource in time.
So the economy predicated on growth must go, as Anderson recognizes, if civilization, and perhaps human life, is to survive. The conclusion is as clear as noon day, and anyone who denies it is just afraid to look. If growth is the end, burning fossil fuels will be a rational choice. No law, no tax will prevent it so long as economic growth, meaning cars, air conditioning, airplanes, big houses, and other energy intensive comforts, is the goal– as it is, everywhere, now.
It is only when we grasp this astonishing fact that we can even begin to think about our predicament. Really, the choice between Hillary Clinton and Donald Trump, as truly lamentable as that choice is, is also irrelevant. Obviously neither has a clue, and neither has much familiarity with the implications of the truth. Both Clinton and Trump, each in her/his way, have signed on to the use of language to deceive and persuade, that is rhetoric, rather than to tell the truth. It would be wasting one’s breath to try to explain it to either of them. Could you imagine Hilary Clinton or Donald Trump embracing a no growth policy? They would both be looking for an angle, as they always do. Habits of thought are real.
To save ourselves we need to change everything. Growth is in our language, our thought, our DNA, our very idea of time. Growth seems like an advance in linear time. We can’t just go nowhere forever. Growth, as the essential human purpose, will force us to burn carbon based fuels. This should all be obvious but crosses the threshold of allowable thought for many people. Time must again become a spiral, and success measured in lack of change, lack of growth, indeed shrinkage, year to year. Only that kind of about face will give a remote chance of human survival. And of course even then the chance of remaining below 2° C is remote. The implications are enormous but not too hard for anyone to see in outline. Of course nobody wants to see it. For how is such a momentous change to be effected? And if it is not, what do we do then?
Well, there we are. If whoever we elect gets a second term (Bush did, Obama did) that will take us to 2025, where, if the use of fossil fuels is still rising, all scientific projections for keeping below 2° C end. I suppose the hope for human survival then would hang upon all scientific assessments having been wrong and Hillary Clinton or Donald Trump, whoever we have elected, having been right.

The 10 Worst Acts of the Nuclear Age

David Krieger

The ten worst acts of the Nuclear Age described below have set the tone for our time.  They have caused immense death and suffering; been tremendously expensive; have encouraged nuclear proliferation; have opened the door to nuclear terrorism, nuclear accidents and nuclear war; and are leading the world back into a second Cold War.  These “ten worst acts” are important information for anyone attempting to understand the time in which we live, and how the nuclear dangers that confront us have been intensified by the leadership and policy choices made by the United States and the other eight nuclear-armed countries.
1 Bombing Hiroshima (August 6, 1945). The first atomic bomb was dropped by the United States on the largely civilian population of Hiroshima, killing some 70,000 people instantly and 140,000 people by the end of 1945.  The bombing demonstrated the willingness of the US to use its new weapon of mass destruction on cities.
2 Bombing Nagasaki (August 9, 1945). The second atomic bomb was dropped on the largely civilian population of Nagasaki before Japanese leaders had time to assess the death and injury caused by the atomic bomb dropped on Hiroshima three days earlier.  The atomic bombing of Nagasaki took another 70,000 lives by the end of 1945.
3 Pursuing a unilateral nuclear arms race (1945 – 1949). The first nuclear weapon test was conducted by the US on July 16, 1945, just three weeks before the first use of an atomic weapon on Hiroshima.  As the only nuclear-armed country in the world in the immediate aftermath of World War II, the US continued to expand its nuclear arsenal and began testing nuclear weapons in 1946 in the Marshall Islands, a trust territory the US was asked to administer on behalf of the United Nations.  Altogether the US tested 67 nuclear weapons in the Marshall Islands between 1946 and 1958, with the equivalent explosive power of 1.6 Hiroshima bombs daily for that 12 year period.
4 Initiating Atoms for Peace (1953). President Dwight Eisenhower put forward an Atoms for Peace proposal in a speech delivered on December 8, 1953.  This proposal opened the door to the spread of nuclear reactors and nuclear materials for purposes of research and power generation.  This resulted in the later proliferation of nuclear weapons to additional countries, including Israel, South Africa, India, Pakistan and North Korea.
5 Engaging in a Cold War bilateral nuclear arms race (1949 – 1991). The nuclear arms race became bilateral when the Soviet Union tested its first atomic weapon on August 29, 1949.  This bilateral nuclear arms race between the US and USSR reached its apogee in 1986 with some 70,000 nuclear weapons in the world, enough to destroy civilization many times over and possibly result in the extinction of the human species.
6 Atmospheric Nuclear Testing (1945 – 1980). Altogether there have been 528 atmospheric nuclear tests.  The US, UK and USSR ceased atmospheric nuclear testing in 1963, when they signed the Partial Test Ban Treaty.  France continued atmospheric nuclear testing until 1974 and China continued until 1980.  Atmospheric nuclear testing has placed large amounts of radioactive material into the atmosphere, causing cancers and leukemia in human populations.
7 Breaching the disarmament provisions of the NPT (1968 – present).  Article VI of the Nuclear Non-Proliferation Treaty (NPT) states, “Each of the Parties to the Treaty undertakes to pursue negotiations in good faith on effective measures relating to cessation of the nuclear arms race at an early date and to nuclear disarmament….”  The five nuclear weapons-states parties to the NPT (US, Russia, UK, France and China) remain in breach of these obligations.  The other four nuclear-armed states (Israel, India, Pakistan and North Korea) are in breach of these same obligations under customary international law.
8 Treating nuclear power as an “inalienable right” in the NPT (1968 – present).  This language of “inalienable right” contained in Article IV of the NPT encourages the development and spread of nuclear power plants and thereby makes the proliferation of nuclear weapons more likely.  Nuclear power plants are also attractive targets for terrorists.  As yet, there are no good plans for long-term storage of radioactive wastes created by these plants.  Government subsidies for nuclear power plants also take needed funding away from the development of renewable energy sources.
9 Failing to cut a deal with North Korea (1992 to present).During the Clinton administration, the US was close to a deal with North Korea to prevent it from developing nuclear weapons.  This deal was never fully implemented and negotiations for it were abandoned under the George W. Bush administration.  Consequently, North Korea withdrew from the NPT in 2003 and conducted its first nuclear weapon test in 2006.
10 Abrogating the ABM Treaty (2002). Under the George W. Bush administration, the US unilaterally abrogated the Anti-Ballistic Missile (ABM) Treaty.  This allowed the US, in combination with expanding NATO to the east, to place missile defense installations near the Russian border.  It has also led to emplacement of US missile defenses in East Asia.  Missile defenses in Europe and East Asia have spurred new nuclear arms races in these regions.

Killing Ourselves With Technology

Pete Dolack

What do we do when technology spirals out of our control? Or, to put it more bluntly, when does humanity’s ability to build ever more dangerous weapons become a self-fulfilling prophesy?
Albert Einstein is said to have remarked that he didn’t know what weapons the third world war would be fought with, but the fourth would be waged with sticks and rocks. Even that classic of science fiction optimism, Star Trek, had humanity surviving a third world war. (Spock recounted the tolls of Earth’s three world wars in one episode.)
But we wouldn’t, would we? Or we might wish we didn’t. One story that has long lingered in my mind is an early Philip K. Dick story, “Second Variety,” published in 1953, a time when the cold war was looking decidedly hot. The story takes place in a post-apocalyptic France, in a world in which nuclear bombs and other equally nightmarish weapons have reduced most of North America and Europe to gray ash, with only a stubby tree trunk or a blasted wall dotting barren, depopulated landscapes.
The West’s governments have retreated to a bunker somewhere on the Moon, with scattered groups of soldiers huddled in hidden underground bunkers on Earth trying to “win” the world war. The land is uninhabitable because of a super-weapon developed by the U.S. — autonomous machines that hone in on any living being and rip it to shreds with whirring metal blades that make short work of whatever they encounter. The Western soldiers are protected by a belt that forces the death machines to back off. This is the weapon that turns the tide of the war into a U.S. advantage after years of “losing” the war against the Soviet Union.
But what is there to “win”? Much of the world is uninhabitable, not only because of the total destruction and residual radiation from countless bombs but from the new weapon. There is no alternative but to huddle in underground bunkers. As Dick’s story unfolds, the nightmare gets progressively worse — the weapons are not only autonomous, they are self-replicating and continually inventing newer and more deadly varieties of themselves. The last pockets of U.S. and Soviet soldiers in this slice of the French countryside are systematically killed as the machines learn to build robots difficult to distinguish from humans; robots allowed into bunkers as refugees, only to suddenly become unstoppable killing machines, and which don’t distinguish one side from the other.
Although shuddering at the mere thought of their deadliness, more than once a soldier tries to justify these ultimate weapons by saying “If we hadn’t invented them, they would have.”
If we didn’t shoot first, bomb first, destroy first, they would have. Whatever we do is justified. No culture has a monopoly on such thoughts. But such thoughts combined with the technological progress of the present day, rising nationalism and budget-busting military budgets leave the possible end of the human race a concrete possibility rather than merely a science fiction allegory.
Philip D. Dick was no prophet — no one is — but the nightmare world he created is chillingly tangible. What would happen if a technology of war was given autonomy? Such a weapon would be purposefully designed to kill swiftly and without mercy. The Pentagon has already begun a program designed to create autonomous weapons systems.
But what if an artificial intelligence decided humans were in the way? Isaac Asimov famously had his robots programmed with three laws that blocked them from doing any harm to any human. The other side of this equation was explored in another Star Trek episode, when the Enterprise encountered a planet populated by advanced robots. The robots had killed their creators so far back in time that the robots couldn’t remember when, but had done so because their creators “had begun to fear us and started to turn us off.”
Technology need not be feared nor is it necessarily fated to escape all control. There are no von Neumann machines swarming everywhere (at least in this part of the galaxy!), and I am inclined to agree with Arthur C. Clarke’s maxim that there is no evil technology, only evil applications of technology. Yet we live in a world where there are plenty of opportunities for technology to be used for evil purposes. We see some of this all around us as workplaces become sites of tightening surveillance and control, from computers that report on us to bosses, to the endless treadmill of work speedups. Technology is today a tool of capitalists, to extract ever more work out of us, to outsource work on scales never before possible and to facilitate ever faster and more numerous speculation in dubious financial instruments.
Technology in these hands also makes waging war easier — a drone operator can sit in a control room thousands of miles from the targets, safe from the carnage rained down on far-away peoples. If autonomous weaponry ever is unleashed, how could it be controlled? It couldn’t. Humanity won’t survive a third world war.
When we think of existential threats to our descendants’ world, we tend to focus on global warming, environmental degradation and the looming collapse of capitalist industrialism, of the impossibility of infinite growth on a finite planet. That is properly so, and these do seem to be the gravest challenges that will face us across the 21st century. But technology applied to perfecting military killing machines is within the human imagination. Dick conjured this at the midpoint of the 20th century and he is far from the only one.
Yes, a warning and not a prophesy. But in a world of vast inequality, of an industrial and financial elite willing to do anything, even put the planet’s health at risk, for the sake of acquiring more wealth, the potential for evil applications of technology are ever present.
One more reason, if we didn’t already have enough, to bring into being a better world, one built for human need and environmental harmony rather than private profit. We then wouldn’t need to endure a mad pursuit of fetishized technological advancement; instead we could harness technology for the greater good as necessary. Barbarism remains the likely alternative.

Americans Work Too Long for Too Little

David Rosen

American workers annually work more hours than workers in any other post-industrial, “1st world,” country.  In 2014, an average American worked 1,789 hours per year or 34.4 hours per week, placing the U.S. at 17th in the OECD’s list of developed countries; German workers rank 1st in terms of the lowest total annual hours worked, nearly one-quarter less per year at 1,371 hours or 26.4 hours per week.
In the 2016 presidential campaign, job growth and the minimum wage are major issues, but job satisfaction and the workweek are non-issues.  Many Americans feel they are living in desperate times and it seems better to have no job than one that doesn’t pay a living wage or is fulfilling.  While the official unemployment rate is slowly falling,those no longer looking for work are increasing and wages remain stagnant.
In the seven decades since the end of World War II, the U.S. has lived two lives.  The first life occurred during the postwar era of recovery and prosperity popularly known as the “American Century” that lasted from 1945 to the mid-1970s; it is the era that Trump invokes when he opines about “Making American Great Again.”  The second phase evolved from the mid-‘70s through today and is marked to the eclipse of the short-lived “American Century.”
The decline in the quality of working life during the last seven decades is revealed by examining four key factors: (i) changes in the length of the workweek, (ii) productivity gains, (iii) wage stagnation and (iv) the rise of personal debt.  Together, they suggest a modest – if fundamental – way to begin to address the problem.  One suggestion is to drastically cut the workweek while maintaining current wages.
* * *
For many Americans, the 40-hour workweek remains the labor standard.  According to the St. Louis Federal Reserve, at the beginning of the post-WW-II recover, in December 1945, manufacturing workers worked 41.2 hours per week; seven decades later, in December 2015, little changed for manufacturing workers who worked on average 41.7 hours per week.  However, as the Fed makes clear, total workweek hours for all private sector workers declined by nearly a quarter to 33.8 hours.
The average workweek means little in itself, but its value comes in terms of two decisive factors, productivity and compensation.  A 2015 study by the Economic Policy Institute (EPI) found that while “net productivity of the total economy” for the period 1948 to 2014 grew by a staggering 238.7 percent, the “average hourly compensation of production/nonsupervisory workers in the private sector” grew by only 109 percent.
The EPI broke up this past seven-decade period into two subsets and assessed compensation and productivity accordingly: (i) 1948-1973: productivity increased by 96.7 percent an hourly compensation by 91.2 percent; and (ii) 1973-2014: productivity grew by 72.2 percent and hourly compensation increased by only 9.2 percent, a 90 percent decline.  It found that during the postwar era of the American Dream, from 1948 to 1973, ”the hourly compensation of a typical worker essentially grew in tandem with productivity ….”  However, in the four decades following 1973, productivity continued to rise but wages stagnated.
During this long postwar era, as the St. Louis Fed detailed in a 2012 report, consumer spending ceaselessly grew as a proportion of GDP:
* 1951-1960 = 62.3%
* 1961-1970 = 61.8%
* 1971-1980 = 62.5%
* 1981-1990 = 64.5%
* 1991-2000 = 67.3%
* 2001-2010 = 70.0%
In conclusion it warned:  “Can American consumers continue to serve as the engine of U.S. and global economic growth as the did during the recent decades?  Several powerful trends suggest not, at least for a while.”
How was consumer spending able to increase while wages stagnated?  The magic of postwar American life was debt.  Secured installment loans, including mortgages and car loans, predated the war; unsecured loans, including credit cards, student loans, paydays loans and lines of credit, followed.  And debt skyrocketed by nearly 65 fold; between 1952 and 2015, per person debt jumped from $160 to $10,600 – and this was during a period when the U.S. population only basically doubled from 156 million (1952) to 319 million people (2015).
* * *
In 1930, the British economist, John Maynard Keynes, predicted that within 100 years, the average workweek would drop to only 15 hours.  His forecast was based on a projection of a modest global economic growth of about 2 percent per year.  According to one scholar, Keynes believed that “in a world with so much wealth, we would naturally choose to increase our leisure time rather than simply accumulate additional wealth.”
Capitalism outsmarted Keynes.  While economic growth exceeded Keynes’ modest proposal, it fostered a postwar world in which people in the U.S. were seduced by all the sexiness, glitter and false consciousness of consumerism.  And while wages stagnated, people were enslaved by ever-mounting debt.  The year 2030 is only 14 years away, but it does not look like the 15-hour workweek is in anyone’s future.
It’s time to readjust the traditional relations between productivity, the workweek and wages.  In 2000, Eric Rauch wrote, “An average worker needs to work a mere 11 hours per week to produce as much as one working 40 hours per week in 1950.”  He adds: “if productivity means anything at all, a worker should be able to earn the same standard of living as a 1950 worker in only 11 hours per week.”
Globalization is restructuring capitalism and, with it, the U.S. economy.  It is fueling the rise of inequality, refashioning social relations and increasing the wealth and power of the 1 percent.  It is also transforming work-life.
So why not rethink the relation between the workweek and compensation?  The push for the $15 per hour minimum wage is a noble effort, one that brings real benefits to the lowest sector the working class.  Switzerland failed effort to provide a basic income of about $2,500 a month (2500 Swiss Francs) suggests a new way to think about income; Andy Stern, the former SEIU president, recently suggested a U.S. version, but for about $1,000.
While well intentioned, these proposals don’t go far enough.  One way to secure the benefits of the enormous productivity gains that have taken place since 1975 is to cut the workweek without cutting wages.  For example, what if manufacturing workers currently (2015) working 41.7 hours per week had their workweek cut to, for example, 20 to 25 hours but kept the same salary?; similarly, what if private sector workers working 33.8 hours could have their workweek cut to 15 to 18 hours at the same salary?  The business sector could take full advantage of productivity gains without having to increase wage expenses.
Such a scheme is, of course, utopian – and intentionally so.  But maybe that’s what’s needed in a time marked by dire predictions as to the nation’s economic future and the lack of real political imagination.

The Inhumanity Of Brexit

Binoy Kampmark

While the Brexit debate has become a matter of colliding blocs of speculators and crystal ball gazers, a glaring aspect has come to the fore.  Virtually nothing has been said about the role played by human rights, Britain’s role in building it up and inspiration in the European Convention of Human Rights, or the issue about citizenship.
In that sense, Brexit mirrors broader European failures: the excision of the human experience from broader managerial and corporate arguments.  In its stead is a reflection about what the price of camembert cheese might be in a post-Brexit regime, or wine for that matter.  This is middle class snobbery run wild, a fear borne from comfort rather than crisis.
The crude economic arguments speak, in many ways, to European problems, rather than strengths. The broader human issues are neglected before those of the purse and assets, bank balances and trade.  This has made the Remain Campaign vulnerable in its sterility.
To emphasise this very point, currency transfer sites are limiting if not suspending operations.  Transferwise, to make its point, is suspending its service during the course of the Brexit vote.
Not to matter, argues the Leave campaign front man and former London Mayor, Boris Johnson.  Britain, he argues, would be able to “prosper mightily” outside the zone.  He gives no examples how, and avoids the bolstering effect Europe has had for Britain’s economy.  But even more seriously, he avoids the humane aspect of the European regime, and the modifying effects of the convention on British jurisprudence.
His critics, taken aback by his surging success, can only resort to personal invective. Arguments on the human side and the European legacy have been left behind by pomposity.  “The Leave campaign,” argued former conservative leader William Hague, “is really the Donald Trump campaign with better hair.”
Evidence is less important to Johnson than faith. “Our campaign,” he assures voters, “is about belief.  It is about trusting the instincts of the British people, trusting in our democracy, trusting in the institutions that have evolved over a long time.  Our campaign is about accountability.”[1]
In of itself, the argument about accountability and self-reliance is a statement that resounds across Europe.  Never mind that much of it is, as expressed by the Leave campaign, a simple argument to simply do what one damn well pleases, be it paying lower wages or reducing better work conditions in the name of profit.
There are countries (France, the Netherlands) where the EU fares even worse by reputation than it does in Britain, and there, the issue of “accountability” and “self-reliance” also feature. But Johnson’s statements resemble those of autarchic ambition.
The crudest arguments of all have come from parts of the Leave Campaign, haloed by a less than holy crown of terrifying promises should Britain actually retain its current arrangements.  To not leave now, while things are moderately bad, will lead to something infinitely worse.
Nigel Farage of the UK Independence Party has been so vehement in this campaign he has become a caricature.  With characteristic indifference to the facts, Farage was happy to be photographed before a poster titled in bold capital letters “Breaking Point”.  Few would argue that the “EU has failed us all”; more would disagree with the idea of using a stream of Syrian refugees to demonstrate the point.
Maps have been produced by the thousands promising a surge of immigrants from countries wishing to be admitted to the EU.  In a spike of xenophobia, and selective thinking, Britain has become the exemplar of fractured Europe.
The good of Europe has been lost in favour of parochialism without oversight.  The issue is not that there is a project worth shaping and saving, but one worth abandoning. “People feel at the moment,” asserts Johnson, “that nothing ever changes in politics.  That is party because so much is governed centrally from Brussels.”
Those of the left who should have been guarding the sacred flame of Europe’s benefits have been conspicuously absent in that regard.  The champagne set have stolen the argument over the working individual who can actually thank the EU for working standards and security.
The absurd premise of pure British indigenousness and exceptionalism demands a good deal of scoffing rebuke. But when it comes out of the mouth of Johnson, it sounds different, striking an idealistically mellow note. He offers a vision without substance, while the Remain campaign have offered what they think is substance without vision.  As Britons go to the polls, the difference will be those undecided ones whose minds will be made up as the mark is made on the ballot paper.

Big Brother’s Virtual Reality

Linh Dinh

A billboard for Comcast pitches a lineup of “reality” shows, with this caption, “Recommended for you. Because real reality is boring.”
In contemporary America, real reality is also less real than Big Brother’s cartoony version. While we’re driving, walking, at work, lying in bed or even in the bathroom, Big Brother dictates what we know. Big Brother’s hypnotic screens have become our arbitrator of reality.
What capture us most are not the propaganda images, which can be ludicrously unconvincing, but the voice over. If Big Brother says it, it must be true.
Unlike in Orwell’s 1984, Big Brother does not speak to us in one stern voice. Deploying many puppets, Big Brother will humor, cajole and even uplift. Despite apparent divergences of opinions or even contradictions, Big Brother’s essential messages will always come through most clearly and persistently.
Hillary Clinton and Bernie Sanders, for example, both speak for Big Brother. While Clinton compares Vladimir Putin to Adolf Hitler, Sanders states, “To temper Russian aggression, we must freeze Russian government assets all over the world, and encourage international corporations with huge investments in Russia to divest from that nation’s increasingly hostile political aims.”
Russia is a threat to world peace, they both agree. In boring reality, however, it is the United States that has surrounded Russia with missiles, staged provocative war games on Russia’s borders and pushed Georgia and Ukraine into wars with Russia.
Donald Trump, on the other hand, is used by Big Brother to fan hatred and paranoia of Muslims. From 9/11 to the Orlando Shooting, every “Muslim” terror attack on American soil has been framed and narrated, with no real evidence, by Big Brother. As with the Boston Bombing, Portland Christmas Tree Plot and the Shoe Bomber Plot, etc., Big Brother has either planned and coached the alleged terrorists, or had foreknowledge of them.
In boring reality, the US has also attacked Afghanistan, Iraq, Libya, Somalia, Sudan, Yemen, Pakistan and Syria. Waging war on all these Muslim countries, the US has killed millions and generated millions more in refugees. According to Big Brother, however, the US is not a most brutal and systematic assailant of Muslims, but their hapless target.
Killing Muslims and stealing their land, Israel has also painted itself as a civilized, dignified and unbelievably restrained victim of barbaric Muslim terror.
Without Israel, the US would not be killing and demonizing Muslims endlessly, nor would it suffer these terror attacks pinned on Muslims. Without Israel, not just the US but the entire world would be much more peaceful.
No one remembers the Portland case. The FBI prevented Mohamed Osman Mohamud from working in Alaska so that he could be snared into a terror plot concocted by the FBI. A bomb-loaded van, provided by the FBI, was to be detonated to kill kids celebrating Christmas around a downtown tree.
Though the FBI had recorded and filmed Mohamud extensively, it could not produce the one tape where Mohamud supposedly incriminated himself. Big Brother tells us the recorder had a malfunction at that exact moment, and we don’t keel over laughing, such is our divorce from common sense.
After the Orlando Shooting, we have preposterous testimonies by purported survivors, witnesses and relatives of victims. Angel Colon was supposedly shot once in the hand, once in the hip and three times in one leg, all from an assault rifle at close range. On top of this, Colon was supposedly dragged over broken glass so that his back and buttocks were all cut up. Just three days later, however, Colon had recovered well enough to be presented to the press to recount his ordeal in amazing detail. Though lying face down and pretending to be dead, and in a darkened room, no less, Colon could see that the gunman was aiming for his head when Colon’s hand was hit. He also witnessed the woman lying next to him being killed. This sort of eidetic recall is typical of those whose only knowledge of shootings comes from movies. Colon has also been broadcast from his hospital bed, smiling and lifting, quite casually, his supposedly shattered leg.
Supposed witness Luis Burbano claims to have seen a bullet sticking about five inches from a victim’s leg. Even unfired, a R-15 bullet is just over two inches long, Luis.
Filmed repeatedly, from the very night of the shooting, Christine Leinonen is seen making all sorts of crying sounds and faces, but without actually shedding a tear. While her son was still unaccounted for, Leinonen blubbered, “This is a club that nobody wants to be in. Please, could we do something with the assault weapons so that we could stop this club from ever getting any new members? I beg all of you, please. I want you to know about my son. When he was in high school, he started the Gay-Straight Alliance, and he won the humanitarian award to bring gays and straights together. I’ve been so proud of him for that. Please, let’s all just get along. We’re on this earth for such a short time. Let’s try to get rid of the hatred and the violence, please!”
The Huffington Post deems Leinonen’s discharge “a powerful and devastating plea for more gun control.” Less prone to cow pie, Lucius Nonesuch, a commenter at Unz Review, observes, “It’s not the language of anyone’s everyday speech, nor is it anything like good literary language—it’s more like a bad MFA creative writing student’s verbiage. These ‘average’ people use impossibly bad metaphors and unbelievable figures of speech—the supposed mother above has now “joined a club,” the club of parents who have lost a child to gun violence—how does this dolt of a woman come up with this trite nonsense on her own? It is not her language, it is the language of a Bolshevik Communications Major. Undeniably bad language—and yet completely not right coming from her mouth.”
Thank you, Lucius Nonesuch. If more Americans were like you, we wouldn’t be hoodwinked endlessly by Big Brother. As is, we’re doomed.