One of the most pervasive pathologies of the wi-fi-enabled, smartphone-strapping, “globally competitive marketplace” world we currently live in is the constant, rushed sense that there is simply no time. Where previous millenia made calendars, and the Industrial Age brought us the clock-regulated day and the hourly wage, we rush ourselves from pillar to post on schedules planned minute-by-minute. Doug Rushkoff calls our current condition “Present Shock,” an obsession with micro-time that is depleting our attention and energy from the genuine human goods of life, especially since the last thing on anyone’s mind is getting the full eight hours of sleep we all know the doctors recommend.
It would seem a good thing, then, that the multi-billion dollar coffee industry has ensured a mocha on every street corner, and fair-trade beans in every pot. Even more helpful, energy drinks like Red Bull, Monster, and Five-Hour Energy concentrate the caffeine further, so that we can be back at, or even above, our normal functioning despite our perpetual sleep deprivation. For those in school at one level or another, prescription stimulants like Adderall and Ritalin make the rounds to enable all-night study sessions. In such a competitive culture, it might even be seen an irresponsible to throw those extra hours away when they can be chemically substituted so effectively.
Science is starting to bring a rude awakening to the sleep-sparing, however. As Maria Konnikova notes over at the New Yorker, “While caffeine has numerous benefits, it appears that the drug may undermine creativity more than it stimulates it.” Caffeine boosts energy without a doubt, and blocks a brain chemical that would normally inhibit other brain activity, “aiding short-term memory, problem solving, decision making, and concentration.” All good so far, but Konnikova quickly notes that
much of what we associate with creativity—whether writing a sonnet or a mathematical proof—has to do with the ability to link ideas, entities, and concepts in novel ways. This ability depends in part on the very thing that caffeine seeks to prevent: a wandering, unfocussed mind. [sic]
Caffeine blocks “the parts of our brain that are more active when we’re at rest” and which “become activated right before we solve problems of insight. Caffeine prevents our focus from becoming too diffuse; it instead hones our attention in a hyper-vigilant fashion.” This extends far beyond the creativity associated with the painter or the poet. Many of our greatest scientific or mathematical discoveries took place not in the lab, or at the chalkboard, but in moments of mental leisure, the proverbial eureka in the shower. And the highest performers across all fields may want to take special note: some research has found even prescription stimulants to taper off in their enhancing effects towards the high end of the bell curve; indeed, they may even start retarding the cognitive powers of the most naturally gifted.
For those who, like Jacques Barzun, swear by the enhancing qualities of caffeine, fear not, as Konnikova notes a hopeful solution: “Some research has found that attributes like increased alertness and focus can be replicated by the placebo effect,” as research substituting decaf for regular coffee has found the benefits of caffeine to only show up among those who believe they have consumed regular coffee, regardless of the true content of their drink.
And while many may feel that they don’t have nearly the need for brilliant insights as they do of those extra few hours of activity, Jane Brody of the New York Times has an unwelcome recounting of all the health maladies that insufficient sleep are linked to, including diabetes, obesity, depression and substance abuse, ADHD, and even finds “The cognitive decline that so often accompanies aging may in part result from chronically poor sleep.”
Those who convince themselves that a five-hour sleep paired with a five-mile run is the key to a healthy and productive life may be fooling themselves, then, and may ultimately be sacrificing countless subtle improvements in their quality of work and life for the sake of a small quantity of time.
It’s a strange variation on a common theme in post-revolution Egypt: the country’s burdensome laws against blasphemy are being used to punish anti-Christian hate speech.
A hard-line Muslim cleric received an 11-year suspended sentence Sunday for tearing up and burning a Bible, Egypt’s official news agency said.
Cairo’s Nasr City court sentenced Ahmed Abdullah and his son was given a suspended sentence of eight years over the same incident, the Middle East News Agency reported. The two were ordered to pay a fine of 5,000 Egyptian pounds ($700). The ruling can be appealed.
Abdullah ripped up a Bible and burned it during a Sept. 11 rally by ultraconservative Salafi Muslims in front of the U.S. Embassy in Cairo, protesting an anti-Islam film produced in the United States. (AP)
In Egypt’s Islamist tilt, these laws have increasingly been applied against Egypt’s Coptic Christians, a religious minority comprising about ten percent of the country. Earlier this month a Coptic Christian lawyer, Rumany Mourad, was sentenced to one year in prison for “defamation of religion” on the basis of a private conversation he had at a law library with two of his Muslim colleagues. Hearings in the case were reportedly “characterized by a heavy presence of Islamist lawyers and their supporters,” one of whom suggested the death penalty, reports Amnesty International.
Last Tuesday, an elementary school teacher, Dimyana Obeid Abd Al Nour, 24, was fined US$14,000 after her students accused her of praising the Coptic Pope and disparaging Mohammed in the classroom.
A Coptic activist asked at the time of Al Nour’s imprisonment, “Why is defamation of religion a one-way street, only for the benefit of the Muslims, while Christianity is defamed every day?” He pointed out that Ahmed Abdullah’s public Bible defamation had gone unpunished.
His question is a fair question, but not the right question. With Abdullah’s conviction, Egypt’s blasphemy laws have been used, for once, to protect Christians from hate speech instead of censure them, but this is no cause for celebration. Blasphemy laws themselves, and not their application, are the problem.
“This ruling is bad,” says Nina Shea, a Hudson Institute scholar who has written a book about blasphemy laws. “The whole blasphemy regime is bad. Minorities get prosecuted disproportionately, and it’s a way of shutting down debate. You could say, ‘Well, burning Bibles, burning Korans should be off limits.’ It never seems to end there. It’s a slippery slope towards banning ideas about religion and expressing rejection of religion.”
“It’s tempting for religious people to be demanding,” she says, noting that as a religious person she finds Abdullah’s actions abhorrent. “That’s the problem, though—it creates sectarian sense of grievances.”
“They think that they can gain greater social peace if the government regulates speech against other religions,” she explains. “Usually that is not the case—just the opposite, it creates jealousy and grievances.” When one religious group sees a member convicted of blasphemy, she explains, it can use that precedent to call for the prosecution of another group.
Moreover, once the government takes a role in regulating religious expression, it rarely sticks to policing the extremes. “The temptation is always to go further to curtail speech and expression,” explains Shea. “You can’t contain this once you go in that direction.”
As tempting as it may be for Egyptian Christians to feel relief at receiving seeming equal protection under the law, no one should praise this ruling. The equal prosecution of blasphemy is at once far too low, and impossibly difficult, a standard to keep.
A spectre is haunting America’s war party. Last week, Iranians went to the polls and surprisingly and unambiguously voted for the most moderate candidate, Hossan Rohani, an establishment cleric who campaigned on the need to improve Iran’s economy and end its diplomatic isolation. Rohani is the vehicle for disparate hopes—not least those of Green movement, suppressed after the 2009 election. One should note the weaknesses of an electoral system where prospective candidates are vetted by the government, but there is little doubt that Rohani’s victory represented a vast outpouring of popular discontent—and raised prospects for an eventual détente between Iran and the West. In their way, the peoples of both Iran and the United States have both spoken—in America first by rejecting the more belligerent candidacies of first McCain and then Romney in favor of Obama; in Iran choosing, probably in the 2009 and certainly last week, the least confrontational candidate available.
Rohani threatens to deny the war party their cartoon image of an Iranian “Hitler,” one which which had been painstakingly, if dishonestly, constructed from the undisciplined and belligerent musings of his populist predecessor, Mahmoud Ahmadinejad. Rohani doesn’t have time to open his mouth before Jonathan Tobin of Commentary warns us that Iran remains a “totalitarian theocracy” and Obama better not “waste more time on sanctions and diplomacy” in an effort to end Iran’s nuclear program. (Tobin fails to explain that rather unique form of totalitarianism which allows meaningful competitive elections, nor does he mention which country in the Middle East has introduced to the region a huge nuclear arsenal.) Max Boot, also at Commentary, reminds readers that power in Iran rests with the Supreme Leader, not with the president, an interpretation of Iranian political dynamic not stressed when Ahmadinejad was president. Jeffrey Goldberg chimes in that the Iranian election was “fake.” Tobin rails about “useful idiots”—the Times editorial board in this instance—who prefer diplomacy to war. But one can sense the fear in the neocons: the broad spectrum of Western opinion is inclined to think the Iranian election result might be a good, not a bad thing. One can be sure a vast research enterprise is underway to find a quote from Rohani’s past that expresses something other than sheer joy at Israel’s dispossession of the Palestinians. An editor at the war-hungry Wall Street Journal is already accusing Rohani of encouraging the murder of dissident students in the 1990′s.
The panic reminds me of the one which pulsed through neoconservative ranks during the emergence of Gorbachev. Then the situation was more ambiguous—the Soviets didn’t allow elections. But the neocons were unanimous (or nearly: Joshua Muravchik was a notable, and solitary exception) in presenting Gorbachev as a greater threat than previous leaders because he seemed moderate, seemed to desire the turning of bad pages and exploring new possibilities. It was a core neoconservative tenet that Soviet totalitarianism was incapable of reform and forever on the march, and in selecting Gorbachev they had found a clever new tool to lull and trick the West. Norman Podhoretz published one column—I recall struggling to write an appropriate headline for it—devoted to the Soviet leader’s devilish and mendacious smile. The danger of course was that Ronald Reagan would drop his guard, which he did, finding Gorbachev’s desire to move past the Cold War altogether credible.
In holding this election, the mullah’s regime in Iran, with all its obvious brutality and structural flaws, has already proved itself more “democratic” than the dictatorship the United States imposed upon Iran for a generation after 1953. Not surprisingly, many Iranians remember this. I don’t know whether Obama has the fortitude to explore the Iranian people’s peace overture—because it is they who made an unambiguous election choice—or whether he will bow to various Beltway hawks. But the existence of popular will on both sides for something other than continued confrontation seems impossible to deny.
And the nuclear issue: it seems to me making a core value of American policy that Israel should have hundreds of nuclear weapons and its regional neighbors not even the right to enrich uranium will always be perceived as inherently unjust, and thus inherently unstable. Margaret Thatcher, expressing frustration at Israel’s efforts to stonewall the peace process once told a Times interviewer ”[Y]ou cannot demand for yourself what you deny to other people.” The same principle can be applied to Israel’s and Iran’s respective nuclear programs.
Facing a series of social and academic woes, the city of Oakland is trying a new reform strategy. The Atlantic published an article on Monday detailing how, through its “community schools” initiative, the school district hopes to offer a more holistic education.
“Community schools,” according to The Atlantic and the Coalition for Community Schools, aim to reach beyond the realm of academics and foster students’ growth through “school-community partnerships.” Rather than one all-encompassing initiative or program, a community school focuses on addressing social issues on an individual level: “There’s no one model for community schools. Advocates say each school reflects the particular needs of its students and parents.”
Each school has a “campus resource center” with nurses, therapists, and social workers. The centers provide counseling and support for needy students. Although teachers had mixed reviews, an East Oakland high school principal said these new services have already reduced suspensions.
This is not the first time Oakland has tried a “smaller is better” education reform policy: in 2000, administrators tried breaking 12 large schools into 48 small ones “in order to nurture closer relationships between students and faculty.” Although the effort made promising gains at first, it failed by 2007. One teacher reported “a campus full of disengaged students, an excess of administrators and a cumbersome bureaucracy.”
In contrast, The Atlantic report indicates that community school initiatives have achieved marked success: “Cincinnati, one of the pioneers of the current community schools push, has seen higher test scores and graduation rates since beginning … Community schools in New York, Chicago and other California cities demonstrated improvement on test scores, better attendance and reduced dropout rates compared to traditional schools.”
Barack Obama has just taken his first baby steps into a war in Syria that may define and destroy his presidency.
Thursday, while he was ringing in Gay Pride Month with LGBT revelers, a staffer, Ben Rhodes, informed the White House press that U.S. weapons will be going to the Syrian rebels.
For two years Obama has stayed out of this sectarian-civil war that has consumed 90,000 lives. Why is he going in now?
The White House claims it now has proof Bashar Assad used sarin gas to kill 100-150 people, thus crossing a “red line” Obama had set down as a “game changer.” Defied, his credibility challenged, he had to do something.
Yet Assad’s alleged use of sarin to justify U.S. intervention seems less like our reason for getting into this war than our excuse.
For the White House decided to intervene weeks ago, before the use of sarin was confirmed. And why would Assad have used only tiny traces? Where is the photographic evidence of the disfigured dead?
What proof have we the rebels did not fabricate the use of sarin or use it themselves to get the gullible Americans to fight their war?
Yet why would President Obama, whose proud boast is that he will have extricated us from the Afghan and Iraq wars, as Dwight Eisenhower did from the Korean War, plunge us into a new war?
He has been under severe political and foreign pressure to do something after Assad and Hezbollah recaptured the strategic town of Qusair and began preparing to recapture Aleppo, the largest city.
Should Assad succeed, it would mean a decisive defeat for the rebels and their backers: the Turks, Saudis and Qataris. And it would mean a geostrategic victory for Iran, Hezbollah and Russia, who have proven themselves reliable allies. Read More…
Members of a heretofore independent panel on Gulf War Illness are accusing Veterans Affairs Secretary Eric Shinseki of “shooting the messenger” by gutting their committee and slashing half its members in a recent charter rewrite.
A member of the Research Advisory Committee (RAC) on Gulf War Illness told The American Conservative over the weekend that Shinseki was retaliating against them for their unvarnished, public criticism of the agency—in the press and on Capitol Hill. Most recently, members Anthony Hardie, a Gulf War veteran and advocate for the estimated 250,000 vets suffering with Gulf War Illness (GWI), and Dr. Lea Steele, a longtime GWI researcher, testified with former VA scientist Steven Coughlin on the Hill. Both RAC members complained that bureaucrats and researchers in the agency were driven by an agenda that preferred viewing GWI as a psychological rather than physical condition.
TAC interviewed Coughlin and Hardie after the hearing. Coughlin said his bosses manipulated and ignored data that did not coincide with their agenda. Hardie concurred, saying that the RAC had been forced to deal with this VA bias for some time and that complaints about it had been ignored. In 2008 for example, the committee released a report saying that GWI was a physical condition caused by toxins, including pesticides and the pills that the soldiers were given to counteract the effects of nerve gas. Since then, committee members have accused the VA of trying to undermine their findings. (The VA’s critics say it is trying to avoid the massive expense of liability, a charge the VA has adamantly denied. Officials have also denied that the VA is trying to push the psychological explanation over the physiological one.)
The damage done to the 15-year-old RAC last month by Shinseki’s hand might forever take the teeth out of the scrappy committee, which is supposed to convene for a regular meeting this week in Washington. The changes to the RAC charter would ax six of its 12 members and replace them “in accordance with VA policy,” according to a letter to RAC chairman James Binns signed by Shinseki’s interim chief of staff, Jose Riojas. The letter was provided to reporter Kelly Kennedy, who wrote about it at USA Today on Friday. The measure also removes Binns—whom the committee called their “principled, fair, just, non-partisan, longstanding champion” of veterans—after a one-year “transition period.” The letter does not identify which other members will have to go. Read More…
With the release of the new biopic “Hannah Arendt,” about the political philosopher’s coverage of the Eichmann trial in Jerusalem, you can expect to be hearing a lot of Arendt’s concept “the banality of evil.”
Arendt famously saw in Adolf Eichmann (one of the key logistical organizers of the Holocaust) not a raging anti-Semite who delighted in murder, but a pencil-pusher who became a workaday tool of genocide merely by unreflectively and diligently following orders.
Critics of Eichmann In Jerusalem believe that Arendt, a great thinker but incompetent court reporter, was duped by Eichmann: Eichmann was in fact a racist true-believer, as were thousands of his countrymen, who did not “blindly” follow orders but became “Hitler’s willing executioners.” Furthermore, Ron Rosenbaum has urged the abandonment of the banality of evil on more general grounds: it denies the reality of conscious, willful, knowing evil.
But apart from the specifics of Eichmann and the Holocaust more generally—a still-raging debate I dare not touch—Barry Gewen reminds us why the “banality of evil” is in fact an important concept, a call to action:
Arendt’s approach was unyieldingly universalistic. Her analysis of Eichmann was a demand for individual responsibility, an insistence on the need constantly to exercise personal choice, whatever society might dictate. This is a cold ethic, as severe as Kant’s, so difficult it has a quality of the inhuman about it. For who among us can maintain the unceasing moral awareness she calls for? [emphasis mine]
The citizens of Le Chambon-sur-Lignon, a largely Huguenot village, rescued over five thousand Jews from Eichmann’s ilk in occupied France. But this was by no means inevitable: when the town’s church community tried to secure promises to help the anticipated stream of refugees, the townspeople largely refused. As James C. Scott recounts in Two Cheers for Anarchism (reviewed for TAC here), they only changed their minds when the Jews began to arrive:
The pastors’ wives found themselves with real, existing Jews on their hands, and they tried again. They would, for example, take an elderly Jew, thin and shivering in the cold, to the door of a farmer who had declined to commit himself earlier, and ask, “Would you give our friend here a meal and a warm coat, and show him the way to the next village?” The farmer now how had a living, breathing victim in front of him, looking him in the eye, perhaps imploringly, and would have to turn him away…
Once the individual villagers had made such a gesture, they typically became committed to helping the refugees for the duration. They were, in other words, able to draw the conclusions of their own practical gesture of solidarity—their actual line of conduct—and see it as the ethical thing to do. They did not enunciate a principle and then act on it. Rather, they acted, and then drew out the logic of that act. Abstract principle was the child of practical reason, not its parent.
Francois Rochat, contrasting this pattern with Hannah Arendt’s “banality of evil,” calls it the “banality of goodness.”
The pastors’ wives answered Gewen’s question: Who can maintain unceasing moral awareness? None of us—our moral reasoning fails us constantly. We need, it would seem, to have our neighbor constantly put before us, his suffering shown to us.
In a rapid response to Rep. Paul Ryan’s convention speech last August, I wrote:
In Ryan’s intellectual bubble, there are job creators and entrepreneurs on one side and parasites on the other. There is no account of the vast gray expanse of janitors, waitresses, hotel front-desk clerks, nurses, highway maintenance workers, airport baggage handlers, and taxi drivers. They work hard, but at the end of the day, what can they be said to have “built”?
In a speech late last week, former Sen. Rick Santorum did me one better. He remarked of the very same convention at which Ryan spoke:
One after another, they talked about the business they had built. But not a single—not a single—factory worker went out there. … Not a single janitor, waitress or person who worked in that company! We didn’t care about them. You know what? They built that company too!
Apparently, Santorum and I have a thing for janitors and waitresses. More importantly: They built that company too!
This is something of an intellectual breakthrough for a high-profile Republican.
At a gut level, most GOPers, including most especially the one who lost the 2012 presidential election, apply a rough sort of common sense to economic outcomes: people help themselves. Government may justifiably step in to come to the aid of those who can’t. Any market interference on top of that is an election-rigging “gift.” Read More…
Facebook announced this week that it is adopting the hashtag – supposedly to “help people more easily discover what others are saying about a specific topic and participate in public conversations.” Some writers herald this as a momentous occasion.
But though hashtags once served a constructive purpose (and still can, when used judiciously), most have devolved into meaningless quasi-expressions – and are even infecting our verbal language and grammar.
Chris Messina is credited with having invented the hashtag in August 2007; it was designed to “gather discussions and online exchanges” on Twitter. By prefixing a word or sentence with the pound sign, users created a searchable metdata tag.
Twitter founder Evan Williams initially thought hashtags were too technical to become popular; unfortunately, they’re now more popular than technical. While created as a “tag” for communities (#StudentsForObama), emerging debates (#StandWithRand), or historic events like #TahrirSquare, the hashtag has largely devolved into a cacophany of meta-communication. As Sam Biddle writes,
Hashtags at their best stand in as what linguists call “paralanguage,” like shoulder shrugs and intonations. That’s fine. But at their most annoying, the colloquial hashtag has burst out of its use as a sorting tool and become a linguistic tumor—a tic more irritating than any banal link or lazy image meme.
Instagram and Twitter are bombarded with hashtag gibberish every day. Many are primarily designed for self-focused clamor. Some examples: #likeforlike, #likeforalike (in case #likeforlike doesn’t cover it), #likeitup, #liketeam, #followme, #followforfollow, and #teamfollowback.
— tsproductionsbeats (@tsproductionsuk) June 14, 2013
With each hashtag added, the user’s post theoretically reaches a larger number of people. But mostly they just get in the way of communicating in the first place. Often, even “context”-creating hashtags are meaningless and self-aggrandizing (#happy, #ilovemylife, #cutestboyfriendever, etc.).
The Times on Thursday ran a lengthy profile of Anthony Weiner, former very liberal congressman and staunchly right-wing Zionist. The paper depicts him, basically, as a jerk. The piece doesn’t mention the sexting scandal that drove him from office two years ago but casts a baleful eye on his achievements as a politician. Weiner is now running for mayor of New York and stands second in the polls, surging in a weak Democratic field. He seems to have more political energy than his rivals. As a congressman, Wiener was a relentless attention seeker, wonderful at getting TV camera time, weak in actual legislative achievements, even liberal ones.
Surprisingly the Times can find no one among his peers with much good to say about him. He comes across as a caricature of driven selfishness, abusive of his staff, demanding that airline flights be rescheduled to fit his convenience, running traffic lights to reach events, heedless of any issues besides those which can benefit him politically or financially.
I saw Weiner in action once, at a debate on Israel, Palestine, and the attack on Gaza a little over two years ago at the New School in Manhattan. He was debating Brian Baird, the now retired congressman from Washington state who distinguished himself by visiting Gaza after Israel’s first assault in 2008 and describing on Capitol Hill the damage American weapons had inflicted on schools and homes. (I hope to see Baird, who is brave and thoughtful, emerge in some other public role soon).
Weiner was something else. He stunned the audience, and no doubt pleased his supporters, by making the most hard-right Zionist claims one could imagine. He claimed there was no Israeli occupation of the West Bank, he claimed Israel’s eastern border was the Jordan River. He wasn’t smooth or even educated on the subject, there was no phony hasbara about how he really desired a Palestinian state if the Palestinians only had better leadership. He simply claimed all the land for the Jews, Palestinians be damned. Read More…