Archive for October, 2010

Brother from the Richmond Planet

Posted by jcmaziquemd on October 31, 2010

Brother from the
Richmond Planet

Share What’s this?


It played out like a John Ford Western—the hero packing a pair of pistols and walking the streets of the dusty town in search of justice and the man who’d threatened his life. But instead of John Wayne in the lead role, the protagonist in this real-life case was a courageous, twenty-two-year-old, African-American firebrand named John Mitchell Jr. As editor and publisher of the African-American weekly newspaper the Richmond Planet, Mitchell was not content to sit in his office—actually, his boardinghouse attic quarters, which doubled as his newsroom at that time—in the face of injustice.

A lynching had taken place at a crossroads in Charlotte County in rural Southside Virginia in May 1886, an event brushed aside in the white press, but taken up in a blistering editorial by Mitchell in the Planet. In response, the journalist received a threatening—and anonymous—letter from Southside with a skull and crossbones on the envelope and the following message: “If you poke that infernal head of yours in this county long enough for us to do it we will hang you higher than he was hung.”

Image: John Mitchell Jr.
—P.77.8-John Mitchell Jr., Valentine Richmond History Center

Mitchell printed the letter in his newspaper and added his own response, which he based on a quote from Shakespeare: “There are no terrors, Cassius, in your threats, for I am armed so strong in honesty that they pass me by like the idle winds, which I respect not.” He traveled to the scene of the barbaric crime—walking five miles in plain sight to get there—then strolled around the neighborhood and visited the jail from which the black man had been kidnapped. All the while wearing a pair of Smith & Wesson revolvers. “The cowardly letter writer was nowhere in evidence,” Mitchell later reported.

The young crusader fought against the lynching of both African Americans and whites (though blacks far outnumbered whites as victims of that crime), and he protested against unjust sentences that were being meted out to black prisoners. Mitchell quickly made a name for himself with his daring deeds and became known as the “Fighting Negro Editor” who would gladly “walk into the jaws of death to serve his race.” It was his job, he said, “to howl, yes howl loudly, until the American people hear our cries.”

Mitchell was fearless and had a flair for the dramatic. When he found out that a fifteen-year-old black boy (whom the governor himself would later describe as “very young and weak-minded”) was facing execution in Chesterfield County, Virginia, for allegedly raping a white girl, the editor claimed it would be a “disgrace to the commonwealth” to execute such a young man—regardless of his color. He managed to track down the governor, Fitzhugh Lee (nephew of Robert E. Lee), who was vacationing in the mountains two hundred and fifty miles away, and convince him to issue a stay of execution.

On the eve of the rescheduled hanging, Mitchell once again wrested a reprieve from the governor, then had to make a mad dash in the middle of the night by horse and buggy to deliver the news to the sheriff before dawn. If he got there too late, the boy would be dead. Mitchell arrived just in time. He then interviewed the young prisoner in the jailhouse, describing in the pages of the Planet the pathetic scene he witnessed: the barefooted prisoner, his ankle chained to the stone floor, who was, in Mitchell’s words, “the picture of sadness.” The editor promised the boy, “I’ll go to Richmond and fight for you until the last moment. If I win you will see me again. If I lose you will see me no more.” According to the published account, even the white jailer welled up with tears. A cartoonist as well as writer, Mitchell included sketches in the newspaper that captured the grim details: the gallows that were already in place, the hood the prisoner would have worn, and even the coffin intended for him. Mitchell’s impassioned support riveted the black community throughout the state, and he eventually succeeded in getting the young man’s sentence reduced.

From 1884 until his death in 1929, Mitchell used his newspaper as a vehicle to awaken the conscience of both blacks and whites to the reality of racial injustice. He was a tireless gadfly during an era of lynching and the subsequent disenfranchisement of black voters at the turn of the century. Lynching reached its peak in the South in the early 1890s, when, on average, a black man was hanged every other day for alleged crimes that ranged from rape and murder to hitting a white man or even just writing an insulting letter. The act of lynching had become a grotesque public spectacle with wide swaths of the white community turning out to watch as the victims—many of them innocent of any crime—were tortured, mutilated, and even set afire. Mitchell reported on these atrocities, regularly listing the names of the victims beneath a drawing or photograph of a lynching. Early on he advertised the paper as follows:

Do you want to see what the Colored People are doing? Read the Planet. Do you want to know what Colored People think? Read the Planet. Do you want to know how many Colored People are hung to trees without due process of law? Read the Planet. Do you want to know how Colored People are progressing? Read the Planet. Do you want to know what Colored People are demanding? Read the Planet. . . .

What Mitchell was demanding was basic human decency and justice for African Americans. He espoused middle-class principles of sober hard work and dignified behavior. “Respect white men, but do not grovel,” he wrote. “Hold your heads up. Be men!” But, in the pages of his newspaper, beneath its powerful logo of a flexed, muscular black arm with lightning bolts radiating out of its clenched fist, Mitchell issued some galvanizing opinions of his own. “We regret the necessity,” he wrote, “but if the government will not stop the killing of black men, we must stop it ourselves.” He believed that self-defense was called for when one was under attack: “The best remedy for a lyncher or a cursed mid-night rider is a 16-shot Winchester rifle in the hands of a dead-shot Negro who has nerve enough to pull the trigger.”

The Richmond Planet
—Library of Virginia

Over the years, his paper—one of only a handful of black newspapers produced in the region—carried news of local, national, and international import. Official circulation rose to an impressive 6,400 by 1896, though many more than that actually read its articles, editorials, and advertisements. The paper was passed from person to person, from family to family among the black population of Richmond—much to the chagrin of the editor, who counted on the income generated by the $1.50 yearly subscription rate. (“Do you subscribe to the PLANET or do you borrow it?” he wrote testily on at least one occasion.) But Mitchell also understood that the larger white world needed to be aware of his paper; they needed to read about events from an African-American point of view; they needed to be presented with a portrayal of the black race as hardworking and honorable citizens. Every week he had a copy of the paper delivered to the governor’s mansion in Richmond and to the city’s white newspaper editors.

Who exactly was this John Mitchell Jr., who became one of Richmond’s most influential citizens during an era of bitter racial antagonisms? He was born on July 11, 1863, barely a week after Gettysburg. His parents were slaves in the household of James Lyons, a prominent lawyer who had railed against the Emancipation Proclamation, calling it “inhuman and atrocious” and an act that would “incite servile insurrection against us.” As a member of the Confederate Congress and a pillar of Richmond society, Lyons had entertained Jefferson Davis and the top Confederate military brass in his grand home. At ten years old, Mitchell became his carriage boy, a job for which he needed no education, according to Lyons; Mitchell’s mother, who thought otherwise, taught her son to read and sent him to school. At thirteen, Mitchell was admitted to Richmond Normal and High School, which had been founded by the Freedmen’s Bureau in the postwar era to train teachers. He graduated as valedictorian in 1881 and began teaching in the public schools, a career cut short after three years when a newly elected school board forced out nearly all the black teachers, including him. In 1884, at age twenty-one, he took over the relatively new—but already failing—Richmond Planet and transformed it with his boundless energy, and his talent as both a writer and a cartoonist.

While still running the newspaper, Mitchell became a political leader, representing the Jackson Ward black district on Richmond’s city council from 1888 until 1896, when he lost his seat in a crooked election that the Planet, with justification, called the “Jackson Ward Robbery.” After the Virginia Constitutional Convention of 1901–1902 succeeded in almost completely disenfranchising black voters, Mitchell adopted different tactics, advocating black economic power as a means to advancement. As head of the fraternal organization, the Virginia Knights of Pythias, Mitchell was already involved in selling insurance; but in 1902 he founded the Mechanics’ Savings Bank for blacks and began investing heavily in Richmond real estate—sometimes in white neighborhoods, to the horror of local residents. Using his newspaper as a bully pulpit, in 1904 he led a citywide black boycott of the newly segregated city streetcars. “Walking is good now. Let us walk!” he wrote, while also offering some hints for aching feet—fish salts and witch hazel were effective remedies, he claimed. In 1910, he built a grand new edifice for his bank—a four-story Renaissance Revival building with a roof garden and marble fixtures in the lobby. The white press reported on the opening under the headline “Colored People Have Skyscraper.”

In 1921, Mitchell even ran for governor on what whites contemptuously called a “lily-black” ticket. He lost in a landslide, and his life began to take a downward turn. Indicted for bank fraud, he was sentenced to three years in prison in 1923. The Supreme Court of Appeals subsequently overturned the conviction, but the damage had been done. His final years were marred by poverty and disgrace—though he remained the ever-feisty editor of the Planet, pleading his innocence. Family lore has it that Mitchell faced death with the same forthright courage that he embodied in life. On December 3, 1929, dressed in his ceremonial Knights of Pythias uniform, he stood up, said, “I am ready for you, death,” and promptly died.

Mitchell’s life is an uplifting tale of triumph in the face of racial hatred, an astounding story of passion, talent, and endurance. So, why is he so little known? Beyond a wonderful biography by Ann Field Alexander, entitled Race Man: The Rise and Fall of the “Fighting Editor,” John Mitchell, Jr., published in 2002, which this article draws upon, not much has been written about his dramatic life. In Richmond—a city full of monuments and markers—his legacy was almost completely ignored until recent times. Fortunately, many of his newspapers survive, and they are currently being digitized thanks to the National Digital Newspaper Program, which is funded by the National Endowment for the Humanities and cosponsored by the Library of Congress.

Errol Somay, director of the digitization project taking place at the Library of Virginia, waxes poetic about theRichmond Planet’s heroic editor: “I think John Mitchell Jr. is one of the best kept secrets in Virginia history. Here is a singular figure most people don’t know about—an African American running a newspaper at the height of lynchings.” It has been, in Somay’s words, a “labor of love” on the part of his staff to rescue the crumbling newspapers that looked “like the Dead Sea Scrolls” when they first found them. The pages were so brittle that many broke into confetti-like bits; they had to be pieced together like a puzzle and then put in their proper order.

Somay explained that historic African-American newspapers are “rare as hen’s teeth,” because, compared with the white press, there were far fewer African-American titles, and official repositories generally didn’t bother to collect them. But the Library of Virginia, the official state library, is exceptionally lucky to have one of the most complete holdings of the Planet—the “luck” perhaps due to Mitchell’s weekly habit of sending the paper to the governor’s mansion.

Somay proudly showed me the enormous broadside sheets—with their fascinating array of news stories, cartoons, advertisements, editorials, and heartrending photos of lynchings—that have now been encapsulated in mylar sheets, microfilmed, and digitized. “We’ve done the whole thing soup to nuts. And now people can do their searches from the comfort of their own home”—and discover the life work of an extraordinary crusading journalist.


Posted in Uncategorized | Leave a Comment »

Message from the Killing Fields…

Posted by jcmaziquemd on October 31, 2010

Warrior Nation

Warrior Nation 1

Larry Towell, Magnum Photos

Enlarge Image

By Michael Nelson

"Endless War" is how The New York Timesheadlined its review of the Boston University historian Andrew J. Bacevich’s new book,Washington Rules: America’s Path to Permanent War. It’s a headline that will work just as well if the Times decides to reviewReasons to Kill: Why Americans Choose Warby Richard E. Rubenstein, a professor of conflict resolution at George Mason University. In fact, either Bacevich or Rubenstein could accurately have chosen "Endless War" as his own book’s title.

The occasion for both books, as well as for the City University of New York journalism and political-science professor Peter Beinart’s recent The Icarus Syndrome: A History of American Hubris, is the start of the 10th year of continuous (and at least seemingly endless) war by the United States in Afghanistan, Iraq, and—factoring in what the Times estimates is "roughly a dozen" secret military campaigns against terrorist groups based in other countries—around the world. Add those to the list of previous wars and military operations during the past 30 years: Nicaragua, Grenada, Libya, Panama, Kuwait, Somalia, Haiti,Bosnia, and Kosovo.

Bacevich, Rubenstein, and Beinart agree that war has been a prominent feature of American life for a very long time. They just disagree over how long and, by implication, how deeply embedded war is in America’s identity as a nation. Unfortunately, as pessimistic as each of them is about the future (pretty pessimistic), their outlook may not be gloomy enough.

Bacevich regards the start of the cold war in the late 1940s as the beginning of an era of perpetual war in which, for the first time, "military might emerged as central to the American identity."The "Washington rules" that he says dominate the nation’s elected government and permanent security apparatus are based on a "credo"—namely, that the United States alone must "lead, save, liberate, and ultimately transform the world."

This credo, Bacevich argues, is made manifest in a "trinity" of operational imperatives: "maintain a global military presence"of bases and fleets around the world, "configure its forces for global power projection" to enable rapid military action anywhere, anytime, and "counter existing or anticipated threats by relying on a policy of global interventionism."Remarkably, the end of the cold war made no difference at all in either credo or trinity. "Once the Soviet threat disappeared," he observes, "with barely a whisper of national debate, unambiguous and perpetual global military supremacy emerged as an essential predicate to global leadership."

Related Content

Enlarge ImageWarrior Nation 2

The New York Times

Beinart by no means slights the importance of the cold war in The Icarus Syndrome. He regards it as the beginning of an era marked by "the hubris of toughness"—"toughness" because it involved standing up to communist aggression the way Chamberlain should have stood up to Hitler, and "hubris" because it was based on "the belief that you’ve discovered a formula that works in all situations."

But for Beinart, the excesses of cold-war toughness that ultimately mired the United States in the Vietnam War were not the first instances of hubristic overreach in American foreign policy. For that he goes back to Woodrow Wilson’s "hubris of reason," which arrogantly assumed after the Allied victory in World War I that the United States could solve the world’s problems by employing the same rational processes with which Wilson and other Progressive leaders were tackling America’s problems at home. The Paris Peace Conference’sdilution of the president’s idealistic Fourteen Points for governing the postwar world and the Senate’s rejection of the League of Nations were among the unhappy results of Wilsonian overreach.

Rubenstein, after pausing at the start of Reasons to Kill to puzzle over Tocqueville’sobservation that Americans are "fond of peace" because it "allows every man to pursue his own little undertakings," traces the roots of American bellicosity further back than either Bacevich or Beinart. He cites a study showing that even in colonial times, "there was either a declared war or a conflict for 79 of the 179 years from just before the founding of Jamestown until 1785, nominally the end of the Revolution." Rubenstein also mentions research by the political scientists Peter D. Feaver and Christopher Gelpi, who in their 2004 book Choosing Your Battles: American Civil-Military Relations and the Use of Force, record 111 "militarized interstate disputes" that the United States initiated from 1812 to 1992.

Rubenstein argues that a proclivity to war sank deep and enduring roots in American soil for two small reasons and one big one. The first small reason is the early settlement pattern that made Scots-Irish immigrants—warriors for more than six centuries in defense of their native land against the English—the dominant ethnic group in the southern frontier; the second is the "Billy Budd syndrome," in which Americans have long been "blinded by uncritical trust in authority," even when it leads them into unnecessary wars against countries like Mexico, Spain, and North Vietnam. The big reason is that Americans are a religious people who won’t fight unless convinced that their cause is just but who are easily persuaded that lots of causes are just. Those include "self-defense" broadly construed, an "evil enemy," "patriotic duty," and their "unique virtue" as "liberators and peacemakers, not selfish imperialists."

All of these books are worthy efforts to explain why the United States, itself scarcely touched by foreign invasion, spends so much time fighting abroad. With the partial exception of Bacevich’sWashington Rules, however, all of them neglect or underplay the importance of two critical Vietnam-era decisions: the replacement of the draft-based army with the All-Volunteer Force (AVF) and the roughly simultaneous expulsion of Reserve Officer Training Corps units from many elite campuses. Taken together, those decisions have made the nation’s inclination to war and other military action greater than at any time in its already war-saturated history.

The volunteer forces came into being in 1973 as a byproduct of President Richard Nixon’s decision to end the draft and thereby take the steam out of the campus-based anti-Vietnam War movement. Rubenstein attributes all sorts of idealistic motives to the antiwar activists ("peace, friendship, race and gender equality, economic justice," etc.), but the truth is that Nixon was right. The demonstrations pretty much ended as soon as college students no longer had to worry about being drafted when they graduated or dropped out of school.

Congress has consistently anted up whatever funds were necessary to attract enough young people, most of them working class, to fill out the enlisted ranks. But where would the officers come from? Objecting to the war, many elite universities, chiefly in the East but also in the Midwest (the University of Chicago) and West (Stanford), had already shown ROTC the exit.Subsequently, those colleges reaffirmed their policy of exclusion in protest of Congress’s 1993"don’t ask, don’t tell" law banishing outed gays and lesbians from the military.Today the armed services aren’t sure it would make economic sense to return ROTC from exile if the gates were reopened, as seems likely at Harvard, Columbia, Stanford,and elsewhere as soon as "don’t ask, don’t tell" is repealed or decisivelyvoided by the courts.

How have these decisions made the United States even more prone to war than Bacevich, Rubenstein, and Beinart think? First, both the volunteer forces and the ROTC expulsions turned the military’s recruiting gaze southward, to the region of the country (still rich in Scots-Irish ethnicity and culture) most supportive of the armed forces as an institution and of war as an instrument of national policy. In 1968 ROTC had 123 units in the East and 147 in the South. Just six years later, SouthernROTC units outnumbered those in the East by 180 to 93. Alabama, with one-fourth the college population of New York City, has 10 ROTC units compared with New York’s two. As Defense Secretary Robert M. Gates pointed out last month in a speech at Duke University, "With limited resources, the services focus their recruiting efforts on candidates where they are most likely to have success."

Forty percent of enlisted men and women are now Southerners, and the officer corps speaks with an even stronger Southern accent. As a consequence, like the South generally, the military has moved rightward into the Republican Party. "Reversing a century and a half of practice," laments the University of North Carolina military historian Richard H. Kohn, based on surveys he helped to conduct, "the American officer corps has become partisan in political affiliation, and overwhelmingly Republican."In his new book, Our Army: Soldiers, Politics, and American Civil-Military Relations, Jason K. Dempsey reports that in 2007 Republicans outnumbered Democrats 49 percent to 12 percent among senior officers.At West Point, Dempsey found, "enough officers overtly endorse the Republican Party that many cadets apparently conflate an identification with the Republican Party with officership."

Second, the end of the draft and ROTC’s banishment from many elite campuses mean that a steadily declining share of those in Congress and the upper reaches of the executive branch have served as either officers or enlistees. Until 1995 the percentage of veterans in Congress was consistently higher than in the country as a whole. Since then it’s been lower—around 30 percent and shrinking.

The result: Fewer and fewer of the civilian decision makers who now send troops into battle know what war is like. Apart from the moral queasiness this ought to induce, there is a tangible consequence. Feaver and Gelpi show statistically in Choosing Your Battles that throughout American history, the government’s likelihood of initiating the use of force has consistently gone up whenever the percentage of veterans in Congress and the cabinet has gone down.

Feaver and Gelpi also report that although the military is typically reluctant to use force, if compelled to do so, it "argues strenuously that the force should be overwhelming and decisive." This argument, when presented to policy makers who have led entirely civilian lives, almost always prevails. The theme that shines through Bob Woodward’s new insider account of Obama’s Wars, for example, is that despite sustained and even heroic efforts by the president, he was unable to make the military give him realistic options in Afghanistan that didn’t involve committing more than 30,000 additional troops.

Finally, the all-volunteer force, by eliminating any real possibility of conscription, has severed the connection between passive disapproval and active opposition to war. Support for the war in Iraq in public-opinion polls fell farther faster in the mid 2000s than support for the Vietnam War ever did. But antiwar opinion in the Vietnam era turned into antiwar protest in ways that more recent antiwar opinion has not. Astonishingly, after President George W. Bush’s war policy was rebuked by the voters in the 2006 midterm election, he was able to deploy an additional 20,000 troops to Iraq with scarcely any organized opposition in Congress or the country. Nor did Obama’s own "surge" in Afghanistan generate effective protest of any kind. It is inconceivable that antiwar college students would have remained politically inert if there was any chance that they would be drafted to fight in the wars they oppose.

What can colleges do to mitigate these developments, which taken together have heightened the already great American proclivity to war that Bacevich, Beinart, and Rubenstein document in their books? Forget about trying to bring back the draft. The technologically complex modern military needs long-term volunteers, not short-term draftees, to function effectively. The all-volunteer military isn’t going anywhere.

ROTC is different. Colleges that have kept their doors shut can begin by reopening them. As the Stanford historian David M. Kennedy argues, excluding ROTC for the past four decades has simply ensured that elite universities, "which pride themselves on training the next generation’s leaders, will have minimal influence on the leadership of a hugely important American institution, the United States armed forces." "It’s clearly best," Kennedy told the Stanford faculty, "for our democracy to have, among its military officers, citizens who have a liberal education at the best universities in the country."

But reopening the doors to ROTC, a military institution that is understandably chary of being burned again by some future campus controversy (an especially unpopular war? military harm to the environment?), won’t be enough. Colleges and universities need to put out the welcome mat so that students are encouraged to consider military service as an option for at least part of their lives—en route, as some of them will turn out to be, to high public offices in which they will make decisions about war and peace in years to come. One form of welcome would be to top up ROTC scholarships so that high-tuition institutions are affordable to service-oriented young people. More generally, though, colleges should take to heart an argument made by Josiah Bunting III, the Vietnam-era army-major-turned-novelist who later became president of Hampden-Sydney College and superintendent of Virginia Military Institute.

Writing in The American Scholar in 2005, Bunting observed that the long-term benefit to society of Teach for America—the program that recruits high-flying college grads to spend two or three years teaching in difficult public schools—is that later in life, when they are in positions of influence, "they will know the costs and difficulties and sometimes dangers of such duties. So it should be with … soldiering in behalf of the American people." That’s not a programmatic plan of action, but it is an animating spirit that individual colleges and universities would do well to adopt and then apply to their own distinctive circumstances.

Posted in Uncategorized | Leave a Comment »

NPR.org – Think You Know How To Study? Think Again

Posted by jcmaziquemd on October 31, 2010

j m thought you would be interested in this story: Think You Know How To Study? Think Again

*Listen/Watch on NPR.org*
Many stories at NPR.org have audio or visual content. When you visit the link
above, look for a "Listen" or "Watch" button.
For technical support, please visit NPR’s Audio/Visual Help page:

Posted in Uncategorized | Leave a Comment »

Book Review: Eugene Robinson’s ‘Disintegration: The Splintering of Black America’

Posted by jcmaziquemd on October 31, 2010

Book Review: Eugene Robinson’s ‘Disintegration: The Splintering of Black America’

By Lawrence Jackson
Sunday, October 10, 2010; B01


The Splintering of Black America

By Eugene Robinson

Random House. 254 pp. $24.95

Eugene Robinson’s new book, "Disintegration," opens with an account of a Washington dinner party dripping with influential Americans whom the reader can only assume are white. But these kingmakers, gathering shortly after the election of Barack Obama, turn out to be black.

Robinson proposes that this group — which included Eric Holder, soon to be nominated as attorney general; Valerie Jarrett, an Obama fundraiser who has Oprah Winfrey’s private phone number; Franklin Raines, a banker with a reputation nearly as bad as Kenneth Lay’s; and Soledad O’Brien, a hard-charging, racially ambiguous newscaster — signals the fulfillment of Martin Luther King Jr.’s dream. Even if it does, a small part of Robinson regrets the achievement of the hallowed plateau. He contends that the exercise of respectable power by these black people actually splinters a formerly coherent and unified black community.

Robinson, a Pulitzer Prize-winning columnist for The Washington Post, carves modern American blacks into four categories. His dinner-party comrades are members of a tiny group he calls the Transcendent class of wealthy blacks, composed chiefly of athletes, singers and media darlings. The Transcendents are more than offset by the regular black headline-makers, a "large minority" of African Americans that sociologists famously called the underclass in the 1980s and that Robinson now labels the Abandoned. A third group he identifies is the Emergent, people who are biracial, the children of parents from Africa or the African diaspora, or, like Obama, both.

Although Robinson calls for a "domestic Marshall Plan" to tackle African American "poverty, dysfunction, and violence," he gives the heart of the book to the fourth group, the one he identifies with: the nebulously defined black Mainstream, a "middle-class majority with a full ownership stake in American society."

The notion of what constitutes a middle-class life has changed over the years. In the 19th century, Americans still clung to Thomas Jefferson’s hope of yeoman farms. After World War II, a middle-class life meant home ownership, a college education, an annual vacation and the possibility of a cozy retirement. Always there was the hope that children would attend better schools, build larger homes and enjoy more material prosperity than their parents.

Being middle class means something different in 2010, and most black families with two college-educated parents are up to their ears in lingering school loans, extravagant mortgages and consumer debt. In other words, these black Americans compose a class without wealth, a feature common in the white upper-working class, as sociologists Melvin Oliver and Thomas Shapiro reminded us in their 1995 book"Black Wealth/White Wealth: A New Perspective on Racial Inequality."

Sadly, Robinson skirts this issue, among others. He suggests that educated, financially secure black women living alone are "blazing another trail" to "redefine the concepts of household and family." This is glib at best, and at worst it cynically casts black women as the engineers of something beyond their control: a socio-historic dynamic that graduates many more women than men from college every year. Robinson contents himself with upbeat platitudes to reinforce a worldview in which Transcendent, Emergent and Mainstream have something deeply symbolic in common with American whites: In unison, they "lock their car doors when they drive through an Abandoned neighborhood."

Robinson is among the group able to take the fullest advantage of King’s sacrifice, and his concern seems more closely aligned with King’s focus on the "content of our character" than on the civil rights leader’s other battle with "the inner city of poverty and despair." The ongoing plight of the black American poor — really a people who never recovered from slavery — bears an eerie similarity to the lives of black people living in Congo, Sierra Leone or Liberia. Americans like to keep a lot of distance between themselves and Africa, and African Americans who are not materially successful stir residues of guilt regarding the African genocides of our own day and the genocides of slaves and Native Americans. The mass incarceration of blacks is parallel to enslavement and peonage laws, as recent books by Michelle Alexander ("The New Jim Crow") and Douglas Blackmon ("Slavery by Another Name") make clear. As King understood, the black experience is shaped as much by the harshness of American society as by the content of the black character.

The black Mainstream that produced King was long on courage, determination and compassion, but short on cash. It was, and is, really a lower middle class, now tethered to an urban setting with compromised educational structures and weakened public services, and is only superficially like the white middle class. Consider this: Black autobiographers Malcolm X, Chester Himes, Nathan McCall and Dwayne Betts all seem to qualify for Robinson’s Mainstream, yet all served prison time for armed robbery between 1929 and 2005. It is difficult to dismiss 80 years’ worth of poignant testimony that black American "middle class" lives are extraordinarily different from those of their white counterparts.

Robinson evades the fact that the boundary between the black Mainstream (whose "historic" gains he admits are "precarious") and the Abandoned is a highly porous one. What often happens is that the Abandoned follow the Mainstream from one part of a city to another and then from the city to the suburbs and back again. It’s a scenario of boom and bust that for more than a century has swamped ambitious black migrants who take advantage of residential and employment opportunities and then, 20 years later, have to pack up and move again in the face of a socio-economic tsunami. It happened to them in the inner cities of the 1960s, the larger metropolitan areas of the 1980s and the foreclosed suburbs of the 2010s. Moving is portrayed as a success, but the cycle of run-ruin-run should not really be thought of as part of the hearty prosperity of a new class. Robinson advocates gentrification as a solution to black urban blight, but that ship "been done sail," as it were. The next wave of the black Abandoned is already tucked into suburbia, in Dekalb County, Ga., and Prince George’s County, Md. Ironically, these are Robinson’s twin geographic locales that exemplify the successful black middle class.

Lawrence Jackson is a professor of African American studies at Emory University.

View all comments that have been posted about this article.

Post a Comment

View all comments that have been posted about this article.

Comments that include profanity or personal attacks or other inappropriate comments or material will be removed from the site. Additionally, entries that are unsigned or contain "signatures" by someone other than the actual author will be removed. Finally, we will take steps to block users who violate any of our posting standards, terms of use or privacy policies or any other policies governing this site. Please review the full rules governing commentaries and discussions. You are fully responsible for the content that you post.

Posted in Uncategorized | Leave a Comment »

The 150-Year War

Posted by jcmaziquemd on October 31, 2010

The 150-Year War


MY attic office is walled with books on Lincoln and Lee, slavery and secession. John Brown glares from a daguerreotype on my desk. The Civil War is my sanctum — except when my 7-year-old races in to get at the costume box. Invariably, he tosses aside the kepi and wooden sword to reach for a wizard cloak or Star Wars light saber.

I was born in a different era, the late 1950s, when the last Union drummer boy had only just died and plastic blue-and-gray soldiers were popular toys. In the 1960s, the Civil War centennial recalled great battles as protesters marched for civil rights and the Rev. Dr. Martin Luther King Jr. declared from the steps of the Lincoln Memorial, “One hundred years later, the Negro still is not free.”

Today the Civil War echoes at a different register, usually in fights over remembrance. Though Southern leaders in the 1860s called slavery the cornerstone of their cause, some of their successors are intent on scrubbing that legacy from memory. Earlier this year in Virginia, Gov. Robert F. McDonnell proclaimed April to be Confederate History Monthwithout mentioning slavery, while the state’s Department of Education issued a textbook peddling the fiction that thousands of blacks had fought for the South. Skirmishes erupt at regular intervals over flags and other emblems, like “Colonel Reb,” whom Ole Miss recently surrendered as its mascot. The 1860s also have a particular resonance at election time, as the country splits along political and cultural lines that still separate white Southern voters from balloters in blue Union states.

But as we approach the 150th anniversary of Abraham Lincoln’s election, on Nov. 6, and the long conflict that followed, it’s worth recalling other reasons that era endures. The Civil War isn’t just an adjunct to current events. It’s a national reserve of words, images and landscapes, a storehouse we can tap in lean times like these, when many Americans feel diminished, divided and starved for discourse more nourishing than cable rants and Twitter feeds.

“The dogmas of the quiet past are inadequate to the stormy present. The occasion is piled high with difficulty, and we must rise with the occasion. As our case is new, so we must think anew, and act anew. We must disenthrall ourselves, and then we shall save our country.” Those famous lines come from President Lincoln, delivered not in the Gettysburg Address, but on a routine occasion: his second annual message to Congress. Can you recall a single line from any of the teleprompted State of the Union messages in your own lifetime?

The Civil War abounded in eloquence, from the likes of Frederick Douglass, Walt Whitman, the Southern diarist Mary Chesnut and warriors who spoke the way they fought. Consider the Southern cavalryman J. E. B. Stuart, with panache, saying of his father-in-law’s loyalty to the Union: “He will regret it but once, and that will be continually.” Or Gen. William Tecumseh Sherman, brutal and terse, warning besieged Atlantans: “You cannot qualify war in harsher terms than I will. War is cruelty, and you cannot refine it.”

These and other words from the war convey a bracing candor and individuality, traits Americans reflexively extol while rarely exhibiting. Today’s lusterless brass would never declare, as Sherman did, “I can make this march, and make Georgia howl!” or say of a superior, as Sherman did of Gen. Ulysses S. Grant, “He stood by me when I was crazy, and I stood by him when he was drunk.”

You can hear the same, bold voice in the writing of common soldiers, their letters unmuzzled by military censors and their dialect not yet homogenized by television and Interstates. “Got to see the elephant at last,” an Indianan wrote of his first, inglorious combat. “I don’t care about seeing him very often any more, for if there was any fun in such work I couldn’t see it … It is not the thing it is bragged up to be.” Another soldier called the Gettysburg campaign “nothing but fighting, starving, marching and cussing.” Cowards were known as “skedaddlers,” “tree dodgers,” “skulkers” and “croakers.”

There’s character even in muster rolls and other records, which constantly confound the stereotype of a war between brotherly white farm boys North and South. You find Rebel Choctaws and Union Kickapoos; Confederate rabbis and Arab camel-drivers; Californians in gray and Alabamans in blue; and in wondrous Louisiana, units called the Corps d’Afrique, the Creole Rebels, the Slavonian Rifles and the European Brigade. By war’s end, black troops constituted over 10 percent of the Union Army and Navy. The roster of black sailors included men born in Zanzibar and Borneo.

Then there are the individuals who defy classification, like this one from a Pennsylvania muster roll: “Sgt. Frank Mayne; deserted Aug. 24, 1862; subsequently killed in battle in another regiment, and discovered to be a woman; real name, Frances Day.”

If the words of the 1860s speak to the era’s particularity, the bleakly riveting data of the Civil War communicates its scale and horror — a portent of the industrial slaughter to come in the 20th century. Roughly 75 percent of eligible Southern men and more than 60 percent of eligible Northerners served, compared with a tiny fraction today, and more than one million were killed or wounded. Fighting in close formation, some regiments lost 80 percent of their men in a single battle. Three days at Gettysburg killed and wounded more Americans than nine years of war in Afghanistan and Iraq have. Nearly one in three Confederate soldiers died — a statistic that helps to explain the deep sense of loss that lasted in the South for over a century. In all, the death rate from combat and disease was so high that a comparable war today would claim six million American lives.

As horrific as these numbers are, they’re made graphic by the pioneering photography of the Civil War. It’s hard for us to conjure the Minutemen of 1775, but we can look into the eyes of Union and Confederate recruits, study their poses, see emotion in their faces. They look lean (and they were: on average, Civil War soldiers were 40 pounds lighter than young men today), but their faces are strikingly modern and jaunty.

Then we see them again, strewn promiscuously across fields, limbs bloated, mouths frozen in ghastly O’s. When Mathew Brady first exhibited photographs of battlefield dead in 1862, The Times likened viewing them to seeing “a few dripping bodies, fresh from the field, laid along the pavement.” Oliver Wendell Holmes Sr. wrote that photographs forced civilians to confront the true face of battle — “a repulsive, brutal, sickening, hideous thing.” We’re spared this discomfort today, with the American dead from two ground wars carefully airbrushed from public view.

There’s another great difference between the Civil War and every other war in our history: the ground itself, a vast and accessible Yosemite of memory that stretches across the South and to points beyond, from Gettysburg in Pennsylvania to New Mexico’s Glorieta Pass. True, much of the Civil War’s landscape has been interred beneath big-box malls and subdivisions named for the history they’ve obliterated. But at national parks like Shiloh and Antietam you can still catch a whisper of a human-scaled America, where soldiers took cover in high corn and sunken roads, and Lincoln’s earthy imagery spoke to the lives of his countrymen.

In an electronics-saturated age, battlefield parks also force us to exercise our atrophied imaginations. There’s no Sensurround or 3D technology, just snake-rail fences, marble men and silent cannons aimed at nothing. You have to read, listen, let your mind go. If you do, you may experience what Civil War re-enactors call a “period rush” — the momentary high of leaving your own time zone for the 1860s.

You wouldn’t want to stay there; at least I wouldn’t. Nor is battle the only way into the Civil War. There are countless other portals, and scholars are opening them to reveal lesser-known aspects of Civil War society and memory. Know about the 11-year-old girl who convinced Lincoln to grow a beard? The Richmond women who armed themselves and looted stores, crying, “Bread or blood”? The “Mammy Monument” that almost went up in Washington a year after the Lincoln Memorial?

It’s a bottomless treasure, this Civil War, much of it encrusted in myth or still unexplored. Which is why, a century and a half later, it still claims our attention and remembrance.

Tony Horwitz is the author of “Confederates in the Attic” and the forthcoming “Midnight Rising: John Brown’s Raid and the Start of the Civil War.”

Posted in Uncategorized | Leave a Comment »

Scariest Sight on Halloween? Grown-Ups

Posted by jcmaziquemd on October 31, 2010

Scariest Sight on Halloween? Grown-Ups


Since I write scary books for kids, I’m asked to do a lot of weird things. A couple of years ago, a magazine asked me to tour some of the haunted houses that were popping up around New York City. They had names like Blood Manor, Dracula’s Dungeon and Nightmare: Bad Dreams Come True.

As I made my way through the dark, twisting, fake-cobwebbed halls of one of these haunted houses, I was more surprised than frightened by the quantities of gore and blood.

Screaming ghouls and zombies lurked in every room, with missing body parts, gashed flesh, hatchets embedded in open skulls, blood-soaked entrails hanging from gaping stomach wounds. Bloody handprints smeared the walls and sticky blood puddles stained the floors.

When I finally staggered outside, shrill shrieks and maniacal laughter ringing in my ears, I gazed at a sign at the entrance I had missed: “No One Admitted Under 18.”

Of course, much has been written about how this generation of American adults doesn’t want to give up its inner child. I don’t have to spell out the evidence — it’s everywhere — that grownups want to be kids for as long as they can possibly get away with it. And who can blame us?

But … no kids admitted to a Halloween haunted house? Talk about a hatchet blow to the head. My brain exploded with vivid images of my own childhood Halloweens.

My memories are typical for anyone who grew up in a quiet suburb in the middle of the last century. I remember the chill of the October night air and, despite the cold, the sweat rolling down into my eyes inside my plastic mask. I can still hear the crunch of my shoes over the frost-hard ground and the whoosh of the wind ruffling our flimsy store-bought costumes.

I can conjure up the heavy disappointment I felt when someone would drop a popcorn ball or an apple into my trick-or-treat bag instead of candy. And I remember the panic when a big kid would stop me and my friends on a shadowy driveway and demand to see what was in our bags.

I remember clearly the costumes my parents brought home from Kresge’s. My favorite was a scarlet devil disguise, the grinning mask complete with curled horns and a painted goatee. My least favorite was a duck costume with a fuzzy yellow tail. I told everyone I was actually a vampire duck. But you can guess how that went over.

Typical memories. But as I recall, the special excitement of Halloween didn’t come from candy or costumes or dark, whispery streets. The overwhelming thrill came from going out of the house at night and wandering freely around the neighborhood with no parents.

Halloween was a night of incredible freedom.

I’ve written dozens of Halloween books for children, and I try to capture those memories and that feeling of liberation. So it was alarming to think that adults were taking the holiday away from kids. Was it really happening?

I walked into the Barnes & Noble in my neighborhood and spotted a table of Halloween-themed books near the front. Sure enough, they were all for adults:

“Halloween Collectibles Price Guide”

“Halloween: From Pagan Ritual to Party Night”

“The Original Duct Tape Halloween Book.”

Duct tape? Really?

Farther down Broadway, I saw a Halloween costume store. Yes, you guessed it — there were no princess costumes with sparkling tiaras, no duck costumes with fuzzy tails. I saw a lot of flimsy negligees as well as studded black leather outfits with handcuffs and whips. A popular item seemed to be fakeblack, curly chest hair for men.

I returned to my apartment disheartened. Perhaps in a few years, I would write a 10-year-old character who described his Halloween like this:

“Best Halloween ever! First, Mom and Dad let my sister and me help decorate the house for their party. Then, they said we could help them get into their costumes! Totally awesome!”

I fretted about this for days, the end of Halloween as I’d known it. And then I happened to eavesdrop on a group of kids waiting in my lobby for their school bus.

They were talking about their Halloween costumes. One boy said he was going to be an iPhone with a lot of apps up and down his front. Another boy said he was going to wear two fake heads, one on each shoulder, and go as triplets.

The only girl in the group insisted they had to start their trick-or-treating at a building a few blocks uptown. “It has a 13th floor,” she said. “We totally have to start on the 13th floor.”

The three-headed boy told them his cousin lived in Connecticut down the block from a “real” haunted house. He said that a group of trick-or-treaters went into the house last year and were never seen again: “All they found were their masks stuck to the front window, looking out.”

That story made the kids laugh. It made me laugh, too. I walked away thinking, yes, more adults are celebrating Halloween. And, yes, the best-selling costume this year may end up being the BP oilman uniform. But, no matter. Kids live in their own special and private world. And Halloween is still the holiday that proves it.

R. L.

Posted in Uncategorized | Leave a Comment »


Posted by jcmaziquemd on October 31, 2010

Black, Hispanic students dwindle at elite Va. public school

By Kevin Sieff
Washington Post Staff Writer
Saturday, October 30, 2010; 6:49 PM

When the Black Students Association at Thomas Jefferson High School for Science and Technology threw a pizza party in September for new members, every African American freshman on campus showed up.

All four of them.

They amount to less than 1 percent of the Class of 2014 at the selective public school in Fairfax County, regarded as among the nation’s best. "It’s disappointing," said Andrea Smith, the club’s faculty sponsor. "But you work with what you got."

The count of Hispanic freshmen is not much higher: 13.

Years of efforts to raise black and Hispanic enrollment at the regional school have failed, officials acknowledge. The number of such students admitted has fallen since 2005.

There are two major reasons. Admissions decisions are generally made without regard to race or ethnicity, despite a policy meant to promote diversity. And initiatives to enlarge the pipeline of qualified black and Hispanic students in elementary and middle school have flopped.

"We need to do a better job of evening the playing field," said Richard Moniuszko, deputy superintendent in Fairfax County. "But there’s a limit to what we can do, both legally and financially."

Isis Castro, a former Fairfax County School Board member who is now on the Virginia Board of Education, said: "The programs that we implemented didn’t work, and the communities that we were trying to help didn’t have a real seat at the table."

Demographic mix

TJ, as the school is known, draws top students from a region with a rich demographic mix: Arlington, Fairfax, Fauquier, Loudoun and Prince William counties, Fairfax City and Falls Church. (Alexandria does not participate.) Together, black and Hispanic students account for about a third of all public school enrollment in those locales. At TJ, they account for less than 4 percent.

Ninety percent of TJ’s 1,764 students are of Asian descent (the largest and fastest-growing group) or are non-Hispanic white (the second-largest). Nearly 6 percent are identified as multiracial.

Like other public schools with competitive admissions, TJ screens applicants through grades and test scores. A key requirement is that students take Algebra 1 by eighth grade. Many disadvantaged students don’t clear that threshold, which presents a national challenge for science and math instruction.

Fierce competition

Competition to get into TJ is fierce. Some private companies charge hundreds of dollars to prepare students for the school’s entrance exam, a two-hour test of math and verbal-reasoning skills. For those who get in, the payoff is clear. The school has an array of laboratories in fields such as biotechnology and microelectronics, and students follow a rigorous interdisciplinary curriculum that culminates in a senior research project.

TJ churns out dozens of National Merit scholars and routinely sends graduates to top colleges. Last year, the school’s average SAT score was 2184 out of 2400. U.S. News & World Report ranks it the top high school in the country.

Yet for thousands of black and Hispanic middle school students in Northern Virginia, TJ is a long shot. The overall admissions rate is 15 percent. But it’s 2 percent for black students and 6 percent for Hispanic students.

"Sometimes in class I look around and think, ‘I know a lot of people who could be here, but they didn’t know about it, or they didn’t know how to prepare,’ " said Alexandria Sutton, an African American junior at TJ. "At my middle school, it was not advertised at all."

Ariel Copeland, a senior at TJ, remembers reading "Beloved," the Toni Morrison novel about slavery, in junior English. Copeland, the only black student in the class, squirmed in her seat during discussions. "It was so awkward," she said. "I could tell people were looking at me."

After class, some students approached her to apologize for the nation’s history of slavery. "I was like, ‘You don’t have to apologize,’ " she said.

History teacher Melissa Schoeplein said she sometimes gives lessons on race and poverty in a classroom without any black or Hispanic students. The lack of diversity, she said, means that students "are missing out on a critical part of their education."

TJ’s black and Hispanic seniors, like their peers, are considering a range of selective universities. Richie Hernandez is thinking about the Massachusetts Institute of Technology. Lucia Melgarejo is looking at Duke University. Copeland likes the College of William & Mary.

Wherever they land, they are virtually certain of two things.

"We know we’ll be ready for college," Hernandez said.

"And we know college is almost definitely going to be more diverse," Melgarejo said.

Affirmative action on trial

It wasn’t always this way. For more than a decade after its founding in 1985, the school actively sought to diversify its enrollment, even if that sometimes meant admitting students with lower test scores than others. In 1997, the school admitted 24 Hispanic students and 25 black students.

That year, several federal courts struck down school affirmative action programs, and attorneys advised Fairfax school officials to end any racial or ethnic preferences. The number of black and Hispanic freshmen plummeted.

In 2003, the Supreme Court struck down a race-based undergraduate admissions policy at the University of Michigan but narrowly upheld a policy at the University of Michigan Law School that allowed the consideration of race as part of a comprehensive examination of an applicant. The majority agreed that the law school had an interest in "the educational benefits that flow from a diverse student body."

In response, Fairfax officials tweaked the TJ admissions policy in 2004 to allow race to be considered as a factor. The change drew an outcry from some parents, who said the policy discriminated against qualified white students. Even so, the admissions rates for black and Hispanic students have been falling.

Under the policy, applicants are screened first on admissions test scores and grades. Then admissions panels, mostly teachers and administrators from other area schools, consider subjective criteria such as essays and teacher recommendations. At that point, race and ethnicity can come into play. But generally they don’t.

"The numbers are unlikely to change under the current policy," said Judy Howard, who was the school’s admissions director from 2004 until last spring. The county’s admissions protocols promote diversity broadly but don’t put particular emphasis on race.

"We thought we’d given committee members enough latitude to consider diversity as a factor," Moniuszko said. "But the results say otherwise."

National trend

Across the country, a number of selective regional schools like TJ "have backed off affirmative action in recent years," said Letita Mason, a board member of the National Consortium for Specialized Secondary Schools of Mathematics, Science and Technology. "It’s not popular. It’s not something they want to tackle."

In Montgomery County, the prestigious Science, Mathematics and Computer Science Magnet Program at Montgomery Blair High School does not consider race as a factor in admissions. About 8 percent of freshmen in the program are African American or Hispanic. But those two groups account for 46 percent of enrollment countywide.

In the District, the selective public School Without Walls also does not consider race in admissions. Black and Hispanic students account for 67 percent of enrollment at the school, compared with 91 percent in the city school system.

Fairfax school officials say that diversifying TJ requires more than making admissions criteria more flexible. It means helping black and Hispanic students keep up with their white and Asian American counterparts at an early age, especially in math and science.

Since 2000, a county program known as Young Scholars has tried to recruit elementary students who might one day attend TJ. More than half of the program’s 3,776 students between kindergarten and eighth grade are black or Hispanic. Next spring, the first 30 Young Scholars will graduate from high school. Only one will be a TJ graduate.

The school’s Parent Teacher Student Association also offers free test-preparation courses for minority students. But a few years ago, the Fairfax school system eliminated another program, known as Quest, which sought to spark interest in TJ by taking minority students to the campus several times a month for science and math programs.

For now, most TJ students come from a group of middle schools that serve neighborhoods that are mostly affluent and mostly white or Asian.

"I’ve always been known as ‘that smart black girl’ – at middle school and now at TJ," said Adrienne Ivey, a junior. "It gets old."

Posted in Uncategorized | Leave a Comment »

Is Candy Evil or Just Misunderstood?

Posted by jcmaziquemd on October 28, 2010

Is Candy Evil or Just Misunderstood?


FOR Samira Kawash, a writer who lives in Brooklyn, the Jelly Bean Incident provided the spark.

Five years ago, her daughter, then 3, was invited to play at the home of a new friend. At snack time, having noted the presence of sugar (in the form of juice boxes and cookies) in the kitchen, Dr. Kawash, then a Rutgers professor, brought out a few jelly beans.

The mother froze. Her child had never tasted candy, she explained, but perhaps it would be all right just this once. Then the father weighed in from the other room, shouting that that they might as well give the child crack cocaine.

“It was clear to me that there was an irrational equation of candy and danger in that house,” Dr. Kawash said in a recent interview. “And that was irresistible to me.”

From that train of thought, the Candy Professor blog was born. In her writing there, Dr. Kawash dives deep into the American relationship with candy, finding irrational and interesting ideas everywhere. The big idea behind Candy Professor is that candy carries so much moral and ethical baggage that people view it as fundamentally different — in a bad way — from other kinds of food.

“At least candy is honest about what it is,” she said. “It has always been a processed food, eaten for pleasure, with no particular nutritional benefit.” Today, she said, every aisle in the supermarket contains highly manipulated products that have those qualities.

And, she points out, many people who avoid candy will cheerfully eat sugar-packed chocolate-chip energy bars and drink Gatorade for health reasons, although a serving of Gatorade contains about the same amount of sugar as a dozen pieces of candy corn. Dr. Kawash’s expertise is in American culture and gender studies, but some nutritionists share her views on the pariah status of candy.

“I don’t think candy is bad for you,” said Rachel Johnson, a nutrition professor at the University of Vermont who was the lead author of the American Heart Association’s comprehensive 2009 review of the scientific literature on sugar and cardiovascular health.

Dr. Johnson said that candy is considered bad because it lacks the “health halo” that hovers over sweet food like granola bars and fruit juice. “Nutritionally there is little difference between a gummy bear and a bite of fruit leather,” she said.

Dr. Johnson also noted that candy provides only 6 percent of the added sugar in the American diet, while sweet drinks and juice supply 46 percent. “There’s reason to believe that sugar in liquid form is actually worse than candy, because it fills you up and displaces healthier food choices,” she said.

Dr. Kawash, who studied architectural theory, narratives of women and medicine, and the imagery of terrorism before she began to write Candy Professor, has complicated feelings about her current specialty. She describes her childhood in Sunnyvale, Calif., in the 1970s as an “endless, and mostly frustrating quest for candy,” restricted to a small weekly indulgence after church on Sundays. Later, she said, binges on gummy bears and spice drops fueled her undergraduate research at Stanford; more recently, she found herself flushing handfuls of candy corn down the toilet to prevent herself from eating “just a few more.”

“Obviously, my own relationship with candy is not totally healthy,” she admits.

Fortunately, some of that passion has now been channeled into research. There are many blogs devoted to tasting, photographing and tracking down obscure types of candy, such as Candy Addict and Candy Blog, but Dr. Kawash’s work is rarely about taste or nostalgia. She is much more interested in untangling the threads of control, danger and temptation that candy has carried since it became widely available in the 1880s.

Until then, most candies — like fudge, brittle and taffy — were homemade, and store-bought hard candies like horehound sticks and peppermints were relatively expensive. But advances in technology enabled sugar to be spun, aerated, softened and flavored in new ways, and sold cheaply. Just like that, candy entered popular culture.

Dr. Kawash notes that candy, like cigarettes, was long advertised as having health benefits. “Eat Tootsie Rolls — The Luscious Candy That Helps Beat Fatigue,” reads one of the many ads she has exhaustively analyzed on her blog. One post is dedicated to the “slippage” between candy and medicine that she has found in a close reading of the history of cough drops — hard candy in a socially acceptable form.

But there have always been what she calls “candy alarmists,” who warned that candy was too stimulating, too soporific, poisoned, or otherwise hazardous. Dangerous candy appears in many fairy tales, a theme continued with the modern public-safety message, “Don’t take candy from strangers,” and in public scares over tampering and contamination. (Dr. Kawash recently detailed how all of this led to the candy wrappers we know today in The Journal of American Culture.)

In the early 20th century, she said — in the absence of any medical evidence — doctors blamed candy for the spread of polio. In the 1970s, refined sugar approached the top of the food counterculture’s list of enemies, spurred by international best sellers like “Sugar Blues” and “Sweet and Dangerous.” Tooth decay was the longtime threat; more recently, the global spread of obesity has prompted fears of the “empty calories” in candy.

Now a tentative cook and a buyer of organic eggs, Dr. Kawash is convinced that candy is often the scapegoat when Americans sense that something is wrong in the food supply. The social critic in her says that corn syrup and the cheap candy produced with it have unhinged our notions of how much candy is too much. At the same time, the historian in her can’t help pointing out that “corn syrup was a wonderful thing for candy.” Its invention in the late 19th century made the commercial production of soft confections like fudge and candy corn possible.

The disruption of traditional agricultural systems — including the presence of corn in so many processed foods — has also dislodged candy from its established place as an occasional treat.

“Candy should not be sold in huge bags at the drugstore,” said Jennifer King, a founder of Liddabit Sweets, a small candy company in Brooklyn that proudly sells candy bars — such as a recreated Snickers — for as much as $6.50. Liddabit products are indulgent but also virtuous: Ms. King and her partner, Liz Gutman, make treats like apple-maple lollipops and spiced caramel chews by hand, from prestigious and often local ingredients. (The honey in the honeycomb candy is gathered from hives in New York City.)

Dr. Kawash says that the fetishization of candy ingredients and the aestheticization of candy — like the color-coordinated candy landscapes now popular at weddings — are relatively new.

“When the moneyed classes indulge in sugar, it’s part of an acceptable leisure activity,” she said, chewing over the significance of high-end candy destinations like Dylan’s Candy Bar.

“But when poor people do the same thing, it’s considered pathological,” she added, citing the current debate over using food stamps to buy soda, candy and other “bad” foods.

Dr. Kawash, 46, retired from teaching in 2009. She said that her increasing interest in candy was making it difficult to fulfill her administrative, teaching and parental responsibilities, and knew that studying the evolution of the shape of the Hershey’s Kiss would never win her respect within the academy.

The blog is not so much a public forum, she said, as a “research trail,” a way of chronicling the hours she now spends reading old issues of Confectioners’ Journal, scanning patent applications, and combing archived phone books to count the number of candy shops in Brooklyn in 1908 (564).

Dr. Kawash says her research is partly fueled by anger toward candy manufacturers who publish inaccurate, often sugarcoated histories of their products. In fact, she says, the home-kitchen inventions of candy-shop owners were often simply copied, stolen or swallowed up by large companies.

“The history of candy, like the history of wars, is always written by the winners,” she said. “We can’t just let that go unchallenged.”

&t=&s=2&ui=1546957&r=http%3a%2f%2fwww%2enytimes%2ecom%2f2010%2f10%2f27%2fdining%2f27candy%2ehtml%3f%5fr%3d1%26ref%3dgeneral%26src%3dme%26pagewanted%3dall&u=www%2enytimes%2ecom%2f2010%2f10%2f27%2fdining%2f27candy%2ehtml%3f%5fr%3d1%26ref%3dgeneral%26src%3dme%26pagewanted%3dprint DCSIMG
24831989Q2FQ3Bey-OQ3BEzQ7COhCzgdeEOQ223dQ26Q25Q22Q22Q51A6Q22Q7CQ3CQ27Q27Q3CQ51(Q26Z6&t=&s=2&ui=1546957&r=http%3a%2f%2fwww%2enytimes%2ecom%2f2010%2f10%2f27%2fdining%2f27candy%2ehtml%3f%5fr%3d1%26ref%3dgeneral%26src%3dme%26pagewanted%3dall&u=www%2enytimes%2ecom%2f2010%2f10%2f27%2fdining%2f27candy%2ehtml%3f%5fr%3d1%26ref%3dgeneral%26src%3dme%26pagewanted%3dprint DCSIMG

Posted in Uncategorized | Leave a Comment »

Study Finds Adversity Does Make Us Stronger

Posted by jcmaziquemd on October 25, 2010

Study Finds Adversity Does Make Us Stronger


Friedrich Nietzsche was right—sort of.

The German philosopher’s oft-quoted adage, "What does not destroy me, makes me stronger," was put to the test as part of a national study of the effects of adverse life events on mental health by researchers at the University at Buffalo-the State University of New York and the University of California, Irvine.

The study, published in the latest issue of the Journal of Personality and Social Psychology, found that people who had experienced a few adverse events in their lives reported better mental health and well being than people with a history of frequent adversity and people with no history of misfortune.

The study, which included 2,398 participants ranging in age from 18 to 101, is part of a larger research effort started after Sept. 11, 2001 to test the notion of resilience—how successfully people adapt after exposure to stressful or potentially traumatic life events or circumstances.

In studies of human resilience over the last three decades, particular adverse events, including physical or sexual assault, the loss of a parent, homelessness and natural disasters, have generally been linked to poorer mental-health outcomes. Studies of people who suffer disability or unemployment have shown lower life satisfaction that lasted over at least several years. And more adversity has generally predicted worse outcomes.

But Mark Seery, a researcher at the Department of Psychology at the University at Buffalo who co-authored the new study, says many studies have focused on personal characteristics or social resources that promote resilience. But the potential benefits of exposure to some adversity, relative to no adversity, have received less attention, he says.

Dr. Seery says his study shows that, under the right conditions, experiencing some adversity may foster resilience. Participants were asked whether they had experienced each of 37 negative events and the ages at which they occurred. Subjects with a history of some lifetime adversity showed lower distress, fewer symptoms of post-traumatic stress, and higher life satisfaction. They also appeared to handle recent adverse events better than other participants. Dr. Seery says age, personality characteristics and social support systems had no measurable impact on the relationship between adversity and mental health.

"So much of the existing literature shows that having experience with a negative life event is bad, with negative effects on mental and physical health," says Dr. Seery. "But we’ve found that that is not the whole story, and that people are more resilient in general than we may think."

Adversity, Dr. Seery adds, can help people develop a "psychological immune system" to help them cope with the slings and arrows that life throws, while those with no experience of adversity may have a hard time dealing with tough times.

At the same time, higher levels of adversity, the study found, can overtax coping skills and support networks, creating feelings of hopelessness and loss of control, disrupting the development of toughness and taking a toll on mental health and well-being. Under those circumstances, Dr. Seery says, even the most minor hassles can seem overwhelming.

Dr. Seery says people who have experienced around two to four adverse events in their lifetimes appeared to be the best off. Recent events—within the last six to 18 months—signaled worse mental health on the whole, suggesting that it may take time for an experience of adversity to bolster resilience.

Ann Masten, an expert in resilience in young people at the University of Minnesota in Minneapolis, says that even if people are capable of adapting to adversity, it is still important to have community and social networks in place to help people deal with the aftermath of adverse events. "We do have enormous capacity for resilience, but that doesn’t mean horrible experiences are good for you," says Dr. Masten. "We need to have a better understanding of how protective systems work and how to mobilize them when they aren’t present."

About 53% of the adversity-study participants were female, and nearly 74% identified themselves as white, non-Hispanic.

Posted in Uncategorized | Leave a Comment »

Annette Gordon-Reed’s ‘Genius’ Pursuit

Posted by jcmaziquemd on October 25, 2010

Posted in Uncategorized | Leave a Comment »