STREET SMART

ARTICLES ABOUT EDUCATION, LITERATURE, ARTS, CULTURE AND CURRENT EVENTS PRESENTED IN THE BELIEF THAT THIS IS THE KIND OF KNOWLEDGE A YOUNG PERSON NEEDS TO BE TRULY STREET SMART

 

 

 

Please CUT AND PASTE only what you need on to a word document;

otherwise, you will end up printing this whole page.

 

 

 

WHAT'S WRONG WITH AMERICAN HIGH SCHOOLS?

 

STEVE JOBS 2005 COMMENCEMENT ADDRESS AT STANFORD UNIVERSITY

 

GEORGE WILL'S 2003 COMMENCEMENT ADDRESS AT BOSTON UNIVERSITY

 

DAVID FOSTER WALLACE'S 2005 COMMENCEMENT ADDRESS AY KENYON COLLEGE

 

THE LONG RUN WITH SPRINGSTEEN

 

IS AP TOO GOOD TO BE TRUE?

 

WHY LENNON WILL ALWAYS MATTER

 

GENIUS DENIED: THE PROBLEM WITH "NO CHILD LEFT BEHIND"

 

BONDING THOUGH BOOKS

 

AS DEADLINES NEAR, STUDENTS SWEAT IT OUT OVER COLLEGE ESSAYS

 

THE POWER OF FICTION

WHY JOHNNY CAN'T BE BOTHERED

SIX OF 100 CHICAGO PUBLIC SCHOOL FRESHMEN WILL GET A COLLEGE DEGREE

BOSS GETS FOLKIE WITH 'SEEGER SESSIONS'

SPRINGSTEEN DOES SEEGER PROUD, FOLKS

DEBATE SHROUDS FILM ON BRIDGE SUICIDES

SCHOLARS DISCOVER LOST SAMUEL BECKETT PLAY

CHEAPENING THE CAP AND GOWN

MARSHMALLOWS AND PUBLIC POLICY

WHAT IS THE BEST WORK OF AMERICAN FICTION OF THE LAST 25 YEARS?

BAN ON BOOKS TO GET A VOTE

CAUTION! SEX, VOILENCE AND DANGEROUS IDEAS. DON’T READ!

KIDS GONE WILD? IN PRAISE OF HAZING

 

RAPPERS UPSET AT OPRAH? HERE’S WHY WE DON’T CARE

 

PERFECT’S NEW PROFILE, WARTS AND ALL

 

19TH CENTURY OUTWEIGHS 20TH FOR TOP NOVELS

 

APOCALYPTIC LEAR

 

DYLAN FINDS HIS VOICE

 

DYLAN SHOW A MOODY MYSTERY

 

DROPOUT NATION

 

OPRAH'S TRUTH SHOULDN'T HURT

 

ON EDUCATION: THREE ESSAYS BY CHARLES MURRAY
 
THIS COLUMN GETS SO GHETTO
 
TO BE OR NOT TO BE
 

EVEN WITH THE BEST INTENTIONS…

 

KEEPING GOOD TEACHERS IS THE TRUE TEST

 
THE TELL-ALL CAMPUS TOUR
 
MEDIA USE BY TEENS, TWEENS GROWS TO 52 HOURS A WEEK
 
EDUCATION REFORMS GET A FAILING GRADE
 
THE THINGS HE WRITES ABOUT
 
METAFICTION AND O'BRIEN'S THE THINGS THEY CARRIED
 
HOW TO TELL A TRUE WAR STORY
 
SOME EDUCATORS QUESTION IF WHITEBOARDS, OTHER HIGH-TECH TOLLS RAISE ACHIEVEMENT 
 

A DOWNSIDE TO HIGH TEEN SELF-ESTEEM

 

MOCKINGBIRD STILL SINGS AFTER 50 YEARS

 

LATINA RAPPERS MAKE THEIR VOICES HEARD

 

HOW ABOUT BETTER PARENTS?

 

FACING A ROBO-GRADER? JUST KEEP OBFUSCATING MELLIFLUOUSLY

 

COLLEGE LESSON: THERE IS NO "ONE"

 

STREAMING VIDEO'S EMERGING BOUNTY

 

THE TROUBLE WITH ONLINE EDUCATION

 

BATMAN’S WORLD OF DREAD, ON SCREEN AND OFF

 

UNNATURAL SELECTION

 

QUESTIONING THE NATURE OF EDUCATION

 

WHAT OUR SCHOOLS CAN’T DO  – BUT PARENTS CAN

 

IT'S TIME TO DUMP STANDARDIZE TESTS

 

HOW HIPSTERS RUINED PARIS

 

 

 

WHAT'S WRONG WITH AMERICAN HIGH SCHOOLS?

 

Our schools were conceived decades ago to meet the needs of another age. It's time for a serious redesign

By Bill Gates, chairman of Microsoft and co-founder of the Bill & Melinda Gates Foundation. Los Angeles Times

March 4, 2005

Our high schools are obsolete.

By obsolete, I don't just mean that they are broken, flawed and underfunded--although I can't argue with any of those descriptions.

What I mean is that they were designed 50 years ago to meet the needs of another age. Today, even when they work exactly as designed, our high schools cannot teach our kids what they need to know.

Until we design high schools to meet the needs of the 21st Century, we will keep limiting--even ruining--the lives of millions of Americans every year. Frankly, I am terrified for our work force of tomorrow. The idea behind the old high school system was that you could train an adequate work force by sending only a small fraction of students to college, and that the other kids either couldn't do college work or didn't need to.

Sure enough, today only one-third of our students graduate from high school ready for college, work and citizenship.

The others, most of whom are low-income and minority students, are tracked into courses that won't ever get them ready for any of those things--no matter how well the students learn or how hard the teachers work.

In district after district across the country, wealthy white kids are taught Algebra II, while low-income minority kids are taught how to balance a checkbook.

This is an economic disaster. In the international competition to have the best supply of workers who can communicate clearly, analyze information and solve complex problems, the United States is falling behind. We have one of the highest high school dropout rates in the industrialized world.

In math and science, our 4th-graders rank among the top students in the world, but our 12th-graders are near the bottom. China has six times as many college graduates in engineering.

As bad as it is for our economy, it's even worse for our students. Today, most jobs that pay enough to support a family require some post-secondary education. Yet only half of all students who enter high school enroll in a post-secondary institution.

High school dropouts have it worst of all. Only 40 percent have jobs. They are nearly four times more likely to be arrested than their friends who stayed in high school. And they die young because of years of poor health care, unsafe living conditions and violence.

We can put a stop to this. We designed these high schools; we can redesign them.

We have to do away with the outdated idea that only some students need to be ready for college and that the others can walk away from higher education and still thrive in our 21st Century society. We need a new design that realizes that all students can do rigorous work.

There is mounting evidence in favor of this approach. Take the Kansas City, Kan., public school district, where 79 percent of students are minorities and 74 percent live below the poverty line. For years, the district struggled with high dropout rates and low test scores. In 1996, it adopted a school-reform model that, among many other steps, requires all students to take college-prep courses. Since then, the district's graduation rate has climbed more than 30 percentage points.

Kansas City is not an isolated example. Exciting work is under way to improve high schools in such cities as Chicago, Oakland and New York.

All of these schools are organized around three powerful principles: Ensure that all students are given a challenging curriculum that prepares them for college or work; that their courses clearly relate to their lives and goals; and that they are surrounded by adults who push them to achieve.

This kind of change is never easy. But I believe there are three ways that political and business leaders at every level can help build momentum for change in our schools.

First, declare that all students must graduate from high school ready for college, work and citizenship. Every politician and chief executive in the country should speak up for the belief that children need to take courses that prepare them for college.

Second, publish the data that measure our progress toward that goal. We already have some data that show us the extent of the problem. But we need to know more: What percentage of students are dropping out? What percentage are graduating? And this data must be broken down by race and income.

Finally, every state should commit to turning around failing schools and opening new ones. When the students don't learn, the school must change. Every state needs a strong intervention strategy to improve struggling schools.

If we keep the system as it is, millions of children will never get a chance to fulfill their promise because of their ZIP code, their skin color or their parents' income. That is offensive to our values.

Every kid can graduate ready for college. Every kid should have the chance.

Let's redesign our schools to make it happen.

Copyright © 2005, Chicago Tribune

 

STEVE JOBS 2005 COMMENCEMENT ADDRESS AT STANFORD UNIVERSITY

Thank you. I'm honored to be with you today for your commencement from one of the finest universities in the world. Truth be told, I never graduated from college and this is the closest I've ever gotten to a college graduation.

Today I want to tell you three stories from my life. That's it. No big deal. Just three stories. The first story is about connecting the dots.

I dropped out of Reed College after the first six months but then stayed around as a drop-in for another eighteen months or so before I really quit. So why did I drop out? It started before I was born. My biological mother was a young, unwed graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife, except that when I popped out, they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking, "We've got an unexpected baby boy. Do you want him?" They said, "Of course." My biological mother found out later that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would go to college.

This was the start in my life. And seventeen years later, I did go to college, but I naïvely chose a college that was almost as expensive as Stanford, and all of my working-class parents' savings were being spent on my college tuition. After six months, I couldn't see the value in it. I had no idea what I wanted to do with my life, and no idea of how college was going to help me figure it out, and here I was, spending all the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back, it was one of the best decisions I ever made. The minute I dropped out, I could stop taking the required classes that didn't interest me and begin dropping in on the ones that looked far more interesting.

It wasn't all romantic. I didn't have a dorm room, so I slept on the floor in friends' rooms. I returned Coke bottles for the five-cent deposits to buy food with, and I would walk the seven miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example.

Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer was beautifully hand-calligraphed. Because I had dropped out and didn't have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and sans-serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can't capture, and I found it fascinating.

None of this had even a hope of any practical application in my life. But ten years later when we were designing the first Macintosh computer, it all came back to me, and we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts, and since Windows just copied the Mac, it's likely that no personal computer would have them.

If I had never dropped out, I would have never dropped in on that calligraphy class and personals computers might not have the wonderful typography that they do.

Of course it was impossible to connect the dots looking forward when I was in college, but it was very, very clear looking backwards 10 years later. Again, you can't connect the dots looking forward. You can only connect them looking backwards, so you have to trust that the dots will somehow connect in your future. You have to trust in something--your gut, destiny, life, karma, whatever--because believing that the dots will connect down the road will give you the confidence to follow your heart, even when it leads you off the well-worn path, and that will make all the difference.

My second story is about love and loss. I was lucky. I found what I loved to do early in life. Woz and I started Apple in my parents' garage when I was twenty. We worked hard and in ten years, Apple had grown from just the two of us in a garage into a $2 billion company with over 4,000 employees. We'd just released our finest creation, the Macintosh, a year earlier, and I'd just turned thirty, and then I got fired. How can you get fired from a company you started? Well, as Apple grew, we hired someone who I thought was very talented to run the company with me, and for the first year or so, things went well. But then our visions of the future began to diverge, and eventually we had a falling out. When we did, our board of directors sided with him, and so at thirty, I was out, and very publicly out. What had been the focus of my entire adult life was gone, and it was devastating. I really didn't know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down, that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure and I even thought about running away from the Valley. But something slowly began to dawn on me. I still loved what I did. The turn of events at Apple had not changed that one bit. I'd been rejected but I was still in love. And so I decided to start over.

I didn't see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods in my life. During the next five years I started a company named NeXT, another company named Pixar and fell in love with an amazing woman who would become my wife. Pixar went on to create the world's first computer-animated feature film, "Toy Story," and is now the most successful animation studio in the world.

In a remarkable turn of events, Apple bought NeXT and I returned to Apple and the technology we developed at NeXT is at the heart of Apple's current renaissance, and Lorene and I have a wonderful family together.

I'm pretty sure none of this would have happened if I hadn't been fired from Apple. It was awful-tasting medicine but I guess the patient needed it. Sometimes life's going to hit you in the head with a brick. Don't lose faith. I'm convinced that the only thing that kept me going was that I loved what I did. You've got to find what you love, and that is as true for work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work, and the only way to do great work is to love what you do. If you haven't found it yet, keep looking, and don't settle. As with all matters of the heart, you'll know when you find it, and like any great relationship it just gets better and better as the years roll on. So keep looking. Don't settle.

My third story is about death. When I was 17 I read a quote that went something like "If you live each day as if it was your last, someday you'll most certainly be right." It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself, "If today were the last day of my life, would I want to do what I am about to do today?" And whenever the answer has been "no" for too many days in a row, I know I need to change something. Remembering that I'll be dead soon is the most important thing I've ever encountered to help me make the big choices in life, because almost everything--all external expectations, all pride, all fear of embarrassment or failure--these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.

About a year ago, I was diagnosed with cancer. I had a scan at 7:30 in the morning and it clearly showed a tumor on my pancreas. I didn't even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctors' code for "prepare to die." It means to try and tell your kids everything you thought you'd have the next ten years to tell them, in just a few months. It means to make sure that everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.

I lived with that diagnosis all day. Later that evening I had a biopsy where they stuck an endoscope down my throat, through my stomach into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated but my wife, who was there, told me that when they viewed the cells under a microscope, the doctor started crying, because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and, thankfully, I am fine now.

This was the closest I've been to facing death, and I hope it's the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept. No one wants to die, even people who want to go to Heaven don't want to die to get there, and yet, death is the destination we all share. No one has ever escaped it. And that is as it should be, because death is very likely the single best invention of life. It's life's change agent; it clears out the old to make way for the new. right now, the new is you. But someday, not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it's quite true. Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma, which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice, heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

When I was young, there was an amazing publication called The Whole Earth Catalogue, which was one of the bibles of my generation. It was created by a fellow named Stuart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late Sixties, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and Polaroid cameras. it was sort of like Google in paperback form thirty-five years before Google came along. I was idealistic, overflowing with neat tools and great notions. Stuart and his team put out several issues of the The Whole Earth Catalogue, and then when it had run its course, they put out a final issue. It was the mid-Seventies and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath were the words, "Stay hungry, stay foolish." It was their farewell message as they signed off. "Stay hungry, stay foolish." And I have always wished that for myself, and now, as you graduate to begin anew, I wish that for you. Stay hungry, stay foolish.

Thank you all, very much.

 

GEORGE WILL'S 2003 COMMENCEMENT ADDRESS AT BOSTON UNIVERSITY

George F. Will
Boston University
Commencement Address
May 18, 2003
 

Chancellor Silber. Distinguished faculty. Proud parents–proud and somewhat poorer than you were four years ago.
And, especially, members–fellow members, I guess I am entitled to say–of the Class of 2003.


I thank Boston University for being allowed to inflict upon you the last lecture you must endure. I am a faculty brat, a son of a philosophy professor. And before I turned to journalism–or as my father thought, before I sank to journalism–I was, briefly, a professor of political philosophy.


So I know something about commencement addresses. I know that brevity is not only the soul of wit and the essence of lingerie. Brevity also is, on occasions like this, mandatory.


I also know that, as has been well said, journalism often involves informing people who have never heard of a particular person that that person has died. Today I want to tell you about such a person. About a great journalist, a great American, who died 10 days ago, halfway through his one-hundredth year.

 
Few if any of you have ever heard of Sam Lacy. Long before he died his name–never widely known outside the African American community–had been largely lost to history. But we live in a world he helped to make very much better.
And where you and I are right now is a suitable setting for paying him homage by talking about three things–baseball, which he loved; our race problems, which he helped to ameliorate; and the study of American history, some of which Sam Lacy made.
Historical knowledge can enrich any day and any activity–such as this day, and this ceremony–and infuse it with fun, as I shall try to demonstrate. But historical knowledge also is a vital civic virtue, for two reasons.


As de Tocqueville warned, historical amnesia is an abiding weakness of democracies because they are so focused on the future. And America’s often unreasonable dissatisfaction with its present often arises from an insufficient understanding of the past. Americans’ chronic social hypochondria is a consequence of the historical amnesia de Tocqueville warned about. That is,
social hypochondria is a consequence of not understanding how far we have come, so fast, in social betterment.


Let me illustrate this–both the simple everyday fun of historical knowledge, and the civic importance of such knowledge–with reference to baseball.


Baseball is always on my mind. I write about politics primarily to support my baseball habit. However, the national pastime, properly understood, is rich with pertinent lessons for the nation.


You are sitting just now in what were, until 1953, the bleachers along the right field foul line in Braves Field, home of the National League’s Boston Braves, who after the 1952 season decamped for Milwaukee, en route to Atlanta.
Braves Field was the scene of what is, to this day, the longest game in major league history in terms of innings played. In 1920, the Braves and the Dodgers played a 26-inning 1-1 tie, called because of darkness. The two starting pitchers were the only pitchers. The Braves pitcher racked up 21 consecutive scoreless innings–still a single game major league record. And only three balls were used in the entire game. And this game lasted less than four hours. Things have changed.


You may think that nothing whatsoever pertaining to the Braves has ever come to your attention. Think again.
For a few years before and after the Second World War the Braves had a player called Bama Rowell, so named because he hailed from the resonantly named town of Citronelle, Alabama. His big league career didn’t amount to a row of beans. But something he did once when the Braves were playing in Brooklyn made Bama Rowell a footnote in literary history.
On Memorial Day 1946, Bama Rowell hit a ball that smashed the clock over the rightfield scoreboard in old Ebbets Field, showering the Dodgers’ rightfielder with broken glass. Watching in the stands was a Brooklyn novelist, Bernard Malamud. Six years later he wrote "The Natural." In one episode the protagonist, Roy Hobbs–you may remember Robert Redford playing the part in the movie–hits a home run that smashes a clock, creating a shower of sparks.


You see? Even historical trivia can be fun. And even historical trivia can cast important light on contemporary life. Let me give another example of that.


Today the very name Boston Braves might be considered politically incorrect–trivializing, even dehumanizing Native Americans by reducing them to the status of mascots. But here a little historical knowledge is helpful.


Before Boston’s National League team was called the Braves, it was called, among other things, the Beaneaters, the Doves and the Rustlers. But the man who bought the team in 1911 was from New York City, where he was a member of the Tammany Hall political club. That club was named after an Indian chief named Tammanend. And Tammany Hall called its members braves. And so he named his team the Braves.


The Washington Redskins football team began life as the Boston Redskins, because they played in Braves Field and were named in emulation of the baseball Braves.


And by the way, the Cleveland Indians came to be called the Indians out of respect for one of their early players, who was the first Native American baseball player, Lou Sockalexis.


You see what I mean about how a little knowledge of history is fun. And it is useful in complicating some easy–too easy–moral snap judgments. Including the judgments that fuel one of America’s largest industries–the indignation industry of those who seem to be happy only when unhappy about some sign of unregenerate American racism, including supposed racism manifested in the names of the sports teams.


Of course baseball, the national pastime, like the nation itself, had a long history of racism. And then Sam Lacy stepped, as it were, to the plate.


Lacy, whose father was African American and whose mother was a Shinnecock Indian, was born in Mystic, Connecticut, but grew up in Washington, D.C., which was then a very Southern, very segregated city. He became a baseball fan. A fan of the Negro leagues, of course, but also of the old Washington Senators, which was not easy at a time when the saying was "Washington–first in war, first in peace and last in the American League."


How bad were the Senators? Their owner, Clark Griffith, once said, "Fans like home runs–and we have assembled a pitching staff to please our fans."


Nevertheless, Sam Lacy loved the Senators, and loved Major League Baseball even though African Americans were excluded from its playing fields, and in Washington–as in St. Louis–they were confined to segregated sections of the stands. He hung out at the Senators’ ballpark, shagging flies, running errands for the players and working as a vendor in the stands.
After graduating from Howard University, Sam Lacy became a sports writer for African American newspapers, first, in 1930, at the Washington Tribune, then in Chicago, and after 1943, in Baltimore. He became a tireless advocate for the integration of Major League Baseball. Writing columns, writing letters, he prodded Baseball Commissioner Kenesaw Mountain Landis.
Landis, who was named after the site of a Civil War battle, was a Confederate at heart, and was hostile to Sam Lacy’s pressure. But Lacy persisted, contacting people in Major League Baseball who he thought might be sympathetic, including Branch Rickey of the Dodgers.


In 1945 Lacy wrote:
"Baseball has given employment to known epileptics, kleptomaniacs, and a generous scattering of saints and sinners. A man who is totally lacking in character has turned out to be a star in baseball. A man whose skin is white or red or yellow has been acceptable. But a man whose character may be of the highest and whose ability may be Ruthian has been barred completely from the sport because he is colored."


Notice Lacy’s use of the word "character." Lacy knew that the first black big leaguer would need exceptional talent–and even more exceptional character.


Early on Lacy focused on an African American player who by 1940 had established himself as one of the greatest all-around athletes America had ever seen. This athlete became the first man at UCLA to letter in four sports. In football as a junior he led the Pacific Coast Conference in rushing, averaging 11 yards a carry. Yes, 11 yards.
He also led the conference in scoring in basketball. Twice.


On the track team he won the NCAA broad jump championship. He dabbled at golf and swimming, winning championships in each.


And he could play a little baseball.


His name was Jackie Roosevelt Robinson.


By 1945 he was playing baseball in the Negro leagues. Lacy was one of those who advised Branch Rickey that Robinson had the temperament to play the demanding game of baseball with poise even while enduring the predictable pressures and abuse of a racial pioneer.


But before the color line was erased in Brooklyn, Lacy and others tried to get it erased in Boston.


The Boston Braves were, almost always, dreadful. In fact, in the 1930s a new owner thought a change of names might improve the team’s luck. Fans were invited to suggest names–and suggested the Boston Bankrupts and the Boston Basements. The newspaper people judging the suggested names picked the Boston Bees–primarily because a short name would simplify writing headlines. And the Bees they were for several years, before again becoming the Braves.


But because the Braves were so bad, they would at least listen to a good idea.


In 1935 a Boston civil rights pioneer, an African American, approached both the Braves and Red Sox about hiring an African American player. The Red Sox gave him short shrift. The Braves, "too, ultimately flinched from challenging the major leagues’ color line–but because the Braves were so awful, they took the idea seriously.
Notice what was stirring. Competition concentrates the mind on essentials. Sport is the competitive pursuit of excellence. The teams most in need of excellence were the ones most receptive to the idea that baseball should be colorblind.
Consider the case of Boston’s other team.


Boston has always been an American League city. So the Red Sox were more complacent than the Braves. Hence the Red Sox were less receptive to the wholesome radicalism of the nascent civil rights movement.


But in 1945 a member of Boston’s city council threatened that if the Boston teams continued to resist the integration efforts of Sam Lacy and others, he, the city councilman, would block the annual renewal of the license that allowed the Braves and Red Sox to play on Sundays.


The Red Sox replied, with breathtaking disingenuousness, that no African Americans had ever asked to play for them and none probably wanted to because they could make more money in the Negro Leagues. The city councilman enlisted the help of a journalistic colleague of Sam Lacy and brought three African American players to Boston for a workout at Fenway Park on an off-day.


One of them was Sam Jethroe, an outfielder who later would play here, for the Braves. Another was Jackie Robinson. The Red Sox official responsible for signing players would not even attend the workout.


At the end of the workout a voice from deep in the Fenway Park stands shouted, "Get those niggers off the field!" It was 14 more years–1959–before the Red Sox finally fielded an African American player, Pumpsie Green. At that time there were just 16 major league teams. The Red Sox were the 16th to integrate.


During their bitter-end resistance to integration the Red Sox sent a scout to Birmingham, Alabama, to look at an outfielder playing for the Birmingham Black Barons. The scout reported laconically that the outfielder was not the Red Sox kind of player. The scout was right about that. The outfielder was Willie Mays.


The Red Sox suffered condign punishment for their bad behavior. In 1946 they lost the seventh game of the World Series. In 1948 and 1949 they lost the American League pennant on the last day of the season. The Red Sox might have won a World Series and two pennants with Jackie Robinson in the lineup.


The national pastime was integrated in 1947, a year before the nation’s military abolished segregated units. But three years before that–11 years before Rosa Parks refused to move to the back of a segregated bus in Montgomery, Alabama–Lt. Jackie Robinson of the United States Army was court-martialed for refusing, at Fort Hood, Texas, to obey a bus driver’s orders to move to the back of a segregated bus. Robinson was acquitted.


It is instructive that the two most thoroughly and successfully integrated spheres of American life are professional sports and the military. This is, I submit, related to the fact that both are severe meritocracies.


The military is meritocratic because competence and excellence are matters of life and death–for individuals and for nations. Sports are meritocratic because competence and excellence are measured relentlessly, play-by-play, day-by-day, in wins and losses. Particularly in baseball, the sport of the box score, that cold retrospective eye of the morning after.
Today the principle that individuals should be judged on their individual merits, not on their membership in this or that group, is still under attack. The attack is against a core principle of an open society–the principle of careers open to talents. Today there are pernicious new arguments for treating certain groups of Americans as incapable of doing what Sam Lacy knew Jackie Robinson could do: compete.


Sometime in the next few weeks the Supreme Court, in a case rising from the University of Michigan, will rule on the question of whether racial preferences in college admissions are compatible with the constitutional requirement of equal protection of the laws for all individuals. The argument about racial preferences is another stage–in my judgment, another deplorable detour–on our long national march toward a colorblind society.


The lives of Sam Lacy and Jackie Robinson remind us that a core principle of an open society is careers open to talents. Open to individuals, without interference–and without favoritism.


It is no accident that baseball was central to the lives of Lacy and Robinson, and to their crusade for a meritocratic society blind to color. Baseball’s season, like life, is long–162 games, 1,458 innings. In the end, the cream rises–quality tells.


Quality told in April 1946, when Jackie Robinson went to spring training with the Montreal Royals, the Dodgers’ highest minor league affiliate.


In an exhibition game he faced a veteran pitcher, a Kentuckian, who thought he would test Robinson’s grit by throwing a fastball at his head. Robinson sprawled in the dirt, then picked himself up, dusted himself off and lashed the next pitch for a single.
The next time Robinson came to bat, the Kentuckian again threw at Robinson’s head. Again, Robinson hit the dirt. And then he hit the next pitch. Crushed it, for a triple.


After the game the Kentucky pitcher went to Robinson’s manager, another southerner, and said simply, one Southerner to another: "Your colored boy is going to do all right."


He did more than all right. Jackie Robinson became 1947’s rookie of the year, en route to the Hall of Fame.


In 1948, Sam Lacy became the first African American member of the Baseball Writers Association of America. And in 1997, the day before he turned 94, he was inducted into the writers and broadcasters wing of the Baseball Hall of Fame. So Sam and Jackie will forever be, as it were, teammates in Cooperstown.


But, then, there is a kind of magic in the way their lives were entwined. Jackie Robinson was signed to a Dodger contract on October 23, 1945, Sam Lacy’s forty-second birthday.


His forty-second. Jackie Robinson’s number–42–is the only number that has been permanently retired, as an act of homage to Robinson, by all 30 major league teams.


As you, the Class of 2003, leave here today, outward bound for lives in an America where careers are open to talents, remember Sam Lacy’s story, and all such stories that comprise the pageant of American history.


Be determined enemies of historical amnesia, because before long, when the Class of 2053 sits where you’re sitting, your stories will have become part of that pageant.


I congratulate you in advance for all that you will achieve, thanks in part to the parents who made it possible for you to benefit from this great university. And I thank you for the privilege of being here at your embarkation.

 

 

DAVID FOSTER WALLACE'S 2005 COMMENCEMENT ADDRESS AY KENYON COLLEGE

(If anybody feels like perspiring [cough], I'd advise you to go ahead, because I'm sure going to. In fact I'm gonna [mumbles while pulling up his gown and taking out a handkerchief from his pocket].) Greetings ["parents"?] and congratulations to Kenyon's graduating class of 2005. There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says "Morning, boys. How's the water?" And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes "What the hell is water?"

This is a standard requirement of US commencement speeches, the deployment of didactic little parable-ish stories. The story ["thing"] turns out to be one of the better, less bullshitty conventions of the genre, but if you're worried that I plan to present myself here as the wise, older fish explaining what water is to you younger fish, please don't be. I am not the wise old fish. The point of the fish story is merely that the most obvious, important realities are often the ones that are hardest to see and talk about. Stated as an English sentence, of course, this is just a banal platitude, but the fact is that in the day to day trenches of adult existence, banal platitudes can have a life or death importance, or so I wish to suggest to you on this dry and lovely morning.

Of course the main requirement of speeches like this is that I'm supposed to talk about your liberal arts education's meaning, to try to explain why the degree you are about to receive has actual human value instead of just a material payoff. So let's talk about the single most pervasive cliché in the commencement speech genre, which is that a liberal arts education is not so much about filling you up with knowledge as it is about quote teaching you how to think. If you're like me as a student, you've never liked hearing this, and you tend to feel a bit insulted by the claim that you needed anybody to teach you how to think, since the fact that you even got admitted to a college this good seems like proof that you already know how to think. But I'm going to posit to you that the liberal arts cliché turns out not to be insulting at all, because the really significant education in thinking that we're supposed to get in a place like this isn't really about the capacity to think, but rather about the choice of what to think about. If your total freedom of choice regarding what to think about seems too obvious to waste time discussing, I'd ask you to think about fish and water, and to bracket for just a few minutes your skepticism about the value of the totally obvious.

Here's another didactic little story. There are these two guys sitting together in a bar in the remote Alaskan wilderness. One of the guys is religious, the other is an atheist, and the two are arguing about the existence of God with that special intensity that comes after about the fourth beer. And the atheist says: "Look, it's not like I don't have actual reasons for not believing in God. It's not like I haven't ever experimented with the whole God and prayer thing. Just last month I got caught away from the camp in that terrible blizzard, and I was totally lost and I couldn't see a thing, and it was fifty below, and so I tried it: I fell to my knees in the snow and cried out 'Oh, God, if there is a God, I'm lost in this blizzard, and I'm gonna die if you don't help me.'" And now, in the bar, the religious guy looks at the atheist all puzzled. "Well then you must believe now," he says, "After all, here you are, alive." The atheist just rolls his eyes. "No, man, all that was was a couple Eskimos happened to come wandering by and showed me the way back to camp."

It's easy to run this story through kind of a standard liberal arts analysis: the exact same experience can mean two totally different things to two different people, given those people's two different belief templates and two different ways of constructing meaning from experience. Because we prize tolerance and diversity of belief, nowhere in our liberal arts analysis do we want to claim that one guy's interpretation is true and the other guy's is false or bad. Which is fine, except we also never end up talking about just where these individual templates and beliefs come from. Meaning, where they come from INSIDE the two guys. As if a person's most basic orientation toward the world, and the meaning of his experience were somehow just hard-wired, like height or shoe-size; or automatically absorbed from the culture, like language. As if how we construct meaning were not actually a matter of personal, intentional choice. Plus, there's the whole matter of arrogance. The nonreligious guy is so totally certain in his dismissal of the possibility that the passing Eskimos had anything to do with his prayer for help. True, there are plenty of religious people who seem arrogant and certain of their own interpretations, too. They're probably even more repulsive than atheists, at least to most of us. But religious dogmatists' problem is exactly the same as the story's unbeliever: blind certainty, a close-mindedness that amounts to an imprisonment so total that the prisoner doesn't even know he's locked up.

The point here is that I think this is one part of what teaching me how to think is really supposed to mean. To be just a little less arrogant. To have just a little critical awareness about myself and my certainties. Because a huge percentage of the stuff that I tend to be automatically certain of is, it turns out, totally wrong and deluded. I have learned this the hard way, as I predict you graduates will, too.

Here is just one example of the total wrongness of something I tend to be automatically sure of: everything in my own immediate experience supports my deep belief that I am the absolute center of the universe; the realest, most vivid and important person in existence. We rarely think about this sort of natural, basic self-centeredness because it's so socially repulsive. But it's pretty much the same for all of us. It is our default setting, hard-wired into our boards at birth. Think about it: there is no experience you have had that you are not the absolute center of. The world as you experience it is there in front of YOU or behind YOU, to the left or right of YOU, on YOUR TV or YOUR monitor. And so on. Other people's thoughts and feelings have to be communicated to you somehow, but your own are so immediate, urgent, real.

Please don't worry that I'm getting ready to lecture you about compassion or other-directedness or all the so-called virtues. This is not a matter of virtue. It's a matter of my choosing to do the work of somehow altering or getting free of my natural, hard-wired default setting which is to be deeply and literally self-centered and to see and interpret everything through this lens of self. People who can adjust their natural default setting this way are often described as being "well-adjusted", which I suggest to you is not an accidental term.

Given the triumphant academic setting here, an obvious question is how much of this work of adjusting our default setting involves actual knowledge or intellect. This question gets very tricky. Probably the most dangerous thing about an academic education -- least in my own case -- is that it enables my tendency to over-intellectualize stuff, to get lost in abstract argument inside my head, instead of simply paying attention to what is going on right in front of me, paying attention to what is going on inside me.

As I'm sure you guys know by now, it is extremely difficult to stay alert and attentive, instead of getting hypnotized by the constant monologue inside your own head (may be happening right now). Twenty years after my own graduation, I have come gradually to understand that the liberal arts cliché about teaching you how to think is actually shorthand for a much deeper, more serious idea: learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience. Because if you cannot exercise this kind of choice in adult life, you will be totally hosed. Think of the old cliché about quote the mind being an excellent servant but a terrible master.

This, like many clichés, so lame and unexciting on the surface, actually expresses a great and terrible truth. It is not the least bit coincidental that adults who commit suicide with firearms almost always shoot themselves in: the head. They shoot the terrible master. And the truth is that most of these suicides are actually dead long before they pull the trigger.

And I submit that this is what the real, no bullshit value of your liberal arts education is supposed to be about: how to keep from going through your comfortable, prosperous, respectable adult life dead, unconscious, a slave to your head and to your natural default setting of being uniquely, completely, imperially alone day in and day out. That may sound like hyperbole, or abstract nonsense. Let's get concrete. The plain fact is that you graduating seniors do not yet have any clue what "day in day out" really means. There happen to be whole, large parts of adult American life that nobody talks about in commencement speeches. One such part involves boredom, routine, and petty frustration. The parents and older folks here will know all too well what I'm talking about.

By way of example, let's say it's an average adult day, and you get up in the morning, go to your challenging, white-collar, college-graduate job, and you work hard for eight or ten hours, and at the end of the day you're tired and somewhat stressed and all you want is to go home and have a good supper and maybe unwind for an hour, and then hit the sack early because, of course, you have to get up the next day and do it all again. But then you remember there's no food at home. You haven't had time to shop this week because of your challenging job, and so now after work you have to get in your car and drive to the supermarket. It's the end of the work day and the traffic is apt to be: very bad. So getting to the store takes way longer than it should, and when you finally get there, the supermarket is very crowded, because of course it's the time of day when all the other people with jobs also try to squeeze in some grocery shopping. And the store is hideously lit and infused with soul-killing muzak or corporate pop and it's pretty much the last place you want to be but you can't just get in and quickly out; you have to wander all over the huge, over-lit store's confusing aisles to find the stuff you want and you have to maneuver your junky cart through all these other tired, hurried people with carts (et cetera, et cetera, cutting stuff out because this is a long ceremony) and eventually you get all your supper supplies, except now it turns out there aren't enough check-out lanes open even though it's the end-of-the-day rush. So the checkout line is incredibly long, which is stupid and infuriating. But you can't take your frustration out on the frantic lady working the register, who is overworked at a job whose daily tedium and meaninglessness surpasses the imagination of any of us here at a prestigious college.

But anyway, you finally get to the checkout line's front, and you pay for your food, and you get told to "Have a nice day" in a voice that is the absolute voice of death. Then you have to take your creepy, flimsy, plastic bags of groceries in your cart with the one crazy wheel that pulls maddeningly to the left, all the way out through the crowded, bumpy, littery parking lot, and then you have to drive all the way home through slow, heavy, SUV-intensive, rush-hour traffic, et cetera et cetera.

Everyone here has done this, of course. But it hasn't yet been part of you graduates' actual life routine, day after week after month after year.

But it will be. And many more dreary, annoying, seemingly meaningless routines besides. But that is not the point. The point is that petty, frustrating crap like this is exactly where the work of choosing is gonna come in. Because the traffic jams and crowded aisles and long checkout lines give me time to think, and if I don't make a conscious decision about how to think and what to pay attention to, I'm gonna be pissed and miserable every time I have to shop. Because my natural default setting is the certainty that situations like this are really all about me. About MY hungriness and MY fatigue and MY desire to just get home, and it's going to seem for all the world like everybody else is just in my way. And who are all these people in my way? And look at how repulsive most of them are, and how stupid and cow-like and dead-eyed and nonhuman they seem in the checkout line, or at how annoying and rude it is that people are talking loudly on cell phones in the middle of the line. And look at how deeply and personally unfair this is.

Or, of course, if I'm in a more socially conscious liberal arts form of my default setting, I can spend time in the end-of-the-day traffic being disgusted about all the huge, stupid, lane-blocking SUV's and Hummers and V-12 pickup trucks, burning their wasteful, selfish, forty-gallon tanks of gas, and I can dwell on the fact that the patriotic or religious bumper-stickers always seem to be on the biggest, most disgustingly selfish vehicles, driven by the ugliest [responding here to loud applause] (this is an example of how NOT to think, though) most disgustingly selfish vehicles, driven by the ugliest, most inconsiderate and aggressive drivers. And I can think about how our children's children will despise us for wasting all the future's fuel, and probably screwing up the climate, and how spoiled and stupid and selfish and disgusting we all are, and how modern consumer society just sucks, and so forth and so on.

You get the idea.

If I choose to think this way in a store and on the freeway, fine. Lots of us do. Except thinking this way tends to be so easy and automatic that it doesn't have to be a choice. It is my natural default setting. It's the automatic way that I experience the boring, frustrating, crowded parts of adult life when I'm operating on the automatic, unconscious belief that I am the center of the world, and that my immediate needs and feelings are what should determine the world's priorities.

The thing is that, of course, there are totally different ways to think about these kinds of situations. In this traffic, all these vehicles stopped and idling in my way, it's not impossible that some of these people in SUV's have been in horrible auto accidents in the past, and now find driving so terrifying that their therapist has all but ordered them to get a huge, heavy SUV so they can feel safe enough to drive. Or that the Hummer that just cut me off is maybe being driven by a father whose little child is hurt or sick in the seat next to him, and he's trying to get this kid to the hospital, and he's in a bigger, more legitimate hurry than I am: it is actually I who am in HIS way.

Or I can choose to force myself to consider the likelihood that everyone else in the supermarket's checkout line is just as bored and frustrated as I am, and that some of these people probably have harder, more tedious and painful lives than I do.

Again, please don't think that I'm giving you moral advice, or that I'm saying you are supposed to think this way, or that anyone expects you to just automatically do it. Because it's hard. It takes will and effort, and if you are like me, some days you won't be able to do it, or you just flat out won't want to.

But most days, if you're aware enough to give yourself a choice, you can choose to look differently at this fat, dead-eyed, over-made-up lady who just screamed at her kid in the checkout line. Maybe she's not usually like this. Maybe she's been up three straight nights holding the hand of a husband who is dying of bone cancer. Or maybe this very lady is the low-wage clerk at the motor vehicle department, who just yesterday helped your spouse resolve a horrific, infuriating, red-tape problem through some small act of bureaucratic kindness. Of course, none of this is likely, but it's also not impossible. It just depends what you what to consider. If you're automatically sure that you know what reality is, and you are operating on your default setting, then you, like me, probably won't consider possibilities that aren't annoying and miserable. But if you really learn how to pay attention, then you will know there are other options. It will actually be within your power to experience a crowded, hot, slow, consumer-hell type situation as not only meaningful, but sacred, on fire with the same force that made the stars: love, fellowship, the mystical oneness of all things deep down.

Not that that mystical stuff is necessarily true. The only thing that's capital-T True is that you get to decide how you're gonna try to see it.

This, I submit, is the freedom of a real education, of learning how to be well-adjusted. You get to consciously decide what has meaning and what doesn't. You get to decide what to worship.

Because here's something else that's weird but true: in the day-to day trenches of adult life, there is actually no such thing as atheism. There is no such thing as not worshipping. Everybody worships. The only choice we get is what to worship. And the compelling reason for maybe choosing some sort of god or spiritual-type thing to worship -- be it JC or Allah, bet it YHWH or the Wiccan Mother Goddess, or the Four Noble Truths, or some inviolable set of ethical principles -- is that pretty much anything else you worship will eat you alive. If you worship money and things, if they are where you tap real meaning in life, then you will never have enough, never feel you have enough. It's the truth. Worship your body and beauty and sexual allure and you will always feel ugly. And when time and age start showing, you will die a million deaths before they finally grieve you. On one level, we all know this stuff already. It's been codified as myths, proverbs, clichés, epigrams, parables; the skeleton of every great story. The whole trick is keeping the truth up front in daily consciousness.

Worship power, you will end up feeling weak and afraid, and you will need ever more power over others to numb you to your own fear. Worship your intellect, being seen as smart, you will end up feeling stupid, a fraud, always on the verge of being found out. But the insidious thing about these forms of worship is not that they're evil or sinful, it's that they're unconscious. They are default settings.

They're the kind of worship you just gradually slip into, day after day, getting more and more selective about what you see and how you measure value without ever being fully aware that that's what you're doing.

And the so-called real world will not discourage you from operating on your default settings, because the so-called real world of men and money and power hums merrily along in a pool of fear and anger and frustration and craving and worship of self. Our own present culture has harnessed these forces in ways that have yielded extraordinary wealth and comfort and personal freedom. The freedom all to be lords of our tiny skull-sized kingdoms, alone at the center of all creation. This kind of freedom has much to recommend it. But of course there are all different kinds of freedom, and the kind that is most precious you will not hear much talk about much in the great outside world of wanting and achieving and [unintelligible -- sounds like "displayal"]. The really important kind of freedom involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day.

That is real freedom. That is being educated, and understanding how to think. The alternative is unconsciousness, the default setting, the rat race, the constant gnawing sense of having had, and lost, some infinite thing.

I know that this stuff probably doesn't sound fun and breezy or grandly inspirational the way a commencement speech is supposed to sound. What it is, as far as I can see, is the capital-T Truth, with a whole lot of rhetorical niceties stripped away. You are, of course, free to think of it whatever you wish. But please don't just dismiss it as just some finger-wagging Dr. Laura sermon. None of this stuff is really about morality or religion or dogma or big fancy questions of life after death.

The capital-T Truth is about life BEFORE death.

It is about the real value of a real education, which has almost nothing to do with knowledge, and everything to do with simple awareness; awareness of what is so real and essential, so hidden in plain sight all around us, all the time, that we have to keep reminding ourselves over and over:

"This is water."

"This is water."

It is unimaginably hard to do this, to stay conscious and alive in the adult world day in and day out. Which means yet another grand cliché turns out to be true: your education really IS the job of a lifetime. And it commences: now.

I wish you way more than luck.

 

 

 

 

 

THE LONG RUN WITH SPRINGSTEEN
 

August 21, 2005

Louis P. Masur, William R. Kenan Jr. Professor of American Institutions and Values at Trinity College

Thirty years ago, Bruce Springsteen's album "Born to Run" thundered onto the American scene to remarkable reviews. Greil Marcus of Rolling Stone declared "You've never heard anything like this before, but you understand it instantly, because this music ... is what rock 'n' roll is supposed to sound like."

In The New York Times, John Rockwell praised the songs as "poetry that attains universality. ... You owe it to yourself to buy this record." Robert Hilburn of the Los Angeles Times would later observe that "`Born to Run' breathed with the same kind of discovery that made Elvis Presley's `Sun Sessions' and Bob Dylan's `Highway 61 Revisited' the two most important American rock albums before it."

The album immediately elevated Springsteen to a cultural icon, and he was hailed as the savior of rock 'n' roll. That October, he appeared on the covers of Time and Newsweek during the same week, and his image--laughing, scruffy, guitar pointed ever upward--perfectly reinforced the message of his music: Life is a continuous journey in search of salvation and love.

On the album cover, a smile lights up Springsteen's face as he leans on his saxophonist Clarence Clemons, whose image carries over to the back. Attached to his guitar strap is an Elvis Presley fan club pin. Springsteen once observed that Presley freed our bodies and Dylan freed our minds. For my generation, Springsteen freed our souls.

I turned 18 in 1975, when Presley was a Vegas lounge act, and the rock revolution ushered in by Dylan's appearance at the Newport Folk Festival and the recording of "Like a Rolling Stone" was a decade in the past. Bruce belonged to us; we were part of the legend from the start. No more older siblings bragging that they saw the Beatles at Shea Stadium or attended Woodstock. I first heard Bruce in 1973, and I followed him and the band to clubs and small theaters, where I soaked in long sets that left me feeling exhilarated.

"Born to Run" gave voice to my dreams of escape and search for meaning. Of course, taking to the road to find yourself is a classic American theme. Bruce and Clarence on the cover are part of a cultural history that includes Herman Melville's Ishmael and Queequeg at sea or Mark Twain's Huck and Jim lighting out for the territory. But each generation conducts its search in its own way and out of its own imperatives.

Mine was something of a post-heroic generation, too young to have participated fully in the cultural rebellions of the 1950s and the civil rights and anti-war movements of the 1960s, yet socialized and politicized by those impulses and seeking direction. Richard Nixon resigned, the Vietnam War ended, and I ached to get away from home. Somehow, national and personal malaise mixed.

In the opening track, "Thunder Road," Springsteen sings "So you're scared and you're thinking that maybe we ain't that young anymore." The line was written by a 24-year-old, and it resonated among teens who feared what all youth fear: boredom, emptiness, meaninglessness. Redemption and love were to be found out on the road, away from "a town full of losers." "Climb in," the narrator insists, "I'm pulling out of here to win."

The song "Born to Run" opened the second side of the album. In this day of single downloads and sideless CDs, it is worth remembering that the best albums of the 1960s and 1970s had movement and drama: We listened to all the cuts and thought about the sequence. Those first few notes invite us on a journey to "get out while we're young" from a town that is "a death trap," "a suicide rap." Suburban comfort and social conformity sapped the soul. "Tramps like us, baby we were born to run," Springsteen declares. The chugging guitar and pounding bass lead us away.

He is striking the same cultural chords as any number of artists: Walt Whitman, Charlie Chaplin, Woody Guthrie and Jack Kerouac, to name a few. But Springsteen has made the highway and the American dream his own. The recording, which took three months in the studio to get pitch perfect, still sounds immediate and vital. Asked recently to identify the one song that embodies his ideals, Springsteen named "Born to Run."

If "Thunder Road" and "Born to Run" offer promises of escape, others songs are less hopeful. Springsteen understands that tragedy stands alongside triumph as a central riff in American culture. On "Backstreets," the summer is "infested," and the characters are "filled with defeat." In "Jungleland," "the street's on fire in a real death waltz," and "lonely-hearted lovers struggle in dark corners."

The themes of love and heroism are reprised throughout, in the poetic words and the soaring arrangements. Those themes re-emerged with new poignancy on Sept. 11, and Springsteen was struck by the references to his work in the Times' "Portraits of Grief." After the service for one victim, mourners sang "Thunder Road" and remembered their friend who "taught us all a lesson in unconditional love." I wonder how many of my generation have left instructions, as I have, for Springsteen to be played at their funeral services.

We may not find everlasting love, and we may turn out not to be heroes. Each of us is "a scared and lonely rider" seeking connection.

"Born to Run" ponders whether somewhere out there "love is wild, ... love is real" and if we keep searching we may eventually "get to that place where we really want to go and we'll walk in the sun." Until then--and pump your fist as you shout it--baby we were born to run.

Copyright © 2005, Chicago Tribune

 

IS AP TOO GOOD TO BE TRUE ?

Justin Ewers

If Neil Panchal had known last year what he was in for, he might have tried to get a little more sleep. Panchal, who recently started 12th grade at Barrington High School outside Chicago, is a man on a college admissions mission: He took his first Advanced Placement course as a sophomore, aced the U.S. History exam, and set his sights on an elite school. Last year, when he saw his classmates filling up their schedules with two, three, even four AP s, he figured that if he was going to be competitive, it was time to ante up.

So he pushed his nonacademic life aside and signed up for a total of five AP courses as a junior--Spanish, calculus, statistics, European history, and chemistry. The workload made him miserable, of course. He had five hours of homework most every night and scaled back his commitment to his beloved tennis team to just one weekly practice. Nevertheless, this fall Panchal is at it again: He's taking another four AP s as a senior, which will bring his grand total, by the time he graduates, to 10. No regrets, though. If his classmates were going to load up on APs, he thought he'd better do the same. Otherwise, "they'd be better off than me when it came to getting into colleges," he says. "I didn't want to fall behind."

Who can blame him? These are, after all, the glory days of Advanced Placement, the latest rage in the ever more challenging race for selective college admissions. AP classes are the highest-level honors curriculum at many high schools: They usually require more reading, more writing, and more problem sets--and they carry high prestige. Some 1.2 million high schoolers took AP exams in 2005, up from 134,000 in 1980. In the past five years, the number of students taking at least one exam in any of the 34 subjects now offered by the College Board, which runs the tests, has jumped by 45 percent. Most colleges continue to give credit to those who score 3 or higher (on a scale of 1 to 5). And as high schools continue to create more AP classes, elite schools are being inundated with applicants like Panchal: At Northwestern University, for example, over 90 percent of incoming freshmen received AP credit last year.

But just as AP, celebrating its 50th anniversary this fall, hits its frenzied peak--with Newsweek going so far as to rank high schools based on the percentage of students who take AP or International Baccalaurate exams--some experts wonder whether the program's wild proliferation has begun to dilute its quality. Several new academic studies indicate that simply taking AP classes--as opposed to passing the end-of-year examinations run by the College Board--isn't a very good predictor of college success. Some high schools are complicating matters by pasting the AP label onto subpar existing courses. And a few highly selective schools have become sufficiently alarmed over the quality of AP classes that they are getting picky about awarding credit even to those who have passed the exams. At Northwestern, for instance, the economics and biology departments grant credit only to students who have earned perfect 5s; Harvard applies the same stringent standard to all AP classes. And Stanford won't give credit at all for such relative newcomers to the Advanced Placement scene as AP Environmental Science and AP World History. A dozen high-profile prep schools have dropped AP altogether. All of which has left some would-be collegegoers and parents wondering: Could Advanced Placement be overrated?

AP was never intended to be at the center of the admissions frenzy, of course. When the program was first developed in 1955, it was designed to give the highest-achieving students, mostly at elite schools, a chance to tackle some college-level coursework--and earn college credit--while still in high school. Placement was the goal, not admissions, thus the name. In the 1980s, though, more and more high schools, seeing how well AP's rigorous subject matter and challenging tests were preparing kids for college, began to add the courses to their curricula. As the program became more popular (and selective colleges continued to show interest in AP students), more parents, believing all students could benefit from exposure to the high-level material covered in AP courses, demanded access. More schools opened the classes to students of all levels, regardless of their ability. Which, of course, had some unintended consequences. Cambridge Rindge & Latin in Massachusetts, for example, offered only one section of AP U.S. History 20 years ago. Today, there are between six and eight sections every year in a school with 1,800 students--which means half of each class is now considered "advanced."

This increased access has had an unmistakable bright side: a dramatic increase in the number of minorities and low-income students enrolling in AP courses. In the past decade, especially, the College Board has given high priority to making AP available to groups that have not, historically, taken the courses. The number of AP exams taken by low-income kids jumped from 32,688 in 1993 to 144,532 ten years later, with minority numbers experiencing a similar leap.

Some analysts wonder, though, whether all of these new students are really doing college-level work. Teens typically receive bonus points on their grade-point averages for completing AP courses. But in a study to be published this fall, Saul Geiser, a research fellow at the Center for Studies in Higher Education at the University of California-Berkeley, comes to the stark conclusion that in California, at least, there is little to no correlation between simply taking an AP course and students' first- and second-year college GPAs. (A study of students in Texas drew a similar conclusion.) Doing well on AP exams is another matter, Geiser says: High test scores are one of the best predictors of college success. But since only about two thirds of California students in AP courses actually take the exam, for many kids the classes themselves are where AP begins and ends. At one time, Geiser says, good grades in AP classes may have been a reasonable barometer of academic ability, but he thinks colleges need to reconsider using AP coursework alone as a criterion in high-stakes admissions. "AP is being used for a purpose for which it has never been validated," he says.

The UC system's powerful statewide admissions board, which commissioned his study, is now considering dropping the bonus points it currently awards for AP courses when calculating GPA s. And the stakes are high: Admissions experts say that by threatening to downgrade the status of AP, much as it did when it considered dropping the SAT in 2001, UC, with its 208,000 students, could do a lot of damage to the program's reputation.

These critics are not the first to take AP to task, of course. A similarly gloomy report written in 2001 by the National Research Council concluded that as AP expands, the courses, too often, are being led by poorly prepared teachers who teach to the test by stressing rote memorization rather than "active problem solving and discussion." Part of the reason for this is the design of the AP curriculum itself, which calls on teachers to cover a lot of ground in a relatively short period of time. In AP U.S. History, for example, students must be familiar with everything from the pre-Columbian era to the Revolutionary and Civil Wars, the Industrial Revolution, and the emergence of America as a world power. Then there's the 20th century to go through as well. The three-hour AP exam ranges over this entire period, which means teachers have to cover all of it at high speed, sometimes at the expense of depth and analysis.

This troubles some educators, who worry that by trying to become everything to everyone, AP is losing its luster. A dozen high-profile private schools, including the tony Fieldston School in New York and the alternative Crossroads School in Santa Monica, Calif., have dropped their AP courses in recent years, saying the curriculum is too inflexible. Some colleges, meanwhile, are starting to be more cautious about giving APs on transcripts a free pass. Applicants can no longer count on bowling over admissions officers with the sheer number of AP courses they've taken. "We're very careful in training new people to be wary of the fact that this person who has six APs is not obviously better than someone who has two or three," says Daniel Walls, dean of admissions at Emory University.

Trevor Packer, the College Board's executive director of Advanced Placement, acknowledges that there have been some quality-control problems with AP courses of late: "What we've seen, over the past five years in particular, is too many schools rushing into AP without having an infrastructure in place to support such rigorous academics," he says. Some schools have gone off the reservation entirely, creating AP courses in subjects unapproved by the College Board--accounting, botany, even one course called AP West Virginia History. New AP teachers are not yet getting enough professional development training, he says. But Packer is quick to emphasize that the program is still quite popular. Some high-profile dissenters notwithstanding, in the past year alone the number of independent schools using AP has jumped by 15 percent--the largest growth of private-school participation in the history of the program.

And Packer points out, quite rightly, that few critics, including UC's Geiser, are questioning the value of the AP exams . There is no need, in other words, to throw the baby out with the bath water: The College Board recently took the first step toward upgrading program quality when it announced a new certification program that will require high schools that use the AP name to be audited by the organization. As of next February, "if they're not providing labs or using textbooks," says Packer, schools will no longer be able to label their courses AP. More schools, in the meantime, do seem to be requiring their students to take the AP exams: Nationwide, some 73 percent of students enrolled in AP courses last year also took the tests, an increase of more than 10 percentage points over the previous decade. Critics' concerns "are valid," says Packer, "but I do think they're being addressed."

It's possible, to be sure, that AP can continue to evolve on all fronts--expanding access to the underprivileged, maintaining its quality, and continuing to be a ticket into elite colleges. For students like Neil Panchal, it's what selective schools decide to do with AP that truly matters. As long as so many keep looking for those two magical letters on applications, high schoolers' lives will continue to involve a lot less sleep--and a lot more AP.

© Justin Ewers 2005

 

WHY LENNON WILL ALWAYS MATTER

 

Michael James Moore

October 9, 2005

Had he not been assassinated in New York City 25 years ago this December, John Lennon would be a 65-year-old man. He was born Oct. 9, 1940.

Ten years before Lennon's birth, my father was born in Chicago.

Neither one of them could have predicted how complicated their relationship would be.

When I was 11 years old in October 1970, a song called "Mother" was playing on the radio. It was a dark, mournful, slow dirge that aired on WLS-AM for one reason only: It was the 45-r.p.m. single heralding the release of John Lennon's first solo studio album. The Beatles had announced the dissolving of their partnership that year.

The cover of the album, "John Lennon/Plastic Ono Band" was unthreatening.

A pastoral scene on the front showed John and his wife, Yoko Ono, sitting beneath a tree; a childhood photo of Lennon was on the back. "He was a handsome boy," my father noted in the car.

That Saturday was (so I thought) one of the great ones. My father traveled a great deal as an executive. Having him home on a weekend was a big deal. Having him buy me a new album was like an early Christmas. And to hold the debut solo LP by John Lennon was to feel as though somehow the cutting edge of the vibrant youth culture was in hand.

That's one of the many reasons why Lennon mattered: then and now. He was fated to personify in his various images and to score in much of his music both the profile and the soundtrack of that rapidly changing epoch, which he still represents. He was the 1960s.

Even then, in the early autumn of 1970, his name alone connoted different things to different people. To the young who wished to have been at Woodstock and yearned to be in San Francisco or Greenwich Village, Lennon signified a whole slew of assumptions.

He was (generally speaking) considered to be the most "way out" of the Beatles; even those of us barely older than 10 knew that meant he had experimented excessively with drugs, wore those hippie-esque granny glasses, sang lead on "Revolution" and conjured up the now patented antiwar anthem, "Give Peace a Chance."

To the millions who comprised President Richard M. Nixon's "silent majority," John Lennon also represented what was least admirable or most detested about the so-called counterculture.

He was the "loudmouth" who had blasphemed Jesus Christ in 1966 when his remarks on Christianity and popular culture induced a firestorm of criticism; he was the Beatle with the longest hair, cozying up to radicals and fundraising for Black Panthers; and certainly he was the one most often in the press for drug busts, protests or stunts with Yoko Ono.

By the fall of 1970, in the aftermath of Nixon's Cambodian incursion and the subsequent massacre of students at Kent State and Jackson State Universities, plus the catastrophic bombing of the Army Math Research Center on the Madison campus of the University of Wisconsin, escalating tensions between young and old had metastasized.

So it was probably the bucolic cover photo on Lennon's first solo LP that made my father think it was something low-key. Something new. No more of that controversial rhetoric.

He was wrong. Later that afternoon, one thing led to another and the lyrics printed on the album sleeve confirmed what my parents thought they heard emanating from my room. On "Working Class Hero," the "f-word" was sung not once but twice in one song.

That was just the beginning. A closer examination of the lyrics revealed that in "I Found Out" and at length on a track called "God," the singer-songwriter was pushing the envelope in ways that bore no relationship to Tin Pan Alley or other traditional venues. Lyrics that grappled with provocative themes? That was an understatement. Throughout this debut solo album, Lennon was trailblazing, smashing icons and breaking the rules.

As a corporate executive, my father could not imagine how any responsible company (i.e., the men at Capitol Records who distributed the ex-Beatles' solo Apple recordings) could sign off on a work that harbored such unacceptable language. He was beyond shocked; he was stricken. That album was exchanged for another the very same day.

Ten years later, after Lennon was murdered on Dec. 8, 1980, my best friends and I exchanged symbolic holiday presents at Christmas. New copies of Lennon's solo albums were distributed all around. That was something that made our clique a bit odd by the standards of others who were too young to experience the'60s in any active way; we were always retro. We simply preferred the '60s and the '50s to 1976 or 1979. And no matter the release date of his albums, John's voice was always a conduit to the 1960s.

So, new copies of Lennon's solo canon piled up quickly: In addition to "John Lennon/Plastic Ono Band," there were the works from 1971-75: "Imagine," "Sometime in New York City," "Mind Games," "Walls and Bridges," "Rock 'n' Roll." Plus the latest album, which had been released a few months earlier: 1980's "Double Fantasy."

One night while home on break from college that season, I took a shot. In my role as periodic family deejay, I was turned loose at the family room stereo.

Rather than play a whole side of any album, I selected ballads; nothing loud or edgy. And my father cried.

A spontaneous John Lennon medley that night consisted of tracks that were hushed or whispered or gently sung: "Love" and "Look at Me"; "Jealous Guy" and "Oh My Love";and then the spine-tingling tour de force on his "Walls and Bridges" album.

"Nobody Loves You [When You're Down and Out]" is a saloon song that John had hoped Frank Sinatra would perform (both Lennon and George Harrison admired how Sinatra had reconfigured George's most famous song, "Something"); the lyric was in sync with Sinatra's whole "One for My Baby"-"Only the Lonely"-"In the Wee Small Hours"-"Angel Eyes" ethos.

And the strings and the horns and the languorous tempo: It all sounded magnificent.

But it was the integrity of the lyric that got to my father: "Everybody's hustlin' for a buck and a dime/I'll scratch your back and you knife mine." He vigorously nodded.

And he cried because, at 50, he was feeling the ravages of time and the changes in his own life, career and his own sense of what was possible. Lennon's voice soothed him.

It's still happening across all generations, between genders, in myriad re-creations of John Lennon's best compositions and in all media: on the Broadway stage, in concerts and videos, through books and documentaries, radio specials; and it's a universal phenom.

Sixty-five years after he was born and 25 years after his death, the reason John Lennon mattered--and still matters--is more obvious than it was even in the fall of 1970.

All flaws notwithstanding (musically or personally), Lennon matters because his own ever-evolving search for meaning impelled him to leave behind, at age 40, eternal songs.

Michael James Moore lives in Madison, Wis.

© 2005, Chicago Tribune

 

GENIUS DENIED: THE PROBLEM WITH "NO CHILD LEFT BEHIND"

By Leslie Mann

November 6, 2005

After studying the American classroom in 1954, anthropologist Margaret Mead said: "There is in America today an appalling waste of first-rate talents. Neither teachers, the parents of other children nor the child's peers will tolerate the wunderkind."

Unfortunately, things have not improved much for gifted children, according to Jan Davidson, who co-wrote "Genius Denied: How to Stop Wasting Our Brightest Young Minds" (Simon and Schuster Paperbacks, $13) with her husband, Bob. "We put a ceiling on their learning," said Davidson, a former college professor. "And we've lowered the ceiling."

Through their non-profit foundation, the Davidson Institute for Talent Development, the Davidsons help parents and educators find learning opportunities for gifted children.

Jan Davidson talked with the Tribune by telephone from her Incline Village, Nev., home.

Q. Explain the title of your book.

A. We're denying our brightest kids the opportunity to learn, to reach their potential. Other countries are beating us in math and science, especially. We are not the top producer of talent.

At the same time, the No Child Left Behind Act focuses on the underachievers and meeting minimum standards, so gifted kids are left behind. Americans spend 143 times more money on special ed than on gifted ed.

Q. How do schools define "gifted"?

A. They define it differently, although high IQ is usually part of the equation. We encourage educators to look at the child's characteristics too. A gifted child learns faster and deeper. He answers teachers' questions with, "It depends." He has an extreme need for mental stimulation. His curiosity is insatiable.

Q. Why are gifted children often misdiagnosed as having attention deficit disorder?

A. Repetitive work that isn't intellectually stimulating bores a gifted child, so he might misbehave. Then he is labeled as hyperactive or unable to concentrate.

If teachers fail to recognize that a bored, non-performing child is gifted, then the student can slip through the cracks and may eventually drop out. Studies show that 10 to 20 percent of high school dropouts are gifted.

Q. What are some of the myths surrounding gifted children?

A. That they can fend for themselves. The truth is, they need to be nurtured or they aren't challenged.

That good grades equal gifted. In fact, many straight-A students are not gifted but obedient. And many gifted kids don't get straight A's. They underachieve to try to fit in with other kids.

That all gifted kids are white and wealthy. In fact, giftedness crosses socioeconomic lines. The wealthy can afford more enrichment programs though, so there are more wealthy kids in them.

That grade-skipping screws up kids. When it is done carefully and monitored, it works. Parents worry that the kids won't get along socially, but they are already friends with older kids because they have little in common, intellectually, with their peers.

Q. What can a parent of a gifted child do to further the child's education?

A. First, go to the "Getting Started" section of our Web site, www.geniusdenied.com. Gather information about what your district offers and learn how to work effectively with your district.

In addition to "Genius Denied," I recommend reading "A Nation Deceived: How Schools Hold Back America's Brightest Students" by the Templeton National Report on Acceleration (University of Iowa, $50) and Joan Smutny's "Stand Up for Your Gifted Child: How to Make the Most of Kids' Strengths at School and at Home" (Free Spirit, $14.95).

Learn about Illinois' policies on gifted education by going to www.gt-cybersource.org.

If your district has a [gifted] parent group, they may be able to help you. If it doesn't, start one!

Q. What's the best way to educate gifted children?

A. Ideally, in self-contained classrooms, where the curriculum is faster and different from the standard curriculum. For some children, grade-skipping helps. Many parents decide to home school, where the kids can learn at their own pace.

I hope we will reach the point where every gifted child has an IEP [Individual Education Plan], as special education children do.

The IEP is designed to assure that the curriculum is appropriately matched to the student's abilities; it creates an opportunity for teachers, school administrators, parents and students to work together to achieve a quality education for an exceptional student.

In our technological age, there's no reason we can't do this. Everything else in this country is so individualized--Amazon knows what books I like. We need to do this for our brightest kids. We need their talent more than ever.

Copyright © 2005, Chicago Tribune

 

BONDING THOUGH BOOKS

Garrison Keillor

December 2, 2005

I got to put on a tux and go to the National Book Awards in New York a couple weeks ago and eat lamb chops in a hotel ballroom and breathe air recently exhaled by Toni Morrison and Norman Mailer and Lawrence Ferlinghetti and other greats and near-greats of the trade.

I was there as an innocent bystander, not a nominee (God forbid). Having never won a big prize, I am opposed to them on principle: They are an excrescence of commerce and a corruption of the purity of artistic creation. Nonetheless, it was good to see the brilliant young novelist William T. Vollmann pick up the trophy for fiction and that grand old man W.S. Merwin get the nod for poetry. If you can't be the creator of Harry Potter or the decoder of da Vinci, winning a big prize is some consolation. It gives you reason to believe you may not have wasted your life after all.

The urge to write stuff and put it between covers is powerful, as one can see by the godawful books that emerge every day--vanity, thy name is publishing!-- and anybody with the authorial urge ought to visit the underground stacks of a major public library and feel the chill of oblivion. Good writers such as Glenway Wescott, John Dos Passos, Caroline Gordon, gone, gone, gone. They had their shining moment and then descended into storage, where they wait for years to be opened. Sometimes, to placate the ghosts, I take a book off the shelf that looks as if nobody's opened it for a few decades and I open it. And then I close it.

Emily Dickinson died unpublished, and her work eventually found its way from deep anonymity to the pantheon of American Lit, and now her grave in Amherst, Mass., is one of the most beloved anywhere in the world. She is the patron saint of the meek and lonely. A devout unbeliever, she lies under a tombstone that says "Called Back," and here, every week, strangers come and place pebbles on her stone and leave notes to her folded into tiny squares. Perhaps they are unpublished poets too. As Emily said, success is counted sweetest by those who ne'er succeed. She would have known about that.

People like to speculate about her love life, but their chatter about that is dull stuff compared to the poems, the flies buzzing and the horses turning their heads toward eternity and the narrow fellow in the grass and "Hope is the thing with feathers" and all--the lady was a fine piece of Yankee free-thinking who dwelt in the richness of Victorian language. Through her poems, you can enter the mind of New England, from which seeds blew westward and blossomed across the country. You read her and, whether you know it or not, your vision of America is elevated.

One reads books in order to gain the privilege of living more than one life. People who don't read are trapped in a mineshaft, even if they think the sun is shining. Most New Yorkers wouldn't travel to Minnesota if a bright star shone in the west and hosts of angels were handing out plane tickets, but they might read a book about Minnesota and thereby form some interesting and useful impression of us. This is the benefit of literacy. Life is lonely; it is less so if one reads.

I once got on the subway at 96th and Broadway in Manhattan and sat down opposite a handsome young African-American woman who was reading a book of mine. The train rattled along and I waited for her to smile or laugh but she didn't. She did, however, keep reading. I stayed on the train past 72nd and 42nd and 34th and finally it was too much for me--if she had slapped the book shut and tossed it away, it would've hurt me so bad, so I got off at 14th and I was a more thoughtful man for the rest of the day. A writer craves readers, but what passes between him and them is so intimate that it's unbearable to sit and watch.

Questions for class discussion: (1) Is the author using irony when he declares he is opposed to prizes? (2) What is "excrescence"? (3) Have you ever sat reading a book and then realized that the author was sitting across from you and as a joke, you kept a straight face?

Copyright © 2005, Chicago Tribune

 

AS DEADLINES NEAR, STUDENTS SWEAT IT OUT OVER COLLEGE ESSAYS

By Susan Kinzie
The Washington Post

December 19, 2005

It's a dark month for high school seniors. College admissions deadlines lurk just after the holidays, and the essay could be the one chance students have to show something more memorable than test scores and band camp -- something to make them stand out from the pile.

George Washington University gets about 20,000 applications a year; the University of Maryland gets a few thousand more. Parke Muth, director of international admission at the University of Virginia, estimates he has read more than 60,000 essays over the years. "That's why I'm nearly blind," he said.

Muth said he doesn't see many laughably bad essays anymore. College admissions are more competitive than ever. Most applicants get coached by parents, counselors and teachers; many spend the fall semester planning and rewriting essays in English class.

Many essays are grammatically perfect, structurally sound and painfully earnest. But not usually anything that would grab a reader from the first line.

That's where Nate Patten and fellow University of Virginia students come in. Each year, they sift through tons of essays from incoming freshmen to put on sketches for the public to show the kaleidoscope of students on campus. "Voices of the Class" gives a funny, illuminating and occasionally sad picture of each fall's freshmen -- and some inspiration for all the high school seniors trying to bang out essays.

Got to be a grabber

Patten got a stack of admissions essays more than a foot high to read for the play he was directing this fall. He'd pick one up, read the first line and -- unless it grabbed him -- toss it aside immediately.

"It was really painful," said fellow cast member Scottie Caldwell. "I would read an essay and think, `This is terrible!' And . . . it was exactly like mine."

After all that reading, the cast members sounded like experts on what works: The best essays read like vivid, entertaining dramas led by a compelling main character. More script than resume, and not a complicated life story -- just a sketch.

Cast members reading through essays laughed about the repetition. Lots of sob stories, lots of obscure words, lots of "Here I sit, musing about how difficult it is to write my essay."

They wrote a scene for the play with a girl at a laptop moaning, "All of my college applications are due tomorrow, and I haven't written my essay. I haven't got a role model . . . I haven't been depressed . . . and my family is obscenely functional." Then she brightens up. "I've got it! It's perfect: I'll write an essay about my essay. No one has ever thought of this. It's self-conscious, yet communal."

One U-Virginia question asks applicants to look out their front window and describe the view and what they would change. "That gives you a whole lot of socially conscious, `damn the Man' kind of essays," said senior Walt McGough. "One kid wrote about the state of youth of America -- it read like a 50-year-old man wrote it."

They went back to read their own essays and shuddered. "Mine were much worse," McGough said. "I wrote about running the light board for a high school performance and how everything went wrong and what it meant for me to triumph over adversity." He laughed. "If not that phrase, then something really, really close."

Now his advice is succinct: Be true to yourself. Take risks.

His first year in college, he heard a story: The Harvard admissions essay question asked, "What is the bravest thing you've ever done?" and one guy wrote -- well, a two-word phrase that is best described, in a family newspaper, as both vulgar and hostile.

"I would let that guy in with honors," McGough said wistfully. "I would love to think that happened; it gives me hope for the future."

Not on record

For the record, the Harvard application has never asked that question.

Also for the record, more than one admissions officer specifically mentioned being offended by overly graphic use of cuss words. Once, U-Virginia got a response to "What is your favorite word and why?" featuring the same four-letter word.

"He took a risk," Muth said. And, with the finality of a Virginia education lost forever, "that risk was not successful."

The essay didn't fail because of the word, Muth said, but because it was chosen for shock value. The essay was lousy.

So the corollary advice: Take a chance, but a calculated one. It's good to stand out, but not in a way that makes admissions staff members recoil.

Someone once sent the University of Maryland a worn flip-flop along with the application, said Shannon Gundy, associate director of undergraduate admissions. She doesn't remember the essay, just the attachment, which grossed her out.

"My least favorite," said Andrew Flagel, dean of admissions at George Mason University, "is the one cut out into a puzzle. It says, `Your school is where I fit in.' Every couple years, someone sends that."

One of Muth's favorite essays was about driving really fast, listening to Radiohead. "She wasn't afraid to say, `This is who I am. . . . I'm not trying to impress you with how much community service I'm doing. But I'm smart.'" It was the writing that carried it, Muth said, poetic and beautiful.

"Be true to yourself" is good advice, he said -- to a point. It's not the best recommendation for ditzes, stoners, sullen teens. He took on a high school senior voice and lilted, "`Does he like, like you -- or just like, like you like you?'

"You don't want to be true to that," he said. "You want to be false to that."

Some are amazing

As U-Virginia cast members read essays, some caught and held them: One about a 4-year-old brother with a brain tumor, making the family laugh and cry when he darted from the hospital elevator saying, "I'm busting out of here!"

One about waking up in the night to the strains of a religious song and creeping downstairs to the basement, sleepy and confused, to find his father high on cocaine, singing and beating his little brother to the cadences of the hymn.

There was one that began: I have always had really big feet.

"Some of these essays are just amazing," Patten said. "Some are very, very funny. Some are so sad, I could cry reading them." In the end, he was disappointed that the admissions office took the names off the essays used in the play. "I thought, this sounds like such a cool person that I would love to get to know better."
 

Copyright © 2005, Chicago Tribune

 

THE POWER OF FICTION

 

by Julia Keller

January 15, 2006

It's a mere slip of a book, slender and bendable, with a mud-colored cover broken only by the vivid red of an opening flower and the small white type of its title. They make chocolate bars bigger and heavier than copies of Sebastian Barry's "A Long Long Way" (Penguin, 2006).

The physical fact of "A Long Long Way" -- its humble, unobtrusive presence in a world full of books with come-hither covers and screaming titles -- is a lot like the novel's protagonist, Willie Dunne. He's a quiet, gentle young man from Dublin thrust in the middle of the deliberate madness known as World War I.

He sees things. He suffers. But it's all painted in the dull browns and laid-back grays of Barry's understated prose; there are no operatic flourishes, just the solemn march of one terrible thing after another.

And it's fiction -- pure, absolute, unrepentant fiction. Neither Barry nor his publisher would deny it. No reader could possibly be misled. Indeed, Barry's novel is a reminder of just how crucial fiction really is. Fiction isn't truth, but it can tell the truth. As the late John Fowles so beautifully put it, fiction can be "as real as, but other than, the world that is."

At present, however, the concept of telling readers what a book really is -- whether it represents events that actually happened or events the author employs to create a plausible alternative universe -- appears to be a lost art. And while there's much hand-wringing about how this might affect non-fiction, I'm frankly more concerned about its impact on fiction.

Last week brought a number of accusations that author James Frey had fibbed a bit in the memoir "A Million Little Pieces," his somewhat hysterical recitation of degradation and subsequent redemption. Frey, for his part, has acted as if calling him essentially a novelist is akin to dubbing him a serial killer.

The real issue, of course, is one of veracity, not genre. It's a matter of truth in advertising. Readers deserve to know the contents of what they're putting in their heads, just as they deserve food labels to tell them what they're putting in their bellies. Still, I find it amusing that Frey and other memoirists scream in agony at the faintest suggestion that their books are primarily works of imagination, not recollection. This spectacle -- authors such as Frey shrinking back in horror from the label "fiction" -- is enough to make fiction feel downright inferior.

Memoirists like to talk about "their" reality as opposed to "the" reality, claiming that fact and fiction aren't separate entities dwelling behind high walls of demarcation. Rubbish. Both fact and fiction are better served by the rigorous distinctions between them. And that's why apparent fabulists such as Frey are dangerous: Not only because they besmirch fact -- which they undoubtedly do -- but also because they besmirch fiction.

Fact can take care of itself. There will always be people willing to insist on proper evidence and meticulous sourcing in books and articles that lay claim to complete accuracy; somebody will always be checking up. The truth-hounds who recently sank their teeth into Frey's backside are proof of that -- and more power to them.

But it's fiction I worry about; fiction, which is already marginalized in a culture that sees it as somewhat frivolous and definitely expendable.

Good timing

"A Long Long Way," a finalist for England's Man Booker Prize, has come along at just the right moment. It comes as historians such as David Fromkin and David Stevenson, in "Europe's Last Summer: Who Started the Great War in 1914?" (Vintage, 2005) and "Cataclysm: The First World War as Political Tragedy" (Basic Books, 2005), respectively, reassess this war from nearly a century ago and find new things to report about it, new perspectives and insights.

It comes as, sadly, the absolute line between fact and fiction -- and it is absolute, no mater what the memoirists say -- is being smudged by those such as Frey who truly don't respect either.

History is always on its way to being essentially fiction; it is always moving toward a story. As events and their particulars recede further and further from us -- indeed, as the people who were there to witness those events inevitably die and a new generation replaces them -- history becomes increasingly the product of imagination. Informed imagination, to be sure; but imagination, still. It henceforth exists solely as enlightened recapitulation. As research-infused meditation. As, in effect, hearsay.

And that's where fiction enters the story: not as a substitute for fact, but as something other than -- yet important as -- fact.

The powerful new accounts of World War I by historians might suggest that facts are all we need: only facts, carefully deployed in context. Barry's novel, though, suggests that there is always more to say about conditions of extremity such as war. The view from up there -- among the kings and politicians and generals -- always requires as well the view from down here, among the men in the mud. "`Anyway, they don't write books about the likes of us,'" grouses a squadmate of Willie's. "`It's officers and high-up people, mostly.'" They do write such books, books about people whose remarks aren't taken down by dutiful stenographers and donated in bound volumes to libraries. Such books are called novels. Yet fiction always has to fight for its place at the table, to argue for its importance in light of the unquestioned primacy of fact.

Fact needs fiction. It needs fiction to tell the rest of the story, the story that lives not in archives or transcripts or photographs, but in the innate human desire to make stories out of experiences. There may not have been a flesh-and-blood Willie Dunne who served in World War I, but there is, rising up from the pages of the Irish novelist's work, a platoon of Willie Dunnes, "dozens and dozens falling under the weird and angry fire."

A different beast

A history book can offer casualty counts, but only fiction can offer this: "The approach trench was a reeking culvert with a foul carpet of crushed dead. Willie could feel the pulverized flesh still in the destroyed uniforms sucking at his boots. There were the bodies of creatures gone beyond their own humanity into a severe state that had no place in human doings and the human world. ... What lives and names and loves he was walking on he could not know any more; these flattened forms did not leak the whistle tunes and meanings of humanity any more."

Fact goes about its business with well-tailored efficiency and unassailable authority, marching along smartly, while fiction brings up the rear, stumbling and daydreaming -- but it's still needed, still desperately relevant.

Ultimately, fact and fiction are trying to do the same thing: In Barry's lyrical phrase, to aspire to get at "all the matter and difficulty of being alive, in a place of peace and a place of war." There's quite a distance between Barry and Frey, and not just because a century divides the events they are describing. One tells a different kind of truth in his fiction, and the other apparently tells a lie about the truth. A long long way, indeed.

© Chicago Tribune

 

WHY JOHNNY CAN'T BE BOTHERED

jgreenfield@tribune.com

dhaugh@tribune.com

By Thomas Geoghegan, a Chicago attorney and author, and James Warren, a Tribune deputy managing editor

April 4, 2006

In Chicago for their annual gathering, the nation's newspaper publishers should sit down with some politicians and school principals. All three parties are impacted by the real Culture War. Not the one between left and right over gays, guns and abortion, but the one between the "we" who still read a daily paper and those who don't.

"My wife and I read three papers a day," says a law professor friend. "But my daughter who's in graduate school, not a one. And my son, 19, doesn't read a paper at all either."

Yes, newspaper reading has dropped around the world. But that's a half-truth at best. The share of Germans over the age of 14 who scan a daily paper is nearly 80 percent. The French and Scandinavians, among others, read much more than we do too.

So don't be so quick to blame the Internet, TV news, iPods, IMing or even unrelenting attacks on the evil "mainstream media." It's too facile. Other countries have most of that, as well as Britney Spears, nincompoop shock jocks and pro wrestling. But newspaper reading in those countries hasn't collapsed as far as it has here.

The crisis in America, where ironically we have the world's highest rate of bachelor's degrees, is that if people don't read papers, they generally won't vote. The crisis of the press here is a crisis of democracy too. The single best indicator of whether someone votes is whether he reads a paper, according to political scientist Martin P. Wattenberg in his book, "Where Have All the Voters Gone?" But the converse is also true. Whether one votes is a much better indicator than a college degree as to whether one is reading a daily paper.

The reaction between these two trends, a decline in voting and the decline in the reading of dailies, is what scientists call autocatalytic. One drives the other in a downward spiral. The under-30 young read far less, and vote far less--and according to their teachers, have fewer opinions. Not reading, not having political sentiments, they aren't especially capable of voting intelligently anyway.

What can we do now?

Let's start with public education. In the Northwest Ordnance of 1787, Thomas Jefferson slipped in a famous mandate of public schools for basically one reason: to turn kids into citizens able to govern themselves. But we take democracy for granted. The founders could not. No one had ever attempted such a huge experiment: to test whether the common people could manage the public business.

Critical to public education was telling children not that they merely could but that they had to vote: It was a moral obligation. And to exercise that obligation, they had to be literate enough to read a paper. If they didn't read a paper, they couldn't follow a legal argument and sit on a jury. Unless they read a paper, they couldn't cast a vote; it would be too dangerous to the country. Jefferson opined, "If I had to choose between newspapers without government and government without newspapers, I'd choose the latter."

But teaching students to read a paper is virtually the last thing anyone in America expects from a school, especially in this test-driven era of No Child Left Behind. The purpose of education is now largely vocational or economic, preparing students for job and career, while filling a dizzying array of state and local mandates, including AIDS awareness, obesity prevention, anti-bullying and fire-safety programs. The civics element is gone. And the industry's traditional link to schools, its Newspaper in Education program, evolved into more of a gambit to boost circulation than a means of thoughtful civics instruction.

History, civics and other "political" subjects need to play a big role not just for the college-bound but also the armies who will at most have high school diplomas. A year ago the Chicago Tribune ran an estimate that only 47 percent of high school graduates from public schools in Chicago went on to any college work at all, and most of those soon dropped out. They depart having been cheated out of the civic skills they need to vote and take part in the great policy debates over allocation of the country's income (Social Security, welfare reform, Medicare, etc.).

There are many ways to recast public education to save the press and the democracy. One approach is four years of civics and four years of American history. "Four years of civics" might include one old-fashioned civics course, a current-events course, a course on problems of American democracy, and a final course that involves in-service learning and volunteer work.

Another approach would be for the state school system to publish a "student paper" that is given every day to students. The paper would consist of articles taken from newspapers around the state. The plan here is to turn the reading of the paper into a daily habit.

If publishers want to save themselves from long-term demise, they must consider reinvention of their papers' content and dramatic hikes in traditionally anemic marketing and promotion efforts. But they should also push for a new public education quite different from that envisioned by No Child Left Behind.

Worry a bit less about the Wall Street analysts and a bit more about the principals and the taxpayers on the local school boards. Sit down with them, but with a bit of care since school leaders rightfully feel put upon by too many mandates. And think about paying something for civics courses, which may turn out your future readers. It's the democratic thing to do--and maybe the industry's best hope to stay alive, even flourish.
 

Copyright © 2006, Chicago Tribune

 

 

SIX OF 100 CHICAGO PUBLIC SCHOOL FRESHMEN WILL GET A COLLEGE DEGREE

By Jodi S. Cohen and Darnell Little, Tribune staff reporters.

Tribune staff reporter Tracy Dell'Angela contributed to this report

April 21, 2006

Of every 100 freshmen entering a Chicago public high school, only about six will earn a bachelor's degree by the time they're in their mid-20s, according to a first-of-its-kind study released Thursday by the Consortium on Chicago School Research.

The prospects are even worse for African-American and Latino male freshmen, who only have about a 3 percent chance of obtaining a bachelor's degree by the time they're 25.

The study, which tracked Chicago high school students who graduated in 1998 and 1999, also found that making it to college doesn't ensure success: Of the city public school students who went to a four-year college, only about 35 percent earned a bachelor's degree within six years, compared with 64 percent nationally.

Researchers say they're not exactly sure why Chicago schools alumni graduate from college in such low numbers, but that poor preparation during high school and too few resources at the college level contribute to the problem.

"Just focusing on getting kids to survive in high school isn't going to be enough," said study co-author Elaine Allensworth, a researcher at the consortium, a group that works closely with Chicago Public Schools. "This report raises a lot of issues that the colleges need to struggle with."

Schools chief Arne Duncan said the grim statistics in the report and the variation in college rates among city high schools are no surprise--they are what is driving massive private investment in high school reform.

"When students here are unprepared for college or the world of work, they are condemned to social failure," he said. "We're doing everything we can to dramatically change the high school experience for our teenagers."

Among other findings:

- Students who graduated from high school with a grade-point average below 3.0 were unlikely to graduate within six years, lacking the study skills that contribute to college success. Only about 16 percent of students with a high school GPA between 2.1 and 2.5 graduated during that time, compared with 63 percent of students who had a 3.6 GPA or better.

- African-American and Latino students from Chicago high schools have the lowest graduation rates--lower than the national average for those groups and lower than their white and Asian peers from Chicago. Just 22 percent of African-American males who began at a four-year college graduated within six years.

Chicago high school graduate Nigel Valentine, 26, is on the 10-year plan. He graduated from Kennedy High School in 1997. After getting an associate's degree from Daley College in 2003, he is now a junior at Northeastern Illinois University. He expects to graduate next year.

"Originally, I was hoping to be out in four or five years," said Valentine, who is studying criminal justice. He says he blames himself and a school system that didn't ensure college readiness. "It's all about preparation. The structure of the classes in high school and elementary school were not up to par."

The study also found varying degrees of success among colleges in graduating students from Chicago schools.

Of the Chicago students who start as full-time freshmen at Northeastern, only 11 percent graduate within six years.

Northeastern officials said the study is unfair to the university, which primarily serves non-traditional students, including many part-time students who take an average of 9 years to graduate. Many students are older, low-income and work while in school, said Provost Lawrence Frank.

But Frank said the study does point "to things we need to address," particularly improving the experience for freshmen. The university next fall will require that all freshmen take a small seminar class with a maximum of 24 students. Sophomores will receive more advising about course selection and major.

To be sure, there were limitations to the study. It only provided graduation rates for students who enrolled full time in a four-year college. It did not include students from alternative high schools or those eligible for special education. Researchers also did not have graduation data from every Illinois college, and DePaul University, Northern Illinois University and Robert Morris College were among those left out.

The researchers used data from the non-profit National Student Clearinghouse, a group that collects data from secondary school officials who want to track their graduates. More than 2,800 colleges participate.

Carole Snow, an executive associate provost at the University of Illinois at Chicago, said many students start college unprepared in math and writing.

The university recently opened a math learning center where students can get tutoring and work on study skills.

About 46 percent of UIC students, including Chicago public school graduates, complete college within six years.

Loyola University has one of the highest graduation rates for Chicago students. About 66 percent complete college within six years, nearly the same as the school average.

Loyola Vice Provost John Pelissero attributes that success to individualized student attention, including mandatory academic counseling. All freshmen also get a peer adviser.

The researchers said that the study could help high school guidance counselors better advise students about where to go to college.

"Our kids could be making better choices than going to U. of. I. Urbana," said co-author Melissa Roderick. "That is a very significant statement on that college, and they need to be paying attention to that."

At the University of Illinois at Urbana-Champaign, where some of Chicago's brightest students enroll, only 42 percent graduate within six years compared with 81 percent of all students, according to the study.

Robin Kaler, spokeswoman for the Urbana campus, disputed the consortium's numbers and said the graduation rate for Chicago students is nearly 65 percent.

"It is still not acceptable to us," said Kaler, who attributed the low number to a challenging environment at U. of. I. "We work hard to attract and identify students that we think can succeed. ... There is no way to predict perfectly who is going to have the most success and who isn't'"

She said the university has worked on improving student advising, with several colleges now requiring it. The advisers are supposed to not only monitor a student's academic progress, but also connect them with career-focused clubs and other services. The university also started a program last fall called "University 101," which is intended to teach students how to study, conduct research, and locate programs and services at the university.

That program came too late for Crystalynn Ortiz, 19, who started at the Urbana campus in fall 2004 after graduating from Prosser Career Academy in Chicago with a 4.5 GPA. She dropped out of U. of. I. after the first year, and now attends nearby Parkland Community College.

"I wasn't prepared to go to U. of. I. I got my first bad grades and then I wasn't motivated to do well," she said. "I felt really unprepared in study habits, how hard it was going to be here." Ortiz said she lives two blocks from U. of I.'s campus, and takes the bus to Parkland. Some of her friends and family members don't know she flunked out, and she hopes to do well enough to return. "For me, this is low. This is bad. I shouldn't be at Parkland. I should be at U. of I. so I am trying to get through this and get back in," she said.

- - -

ABOUT THE STUDY

Six in 100 Chicago public high school freshmen will receive a bachelor's degree by the age of 25, according to a study that tracked 1998 and 1999 high school graduates.
jscohen@tribune.com
dlittle@tribune.com

Copyright © 2006, Chicago Tribune

 

BOSS GETS FOLKIE WITH 'SEEGER SESSIONS'

ASBURY PARK, N.J. - Bruce Springsteen, rock 'n' roll icon, stands on a cramped Jersey shore stage surrounded by 16 musicians. There's a fiddle, a banjo, a tuba, an accordion _ and not a single electric guitar.

 

The music swells, a glorious noise, as Springsteen leans into the microphone and sings a familiar song: "He floats through the air with the greatest of ease, the daring young man on the flying trapeze."

 

The vintage tale of a high-flying, womanizing circus star is followed by "Poor Man," a reworking of a Blind Alfred Reed song from the 1920s. This is the music of the moment for Springsteen: folk songs from decades past as he releases an album of songs culled from the Pete Seeger catalogue.

 

Bob Dylan once went electric. This is Springsteen going eclectic.

 

"The songs have lasted 100 years, or hundreds of years, for a reason," Springsteen explains in a spartan dressing room after rehearsing with his new big band. "They were really, really well-written pieces of music.

 

"They have worlds in them. You just kind of go in _ it's a playground. You go in, and you get to play around."

 

"We Shall Overcome: The Seeger Sessions" arrives Tuesday, with a tour to follow (including a trip to New Orleans for the Jazz and Heritage Festival). Springsteen, still damp with perspiration from his rehearsal, sat backstage for a 40-minute interview with The Associated Press that covered his musical past, present and future.

 

The new album is Springsteen's most sonically surprising since the spare "Nebraska" in 1982. Springsteen compares its variety with his second album, "The Wild, The Innocent and the E Street Shuffle," where the music veered from straight rock ("Rosalita") to jazz ("New York City Serenade") to oompah ("Wild Billy's Circus Story").

 

Leaning back on a couch, Springsteen said he was intent on getting out more music, including a group of songs already written for the E Street Band and a follow-up to "Tracks," his collection of unreleased studio cuts. He was working on the latter before deciding to do the new record.

 

"After a long time, you get a lot more secure about what you're doing," Springsteen said between sips from a bottle of water. "I spend much less time making decisions. Incredibly less. It used to be, like, there's a line in a song that I sang a certain way.

"I might mull it over for three days. Maybe longer, right? Now, you know, it's very different. I realize it's not necessary. You know your craft better."

 

"The Seeger Sessions" featured Springsteen making an album in record time. The rock Hall of Famer, who in the past went years between releases, did the new album in three days. The 13 songs, plus two bonus tracks, were recorded inside the living room of a farm house at Springsteen's New Jersey home _ with the horn section playing in the hall.

There were no rehearsals, no arrangements, no overdubs. Springsteen wasn't even sure if the results would become an album.

 

"It was just playing music," Springsteen said of the sessions. "I didn't have any intention for it. I knew that I enjoyed making this kind of music. ... It was really just purely for the joy of doing it. It was a lot of fun."

 

Springsteen, 56, is coming off a busy year when he toured extensively behind his Grammy-winning solo album "Devils & Dust." Last year also marked the 30th anniversary of "Born To Run," the classic album that turned the local hero into a worldwide star.

 

Springsteen first connected with the Seeger songboook in 1997, when he recorded "We Shall Overcome" for a tribute album. His interest grew as he delved into the material _ sturdy songs like "John Henry," "Erie Canal" and "Oh Mary, Don't You Weep."

 

"I wasn't aware of the vast library of music that Pete helped create and also collected," said Springsteen, who was more familiar with the work of Woody Guthrie. "Just this whole wonderful world of songwriting with all these lost voices. Great stories. Great characters."

 

Like Seeger, Springsteen is well-known for his role as a social activist. In 2004, Springsteen campaigned for John Kerry and criticized the Bush administration for bringing the country to war in Iraq. He's been a longtime advocate for local food banks, and played benefits for union workers, flood victims and other causes.

 

Seeger paid a heavy price for his beliefs. During the McCarthy era, he was summoned by the House Committee on Un-American Activities as it investigated supposed subversive influences in entertainment. He refused to cooperate and was blacklisted for the next decade.

 

So was releasing an album of Seeger's songs during President Bush's second term a political statement?

 

"I'll let somebody else sort that part of it, I guess," Springsteen said. "But a lot of 'em seem pretty applicable, you know? `Mrs. McGrath' is basically an Irish anti-war song, but it's ripped right out of the headlines everyday today."

 

The songs once sung by Seeger "shine a continuing light on a whole set of not just wonderful stories, but obviously a lot of social issues, the direction the country is going down," he continued. "There's still a place for a lot of that music."

 

Once Springsteen decided to forge ahead with the project, he called Seeger with the news. Seeger asked which songs would be on the record.

 

"He'd start giving me the history of each song," Springsteen said. "He actually knows about all those things. So it was an enjoyable conversation, and I hope he likes the record."

 

Springsteen had no concerns about audience reaction to his foray into a new musical landscape. He expects "the adventurous part of my fans" will enjoy the album. And he considers change a requirement for any successful musician.

 

"Your job as an artist is to build a box, and then let people watch you escape from it," Springsteen explained. "And then they follow you to the next box, and they watch you escape from that one. ... Escape artistry is part of the survival mechanism of the job. "If you want to do the job well, you have got to be able to escape from what you've previously built."

 

There's one other major difference between "Seeger Sessions" and all of Springsteen's previous work: He didn't write a single song for this project.

 

"A real pleasure," he said of the break from writing. "Once we put it together, it was like, `Wow. I can make records and I don't have to write anything.' There are thousands of great songs sitting out there waiting to be heard, and I know a way to act as an interpreter on these things."

 

In between finishing up the album and preparing for the tour, Springsteen was inducted into another Hall of Fame _ at his alma mater, Freehold High School. Springsteen, whose mother attended the ceremony, was bemused by the award.

 

"The high school hall of fame was, I suppose, less expected," Springsteen said between smiles. "I was at best a mediocre student, and I was an outcast. I didn't even attend my graduation. I went back in the middle of the summer and picked up my diploma across a desk and I went home.

 

"It's a little on the ironic side, I'd have to say. But it was nice."

© 2006

 

SPRINGSTEEN DOES SEEGER PROUD, FOLKS

Eric Zorn

It took exactly 26 seconds for Bruce Springsteen to replace my dread with pure exhilaration.

The rock star just released "We Shall Overcome--The Seeger Sessions," a CD billed as a tribute to folk icon Pete Seeger. As a Seeger fan nearly all my life, I could easily imagine the ways such a project could go wrong: With self-indulgent and self-referential reinterpretations; with electronic instrumentation to modernize the sound; with sappy, sterile, scholarly renditions.

Seeger, now 87 and retired, has always embodied the joyful community aspects of folk music--the sing-loud, sing-together, pound-on-the-strings-so-hard-you-break-them spirit that first drew me to it when I was a little kid and my parents played his records on the turntable in our apartment.

He was more an evangelist than a performer; equal parts cheerleader, historian, teacher, artist and political activist. Hearing Seeger play didn't make you want to sit and listen, it made you want to get up and join him. The only proper tribute to Seeger is a celebration of what he stood for, not a retrospective of his most famous work.

So the CD begins with Springsteen, a bit of laughter in his voice, counting into a song over a simple percussion line. A banjo lead-in follows. Then comes Springsteen, shout-singing the first verse of "Old Dan Tucker," a ditty you may have learned in grade school:

"Old Dan Tucker was a fine old man / Washed his face in a frying pan ..."

Seeger didn't write it. In fact, he didn't write any of the songs on the disc, an omission some critics have noted disapprovingly, given that Seeger was a formidable composer ("Turn, Turn, Turn," "Where Have All the Flowers Gone," and others).

"Old Dan Tucker" comes from the 1840s but is a perfect example of those songs that seemed like museum pieces until Seeger and fellow members of the mid-century folk revival dusted them off to reveal their timeless energy and catchy melodies.

It's also a nonsense song--lyrical couplets designed to fit a square-dance tune--which signals from the git-go that this is not a "message" album.

Fine so far. But it was the first line of the chorus, less than half a minute in, that got me: "Get out the way, Old Dan Tucker!" at full throat by a roomful of singers, most of them on the melody line, backed by a throbbing burst of acoustic instruments.

Listen to it yourself--a click-to-play sound clip is posted to my blog at chicagotribune.com/changeofsubject. It's not beautiful. It's not sweet. It's not technically astounding or musically sophisticated. But it may just blow you away. The CD is "rollicking," wrote our music critic, Greg Kot. "Rambunctious ... exuberant ... infectious ... music made for dancing and drinking."

"Unexpected and liberating," said Entertainment Weekly. Springsteen's "most jubilant disc since `Born in the U.S.A,'" said Rolling Stone.

The record-buying public made the album No. 3 on last week's pop charts.

Even the traditionalists and purists like it. Rich Warren, host of "Folkstage" and "The Midnight Special" on WFMT FM 98.7, told me that the folk deejay message board on the Web is vibrating with excitement and praise for "The Seeger Sessions," which he said "returns energy and spirit back to our roots music."

That energy and spirit have always been there--in late-night jam sessions, impromptu kitchen hootenannies and organized song circles. It's been hidden by the broad characterization of folk as the genre of earnest singer songwriters, zither-playing anachronists, whale savers and aspiring troubadours like that guy on the stairway who was singing "The Riddle Song" in "National Lampoon's Animal House" before Bluto smashed his guitar to bits.

Springsteen, in his ragged-but-right way, destroys these stereotypes. "Folk" in the hands of the crew he invited to his farmhouse for this live, lightly rehearsed production is music as social enterprise, a true collaboration, a party at which you'd have been welcome if you'd wandered by.

I said earlier that "The Seeger Sessions" isn't a "message album," but that's not quite true.

It carries Pete Seeger's message: Playing and singing together is great fun. This is your music, people. Join in.

Copyright © 2006, Chicago Tribune

 

DEBATE SHROUDS FILM ON BRIDGE SUICIDES

By John M. Glionna

May 4, 2006

SAN FRANCISCO -- For an entire year the cameras rolled, capturing death amid the eerie fog and shifting tides.

One by one, filmmaker Eric Steel documented the final moments of nearly two dozen despondent men and women, and the agonizing, four-second fall after they leaped off the Golden Gate Bridge.

His intent, he says, was to illuminate "the darkest corner of the human mind." If he watched enough people take their own lives, he thought he could "spot the outward manifestations of their interior demons."

Steel says he, too, once considered suicide. "It's that Humpty Dumpty moment when it's all going to fall apart," he said. "For me and many others, it didn't come. For the people in this film, it did."

His documentary, "The Bridge," which opened at a film festival here last week after debuting in New York City, has already provoked outrage.

"This is like a newspaper carrying a front-page photo of someone blowing his head off; it's irresponsible, exploitative, voyeuristic, ghastly and immoral," said Mark Chaffee, president of Suicide Prevention Advocacy Network-California, who has not seen the movie. His 16-year-old son took his life in 1998.

The film set off alarms soon after shooting wrapped in December 2004.

Officials from the Golden Gate National Recreation Area said Steel misrepresented his project when applying for filming permits, telling them he wanted to capture the grandeur of the iconic bridge.

Instead, they complain, he made what one San Francisco supervisor dismissed in media coverage as a "snuff film."

Others say publicity over the movie prompted bridge officials to fund a $2 million study of building a pedestrian suicide barrier, a move they long resisted. But the officials deny this.

"Does anybody think this movie truly addresses mental health problems?" said Mike Martini, a member of the bridge board of directors who says he will not see the film. "It's like saying you understand the Civil War after watching `Gone With the Wind.'"

The 93-minute documentary draws from thousands of hours of footage, including interviews with relatives of jumpers, and one man who survived the 25-story plunge.

Among those interviewed is Kevin Hines, who survived his jump in 2000. Now 25, he speaks freely about his ordeal with manic-depression in the hope that a suicide barrier will be built.

"This film shows what a suicidal person feels like, what their family feels like," said Hines, who has seen the movie. "It shows this happens all the time."

Since the Golden Gate Bridge opened in May 1937, an estimated 1,300 people have leaped to their deaths.

Steel, a 42-year-old New York native, said he has experienced his own bouts of depression. He lost a brother and a sister--one to cancer, the other to an accident with a drunken driver.

He wanted to know what made people hurl themselves into the roiling bay water. "One thing that stuck with me is that someone had to walk from the parking lot to one spot on the bridge before taking that jump," he said. "That walk must involve the most unimaginable mental anguish."

Steel established guidelines on when to intervene, instructing the crew to call emergency officials if a pedestrian set down a bag or briefcase, removed shoes or wallet, or climbed onto the rail. They intervened five times to stop jumpers, he said.

Steel sought details of the dead. He then crossed the country to interview friends and relatives. In the film, Wally Manikow, a tax collector from Virginia, relates a conversation with his son, Philip.

"He asked me, `Is suicide a sin?'" Manikow says as he sits on his living room couch absent-mindedly petting his dog. "I told him, `No, that's something man made up. God is not going to hold it against you.'"

The father pauses, then continues: "He thanked me for telling him the truth."

Steel did not inform family members that he had filmed their loved ones' suicides. Later, he acknowledged, "individual people called and were upset I didn't tell them."

Copyright © 2006, Chicago Tribune

 

SCHOLARS DISCOVER LOST SAMUEL BECKETT PLAY

PARIS—Just weeks after the centennial of the birth of pioneering minimalist playwright Samuel Beckett, archivists analyzing papers from his Paris estate uncovered a small stack of blank paper that scholars are calling "the latest example of the late Irish-born writer's genius."

The 23 blank pages, which literary experts presume is a two-act play composed sometime between 1973 and 1975, are already being heralded as one of the most ambitious works by the Nobel Prize-winning author of Waiting For Godot, and a natural progression from his earlier works, including 1969's Breath, a 30-second play with no characters, and 1972's Not I, in which the only illuminated part of the stage is a floating mouth.

"In what was surely a conscious decision by Mr. Beckett, the white, uniform, non-ruled pages, which symbolize the starkness and emptiness of life, were left unbound, unmarked, and untouched," said Trinity College professor of Irish literature Fintan O'Donoghue. "And, as if to further exemplify the anonymity and facelessness of 20th-century man, they were found, of all places, between other sheets of paper."

"I can only conclude that we have stumbled upon something quite remarkable," O'Donoghue added.

According to literary critic Eric Matheson, who praised the work for "the bare-bones structure and bleak repetition of what can only be described as 'nothingness,'" the play represents somewhat of a departure from the works of Beckett's "middle period." But, he said, it "might as well be Samuel Beckett at his finest."

"It does feature certain classic Beckett elements, such as sparse stage directions, a mysterious quality of anonymity, a slow building of tension with no promise of relief, and an austere portrayal of the human condition," Matheson said. "But Beckett's traditional intimation of an unrelenting will to live, the possibility of escape from the vacuous indifference that surrounds us—that's missing. Were that his vision, I suspect he would have used perforated paper."

Scholars theorize that the 23-page play might have been intended to be titled Five Conversations, Entropolis, or Stop.

In addition, an 81-page document, also blank, was found, which, for all intents and purposes, could be an earlier draft of the work.

"I suspect this was a nascent stream-of-consciousness attempt," O'Donoghue said of the blank sheets of paper, which were found scattered among Beckett's personal effects and took a Beckett scholar four painstaking days to put into the correct order. "In his final version, Beckett used his trademark style of 'paring down' to really get at the core of what he was trying to not say."

Some historians, however, contend that the play could have been the work of one of Beckett's protégés.

"Even though the central theme and wicked sense of humor of this piece would lead one to believe that this could conceivably be a vintage Beckett play, in reality, it could just as easily have been the product of [Beckett's close friend] Rick Cluchey," biographer Neal Gleason said. "And if it was Beckett, it's not outside the realm of possibility that, given his sharp wit, it was just intended as a joke. If Beckett were alive today, he might insist that it's not even a play at all. It could be a novella, or a screenplay."

Enthusiasts still maintain that the "nuances, subtleties, and allusions to his previous works" are all unmistakably Beckett. They also claim to have found notes and ideas for this play in the margins of Beckett's earlier works.

There are already plans to stage the play during the intermission of an upcoming production of Waiting For Godot.

© The Onion

 

CHEAPENING THE CAP AND GOWN

By MICHAEL WINERIP

Published: May 3, 2006

RAMSES SANTELISES was supposed to graduate from John F. Kennedy High in the Bronx in June 2005, but, he said, he goofed off his senior year. He failed senior English in the second semester and two gym classes. "I got senioritis," he said.

He was planning to make up the courses at summer school, but said that he got sick and was hospitalized, and that by the time he reported to summer school, he had missed too many days. They told him to sign up for night school in the fall. "I was upset," he said. "I was hoping to start college."

In late August he went to Kennedy to register for the night program, discussed the three courses he needed and, he said, got a big surprise. "They said, 'No questions asked, we're going to let you graduate,' " he recalled. "I never had to take the two gym classes and English class I should have taken."

And he wasn't alone, he said. "I know for a fact there were kids there they let graduate to get it over with," he said. At Kennedy's September 2005 graduation, Ramses was one of 105 students awarded diplomas, 31 more than graduated the September before.

For the last year, the city has been investigating whether long-troubled Kennedy High, which has been perilously close to landing on the state's failing school list, used several illegal methods to improve its academic standing. In February, this column raised questions about the principal's decision to change scores to passing from failing on 16 students' English Regents exams required for graduation. City officials were so concerned about grading practices at Kennedy that they issued a memorandum in March to all high school principals, announcing a new citywide policy aimed at closing loopholes in graduation requirements.

Faculty members at Kennedy say the abuses go beyond changing state Regents exam scores. Susan Werner, a longtime Kennedy guidance counselor, said that she and her colleagues knew of a dozen students who lacked the credits to graduate, but were awarded diplomas in the fall of 2005. Several counselors refused to sign papers certifying those students for graduation, she said, and complained to their supervisors, as well as the principal, Anthony Rotunno. "We felt this was unethical," Ms. Werner said.

Ms. Werner said that when she talked to Norma Smith, assistant principal for guidance, Ms. Smith "threw up her hands and said 'principal discretion.' " Ms. Werner said that when she complained to Rashid Davis, assistant principal for administration, "He made a big joke or got nasty."

Mr. Rotunno, Mr. Davis and Ms. Smith refused to be interviewed for this article. A city spokesman, David Cantor, said he could not comment on matters being investigated.

But Kathleen Pollina, the local superintendent who oversees Kennedy, defended Mr. Rotunno. She said that the school was being downsized, with staff members being "excessed" to make room for smaller schools, and that that caused tensions. She blamed a small group of staff members resistant to change. "Kennedy is making great strides toward changing what has long been a culture of failure," she said. "These changes are to benefit students."

Not all students, though, feel that they have benefited. "I'm happy to be leaving," said Luisa Gonzalez, the valedictorian. "It's been a stressful, disappointing year."

In September, Mr. Rotunno eliminated 4 guidance positions, on top of 5 cuts the previous June — an overall reduction to 6 counselors from 15. He said this was part of the restructuring of the school; the counselors say it was retribution for challenging him about students' graduating without enough credits. Either way, his move spurred a demonstration by many of the school's top students, which broke into fights and led to seven student arrests.

Recently, Kennedy was added to the list of the city's most violent schools.

"The administration is at war with students and teachers," said Kimberly Rodriguez, a senior who will attend the University of Rochester this fall and wrote her college essay on the problems caused by her counselor's leaving. She said she got little help applying to college. "You could never get in the guidance office," she said. "I'm glad to be getting out."

Faculty members say relations with the principal degenerated after he announced a new grading policy in October 2004. A memorandum from Mr. Rotunno said that any student who passed a Regents exam with a 65 or better in a core subject like math would be given retroactive credit for any failed math classes leading up to the Regents. Mr. Davis, the assistant principal, explained that the new policy was needed to improve the graduation rate and keep Kennedy off the failing school list, according to minutes of an Oct. 13, 2004, meeting recorded by the principal's secretary.

"It was like giving kids a free pass," said Doris Diaz, a counselor whose position was eliminated and who works at DeWitt Clinton High in the Bronx. "Why would kids work during the year if they knew all they had to do was pass the Regents?"

Ms. Werner said, "They started giving out credits like candy." Global history is a four-term course spread over two years, and Ms. Diaz and Ms. Werner say they saw transcripts for students who had failed four terms of global history and were given credit for all four courses after passing the global Regents exam.

This reporter obtained copies of transcripts (with names blanked out) from a teacher who requested anonymity for fear of retribution. In one case, a student who failed three semesters of global history classes starting in January 2003 was given credit for those courses after passing the state global history Regents exam with a 65 in January 2005. A student who failed freshman English 1 and 2 in 2002-03 was given credit for those courses after passing the English Regents with a 68 in January 2005.

In an interview in February, Mr. Rotunno said the policy was not new, just a clarification of an existing policy that went back to the school's beginning.

Several longtime teachers and administrators disagreed.

Charles Saltzman was an assistant principal when Kennedy opened in 1972, and principal from 1984 until retiring in 1995.

Mr. Saltzman said the only time a Regents score was used to give a course credit was if a student passed the first half of a yearlong course, was failing the second half, but then passed the Regents exam. In that case, he said, "the teacher had the option to give a passing grade for second semester." But he added, "the teacher had final say."

UNDER Mr. Rotunno's policy, teachers no longer needed to be consulted. Eileen Sokoloff, an English teacher, had a student in the spring of 2005 whom she flunked with a 50. She said he did no work and was often absent, but much to her surprise, there he was at graduation. "I was furious," she said. "And other teachers were saying the same thing."

In February, after this reporter questioned whether Kennedy's grading policy was legal, Deputy Chancellor Carmen Fariña sent a memorandum to all high school principals warning about the limits of using Regents scores for course credit. Her memo quotes state policy: "the awarding of credit may not be based solely on the results of the state exam." She also announced a new city policy: If a Regents exam is used as part of a final grade it should be weighted no more than 33 per cent of the "terminal course" leading to the Regents exam.

Mr. Cantor, the city spokesman, said that under this new policy, giving credits for four terms of global or two freshman English classes based on a Regents score "would be a gross violation of the rules."

The city investigation will determine whether Kennedy's policies violated state or city rules in place at the time.

After Ms. Werner, a 20-year veteran, challenged the policy, her student caseload was taken from her, and she was given little to do. She retired in October, because, she said, she felt beaten down and disgusted.

Hard-working students like Kimberly Rodriguez felt the same way. "These kids who didn't work hard passed anyway," she said. "Why even bother when you could pass without doing anything?"

©NYTimes.com

 

MARSHMALLOWS AND PUBLIC POLICY

By DAVID BROOKS

May 7, 2006

Around 1970, Walter Mischel launched a classic experiment. He left a succession of 4-year-olds in a room with a bell and a marshmallow. If they rang the bell, he would come back and they could eat the marshmallow. If, however, they didn't ring the bell and waited for him to come back on his own, they could then have two marshmallows.

In videos of the experiment, you can see the children squirming, kicking, hiding their eyes — desperately trying to exercise self-control so they can wait and get two marshmallows. Their performance varied widely. Some broke down and rang the bell within a minute. Others lasted 15 minutes.

The children who waited longer went on to get higher SAT scores. They got into better colleges and had, on average, better adult outcomes. The children who rang the bell quickest were more likely to become bullies. They received worse teacher and parental evaluations 10 years on and were more likely to have drug problems at age 32.

The Mischel experiments are worth noting because people in the policy world spend a lot of time thinking about how to improve education, how to reduce poverty, how to make the most of the nation's human capital. But when policy makers address these problems, they come up with structural remedies: reduce class sizes, create more charter schools, increase teacher pay, mandate universal day care, try vouchers.

The results of these structural reforms are almost always disappointingly modest. And yet policy makers rarely ever probe deeper into problems and ask the core questions, such as how do we get people to master the sort of self-control that leads to success? To ask that question is to leave the policy makers' comfort zone — which is the world of inputs and outputs, appropriations and bureaucratic reform — and to enter the murky world of psychology and human nature.

And yet the Mischel experiments, along with everyday experience, tell us that self-control is essential. Young people who can delay gratification can sit through sometimes boring classes to get a degree. They can perform rote tasks in order to, say, master a language. They can avoid drugs and alcohol.

For people without self-control skills, however, school is a series of failed ordeals. No wonder they drop out. Life is a parade of foolish decisions: teen pregnancy, drugs, gambling, truancy and crime.

If you're a policy maker and you are not talking about core psychological traits like delayed gratification skills, then you're just dancing around with proxy issues. You're not getting to the crux of the problem.

The research we do have on delayed gratification tells us that differences in self-control skills are deeply rooted but also malleable. Differences in the ability to focus attention and exercise control emerge very early, perhaps as soon as nine months. The prefrontal cortex does the self-control work in the brain, but there is no consensus on how much of the ability to exercise self-control is hereditary and how much is environmental.

The ability to delay gratification, like most skills, correlates with socioeconomic status and parenting styles. Children from poorer homes do much worse on delayed gratification tests than children from middle-class homes. That's probably because children from poorer homes are more likely to have their lives disrupted by marital breakdown, violence, moving, etc. They think in the short term because there is no predictable long term.

The good news is that while differences in the ability to delay gratification emerge early and persist, that ability can be improved with conscious effort. Moral lectures don't work. Sheer willpower doesn't seem to work either. The children who resisted eating the marshmallow didn't stare directly at it and exercise iron discipline. On the contrary, they were able to resist their appetites because they were able to distract themselves, and think about other things.

What works, says Jonathan Haidt, the author of "The Happiness Hypothesis," is creating stable, predictable environments for children, in which good behavior pays off — and practice. Young people who are given a series of tests that demand self-control get better at it over time.

This pattern would be too obvious to mention if it weren't so largely ignored by educators and policy makers. Somehow we've entered a world in which we obsess over structural reforms and standardized tests, but skirt around the moral and psychological traits that are at the heart of actual success.

Walter Mischel tried to interest New York schools in programs based on his research. Needless to say, he found almost no takers.

 © New York Times

 

WHAT IS THE BEST WORK OF AMERICAN FICTION OF THE LAST 25 YEARS?

By A. O. Scott

More than a century ago, Frank Norris wrote that "the Great American Novel is not extinct like the dodo, but mythical like the hippogriff," an observation that Philip Roth later used as the epigraph for a spoofy 1973 baseball fantasia called, naturally, "The Great American Novel." It pointedly isn't - no one counts it among Roth's best novels, though what books people do place in that category will turn out to be relevant to our purpose here, which has to do with the eternal hunt for Norris's legendary beast. The hippogriff, a monstrous hybrid of griffin and horse, is often taken as the very symbol of fantastical impossibility, a unicorn's unicorn. But the Great American Novel, while also a hybrid (crossbred of romance and reportage, high philosophy and low gossip, wishful thinking and hard-nosed skepticism), may be more like the yeti or the Loch Ness monster - or sasquatch, if we want to keep things homegrown. It is, in other words, a creature that quite a few people - not all of them certifiably crazy, some of them bearing impressive documentation - claim to have seen. The Times Book Review, ever wary of hoaxes but always eager to test the boundary between empirical science and folk superstition, has commissioned a survey of recent sightings.

Or something like that. Early this year, the Book Review's editor, Sam Tanenhaus, sent out a short letter to a couple of hundred prominent writers, critics, editors and other literary sages, asking them to please identify "the single best work of American fiction published in the last 25 years." The results - in some respects quite surprising, in others not at all - provide a rich, if partial and unscientific, picture of the state of American literature, a kind of composite self-portrait as interesting perhaps for its blind spots and distortions as for its details.

 

And as interesting, in some cases, for the reasoning behind the choices as for the choices themselves. Tanenhaus's request, simple and innocuous enough at first glance, turned out in many cases to be downright treacherous. It certainly provoked a lot of other questions in response, both overt and implicit. "What is poetry and if you know what poetry is what is prose?" Gertrude Stein once asked, and the question "what is the single best work of American fiction published in the last 25 years?" invites a similar scrutiny of basic categories and assumptions. Nothing is as simple as it looks. What do we mean, in an era of cultural as well as economic globalization, by "American"? Or, in the age of James Frey, reality television and phantom W.M.D.'s, what do we mean by "fiction"? And if we know what American fiction is, then what do we mean by "best"?

 

A tough question, and one that a number of potential respondents declined to answer, some silently, others with testy eloquence. There were those who sighed that they could not possibly select one book to place at the summit of an edifice with so many potential building blocks - they hadn't read everything, after all - and also those who railed against the very idea of such a monument. One famous novelist, unwilling to vote for his own books and reluctant to consider anyone else's, asked us to "assume you never heard from me."

 

More common was the worry that our innocent inquiry, by feeding the deplorable modern mania for ranking, list-making and fabricated competition, would not only distract from the serious business of literature but, worse, subject it to damaging trivialization. To consecrate one work as the best - or even to establish a short list of near-bests - would be to risk the implication that no one need bother with the rest, and thus betray the cause of reading. The determination of literary merit, it was suggested, should properly be a matter of reasoned judgment and persuasive argument, not mass opinionizing. Criticism should not cede its prickly, qualitative prerogatives to the quantifying urges of sociology or market research.

 

Fair enough. But there would be no point in proposing such a contest unless it would be met with quarrels and complaints. (A few respondents, not content to state their own preferences, pre-emptively attacked what they assumed would be the thinking of the majority. So we received some explanations of why people were not voting for "Beloved," the expected winner, and also one Roth fan's assertion that the presumptive preference for "American Pastoral" over "Operation Shylock" was self-evidently mistaken.) Even in cases - the majority - where the premise of the research was accepted, problems of method and definition buzzed around like persistent mosquitoes. There were writers who, finding themselves unable to isolate just one candidate, chose an alternate, or submitted a list. The historical and ethical parameters turned out to be blurry, since the editor's initial letter had not elaborated on them. Could you vote for yourself? Of course you could: amour-propre is as much an entitlement of the literary class as log-rolling, which means you could also vote for a friend, a lover, a client or a colleague. But could you vote for, say, "A Confederacy of Dunces," which, though published in 1980, was written around 20 years earlier? A tricky issue of what scholars call periodization: is John Kennedy Toole's ragged New Orleans farce a lost classic of the 60's, to be shelved alongside countercultural picaresques like Richard Fariña's "Been Down So Long It Looks Like Up to Me"? Or is it a premonition of the urban-comic 80's zeitgeist in which it finally landed, keeping company with, say, Jay McInerney's "Bright Lights, Big City"? What about story collections - I. B. Singer's, Donald Barthelme's, Raymond Carver's, for instance - that appeared between 1980 and 2005 but gathered up the work of earlier decades? Do they qualify? And - most consequentially, as it happened - what about John Updike's four "Rabbit" novels? Only the last two were published during the period in question, but all four were bound into a single volume and published, by Everyman's Library, in 1995. Considered separately, "Rabbit Is Rich" (1981) and "Rabbit at Rest" (1990) might have split Updike's vote, which "Rabbit Angstrom" was able to consolidate, placing it in the top five. If Nathan Zuckerman had received a similar omnibus reissue, with "The Counterlife," "The Human Stain," "American Pastoral" and the others squeezed into one fat tome, literary history as we know it - or at least this issue of the Book Review - would be entirely different.

THE question "what do you mean by 'the last 25 years'?" in any case turned out to be a live one, and surveying the recent past caused a few minds to wander farther back in time. One best-selling author (whose fat novels seem to have been campaigning for inclusion in this issue long before the editors dreamed it up, even though not even he bothered to vote for any of them) reflected on the poverty of our current literary situation by wondering what the poll might have looked like in 1940, with Hemingway, Faulkner and Fitzgerald - to say nothing of Theodore Dreiser, Willa Cather and Sinclair Lewis - in its lustrous purview. The last time this kind of survey was conducted, in 1965 (under the auspices of Book Week, the literary supplement of the soon-to-be-defunct New York Herald Tribune), the winner was Ralph Ellison's "Invisible Man," which was declared "the most memorable" work of American fiction published since the end of World War II, and the most likely to endure. The field back then included "The Adventures of Augie March," "Herzog," "Lolita," "Catch-22," "Naked Lunch," "The Naked and the Dead" and (I'll insist if no one else will) "The Group." In the gap between that survey and this one is a decade and a half - the unsurveyed territory from 1965 to 1980 - that includes Thomas Pynchon's "Gravity's Rainbow" and William Gaddis's "JR," as well as "Humboldt's Gift," "Portnoy's Complaint," "Ragtime," "Song of Solomon" and countless others.

Contemplation of such glories lent an inevitable undercurrent of nostalgia to some of the responses. Where are the hippogriffs of yesteryear? Could they have been dodos all along? Not to worry: late-20th-century American Lit comprises a bustling menagerie, like Noah's ark or the island of Dr. Moreau, where modernists and postmodernists consort with fabulists and realists, ghost stories commingle with domestic dramas, and historical pageantry mutates into metafiction. It is, gratifyingly if also bewilderingly, a messy and multitudinous affair.

 

It is perhaps this babble and ruckus - the polite word is diversity - that breeds the impulse of which Sam Tanenhaus's question is an expression: the urge to isolate, in the midst of it all, a single, comprehensive masterpiece. E pluribus unum, as it were. We - Americans, writers, American writers - seem often to be a tribe of mavericks dreaming of consensus. Our mythical book is the one that will somehow include everything, at once reflecting and by some linguistic magic dissolving our intractable divisions and stubborn imperfections. The American literary tradition is relatively young, and it stands in perpetual doubt of its own coherence and adequacy - even, you might say, of its own existence. Such anxiety fosters large, even utopian ambitions. A big country demands big books. To ask for the best work of American fiction, therefore, is not simply - or not really - to ask for the most beautifully written or the most enjoyable to read. We all have our personal favorites, but I suspect that something other than individual taste underwrites most of the choices here. The best works of fiction, according to our tally, appear to be those that successfully assume a burden of cultural importance. They attempt not just the exploration of particular imaginary people and places, but also the illumination of epochs, communities, of the nation itself. America is not only their setting, but also their subject.

 

They are - the top five, in any case, in ascending order - "American Pastoral," with 7 votes; Cormac McCarthy's "Blood Meridian" and Updike's four-in-one "Rabbit Angstrom," tied with 8 votes each; "Don DeLillo's "Underworld," with 11; and, solidly ahead of the rest, Toni Morrison's "Beloved," with 15. (If these numbers seem small, keep in mind that they are drawn from only 125 votes, and from a pool of potential candidates equal to the number of books of fiction by American writers published in 25 years. Sometimes cultural significance can be counted on the fingers of one hand.)

Any other outcome would have been startling, since Morrison's novel has inserted itself into the American canon more completely than any of its potential rivals. With remarkable speed, "Beloved" has, less than 20 years after its publication, become a staple of the college literary curriculum, which is to say a classic. This triumph is commensurate with its ambition, since it was Morrison's intention in writing it precisely to expand the range of classic American literature, to enter, as a living black woman, the company of dead white males like Faulkner, Melville, Hawthorne and Twain. When the book first began to be assigned in college classrooms, during an earlier and in retrospect much tamer phase of the culture wars, its inclusion on syllabuses was taken, by partisans and opponents alike, as a radical gesture. (The conservative canard one heard in those days was that left-wing professors were casting aside Shakespeare in favor of Morrison.) But the political rhetoric of the time obscured the essential conservatism of the novel, which aimed not to displace or overthrow its beloved precursors, but to complete and to some extent correct them.

It is worth remarking that the winner of the 1965 Book Week poll, Ralph Ellison's "Invisible Man," arose from a similar impulse to bring the historical experience of black Americans, and the expressive traditions this experience had produced, into the mainstream of American literature. Or, rather, to reveal that it had been there all along, and that race, far from being a special or marginal concern, was a central facet of the American story. On the evidence of Ellison's and Morrison's work, it is also a part of the story that defies the tenets of realism, or at least demands that they be combined with elements of allegory, folk tale, Gothic and romance.

 

The American masterpieces of the mid-19th century - "Moby-Dick," "The Scarlet Letter," the tales of Edgar Allan Poe and, for that matter, "Uncle Tom's Cabin" - were compounded of precisely these elements, and nowadays it seems almost impossible to write about that period without crossing into the realm of the supernatural, or at least the self-consciously mythic. This is surely what ties "Beloved" to "Blood Meridian." Both novels treat primordial situations of American violence - slavery and its aftermath in one case, the conquest of the Southwestern frontier in the other - in compressed, lyrical language that rises at times to archaic, epic strangeness. Some of their power - and much of their originality - arises from the feeling that they are uncovering ancient tales, rendering scraps of a buried oral tradition in literary form.

 

But the recovery of the past - especially the more recent past - turns out to be the dominant concern of American writing, at least as reflected in this survey, over the past quarter-century. Our age is retrospective. One obvious difference between "Invisible Man" and "Beloved," for instance, is that Ellison's book, even as it flashes back to the Depression-era South and the Harlem of the 1940's, plants itself in the present and leans forward, to the point of risking prophecy. "Beloved," in contrast, concerns itself with the recovery of origins, the isolation of a primal trauma whose belated healing will be undertaken by the narrative itself. And while "Blood Meridian" is far too gnomic and nihilistic to claim such a therapeutic function for itself, it nonetheless shares with "Beloved" a vision of the past as an alien realm of extremity, in which human relations are stripped to the bare essentials of brutality and tenderness, vengeance and honor.

 

In some ways, the mode of fiction McCarthy and Morrison practice is less historical than pre-historical. It does not involve the reconstruction of earlier times - the collisions between real and invented characters, the finicky attention to manners, customs and habits of speech - that usually defines the genre. But to look again at the top five titles in the survey is to discover just how heavily the past lies on the minds of contemporary writers and literary opinion makers. To the extent that the novel can say something about where we are and where we are going, the American novel at present chooses to do so above all by examining where we started and how we got here.

IF "Beloved" and "Blood Meridian" pull us back to a premodern American scene - a place that exists beyond realism and in some respects before civilization as we know it - the other three novels trace the more recent ups and downs of that civilization. Indeed, it is only a small exaggeration to say that "Underworld," "American Pastoral" and "Rabbit Angstrom" are variations on the same novel, a decades-spanning tale rooted in the old cities of the Eastern Seaboard. Needless to say, the methods, the characters and the voices are quite distinct - no one would mistake Roth for DeLillo or Updike for Roth - but these are differences of perspective, as if three painters were viewing the same town from neighboring hillsides.

The three novels do what we seem to want novels to do, which is to blend private destinies with public events, an exercise that the postwar proliferation of media simultaneously makes more urgent and more difficult. Rabbit Angstrom, high school basketball star, typesetter-turned-car-dealer, as carelessly loyal to his country as he is unfaithful to his wife, is an incarnation of the American ordinary made exemplary by the grace of God and of Updike's prose. Especially in the later novels, his consciousness becomes the prism through which the unsettled experience of the nation is refracted. The war in Vietnam, the racial agitations of the 60's, the moon landing, the Carter-era malaise, the end of the cold war: all of these are filtered through Rabbit's complacent gaze. So are less dramatic but no less consequential shifts in manners and morals, in taste and sensibility. Food, sex, cars, real estate, social class, religion - everything changes from "Rabbit, Run" to "Rabbit at Rest," even as the deep continuities of American life, embodied in the hero's transcendent laziness, appear to triumph in the end.

 

"Rabbit Angstrom" is not, strictly speaking, a novel of retrospect; it was written in the present tense and in real time, each segment composed before the end of the story could be known. Because of this - because Updike's gift for observing the present has always outstripped his ability to animate the past - "Rabbit," like the great Russian and French realist novels of the 19th century, becomes an unequaled repository of historical detail. Next to it, Updike's attempted multigenerational chronicle of 20th-century American history, "In the Beauty of the Lilies," looks thin and stagy.

 

Alongside Rabbit there is Zuckerman, his near contemporary, and like him the product of a small, industrial mid-Atlantic city. More pointedly, perhaps, there is Swede Levov, the hero of "American Pastoral" (Zuckerman being the self-effacing narrator), who is, like Rabbit, a star athlete in high school and whose nickname curiously recalls Rabbit's ethnic background. But while Rabbit is, for all the suffering he endures and inflicts, a fundamentally comic character, his destiny arcing toward happiness, Swede's trajectory is tragic. Fate has raised him high in order to see how far he might fall. He contains traces of Job - his fidelity to America tested by brutal and arbitrary misfortune - and also of Lear, snakebit by one of the most floridly and obscenely ungrateful children in all of literature.

 

The agonized question that ripples through "American Pastoral" is "what happened?" How did the pastoral America of Newark in the 40's and 50's - an Eden only in retrospect - come apart? And its selection over Roth's other books is indicative of how important this question is taken to be. Over the past 15 years, Roth's production has been so steady, so various and (mostly) so excellent that his vote has been, inevitably, split. If we had asked for the single best writer of fiction over the past 25 years, he would have won, with seven different books racking up a total of 21 votes. Within these numbers is an interesting schism. The loose trilogy of which "American Pastoral" is the first installment - "I Married a Communist" and "The Human Stain" are its companions - accounts for 11 votes, while 8 are divided among "Sabbath's Theater," "The Counterlife" and "Operation Shylock," and another 2 go to "The Plot Against America." The Roth whose primary concern is the past - the elegiac, summarizing, conservative Roth - is preferred over his more aesthetically radical, restless, present-minded doppelgänger by a narrow but decisive margin.

A similar split occurs among DeLillo's partisans, who favor the historical inquiry of "Underworld" over the contemporaneity of "White Noise." (There were also two voters who chose "Libra," a more narrowly focused historical fiction and in some ways a rehearsal for "Underworld.") Like "American Pastoral," "Underworld" is a chronologically fractured story drawn by a powerful nostalgic undertow back to the redolent streets of a postwar Eastern city. Baseball and the atom bomb, J. Edgar Hoover and the science of waste disposal are pulled into its vortex, but whereas Updike and Roth work to establish connection and coherence in the face of time's chaos, DeLillo is an artist of diffusion and dispersal, of implication and missing information. But more than his other books, "Underworld" is concerned with roots, in particular with ethnicity. Nick Shay, at first glance another one of his tight-lipped, deracinated postmodern drifters, turns out to be a half-Italian kid from the old East Bronx, and the characteristic rhythms of DeLillo's prose - the curious noun-verb inversions, the quick switches from abstraction to earthiness, from the decorous to the profane - are shown to arise, as surely as Roth's do, from the polyglot idiom of the old neighborhood.

So the top five American novels are concerned with history, with origins, to some extent with nostalgia. They are also the work of a single generation. DeLillo, born in 1936, is the youngest of the five leading authors. The others were born within two years of one another: Morrison in 1931, Updike in 1932, Roth and McCarthy in 1933.

 

Their seniority, needless to say, is earned - they have had plenty of time to ripen and grow - but it is nonetheless startling to see how thoroughly American writing is dominated by this generation. Startling in part because it reveals that the baby boom, long ascendant in popular culture and increasingly so in politics and business, has not produced a great novel. The best writers born immediately after the war seem almost programmatically to disdain the grand, synthesizing ambitions of their elders (and also some of their juniors), trafficking in irony, diffidence and the cultivation of small quirks rather than large idiosyncrasies. Only two books whose authors were born just after the war received more than two votes: "Housekeeping," by Marilynne Robinson, and "The Things They Carried," by Tim O'Brien. These are brilliant books, but they are also careful, small and precise. They do not generalize; they document. Ann Beattie, born in 1947, is among the most gifted and prolific fiction writers of her generation, but her books are nowhere to be found on this list; not, I would venture, because she fails to live up to the survey's implicit criterion of importance, but because she steadfastly refuses to try.

 

Expand beyond the immediate parameters of this exercise, and the generational discrepancy grows even more acute: add Thomas Pynchon and E. L. Doctorow, Anne Tyler and Cynthia Ozick, John Irving and Joan Didion and Russell Banks and Joyce Carol Oates and you will have a literary pantheon born almost to a person during the presidency of Franklin Roosevelt. Further expansion - by means of a Wolfe here, a Mailer there - is likely to push the median age still higher. Think back on that 1965 survey; it's hard to find an author on the list of potential candidates much older than 50.

IS this quantitative evidence for the decline of American letters - yet another casualty of the 60's? Or is the American literary establishment the last redoubt of elder-worship in a culture mad for youth? In sifting through the responses, I was surprised at how few of the highly praised, boldly ambitious books by younger writers - by which I mean writers under 50 - were mentioned. One vote each for "The Corrections" and "The Amazing Adventures of Kavalier & Clay," none for "Infinite Jest" or "The Fortress of Solitude," a single vote for Richard Powers, none for William T. Vollmann, and so on.

 

But the thing about mythical beasts is that they don't go extinct; they evolve. The best American fiction of the past 25 years is concerned, perhaps inordinately, with sorting out the past, which may be its way of clearing ground for the literature of the future. So let me end with a message to all you aspiring hippogriff breeders out there: 2030 is just around the corner. Get to work.

© NY TIMES 2006

  

BAN ON BOOKS TO GET A VOTE

By Jamie Francisco
Tribune staff reporter
Published May 24, 2006

A bid to remove nine books from the required-reading list of the second-largest high school district in Illinois has triggered debate over whether works praised in literary circles are high art or smut.

The issue arose this month when Township High School District 214 board member Leslie Pinney flagged books that she said contain vulgar language, brutal imagery or depictions of sexual situations inappropriate for students.

The board is scheduled to vote Thursday night on whether to keep the books as part of the curriculum.

"If the media are bombarding our children with explicit sexual images and graphic violence and prolific profanity, can't a school relent from that?" Pinney said. "Is there a different level of standards? That's my question."

The titles on Pinney's list are "Beloved" by Toni Morrison, "Slaughterhouse-Five" by Kurt Vonnegut, "The Things They Carried" by Tim O'Brien, "The Awakening" by Kate Chopin, "Freakonomics" by Steven D. Levitt and Stephen J. Dubner, "The Botany of Desire: A Plant's-Eye View of the World" by Michael Pollan, "The Perks of Being a Wallflower" by Stephen Chbosky, "Fallen Angels" by Walter Dean Myers and "How the Garcia Girls Lost Their Accents" by Julia Alvarez.

It is the first time in more than 20 years that a reading list has been challenged in the Arlington Heights-based district, said Chuck Venegoni, who heads the English and fine arts departments at Hersey High School. The district uses an extensive review process based on established national reading lists, and the suggestion that teachers are using materials on par with porn is insulting, he said.

"This is not some serendipitous decision to allow someone to do what they felt like doing because they had something about talking about something kinky in front of kids," Venegoni said. "It's insulting to hardworking people who really do care about kids."

Pinney, the mother of a District 214 graduate, admits she has not read all the books. She is not seeking to ban them from district libraries, but in class she would like to replace them with books that address the same themes without explicit material.

Among her objections are a bestiality scene in "Beloved," graphic violence in "The Things They Carried" and masturbation references in "Wallflower."

Venegoni said he has received dozens of e-mails of support from parents but also has had to explain that pornography is not part of the lesson plan.

"For however edgy a few passages taken out of context, there is nothing in any of those books that even remotely approaches what an objective person would call pornography," he said.

Several conservative groups have rallied to Pinney's cause, saying that the books promote porn, which has prompted community members on both sides to flood board members and teachers with e-mail.

In 2005 the American Library Association received more than 400 requests to pull books from the shelves of school and public libraries, a spokeswoman said.

The call to ban books is timeless, but it is important to continue supporting literature that makes readers think, said Mary Dempsey, commissioner of the Chicago Public Library.

Mayor Richard Daley selected "The Things They Carried" in 2003 for the One Book, One Chicago city reading series. The book, about the Vietnam War, was a finalist for the Pulitzer Prize in 1990. The themes--love, hate, war, kindness and cruelty--along with the author's ability to convey the harshness of war outweighed concerns, Dempsey said.

"I cannot imagine that language in that book is not said on a battlefield and, candidly, is not said in the corridors of most high schools in the suburbs of Chicago today," she said. "Good literature is supposed to get people to think. And sometimes, good literature takes you out of your comfort zone."

District 214 officials rely on the expertise of teachers and other members of textbook selection committees to scrutinize reading lists used nationwide, said board President William Dussling. The books are geared for juniors and seniors in honors or Advanced Placement courses to prepare them for college, he said. Parents can have their students opt out if they find reading material objectionable, he said.

"There will be accommodations made of something else to read that will still meet the learning points in the class," Dussling said. "It's not a matter of this is it."

The district's six schools have nearly 13,000 students in Arlington Heights, Buffalo Grove, Des Plaines, Elk Grove Village, Mt. Prospect, Prospect Heights, Rolling Meadows and Wheeling.

Pinney said the system needs to be modified so parents are better informed.

"The opt-out clause is flawed because unless you're digging around the student's backpack, looking at the books and reading them, how exactly will you know what your student is reading?" Pinney said.

Terri Brightwell, whose son is a senior at Rolling Meadows High School, agreed.

"Their standards may not be my standards," she said. "It should be open to a process where parents are involved."

On the other side, Sharon Neff said she trusts the judgment of educators compiling the lists and believes they are introducing subjects to her daughter, Valerie, that will prepare her for life beyond Hersey.

"That's not a watered, diluted version of reality. Without it, the literature isn't as effective," she said. "[Pinney] needs to read the books."

jfrancisco@tribune.com © 2006
 

 

CAUTION! SEX, VIOLENCE AND DANGEROUS IDEAS. DON’T READ!

I want to thank the activist in Township High School District 214, in Arlington Heights, who recently tried to ban Tim O'Brien's "The Things They Carried" and other classic works of literature. You have made my job as a high school English teacher a lot easier.

Usually, I have to prompt students to read some of those very books you wanted to ban. It's tough work, believe me. But I've learned one invaluable fact: Banned books are hot.

Every June, I warn my students about the violent and explicit texts on our summer reading list. The scandalous works include Joseph Conrad's "Heart of Darkness," Bram Stoker's "Dracula" and Toni Morrison's "The Bluest Eye." These novels are standard fare for most college-prep high schools, but nobody has to know that straight off. It's an obvious ploy, but once I mention severed heads on sticks and teenage pregnancy, students suddenly begin jotting down titles.

Don't get me wrong: I sometimes yearn for blander classroom reading. I say this as someone who recently had to explain the term "maidenhead" in "Romeo and Juliet" to 28 giggling freshmen. Certain scenes in "The Canterbury Tales," which I read with the seniors, could not be fully described in a family newspaper. If you've read "The Miller's Tale," you know what I'm talking about.

But I refuse to leave my students ignorant of some of the landmark works of English literature, as well as the techniques the authors used to engage their audiences. Shakespeare used bawdy language to capture the "groundlings" in the cheap standing room section of the Globe--and in truth, we all have some groundling in us.

You can't take sex and violence out of the high school nine o'clock news. The best books often use this material to explore the morality and consequences of characters' actions.

I teach in Pilsen, a low-income area of the city that has suffered more than its share of youth violence. One of my most gifted students was shot to death at age 17. This weighty backdrop frequently influences classroom discussions. Several years ago when I read "Antigone" with one senior class, a student asked why Antigone and her siblings did not leave Thebes after "everything happened."

We spoke about it a while and ended up comparing the ideas in "Antigone" to the way some modern families continue to live in their homes after a son or nephew is killed on their street. I have rarely felt such immediacy when discussing a classic, yet never would have experienced it if "Antigone" was banned from the classroom for, say, its depiction of incest and explicit violence.

In debates about banning books, we remain too focused on the content. Nobody ever wants to ban a book for craft-related reasons, such as too many cliches or a real lack of externalized characterization.

"The Da Vinci Code," for instance, offends me, but not because of its absurd portrayal of a Jesus who fathered a child and a villainous Catholic Church. Rather, the language is passive and predictable.

Several years ago I stopped using in my class another title on the District 214 proposed ban: "How the Garcia Girls Lost Their Accents," by Julia Alvarez. But it was not for its allusion to premarital sex (and its drawbacks). I discontinued it because the students were confused by the shifting narration.

When I first heard about the proposed ban in northwest suburban District 214, I admit I yawned, thinking of historical challenges to acclaimed books such as "To Kill a Mockingbird" and "The Catcher in the Rye." Then I heard a radio interview with Leslie Pinney, the most ardent activist, and it almost made me think she had done her homework and may have a point. When she had received the district's proposed purchasing list, she said she picked out a few titles and "did what [I] typically do when I don't know enough."

OK, I thought, a bit begrudgingly, since she read them carefully, I'm going to have to hop down from my freedom-of-literature high horse and actually give her perspective more consideration. But Pinney hadn't read the books; she had merely Googled the titles and been disgusted with what came up.

I nearly pulled over the car. I don't let my high school freshmen get away with Google as primary research. Frustrated, I felt the way I imagine many teachers react to such controversies, saying such book-banning efforts don't convey merely a suspicion of teachers and their text choices, they imply that we are not qualified to select appropriate literature for our students.

The District 214 school board rejected Pinney's proposal in a 6-1 vote. But the controversy brought hundreds out to the meeting and touched a nerve. And ultimately, Pinney raised an important point: Parents and guardians should try to be aware of what their children are reading. To me, this is less an act of censorship than an invitation to engage in conversation--surely one of the most important and rewarding purposes of literature.

Several years ago a student in my freshman class mentioned she had been reading John Steinbeck's "Of Mice and Men" aloud to her mother. My student said her mother was surprised at first by the occasional use of profanity--one of the reasons that the book has been periodically banned over the years.

Girding myself, I asked the student if she wanted me to talk to her mother about why we read it. Mentally I was already planning a defense about the book's vivid illustration of the American dream, and struggles faced by minorities and the disabled, but my student just smiled and shook her head. She had already talked it through with her mother herself.


Carolyn Alessio, a former deputy editor for the Tribune's Books section, teaches English at Cristo Rey Jesuit High School in Pilsen and is a writer.

Her summer reading list

Incoming seniors at Cristo Rey Jesuit High School choose books from the following:

"Heart of Darkness" Joseph Conrad
"Brave New World" Aldous Huxley
"Dracula" Bram Stoker
"The Hitchhiker's Guide to the Galaxy" Douglas Adams
"Killing Pablo" Mark Bowden
"In the Company of Heroes" Michael J. Durant with Steven Hartov
"Plainsong" Kent Haruf
"Moneyball" Michael Lewis
"The Dive from Clausen's Pier" Ann Packer
"A Hope in the Unseen" Ron Suskind
"Don Quijote de la Mancha" Miguel de Cervantes Saavedra

 

KIDS GONE WILD? IN PRAISE OF HAZING

Gary Alan Fine

June 20, 2006

Hazing is good for America. Those of us who have been through fraternity (and some sorority) initiations, at one time a hallowed part of campus life, know that they develop shared feelings of honor and pride. But such rituals have been toned down in today's no-risk, litigious, surveillance society. Where once we accepted the rough-and-tumble of youth culture, now everything is examined through the thorny eyes of lawyers.

Recently, Northwestern University suspended some members of the women's soccer team from some 2006-07 regular-season games for hazing. Some players also received probation and others unspecified "additional disciplinary action." The men's swim team and the Northwestern Wildcat mascot squad also were punished in separate incidents.

The truth is that in almost all instances hazing is not harmful. Girls will be girls (and boys, boys) and any punishment will be ineffective. And hazing rituals have real benefits.

Initiations require mutual support and bonding among members. The initiates give up some of their dignity, smudge their reputations, because they know that others in the group will have done the same. They gain a confidence that their mates will support them through college and after. Those more senior know that the initiates wish to join with such intensity that they are willing to let themselves be humiliated. You agree to become the butt of a collective joke, shrouded in secrecy. No one will ever know, so one's public self is preserved.

Being told that you're going to eat worms, strip to your skivvies, or chug a few beers while being paddled is not everyone's idea of fun. But it is precisely the willingness to put up with these uncomfortable (and sometimes painful) antics that indicates you care deeply about membership. The group matters. Initiates give up part of their personal reputation to acquire the benefits of the reputation of the team. And this strengthens the group and the person.

Indeed, what is striking about the women's soccer initiation at Northwestern is that all reports suggest the women participated voluntarily and considered it fun.

Granted, initiations can go too far. Some rules are essential (no sexual contact, reasonable boundaries on physical punishment, and, most significantly, demands that the organizers refrain from alcohol). Excessive practices often occur when authorities prohibit initiations. When we do not teach teenagers how to drink responsibly, they learn to drink rapidly and to excess. When initiations are pushed underground, they are re-created without tradition and sometimes without boundaries. When universities do not learn that bonding rituals are valid and valuable, they respond with fear and create foolish rules that encourage violations.

Initiations were once tied mostly to the doings of college men. Perhaps the sexist idea that this rough sport was acceptable for boys led to a greater acceptance of these rituals. However, female athletes and sorority members are now quite as wild as their male counterparts. And good for them. Bonding used to be a male activity, but now female bonding serves the same valid purposes as they did for their brothers.

However, one rule should be inviolable. No Internet pictures. Today the tut-tut images of young adults romping in their panties, downing brews, being bound with tape or giving lap dances on Web sites such as badjocks.com combine smarmy voyeurism with unctuous morality, the worst of both worlds. For hazing to have its positive effects, it must separate the group from those outside to create a powerful connection among members.

College administrators may want to punish students for their violations, but these are rules that no one needs or wants.

Left alone, these students will create connections that will serve them for life. Just ask President Bush and Sen. John Kerry (D-Mass.) and their Skull and Bones brothers.
 

Copyright © 2006, Chicago Tribune

 

RAPPERS UPSET AT OPRAH? HERE’S WHY WE DON’T CARE

 By Leonard Pitts

(a syndicated columnist based in Washington: Knight Ridder/Tribune)

Published June 27, 2006

 

Would somebody please tell the hip-hop community to stop whining?

Go drink some Cristal, buy some bling, pimp some hoes or do whatever it is you do for amusement, but please, cease, desist, shut up already about how Oprah Winfrey has hurt your feelings.

For those who came in late: Over the last month, a trio of high-profile rappers has leveled criticism at Winfrey for what they feel is her disrespect of their medium. The first blast came from a gentleman who calls himself Ludacris, but whose birth certificate identifies him as Christopher Brian Bridges. He said that when he appeared on Winfrey's show to promote the movie "Crash," in which he co-starred, she treated him dismissively.

The complaint was echoed by 50 Cent (born Curtis James Jackson III), who complained to The Associated Press that Winfrey rarely features hip-hop on her talk show.

"Oprah's audience is my audience's parents, so I could [not] care less about Oprah or her show," he said, sounding like a guy who cares too much.

Then Ice Cube (nee O'Shea Jackson) got into the fray, complaining to FHM magazine that he's never been invited to sit on Winfrey's couch. "She's had damn rapists, child molesters and lying authors on her show. And if I'm not a rags-to-riches story for her, who is?"

Not that anyone asked me, but I could answer all of this in words of one syllable: boo hoo.

Winfrey, though, evidently feeling these gentlemen deserved more response than that, went on a New York radio station and told disc jockey Ed Lover that rumors of her distaste for hip-hop are exaggerated. "I've got a little 50 on my iPod," she said.

Some of us chose to take that revelation with a box of salt. Some of us were left wondering when, how and why liking hip-hop came to be a litmus test for, well, anything. Winfrey went on to explain that her problem with hip-hop is that some of it offends her "sensibilities." She will not, she said, support music that marginalizes women.

You think maybe she could have been referring to that rap video where a credit card is swiped through a woman's backside? Or to any of the hundreds of other videos where women are treated as props and accoutrements? Or to the ones where they are addressed in terms normally reserved for prostitutes and canines?

Here's what amuses me: These guys actually think they have a point. They actually think they've been wronged. And never mind the thousand and one ways their music has wronged us all.

The lords of hip-hop made their fortunes and their fame by flipping the middle-finger salute to Middle American alarm and apprehension over their music, its rawness, its explicitness, its violence and its effects. They were outsiders, loud and profanely proud in their rejection of white-picket-fence mores and norms.

Fine. They have every right.

But now they're singing the blues because the ultimate arbiter of white-picket-fence mores and norms wants nothing to do with them? Now they're seeking sympathy because they are denied a stamp of approval from Middle America's main gatekeeper?

What do they expect? You can't have it both ways. You cannot curse people and expect them to support you, cannot offend them then ask them to welcome you. I'm reminded of what mama always said about respect: You got to give some to get some. Perhaps this is news to the hip-hop nation, populated as it is by people who routinely embrace values neutrality and moral relativism, who often duck responsibility for what they say and how they say it, who frequently refuse to recognize that words have meaning and consequence.

But if it's new to them, it's validation to me.

For the better part of 20 years, hip-hop's overriding message has been, "Bleep the mainstream."

Apparently, these guys are upset that they're being taken at their word.

© Chicago Tribune 2006

 

PERFECT’S NEW PROFILE, WARTS AND ALL

 

By Tamar Lewin

 

September 3, 2006

 

When it comes to the SAT, perfect is now a whole lot harder. But take heart if you favor cursive over printing, the third person over the first, and more over less. You may have an edge.

 

More than 1,000 students got a perfect 1600 last year, when the college-admissions test consisted of the time-honored two 800-point sections, verbal and math.

 

But now that the test has been revamped and expanded to nearly four hours, with a new writing section that includes an essay, the average scores have dropped by seven points — and only 238 students received the new perfect score of 2400.

Technically, even perfect isn’t necessarily perfect. Students can get the top score even if they miss a few questions.

“We actually don’t know how many got a perfect perfect,” said Caren Scoropanos, a spokeswoman for the College Board.

What the College Board does know is that the top scorers comprised 131 boys and 107 girls, or just 0.017 percent of the almost 1.5 million college-bound seniors who took the test.

 

It seems to be the writing test that has made the number of perfects plummet. While the math and reading sections each had more than 8,000 top scores, only 4,102 students were rated perfect on the writing test, the only part of the exam where girls outscored boys.

 

Most of the writing test — and three-quarters of the writing score — consists of multiple-choice questions on grammar and usage. But most of the anxiety among high school students centers on the 25-minute essay, graded on a scale of one to six by at least two readers, who spend about three minutes on each essay. Their two scores are added. And, the College Board said, the reason so few students won top marks on the writing section is that so few — less than one percent — got sixes from both readers, for that perfect 12.

 

The SAT Scoring Guide says an essay gets a six if it “effectively and insightfully develops a point of view on the issue,” “demonstrates outstanding critical thinking,” “is well organized and clearly focused,” and “exhibits skillful use of language.” Grading is holistic, with no points off for spelling errors or small linguistic flaws.

 

Last week, when the board released 20 top-scoring essays, all on the topic of whether memories are a help or a hindrance, it was impossible not to notice that many were — what’s the right word? — awkward:

 

“Memory is often the deciding factor between humans and animals,” one started.

 

“It is a commonly cited and often clichéd adage that people learn from their mistakes,” wrote another.

 

“We reason only with information, that is, reason is the mortar that arranges & connects pieces of information into the palace of understanding,” said a third.

 

Ed Hardin, who helped develop the writing test for the College Board, has an explanation: “Someone has to get a six,” he said. “Student writing, over all, is not very strong, which is the reason we added the writing test to the SAT. We hope they’ll get better.

 

After analyzing the results, the board had these insights for the next crop of SAT-takers.

 

Eighty-four percent of the essays took up more than one page, and longer essays were more likely to get a high score than shorter ones. (Two pages is the limit.)

 

Most essays were printed, but those written in cursive got slightly higher scores.

 

About half the essays were written in the first person, but those that did not use the first person got slightly higher scores.

“You can certainly write a first-person essay and get a six, but it’s also true that a lot of very low-performing students write first person,” Mr. Hardin said. “What we tried to show, in releasing these top-scoring essays, is that lots of different things can work. You can select any style, any approach that you think suits your strengths as a writer.”

 

And on the essay, at least, it’s possible to get a six without coming anywhere near perfect.

© New York Times 2006

 

19TH CENTURY OUTWEIGHS 20TH FOR TOP NOVELS

By RICHARD BERNSTEIN 
Copyright 1998 New York Times

With all due respect to Arthur Schlesinger Jr., A.S. Byatt, William Styron and the others who, acting at the behest of the Modern Library, produced a list of the 100 greatest English-language novels of the century, the truth is that the entire endeavor is so drenched in caprice as to be close to silly.

You might be able scientifically to pick the 100 best-ever baseball players, because there would be a certain statistical basis to rely on. The same is not the case with works of the imagination. Still, the purpose of the exercise was to provoke discussion, always a good thing.

 

Should Ulysses really have been No. 1, especially when one suspects that almost nobody, probably including most members of the Modern Library panel, has ever read James Joyce's difficult masterpiece from cover to cover? How can Joseph Conrad's immortal Lord Jim be No. 85, while Zuleika Dobson by Max Beerbohm ends up 59th? And here's another question that emerged from a recent conversation duly provoked by the Modern Library list: How many of the books would still be on the list if it had included works of the 19th century, as well? My own quick answer is, not many.

 

Indeed, when one starts examining this issue a bit more closely, the Modern Library list of 20th-century writers virtually demonstrates that the 19th century was a greater epoch for literature. In any case, I offer that statement (one that is certainly no more outrageous than some of the choices in the Modern Library list) as a challenge for rebuttal.

 

It is true, of course, that the Modern Library list has many very good novels: two by Joyce as well as works by F. Scott Fitzgerald, William Faulkner, Vladimir Nabokov, Robert Graves and Virginia Woolf, along with books by Henry James and Joseph Conrad, 19th-century men whose greatest works squeaked in just over the 20th-century line.

 

But on the 19th-century list would be Charles Dickens and Herman Melville, Jane Austen and Mark Twain ("the Lincoln of our literature," said William Dean Howells). Comparing Virginia Woolf with Jane Austen, in my view, is similar to comparing Samuel Richardson of Clarissa with Henry Fielding of Tom Jones, as Samuel Johnson did. "There is more knowledge of the heart in one letter of Richardson's than in all Tom Jones," Johnson said.

 

I would say that there is more literary genius in every chapter of Austen than there is in the entire oeuvre of Woolf. To the Lighthouse is a fine but somewhat precious and dull work, certainly not one to be mentioned in the same breath with Pride and Prejudice; yet it got the 15th spot on the Modern Library list.

 

The greatest monuments in American literature are, I would argue, Huckleberry Finn and Moby-Dick. Indeed, what serious person could maintain, if those two books had been included, that they would not have ended up higher than the two top American entries on the Modern Library 20th-century list, The Great Gatsby and Nabokov's Lolita?

These latter two works are indisputably great, but I do not believe that either of them has the mightiness of theme, the narrative power or the transgressive originality of the Melville and the Twain.

 

Among the other 19th-century novels that would crowd out most of the 20th-century selections: Vanity Fair, by William Makepeace Thackeray; Billy Budd, by Melville; Middlemarch, by George Eliot; Far From the Madding Crowd, by Thomas Hardy; and at least six books each by Dickens and Austen. That is not to mention Stephen Crane's Red Badge of Courage, Nathaniel Hawthorne's Scarlet Letter or The Last of the Mohicans by James Fenimore Cooper. If I am right that the 19th-century list would be greater than the 20th-century one, the next question is, why?

T

his is not easy to answer, given the difficulty of ever knowing why the literary imagination seems to flourish at certain times and in certain places but not in others. Perhaps the 19th century was greater than the 20th because, except for the Napoleonic Wars that opened it and for the American Civil War, it was a period of relatively small and short military conflicts. As Jason Epstein, senior editor at Random House, puts it, the great war of the 20th century lasted with some brief interruptions from the beginning of World War I in 1914 to the end of the Cold War in 1989.

 

It has been argued that civil or international turmoil is a stimulus to great writing. But the hypothesis here is that the 20th-century wars were so devastating that they led to a loss of spirit and confidence, both of which are necessary for the production of great literature.

 

"The 19th century was fundamentally an optimistic, progressive century," Epstein said. "Things were going to get better. The 20th century was riddled with angst and disillusion, and each century produced its appropriate literature. The consensus in retrospect is that the 19th century produced greater literature than the 20th, so the conclusion would seem to be that optimism is better for literature than pessimism."

 

World War I and the powerful despondency that it created, the disgust with the human animal that it generated, aided in the development of an overelaborated, introspective sensibility that, while critical to the modernism pioneered by Joyce, also led literature toward self-consciousness and away from events.

 

And in this sense, the literary age of feeling, of personal, sexual, political and stylistic exploration, seems somehow smaller than the literary age of great moral and philosophical narrative. The priority given to literary experimentation in the 20th-century novels of politics and consciousness seems to have operated in the recent Modern Library list.

 

How else to explain why Lolita, To the Lighthouse and Portnoy's Complaint are high on the 20th-century list, while Conrad's Lord Jim, a far greater work, in my opinion, than any of those three, ended up No. 85?

 

Maybe I'm all wrong, but I don't think so. Surely my preferences are no more capricious than ranking Aldous Huxley's Brave New World as the fifth-best book in English of the century, a strange choice indeed!

 

APOCALYPTIC LEAR

By Chris Jones
Tribune theater critic
Published September 20, 2006

You can set "King Lear' on the moon, or -- like the Goodman Theatre's endlessly audacious Robert Falls -- in a post-apocalyptic Eastern European world of guns, vodka, petty fiefdoms and crushing sexual cruelty. No matter. This great play will still try to spin around the same axis.

You will either walk with Lear toward the most painful kind of self-knowledge or you will stand, open-mouthed in the nearby rubble, and stare coldly at a body hastening toward the same death that awaits us all.

To watch Falls' astonishingly nihilistic "Lear," a colossal, eye-popping operatic production that has defiantly shorn the play of its decency, is to peek out onto some kind of terrifying no-man's land laid out before you with the most brutal kind of precision.

This is a "Lear" in which even that selfless servant Kent, whom some of us have spent our entire adult lives admiring, picks up a tire iron with brutal, violating intent. Has our world really come to that?

It is legitimate, of course, for a world-class theatrical artist such as Falls to make the case that, yes, it has. Or, in the worlds of Slobodan Milosevic or Nicolae Ceausescu or the fall of Baghdad, it surely did. And whatever barbs one might throw at Falls' unctuous, arrogant auteurism -- and there are narrative liberties taken here that will have Chicago's tragic purists spitting nails in the Goodman Theatre's direction -- this is a carefully wrought directorial vision expressed with such intensity and detail that it envelops its audience in a small-time world of expansive scale.

Falls' "Lear" is a show that deserves to move beyond Chicago, and it will set people talking wherever it lands. And at least half the room will be arguing that it doesn't have much to do with Shakespearean tragedy.

Some will hate this show. I deeply admire much of it -- especially the consistency, profundity, clarity and audacity of its conception. But even when viewed by its own rules, there is a gaping hole in the show's side. Falls knows precisely what he wants to do with everyone in the play, with one exception. And that's the guy whose name makes up the title and who is supposed to be our way into the play.

Actually, no Shakespearean director is obliged to follow some rule of tragic magnitude or even to consider the full vision of a play. "King Lear" will survive the Falls "Lear," just as it survived those cheery, tacked-on endings common in the 19th Century.

When it comes to the depiction of human cruelty -- or the evocation of the arbitrary nastiness of the cosmos -- this is a "Lear" sans pareil. Here, that privileged bad-boy Falls is like a middle-age, Midwestern football hooligan given all the theatrical toys money can buy. Whether it's popping one of poor Gloucester's eyes into a frying pan or lubricating some of the duller scenes with scenes of unscripted copulation, this show displays a twisted, seriocomic sensibility that lands somewhere between Jacobean tragedy and the Tarantinoesque. School groups be warned.

No modern American director is better than Falls at making a play's iconic moments pop with fresh irreverence. Here, the first lines of dialogue are delivered by two men at urinals. When Lear divides his kingdom, he cuts up a piece of cheap cake doled out at a drunken party held in his honor. And Edgar -- typically a paragon of filial virtue -- is rendered as a pill-popping rich kid.

Often, these counterintuitive characterizations work brilliantly. Edward Gero's superbly acted Gloucester, for example, brilliantly captures the lazy stupidity of a second-tier arriviste -- he calls to mind some sad-eyed patronage worker in a city government under federal investigation. Jonno Roberts' deftly underplayed Edmund is, aptly, a nightmare to watch. And Laura Odeh's complex Cordelia is not the usual pure soul, but an ineffectual girl from a rough family who tries but always knows she can't escape.

Falls goes over the top in the case of Lear's daughters Goneril (Kim Martin-Cotton) and Regan (Kate Arrington), who have colorful moments but read mostly as strangely costumed, sexually obsessed ugly sisters who change little in the course of the night. He's playing with archetype, sure, but also enjoying it too much.

Often, the actors' hearts leaven Falls' excesses. Steve Pickering's Kent does shocking things, but the actor's soul burns through nonetheless. And although Falls has turned Albany (a character who usually sees the error of his ways) into an amoral incarnation of Edward Albee's George, the actor Kevin Gudahl salvages some humanity.

Which leaves us with Lear -- not the most desirable order of priority. Stacy Keach, a distinguished and truthful actor, offers a performance of characteristic detail but insufficient scale. His "Lear" -- sad, fusty, overwhelmed, uncertain, clinging to what's left of his world -- is both truthful and intermittently moving, especially in the second act. But he's like a single-story urban artifact, beloved of preservationists but crumbling at the hands of a city planner more interested in the brutalist suburban skyscrapers.

Keach, who tackles Lear as if this is a tragedy (which it ain't), doesn't fully belong to Falls' world yet alone sits at its center. It's a balance that could be fixed, if Falls paid less attention to the writhings at the side and more to the capable star lost in the middle.

After the first scene, we're left wondering whether Lear is the last half-decent leader this wretched place of Falls' imagination ever had, or whether he's a loathsome, brutal dictator about to get both a show trial and his comeuppance at the hands of the mobster family he created. By the end of the night, it sure feels as if Falls licked his lips, kissed off any last remnants of tragic obligation and picked the latter. Keach, a tragic actor in the grand tradition, hasn't yet fully imbibed the Kool-Aid.

cjones5@tribune.com

 

DYLAN FINDS HIS VOICE

By Greg Kot

October 30, 2006

The work in progress that is Bob Dylan's career turned another corner Friday at the newly unveiled Sears Centre Arena in Hoffman Estates, and the 65-year-old singer was sounding feisty again.

That wasn't the case in early 2005, when Dylan came to town with a recently retooled band. The pros flanking him seemed too timid to challenge him. Dylan himself sounded tentative in the new surroundings, his voice erratic and his keyboard inaudible most of the night.

But more than a year later, Dylan and his five musical accomplices have coalesced into a band to be reckoned with. Dylan's counterpoint chords on the keys were now on equal footing with the other instruments, particularly during a jumping "Summer Days," a jauntily slap-dash "Tangled Up in Blue" and the dreamlike ballad "When the Deal Goes Down."

He also found a comfortable pocket for his voice, and the band laid back just enough to let him sing without strain. He can't shout through a band anymore, but he has discovered a conversational playfulness that suits his recent material just fine.

After three somewhat perfunctory performances of '60s warhorses, the band got down to business with "High Water," Dylan's rasp curling into a snarl. "It's bad out there, high water everywhere," he declared, and the guitars rose up like an angry tide on the final syllable. It set the tone for the remainder of the 90-minute performance: Dylan brought particular intensity to tunes from his last three albums, including his first chart-topper in 30 years, "Modern Times."

His aversion to nostalgia ensures that his road show will remain in constant flux: band members come and go and set lists shift nightly. One thing that hasn't changed: He's still turning his back to a third of the audience while interacting with the band.

"Do you think he's trying to tell us something?" one fan remarked. Yet Dylan was clearly into the music, and the audience -- a mix of graying Baby Boomers, their kids, and a sizable cross-section of college-age enthusiasts -- responded in kind. Though the 11,800-seat arena was only half full, the fans who did show up were pumped, and with good reason.

"Rollin' and Tumblin'" pounded with menace as Dylan turned the word "burn" into a threat, and "Lovesick" dripped with dread. "Tweedle Dee & Tweedle Dum" was yet another blues that oozed violence as the guitars jousted, then settled into an uneasy calm.

"Thunder on the Mountain" ushered in the encore, which wrapped with a couple more oldies ("Like a Rolling Stone," "All Along the Watchtower"). They were dispatched with efficiency by an artist who still would rather try something new and fail than recycle his past.

----------

gregkot@aol.com

Copyright © 2006, Chicago Tribune

 

 

DYLAN SHOW A MOODY MYSTERY

October 29, 2006

DAVE HOEKSTRA

 

Bob Dylan has rolled into town in his kind of mysterious weather, somewhere between a hard rain and an early snow.

On Friday night, Dylan and his band found themselves part of the opening weekend festivities at the new Sears Centre in Hoffman Estates, somewhere between Ikea and the suburban ideal. Dylan likely had no idea where he was. Nor did I.

The theater suddenly sprung up in the middle of the Prairie Stone Business Park, which certainly accounted for Friday's lackluster attendance. The 11,800-seat arena was about half full. (The bill, with openers Kings of Leon, was to be repeated Saturday night).

This was puzzling. Dylan has never been more visible in his majestic career. His latest "Modern Times" album is his first No. 1 record since 1976's "Desire." And his XM Satellite radio show has 1.7 million listeners.

So don't blame Bob if he delivered a moody set. He warned to take heed of the western wind in a resplendent version of "Boots of Spanish Leather" and he was tossed by the winds of the seas in the new country-tinged "Workingman's Blues #2," incorporating ethics he surely heard on last year's tour with Merle Haggard. "High Water (For Charley Patton)" was a glorious juxtaposition of a funk bottom against Donnie Herron's banjo.

Dylan's new stuff was the highlight, at least for a Zimmy Gypsy like me. The new "When The Deal Goes Down" took on more depth, shaped by calliope rhythms and the dependable Tony Garnier on stand-up bass. And his first encore of the new "Thunder on The Mountain" made sense, as it uses the same roadhouse shuffle of his set closer, "Summer Days."

Dylan did seem to get a charge out of an extended "Tangled Up In Blue," providing his own call and response (by changing his vocal pitch) in what he recast as a charging Irish anthem. And he kicked the show off just right with "Leopard- Skin- Pill- Box-Hat," followed by his increasingly poignant "The Times They Are A Changin'."

Sears was fine on your ears, considering this was just the second show in the venue's history. Duran Duran opened the venue on Thursday night. The bass was heavy at the beginning of the Kings of Leon's hippie rock set, but the balance was pristine by the time Dylan took the stage.

Fans weren't hit by downtown or huge shed prices either. A regular soda was $3 and a small popcorn was $3.50. I noticed a 16-ounce bottle of beer was $6, advertised as a "Bob Dylan Special," whatever that meant. But then any time Dylan comes around, it's special. There will always be a song to shine on your heart as the skies turn dark.

dhoekstra@suntimes.com

 

DROPOUT NATION

By Nathan Thornburgh

It's lunchtime at Shelbyville High School, 30 miles southeast of Indianapolis, Ind., and more than 100 teenagers are buzzing over trays in the cafeteria. Like high schoolers everywhere, they have arranged themselves by type: jocks, preps, cheerleaders, dorks, punks and gamers, all with tables of their own. But when they are finished chugging the milk and throwing Tater Tots at one another, they will drift out to their classes and slouch together through lessons on Edgar Allan Poe and Pythagoras. It's the promise of American public education: no matter who you are or where you come from, you will be tugged gently along the path of learning, toward graduation and an open but hopeful future.

Shawn Sturgill, 18, had a clique of his own at Shelbyville High, a dozen or so friends who sat at the same long bench in the hallway outside the cafeteria. They were, Shawn says, an average crowd. Not too rich, not too poor; not bookish, but not slow. They rarely got into trouble. Mainly they sat around and talked about Camaros and the Indianapolis Colts.

These days the bench is mostly empty. Of his dozen friends, Shawn says just one or two are still at Shelbyville High. If some cliques are defined by a common sport or a shared obsession with Yu-Gi-Oh! cards, Shawn's friends ended up being defined by their mutual destiny: nearly all of them became high school dropouts.

Shawn's friends are not alone in their exodus. Of the 315 Shelbyville students who showed up for the first day of high school four years ago, only 215 are expected to graduate. The 100 others have simply melted away, dropping out in a slow, steady bleed that has left the town wondering how it could have let down so many of its kids.

In today's data-happy era of accountability, testing and No Child Left Behind, here is the most astonishing statistic in the whole field of education: an increasing number of researchers are saying that nearly 1 out of 3 public high school students won't graduate, not just in Shelbyville but around the nation. For Latinos and African Americans, the rate approaches an alarming 50%. Virtually no community, small or large, rural or urban, has escaped the problem.

There is a small but hardy band of researchers who insist the dropout rates don't quite approach those levels. They point to their pet surveys that suggest a rate of only 15% to 20%. The dispute is difficult to referee, particularly in the wake of decades of lax accounting by states and schools. But the majority of analysts and lawmakers have come to this consensus: the numbers have remained unchecked at approximately 30% through two decades of intense educational reform, and the magnitude of the problem has been consistently, and often willfully, ignored.

That's starting to change. During his most recent State of the Union address, President George W. Bush promised more resources to help children stay in school, and Democrats promptly attacked him for lacking a specific plan. The Bill & Melinda Gates Foundation has trained its moneyed eye on the problem, funding "The Silent Epidemic," a study issued in March that has gained widespread attention both in Washington and in statehouses around the country.

The attention comes against a backdrop of rising peril for dropouts. If their grandparents' generation could find a blue-collar niche and prosper, the latest group is immediately relegated to the most punishing sector of the economy, where whatever low-wage jobs haven't yet moved overseas are increasingly filled by even lower-wage immigrants. Dropping out of high school today is to your societal health what smoking is to your physical health, an indicator of a host of poor outcomes to follow, from low lifetime earnings to high incarceration rates to a high likelihood that your children will drop out of high school and start the cycle anew.

Identifying the problem is just the first step. The next moves are being made by towns like Shelbyville, where a loose coalition of community leaders and school administrators have, for the first time, placed dropout prevention at the top of the agenda. Now they are gamely trying to identify why kids are leaving and looking for ways to reverse the tide. At the request of a former principal, a local factory promised to stop tempting dropouts with jobs. Superintendent David Adams is scouting vacant storefronts for a place to put a new alternative high school. And Shelbyville's Republican state representative, Luke Messer, sponsored a bill, signed into law by the Governor two weeks ago, that will give students alternatives to traditional high school while imposing tough penalties on those who try to leave early without getting permission from the school district or a judge.

Shelbyville, a town of almost 18,000 located on the outer fringe of the "doughnut" counties that ring Indianapolis, seems an unlikely battleground in the war on dropouts. Despite a few oddities--it's home to both the oldest living Hoosier and the world's tallest woman--it is an otherwise pleasantly unremarkable town. The capital is just a short drive away, but miles of rust-colored farmland, mainly cornfields waiting for seed, give the area a rural tinge. Most people live in single-family houses with yards and fences. Not many of them are very well off, but there's little acute poverty, as a gaggle of automotive and other factories has given the town a steady supply of well-paying jobs. Violent crime is rare, and the town is pervaded by a throwback decency. People wave at one another from their cars on Budd Street. They chitchat in the aisles of Mickey's T-Mart grocery store.

For years, Shelbyville had been comforted by its self-reported--and wildly inaccurate--graduation rate of up to 98%. The school district arrived at that number by using a commonly accepted statistical feint, counting any dropout who promises to take the GED test later on as a graduating student.

The GED trick is only one of many deployed by state and local governments around the country to disguise the real dropout rates. Houston, for example, had its notorious "leaver codes"--dozens of excuses, such as pregnancy and military service, that were often applied to students who were later reclassified as dropouts by outside auditors. The Federal Government has been similarly deceptive, producing rosy graduation-rate estimates--usually between 85% and 90%--by relying only on a couple of questions buried deep within the U.S. Census Bureau's Current Population Survey. The survey asks whether respondents have a diploma or GED. Critics say the census count severely underreports dropout numbers, in part because it doesn't include transients or prisoners, populations with a high proportion of dropouts.

In 2001, Jay Greene, a senior fellow at the Manhattan Institute, published a study that peeled back the layers of statistical legerdemain. Poring over raw education data, he asked himself a basic question: What percentage of kids who start at a high school finish? The answers led Greene and subsequent researchers around the country to place the national graduation rate at anywhere from 64% to 71%. It's a rate that most researchers say has remained fairly static since the 1970s, despite increased attention on the plight of public schools and a vigorous educational-reform movement.

Starting a year ago, the people of Shelbyville began to admit the scope of their problem by asking themselves the same simple questions about who was graduating. It helped that superintendent Adams was new to his job and that the high school's principal was too. They had a clean slate and little incentive to make excuses for the old way of doing things.

THE PUSHOUT

Sarah Miller, 28, was victim of those old ways. An intelligent but rebellious teenager with a turbulent home life, Sarah began falling behind in attendance and classwork her freshman year. Like many other 15-year-olds, she had a talent for making poor decisions. She and her friends would often skip out of school after lunch and cruise up and down Broadway. Teachers rarely stopped them, but school authorities knew what she and her friends were up to. One morning Sarah went to the school office to discuss getting back on track but got a surprise. One of the administrators asked her point-blank, "Why don't you just quit school?" "I was just a kid," says Sarah with a laugh. "It was like they said the magic words. So I told them, 'O.K.!' And I left."

Sarah never set foot in a high school again. She got her GED, but now she's too afraid to try community college, she says, because she doesn't want to look stupid. Although she has a house she owns with her husband and a fine job serving coffee, biscuits and small talk at Ole McDonald's Cafe in nearby Acton, Ind., Sarah is not without regret. "It would have been nice to have someone pushing me to stay," she says. "Who knows how things would have turned out?"

Researchers call students like Sarah "pushouts," not dropouts. Shelbyville High's new principal, Tom Zobel, says he's familiar with the mind-set. "Ten years ago," he says, "if we had a problem student, the plan was, 'O.K., let's figure out how to get rid of this kid.' Now we have to get them help."

But can educators really be faulted for the calculation, however cold, that certain kids are an unwise investment of their limited energies and resources? That question quickly leads to the much thornier issues of class and clout that shape the dropout crisis. The national statistics on the topic are blunt: according to the National Center for Education Statistics, kids from the lowest income quarter are more than six times as likely to drop out of high school as kids from the highest. And in Shelbyville, nearly every dropout I met voiced a similar complaint: teachers and principals treat the "rich kids" better. "The rich kids always knew how to be good kids," says Sarah in a more nuanced version of the same refrain. "So I guess it's natural the schools wanted to work with them more than with the rest of us." The poor kids, though, are exactly the ones who need the extra investment.

Shelbyville leaders hope to change the prevailing mentality. At a cavernous high school gym in nearby Columbus, I watched the boys' basketball sectional semifinal with Shelbyville mayor Scott Furgeson. The Shelbyville Golden Bears' 21-0 regular season record had turned the town's usual Hoosier hysteria into Hoosier histrionics. As his constituents cheered on the good kids--the lithe, clean-cut basketball players who were dominating Columbus North High School--Furgeson paused to think about the other kids. Before becoming mayor, he spent 22 years managing the local Pizza King franchise. Every year he had to hire up to 200 teenagers, many of them dropouts, just to keep 10 full-time positions staffed. Those teenagers, failing in life as they had failed at school, were often the children of people Furgeson had seen quit school when he was a student at Shelbyville High 25 years before. The dropout problem, he says, corrupts the community far beyond the halls of the high school. "I worry that we're creating a permanent underclass," he says.

John Bridgeland, CEO of the Washington-based public-policy firm Civic Enterprises, says it's that type of attitude shift, more than legislation, that is likely to lead to change. Messer's 2005 bill made Indiana one of six states in the past five years to raise its minimum dropout age to 18 from 16. (Twenty-three states still let kids drop out at the younger age without parental consent.) Bridgeland, who co-wrote the Gates Foundation--funded report, supports the age hike but warns that states can't legislate in a vacuum. "These laws have to be coupled with strong support from the school and the community," he says. Underlying that conviction is perhaps the most surprising finding of the Gates survey: just how few dropouts report being overwhelmed academically. Fully 88% said they had passing grades in high school. Asked to name the reasons they had left school, more respondents named boredom than struggles with course work.

THE RESTLESS ONE

Susan Swinehart, 17, was an honors student her freshman year. She also joined the yearbook staff and found that she loved selling the $300 full-page yearbook ads to local businesses like Rush Shelby Energy and Fat Daddy's restaurant.

But the social cauldron of high school weighed on her. She didn't get along with the cheerleaders on the yearbook staff. And her avid interest in Stephen King novels and TV shows about forensics earned her a false reputation, she says, as a glum goth girl. So she started ditching class, barreling through the Indiana countryside alone in her Dodge Neon, blasting her favorite song, The Ghost of You, by My Chemical Romance--a song, as she puts it, about missed opportunities and regret.

"I'd rather regret something I did," she says, eyes welling with tears, "than regret something I didn't do." For her, sitting in a classroom biting her tongue and waiting to graduate when college wasn't necessarily in her future was a form of inaction. Working, saving money, starting her adult life--that was taking the initiative.

In cases like Susan's, American public education may be a victim of its own ambition. Rallying around the notion that every child should be prepared for higher education, schools follow a general-education model that marches students through an increasingly uniform curriculum, with admission to college as the goal. But what happens when a 17-year-old decides, rightly or wrongly, that her road in life doesn't pass through college? Then the college-prep exercise becomes a charade. At Shelbyville High School, as elsewhere, the general-education model became an all-or-nothing game that left far too many students with nothing.

Two months ago, Susan told her mother Kathy Roan that she was dropping out. "I wanted to kill her," says Kathy. But Kathy had her own bitterness about Shelbyville High. Two decades earlier, she too had been angered by the indifference of the school. She dropped out as soon as she turned 16.

On Feb. 22, Susan's mother went to school with her to sign her out of high school. That night Susan applied for more hours at the Taco Bell where she worked and promptly stayed for the 5 p.m.--to--2 a.m. shift. The other women on the graveyard shift gave her hell for quitting school. They were mostly dropouts themselves, says Susan, who reminded her that even at fast-food chains, anyone who wants to advance needs a diploma or GED. She had, they told her, just broken something that could not be easily put back together.

Susan says she will prove them wrong. She has started a Pennsylvania-based correspondence course that both her mother and sister completed. For $985, it provides textbooks, online tests and teacher support via phone and e-mail. The rush to cash in on dropouts has made such correspondence courses and "virtual high schools" the Wild West of secondary education, a multimillion-dollar industry that can offer a valuable second chance but has suffered at times from poor oversight and a dizzying array of self-styled accrediting institutions, many of which aren't recognized by mainstream colleges.

There is, not surprisingly, partisan division over the dropout problem. Liberals say dropouts are either a by-product of testing mania or an unavoidable result of public schools' being starved for funding. But more conservative reform advocates, like Marcus Winters, a senior research associate at the Manhattan Institute, disagree. "Spending more money just has not worked," he says. "We've doubled the amount we spend per pupil since the '70s, and the problem hasn't budged."

In Indiana, however, there is a bipartisan consensus about the state's latest antidropout measure. Shelbyville representative Messer, former head of the Indiana Republican Party, is no stranger to partisan politics, but his strongest partner in pushing for the measure was a liberal Democrat named Stan Jones, who is now the state's commissioner of higher education. The bill they championed had, fittingly, both carrot and stick. Students who drop out before age 18 could have their driver's license suspended or their work permit revoked unless their decision was first approved by a school or judge. But students who found the high school environment stifling could take classes at community colleges. The dual approach struck a chord, and both houses passed the bill unanimously.

Messer acknowledges that his law is no panacea. He's fond of saying he can't legislate away teenage mistakes. And indeed, Kentucky, Georgia and West Virginia have had similar laws on the books for a number of years, but critics say there's no proof that the laws have worked. Still, he says, "some kids are dropping out because it's easy and it's O.K. That is going to change."

On a national level, No Child Left Behind--the metric-heavy school reform that President Bush would like to expand in public high schools--was designed to make schools accountable for their dropout rates. But it hasn't been carried out very seriously. The Education Trust, an advocacy group for low-income and minority students, issued a scathing report in 2005 about how the Federal Government stood by while states handed in patently misleading graduation numbers: last year three states didn't submit any, and for many states, the figures were clearly inflated.

Secretary of Education Margaret Spellings tells TIME that much is being done to get better data on dropouts. She points to the National Governors Association resolution last year to set, for the first time, a common definition of a dropout that all states will use to report graduation rates to the Federal Government. But it's a nonbinding compact. And critics say the government is trying to slash funding for important support programs, including the Carl Perkins Act, which has funded vocational education across the country since 1984. Spellings says President Bush has proposed converting Perkins and other support programs like GEAR UP and Upward Bound into block grants for states to choose their own fixes. As long as states get results, says Spellings, "we're not going to prescribe particular programs or strategies like vocational education."

Superintendent Adams believes he has come up with the right prescription for Shelbyville. The high school has established a credit lab, a sort of open study hall that lets at-risk kids recover credit from classes that they have failed. The principal at the elementary school is trying to identify at-risk kids in first grade. In the middle school, students are taking high school--graduation pledges, promising to be onstage with a diploma along with the rest of their class.

The district will also continue to support the Blue River vocational school, where more than 300 juniors and seniors spend their afternoons learning trades from nursing to marketing to auto-body repair. And there is a plan to build an alternative high school, which Adams envisions as a low-key place where, if they want to, kids can eat a doughnut while instant-messaging friends during loosely structured study hall, so long as they get their work done at some point. "Too many kids, at their exit interviews, say, 'I'm just done with this process--50 minutes, bell, 50 minutes, bell,'" says principal Zobel. "With the alternative school, I could give them an option, another environment to be in."

THE COMEBACK KID

On the edge of Shelbyville's Old Town square, now a roundabout with a paved parking lot in the middle, there's a statue of one of central Indiana's most famous literary characters, a sort of Hoosier Huck Finn named Little Balser. The main character of The Bears of Blue River, a book for adolescents set in the woods of frontier-era Shelby County, Balser spends his days striking off into the wilderness, slaying countless bears (and even an Indian or two) and worrying his parents sick. He is the prototype of an American teenager, a combustible combination of independence and irresponsibility.

Ryan Tindle, 21, carried that legacy to its modern-day extreme. In middle school, he started ditching class, trying to escape a tough home life by ingratiating himself with older kids who played rough. So it was little surprise when he traveled the well-worn path of the troublemaker, dropping out of high school and promptly beating up an older kid so severely that Ryan was sentenced to a year at Plainfield Juvenile Correctional Facility. Once inside, one of the few times he picked up a pencil, he used it to stab another inmate in the hand. He felt that he had to prove himself, he says, after witnessing weaker kids being assaulted at the facility. The attack earned him a stint in isolation in Cottage 13--"the cage"--and that, says Ryan, is where he got religion about schooling.

"My family always thought I was going to be worthless," he says, "and for the first time, I saw they were right."

As soon as he was released, Ryan went back to Shelbyville High School and asked to re-enroll. The Ryan Tindle that administrators knew, however, was nothing but grief. Wary administrators balked at letting him back in. He had to wait until a new principal arrived before he could convince the school that he was serious about his new leaf. But now he had to catch up quickly on a lot of lost years. "I went back with a fifth-grade education," he says. "That was the last time I had paid attention in school."

In the end, it took him nearly two years of a grueling schedule to finish what he started. From 7 a.m. to 3 p.m., he sat in class at the high school, then took three hours of night school for basic reading and math. To everyone's amazement, he finished.

Ryan is working hard these days. He wakes up before 5 every morning to go to his job at a car-parts factory, where he works on the line and earns less than $10 an hour. On Saturdays and Sundays, he trains new employees at the local Arby's. In all, he takes home about $23,000 a year. He would like to go to college someday, he says with a slightly embarrassed grin, to study criminology. He wants to be a cop.

For now, however, graduation is reward enough. He pulls a laminated card out of his wallet. It's his Shelbyville High School diploma, miniaturized. "I'll always be able to look at that diploma and smile," he says. "It's the best thing I've ever done."

If Ryan's redemption seems remarkable, that's because it is. According to a 2005 report from the Educational Testing Service, the company that runs the SATs, federal funding for second-chance programs, such as the night school Ryan attended, dropped from a high of $15 billion in the late 1970s to $3 billion last year. Yet the stakes in the struggle to get students to graduate are higher than ever: an estimated 67% of prison inmates nationwide are high school dropouts. A 2002 Northeastern University study found that nearly half of all dropouts ages 16 to 24 were unemployed.

Finding good work is only getting harder for dropouts in the era of the knowledge-based economy and advanced manufacturing. Knauf Insulation is Shelbyville's largest employer, with more than 800 workers. Salaries start at $16.50 an hour, and the benefits at this German company are, well, positively European. In one of its factories along the Blue River, a row of mammoth 2400° furnaces spin the plant's secret recipe of sand, soda ash, borax and limestone into billions of billowy glass fibers, which will be cooled, packed and cut into battens of fiber-glass insulation. The workers running the furnaces are the last of a dying breed: people holding good jobs who never earned a high school diploma. Thirty years ago, the men came from as far away as the hills of Kentucky and proved themselves steady workers. Today they earn as much as $60,000 a year.

It's a fine life, but these days high school dropouts need not apply. Even a GED is not sufficient for a job here anymore. Take a tour of the factory floor, and the main reason is clear. Some workers--entry-level employees--stand at their stations and pluck irregular pieces of fiber glass from the line. It's mostly mindless labor, but the giant whirring belts and chomping insulation cutters are run by adjacent computer terminals called programmable-logic controllers. When the floor boss goes on a coffee break, it's the floor workers who must operate the controllers. In today's factories, no worker is more than a boss's coffee break away from needing at least some computing skills. And now more than ever, says Knauf president Bob Claxton, the company wants to invest in the continuing education of its workers so they can keep up with new technologies--an investment that might not be worth making if those workers lack high school basics.

But the firm's requirement of a high school diploma is as much about a mind-set as it is about a skill-set, says Claxton. A diploma "shows that these applicants had the discipline to gut out a tough process," he says. "They learned how to get along with people, some of whom they may not have liked so well, in order to achieve their goals." A GED, he says, doesn't prove they can do that.

Even the dropouts who do land factory jobs can find work tougher than they thought. A relative helped Christine Harden, 18, find work in a local car-parts factory four months after she dropped out of Shelbyville High. But she has to get up at 4:30 a.m. to make the first shift every day, and she says her back is killing her. "All my friends who are thinking about dropping out, I tell them, 'Don't do it,'" she says. "This is real life out here. It's not easy."

THE LONE HOLDOUT

I met Shawn Sturgill's parents in the living room of their ranch-style home around the corner from Shelbyville's cemetery. At age 15, Shawn's father Steve, with a child on the way, dropped out of high school and then spent more than a decade battling drug abuse. He was born again six years ago, he says, patting the thick wooden cross around his neck. He has been clean since and has a high-paying job burying fiber-optic cables. But his turnaround came too late to be a model for his three older children, two of whom dropped out of school.

Shelbyville schools are performing triage on Shawn's education. For much of the day, he is in credit lab, working at his own pace to recover classes he has failed. Every afternoon he goes to the Blue River school, where he is enrolled in auto-body-repair courses.

Shawn has a tough road ahead of him. Though he will attend his class's graduation ceremony to watch his peers get diplomas, he won't be on stage, at least not yet. Even the school's efforts to speed up his credit recovery haven't been enough, so he will have to return for a fifth year at Shelbyville High. It's no fun for a 19-year-old to be in high school. Shawn is already a big guy who doesn't like to draw attention to himself.

But Shawn's hopes are bolstered by his plan. Auto-body work is not just a passing fancy for him--even when he's not at the vocational school, he is working on his Camaro, which most recently needed a new bumper. His favorite TV show, of course, is Pimp My Ride. He wants to save for tuition at Lincoln Technical Institute in Indianapolis so he can continue to develop his auto-sculpting skills. He rattles off the industry rates--car painters make an hourly wage of $22, collision techs $17--and he wants to get there. So he laughs it off every time somebody asks him in the hallway, "Hey, you're still in school? I would have thought you'd drop out by now."

Shawn's friends who have dropped out are, for the most part, struggling. A couple of them got their GED and are working in factories, but others are shuffling through menial jobs--one works at the car wash, another is washing dishes. A few, says Shawn, aren't doing much of anything except playing video games at their parents' houses. But Shawn says he is serious about not becoming a part of their dropout nation. "I've already went and put 12 years into this thing," he says. "There's no use throwing it all away."

  © Nathan Thornburgh 2006

 

OPRAH'S TRUTH SHOULDN'T HURT

Clarence Page

January 7, 2007

WASHINGTON -- Oprah Winfrey's poke at the short-sighted materialism of some of America's low-income students has delighted conservative commentators, but that doesn't mean she's wrong.

Liberals love to speak "truth to power," but the powerless need to hear the truth too. Knowledge, after all, is power. Don't keep it to yourself, I say. Spread it around.

That's why the Queen of Daytime Talk did poor folks a favor when she candidly explained in a recent Newsweek interview why she decided to build the lavish $40 million Oprah Winfrey Leadership Academy for Girls for impoverished teenagers in South Africa instead of in an American city. South Africa's students, she said, had a greater need and appreciation for education.

"I became so frustrated with visiting inner-city schools [in America] that I just stopped going. The sense that you need to learn just isn't there," she said. "If you ask the kids what they want or need, they will say an iPod or some sneakers. In South Africa, they don't ask for money or toys. They ask for uniforms so they can go to school."

Having reported from South Africa at various times since the 1970s and as the parent of a teenager, I agree with Winfrey. She's not blaming the victims. Our kids are a reflection of us, their parents. Kids don't know anything except that which they are taught by parents, peers, teachers and other role models. My folks didn't even need college degrees to know that, as they let me know on a daily basis.

Yet, these sentiments sound so politically incorrect these days that it is easy to understand why Fox News Channel's John Gibson sounded shocked--Shocked!--at Winfrey's quote. "Uhh, just asking, but can anybody else in America say that and get away with it?" he opined.

And Rush Limbaugh responded with similar astonishment. "This is quite Cosby-esque of the Oprah," he said, approvingly. That, of course, was a direct reference to Bill Cosby. The Cos sparked a backlash from some quarters for lashing out at parents who buy their kids overpriced gym shoes instead of assisting them with their homework.

Indeed, there were some critics who accused Cosby (incorrectly, in my view) of blaming the victims. But having paid close attention to the reactions Cosby has received, I have heard more positive than negative responses from black parents and from educators of all races. But, to conflict-driven news media, it's conflict that sells. The same Cosby-esque frenzy has swirled up in recent days around Herman Badillo. Badillo, 77, the first native-born Puerto Rican elected to Congress, is being criticized for writing in his new book, "One Nation, One Standard," that too many of his fellow Hispanic-Americans are stuck in poverty because they don't value education.

"Education is not a high priority in the Hispanic community," wrote Badillo. "Hispanic parents rarely get involved with their children's schools. They seldom attend parent-teacher conferences, ensure that children do their homework or inspire their children to dream of attending college."

Unfortunately, Badillo is right and not only about Hispanics. Indifference to education is unfortunately epidemic across racial and ethnic lines, and it is particularly damaging to the poor. For earlier waves of immigrants to America, unskilled jobs were much more plentiful. Upward mobility for most of today's kids already requires at least a couple of years of schooling beyond high school.

Yet, instead of discussing the points Badillo raises, many will try to shout him down. Bronx Democratic leader Jose Rivera already has blasted Badillo in a New York Post interview as being a "total insult" to Latino parents. That's OK, Badillo says. He wanted to stir up a dialogue. The controversy will help him sell a few more books too. Puerto Ricans certainly are not the only Americans who need to read it.

With that in mind, I don't mind the lavishness of Oprah's academy, which has come under fire from critics on the right and the left. Sure, the $40 million could have serviced at least 10 times more South African students in more modest structures. But, alas, why shouldn't bright and promising future African leaders have a learning environment at least as nice as that enjoyed by the Ivy League elites who populate America's leadership class?

We want our kids to appreciate education. We should follow Oprah's example and fix up the crumbling structures into which we herd too many of our students here at home. If we want our kids to appreciate education, we grown-ups have to show some respect for it too.

----------

E-mail: cptime@aol.com

Copyright © 2007, Chicago Tribune

 

 

ON EDUCATION: THREE ESSAYS BY CHARLES MURRAY
1)      Intelligence in the Classroom
2)      What’s Wrong with Vocational School?
3)      Aztecs versus Greeks
 
 
http://www.opinionjournal.com/extra/?id=110009531
Intelligence in the Classroom
Half of all children are below average, and teachers can do only so much 
for them.
 
BY CHARLES MURRAY
Tuesday, January 16, 2007 12:01 a.m. EST
 
Education is becoming the preferred method for diagnosing and attacking 
a wide range problems in American life. The No Child Left Behind Act is 
one prominent example. Another is the recent volley of articles that 
blame rising income inequality on the increasing economic premium for 
advanced education. Crime, drugs, extramarital births, unemployment--you 
name the problem, and I will show you a stack of claims that education 
is to blame, or at least implicated.
 
One word is missing from these discussions: intelligence. Hardly anyone 
will admit it, but education's role in causing or solving any problem 
cannot be evaluated without considering the underlying intellectual 
ability of the people being educated. Today and over the next two days, 
I will put the case for three simple truths about the mediating role of 
intelligence that should bear on the way we think about education and 
the nation's future.
 
Today's simple truth: Half of all children are below average in 
intelligence. We do not live in Lake Wobegon.
 
Our ability to improve the academic accomplishment of students in the 
lower half of the distribution of intelligence is severely limited. It 
is a matter of ceilings. Suppose a girl in the 99th percentile of 
intelligence, corresponding to an IQ of 135, is getting a C in English. 
She is underachieving, and someone who sets out to raise her performance 
might be able to get a spectacular result. Now suppose the boy sitting 
behind her is getting a D, but his IQ is a bit below 100, at the 49th 
percentile.
 
We can hope to raise his grade. But teaching him more vocabulary words 
or drilling him on the parts of speech will not open up new vistas for 
him. It is not within his power to learn to follow an exposition written 
beyond a limited level of complexity, any more than it is within my 
power to follow a proof in the American Journal of Mathematics. In both 
cases, the problem is not that we have not been taught enough, but that 
we are not smart enough.
 
Now take the girl sitting across the aisle who is getting an F. She is 
at the 20th percentile of intelligence, which means she has an IQ of 88. 
If the grading is honest, it may not be possible to do more than give 
her an E for effort. Even if she is taught to read every bit as well as 
her intelligence permits, she still will be able to comprehend only 
simple written material. It is a good thing that she becomes 
functionally literate, and it will have an effect on the range of jobs 
she can hold. But still she will be confined to jobs that require 
minimal reading skills. She is just not smart enough to do more than that.
 
How about raising intelligence? It would be nice if we knew how, but we 
do not. It has been shown that some intensive interventions temporarily 
raise IQ scores by amounts ranging up to seven or eight points. 
Investigated psychometrically, these increases are a mix of test effects 
and increases in the underlying general factor of intellectual 
ability--"g." In any case, the increases fade to insignificance within a 
few years after the intervention. Richard Herrnstein and I reviewed the 
technical literature on this topic in "The Bell Curve" (1994), and 
studies since then have told the same story.
 
There is no reason to believe that raising intelligence significantly 
and permanently is a current policy option, no matter how much money we 
are willing to spend. Nor can we look for much help from the Flynn 
Effect, the rise in IQ scores that has been observed internationally for 
several decades. Only a portion of that rise represents an increase in 
g, and recent studies indicate that the rise has stopped in advanced 
nations.
 
Some say that the public schools are so awful that there is huge room 
for improvement in academic performance just by improving education. 
There are two problems with that position. The first is that the numbers 
used to indict the public schools are missing a crucial component. For 
example, in the 2005 round of the National Assessment of Educational 
Progress (NAEP), 36% of all fourth-graders were below the NAEP's "basic 
achievement" score in reading. It sounds like a terrible record. But we 
know from the mathematics of the normal distribution that 36% of 
fourth-graders also have IQs lower than 95.
 
What IQ is necessary to give a child a reasonable chance to meet the 
NAEP's basic achievement score? Remarkably, it appears that no one has 
tried to answer that question. We only know for sure that if the bar for 
basic achievement is meaningfully defined, some substantial proportion 
of students will be unable to meet it no matter how well they are 
taught. As it happens, the NAEP's definition of basic achievement is 
said to be on the tough side. That substantial proportion of 
fourth-graders who cannot reasonably be expected to meet it could well 
be close to 36%.
 
The second problem with the argument that education can be vastly 
improved is the false assumption that educators already know how to 
educate everyone and that they just need to try harder--the assumption 
that prompted No Child Left Behind. We have never known how to educate 
everyone. The widely held image of a golden age of American education 
when teachers brooked no nonsense and all the children learned their 
three Rs is a myth. If we confine the discussion to children in the 
lower half of the intelligence distribution (education of the gifted is 
another story), the overall trend of the 20th century was one of slow, 
hard-won improvement. A detailed review of this evidence, never 
challenged with data, was also part of "The Bell Curve."
 
This is not to say that American public schools cannot be improved. Many 
of them, especially in large cities, are dreadful. But even the best 
schools under the best conditions cannot repeal the limits on 
achievement set by limits on intelligence.
 
To say that even a perfect education system is not going to make much 
difference in the performance of children in the lower half of the 
distribution understandably grates. But the easy retorts do not work. 
It's no use coming up with the example of a child who was getting Ds in 
school, met an inspiring teacher, and went on to become an 
astrophysicist. That is an underachievement story, not the story of 
someone at the 49th percentile of intelligence. It's no use to cite the 
differences in test scores between public schools and private ones--for 
students in the bottom half of the distribution, the differences are 
real but modest. It's no use to say that IQ scores can be wrong. I am 
not talking about scores on specific tests, but about a student's 
underlying intellectual ability, g, whether or not it has been measured 
with a test. And it's no use to say that there's no such thing as g.
 
While concepts such as "emotional intelligence" and "multiple 
intelligences" have their uses, a century of psychometric evidence has 
been augmented over the last decade by a growing body of neuroscientific 
evidence. Like it or not, g exists, is grounded in the architecture and 
neural functioning of the brain, and is the raw material for academic 
performance. If you do not have a lot of g when you enter kindergarten, 
you are never going to have a lot of it. No change in the educational 
system will change that hard fact.
 
That says nothing about the quality of the lives that should be open to 
everyone across the range of ability. I am among the most emphatic of 
those who think that the importance of IQ in living a good life is 
vastly overrated. My point is just this: It is true that many social and 
economic problems are disproportionately found among people with little 
education, but the culprit for their educational deficit is often low 
intelligence. Refusing to come to grips with that reality has produced 
policies that have been ineffectual at best and damaging at worst.
 
Mr. Murray is the W.H. Brady Scholar at the American Enterprise 
Institute. This is the first in a three-part series, concluding on 
Thursday.
 
ON EDUCATION
http://www.opinionjournal.com/extra/?id=110009535
What's Wrong With Vocational School?
Too many Americans are going to college.
 
BY CHARLES MURRAY
Wednesday, January 17, 2007 12:01 a.m. EST
 
The topic yesterday was education and children in the lower half of the 
intelligence distribution. Today I turn to the upper half, people with 
IQs of 100 or higher. Today's simple truth is that far too many of them 
are going to four-year colleges.
 
Begin with those barely into the top half, those with average 
intelligence. To have an IQ of 100 means that a tough high-school course 
pushes you about as far as your academic talents will take you. If you 
are average in math ability, you may struggle with algebra and probably 
fail a calculus course. If you are average in verbal skills, you often 
misinterpret complex text and make errors in logic.
 
These are not devastating shortcomings. You are smart enough to engage 
in any of hundreds of occupations. You can acquire more knowledge if it 
is presented in a format commensurate with your intellectual skills. But 
a genuine college education in the arts and sciences begins where your 
skills leave off.
 
In engineering and most of the natural sciences, the demarcation between 
high-school material and college-level material is brutally obvious. If 
you cannot handle the math, you cannot pass the courses. In the 
humanities and social sciences, the demarcation is fuzzier. It is 
possible for someone with an IQ of 100 to sit in the lectures of 
Economics 1, read the textbook, and write answers in an examination 
book. But students who cannot follow complex arguments accurately are 
not really learning economics. They are taking away a mishmash of 
half-understood information and outright misunderstandings that probably 
leave them under the illusion that they know something they do not. (A 
depressing research literature documents one's inability to recognize 
one's own incompetence.) Traditionally and properly understood, a 
four-year college education teaches advanced analytic skills and 
information at a level that exceeds the intellectual capacity of most 
people.
 
There is no magic point at which a genuine college-level education 
becomes an option, but anything below an IQ of 110 is problematic. If 
you want to do well, you should have an IQ of 115 or higher. Put another 
way, it makes sense for only about 15% of the population, 25% if one 
stretches it, to get a college education. And yet more than 45% of 
recent high school graduates enroll in four-year colleges. Adjust that 
percentage to account for high-school dropouts, and more than 40% of all 
persons in their late teens are trying to go to a four-year 
college--enough people to absorb everyone down through an IQ of 104.
 
No data that I have been able to find tell us what proportion of those 
students really want four years of college-level courses, but it is safe 
to say that few people who are intellectually unqualified yearn for the 
experience, any more than someone who is athletically unqualified for a 
college varsity wants to have his shortcomings exposed at practice every 
day. They are in college to improve their chances of making a good 
living. What they really need is vocational training. But nobody will 
say so, because "vocational training" is second class. "College" is 
first class.
 
Large numbers of those who are intellectually qualified for college also 
do not yearn for four years of college-level courses. They go to college 
because their parents are paying for it and college is what children of 
their social class are supposed to do after they finish high school. 
They may have the ability to understand the material in Economics 1 but 
they do not want to. They, too, need to learn to make a living--and 
would do better in vocational training.
 
Combine those who are unqualified with those who are qualified but not 
interested, and some large proportion of students on today's college 
campuses--probably a majority of them--are looking for something that 
the four-year college was not designed to provide. Once there, they 
create a demand for practical courses, taught at an intellectual level 
that can be handled by someone with a mildly above-average IQ and/or 
mild motivation. The nation's colleges try to accommodate these new 
demands. But most of the practical specialties do not really require 
four years of training, and the best way to teach those specialties is 
not through a residential institution with the staff and infrastructure 
of a college. It amounts to a system that tries to turn out televisions 
on an assembly line that also makes pottery. It can be done, but it's 
ridiculously inefficient.
 
Government policy contributes to the problem by making college 
scholarships and loans too easy to get, but its role is ancillary. The 
demand for college is market-driven, because a college degree does, in 
fact, open up access to jobs that are closed to people without one. The 
fault lies in the false premium that our culture has put on a college 
degree.
 
For a few occupations, a college degree still certifies a qualification. 
For example, employers appropriately treat a bachelor's degree in 
engineering as a requirement for hiring engineers. But a bachelor's 
degree in a field such as sociology, psychology, economics, history or 
literature certifies nothing. It is a screening device for employers. 
The college you got into says a lot about your ability, and that you 
stuck it out for four years says something about your perseverance. But 
the degree itself does not qualify the graduate for anything. There are 
better, faster and more efficient ways for young people to acquire 
credentials to provide to employers.
 
The good news is that market-driven systems eventually adapt to reality, 
and signs of change are visible. One glimpse of the future is offered by 
the nation's two-year colleges. They are more honest than the four-year 
institutions about what their students want and provide courses that 
meet their needs more explicitly. Their time frame gives them a big 
advantage--two years is about right for learning many technical 
specialties, while four years is unnecessarily long.
 
Advances in technology are making the brick-and-mortar facility 
increasingly irrelevant. Research resources on the Internet will soon 
make the college library unnecessary. Lecture courses taught by 
first-rate professors are already available on CDs and DVDs for many 
subjects, and online methods to make courses interactive between 
professors and students are evolving. Advances in computer simulation 
are expanding the technical skills that can be taught without having to 
gather students together in a laboratory or shop. These and other 
developments are all still near the bottom of steep growth curves. The 
cost of effective training will fall for everyone who is willing to give 
up the trappings of a campus. As the cost of college continues to rise, 
the choice to give up those trappings will become easier.
 
A reality about the job market must eventually begin to affect the 
valuation of a college education: The spread of wealth at the top of 
American society has created an explosive increase in the demand for 
craftsmen. Finding a good lawyer or physician is easy. Finding a good 
carpenter, painter, electrician, plumber, glazier, mason--the list goes 
on and on--is difficult, and it is a seller's market. Journeymen 
craftsmen routinely make incomes in the top half of the income 
distribution while master craftsmen can make six figures. They have work 
even in a soft economy. Their jobs cannot be outsourced to India. And 
the craftsman's job provides wonderful intrinsic rewards that come from 
mastery of a challenging skill that produces tangible results. How many 
white-collar jobs provide nearly as much satisfaction?
 
Even if forgoing college becomes economically attractive, the social 
cachet of a college degree remains. That will erode only when large 
numbers of high-status, high-income people do not have a college degree 
and don't care. The information technology industry is in the process of 
creating that class, with Bill Gates and Steve Jobs as exemplars. It 
will expand for the most natural of reasons: A college education need be 
no more important for many high-tech occupations than it is for NBA 
basketball players or cabinetmakers. Walk into Microsoft or Google with 
evidence that you are a brilliant hacker, and the job interviewer is not 
going to fret if you lack a college transcript. The ability to present 
an employer with evidence that you are good at something, without 
benefit of a college degree, will continue to increase, and so will the 
number of skills to which that evidence can be attached. Every time that 
happens, the false premium attached to the college degree will diminish.
 
Most students find college life to be lots of fun (apart from the boring 
classroom stuff), and that alone will keep the four-year institution 
overstocked for a long time. But, rightly understood, college is 
appropriate for a small minority of young adults--perhaps even a 
minority of the people who have IQs high enough that they could do 
college-level work if they wished. People who go to college are not 
better or worse people than anyone else; they are merely different in 
certain interests and abilities. That is the way college should be seen. 
There is reason to hope that eventually it will be.
 
Mr. Murray is the W.H. Brady Scholar at the American Enterprise 
Institute. This is the second in a three-part series, concluding tomorrow.
 
ON EDUCATION
http://opinionjournal.com/extra/?id=110009541
Aztecs vs. Greeks
Those with superior intelligence need to learn to be wise.
 
BY CHARLES MURRAY
Thursday, January 18, 2007 12:01 a.m. EST
 
If "intellectually gifted" is defined to mean people who can become 
theoretical physicists, then we're talking about no more than a few 
people per thousand and perhaps many fewer. They are cognitive 
curiosities, too rare to have that much impact on the functioning of 
society from day to day. But if "intellectually gifted" is defined to 
mean people who can stand out in almost any profession short of 
theoretical physics, then research about IQ and job performance 
indicates that an IQ of at least 120 is usually needed. That number 
demarcates the top 10% of the IQ distribution, or about 15 million 
people in today's labor force--a lot of people.
 
In professions screened for IQ by educational requirements--medicine, 
engineering, law, the sciences and academia--the great majority of 
people must, by the nature of the selection process, have IQs over 120. 
Evidence about who enters occupations where the screening is not 
directly linked to IQ indicates that people with IQs of 120 or higher 
also occupy large proportions of positions in the upper reaches of 
corporate America and the senior ranks of government. People in the top 
10% of intelligence produce most of the books and newspaper articles we 
read and the television programs and movies we watch. They are the 
people in the laboratories and at workstations who invent our new 
pharmaceuticals, computer chips, software and every other form of 
advanced technology.
 
Combine these groups, and the top 10% of the intelligence distribution 
has a huge influence on whether our economy is vital or stagnant, our 
culture healthy or sick, our institutions secure or endangered. Of the 
simple truths about intelligence and its relationship to education, this 
is the most important and least acknowledged: Our future depends 
crucially on how we educate the next generation of people gifted with 
unusually high intelligence.
 
How assiduously does our federal government work to see that this 
precious raw material is properly developed? In 2006, the Department of 
Education spent about $84 billion. The only program to improve the 
education of the gifted got $9.6 million, one-hundredth of 1% of 
expenditures. In the 2007 budget, President Bush zeroed it out.
 
But never mind. A large proportion of gifted children are born to 
parents who value their children's talent and do their best to see that 
it is realized. Most gifted children without such parents are recognized 
by someone somewhere along the educational line and pointed toward 
college. No evidence indicates that the nation has many children with 
IQs above 120 who are not given an opportunity for higher education. The 
university system has also become efficient in shipping large numbers of 
the most talented high-school graduates to the most prestigious schools. 
The allocation of this human capital can be criticized--it would 
probably be better for the nation if more of the gifted went into the 
sciences and fewer into the law. But if the issue is amount of 
education, then the nation is doing fine with its next generation of 
gifted children. The problem with the education of the gifted involves 
not their professional training, but their training as citizens.
 
We live in an age when it is unfashionable to talk about the special 
responsibility of being gifted, because to do so acknowledges inequality 
of ability, which is elitist, and inequality of responsibilities, which 
is also elitist. And so children who know they are smarter than the 
other kids tend, in a most human reaction, to think of themselves as 
superior to them. Because giftedness is not to be talked about, no one 
tells high-IQ children explicitly, forcefully and repeatedly that their 
intellectual talent is a gift. That they are not superior human beings, 
but lucky ones. That the gift brings with it obligations to be worthy of 
it. That among those obligations, the most important and most difficult 
is to aim not just at academic accomplishment, but at wisdom.
 
The encouragement of wisdom requires a special kind of education. It 
requires first of all recognition of one's own intellectual limits and 
fallibilities--in a word, humility. This is perhaps the most 
conspicuously missing part of today's education of the gifted. Many 
high-IQ students, especially those who avoid serious science and math, 
go from kindergarten through an advanced degree without ever having a 
teacher who is dissatisfied with their best work and without ever taking 
a course that forces them to say to themselves, "I can't do this." 
Humility requires that the gifted learn what it feels like to hit an 
intellectual wall, just as all of their less talented peers do, and that 
can come only from a curriculum and pedagogy designed especially for 
them. That level of demand cannot fairly be imposed on a classroom that 
includes children who do not have the ability to respond. The gifted 
need to have some classes with each other not to be coddled, but because 
that is the only setting in which their feet can be held to the fire.
 
The encouragement of wisdom requires mastery of analytical building 
blocks. The gifted must assimilate the details of grammar and syntax and 
the details of logical fallacies not because they will need them to 
communicate in daily life, but because these are indispensable for 
precise thinking at an advanced level.
 
The encouragement of wisdom requires being steeped in the study of 
ethics, starting with Aristotle and Confucius. It is not enough that 
gifted children learn to be nice. They must know what it means to be good.
 
The encouragement of wisdom requires an advanced knowledge of history. 
Never has the aphorism about the fate of those who ignore history been 
more true.
 
All of the above are antithetical to the mindset that prevails in 
today's schools at every level. The gifted should not be taught to be 
nonjudgmental; they need to learn how to make accurate judgments. They 
should not be taught to be equally respectful of Aztecs and Greeks; they 
should focus on the best that has come before them, which will mean a 
light dose of Aztecs and a heavy one of Greeks. The primary purpose of 
their education should not be to let the little darlings express 
themselves, but to give them the tools and the intellectual discipline 
for expressing themselves as adults.
 
In short, I am calling for a revival of the classical definition of a 
liberal education, serving its classic purpose: to prepare an elite to 
do its duty. If that sounds too much like Plato's Guardians, consider 
this distinction. As William F. Buckley rightly instructs us, it is 
better to be governed by the first 2,000 names in the Boston phone book 
than by the faculty of Harvard University. But we have that option only 
in the choice of our elected officials. In all other respects, the 
government, economy and culture are run by a cognitive elite that we do 
not choose. That is the reality, and we are powerless to change it. All 
we can do is try to educate the elite to be conscious of, and prepared 
to meet, its obligations. For years, we have not even thought about the 
nature of that task. It is time we did.
 
The goals that should shape the evolution of American education are 
cross-cutting and occasionally seem contradictory. Yesterday, I argued 
the merits of having a large group of high-IQ people who do not bother 
to go to college; today, I argue the merits of special education for the 
gifted. The two positions are not in the end incompatible, but there is 
much more to be said, as on all the issues I have raised.
 
The aim here is not to complete an argument but to begin a discussion; 
not to present policy prescriptions, but to plead for greater realism in 
our outlook on education. Accept that some children will be left behind 
other children because of intellectual limitations, and think about what 
kind of education will give them the greatest chance for a fulfilling 
life nonetheless. Stop telling children that they need to go to college 
to be successful, and take advantage of the other, often better ways in 
which people can develop their talents. Acknowledge the existence and 
importance of high intellectual ability, and think about how best to 
nurture the children who possess it.
 
Mr. Murray is the W.H. Brady Scholar at the American Enterprise 
Institute. This concludes a three-part series which began on Tuesday

 

THIS COLUMN GETS SO GHETTO

Clarence Page

WASHINGTON. April 7, 2007 -- Somebody should have warned Newt Gingrich to stay away from the "ghetto."

The word, I mean. If so, the former speaker of the House, who is weighing a presidential bid, could have avoided the embarrassment for which he is apologizing.

In a video statement on YouTube that's read in Spanish and subtitled in English, the Georgia Republican says his "word choice was poor" when he equated bilingual education with "the language of living in a ghetto" in a recent speech.

What he meant to say, he says, is that, "In the United States it is important to speak the English language well in order to advance and have success."

Alas, poor Newt. As a journalist who has had the G-word stricken from my copy by cautious editors, I could have warned him. "Ghetto" means so many things to so many different people that it is best avoided as a metaphor in mixed company unless you're trying to be, say, Grand Master Newt, the rap artist.

Gingrich fell into an unexpected culture gap similar to the one that Sen. Joe Biden opened up by referring to Sen. Barack Obama as "clean" and "articulate." On the bright side, such political gaffes offer rare opportunities for the rest of us to see how different cultures can draw vastly different meanings from the same words.

Gingrich didn't know it, but his G-bomb stepped into the middle of a bubbling controversy in the black community that has boiled over into mainstream American culture.

"Ghetto" originally referred to the areas of Rome, Warsaw and some other European cities into which Jews once were confined. Black activists in 1960s America embraced the word to label impoverished urban areas into which blacks had long been segregated.

But in recent years, the word increasingly has come to mean simply "low class," sometimes with irony, sometimes not.

Gingrich's gaffe coincides with the publication of a book he would have found helpful: "Ghettonation: A Journey into the Land of Bling and the Home of the Shameless" by Cora Daniels, a contributing writer for Essence, Fortune, The New York Times and O: The Oprah Magazine, among others.

She was moved to write, Daniels says, by the sight of Paris Hilton remarking on the reality TV show "The Simple Life," that "this truck is so ghetto," as she tried in vain to start up an old, rusted pickup truck. At that moment, Daniels says, she realized that "ghetto" is no longer a "black thing," but "an American thing."

Martha Stewart helped confirm that when Daniels saw her boast on TV that she can "get ghetto" when she needs to. Not a good thing, Martha.

Daniels is not radical chic. She comes courageously to vilify "ghetto," not to praise it.

With wit and wisdom, she explores and exposes the ghetto "mind-set" that demeans women ("hos," "bee-yatches"), devalues education ("acting white"), ridicules proper English ("talking white"), celebrates criminality ("gangsta love"), discards traditional parenthood ("babydaddies," etc.) and celebrates tacky fashion and behavior ("ghettofabulous").

She knew things had gone off the rails when, shopping for Halloween, she found "pimp" and "ho" costumes in preschool sizes.

Or when she discovered that more than 1,200 babies were named Lexus in 2006.

Yet, Daniels writes with an undertone of love. She softens the inevitable "elitist" label that some critics have pinned on Bill Cosby by spreading the blame. Cosby famously chastised poor people three years ago for "not holding up their end in this deal." Daniels quite properly includes black middle-class Americans, like her and me, in her critique, too.

Daniels fails to pin down the precise moment when the most self-destructive values took hold, if there is one. I would put it at the point when, as black novelist-journalist Jill Nelson tells Daniels, "We lost hope." Our generation saw Martin Luther King, Malcolm X, Robert F. Kennedy and other great leaders rise, only to be violently snatched away from us. Then, Nelson recalls, we "smothered our kids in material stuff to insulate them from the pain."

We also saw the poorest of the black poor becoming increasingly isolated in the economic ghettos from which their more fortunate neighbors escaped. Like impoverished societies everywhere, our black poor created new music and fashions from the resources they had. These, in turn, were ironically exploited by entertainment executives when they found big profits to be made, often in white suburbia.

At a time of great national argument over who is to blame for poverty, racism or bad habits, Daniels reveals that society can be blamed.

Gingrich is hardly the first or only American to be caught in the "ghetto." The first step toward improving our predicament, as Daniels tells us, is to improve the way we think about it.
----------
Clarence Page is a member of the Tribune's editorial board. E-mail: cptime@aol.com

Copyright © 2007, Chicago Tribune

 

TO BE OR NOT TO BE

By Art Winslow

In one of the more amusing sections of "Becoming Shakespeare," Jack Lynch's examination of the afterlife of William Shakespeare -- that is, how what we recognize as "Shakespearean" has acquired, over the centuries, its specific qualities and shape -- he juxtaposes versions of what should be the same line from "Hamlet."

"O that this too too solid flesh would melt," the Danish prince soliloquizes in "The Oxford Shakespeare"; "O, that this too too sallied flesh would melt," is how the "Norton Critical Edition" has it; "O that this too too sullied flesh would melt," is Hamlet's utterance in "The Pelican Shakespeare."

As Lynch points out, "This is one of Hamlet's most important speeches"; the answer to the question of which version is correct "presumably matters," and yet "there are hundreds of problems like this in every single play." It is as if the ambiguities of Shakespeare's wordplay carried on of their own accord, even after (and long before postmodernism) the death of the author.

No draft of a Shakespeare play in his own hand is known to have survived, nor is he known to have supervised the printing of any copy. In his lifetime, inexpensive versions of many of Shakespeare's scripts were printed in small formats known as quartos, of inconsistent reliability or fidelity to a presumed original. Not only do wordings in the quartos vary one against the other, they differ from the official First Folio edition of his plays put together by two members of Shakespeare's acting company seven years after his death in 1616 (for 18 of his plays, the First Folio is the only source). So, in more than one sense we are left with what fellow poet Samuel Coleridge called the " 'myriad-minded' " Shakespeare.

Lynch, a professor of English at Rutgers University, focuses on Shakespeare the Reputation -- in the academy, on the stage, in the publishing world and in education -- a plus for a relatively slight volume, for it stakes out terrain that is somewhat idiosyncratically removed from the avalanche of other Shakespeare books. Since Harold Bloom's sizable biography nearly a decade ago, "Shakespeare: The Invention of the Human," notable works have included an omnibus interpretation of the plays from Marjorie Garber ("Shakespeare After All"), a highly interpretive biography from Stephen Greenblatt ("Will in the World") and, most recently, Ron Rosenbaum's chronicle of struggles inherent in gaining perspective on the Bard ("The Shakespeare Wars," which has some overlap of approach with Lynch's book, particularly when elucidating disputes over wordings in "Hamlet" and "King Lear"). As Lynch points out in an addendum, "A typical year now sees around 1,500 new articles, 650 new books, 200 new editions, and 100 new doctoral dissertations devoted to Shakespeare." By his rough estimate, that translates into 30 new pages published per hour, around the clock, year-round.

One of the historical debates has been over the very authorship of the plays, but Lynch dispenses with this argument in his introduction. Dismissing the " 'anti-Stratfordians' " who have put forth figures from Francis Bacon to Sir Walter Raleigh, Edmund Spencer to Edward de Vere, 17th earl of Oxford, Lynch writes that "the evidence just doesn't support the case for anyone other than William Shakespeare of Stratford" as the author. His starting point, rather, is to ask whether Shakespeare was "really the great genius he's now made out to be?" He was widely respected by his contemporaries, but was not elevated "as a kind of secular deity," his modern fate.

Perhaps surprisingly from today's perspective, "Shakespeareana slowed to a trickle" after the First Folio and a second edition of it were published, and by the 1630s Shakespeare was seen as old-fashioned, with only five of his more than three dozen plays being staged.

Then a curious thing happened on the way to England's theaters: As the Puritans gained political power, their regard of playhouses and acting troupes as purveyors of immorality led Parliament to close public theaters nationwide in 1642. In the period of civil war, Oliver Cromwell and his successors kept theaters closed until the restoration of the monarchy under Charles II in 1660, when "Charles's Restoration proved to be Shakespeare's as well."

In a classic case of supply and demand, when the king licensed theaters again after a drought of nearly two decades, "the most pressing problem for the newly instituted companies was that they had no new plays to perform,"only historical chestnuts like "The Moore of Venice" (alternate title of "Othello") and "The Merry Wives of Windsor." The two king's companies set up in Drury Lane and Covent Garden, and "no one benefited more than William Shakespeare," Lynch writes.

At this point "Becoming Shakespeare" presents a general overview familiar from theater histories, discussing shifts in audience and theater design and the emergence of prominent Shakespearean actors. Whereas the old Globe could hold up to 3,000 spectators, refurbished tennis courts and other facilities that became home to early Restoration theaters could hold only a few hundred. With a smaller attendance base to cover costs, admission prices rose significantly, changing the social profile of theatergoers, making the atmosphere "necessarily more genteel."

An incipient star system developed as well, as theater actors became national celebrities, "achieving a degree of fame that no one in Shakespeare's day could have imagined." The first prominent star in this period was Thomas Betterton, who performed tragic roles as Hamlet, Lear and Othello but also distinguished himself playing Falstaff and Mercutio. Betterton was succeeded most prominently in the early 18th Century by Colley Cibber, who had a penchant for rewriting Shakespeare's plays as adaptations. That practice became common, and Lynch smoothly illustrates how changing performance expectations worked in tandem with censorious impulses (e.g., an aversion to Shakespeare's bawdiness) to denature the presentation of what was "Shakespeare."

Lynch moves forward with capsule career sketches of David Garrick (who introduced a naturalistic acting style) in the mid-18th Century, the many actors of the Kemble family and the great Edmund Kean of the early 19th Century, of whom Coleridge said, " 'To see him act, is like reading Shakespeare by flashes of lightning.' "

"Becoming Shakespeare" turns more interesting in its second and third acts, so to speak, in which Lynch delves into the critical reactions to Shakespeare over time. Few early critics would have ranked the Stratford playwright above contemporaries like Ben Jonson, whose tendencies did not include what was seen as implausible plots, sloppiness and lack of adhesion to Aristotelian principles of unity in drama. Theater professionals and the public "were content to publish, perform, watch, and read [Shakespeare's] plays in thoroughly rewritten versions," Lynch points out, and in fact for much of the past 400 years, Shakespeare's plays "were rarely presented as he wrote them."

For example, Garrick, when performing "Richard III," was Cibber's Richard, in which "[l]ittle of Shakespeare's text survived"; some of Cibber's lines even show up in Laurence Olivier's 1956 film version of the play. Lynch suggests, somewhat counterintuitively, that the popularity of the transformed versions was a historical plus: "Perhaps, without these adaptations, Shakespeare would never have become the giant he is today. 'Lear' and 'Macbeth,' after all, were not very successful until they were rewritten."

Not only did staging sensibilities metamorphose Shakespeare's plays in execution, many editors of his published scripts did so as well. Poet Alexander Pope was one of those, producing a six-volume set in the 1720s. But despite his brilliance, Lynch writes that "the cavalier way he treated Shakespeare's text can be surprising, even shocking, to modern eyes." Considering some passages inferior, Pope cut more than 1,500 lines from the plays and left them extant as footnotes instead.

This will seem child's play next to the efforts of Henrietta Maria Bowdler and her brother Thomas, however, in producing editions of "The Family Shakespeare" that removed references to sex and impiety, indeed "all the passages unsuitable to refined sensibilities," and became one of the most successful Shakespeare editions of the 19th Century. Even today, some school editions of Shakespeare are direct descendants of the Bowdlerized text, Lynch notes.

In Lynch's final chapter, "Worshipping Shakespeare," he tracks the quantum leap of his subject from a playwright whose works were considered risque and dramatically unruly to one whose admirers were "turning him into a kind of god," one whose global spread was assured by the colonial reach of Great Britain. Harold Bloom's explanation for the phenomenon, in a coda to his Shakespeare biography, is that "Shakespeare matters most because no one else gives us so many other selves, larger and more detailed than any closest friends or lovers seem to be." In "Becoming Shakespeare," Lynch suggests how many selves Shakespeare himself left behind instead.

© Chicago Tribune 2007

 

EVEN WITH THE BEST INTENTIONS…

 

By Lara Weber

 

November 11, 2007


Go to an exotic developing nation. Be charmed by throngs of beaming children. Meet with locals who have big plans and no funding. Donate seed money. Build a school, or a clinic, or a library. Feel like a hero.

Then watch it fall apart, as things so famously tend to do in places like Africa.

Oprah Winfrey knows what I'm talking about. Building a girls school in South Africa and then having to confront allegations of sexual abuse and malfeasance at the school, as she did in the last few weeks, cannot feel heroic. But because she's Oprah, she gets the unenviable honor of watching her dream fall apart (or at least crumble a little) in front of a global audience.

She's not the first to stumble, of course.

Thousands of Peace Corps volunteers and other aid workers in Africa have been there too, as I was, riding the development roller coaster in remote towns and villages across the continent. The budgets may be smaller, but the highs and lows are just as intense.

I've been watching Winfrey get to know Africa since 2002, when I had just returned from two years as a Peace Corps volunteer in Zambia and caught a televised special about her visit to South Africa. The pure joy and compassion she exuded carried me straight back to my first days in Zambia. Winfrey was smitten with Africa, and I knew exactly how that felt.

Since then, she has been on a journey that is so familiar to so many volunteers -- the euphoria, the idealism, the connection with the community -- and I've been rooting for her. Cautiously.

Her pet project, an ultramodern elite school for a select number of girls in South Africa, rings of a Western solution to an African problem. At the Oprah Winfrey Leadership Academy for Girls, she wants to shower the girls with luxury to boost their self-esteem and turn them into leaders. Hmm.

I thought back to my experience in Zambia, where jealousy (ukwa in the local Senga language) could, at the least, stop the best-intended projects and, at the worst, lead to murderous vengeance.

A small stack of children's books sent to me for my village by well-meaning friends set off a firestorm of ukwa. For days, the village swirled with nasty accusations of special treatment, and ill will grew between families. I couldn't distribute the books and finally opted to start a tiny library at my house where the children could come to read. No bloodshed, thankfully, but the joy of sharing the books had been completely deflated.

Winfrey's 450 lucky girls, receiving free education and luxurious housing and clothing, certainly would face an unfathomable amount of ukwa. Surely there are local advisers explaining these issues to Winfrey, right?

Or has her team, like so many of us in the Peace Corps, been too swept up in idealism to notice that a $40 million campus, with theaters and a beauty salon, might be overkill? And that the lavish surroundings could even have harmful consequences in a culture where jealousy runs much deeper than the petty envy we experience in our culture?

The sexual abuse allegations have put Winfrey's African journey back on the world stage. They also show how complicated development work can be. In an e-mail discussion with other former Peace Corps volunteers, many of us who served in Africa were sympathetic to Winfrey's situation and commended her for stepping up quickly to take responsibility for the problems at her school. It's a lesson in accountability that some volunteers should take note of, one told me.

But we also recalled so many projects that failed because of corruption and deep traditions that condoned abuse, prostitution and sexual relationships between teachers and students.

I remembered happily befriending a group of European development workers who had descended on my village to install new water sources, only to discover that they were paying the local girls $10 for sex. In the local economy, $10 is like picking the winning lottery numbers. How could I persuade the girls not to cash in?

Navigating an environment of deep poverty, shifting social mores, corruption and rampant disease is disorienting for the best of us. It gets complicated fast, and it's hard to know where it's going to get messy unless you've been in it for a long time.

Career aid workers and lifelong missionaries spend decades getting to know the local culture, learning the language and exploring ancient traditions that hold a tight grip on the present day.

For Peace Corps volunteers, in a country for a meager two years, the rule is to take it slow, start small, learn the culture, make our projects sustainable -- sort of the anti-Oprah approach. In Zambia, my group of volunteers was advised not to start any project until we had lived in our villages for at least three months. Even that was probably too soon.

If only Winfrey could have been in the Peace Corps, lived in a village, tried to do something as simple as distribute a stack of children's books. Would she have built her academy? Or would she maybe have considered developing 100 teachers colleges with that $40 million instead?

I'll keep watching Winfrey live out her celebrity-style volunteer experience, and I'll still root for her. But -- if she hasn't already done so -- I invite her to get to know a few Peace Corps volunteers. To talk to someone who was taught by a volunteer and was inspired to become a teacher. To hear about projects that failed. And to visit some of the thousands of programs around the world that thrive today because they weren't dependent on large grants or beautiful buildings.

Then I hope Winfrey will teach her millions of viewers and fans that changing the world isn't so easy after all. It can still feel like the greatest high. It's always worth the effort. But it doesn't always work as shown on TV. The real test comes when things fall apart: Are you willing to adapt to put them back together?

lweber@tribune.com

 

 

KEEPING GOOD TEACHERS IS THE TRUE TEST

By Johnathon E. Briggs

November 11, 2007

Teachers, writes education activist Jonathan Kozol, are not "drill sergeants for the state." Yet in many of America's 93,000 public schools, the high-stakes testing environment fueled by the No Child Left Behind law has left teachers feeling like "robotic drones" who regurgitate mandated curriculum. Is it any wonder that nearly 50 percent of new teachers in urban public schools quit within three years, by Kozol's estimate?

Kozol's latest book, "Letters to a Young Teacher," imagines a series of exchanges with "Francesca," a first-time teacher at an inner-city Boston school who is a composite of instructors Kozol has corresponded with over the years.

Kozol, 71, is himself a former public school teacher. He has spent nearly 40 years condemning the inequalities of education, most recently in 2005's "The Shame of the Nation," in which he exposes the often worsening segregation in public schools.

Kozol spoke to the Tribune before a recent visit to Chicago. An edited transcript follows:

Q. What is driving young teachers from the classroom?

A. When No Child Left Behind was sold to Congress, the rhetoric of the White House insisted that our urban schools were full of mediocre drones [as teachers]. There are some mediocre drones in public schools, just as there are mediocre senators and presidents. But hopelessly dull and unimaginative teachers do not certainly turn into classroom wizards under a regimen that transforms their classrooms into miserable test-prep factories.

So the only real effect of [the law] is to drive away the superbly educated, high-spirited teachers we're trying so hard to recruit. When I ask them why [they leave], they never say it's the kids. They always say it's this absolute decapitation of potential in children that is the unintended consequence of an agenda that strips down the curriculum in order to teach only isolated skills that will appear on an exam.

Q. What advice do you offer young teachers?

A. Use some wise and simple strategies to prevail without discouragement. Reach out to the parents [of your students] as quickly as you can. Befriend the wisest of the older teachers. Try not to demonize your principal. Recognize that the principal is under the same sword of fear and anxiety that you've been under.

Q. Why are you so critical of No Child Left Behind?

A. If No Child Left Behind had worked, I would not be so adamant in my beliefs. In fact, it has not worked. The 4th-grade gains claimed by [U.S. Secretary of Education] Margaret Spellings are illusory. They are testing gains as a result of teaching the test. They are not learning gains. If they were learning gains they would persist. I visit the same 4th-graders four years later when they are in 8th grade, and I find that they cannot write a cogent sentence, comprehend a simple text or, worst of all, participate in a discerning class discussion because they've never learned to ask real questions, but only to provide the scripted answers.

Q. No Child Left Behind is currently up for reauthorization before Congress. What changes should be made to the law?

A. In my long conversations with senior members of the Senate Education Committee, I've argued for three specific changes.

First, high-stakes standardized exams ought to be given only every other year from 3rd to 8th grade. Schools should instead rely far more seriously on useful testing known as diagnostic tests, in which the teacher actually learns something useful about the child.

Second, I strongly recommended that Congress require that states certify that class size in an urban district is at the same level as the size in an affluent suburban district and that every child ... receive the same two or three rich years of preschool education before a standardized exam can be used to penalize a child, school or teacher.

Third, Congress should amend the transfer provision to require that states facilitate and, where necessary, finance the right of transfer across district lines in order to enable the parents of inner-city children who are in chronically failing schools to place their children in high-performing and far better-funded public schools, which often are only 20 minutes from their homes.

Q. The Chicago Public Schools system is more than halfway through a school reform initiative known as Renaissance 2010 that aims to close dozens of low-performing schools and replace them with 100 innovative ones. More than half the new schools are charters. Are charters a solution?

A. I'm not opposed to all charter schools. I think it's naive to believe that charter schools can ever meet the need. Despite their claims that their schools are not selective in the students they enroll, the kids whose parents even hear about these schools and whose parents know how to navigate the application process are inherently self-selective.

Q. The CEO of Chicago Public Schools, Arne Duncan, has repeatedly called for education-funding reform, saying that relying on property taxes to fund education has resulted in disparities between wealthy suburban districts and struggling rural and city schools. What alternatives are there to funding public education?

A. Arne Duncan is absolutely right. So long as Illinois relies more heavily than almost any other state on local property wealth, you will never have a genuine meritocracy. You will always have a hereditary meritocracy based on the accident of birth. The only answer is to get rid of the property tax almost entirely as the basis of school funding. Or, if it continues to exist, to pool the property taxes into a common pool ... and then distribute those funds equitably to every single child in the state adjusted only for cost of living or the greater or lesser needs of children in specific districts.

Q. What can school districts do to close the achievement gap between black and Latino children and their white counterparts?

A. What counts most in education is ... the high quality of the teacher, the strong morale of the teacher and the number of children in that classroom. If we want to know what works in closing the achievement gap, we don't need to search for isolated exceptions in major urban systems. All we need to do is look at any great suburban system in which children thrive as a matter of course, not as a matter of exception.

Q. What impact will the recent Supreme Court decision have on efforts to integrate the nation's schools, which are resegregating at an alarming rate?

A. In his partial concurrence, Justice Anthony Kennedy opened up a means of pursuing integration so long as it is not race-specific. Congress has a golden opportunity to require states to allow transfers across district lines without ever introducing race into the debate, solely for the reason that the child is in a chronically low-performing school.

Q. Do you ever lose heart?

A. The reason I don't lose heart is, despite everything, there are far more marvelous teachers in these urban schools than you would ever guess if you listen to the politicians who condemn them. I do believe that in the long run the high morale of our teachers is our most precious asset. If they lose their delight in being with the children, they won't stay, and we'll lose everything.

 

THE TELL-ALL CAMPUS TOUR

by Jonathan Dee

September 21, 2008

Broke young college graduates with ideas for awesome new Web sites are about as thick on the ground as pigeons in New York City, but Jordan Goldman has a talent for getting noticed. Born and raised in Staten Island, he graduated from Wesleyan in 2004, spent two post-grad years in England and, upon his return to his native city, lived in 16 different sublets in the next two years. His own parents referred to him as the Wandering Jew. “I was ordering Chinese lunch specials and dividing them into three,” he remembered recently, “and that was my food for days. My mom thought I was nuts. She kept saying, ‘Get a job,’ and I’d say, ‘No, Ma, I have this idea.’ ”

 

 

 

With no money, no contacts and no business education whatsoever, Goldman began where any 21st-century self-starter would: “I Google-searched ‘business plan,’ and I found one and just plugged my own words into it. Then it wound up that Wesleyan has an alumni database, and so I looked for people who worked in finance and who graduated 10 or more years before I did. I e-mailed about 500 people, and I just said: ‘Look, I have this idea. What do I do now? What comes next?’ It was a fairly untraditional fund-raising process.”

Actually, with the exception of the bit about Google, it was as traditional as can be, but given that he was 23, Goldman can be excused for thinking that he discovered the Old Boy Network. About 50 Wesleyan alums answered his e-mail messages, and one of those replies — from Frank Sica, a former president of Soros Private Funds Management — was the stuff of drama.

 

“He said, ‘I live in Bronxville,’ ” Goldman recounted. “ ‘At 7:30 I order my eggs at this diner. I’m done by 8. Come up to the diner and tell me about your idea, and I’ll give you until I’m done with my eggs.’ ” Armed with only his idea and the ability to talk a blue streak about it, Goldman set his alarm and took a train to that diner. No one who has ever met Goldman would have any trouble guessing that by the time Sica was finished with his eggs that day, he was on his way to becoming the young man’s lead investor.

 

Now Goldman goes to work every day on Park Avenue, in an office with an interior window through which he can keep tabs on his 25 employees, nearly all of them even younger than he. This month his Web site, called Unigo.com — a free, gigantic, student-generated guide to North American colleges for prospective applicants and their families — went live for the benefit of tens of thousands of trepidatious high-school students as they try to figure out where and how to go to college. Not coincidentally, it also aims to siphon away a few million dollars from the slow-adapting publishers of those elephantine college guidebooks that have been a staple of the high-school experience for decades. A lot of the classic narratives about a young man’s coming of age may seem fatally old-fashioned in the new century, but apparently, Horatio Alger still lives.

 

One measure of an idea’s greatness is how obvious it seems in retrospect, and Unigo’s central idea — that high-school and college students would much rather learn from one another than from a book — is so self-evident that your first reaction is surprise that no one has acted on it before. As status anxiety has helped to drive college applications to record levels, the college-guidebook industry has expanded along with it, stoking those anxieties in order to sell you a way to assuage them, most conspicuously through their merciless numerical ranking of the colleges by every metric they can plausibly invent (“Most Millionaire Graduates,” “Top 10 Schools You’ve Never Heard Of”). But over the years, the handful of major players in the guidebook business — a group that includes The Princeton Review, Fiske, Peterson’s and especially the rankings-granddaddy, U.S. News & World Report — have enlarged their operations without really adapting them to the habits of a generation whose first, and often only, source for information is the Internet. The guidebook publishers all have decent Web sites, but since the ultimate purpose of those Web sites is to sell the books, they have little choice but to be parsimonious about how much information they give out for free.

 

On Unigo, the information is all free — “free,” of course, understood as a synonym for “accompanied by advertisements” — and with the exception of brief editorial overviews of each of the 267 colleges featured at start-up, all of it is voluntarily provided by current students at those colleges. “For so long, the colleges have been able to have this stranglehold on the P.R. image of their school,” Goldman said recently in his office, decorated boy-workaholic-style with nothing but an open box of Frosted Flakes and a toy robotic dinosaur. “It’s just harder to look at them as the main source of information. If you’re a college student, you are as much of an expert on being a student at that college as anyone.”

 

The beauty part is that Unigo has not only declined to enlist the colleges’ help with this “national grass-roots movement,” as Goldman likes to refer to it, but the company has also kept it a secret from them. Unigo started soliciting input directly from students (under a kind of Internet alias, “bystudents.com”) almost a year ago, and to date it has received more than 30,000 individual bits of content — primarily reviews in the form of responses to an essay-based questionnaire, but also photos, videos, uploaded writing samples, etc. — all before publicly unveiling the site or even the real name. So how many of these contributions will ultimately be chosen for inclusion on the site?

 

Goldman looked surprised by the question. “All of them,” he said.

 

And that is the plane on which it is simply impossible for the traditional guidebooks to compete. Even the most student-oriented book — “The Insider’s Guide to the Colleges” published annually by the staff of The Yale Daily News, to take one example — is forced by space considerations into a shorthand style reminiscent of a Zagat’s restaurant review: student quotes are cherry-picked on the basis of how representative they are of respondents’ opinions as a whole. Unigo, though, can host so much more material than even the fattest book that there is no burden on anyone’s opinion to be in any way representative of anything. On the contrary, it is free to be as idiosyncratic and intemperate as college life itself. (Unigo’s editorial overviews make use of those Zagat’s style quotelets as well, but each quote will function as a hyperlink to the full-length review and from there to the reviewer’s personal profile.)

 

“There’s a nuance to colleges,” said a Unigo editor, Nikki Martinez, who graduated from the University of California at Santa Barbara two years ago. “One of the colleges I’m responsible for is U.S.C. Forty thousand kids. Their reputation is pretty much for football. So my intern there submits a video about this little taco truck they have in the parking lot. I have a lot of friends who went to U.S.C., and the second I mention it they’re all like: ‘Oh, my God, the taco truck! I’m so glad you got the taco truck!’ ”

 

The textual content on the site takes many forms, but it all essentially adopts the tone and language and casual, critical spirit of an online product review. Even the most boosterish appraisals are often wonderfully unguarded. It is diverting to imagine the reactions of admissions officers nationwide as they read testimonials like this one from an undergraduate at Louisiana State University: “We can drink any college under the table and do it with some class and hospitality.” Or this, from a current Cornell student: “I tend not to blame the suicides on the school. As for blaming suicides on the weather: if you’re that cold, then buy a jacket, for God’s sake. It’s much less messy, and you don’t have to write a note first.”

 

It is possible to have your contribution rejected, at least in theory. But the extent to which Unigo abides by this anything-goes principle is bracing. A student at Quinnipiac University in Connecticut, for instance, writes approvingly in his review that it is still “a white school.” The Unigo staff members shrug it off. “If that’s the kind of people that are going there,” Martinez said, “people need to know that.”

 

Said Adam Freelander, a Unigo managing editor: “Even the best guidebooks kind of make it seem like every college in the country is an awesome place to be, no matter who you are. And that’s not true.”

 

Every student who joins Unigo has a user profile, and while that profile might not feature his or her real name, the idea is that by garnering a few pieces of personal information — your major, your hometown, your race, sex and political leanings — a database is created that makes it possible for newcomers to search the site by all kinds of hyperspecific criteria. You can see how many other people from your own high school are looking at a particular college. You can contact the author of a review with follow-up questions. “You can say, ‘I only want to see reviews of Harvard by African-American students,’ and have a choice of 20,” Goldman projected, “or by English majors, and have a choice of 50. So you can not only see a more comprehensive version of the school than you can anywhere else, but you can also see the school through the eyes of someone who’s just like you.”

 

The idea of letting students write, or at least contribute to, college guides is not brand new; in fact, the one significant modernization in the guidebook business in the last decade or so is the vogue for books that feature students’ contributions alongside those of objective “experts.” It is a vogue for which Goldman, despite his tender years, can already claim a fair amount of credit. He is the co-editor of “The Students’ Guide to Colleges,” a project he began freshman year in his dorm room at Wesleyan using nothing more than his own ingratiating manner and boundless energy to hire unpaid interns on 100 college campuses nationwide. They helped him to attract their fellow students’ attention to the long-form, essay-based survey Goldman then posted online, offering only the promise that the three best responses to these surveys would be chosen to represent the authors’ schools in print. By the time he graduated, the Penguin edition of “The Students’ Guide” was selling solidly, but the book’s success, as well as its limitations, got Goldman thinking about what might be wrought on a grander scale.

 

“My whole family chipped in for me to go to college,” he said. “They were saving from when I was 2 or 3 years old. That the best resource for a four-year, $200,000 decision are these books — with no photos, no videos, no interactivity, only three to five pages per school on average, fully updated usually once every several years — just doesn’t makthe grade. This is the most important decision people that age have ever made, and the information is just not there.”

In early August, with the site code complete and the tens of thousands of individual pieces of content laboriously loaded onto it, a loosely organized parade of small focus groups, recruited via Craigslist, began to drop by the Unigo office. A pair of high-school juniors from Rye Country Day School in Westchester County were paid $25 each to spend an hour or so navigating the site, pronounced it awesome, something they would immediately tell their friends about, leaving Goldman and the others beaming in the conference room afterward.  

 

Each Unigo editor has a list of 10 colleges (including, always, his or her own alma mater) to oversee; their most important task may be finding an unpaid intern on each campus willing to act as a liaison and an occasional reality-checker for Unigo’s efforts. The real masterstroke, though, was the purchase of a hundred Flip video cameras, which were delivered to the on-campus interns themselves with a minimum of instructions. The results are not only vivid in a way no guidebook can match but also, in the way of the generation that produced them, often guilelessly intimate.

 

A white student looks back on her decision to attend the historically black Howard University. Two girls at Notre Dame, one an official campus tour guide, visit several spots around campus: at each one, the tour guide gives you the approved spiel, and then her friend tells you what the spiel leaves out. At Wesleyan, the camera goes around a dining hall, and an offscreen student asks the different, socially stratified tables — a table full of jocks, a table full of hipsters — to talk about student stereotypes. A Princeton student sitting on a dorm-room couch, his face almost entirely obscured by a hoodie, talks about the difference between racial and socioeconomic diversity and why Princeton may excel at the first but fails at the second.

 

“In the ‘Fiske Guide,’ ” said Max Baumgarten, a Unigo editor, “it’ll say ‘the University of Wisconsin sits on these lakes,’ right, and you’re a high-school senior, and you’re like: What do I care that the school is on a few lakes? That means nothing to me. You look at a video, though, and see students hanging out in these beautiful surroundings, and that really changes the game.”

 

It changes the game from an economic standpoint too: it costs a lot of money to travel far away from home to check out schools, and Unigo offers an unfiltered, detailed, often somewhat eccentric view of campuses all over the country. A 45-second video in which an unseen student pans around the courtyard at Sarah Lawrence on a sunny day and simply describes what she sees (including a student-run barbecue pit called PETA, which stands for “People Eating Tasty Animals”) is so evocative that it makes the one-page U.S. News summary — or the descriptions in Sarah Lawrence’s own admissions catalog, for that matter — read like junk mail.

 

Unigo’s staff members, as befits their age, do not lack for a sense of the significance of what they are about to unleash on the world. “The colleges are going to have to change what they’re doing,” Martinez said. “The pictures of the kids on the lawn won’t do anymore.”

 

If so, the colleges remain happily in the dark for now. “I’ve got to be honest with you,” Christopher Gruber, a vice president who oversees admissions at Davidson College, told me. “I’m not spending a ton of time navigating those student-driven sites. It’s too much to manage. My sense is that the traditional big players, like Princeton Review, are the major sources for online information too, in part because those are the names that parents still recognize. Those are the names that are going to have greater panache, and so those are probably the ones that will be turned to. The ones that we supply information to are the ones that we spend the most time on, filling out surveys for them to make sure that that information is accurate.”

 

In early September, after Unigo offered Davidson and the other 266 colleges a two-week preview of the site — “because we don’t want them to feel ambushed,” Goldman explained — Gruber confirmed that the letter from Goldman was sitting on his desk but said he hadn’t yet found the time to visit the site itself. If he does, he will see reviews, photos and videos by roughly 230 current Davidson students (one-eighth of its entire student body) already posted there.

“I don’t think they know the numbers,” Baumgarten said. “That’s the distinction. The whole package is something they should be a bit scared of, but they’re not. They don’t really understand the immensity of it.”

 

Unigo’s 25 editors would be the first to acknowledge that their contribution to the site pales in comparison to that of their legion of student contributors. It’s hard to find fault with Goldman’s assertion that this is not just another Web site but a grass-roots phenomenon. But a phenomenon of what nature? If consumer advocacy seems like a strange modern lens to train on what used to be the great desideratum of millions of American families — getting accepted into, and graduating from, a good college, or indeed any college at all — it certainly speaks to the great leveling that the Internet has wrought in terms of consumption. “If you look at it,” Goldman said, “you can review anything online. You can review the most trivial things, but you can’t review your college. There’s no platform for this incredibly important decision that costs so much money.”

 

College, in other words, is at a certain point in your life the ultimate product, and the first truth with which its current and prospective students are concerned is truth in advertising. Schools are very much in the business of selling themselves — about 150 colleges nationwide reject more students than they accept, and even those compete hard for the applicants they consider the biggest gets — to which Unigo’s reply seems to be: Look, if you won’t sell the experience to us properly, then don’t even bother. We’ll just sell it to one another.

 

“Empowerment,” “revolution,” “grass-roots movement” — these are phrases Goldman and his employees toss around a fair bit. They’re not wrong, exactly, but there is something dispiriting about seeing that vocabulary applied here, as if the greatest empowerment to which young people can aspire is the empowerment of the focus group — the opportunity to offer marketers “reviews” that help determine how those who come after them will be marketed to. Several Unigo employees repeated to me a sort of party line that ran like this: Who’s a better judge of a college than its students? The potential counterarguments seem less important than the fact that they clearly consider the question a rhetorical one. Thus they feel no need to critique, for instance, their own tabulation that one of the most commonly voiced student complaints about today’s college experience, nationwide, is the lack of sufficient on-campus parking.

 

It all might seem less suggestive if it weren’t for the fact that this whole “grass-roots movement” seems poised to make a lot of money — most of which seems destined to find its way to the usual suspects, none of whom are part of a grass-roots anything.

 

Asked whether he ever thinks twice about taking a position in a company with someone as young as Goldman at the helm, Sica — the investor who listened to Goldman while eating his eggs and who now leads Unigo’s board — was surprisingly candid. “I’m still not off the kick of saying we need a real C.E.O.,” he said. “We haven’t needed one to date. We have a very active board. Jordan has listened and has acted appropriately, so I haven’t pushed the point. But it’s not clear to me what kind of person we want as C.E.O. At this point, we’re still a development-stage company: we haven’t sold a thing; we don’t have a dollar of revenue. Once we see where we’re going, I think we’ll revisit that issue, and Jordan may end up as the C.E.O. then, or he may not.”

 

Goldman has always made the case that his youth is in many ways his chief qualification. “When I brought this up from scratch, some people said, ‘Look, you’re just a kid — are you really the right person to do this?’ And we tried to make the case that we’re the perfect people to do it, because we’re the only ones who know what college today is really like and who know how to reach other students in a way that someone 20 years out isn’t going to.” As for the future: “Right now I’m still the largest shareholder, but I feel like it’s kind of not about those things. If anything, I gave away a fair amount, under the idea that it was more important to get this company off the ground than to be able to say it was mine. Anyway, I don’t have any sort of five-year plan. It’s hopefully about a lot more than just me.” He smiled. “After I graduated, when I told my mom that instead of getting a job I was going to spend a year trying to finance a business plan, she thought I was totally nuts. She still thinks I’m nuts. But at least I’m nuts with a Web site.”

 

 © The New York Times Magazine, 2008

 

 

 

MEDIA USE BY TEENS, TWEENS GROWS TO 52 HOURS A WEEK

 

The amount of time young people spend consuming media has ballooned with around-the-clock access and mobile devices that function practically as appendages, according to a new report.

A few years ago, the same researchers thought that teens and tweens were consuming about as much media as humanly possible in the hours available. But somehow, young people have found a way to pack in even more.

In the last five years, the time that America's 8- to 18-year-olds spend watching TV, playing video games and using a computer for entertainment has risen by 1 hour, 17 minutes a day, the Kaiser Family Foundation said.

Young people now devote an average of 7 hours, 38 minutes to daily media use, or about 53 hours a week — more than a full-time job.

"What surprised me the most is the sheer amount of media content coming into their lives each day," said Kaiser's Vicky Rideout, who directed the study. "When you step back and look at the big picture, it's a little overwhelming."

The numbers zoom even higher if you consider kids' multitasking — such as listening to music while on the computer. Those data show young people are marinating in media for what amounts to 10 hours, 45 minutes a day — an increase of almost 2.25 hours since 2004.

The report, "Generation M2: Media in the Lives of 8- to 18-year-olds," is based on a survey of more than 2,000 students nationwide. It is the third wave of the nonprofit's ongoing look at children's media use, providing a glimpse at current viewing and listening patterns while also documenting changes from five and 10 years ago.

The huge increase since 2004 can be attributed to the transformation of the cell phone into a content delivery device, Rideout said.

"Kids are spending more time using their phone to play video games, watch TV and listen to music than to actually talk on them," she said.

And, of course, the last time Kaiser took the nation's temperature, social networking sites barely existed.

"The average day for me, if I am not at work, I will text all day or be on MySpace or Facebook," said Felinda Seymore, 17, of Waukegan. "That's my life."

On Sunday, for instance, she fiddled around online from 9 p.m. to 2 a.m., updating her status and commenting on her friends' pages, she said.

"My mom thinks it's too much technology," Seymore said. "She says back in her day, they didn't have that stuff. I feel like it helps us open up and learn new things ... instead of sitting around at home being bored."

Media consumption is even heavier in minority families such as Seymore's — a trend unaffected by a child's age, socioeconomic status or parents' education. African-American and Hispanic youths favor TV over mobile devices, posting nearly six hours of tube time a day compared with 3.5 hours for their white counterparts.

Parents aren't helpless to limit the intake, the study found.

When parents impose limits, they work, with their offspring tallying nearly three hours less exposure a day. But only 30 percent impose some kind of parameters, the study found.

It's not easy playing electronic cop, but the stakes are too high, said Becky Kirsh, who has been known to pack up the remote controls and bring them with her to work.

With four kids, three computers and assorted cell phones, TVs and video games, Kirsh and her husband struggle to keep media from seeping into every corner of their Lombard home.

"The bottom line is that this is my house," she said. "There's so much that is positive about old-fashioned family life ... and I'm just not willing to give that up to technology."

She offered one example of how gadgetry can alter relationships with her four children, who range in age from 9 to 15. In a simpler time, the car was an ideal place for heart-to-heart chats (captive audience, no eye contact).

 

But when her kids go right to their cell phones or immediately retreat into their headphones in the car, "it's no different than if they were in their bedrooms, with the door closed," said Kirsh, an educational coordinator at a local church. "That's when I really put my foot down."

Right now, the biggest tussle is with her 15-year-old son over texting — a practice Kaiser didn't include separately in its count of media use, but parents often file under the same category. The Kirshes have responded by building in some restraints, including a limit of 2,500 texts and blocking any incoming messages from 7 to 9 p.m. (homework time) and after 11 p.m.

To most adults, a couple of thousand texts is tantamount to a blank check, but Joe Kirsh chafes under the allotment, saying it cramps his social life.

"When I run out of texts, I can't make plans," he said, adding that there is no way to access messages that arrive after hours and that he is the only one of his friends to have such restrictions. "I get good grades ... so it's really not fair."

When it comes to report cards, the Kaiser report finds a difference between heavy and light media users, though researchers note that they haven't determined cause and effect. Nearly half of all heavy media users, those who consume more than 16 hours a day (including time spent multitasking), say they usually get "fair or poor" grades compared with about a quarter of light users (less than 3 hours).

Certainly, part of managing the media landscape means parents need to be savvy about everything from age-appropriate content to V-chips.

But it's not just about more government regulations and stronger locks, Rideout said.

Adults also need to look at their own behavior. Do they put a computer in every bedroom? Is the TV on during dinner? Are Mom and Dad tethered to their own BlackBerrys?

"Really, parents make choices about the media environment every day, Rideout said. "We hope these findings will allow them to look at what goes on in their own families ... and talk about it."



Name: Jorge Becerra Jr., 16

Residence: Mundelein

Spends most of his time: On the computer, playing "World of Warcraft" games with friends.

Parents say: It's hard to wrest Jorge from video games when it's time for dinner.

"He says, 'Wait, wait,' " said his father, Jorge Becerra Sr. "I tell my wife, 'Leave his dinner out and let it get cold.' "

Words of wisdom: Use the hobby as leverage. "If he gets good grades, like he's doing now, we go easy on him," his father said. "But if he does bad at school, we take the video games, the TV, the cell phone."

Name: Jailen Williams, 12

Residence: Hazel Crest

Spends most of his time: Playing "Assassin's Creed," a video game

Parents say: Jailen only can play video games during the weekend — but even then it's limited to three hours a day, said his mother, Kimberly, who imposed the restrictions when he was 7.

Words of wisdom: Set expectations. "He knows he can't play, so he focuses on other things, like school and basketball," she said. "But on Friday nights he makes a beeline."

Name: Matt, 12, and Nate, 8

Residence: Frankfort

Latest acquisition: A cell phone.

Parents say: Matt's phone is not holstered to his hip, but it must be requested. "He's been wanting to use it more and more," said his mom, Marcie Stern. "And he recently asked for his own e-mail. I'm a little nervous, but we've worked really hard at having open communication ... so I don't think it will be a problem."

Words of wisdom: No TV before 8 a.m. "We did it when they were young ... and it still stands."

 

EDUCATION REFORMS GET A FAILING GRADE

Stephen Chapman, April 15, 2010

April 15, 2010

In 1990, in one of the most innovative developments in modern American education, the Milwaukee public schools created a parental choice system. Some low-income parents got vouchers that could be used to send their children to private schools.

It was a richly promising idea. The new option would let disadvantaged kids escape wretched public schools. Competition would force public schools to improve or close. Students would learn more.

Twenty years have passed. Last week, researchers at the School Choice Demonstration Project at the University of Arkansas published their latest assessment of the results.

What did they find? Something unexpected: Kids in the program do no better than everyone else. "At this point," said professor Patrick J. Wolf, "the voucher students are showing average rates of achievement gain similar to their public school peers."

This is a surprise to anyone who originally supported the voucher idea — as I did. But it's entirely consistent with the record elsewhere.

In Washington, D.C., voucher kids improved a little in reading after three years, but not in math. A 2009 review of all the studies on voucher programs found few gains, "most of which are not statistically different from zero." This type of school choice, whatever its merits, has not accomplished what it was supposed to do.

In that, it resembles just about every idea offered by liberals, conservatives or anyone else in recent decades. Coming up with solutions for public education, it turns out, is easy. Coming up with solutions that actually work — well, that's another story.

The latest trend in education reform is charter schools — independent institutions that are publicly funded but free of the usual restrictions on hiring, firing, curriculum, instruction and so on. Today, there are some 4,700 charter schools enrolling 1.4 million kids.

Like vouchers, they are supposed to stimulate improvement by expanding options, fostering a rush to quality. Like vouchers, they have fallen way short of expectations.

In some places, there is evidence that students who win lotteries that let them go to charter schools do better than students who lose out. Stanford University economist Caroline Hoxby found evidence that in New York City, charter school kids progress more rapidly than their peers in public schools.

But her study doesn't resolve why. Do the charter schools have better educational methods? Or do the kids just function better when surrounded by motivated kids (or kids with motivated parents)?

The answer is important. Better educational methods can be duplicated in other schools. But no one knows how to increase the supply of motivated families.

In any case, New York is not exactly the norm. A study last year by Stanford's Center for Research on Education Outcomes found that overall, "charter students are not faring as well" as public school pupils.

These findings may be heartening to liberals who thought the school choice movement was a snare and a delusion. But the real world has also demolished liberal notions of how to improve educational outcomes.

More money for schools? Between 1960 and 2005, per-pupil spending in the United States quadrupled, adjusting for inflation. Yet student performance on reading and math tests stayed put.

Smaller classes? As Eric Hanushek and Alfred Lindseth note in their book, "Schoolhouses, Courthouses, and Statehouses," almost three-quarters of the studies conclude that class size doesn't affect student achievement.

Anyone who still puts stock in expanded resources has to contend with the dismal experience of the Kansas City public schools, which got a huge infusion of money when a federal judge essentially took them over in 1986.

Facilities were radically upgraded, classes shrank, new programs proliferated, teachers got raises, and every school became a magnet school. But students didn't learn any more than before. The schools got everything a supporter of old-fashioned public education could have asked for, and they couldn't educate kids any better.

What should we draw from these experiences? Not that nothing works, but that few if any remedies work consistently in different places with different populations. We shouldn't expect that broad, one-size-fits-all changes imposed by the federal government — such as those offered by the Obama administration — will pay off in student performance.

From the local school district to the federal Department of Education, humility, caution and open-mindedness are in order. Because right now, the main thing we know about improving schools is that we don't know very much.

 

THE THINGS HE WRITES ABOUT

Julia Keller

The first time I read "The Things They Carried," I had a chip on my shoulder roughly the size of Texas — which was only fitting because that's where author Tim O'Brien lives.

But I didn't know that then. I only knew that I'd read a lot of books about the Vietnam War, and a lot of books about a lot of wars, and everybody was telling me I just had to read "The Things They Carried." It was originally published in 1990. But this was 2000, and I thought I knew all I cared to know about that complex and terrible time.

Give it, I thought, a rest.

Still, duty called, and so I read it. Filled with resentment, stewing in a sour attitude of, "Great, another war book — hooray," I read it. Didn't want to, didn't think I needed to, groused about it even as I plopped down with the paperback — that giant chip on my shoulder made standing precarious — and read.

The chip vanished. The world changed.

"The Things They Carried" is not a war book. It's a life book. A death book. A dream book. A memory book. It's a book about writing, about being young, about falling in love, about watching people die, about wishing they didn't have to, about learning how to bring them back — how to bring everything back — with stories.

I've never been the same since I read it. And I'm not alone. "The Things They Carried" has sold millions of copies. High schools and colleges have made it a staple of the curriculum. A special 20th anniversary edition is now available from Houghton Mifflin Harcourt in hardback, paperback and e-book.

O'Brien, 63, a quiet, humble, self-effacing guy who always seems to be wearing a baseball cap, just wound up a coast-to-coast tour on behalf of the new edition. His final stop was in Chicago this month.

Reading the book now, he says, "is like remembering a stranger. I can remember sitting in my underwear in front of an old Wang computer and writing the first chapter." He laughs softly. "That guy had all of his hair. And not so many lines around his eyes."

It took him five years to write "The Things They Carried," he recalls. He has revised it a few times for subsequent editions — "I'm never wholly satisfied with any book, good reviews or bad" — and he revised it yet again for the 20th anniversary version. "I went through it and added little fixes here and there. Added a word or two. I doubt many readers would be able to notice what I did. But it's an improved book."

ard to see how it could be. A series of 22 linked stories, some just a few pages long, "The Things They Carried" is eloquent and grisly and mysterious and funny. It's ugly and it's beautiful.

It's about a guy named Tim from a small town in the Midwest who is drafted and sent to Vietnam.

Is it autobiographical?

O'Brien, a guy from a small town in the Midwest — Austin, Minn. — who was drafted and sent to Vietnam, hates that question. "People want answers. Strict answers. If you have a literal mindset, this book is frustrating," he says. "I think people forget that truth evolves. It's not a hard and fast thing. Truth changes. It's slippery and evasive, particularly if you're in a situation of great stress or trauma.

"Whenever I'm speaking somewhere and someone raises a hand and says, ‘Is it true?' a little trap door opens in my heart. I want to say, ‘Go back and read the book again.'"

O'Brien has written other fine novels as well, such as "Going After Cacciato" (1978), "In the Lake of the Woods" (1994) and "July, July" (2002). The Vietnam War figures in all of his work, he says, either obviously or obliquely.

"I came from a small, conservative town. I believed certain things. I remember being taught, ‘Thou shalt not kill.' Then you find yourself in a place where you'd better kill — or you'll be court-martialed. If that doesn't challenge your sense of self and your sense of truth, I don't know what will."

He has two sons, 6 and 4, and his next book is about being an older dad. But it's also about Vietnam, he adds. He just doesn't know how yet. "It's there. No matter what I write about, the feel of 1969 is still there."

Here's how he puts it in "The Things They Carried":

"You take your material where you find it, which is in your life, at the intersection of past and present … That's the real obsession. All these stories."

Later in the book, he writes: "War is hell, but that's not the half of it." The rest of it is stories.

 

METAFICTION AND O'BRIEN'S THE THINGS THEY CARRIED

 

Michelle Frielander

"Metafiction is a term given to fictional writing which self-consciously and systematically draws attention to its status as an artifact in order to pose questions about the relationship between fiction and reality."

--Patricia Waugh, Metafiction: The Theory and Practice 
of Self-Conscious Fiction.New York: Methuen, 1984.

In many respects, Tim O'Brien's The Things They Carried concerns the relationship between fiction and the narrator. In this novel, O'Brien himself is the main character--he is a Vietnam veteran recounting his experiences during the war, as well as a writer who is examining the mechanics behind writing stories. These two aspects of the novel are juxtaposed to produce a work of literature that comments not only upon the war, but also upon the actual art of fiction: the means of storytelling, the purposes behind them, and ultimately the relationship between fiction and reality itself.

Through writing about his experiences in Vietnam, O'Brien's character is able to find a medium in which he can sort through his emotions, since "by telling stories, you objectify your own experience. You separate it from yourself. You pin down certain truths" (158). He does not look upon his stories as therapy--he recounts his stories since they are a part of his past, and who he is now is the direct result of them: 

Forty-three years old, and the war occurred half a life-time ago, and yet the remembering makes it now. And sometimes remembering will lead to a story, which makes it forever. Stories are for joining the past to the future. Stories are for those late hours in the night when you can't remember how you got from where you were to where you are. (38)

O'Brien's character makes several comments on storytelling in certain sections of the novel, such as "How to Tell a True War Story." Through making these comments, the narrator is not only justifying the intent of The Things They Carried,but he is also providing clues to the content, structure, and interpretation of the novel. The narrator states that one fundamental truth is that "In any war story, but especially a true one, it's difficult to separate what happened from what seemed to happen....The angles of vision are skewed" (71). This novel is written in this way: characters such as Curt Lemon are killed and then later introduced, or the narrator undercuts what he has previously lead the reader to believe, as in the case of Norman Bowker's suicide. A true war story is distinguishable "by the way it never seems to end. Not then, not ever" (76). None of the anecdotes in this novel demonstrates complete closure, except perhaps in the case where the character was killed. Even then, however, that particular loss had an impact upon the lives of the people who have survived. Even the end of the novel itself is indefinite and without resolution. 

Most importantly, "In a true war story, if there's a moral at all, it's like the thread that makes the cloth. You can't tease it out. You can't extract the meaning without unraveling the deeper meaning" (77). Through extracting the true meaning of The Things They Carried, it is impossible to miss the deeper relationship that is being expressed in this novel and the true motivation behind the narrator's storytelling: the relationship between stories, reality, and time. 

As the narrator, O'Brien often comments upon the concept of time, such as in the section "On the Rainy River": "Looking back after twenty years, I sometimes wonder if the events of that summer didn't happen in some other dimension, a place where your life exists before you've lived it, and where it goes afterward" (54). During the lake scene in this section, O'Brien sees everyone important in his life on the shore: "I saw faces from my distant past and distant future....It was as if there were an audience to my life" (59). In this scene, the power of fiction to transcend the barriers of time and space and also life and death are shown. We are directly the result of our experiences, and, through the powers of storytelling, everyone who has had an impact upon the life of the narrator is brought together. As a collective entity, they are not only an audience to his life, but also serve as reflection of O'Brien's life in its entirety. 

Drawing upon the ability of fiction to preserve life against death, O'Brien says that, during wartime, that they were able to "[keep] the dead alive with stories" (239). To the living, stories were a way to keep the memory of the dead alive, but to the dead, it was the simple act of remembering that kept them alive: "That's what a story does. The bodies are animated. You make the dead talk" (232). This theme of preservation is exemplified by story of Linda, in which O'Brien uses the power of storytelling and memory to keep people alive: "Stories can save us. I'm forty-three years old, and a writer now, and even still, right here, I keep dreaming Linda alive...They're all dead. But in a story, which is a kind of dreaming, the dead sometimes smile and sit up and return to the world." (225). 

Ultimately, this novel is not about Vietnam--in fact, it is not about war at all. It is about the narrator's attempt to find a place where the erosion of time will have no effect. By working through the "threads" of this novel, O'Brien's intentions become obvious: He is fighting to preserve the physical against deterioration, and by extension, to preserve life by immortalizing it in fiction. He is not writing as a result of neurosis or as a form of therapy; he does this since immortality and preservation lies in the memory of people. If the true measure of life is how long we live after we are gone, then keeping the memory of people alive through fiction is a means of preserving life: 

I'll never die. I'm skimming across the surface of my own history, moving fast, riding the melt beneath the blades, doing loops and spins, and when I take a high leap into the dark and come down thirty years later, I realize it is as Tim trying to save Timmy's life with a story. (246)

Since O'Brien's life itself is the culmination of his past relationships with all of the people who have been a part of it--both past, present, and future--then keeping them alive does the same for himself. In short, O'Brien's writing is a matter of survival since, through the powers of storytelling, he can ensure the immortality of all those who were significant in his life. It is through their immortality that he has the ability to save himself with a simple story.
 

HOW TO TELL A TRUE WAR STORY

Calloway, Catherine, 'How to tell a true war story': Metafiction in 'The Things They Carried.'., Vol. 36, Critique: Studies in Contemporary Fiction, June, 1995, pp 249 ff. .

Tim O'Brien's most recent book, The Things They Carried, begins with a litany of items that the soldiers "hump" in the Vietnam War - assorted weapons, dog tags, flak jackets, ear plugs, cigarettes, insect repellent, letters, can openers, C-rations, jungle boots, maps, medical supplies, and explosives as well as memories, reputations, and personal histories. In addition, the reader soon learns, the soldiers also carry stories: stories that connect "the past to the future" (40), stories that can "make the dead talk" (261), stories that "never seem . . . to end" (83), stories that are "beyond telling" (79), and stories "that swirl back and forth across the border between trivia and bedlam, the mad and the mundane" (101). Although perhaps few of the stories in The Things They Carried are as brief as the well-known Vietnam War tale related by Michael Herr in Dispatches - "'Patrol went up the mountain. One man came back. He died before he could tell us what happened,' "(6) - many are in their own way as enigmatic. The tales included in O'Brien's twenty-two chapters range from several lines to many pages and demonstrate well the impossibility of knowing the reality of the war in absolute terms. Sometimes stories are abandoned, only to be continued pages or chapters later. At other times, the narrator begins to tell a story, only to have another character finish the tale. Still other stories are told as if true accounts, only for their validity to be immediately questioned or denied. O'Brien draws the reader into the text, calling the reader's attention to the process of invention and challenging him to determine which, if any, of the stories are true. As a result, the stories become epistemological tools, multidimensional windows through which the war, the world, and the ways of telling a war story can be viewed from many different angles and visions.

The epistemological ambivalence of the stories in The Things They Carried is reinforced by the book's ambiguity of style and structure. What exactly is The Things They Carried in terms of technique? Many reviewers refer to the work as a series of short stories, but it is much more than that. The Things They Carried is a combat novel, yet it is not a combat novel. It is also a blend of traditional and untraditional forms - a collection, Gene Lyons says, of "short stories, essays, anecdotes, narrative fragments, jokes, fables, biographical and autobiographical sketches, and philosophical asides" (52). It has been called both "a unified narrative with chapters that stand perfectly on their own" (Coffey 60) and a series of "22 discontinuous sections" (Bawer A13).

Also ambiguous is the issue of how much of the book is autobiography. The relationship between fiction and reality arises early in the text when the reader learns the first of many parallels that emerge as the book progresses: that the protagonist and narrator, like the real author of The Things They Carried, is named Tim O'Brien. Both the real and the fictional Tim O'Brien are in their forties and are natives of Minnesota, writers who graduated Phi Beta Kappa from Macalester College, served as grunts in Vietnam after having been drafted at age twenty-one, attended graduate school at Harvard University, and wrote books entitled If I Die in a Combat Zone and Going After Cacciato. Other events of the protagonist's life are apparently invention. Unlike the real Tim O'Brien, the protagonist has a nine-year-old daughter named Kathleen and makes a return journey to Vietnam years after the war is over.(1) However, even the other supposedly fictional characters of the book sound real because of an epigraph preceding the stories that states, "This book is lovingly dedicated to the men of Alpha Company, and in particular to Jimmy Cross, Norman Bowker, Rat Kiley, Mitchell Sanders, Henry Dobbins, and Kiowa," leading the reader to wonder if the men of Alpha Company are real or imaginary.

Clearly O'Brien resists a simplistic classification of his latest work. In both the preface to the book and in an interview with Elizabeth Mehren, he terms The Things They Carried "'fiction . . . a novel'" (Mehren E1), but in an interview with Martin Naparsteck, he refers to the work as a "sort of half novel, half group of stories. It's part nonfiction, too," he insists (7). And, as Naparsteck points out, the work "resists easy categorization: it is part novel, part collection of stories, part essays, part journalism; it is, more significantly, all at the same time" (1).

As O'Brien's extensive focus on storytelling indicates, The Things They Carried is also a work of contemporary metafiction, what Robert Scholes first termed fabulation or "ethically controlled fantasy" (3). According to Patricia Waugh,

Metafiction is a term given to fictional writing which self-consciously and systematically draws attention to its status as an artefact in order to pose questions about the relationship between fiction and reality. In providing a critique of their own methods of construction, such writings not only examine the fundamental structures of narrative fiction, they also explore the possible fictionality of the world outside the literary fictional text. (2)

Like O'Brien's earlier novel, the critically acclaimed Going After Cacciato,(2) The Things They Carried considers the process of writing; it is, in fact, as much about the process of writing as it is the text of a literary work. By examining imagination and memory, two main components that O'Brien feels are important to a writer of fiction (Schroeder 143), and by providing so many layers of technique in one work, O'Brien delves into the origins of fictional creation. In focusing so extensively on what a war story is or is not, O'Brien writes a war story as he examines the process of writing one. To echo what Philip Beidler has stated about Going After Cacciato, "the form" of The Things They Carried thus becomes "its content" (172); the medium becomes the message.

"I'm forty-three years old, and a writer now," O'Brien's protagonist states periodically throughout the book, directly referring to his role as author and to the status of his work as artifice. "Much of it [the war] is hard to remember," he comments. "I sit at this typewriter and stare through my words and watch Kiowa sinking into the deep muck of a shit field, or Curt Lemon hanging in pieces from a tree, and as I write about these things, the remembering is turned into a kind of rehappening" (36). The "rehappening" takes the form of a number of types of stories: some happy, some sad, some peaceful, some bloody, some wacky. We learn of Ted Lavender, who is "zapped while zipping" (17) after urinating, of the paranoid friendship of Dave Jensen and Lee Strunk, of the revenge plot against Bobby Jorgenson, an unskilled medic who almost accidentally kills the narrator, of the moral confusion of the protagonist who fishes on the Rainy River and dreams of desertion to Canada, and Mary Ann Bell, Mark Fossie's blue-eyed, blonde, seventeen- year-old girlfriend, who is chillingly attracted to life in a combat zone.

Some stories only indirectly reflect the process of writing; other selections include obvious metafictional devices. In certain sections of the book, entire chapters are devoted to discussing form and technique. A good example is "Notes," which elaborates on "Speaking of Courage, " the story that precedes it. The serious reader of the real Tim O' Brien's fiction recognizes "Speaking of Courage" as having first been published in the Summer 1976 issue of Massachusetts Review.(3) This earlier version of the story plays off chapter 14 of Going After Cacciato, "Upon Almost Winning the Silver Star," in which the protagonist, Paul Berlin, is thinking about how he might have won the Silver Star for bravery in Vietnam had he had the courage to rescue Frenchie Tucker, a character shot while searching a tunnel. However, in The Things They Carried's version of "Speaking of Courage," the protagonist is not Paul Berlin, but Norman Bowker, who wishes he had had the courage to save Kiowa, a soldier who dies in a field of excrement during a mortar attack.(4) Such shifts in character and events tempt the reader into textual participation, leading him to question the ambiguous nature of reality. Who really did not win the Silver Star for bravery? Paul Berlin, Norman Bowker, or Tim O'Brien? Who actually needed saving? Frenchie Tucker or Kiowa? Which version of the story, if either, is accurate? The inclusion of a metafictional chapter presenting the background behind the tale provides no definite answers or resolutions. We learn that Norman Bowker, who eventually commits suicide, asks the narrator to compose the story and that the author has revised the tale for inclusion in The Things They Carried because a postwar story is more appropriate for the later book than for Going After Cacciato. However, O'Brien's admission that much of the story is still invention compels the reader to wonder about the truth. The narrator assures us that the truth is that "Norman did not experience a failure of nerve that night . . . or lose the Silver Star for valor" (182). Can even this version be believed? Was there really a Norman Bowker, or is he, too, only fictional?

Even more significant, the reader is led to question the reality of many, if not all, of the stories in the book. The narrator insists that the story of Curt Lemon's death, for instance, is "all exactly true" (77), then states eight pages later that he has told Curt's story previously - "many times, many versions" (85) - before narrating yet another version. As a result, any and all accounts of the incident are questionable. Similarly, the reader is led to doubt the validity of many of the tales told by other characters in the book. The narrator remarks that Rat Kiley's stories, such as the one about Mary Ann Bell in "Sweetheart of the Song Tra Bong," are particularly ambiguous:

For Rat Kiley . . . facts were formed by sensation, not the other way around, and when you listened to one of his stories, you'd find yourself performing rapid calculations in your head, subtracting superlatives, figuring the square root of an absolute and then multiplying by maybe. (101)

Still other characters admit the fictionality of their stories. Mitchell Sanders, in the ironically titled "How to Tell a True War Story," confesses to the protagonist that although his tale is the truth, parts of it are pure invention. "'Last night, man,'" Sanders states, "'I had to make up a few things . . . The glee club. There wasn't any glee club . . . No opera,'" either (83-84). "'But,'" he adds, "'it's still true'" (84).

O'Brien shares the criteria with which the writer or teller and the reader or listener must be concerned by giving an extended definition of what a war story is or is not. The chapter "How to Tell a True War Story" focuses most extensively on the features that might be found in a "true" war tale. "A true war story is never moral," the narrator states. "It does not instruct, nor encourage virtue, nor suggest models of proper human behavior, nor restrain men from doing the things men have always done" (76). Furthermore, a true war story has an "absolute and uncompromising allegiance to obscenity and evil" (76), is embarrassing, may not be believable, seems to go on forever, does "not generalize" or "indulge in abstraction or analysis" (84), does not necessarily make "a point" (88), and sometimes cannot even be told. True war stories, the reader soon realizes, are like the nature of the Vietnam War itself; "the only certainty is overwhelming ambiguity" (88). "The final and definitive truth" (83) cannot be derived, and any "truths are contradictory" (87).

By defining a war story so broadly, O'Brien writes more stories, interspersing the definitions with examples from the war to illustrate them. What is particularly significant about the examples is that they are given in segments, a technique that actively engages the readers in the process of textual creation. Characters who are mentioned as having died early in the work are brought back to life through flashbacks in other parts of the text so that we can see who these characters are, what they are like, and how they die. For instance, in the story, "Spin," the narrator first refers to the death of Curt Lemon, a soldier blown apart by a booby trap, but the reader does not learn the details of the tragedy until four stories later in "How to Tell a True War Story." Even then, the reader must piece together the details of Curt' s death throughout that particular tale. The first reference to Lemon appears on the third page of the story when O'Brien matter-of-factly states, "The dead guy's name was Curt Lemon" (77). Lemon's death is briefly mentioned a few paragraphs later, but additional details surrounding the incident are not given at once but are revealed gradually throughout the story, in between digressive stories narrated by two other soldiers, Rat Kiley and Mitchell Sanders. Each fragment about Curt's accident illustrates the situation more graphically. Near the beginning of the tale, O'Brien describes the death somewhat poetically. Curt is "a handsome kid, really. Sharp grey eyes, lean and narrow-waisted, and when he died it was almost beautiful, the way the sunlight came around him and lifted him up and sucked him high into a tree full of moss and vines and white blossoms" (78). Lemon is not mentioned again for seven pages, at which time O'Brien illustrates the effect of Lemon's death upon the other soldiers by detailing how Rat Kiley, avenging Curt's death, mutilates and kills a baby water buffalo. When later in the story Lemon's accident is narrated for the third time, the reader is finally told what was briefly alluded to in the earlier tale "Spin": how the soldiers had to peel Curt Lemon's body parts from a tree.

The story of Curt Lemon does not end with "How to Tell a True War Story" but is narrated further in two other stories, "The Dentist" and "The Lives of the Dead." In "The Lives of the Dead," for example, Curt is resurrected through a story of his trick-or-treating in Vietnamese hootches on Halloween for whatever goodies he can get: "candles and joss sticks and a pair of black pajamas and statuettes of the smiling Buddha" (268). To hear Rat Kiley tell it, the narrator comments, " you'd never know that Curt Lemon was dead. He was still out there in the dark, naked and painted up, trick-or-treating, sliding from hootch to hootch in that crazy white ghost mask" (268). To further complicate matters, in "The Lives of the Dead," O'Brien alludes to a soldier other than Curt, Stink Harris, from a previous literary work, Going After Cacciato, written over a decade before The Things They Carried. Thus, the epistemological uncertainty in the stories is mirrored by the fact that O'Brien presents events that take place in a fragmented form rather than in a straightforward, linear fashion. The reader has to piece together information, such as the circumstances surrounding the characters' deaths, in the same manner that the characters must piece together the reality of the war, or, for that matter, Curt Lemon's body.

The issue of truth is particularly a main crux of the events surrounding "The Man I Killed," a story that O'Brien places near the center of the book. Gradually interspersed throughout the stories that make up The Things They Carried are references to a Vietnamese soldier, "A slim, dead, dainty young man of about twenty" (40) with "a star- shaped hole" (141) in his face, who is first mentioned in the story "Spin" and whose death still haunts the narrator long after the end of the war. Nine chapters after "Spin," in "The Man I Killed," the protagonist graphically describes the dead Vietnamese youth as well as creates a personal history for him; he envisions the young man to have been a reluctant soldier who hated violence and "loved mathematics" (142), a university-educated man who "had been a soldier for only a single day" (144) and who, like the narrator, perhaps went to war only to avoid "disgracing himself, and therefore his family and village" (142).(5) "Ambush," the story immediately following "The Man I Killed, " provides yet another kaleidoscopic fictional frame of the incident, describing in detail the events that lead up to the narrator's killing of the young soldier and ending with a version of the event that suggests that the young man does not die at all. The reader is forced to connect the threads of the story in between several chapters that span over a hundred pages; not until a later chapter, "Good Form," where the protagonist narrates three more stories of the event, does the reader fully question the truth of the incident. In the first version in "Good Form," the narrator reverses the details of the earlier stories and denies that he was the thrower of the grenade that killed the man. "Twenty years ago I watched a man die on a trail near the village of My Khe," he states. "I did not kill him. But I was present, you see, and my presence was guilt enough" (203). However, he immediately admits that "Even that story is made up" (203) and tells instead what he terms "the happening-truth":

I was once a soldier. There were many bodies, real bodies with real faces, but I was young then and I was afraid to look. And now, twenty years later, I'm left with faceless responsibility and faceless grief. (203)

In still a third version, "the happening-truth" is replaced with " the story-truth." According to the protagonist, the Vietnamese soldier

was a slim, dead, almost dainty young man of about twenty. He lay in the center of a red clay trail near the village of My Khe. His jaw was in his throat. His one eye was shut, the other eye was a star- shaped hole. I killed him. (204)

But the reader wonders, did the narrator kill the young man? When the narrator's nine-year-old daughter demands, "'Daddy, tell the truth . . . did you ever kill anybody,'" the narrator reveals that he "can say, honestly, 'Of course not,'" or he "can say, honestly, 'Yes'" (204).

According to Inger Christensen, one of the most important elements of metafiction is "the novelist's message" (10). At least one reviewer has reduced O'Brien's message in The Things They Carried to the moral "'Death sucks'" (Melmoth H6); the book, however, reveals an even greater thematic concern. "Stories can save us," asserts the protagonist in "The Lives of the Dead," the concluding story of the text (255), where fiction is used as a means of resurrecting the deceased. In this multiple narrative, O'Brien juxtaposes tales of death in Vietnam with an account of the death of Linda, a nine-year-old girl who had a brain tumor. As the protagonist tells Linda's story, he also comments on the nature and power of fiction. Stories, he writes, are "a kind of dreaming, [where] the dead sometimes smile and sit up and return to the world" (255). The narrator of "The Lives of the Dead" thus seeks to keep his own friends alive through the art of storytelling. "As a writer now," he asserts,

I want to save Linda's life. Not her body - her life . . . in a story I can steal her soul. I can revive, at least briefly, that which is absolute and unchanging. . . . In a story, miracles can happen. Linda can smile and sit up. She can reach out, touch my wrist, and say, "Timmy, stop crying." (265)

Past, present, and future merge into one story as through fiction O'Brien zips "across the surface of . . . [his] own history, moving fast, riding the melt beneath the blades, doing loops and spins . . . as Tim trying to save Timmy's life with a story" (273). His story mirrors his own creative image of history, "a blade tracing loops on ice" (265), as his metafictive narrative circles on three levels: the war of a little boy's soul as he tries to understand the death of a friend, the Vietnam War of a twenty-three-year-old infantry sergeant, and the war of "guilt and sorrow" (265) faced by "a middle-aged writer" (265) who must deal with the past.

In focusing so extensively on the power of fiction and on what a war story is or is not in The Things They Carried, O'Brien writes a multidimensional war story even as he examines the process of writing one. His tales become stories within stories or multilayered texts within texts within texts. The book's genius is a seeming inevitability of form that perfectly embodies its theme - the miracle of vision - the eternally protean and volatile capacity of the imagination, which may invent that which it has the will and vision to conceive.(6) "In the end," the narrator states,

a true war story is never about war. It's about sunlight. It's about the special way that dawn spreads out on a river when you know you must cross the river and march into the mountains and do things you are afraid to do. It's about love and memory. It's about sorrow. It' s about sisters who never write back and people who never listen. (91)

How, then, can a true war story be told? Perhaps the best way, O'Brien says, is to "just keep on telling it" (91)

 

SOME EDUCATORS QUESTION IF WHITEBOARDS, OTHER HIGH-TECH TOLLS RAISE ACHIEVEMENT

By Stephanie McCrummen

Washington Post Staff Writer
Friday, June 11, 2010

 Under enormous pressure to reform, the nation's public schools are spending millions of dollars each year on gadgets from text-messaging devices to interactive whiteboards that technology companies promise can raise student performance.

 Driving the boom is a surge in federal funding for such products, the industry's aggressive marketing and an idea axiomatic in the world of education reform: that to prepare students kids for the 21st century, schools must embrace the technologies that are the media of modern life.
Increasingly, though, another view is emerging: that the money schools spend on instructional gizmos isn't necessarily making things better, just different. Many academics question industry-backed studies linking improved test scores to their products. And some go further. They argue that the most ubiquitous device-of-the-future, the whiteboard -- essentially a giant interactive computer screen that is usurping blackboards in classrooms across America -- locks teachers into a 19th-century lecture style of instruction counter to the more collaborative small-group models that many reformers favor.

"There is hardly any research that will show clearly that any of these machines will improve academic achievement," said Larry Cuban, education professor emeritus at Stanford University. "But the value of novelty, that's highly prized in American society, period. And one way schools can say they are 'innovative' is to pick up the latest device."


Federal dollars for educational technology, minuscule until the mid-1990s, grew to more than $800 million last year, and industry analysts estimate that federal, state and local expenditures will total $16 billion next year. Money that once bought filmstrips and overhead projectors has spawned a thriving industry of companies that pitch their products as a way to help schools meet the federal priorities of the day. Glossy brochures that claimed whiteboards would help teachers reach Bush's No Child Left Behind goals, for instance, now say the devices will help schools win "Race to the Top" grants from the Obama administration.

Nancy Knowlton, the chief executive of SMART Technologies, said that schools are desperate to find ways to engage multi-tasking, tech-savvy kids, who often play video games before they can read and that some "strictly gathered research data," along with anecdotal evidence, show that her company's products work.

 "[Students] are engaged when they're in class, they are motivated, they are attending school, they are behaving and this is translating to student performance in the classroom," she said. "Kids want an energized, multimedia learning experience. . . . When you ask them to shut off when they enter the classroom, that doesn't really work for them."

Fairfax County public schools began installing interactive whiteboards several years ago, one of which landed in Sam Gee's classroom at W.T. Woodson High School. On a recent morning, the popular history teacher dimmed the lights, and his students stared at the glowing, $3,000 screen.
As he lectured, Gee hyperlinked to an NBC news clip, clicked to an animated Russian flag, a list of Russian leaders and a short film on the Mongol invasions. Here and there, he starred items on the board using his finger. "Let's say this is Russia," he said at one point, drawing a little red circle. "Okay -- who invaded Russia?"

One student was fiddling with an iPhone. Another slept. A few answered the question, but the relationship between their alertness and the bright screen before them was hardly clear. And as the lesson carried on, this irony became evident: Although the device allowed Gee to show films and images with relative ease, the whiteboard was also reinforcing an age-old teaching method -- teacher speaks, students listen. Or, as 18-year-old Benjamin Marple put it: "I feel they are as useful as a chalkboard."

On its Web site, Smart Technologies cites more glowing testimony, quoting a former Fairfax high school teacher saying that after the whiteboards arrived, he saw "significant" increases in student performance "across all grade levels."

Such statements reflect the fact that many teachers love whiteboards -- industry groups say one in three classrooms will have the device by 2011. They also reflect the relationships that ed-tech companies cultivate with school officials to market their products, underwriting major education conferences and sponsoring professional associations. After the Montgomery County school system signed a $13 million deal with Promethean to lease 2,600 whiteboards in 2008, for instance, its technology director, Sherwin Collette, spoke at Promethean events during several major education conferences. A district spokesman said Collette was not "promoting" the products per se, but speaking about technology generally.

Last year, the Arizona attorney general criticized Tucson Unified School District officials for accepting rooms, meals, an open bar and free iPods at a resort conference paid for by Promethean after the district spent $2.1 million on products. Mark Elliott, president of Promethean North America, said the company has since revised its ethics policy. But he and others said such events help the industry "keep its finger on the pulse" of what schools need.
"The private sector engagement is a good thing," said Doug Levin, executive director of the State Educational Technology Directors Association, which lists Promethean, SMART Technologies and Apple among its $30,000 platinum sponsors. "It is the [job] of the public sector to evaluate claims of these vendors."


But according to many academics, industry claims about whiteboards are not based on rigorous academic studies. One frequently cited study, conducted by Marzano Research Laboratory and funded by Promethean, surveyed 85 teachers who volunteered to teach a lesson of their choice to two classes, one with the whiteboard, one without. The teachers then gave a test of their own design, with results showing an average 17-point gain in classrooms with whiteboards. "It's a suggestive study -- you can't conclude anything," said Steve Ross, an education professor at Johns Hopkins University. "And that's being generous."

Even the study's author, Robert Marzano, noted that 23 percent of the teachers reported higher test scores without the whiteboard, and some reported lower scores using it. "It looks like whiteboards can be used in a way that can lull teachers into not using what we consider good instructional strategies," Marzano said in an interview.

After using an interactive whiteboard for a year, William Ferriter, a sixth-grade teacher in North Carolina, came to a similar conclusion, deciding the whiteboard was little more than "a badge saying 'We're a 21st-century school.' " He spent weeks trying to devise collaborative lessons that he knows engage students. The best one, he said, brought kids to the whiteboard, where they used their fingers to sort words describing metamorphic rocks, as a video played to the side.
"It just allows you to create digitized versions of old lessons," he said. "My kids were bored with it after about three weeks."

Chris Dede, an education professor at Harvard University, said whiteboards are popular precisely because companies designed them to suit the old instructional style with which teachers are most comfortable.

"No one should be beating up on these companies," Dede said. "They're just doing what a capitalist society tells them to do."

One recent morning, an amiable corporate salesman in a dark suit wheeled into a Maryland classroom the latest high-tech device -- a $6,500 table with an interactive touch screen that allows students to collaboratively count, do puzzles and play other instructional games. "We had a first run and boom! They sold out," Joe Piazza said in his presentation to administrators at Parkside High School on the Eastern Shore. "It was kind of like the iPad."

In the cinder-block classroom, a few kindergartners sat around the fancy table, working a digital puzzle as blips and canned applause encouraged them. The school officials seemed pleased.
"So," the district's technology director asked Piazza, "do we just call you for pricing?"

 

A DOWNSIDE TO HIGH TEEN SELF-ESTEEM

By John Keilman, Tribune reporter, July 4, 2010

Laura Rovi was smart enough to be lazy. An honor student at Elmhurst's York High School, she was accustomed to getting an A even when she cruised through a class.

She expected nothing less when she took a government course her sophomore year and let a classmate do all the work on their final group project, an advocacy video warning of the dangers of eating disorders.

This time, though, her lack of effort earned her a C — a mark that produced a curious reaction.

She wasn't guilty. She wasn't depressed. She was insulted.

"This was just in my face," Rovi, 18, recalled recently. "I was not used to that."

Rovi belongs to a generation of teens for whom praise has often come as readily as oxygen. They've been bathed from the cradle in affirmations and awards meant to boost their self-esteem — and, by extension, their prospects in life.

But some who research the psychology of teens have concluded that this trend, born of good intentions in the Age of Aquarius, has had toxic effects.

By their estimation, today's young people have been praised so much that some flail at their first taste of criticism or failure. Others develop a keen sense of privilege, believing they'll coast into a golden future regardless of their actual talents, accomplishments or willingness to work.

"There has been a pretty big shift in expectations. Adjusting to reality is going to be different," said Jean Twenge, a San Diego State University psychology professor whose research has found soaring teen self-esteem.

Twenge's conclusion is not universally accepted — other researchers have found no significant changes in self-esteem from previous generations — but it rings true in many schools and homes. And it has some adults asking themselves hard questions.

"It's this entitlement that is driving many of us crazy. It's like, where did we go wrong?" said Rita Berger, a West Chicago mother of a teenage son and daughter. "We're kind of the root problem. In our attempt to give (this generation) everything, they have not learned to work or appreciate things."

The self-esteem movement grew out of the work of therapists like Nathaniel Branden, who in the late 1960s wrote that internal negativity could lead to lack of achievement. Change what people think of themselves, he contended, and you can change their destiny.

It was a theory in keeping with the times. Baby boomers were breaking free of traditional social structures to search for fulfillment on their own terms, and the notion of boosting one's self-esteem fit into that perfectly, Twenge said.

They carried the idea into the way they raised their kids, she said, while schools adopted policies that nurtured children's emotional well-being. The result, according to decades of data Twenge and her colleagues have mined in their research, is that youth self-esteem has risen sharply over the last 30 years, with a particularly dramatic jump since the late 1980s.

Brittany Gentile, a psychology graduate student at the University of Georgia, found that between 1988 and 2006, the average junior high student's score on the Rosenberg Self-Esteem Scale (a questionnaire that asks whether respondents agree with such statements as "On the whole, I am satisfied with myself") jumped nearly four points on a 40-point scale. The average score for a high school student went up almost two points during a similar span.

She said that while some of the increase could be due to the self-esteem movement, the rise could also reflect changes in the classroom.

Gentile cited a recent study that found twice as many high school seniors in 2006 reported earning an A average as seniors in 1976. At the same time, fewer students said they did 15 or more hours of homework each week — meaning teens are getting better grades with less work.

It is here, though, that the case for runaway self-esteem grows murky. Have teens really changed that much? Or are they simply reflecting changes in the world around them?

Take the fixation on grades. Mitchell Levy, who just graduated from Deerfield High School, said he once enlisted his parents' help to try to change his mark in a Spanish class from an A-minus to an A. They argued that a student teacher had been unduly harsh and that the good scores Levy earned when the full-time instructor returned should have received more weight.

The school declined to change the grade, and Levy said he and his parents dropped their challenge. Looking back, he called the episode "a little bit ridiculous" but said college entrance requirements have become so competitive and student evaluations so generous that even a tiny blemish can be damaging.

"If grades were harshly done, then it would be OK to get a B. But because grades are so lightly done, it can put you at a disadvantage," said Levy, who, after being wait-listed at Harvard University, plans to attend the University of Chicago in the fall.

Or take entitlement.

Mike Greene, who as caddy superintendent employs 170 teens at Wheaton's Cantigny Golf, said some live in such material splendor that they have little motivation to work hard.

"There certainly are a lot of kids in this world who are very comfortable," he said. "I think that's dangerous. They need to be hungry for something."

But Heather Nicodemus, who has one son at Grayslake Central High School and two in college, sees it differently.

She said her boys have routinely quit sports, activities and even jobs they felt were unfulfilling. Though it is far different from the way Nicodemus was brought up — "My parents said, 'Hell no, we paid our $100 (registration fee), you're not quitting,'" she recalled — she found something admirable about their willingness to walk away.

"If they're going to work so hard to accomplish something, it should be something they love," she said, adding that her sons buckle down once they find an activity that interests them.

The ultimate problem with inflated self-esteem, Twenge said, is that it can end with a painful reckoning. Alex Ortiz knows what that feels like.

As she grew up in Elmhurst, softball was her life. She had played since age 4, adoring the game and the bonds she formed with her teammates. Her e-mail address started with the handle "Softballgrl."

She was good too — or so her coaches had always told her. But then she got to York, where claiming a place on the freshman team meant surviving the cuts that followed a three-day tryout.

She didn't make it. Distraught, she gave up the game.

"I went from being told, 'You're good, you're good,' to getting told I'm not really good," said Ortiz, 16, who will be a junior in the fall. "It kind of crushed me. It felt like (earlier coaches) had been lying to me."

Others, though, say they embraced their reality checks. Rovi, the lackadaisical honor student, said she soon accepted the fairness of her C, realizing it was a better grade than her minimal effort deserved. It spurred her to work harder, she said, and she ended up graduating as an Illinois State Scholar.

John Reynolds, a sociologist at Florida State University, said that kind of adjustment appears to be common. Four years ago, he co-wrote a paper showing that high school seniors have increasingly overestimated their chances of earning a bachelor's degree or working in a professional job. He figured that would lead many unprepared students to drop out of college in a funk of despair.

But when he went back to examine the fallout, he was surprised at what he found. Students who thought they would earn a degree but failed were no more apt to suffer depression than those who succeeded.

That could indicate that their self-esteem is as bulletproof as ever. Or it could mean that getting taken down a few notches doesn't hurt as badly as some might fear.

"How long can you hold on to unrealistic self-esteem? It wouldn't last very far into your 20s," Reynolds said. "The sociological evidence says there are more important things to worry about."

Copyright © 2010, Chicago Tribune

 

MOCKINGBIRD STILL SINGS AFTER 50 YEARS

 Eric Zorn, Chicago Tribune 

 Of all the things to love about “To Kill a Mockingbird,” the modern classic that celebrates its 50th anniversary this weekend, the first appears in the opening paragraph.

 "When he was nearly thirteen, my brother Jem got his arm badly broken at the elbow,” begins author Harper Lee. “When it healed, and Jem's fears of never being able to play football were assuaged, he was seldom self-conscious about his injury.”

Stop right there. “Assuaged.” Lee could have chosen any number of words or phrases more likely to be understood by a boy nearly 13 or his sister, Scout, the narrator, four years younger: “Eased,” “Relieved.” “A distant memory.”

Instead she went with the ancient, storied and exquisite “assuaged,” which has its roots in the Latin suavis, meaning pleasant, from which we also get not only the word “suave” — meaning smooth — but also the word “sweet.” Pull it apart a bit further, and you find the etymological DNA of “persuade,” which arises from the idea of making something pleasant to another person.

So Jem's recovery doesn't simply ease his mind, it smoothes and sweetens his mind; it convinces him that his future is unaltered.

My twins, who were themselves nearly 13 when we read the book together recently, reacted with impatient dismay when I lingered, as above, on “assuaged”: Two sentences into the story, and we're already pausing for pedantry?

Little did they know. Lee's rich use of language — much of it redolent with what Scout calls her father's “last will and testament diction” — is a particular pleasure of “To Kill a Mockingbird,” one that I admit I'd missed in hurrying through previous readings as a busy student and a nostalgia-minded adult.

Apothecary. Impotent. Taciturn. Unsullied. Dispatched (as a synonym for murdered). Impudent. Courteous detachment. A vapid repertoire. A malevolent phantom. Morbid nocturnal events. Predilection. Domiciled. Stumphole whiskey. Beadle. Nebulous. Meditatively. Foray.

And those are just some of the words from the first chapter I highlighted and explained when I became a parent seeing the book through the eyes of children.

Lee was not just showing off her erudition, nor was I just trying to give the kids a leg up on the vocabulary portion of their SATs. The rigorous formality of the writing — which can be easy to gloss over as you follow the intertwining stories of Tom Robinson, the black man falsely accused of rape — reflects the rigorous formality of Depression-era Alabama in which the novel is set. It also reflects the cerebral nature of Atticus Finch, Scout and Jem's father, the lawyer who coolly confronts the malignant, nonchalant racism that's part of the formal structure of his society.

Today's reader is outraged by the guilty verdict from the all-white jurors at Robinson's trial. But “To Kill a Mockingbird” is not the angry book it might have been had it been written just a few years later, when the civil rights movement with its attendant fury was in full swing.

The bigotry that convicted the innocent man is “just as much Maycomb County as missionary teas,” says the phlegmatic Atticus shortly after the verdict.

Critics have since noted that Atticus is too decent for his own good — he applies even to vicious racists his dictum “You never really understand a person until you consider things from his point of view; until you climb into his skin and walk around in it.” A true hero would never acquiesce as he does to injustice.

I see that point. Finch, so irreproachably cool and respectable, so educated and moral, plowed the ground for the true, real-life heroes whose triumphs followed the publication of this novel.

Will my children remember “assuaged” or any of the other words, terms and allusions I defined for them in what became a 172-page glossary/albatross? Will they be inspired to hunt, as they write, for the perfect word and not its merely adequate cousin? I hope at least a little.

But more, I hope they remember the biggest thing to love about “To Kill a Mockingbird” — the lesson imparted by those words that courage isn't always flashy, isn't always enough, but is always in style.

 

LATINA RAPPERS MAKE THEIR VOICES HEARD

August 10, 2010|By Ernesto Lechner, Special to the Los Angeles Times

Ana Tijoux is not your average rapper.

On "1977," the lush title track off her new album, she raps about her life so far, from childhood in exile and rebellious adolescence to maturity as a young woman. But her rhymes don't rhyme. The words don't bounce off each other with the expected repetition of most commercial fare.

Instead, Tijoux's lyrics boast an internal logic of their own. Breathlessly, she raps, manipulating syllables, exploring the beauty of the Spanish language — a staccato rhythm here, an unusual metaphor there. The result is a new sound in the burgeoning genre of Latin rap. Even Radiohead's Thom Yorke has paid notice. Recently, he listed "1977" among his current faves on his band’s Web site.

2010 is shaping up to be a transformative year for the Latina rapper. As Latin music continues to mutate and evolve in new directions, three noteworthy recent albums have a female MC at the core of their sonic DNA. There's Tijoux, who was born in France to parents from Chile and currently resides in Santiago, and two groups from Colombia: Bomba Estéreo and Chocquibtown. All three albums are on Nacional Records, a Los Angeles-based label that specializes in the Latin Alternative market.

Rapping came naturally to Tijoux. In the late '90s, she was the female MC with pioneering Chilean hip-hop group Makiza. She went solo in 2006, and continued to develop a unique flow.

"There isn't a logic or theory to what I do," Tijoux explains from her home in Santiago, where she spends much of the day taking care of her young son. "I taught myself how to rap — and eventually reached a fortuitous moment when I discovered my own style, or signature."

Tijoux's take on her artistry is as complex and contradictory as her rapping. On "Crisis de un MC," from "1977", she describes in painstaking detail the insecurities of a musician — or any artist, really. The contradiction between her natural shyness, her desire to isolate herself from the world, versus her need to make art with words and expose her soul onstage.

http://articles.latimes.com/images/pixel.gif"I'm faced with an inner contradiction that is nothing short of explosive," Tijoux says. "When you're onstage, there are all these euphoric people at the venue, and for a moment, you wish that you could be at home. Interacting with an audience is a beautiful thing to do, but there's also a violence to it. When I started doing photo shoots, I would panic and sweat profusely.... No one told me that it was part of the job. Slowly, you learn how to deal with your insecurities."

Chocquibtown follows a musical path that was pioneered in the late '90s by Cuba's Orishas: party-friendly hip-hop with a distinct Afro-Caribbean zest. With their feel-good call-and-response choruses, songs like "De Donde Vengo Yo" and "Somos Pacífico," included in Chocquibtown's debut "Oro," are all about celebrating Colombia's cultural heritage without dwelling on the country's painful realities (for example: "todo el mundo quiere irse de aquí, pero nadie lo ha logrado" — everyone wants to leave this place, but no one has managed to do it.)

The trio of Miguel "Slow" Martínez, Carlos "Tostao" Valencia and Gloria "Goyo" Martínez hail from Chocó — one of the country's poorest provinces, marked by its large Afro-Colombian population. Goyo's uncle is Jairo Varela — legendary leader and composer with salsa supergroup Grupo Niche.

The influence of tropical music on Goyo cannot be underestimated; at times, the swing in her voice suggests the invisible presence of famous cumbia divas like Leonor González Mina or Totó la Momposina."I grew up in the town of Condoto, next to a river, surrounded by music," says Goyo, during a break from an extensive European tour. "My father was a record collector. He had a music room, devoid of light or furniture. Its only luxury was a huge LP collection: Michael Jackson, El Gran Combo, Marvin Gaye. No one could have imagined that there was music from all over the world in that little room in Condoto. And yet, his collection gave me a broad panorama of sounds. It made me the performer that I am today."

Colombia's other powerhouse female MC is Liliana Saumet, vocalist with Bogota's Bomba Estéreo. The group began as an instrumental outfit, mixing cumbia with electronica and a strong dash of psychedelia — much like Richard Blair's Sidestepper, founder of the electro-cumbia school of thought.

Alternating between rapping and singing, Saumet injected a reckless sexual intensity that permeates "Blow Up," the band's U.S. debut. Bomba's bouncy radio anthem, "Fuego," is all about fire and adrenaline. On "Cosita Rica," she describes in detail a night of clubbing and ferocious lovemaking.

"People often see my lyrics as daring, or sexual," says Saumet, who is articulate and polite, almost soft-spoken. "I grew up in the Atlantic coast of Colombia, where language is meant to be sensuous. People are warm. There's the beach, of course — the sweating, the ocean breeze touching your skin."

Goyo and Saumet come from different backgrounds within the same vast nation. And yet, both were raised to the sounds of the shimmering tropical hits that define a big part of Colombia's cultural identity.

"My mom's favorite artist was [Afro-Caribbean singer-songwriter] Joe Arroyo," offers Saumet. "I grew up singing his songs. There are a lot of outside influences in his music. The ships that arrived to the Colombian coast in the '60s and '70s brought records of funk and African music. The local sound systems would play them, and people like Joe would assimilate those influences."

Similarly, it is the collision of cultures that makes the music of Tijoux, Bomba Estéreo and Chocquibtown so intriguing. In the end, the voice of a female MC is just one of many elements that separates these bright new hopes from the competition.

"Women have always played a big part in Latin rap," says Juan Data, a San Francisco-based DJ who has been writing about the genre since the '90s. "I can't really explain why that is, but compared to American rap, the Latina MCs occupy a place of honor in this music. I guess that in a scene as macho as rap, a woman who establishes a strong presence of her own will always enjoy an extra bit of respect."

calendar@latimes.com

BONDING THOUGH BOOKS

Garrison Keillor

December 2, 2005

I got to put on a tux and go to the National Book Awards in New York a couple weeks ago and eat lamb chops in a hotel ballroom and breathe air recently exhaled by Toni Morrison and Norman Mailer and Lawrence Ferlinghetti and other greats and near-greats of the trade.

I was there as an innocent bystander, not a nominee (God forbid). Having never won a big prize, I am opposed to them on principle: They are an excrescence of commerce and a corruption of the purity of artistic creation. Nonetheless, it was good to see the brilliant young novelist William T. Vollmann pick up the trophy for fiction and that grand old man W.S. Merwin get the nod for poetry. If you can't be the creator of Harry Potter or the decoder of da Vinci, winning a big prize is some consolation. It gives you reason to believe you may not have wasted your life after all.

The urge to write stuff and put it between covers is powerful, as one can see by the godawful books that emerge every day--vanity, thy name is publishing!-- and anybody with the authorial urge ought to visit the underground stacks of a major public library and feel the chill of oblivion. Good writers such as Glenway Wescott, John Dos Passos, Caroline Gordon, gone, gone, gone. They had their shining moment and then descended into storage, where they wait for years to be opened. Sometimes, to placate the ghosts, I take a book off the shelf that looks as if nobody's opened it for a few decades and I open it. And then I close it.

Emily Dickinson died unpublished, and her work eventually found its way from deep anonymity to the pantheon of American Lit, and now her grave in Amherst, Mass., is one of the most beloved anywhere in the world. She is the patron saint of the meek and lonely. A devout unbeliever, she lies under a tombstone that says "Called Back," and here, every week, strangers come and place pebbles on her stone and leave notes to her folded into tiny squares. Perhaps they are unpublished poets too. As Emily said, success is counted sweetest by those who ne'er succeed. She would have known about that.

People like to speculate about her love life, but their chatter about that is dull stuff compared to the poems, the flies buzzing and the horses turning their heads toward eternity and the narrow fellow in the grass and "Hope is the thing with feathers" and all--the lady was a fine piece of Yankee free-thinking who dwelt in the richness of Victorian language. Through her poems, you can enter the mind of New England, from which seeds blew westward and blossomed across the country. You read her and, whether you know it or not, your vision of America is elevated.

One reads books in order to gain the privilege of living more than one life. People who don't read are trapped in a mineshaft, even if they think the sun is shining. Most New Yorkers wouldn't travel to Minnesota if a bright star shone in the west and hosts of angels were handing out plane tickets, but they might read a book about Minnesota and thereby form some interesting and useful impression of us. This is the benefit of literacy. Life is lonely; it is less so if one reads.

I once got on the subway at 96th and Broadway in Manhattan and sat down opposite a handsome young African-American woman who was reading a book of mine. The train rattled along and I waited for her to smile or laugh but she didn't. She did, however, keep reading. I stayed on the train past 72nd and 42nd and 34th and finally it was too much for me--if she had slapped the book shut and tossed it away, it would've hurt me so bad, so I got off at 14th and I was a more thoughtful man for the rest of the day. A writer craves readers, but what passes between him and them is so intimate that it's unbearable to sit and watch.

Questions for class discussion: (1) Is the author using irony when he declares he is opposed to prizes? (2) What is "excrescence"? (3) Have you ever sat reading a book and then realized that the author was sitting across from you and as a joke, you kept a straight face?

 

HOW ABOUT BETTER PARENTS?

Thomas L. Freidman

In recent years, we’ve been treated to reams of op-ed articles about how we need better teachers in our public schools and, if only the teachers’ unions would go away, our kids would score like Singapore’s on the big international tests. There’s no question that a great teacher can make a huge difference in a student’s achievement, and we need to recruit, train and reward more such teachers. But here’s what some new studies are also showing: We need better parents. Parents more focused on their children’s education can also make a huge difference in a student’s achievement.

How do we know? Every three years, the Organization for Economic Cooperation and Development, or O.E.C.D., conducts exams as part of the Program for International Student Assessment, or PISA, which tests 15-year-olds in the world’s leading industrialized nations on their reading comprehension and ability to use what they’ve learned in math and science to solve real problems — the most important skills for succeeding in college and life. America’s 15-year-olds have not been distinguishing themselves in the PISA exams compared with students in Singapore, Finland and Shanghai.

To better understand why some students thrive taking the PISA tests and others do not, Andreas Schleicher, who oversees the exams for the O.E.C.D., was encouraged by the O.E.C.D. countries to look beyond the classrooms. So starting with four countries in 2006, and then adding 14 more in 2009, the PISA team went to the parents of 5,000 students and interviewed them “about how they raised their kids and then compared that with the test results” for each of those years, Schleicher explained to me. Two weeks ago, the PISA team published the three main findings of its study:

“Fifteen-year-old students whose parents often read books with them during their first year of primary school show markedly higher scores in PISA 2009 than students whose parents read with them infrequently or not at all. The performance advantage among students whose parents read to them in their early school years is evident regardless of the family’s socioeconomic background. Parents’ engagement with their 15-year-olds is strongly associated with better performance in PISA.”

Schleicher explained to me that “just asking your child how was their school day and showing genuine interest in the learning that they are doing can have the same impact as hours of private tutoring. It is something every parent can do, no matter what their education level or social background.”

For instance, the PISA study revealed that “students whose parents reported that they had read a book with their child ‘every day or almost every day’ or ‘once or twice a week’ during the first year of primary school have markedly higher scores in PISA 2009 than students whose parents reported that they had read a book with their child ‘never or almost never’ or only ‘once or twice a month.’ On average, the score difference is 25 points, the equivalent of well over half a school year.”

Yes, students from more well-to-do households are more likely to have more involved parents. “However,” the PISA team found, “even when comparing students of similar socioeconomic backgrounds, those students whose parents regularly read books to them when they were in the first year of primary school score 14 points higher, on average, than students whose parents did not.”

The kind of parental involvement matters, as well. “For example,” the PISA study noted, “on average, the score point difference in reading that is associated with parental involvement is largest when parents read a book with their child, when they talk about things they have done during the day, and when they tell stories to their children.” The score point difference is smallest when parental involvement takes the form of simply playing with their children.

These PISA findings were echoed in a recent study by the National School Boards Association’s Center for Public Education, and written up by the center’s director, Patte Barth, in the latest issue of The American School Board Journal.

The study, called “Back to School: How parent involvement affects student achievement,” found something “somewhat surprising,” wrote Barth: “Parent involvement can take many forms, but only a few of them relate to higher student performance. Of those that work, parental actions that support children’s learning at home are most likely to have an impact on academic achievement at school.

“Monitoring homework; making sure children get to school; rewarding their efforts and talking up the idea of going to college. These parent actions are linked to better attendance, grades, test scores, and preparation for college,” Barth wrote. “The study found that getting parents involved with their children’s learning at home is a more powerful driver of achievement than parents attending P.T.A. and school board meetings, volunteering in classrooms, participating in fund-raising, and showing up at back-to-school nights.”

To be sure, there is no substitute for a good teacher. There is nothing more valuable than great classroom instruction. But let’s stop putting the whole burden on teachers. We also need better parents. Better parents can make every teacher more effective.

 

FACING A ROBO-GRADER? JUST KEEP OBFUSCATING MELLIFLUOUSLY


By Michael Winerip April 22, 2012


A recently released study has concluded that computers are capable of scoring essays on standardized tests as well as human beings do.

Mark Shermis, dean of the College of Education at the University of Akron, collected more than 16,000 middle school and high school test essays from six states that had been graded by humans. He then used automated systems developed by nine companies to score those essays.

Computer scoring produced “virtually identical levels of accuracy, with the software in some cases proving to be more reliable,” according to a University of Akron news release.

“A Win for the Robo-Readers” is how an Inside Higher Ed blog post summed things up.

For people with a weakness for humans, there is more bad news. Graders working as quickly as they can — the Pearson education company expects readers to spend no more than two to three minutes per essay— might be capable of scoring 30 writing samples in an hour.

The automated reader developed by the Educational Testing Service, e-Rater, can grade 16,000 essays in 20 seconds, according to David Williamson, a research director for E.T.S., which develops and administers 50 million tests a year, including the SAT.

Is this the end? Are Robo-Readers destined to inherit the earth?

Les Perelman, a director of writing at the Massachusetts Institute of Technology, says no.

Mr. Perelman enjoys studying algorithms from E.T.S. research papers when he is not teaching undergraduates. This has taught him to think like e-Rater.

While his research is limited, because E.T.S. is the only organization that has permitted him to test its product, he says the automated reader can be easily gamed, is vulnerable to test prep, sets a very limited and rigid standard for what good writing is, and will pressure teachers to dumb down writing instruction.

The e-Rater’s biggest problem, he says, is that it can’t identify truth. He tells students not to waste time worrying about whether their facts are accurate, since pretty much any fact will do as long as it is incorporated into a well-structured sentence. “E-Rater doesn’t care if you say the War of 1812 started in 1945,” he said.

Mr. Perelman found that e-Rater prefers long essays. A 716-word essay he wrote that was padded with more than a dozen nonsensical sentences received a top score of 6; a well-argued, well-written essay of 567 words was scored a 5.

An automated reader can count, he said, so it can set parameters for the number of words in a good sentence and the number of sentences in a good paragraph. “Once you understand e-Rater’s biases,” he said, “it’s not hard to raise your test score.”

E-Rater, he said, does not like short sentences.

Or short paragraphs.

Or sentences that begin with “or.” And sentences that start with “and.” Nor sentence fragments.

However, he said, e-Rater likes connectors, like “however,” which serve as programming proxies for complex thinking. Moreover, “moreover” is good, too.

Gargantuan words are indemnified because e-Rater interprets them as a sign of lexical complexity. “Whenever possible,” Mr. Perelman advises, “use a big word. ‘Egregious’ is better than ‘bad.’ ”

The substance of an argument doesn’t matter, he said, as long as it looks to the computer as if it’s nicely argued.

For a question asking students to discuss why college costs are so high, Mr. Perelman wrote that the No. 1 reason is excessive pay for greedy teaching assistants.

“The average teaching assistant makes six times as much money as college presidents,” he wrote. “In addition, they often receive a plethora of extra benefits such as private jets, vacations in the south seas, starring roles in motion pictures.”

E-Rater gave him a 6. He tossed in a line from Allen Ginsberg’s “Howl,” just to see if he could get away with it.

He could.

To their credit, researchers at E.T.S. provided Mr. Perelman access to e-Rater for a month. “At E.T.S., we pride ourselves in being transparent about our research,” Mr. Williamson said.

Two of the biggest for-profit education companies, Vantage Learning and Pearson, turned down my request to let Mr. Perelman test their products.

“He wants to show why it doesn’t work,” said Peter Foltz, a Pearson vice president.

“Yes, I’m a skeptic,” Mr. Perelman said. “That’s exactly why I should be given access.”

E.T.S. officials say that Mr. Perelman’s test prep advice is too complex for most students to absorb; if they can, they’re using the higher level of thinking the test seeks to reward anyway. In other words, if they’re smart enough to master such sophisticated test prep, they deserve a 6.

E.T.S. also acknowledges that truth is not e-Rater’s strong point. “E-Rater is not designed to be a fact checker,” said Paul Deane, a principal research scientist.

“E-Rater doesn’t appreciate poetry,” Mr. Williamson added.

They say Mr. Perelman is setting a false premise when he treats e-Rater as if it is supposed to substitute for human scorers. In high stakes testing where e-Rater has been used, like grading the Graduate Record Exam, the writing samples are also scored by a human, they point out. And if there is a discrepancy between man and machine, a second human is summoned.

Mr. Foltz said that 90 percent of the time, Pearson’s Intelligent Essay Assessor is used by classroom teachers as a learning aid. The software gives students immediate feedback to improve their writing, which they can revise and resubmit, Mr. Foltz said. “They may do five drafts,” he said, “and then give it to the teacher to read.”

As for good writing being long writing, Mr. Deane said there was a correlation. Good writers have internalized the skills that give them better fluency, he said, enabling them to write more in a limited time.

Mr. Perelman takes great pleasure in fooling e-Rater. He has written an essay, then randomly cut a sentence from the middle of each paragraph and has still gotten a 6.

Two former students who are computer science majors told him that they could design an Android app to generate essays that would receive 6’s from e-Rater. He says the nice thing about that is that smartphones would be able to submit essays directly to computer graders, and humans wouldn’t have to get involved.

In conclusion, to paraphrase the late, great Abraham Lincoln: Mares eat oats and does eat oats, but little lambs eat ivy.

A kiddley divey too, he added, wouldn’t you?
 

 

COLLEGE LESSON: THERE IS NO 'ONE'

Mary Schmich   April 29  2012

 

Every year about this time, when the future is rapping hard on the doors of high school seniors, I get a smattering of letters from students.

They write because they haven't gotten into the college of their dreams, and they're wondering if I have any guidance or consolation.

One of these letters arrived in my email last week, from a suburban Chicago student who had been wait-listed at her preferred college.

"I have been simultaneously encouraged by everyone to stay optimistic," she wrote, "but also choose another school and mentally commit to it. The former advice was easy to follow. The latter, quite challenging. I have become one of those kids caught in a desperate love affair with a place they've only spent a few hours at. It pains me to evoke this cliche, but ... As long as I still have a hope, I am going to hope with all my heart."

I'm sympathetic. The heart is a single-minded creature, and there's virtue in the tenacity of hope.

But she's on to more than she may know when she calls her desire for this college "a desperate love affair."

Desperate love affairs are often conducted in the haze of romantic myth, fogged up by the notion that there is only one true beloved, without whom life will be just a shadow of its glorious possibilities.

This romantic myth can apply to colleges as surely as it does to human mates, and in both cases it obscures the truth: There is no perfect One, and there's more than one path to a good life.

I never yearned for one special college the way many students do. The reasons are various. By the time college was in my viewfinder, my parents had more urgent preoccupations, and so I felt no parental pressure. My friends were staying close to home for school, so I felt no peer pressure. Nor did my school counselor foster the idea that applying to college was an Olympic-level competition in which she who aims highest is likeliest to die happiest.

I applied to one school, Pomona College in California, figuring that if I got in and they gave me financial aid, I'd go, and if they didn't? Well, I'd hang with my friends at Arizona State.

I did get in to Pomona — I might not qualify today — and it changed my life. I found lifetime friends, learned how to learn, grew up. I feel loyal and indebted to the place.

But I also believe that what I gained there I could have gained at a number of other schools. I know people who went to state universities whose college years were equally fruitful.

And I've seen desperate love affairs with colleges turn into bad dates, pricey bad dates.

I know a couple of smart Chicago-area women, now in their 20s, who went away to expensive colleges they'd yearned for, only to wind up unhappy. They came back, closer to home, and prospered, less expensively.

One beauty of being 18 is the intensity of desire. You want, you dream, you reach. And the broad things so many of us want at that age — knowledge, adventure, opportunity, love in its assorted forms — are likely to be things we want all through our lives.

But the specifics — that school, that job, that person — are not as important as they seem in the clutch of desire. Love can change its direction, and in the end, college is like everything else:

It doesn't make you. Your future lies in what you make of it.

mschmich@tribune.com

 

STREAMING VIDEO'S EMERGING BOUNTY

AT this point in its evolution streaming video can still feel like your neighborhood VHS rental shop, circa 1985.
 
The shelves of the two leading services, Netflix Instant and Hulu Plus, seem to be full of films you’ve never heard of, arranged in no particular order. The latest hits haven’t arrived yet, and there’s no one around to help you out except for the digital equivalent of the surly, underpaid clerk: those “recommended for you” algorithms that pretend to know your taste but come up with the oddest suggestions imaginable. Why does Netflix keep insisting that I need to see “Celebrity Rehab With Dr. Drew”? Is it trying to tell me something?

But where there is chaos, there is also opportunity. Both Netflix and Hulu are full of hidden gems, but often it’s not easy to dig them out. Somewhere on Netflix, between Ashley Tisdale in “Sharpay’s Fabulous Adventure” and Christopher Walken and Jennifer Beals in “The Prophecy II,” there’s a very good copy of Edgar G. Ulmer’s 1948 film noir “Ruthless” in its full 105-minute version, rather than the 88-minute public domain cut that’s been the only edition available for years. To find it, though, you have to know it’s there.

One useful service is instantwatcher.com, an independent Web site that monitors the Netflix streaming library (and has a beta site up for Hulu, at instantwatcher.com/hulu, that for the moment only covers Hulu’s free, commercially supported programming). Instantwatcher keeps track of the new releases on Netflix, as well as those about to expire, and offers several searchable sub-indexes: pages devoted to various genres, languages and years of release; lists of films recommended by Rotten Tomatoes and the critics of The New York Times (including links back to The Times’s reviews).

Streaming video will probably never live up to the utopian dream of many cinephiles: the notion that every movie ever made will suddenly be available with the click of a mouse. Neither Netflix nor Hulu is particularly friendly toward older films. Instantwatcher reports only 26 Netflix streaming titles for the banner year 1939, more than half of them B westerns (but also “One Third of a Nation,” a fascinating and quite rare offshoot of the Federal Theater Project’s Living Newspaper Unit). The number of offerings doesn’t achieve triple digits until 1984 (with 108 titles, many of them television episodes) and builds to a peak with 1,007 titles available for 2009 (a bonanza for fans of reality shows).

But Netflix shows some surprising strengths, particularly with those studios — Paramount, Universal, MGM/UA and Fox — that have most drastically cut back on their library releases to DVD. In most cases these are not impeccably restored, newly mastered editions. Too many of the wide-screen films of the ’50s and ’60s are presented in unwatchable, pan-and-scan prints, though there are some surprising exceptions, like Richard Sale’s 1955 “Gentlemen Marry Brunettes,” which even turns up in high-definition.

But perfection is not always a virtue, and streaming, with its forgivingly low resolution, provides a perfectly acceptable showcase for movies that do not exist in the flawless prints now considered essential for DVD, and particularly Blu-ray, release. There are many movies of interest without reputations or stars big enough to justify the expense of a full-scale digital restoration, but I cling to the conviction that it’s better for films to be seen with dust spots or dubious color than not seen at all. Streaming does, or should, open a niche for films that otherwise wouldn’t be economically viable.

Netflix, for example, makes it possible to follow the late career of Mitchell Leisen, one of Paramount’s most gifted contract directors, with five otherwise unseeable films including the noirish melodrama “No Man of Her Own,” starring Barbara Stanwyck. Here too are nine films from the erratic but interesting Lewis R. Foster, three starring Ronald Reagan in his slipping-down days as a movie star: “Cavalry Charge” (a k a “The Last Outpost”), “Hong Kong” and “Tropic Zone”.

It’s particularly gratifying to see a handful of titles emerging from Republic Pictures, a very rich library that Viacom, its current owner, has allowed to fall into disuse. Famous for its Gene Autry and Roy Rogers westerns of the ’30s and ’40s, Republic also produced a wide range of crime films, musicals and melodramas that have been virtually invisible for decades. Just by returning a fraction of this material to the public eye Netflix would be doing a valuable service to critics and historians, as well as fans.

Hulu’s main focus remains episodic television, a rich field in itself. Only on Hulu Plus can you see the full three-season run of “The Alfred Hitchcock Hour,” including the one episode personally directed by Hitchcock (“I Saw the Whole Thing”) and subliminal classics like “An Unlocked Window,” directed by Joseph M. Newman from a script by James Bridges.

On the movie side Hulu Plus offers a few hundred titles, mostly recent releases from independent distributors. But the great strength of the service is its large and rapidly increasing selection of films from the Criterion Collection, including many titles the company has not yet released in any format.

There’s a huge infusion of films from Japan, including masterpieces by Kenji Mizoguchi (“The Life of Oharu,” “Utamaro and His Five Women,” “The Story of the Last Chrysanthemum, “ “Princess Yang Kwei-fei” and “The 47 Ronin”) and Mikio Naruse (“Wife,” “Mother,” “Ginza Cosmetics,” “Flowing”), as well as genre films from Hideo Gosha (“Bandits vs. Samurai Squadron”), Kiyoshi Kurosawa (“Cure”), Kenji Misumi (the “Hanzo the Razor” trilogy) and Seijun Suzuki (“Everything Goes Wrong”).

The otherwise unavailable French titles include classics like Marcel Carné’s “Hotel du Nord” and Jacques Feyder’s “Carnival in Flanders” and Robert Bresson’s “Man Escaped,” and rediscoveries like Jean Grémillon’s “Remorques” and Raymond Bernard’s “Anne-Marie.” From Italy, Criterion has supplemented the recent Eclipse release of four films by Raffaello Matarazzo with two more titles, “He Who Is Without Sin” and “Torna!,” by that master of Italian melodrama; Mario Monicelli’s “Organizer” (1963), with Marcello Mastroianni as a trade unionist in 19th-century Turin, is a historical drama on a scale with Visconti’s “Leopard” but seasoned with Monicelli’s wit.

Some of these new titles seem to be passing through streaming on their way to a full-scale DVD release (and, in the case of the Mizoguchi films, Blu-ray seems almost a moral obligation). Others are too obscure (Mikhail Romm’s 1962 Soviet drama “Nine Days One Year”) or in too rough condition (Hanns Schawz’s 1937 “Return of the Scarlet Pimpernel”) to make a disc release seem practical.

But all the promise of streaming video as a new platform lies right there: in revalorizing films that don’t fit the dominant business model. There’s no shortage of movies in this world; what we need are new ways to see them.

 

THE TROUBLE WITH ONLINE EDUCATION

By Mark Edmundson

“AH, you’re a professor. You must learn so much from your students.”

This line, which I’ve heard in various forms, always makes me cringe. Do people think that lawyers learn a lot about the law from their clients? That patients teach doctors much of what they know about medicine?

Yet latent in the sentiment that our students are our teachers is an important truth. We do in fact need to learn from them, but not about the history of the Roman Empire or the politics of “Paradise Lost.” Understanding what it is that students have to teach teachers can help us to deal with one of the most vexing issues now facing colleges and universities: online education. At my school, the University of Virginia, that issue did more than vex us; it came close to tearing the university apart.

A few weeks ago our president, Teresa A. Sullivan, was summarily dismissed and then summarily reinstated by the university’s board of visitors. One reason for her dismissal was the perception that she was not moving forward fast enough on Internet learning. Stanford was doing it, Harvard, Yale and M.I.T. too. But Virginia, it seemed, was lagging. Just this week, in fact, it was announced that Virginia, along with a number of other universities, signed on with a company called Coursera to develop and offer online classes.

But can online education ever be education of the very best sort?

It’s here that the notion of students teaching teachers is illuminating. As a friend and fellow professor said to me: “You don’t just teach students, you have to learn ’em too.” It took a minute — it sounded like he was channeling Huck Finn — but I figured it out.

With every class we teach, we need to learn who the people in front of us are. We need to know where they are intellectually, who they are as people and what we can do to help them grow. Teaching, even when you have a group of a hundred students on hand, is a matter of dialogue.

In the summer Shakespeare course I’m teaching now, I’m constantly working to figure out what my students are able to do and how they can develop. Can they grasp the contours of Shakespeare’s plots? If not, it’s worth adding a well-made film version of the next play to the syllabus. Is the language hard for them, line to line? Then we have to spend more time going over individual speeches word by word. Are they adept at understanding the plot and the language? Time to introduce them to the complexities of Shakespeare’s rendering of character.

Every memorable class is a bit like a jazz composition. There is the basic melody that you work with. It is defined by the syllabus. But there is also a considerable measure of improvisation against that disciplining background.

Something similar applies even to larger courses. We tend to think that the spellbinding lecturers we had in college survey classes were gifted actors who could strut and fret 50 amazing minutes on the stage. But I think that the best of those lecturers are highly adept at reading their audiences. They use practical means to do this — tests and quizzes, papers and evaluations. But they also deploy something tantamount to artistry. They are superb at sensing the mood of a room. They have a sort of pedagogical sixth sense. They feel it when the class is engaged and when it slips off. And they do something about it. Their every joke is a sounding. It’s a way of discerning who is out there on a given day.

A large lecture class can also create genuine intellectual community. Students will always be running across others who are also enrolled, and they’ll break the ice with a chat about it and maybe they’ll go on from there. When a teacher hears a student say, “My friends and I are always arguing about your class,” he knows he’s doing something right. From there he folds what he has learned into his teaching, adjusting his course in a fluid and immediate way that the Internet professor cannot easily match.

Online education is a one-size-fits-all endeavor. It tends to be a monologue and not a real dialogue. The Internet teacher, even one who responds to students via e-mail, can never have the immediacy of contact that the teacher on the scene can, with his sensitivity to unspoken moods and enthusiasms. This is particularly true of online courses for which the lectures are already filmed and in the can. It doesn’t matter who is sitting out there on the Internet watching; the course is what it is.

Not long ago I watched a pre-filmed online course from Yale about the New Testament. It was a very good course. The instructor was hyper-intelligent, learned and splendidly articulate. But the course wasn’t great and could never have been. There were Yale students on hand for the filming, but the class seemed addressed to no one in particular. It had an anonymous quality. In fact there was nothing you could get from that course that you couldn’t get from a good book on the subject.

A truly memorable college class, even a large one, is a collaboration between teacher and students. It’s a one-time-only event. Learning at its best is a collective enterprise, something we’ve known since Socrates. You can get knowledge from an Internet course if you’re highly motivated to learn. But in real courses the students and teachers come together and create an immediate and vital community of learning. A real course creates intellectual joy, at least in some. I don’t think an Internet course ever will. Internet learning promises to make intellectual life more sterile and abstract than it already is — and also, for teachers and for students alike, far more lonely.

Mark Edmundson, a professor of English at the University of Virginia, is the author of “Why Read?”

 

BATMAN’S WORLD OF DREAD, ON SCREEN AND OFF

A policeman stands outside a movie theater in New York on Friday during a showing of "The Dark Knight Rises." (MEHDI…)

With a severity unusual for a summer picture, even one depicting the winter of Gotham City's discontent, "The Dark Knight Rises" grinds the audience's guts as it imagines a metropolis nearly beyond saving. It is a prime example of popular entertainment invested, wholly, in dread.

Those who emerge from "The Dark Knight Rises" mightily impressed speak of it as a strange sort of survival test. This is how it is in 2012, in a world without much job stability or stability of any kind. On our couches or out in the world, surrounded by others, we spend a lot of time and money and psychic energy steeling ourselves for the worst. We measure ourselves against dire apocalyptic scenarios invented for our enjoyment. We immerse ourselves in the war games of "Call of Duty," wondering if we can survive to the next level.

http://articles.chicagotribune.com/images/pixel.gifhttp://articles.chicagotribune.com/images/pixel.gifOr we pay to witness mass dread writ large, on an IMAX screen, if we can swing the up-charge for "The Dark Knight Rises."

The dread has now spilled over into the real world. "The Dark Knight Rises" opened to the public Friday just after midnight across the country, including those at the Century 16 multiplex in Aurora, Colo. This is where a gunman, identified as 24-year-old James Holmes, opened fire 30 minutes after the latest and most dire Batman movie began, first detonating some sort of tear gas or smoke bomb and then killing 12 moviegoers and injuring, some critically, 59 others.

Holmes wore black, a flak jacket and a gas mask when he entered the theater. Law enforcement officials, among them New York City Police Commissioner Ray Kelly, confirmed Friday that, in Kelly's words, Holmes also "had his hair painted red, he said he was 'The Joker,' obviously the 'enemy' of Batman."

This will be enough, in itself, to reduce the tragedy to a tale of a sick mind with a thing for Heath Ledger. But was it really that simple? Was this a psychopathic act of homage to the previous "Dark Knight" film, released in 2008, which began with Ledger's Joker detonating a smoke bomb after murdering his bank-robber colleagues and innocent civilians?

What happened in a movie theater just 15 miles from the 1999 Columbine High School massacre occurred in the midst of a film depicting — in lingering, protracted detail — a worst-case terrorism scenario. Director Christopher Nolan's film has been praised widely for its depiction of Gotham City under siege. For some of us, though, what worked with sinister skill in "The Dark Knight" turned morbid, rancid, in the new picture, quite apart from the horrific events of early Friday morning.

As Joe Morgenstern wrote in The Wall Street Journal: The film "makes you feel thoroughly miserable about life. It's spectacular, to be sure, but also remarkable for its all-encompassing gloom. No movie has ever administered more punishment, to its hero or its audience, in the name of mainstream entertainment." Slate critic Dana Stevens noted its "repeated scenes of bone-crunching violence" and characterized it as "something of an ordeal."

Law enforcement officials around the country are providing additional security for this weekend's "Dark Knight Rises" screenings, in the wake of the Aurora killings. New York was doing so, Kelly said, "as a precaution against copycats and to raise the comfort levels among movie patrons." A heavy irony, since the movie's own comfort levels are set so low.

Warner Bros., the distributor of "The Dark Knight Rises," on Friday pulled from theaters a coming-attractions trailer for "Gangster Squad," starring Sean Penn as mob boss Mickey Cohen. In one scene, Cohen's henchmen open fire with machine guns in a crowded movie theater. It's a hideous sight in the shadow of Aurora. But we see that sort of carnage every day, on one screen or another.http://articles.chicagotribune.com/images/pixel.gif

No film, even one made by a talented director of serious ambition, can explain or capture a nation's spreading nervous breakdown. We pay good money to watch that breakdown in fictional action, an hour or two (or, in the case of Nolan's film, nearly three) at a time.

It'll be a long time, if ever, before "The Dark Knight Rises" can be watched for what it is — grave masterwork or grim ordeal or both — rather than for what happened early Friday. I'm not sure why, exactly, but the saddest thing I read after the killings came from Tom Mai, a neighbor of the Holmes family in suburban San Diego. In an Associated Press interview, Mai described Holmes, who recently endured some troubles at school and, like millions, couldn't find full-time work, as "a typical American kid."

Meaning?

He "kept to himself," Mai said.

And he "didn't seem to have many friends."

mjphillips@tribune.com

 

UNNATURAL SELECTION

Confessions of an Application Reader

Lifting the Veil on the Holistic Process at the University of California, Berkeley

By RUTH A. STARKMAN

 

A highly qualified student, with a 3.95 unweighted grade point average and 2300 on the SAT, was not among the top-ranked engineering applicants to the University of California, Berkeley. He had perfect 800s on his subject tests in math and chemistry, a score of 5 on five Advanced Placement exams, musical talent and, in one of two personal statements, had written a loving tribute to his parents, who had emigrated from India.

Why was he not top-ranked by the “world’s premier public university,” as Berkeley calls itself? Perhaps others had perfect grades and scores? They did indeed. Were they ranked higher? Not necessarily. What kind of student was ranked higher? Every case is different.

The reason our budding engineer was a 2 on a 1-to-5 scale (1 being highest) has to do with Berkeley’s holistic, or comprehensive, review, an admissions policy adopted by most selective colleges and universities. In holistic review, institutions look beyond grades and scores to determine academic potential, drive and leadership abilities. Apparently, our Indian-American student needed more extracurricular activities and engineering awards to be ranked a 1.

Now consider a second engineering applicant, a Mexican-American student with a moving, well-written essay but a 3.4 G.P.A. and SATs below 1800. His school offered no A.P. He competed in track when not at his after-school job, working the fields with his parents. His score? 2.5.

Both students were among “typical” applicants used as norms to train application readers like myself. And their different credentials yet remarkably close rankings illustrate the challenges, the ambiguities and the agenda of admissions at a major public research university in a post-affirmative-action world.

While teaching ethics at the University of San Francisco, I signed on as an “external reader” at Berkeley for the fall 2011 admissions cycle. I was one of about 70 outside readers — some high school counselors, some private admissions consultants — who helped rank the nearly 53,000 applications that year, giving each about eight minutes of attention. An applicant scoring a 4 or 5 was probably going to be disappointed; a 3 might be deferred to a January entry; students with a 1, 2 or 2.5 went to the top of the pile, but that didn’t mean they were in. Berkeley might accept 21 percent of freshman applicants over all but only 12 percent in engineering.

My job was to help sort the pool.

We were to assess each piece of information — grades, courses, standardized test scores, activities, leadership potential and character — in an additive fashion, looking for ways to advance the student to the next level, as opposed to counting any factor as a negative.

External readers are only the first read. Every one of our applications was scored by an experienced lead reader before being passed on to an inner committee of admissions officers for the selection phase. My new position required two days of intensive training at the Berkeley Alumni House as well as eight three-hour norming sessions. There, we practiced ranking under the supervision of lead readers and admissions officers to ensure our decisions conformed to the criteria outlined by the admissions office, with the intent of giving applicants as close to equal treatment as possible.

The process, however, turned out very differently.

In principle, a broader examination of candidates is a great idea; some might say it is an ethical imperative to look at the “bigger picture” of an applicant’s life, as our mission was described. Considering the bigger picture has aided Berkeley’s pursuit of diversity after Proposition 209, which in 1996 amended California’s constitution to prohibit consideration of race, ethnicity or gender in admissions to public institutions. In Fisher v. the University of Texas, the Supreme Court, too, endorsed race-neutral processes aimed at promoting educational diversity and, on throwing the case back to lower courts, challenged public institutions to justify race as a factor in the holistic process.

In practice, holistic admissions raises many questions about who gets selected, how and why.

I could see the fundamental unevenness in this process both in the norming Webinars and when alone in a dark room at home with my Berkeley-issued netbook, reading assigned applications away from enormously curious family members. First and foremost, the process is confusingly subjective, despite all the objective criteria I was trained to examine.

In norming sessions, I remember how lead readers would raise a candidate’s ranking because he or she “helped build the class.” I never quite grasped how to build a class of freshmen from California — the priority, it was explained in the first day’s pep talk — while seeming to prize the high-paying out-of-state students who are so attractive during times of a growing budget gap. (A special team handled international applications.)

In one norming session, puzzled readers questioned why a student who resembled a throng of applicants and had only a 3.5 G.P.A. should rank so highly. Could it be because he was a nonresident and had wealthy parents? (He had taken one of the expensive volunteer trips to Africa that we were told should not impress us.)

Income, an optional item on the application, would appear on the very first screen we saw, along with applicant name, address and family information. We also saw the high school’s state performance ranking. All this can be revealing.

Admissions officials were careful not to mention gender, ethnicity and race during our training sessions. Norming examples were our guide.

Privately, I asked an officer point-blank: “What are we doing about race?”

She nodded sympathetically at my confusion but warned that it would be illegal to consider: we’re looking at — again, that phrase — the “bigger picture” of the applicant’s life.

After the next training session, when I asked about an Asian student who I thought was a 2 but had only received a 3, the officer noted: “Oh, you’ll get a lot of them.” She said the same when I asked why a low-income student with top grades and scores, and who had served in the Israeli army, was a 3.

Which them? I had wondered. Did she mean I’d see a lot of 4.0 G.P.A.’s, or a lot of applicants whose bigger picture would fail to advance them, or a lot of Jewish and Asian applicants (Berkeley is 43 percent Asian, 11 percent Latino and 3 percent black)?

The idea behind multiple readers is to prevent any single reader from making an outlier decision. And some of the rankings I gave actual applicants were overturned up the reading hierarchy. I received an e-mail from the assistant director suggesting I was not with the program: “You’ve got 15 outlier, which is quite a lot. Mainly you gave 4’s and the final scores were 2’s and 2.5’s.” As I continued reading, I should keep an eye on the “percentile report on the e-viewer” and adjust my rankings accordingly.

In a second e-mail, I was told I needed more 1’s and referrals. A referral is a flag that a student’s grades and scores do not make the cut but the application merits a special read because of “stressors” — socioeconomic disadvantages that admissions offices can use to increase diversity.

Officially, like all readers, I was to exclude minority background from my consideration. I was simply to notice whether the student came from a non-English-speaking household. I was not told what to do with this information — except that it may be a stressor if the personal statement revealed the student was having trouble adjusting to coursework in English. In such a case, I could refer the applicant for a special read.

Why did I hear so many times from the assistant director? I think I got lost in the unspoken directives. Some things can’t be spelled out, but they have to be known. Application readers must simply pick it up by osmosis, so that the process of detecting objective factors of disadvantage becomes tricky.

It’s an extreme version of the American non-conversation about race.

I scoured applications for stressors.

To better understand stressors, I was trained to look for the “helpful” personal statement that elevates a candidate. Here I encountered through-the-looking-glass moments: an inspiring account of achievements may be less “helpful” than a report of the hardships that prevented the student from achieving better grades, test scores and honors.

Should I value consistent excellence or better results at the end of a personal struggle? I applied both, depending on race. An underrepresented minority could be the phoenix, I decided.

We were not to hold a lack of Advanced Placement courses against applicants. Highest attention was to be paid to the unweighted G.P.A., as schools in low-income neighborhoods may not offer A.P. courses, which are given more weight in G.P.A. calculation. Yet readers also want to know if a student has taken challenging courses, and will consider A.P.’s along with key college-prep subjects, known as a-g courses, required by the U.C. system.

Even such objective information was open to interpretation. During training Webinars, we argued over transcripts. I scribbled this exchange in my notes:

A reader ranks an applicant low because she sees an “overcount” in the student’s a-g courses. She thinks the courses were miscounted or perhaps counted higher than they should have been.

Another reader sees an undercount and charges the first reader with “trying to cut this girl down.”

The lead reader corrects: “We’re not here to cut down a student.” We’re here to find factors that advance the student to a higher ranking.

Another reader thinks the student is “good” but we have so many of “these kids.” She doesn’t see any leadership beyond the student’s own projects.

Listening to these conversations, I had to wonder exactly how elite institutions define leadership. I was supposed to find this major criterion holistically in the application. Some students took leadership courses. Most often, it was demonstrated in extracurricular activities.

Surely Berkeley seeks the class president, the organizer of a volunteer effort, the team captain. But there are so many other types of contributions to evaluate. Is the kindergarten aide or soup kitchen volunteer not a leader?

And what about “blue noise,” what the admissions pros called the blank blue screen when there were no activities listed? In my application pile, many students from immigrant households had excellent grades and test scores but few activities. I commented in my notes: “Good student, but not many interests or activities? Why? Busy working parents? And/or not able to afford, or get to, activities?”

In personal statements, we had been told to read for the “authentic” voice over students whose writing bragged of volunteer trips to exotic places or anything that “smacks of privilege.”

Fortunately, that authentic voice articulated itself abundantly. Many essays lucidly expressed a sense of self and character — no small task in a sea of applicants. Less happily, many betrayed the handiwork of pricey application packagers, whose cloying, pompous style was instantly detectable, as were canny attempts to catch some sympathy with a personal story of generalized misery. The torrent of woe could make a reader numb: not another student suffering from parents’ divorce, a learning difference, a rare disease, even dandruff!

As I developed the hard eye of a slush pile reader at a popular-fiction agency, I asked my lead readers whether some of these stressors might even be credible. I was told not to second-guess the essays but simply to pick the most worthy candidate. Still, I couldn’t help but ask questions that were not part of my reader job.

The assistant director’s words — look for “evidence a student can succeed at Berkeley” — echoed in my ears when I wanted to give a disadvantaged applicant a leg up in the world. I wanted to help. Surely, if these students got to Berkeley they would be exposed to all sorts of test-taking and studying techniques.

But would they be able to compete with the engineering applicant with the 3.95 G.P.A. and 2300 SATs? Does Berkeley have sufficient support services to bridge gaps and ensure success? Could this student with a story full of stressors and remedial-level writing skills survive in a college writing course?

I wanted every freshman walking through Sather Gate to succeed.

Underrepresented minorities still lag behind: about 92 percent of whites and Asians at Berkeley graduate within six years, compared with 81 percent of Hispanics and 71 percent of blacks. A study of the University of California system shows that 17 percent of underrepresented minority students who express interest in the sciences graduate with a science degree within five years, compared with 31 percent of white students.

When the invitation came to sign up for the next application cycle, I wavered. My job as an application reader — evaluating the potential success of so many hopeful students — had been one of the most serious endeavors of my academic career. But the opaque and secretive nature of the process had made me queasy. Wouldn’t better disclosure of how decisions are made help families better position their children? Does Proposition 209 serve merely to push race underground? Can the playing field of admissions ever be level?

For me, the process presented simply too many moral dilemmas. In the end, I chose not to participate again.

Ruth A. Starkman teaches writing and ethics at Stanford and, from 1992 to 1996, taught writing at the University of California, Berkeley.

BERKELEY ON BERKELEY ADMISSIONS

“In general, we have an incredibly successful story to tell about our process,” said Amy Jarich, who has been director of admissions at the University of California, Berkeley, since September.

 

In an interview, Ms. Jarich responded to some of the issues raised by Ruth A. Starkman in her essay on the training of outside application readers and Berkeley’s admissions process — a process Ms. Jarich calls transparent. (Freshman selection criteria and reports on comprehensive review can be found on Berkeley’s Web site.)

 

“The training process is tried and true,” she said. “We try to do consistent training that helps people understand the policies and also the practice.”

On the application examples used in training, she said, “we intentionally pick the trickiest cases to norm with, aimed at generating discussion,” after which many new readers have to adjust their scoring.

 

Noting that reading applications is “an art,” she said that Proposition 209 was a challenge that created the need for readers to separate out in their minds race, ethnicity and gender. “Other factors, like reported family income, do not make the decision for us, but they do inform us as we read in context.”

 

“We’re very sensitive to the fact that we want to pull in a socioeconomically diverse group,” she said, naming several programs in place to help students graduate.

To further diversify, the chancellor has set a goal that 20 percent of students come from outside California, she said. Calling the in-state/out-of-state argument “so political,” she added: “It’s hard to close your mind to it, but in the review process it’s not a factor.” Nor are candidates compared, she said. “Nobody should say we have too many of one and not enough of another.”

 

“The student reports to us their G.P.A., and shows us every strength and every marker,” she said. Readers in the application-review stage should not consider “anything that’s out of that student’s control.”

 

QUESTIONING THE NATURE OF EDUCATION

August 23, 2013

By Troy Jollimore

Mark Edmundson. author of "Why Teach?" is a professor of English at the Universoty of Virginia

Mark Edmundson has a message for this fall’s crop of first-year university students: If you’re going to university in search of what universities have traditionally offered — which is to say, an education — you had better be prepared to meet some resistance. Edmundson asserts in his book “Why Teach?” that universities have largely lost their way: Their administrators, and many of their faculty, are less interested in developing their students' intellectual capacities, in giving them access to the treasures of our culture and in helping them discover themselves and grow as people than they are in keeping students happy with flashy technology, pop culture references and inflated grades.http://articles.chicagotribune.com/images/pixel.gif

Edmundson is a professor of English at the University of Virginia, and when he talks about a "good" or "real" education, he means a liberal arts education: one that emphasizes the humanities, in particular the reading of literature, and takes self-development rather than career preparation as its primary goal. The fact that most universities, and most students, now focus on job training and show little interest in exploring perpetually perplexing questions and trying to impart deep values strikes Edmundson as disturbing and wrongheaded. "What does it mean," he asks, "for a university to stop seeing itself as having something like a spiritual mission and begin acting like a commercial venture?"

Regardless of what it means, there is no question that universities have undergone a radical shift in the way education is perceived. The people who run universities — and many of the people who teach in them — no longer believe in the value of learning for its own sake, let alone such hoary ideas as truth, virtue or wisdom. What they care about is pleasing the students so that those students will continue to enroll and pay the tuition that funds the university's operations, and so that those students will give high evaluations now required of professors for retention and promotion. And, as Edmundson points out, students who have been raised in a consumption-based society in which the fundamental values are monetary, the most respected virtues are agreeableness and speed, and the highest conceivable end is to be constantly diverted and entertained are unlikely to demand to be challenged, made uncomfortable or forced to confront and critique their basic beliefs.

Yet it is those students who suffer. "The quest at the center of a liberal arts education is not a luxury quest: it's a necessity quest," Edmundson writes. "If you do not undertake it, you risk leading a life of desperation — maybe quiet; maybe, in time, very loud — and I am not exaggerating. For you risk trying to be someone other than who you are, which, in the long run, is killing."

Unfortunately, Edmundson asserts, most university education is concerned with making us other than who we are. "Current schooling, from the primary grades through college, is about tooling people to do what society (as its least imaginative members conceive it) needs done. We are educated to fill roles, not to expand our minds and deepen our hearts."

Some will find it easy to scoff at such lofty sentiments. Indeed, scoffing — particularly when lofty sentiments are the target — has become something close to an automatic reflex in our society, and I think Edmundson is on to something when he points out how much harm is done by the desire to look cool, to avoid showing enthusiasm, to appear above sincere expressions of genuine feeling. It used to be that professors were willing to display a passionate interest in the subjects to whose studies they had devoted their lives. If this alienated some students, or invited a certain degree of easy mockery, it also served as an encouraging example and role model for those students who were potentially capable of passionate interest and commitment. To avoid such displays, as so many professors do now in the interest of trying to look cool (or at least relevant), is to rob students of this opportunity.

http://articles.chicagotribune.com/images/pixel.gif

Edmundson's passionate dedication to and enthusiasm for teaching make the book, despite the grimness of its portrait of American universities today, a spirited and cheering read. I had the occasional quibble with Edmundson's claims — and once or twice I wasn't quite clear what he was getting at — but for the most part I found his descriptions, diagnoses and suggestions accurate and insightful, even inspiring.

I found myself wanting to give copies of "Why Teach?" not only to all of my university colleagues but to my students. Perhaps some of them would dismiss Edmundson as a crank, a Luddite or some sort of malcontent. But others, I'm sure, would recognize him for who he is: an optimist and an idealist who believes that true education makes students into something far more valuable than consumerist robots and who believes that universities can still offer this kind of education. All it would take is for those of us who have chosen teaching as our profession to remember that we made this choice because we believed in values that transcend the shallow and deeply corrosive values of the marketplace, and to find the integrity to proclaim our belief in those values to each other, to our students, and to the world.

Troy Jollimore is a 2013 Guggenheim fellow and a philosophy professor at California State University, Chico. His books include "Love's Vision" and "At Lake Scugog: Poems."

 

WHAT OUR SCHOOLS CAN’T DO  – BUT PARENTS CAN

By Phillip Jackson

Chicago schools by themselves can never close the academic gap between poor black and Latino students and their more affluent white and Asian counterparts. Schools cannot eliminate the racial academic achievement gap because schools did not create it. This gap comes to schools with children from their homes, families and communities.

The gap, which is well-established before kindergarten, widens during the first three years of schooling. And from third grade through high school, the academic achievement gap remains relatively steady. A recent survey by the Organization for Economic Co-operation and Development shows the racial gap’s prevalence even among adults in the United States.

Until the proper roles and limitations of schools are understood, they will almost always fail at educating poor, ethnic students. Effective teachers and administrators are critical to the successful educational process, but they are not miracle workers. Very few teachers can compensate for years, decades and sometimes even centuries of educational deficiencies in homes, families and communities.

Parents, families, communities, societal structures, value systems, peer groups, networks, cultures and community institutions are the best sources of education for children. Without these structures and institutions in place and operating optimally, even the best schools will struggle to educate children.

At a minimum, for schools to succeed, they must have students with basic skills who want to learn, actively engaged parents who are invested in their child’s educational success, and teachers and administrators who are skilled and passionate about their profession.

For at least the past 70 years, we have been asking public schools to do what no school can do. We ask schools to substitute for broken family structures and decimated communities; to impart moral and spiritual values to children; to teach children discipline and self-control; to teach them to want to learn and to inspire them to succeed; and to teach children to make positive and proper life choices.

Public schools were not designed to do any of these things well. And because schools spend so much time trying to teach things that they cannot teach, many times they fail at teaching the things that they ought to be teaching — reading, writing, math and science.

Government and schools have important roles and responsibilities for educating children. However, their first responsibility is to powerfully engage parents and communities in their children’s educational lives.

Effective parents, families and communities can do what no school can do: create the culture, lay the foundation, set the trajectory, establish the momentum, insist upon high academic standards and model the behaviors and habits necessary for globally relevant learning and the educational and life success for all children in every community. This is what no school can do.

Phillip Jackson is the founder and executive director of The Black Star Project.

©2013

 

IT’S TIME TO DUMP STANDARDIZED TESTS

Dennis Byrne

Here's a way to dramatically increase classroom teaching time in Chicago and elsewhere while saving a ton of money:

Get rid of most or all standardized testing.

If you eliminated all the time that teachers now must prepare for and administer tests, you could recover days of instruction time each school year, according to a 2013 American Federation of Teachers report.

It found that in one Midwest school district, test preparation and testing consumed 19 full school days. In another district, in the East, the tests gobbled up a full month and a half of school days in heavily tested grades.

The burden comes as no surprise to teachers and administrators. Nor to the students who are the victims — yes, victims — of this overload. Or to some parents who are faced with stressed children.

Confronting this scourge, teachers here and in other cities, including New York and Los Angeles, last week conducted a National Day of Action on Testing. I don't often agree with much of what Karen Lewis, president of the Chicago Teachers Union, says, but she hit the bull's-eye when she called the standard test blizzard a "destructive national trend."

"In general, standardized tests are devised from afar, not locally," she said. "It has been documented again and again that these multimillion-dollar, rigidly prescribed, standardized testing programs often aim to judge students against measures that have little or nothing to do with what the classroom teacher has taught or is expected to teach."

What these tests do is fuel the egos and purses of bureaucrats in Springfield and Washington, academics who populate education schools (who seem to escape any blame for today's poor educational attainment) and political ideologues who wish to impose on the world their notion of what kids must learn.

Thanks to them, our schools are afflicted with top-down, one-size-fits all education standards, the latest being something called Common Core State Standards. After all, if you intend to impose uniform and strict standards, you've got to have standardized tests to measure everyone's conformance to those standards. In effect, to out those who fail to conform.

Never mind that it also fuels a multibillion-dollar industry that sets standards, creates tests and feeds off taxpayer-funded grants to further rationalize and reinforce the system.

For this state of affairs, both liberals and conservatives wear the collar. Conservatives latched onto the idea decades ago when their hackles were raised by the use, in some schools, of Ebonics — a nonstandard form of English spoken by some American blacks. Conservatives wanted "basics" taught, not just in English and literature, but also in math and social studies.

That viewpoint softened conservative opposition to centralized government just enough to enable President George W. Bush, with the help of liberals, to pass one of the most intrusive programs in history — No Child Left Behind, a failed pie-in-the-sky boondoggle.

But it didn't take long for the liberals who overpopulate the education industry to hijack the idea for their own purposes — the employment of goofy pedagogy, the conversion of American history into narrow political interpretations that favor "progressive" agendas. And so forth.

But now, opposition to the Common Core standards has created an unlikely alliance between some liberals and conservatives who together are fed up with the reality of failed attempts at central planning and control of our schools. Some liberals even see standardized testing as a tool of a sinister eugenics movement, a "science" that attempts to improve the human "breed" by weeding out the "inferior" stock.

What to do? Anti-testing forces are circulating authorization forms that allow parents to have their children opt out of standardized testing. If enough of you do it, that will screw up the test results to a point of uselessness, although they already are often useless and bogus.

The real dagger would be to deep-six the U.S. Department of Education, or at least plow all of its $70 billion budget directly into classroom instruction.

Then the question will become: Without such "oversight," how will we know if teachers and schools are performing, even if just adequately? It strikes me that the responsibility will fall on parents (or too often, one parent), teachers and principals. Like what it used to be when so many schools weren't so atrocious.

Dennis Byrne, a Chicago writer, blogs in the Barbershop on chicagonow.com. dennis@dennisbyrne.net

© 2013

 

HOW HIPSTERS RUINED PARIS

by Thomas Chatterton Williams

 

        PARIS — THE northern edge of Paris’s Ninth Arrondissement, near the Place Pigalle, was once known as “la Nouvelle-Athènes,” both for the neo-Classical flourishes of its most graceful blocks and for the creative geniuses who swept in to inhabit them.

This was the original “gay Paree” on display in Edouard Manet’s “Bar at the Folies-Bergère,” a Bohemia of near-mythical proportions in which every tier of society — from the well heeled to the creative to the horizontally employed — collided in the district’s cafes, theaters and cabarets. It was the Paris of Alexandre Dumas, Victor Hugo, Gustave Moreau and Pierre-Auguste Renoir.

        Paris has long been a palimpsest of different cities, each new iteration grafted on top of the still visible last, spanning the extremes of human excellence and beauty and, just as crucially, filth and squalor. The area around Pigalle in particular — which American G.I.’s aptly called “Pig Alley” — was always a mixture of both, its seediness informing the artistic production and spirit of numerous generations of inhabitants. You can see it in Edgar Degas’s brush strokes and hear it in Edith Piaf’s voice.

But it’s disappearing. Today, the neighborhood has been rechristened “South Pigalle” or, in a disheartening aping of New York, SoPi. Organic grocers, tasteful bistros and an influx of upscale American cocktail bars are quietly displacing the pharmacies, dry cleaners and scores of seedy bar à hôtesses that for decades have defined the neighborhood.

        These “hostess bars,” marked by barely dressed women perched in the windows, are the direct descendants of the regulated brothels that thrived here from Napoleon’s time until the postwar purge of the 1940s. The French daily Libération reports that in 2005 there were 84 such establishments around Pigalle. Today there are fewer than 20. Their disappearance is a watermark of the quarter’s rapid loss of grit and character alike.

        When my wife and I first moved here in 2011, I wasn’t sure what to make of living in the middle of a functioning red-light district. Our neighborhood, though safe and well on its way to gentrification, remained funky in the original sense of the term. In addition to cigarette smoke and baking bread, there was the whiff of dirt and sex in the air. It took a while for me to get used to the tap-tapping on windows — or hissing and tongue clicking from open doors — that greeted me as I passed the bars on my way to fill a prescription or buy a bottle of Pouilly-Fumé.

        I have never quite gotten used to the transsexual hookers who traipse the Boulevard de Clichy outside the area’s various sex shops and with whom I must share the carnivalesque sidewalk on my way in and out of the post office. Frankly, they make me uncomfortable.

        But I’ve come to see that unease as a good thing the longer I stay in this corner of France, a country where the world’s oldest profession continues to enjoy a special patrimonial status and where, try as it might, the government can’t seem to un-sew that tawdry patch from the national quilt. (It is now considering criminalizing johns, which prompted incensed writers and luminaries to pen a spirited manifesto in protest.)

        We should be grateful to be jolted from our anesthetized routines, confronted when we can be with surroundings and neighbors that are not injection-molded to the contours of our own bobo predilections. Too much of modern urban life revolves around never feeling less than fully at ease; about having even the minutest of experiences tailored to a set of increasingly demanding and homogeneous tastes — from the properly sourced coffee grounds that make the morning’s flat white to the laboriously considered iPod soundtracks we rely on to cancel the world’s noise. The logical extension is to “curate” our urban spaces like style blogs or Pinterest boards representing a single, self-satisfied and extremely sheltered expression of middle- and upper-middle-class sensibility.

        Outside my window, and adjacent to a baby boutique that stocks cashmere swaddle blankets, is a nondescript Asian massage parlor. On nice summer days, there is one masseuse who likes to prop open the door, pull her chiropractic table into the fresh air and sunbathe between clients. Once I watched a well-turned-out mother with toddler approach as the woman was smoking a cigarette. Instead of giving the kind of not-in-my-backyard glare I imagined her Park Slope counterpart might unleash, she just asked the masseuse for a light. They shared a few friendly words before going their separate ways, leaving me to wonder why I thought that should be odd.

        SUCH encounters are getting rarer by the week, but they remind me that genuinely engaging with an urban space means encountering and making room for an assortment of lifestyles and social realities — some appealing, some provocative, and some repulsive. This is what the Situationists meant by psychogeography, or, as Guy Debord put it, the “specific effects of the geographical environment (whether consciously organized or not) on the emotions and behavior of individuals.”

Down the street, where Henri de Toulouse-Lautrec once had his studio, you now must pass a store called “Pigalle” — a high-end streetwear purveyor — and then Buvette Gastrotèque, the handsome new Paris outpost of a faux-French restaurant and bar from the West Village.

        From there a left turn puts you at the intersection of Rue Victor Massé and Rue Frochot, where, in the space of one half-block, three hostess bars have recently been shuttered and reopened as upscale cocktail lounges. That number includes the famous Dirty Dick, now a Polynesian-themed luxury rum bar, with the name and grungy facade kept ironically intact. Inside, the atmosphere is far more beach bum than bordello; the most subversive element is a smoking room in the back.

Directly opposite, beside a dilapidated DVD shop, black-clad bouncers assemble a velvet rope each night in front of a pristine new bar called Glass. It is the brainchild of a polyglot team of N.Y.U. grads who have decided (correctly, judging by their success) that what Parisians want most these days are tacos, hot dogs and homemade tonic water in their G & Ts. Le F’Exhib — the lone holdout on the block, where the girls and the ravaged exterior seemed to age in tandem — finally closed its doors this fall.

        And so a vivid and storied layer of authentic Paris is being wiped out not by not-in-my-backyard activism, government edict or the rapaciousness of Starbucks or McDonald’s but by the banal globalization of hipster good taste, the same pleasant and invisible force that puts kale frittata, steel-cut oats and burrata salad on brunch tables from Stockholm to San Francisco.

        Drifting through these streets, as they are scrubbed clean and homogenized before my eyes, my thoughts turn to Blaise Pascal, who once wrote “a man does not show his greatness by being at one extremity, but rather by touching both at once.” The same, of course, could be said of neighborhoods. The nicer this one gets, the more it seems to feel like the one I left behind in Brooklyn.

        People say you had to be in Paris in the ’20s or New York in the ’80s. The sad truth of our contemporary moment seems to be only that you no longer need to be anywhere in particular anymore.

The brunch is all the same.

 

    <img src="http://meter-svc.nytimes.com/meter.gif"/>

 

Thomas Chatterton Williams is the author of “Losing My Cool: Love, Literature and a Black Man’s Escape From the Crowd,” now at work on a novel about a shooting on Long Island.