Maximizing shareholder value: The goal that changed corporate America - The Washington Post

<!-- This node will contain a number of 'page' class divs. -->

Maximizing shareholder value: The goal that changed corporate America

By , Published: August 26

ENDICOTT, N.Y. — This town in the hills of Upstate New York is best known as the birthplace of IBM, one of the country’s most iconic companies. But there remain only hints of that storied past.

The main street, once swarming with International Business Machines employees in their signature white shirts and dark suits, is dotted with empty storefronts. During the 1980s, there were 10,000 IBM workers in Endicott. Now, after years of layoffs and jobs shipped overseas, about 700 employees are left.

Investors in IBM’s shares, by contrast, have fared much better. IBM makes up the biggest portion of the benchmark Dow Jones industrial average and has helped drive that index to record highs. Someone who spent about $16,000 buying 1,000 shares of IBM in 1980 would now be sitting on more than $400,000 worth of stock, a 25-fold return.

It used to be a given that the interests of corporations and communities such as Endicott were closely aligned. But no more. Across the United States, as companies continue posting record profits, workers face high unemployment and stagnant wages.

Driving this change is a deep-seated belief that took hold in corporate America a few decades ago and has come to define today’s economy — that a company’s primary purpose is to maximize shareholder value.

The belief that shareholders come first is not codified by statute. Rather, it was introduced by a handful of free-market academics in the 1970s and then picked up by business leaders and the media until it became an oft-repeated mantra in the corporate world.

Together with new competition overseas, the pressure to respond to the short-term demands of Wall Street has paved the way for an economy in which companies are increasingly disconnected from the state of the nation, laying off workers in huge waves, keeping average wages low and threatening to move operations abroad in the face of regulations and taxes.

This all presents a quandary for policymakers trying to combat joblessness and raise the fortunes of lower- and middle-class Americans. Proposals by President Obama and lawmakers on Capitol Hill to change corporate tax policy, for instance, are aimed at the margins of company behavior when compared with the overwhelming drive to maximize shareholder wealth.

“The shift in what employers think of as their role not just in the community but [relative] to their workforce is quite radical, and I think it has led to the last two jobless recoveries,” said Ron Hira, an associate professor of public policy at the Rochester Institute of Technology.

The change can be seen in statements from IBM’s leaders over the years. When he was IBM’s president and chief executive, Thomas J. Watson Jr., son of the company’s founder, spoke explicitly about balancing a company’s interests with the country’s. Current chief executive Virginia Rometty has pledged to follow a plan called the “2015 Road Map” in which the primary goal is to dramatically raise the company’s earnings-per-share figure, a metric favored by Wall Street.

Job cuts have come this summer — the biggest wave in years at the company. In Essex Junction, Vt., about 450 workers were axed in June. In Dutchess County, N.Y., 700 jobs were lost. At Endicott, at least 15 workers were told to leave.

Retired software developer Linda Guyer saw the change over her 29-year career. In the beginning, “it was a wonderful place to work — maybe the way Google is today, really innovative,” said Guyer, 59, who used to work for IBM in Endicott. But after training her overseas replacements and then being pushed into early retirement, Guyer said, “you end up feeling really cynical.”

In 2009, IBM stopped breaking out how many workers it has in the United States vs. other countries. The company, based in Armonk, N.Y., probably began employing more workers in India than in this country around the same time, according to an analysis by Dave Finegold, a professor at the Rutgers School of Management and Labor Relations.

Many things have changed over the years about IBM, which has one of the oldest continuous histories of any company in the world. The firm that pioneered the floppy disk, an early version of the ATM and one of the earliest best-selling PCs now makes nearly as much money selling consulting services as it does software. Defenders argue that the company has had to reinvent itself so many times to stay alive that the values of Watson are no longer as easy to apply as they used to be.

Doug Shelton, a spokesman for IBM, said that globalization and increased competition make it hard to compare the company with its earlier days under Watson and that it still has the biggest and most talented technology workforce in the world. Shelton said that the company’s head count has expanded every year since 2002 and that it is hiring for positions across the United States.

“Change is constant in the technology industry,” Shelton said in a statement. “IBM is investing in growth areas for the future: big data, cloud computing, social business and the growing mobile computing opportunity. The company has always invested in transformational areas, and as a result, we must remix our skills so IBM can lead in these higher-value segments, in both emerging markets and in more mature economies.”

The cultural shifts in corporate America have not changed only IBM. The company is merely a representative of what has happened at most large, globalized U.S. firms. But some experts wonder whether these companies have gone too far, leaving the rest of the country behind.

“We don’t build companies to serve Wall Street,” said Margaret Blair, a professor at Vanderbilt Law School. “We build corporations to provide goods and services to a society and jobs for people.”

‘Social responsibility’

In the decades after World War II, as the U.S. economy boomed, the interests of companies, shareholders, society and workers appeared to be in tune. Towns such as Endicott flourished.

Even until 1981, the Business Roundtable trade group understood the need to balance these different stakeholders.

“Corporations have a responsibility, first of all, to make available to the public quality goods and services at fair prices, thereby earning a profit that attracts ­investment to continue and enhance the enterprise, provide jobs, and build the economy,” the group said at the time, in a document cited this year in an article in the publication Daedalus.

It continued: “The long-term viability of the corporation depends upon its responsibility to the society of which it is a part. And the well-being of society depends upon profitable and responsible business enterprises.”

But changes were already afoot in the academic world that would reshape the fundamental relationship between this country and its companies.

Lynn Stout, a professor of corporate and business law at Cornell University Law School, traces the transformation to the rise of the “Chicago school” of free-market economists.

In 1970, Nobel Prize-winning economist Milton Friedman wrote an article in the New York Times Magazine in which he famously argued that the only “social responsibility of business is to increase its profits.”

Then in 1976, economists Michael Jensen and William Meckling published a paper saying that shareholders were “principals” who hired executives and board members as “agents.” In other words, when you are an executive or corporate director, you work for the shareholders.

Stout said these legal theories appealed to the media — the idea that shareholders were king simplified the confusing debate over the purpose of a corporation.

More powerfully, it helped spawn the rise of executive pay tied to share prices — and thus the huge rise in stock-option pay. As a result, average annual executive pay has quadrupled since the early 1970s.

Part of this was a backlash to the dismal performance of the stock market during the 1970s, a decade that brought negative returns for investors. There was also the perception that companies, including IBM, had become lax in their management. Pressing executives to boost their returns created a new kind of accountability, just as the economy was becoming more globalized and more competitive.

The shift was dramatic. by 1997, the Business Roundtable had a new statement, also unearthed in the Daedalus article. It stated that the principal objective of a business enterprise “is to generate economic returns to its owners” and that if “the CEO and the directors are not focused on shareholder value, it may be less likely the corporation will realize that value.”

The mantra that executives and corporate board members have a duty to maximize shareholder value has become so ingrained that many people assume it must be codified somewhere.

But legal experts say there is no statute in state or federal law requiring corporations and executives to maximize shareholder value. Blair, the professor at Vanderbilt, said that courts in fact allow wide latitude for managers and directors when it comes to business decisions.

“Let me be clear that this pressure comes from the media, from shareholder advocates and financial institutions in whose direct interest it is for the company to get its share price to go up,” Blair said in testimony before a House hearing in 2008, “and from the self-imposed pressure created by compensation packages that provide enormous potential rewards for directors and managers if stock prices go up.”

Some who defend the use of shareholder value as a measuring stick for corporate success argue that with retirees depending on stocks, whether through pension funds or 401(k)s, rising share prices benefit more than just Wall Street.

“If you stick it to the equity holders, you’re going to stick it to the retirees,” said Charles Elson, director of the John L. Weinberg Center for Corporate Governance at the University of Delaware.

Philosophical changes

Like the Business Roundtable statements that changed over time, the message from companies such as IBM shifted as well.

Watson published a seminal text in 1963 called “A Business and Its Beliefs: The Ideas that Helped Build IBM.” In it, he wrote that IBM’s philosophy could be contained in three beliefs: One, the most important, was respect for the individual employee; the second, a commitment to customer service; and third, achieving excellence.

He wrote that balancing profits between the well-being of employees and the nation’s interest is a necessary duty for companies. Watson took pride in the fact that his father avoided layoffs, even through the Great Depression.

“We acknowledge our obligation as a business institution to help improve the quality of the society we are part of,” read the text of IBM’s corporate values.

Under Watson’s watch, IBM introduced groundbreaking computers that shot his father’s company to the top of the technology world. Even into the 1980s, there was a saying that IBM’s products were so reliable, “nobody ever got fired for buying IBM.”

But by the time Louis V. Gerstner Jr. took over IBM in the early 1990s, the company was in trouble. Its main advantage in the PC business was eroding, and expenses were high compared with those of competitors.

Months into his tenure, Gerstner cut about 60,000 workers, at the time one of the biggest layoffs at a U.S. corporation.

In 1994, Gerstner outlined his own set of eight principles, a clear break from the old document. Near the top was that the company’s primary “measures of success” were shareholder value and customer satisfaction. The last one: “We are sensitive to the needs of all employees and to the communities in which we operate.”

Gerstner pulled off a turnaround considered legendary by those who study business history.

The culture at IBM was irrevocably changed, too. The chief executives who followed Gerstner have pushed the company hard to hit ambitious financial targets designed to please analysts on Wall Street. In the process, the iconic IBM charted a path that other companies have followed.

One of the most influential changes took place in 1999, when IBM overhauled its pension plan under Gerstner to help cut costs, shocking longtime employees.

Guyer, the former IBM software developer, said she still remembers the surprise of getting a letter in the mail showing her cash balance for retirement after about two decades at the company: $30,000.

“It was like, ‘Oh, my God, we’ve been totally ripped off,’ ” she said.

IBM employees later filed a class-action lawsuit over the pension changes. In 2004, the company agreed to pay $320 million to current and former employees in a settlement.

William Lazonick, an expert on industrial competitiveness at the University of Massachusetts at Lowell, said the pension-plan change was a watershed moment.

“IBM was a critical company, because everybody after that said, ‘If IBM is going in that direction, we’ll all go in that direction,’ ” Lazonick said. “By 2000, really the whole system had changed.”

The company has continued tinkering with its retirement benefits. Late last year, it changed its 401(k) contribution policy so that IBM matches employee savings just once a year rather than throughout the year. The company said it was making the change to stay competitive, but the new plan also means that employees who lose their jobs before a set date in December do not see any of the matching funds.

Since Gerstner’s time running the company, the pressure to please shareholders has only ratcheted up. Samuel J. Palmisano, chief executive from 2002 to 2011, charted new goals in 2010, calling the plan IBM’s “2015 Road Map.” The primary objective: nearly doubling earnings per share, to $20.

IBM’s current chief executive, Rometty, has picked up where Palmisano left off. The company’s 2012 annual report notes that the company’s road map “delivers long-term value and performance for all key IBM stakeholders — investors, clients, employees and society.”

But as sales flatten, questions have emerged about how the company will hit its ambitious target, aside from slashing jobs.

“This is a horrible business model,” said Lee Conrad, a coordinator for Alliance@IBM, a group that advocates for company employees. “It’s all about the EPS [earnings per share] and not about growing the business. The customers are being impacted by this when good employees are being cut. It’s just a mess.”

Guyer said everyone in the office used to have a copy of Watson’s manifesto on IBM’s principles, the one that says “respect for the individual” came first. The company had its own printing press, so it was easy to get the book.

By the time she left, she did not see the book around as much. She remembers rescuing one from a trash can once. And her copy? It is stored in her garage somewhere, a bittersweet souvenir from her corporate career.

Noonan: Work and the American Character - WSJ.com

<!-- This node will contain a number of 'page' class divs. -->

Noonan: Work and the American Character - WSJ.com

Two small points on an end-of-summer weekend. One is connected to Labor Day and the meaning of work. It grows out of an observation Mike Huckabee made on his Fox show a few weeks ago. He said that we see joblessness as an economic fact, we talk about the financial implications of widespread high unemployment, and that isn't wrong but it misses the central point. Joblessness is a personal crisis because work is a spiritual event.

A job isn't only a means to a paycheck, it's more. "To work is to pray," the old priests used to say. God made us as many things, including as workers. When you work you serve and take part. To work is to be integrated into the daily life of the nation. There is pride and satisfaction in doing work well, in working with others and learning a discipline or a craft or an art. To work is to grow and to find out who you are.

In return for performing your duties, whatever they are, you receive money that you can use freely and in accordance with your highest desire. A job allows you the satisfaction of supporting yourself or your family, or starting a family. Work allows you to renew your life, which is part of the renewing of civilization.

Work gives us purpose, stability, integration, shared mission. And so to be unable to work—unable to find or hold a job—is a kind of catastrophe for a human being.

There are an estimated 11.5 million unemployed people in America now, and those who do not have sufficient work or who've left the workforce altogether inflate that number further.

This is the real reason jobs and employment are the No. 1 issue in America's domestic life. And what I have been thinking in the weeks leading up to this weekend is very simple: "Thank you, God, that I have a job." May more of us be able to say those words on Labor Day 2014.

Related Video

And may more political leaders come up who can help jobs happen, who can advance and support the kind of national policies that can encourage American genius. One of the things missing in the current political scene is zest—a feeling that can radiate from the political sphere that everything is possible, the market is wide open.

In the midst of the economic malaise of the 1970s the TV anchormen spoke in sonorous tones about the dreadful economic indicators—inflation, high interest rates, "the misery index." But Steve Jobs, in his parents' garage, was quietly working on circuit boards. And strange young Bill Gates was creating a company called Microsoft. All that work burst forth under the favorable economic conditions and policies in the 1980s and '90s.

What is needed now is a political leader on fire about all the possibilities, not one who tries to sound optimistic because polls show optimism is popular but someone with real passion about the idea of new businesses, new inventions, growth, productivity, breakthroughs and jobs, jobs, jobs. Someone in love with the romance of the marketplace. We've lost that feeling among our political leaders, who mostly walk around looking like they have headaches. But American genius is still there, in our garages. It's been there since before Ben Franklin and the key and the kite and the bolt of lightning.

***

The second point is about a kind of cultural unease in the country that is having an impact on the national mood. I think it's one of the reasons the right track/wrong track polls are bad.

To make the point, we go back in political time.

Really good politicians don't try to read the public, they are the public. They don't try to be like the people, they actually are like them. Ronald Reagan never thought of himself as a gifted reader of the public mind, but as a person who had a sense of what Americans were thinking because he was thinking it too. That's a gift, and a happy one to have—the gift of unity with the public you lead. The lack of that quality can be seen in many current political figures, who often, when they speak, seem to be withholding their true thoughts. As if the people wouldn't like it, or couldn't handle it.

Reagan was a good man, and part of his leadership was that he thought Americans were good too. He had high respect for what he saw as the American character. He liked to talk about the pioneers because he was moved by their courage, their ability to endure and forge through hostile conditions. He thought that was a big part of the American character. He was similarly moved by the Founders. He talked about the men who founded Hollywood, too, because those old buccaneers were great entrepreneurs who invented an industry. He admired their daring and willingness to gamble. They were wealth creators—that's who Americans are. He liked to talk about inventors who create markets—that's us, he thought. He liked to talk about barn raisings—the practice out West of local settlers coming together to build some neighbor's barn, so pretty soon they'd have a clearing and then a town.

By celebrating these things he felt he was celebrating not the America that was, but the America that is. That America, he felt, was under threat of being squashed and worn down by the commands and demands of liberalism. He would fight that and, he thought, win, because Americans saw it pretty much as he did.

So Reagan didn't just have something, the ability to lead. He was given something—the America he grew up in, knew and could justly laud.

To today: I've been thinking about the big bad stories of the summer, the cultural ones that disturb people. The sick New York politician who, without apparent qualms, foists his sickness into the public sphere again. The kids who kill the World War II vet because they're bored. The kids who kill the young man visiting from Australia because they too are bored, and unhappy, and unwell. The teacher who has the affair with the 14-year-old student, and gets a slap on the wrist from the judge. The state legislator who's a sexual predator, the thieving city councilor and sure, the young pop star who is so lewd, so mindlessly vulgar and ugly on the awards show.

We're shocked. But we're not shocked. And that itself is disturbing. We're used to all this, now, this crassness and lowness of public behavior. The cumulative effect of these stories, I suspect, is that we're starting to fear: Maybe that's us. Maybe that's who we are now. As if these aren't separate and discrete crimes and scandals but a daily bubbling up of the national character.

It would be good if we had some political leaders who could speak of this deflated and anxious feeling about who we are. Conservatives have been concerned about our culture for at least a quarter-century. Helpful now would be honest liberal voices that speak to our concerns about who we fear we're becoming. They might find they're thinking the way the American people are thinking, which is step one in true leadership.

<!-- article end -->


Stephen.Bates@gmail.com | Tel/Txt +1 202 730-9760
mobile.short.typos

Mining social media: The new way of life - The Washington Post

<!-- This node will contain a number of 'page' class divs. -->

Mining social media: The new way of life

By Walter Pincus,  Published: WEDNESDAY, AUGUST 07, 8:29 PM ET
      Aa  

Ever heard of Raptor X, a specialized computer tool that when used with a privately developed plug-in called “Social Bubble” can display the geographic location of Twitter users and their posted Tweets? In addition, Raptor could potentially capture related commercial entities and even financial transactions.

The government created Raptor X. The Special Operations Command’s National Capital Region (SOCOM NCR) organization, located here in the Washington area, used it to data-mine social media as part of the 2012 Project Quantum Leap experiments.

The project’s purpose was to identify “strategies and techniques for exploiting open sources of information, particularly social media in support of a counter threat finance mission.”

That’s a quote from the draft of an unclassified NCR after-action report released Tuesday by Steven Aftergood of the Federation of American Scientists’ Project on Government Secrecy.

Many people love the convenience of the Internet and cellphones and ever-multiplying social-media applications. What many don’t always focus on, however, is how easily outsiders can invade their lives.

The June disclosure that the National Security Agency is collecting everyone’s telephone records and storing them for five years as part of anti-terrorism efforts has caused an uproar.

Get used to it. The gathering of such data, whether by private commercial enterprises, hackers or governments — ours or foreign ones — is part of 21st- century life.

NCR’s Quantum Leap is another peak into that future.

The project, designed to improve federal interagency coordination in dealing with different threats or scenarios, involved about 50 people from government and private industry and was to last over six months. Hard to know if it did, because SOCOM is not commenting.

The released draft covers only the first experiment, which dealt with countering the financing activities of terrorists, insurgents, human traffickers, weapons proliferators and international organized crime.

It used a real money-laundering case that up to then involved $2.5 billion and was being investigated by several elements of the Department of Homeland Security, led by Immigration and Customs Enforcement and the Homeland Security investigations division. Quantum Leap participants came from Defense Department and non-defense law enforcement and regulatory agencies.

The threat network involved multinational and U.S.-based corporate entities, shell and shelf companies, dozens of individuals and millions to billions of assets, according to the draft.

What is stunning is that the project identified more than 300 traditional and nontraditional open sources as potentially relevant to the activity. These ranged from public sources such as the Patent Classification System, which has a lot of free business information, to subscription-based sources that sell specialized financial and business data.

Of course there was also access to non-public government data, such as banking-secrecy and transactional activities obtained by law enforcement personnel.

Data on real estate transactions, transportation and logistics are almost always public records but difficult to gather unilaterally. Much of it can be obtained from commercial sectors on a subscription basis.

When it comes to gathering information about individuals, the draft notes, “Fortunately, penetration of social media, preponderance of publicly available Personal Identifying information databases and sources, and advancements in available analytical tools significantly improve the ability to rapidly and accurately do human entity resolution from open sources.”

The Department of Energy’s Special Technologies Laboratory was the developer of the Raptor X open architecture.

Creative Radicals, a San Francisco design and development firm, created Social Bubble, the Twitter search tool that “was heavily used to explore human networks associated with the counter-finance threat scenario and enabled identification of various entities: people, businesses and locations associated with the money laundering network,” according to the draft.

Red Cell Intelligence Group of Arlington specialized in collecting data from public or commercial sources to create a searchable database of international banking relationships; Green Line System is a commercial maritime tracking and analysis company and in Quantum Leap “demonstrated the ability to track a particular ship or ships associated with a named company,” the draft said.

One of the major lessons learned was the “pronounced utility of social media in exploiting human networks, including networks in which individual members actively seek to limit their exposure to the Internet and social media,” the draft report said.

Among other lessons:

●Location-based services are becoming prevalent and much more accepted in social media by the younger generation.

●Exploiting social media could become more or less difficult in the coming months, depending on whether security methods improve.

●Legal reviews of the uses and applications of social media are just beginning and inevitably will transform notions of privacy.

A data revolution is underway with private industry and government leading the way. But as the response to the NSA disclosures show, this country is not yet prepared for it.

Show Comments (14)



Stephen.Bates@gmail.com | Tel/Txt +1 202 730-9760
mobile.short.typos

great FT weekend @longread on @natesilver: #Bigdata

http://www.ft.com/intl/cms/s/2/469049e2-fa37-11e2-98e0-00144feabdc0.html

Nate Silver: Big data’s biggest figure

Nate Silver photographed for the FT Magazine in NY©Dylan Coulter

Nate Silver was down on Anthony Weiner’s chances long before the selfie-snapping former congressman’s campaign to become New York’s mayor had to contend with the publication of a second wave of X-rated messages and priapic self-portraits.

“I think his favourables were low enough that he had a cap on his support from the get-go,” says the 35-year-old data-blogger as he perches, straight-backed, on the edge of a black leather couch in his Manhattan loft. Huma Abedin, Weiner’s wife and an aide to Hillary Clinton, would have won in a landslide, he adds. The unfortunately named candidate has been a gift to the city’s tabloids but a metropolitan mayoral race feels a little small for Silver: the man who predicted how 49 of America’s 50 states would vote in 2008 and then swept the board in 2012.

The one-time economics student and KPMG consultant looks every inch the nervy nerd in glasses, brown suit trousers and pale blue shirt, but the age of “big data” has made numeracy hip. Statisticians have become stars, from the authors of Freakonomics to Billy Beane, who applied data to baseball at the Oakland A’s and ended up being portrayed by Brad Pitt in Moneyball.

Nearly a year after publishing The Signal and the Noise, his bestseller on how human foibles make most of us poor predictors of anything from card games to climate change, Silver finds people staring at him in airports, “not because your zipper’s undone, but because they saw you on TV”. Characteristically, he has reduced fame to an equation: “You can almost do a mathematical function, where it’s based on when were you last on TV and how high-profile is the outlet. The half-life is actually pretty short,” he laughs.

He is still invited to retell the story of how he consistently foresaw Barack Obama’s victory over Mitt Romney even as professional pundits weakly dubbed the race too close to call. Yet the biennial rhythms of US general elections and midterms have left Silver with itchy feet in the lull of an odd-numbered year. “It’s hard to tell the same story 30 or 40 times,” he says. “I want to have movement and vitality and challenge myself a little bit.”

This summer has provided that opportunity. In 2010, he licensed FiveThirtyEight.com – his political data blog named after the number of votes in the US electoral college – to The New York Times for three years. As the clock ticked on the deal, he was courted by other news organisations, Wall Street firms, sports teams and Hollywood studios. The NYT wheeled out editors and executives to persuade him to stay but in July he defied expectations by selling FiveThirtyEight to ESPN. Walt Disney’s deep-pocketed sports network “was kind of a 9.5 out of 10 or a 9.8 out of 10 or a 10 out of 10,” Silver explained on a conference call announcing his decision, with only slightly less precision than his readers have come to expect. The deal, “web-focused” at first but likely to include on-air appearances for coverage of major sporting events and Disney’s ABC News broadcasts, gives Silver a chance to mix politics with his original passion for applying overlooked statistics to forecast baseball players’ performances.

He is still a Detroit Tigers fan, able to enjoy games over a beer with friends rather than analysing them against his model, but he admits to having become “disengaged” from baseball as electoral calendars clashed with the October World Series season climax.

More than a return to sport, though, the former full-time poker player now plans to create a much more ambitious web property. ESPN, the wealthiest network on US television, has the budget for him to hire a few dozen journalists, editors and analysts who can use data to shed new light on anything from tennis and football to the economy, the weather, teacher performance, college choices, local restaurants, the Oscars or even foreign policy.

Silver tackled some of these subjects in his book, which he stresses had only one chapter on politics. The one-time high-school debate champion, who makes his points with focused confidence, says it is time to branch out because he has proved his point in politics “so there are diminishing returns in terms of intellectual interest for me”.

. . .

Despite his mild-mannered reporter look, Silver has become a punchy voice in debates over the future of a profession shaken by disrupted business models and new competitors. Some political journalists may dismiss him as a mere blogger whose work could not be compared with theirs and Silver’s sources may be numbers rather than nominees, but he says what he does is “almost certainly journalism”. Bald men and combs come to mind when shrinking newsrooms argue over who can define themselves as a journalist, and press-pack conformity is not for Silver, who told Out magazine last year that being gay had encouraged his independent-mindedness. (He described himself as “sexually gay but ethnically straight”.)

“I’m not in the business to make friends,” he says coolly, explaining that he thinks reporters overrate the value of “quote-unquote inside information” from their political sources. “You’re being spun by someone. These people are professional liars, basically,” he says, sounding affronted on the truth’s behalf. He is equally aggravated by Washington pundits who routinely get their predictions wrong. Recalling one Twitter feud with a journalist he called out for his inaccurate projection on the eve of the 2012 vote, he says: “If you don’t have that instinct for accountability, I don’t know if you can really think of yourself as a journalist, or at least not a good journalist.”

Reporting both sides’ spin does not make you objective, Silver adds, but “I think you have a lot of people in DC who are very detached from reality”. He sees that most clearly in the media’s tendency to focus on outlying polls and anecdotal pointers. It is one thing to note the number of Romney signs in front of houses in a swing state, he argues, but quite another to take them as evidence of the race being closer than most polls indicate.

People are usually too sure of themselves when they make predictions, his book notes, but general elections are a rare exception. A desire to be even-handed may help explain why political reporters seem to preserve the fiction of a close race until the end. “It’s very easy to say it’s too close to call. You won’t get yourself in trouble for saying that.”

Silver has caused plenty of trouble by stating the opposite. Republican supporters accused him of bias, with Dean Chambers of Unskewed Polls calling him “effeminate”. (That prompted Gawker founder Nick Denton, a friend, to publish a picture of Silver with his ex-boyfriend the day after November’s election, saying “I hope it drives the homophobes crazy.”) Silver is dismissive of such attacks, saying: “There are some people who think that if you have a model that says Obama has an 80 per cent chance of winning, it’s a partisan forecast.”

Politico, the Washington news outlet, has harrumphed loudest, calling Silver overrated. “Some of his stuff goes on and on, trying to use numbers to prove stuff that I don’t think can be proved by numbers alone,” its executive editor said in June. There was similar pushback in parts of The New York Times. David Brooks, one of its columnists, argued that when polls move from offering snapshots of the current mood to projecting, “they’re getting into silly land”. Margaret Sullivan, the NYT public editor, said last month that some colleagues had found him “disruptive”. Silver says that claims of tension in a newsroom he calls “a team of rivals” were “a little overblown”.

“It’s not our intention – and when I say ‘our’ I mean me and my counsel – it’s not our intention to rehash the past,” he says carefully. He is frustrated by inaccurate accounts of his exit from the NYT, but won’t set the record straight: “There’s this dilemma where you’re being professional by being careful and confidential and other people are being less careful and confidential,” he says.

The public editor acknowledged Silver’s “unmatched” ability to bring traffic to the NYT’s website, particularly from elusive young readers. At the election season’s height, when traffic to FiveThirtyEight peaked at 10 times normal levels, one in five NYT digital readers was visiting the blog. Silver notes the jump in digital subscriptions the NYT enjoyed that quarter, saying pointedly: “I can’t say that was all FiveThirtyEight but I think it brought a lot of value to the paper.” Post-election, his non-politics posts attracted more traffic than his political pieces, he adds.

He won’t say what ESPN paid for FiveThirtyEight. He could perhaps have made more by taking his algorithmic aptitude to a hedge fund, but the option did not appeal: “It’s not like I’m struggling by any means but I would find it much more satisfying to make a good to very good income, where you’re building out the brand in the public eye, than to go and work for a hedge fund.”

. . .

His 12th-storey apartment, close to Penn Station, is calm and sparsely decorated but not hedge-fund showy. The self-confessed design snob has broken up the white walls with a few well-picked pieces of contemporary art. The windows offer a view of the spires of a Catholic church, a few rooftop wooden water towers and Madison Square Garden, home of the New York Knicks.

But he is more interested in data than the view. As a child in Lansing, Michigan – the son of a political scientist father who would analyse stadium capacity figures for fun and an activist mother with a PhD in French history – Silver could multiply two-digit numbers in his head: taking 48 and 54 and coming up with 2,592.

Silver at the home of the Chicago White Sox, US Cellular Field, in 2008©Sports Illustrated

The baseball pundit: Silver at the home of the Chicago White Sox, US Cellular Field, in 2008

An algorithm brought him his first fame. His Pecota model, which stands for “Player Empirical Comparison and Optimisation Test Algorithm”, compared baseball players with other similar individuals in major-league history. Silver’s model looked past the commonly watched stats, such as batting average, and assigned greater weight to less-quoted ones, such as how often a batter gets on base, which correlated more to teams winning games.

Similarly, FiveThirtyEight’s model weighs up factors, from pollsters’ past accuracy to the religious and economic make-up of each state, then simulates the election 10,000 times to provide a probabilistic assessment of likely outcomes, based on polls going back to 1952. “We know that we’re going to get some of them wrong,” Silver cautions. The probability of any election victory is almost never 100 per cent. “You have a 70-30 bet, you’re supposed to get that wrong 30 per cent of the time.”

He is frustrated by people who prefer simple blue-or-red forecasts to such numerate nuance. “In baseball, it’s reached a healthy equilibrium where numbers-driven analysis is used in an appropriate way.” But politics is still far behind, he says: “I feel like we’re still fighting the Moneyball wars of 2003 and it might take another 10 years, if at all.”

The problem with political commentary is that it favours ideologues. CNN, stuck between Fox News on the right and MSNBC on the left, is struggling to revive its ratings because “the energy in politics is on the extremes,” Silver observes.

If most people struggle to interpret simple polls, will companies fare any better with the “big data” they are excitedly crunching? “I don’t think we’re on the edge of a singularity in terms of people becoming much more productive,” he says. In many cases, when he explores a new field and discovers how people misread the data about it, “it just kind of becomes depressing”.

Silver says he does not get on well with political reporters but is friends with media entrepreneurs such as Gawker’s Denton and Andrew Sullivan, the prominent blogger. His generation shares that entrepreneurial ambition, he says. “It used to be that you would idolise the guy who graduated at the top of his class from Harvard, and now you idolise the guy who drops out of Harvard to run a business,” he smiles. “I think these newspapers have a lot in common with Ivy League universities.” There is the prestige and the bright people “but there’s lots of internal politics. There are pockets of amazing things that are happening, but also pudgy bureaucratic cultures in other respects.”

Silver is not exactly dropping out; ESPN is a corporate giant with the resources to commercialise FiveThirtyEight more than ever. “I’m still pretty hungry,” he says, explaining that moving to ESPN was a decision to work “really, really hard for four more years” instead of coasting. “You could take a more relaxed route and kind of write now and then and travel a lot. That’s great, but I can do that in my fifties or sixties or seventies. Right now I want to build something while I’m still young.”

Lexington: The war of the words | The Economist

<!-- This node will contain a number of 'page' class divs. -->

The war of the words

Lexington

How Republicans and Democrats use language

Jul 13th 2013 |From the print edition

“POLITICAL language”, wrote George Orwell, “is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.” No leader will admit to having had people tortured, but Dick Cheney did say: “I was and remain a strong proponent of our enhanced interrogation programme”—which means the same thing. Notice how, as Orwell put it, “A mass of Latin words falls upon the facts like soft snow, blurring the outlines and covering up all the details.”

Wars sound horrible in plain English, so they have always generated a smokescreen of euphemism. “Kinetic action” means “killing people”. “Collateral damage” means “killing people accidentally”. Politicians typically use the word “kill” only to describe what our enemies do to us; not what we do to them. In a speech in May explaining his drone warfare policy, for example, Barack Obama spoke of “lethal, targeted action against al-Qaeda and its associated forces”. As Orwell said, when “certain topics are raised, the concrete melts into the abstract”.

Orwell worried that sloppy language disguised bad ideas. Some influential Democrats today have a different complaint: that Republicans use words more skilfully to win political battles. Conservatives are shameless and simplistic, they grumble, and it works. When Mr Obama was struggling to explain the circumstances under which doctors might discuss end-of-life provisions with Medicare patients, Sarah Palin yelled “Death panels!” and spooked a huge chunk of the electorate.

“[C]onservatives use language more effectively than liberals in communicating their deepest values,” writes George Lakoff, a linguist at the University of California, Berkeley, in “The Little Blue Book: The Essential Guide to Thinking and Talking Democratic”. Liberals “present the facts and offer policies”, he claims. Republicans, by contrast, go straight for the gut. Newt Gingrich, while Speaker of the House in the 1990s, encouraged his footsoldiers to repeat focus-group-tested words like “sick”, “pathetic” and “coercion” when talking about Democrats, while parroting “family”, “children” and “liberty” as Republican values.

When Republicans and Democrats use different terms for the same thing, the Republican phrase is nearly always shorter and more concrete, observes Joseph Romm, the author of “Language Intelligence”. He has a point. When arguing about abortion, Republicans favour “life” (evocative) while Democrats talk about “choice” (abstract). Republicans talk about “taxes” and “spending” while Democrats want to raise “revenue” for “investment”. George W. Bush had the “Patriot Act”, whereas Mr Obama has the “Patient Protection and Affordable Care Act”. The former is an awful law that is hard to oppose; the latter an awful mouthful that is hard to remember.

Mr Lakoff urges Democrats to be more concrete. “Have I seen it with my own eyes?” he asks. “Can I take a pen and draw a picture of it?” “Air”, “water” and “soil” are better than the “environment”, for example, which is perhaps why the “Clean Air Act” is the law of the land but the “American Clean Energy and Security Act of 2009” (a cap-and-trade bill for greenhouse-gas emissions) crashed to ignominious defeat.

Republicans are also better, Democrats fear, at agreeing on a message and sticking to it. Frank Luntz, a Republican consultant, once said: “There’s a simple rule. You say it again, and you say it again, and you say it again, and you say it again, and you say it again, and then again and again and again and again, and about the time that you’re absolutely sick of saying it is about the time that your target audience has heard it for the first time.”

Democrats sigh that they are too sophisticated to feel comfortable reducing complex ideas to pithy two-word phrases. And they struggle to unite around a slogan because their base includes disparate groups (blacks, Latinos, unions, educated urbanites) who do not, themselves, speak the same way. The Republican base is varied too, including both small-government types and devout Christians, but they unite around slogans such as “liberty”, whether freedom from taxes or the freedom to pray in schools. If only Democrats could “frame” issues better (in Mr Lakoff’s phrase), they would win more battles.

Not all weasels are Republican

But hold on. Democrats have won four of the past six presidential elections, so they can’t be doing everything wrong. And many of them use short words deftly. Barack Obama’s 2008 slogan, “Yes we can”, whipped crowds into a frenzy of approval (though one of Mr Obama’s speechwriters is said to have hated it). Bill Clinton’s formula “safe, legal and rare” helped bolster support for legal abortion. Democrats invoke “working families” to remind voters that “poor” and “scrounger” do not mean the same thing.

Democrats can be shameless, too: the campaign ad showing Paul Ryan tossing an old lady in a wheelchair off a cliff was not exactly nuanced. They repeat messages aggressively: Mr Obama in 2012 never stopped reminding voters that Mitt Romney was rich. And their rhetoric is often misleading. When arguing about budgets, for example, they use the word “cut” to mean “spend less than was previously planned”. So a “savage cut” can actually be a large increase. This is such a potent subterfuge that Republicans use it too, at least when talking about military spending.

Politicians will never use language the way Orwell did, marrying clarity of thought with precision. A politician has to win elections, which means convincing lots of people with widely varying interests and opinions that he is on their side. Alas, that requires waffle, fudge and snappy slogans. These are hard to coin, as Mr Lakoff inadvertently proves. He has urged Democrats to refer to taxes as “membership fees” and to argue that “Patriotism requires Medicare for all.” Somehow, neither has caught on.

Economist.com/blogs/lexington

From the print edition: United States

Broken record: fix Congressional gerrymandering fix Congressional gerrymandering fix Congressional gerrymandering The Solution to America's Collapsing Confidence - WSJ.com

<!-- This node will contain a number of 'page' class divs. -->

The People's Choice: Distrust

Senate Majority Leader Harry Reid sat in an elegant room in the Capitol on Monday afternoon, knowing that the body he leads was in imminent danger of a partisan meltdown. A bitter dispute was still raging over a seemingly simple task—the confirmation of presidential nominees. The crisis led him to muse about the broader consequences of what sometimes appears to be permanent congressional dysfunction.

"When I ran the first time, the approval rating of Congress was at 45%," said Sen. Reid, who came to Washington from his home state of Nevada three decades ago. "Now it's at 10%. In all the time Gallup has been doing its polling, no institution has ever been recorded at lower than that."

[image]Yarek Waszul

The immediate crisis was resolved the next morning, in a deal between Democrats and Republicans in which the Senate would approve a handful of presidential appointments that had been held up an average of nine months. But there is no guarantee the Senate won't return to its partisan rut—and air of crisis—this fall.

The problem isn't simply political dysfunction in Congress but something bigger: a collapse of public confidence in government across all fronts. In the most recent Wall Street Journal/NBC News poll, just 17% said they have confidence in the federal government, less than half of the 36% found in 1990. In Gallup's latest poll, as Sen. Reid noted, confidence in Congress has shrunk to an almost comical 10%. The presidency does better at 36%, but that is down from 58% a decade ago.

New bouts of political paralysis, topped off by controversies over the Internal Revenue Service and government surveillance, are making the problem worse. The term "crisis of confidence" may be overused in Washington, but there is no denying that we are in the midst of one. And that shouldn't make anyone happy, even if you don't like government. Reversing this trend is, in fact, one of the keys to American progress and international competitiveness in the century ahead.

A government that tries to do too many things and does many of them badly won't be trusted to perform the key functions—education, security, infrastructure, research—that are essential for the success of businesses and workers alike.

President Barack Obama took note of this the other day in a speech that was widely ignored but shouldn't have been. He proposed pulling the federal government more firmly into the 21st century by making better use of cutting-edge technology. "We can't take comfort in just being cynical," he said. "We all have a stake in government success, because government is us."

How has it come to this? After all, those of us of a certain age grew up thinking government was filled with smart people who did great things.

Government won World War II, built the interstate highway system, desegregated schools, put a man on the moon, cleaned up the air, built the biggest computers and conjured up the Internet. The brightest Ivy League graduates went to work for the CIA rather than big law firms, and the kind of young propeller heads who now head for Silicon Valley instead beat a path to NASA.

About four decades ago, all this began to change. The crumbling began with Vietnam and Watergate, of course, which together convinced many Americans that the government that did all those wonderful things had become power-hungry and untrustworthy. Jimmy Carter made it worse by sitting in his sweater in a cold White House and telling the public that government couldn't cure the nation's woes. Ronald Reagan confirmed and extended the argument: Not only was government not the solution, it was a big part of the problem.

Then the Robert Bork controversy erupted in 1987. The Bork hearings, in which a conservative federal judge was defeated for a spot on the Supreme Court after a vicious debate, were a watershed; they marked the beginning of the end of graciousness in D.C. discourse. "While liberals look at it as an important success, I think it also marked a change in…the sense of fair play and the level of the debate," says Democratic pollster Peter Hart.

Newt Gingrich soon arose to argue that people in government aren't simply wrongheaded; they are part of an evil bureaucracy that tolerates fraud in social welfare programs and pursues policies that undermine families. Then, to some, Bill Clinton seemed to validate the image of irresponsible governance with the Monica Lewinsky affair.

For a moment after the 9/11 terror attacks, it appeared that confidence was revived. But it soon dissipated, with two wars that seemed to many Americans either unnecessary or unending, coupled with a searing civil-liberties debate that suggested Republicans wanted big government as much as Democrats did, just for different purposes. The arrival of President Obama offered a second shining moment, but so far he has proved unequal to the daunting task of either making Washington work better or turning around political attitudes.

Through it all, government slowly began doing things it probably should have just left to the private sector or civil society. Worse, the persistence of federal deficits has eroded confidence in Washington's ability to manage the economy. Just to illustrate: In the 25 years from 1965 to 1980, the annual federal deficit topped 3% of the nation's gross domestic product just twice. In the past 25 years, it has done so 13 times.

So what can be done to fix this crisis of confidence in government? A few things come to mind:

The deficit has to be managed. It has become a metaphor for a government that doesn't work. It doesn't have to be eliminated any time soon—which would be bad economics anyway—but there has to be a sense that it is being tamed intelligently.

Government has to work better, meaning that it needs to modernize and become more useful in our everyday lives. This is what President Obama was getting at in his recent speech. He noted, for example, that the federal government possesses a trove of big data—a commodity increasingly important to businesses, researchers and academics. So it is starting to open up that data to the public online at data.gov, which in turn has spawned companies formed to put the data to use.

Also, as Mr. Obama noted, why should an individual or a business have to reproduce the same information every time it deals with a different government agency, when private companies have figured out how to make such information transparent across business lines?

The political system has to be fixed. States need to stop drawing congressional districts that ensure deep and paralyzing polarization by making them so dark red or dark blue that only the most ideologically rigid candidates bother to run. Also, the rules of the Senate need to be changed to curtail the ability of a minority of Senators, or sometimes a single one, to make progress grind to a halt.

Bill McInturff, a Republican pollster who helps conduct the Wall Street Journal/NBC News poll, says it also would be helpful if Washington's leaders managed to convey to the country a "sense of shared purpose."

Oh, and one other thing. An economic boom, he says, certainly would help: "If the economy's booming, people will say, 'Hey, things are working.' It just takes off this nasty edge."

Write to Gerald F. Seib at jerry.seib@wsj.com

<!-- article end -->

A version of this article appeared July 19, 2013, on page C3 in the U.S. edition of The Wall Street Journal, with the headline: The People's Choice:Distrust.

May I suggest you use your nightstick, Officer? Rise of the Wannabe-Warrior Cop

<!-- This node will contain a number of 'page' class divs. -->

Rise of the Warrior Cop

On Jan. 4 of last year, a local narcotics strike force conducted a raid on the Ogden, Utah, home of Matthew David Stewart at 8:40 p.m. The 12 officers were acting on a tip from Mr. Stewart's former girlfriend, who said that he was growing marijuana in his basement. Mr. Stewart awoke, naked, to the sound of a battering ram taking down his door. Thinking that he was being invaded by criminals, as he later claimed, he grabbed his 9-millimeter Beretta pistol.

[image]Sean McCabe

The police say that they knocked and identified themselves, though Mr. Stewart and his neighbors said they heard no such announcement. Mr. Stewart fired 31 rounds, the police more than 250. Six of the officers were wounded, and Officer Jared Francom was killed. Mr. Stewart himself was shot twice before he was arrested. He was charged with several crimes, including the murder of Officer Francom.

The police found 16 small marijuana plants in Mr. Stewart's basement. There was no evidence that Mr. Stewart, a U.S. military veteran with no prior criminal record, was selling marijuana. Mr. Stewart's father said that his son suffered from post-traumatic stress disorder and may have smoked the marijuana to self-medicate.

Early this year, the Ogden city council heard complaints from dozens of citizens about the way drug warrants are served in the city. As for Mr. Stewart, his trial was scheduled for next April, and prosecutors were seeking the death penalty. But after losing a hearing last May on the legality of the search warrant, Mr. Stewart hanged himself in his jail cell.

The police tactics at issue in the Stewart case are no anomaly. Since the 1960s, in response to a range of perceived threats, law-enforcement agencies across the U.S., at every level of government, have been blurring the line between police officer and soldier. Driven by martial rhetoric and the availability of military-style equipment—from bayonets and M-16 rifles to armored personnel carriers—American police forces have often adopted a mind-set previously reserved for the battlefield. The war on drugs and, more recently, post-9/11 antiterrorism efforts have created a new figure on the U.S. scene: the warrior cop—armed to the teeth, ready to deal harshly with targeted wrongdoers, and a growing threat to familiar American liberties.

The acronym SWAT stands for Special Weapons and Tactics. Such police units are trained in methods similar to those used by the special forces in the military. They learn to break into homes with battering rams and to use incendiary devices called flashbang grenades, which are designed to blind and deafen anyone nearby. Their usual aim is to "clear" a building—that is, to remove any threats and distractions (including pets) and to subdue the occupants as quickly as possible.

image
Daily Republic/Associated Press

Today the U.S. has thousands of SWAT teams. A team prepares to enterahouse in Vallejo, Calif., on March 20, above.

The country's first official SWAT team started in the late 1960s in Los Angeles. By 1975, there were approximately 500 such units. Today, there are thousands. According to surveys conducted by the criminologist Peter Kraska of Eastern Kentucky University, just 13% of towns between 25,000 and 50,000 people had a SWAT team in 1983. By 2005, the figure was up to 80%.

The number of raids conducted by SWAT-like police units has grown accordingly. In the 1970s, there were just a few hundred a year; by the early 1980s, there were some 3,000 a year. In 2005 (the last year for which Dr. Kraska collected data), there were approximately 50,000 raids.

A number of federal agencies also now have their own SWAT teams, including the Fish & Wildlife Service, NASA, the Consumer Products Safety Commission and the Department of the Interior. In 2011, the Department of Education's SWAT team bungled a raid on a woman who was initially reported to be under investigation for not paying her student loans, though the agency later said she was suspected of defrauding the federal student loan program.

The details of the case aside, the story generated headlines because of the revelation that the Department of Education had such a unit. None of these federal departments has responded to my requests for information about why they consider such high-powered military-style teams necessary.

Americans have long been wary of using the military for domestic policing. Concerns about potential abuse date back to the creation of the Constitution, when the founders worried about standing armies and the intimidation of the people at large by an overzealous executive, who might choose to follow the unhappy precedents set by Europe's emperors and monarchs.

The idea for the first SWAT team in Los Angeles arose during the domestic strife and civil unrest of the mid-1960s. Daryl Gates, then an inspector with the Los Angeles Police Department, had grown frustrated with his department's inability to respond effectively to incidents like the 1965 Watts riots. So his thoughts turned to the military. He was drawn in particular to Marine Special Forces and began to envision an elite group of police officers who could respond in a similar manner to dangerous domestic disturbances.

image
Standard-Examiner/Associated Press

When A strike force raided the home of Matthew David Stewart, one officer was killed.

Mr. Gates initially had difficulty getting his idea accepted. Los Angeles Police Chief William Parker thought the concept risked a breach in the divide between the military and law enforcement. But with the arrival of a new chief, Thomas Reddin, in 1966, Mr. Gates got the green light to start training a unit. By 1969, his SWAT team was ready for its maiden raid against a holdout cell of the Black Panthers.

At about the same time, President Richard Nixon was declaring war on drugs. Among the new, tough-minded law-enforcement measures included in this campaign was the no-knock raid—a policy that allowed drug cops to break into homes without the traditional knock and announcement. After fierce debate, Congress passed a bill authorizing no-knock raids for federal narcotics agents in 1970.

Over the next several years, stories emerged of federal agents breaking down the doors of private homes (often without a warrant) and terrorizing innocent citizens and families. Congress repealed the no-knock law in 1974, but the policy would soon make a comeback (without congressional authorization).

During the Reagan administration, SWAT-team methods converged with the drug war. By the end of the 1980s, joint task forces brought together police officers and soldiers for drug interdiction. National Guard helicopters and U-2 spy planes flew the California skies in search of marijuana plants. When suspects were identified, battle-clad troops from the National Guard, the DEA and other federal and local law enforcement agencies would swoop in to eradicate the plants and capture the people growing them.

Advocates of these tactics said that drug dealers were acquiring ever bigger weapons and the police needed to stay a step ahead in the arms race. There were indeed a few high-profile incidents in which police were outgunned, but no data exist suggesting that it was a widespread problem. A study done in 1991 by the libertarian-leaning Independence Institute found that less than one-eighth of 1% of homicides in the U.S. were committed with a military-grade weapon. Subsequent studies by the Justice Department in 1995 and the National Institute for Justice in 2004 came to similar conclusions: The overwhelming majority of serious crimes are committed with handguns, and not particularly powerful ones.

The new century brought the war on terror and, with it, new rationales and new resources for militarizing police forces. According to the Center for Investigative Reporting, the Department of Homeland Security has handed out $35 billion in grants since its creation in 2002, with much of the money going to purchase military gear such as armored personnel carriers. In 2011 alone, a Pentagon program for bolstering the capabilities of local law enforcement gave away $500 million of equipment, an all-time high.

The past decade also has seen an alarming degree of mission creep for U.S. SWAT teams. When the craze for poker kicked into high gear, a number of police departments responded by deploying SWAT teams to raid games in garages, basements and VFW halls where illegal gambling was suspected. According to news reports and conversations with poker organizations, there have been dozens of these raids, in cities such as Baltimore, Charleston, S.C., and Dallas.

In 2006, 38-year-old optometrist Sal Culosi was shot and killed by a Fairfax County, Va., SWAT officer. The investigation began when an undercover detective overheard Mr. Culosi wagering on college football games with some buddies at a bar. The department sent a SWAT team after Mr. Culosi, who had no prior criminal record or any history of violence. As the SWAT team descended, one officer fired a single bullet that pierced Mr. Culosi's heart. The police say that the shot was an accident. Mr. Culosi's family suspects the officer saw Mr. Culosi reaching for his cellphone and thought he had a gun.

Assault-style raids have even been used in recent years to enforce regulatory law. Armed federal agents from the Fish & Wildlife Service raided the floor of the Gibson Guitar factory in Nashville in 2009, on suspicion of using hardwoods that had been illegally harvested in Madagascar. Gibson settled in 2012, paying a $300,000 fine and admitting to violating the Lacey Act. In 2010, the police department in New Haven, Conn., sent its SWAT team to raid a bar where police believed there was underage drinking. For sheer absurdity, it is hard to beat the 2006 story about the Tibetan monks who had overstayed their visas while visiting America on a peace mission. In Iowa, the hapless holy men were apprehended by a SWAT team in full gear.

Unfortunately, the activities of aggressive, heavily armed SWAT units often result in needless bloodshed: Innocent bystanders have lost their lives and so, too, have police officers who were thought to be assailants and were fired on, as (allegedly) in the case of Matthew David Stewart.

In my own research, I have collected over 50 examples in which innocent people were killed in raids to enforce warrants for crimes that are either nonviolent or consensual (that is, crimes such as drug use or gambling, in which all parties participate voluntarily). These victims were bystanders, or the police later found no evidence of the crime for which the victim was being investigated. They include Katherine Johnston, a 92-year-old woman killed by an Atlanta narcotics team acting on a bad tip from an informant in 2006; Alberto Sepulveda, an 11-year-old accidentally shot by a California SWAT officer during a 2000 drug raid; and Eurie Stamps, killed in a 2011 raid on his home in Framingham, Mass., when an officer says his gun mistakenly discharged. Mr. Stamps wasn't a suspect in the investigation.

What would it take to dial back such excessive police measures? The obvious place to start would be ending the federal grants that encourage police forces to acquire gear that is more appropriate for the battlefield. Beyond that, it is crucial to change the culture of militarization in American law enforcement.

Consider today's police recruitment videos (widely available on YouTube), which often feature cops rappelling from helicopters, shooting big guns, kicking down doors and tackling suspects. Such campaigns embody an American policing culture that has become too isolated, confrontational and militaristic, and they tend to attract recruits for the wrong reasons.

If you browse online police discussion boards, or chat with younger cops today, you will often encounter some version of the phrase, "Whatever I need to do to get home safe." It is a sentiment that suggests that every interaction with a citizen may be the officer's last. Nor does it help when political leaders lend support to this militaristic self-image, as New York City Mayor Michael Bloomberg did in 2011 by declaring, "I have my own army in the NYPD—the seventh largest army in the world."

The motivation of the average American cop should not focus on just making it to the end of his shift. The LAPD may have given us the first SWAT team, but its motto is still exactly the right ideal for American police officers: To protect and serve.

SWAT teams have their place, of course, but they should be saved for those relatively rare situations when police-initiated violence is the only hope to prevent the loss of life. They certainly have no place as modern-day vice squads.

Many longtime and retired law-enforcement officers have told me of their worry that the trend toward militarization is too far gone. Those who think there is still a chance at reform tend to embrace the idea of community policing, an approach that depends more on civil society than on brute force.

In this very different view of policing, cops walk beats, interact with citizens and consider themselves part of the neighborhoods they patrol—and therefore have a stake in those communities. It's all about a baton-twirling "Officer Friendly" rather than a Taser-toting RoboCop.

Mr. Balko is the author of "Rise of the Warrior Cop," published this month by Public Affairs.

<!-- article end -->

A version of this article appeared July 19, 2013, on page C1 in the U.S. edition of The Wall Street Journal, with the headline: rise ofthe warrior cop.

FT: ‘Lucy Kellaway’s History of Office Life’



<!-- This node will contain a number of 'page' class divs. -->

‘Lucy Kellaway’s History of Office Life’

Illustration by Joe Wilson of a PC©Joe Wilson

When I started working in an office in the early 1980s, we smoked cigarettes at our desks, banged out articles on heavy typewriters and at lunchtime decamped to the office canteen or the pub for a hotplate of shepherd’s pie. Everything I’ve written since then about modern offices – the wireless, smokeless, noiseless places where we now work – has been coloured by my memory of how things used to be.

But now I find my sense of history is all skewed. I’ve just finished making a series for Radio Four about the past 250 years of office life, and have discovered half the things I thought of as new fads turn out not to be new at all; while many things I took to be eternal facts of office life are actually rather recent. There are, however, some constants – like lust and boredom – as well as some things that have gone for ever. The tea lady isn’t coming back.

The following lists aim to set the record straight.

Six new fads that aren’t new

1. Working from Starbucks

This recent trend is 350 years old. London’s first coffee shop was opened in 1652, and was an instant hit with men insuring ships or trading in sugar or human hair. Within 50 years there were 3,000 of them – an expansion rate that makes Starbucks’ invasion of the UK capital look sluggish. There were two differences between coffee houses and the modern version. The point then was to meet people – now it is to be alone with your laptop. And the drink of choice was not a caramel frappuccino with extra cinnamon but a blackish brew said to resemble “syrup of soot and essence of old shoes”.

2. Working from home

People always used to work from home – not because of the internet or to save petrol – but because there weren’t any offices. In 1762, when the Barings set up at their home in Mincing Lane, the banking was done downstairs, while upstairs Mrs Baring raised 10 out of their 12 children. It was a time of multiskilling: clerks, who lived in too, were expected to be equally handy totting up numbers, running errands and handing round bread and butter soldiers at teatime.

3. Paying for internships

Modern interns are expected not only to work for nothing but sometimes have to pay for the privilege. Yet 200 years ago this sort of thing was routine. When a teenage Charles Lamb got a sought-after job in the accounts department at the East India Company, he had to put down a £500 bond against good behaviour, and find two sponsors to do the same, and then worked for two years with no salary at all. That cost the equivalent of £140,000, making a week’s internship at Vogue – which went for $42,500 at a charity auction last year – seem quite reasonable.

4. Eating breakfast at your desk

A young colleague likes to start the day crunching his way through a bowl of Fruit’n’Fibre over his keyboard; John Stuart Mill got there 170 years before him. Every day he walked from his house in Kensington to his office in Leadenhall Street where he had a boiled egg and a cup of tea at his desk. The difference was that Mill’s breakfast was brought to him by a servant. My colleague has to pour out his Fruit’n’Fibre himself.

5. Twitter

The microblogging site did not invent brevity. It was invented on May 24 1844 when Samuel Morse tapped out the first telegraph: “What hath God wrought?” What He wrought turned out to be a very big deal indeed, laying the way for the internet and leading on March 21 2006 to Jack Dorsey tapping out the first tweet: “just setting up my twttr”. It’s not surprising that as a piece of prose, Morse’s message was vastly superior: early telegraphs cost the equivalent of $25 a message, which meant you didn’t just write any old thing.

6. Email destroying peace of mind

Our fears about email making us stressed are precisely the same as the ones we had 100 years ago with the telephone. An article published in Telephony magazine in 1913 reported that some people were made hysterical by being endlessly available on the telephone and fielding calls that flooded in at the rate of less than one a day.

And it was the telephone, not the BlackBerry, that destroyed holidays. An advertisement from 1914 advised business people that the telephone would allow them, while on holiday fishing for trout, to stay in touch with what was happening in the office.

. . .

Six things that really are new

1. Managers

Until the 20th century there were hardly any managers at all. Britain went through the industrial revolution with hardly any – instead, there were owners, gang bosses and overseers. The word “management” wasn’t used in its modern sense until a hundred years ago. But now there are 5m of them in the UK, 10 times as many as in 1911. Without managers, office life as we know it simply didn’t exist: there were almost no meetings, no memos and no need for “leveraging” or “delivering solutions”.

2. Liking your job

The notion that people enjoyed their work was unheard of. JS Mill, who had a cushy job at the East India Company, was more positive than most but even he viewed it as “an actual rest from the other mental occupations which I have carried on simultaneously”. In Victorian times clerks appear to have been permanently wretched. A clerk writing in 1907 referred to his colleagues as “miserable little pen drivers – fellows in black coats with inky fingers and shiny seats on their trousers”. They sat on high, uncomfortable stools and worked in damp places, and were likely to get tuberculosis as well as backache.

Illustration by Joe Wilson of a yarn ball for crocheting©Joe Wilson

3. Women

Women in offices were a late 19th-century innovation, introduced as an experiment to cope with growing workload, but became a huge hit. They were cheap, and didn’t need promoting because as soon as they got married, they left and cheaper replacements could be found. Until the first world war, “lady clerks” had separate entrances, staircases and dining rooms. They often worked behind screens and, in some cases, in cages to ensure their morals weren’t messed with.

At Barclays they were allowed on to the roof at lunchtime, where they marched around and sang the company song. There was only one perk enjoyed by lady clerks not available now: they were allowed to knit in quiet periods.

4. Competence

Being adequate at your job is a relatively new invention – at least in the public sector. In the mid-19th century the Civil Service was stuffed full of drooling idiots put there by relatives. A parliamentary paper from 1855 refers to “the most feeble sons in families which have been so fortunate as to obtain an appointment, yes, and others too, either mentally or physically incapacitated, enter the Service”. But then came the reforms of the 1870s and the revolutionary idea that to get a job you needed not only to refrain from drooling but also to know some mathematics and Latin, too.

5. Jargon

When the management style was command-and-control there was no need for jargon. To fire people, you didn’t talk of “demising,” as HSBC did recently. William Lever, founder of the soap company Lever Bros, wrote matter-of-factly of how in the 1920s he had got rid of “inefficient men, and too highly paid men, elderly men and men past their work ... I am confident that this has produced a state of fear in the minds of the remainder that if they were not efficient their turn would come next.”

Illustration by Joe Wilson of a man with a tie©Joe Wilson

6. Casual clothes

The team that brought Apple Macintosh to market in 1984 didn’t just amaze with their product but with their clothes: they were all clad in grey hoodies. Until then, everyone dressed up for work. In Pickwick Papers Charles Dickens describes: “First, taking off that black coat which lasts the year through, and putting on one which did duty last year, and which he keeps in his desk to save the other.” By the 1970s the virtue of smart dress was scientifically proven: according to the bestseller Dress For Success, secretaries of men who wore short-sleeved shirts were late 12 per cent more than those in long-sleeved ones.

. . .

Six things that are eternal

1. Lust

This long predates the invention of the lady clerk. As Pepys writes in his diary on June 30 1662: “Up betimes, and to my office, where I found Griffen’s girl making it clean, but, God forgive me! what a mind I had to her, but did not meddle with her.”

By the time women arrived in offices a great deal of meddling went on, and often ended very badly indeed. In 1958 Connie Nichols, a secretary at Eli Lilly, had a long affair with her boss but when she found she’d been discarded for a younger model, she seized a gun and shot him.

2. Badmouthing colleagues

To ridicule workmates appears to be a basic need for office workers. Lamb composed a couplet about a particularly dim clerk called Ward: “What Ward knows, God knows; But God knows what Ward knows!” While the need is constant, the execution has changed – witty couplets have been long since replaced by playing mean pranks on social networks.

3. Beauty premium

Being tall, low-voiced and easy on the eye has always been an advantage. Modern chief executives in the US have been found to be 2.5in taller than the average man and countless studies have shown that the good looking do better. A hundred years ago the beauty bias was made explicit: at the Bank of Scotland in the late 19th century, clerks were “removed from view” due to “diminutive stature”, having a “voice a little peculiar”, or for “their jug ears and red hair”.

4. Petty policies

In my working life some of the most unpopular changes have centred on axing minor perks such as free biscuits. At the East India Company in 1817 there was an outcry when the Christmas party – the “yearly turtle feast” – was scrapped. Even worse was a new initiative that had everyone signing in every 15 minutes throughout the day. A policy that makes Marissa Mayer’s insistence that Yahoo staff turn up to work look laisser faire.

5. Motivational slogans

At the cool Facebook headquarters in San Francisco the walls are covered in notices saying: “What would you do if you weren’t afraid?”

At Larkin Soap building in Buffalo, also pretty cool when it was opened in 1907, Frank Lloyd Wright had this carved into the walls: “Thought, feeling, action.” Such slogans didn’t prove terribly successful: Larkin Soap went bust.

6. Paper

In 1975, BusinessWeek famously predicated a paperless office but for the next 25 years the volume of paper used in offices went on rising. Even though we are now weaning ourselves off it a little, the average worker still generates 2lb of paper a day. I still predict the paperless office will arrive no sooner than the paperless toilet.

. . .

Six things that will never come back

1. Ledgers

The end of the ledger was possibly the best news the office ever had. The system of entering all information chronologically in vast books meant nothing could ever be found again. The invention of the filing cabinet in 1868 – which allowed things to be filed alphabetically – was probably a bigger step towards the knowledge economy than the computer.

2. A graveyard of equipment

Items such as quill pens, blotting paper, typewriters, adding machines, mainframe computers, word processors, and fax machines are all gone or going.

3. Noise

The end of the clanking adding machines, typewriters and raucous Bakelite telephones meant the end of noise. Now there is only the light tapping on keyboards and politely vibrating mobiles. Text has replaced talk. The irony is that against all that distracting silence, what have we started to do? Wear headphones.

4. Tobacco

This was the perfect office drug – a pick-me-up and social lubricant all in one. In Dickens’ time it wasn’t cigarettes but snuff: “The clerk smiled as he said this, and inhaled the pinch of snuff with a zest which seemed to be compounded of a fondness for snuff and a relish for fees.” In offices, a relish for fees has outlasted a fondness for tobacco, which has been stamped out from the office, consumed only by a stubborn minority on the pavement outside.

5. Privacy

Lowly workers have always worked open plan while managers had their own offices – until the 1960s and a German movement called Bürolandschaft took away walls and put in pot plants instead. Since then the onward march of open plan has continued, and even if executives manage to hold on to their offices, the walls are now made of glass. Thus anyone wanting a private meeting is forced out of the goldfish bowl and on to the stairwell.

6. The tea lady

In 1666, the wife of the housekeeper at the East India Company started making tea for the directors, and thus the role of the tea lady was born. For the next 300 years she was a cult figure in most organisations with her welcome cry of “Trolley”. In 2003, Isa Allan, a tea lady at Scottish Enterprise, was given an MBE by the Queen for being the “heart and soul” of the place. But even the Queen could not halt the onward march of mechanisation, outsourcing and cost-cutting: the tea lady has been replaced by the coffee machine, the water cooler and Pret A Manger – none of which does the job nearly so well.

‘Lucy Kellaway’s History of Office Life’ is on Radio 4 daily at 1.45pm for two weeks from Monday

Robert Francis Kennedy: On the Mindless Menace of Violence - 5 Apr 68

http://en.wikisource.org/wiki/On_the_Mindless_Menace_of_Violence

<!-- This node will contain a number of 'page' class divs. -->

On the Mindless Menace of Violence

This is a time of shame and sorrow. It is not a day for politics. I have saved this one opportunity, my only event of today, to speak briefly to you about the mindless menace of violence in America which again stains our land and every one of our lives.

It is not the concern of any one race. The victims of violence are black and white, rich and poor, young and old, famous and unknown. They are, most important of all, human beings whom other human beings loved and needed. No one—no matter where he lives or what he does—can be certain who will suffer from some senseless act of bloodshed. And yet it goes on and on and on in this country of ours.

Why? What has violence ever accomplished? What has it ever created? No martyr's cause can ever be stilled by an assassin's bullet.

No wrongs have ever been righted by riots or civil disorders. A sniper is only a coward, not a hero; and an uncontrolled, uncontrollable mob is only the voice of madness, not the voice of reason.

Whenever any American's life is taken by another American unnecessarily—whether it is done in the name of the law or in defiance of the law, by one man or a gang, in cold blood or in passion, in an attack of violence or in response to violence—whenever we tear at the fabric of life which another man has painfully and clumsily woven for himself and his children, the whole nation is degraded.

"Among free men," said Abraham Lincoln, "there can be no successful appeal from the ballot to the bullet; and those who take such appeal are sure to lose their cause and pay the costs."

Yet we seemingly tolerate a rising level of violence that ignores our common humanity and our claims to civilization alike. We calmly accept newspaper reports of civilian slaughter in far-off lands. We glorify killing on movie and television screens and call it entertainment. We make it easy for men of all shades of sanity to acquire whatever weapons and ammunition they desire.

Too often we honor swagger and bluster and the wielders of force; too often we excuse those who are willing to build their lives on the shattered dreams of others. Some Americans who preach non-violence abroad fail to practice it here at home. Some who accuse others of inciting riots have by their very conduct invited them.

Some look for scapegoats, others look for conspiracies, but this much is clear: violence breeds violence, repression breeds retaliation, and only a cleansing of our whole society can remove this sickness from our soul.

For there is another kind of violence, slower but just as deadly destructive as the shot or the bomb in the night. This is the violence of institutions; indifference and inaction and slow decay. This is the violence that afflicts the poor, that poisons relations between men because their skin has different colors. This is the slow destruction of a child by hunger, and schools without books and homes without heat in the winter.

This is the breaking of a man's spirit by denying him the chance to stand as a father and as a man among other men. And this too afflicts us all.

I have not come here to propose a set of specific remedies nor is there a single set. For a broad and adequate outline we know what must be done. When you teach a man to hate and fear his brother, when you teach that he is a lesser man because of his color or his beliefs or the policies he pursues, when you teach that those who differ from you threaten your freedom or your job or your family, then you also learn to confront others not as fellow citizens but as enemies, to be met not with cooperation but with conquest; to be subjugated and mastered.

We learn, at the last, to look to our brothers as aliens, men with whom we share a city, but not community; men bound to us in common dwelling, but not in common effort. We learn to share only a common fear, only a common desire to retreat from each other, only a common impulse to meet disagreement with force. For all this, there are no final answers.

Yet we know what we must do. It is to achieve true justice among our fellow citizens. The question is not what programs we should seek to enact. The question is whether we find in our own midst and in our own hearts that leadership of humane purpose that will recognize the terrible truths of our existence.

We must admit the vanity of our false distinctions among men and learn to find our own advancement in the search for the advancement of others. We must admit in ourselves that our own children's future cannot be built on the misfortunes of others. We must recognize that this short life can neither be ennobled or enriched by hatred or revenge.

Our lives on this planet are too short and the work to be done too great to let this spirit flourish any longer in our land. Of course we cannot vanquish it with a program, nor with a resolution.

But we can perhaps remember, if only for a time, that those who live with us are our brothers, that they share with us the same short moment of life; that they seek, as do we, nothing but the chance to live out their lives in purpose and in happiness, winning what satisfaction and fulfillment they can.

Surely this bond of common faith, this bond of common goal, can begin to teach us something. Surely we can learn, at least, to look at those around us as fellow men, and surely we can begin to work a little harder to bind up the wounds among us and to become in our own hearts brothers and countrymen once again.


<!-- NewPP limit report Preprocessor visited node count: 465/1000000 Preprocessor generated node count: 6307/1500000 Post‐expand include size: 7212/2048000 bytes Template argument size: 1210/2048000 bytes Highest expansion depth: 13/40 Expensive parser function count: 1/500 --> <!-- Saved in parser cache with key enwikisource:pcache:idhash:191420-0!*!0!*!*!4!* and timestamp 20130704060356 -->

I'm up between 5 and 6am. Why Productive People Get Up Insanely Early

http://www.fastcompany.com/3013856/how-to-be-a-success-at-everything/why-productive-people-get-up-insanely-early?partner=newsletter

<!-- This node will contain a number of 'page' class divs. -->

WHY PRODUCTIVE PEOPLE GET UP INSANELY EARLY

I am not a successful entrepreneur. I do not know the secret to life. I know that I love what I do but struggle with feeling content and balanced.

I'd ask other entrepreneurs about their advice and they would say within seconds you need to unplug or take a vacation, but applying that is difficult for me--I don't want to figure out the right way to take a break, I want to figure out how to appreciate the present moment.

I'm not an aspiring buddhist or a zen master either. I want to win, I want to be the best, I want to make some people feel stupid for not believing in us, and I love being at the front of the eternal fight that is a startup. But the counterbalance of the fight was difficult to find.

For me, 4 a.m. yielded reprieve.

Why?

From all the research done about suicide and depression, the greatest predictor may be one simple thing: weather. There are higher suicide rates in places that have the least sunlight. So getting the most sunlight possible seemed like an optimal step for increasing my daily happiness.

For the same reasons that I felt most creative on Saturday mornings and on planes, 4 a.m. has become a place of productive peace.

The first time I woke up at 4 a.m. to try this, my mind was in a completely different place with a completely foreign feeling. I had a completely different initiative: To make the World’s Best Omelette. Like actually take my time with it and take pride in it. This was something I could have never done before.

If I had to boil this down to why I felt focused and unhurried at this time: Not one person is expecting anything from you in the next 4 hours. So the ability to appreciate the task at hand and thinking creatively seemed natural.

A fun surprise: discovering your deprivation

What I was depriving myself from was time in the day where there was no pressure and no expectations. For the same reasons that I felt most creative on Saturday mornings and on planes, 4 a.m. has become a place of productive peace. That feeling is why I love what I do. I don't need a vacation. I don't need to step away. I just need a couple hours a day before anyone else is up.

The sole cause of man's unhappiness is that he does not know how to stay quietly in his room.

I can't quantify this feeling. Ben Huh of Cheezeburger.com openly talks about his suicidal thoughts when his first startup didn’t work from the pressure he put on himself. When I asked him about what makes him happy now, he cited a book called Flow: Mihaly Csikszentmihalyi refers to the "optimal experience" and what makes an experience genuinely satisfying is a state of skill-expanding consciousness appropriately enough called flow. During flow, people typically experience deep enjoyment, creativity, and a total involvement with life.

And when I asked our Philosophy PhD-turned-VC why I felt most productive on a plane, he opened his Moleskine to the opening cover where he had this quote from Pascal pasted: "The sole cause of man's unhappiness is that he does not know how to stay quietly in his room.” In other words, not being able to go anywhere cuts away the need to think about new stimuli--and finally allows us to focus.

Making an omelette, coffee, playing guitar, exercising, listening to the Dan Patrick Show, learning a language, and getting some exercise all felt like optimal experiences when not in completion for mental bandwidth slated for the company. And by 6:45, I'm working on the biggest items on my plate that I can focus on without the threat of new stimuli for the next couple of hours. The second you check email or LinkedIn, an internal clock of new items immediately starts in our minds--a vicious cycle. Planning your day the night before allows you to feel on top of your day and even look forward to it. Attacking the hardest thing first and all the stuff I didn't want to do before 9 a.m. leaves the rest of the day to be very fulfilling.

I can only point to a book written by someone else and a quote from a philosopher to explain why a 4 a.m. start time has allowed me to enjoy each day to the fullest. In a competitive landscape where being relentlessly proactive and creative each day are minimum standards, the biggest threat to your business is if you stop loving what you do. Whether by waking up before dawn or truly vacating your vacation, building a schedule that protects your love for what you do is critical to optimizing the quality of your life--and your work.

--Paul DeJoe is cofounder of Ecquire, a sales productivity tool, based in Vancouver. Check out their blog or follow them on Twitter at @ecquire.

[Image: Flickr user Anthony_lui]