Posted 3/17/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
Real people, not rules, get things done. Rules exist to prevent bad conduct (thereby enhancing our freedom). Legal protocols, such as speed limits and contract law, allow people in a crowded society to move around without crashing too much. Organizational systems in companies, hospitals and schools can help mobilize humans to build products and provide services.
But only humans, individual people, make anything happen. Whether a school, hospital or business succeeds always hinges on the commitment, skill and judgment of the people. Government too requires individual initiative.
American history can best be told as a story of individual accomplishment—not just inspirational political leaders, such as Washington or Lincoln, but social leaders such as MLK and, especially, innovators in every aspect of commerce and society—from Fulton to Edison to the Wright brothers to Gates.
Jack Kinzler, longtime head of Technical Services at NASA center in Houston, died this past week at age 94. He famously rigged up telescoping fishing poles to figure how to build a replacement heat shield for the Skylab. He also fabricated the 6-iron which Neil Armstrong took to the moon. Reading the New York Times obituary, you can practically see the twinkle in his eyes when confronted with technical difficulties. Oh, and this legendary NASA genius never attended college.
Modern culture is not friendly to individual initiative. The dramatic exceptions, such as Steve Jobs or others in technology, only prove the rule. Sociologist Robert Bellah and colleagues spotted this trend a few decades ago, when they found that Americans increasingly consider freedom to be the freedom to be left alone, not the freedom to do things. We are free to aspire to flat screen TVs in every room, but not, say, to start a business or to volunteer at the local school.
Like all cultural phenomena, this growing sense of powerlessness has complex roots. One important source, as I write about, is the steady bureaucratization of social activities. The US now ranks 20th in the world in ease of starting a business. Many schools don’t want volunteers—there might be legal liability if something goes wrong. The land of the free has become a legal minefield.
The ultimate symptom of powerlessness is that America has also lost its confidence. We fear people making decisions. What if their judgment is deficient, or biased? We huddle together and move only in unison, shielded from the risk of individual choice by an ever denser legal jungle. This voluntary confinement reflects a fear of human nature, fed by a modern trend of analyzing human judgment as only slightly removed from bestial instincts. But this attitude “sells humanity short,” David Brooks wrote this past week. People grow and mature and develop values that far surpass their primal origins.
All the people we admire, in our history and in our lives, are people who take responsibility for their choices. They are people whose first instinct is to ask, “What is the right thing to do?” and not, “What does the rule require?” Whatever works in any community or business is always the result of individual effort. People of energy and good will wake up in the morning, determined to make a difference.
Many of the problems that cause us to wring our hands—starting with the dysfunction of democracy—can be described as failures of individual initiative. Who’s responsible for the budget deficits? Exactly. Nobody. David Remnick's recent profile of President Obama in the New Yorker reflected a kind a fatalism, that even the President could only respond to the situation presented, with little opportunity to lead us to a new place.
This is perhaps America’s greatest cultural challenge. America needs to believe again in the capacity of individuals to make a difference. If the machinery of democracy is paralyzed, we must rebuild it. If we can’t volunteer in our communities, we need to change the rules. If the culture has stumbled into the quicksand of social distrust, leaders with moral authority must emerge to pull it out. Nothing will fix itself, including America’s insecure culture. Only humans can make things work.Comment ›
Posted 3/12/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
I always wondered what political scientists did. If you too are curious, read some of the 30 or so essays that the Washington Post has collected for a series on political polarization. With a few exceptions, the ones I read seem to accept the inevitability of democratic dysfunction. Political scientists slice and date the data to explain why polarization is, or is not, a new phenomenon, and similar abstract points.
A post by Stanford political scientist Morris Fiorina is titled "Gridlock is bad. The alternative is worse." The gist is that parties focus on polarized positions (say, Right to Life vs. Right to Choose), which do not reflect the more moderate views of the general electorate. Therefore, he concludes, gridlock is better than action.
I guess that’s a point. Far more important is this question: Why is it that parties get bogged down in litmus test issues, and never get around to issues vital to the daily functioning of society? However strongly you feel about abortion and gun control, the debate will do nothing to solve deficits, make schools work better, or save fish in the sea.
I have a different hypothesis than Prof. Fiorina: Modern American politics is an artificial game designed for gridlock. Polarization is a useful tool to excite the extremists. As long as the tug of war continues over divisive ideological issues, party coffers will keep ringing with new campaign money. Prof. Fiorina has no reason to fear action on these issues; that’s the last thing Ted Cruz or Harry Reid wants. Then they would bear responsibility for the consequences.
The disaster of modern politics is that our leaders are not even debating the real issues: Decaying infrastructure, obsolete entitlements, unmanageable civil service, and a disjointed regulatory system that makes starting a business unimaginable for most people—the US now ranks 20th in the world for "ease of starting a business." These are not ideological issues. That, apparently, is why politicians don’t address them. Who wants to take responsibility for change that, inevitably, some people won’t like? Better to rant and rave over ideological issues where gridlock is virtually guaranteed.
The genesis of litmus test politics probably lies in a toxic mix of gerrymandering, campaign finance, and reflexive social fears of Big Brother telling us how to live our lives. Whatever the causes, America’s political culture has changed. This isn’t the way the game worked under Howard Baker and Everett Dirksen and Sam Rayburn, but this is the way the game is played today. What’s missing? For starters, there’s no accountability for the growing dysfunction. It’s hard to hold anyone accountable when everyone—public and politicians alike—is trapped in a black hole of ideological stalemate—just turn on Fox News or MSNBC—without a line of sight to a new vision of a functioning democracy. The only people with visions are the loonies on both extremes.
America shouldn’t fear action. It should fear a political culture designed to divert attention away from practical realities. The solution is a clean break: New leaders who do not get overwhelmed by cynicism, emboldened by a popular movement to fix this broken system, top to bottom. At this point, America has far more to fear from continued paralysis than from action.Comment ›
Posted 3/11/14 by Philip K. Howard
Mayor de Blasio’s threat to pull public support of charter schools has elicited powerful reactions, none more so than Peggy Noonan’s column this past weekend: "When a school exists for the students, you can tell. When it exists for the unions, you can tell that too."
The controversy centers on whether charter schools should pay rent or no longer be "co-located" within public schools. To review the bidding, charter schools in NY are privately-run schools that receive roughly the same stipend as the public school district spends. Since the public budget calculation excludes the free rent of existing schools (as well as unfunded pension costs), it’s hard to see why charter schools shouldn’t be kept at parity by getting free rent as well.
De Blasio’s objection to charter schools seems to be that they are "privileged": they receive supplemental outside funding, often from wealthy people, and tend to attract children of parents seeking a more rigorous educational environment. Many charter schools operate longer hours, more days, and with a longer school year. They also are liberated from the constraints imposed by central public bureaucracy and by the teachers’ union.
The huge advantage of charter schools is that everybody involved—teachers, parents, yes even funders—have a sense of ownership. The operative question, at all times, is this: "What’s the right thing to do?" If something isn’t working, administrators and parents and teachers can get together and talk about how to make things better. If there’s an opportunity, they have the authority—the freedom—to make exceptions.
This ownership of daily choices brings with it human power exponentially greater than found in most rote organizations. My youngest daughter teaches first grade in a charter school in the middle of Brooklyn. She leaves just after 6 am and doesn’t get home til after 7. She is bursting with stories about her students. Last year most of her first graders, all from the projects, were reading well above grade level.
Not all charter schools achieve better academic results than public schools, and charters are (appropriately) subject to periodic re-accreditation. But charter schools almost can’t help but be better in instilling social values of right and wrong. Basic values needed to be a good citizen and hold a job are fostered by a school culture run by human values instead of mindless compliance with thick rulebooks.
It is correct, as Mayor de Blasio surmises, that charter schools therefore enjoy advantages over public schools. People in charter schools are energized about their ownership of daily choices. But is the solution to impose extra financial burdens on them? Dragging the best down is perhaps not the optimum public policy (as in Vonnegut’s short story "Harrison Bergeron," in which the intelligent get zapped whenever their brains start thinking too much). Maybe the correct policy is to reorganize public schools so that they, too, enjoy the freedoms and energy of charter schools. I bet even the union teachers would like it.Comment ›
Posted 3/10/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
There are three major structural causes of unaffordable health care in America. One is defensive medicine (3-10% of total costs), caused by unnecessary tests and procedures done in part to help protect doctors from possible lawsuits for not "doing enough." Second is the "fee-for-service" reimbursement system, which incentivizes providers to provide more (not better) care, accounting for an estimated 20 to 30% of extra costs. The third is bureaucracy—a torrent of bureaucracy—that invades every nook and cranny of caregivers’ days, costing twice as much per capita as the next most bureaucratic state, or an extra 15% of the healthcare dollar on excess administration. Doctors spend over 20% of their time on what used to be known as paperwork.
None of these sources of waste is a secret. The drafters of Obamacare tried to deal with aspects of them, as Ezekiel Emanuel recounts in his new book, Reinventing American Health Care. But change is hard. Change scares people. Change disrupts special interests. When things are working really badly, anarchist Peter Kropotkin observed at the turn of the last century, people cling even harder to the status quo, "lest [change] may make him more wretched still."
Change happens, but typically when the old system collapses. Wasting a trillion dollars a year on inefficient health care—that’s about $10,000 per family—will eventually cause the branch to break. So what should the new system look like?
Solving these problems requires entirely new frameworks. Fee-for-service systems should be replaced by "accountable care organizations," in which a provider takes care of all of a person’s health needs for one annual fee. When a patient needs specialized care, it should be done as "bundled payments," basically a fixed fee for everything to do with that problem.
Defensive medicine is the area of waste that Common Good is trying to solve. What’s needed is clear: replace unreliable jury-by-jury verdicts (and years-long emotionally-charged proceedings) with expert health courts that reliably sort good care from bad care. Only then can doctors go through the day relying on their best judgment instead of listening to a little lawyer on their shoulders. See here and here.
The Obama administration and Dr. Emanuel have been helpful in advancing the cause of reliable healthcare justice; for example, in 2011 Obama presented a budget that included $250 million to help fund medical justice reform initiatives, including special health courts. I met and spoke with Dr. Emanuel several times during his tenure as a key healthcare adviser.
But Dr. Emanuel is not accurate in suggesting that the Affordable Care Act advances the cause of reliable justice. Yes, early drafts of the Act included provisions for pilot projects of alternative systems of justice. But Senate Majority Leader Harry Reid is a champion of trial lawyers. In the law as enacted, these pilots are permissible only if they "provide[ ] patients the ability to opt out of or voluntarily withdraw from participating in the alternative at any time."
Keeping the jury, of course, eviscerates the whole idea of consistent decisions. One jury can decide a case one way; another jury on the same facts can decide exactly the opposite way. Most studies suggest that an expert court would be fairer for injured patients as well as more reliable for innocent doctors. But the Affordable Care Act is unwilling to countenance a pilot project that doctors could rely upon.
President Obama’s heart certainly seems to be in the right place on this issue. But it will take more than a suggestion to an obdurate Senate to advance this reform. The public, polls show, overwhelmingly supports special health courts. But no change is likely to happen until there’s firm leadership from the President, or a crisis.Comment ›
Posted 3/5/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
The Hill newspaper hosted a breakfast forum last week on the state of federal regulation. The panel consisted of former senators Blanche Lincoln (D) and George Allen (R), Susan Dudley at the GW regulatory center, Robert Weissman of Public Citizen, and me. The tone was generally moderate, and everyone seemed to agree that calling for wholesale "deregulation" is not useful. Most regulatory goals are unassailable—say, for worker safety or clean water.
But most agreed that regulation (as it's practiced in the U.S.) is a bureaucratic nightmare, and often ineffective. Some panelists pointed the finger at unaccountable agencies who, year after year, write the regulations. Robert Weissman focused on insufficient enforcement resources. I pointed to overly detailed statutes and regulations which become instantly obsolete when circumstances change. There seemed to be general agreement on the need for a mechanism to revisit old laws to see how they’re working—such as a mandatory "sunset."
So far so good. But then, prompted by a question from the audience, the panel began discussing how regulations should be structured. The senators both felt that laws and regulations should provide "clear metrics" by which the success of the program should be measured. Robert Weissman pointed out, correctly in my view, that many public goals embody moral and qualitative choices not readily quantifiable. What is the metric, say, for a successful special ed program? The number of students helped? What if the quality of the program is lousy? Moreover, anyone familiar with cost-benefit analysis knows how easily the numbers can be fudged. Metrics can be useful as a tool of analysis but not as the sole lodestone of success or failure.
While Robert Weissman did not buy into the notion of clear metrics, he did seem to believe that compliance with rules was a key to regulatory success. He pointed out that in the recent West Virginia chemical spill, the company did not have the required "materials safety data sheet" showing how toxic the chemicals were. But rules are just a rigid metric and often poor substitutes for right and wrong. For example, "MSDS" safety sheets required by worker safety regulations don’t leave room for judgment, so information on highly toxic chemicals is buried in thick notebooks containing sheets describing the perils of "Joy" dishwashing liquid and other benign chemicals. Yes, it would be harmful to chug a jug of Joy, but most workers probably don’t have that urge. Almost no one actually reads the thick notebooks on MSDS sheets. It’s too hard to find any pertinent information. By not leaving room for human judgment to decide which chemicals are likely to cause harm, the rule requiring MSDS sheets is just a version of the boy who cried wolf.
Regulation can be coherent only if humans have room to use their judgment. Safety sheets should be displayed only for chemicals likely to cause harm. Letting people use their judgment doesn’t mean they can do whatever they want. Everyone is still bound to honor the goals and principles, and, if there is a dispute, there’s always a court to complain to. But accepting the role of human judgment opens the door to an open field of common sense instead of a bureaucratic jungle. There’s no need to tangle everyone up in legal vines—the 950 page Volcker Rule comes to mind—if we accept that regulation, like every other life activity, requires human judgment. Bureaucracy could be radically simplified if it focused on goals and guiding principles instead of rigid rules and metrics telling people exactly how to do their jobs.
I had this discussion almost 20 years ago with Joe Dear, the head of OSHA (the worker safety agency) under Clinton. I had been highly critical of OSHA in my book The Death of Common Sense, because studies showed that all its thousands of rules detailing exactly what kind of equipment to use, etc., had done almost no good. How could that be? It seemed that focusing on rule compliance had diverted attention away from the most important factor in safety—a workplace culture valuing safety training and attitudes.
Joe Dear turned out to be a remarkable public servant. He looked like a triathlete, and had an unusual willingness to question bureaucratic assumptions. He started encouraging regional managers to rethink how OSHA did its job. In Maine, OSHA entered into an informal arrangement with large employers to promote safety attitudes in lieu of mindless compliance with all the rules. Once the focus was on how workers did their jobs, instead of handing out fines for, say, having a railing of 38 inches instead of the required 42 inches, the workplaces became safer places. Letting humans use their judgment—both regulator and regulated—proved to be far more effective than focusing on compliance with thousands of rigid rules.
Last week on the Bloomberg ticker the news read that Joe Dear, who had become the Chief Investment Officer of Calpers, had died of cancer at age 62. What a great guy. What America needs, now more than ever, are public servants like Joe who are willing to buck the system by taking responsibility to achieve public goals.Comment ›
Posted 3/4/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
My post on "the evil in investing in litigation" elicited strong reactions. While most were enthusiastic, including from prominent scholars in different fields, a few, from legal scholars I respect, were sharply negative. One said that "in this case you are completely wrong."
Legal scholars tend to look at the internal logic of law. Why not let claimants lay off some of the risk of litigation on outside investors? After all, especially in complex cases, it may facilitate someone’s ability to pursue the claim.
On the other hand, I see the corrosion of daily freedoms from the prevailing belief that any loss deserves compensation—resulting in pervasive fears that any accident, any employment dispute, any commercial dispute, can result in years of litigation. Transforming justice into a for-profit industry can only exacerbate those fears. Part of the distrust of litigation stems from extreme claims. Outside investors will demand that claimants sue for the moon; the more the better. Using litigation as a tool of extortion sounds like a good business model to an outside investor. But justice is supposed to provide compensation (not rewards) from injury caused by someone else’s error (not by the mere fact of an accident or dispute).
There’s a conflict in values here. Does a free society want to maximize the opportunities to sue, so that as few worthy claims as possible are left unrequited? Or do we want to restore justice as a keel of reasonableness, with a reputation for keeping claims in line with social values of right and wrong? Stoking the fires of lawsuits with outside investors arguably maximizes claims, but at the cost of social trust. I vote for social trust, and removing the sword of Damocles that hangs over daily interactions. That requires judges to dismiss extreme claims, and claims that might undermine the freedoms of others in society, whether children’s play, employer job references, teacher authority over classroom order, or any other of the countless freedoms that have been corroded by a sue-for-anything approach to justice.Comment ›
Posted 2/28/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
This week the House passed several bills that purport to reduce excess or unwise legislation. Called the ALERRT Act (Achieving Less Excess in Regulation and Requiring Transparency Act), it would make it harder for agencies to write new rules. It has about zero chance of becoming law with a Democratic Senate. With one possible exception, however, even if it did become law it wouldn’t relieve the burden of unnecessary regulation.
America now ranks 20th in the world in ease of starting a business, and probably close to last in ease of rebuilding infrastructure. But this is because of the massive accretion of obsolete laws and regulations, not the new ones. It is hard to find one regulatory program that isn’t obsolete or broken in significant ways. But most of these regulations aim for worthwhile public goals, so what’s needed is not deregulation but practical re-regulation. So why doesn’t Congress turn its focus to fixing what’s broken, including the many obsolete statutes that it enacted? Who else has that responsibility?
Sure, there are new regulations that Republicans oppose, such as stricter fuel-efficiency standards for big trucks. But these are a drop in the bucket compared to the obsolete old ones. If Congress really wants “to serve the American people and use taxpayer dollars wisely,” it should start the hard work—yes, sorry—of dislodging the special interest stranglehold on the status quo. By stacking more process on new regulations, the House is dodging its own complicity and contributing to the general paralysis.
So what would relieve unnecessary regulatory burdens? Sunsets on regulations and laws would be a good start, requiring lawmakers to periodically revisit how regulations actually work. Requiring an independent commission to report on whether the regulations, as written, serve the public good would enhance public accountability. Further, Congress wouldn’t need to put so many shackles on new regulations if it took back the authority to overturn regulations. Why shouldn’t Congress, as the constitutional lawmaking body, always have authority to veto regulations that are written under explicit congressional delegation? Today, under Supreme Court rulings, Congress can only veto a regulation if it “presents” this congressional act to the President for his signature, as it would with a new law. A constitutional amendment is thus required to restore congressional oversight of regulations it considers unwise. (I discuss this and other amendments as part of a Bill of Responsibilities in The Rule of Nobody, out in April).
One of the proposed bills in the ALERRT Act does strike me as deserving some consideration—the Sunshine for Regulatory Decrees and Settlements Act which limits how federal agencies and plaintiffs can enter into settlements that result in new regulations. Using the guise of a lawsuit to “settle” by imposing new regulations and consent decrees is just a way to give courts authority that, under the Constitution, is supposed to be lodged in Congress.
Regulation-wary legislators from both parties should obviously oppose new rules they think are unwise. But that will do nothing to alleviate the existing regulatory heap that is piled high with burdensome, unnecessary rules. Moreover, imposing more bureaucratic process is unlikely to accomplish the goal of fixing new regulations, and just contributes to the bureaucratic sludge.Comment ›
Posted 2/26/14 by Philip K. Howard
Restoring reliability to the medical malpractice system, Peter Orszag (former head of OMB under Obama) periodically reminds us, could avoid the vast waste of "unnecessary tests and treatments" ordered only because doctors "believe it will protect them from a lawsuit."
This week on Bloomberg View Orszag suggests that the solution is to create "safe harbors" for doctors who follow national guidelines. An added advantage is that doctors will feel compelled to keep up with national best practices instead of following “customary-practice standards” of the local community.
These strike me as good ideas with two very significant caveats:
First, who decides what qualifies for the "safe harbor"? Each patient presents a complex set of facts—say, a sore throat, aching ear, a slight fever. What if it escalates into a debilitating disease? Who decides whether the advice of taking two aspirin was appropriate? With the benefit of hindsight, any adverse medical event might have been handled differently. Certainly any lawyer could readily conjure up reasons why his client’s situation doesn’t quite fit the criteria of a safe harbor. So…does a jury decide? Do you think doctors would trust a jury to reliably sort out what qualifies for a safe harbor, while looking at a plaintiff who suffered from a terrible disease? There’s not a chance, in my view, that doctors would trust that system. Defensive medicine would continue to waste tens of billions every year.
Safe harbors won’t work without a reliable decision-maker. That’s why America needs expert health courts—where specially-trained judges, advised by neutral experts, decide each case with written rulings that strive to apply best practices to each fact situation. The health court proposal, developed by Common Good and the Harvard School of Public Health, has been endorsed by a broad coalition of doctors, patient safety experts, consumer groups, and every budget deficit commission. See here and here.
The second problem is that safe harbors will not cover the universe of malpractice disputes, and won’t be relevant to many cases. Some patient situations will be completely unique. Do those cases go back to the current ad hoc jury-by jury system, which has an error rate of about 25 percent? Will doctors really stop practicing defensive medicine when they’re not sure which cases will qualify? Here as well, special health courts can fill the gap. Even if a case does not fit within the safe harbor, doctors will be able to trust that an expert health court will strive to decide in accord with best practices.
The bottom line: Safe harbors are a good idea to incentivize doctors for better care, but won’t be effective to do that, or to end defensive medicine, without an expert health court that doctors trust.Comment ›
Posted 2/24/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
What one thing would you change to make government work better? I’ve gotten this question many times. Pressing the reset button is clearly needed, because the dysfunction of endless bureaucracy and bought-off democracy has led to structural paralysis.
No new vision can work, however, until there’s been a complete overhaul of civil service. Over 22 million Americans work for federal, state, and local government. How well government works depends on how well they do their jobs. Today, most public employees wake up and go to work in suffocating bureaucracies. Teachers are demoralized by legal shackles that prevent them from maintaining order or, indeed, from teaching with the spontaneity needed to form a genuine bond with their students.
Three recent articles highlight the brokenness of public service. All are thoughtful, but in each case fail to come to grips with the depth of the challenge and the extraordinary opportunity of remaking the social contract with public employees.
In 2011, Wisconsin Gov. Scott Walker succeeded in breaking the power of public unions over hiring, firing, and seniority entitlements. Steven Greenhouse’s post-mortem in this weekend’s New York Times acknowledges that the changes resulted in greater efficiency, and that most public workers had dropped out of unions once they were allowed to. But he also suggests that the efficiencies are on the backs of public workers (such as higher contributions to health care) and have resulted in widespread demoralization. I am a little skeptical, and would like to see an impartial survey from a respected research outfit like Public Agenda. I also have two immediate comments on the report:
First, the unstated assumption of the report is that perhaps Wisconsin should return to the good old days of union power. But Wisconsin public unions, like public unions generally, were notoriously hidebound. It was impossible to terminate lousy teachers and other employees. It was almost impossible to manage them. The retirement rules were abusive, with some workers "retiring" in their 40s or 50s with pensions "spiked" by excess overtime in the last year of employment. Under seniority rules, a young teacher who was honored as one of the best first-year teachers in Wisconsin was forced to be laid off. Aaargh!!! There is a lot to talk about with public service, but the one place we don’t want to go is back to the old days.
The NYT story also assumes that civil service policy ultimately turns on your view of labor vs. management. I reject that premise: What’s important here is the public interest. The litmus test for Scott Walker’s reforms is whether they helped the public. If they result in better, more efficient government, then those are markers of success. If they demoralize public workers, then the reforms are not sustainable, and will drive good people away from government.
"Here’s How To Reform Civil Service in America" is the headline of a Washington Post interview of Prof. Linda Bilmes, an expert on civil service at Harvard’s JFK School. On tenure, she says, correctly, that bad employees are "a real morale drag for those who are working hard." But she blames this on inexpert managers: "Federal managers don’t know how to deal with poor performers." Excuse me: the legal armor surrounding civil servants is nearly impregnable. (See The Collapse of the Common Good.) As one manager told me, "you have to dedicate years" to getting rid of a bad employee. Far more efficient to work around the bad apples. And yes, one bad apple can indeed spoil the barrel. That’s one of the reasons working in government is so demoralizing. The solution is to strip away the legal armor, and replace it with non-legal checks on termination, such as an oversight committee that includes line employees. Everyone in an office knows who’s doing the job and who’s not.
Why aren’t more good people going into government? Prof. Bilmes suggests that young people are impatient: "If we want to attract the cream of the crop of this generation, the government needs to step up its game technologically and change the way agencies work to permit pockets of what I call ‘intrapreneurship,’ where people can create new things and run with new ideas."
EJ Dionne, in a Washington Post column, suggests that recruitment is a marketing problem, mainly caused by right-wing disparagement of public service, and that Obama should "lift up government service as a noble calling. The people we deride as bureaucrats are those who do the daily work of self-government on our behalf. We should never forget that self-government is a thrilling idea."
Actually, working for government would be, for most people, an awful experience. Who wants to work in a place where your ideas make no difference? The bureaucracy is exhausting. As Prof. Bilmes points out, it starts with the opaque, convoluted recruiting process. But that’s only the introductory quicksand to what promises to be a lifetime of frustration. Former NYC Commissioner Sam Schwartz noted that the bureaucracy of modern government drives good people out: as he put it, "expulsion of the fittest."
What amazes me is how many good civil servants stick it out, and deliver needed services despite work conditions that constantly trip them up. They deserve medals. But they’re not proof of a working system, but of the extraordinary strength of human character. Imagine what good they could do if they were free to roll up their sleeves and take responsibility.
Let’s agree on this: Getting able people into government should be a core goal. They should be honored, and treated fairly, and paid reasonably. Public service should be a noble career.
How do we achieve that? I believe America needs a new social contract for public employees. The first principle should be personal responsibility—meaning both the authority to make a difference, and the accountability that goes with that. Avoiding abuse is important—no spoils or arbitrary dismissals—but those goals can be achieved without tiptoeing through a legal minefield. The starting point is to acknowledge that the current system needs to be abandoned: As a report from the Partnership for Public Service concluded: "Today’s federal civil service system is obsolete."Comment ›
Posted 2/21/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
Abraham Lincoln was an accomplished trial lawyer. He also believed that litigation should only be used as a last resort: "Never stir up litigation. A worse man can scarcely be found than one who does this." Lincoln’s view of the role of litigation prompts me to reflect on the new trend of outside investors funding lawsuits, discussed in an excellent op-ed by Gerald Skoning in today’s Wall Street Journal.
Americans have always been more litigious than people in other countries. The can-do spirit that drove Americans to push the frontiers (literally and figuratively) also resulted in more human conflict.
Only in the last 50 years or so, however, has litigation turned into a for-profit industry. A side effect of the 1960s rights revolution was the idea that people had a right to sue for anything. Human suffering became an opportunity to get rich. Entrepreneurial plaintiffs lawyers like Dickie Scruggs, Mel Weiss, and John Edwards congregated at the intersection of human tragedy and human greed, and became tycoons. It was easy work for anyone with a knack for sales. Just find any human suffering—a baby born with cerebral palsy, a company that went bankrupt, smokers who got sick—and sue for the moon. It was all about emotion: "How much would it be worth to you to have emphysema?" The families of victims got rich. The lawyers, skimming a third or more out of multiple verdicts and settlements, got really rich. Class actions were the pot at the end of the rainbow. Scruggs reportedly got a billion dollar fee for settlement of mass tort claims on behalf of the State of Mississippi. With this much money slopping around, the temptations were too great to resist. Asbestos cases were rife with fraudulent doctors’ reports. Stakes were just too high to take the risk of losing—better just to pay someone off. Scruggs and Weiss ended up in jail.
But there are deeper flaws than fraud in this get-rich-through-litigation idea of American success. I forget whether it was Walter Olson or Dan Popeo who observed that "America can’t sue its way to greatness." When plaintiffs get rich, defendants get poor. Asbestos litigation has driven a hundred companies into bankruptcy, costing over 100,000 jobs and causing a decline in value of investments by pension funds and others. Southern hospitals who paid several hundred million dollars in 16 cerebral palsy cases brought by John Edwards had to raise prices, directly or indirectly, to pay those verdicts. Oh, not that it matters in today’s system of justice, medical studies show that in over 90% of cerebral palsy cases, nothing the hospital or doctor did could have caused it.
These direct costs of sue-for-anything justice are only the tiny tip of a far larger cost—a pervasive fear of litigation has replaced a sense of freedom and spontaneity in social dealings. A tidal wave of defensiveness has washed over American culture. When anyone can sue for almost anything, people start going through the day looking over their shoulders. Doctors waste billions in "defensive medicine." Teachers no longer feel free to put an arm around a crying child. Businesses no longer give job references. Diving boards and seesaws disappear. Companies don’t take risks with innovative new products. Better safe than sorry. America’s can-do spirit turns upside down. Welcome to the culture of can’t do.
The flaw in America’s litigation philosophy, as I have argued, is the notion that suing is act of freedom, like, say, free speech. No, it’s not: Suing is a use of state power, just like indicting someone. The mere act of filing a lawsuit puts a sword of Damocles over the head of the defendant. That’s why everyone is so defensive. Moreover, a lawsuit doesn’t just affect the immediate parties. What people can sue for establishes the boundaries of everyone else’s freedom. If a school in California gets sued when a child falls off a seesaw, you can be sure that schools in Massachusetts will remove seesaws. A laissez faire approach to litigation profoundly corrodes the fabric of freedom. The solution—the only solution—is for judges and legislatures to draw the boundaries of who can sue for what as a matter of law. Every claim should first go through a legal gatekeeper, asking whether this claim might erode the legitimate freedoms of people in society. These rulings of law should affirmatively defend the freedom of people to take reasonable risks—like, say, children on a seesaw. Rulings of law establishing boundaries of lawsuits are not somehow un-American. The role of the jury is to decide disputed issues of fact, not legal boundaries of a free society. They’re called "lawsuits," not "claim-anything-suits."
So now let’s return to outside investors funding litigation. They should be barred, in my view, as they were under the common law prohibition against champerty. Litigation should always be about right and wrong. Investors care only about money. Litigation should strive to compensate for actual losses, not make people rich when tragedy occurs: "Gosh, it’s terrible your dad died. We’ll teach them a lesson. You can get a new boat." Legal claims should not be permitted to undermine broader social freedoms, and lawyers should be accountable for professional values that honor broader social goals. Investors have no professional obligations, and will have every incentive to game the system like it’s a casino. Turning litigation into a business is corrosive of almost every good value of the rule of law. Abraham Lincoln, if he were here, would make this moral case powerfully.Comment ›