Blog — Op-Ed
Posted 3/4/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
My post on "the evil in investing in litigation" elicited strong reactions. While most were enthusiastic, including from prominent scholars in different fields, a few, from legal scholars I respect, were sharply negative. One said that "in this case you are completely wrong."
Legal scholars tend to look at the internal logic of law. Why not let claimants lay off some of the risk of litigation on outside investors? After all, especially in complex cases, it may facilitate someone’s ability to pursue the claim.
On the other hand, I see the corrosion of daily freedoms from the prevailing belief that any loss deserves compensation—resulting in pervasive fears that any accident, any employment dispute, any commercial dispute, can result in years of litigation. Transforming justice into a for-profit industry can only exacerbate those fears. Part of the distrust of litigation stems from extreme claims. Outside investors will demand that claimants sue for the moon; the more the better. Using litigation as a tool of extortion sounds like a good business model to an outside investor. But justice is supposed to provide compensation (not rewards) from injury caused by someone else’s error (not by the mere fact of an accident or dispute).
There’s a conflict in values here. Does a free society want to maximize the opportunities to sue, so that as few worthy claims as possible are left unrequited? Or do we want to restore justice as a keel of reasonableness, with a reputation for keeping claims in line with social values of right and wrong? Stoking the fires of lawsuits with outside investors arguably maximizes claims, but at the cost of social trust. I vote for social trust, and removing the sword of Damocles that hangs over daily interactions. That requires judges to dismiss extreme claims, and claims that might undermine the freedoms of others in society, whether children’s play, employer job references, teacher authority over classroom order, or any other of the countless freedoms that have been corroded by a sue-for-anything approach to justice.Comment ›
Posted 2/28/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
This week the House passed several bills that purport to reduce excess or unwise legislation. Called the ALERRT Act (Achieving Less Excess in Regulation and Requiring Transparency Act), it would make it harder for agencies to write new rules. It has about zero chance of becoming law with a Democratic Senate. With one possible exception, however, even if it did become law it wouldn’t relieve the burden of unnecessary regulation.
America now ranks 20th in the world in ease of starting a business, and probably close to last in ease of rebuilding infrastructure. But this is because of the massive accretion of obsolete laws and regulations, not the new ones. It is hard to find one regulatory program that isn’t obsolete or broken in significant ways. But most of these regulations aim for worthwhile public goals, so what’s needed is not deregulation but practical re-regulation. So why doesn’t Congress turn its focus to fixing what’s broken, including the many obsolete statutes that it enacted? Who else has that responsibility?
Sure, there are new regulations that Republicans oppose, such as stricter fuel-efficiency standards for big trucks. But these are a drop in the bucket compared to the obsolete old ones. If Congress really wants “to serve the American people and use taxpayer dollars wisely,” it should start the hard work—yes, sorry—of dislodging the special interest stranglehold on the status quo. By stacking more process on new regulations, the House is dodging its own complicity and contributing to the general paralysis.
So what would relieve unnecessary regulatory burdens? Sunsets on regulations and laws would be a good start, requiring lawmakers to periodically revisit how regulations actually work. Requiring an independent commission to report on whether the regulations, as written, serve the public good would enhance public accountability. Further, Congress wouldn’t need to put so many shackles on new regulations if it took back the authority to overturn regulations. Why shouldn’t Congress, as the constitutional lawmaking body, always have authority to veto regulations that are written under explicit congressional delegation? Today, under Supreme Court rulings, Congress can only veto a regulation if it “presents” this congressional act to the President for his signature, as it would with a new law. A constitutional amendment is thus required to restore congressional oversight of regulations it considers unwise. (I discuss this and other amendments as part of a Bill of Responsibilities in The Rule of Nobody, out in April).
One of the proposed bills in the ALERRT Act does strike me as deserving some consideration—the Sunshine for Regulatory Decrees and Settlements Act which limits how federal agencies and plaintiffs can enter into settlements that result in new regulations. Using the guise of a lawsuit to “settle” by imposing new regulations and consent decrees is just a way to give courts authority that, under the Constitution, is supposed to be lodged in Congress.
Regulation-wary legislators from both parties should obviously oppose new rules they think are unwise. But that will do nothing to alleviate the existing regulatory heap that is piled high with burdensome, unnecessary rules. Moreover, imposing more bureaucratic process is unlikely to accomplish the goal of fixing new regulations, and just contributes to the bureaucratic sludge.Comment ›
Posted 2/26/14 by Philip K. Howard
Restoring reliability to the medical malpractice system, Peter Orszag (former head of OMB under Obama) periodically reminds us, could avoid the vast waste of "unnecessary tests and treatments" ordered only because doctors "believe it will protect them from a lawsuit."
This week on Bloomberg View Orszag suggests that the solution is to create "safe harbors" for doctors who follow national guidelines. An added advantage is that doctors will feel compelled to keep up with national best practices instead of following “customary-practice standards” of the local community.
These strike me as good ideas with two very significant caveats:
First, who decides what qualifies for the "safe harbor"? Each patient presents a complex set of facts—say, a sore throat, aching ear, a slight fever. What if it escalates into a debilitating disease? Who decides whether the advice of taking two aspirin was appropriate? With the benefit of hindsight, any adverse medical event might have been handled differently. Certainly any lawyer could readily conjure up reasons why his client’s situation doesn’t quite fit the criteria of a safe harbor. So…does a jury decide? Do you think doctors would trust a jury to reliably sort out what qualifies for a safe harbor, while looking at a plaintiff who suffered from a terrible disease? There’s not a chance, in my view, that doctors would trust that system. Defensive medicine would continue to waste tens of billions every year.
Safe harbors won’t work without a reliable decision-maker. That’s why America needs expert health courts—where specially-trained judges, advised by neutral experts, decide each case with written rulings that strive to apply best practices to each fact situation. The health court proposal, developed by Common Good and the Harvard School of Public Health, has been endorsed by a broad coalition of doctors, patient safety experts, consumer groups, and every budget deficit commission. See here and here.
The second problem is that safe harbors will not cover the universe of malpractice disputes, and won’t be relevant to many cases. Some patient situations will be completely unique. Do those cases go back to the current ad hoc jury-by jury system, which has an error rate of about 25 percent? Will doctors really stop practicing defensive medicine when they’re not sure which cases will qualify? Here as well, special health courts can fill the gap. Even if a case does not fit within the safe harbor, doctors will be able to trust that an expert health court will strive to decide in accord with best practices.
The bottom line: Safe harbors are a good idea to incentivize doctors for better care, but won’t be effective to do that, or to end defensive medicine, without an expert health court that doctors trust.Comment ›
Posted 2/24/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
What one thing would you change to make government work better? I’ve gotten this question many times. Pressing the reset button is clearly needed, because the dysfunction of endless bureaucracy and bought-off democracy has led to structural paralysis.
No new vision can work, however, until there’s been a complete overhaul of civil service. Over 22 million Americans work for federal, state, and local government. How well government works depends on how well they do their jobs. Today, most public employees wake up and go to work in suffocating bureaucracies. Teachers are demoralized by legal shackles that prevent them from maintaining order or, indeed, from teaching with the spontaneity needed to form a genuine bond with their students.
Three recent articles highlight the brokenness of public service. All are thoughtful, but in each case fail to come to grips with the depth of the challenge and the extraordinary opportunity of remaking the social contract with public employees.
In 2011, Wisconsin Gov. Scott Walker succeeded in breaking the power of public unions over hiring, firing, and seniority entitlements. Steven Greenhouse’s post-mortem in this weekend’s New York Times acknowledges that the changes resulted in greater efficiency, and that most public workers had dropped out of unions once they were allowed to. But he also suggests that the efficiencies are on the backs of public workers (such as higher contributions to health care) and have resulted in widespread demoralization. I am a little skeptical, and would like to see an impartial survey from a respected research outfit like Public Agenda. I also have two immediate comments on the report:
First, the unstated assumption of the report is that perhaps Wisconsin should return to the good old days of union power. But Wisconsin public unions, like public unions generally, were notoriously hidebound. It was impossible to terminate lousy teachers and other employees. It was almost impossible to manage them. The retirement rules were abusive, with some workers "retiring" in their 40s or 50s with pensions "spiked" by excess overtime in the last year of employment. Under seniority rules, a young teacher who was honored as one of the best first-year teachers in Wisconsin was forced to be laid off. Aaargh!!! There is a lot to talk about with public service, but the one place we don’t want to go is back to the old days.
The NYT story also assumes that civil service policy ultimately turns on your view of labor vs. management. I reject that premise: What’s important here is the public interest. The litmus test for Scott Walker’s reforms is whether they helped the public. If they result in better, more efficient government, then those are markers of success. If they demoralize public workers, then the reforms are not sustainable, and will drive good people away from government.
"Here’s How To Reform Civil Service in America" is the headline of a Washington Post interview of Prof. Linda Bilmes, an expert on civil service at Harvard’s JFK School. On tenure, she says, correctly, that bad employees are "a real morale drag for those who are working hard." But she blames this on inexpert managers: "Federal managers don’t know how to deal with poor performers." Excuse me: the legal armor surrounding civil servants is nearly impregnable. (See The Collapse of the Common Good.) As one manager told me, "you have to dedicate years" to getting rid of a bad employee. Far more efficient to work around the bad apples. And yes, one bad apple can indeed spoil the barrel. That’s one of the reasons working in government is so demoralizing. The solution is to strip away the legal armor, and replace it with non-legal checks on termination, such as an oversight committee that includes line employees. Everyone in an office knows who’s doing the job and who’s not.
Why aren’t more good people going into government? Prof. Bilmes suggests that young people are impatient: "If we want to attract the cream of the crop of this generation, the government needs to step up its game technologically and change the way agencies work to permit pockets of what I call ‘intrapreneurship,’ where people can create new things and run with new ideas."
EJ Dionne, in a Washington Post column, suggests that recruitment is a marketing problem, mainly caused by right-wing disparagement of public service, and that Obama should "lift up government service as a noble calling. The people we deride as bureaucrats are those who do the daily work of self-government on our behalf. We should never forget that self-government is a thrilling idea."
Actually, working for government would be, for most people, an awful experience. Who wants to work in a place where your ideas make no difference? The bureaucracy is exhausting. As Prof. Bilmes points out, it starts with the opaque, convoluted recruiting process. But that’s only the introductory quicksand to what promises to be a lifetime of frustration. Former NYC Commissioner Sam Schwartz noted that the bureaucracy of modern government drives good people out: as he put it, "expulsion of the fittest."
What amazes me is how many good civil servants stick it out, and deliver needed services despite work conditions that constantly trip them up. They deserve medals. But they’re not proof of a working system, but of the extraordinary strength of human character. Imagine what good they could do if they were free to roll up their sleeves and take responsibility.
Let’s agree on this: Getting able people into government should be a core goal. They should be honored, and treated fairly, and paid reasonably. Public service should be a noble career.
How do we achieve that? I believe America needs a new social contract for public employees. The first principle should be personal responsibility—meaning both the authority to make a difference, and the accountability that goes with that. Avoiding abuse is important—no spoils or arbitrary dismissals—but those goals can be achieved without tiptoeing through a legal minefield. The starting point is to acknowledge that the current system needs to be abandoned: As a report from the Partnership for Public Service concluded: "Today’s federal civil service system is obsolete."Comment ›
Posted 2/21/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
Abraham Lincoln was an accomplished trial lawyer. He also believed that litigation should only be used as a last resort: "Never stir up litigation. A worse man can scarcely be found than one who does this." Lincoln’s view of the role of litigation prompts me to reflect on the new trend of outside investors funding lawsuits, discussed in an excellent op-ed by Gerald Skoning in today’s Wall Street Journal.
Americans have always been more litigious than people in other countries. The can-do spirit that drove Americans to push the frontiers (literally and figuratively) also resulted in more human conflict.
Only in the last 50 years or so, however, has litigation turned into a for-profit industry. A side effect of the 1960s rights revolution was the idea that people had a right to sue for anything. Human suffering became an opportunity to get rich. Entrepreneurial plaintiffs lawyers like Dickie Scruggs, Mel Weiss, and John Edwards congregated at the intersection of human tragedy and human greed, and became tycoons. It was easy work for anyone with a knack for sales. Just find any human suffering—a baby born with cerebral palsy, a company that went bankrupt, smokers who got sick—and sue for the moon. It was all about emotion: "How much would it be worth to you to have emphysema?" The families of victims got rich. The lawyers, skimming a third or more out of multiple verdicts and settlements, got really rich. Class actions were the pot at the end of the rainbow. Scruggs reportedly got a billion dollar fee for settlement of mass tort claims on behalf of the State of Mississippi. With this much money slopping around, the temptations were too great to resist. Asbestos cases were rife with fraudulent doctors’ reports. Stakes were just too high to take the risk of losing—better just to pay someone off. Scruggs and Weiss ended up in jail.
But there are deeper flaws than fraud in this get-rich-through-litigation idea of American success. I forget whether it was Walter Olson or Dan Popeo who observed that "America can’t sue its way to greatness." When plaintiffs get rich, defendants get poor. Asbestos litigation has driven a hundred companies into bankruptcy, costing over 100,000 jobs and causing a decline in value of investments by pension funds and others. Southern hospitals who paid several hundred million dollars in 16 cerebral palsy cases brought by John Edwards had to raise prices, directly or indirectly, to pay those verdicts. Oh, not that it matters in today’s system of justice, medical studies show that in over 90% of cerebral palsy cases, nothing the hospital or doctor did could have caused it.
These direct costs of sue-for-anything justice are only the tiny tip of a far larger cost—a pervasive fear of litigation has replaced a sense of freedom and spontaneity in social dealings. A tidal wave of defensiveness has washed over American culture. When anyone can sue for almost anything, people start going through the day looking over their shoulders. Doctors waste billions in "defensive medicine." Teachers no longer feel free to put an arm around a crying child. Businesses no longer give job references. Diving boards and seesaws disappear. Companies don’t take risks with innovative new products. Better safe than sorry. America’s can-do spirit turns upside down. Welcome to the culture of can’t do.
The flaw in America’s litigation philosophy, as I have argued, is the notion that suing is act of freedom, like, say, free speech. No, it’s not: Suing is a use of state power, just like indicting someone. The mere act of filing a lawsuit puts a sword of Damocles over the head of the defendant. That’s why everyone is so defensive. Moreover, a lawsuit doesn’t just affect the immediate parties. What people can sue for establishes the boundaries of everyone else’s freedom. If a school in California gets sued when a child falls off a seesaw, you can be sure that schools in Massachusetts will remove seesaws. A laissez faire approach to litigation profoundly corrodes the fabric of freedom. The solution—the only solution—is for judges and legislatures to draw the boundaries of who can sue for what as a matter of law. Every claim should first go through a legal gatekeeper, asking whether this claim might erode the legitimate freedoms of people in society. These rulings of law should affirmatively defend the freedom of people to take reasonable risks—like, say, children on a seesaw. Rulings of law establishing boundaries of lawsuits are not somehow un-American. The role of the jury is to decide disputed issues of fact, not legal boundaries of a free society. They’re called "lawsuits," not "claim-anything-suits."
So now let’s return to outside investors funding litigation. They should be barred, in my view, as they were under the common law prohibition against champerty. Litigation should always be about right and wrong. Investors care only about money. Litigation should strive to compensate for actual losses, not make people rich when tragedy occurs: "Gosh, it’s terrible your dad died. We’ll teach them a lesson. You can get a new boat." Legal claims should not be permitted to undermine broader social freedoms, and lawyers should be accountable for professional values that honor broader social goals. Investors have no professional obligations, and will have every incentive to game the system like it’s a casino. Turning litigation into a business is corrosive of almost every good value of the rule of law. Abraham Lincoln, if he were here, would make this moral case powerfully.Comment ›
Posted 2/20/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
Nick Kristof’s essay last weekend bemoans the growing distance between academic thinkers and the world of public policy. One point that rings true to me is the almost emotional aversion by academic experts to coming up with solutions. As he notes, “In the late 1930s and early 1940s, one-fifth of articles in The American Political Science Review focused on policy prescriptions; at last count, the share was down to 0.3 percent.” Common Good's online policy forum series, NewTalk, has engaged expert academics who often show more interest in exhaustively analyzing problems than imagining potent solutions.
Government is broken. Everyone knows it. While there are a few experts out there actively pushing a new vision—for example, Harvard’s Larry Lessig with campaign finance reforms—they are the exception. The worse things get, the more reluctant experts are to go out on a limb to suggest new ideas.
Now, I don’t happen to believe that experts have a monopoly on wisdom. The more specialized they are, I’ve observed, the more likely their ideas will depart from good sense. But even a bad idea prompts debate and gets people thinking about change and innovation.
When things aren’t working, it’s easy to criticize. It’s even easier to throw up your hands and observe that nothing is politically feasible. After all, Congress can barely avoid national default by raising the debt ceiling.
But the current system is not fiscally sustainable. The time will come when America must make new choices. For these choices to be good choices—moral as well as practical, and consistent with America’s noble founding values—there must a new vision. Who is coming up with that vision?Comment ›
Posted 2/19/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
A shipload of salt to deal with this year’s snow and ice on New Jersey’s roads has been detained in legal limbo in Providence harbor, en route from Maine. The problem, detailed in today’s New York Times, is that it’s illegal for a foreign-owned vessel to ship goods from one U.S. port to another. (There’s even a word for it, I learned. Domestic shipping is called "cabotage.") Now, there’s nothing apparently wrong with the ship, which had just finished unloading its cargo in Maine and was available to take on the salt immediately. But an obscure 1920 law known as the Jones Act requires a U.S. ship, with a U.S. crew, on all domestic routes. There’s a cottage merchant marine industry and union that exists just because of this law.
In this era of free markets, one would think that protectionist laws from almost 100 years ago would have gone the way of the horse and buggy. But laws have remarkable staying power (as we saw two weeks ago with the continuation of New Deal-era farm subsidies). The same onerous process for enacting a law applies to repealing it, with one additional, almost insurmountable, hurdle: the law now is surrounded by an army of special interests who will do anything to defend it (think campaign money and ad hominem attacks on would-be reformers). That’s why, in the strange culture of Washington, repealing laws is so rare as to be almost unthinkable. Getting rid of old laws violates the laws of legislative physics.
Laws pile up, year after year, like sediment in the harbor. Society, meanwhile, is increasingly paralyzed. The U.S. now ranks 20th in the world in ease of starting a business. This is because of thousands of laws like the Jones Act.
American democracy has a structural problem: there’s no political or legal imperative to clean the stables. The accumulation is so bad that, as I argue elsewhere, America should initiate a series of commissions, area by area, to recommend what are known as "recodifications" of law—new, simpler codes that reflect current national goals and priorities. Going forward, most regulatory programs should periodically "sunset," with an action-forcing mechanism (perhaps a constitutional amendment) that prevents Congress from simply re-enacting the same program in a midnight vote.
It’s impossible to run a government, much less balance public budgets, under the weight of a hundreds of laws and programs that are obsolete in whole or part. The weight grows heavier every year. It will break, sooner or later. Perhaps it’s time to start thinking about to fix it.Comment ›
Posted 2/18/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
Yesterday’s White House report, claiming 6 million jobs saved by the $800 billion stimulus plan, predictably prompted partisan debate on fiscal waste. I will not tread into the foggy land of economic theory, except to note that spending more money intuitively always stimulates economic activity—but at the cost of economic health in the future if the money isn’t invested to stimulate future growth (which economists call a "multiplier" effect).
Probably the wisest investment is in rebuilding America’s decaying infrastructure. This was the focus of the President’s push for the stimulus back in 2009, and also the headliner in the report issued yesterday: The stimulus "initiated more than 15,000 transportation projects, which will improve nearly 42,000 miles of road, mend or replace over 2,700 bridges, and provide funds for over 12,220 transit vehicles," plus improving 6,000 miles of rail.
These all sound like good investments to me, but I was curious how much of the stimulus plan went to these transportation infrastructure projects. Towards the back of the report (Table 8 on p. 34) there’s a chart that gives the number: $30 billion. That’s a little over three percent of the total stimulus plan.
Three percent!! American infrastructure recently received a D+ rating from the American Society of Civil Engineers. All those repair projects, listed above, only scratch at the surface of America’s decaying infrastructure. Why wasn’t more spent on this urgent need? Modernizing American infrastructure will improve competitiveness, create a "greener" footprint, and has a high "multiplier" on each dollar invested. We know that’s what President Obama wanted at the outset. Why didn’t it happen?
Let’s break this down into two questions. First, how did the headline goal of the stimulus—rebuilding infrastructure—become a small footnote? Because, as Obama subsequently discovered, "there’s no such thing as shovel-ready projects." The approval process for any significant project (a new road, or power line, or pipeline) approaches a decade, and often longer. An impenetrable legal swamp stands between America and a modern infrastructure.
Second, if not infrastructure, where was most of the stimulus money spent? About $500 billion went to tax cuts, unemployment benefits, and "state fiscal relief" (shoring up insolvent state budgets). The remaining $300 billion was spent on actual projects, of which the big beneficiaries were: (i) subsidies for clean energy ($78 billion), (ii) subsidies for education and child support ($50 billion)(student loans, special ed, and support for disadvantaged children), (iii) health and health IT ($32 billion), (iv) transportation infrastructure ($30 billion, as noted above); (v) environmental cleanup ($28 billion), (vi) new buildings ($24 billion), (vii) scientific research ($18 billion), and a few other categories.
Look at where the stimulus money was spent. Virtually none of the stimulus categories require a significant government approval process. The conclusion is unavoidable: Government is unable to pursue vital public investments because government has lost the authority to approve them. It's pathetic: Government can't get out of its way. If America really wants to rebuild the economy with modern infrastructure, the first task is to rebuild its own authority structure so that approvals take 12 months, not 12 years. See here and here.Comment ›
Posted 2/14/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
Why is it, Stewart asks, that everyone talks about politics, and no one focuses on government’s inability to get things done sensibly? Pelosi: Republicans are obstructive.
Stewart: To make the Democrat’s case, wouldn’t it be helpful if government could actually do the job competently? Pelosi: Democrats need better messaging.
What about the failure of the Obamacare rollout? Pelosi: "I don’t know." Stewart, rising as if to leave: "Let me get the House Minority Leader here—I can ask her. … How do you not know?" Pelosi: "It’s not my responsibility."
Stewart: "Has the regulation become so onerous that government can no longer be agile?" Pelosi: "The procurement process…everybody knew about that."
Stewart: The Obama campaign computer genius wouldn’t bid for the Obamacare IT contract because he couldn’t figure out how to navigate the "300-page document" for bidding. Pelosi: "It doesn’t matter. … It should have been prepared for."
Stewart: "Do we have a foundational problem?" Pelosi: No. Stewart asks for an example of government’s ability to do its job "in an agile and efficient way." Pelosi: The Affordable Care Act.
Let’s pause for a minute. Here we have the top House Democrat who, apparently, doesn’t understand that government doesn’t actually work very well. Not that Republicans would get to the point either. They would be quick to jump on government’s failures, but rarely offer solutions to help government work sensibly.
Political leaders apparently see all issues through the lens of partisan debate, not whether government actually works. In the hermetically sealed bubble that is Washington, our would-be leaders fight about ideology. Dysfunctional bureaucracy, as Pelosi put it, "doesn’t matter. … It should have been prepared for."
Oh, ok, who is in charge of making government work? Pelosi says it’s not Congress’s responsibility. The President is neck deep in decades of statutory and bureaucratic accumulation, like the 300 pages of procurement regs, and lacks legal authority to clean it out. So ask yourself again: who’s in charge of fixing government?
Change will only come from the outside, as retiring Sen. Tom Coburn recently noted. Fixing broken government will require a popular movement to force change. Someone recently asked me what the rallying cry might be for a movement. Maybe we could sponsor a contest. Should we demand that every member of Congress resign? Or call for a constitutional convention?
The moral here is not that Pelosi looked ridiculous. She has the wrong idea of her responsibility. She doesn’t know what her job is. That’s a flaw in America’s political culture. The only way to fix it is dramatic intervention from the outside.Comment ›
Posted 2/13/14 by Philip K. Howard
Howard's Daily by Philip K. Howard
There’s a wide chasm between those who write regulations and the humans expected to abide by them. Real people don’t have the capacity or time to understand, much less comply with, scores of regulations. This is a reason why regulation so often is counterproductive. People preoccupied with rule compliance no longer act sensibly. Focus on A, as sociologist Robert K. Merton put it, and you cannot see B.
Healthcare delivery in America has been suffocated by bureaucracy. How this affects the daily choices of physicians is described by Victoria McEvoy from Harvard Medical School in today’s Wall Street Journal. Regulations presume to guarantee proper care by forcing doctors to go down checklists of every possible treatment associated with, say, an obese child. The problem, of course, is that all this time checking boxes "takes precious time away from doctor-patient communication. Not one of my patients has lost a pound from my box checking."
Like marionettes in a dystopic puppet show, all day long physicians are jerked away from sensible patient care by regulatory mandates written without any concern for human bandwidth. But doctors aren’t computers. Sometimes the box needs to be checked—did the surgeon double-check all requirements?—but most regulatory requirements are in service of a form of central planning, as often requiring useless activities as those that make sense.
Follow all these rules, regulators think, and health care will be perfect. But regulations can’t honor the complexity of the actual patient situation that the doctor is facing. So, when "one metric is off," regulations compel doctors to take certain actions, even where those actions make no sense.
The fee-for-service reimbursement bureaucracy multiplies the box-checking and skewing of sensible judgment (doctors spend 30% of their time on paperwork). Regulatory overload in health care causes various forms of failure—unnecessary cost, grotesque inefficiency, corrosion of professional judgment, and a palpable degradation of professional spirit.
Then pile on top of this the ability of any sick person to bring a lawsuit against a doctor in almost any amount, without any reliable decision maker, and—voila—you have the world’s most expensive healthcare system, by almost a factor of two, and perhaps the most dispirited medical professionals in the developed world.
The solution is not getting rid of regulatory oversight, but re-humanizing it. Box checking should be restricted to high-risk activities. Ideas for dealing with this or that disease should be placed in a reference manual as guides, not as a mandatory compliance regime. Accountability should be determined after the fact, by periodic reviews based on the judgment of professionals who understand the complexities, not by rigid metrics.
The quest for regulatory perfection, like the quest for legal certainty, does not avoid failure, but causes failure.Comment ›