Why You Pay Lots for Emergency Care: “Duh, It’s a Monopoly, Dude.”

[This article is from the exemplary on-line magazine Vox, by Sarah Kliff, who is a super interpreter of public policy in the health arena. Follow her regularly, including especially her daily VoxCare column. ]

Emergency rooms are monopolies. Patients pay the price.

New data shows how emergency rooms take advantage of their market share, at the expense of their patients.

Amanda Northrop/Vox
Around 1 am on August 20, Ismael Saifan woke up with a terrible pain in his lower back, likely the result of moving furniture earlier that day.

“It was a very sharp muscle pain,” Saifan, a 39-year-old engineer, remembers. “I couldn’t move or sleep in any position. I was trying laying down, sitting down, nothing worked.”

Saifan went online to figure out where he could see a doctor. The only place open at that hour was Overland Park Regional Medical Center in his hometown of Overland Park, Kansas.

The doctor checked his blood pressure, asked about the pain, and gave him a muscle relaxant. The visit was quick and easy, lasting about 20 minutes.

But Saifan was shocked when he received bills totaling $2,429.84.

The bill included a $3.50 charge for the muscle relaxant. The rest — $2,426.34 — was from “facility fees” charged by the hospital and doctor for walking into the emergency room and seeking care.

Because Saifan’s health spending is still within his plan’s deductible, he is responsible for the entire amount.

“I called the insurance company to make sure the bill was real,” he says. “They said it was a reasonable price, and gave me a breakdown.”

Spending on emergency room fees has increased by $3 billion — even though the number of fees is declining slightly

There are 141 million visits to the emergency room each year, and nearly all of them (including Saifan’s) have a charge for something called a facility fee. This is the price of walking through the door and seeking service. It does not include any care provided.

Emergency rooms argue that these fees are necessary to keep their doors open, so they can be ready 24/7 to treat anything from a sore back to a gunshot wound. But there is also wide variation in how much hospitals charge for these fees, raising questions about how they are set and how closely they are tethered to overhead costs.

Most hospitals do not make these fees public. Patients typically learn what their emergency room facility fee is when they receive a bill weeks later. The fees can be hundreds or thousands of dollars. That’s why Vox has launched a year-long investigation into emergency room facility fees, to better understand how much they cost and how they affect patients.

Saifan’s bill was so expensive, it turns out, because the hospital used the facility fee typically reserved for complex, intensive emergency room visits.

Emergency room facility fees are usually coded on a 1 to 5 scale, to reflect the complexity of care delivered to the patient. Saifan’s visit where he received a muscle relaxant was coded by the doctor as a level 4 visit — the second highest — and came with hefty fees as a result.

The hospital billed a separate facility fee (!) and chose level 3, typically reserved for moderately complex visits.

Saifan’s experience isn’t an anomaly: A new Vox analysis reveals that emergency rooms all across the country are increasingly using these higher-intensity codes, and that the price of these codes has increased sharply since 2009.

Vox worked with the nonprofit Health Care Cost Institute (HCCI) to analyze 70 million insurance bills for emergency room visits from between 2009 and 2015. We focused on the prices that health plans paid hospitals for facility fees, not the hospital charges (which can often be inflated well above what patients actually pay).

We found that the price of these fees rose 89 percent between 2009 and 2015 — rising twice as fast as the price of outpatient health care, and four times as fast as overall health care spending.

Overall spending on emergency room fees rose by more than $3 billion between 2009 and 2015, despite the fact the HCCI database shows a slight (2 percent) decline in the number of emergency room fees billed in the same time period.

“It is having a dramatic effect on what people spend in a hospital setting,” says Niall Brennan, executive director of the Health Care Cost Institute. “And as we know, that has a trickle-down effect on premiums and benefits.”

Javier Zarracina/Vox

The HCCI data shows that prices are rising dramatically and that, increasingly, hospitals have gravitated to using the most expensive billing codes — the level 4 and 5 charges, typically reserved for the most complex visits.

The rising price of emergency room facility fees coupled with growing usage of the most expensive codes mean it’s significantly more expensive to go to an emergency room now than it was six years ago.

Hospitals argues that these increases are due to an aging, sicker population.

“If you have a monopoly — and when it comes to the ER, it’s a monopoly — you can set any price you want,” says Robert Derlet, a professor emeritus in emergency medicine at the University of California Davis, who has been critical of ER billing in the past.

“What is going to deter me from increasing my price? Who can stop me? If I’m the financial officer for the hospital, I might even get a bonus for doing this.”

Hospitals increasingly code emergency room visits as complex

David Overton knows what strep throat feels like. He comes down with it once or twice every year. He typically goes to urgent care or a drug store clinic when his achy throat and fever symptoms start.

But his most recent strep throat infection flared up on Memorial Day, and those clinics were closed. So he went to Legacy Emergency Room & Urgent Care outside of Dallas.

Overton walked in the door for urgent care, but staff there said his case was severe enough to move him to the emergency room side of the clinic.

“I felt terrible, and I wasn’t up for debating it,” he says. “So I was like okay, I guess we’ll do this.”

The emergency room performed a CT scan and used IV medications to treat Overton. He immediately felt better, and left with a prescription for antibiotics. Three days, later he saw another doctor who confirmed it was a simple case of strep throat.

But because of the complex intervention — the CT scan, the IV drugs, the long visit — the hospital coded Overton has having the most complex visit possible, a level 5. They billed him a $1,900 facility fee. Because he has a high-deductible plan, he’s responsible for all of it.

“Did I need the CT scan? No. Did I need the IV antibiotics? No,” he says. “I could have been treated with oral antibiotics.”

He’s currently paying off the bill by $50 each month. So far, he’s paid $300.

Emergency billing guidelines offered by the American College of Emergency Physicians typically reward doctors for providing a higher level of medical care. They instruct hospitals to use the more expensive billing codes for cases where they have to perform multiple scans and examinations.

Two different hospitals trips for the same condition may be treated and billed quite differently. A case of strep throat treated with oral antibiotics — a simple visit — would likely be coded as a level 1 or 2 visit. But a visit with multiple scans and an IV drip could come out to a 4 or 5.

The HCCI data set suggests that Overton’s experience may not be atypical. It shows that more and more emergency rooms are billing the severe, expensive facility fee charges.

What’s more, there are no federal guidelines on how to code even the exact same visit. This is left up to a hospital’s billing staff, meaning that if two patients receive identical care in different emergency rooms, one may be coded as a level 3 and another as a level 4.

“There are charges that people could look at differently,” says David McKenzie, reimbursement director for the American College of Emergency Physicians. “Reasonable people could disagree on severity.”

These discrepancies can be expensive for patients, as emergency rooms charge hundreds of dollars more for the more severe codes.

The HCCI database shows that the average price of a level 3 facility fee (in medical coding, this is billed as 99283) is $576. Go up to the next severity code, level 4 (or, in medical codes, 99284), and the price rises to $810.

The price of facility fees has risen steadily in recent years. A level 3 code (99283) now costs, on average, $576. A level 4 code (99284) averages $810.
Christina Animashaun/Vox

“Hospitals can make a lot of money charging for all the extras — CT scans, MRIs, laboratory fees, even starting an IV,” Derlet, the emergency physician, says.

In 2009, 50 percent of all emergency room facility fee charges were for level 4 and 5 codes. In 2015, that number rose to 59 percent.

How to interpret that trend isn’t fully clear. Some say it could signal hospitals charging higher rates for similar care. But the current data makes it impossible to rule out the fact that emergency room visits may just be getting more serious.

Medicare tried to simplify facility fees. It failed.

The government has tried to crack down on high emergency room fees before — but ultimately was scared off by intense push back from health care industries.

In 2012, an investigation by the Center for Public Integrity showed that hospitals had earned an additional $1 billion in Medicare revenue by using the most expensive facility fees — the level 5 codes.

This meant that the public insurance program that covers Americans over 65 was suddenly spending significantly more money on these routine fees.

Christina Animashaun/Vox

In response, the Obama administration proposed eliminating the facility fee levels entirely, to get rid of any incentives to bill for a higher price. It suggested one flat fee for all visits.

“A single code and payment for clinic visits is more administratively simple for hospitals and better reflects hospital resources involved in supporting an outpatient visit,” Medicare argued in November 2013.

But that rule never saw the light of day. After intense pushback from hospitals and doctor groups, the issue was dropped and the fee levels remain today.

“I remember we got an onslaught of comments, hospitals being very frustrated, emergency room doctors being upset,” one former Medicare official involved with the rule said. “It definitely got pulled, and it was just about the amount of commenting and concerns from the industry.”

At the time, Medicare issued a statement saying it “intends to consider options to improve the codes for these services in future rulemaking.” So far, the agency has taken no actions to re-regulate these codes.

Some patients, however, have had success taking matters into their own hands.

Last winter, John Shelbourne ended up with a small gash above his eye from a basketball game with friends.

“It was just big enough that it needed to be closed,” Shelbourne, 37, says. “It was probably 9 at night, so I had to head to the emergency room.”

The doctors at Swedish Covenant Hospital glued the wound shut and covered it with Steri-Strips. Shelbourne estimates the visit took about 15 minutes.

A few weeks later, he received a bill with a $899 facility fee. (Incidentally, the Steri-Strips — which cost $1.49 for a box of 12 at Target — were billed at $14.)

Shelbourne’s visit was coded level 3, a medium-intensity facility fee. He showed the bill to his father, a doctor.

“I was able to ask him what these levels were, and he asked his nurses, who knew the difference,” he says. “They said level 3 is a complex procedure, and once I found that out, I knew there was no way this was a complex visit.”

Shelbourne started calling the hospital, arguing that his visit should be coded level 2 instead of 3. He made 25 separate calls over two months. He kept records of all of them, on sticky notes around his desk at work. The hospital eventually agreed with him and lowered his visit from a level 3 down to a level 2.

That change lowered his portion of the bill from $441 down to $305.

“I’m stubborn,” Shelbourne says. “Once I found out what a level 3 visit was, I got pissed off about it. I was on a mission. I went from not knowing the difference between ER visit levels to seeing how these things could get totally slipped by you.”

Saifan, the engineer in Kansas who received the $2,429.84 bill for the muscle relaxant, is disputing his bill as well.

“We’ll see how the dispute goes, but I’m not expecting it to change,” he says. He says the hospital told him he could get a 10 percent discount if he paid the bill in full rather than installments over time, which he’ll do if his dispute is unsuccessful. His family has enough money to pay off the bill.

“It’s not easy to pay $2,500, but it won’t be life or death,” he says. “It would be a pretty frustrating disappointment.”

Help us report on the costs to visit the emergency room. Share your bill here.

Posted in Fun, Governance, Health

US Fisheries Regulation: States’ Rights, except When the Emperor Says NO

Really important example of how to kneecap regulatory processes. Result will certainly be over-exploitation of fisheries resources for the seventh largest fishery on the East Coast (summer flounder) . Needless to say, this is being led by national, Trump administration, regulatory agencies over-ruling state and regional authorities. Painful to see …

From the Washington Post 


Trump administration dives into fish fight

November 21 at 2:27 PM
WASHINGTON — An unprecedented Trump administration decision over the summer that overruled an interstate fishing commission has drawn the ire of critics who worry that keeping a healthy and viable supply of flounder in the Atlantic Ocean is being sacrificed to commercial profits.While the fight over fish largely has been out of the public eye, it has implications for Maryland and other coastal states. Critics charge the controversy further underscores environmental backsliding by a White House beholden to business interests seeking fewer restrictions on the potentially harmful exploitation of natural resources.In July, Secretary of Commerce Wilbur Ross overruled a recommendation by the Atlantic States Marine Fisheries Commission finding New Jersey out of compliance with proposed 2017 harvest limits of summer flounder along the Atlantic coast.The reversal marked the first time since passage of the Atlantic Coastal Act in 1993 that the Department of Commerce overruled the commission’s finding of noncompliance, said commission spokeswoman Tina Berger.

It was a big surprise that the commission’s authority would essentially be disregarded by the Commerce Department,” said Maryland Del. Dana Stein, D-Baltimore, one of the fisheries commissioners. “I was very disappointed upon hearing about this.”

Former commission Chair Douglas Grout at the time said the “commission is deeply concerned about the near-term impact on our ability to end overfishing on the summer flounder stock, as well as the longer-term ability for the commission to effectively conserve numerous other Atlantic coastal shared resources.”

The commission, formed by the 15 Atlantic coastal states in 1942, provides a platform for states to coordinate management plans to conserve fishing stocks.

Each state is represented by three commissioners, including a member of the state legislature, an industry representative and a state official.

Fisheries management is a complicated and difficult field that uses a number of measurements to estimate the number of fish of a particular species in the ocean.

Much of the data is collected by the National Marine Fisheries Service, a part of the National Oceanic and Atmospheric Administration (NOAA).

The data includes fish size, recreational and commercial harvest amounts and assessments of the habitat and movement of species. Also measured is spawning stock biomass, defined as the total weight of male and female fish in a population that contribute to reproduction.

The focus of the dispute is New Jersey’s plan for summer flounder, also known as fluke, a large, flat fish that in Maryland is caught both in the Chesapeake Bay and on the Atlantic seaboard.

At issue is how many fish of any species can be taken in a season without tipping the balance toward a steady decline in the overall stock, as occurred years ago with striped bass.

Growing up to four feet long, summer flounder is the seventh most-fished in Maryland and is particularly prized by recreational fishermen.

The flounder reach spawning age at around two years, by which time a mature fish should measure approximately 10 inches long.

New Jersey proposed allowing the harvest of approximately 93,000 more fish this year – roughly double the previous quota limit. The state contended that it was possible to reduce the number of undersized flounder that die after being released back into the ocean by anglers, using an angler education program.

New Jersey’s “discard mortality” was rejected by the fisheries commission in its technical report as unquantifiable. But the Commerce Department said Ross accepted NOAA’s judgment that New Jersey’s plan would work “while also preserving jobs supported by the recreational summer flounder industry” in the state.

It is unclear what NOAA told the Commerce Department. Officials with NOAA declined requests for comment.

However, an earlier report by NOAA contradicts the position the Commerce Department took on the health of the summer flounder supply. In short, that report said that summer flounder was experiencing overfishing and noted that spawning stock biomass of the species decreased significantly between 2013 and 2016.

The NOAA report also noted that, “as the result of the 2016 assessment update, reductions in catch and landings limits were required for 2017 and 2018.”

In addition, a memo from the Commerce Department to the Atlantic States Marine Fisheries Commission said that while it was possible New Jersey’s proposal would result in equal conservation, it recognized that “there is some uncertainty about how effective the New Jersey measures will be.”

“There’s a serious question here of transparency,” said Molly Masterson, project attorney at the National Resources Defense Council, a non-profit organization focused on long-term management of natural resources.

“… We don’t know, but if commerce and the technical advisors at NOAA were at odds on this that’s really important for the public to know and as it currently stands we just don’t know,” she said.

According to Kiley Dancy, program manager for summer flounder at the Mid Atlantic Fishery Management Council — one of six federally mandated regional councils — summer flounder was one of the “best assessed” species managed by the commission.

“Almost all of the input into the assessment have shown pretty substantial declines of summer flounder over the years, so although there may be some uncertainties in exactly where the biomass is right now, we’ve seen trends in declines in these indices for almost all of the indices that are in the assessment,” Dancy told Capital News Service.

Her assessment was shared by Maryland officials familiar with the issue.

“The flounder stock has shown a kind of extended period of decline over the last decade from a high point, you know, ten years ago, to a point in time now where the stock is approaching the threshold level for which more significant management action would have to happen,” said Michael Luisi, program director at the Maryland Department of Natural Resources.

But New Jersey officials say their approach was quantifiable and based on hard data.

“At the end of the day, we’re the Department of Environmental Protection,” said that agency’s assistant commissioner, David Glass. “We’re a science-based agency and were able to ultimately be successful by providing sound science and data to the secretary of commerce and NOAA fisheries.”

“We’ve contracted with Montclair State University.here in New Jersey, to conduct a survey,” Glass added. “They did a preseason survey for us, and they’re doing a post-season survey to show, ultimately was our campaign effective? Did it change angler behavior? Did it help save more fish in the water?”

Officials in Maryland were cautious about the approach taken in New Jersey.

“It’s not that New Jersey wasn’t acting in the best interest of conservation,” Luisi said. “They just did it in a different way and maybe it was a little less quantified based on the hard science, but it doesn’t mean it was wrong.”

“I think there was just a difference of opinion regarding the management actions that one particular state was presenting as something that they felt was equal to that of the other states,” Luisi added.

The fisheries commission tends to err on the side of caution, Masterson said, noting that with a vulnerable population such as summer flounder, “it’s really critical that the managers get it right based on a really robust scientific and management strategy evaluation process.”

Luisi agreed: “The stock is approaching the threshold level for which more significant management action would have to happen…Managers need to be conservative in how they deal with quotas.”

A statement from the Department of Commerce maintains that the decision was in keeping with the available data and with recommendations from the National Marine Fisheries Service.

“The long-term sustainability of American fishing stocks and the jobs that rely on them are of the utmost concern to Secretary Ross,” said a statement provided by the department.

But the matter also seems to be one of political and commercial interest taking precedence over economic and environmental sustainability according to NRDC’s Masterson.

She insisted that “the secretary’s decision had absolutely no technical support or analysis from a conservation standpoint as to why that it warranted…overturning the commission’s decision and why New Jersey’s proposal would be enough basically for conservation.”

The future of sustainability efforts now appears to be in a state of uncertainty, according to state fishery managers, with the strict limits imposed by the commission suddenly open to question.

“There’s a real concern of states coming out sort of at the last minute and saying, ‘Oh, we want to do something totally different and… because of political influence, we have the guys at the commerce that are going to support us,’” said Masterson.

If others states are able to lobby the Commerce Department directly for changes to fishing regulations, as New Jersey has done, Stein said he doesn’t want Maryland to lose out.

“I would hope that Maryland wouldn’t be the next state (to loosen regulations), but if it seems like that’s the trend, Maryland would feel it’d have to defend its own interests,” the lawmaker said.

“The decision by Commerce – it makes the whole compliance conservation equivalency a little bit gray. How that translates into future management, it’s yet to be determined,” Luisi said.

But the decision may be popular with fishermen, who contend the fisheries commission’s zeal to protect the fish supply often exceeds its technical knowledge.

“I’m glad to see that somebody stood up to the commission,” said Robert Brown, president of Maryland Watermen’s Association, which represents commercial fishermen in the state.

“The best science that they say is available — it isn’t such a thing,” Brown said. “It’s the best assessment, the best guess. There’s no way you can tell how many fish are out there.”

Brown’s elation could be short-lived.

Berger, the fisheries commission spokeswoman, said that if summer flounder reaches an overfished status, more stringent federal laws could impose fishing moratoriums on the species.

Copyright 2017 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


Posted in Fun

The Tormenting of Tillerson

from the October 7th edition of The Economist, page 32:

Crawling back to you

Mr Tillerson deserves only slight sympathy. He has shown little interest in representing American ideals, such as the promotion of human rights, while carrying out a botched reorganisation of the State Department that has left it hollowed out and dysfunctional. Many important posts remain unfilled—including those of assistant secretary of state for East Asia and ambassador to Seoul. He would not be much missed if he decided to quit. But on October 4th, Mr Tillerson declared that he would soldier on in his thankless job. That may not be a bad thing. Such is the damage being done to the effectiveness of American diplomacy at Mr Trump’s hands, it is doubtful whether anyone of stature would be willing to take his place.

Now to the Washington Post’s editorial of today, 13 November 2017 — 

Tillerson’s ‘redesign’ for State looks a lot like a retreat

November 12 at 7:47 PM

WHAT IS going on at the State Department? Secretary of State Rex Tillerson has begun a “redesign” and has spent much of this year working on it. There is not an agency in government that could not benefit from a fresh look, but the importance of the State Department’s mission should not be in doubt. Does Mr. Tillerson really intend to strengthen diplomacy as a tool for dealing with persistent global problems, or is this an exercise in slashing at the people and offices necessary to represent the United States abroad, defend its interests and keep a watch on the future?

In a forthcoming issue of the Foreign Service Journal, the monthly magazine of the American Foreign Service Association (which is the professional association and union of the Foreign Service), the group’s president, former ambassador Barbara Stephenson, raises a fresh alarm about the future of the U.S. diplomatic corps. This refrain has been heard all year, as many outside experts and foreign-service veterans criticize Mr. Tillerson’s proposals to slash the department’s budget by about 30 percent. Congress rejected the most draconian cuts of Mr. Tillerson’s reorganization process, which Sen. Benjamin L. Cardin of Maryland, ranking Democrat on the Senate Foreign Relations Committee, has called a “pre-cooked and ideologically driven exercise.”

Ms. Stephenson reports that the department has decided “to slash promotion numbers by more than half.” Usually, the number of promotions is aimed at matching available jobs at various grade levels. A multiyear smoothing algorithm is used to avoid big fluctuations from year to year. But now, she says, the number of officers at the rank of career minister has fallen from 33 to 19; the number of minister counselors from 431 to 369. Because of a hiring freeze, intake into the Foreign Service will drop from 366 last year to 100 this year, she says. “The rapid loss of so many senior officers has a serious, immediate, and tangible effect on the capacity of the United States to shape world events,” she wrote.

Posted in Fun

US Climate Change Report — Weather Underground Summary w/ Graphics

This is an essay  produced by Bob Henson (bio at the bottom) for WeatherUnderground which nicely summarizes many of the most significant regional impacts of Climate Change in the USA (There is still something like this needed for islands under US jurisdiction.)
– – – – – – – – – – – – – – – – – – – – – – – – –
 – – – – – – – – – – – – – – – – – – – – – – – –

Blockbuster Assessment: Humans Likely Responsible For Virtually All Global Warming Since 1950s

November 3, 2017, 2:18 PM EDT

Above: Number of days per year with a high temperature above 90°F during the period 2036–2065 as compared to 1976–2005, under the highest of the emissions scenarios used by the IPCC (RCP8.5). The maps show the average (mean) from 32 climate model projections. Most parts of the U.S. are projected to get several dozen more 90°F days per year by the middle of this century. The results are statistically significant throughout the contiguous United States. Image credit: CICS-NC and NOAA NCEI, via Figure 6.9 of Chapter 6, Climate Science Special Report (CSSP).

Humans are likely responsible for 93 – 123% of Earth’s net global warming after 1950, says a blockbuster climate report issued on Friday. The Climate Science Special Report is the first product released by the Fourth National Climate Assessment (NCA); the core assessment itself, focusing on impacts, will be released in 2018. The NCA is an congressionally mandated quadrennial effort by hundreds of U.S. scientists to assess how the climate is changing in the United States. The project is carried out by the U.S. Global Change Research Program. Preparation of the report included workshops around the nation, a public-comment period on the draft, and a technical review spanning 13 agencies.

Figure 1. (left) Global annual average temperature has increased by more than 1.2°F (0.7°C) for the period 1986–2016 relative to 1901–1960. Red bars show temperatures that were above the 1901–1960 average, and blue bars indicate temperatures below the average. (right) Surface temperature change (in °F) for the period 1986–2016 relative to 1901–1960. Gray indicates missing data. Image credit: Figures 1.2. and 1.3 of Chapter 1, Climate Science Special Report.

A strong answer for climate-science-denying politicians

Ever since the Earth recorded three consecutive warmest years on record—2014, 2015, then 2016—the mantra of climate-science-denying politicians has shifted from “it hasn’t warmed since 1998” to “Earth’s climate has always changed, and we are not sure how much humans are to blame for the current warming.” At least three members of President Trump’s cabinet gave a variation of this message in their Congressional confirmation hearings. Well, we now have a new authoritative range on what the human contribution to global warming is: 93 – 123% of the warming since 1951. Chapter 3, Detection and Attribution of Climate Change (p.160) of the new report states:

“The likely range of the human contribution to the global mean temperature increase over the period 1951–2010 is 1.1° to 1.4°F (0.6° to 0.8°C), and the central estimate of the observed warming of 1.2°F (0.65°C) lies within this range (high confidence). This translates to a likely human contribution of 93%–123% of the observed 1951–2010 change.” 

In other words, Earth might well have cooled slightly during this period if it were not for human activity; this makes Earth’s recent record-high temperatures even more startling. The report adds:

“For the warming over the last century, there is no convincing alternative explanation supported by the extent of the observational evidence.”

It is worth noting that this new report is even stronger on the human-caused component of warming than the 2013 Intergovernmental Panel on Climate Change (IPCC) report, prepared once every six years. The 2013 IPCC report had this to say about the observed warming of Earth since 1950:

“The best estimate of the human-induced contribution to warming is similar to the observed warming over this period..”

Figure 2.  The ten U.S. regions employed in the Climate Science Special Report and the upcoming Fourth National Climate Assessment. There are two new regions since the last assessment: the Caribbean has been broken off from the Southeast, and the Great Plains have been split into two regions. Image credit: Figure 1 of Guide to this Report, CSSR.

Major U.S. conclusions in the new report

Next year’s full assessment will dig deeper into national impacts, but the Climate Science Special Report has plenty of detail on how climate change is already affecting the United States and what the future may hold. As shown in Figure 1, the analysis is broken into 10 regions.  Here are just a few of the key findings:

Warmest in more than a thousand years. A major paleoclimate study has shown that for each of the world’s seven major continental regions, the average temperature for 1971-2000 was the highest in more than 1300 years. There is significant uncertainty around these estimates, but a separate study found that temperate North America as a whole (including most of the contiguous U.S.) is having its warmest 30-year periods in at least 1500 years.

It’s going to get a lot warmer in the coming decades. Temperatures across the contiguous U.S. have risen about 1.8°F (1.0°C) over the period 1901-2016. “Surface and satellite data are consistent in their depiction of rapid warming since 1979,” the report notes. By the period 2070-2100 (when today’s infants will be elders), U.S. temperatures may be 2.8 to 7.3°F warmer than the 1976-2005 average if greenhouse-gas emissions are reined in strongly—or 5.8 to 11.9°F warmer if emissions continue to grow at the pace of recent decades.

The U.S. temperature record still bears the imprint of the 1930s Dust Bowl.  The coldest single day of the year was higher in all U.S. regions during 1986-2016 as compared to 1901-1960:  from 1.13°F warmer in the Southeast to 4.78°F in the Northwest. Perhaps surprisingly, the warmest day of the year turned slightly cooler in all regions but the Southwest. One big reason: the 1930s Dust Bowl, exacerbated by poor land management, produced some extremely hot summer days. Warming in recent decades becomes much more evident when looking at daily record highs and lows. The ratio of hot to cold records has been more than 2 to 1 during the last two decades: in 2016 it was about 5 to 1, and for 2017 thus far, it’s running at more than 3 to 1. Continued warming in the 21st century should eventually transcend the Dust Bowl hangover, the report indicates: “the coldest and warmest daily temperatures of the year are expected to increase at least 5°F (2.8°C) in most areas by mid-century, rising to 10°F (5.5°C) or more by late-century.”

It’s getting wetter, but not everywhere. Average precipitation for the nation as a whole has increased by about 4% since 1901. This is mainly due to large increases in autumn (see Figure 4). Overall, precipitation has decreased over much of the West, Southwest, and Southeast, and increased over most of the Great Plains, Midwest, and Northeast.

Figure 3.  Seasonal changes in precipitation, comparing the period 1986-2015 to the period 1901-1960 for the contiguous U.S. and to 1925-1960 for Alaska and Hawaii. Image credit: NOAA/NCEI, via Figure 7.1 of Chapter 7, CSSR.

The biggest precipitation events are getting bigger. Between 1901 and 2016, the amount of moisture one would get on the wettest day across a five-year period has increased by anywhere from 1% in the Southwest to 27% in the Northeast. The jumps are even larger for the period 1958 – 2016, when considering the amount of moisture falling in the top 1% of all wet days: from a 9% increase in the Northwest to 55% in the Northeast (although Hawaii and the Caribbean saw drops of 11% and 12%, respectively).

Heat is making U.S. drought worse. The new report finds little evidence for a human influence on observed precipitation deficits—i.e., meteorological drought. Importantly, the study did find ample evidence that the impact of drought on soil moisture is increasing, because of warmer temperatures drawing more moisture out of plants and soil. This is exactly the process implicated in California’s destructive drought of 2011-2016.

Wetter north, drier south? Winter and spring are projected to get wetter on average in the northern U.S., including Alaska. However, parts of the Southwest may see a decrease in winter and spring moisture. As the century rolls on, we’re likely to see a continued increase in the frequency and intensity of heavy precipitation events (see Figure 4 below).                

Figure 4. Projected change in the amount of precipitation one would expect on the wettest day in a 20-year period for the mid-21stcentury (left maps) and late-21st century (right maps). Results are shown for a lower-emission scenario (top maps; RCP4.5) and for a higher-emission scenario (bottom maps, RCP8.5). Image credit: CICS-NC and NOAA NCEI, via Figure 7.7 of Chapter 7, CSSR.        

Big regional differences in sea level rise along U.S. coastline

For vast numbers of people living along or near the U.S. coast, no aspect of climate change will be more wrenching than sea level rise. One of the biggest advances in the ongoing National Climate Assessment is its treatment of sea level rise. This week’s report incorporates some of the latest findings on global mean sea level and how it could climb far more than earlier expected. The main reason: ice sheets in Antarctica may melt more quickly than once thought, especially if the ice cliffs and shelves along the Antarctic Ice Sheet become unstable and prone to fracturing.

Global mean sea level rose about 4-5 inches (11-14 cm) from 1901 to 1990, and about 3 inches (7 cm) in the comparatively brief period since 1990. The rate could accelerate much more this century based on six scenarios identified by the U.S. Interagency Sea Level Rise Task Force. The task force estimates (shown below) are similar to those now being used by the U.S. Department of Defense for planning all coastal facilities worldwide. The low end is comparable to a linear extension of the recent rate (about 0.12 in/year), while the high end is a very-bad-case scenario, including rapid ice loss in Antarctica.

Projected rises in global mean sea level from 2000 to the year shown:
Low:                    0.2’ by 2020,   0.5’  by 2050,   1.0’ by 2100
Intermediate:      0.3’ by 2020,   1.1’  by 2050,   3.3’ by 2100
High:                   0.4’ by 2020,   1.8’ by 2050,    2.1’ by 2100
Extreme:             0.4’ by 2020,   2.1’ by 2050,    8.2’ by 2100
(with further rises expected after 2100)                                                                         

Each of these numbers is a global average, but there’s actually a surprising amount of variation in the height of the sea from one place to another. At either end of the tropical Pacific, for example, El Niño and La Niña can drive sea level up or down by a few inches for months by altering the surface winds that push water across the region. Long-term climate change will bring its own set of region-by-region differences to sea level rise (see Figure 5).

Figure 5. Left: Global mean sea level (GMSL) rise from 1800 to 2100, based on six scenarios from the U.S. Interagency Sea Level Rise Task Force (navy blue, royal blue, cyan, green, orange, and red curves). Also shown are the very likely ranges in 2100 for different RCPs (colored boxes), and lines augmenting the very likely ranges by the difference between the median Antarctic contribution of Kopp et al. and the various median Antarctic projections of DeConto and Pollard. Right:  Relative sea level (RSL) rise (feet) in 2100 projected for the Interagency Intermediate Scenario (a rise of 1 meter [3.3 feet] in global mean sea level by 2100). Image credit: Sweet et al. 2017, via Figure 12.4 of Chapter 12, CSSR.

For the first time, the U.S. National Assessment is analyzing and projecting regional differences in sea level rise along the nation’s coasts, as shown in Figure 5 above. A few of the factors involved:

U.S. coasts will experience more than the global average sea level rise from Antarctica Ice Sheet melt, and less than the global average from Greenland Ice Sheet melt. These results are both produced by what’s called static-equilibrium effects—basically, how the planet’s gravity and rotation are affected by moving huge volumes of water from polar ice sheets into the global oceans.

The Northeast U.S. coast is expected to see additional sea level rise because of a gradually weakening Atlantic meridional overturning circulation, which helps power the Gulf Stream. Much as the polar jet stream separates air masses of different densities, the Gulf Stream separates warmer, less-dense water and higher sea levels on its southwest side from denser, cooler water and a lower sea level on its northwest side, toward the Northeast U.S. coast. Any long-term weakening of the Gulf Stream would be associated with a reduced sea-level gradient, and that would mean a drop in sea level toward the southeast and a rise toward the northwest (on top of any global-scale changes, of course).

Regional sea level rise is being exacerbated by withdrawals of groundwater off the Atlantic coast, and withdrawals of both fossil fuels and groundwater off the Gulf Coast. If these continue, so will the regional effects.

Sea level could rise at a pace below the global average along the coasts of Alaska and the Pacific Northwest. As the glaciers of Alaska melt, the land beneath them will rebound; this will also cut back on sea level rise over the Pacific Northwest due to the static-equilibrium effects noted above.

Coastal storms such as hurricanes and nor’easters will complicate the effects of sea level rise, as they bring their surges atop an ever-rising foundation of mean sea level. “A projected increase in the intensity of hurricanes in the North Atlantic could increase the probability of extreme flooding along most of the U.S. Atlantic and Gulf Coast States beyond what would be projected based solely on [regional sea level] rise,” the report notes. On top of this, there are nonlinear effects that could increasingly exacerbate storm surge heights in areas where the near-coast topography is shallow.

Figure 6. (a) Tidal floods (days per year) exceeding NOAA thresholds for minor impacts at 28 NOAA tide gauges through 2015. (b) Historical exceedances (orange), future projections through 2100 based upon the continuation of the historical trend (blue), and future projections under median conditions for low, medium, and high emission scenarios, for two of the locations: Charleston, SC and San Francisco, CA. Image credit: (a) adapted from Sweet and Marra 2016, (b) adapted from Sweet and Park 2014; via Fig. 12.5 of Chapter 12, CSSR.

Sea level rise is already having an effect on the U.S.

The impacts of sea level rise are not limited to future decades—they’re happening right in front of us, right now. “Nuisance” flooding has become a growing problem in places ranging from Miami Beach to San Francisco. In Maryland, both Annapolis and Baltimore now get more than nine times the number of flood days they experienced in the 1960s. For another example, see weather.com’s powerful report on Naval Station Norfolk (Virginia), a massive base that now experiences routine floods at high tide. Naval Station Norfolk could eventually flood on 200 days a year, based on current trends. However, efforts to keep up with the rising tide are mainly worked into ongoing projects rather than handled as priorities of their own.

The Norfolk report is part of weather.com’s “United States of Climate Change” series, which is examining major climate-change impacts in each of the 50 U.S. states—a very fitting complement to the new NOAA report.

Dr. Jeff Masters co-wrote this post.

The views of the author are his/her own and do not necessarily represent the position of The Weather Company or its parent, IBM.

Bob Henson

WU meteorologist Bob Henson, co-editor of Category 6, is the author of “Meteorology Today” and “The Thinking Person’s Guide to Climate Change.” Before joining WU, he was a longtime writer and editor at the University Corporation for Atmospheric Research in Boulder, CO.

Posted in Chesapeake Bay, Climate Change

Unhealthy Politics in Congress

The Unhealthy Politics of Pork: How It Increases Your Medical Costs

The term pork barrel spending has been around for well over 100 years. It means using government funds on local projects that are primarily used to bring more money to a specific representative’s district. 
CreditMatt Cardy/Getty Images

No industry in America spends more on lobbying than health care.

In 2016, the health care industry spent half a billion dollars on lobbying, with pharmaceutical companies, hospitals and health professionals making the largest contributions. In 2009, the year the Affordable Care Act was debated, health care lobbying exceeded $550 million. (Last year, by comparison, defense lobbying totaled $129 million, and the gun lobby spent just $10.5 million.)

Closely related to industry lobbying is the political maneuvering that congressional leaders use in an effort to pass legislation — specifically, targeted provisions known as earmarks, “sweeteners” or pork barrel spending.

The final version of the Graham-Cassidy health bill, for example, would have sent extra money to Alaska and Maine for the crucial votes of senators from those states, Lisa Murkowski and Susan Collins. In 2010, Democrats hoping to secure votes from reluctant rural state senators added the “Frontier States” provision to the A.C.A., which increased Medicare payments to five states with low population densities.

We all know earmarks and lobbying influence policymakers and policy. In health care, this has critical implications: who gets care, how much they get, how we pay for it. But there’s little hard data on exactly who benefits and how large the effects can be. A new study illuminates the ways these political dynamics can change congressional and hospital behavior — and how they can increase health care costs for the rest of us.

Research by Zack Cooper, Amanda Kowalski and Jennifer Wu at Yale and by Eleanor Powell at the University of Wisconsin-Madison analyzed a provision in the Medicare Modernization Act of 2003 (M.M.A.), known as Section 508, that helped secure Republican votes for the law’s passage.
The M.M.A., which created Medicare Part D and provided prescription drug coverage for seniors, was a political priority for President George W. Bush ahead of his 2004 re-election campaign. But fiscally conservative Republicans were hesitant to sign on to what amounted to the largest expansion of Medicare in its history, and the bill seemed unlikely to pass.

That’s when Section 508 was added.

The rate at which Medicare pays individual hospitals is determined largely by a hospital’s location and the labor costs, or wage index, in its area. Hospitals can, however, request to be reclassified into a different wage index area to raise their payments. Sometimes there are good reasons for this: Two hospitals might be competing in the same region, and because they’re separated by an arbitrary bureaucratic line, one gets paid more than the other.

But Section 508 waivers created new, more ambiguous ways that hospitals in specific districts could appeal their assigned wage index, and gave the executive branch considerable discretion about which requests would be granted and how big the pay increases would be.

The Section 508 waivers had large effects on how both politicians and hospitals operated. About 400 hospitals applied for a Medicare pay increase, and 120 waivers were granted. Hospitals in districts represented by a Republican member of Congress who voted for the M.M.A. were seven times more likely to receive a waiver compared with those in districts of members who voted against it. On average, these hospitals saw a 6.5 percent increase in Medicare payments, but the 29 hospitals with the biggest payment increases — “high 508 recipient hospitals” — received a 10 percent boost.

How did hospitals spend the extra money? Perhaps unsurprisingly, they started treating more Medicare patients — about 8 percent more per year. They also expanded nursing staffing by roughly a third, and invested in new technologies. But extra cash also meant big raises for hospital C.E.O.s: nearly half a million dollars per year at each hospital. Over all, “high 508 recipient hospitals” had $1.25 billion in additional spending from 2005 to 2010 — about 25 percent more than they otherwise would have. There was no evidence of improved quality or outcomes.

“If you told me in advance that we’d find this tight a link between Congress and hospitals, I would have been very surprised,” Mr. Cooper said. “We knew there was some connection, of course, but the more we kept digging, the stronger and more precise the link became.”

Section 508 payment changes were supposed to expire after three years. But hospitals with lucrative waivers had considerable interest in seeing the program extended, and worked together to form the Section 508 Hospital Coalition.

Pork, it seems, is as bad for budgets as it is for waistlines.

“Every time you pass legislation, big or small, these elements are added in,” Mr. Cooper said. “It’s not that any single one is hugely offensive. It’s their accumulation and continuation over time.”

Although Mr. Cooper’s research offers perhaps the clearest empirical glimpse of the links between lobbying, earmarks and medical spending, this political maneuvering is not new — and Medicare hospital payment seems to be a particularly susceptible target.

Both Democrats and Republicans have won pay increases for hospitals they represent. In the 1999 budget, the House Republican whip, Tom DeLay, and House Speaker Dennis Hastert reclassified hospitals in their districts into other regions, leading to hundreds of thousands of dollars of extra funding per year.

About a dozen years later, in what was called the Bay State Boondoggle, John Kerry, then a senator, succeeded in lobbying for Medicare to pay Massachusetts’ urban hospitals at the same rate it paid the state’s rural hospitals. The catch: There was only one hospital that qualified as “rural” in Massachusetts — on the wealthy island of Nantucket.

None of this is surprising. A primary motive of elected representatives is getting re-elected. Passing expansive legislation — like Medicare Part D or the A.C.A. — is hard, especially when legislators can’t point to specific benefits for their constituents. But a critical flaw in our current system is that payments are hugely influenced by politicians who have every incentive to increase them for their own districts.

“You can’t get upset at a snake for having fangs,” Mr. Cooper told me. “We need to design a system that takes payment decisions out of the hands of elected representatives. We think of interest rates as so important and complicated that we’ve tried to remove politics and give the responsibility to the Fed. The same argument holds for health care. When the government spends a trillion dollars on health care, it’s too easy for members to direct funds to their districts.”

We’ve been close to a possible solution. The A.C.A. called for establishing an Independent Payment Advisory Board, a 15-member panel charged with making changes to Medicare to control costs. The proposed reforms would have been put into effect unless Congress introduced alternate policies to achieve the same savings. But the advisory board faced fierce bipartisan opposition and was never created.

Often these costs are borne by all of us, while the benefits — if any — go to a favored few. Excess medical spending, then, is driven not only by inefficiencies in our health system, but also by those in our political system. Our solutions, it seems, must confront that uncomfortable reality.

Dhruv Khullar, M.D., M.P.P., is a physician at NewYork-Presbyterian Hospital and a researcher at the Weill Cornell Department of Healthcare Policy and Research. Follow him on Twitter at @DhruvKhullar.

The Upshot provides news, analysis and graphics about politics, policy and everyday life. Follow us on Facebook and Twitter. Sign up for our newsletter.

Posted in Fun, Governance, Health

Fresno Resurgent: Progress in Surprising Places

from the Non-Profit Quarterly

Posted in Development, Fun, Governance

Another Study Showing Economic Disadvantage Inflicted by Racism

from the Business Section of the Sunday New York Times, 8 October 2017

Credit:  Dani Pendergast

A team of economists has uncovered persuasive evidence that local government officials throughout the United States are less responsive to African-Americans than they are to whites.

The researchers sent roughly 20,000 emails to local government employees in nearly every county. The emails posed commonplace questions, like “Could you please tell me what your opening hours are?”

The emails were identical except that half appeared to come from a DeShawn Jackson or a Tyrone Washington, names that have been shown to be associated with African-Americans. The other half used names that have been shown to be associated with whites: Greg Walsh and Jake Mueller. The email sent to each local officeholder was determined by chance.

Most inquiries yielded a timely and polite response. But emails with black-sounding names were 13 percent more likely to go unanswered than those with white-sounding names. This difference, which appeared in all regions of the country, was large enough that it was statistically unlikely to have been a matter of mere chance.

These troubling results were documented in the paper, “Racial Discrimination in Local Public Services: A Field Experiment in the US,” by Corrado Giulietti of the University of Southampton in Britain, Mirco Tonin of the Free University of Bozen-Bolzano in Italy, and Michael Vlassopoulos, also of the University of Southampton. The study is to be published in the Journal of the European Economic Association.

The findings appeared to be a striking indication of racial discrimination in seemingly benign and mundane interactions. The tendency to ignore emails sent by African-Americans was particularly pronounced in sheriffs’ offices, but it was also evident in school districts and libraries.

In a clever twist, the authors analyzed whether the replies were polite, counting responses that included either the sender’s name or words like “hi,” “Mr.,” “dear,” “good” (which captures “good morning,” “good afternoon” and “have a good day”) or “thank” (which captures both “thanks” and “thank you”). By this measure, those with apparently African-American names received 8 percent fewer polite responses than those with white names.

While many studies have found differences in treatment for African-Americans and whites in employment, housing and the criminal justice system, it hasn’t always been clear whether these differences reflect discrimination or other factors.

The usual difficulty is that it’s impossible to find, say, job seekers who are absolutely identical in every respect except race. As a result, it is difficult to conclude whether a white job seeker succeeded — and a black one didn’t — because of discrimination. While statistical techniques can adjust for some of these factors — education, geography and the like — no analysis can account for all of them.

But the new research allows for a clearer conclusion: It appears to have documented straightforward discrimination.

As a real-world experiment, it built on earlier “audit experiments,” as they are known in social science. Perhaps the most famous is a study by Marianne Bertrand of the University of Chicago and Sendhil Mullainathanof Harvard (who is a regular contributor to this column). In that earlier experiment, Ms. Bertrand and Mr. Mullainathan sent fictitious résumés to employers, finding that people with white-sounding names were more likely to receive a positive response than those with black-sounding names.

The new findings provide further indication of the many ways in which discrimination shapes the lives of African-Americans. What’s more, when it’s harder to get your neighborhood librarian to respond to a simple email about opening hours, it’s not much of a leap to imagine other interactions — dealing with a computer help desk, the front office at a school or just the dry cleaner — that go less smoothly.

Economists tend to group explanations of discriminatory behavior into two buckets: taste-based and statistical. If a librarian chooses not to respond because a person is black, that’s taste-based discrimination. In common speech, there’s a simpler label: racism.

Statistical discrimination, on the other hand, occurs when a librarian uses a person’s name or race as a marker for other characteristics. Perhaps an African-American-sounding name signals that a person is more likely to be poor. The librarian happens to be biased against poor people. In this case, race is being used as a statistic for inferring poverty, and it’s the perception of poverty that causes the discriminatory behavior.

But two pieces of suggestive evidence in this study point to the problem here as being straightforward, taste-based discrimination.

First, the authors repeated the exercise — sending an additional 20,000 emails to the same recipients — although this time with a twist. They added a signature line, identifying the sender as a real estate agent. This extra information made the sender’s name — whether it seemed to be African-American or white — less relevant for inferring income or socioeconomic status. If statistical discrimination had driven behavior in the first round, this extra information should have led to less discrimination in the follow-up. It did not.

Second, the pattern of evidence was consistent with taste-based discrimination. While the researchers didn’t determine the race of the people who responded to their emails, they did have data on the racial breakdown of the municipal work forces. The racial gap in email response rates was greater in counties where the proportion of whites was higher.

Taste-based discrimination — basically, racism — isn’t necessarily the result of conscious thought. In an email, Mr. Tonin, one of the study’s authors, said that it’s possible “this behavior is due to some sort of unconscious bias” and, therefore, that “making people aware of the problem may contribute to the solution.”

If awareness really is the first step toward a fix, then the study may be helpful in refining our understanding of racial discrimination in America. It occurs not only in the labor market and the criminal justice system, but also in countless small frictions every day.

The culprit may not be a hate-spewing white nationalist, but rather a librarian or a school administrator or a county clerk, unaware that she’s helping some clients more than others.

Posted in Fun