Residual Dried Blood and New Born Screening in Minnesota

Note: The Bioethics Program blog is moving to its new home on April 1, 2015. Be sure to change your bookmarks to http://bioethics.uniongraduatecollege.edu/blog/

by Courtney Jarboe, Bioethics Program Student

In Minnesota, residual dried blood (RDB) samples collected for newborn screening had been stored, retained, and used for research without parental consent. It had been presumed that the Minnesota Department of Health (MDH) had the statutory authority to do so. In 2011, the Minnesota Supreme Court ruled (in the case of Bearder v. Minnesota) that the newborn screening program was subject to the Genetic Privacy Act (2006) (pdf), which requires written consent for secondary uses of genetic information. This ruling led to revisions of newborn screening legislation and the destruction of 1.1 million newborn screening RDB samples. Since the legislative dust has settled, MDH has begun aggressive educational campaigns to rebuild trust and awareness between parents and healthcare providers in Minnesota and address information gaps in the public’s understanding of the newborn screening program and the associated research.

In February of this year, I received a letter from the Citizen’s Council for Health Freedom (CCHF), an organization that actively supported the nine families in Bearder v. Minnesota who sued to stop the use of RDB samples for research. (see image) The letter details CCHF’s concerns about the MDH newborn screening program. The letter also suggests that MDH can use the RDB and the associated child’s DNA without parent consent. Included with the letter were a letter addressed to MDH and a copy of the MDH ‘Directive to Destroy’ form.

CCHF disclosed in late March that they distributed the letter to roughly 10,000 parents across the state based on birth certificate records. Within just a week of CCHF’s distribution of the letter, MDH received 59 directives to destroy RDB samples and related results. This was more than the total number of requests in the previous 5 months before these letters were sent out.

I was particularly interested in this letter because of my master’s project on Minnesota’s newborn screening program. Because this wasn’t my first exposure to the newborn screening program in Minnesota, I realized that I might be able to help clarify some of the aspects of this letter that other parents should be aware of. First, CCHF claims that, “Consent requirements mean ownership claims. But now, only if you object will the State release ownership claims to your baby’s DNA.”  However, CCHF is asking parents to complete “the official state opt-out form” and return the “I Did It” postcard. However, this is misleading as the form is actually the ‘Directive to Destroy Newborn Screening Samples and Test Results’ form. CCHF does not inform these parents that they could submit a request to MDH to return the remaining samples back to them. The ‘Directive to Destroy’ may not be the best choice for every family. What if that family should avoid destroying the sample due their medical history?

CCHF also claims “Without consent, the law allows research to be conducted on your child.” After Bearder v. Minnesota, this claim is simply not true. MDH stores and retains residual dried blood samples and test results from infants that participated in the newborn screening program as of August 1, 2014. These samples are used for quality assurance testing and the development of new tests for the screening panel, but the law does not allow research to be conducted on these samples. If they should wish to do research, researchers and MDH must obtain written informed consent from parents in order to use samples (Parental Consent for Research Use of Newborn Screening Blood Spots and Test Results).

Parents who have questions about their state’s newborn screening program practices should consult with their primary care provider or state’s newborn screening program office. If their provider cannot provide the answers, the state’s newborn screening program should have staff available to assist them. For more information about Minnesota’s newborn screening program, visit the state’s website. Information about other state programs, including contact information can be easily found on Baby’s First Test.

Jarboe Minnesota Letter (1) Jarboe Minnesota Letter page 2 (1)

[The contents of this blog are solely the responsibility of the author and do not represent the views of the Bioethics Program or Union Graduate College.]

V-Ticket to Ride

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

I haven’t been to Disneyland since my senior year in high school, and I’ve actually never visited one of the Disney World resorts. Frankly, I never really cared for the noise, the crowds and the artificiality of the Disney parks. The fact that one of these amusement parks is now the center of an infectious disease outbreak makes my aversion even more intense.

Public health officials in California recently confirmed that an outbreak of measles in that state has been linked to the Disneyland theme park in Anaheim. Over 90 new cases of measles have been reported in California and seven neighboring states during the past two weeks, with over 50 of those cases originating in the Magic Kingdom. Most of those cases occurred among unvaccinated kids.

To put this outbreak into context, consider that in 2000 the US Centers for Disease Control and Prevention (CDC) declared that measles had been eliminated from the United States due to vaccination programs and a strong system for detecting, reporting and responding to outbreaks. Only 37 confirmed cases of measles were reported that year, all of which were imported from other countries.

What a difference 15 years can make. In 2014, the United States experienced a record number of measles cases. 644 cases of measles were reported to the CDC, more than the total number of cases in the previous four years combined. If this year’s Disney outbreak is any predicator, we are likely to surpass the 2014 record.

The disease itself is still largely imported from overseas, but it spreads like wildfire among unvaccinated Americans. This is because measles is one of the most infectious diseases known to man. On average, a person with measles spreads it to 18 other people. By contrast, a person with Ebola (the deadly disease that caused widespread panic last year) is only likely to transmit that virus to one or two others, and only in places that lack a robust public health system. An outbreak of Ebola in the US is highly unlikely, whereas measles outbreaks will probably become commonplace.

The anti-vaccination movement is solely to blame for the re-emergence of measles as a public health threat. The easiest way to prevent the spread of measles is vaccination. If 95% of the people in a community are vaccinated against measles, outbreaks cannot occur. Unfortunately, rates of vaccination have fallen to their lowest levels since the start of widespread measles immunization programs in 1963. For example, over 40% of kindergarteners in Orange County, home to Disneyland and epicenter of the current outbreak, are not vaccinated against measles.

Those opposed to vaccination, including such public health luminaries as disgraced British researcher Andrew Wakefield and former Playboy Playmate Jenny McCarthy, have successfully convinced a large swath of the American public that vaccines are dangerous. Despite scientific evidence to the contrary, claims have been made that vaccines cause autism, cancer, asthma, allergies, and a host of other acute and chronic ailments. These so-called “anti-vax” claims have been largely accepted by a gullible populace. According to a recent survey, barely 51 percent of Americans believe that vaccines are safe and effective. About the same number of people also believe in astrology, creationism and ghosts.

Since the Disneyland outbreak began, a number of prominent anti-vaxxers have also argued publicly (including on the CBS Evening News) that measles is not a disease to be feared. Nothing could be further from the truth. Measles is a dangerous and deadly illness. Before the first effective vaccine was developed, approximately 4 million Americans contracted measles each year. Of those, 3 in 10 developed complications like pneumonia. Nearly 50,000 people were hospitalized, 1,000 were permanently disabled due to measles encephalitis, and 500 died.

When confronted with the lack of compelling data to support their claims, anti-vaccination activists often fall back on the most American of arguments: individual freedom and personal liberty. Specifically, many anti-vaxxers believe that the government cannot tell them what they should or should not put into their (or their child’s) body. But this position has limits, particularly when individual actions jeopardize the lives of others.

That is exactly the case here. When someone refuses to vaccinate themself or their kids, they put others at risk, including children who are too young to be vaccinated or elderly whose resistance to measles and other preventable diseases has waned.

It’s time for clinicians, public officials, and politicians to take a stand on vaccination, and take a stand against the claim that personal liberty trumps public safety. Pediatricians and other physicians should refuse to accept new patients who chose not to immunize themselves or their children. School officials should no longer allow unvaccinated children to attend public schools, except in rare cases where vaccination is contraindicated medically. Finally, local, state and national politicians should no longer make it easy for parents to obtain philosophical or “personal belief” exemptions from vaccination requirements and other public health statutes.

If you don’t like vaccines and refuse to get immunized, that is your right. But you shouldn’t expect to line up for Space Mountain or the Pirates of the Caribbean with the rest of us.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 29, 2015, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

A Cold Day’s Concern About a Warming Planet

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

My husband and I usually spend both Thanksgiving and Christmas at my in-laws’ house in Western New York, located about 30 minutes outside of Buffalo. This year was no exception.

What was different this year was the unusual weather. The Thanksgiving holiday was a white one, with sub-freezing temperatures and lots of snow on the ground. In fact, at that time the Buffalo area was still recovering from one of the worst storms in recent history, with some areas receiving over 8 feet of lake-effect snow in the two weeks prior.

By contrast, the Christmas holiday was a green one, with temperatures in the mid-fifteens and not a flake of snow to be seen. We did, however, experience high winds that knocked over trees and toppled power lines. My grandfather-in-law was nearly killed when he struck a downed telephone poles while driving home late on Christmas Eve.

It’s tempting to chalk up these weird weather patterns to global climate change. According to the vast majority of climate scientists (over 97% of them, to be precise), we can expect to see increasing temperatures, changing rain and snowfall patterns, and more extreme weather events like droughts, floods and blizzards over the coming years. These changes are the result of increasing levels of carbon dioxide and other greenhouse gases in our atmosphere, largely as a result of industrial activity.

Despite concerted campaigns by business groups, right-wing pundits, and conservative politicians to discredit the theory — and despite the fact that much of the US is currently shivering under abnormally cold weather — I do believe that climate change is real. I believe it is happening. I think we are already seeing the effects, even if we cannot ascribe singular weather events like the Buffalo blizzard to greenhouse gases in the air.

But I am a public health expert, not a meteorologist. What concerns me the most about global climate change is the effect it will have on patterns of disease and illness, both here in the United States and overseas.

The health-related impact of climate change is most directly observed during extreme weather events, such as increased mortality among the elderly and those who work outdoors during heat waves. But the long-term effects of climate change on public health are much more insidious, particularly the impact on the spread of infectious diseases.

We know, for example, that infectious diseases like cholera and cryptosporidiosis show seasonal patterns. Until recently, the Americas had been free of the deadly diarrheal disease cholera for more than 100 years. When that water-borne illness re-emerged in Central and South America in 1991, it coincided with a periodic weather event (El Niño) that resulted in much warmer than normal coastal waters. Vibrio, the bacterium that causes cholera, was able to proliferate in these unusually warm waters, setting the stage for increased exposure and transmission to humans.

Cholera is now re-established in the Americas, with outbreaks linked to weather events like El Niño. Unfortunately for those countries affected, over the last several decades the number of El Niño events increased, and studies of historical data show the recent variation is likely linked to global warming.

The spread of other illnesses is also climate sensitive, including the spread of vector-borne diseases like Lyme disease, West Nile Virus, malaria, and dengue fever. Several of these are already present in this country. For instance, Lyme disease is the most common vector-borne disease in the US, with the costs of medical treatment and lost productivity alone estimated to exceed $3.2 billion a year.

Other diseases that haven’t been seen in the US for centuries are likely to regain a foothold as the climate changes, particularly as increased temperatures and altered rainfall patterns allow the mosquitoes that carry malaria, dengue and other deadly or debilitating illnesses to thrive in American cities. I know this personally, as a recent business trip to Grenada left me suffering from the high fever and severe joint pain associated with the newly emergent disease known as chikungunya.

Global climate change seems to be an unfortunate reality, with the effects increasingly seen through changes in rain and snowfall patterns, decreases in crop yields, and extreme weather events like heat waves, cold snaps, droughts and floods. While the impact on human health has been limited to date, we can expect to see increasing morbidity and mortality as water-based illnesses and insect-borne diseases become more common.

All we can do now is try to mitigate the effects, through personal behaviors and public policies that reduce the amount of greenhouse gases produced. The more the planet warms, the more likely we will have devastating floods, disastrous droughts and deadly outbreaks of infectious disease. Sadly, in an era of $2.50-per-gallon gas and economic struggle, neither we nor our political leaders seem willing to make the hard choices necessary to limit the rate and magnitude climate change.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 1, 2015, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

Striking the Balance Between Population Guidelines and Patient Primacy

by Susan Mathews, Bioethics Program Alumna (2014)

Breast cancer is the second leading cause of cancer death among North American women. Although routine mammography decreases the risk of death by about 15 percent, research on the effectiveness of wide-scale screening programs shows that 2,500 people would need to be screened to prevent one cancer death among women ages 40-49. Given this, the US Preventive Services Task Force (USPSTF) updated its population guidelines in 2009 to advise against routine screening mammography for women under 50.

These new recommendations were met with controversy and confusion, with many questioned the ability of “experts” to weigh potential benefits and harms of screening for individuals.

But how should population data like this, along with other epidemiologic, social, psychological and economic factors, be considered in medical decision-making?

To read more, click here.

[This post is a summary of an article published on Life Matters Media on November 25, 2014. The contents of this blog are solely the responsibility of the author and do not represent the views of the Bioethics Program or Union Graduate College.]

Fear and Loathing in Liberia

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

Two weeks ago, I wrote a commentary decrying the current hysteria in the US over Ebola. It was ironic, I argued, that so many people were demanding the federal government take immediate steps to address the perceived threat of Ebola while simultaneously ignoring the real public health threats that we face.

A seasonal disease like influenza, for example, takes the lives of tens of thousands of Americans every winter. Still, far too many people refuse to get an annual flu shot. Similarly, outbreaks of preventable (and potentially deadly) diseases like measles, mumps and whooping cough are becoming more and more common as childhood vaccination rates plummet.

Moreover, the politicians and pundits calling on the Obama administration to take radical steps to combat Ebola are the same individuals who have repeatedly criticized efforts to combat the main causes of mortality in the US. Plans to tax junk food or limit the size of sugary sodas are seen as unwelcome government intrusions into the private lives of Americans, despite the fact that over 300,000 Americans die of obesity-related illness every year.

This isn’t to say that Ebola shouldn’t be a concern for public health officials in the US. I previously criticized both the US Centers for Disease Control and Prevention (CDC) and US Customs and Border Protection for their initially tepid response to the crisis.

CDC officials, for instance, were slow to update guidelines for treating patients with Ebola, initially recommending a level of training and use of protective gear that was woefully inadequate. As a result, two nurses who cared for an Ebola patient in Dallas are now infected with the virus. Thankfully, these women are likely to recover.

The CDC has now released new guidelines for clinicians that are similar to those used by Doctors Without Borders, the charitable organization at the forefront of combatting the Ebola epidemic in West Africa. These guidelines, along with new screening procedures for travelers arriving from countries affected by the Ebola epidemic, make it even more unlikely that we will have a serious outbreak here in the US.

Unfortunately, our public response to Ebola is marked by ignorance, fear and panic. Parents of students at Howard Yocum Elementary School, located in a bucolic suburb of Philadelphia, recently protested the fact that two students from Rwanda were enrolled. Rwanda is a small East African country that is 3,000 miles away from the epicenter of the Ebola crisis, and has no reported cases of the disease. Nevertheless, frightened parents threatened to boycott classes. In response, school officials asked the parents of these two young children to “voluntarily” quarantine their kids.

What happened at Howard Yocum Elementary School is not an isolated case. A teacher in Maine was put on mandatory leave simply for attending a conference in Dallas, where the first US cases of Ebola were reported. A middle-school principal in Mississippi was suspended after returning from a family funeral in Zambia, another East African country located many thousands of miles from the heart of the Ebola outbreak.

Cruise ships have been put on lock down, subway stations closed, family vacations cancelled, and buses and planes decommissioned because of public fear about Ebola and the risks it poses.

The sad thing is this much of irrational fear is driven by xenophobia and racism. Since the Ebola outbreak began, over 4,500 people have died in West Africa. However, the mainstream Western media only began to report on the epidemic once an American doctor became infected. The level of care and treatment offered to infected patients from the US and Spain – including access to experimental drugs and vaccines – is also far greater what is provided to patients in affected countries.

Finally, African immigrants to the US are being increasingly ostracized and stigmatized, even if they come from countries unaffected by Ebola. Their kids are being denied admission to school, their parents denied service at restaurants, and their friends potentially denied entry to this country.

Many US politicians, mostly conservative lawmakers but also some progressive policymakers facing tough reelection campaigns, have called for a travel ban to affected countries in West Africa. This is despite statements from the World Health Organization, Red Cross and CDC that such a travel ban will be ineffective. This is also rather disproportionate compared with lawmakers’ reactions to past outbreaks of mad cow disease in England, SARS in Canada and bird flu in China. No travel bans were proposed in those situations.

Rather than fear West Africans, now is the time to embrace them. We could learn a lot from them. Consider the recent piece by Helene Cooper, a New York Times correspondent and native of Liberia. In that country, where over 2,000 people have died, few families have been left untouched by Ebola. At great personal risk, Liberians have banded together to fight the disease rather than isolating and ostracizing those who are sick. Unlike the average American, they are responding not with fear and loathing but with compassion and love. It’s time for us to do the same.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on October 22, 2014, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

Fever Pitch

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

Public concern about Ebola reached a fever pitch this past week, no pun intended, following the revelation that a patient in Dallas was infected with this deadly virus.

Returning from a recent trip to Liberia, where thousands of people have died from Ebola since the epidemic began last December, Thomas Eric Duncan (who died shortly after this commentary was recorded for NPR) developed symptoms shortly after arriving in the United States. Public health officials in Texas are now tracking and quarantining the 38 people who had contact with Mr. Duncan after he became ill.

US health professionals and immigration officers have admittedly been slow to react to the Ebola crisis. When Mr. Duncan first started feeling sick, for example, doctors at the Texas hospital where he was first seen failed to recognize the disease. They instead sent him home with antibiotics for what they believed was a common respiratory infection, unwittingly exposing more people to the deadly virus. This so concerned officials with the New York City Department of Health and Mental Hygiene that they are now sending actors faking symptoms of Ebola into emergency rooms in order to test local preparedness.

More worrisome is the fact that US Customs and Border Protection agents seem uninformed about the risks and warning signs of Ebola infection. Several journalists covering the Ebola outbreak in West Africa have reported that immigration officials have failed to screen air passengers arriving from afflicted areas for the disease, even when prompted with that knowledge.

All those concerns and considerations aside, the truth of the matter is that we are unlikely to experience a full-blown outbreak of Ebola here in the United States, regardless of what the current media frenzy around Mr. Duncan and other cases suggests.

The main reason is this: Ebola, although deadly, is not particularly infectious. Transmission occurs when people are exposed to the bodily fluids (blood, feces or saliva) of an infected and symptomatic patient. This is why health care workers and others caring for afflicted patients are most at risk, and why the rest of us are relatively safe.

This also explains why the epidemic has taken hold in West Africa, a region of the world where the existing public health infrastructure is weak, sanitation systems are crumbling, and cultural traditions around dying require family members to express love for the deceased by touching or hugging the dead body. That is very different from the situation in the US.

The Ebola epidemic raises a lot of interesting policy issues and ethical challenges: if and when to quarantine travelers coming from afflicted areas, how to respond to possible cases of infection in the clinic and in the community, what are the obligations of doctors and nurses to care for those who with Ebola, and when to provide experimental and untested treatments to those who are sick. Except for those with relatives in West Africa, however, most of us who live in the United States shouldn’t be overly concerned about this disease.

Despite this, millions of Americans are taking to social media sites like Facebook and Twitter to express concern (and even outrage) over how local, state and federal agencies have dealt with the Ebola crisis. Many of these individuals are the same ones who fail to vaccinate their kids against measles, whooping cough or the mumps. Others fail to get a yearly flu shot. But these are the diseases that should terrify us.

Take influenza, for example. It is far more contagious than Ebola, being spread through respiratory droplets or contaminated objects like door handles and telephone receivers. People infected with the flu can also spread it to others even if they do not show signs of illness. This disease will kill nearly 50,000 people in the United States this winter, compared with 3,000 people who have died in the current Ebola outbreak. Despite this, less than half of all Americans will be vaccinated against influenza in the coming year.

Rates of childhood immunization have also declined markedly as parents (particularly more progressive and affluent parents) have become increasingly skeptical of the safety and value of vaccines against polio, measles and whooping cough. As a result, we are seeing a resurgence of these otherwise preventable (and potentially deadly) infectious diseases.

This is the great irony of the situation. Americans are up in arms about the unlikely possibility that there will be a mass outbreak of Ebola on US soil, but are apathetic about the real public health threats that they face.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on October 9, 2014, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

Extending the Zadroga Act

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

Thirteen years ago today, Americans watched in horror as planes hijacked by Al Qaeda-backed terrorists slammed into the World Trade Center, the Pentagon and a vacant field outside of Shanksville, Pennsylvania.

Many of us lost friends and family. Nearly 3,000 people were killed that day, including 2,753 who died when the World Trade Center’s Twin Towers fell. The actual death toll associated with 9/11, however, is much higher.

When the Towers fell, they released a cloud of pulverized cement, shards of glass, asbestos, mercury, lead, PCBs, and other carcinogenic and poisonous materials into the air. That cloud lingered for months, with hundreds of rescue workers, thousands of construction workers and millions of New York City residents breathing in a witches’ brew of cancer-causing chemicals.

Rates of asthma, obstructive pulmonary disease and other respiratory illnesses are sky high among those who were exposed to the foul air or toxic dust that lingered over Lower Manhattan in the days and weeks the followed 9/11. A study of police who responded to the terror attacks found that more half have diminished lung function and chronic shortness of breath.

Rates of prostate cancer, thyroid cancer, and multiple myeloma are also elevated; one study looking at nearly 10,000 firefighters found that those who were at the World Trade Center were 20% more likely to develop cancer than those who were not there. Over 2,900 people who worked or lived near the World Trade Center on 9/11 have been diagnosed with cancer, including nearly 900 fire fighters and 600 police. Many of these cancers are likely associated with exposure to chemicals in the air and debris at Ground Zero.

Under the James Zadroga 9/11 Health and Compensation Act, passed by Congress in 2010 after a prolonged partisan fight, first responders, recovery workers, and survivors of the terror attacks can seek free testing and treatment for 9/11-related illnesses. Nearly 50,000 people are currently being monitored and over 30,000 are receiving medical treatment or compensation for illnesses and injuries associated with the World Trade Center’s collapse.

These numbers are expected to rise in the coming years. The incidence of cancer and chronic respiratory illnesses continues to increase at an alarming rate among survivors and responders of the terror attacks. At the same time, two of the key programs created by the Zadroga Act are due to expire. Unless Congress extends the Act, the World Trade Center Health Program, which provides free screening and treatment for 9/11-related illnesses, will end in October 2015. The September 11th Victim Compensation Fund, which provides financial support to the victims of 9/11 and their families, will close in October 2016. Desperately needed medical care and social services will be cut off for thousands of sick patients whose only crime was to survive the attacks or to provide care and aid for those who did.

A bipartisan group of New York politicians – including New York City Mayor Bill de Blasio, US Senator Kirsten Gillibrand, and US Representatives Peter King and Carolyn Maloney – want to prevent this. Just this week, they called upon Congress to extend the Zadroga Act for another 25 years. But they and other supporters of the Act face an uphill battle.

One of the key reasons that it took nearly 10 years to get this legislation passed in the first place is that many prominent (largely conservative) Congressmen opposed its passage, including Representatives Michele Bachmann and Paul Ryan. House Speaker John Boehner and Majority Whip Kevin McCarthy voted against it repeatedly. Senator Tom Coburn also filibustered its passage, arguing that the federal government simply cannot afford provide treatment and care for the victims of 9/11 in an era of record budget deficits. Should the deficit hawks of the Republican Party retain control of the House and recapture the Senate in the upcoming mid-term elections, the fate of the Zadroga Act is likely sealed.

The heroes and victims of 9/11 deserve better. I believe that we have a moral obligation to provide lifelong medical care and treatment for illnesses linked to the terror attacks. It is shameful that the same politicians who used these attacks to justify hundreds of billions of dollars in military expenditures are suddenly crying poor when asked to help the victims themselves. I urge you to call your Senator and Representative and urge them to support the Zadroga Act. More importantly, I urge you to use the power of the ballot box in the upcoming midterm elections to send a message to those who do not support an extension of the James Zadroga 9/11 Health and Compensation Act.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on September 11, 2014, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

The Boys in the Ban

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

For over 30 years now, the United States Food and Drug Administration (FDA) has banned blood donations from gay and bisexual men. It is a lifetime ban. Currently, no man who has ever had sex with another man can donate blood in the US.

The same is true for tissue donations. Just last year, for example, the FDA refused to accept for donation the eyes of an Iowan teen after learning that the boy was gay. When 16-year-old Alexander Betts committed suicide after months of bullying at the hands of classmates because of his sexual orientation, just a few months after he signed up as an organ donor, his family honored one of his last wishes by donating his organs and tissues. But while his heart, lungs, kidneys and liver were used to save the lives of six other people, the donation of his eyes was rejected because “tissue from gay men carries an increased risk of sexually transmitted diseases, including HIV/AIDS.”

The ban on blood and tissue donation from gay men was put in place in 1983, shortly after HIV, the virus that causes AIDS, was first isolated. It made sense at that time. Along with other socially or economically marginalized groups like injection drug users and commercial sex workers, during the early years of the AIDS epidemic gay men were — and still are — at increased risk of acquiring HIV. Banning donations from groups who were more likely to be infected with the virus, particularly when there were no effective treatments, was a logical step to protect the blood supply from contamination with HIV.

This was in part because the first tests to detect the virus in the blood of infected individuals were notoriously inefficient. In fact, these first tests didn’t — and many modern HIV tests still don’t — test for the presence of the virus itself. Rather, they test for the presence of antibodies to HIV.

Antibodies are proteins produced after the immune system encounters a foreign body like a virus, a bacterium or an allergen. They specifically recognize and bind to these pathogens, hopefully neutralizing them before they can infect a person and cause disease. Most vaccines are designed to trigger an antibody response to common infectious agents, such as those cause measles, chicken pox or hepatitis, in order to protect people exposed to those diseases.

Unfortunately, the antibodies produced by the human body against HIV are not protective. But they are a marker that a person has been exposed to HIV, and likely been infected. But an antibody response to HIV can take days or even weeks to develop after infection. So tests that look only for the presence of antibodies to HIV can miss those individuals who are recently infected. If these people give blood in the interval between when they were infected and when they develop an antibody response to HIV, testing their blood will suggest that it is clean even though it may contain live virus that can be spread to transfusion recipients.

But as a team of researchers as Harvard Law School point out in a recent article in the Journal of the American Medical Association, times have changed. HIV testing technologies have dramatically improved in the three decades since the virus was found. Modern antibody tests are much more sensitive, detecting anti-HIV antibodies much earlier in the infection process. We also have inexpensive and reliable tests that look for the presence of the virus itself. Used in combination, these tests can determine if a person has been infected within just a couple of days of exposure. They are a quick, cost-effective and largely infallible way to screen the US blood supply.

Given this, it seems rather unconscionable that the FDA continues to maintain a lifetime ban on blood donations from gay men. This is particularly true when you consider that other groups at high risk for HIV do not face a similar ban. For example, the ban on blood donations from men who have had unprotected sex with women who are known to be HIV-positive is only one year in duration, not life. The same is true for women who have had sex with an HIV-positive male partner. So it’s not the gender of the infected partner that matters, only their sexual orientation.

Moreover, in countries that have lifted the lifetime ban on donations from men who have sex with men, no concomitant increase in the incidence of transfusion-acquired HIV has been seen.

Finally, in 2010 an FDA advisory committee concluded that the lifetime ban keeps many low-risk men from donating to the nation’s blood supply. But despite this, the committee voted to keep the ban in place.

So why does the lifetime ban on blood donations by gay and bisexual men? It is sexual behavior not sexual orientation that determines whether or not an individual is at increased risk of HIV. A promiscuous heterosexual college student is a far greater risk than a gay man who has been in a long-term monogamous relationship.

Quite simply, the ban is purely discriminatory in nature. It does little more than perpetuate outdated and homophobic stereotypes. It also contributes to widespread stigmatization of sexual minorities, leading to the open hostility and institutionalized violence that lead young men like Alexander Betts to end their lives.

We can do better. It’s time to end the lifetime ban on blood and tissue donation by gay and bisexual men.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on August 28, 2014, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

Big Bad Ebola

by Theresa Spranger, Bioethics Program Alumna (MSBioethics 2012)

Last week Ebola came to the United States, it came on a specialized plane in the form of two medical missionaries. The conversation since has revolved around whether or not bringing them home for treatment was wise and/or just.

First, let’s talk about the risk of Ebola transmission, this seems to be the main concern for those who object to the patients being transferred to the U.S. As a nation we are at far greater risk from travelers not yet showing signs of the disease. The two missionaries are being well isolated and every precaution has been taken to ensure the virus does not escape the containment unit. The risk of contamination, transmission, etc. is very low from these patients. It is certainly not zero, a lie I have heard too many times from the media, but it is extremely low.

An argument I simply can’t stomach is: If the missionaries caught Ebola while using the appropriate personal protection equipment (PPE) isn’t the American medical staff at the same risk? Absolutely not. In Africa Ebola is rampant, patients are kept in large wards, the disease is in the communities, and there is no possible way the healthcare providers could have kept their guard up at all times. The risk of contracting the disease while in Africa is high, no matter the protection they used.

Argument 2: They knew the risks and went over anyway, just leave them in Africa to suffer the consequences of their decision. True, they knew the risks and this a viable, though not-so-compassionate response to the problem. Honestly, I think either decision could have been rationalized though I am sure the families of the missionaries appreciate the choice we made to bring their loved ones home and if it had been your family member you would have wanted them home too.

Now that we have unpacked the risk of Ebola spreading in the U.S. I want to talk about the experimental Ebola antiserum. It is reported in the media that the two American missionaries have received this experimental antiserum and I have heard calls to release the drug to all those suffering in Africa. Advocates for release of the drug are particularly intent that it be given to the medical staff who has become infected while treating Ebola patients, since they were infected in the same way as the Americans. I understand the desire and it “feels” compassionate to fight to send this antiserum to Africa. However, there are some very important reasons why this should not happen:

First of all, we don’t know that this drug even works, it’s experimentation in humans is extremely limited, and for all we know it could cause more harm than good. It may seem like the simple solution in an emergency to toss protocol out the window. However, for the safety of the trial participants studies need to be limited in scope and as controlled as possible. If released to the general public or even the medical staff of Africa there would be no way to appropriately monitor for side effects and safety issues. Furthermore, to get the drug to the population as quickly as “demanded” would necessarily violate all study regulations and proper procedure in Africa. This is a huge risk, and if anything were to go wrong with the drug or study who would be blamed? Certainly America and the scientists. It simply isn’t a wise move.

Secondly, the supply of antiserum is extremely limited, there would be no way to make an acceptable amount of antiserum to treat those infected, presuming of course that the drug does what it is intended to do without major side effects.

The scientists have been painted as the bad guys by many of those pushing for the drug’s release and this is simply not true. Those scientists have committed their lives to finding a cure for Ebola and just because they are bound to a process you don’t like right now doesn’t mean that they are evil. If the process is followed and the trial successful perhaps we can have a long term cure for Ebola. Pushing to exempt this project from the scientific process will not help long term progress and could result in short term disaster.

It is easy and feels right to act on emotion, but it is rarely the wise choice.

[The contents of this blog are solely the responsibility of the author and do not represent the views of the Bioethics Program or Union Graduate College.]

Income Inequality and Health: Can the Poor Have Longer and Better Lives?

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

The issue of income inequality has been in the news a lot lately. The gap between rich Americans and poor Americans has grown considerably since the 1970s. The United States now ranks first among the developed nations of the world in terms income inequality as measured by the Gini coefficient, a way of describing the distribution of wealth in a society. Globally, we’re fourth overall, surpassed only by Lebanon, Russia and the Ukraine.

Income inequality is a serious problem, so much so that Nobel Prize-winning economist Robert Shiller called it, “the most important problem that we are facing today.” Income inequality negatively affects economic growth, social mobility, political stability and democratic participation. It also affects the public health.

Quite simply, wealthier Americans tend to live healthier and longer lives. As the income gap has grown, so has the longevity gap. For example, consider the report recently released by the Brookings Institute that looked at income and differential mortality.

Between 1977 and 2007, Brookings economists Barry Bosworth and Kathleen Burke found that life expectancy increased an average of five years for men and one year for women.  But the gains in life expectancy accrued primarily to the rich. The richest 10% of Americans gained 5.9 and 3.1 years of life for men and women, respectively.  For men in the poorest 10%, the increase in life expectancy was less than two years.  The poorest women actually lost two years of life.

To really get a sense of how stark this divide is, however, consider the recent article by New York Times Reporter Annie Lowrey. She compared average life expectancy in Fairfax County, Virginia with that of McDowell County, West Virginia. A suburb of Washington, DC, Fairfax has one of the highest median incomes in the country:  $107,000. Men in Fairfax also have a mean life expectancy of 82 years. By contrast, the coal mining communities in McDowell have one of the lowest median incomes: $23,000. Men in that county only live to 64 on average.

There are a myriad of reasons why this longevity gap exists. The most obvious is access to health care. Wealthier individuals are more likely to have health insurance, a fact that the Obama Administration is trying to change through the Affordable Care Act.

But even if the Affordable Care Act succeeds in reducing the number of under- or uninsured Americans — which now seems likely, given that 8 million people signed up for one of the new health insurance exchanges — inequities in access will still exist.

For example, wealthier Americans will have far more choice in the types and numbers of doctors they can see.  Many clinicians are now refusing to accept any insurance plan, particularly publicly funded plans like Medicaid. Others are setting up concierge practices that guarantee same day appointments to those willing to pay. By contrast, poorer patients will have to wait for treatment, assuming they can find a doctor willing to see them.

The quality of care that the poor receive is also lower. Numerous studies have shown that lower-income patients are more likely to be misdiagnosed, prescribed the wrong medication, or suffer from complications of treatment. This is not because their doctors are incompetent or don’t care about their poorer patients. Rather, doctors that serve lower-income communities often do not have the time to adequately examine patients, take a full medical history, properly explain treatment options, or prescribe the newest drugs; they simply have too many patients to see and insurance reimbursement rates are too low to provide a full range of services.

Finally, wealthier individuals tend to live healthier lives overall. They are less likely to smoke, to drink to excess, and to be overweight. Part of this is due to differences in education, but part of it is due to time and resources. The investment banker who works in Manhattan can afford to buy fresh produce and other healthy meals at the local Whole Foods. He can also afford a gym membership, and he likely lives in a neighborhood that offer safe opportunities for exercising out-of-doors. By contrast, the single mother of four who lives in the Bronx must feed her family on a limited income, buying pre-packaged food at the corner market. She also probably lacks the time to exercise, assuming that the local playground isn’t overrun with drug dealers and gang members.

As we struggle with the issue of health care in America — expanding access to treatment while controlling costs — it is important to remember that the current health care crisis is not just about medical insurance. There are other problems in our society that will affect the outcome of the current debate. The Affordable Care Act will help address some of the current inequities in our health care system. Until we attack the fundamental issue of poverty and the income gap, however, we are probably just putting a small bandage on a large and gaping wound.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on April 24, 2014. It is also available on the WAMC website. Its contents are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]