Residual Dried Blood and New Born Screening in Minnesota

Note: The Bioethics Program blog is moving to its new home on April 1, 2015. Be sure to change your bookmarks to http://bioethics.uniongraduatecollege.edu/blog/

by Courtney Jarboe, Bioethics Program Student

In Minnesota, residual dried blood (RDB) samples collected for newborn screening had been stored, retained, and used for research without parental consent. It had been presumed that the Minnesota Department of Health (MDH) had the statutory authority to do so. In 2011, the Minnesota Supreme Court ruled (in the case of Bearder v. Minnesota) that the newborn screening program was subject to the Genetic Privacy Act (2006) (pdf), which requires written consent for secondary uses of genetic information. This ruling led to revisions of newborn screening legislation and the destruction of 1.1 million newborn screening RDB samples. Since the legislative dust has settled, MDH has begun aggressive educational campaigns to rebuild trust and awareness between parents and healthcare providers in Minnesota and address information gaps in the public’s understanding of the newborn screening program and the associated research.

In February of this year, I received a letter from the Citizen’s Council for Health Freedom (CCHF), an organization that actively supported the nine families in Bearder v. Minnesota who sued to stop the use of RDB samples for research. (see image) The letter details CCHF’s concerns about the MDH newborn screening program. The letter also suggests that MDH can use the RDB and the associated child’s DNA without parent consent. Included with the letter were a letter addressed to MDH and a copy of the MDH ‘Directive to Destroy’ form.

CCHF disclosed in late March that they distributed the letter to roughly 10,000 parents across the state based on birth certificate records. Within just a week of CCHF’s distribution of the letter, MDH received 59 directives to destroy RDB samples and related results. This was more than the total number of requests in the previous 5 months before these letters were sent out.

I was particularly interested in this letter because of my master’s project on Minnesota’s newborn screening program. Because this wasn’t my first exposure to the newborn screening program in Minnesota, I realized that I might be able to help clarify some of the aspects of this letter that other parents should be aware of. First, CCHF claims that, “Consent requirements mean ownership claims. But now, only if you object will the State release ownership claims to your baby’s DNA.”  However, CCHF is asking parents to complete “the official state opt-out form” and return the “I Did It” postcard. However, this is misleading as the form is actually the ‘Directive to Destroy Newborn Screening Samples and Test Results’ form. CCHF does not inform these parents that they could submit a request to MDH to return the remaining samples back to them. The ‘Directive to Destroy’ may not be the best choice for every family. What if that family should avoid destroying the sample due their medical history?

CCHF also claims “Without consent, the law allows research to be conducted on your child.” After Bearder v. Minnesota, this claim is simply not true. MDH stores and retains residual dried blood samples and test results from infants that participated in the newborn screening program as of August 1, 2014. These samples are used for quality assurance testing and the development of new tests for the screening panel, but the law does not allow research to be conducted on these samples. If they should wish to do research, researchers and MDH must obtain written informed consent from parents in order to use samples (Parental Consent for Research Use of Newborn Screening Blood Spots and Test Results).

Parents who have questions about their state’s newborn screening program practices should consult with their primary care provider or state’s newborn screening program office. If their provider cannot provide the answers, the state’s newborn screening program should have staff available to assist them. For more information about Minnesota’s newborn screening program, visit the state’s website. Information about other state programs, including contact information can be easily found on Baby’s First Test.

Jarboe Minnesota Letter (1) Jarboe Minnesota Letter page 2 (1)

[The contents of this blog are solely the responsibility of the author and do not represent the views of the Bioethics Program or Union Graduate College.]

Advertisements

V-Ticket to Ride

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

I haven’t been to Disneyland since my senior year in high school, and I’ve actually never visited one of the Disney World resorts. Frankly, I never really cared for the noise, the crowds and the artificiality of the Disney parks. The fact that one of these amusement parks is now the center of an infectious disease outbreak makes my aversion even more intense.

Public health officials in California recently confirmed that an outbreak of measles in that state has been linked to the Disneyland theme park in Anaheim. Over 90 new cases of measles have been reported in California and seven neighboring states during the past two weeks, with over 50 of those cases originating in the Magic Kingdom. Most of those cases occurred among unvaccinated kids.

To put this outbreak into context, consider that in 2000 the US Centers for Disease Control and Prevention (CDC) declared that measles had been eliminated from the United States due to vaccination programs and a strong system for detecting, reporting and responding to outbreaks. Only 37 confirmed cases of measles were reported that year, all of which were imported from other countries.

What a difference 15 years can make. In 2014, the United States experienced a record number of measles cases. 644 cases of measles were reported to the CDC, more than the total number of cases in the previous four years combined. If this year’s Disney outbreak is any predicator, we are likely to surpass the 2014 record.

The disease itself is still largely imported from overseas, but it spreads like wildfire among unvaccinated Americans. This is because measles is one of the most infectious diseases known to man. On average, a person with measles spreads it to 18 other people. By contrast, a person with Ebola (the deadly disease that caused widespread panic last year) is only likely to transmit that virus to one or two others, and only in places that lack a robust public health system. An outbreak of Ebola in the US is highly unlikely, whereas measles outbreaks will probably become commonplace.

The anti-vaccination movement is solely to blame for the re-emergence of measles as a public health threat. The easiest way to prevent the spread of measles is vaccination. If 95% of the people in a community are vaccinated against measles, outbreaks cannot occur. Unfortunately, rates of vaccination have fallen to their lowest levels since the start of widespread measles immunization programs in 1963. For example, over 40% of kindergarteners in Orange County, home to Disneyland and epicenter of the current outbreak, are not vaccinated against measles.

Those opposed to vaccination, including such public health luminaries as disgraced British researcher Andrew Wakefield and former Playboy Playmate Jenny McCarthy, have successfully convinced a large swath of the American public that vaccines are dangerous. Despite scientific evidence to the contrary, claims have been made that vaccines cause autism, cancer, asthma, allergies, and a host of other acute and chronic ailments. These so-called “anti-vax” claims have been largely accepted by a gullible populace. According to a recent survey, barely 51 percent of Americans believe that vaccines are safe and effective. About the same number of people also believe in astrology, creationism and ghosts.

Since the Disneyland outbreak began, a number of prominent anti-vaxxers have also argued publicly (including on the CBS Evening News) that measles is not a disease to be feared. Nothing could be further from the truth. Measles is a dangerous and deadly illness. Before the first effective vaccine was developed, approximately 4 million Americans contracted measles each year. Of those, 3 in 10 developed complications like pneumonia. Nearly 50,000 people were hospitalized, 1,000 were permanently disabled due to measles encephalitis, and 500 died.

When confronted with the lack of compelling data to support their claims, anti-vaccination activists often fall back on the most American of arguments: individual freedom and personal liberty. Specifically, many anti-vaxxers believe that the government cannot tell them what they should or should not put into their (or their child’s) body. But this position has limits, particularly when individual actions jeopardize the lives of others.

That is exactly the case here. When someone refuses to vaccinate themself or their kids, they put others at risk, including children who are too young to be vaccinated or elderly whose resistance to measles and other preventable diseases has waned.

It’s time for clinicians, public officials, and politicians to take a stand on vaccination, and take a stand against the claim that personal liberty trumps public safety. Pediatricians and other physicians should refuse to accept new patients who chose not to immunize themselves or their children. School officials should no longer allow unvaccinated children to attend public schools, except in rare cases where vaccination is contraindicated medically. Finally, local, state and national politicians should no longer make it easy for parents to obtain philosophical or “personal belief” exemptions from vaccination requirements and other public health statutes.

If you don’t like vaccines and refuse to get immunized, that is your right. But you shouldn’t expect to line up for Space Mountain or the Pirates of the Caribbean with the rest of us.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 29, 2015, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

A Cold Day’s Concern About a Warming Planet

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

My husband and I usually spend both Thanksgiving and Christmas at my in-laws’ house in Western New York, located about 30 minutes outside of Buffalo. This year was no exception.

What was different this year was the unusual weather. The Thanksgiving holiday was a white one, with sub-freezing temperatures and lots of snow on the ground. In fact, at that time the Buffalo area was still recovering from one of the worst storms in recent history, with some areas receiving over 8 feet of lake-effect snow in the two weeks prior.

By contrast, the Christmas holiday was a green one, with temperatures in the mid-fifteens and not a flake of snow to be seen. We did, however, experience high winds that knocked over trees and toppled power lines. My grandfather-in-law was nearly killed when he struck a downed telephone poles while driving home late on Christmas Eve.

It’s tempting to chalk up these weird weather patterns to global climate change. According to the vast majority of climate scientists (over 97% of them, to be precise), we can expect to see increasing temperatures, changing rain and snowfall patterns, and more extreme weather events like droughts, floods and blizzards over the coming years. These changes are the result of increasing levels of carbon dioxide and other greenhouse gases in our atmosphere, largely as a result of industrial activity.

Despite concerted campaigns by business groups, right-wing pundits, and conservative politicians to discredit the theory — and despite the fact that much of the US is currently shivering under abnormally cold weather — I do believe that climate change is real. I believe it is happening. I think we are already seeing the effects, even if we cannot ascribe singular weather events like the Buffalo blizzard to greenhouse gases in the air.

But I am a public health expert, not a meteorologist. What concerns me the most about global climate change is the effect it will have on patterns of disease and illness, both here in the United States and overseas.

The health-related impact of climate change is most directly observed during extreme weather events, such as increased mortality among the elderly and those who work outdoors during heat waves. But the long-term effects of climate change on public health are much more insidious, particularly the impact on the spread of infectious diseases.

We know, for example, that infectious diseases like cholera and cryptosporidiosis show seasonal patterns. Until recently, the Americas had been free of the deadly diarrheal disease cholera for more than 100 years. When that water-borne illness re-emerged in Central and South America in 1991, it coincided with a periodic weather event (El Niño) that resulted in much warmer than normal coastal waters. Vibrio, the bacterium that causes cholera, was able to proliferate in these unusually warm waters, setting the stage for increased exposure and transmission to humans.

Cholera is now re-established in the Americas, with outbreaks linked to weather events like El Niño. Unfortunately for those countries affected, over the last several decades the number of El Niño events increased, and studies of historical data show the recent variation is likely linked to global warming.

The spread of other illnesses is also climate sensitive, including the spread of vector-borne diseases like Lyme disease, West Nile Virus, malaria, and dengue fever. Several of these are already present in this country. For instance, Lyme disease is the most common vector-borne disease in the US, with the costs of medical treatment and lost productivity alone estimated to exceed $3.2 billion a year.

Other diseases that haven’t been seen in the US for centuries are likely to regain a foothold as the climate changes, particularly as increased temperatures and altered rainfall patterns allow the mosquitoes that carry malaria, dengue and other deadly or debilitating illnesses to thrive in American cities. I know this personally, as a recent business trip to Grenada left me suffering from the high fever and severe joint pain associated with the newly emergent disease known as chikungunya.

Global climate change seems to be an unfortunate reality, with the effects increasingly seen through changes in rain and snowfall patterns, decreases in crop yields, and extreme weather events like heat waves, cold snaps, droughts and floods. While the impact on human health has been limited to date, we can expect to see increasing morbidity and mortality as water-based illnesses and insect-borne diseases become more common.

All we can do now is try to mitigate the effects, through personal behaviors and public policies that reduce the amount of greenhouse gases produced. The more the planet warms, the more likely we will have devastating floods, disastrous droughts and deadly outbreaks of infectious disease. Sadly, in an era of $2.50-per-gallon gas and economic struggle, neither we nor our political leaders seem willing to make the hard choices necessary to limit the rate and magnitude climate change.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 1, 2015, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

Striking the Balance Between Population Guidelines and Patient Primacy

by Susan Mathews, Bioethics Program Alumna (2014)

Breast cancer is the second leading cause of cancer death among North American women. Although routine mammography decreases the risk of death by about 15 percent, research on the effectiveness of wide-scale screening programs shows that 2,500 people would need to be screened to prevent one cancer death among women ages 40-49. Given this, the US Preventive Services Task Force (USPSTF) updated its population guidelines in 2009 to advise against routine screening mammography for women under 50.

These new recommendations were met with controversy and confusion, with many questioned the ability of “experts” to weigh potential benefits and harms of screening for individuals.

But how should population data like this, along with other epidemiologic, social, psychological and economic factors, be considered in medical decision-making?

To read more, click here.

[This post is a summary of an article published on Life Matters Media on November 25, 2014. The contents of this blog are solely the responsibility of the author and do not represent the views of the Bioethics Program or Union Graduate College.]

Fear and Loathing in Liberia

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

Two weeks ago, I wrote a commentary decrying the current hysteria in the US over Ebola. It was ironic, I argued, that so many people were demanding the federal government take immediate steps to address the perceived threat of Ebola while simultaneously ignoring the real public health threats that we face.

A seasonal disease like influenza, for example, takes the lives of tens of thousands of Americans every winter. Still, far too many people refuse to get an annual flu shot. Similarly, outbreaks of preventable (and potentially deadly) diseases like measles, mumps and whooping cough are becoming more and more common as childhood vaccination rates plummet.

Moreover, the politicians and pundits calling on the Obama administration to take radical steps to combat Ebola are the same individuals who have repeatedly criticized efforts to combat the main causes of mortality in the US. Plans to tax junk food or limit the size of sugary sodas are seen as unwelcome government intrusions into the private lives of Americans, despite the fact that over 300,000 Americans die of obesity-related illness every year.

This isn’t to say that Ebola shouldn’t be a concern for public health officials in the US. I previously criticized both the US Centers for Disease Control and Prevention (CDC) and US Customs and Border Protection for their initially tepid response to the crisis.

CDC officials, for instance, were slow to update guidelines for treating patients with Ebola, initially recommending a level of training and use of protective gear that was woefully inadequate. As a result, two nurses who cared for an Ebola patient in Dallas are now infected with the virus. Thankfully, these women are likely to recover.

The CDC has now released new guidelines for clinicians that are similar to those used by Doctors Without Borders, the charitable organization at the forefront of combatting the Ebola epidemic in West Africa. These guidelines, along with new screening procedures for travelers arriving from countries affected by the Ebola epidemic, make it even more unlikely that we will have a serious outbreak here in the US.

Unfortunately, our public response to Ebola is marked by ignorance, fear and panic. Parents of students at Howard Yocum Elementary School, located in a bucolic suburb of Philadelphia, recently protested the fact that two students from Rwanda were enrolled. Rwanda is a small East African country that is 3,000 miles away from the epicenter of the Ebola crisis, and has no reported cases of the disease. Nevertheless, frightened parents threatened to boycott classes. In response, school officials asked the parents of these two young children to “voluntarily” quarantine their kids.

What happened at Howard Yocum Elementary School is not an isolated case. A teacher in Maine was put on mandatory leave simply for attending a conference in Dallas, where the first US cases of Ebola were reported. A middle-school principal in Mississippi was suspended after returning from a family funeral in Zambia, another East African country located many thousands of miles from the heart of the Ebola outbreak.

Cruise ships have been put on lock down, subway stations closed, family vacations cancelled, and buses and planes decommissioned because of public fear about Ebola and the risks it poses.

The sad thing is this much of irrational fear is driven by xenophobia and racism. Since the Ebola outbreak began, over 4,500 people have died in West Africa. However, the mainstream Western media only began to report on the epidemic once an American doctor became infected. The level of care and treatment offered to infected patients from the US and Spain – including access to experimental drugs and vaccines – is also far greater what is provided to patients in affected countries.

Finally, African immigrants to the US are being increasingly ostracized and stigmatized, even if they come from countries unaffected by Ebola. Their kids are being denied admission to school, their parents denied service at restaurants, and their friends potentially denied entry to this country.

Many US politicians, mostly conservative lawmakers but also some progressive policymakers facing tough reelection campaigns, have called for a travel ban to affected countries in West Africa. This is despite statements from the World Health Organization, Red Cross and CDC that such a travel ban will be ineffective. This is also rather disproportionate compared with lawmakers’ reactions to past outbreaks of mad cow disease in England, SARS in Canada and bird flu in China. No travel bans were proposed in those situations.

Rather than fear West Africans, now is the time to embrace them. We could learn a lot from them. Consider the recent piece by Helene Cooper, a New York Times correspondent and native of Liberia. In that country, where over 2,000 people have died, few families have been left untouched by Ebola. At great personal risk, Liberians have banded together to fight the disease rather than isolating and ostracizing those who are sick. Unlike the average American, they are responding not with fear and loathing but with compassion and love. It’s time for us to do the same.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on October 22, 2014, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

Fever Pitch

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

Public concern about Ebola reached a fever pitch this past week, no pun intended, following the revelation that a patient in Dallas was infected with this deadly virus.

Returning from a recent trip to Liberia, where thousands of people have died from Ebola since the epidemic began last December, Thomas Eric Duncan (who died shortly after this commentary was recorded for NPR) developed symptoms shortly after arriving in the United States. Public health officials in Texas are now tracking and quarantining the 38 people who had contact with Mr. Duncan after he became ill.

US health professionals and immigration officers have admittedly been slow to react to the Ebola crisis. When Mr. Duncan first started feeling sick, for example, doctors at the Texas hospital where he was first seen failed to recognize the disease. They instead sent him home with antibiotics for what they believed was a common respiratory infection, unwittingly exposing more people to the deadly virus. This so concerned officials with the New York City Department of Health and Mental Hygiene that they are now sending actors faking symptoms of Ebola into emergency rooms in order to test local preparedness.

More worrisome is the fact that US Customs and Border Protection agents seem uninformed about the risks and warning signs of Ebola infection. Several journalists covering the Ebola outbreak in West Africa have reported that immigration officials have failed to screen air passengers arriving from afflicted areas for the disease, even when prompted with that knowledge.

All those concerns and considerations aside, the truth of the matter is that we are unlikely to experience a full-blown outbreak of Ebola here in the United States, regardless of what the current media frenzy around Mr. Duncan and other cases suggests.

The main reason is this: Ebola, although deadly, is not particularly infectious. Transmission occurs when people are exposed to the bodily fluids (blood, feces or saliva) of an infected and symptomatic patient. This is why health care workers and others caring for afflicted patients are most at risk, and why the rest of us are relatively safe.

This also explains why the epidemic has taken hold in West Africa, a region of the world where the existing public health infrastructure is weak, sanitation systems are crumbling, and cultural traditions around dying require family members to express love for the deceased by touching or hugging the dead body. That is very different from the situation in the US.

The Ebola epidemic raises a lot of interesting policy issues and ethical challenges: if and when to quarantine travelers coming from afflicted areas, how to respond to possible cases of infection in the clinic and in the community, what are the obligations of doctors and nurses to care for those who with Ebola, and when to provide experimental and untested treatments to those who are sick. Except for those with relatives in West Africa, however, most of us who live in the United States shouldn’t be overly concerned about this disease.

Despite this, millions of Americans are taking to social media sites like Facebook and Twitter to express concern (and even outrage) over how local, state and federal agencies have dealt with the Ebola crisis. Many of these individuals are the same ones who fail to vaccinate their kids against measles, whooping cough or the mumps. Others fail to get a yearly flu shot. But these are the diseases that should terrify us.

Take influenza, for example. It is far more contagious than Ebola, being spread through respiratory droplets or contaminated objects like door handles and telephone receivers. People infected with the flu can also spread it to others even if they do not show signs of illness. This disease will kill nearly 50,000 people in the United States this winter, compared with 3,000 people who have died in the current Ebola outbreak. Despite this, less than half of all Americans will be vaccinated against influenza in the coming year.

Rates of childhood immunization have also declined markedly as parents (particularly more progressive and affluent parents) have become increasingly skeptical of the safety and value of vaccines against polio, measles and whooping cough. As a result, we are seeing a resurgence of these otherwise preventable (and potentially deadly) infectious diseases.

This is the great irony of the situation. Americans are up in arms about the unlikely possibility that there will be a mass outbreak of Ebola on US soil, but are apathetic about the real public health threats that they face.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on October 9, 2014, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]

Extending the Zadroga Act

by Sean Philpott-Jones, Director of the Center for Bioethics and Clinical Leadership

Thirteen years ago today, Americans watched in horror as planes hijacked by Al Qaeda-backed terrorists slammed into the World Trade Center, the Pentagon and a vacant field outside of Shanksville, Pennsylvania.

Many of us lost friends and family. Nearly 3,000 people were killed that day, including 2,753 who died when the World Trade Center’s Twin Towers fell. The actual death toll associated with 9/11, however, is much higher.

When the Towers fell, they released a cloud of pulverized cement, shards of glass, asbestos, mercury, lead, PCBs, and other carcinogenic and poisonous materials into the air. That cloud lingered for months, with hundreds of rescue workers, thousands of construction workers and millions of New York City residents breathing in a witches’ brew of cancer-causing chemicals.

Rates of asthma, obstructive pulmonary disease and other respiratory illnesses are sky high among those who were exposed to the foul air or toxic dust that lingered over Lower Manhattan in the days and weeks the followed 9/11. A study of police who responded to the terror attacks found that more half have diminished lung function and chronic shortness of breath.

Rates of prostate cancer, thyroid cancer, and multiple myeloma are also elevated; one study looking at nearly 10,000 firefighters found that those who were at the World Trade Center were 20% more likely to develop cancer than those who were not there. Over 2,900 people who worked or lived near the World Trade Center on 9/11 have been diagnosed with cancer, including nearly 900 fire fighters and 600 police. Many of these cancers are likely associated with exposure to chemicals in the air and debris at Ground Zero.

Under the James Zadroga 9/11 Health and Compensation Act, passed by Congress in 2010 after a prolonged partisan fight, first responders, recovery workers, and survivors of the terror attacks can seek free testing and treatment for 9/11-related illnesses. Nearly 50,000 people are currently being monitored and over 30,000 are receiving medical treatment or compensation for illnesses and injuries associated with the World Trade Center’s collapse.

These numbers are expected to rise in the coming years. The incidence of cancer and chronic respiratory illnesses continues to increase at an alarming rate among survivors and responders of the terror attacks. At the same time, two of the key programs created by the Zadroga Act are due to expire. Unless Congress extends the Act, the World Trade Center Health Program, which provides free screening and treatment for 9/11-related illnesses, will end in October 2015. The September 11th Victim Compensation Fund, which provides financial support to the victims of 9/11 and their families, will close in October 2016. Desperately needed medical care and social services will be cut off for thousands of sick patients whose only crime was to survive the attacks or to provide care and aid for those who did.

A bipartisan group of New York politicians – including New York City Mayor Bill de Blasio, US Senator Kirsten Gillibrand, and US Representatives Peter King and Carolyn Maloney – want to prevent this. Just this week, they called upon Congress to extend the Zadroga Act for another 25 years. But they and other supporters of the Act face an uphill battle.

One of the key reasons that it took nearly 10 years to get this legislation passed in the first place is that many prominent (largely conservative) Congressmen opposed its passage, including Representatives Michele Bachmann and Paul Ryan. House Speaker John Boehner and Majority Whip Kevin McCarthy voted against it repeatedly. Senator Tom Coburn also filibustered its passage, arguing that the federal government simply cannot afford provide treatment and care for the victims of 9/11 in an era of record budget deficits. Should the deficit hawks of the Republican Party retain control of the House and recapture the Senate in the upcoming mid-term elections, the fate of the Zadroga Act is likely sealed.

The heroes and victims of 9/11 deserve better. I believe that we have a moral obligation to provide lifelong medical care and treatment for illnesses linked to the terror attacks. It is shameful that the same politicians who used these attacks to justify hundreds of billions of dollars in military expenditures are suddenly crying poor when asked to help the victims themselves. I urge you to call your Senator and Representative and urge them to support the Zadroga Act. More importantly, I urge you to use the power of the ballot box in the upcoming midterm elections to send a message to those who do not support an extension of the James Zadroga 9/11 Health and Compensation Act.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on September 11, 2014, and is available on the WAMC website. The contents of this post are solely the responsibility of the author alone and do not represent the views of the Bioethics Program or Union Graduate College.]