Tuesday, November 29, 2016

Event Recap: Anonymous Peer Review & PubPeer

by Ian McLaughlin 

On the 24th of October, the Penn Science Policy Group met to discuss the implications of a new mechanism by which individuals can essentially take part in the peer review process.  The group discussion focused on a particular platform, PubPeer.com, which emerged in 2012 and has since become a topic of interest and controversy among the scientific community.  In essence, PubPeer is an online forum that focuses on enabling post-publication commentary, which ranges from small concerns by motivated article readers, to deeper dives into the legitimacy of figures, data, and statistics in the publication.  Given the current state of the widely criticized peer-review process, we considered the advantages and disadvantages of democratizing the process with the added layer of anonymity applied to reviewers.

PubPeer has been involved in fostering investigations of several scandals in science.  Some examples include a critical evaluation of papers published in Nature 2014 entitled Stimulus-triggered fate conversion of somatic cells into pluripotency [1].  The paper described a novel mechanism by which pluripotency might be induced by manipulating the pH environments of somatic cells.  However, following publication, concerns regarding the scientific integrity of published experiments were raised, resulting in the retraction of both papers and an institutional investigation.
  
Subsequently, the publications of a prolific cancer researcher received attention on PubPeer, ultimately resulting in the rescission of a prestigious position at a new institution eleven days before the start date due, at least in part, to PubPeer commenters contacting faculty at the institution.  When trying to return the professor’s former position, it was no longer available.  The professor then sued PubPeer commenters, arguing that the site must identify the commenters that have prevented a continued career in science.  PubPeer, advised by lawyers from the ACLU working pro-bono, is refusing to comply – and enjoy the support of both Google and Twitter, both of which have filed a court brief in defense of the website [2]. 
                  
Arguably at its best, PubPeer ostensibly fulfills an unmet, or poorly-met, need in the science publication process.  Our discussion group felt that the goal of PubPeer is one that the peer review process is meant to pursue, but occasionally falls short of accomplishing. While increased vigilance is welcome, and bad science – or intentionally misleading figures – should certainly not be published, perhaps the popularity and activity on PubPeer reveals a correctable problem in the review process rather than a fundamental flaw. While the discussion group didn’t focus specifically on problems with the current peer review process – a topic deserving its own discussion [3] – the group felt that there were opportunities to improve the process, and was ambivalent that a platform like PubPeer is sufficiently moderated, vetted, and transparent in the right ways to be an optimal means to this end.
                  
Some ideas proposed by discussion participants were to make the peer-review process more transparent, with increased visibility applied to the reasons a manuscript is or is not published.  Additionally, peer-review often relies upon the input of just a handful of volunteer experts, all of whom are frequently under time constraints that can jeopardize their abilities to thoroughly evaluate manuscripts – occasionally resulting in the assignment of peer review to members of related, though not optimally relevant, fields [4].  Some discussion participants highlighted that a democratized review process, similar to that of PubPeer, may indeed alleviate some of these problems with the requirement that commenters be moderated to ensure they have relevant expertise.  Alternatively, some discussion participants argued, given the role of gate-keeper played by journals, often determining the career trajectories of aspiring scientists, the onus is on Journals’ editorial staffs to render peer review more effective.  Finally, another concept discussed was to layer a 3rd party moderation mechanism on top of a platform like PubPeer, ensuring comments are objective, constructive, and unbiased.
                  
The concept of a more open peer review is one that many scientists are beginning to seriously consider.  In Nature News, Ewen Callaway reported that 60% of the authors in Nature Communications agreed to have publication reviews published [7].  However, while a majority of responders to a survey funded by the European Commission believed that open peer review ought to become more routine, not all strategies of open peer review received equivalent support.

[7]

                  
Ultimately, the group unanimously felt that the popularity of PubPeer ought to be a signal to the scientific community that something is wrong with the publication process that requires our attention with potentially destructive ramifications [5].  Every time a significantly flawed article is published, damage is done to the perception of science and the scientific community, and at a time when the scientific community still enjoys broadly positive public perception [6], now is likely an opportune time to reconsider the peer-review process – and perhaps learn some lessons that an anonymous post-publication website like PubPeer might teach us.

References


1) PubPeer - Stimulus-triggered fate conversion of somatic cells into pluripotency. (n.d.). Retrieved November 25, 2016, from https://pubpeer.com/publications/8B755710BADFE6FB0A848A44B70F7D 

2) Brief of Amici Curiae Google Inc. and Twitter Inc. in Support of PubPeer, LLC. (Michigan Court of Appeals). https://pubpeer.com/Google_Twitter_Brief.pdf

3) Balietti, S. (2016). Science Is Suffering Because of Peer Review’s Big Problems. Retrieved November 25, 2016, from https://newrepublic.com/article/135921/science-suffering-peer-reviews-big-problems

4)Arns M. Open access is tiring out peer reviewers. Nature. 2014 Nov 27;515(7528):467. doi: 10.1038/515467a. PubMed PMID: 25428463.

5) Correspondent, A. J. (2012). False positives: Fraud and misconduct are threatening scientific research. Retrieved November 25, 2016, from https://www.theguardian.com/science/2012/sep/13/scientific-research-fraud-bad-practice

6) Hayden, E. C. (2015, January 29). Survey finds US public still supports science. Retrieved November 25, 2016, from http://www.nature.com/news/survey-finds-us-public-still-supports-science-1.16818 

7) Callaway E. Open peer review finds more takers. Nature. 2016 Nov 10;539(7629):343. doi: 10.1038/nature.2016.20969. PubMed PMID: 27853233

Sunday, November 6, 2016

Tracing the ancestry and migration of HIV/AIDS in America

by Arpita Myles
Acquired immunodeficiency syndrome or AIDS is a global health problem that has terrified and intrigued scientists and laypeople alike for decades. AIDS is caused by the Human Immunodeficiency Virus, or HIV, which is transmitted through blood, semen, vaginal fluid, and from an infected mother to her child [1]. Infection leads to failure of the immune system, increasing susceptibility to secondary infections and cancer, which are mostly fatal. Considerable efforts are being put into developing prophylactic and therapeutic approaches to tackle HIV-AIDS, but there is also interest in understanding how the disease became so wide-spread. With the advent of the Ebola and Zika viruses in the last couple of years, there is a renewed urgency in understanding the emergence and spread of viruses in the past in order to prevent those in the future. The narrative surrounding the spread of HIV has been somewhat convoluted, but a new paper in Nature by Worobey et. al, hopes to set the record straight [2].
Humans are supposed to have acquired HIV from African chimpanzees- presumably as a result of hunters coming in contact with infected blood, containing a variant of the virus that had adapted to infect humans. The earliest known case of HIV in humans was detected in 1959 in Kinshasa, Democratic Republic of the Congo, but the specific mode of transmission was never ascertained [3].
There has been little or no information about how HIV spread to United States, until now. HIV incidences were first reported in the US in 1981, leading to the recognition of AIDS [4]. Since the virus can persist for a decade or more prior to manifestation of symptoms, it is possible that it arrived in the region long before 1981. However, since most samples from AIDS patients were collected after this date, efforts to establish a timeline for HIV’s entry into the states met with little success. Now, researchers have attempted to trace the spread of HIV by comparing genetic sequences of contemporary HIV strains with blood samples from HIV patients dating back to the late 1970’s [2]. These samples were initially collected for a study pertaining to Hepatitis B, but some were found to be HIV seropositive. This is the first comprehensive genetic study of the HIV virus in samples collected prior to 1981.
The technical accomplishment of this work is significant as well. In order to circumvent the problems of low amounts and extensive degradation of the viral RNA from the patient samples, they developed a technique they call “RNA jackhammering.”  In essence, a patient’s genome is broken down into small bits and overlapping sequences of viral RNA are amplified. This enables them to “piece together” the viral genome, which they can then subject to phylogenetic analysis.
Using novel statistical analysis methods, Worobey et al. reveal that the virus had probably entered New York from Africa (Haiti) during the 1970s, whereupon it spread to San Francisco and other regions. Upon analyzing the older samples, the researchers found that despite bearing similarities with the Caribbean strain, the strains from San Francisco and New York samples differed amongst themselves. This suggests that the virus had entered the US multiple, discreet times and then began circulating and mutating. Questions still remain regarding the route of transmission of the virus from Haiti to New York.
The relevance of this study is manifold. Based on the data, one can attempt to understand how pathogens spread from one population to another and how viruses mutate and evolve to escape natural immunity and engineered therapeutics. Their molecular and analytical techniques can be applied to other diseases and provide valuable information for clinicians and epidemiologists alike. Perhaps the most startling revelation of this study is that contemporary HIV strains are more closely related to their ancestors than to each other. This implies that information derived from ancestral strains could lead to development of successful vaccine strategies.
Beyond the clinic and research labs, there are societal lessons to be learned as well. Published in 1984, a study by CDC (Center for Disease Control) researcher William Darrow and colleagues traced the initial spread of HIV in the US to GaĆ©tan Dugas- a French Canadian air steward. Examination of Dugas’s case provided evidence linking HIV transmission with sexual activity. Researchers labeled Dugas as “Patient O”, as in “Out of California” [5]. This was misinterpreted as “Patient Zero” by the media- a term still used in the context of other epidemics like flu and Ebola. The dark side of this story is that Dugas was demonized in the public domain as the one who brought HIV to the US. As our understanding of the disease and its spread broadened, scientists and historians began to discredit the notion that Dugas played a significant role. However, scientific facts were buried beneath layers of sensationalism and hearsay and the stigma remained.
Now, with the new information brought to light by Worobey’s group, Dugas’s name has been cleared. Phylogenetic analysis of Dugas’s strain of HIV was sufficiently different from the ancestral ones, negating the possibility that he initiated the epidemic.
The saga in its entirety highlights the moral dilemma of epidemiological studies and the extent to which the findings should be made public. Biological systems are complicated, and while narrowing down origin of a disease has significance clinical relevance, we often fail to consider collateral damage. The tale of tracking the spread of HIV is a cautionary one; scientific and social efforts should be focused more on resolution and management, rather than on vilifying unsuspecting individuals for “causing” an outbreak.

References:
1. Maartens G, Celum C, Lewin SR. HIV infection: epidemiology, pathogenesis, treatment, and prevention. Lancet. 2014 Jul 19;384(9939):258-71.
2. Worobey M, Watts TD, McKay RA et al., 1970s and 'Patient 0' HIV-1 genomes illuminate early HIV/AIDS history in North America. Nature. 2016 Oct 26. doi: 10.1038/nature19827.
3. Faria NR, Rambaut A et al., HIV epidemiology. The early spread and epidemic ignition of HIV-1 in human populations. Science. 2014 Oct 3;346(6205):56-61.
4. Centers for Disease Control (CDC). Pneumocystis pneumonia--Los Angeles. MMWR Morb Mortal Wkly Rep. 1981 Jun 5;30(21):250-2.
5. McKay RA. “Patient Zero”: The Absence of a Patient’s View of the Early North American AIDS Epidemic. Bull Hist Med. 2014 Spring: 161-194.

Tuesday, October 11, 2016

Event Recap: The Importance of Science-Informed Policy & Law Making

by Ian McLaughlin          

Last week, we held a panel discussion focused on the importance of science-informed policy & law making.  The panel included Dr. Michael Mann, a climatologist and geophysicist at Pennsylvania State University who recently wrote The Madhouse Effect: How Climate Change Denial is Threatening Our Planet, Destroying Our Politics, and Driving Us Crazy.   Dr. Andrew Zwicker, a member of the New Jersey General Assembly and a physicist who heads the Science Education Department of the Princeton Plasma Physics Laboratory, joined him.  Finally, Shaughnessy Naughton, a chemist and entrepreneur who ran for congressional office in Pennsylvania and founded the 314 PAC, which promotes the election of candidates with backgrounds in STEM fields to public office, joined the panel as well.

The event began with personal introductions, with each member characterizing their unique perspectives and personal histories.  Shaughnessy Naughton highlighted the scarcity of legislators with backgrounds in math and science as a primary motivator for encouraging people with science backgrounds to get involved beyond just advocacy. 

Dr. Andrew Zwicker, having previously run for office in the US House of Representatives, ultimately was successful in his run for the state assembly in an extremely tight race, winning by just 78 votes, or 0.2456%  – a level of precision that he’s been told would only be spoken by a scientist, as most would simplify the value to a quarter of a percent.  He credited two primary features of his campaign as contributing to his success.  First, on a practical level, he utilized a more sophisticated voter model.  As the first Democrat ever elected to his district in its 42 years[1], it was critical to optimally allocate resources to effectively communicate his message.  Second, he identified his background in science as a strength.  When campaigning, he made it clear that he’d ensure facts would guide his decisions – and his constituents found that pragmatism appealing.

Next, Dr. Michael Mann summarized his pathway to prominence in the climate change debate by recounting the political fallout that occurred following the publication of his now famous “hockey-stick graph”[2].  In short, the graph depicts that average global temperatures had been fairly stable until 1900 (forming the shaft of the hockey stick), at which point a sharp rise in temperature begins (forming the blade).  In articulating why exactly this publication made such a splash, he highlighted the simplicity of the graph. It summarizes what is otherwise fairly esoteric data in a way that’s accessible to non-scientists.  “You don’t have to understand the complex physics to understand what the graph was saying: there’s something unprecedented taking place today, and, by implication, probably has something to do with what we’re doing.”  After its publication, he was in for a whirlwind.  The graph became iconic in the climate change debate, provoking the ire of special interests who then pursued a strategy to personally discredit Mann.

Naughton initiated the conversation by asking Zwicker if his background in science has influenced what he’s been able to accomplish in his past 9 months of public office.  While at times it has given him credibility and garnered trust among his peers and constituents, the nature of science is often incongruous with politics: rather than relying solely on facts, politics requires emotional and personal appeals to get things done.  A specific example: the fear of jobs being lost due to legislation, particularly reforms focused on energy and climate change, oftentimes obscures what would otherwise be a less volatile debate.

Naughton then asked Mann to describe his experience with Ken Cuccinelli, the former Attorney General (AG) of Virginia under former governor Bob McDonnell.  One of the former AG’s priorities was to target the Environmental Protection Agency’s ability to regulate greenhouse gas emissions, as well as demand the University of Virginia – the institution where Dr. Mann had been an assistant professor from 1999 to 2005 – to provide a sweeping compilation of documents associated with Dr. Mann.  Cuccinelli was relying upon the 2002 Virginia Fraud Against Taxpayers Act, devised to enable the AG to ferret out state waste and fraud, to serve the civil investigative demand.  Ultimately, Cuccinelli’s case was rejected, and has since been considered a major victory to the integrity of academic research and scientists’ privacy.

The panel then invited questions from attendees, which ranged from technical inquiries of how climate estimates were made for the Hockey Stick Curve to perspectives on policy & science communication. 

One question focused on the public’s ability to digest and think critically about scientific knowledge – highlighting that organizations and institutions like AAAS and the NSF regularly require funded investigators to spend time communicating their research to a broader audience.  However, the relationship between the public and science remains tenuous.  Zwicker responded by identifying a critical difference in efficacy between the beautiful images and data from NASA or press releases and the personal experiences of people outside of science.  Special interest groups can disseminate opinions and perspectives that don’t comport with the scientific consensus, and without truly effective science communication, the public simply can’t know whom to trust.  He argued that scientists do remain a broadly trusted group, but without competent efforts to communicate the best science, it remains a major challenge.  Ultimately, the solution involves a focus on early education and teaching critical thinking skills.

Moreover, Mann commented on a problematic fallacy that arises from a misunderstanding of how science works: “there’s a fallacy that because we don’t know something, we know nothing.  And that’s obviously incorrect.” There are many issues at the forefront of science that remain to be understood, but that forefront exists because of relevant established knowledge.  “We know greenhouse gasses warm the planet, and it’ll warm more if we continue burning carbon.  There’s still uncertainty with gravity.  We haven’t reconciled quantum mechanics with general relativity.  Just because we haven’t reconciled all of the forces, and there’s still something to be learned about gravity at certain scales – we still understand that if we jump out the window, we’ll plummet to our deaths.”

Naughton suggested that much of this disconnect between scientific knowledge and public sentiment comes down to communication.  “For many scientists, it’s very difficult to communicate very complex processes and theories in a language that people can understand.  As scientists, you want to be truthful and honest.  You don’t learn everything about quantum mechanics in your first year of physics; by not explaining everything, that doesn’t mean you’re being dishonest.” 

Zwicker highlighted that there aren’t many prominent science communicators, asking the audience to name as many as they could.  Then, he asked if we could name prominent female science communicators, which proved more difficult for the audience.  There isn’t necessarily a simple solution to this obvious problem, given the influence of special interests and concerns of profitability.

An audience member then asked whether the panelists considered nuclear energy a viable alternative – and, in particular “warehouse-ready nuclear”, which describes small modular reactors that operate on a much smaller scale than the massive reactors to which we’ve become accustomed.  Zwicker, as a physicist, expressed skepticism: “You’ll notice there are no small reactors anywhere in the world.  By the time you build a reactor and get through the regulation – and we’re talking 10-30 years to be completed – we’re still far away from them being economically viable.”  He also noted that he’s encountered the argument that investment allocation matters to the success of a given technology, and that investment in one sustainable energy platform may delay progress in others.  The audience then asked about the panel’s perspectives on natural gas, which is characterized by some as a bridge fuel to a lower carbon-emitting future energy source.  Summarizing his perspective on natural gas, Mann argued “a fossil fuel ultimately can’t be the solution to a problem caused by fossil fuels.”

Jamie DeNizio, a member of PSPG, asked if the panel thought coalitions between state and local governments could be an effective strategy to get around current barriers at the national level.  Naughton noted that this is ultimately the goal behind the federal Clean Power Plan, with goals tailored to specific states for cutting carbon output.  Mann, highlighting the prevalent lack of acceptance of climate change at the federal level, suggested that the examples of state consortia that currently exist – like The Regional Greenhouse Gas Initiative (RGGI) in New England, or the Pacific Coast Collaborative (PCC) on the West Coast – are causes for optimism, indicating that progress can be made despite gridlock at the federal level.  Zwicker noted that New Jersey’s participation in trading carbon credits had resulted in substantial revenue, as New Jersey was able to bring in funds to build a new hospital.  He suggested that Governor Chris Christie’s decision to withdraw from RGGI was imprudent, and the New York Times noted that, in 2011, New Jersey had received over $100 million in revenue from RGGI[3].

Another issue that was brought up by the panel was how counterproductive infighting among environmentalists and climate change activists can be to the overall effort.  In particular, this splintering enables critics to portray climate change as broadly incoherent, rendering the data and proposals less convincing to skeptics of anthropogenic climate change.

Adrian Rivera, also a PSPG member, asked the panel to comment on whether they felt social media is an effective strategy to communicate science to the general public.  Mann stated that scientist that do not engage on social media are not being as effective as they can be, mostly because there is a growing subset of the population that derives information via social media platforms. In contrast, Zwicker highlighted the lack of depth on social media, and that some issues simply require more in-depth discussion than social media tends to accommodate. Importantly, Zwicker emphasized the importance and value of face-to-face communication. Naughton then brought this point to a specific example of poor science communication translating into tangible problems.  “It’s not all about policy or NIH/NSF funding.  It’s about making sure evolution is being taught in public schools.”  She noted the experience of a botany professor in Susquehanna, PA, who was holding an info-session on biology for high-school teachers. One of the attending high-school teachers told him that he was brave for teaching evolution in school, which Naughton identified as an example of ineffective science communication.

Finally, an environmental activist in the audience noted that a major problem he’d observed in his own approach to advocacy was that he was often speaking through feelings of anger rather than positive terms.  Mann thoroughly agreed, and noted that “there’s a danger when we approach from doom and gloom.  This takes us to the wrong place; it becomes an excuse for inaction, and it actually has been co-opted by the forces of denial.  It is important to communicate that there is urgency in confronting this problem [climate change] – but that we can do it, and have a more prosperous planet for our children and grandchildren.  It’s critical to communicate that.  If you don’t provide a path forward, you’re leading people in the wrong direction.”

The event was co-hosted by 314 Action, a non-profit affiliated with 314 PAC with the goal of strengthening communication among the STEM community, the public, and elected officials.


References:

1. Qian, K. (2015, November 11). Zwicker elected as first Democrat in NJ 16th district. Retrieved October 6, 2016, from http://dailyprincetonian.com/news/2015/11/zwicker-elected-as-first-democrat-in-nj-16th-district/

2. Mann, Michael E.; Bradley, Raymond S.; Hughes, Malcolm K. (1999), "Northern hemisphere temperatures during the past millennium: Inferences, uncertainties, and limitations" (PDF), Geophysical Research Letters, 26 (6): 759–762, Bibcode:1999GeoRL..26..759M, doi:10.1029/1999GL900070

3. Navarro, M. (2011, May 26). Christie Pulls New Jersey From 10-State Climate Initiative. Retrieved October 6, 2016, from http://www.nytimes.com/2011/05/27/nyregion/christie-pulls-nj-from-greenhouse-gas-coalition.html?_r=1&ref=nyregion

Monday, October 3, 2016

New Research shows how to make Human Stem Cell Lines divide equally

by Amaris Castanon
For the first time, scientists have generated haploid embryonic stem (ES) cell lines in humans, as published in Nature. This could lead to novel cell therapies for genetic diseases – even color blindness (Benvenisty et al., 2016)
The study was performed by scientists from the Hebrew University of Jerusalem(Israel) in collaboration with Columbia University Medical Center (CUMC) and the New York Stem Cell Foundation (NYSCF).
The newly derived pluripotent, human ES cell lines demonstrated their ability to ‘self-renew’ while maintaining a normal haploid karyotype (i.e. without chromosomal breakdown after each generation) (Benvenisty et al., 2016).
While gamete manipulation in other mammalian species has yielded several ES cell lines (Yang, H. et al., Leeb, M. & Wutz, A.), this is the first study to report human cells capable of cell division with merely one copy of the parent’s cell genome (Benvenisty et al., 2016).
The genetic match between the stem cells and the egg donor may prove advantageous for cell-based therapies of genetic diseases such as diabetes, Tay-Sachs disease and even color blindness (Elling et al., 2011).
Mammalian cells are considered diploid due to the fact that two sets of chromosomes are inherited: 23 from the father and 23 from the mother (a total of 46) (Wutz, 2014; Yang H. et al., 2013). Haploid cells contain a single set of 23 chromosomes and arise only as post-meiotic germ cells (egg and sperm) to ensure the right number of chromosomes end up in the zygote (embryo) (Li et al., 2014; Elling et al., 2011).
Other studies performed in an effort to generate ES cells from human egg cells reported generating solely diploid (46 chromosome) human stem cells, which is a problem (Leeb, M. et al., 2012; Takahashi, S. et al., 2014). This study, however, reported inducing cell division in unfertilized human egg cells (Benvenisty et al., 2016).
The DNA was labeled with a florescent dye prior to isolating the haploid stem cells and scattering (the haploid cells or the cells) among the larger pool of diploid cells. The DNA staining demonstrated that the haploid cells retained their single set of chromosomes, while differentiating to other cell types including nerve, heart, and pancreatic cells demonstrates their ability to give rise to cells of different lineage (pluripotency) (Benvenisty et al., 2016).
Indeed, the newly derived haploid ES cells demonstrated pluripotent stem cell characteristics, such as self-renewal capacity and a pluripotency-specific molecular signature (Benvenisty et al., 2016).
In addition, the group of researchers successfully demonstrated usage of their newly derived human ES cells as a platform for loss-of-function genetic screening. Therefore, elucidating the genetic screening potential of targeting only one of the two copies of a gene.
These findings may facilitate genetic analysis in the future by allowing an ease of gene editing in cancer research and regenerative medicine.
This is a significant finding in haploid cells, due to the fact that detecting the biological effects of a single-copy mutation in a diploid cell is difficult. The second copy does not contain the mutation and therefore serves as a ‘backup’ set of genes, making it a challenge for precise detection.
The newly derived haploid ES cells will provide researchers with a valuable tool for improving our understanding of human development and genetic diseases.
This study has provided scientists with a new type of human stem cell that will play an important role in human functional genomics and regenerative medicine.
References:
Derivation and differentiation of haploid human embryonic stem cells. Sagi I, Chia G, Golan-Lev T, Peretz M, Weissbein U, Sui L, Sauer MV, Yanuka O, Egli D, Benvenisty N. Nature. 2016 Apr 7;532(7597):107-11.

Elling, U. et al. Forward and reverse genetics through derivation of haploid mouse embryonic stem cells. Cell Stem Cell 9, 563–574 (2011).

Leeb, M. et al. Germline potential of parthenogenetic haploid mouse embryonic stem cells. Development 139, 3301–3305 (2012)

Leeb, M. & Wutz, A. Derivation of haploid embryonic stem cells from mouse embryos.Nature 479, 131–134 (2011)

Li, W. et al. Genetic modification and screening in rat using haploid embryonic stem cells. Cell Stem Cell 14, 404–414 (2014).

Takahashi, S. et al. Induction of the G2/M transition stabilizes haploid embryonic stem cells. Development 141, 3842–3847 (2014)

Wutz, A. Haploid mouse embryonic stem cells: rapid genetic screening and germline transmission. Annu. Rev. Cell Dev. Biol. 30, 705–722 (2014).

Yang, H. et al. Generation of genetically modified mice by oocyte injection of androgenetic haploid embryonic stem cells. Cell 149, 605–617 (2012)

Friday, March 25, 2016

Event Recap: Dr. Sarah Rhodes, Health Science Policy Analyst

by Chris Yarosh

PSPG tries to hold as many events as limited time and funding permit, but we cannot bring in enough speakers to cover the range of science policy careers out there. Luckily, other groups at Penn hold fantastic events, too, and this week’s Biomedical Postdoc Program Career Workshop was no exception. While all of the speakers provided great insights into their fields, this recap focuses on Dr. Sarah Rhodes, a Health Science Policy Analyst in the Office of Science Policy (OSP) at the National Institutes of Health (NIH).

First, some background: Sarah earned her Ph.D. in Neuroscience from Cardiff University in the U.K., and served as a postdoc there before moving across the pond and joining a lab at the NIH. To test the policy waters, Sarah took advantage of NIH’s intramural detail program, which allows scientists to do temporary stints in administrative offices. For her detail, Sarah worked as a Policy Analyst in the Office of Autism Research Coordination (OARC) at the National Institute of Mental Health (NIMH). That experience convinced her to pursue policy full time. Following some immigration-related delays, Sarah joined OARC as a contractor and later became a permanent NIH employee.

After outlining her career path, Sarah provided an overview of how science policy works in the U.S. federal government, breaking the field broadly into three categories: policy for science, science for policy, and science diplomacy. According to Sarah (and as originally promulgated by Dr. Diane Hannemann, another one of this event’s panelists), the focus of different agencies roughly breaks down as follows:


This makes a lot of sense. Funding agencies like NIH and NSF are mostly concerned with how science is done, Congress is concerned with general policymaking, and the regulatory agencies both conduct research and regulate activities under their purview. Even so, Sarah did note that all these agencies do a bit of each type of policy (e.g. science diplomacy at NIH Fogarty International Center). In addition, different components of each agency have different roles. For example, individual Institutes focus more on analyzing policy for their core mission (aging at NIA, cancer at NCI, etc.), while the OSP makes policies that influence all corners of the NIH.

Sarah then described her personal duties at OSP’s Office of Scientific Management and Reporting (OSMR):
  • Coordinating NIH’s response to a directive from the President’s Office of Science and Technology Policy related to scientific collections (think preserved specimens and the like)
  • Managing the placement of AAAS S&T Fellows at NIH
  • Supporting the Scientific Management Review Board, which advises the NIH Director
  • Preparing for NIH’s appropriations hearings and responding to Congressional follow-ups
  • “Whatever fires needs to be put out”
If this sounds like the kind of job for you, Sarah recommends building a professional network and developing your communication skills ASAP (perhaps by blogging!?). This sentiment was shared by all of the panelists, and it echoes advice from our previous speakers. Sarah also strongly recommends volunteering for university or professional society committees. These bodies work as deliberative teams and are therefore good preparation for the style of government work.

For more information, check out the OSP’s website and blog. If you’re interested in any of the other speakers from this panel, I refer you to the Biomedical Postdoc Program.