Dna, Genomics, And The New Ethical Dilemmas
In many respects the genetic technology that has grown up around the elucidation of the molecular structure of DNA since 1953 has raised even more ethical and moral questions than its predecessor in the classical Mendelian era. The rapid and exciting development of molecular genetics in the period from 1953 to 1970 provided the basis for understanding aspects of genetics at the molecular level that had only been imagined by prewar geneticists. Understanding how DNA replicates itself and how genes control cell function by coding for proteins that serve both structural and catalytic (enzymatic) roles, the nature and universality of the genetic code itself, and the way in which genes are controlled (turned on and off during development) all suggested that soon human beings would be able to engineer organisms in almost any conceivable direction. Indeed, the term genetic engineering was coined during the 1960s and early 1970s to express the new hope and excitement held forth by the understanding of molecular mechanisms of heredity.
As with the rapid advances in Mendelian and chromosome genetics in the 1920s and 1930s, so with the advancement of molecular genetics; new genetic discoveries were being announced almost weekly. Although many theories of molecular genetics came and went, they were all subject to being tested and accepted or rejected. For example, initial claims about how transcription (forming messenger RNA from a DNA strand) and translation (using the messenger to synthesize a specific protein), based on work with prokaryotic (bacterial) systems, proved to have many differences from the same process in eukaryotic cells (as in all cells of higher organisms). Bacterial and viral chromosomes turned out to be organized quite differently from the chromosomes of the fruit fly or the human. The statement that "what is true for E. coli [a common bacterium used for molecular genetic research] is true for the elephant," as molecular biologist Jacques Monod put it, turned out to be not quite that simple. But yet there did appear to be an evolutionary unity among all forms of life on earth that was even more apparent at the molecular level than at the level of gross phenotype.
The application of the new genetics to practical concerns, both in agriculture and medicine, raised a number of social, political, and ethical issues, some of which overlapped with concerns from the classical era and some of which were quite new to the molecular era. At the agricultural level, one of the first great controversies to emerge concerned the technology for transferring genes from one organism to another. The common method for doing this was to use a bacterial or viral plasmid (a small chromosome-like element of DNA) as a "vector." An isolated segment of DNA from one type of organism could be inserted into the plasmid, which, because of its size, could be incorporated into another cell type and eventually integrated into the host cell's genome. This meant that the foreign, or transplant, DNA would subsequently be replicated every time the DNA of the host cell replicated. Characteristics such as insect, mold, and frost resistance could thus be genetically engineered by transferring DNA (genes) from a species that had one of these traits of great commercial value. The controversies arising from the appearance of this technology reached significant proportions in the early 1980s in Cambridge, Massachusetts, where much of the experimental work was being carried out by Harvard and Massachusetts Institute of Technology biologists. Fear that viral plastids could "get loose" into the community through the massive use of the new technology sparked a series of public meetings and calls for a moratorium on all genetic engineering until safeguards could be assured. A meeting of many of the leading biotechnological researchers at Asilomar, California, in 1975 brought to the fore a discussion of the potential hazards from inserting genes from one kind of organism into the genome of another. Although hailed as one of the boldest exercises in social responsibility by scientists since World War II, interestingly enough, the most dangerous use of the new biotechnology—that is, to create biological weapons—was discussed). Later guidelines incorporated into all grants funded by the National Institutes of Health were based on some of the early decisions among molecular biologists themselves.
Especially in the agricultural realm, the issue of "genetically modified organisms" (GMOs) became a matter of global concern in the 1980s and 1990s. Although the use of viral and bacterial plasmids turned out not to pose as serious a threat as originally thought, critics of biotechnology argued that GMOs could have altered metabolic characteristics in a way that could adversely affect the physiology of the consumer and of the environment at large. One such case became a cause célèbre in 1999 when it was found that corn genetically modified to contain a bacterial gene, Bt, that made the corn insect resistant, was killing off monarch butterflies in various localities in Britain where the corn was being planted. Indeed, as mega-corporations such as Monsanto and others turned aggressively to exploiting the GMO market, many countries, especially those in the European Union and Africa, began to place restrictions on, or even ban, the sale or importation of GMOs within their borders. The issue was less the effect on a specific species such as the monarch butterfly than the fact that destruction of the monarch symbolized a major problem with GMOs: as a result of competitive pressure from rival companies they were often rushed onto the market without thorough testing. Long-standing distrust of corporate agribusiness, where quick profits have been seen as taking precedence over human health and the quality of the environment, has fueled much of the negative response to GMOs worldwide.
Equally as important has been the issue of using human subjects in genetic research. The problem of "informed consent," never something biologists routinely worried about prior to World War II (though some were scrupulous about informing their subjects about the nature of the research in which they were involved) became a central aspect of the ethics of all human subject research protocols from the 1970s onward. All universities and hospitals engaged in any sort of human genetics (or other) research now have internal review boards (IRBs) responsible for overseeing projects in which human subjects are involved. With regard to genetic information about individuals, the issue of consent is meant not only to insure that individual subjects understand the nature of the research of which they are a part, and to insure their safety, but also to place tight restrictions on who has access to the resulting information. A particular concern regarding genetic information about people in clinical studies is whether individual subjects could be identified from examining published or unpublished reports, notebooks, or other documents. Preserving anonymity has become a hallmark of all modern genetic research involving human subjects.
The question of accessibility of genetic information has had ramifications in another aspect of medicine as well as that of designing research protocols. As testing for genes known to be related to specific human genetic diseases, such as sickle-cell anemia, Huntington's disease, or cystic fibrosis (CF) has been made available to clinicians, two questions have loomed large, especially in the United States: the accuracy of the tests (that is, incidence of false positives) and the question of who should have access to the information. Fears that genetic information might lead to job or health care discrimination have surfaced throughout the United States as genetic screening programs have become more technically feasible and thus more frequently employed. Perhaps the more general concern is the potential for insurance companies to obtain, or even require, genetic testing of adults as the basis for medical coverage, or harkening back to what seems like an almost eugenic-like view of testing fetuses, with the threat of loss of coverage if a fetus with a known genetic defect is born. Medical insurance companies in the past have tried to classify genetic diseases as "prior conditions" that are thus exempt from coverage. Most of these attempts have not been carried through, but the threat is there and raises a host of legal as well as social and psychological concerns. As of 2004 a small number (seven) of states in the United States have passed legislation specifically prohibiting insurers from denying coverage to individuals based on genetic data.
- Contemporary Genetics - Human Behavior Genetics
- Contemporary Genetics - Eugenics And The Ethical Issues Of Selective Breeding (1900–1945)
- Other Free Encyclopedias