Team:Lethbridge/Ethics
From 2010.igem.org
Liszabruder (Talk | contribs) |
Liszabruder (Talk | contribs) |
||
Line 96: | Line 96: | ||
<BLOCKQUOTE> | <BLOCKQUOTE> | ||
- | + | =<font color="white">“Study the past, if you would divine the future” – Confucius= | |
- | + | ||
- | + | ||
Scientific study is generally perceived as the development of new ideas and novel data, but underlying this is the fact that scientific advancement is made by building formerly known information on top of new innovation. Indeed, without the discovery of the cell, synthetic biology would never exist. | Scientific study is generally perceived as the development of new ideas and novel data, but underlying this is the fact that scientific advancement is made by building formerly known information on top of new innovation. Indeed, without the discovery of the cell, synthetic biology would never exist. | ||
Line 111: | Line 109: | ||
Through our analysis of cloning, antibiotics, internet and nuclear power, the Lethbridge iGEM Team will “divine the future” of ethics and its relationship with the newly developing field of synthetic biology. | Through our analysis of cloning, antibiotics, internet and nuclear power, the Lethbridge iGEM Team will “divine the future” of ethics and its relationship with the newly developing field of synthetic biology. | ||
- | + | =<font color="white">Cloning= | |
Most textbooks and scientific papers define cloning as the biological process during which an individual(s) produces identical individuals mostly as a result of asexual reproduction. In biotechnology, cloning refers to creating multiple copies of DNA fragments, cells or organisms. At the molecular and cellular level, cloning and genetic engineering have greatly impacted the manner in which we live our lives. Research in these areas have led to the development of techniques such as PCR and recombinant technology, which have numerous applications in the lives of humans - from advances in medicine which facilitated synthesis of insulin for diabetic patients to the modern day forensic units, which use PCR to amplify and identify criminals. | Most textbooks and scientific papers define cloning as the biological process during which an individual(s) produces identical individuals mostly as a result of asexual reproduction. In biotechnology, cloning refers to creating multiple copies of DNA fragments, cells or organisms. At the molecular and cellular level, cloning and genetic engineering have greatly impacted the manner in which we live our lives. Research in these areas have led to the development of techniques such as PCR and recombinant technology, which have numerous applications in the lives of humans - from advances in medicine which facilitated synthesis of insulin for diabetic patients to the modern day forensic units, which use PCR to amplify and identify criminals. | ||
Line 142: | Line 140: | ||
(2) Suckow M.A., Weisbroth S.H. 200. The Laboratary rat. “Ethical and legal perspectives – Beverly J. Gnadt”. 53-70. | (2) Suckow M.A., Weisbroth S.H. 200. The Laboratary rat. “Ethical and legal perspectives – Beverly J. Gnadt”. 53-70. | ||
- | + | =<font color="white">Antibiotics= | |
The spread of disease and sickness caused by microorganisms brought about widespread death for many centuries, and it wasn’t until the discovery of antibiotics that the medical community was finally able to fight off infection and bacterial growth. In 1928, Alexander Fleming discovered penicillin from the mold Penicillium notatum which inhibited the growth of Staphylococcus aureus, a bacterium common in many diseases (1). Modern chemotherapy originated in Germany in the early twentieth century under Paul Ehrlich who began looking for a “magic bullet” that he speculated would selectively destroy pathogens but not the host organism (1). Originally, antibiotics were seen as a “miracle cure” and there was no medical evidence that indicated that was any need to be concerned with their extensive use. With the development of new antibiotics specific to certain bacteria, the field of medicine had become even more successful in curing illness and disease. However, over time, it was discovered that the overuse and misuse of antibiotics had promoted the evolution of new resistant bacteria which were impossible to destroy (1). These bacteria, which are invulnerable to any antibiotic currently developed, could potentially lead to a massive outbreak of untreatable patients with life threatening diseases. | The spread of disease and sickness caused by microorganisms brought about widespread death for many centuries, and it wasn’t until the discovery of antibiotics that the medical community was finally able to fight off infection and bacterial growth. In 1928, Alexander Fleming discovered penicillin from the mold Penicillium notatum which inhibited the growth of Staphylococcus aureus, a bacterium common in many diseases (1). Modern chemotherapy originated in Germany in the early twentieth century under Paul Ehrlich who began looking for a “magic bullet” that he speculated would selectively destroy pathogens but not the host organism (1). Originally, antibiotics were seen as a “miracle cure” and there was no medical evidence that indicated that was any need to be concerned with their extensive use. With the development of new antibiotics specific to certain bacteria, the field of medicine had become even more successful in curing illness and disease. However, over time, it was discovered that the overuse and misuse of antibiotics had promoted the evolution of new resistant bacteria which were impossible to destroy (1). These bacteria, which are invulnerable to any antibiotic currently developed, could potentially lead to a massive outbreak of untreatable patients with life threatening diseases. | ||
Line 167: | Line 165: | ||
(1) Tortora, Gerard J., Funke, Berdell R., and Case, Christine, L. 2010. Microbiology: An Introduction (10th Edition), Pearson Benjamin Cummings, San Francisco, CA. pp. 554. | (1) Tortora, Gerard J., Funke, Berdell R., and Case, Christine, L. 2010. Microbiology: An Introduction (10th Edition), Pearson Benjamin Cummings, San Francisco, CA. pp. 554. | ||
- | + | =<font color="white">Internet= | |
The Internet initially began in the 1960s to allow for a globally interconnected set of computers that could provide quick and easy access to various data and programs for users (1). By the 1980s, progress in the development of the Internet included networks that revolutionized the world of computers and communication by bringing about the invention of the World Wide Web by European scientists (1). In 2010, the Web is such a major part of everyday life that it has become somewhat of a necessity for successful social interaction. Who would have thought that a scientific innovation spearheaded by numerous MIT researchers would evolve into such a sophisticated system that would allow almost anyone to be able to view every type of multimedia on their computer? | The Internet initially began in the 1960s to allow for a globally interconnected set of computers that could provide quick and easy access to various data and programs for users (1). By the 1980s, progress in the development of the Internet included networks that revolutionized the world of computers and communication by bringing about the invention of the World Wide Web by European scientists (1). In 2010, the Web is such a major part of everyday life that it has become somewhat of a necessity for successful social interaction. Who would have thought that a scientific innovation spearheaded by numerous MIT researchers would evolve into such a sophisticated system that would allow almost anyone to be able to view every type of multimedia on their computer? | ||
Line 197: | Line 195: | ||
(2) Brignall T.W., Valey T.V. 2005. The Impact of Internet Communications on Social Interaction. Sociological Spectrum. 25: 335 - 348. | (2) Brignall T.W., Valey T.V. 2005. The Impact of Internet Communications on Social Interaction. Sociological Spectrum. 25: 335 - 348. | ||
- | + | =<font color="white">Nuclear Power= | |
In the early 1900s, radioactive materials were first discovered. Up until this time, the idea that some atoms displayed signs of “radioactivity” was known, but specific details were lacking. After scientists realized how to rearrange atoms using neutrons, it did not take long before German chemists, Otto Hahn and Fritz Strassman bombarded some isotopes of uranium with thermal neutrons revealing traces of barium and the first glimpses of nuclear energy technology (1). This newfound knowledge sparked the interest of Niels Bohr and Francis Perrin who both worked on methods of slowing down the decaying process of neutron emission, right before World War II broke out (1). Werner Heisenberg’s student, Rudolf Pieierls, took over this work, building on Perrin’s theories (1). Since Pieierls worked for the German energy project, his team began working on ways of integrating this technology into warfare. Although they were unsuccessful, both the United States and Britain saw this development as challenging and significant and were spurred to carry on developing the atomic bomb (1). During the war, governmental efforts increased drastically towards such research, and on July 16, 1945, the United States tested their first atomic bomb in New Mexico (1). Not long after in August of 1945, the United States put two bombs into service and detonated them over the Japanese cities of Hiroshima and Nagasaki, Japan (1). These two events are the only active deployments of nuclear weapons in war. 90,000 to 166,000 people died in Hiroshima and 60,000 to 80,000 died in Nagasaki (1). Most of the deaths were from the effects of burns and radiation sickness with the balance coming as a result of other injuries and illness. After the end of World War II, efforts once again turned to energy production. | In the early 1900s, radioactive materials were first discovered. Up until this time, the idea that some atoms displayed signs of “radioactivity” was known, but specific details were lacking. After scientists realized how to rearrange atoms using neutrons, it did not take long before German chemists, Otto Hahn and Fritz Strassman bombarded some isotopes of uranium with thermal neutrons revealing traces of barium and the first glimpses of nuclear energy technology (1). This newfound knowledge sparked the interest of Niels Bohr and Francis Perrin who both worked on methods of slowing down the decaying process of neutron emission, right before World War II broke out (1). Werner Heisenberg’s student, Rudolf Pieierls, took over this work, building on Perrin’s theories (1). Since Pieierls worked for the German energy project, his team began working on ways of integrating this technology into warfare. Although they were unsuccessful, both the United States and Britain saw this development as challenging and significant and were spurred to carry on developing the atomic bomb (1). During the war, governmental efforts increased drastically towards such research, and on July 16, 1945, the United States tested their first atomic bomb in New Mexico (1). Not long after in August of 1945, the United States put two bombs into service and detonated them over the Japanese cities of Hiroshima and Nagasaki, Japan (1). These two events are the only active deployments of nuclear weapons in war. 90,000 to 166,000 people died in Hiroshima and 60,000 to 80,000 died in Nagasaki (1). Most of the deaths were from the effects of burns and radiation sickness with the balance coming as a result of other injuries and illness. After the end of World War II, efforts once again turned to energy production. | ||
Line 227: | Line 225: | ||
http://www.inl.gov/factsheets/ebr-1.pdf | http://www.inl.gov/factsheets/ebr-1.pdf | ||
- | + | =<font color="white">Conclusion= | |
It has been very enlightening to look at scientific inventions of the past from an ethical point of view. No matter which area of discovery that the U of L iGEM team researched; cloning, antibiotics, the internet or nuclear power, it was obvious that there were social, environmental, economic and legal implications that always needed to be addressed. Our synthetic biology project is no different. What we have learnt from our investigation is that it is important to try to act proactively and anticipate problems that may arise and be a concern for the safety of all living matter. Information needs to be readily available about synthetic biology so the public is knowledgeable about how it is being used and the discoveries that are being made to improve their well-being. In all scientific research, a good public perception is vital since it will be the people through their corporate sponsorship and elected officials in government that will support and ultimately finance worthwhile projects involving synthetic biology. | It has been very enlightening to look at scientific inventions of the past from an ethical point of view. No matter which area of discovery that the U of L iGEM team researched; cloning, antibiotics, the internet or nuclear power, it was obvious that there were social, environmental, economic and legal implications that always needed to be addressed. Our synthetic biology project is no different. What we have learnt from our investigation is that it is important to try to act proactively and anticipate problems that may arise and be a concern for the safety of all living matter. Information needs to be readily available about synthetic biology so the public is knowledgeable about how it is being used and the discoveries that are being made to improve their well-being. In all scientific research, a good public perception is vital since it will be the people through their corporate sponsorship and elected officials in government that will support and ultimately finance worthwhile projects involving synthetic biology. |