Impact Of The BP Oil Spill In The Gulf Of Mexico Sample Assignment

Introduction

It entirely went awry for British Petroleum in the Gulf of Mexico on the late afternoon of 20Th April this year when its service provider Transocean was making a hole on a new rig on the Mississippi Canyon Block 252 in which BP had a controlling stake. In the beginning there was a fire which then paved way for an enormous blast. On the third day, the rig went under the surface spurring a series of leak reaction procedures that have so far involved an excess of 3000 people, a navy of ships airplanes, solvents and shots. At least a dozen of workers were pronounced dead while seventeen others were wounded. Politics have also found its way to this issue.

It is now apparent that the ultimate solution lies in capping the opening which is 5000 Feet beneath the surface. So far three intercessions have been made but unsuccessful due to the fact that the depths of water involved have never been experienced before. It has been noted that the equipment used for preventing such a leak, known as Blow Out Preventor (BOP), failed and the whole oil engineering sector is unable to mitigate at deep sea surroundings1.

Thesis Statement

There are several measures that have so far been taken to mitigate the effects of the BP oil spill in the Gulf of Mexico as well as the on going efforts to cap the well oozing the oil.

Topic sentence

Capping the well that is spilling oil is the most difficult and challenging measurement to completely stop the disaster since it involves a series of complex technological stages.

Discussion

The petroleum commerce has been expediting discoveries in subterranean surfaces for close to two decades and episode of this kind has never been experienced before. The foremost expedition that has been employed before was 10000 Feet of below the water surface and 35000 Feet below the seabed. This particular incident is 5000 Feet below the water surface and 18000 below the seabed. That notwithstanding, the attempts to cap the seep out has seen the petroleum engineering expertise pressed over the edge.

So far, each and every one of the procedures to stem the spill on the sea bed as well as the American shores is being administered from the command hub situated at Mobile in the state of Alabama. Currently there are more than sixteen leak ships and planes which are deploying compounds to scatter the oil that has reached the surface. Meanwhile other efforts are being employed below the water surface. Therefore there are two approaches to solve the problem namely the use of dispersants, also known as boom, and the efforts to cap the fissure, also known as the subsea method2.

Under the subsea method, there are only two procedures that have been used. First is to decrease and control the oil leaking from the fissure and secondly to completely shut it. When the incident was first reported, the first procedure was to maneuver the Blowout Preventor to shut the fissure by means of eight distantly-controlled vehicles, also called ROVs (Remotely Operated Vehicles). However, this intervention was failed.

Concurrently, BP was infusing dispersants straight to the oil spill on the seabed, propelled from the ships and disseminated by the distantly controlled vehicles with rods. This basically means that the dispersants is being forced downwards through a conduit and infusing it with a rod leading the outpouring of oil from the conduit. The chemical process involves the oil assimilating with the dispersant and being crashed into lesser globules3.

Furthermore, BP also practiced the subsea gathering and seeping of the spilled oil. This effort proved even more testing. In the beginning, it meant putting down a big auditorium shaped equipment called a cofferdam over the spilled oil and to suck the oil through a conduit on top of it to dais called an Enterprise. Simply put, the cofferdam covered the spill oil just above the fissure from where it could be pumped out through a pipe to a reservoir on the surface. Again, this effort proved futile due to the development of vast quantity of hydrates which were unexpected. The oil was never let to go through the conduit and steam was used on the outside to make it balmy but the hydrates were too much4.

With the disappointment that followed this method, the next logical step is to ensnare the spilling oil on a really lesser and convenient degree. This means that the big cofferdam is be replaced with a miniature one which is quite easy to control and productive as well given that the quantity of water that had to be reduced. The argument is that if the water was set aside from the oil then automatically the hydrates would not develop. Alternatively, ethanol would be used to ensure that water is completely kept out. But before the miniature cofferdam is given a try, a riser implement slot in was put in the riser conduit. This was hugely effective. So far the implement has been re-slotted in making the seeping and pumping processes faster.

The next stage is to completely shut the fissure given that the oil spill is now pretty much under control. A surest method of shutting it out is in the offing. It involves disengaging the manipulating shells outside the Blowout Protector and bringing them to the surface for renovation and repair of the mechanical parts that are vital. They are then reinserted back when their control and manipulation is simple and straightforward. Fluid sealants would be injected down there and with time if other effective compound are discovered and used, then there would be a victorious capping. From there BP can sanitize the affected areas and move on

Bibliography

  1. National Research Council (U.S.) Ocean Studies Board (2005). Oil spill dispersants: efficacy and effects. Michigan: The National Academies Press University of Michigan
  2. National Research Council (U.S.). (1998). Committee on Oil Spill Risks from Tank Vessel Lightering Oil: spill risks from tank vessel lightering. Michigan: National Academies Press.
  3. Payne, R. & Farlow, J.S. (2003). Oil spill dispersants: mechanisms of action and laboratory test. Boca Raton: CRC Press
  4. Tunnell, W.T. Darryl, F. L., Earle, S. (2009). Gulf of Mexico Origin, Waters, and Biota: Biodiversity. Texas A&M University Press: CRC Press

Footnotes

  1. National Research Council (U.S.) Ocean Studies Board (2005). Oil spill dispersants: efficacy and effects. Michigan: The National Academies Press University of Michigan
  2. National Research Council (U.S.). (1998). Committee on Oil Spill Risks from Tank Vessel Lightering Oil: spill risks from tank vessel lightering. Michigan: National Academies Press.
  3. Payne R., Farlow John S.(2003)Oil spill dispersants: mechanisms of action and laboratory test. CRC Press: Boca Raton
  4. Tunnell, W.T. Darryl, F. L., Earle, S. (2009). Gulf of Mexico Origin, Waters, and Biota: Biodiversity. Texas A&M University Press: CRC Press

Public Health Informatics And Technology Integration

Introduction

In public health, informatics is defined as the systematic use of information and computer science and technology in public health practice, research, and learning (Carroll, 2003). As an engineering discipline, health informatics necessitates the application of knowledge from several related fields, particularly information science, computer science, psychology, and communications (Carroll, 2003). As compared with medical informatics, public health informatics seeks to integrate information science and technology to improve human health.

Analysis

Public health informatics inputs refer to data fed to a system to commence a particular process aimed at producing specific services or products (Carroll, 2003). There are several inputs required in the health informatics sciences, and among which are doctor’s notes, patient’s documents, lab results, supplies and equipment, and sundry. Similarly, public informatics throughput refers to the amount of work that an informatics’ computer system can handle in a given period. Factually, throughput can be used to correlate the effectiveness of several large public health informatics computers running many systems concurrently (Carroll, 2003).

In both computing and public health informatics, output refers to the communication between an information processing system and the outside world (Armoni, 2000). Through the output devices, processed information fed to the computer system is relayed to the observer on a screen or a paper display. Among the most popular output, public health informatics devices are screens and copiers. Similarly, devices that link several computer networks are considered as both input and output devices.

Health informatics systems are housed in a data warehouse (Armoni, 2000). Through this combined database, different people can advance their research analyses at once. The major sources of data are gutted, classified, altered, and availed by their administrators for possible data mining, online analytical processing, and decision support (Armoni, 2000).

Public health informatics specialists enter data used in hospital informatics systems. Health informatics experts major in inpatient security, coding, privacy, and records administration. This personnel is employed by hospitals to help with patient data care and ensure the accuracy of public health care records (Hovenga, 2010).

Public health organizations must protect their information to ensure that the confidentiality and privacy of individuals are upheld. Thus, safeguarding individuals’ data privacy and confidentiality is an important undertaking (Hovenga, 2010). To ensure that public health organizations protect the integrity of information, and guard against unauthorized access to information, several security measures need to be employed. Among these, security measures are the introduction of passwords, smart cards, biometrics, and cryptography. Similarly, an organization must be watchful for any possible intrusions into their computer systems, specifically systems that run online. To guard against mischievous attacks from the internet proxy servers, session password mechanisms, and firewalls are to be established (Hovenga, 2010).

Recommendation

Over the last decades, advancements in health informatics have resulted in numerous health benefits (Hovenga, 2010). Thus, to maximize the technology, much more investments need to be realized not just on the implementation of hardware and software but also on the development of an effective informatics workforce. Effective health informatics experts would maximize their potential when they work as a group. Similarly, the current reliance on data and communication networks has led to an increase in issues surrounding the control and access of such data, hence appropriate privacy and security measures should be put in place. With the expansion of internet connectivity, global health care costs will significantly drop due to the improvement of delivery and effectiveness of health care services (Hovenga, 2010). Therefore, if the trend continues into the near future, we may accumulate enough proof that informatics improves the health care system.

References

Armoni, A. (2000). Healthcare information systems challenges of the new millennium. Hershey, Pa.: Idea Group Pub.

Carroll, P. W. (2003). Public health informatics and information systems. New York: Springer.

Hovenga, E. J. (2010). Health informatics an overview ([2nd ed.). Amsterdam: IOS Press.

Forensic Psychology: Quantitative Vs Qualitative

In forensic psychology, both quantitative and qualitative research designs can be used when the available data is presented in the qualitative form, in words or categories. Depending on the purpose of the research, the data can be coded to determine themes, as it is in the qualitative research, or it can be transformed into numbers with the help of scales or other tools for quantitative research (Gravetter & Forzano, 2011, p. 25; Stangor, 2014, p. 16). To compare the quantitative and qualitative research designs, it is necessary to identify similarities and differences in the methods and data analysis used in two different articles.

In their research, Cole and Sprang aimed to study differences in professionals’ awareness of sex trafficking and their experiences in the work with victims of sex trafficking in different types of communities. The study was conducted with the help of the telephone survey, during which researchers interviewed professionals (Cole & Sprang, 2015, p. 115). The collected qualitative data was classified and then transformed in numerical form to conduct the statistical analysis and compare the participants’ responses to determine possible differences. After conducting the bivariate tests of association, Chi-square tests, and the one-way ANOVA, the authors analyzed the data and concluded about differences in the awareness and experience of professionals from metropolitan, micropolitan, and rural communities. Hypothesized similarities and differences were found and supported.

Peled and Parker conducted a qualitative naturalistic study to examine the mothering experiences in women who became the victims of sex trafficking. Eight women from the former Soviet Union who were sex trafficked to Israel were involved in a series of in-depth interviews with different types of questions. To analyze the participants’ responses, the researchers chose the standard qualitative data analysis procedure and coded the collected data to determine themes in narratives and identified such types of mothering as “the good mother”, “the sacrificing mother”, and “the mother who wants for herself” (Peled & Parker, 2013, p. 576). The findings helped conclude the role of the sex trafficking experience in motherhood.

Two studies can be discussed as similar in terms of data interest for the researchers and procedures used for collecting the qualitative data. Although studies are different concerning their type, the researchers focused on collecting qualitative data on the participants’ feelings and experiences. The data were gathered with the help of interviews, and then, the information was coded. However, Cole and Sprang were interested in the determination of the actual percentage of professionals having different experiences. Thus, the qualitative data was transformed into the numerical form, and the statistical analysis typical for quantitative researches was used to compare professionals’ experiences (Crighton & Towl, 2015, p. 41). In contrast, the qualitative data analysis was used by Peled and Parker who needed to identify the participants’ feelings with the focus on categories, instead of numbers.

The comparison of the quantitative and qualitative research designs used in psychology is important to conclude that the focus on the quantitative data and statistical analysis is necessary when the actual statistical data are expected to be found. This approach is effective to conduct surveys the data of which can be generalized for future researches and conclusions. On the contrary, the qualitative research design is effective to represent the actual data regarding feelings, experiences, and knowledge. Qualitative research is appropriate for using when the investigator is interested in the participants’ perceptions and actual words.

References

Cole, J., & Sprang, G. (2015). Sex trafficking of minors in metropolitan, micropolitan, and rural communities. Child Abuse & Neglect, 40(1), 113-123.

Crighton, D., & Towl, G. (2015). Forensic psychology. New York, NY: John Wiley & Sons.

Gravetter, F., & Forzano, L. (2011). Research methods for the behavioral sciences. New York, NY: Cengage Learning.

Peled, E., & Parker, A. (2013). The mothering experiences of sex-trafficked women: Between here and there. American Journal of Orthopsychiatry, 83(4), 576-587.

Stangor, C. (2014). Research methods for the behavioral sciences. New York, NY: Cengage Learning.