Thursday, November 28, 2019

The AIDS virus is one of the most deadly and most Essay Example For Students

The AIDS virus is one of the most deadly and most Essay AZTAZT wide spread diseasesin the modern era. The disease was first found in 1981 as doctors around theUnited States began to report groups of young, homosexual men developing a rarepneumonia caused by an organism called Penumocystis carini. These patients thenwent on to develop many other new and rare complications that had previouslybeen seen only in patients with severely damaged immune systems. The Center forDisease Control in the United States named this new epidemic the acquiredimmunodeficiency syndrome and defined it by a specific set of symptoms. In 1983,researchers finally identified the virus that caused AIDS. They named the virusthe human immunodeficiency virus, or HIV. AIDS causes the immune system of theinfected patient to become much less efficient until it stops working altogether. We will write a custom essay on The AIDS virus is one of the most deadly and most specifically for you for only $16.38 $13.9/page Order now The first drug that was approved by the American Food and Drugadministration for use in treating the AIDS virus is called AZT, which standsfor azido-thymidine. AZT was released under the brand name of Retrovir and itschemical name is Zidovudine, or ZDV. The structural name of AZT is 3-azido-3-deoxythymidine. AZT works by inhibiting the process of copying DNA in cells. More specifically, AZT, inhibits the reverse transcriptase enzyme, which isinvolved in the DNA replication process. When DNA is replicating in a cell,there is a specific enzyme that works along one side of the original DNA strandas the DNA is split into two strands, copying each individual nucleotide. Thisenzyme is only able to work in one direction along the nucleotide string,therefore a different enzyme, or rather a series of different enzymes isrequired to work in the opposite direction. Reverse transcriptase is one of theenzymes that is required to work in the opposite direction. AZT works by bondingto the reverse transcriptase enzyme, thereby making it unable to bond with thenucleotide string and making it unable to fulfill its role. This whole processis used by the HIV virus to replicate itself so that it can continue to infectmore cells. AZT was originally developed over 20 years ago for the treatment oflukemia. The concept behind this was that the AZT was supposed to terminate theDNA synthesis in the growing lukemia lymphocytes, thereby stopping the disease. AZT was rejected at this point because it failed to lengthen the lives of testanimals. The problem with the AZT drug is that it is not perfect. First of all,AZT will not bond to each and every reverse transcriptase enzyme in the body,and therefore it cannot shut down the HIV production completely. The reason forthis is because to put enough AZT in the patient to completely shut down the HIVproduction would probably kill the patient. The second, and most serious problemwith AZT is that it also goes into normal, healthy cells and will inhibit theirreverse transcriptase enzyme and will therefore inhibit their ability to producenew, healthy cells. However, AZT does have an ability to specifically target HIVinfected cells to a certain degree so that it does not kill each and every cellit gets into. However, it does kill a high proportion of the cells that it getsinto, thereby giving it a high toxicity level. The formula for AZT is C H N O . The molar mass of AZT is 267.24 gramsper mole. AZTs melting point is between 106 C and 112 C. AZT is soluble inwater, which is important so that it may dissolve into the human blood and bedistributed to the cells. AZT is usually taken in a pill format, but it isabsorbed by the skin, which can make it dangerous for people handling the drug. .uab596cb28a1d7692ffb45d67e4df3c81 , .uab596cb28a1d7692ffb45d67e4df3c81 .postImageUrl , .uab596cb28a1d7692ffb45d67e4df3c81 .centered-text-area { min-height: 80px; position: relative; } .uab596cb28a1d7692ffb45d67e4df3c81 , .uab596cb28a1d7692ffb45d67e4df3c81:hover , .uab596cb28a1d7692ffb45d67e4df3c81:visited , .uab596cb28a1d7692ffb45d67e4df3c81:active { border:0!important; } .uab596cb28a1d7692ffb45d67e4df3c81 .clearfix:after { content: ""; display: table; clear: both; } .uab596cb28a1d7692ffb45d67e4df3c81 { display: block; transition: background-color 250ms; webkit-transition: background-color 250ms; width: 100%; opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #95A5A6; } .uab596cb28a1d7692ffb45d67e4df3c81:active , .uab596cb28a1d7692ffb45d67e4df3c81:hover { opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #2C3E50; } .uab596cb28a1d7692ffb45d67e4df3c81 .centered-text-area { width: 100%; position: relative ; } .uab596cb28a1d7692ffb45d67e4df3c81 .ctaText { border-bottom: 0 solid #fff; color: #2980B9; font-size: 16px; font-weight: bold; margin: 0; padding: 0; text-decoration: underline; } .uab596cb28a1d7692ffb45d67e4df3c81 .postTitle { color: #FFFFFF; font-size: 16px; font-weight: 600; margin: 0; padding: 0; width: 100%; } .uab596cb28a1d7692ffb45d67e4df3c81 .ctaButton { background-color: #7F8C8D!important; color: #2980B9; border: none; border-radius: 3px; box-shadow: none; font-size: 14px; font-weight: bold; line-height: 26px; moz-border-radius: 3px; text-align: center; text-decoration: none; text-shadow: none; width: 80px; min-height: 80px; background: url(https://artscolumbia.org/wp-content/plugins/intelly-related-posts/assets/images/simple-arrow.png)no-repeat; position: absolute; right: 0; top: 0; } .uab596cb28a1d7692ffb45d67e4df3c81:hover .ctaButton { background-color: #34495E!important; } .uab596cb28a1d7692ffb45d67e4df3c81 .centered-text { display: table; height: 80px; padding-left : 18px; top: 0; } .uab596cb28a1d7692ffb45d67e4df3c81 .uab596cb28a1d7692ffb45d67e4df3c81-content { display: table-cell; margin: 0; padding: 0; padding-right: 108px; position: relative; vertical-align: middle; width: 100%; } .uab596cb28a1d7692ffb45d67e4df3c81:after { content: ""; display: block; clear: both; } READ: Walt Disney Yen Financing EssayThere is quite a bit of controversy about the effectiveness of AZT. Mostexperts agree that AZT delays the progression of HIV disease; the drug may alsoprolong the disease-free survival period. However, many doctors still disagreewith using AZT as a treatment for AIDS. Peter Duesberg, a professor of molecularbiology at the university of California, Berkley, says that In view of this,the cytotoxicity level of AZT there is no rational explanation of how AZTcould be beneficial to AIDS patients, even if HIV were proven to cause AIDS.This comment stems from the fact that AZT has a very high cytotoxicity level,which means that while it kills the inf ected cells, it will also kill perfectlyhealthy cells. According to Dr. Duesberg, AZT will kill approximately ninehundred and ninety nine healthy cells for each infected cell that it kills. Mostof this opposition to AZT stems from the fact that the initial testing for thedrug had severe problems associated with it. These initial tests were performedwith two groups of AIDS patients. The volunteering patients were secretlydivided into two groups using a double-blind system, where neither the patientsnor the doctors are aware of who is in the placebo, or control group, and who isin the AZT group. These tests were performed by the FDA at twelve medicalcenters throughout the United States. The study actually became unblinded almostimmediately as some patients discovered a difference in taste between theplacebo and AZT caplets and other patients took the capsules to chemists to havethem analyzed. The doctors found out the differences between AZT patients andthe placebo patients by very ob vious differences in blood profiles. An FDAmeeting was convened and the decision was made to keep all of the useless data,and therefore the bad data was thrown in with the good data and it ended upmaking all of the data virtually useless. In fact, according to some sources,AZT ended up shortening the lifespans of many of the patients taking it. AZT isalso thought to be a possible carcinogen, although it has not been around longenough for any conclusive results to be obtained. After AZT was approved for use,mortality statistics were taken, they showed a mortality rate of 10% after 17weeks, with the original number of patients being 4805. The FDA tests, withtheir skewed statistics, showed only a 1% mortality rate. AZT also had somestrange side-effects that were reported with its use, such as raising the IQsof 21 children who took the drug by 15 points, 5 of the children died. The newest treatments with AZT are combining AZT with other drugs, suchas ddI. These tests were being performed, once again in the double-blind format,just like the original FDA tests. Three different groups were tested, onestaking only AZT, ones taking only ddI and ones taking a combination of both ddIand AZT. The Data Safety Monitoring Board (DSMB), and organization that monitorsall testing in the United States secretly unblinded the test, as they do withall double-blind tests, and found that the AZT patients had a much highermortality rate than those in the straight ddI and the ddI and AZT tests. TheDSMB found the difference in the tests to be high enough to stop the trialsearly. In August of 1994, the FDA approved AZT for use by pregnant, AIDSinfected women. Once again it was conducted in a double-blind method and wasplacebo controlled. The therapy was begun 14-34 weeks after pregnancy. However,in this testing it was found that in the AZT mothers, the AIDS transmission rateto the babies was about 8.3% while the placebo group was about 25.5%. Thereforethe AZT was reducing the AIDS transmission by two thirds. .u438316db6e91d6997f76c8a8a28eee7c , .u438316db6e91d6997f76c8a8a28eee7c .postImageUrl , .u438316db6e91d6997f76c8a8a28eee7c .centered-text-area { min-height: 80px; position: relative; } .u438316db6e91d6997f76c8a8a28eee7c , .u438316db6e91d6997f76c8a8a28eee7c:hover , .u438316db6e91d6997f76c8a8a28eee7c:visited , .u438316db6e91d6997f76c8a8a28eee7c:active { border:0!important; } .u438316db6e91d6997f76c8a8a28eee7c .clearfix:after { content: ""; display: table; clear: both; } .u438316db6e91d6997f76c8a8a28eee7c { display: block; transition: background-color 250ms; webkit-transition: background-color 250ms; width: 100%; opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #95A5A6; } .u438316db6e91d6997f76c8a8a28eee7c:active , .u438316db6e91d6997f76c8a8a28eee7c:hover { opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #2C3E50; } .u438316db6e91d6997f76c8a8a28eee7c .centered-text-area { width: 100%; position: relative ; } .u438316db6e91d6997f76c8a8a28eee7c .ctaText { border-bottom: 0 solid #fff; color: #2980B9; font-size: 16px; font-weight: bold; margin: 0; padding: 0; text-decoration: underline; } .u438316db6e91d6997f76c8a8a28eee7c .postTitle { color: #FFFFFF; font-size: 16px; font-weight: 600; margin: 0; padding: 0; width: 100%; } .u438316db6e91d6997f76c8a8a28eee7c .ctaButton { background-color: #7F8C8D!important; color: #2980B9; border: none; border-radius: 3px; box-shadow: none; font-size: 14px; font-weight: bold; line-height: 26px; moz-border-radius: 3px; text-align: center; text-decoration: none; text-shadow: none; width: 80px; min-height: 80px; background: url(https://artscolumbia.org/wp-content/plugins/intelly-related-posts/assets/images/simple-arrow.png)no-repeat; position: absolute; right: 0; top: 0; } .u438316db6e91d6997f76c8a8a28eee7c:hover .ctaButton { background-color: #34495E!important; } .u438316db6e91d6997f76c8a8a28eee7c .centered-text { display: table; height: 80px; padding-left : 18px; top: 0; } .u438316db6e91d6997f76c8a8a28eee7c .u438316db6e91d6997f76c8a8a28eee7c-content { display: table-cell; margin: 0; padding: 0; padding-right: 108px; position: relative; vertical-align: middle; width: 100%; } .u438316db6e91d6997f76c8a8a28eee7c:after { content: ""; display: block; clear: both; } READ: Psychodynamic view EssayIt is still not clear as to the effectiveness of AZT to stop or hinderthe progress of the AIDS virus. Most experts today consider AZT to be a validway to treat AIDS and HIV infection, but they are constantly experimenting withnew combinations of different drugs such as ddI and AZT to try to better treatAIDS patients. The massive administrative errors in the initial testing have setthe AZT research back and have fostered unlooked for antipathy. As thetreatments become more sound and more reliable, AZT will find its place in AIDStreatments. EndNotes Lauritsen, John. Poison by Prescription The AZT Story. New York; AsklepiosPublishing, 1990. pg.7. Lauritsen, John. Poison by Prescription The AZT Story. New York; AsklepiosPublishing, 1990. pg.7. Lauritsen, John. Poison by Prescription The AZT Story. New York; AsklepiosPublishing, 1990. pg.23. Lauritsen, John. Poison by Prescription The AZT Story. New York; AsklepiosPublishing, 1990. pg.49. Whitmore, Arthur. AZT Approved for Preventing Maternal-Fetal HIVTransmission. Internet: http://www.hivpositive.com/f-DrugAdvisories/II-FDA/4.htm. August 8, 1994. BibliographyLauritsen, John. Poison by Prescription The AZT Story. New York: AsklepiosPublishing, 1990. Pinsky, Laura. Douglas, Paul Harding. Metroka, Craig. The Essential HIVTreatment Fact Book. New York: Simon ; Schuster Inc., 1992. Kaiser, Jon D. Immune Power A Comprehensive Treatment Program for HIV. NewYork: St.Martins Press, 1993. Whitmore, Arthur. AZT Approved For Preventing Maternal-Fetal HIV Transmission. Internet: http://www.hivpositive.com/f-DrugAdvisories/II-FDA/4.htm,August 8, 1994. Whitmore, Arthur. FDA Grants Accelerated Approval For 3TC With AZT To TreatAIDS. Internet: http://www.hivpositive.com/f-DrugAdvisories/II-FDA/17.htm,November 20, 1995. Clark, Martina. AZT: Pediatric Study Changed. Internet:http://www.out.org/HIV/AZT_pediatric_study_changed.htm, W.O.R.L.D. ANewsletter about Women HIV April 22, 1995. Science

Sunday, November 24, 2019

Unknowingly Living With Mental Illnesses Professor Ramos Blog

Unknowingly Living With Mental Illnesses Mental illnesses have increased throughout a period of years and people have accepted them into their daily routines, so much so, that it almost seems like a normal thing to cope with. But what people do not ask themselves often is how did this ailment begin? In the story â€Å"Never Marry a Mexican† by Sandra Cisneros, she introduces a character named Clemencia- a troubled young women with a history of family problems and a natural instinct to self destruct. I am writing about this story because in a way, Clemencia and I have similarities. She and I both had a mother who cheated on her husband and we both have emotional scars that were never properly healed. A type of trauma like this stays with a person for a while and sometimes forever, especially if they do not learn how to forgive, accept, and move on. A child’s adolescence is the rise of their future characteristics; you can either let nature  raise a child or you can nurture them, though, however you choose, there will be some kind of psychology applied to their environment. Clemencia grew up in more of a nature environment being left to fend for herself and it seemed to have caused her psychological distress: â€Å"psychological discomfort that interferes with your activities of daily living. Psychological distress can result in negative views of the environment, others, and the self.† (study.com). Some symptoms of P.D. are obsessive thoughts or compulsions and reckless acts. Clemencia did not show too much happiness throughout the affair because she knew Drew was a man who would never stay with her when the sun would rise or no matter how many times she asked him to leave his wife for her, he would not. But she did find joy through secretly taunting Megan by having a romantic affair with her husband. Reckless ac ts like Clemencia’s are not considered normal and rather unhealthy. She knew her actions were specifically meant for bad intentions and pursued them to feel satisfaction. Though this story did not mention Clemencia having any mental illnesses, it could be one explanation for her erratic behavior. When her father had passed away, she was not given the proper care to grief and move on. She did not have the support of her mother to hold her and tell her everything would be okay. When her mother passed away, Clemencia felt nothing anymore. Not towards her mother, not towards her students, and not towards the affair she was having with Drew. A symptom like this is a sign of a sociopath: â€Å"a person with a psychopathic personality whose behavior is antisocial, often criminal, and who lacks a sense of moral responsibility or social conscience.† (Dictionary.com). A sociopath can be spotted through many observations but a specific symptom that resembled Clemencia was â€Å"Lack of remorse and shame† (Psychology Today). When she and Drew were having an affair, Clemencia never stopped to think about how this would affect his marriage and what hi s wife Megan would think of it. She enjoyed being a mistress in general but she was obsessed with Drew the most. She also never considered how this might possibly affect her personally in the future when she tries maturing and finding a healthy relationship. Actions like hers are capable of causing her more trauma than what she has already experienced. Another symptom Clemencia had was â€Å"Absence of delusions and other signs of irrational thinking†. (Psychology Today). When Drew was being phoned at four in the morning, Megan answered with a polite voice. Megan was not aware the phone was from her husband’s mistress, but this senseless act brought Clemencia an overpowering feeling of satisfaction and joy. All she could say to him was â€Å"Drew? That dumb bitch of a wife of yours†¦ that stupid stupid stupid†¦ Excuse me, honey. It cracked me up.† (Cisneros 61). Clemencia showed her worst possible behavior when she began putting gummy bears inside of Megan’s personal belongings. I understand Drew should have never gotten involved with anyone if we was already married. But I do not understand how Clemencia could be this delusional to stash gummy bears inside of his wife’s personal belongings- in places only Megan would look. Only a person who is emotionally distressed can be capable of sc heming at this level. A sociopath tends to have an oversized ego during a relationship. Many times Clemencia tried to convince Drew he was nothing without her. â€Å"They are narcissists to the extreme, with a huge sense of entitlement† (Huffington Post). She would take credit for the man he was and spoke of herself sounding pompous. Towards the end of the story, she spoke to Drew’s son, making her side of the story sound as if Drew was responsible for her acts and failures. But the very last symptom Clemencia proved to possibly be a sociopath was her talk of suicide. She was not convinced in the end of her story that she would ever actually commit suicide. But she did speak of it and consider it towards herself and the possible death of others. Though her suicide thoughts are in her mind- a sociopath never actually carries on the thought. That does not mean she is not a threat to other people. Her traumatic situation impacted her life gradually and even though she did not act physically vio lent, Clemencia’s past could still affect her future. Who knows what actual disorder she could have had and what risks come with it. To my readers- please be careful out there. Cisneros, Sandra â€Å"Never Marry a Mexican.† Woman Hollering Creek, Bloomsbury, 2004, pp.51–69. Macrina Cooper-White. â€Å"11 Signs You May be Dating a Sociopath.†HuffPost, 6 Dec. 2017, https://www.huffingtonpost.com/2013/08/23/11-signs-dating-a-sociopath_n_3780417.html M.E. Thomas. â€Å"How to Spot a Sociopath.† Confessions of a Sociopath, 7 May 2013, https://www.psychologytoday.com/us/articles/201305/how-spot-sociopath Yolanda Williams. â€Å"What is Psychological Distress.† https://study.com/academy/lesson/what-is-psychological-distress-definition-lesson-quiz.html

Thursday, November 21, 2019

Resilient Cultures by Kicza, John Essay Example | Topics and Well Written Essays - 500 words

Resilient Cultures by Kicza, John - Essay Example This means that those in the east and the north adopted maize farming from the Mexicans or the natives of south west American (30). The Europeans had failed to change the farming system of the Northerners for several years. My thought was, the natives of the east and the north had resisted the European farming system for several years. There are several religious systems in America such as Christianity, Islam, Buddhism, and Hinduism. The native Indians in America had their religious believes before the coming of the Europeans. Slave trade had existed in several European countries before they colonized America. Some of the slaves were taken to work on plantations in America (60). These slaves were captured from different regions in Africa and some from Asia. It could be possible these religious beliefs were introduced by the interaction between the slaves and the American natives. Christianity was dominant among the Europeans, and it was spread across the world. This means that the Europeans introduced Christianity to the Americans and the slaves introduced other religions. Some of the slaves settled permanently in America and had to establish their places of worship. In the second half of the fifteen century, the eastern section of America had begun building the 13 colonies, and Virginia was established in 1607. This shows that as Europe and the Ottoman Empire were dominating, America was advancing. The Americans had acquired architectural design skills from the Europeans. Historically when the Ottoman Empire controlled the Mediterranean, they took silk, spices, porcelain and other valuables from Europe. Applying the same to America, Christopher Columbus was not only looking for a shorter route to china but was also looking for valuable skills and spices in Europe (78). This gave the Americans the wealth and the skills needed to build the colonies and develop the economy of the country. Aztec was an

Wednesday, November 20, 2019

Waiting for Superman (2010) Film Essay Example | Topics and Well Written Essays - 750 words

Waiting for Superman (2010) Film - Essay Example The director of the movie is Davis Guggenheim, and the producer is Lesley Chilcott. Several students are used to reflect as they struggle to be accepted in the charter school. The Audience Award honored the film in 2010 as one of the best documentary. This paper seeks to focus on ‘Waiting for Superman’ (2010) film. In the film waiting for the superman (2010), several problems are identified as an impediment to quality education standards. One of the problems is the tiresome process one is supposed to undergo  in order to get a place in the schools thought to be best performing. Those schools that are best, and have spaces depend on lottery for enrollment, hence denying chance to many students. Consequently, they are forced to go through schools whose performances rank low in all aspects of academics. The other significant policy problem identified in this issue is bureaucracy employed by the teachers union. Most of the teachers are unable to inject their knowledge to s upport students to achieve satisfactory grades. Furthermore, those teachers identified to contribute to poor performance of the student are not fired. They are protected by security of tenure, which is easily acquired after two years of teaching. The other problem identified by the movie is the issue of perception that individual background determines the level of performance. He disputes this fact and notes if such individual are exposed to formal education standards, and have exceptional teachers they are more likely to make it to the college. There is also lack of motivation of exceptional performing teachers as their payments are standardized (Participant Media and Weber 17-22). Many causes have led to various policy problems as identified by the film. The most notable is the bureaucracy of the teacher union. Even after identification of poorly performing teacher, it takes a long time, for such teacher to be expelled. In addition, the union contract is a source of impediment to school reforms. Such a teacher also continues receiving money from the exchequer. The other challenge is the issue of United States standardized test scores. This score continues to fall since the early seventies affecting performance significantly. Moreover, charter schools enjoy certain provisions that are not in public schools. They have the rights to have longer school days and schools years while this is reduced in public schools. He also identifies failure in public schools because of strict mechanism that are applied to the students. Some rules in public according to the film are conservative and retrogressive (Participant Media and Weber 17-22). According to Guggenheim film, quality education is composed of great teachers, prepared students, excellent schools and an increased level of literacy (Film). Furthermore, it claims such quality education produces outstanding graduate. There are several proposed policy instruments to quality education performance. The most important is the motivation of teachers based on their performance. He proposes such teachers should be given some form of incentives such as salary increase. The other important step is to raise education status to international standards. He also proposes that there should be an increase in literary rates. The film also calls for providing a successful school experience for all students. The movie also supports the establishment of more charter schools to increase enrollment. He also advocates getting rid of teachers union as a

Monday, November 18, 2019

Art Essay Example | Topics and Well Written Essays - 750 words - 5

Art - Essay Example It is true that media artists today consider both aesthetics and ethics in their work. In the current investigation, my point of view is that the most important thing in a work of art is its ethical qualities, because no matter how hard one tries, one cannot get away from ethics. To me, concentrating wholly on form and ignoring ethics is basically blinding oneself to a critical aspect of artistic creation and (especially) consumption. Television, movies, and other media arts are judged in terms of morality and ethical qualities as a foremost consideration; to ignore this aspect borders on folly. Media arts are made for public consumption, and part of this public consumption is the ethical judgment of the genre and the product. Television must pass censorship standards, and movies must pass ratings standards. When people watch a movie, especially a mass-market production, they are often watching an ethical polarization of good and evil (good guys vs. bad guys). And if there is ambiguity between who is the bad guy and the good guy, this is also a moral or ethical question. â€Å"From our casual conversations about the moral status of cinematic villains and heroines, through debates about the effects of the portrayal of violence by Hollywood International, to arguments about the portrayal of sex and sexuality, film talk is intimately tied up with ethical concerns and evaluations† (Meskin, 2009). The same can be said of talk about other media arts, especially television, which seeks to appeal to a mass market kind of morality or ethics. To say that media arts is only for malism is all well in good in a very limited art-house context, but to consider the real situation, one must accept the mass-media perspective. In this perspective, ethics and morality are explicitly involved in the consumption of media arts. One cannot subtract ethics from this consideration; to do so would be to lose a large piece of the puzzle in terms of how human beings react

Friday, November 15, 2019

Development of Peer-to-Peer Network System

Development of Peer-to-Peer Network System Procedures which we are followed to success this project. Task 01 Familiarizing with the equipments preparing an action plan. Task 02 Prepare the work area. Task 03 Fixed the hardware equipments and assemble three PCs. Task 04 Install NICs for each and every PC. Task 05 Cabling three computers and configure the peer to peer network with using hub or switch. Task 06 Install Windows operating system to each and every PC. Task 07 Install and configure the printer on one of the PCs. Task 08 Share printer with other PCs in the LAN. Task 09 Establish one shared folder Task 10 Create a test document on one of the PCs and copy the files to each of the other PCs in network. Task 11 Test the printer by getting the test document from each of the networked PCs. Time allocation for the tasks. Task No. Time allocation Task 01 1 hour Task 02 30 minutes Task 03 1  ½ hour Task 04 1  ½ hour Task 05 1  ½ hour Task 06 3 hour Task 07 15 minutes Task 08 15 minutes Task 09 15 minutes Task 10 10 minutes Task 11 05 minutes Total time allocation 10 hours Physical structure of proposed Peer to Peer network system. In peer to peer network there are no dedicated servers or hierarchy among the computers. The user must take the decisions about who access this network. Processors In 1945, the idea of the first computer with a processing unit capable of performing different tasks was published by John von Neumann. The computer was called the EDVAC and was finished in 1949. These first primitive computer processors, such as the EDVAC and the Harvard Mark I, were incredibly bulky and large. Hundreds of CPUs were built into the machine to perform the computers tasks. Starting in the 1950s, the transistor was introduced for the CPU. This was a vital improvement because they helped remove much of the bulky material and wiring and allowed for more intricate and reliable CPUs. The 1960s and 1970s brought about the advent of microprocessors. These were very small, as the length would usually be recorded in nanometers, and were much more powerful. Microprocessors helped this technology become much more available to the public due to their size and affordability. Eventually, companies like Intel and IBM helped alter microprocessor technology into what we see today. The computer processor has evolved from a big bulky contraption to a minuscule chip. Computer processors are responsible for four basic operations. Their first job is to fetch the information from a memory source. Subsequently, the CPU is to decode the information to make it usable for the device in question. The third step is the execution of the information, which is when the CPU acts upon the information it has received. The fourth and final step is the write back. In this step, the CPU makes a report of the activity and stores it in a log. Two companies are responsible for a vast majority of CPUs sold all around the world. Intel Corporation is the largest CPU manufacturer in the world and is the maker of a majority of the CPUs found in personal computers. Advanced Micro Devices, Inc., known as AMD, has in recent years been the main competitor for Intel in the CPU industry. The CPU has greatly helped the world progress into the digital age. It has allowed a number of computers and other machines to be produced that are very important and essential to our global society. For example, many of the medical advances made today are a direct result of the ability of computer processors. As CPUs improve, the devices they are used in will also improve and their significance will become even greater. VGA The term Video Graphics Array (VGA) refers specifically to the display hardware first introduced with the IBM PS/2 line of computers in 1987,[1] but through its widespread adoption has also come to mean either an analogue computer display standard, the 15-pin D-sub miniature VGA connector or the 640Ãâ€"480 resolution itself. While this resolution has been superseded in the personal computer market, it is becoming a popular resolution on mobile devices. Video Graphics Array (VGA) was the last graphical standard introduced by IBM that the majority of PC clone manufacturers conformed to, making it today (as of 2009) the lowest common denominator that all PC graphics hardware supports, before a device-specific driver is loaded into the computer. For example, the MS-Windows splash screen appears while the machine is still operating in VGA mode, which is the reason that this screen always appears in reduced resolution and colour depth. VGA was officially superseded by IBMs XGA standard, but in reality it was superseded by numerous slightly different extensions to VGA made by clone manufacturers that came to be known collectively as Super VGA. VGA is referred to as an array instead of an adapter because it was implemented from the start as a single chip (an ASIC), replacing the Motorola 6845 and dozens of discrete logic chips that covered the full-length ISA boards of the MDA, CGA, and EGA. Its single-chip implementation also allowed the VGA to be placed directly on a PCs motherboard with a minimum of difficulty (it only required video memory, timing crystals and an external RAMDAC), and the first IBM PS/2 models were equipped with VGA on the motherboard. RAM Random-access memory (usually known by its acronym, RAM) is a form of computer data storage. Today, it takes the form of integrated circuits that allow stored data to be accessed in any order (i.e., at random). The word random thus refers to the fact that any piece of data can be returned in a constant time, regardless of its physical location and whether or not it is related to the previous piece of data. By contrast, storage devices such as tapes, magnetic discs and optical discs rely on the physical movement of the recording medium or a reading head. In these devices, the movement takes longer than data transfer, and the retrieval time varies based on the physical location of the next item. The word RAM is often associated with volatile types of memory (such as DRAM memory modules), where the information is lost after the power is switched off. Many other types of memory are RAM, too, including most types of ROM and flash memory called NOR-Flash. An early type of widespread writable random-access memory was the magnetic core memory, developed from 1949 to 1952, and subsequently used in most computers up until the development of the static and dynamic integrated RAM circuits in the late 1960s and early 1970s. Before this, computers used relays, delay line memory, or various kinds of vacuum tube arrangements to implement main memory functions (i.e., hundreds or thousands of bits); some of which were random access, some not. Latches built out of vacuum tube triodes, and later, out of discrete transistors, were used for smaller and faster memories such as registers and random-access register banks. Modern types of writable RAM generally store a bit of data in either the state of a flip-flop, as in SRAM (static RAM), or as a charge in a capacitor (or transistor gate), as in DRAM (dynamic RAM), EPROM, EEPROM and Flash. Some types have circuitry to detect and/or correct random faults called memory errors in the stored data, using pa rity bits or error correction codes. RAM of the read-only type, ROM, instead uses a metal mask to permanently enable/disable selected transistors, instead of storing a charge in them. As both SRAM and DRAM are volatile, other forms of computer storage, such as disks and magnetic tapes, have been used as persistent storage in traditional computers. Many newer products instead rely on flash memory to maintain data when not in use, such as PDAs or small music players. Certain personal computers, such as many rugged computers and net books, have also replaced magnetic disks with flash drives. With flash memory, only the NOR type is capable of true random access, allowing direct code execution, and is therefore often used instead of ROM; the lower cost NAND type is commonly used for bulk storage in memory cards and solid-state drives. Similar to a microprocessor, a memory chip is an integrated circuit (IC) made of millions of transistors and capacitors. In the most common form of computer memory, dynamic random access memory (DRAM), a transistor and a capacitor are paired to create a memory cell, which represents a single bit of data. The transistor acts as a switch that lets the control circuitry on the memory chip read the capacitor or change its state. Types of RAM Top L-R, DDR2 with heat-spreader, DDR2 without heat-spreader, Laptop DDR2, DDR, Laptop DDR 1 Megabit chip one of the last models developed by VEB Carl Zeiss Jena in 1989 Many computer systems have a memory hierarchy consisting of CPU registers, on-die SRAM caches, external caches, DRAM, paging systems, and virtual memory or swap space on a hard drive. This entire pool of memory may be referred to as RAM by many developers, even though the various subsystems can have very different access times, violating the original concept behind the random access term in RAM. Even within a hierarchy level such as DRAM, the specific row, column, bank, rank, channel, or interleave organization of the components make the access time variable, although not to the extent that rotating storage media or a tape is variable. The overall goal of using a memory hierarchy is to obtain the higher possible average access performance while minimizing the total cost of entire memory system. (Generally, the memory hierarchy follows the access time with the fast CPU registers at the top and the slow hard drive at the bottom.) In many modern personal computers, the RAM comes in an easily upgraded form of modules called memory modules or DRAM modules about the size of a few sticks of chewing gum. These can quickly be replaced should they become damaged or too small for current purposes. As suggested above, smaller amounts of RAM (mostly SRAM) are also integrated in the CPU and other ICs on the motherboard, as well as in hard-drives, CD-ROMs, and several other parts of the computer system. Hard Disk A hard disk drive (often shortened as hard disk, hard drive, or HDD) is a non-volatile storage device that stores digitally encoded data on rapidly rotating platters with magnetic surfaces. Strictly speaking, drive refers to the motorized mechanical aspect that is distinct from its medium, such as a tape drive and its tape, or a floppy disk drive and its floppy disk. Early HDDs had removable media; however, an HDD today is typically a sealed unit (except for a filtered vent hole to equalize air pressure) with fixed media. HDDs (introduced in 1956 as data storage for an IBM accounting computer) were originally developed for use with general purpose computers. During the 1990s, the need for large-scale, reliable storage, independent of a particular device, led to the introduction of embedded systems such as RAIDs, network attached storage (NAS) systems, and storage area network (SAN) systems that provide efficient and reliable access to large volumes of data. In the 21st century, HDD usage expanded into consumer applications such as camcorders, cell phones (e.g. the Nokia N91), digital audio players, digital video players, digital video recorders, personal digital assistants and video game consoles. HDDs record data by magnetizing ferromagnetic material directionally, to represent either a 0 or a 1 binary digit. They read the data back by detecting the magnetization of the material. A typical HDD design consists of a spindle that holds one or more flat circular disks called platters, onto which the data are recorded. The platters are made from a non-magnetic material, usually aluminium alloy or glass, and are coated with a thin layer of magnetic material, typically 10-20 nm in thickness with an outer layer of carbon for protection. Older disks used iron (III) oxide as the magnetic material, but current disks use a cobalt-based alloy. The platters are spun at very high speeds. Information is written to a platter as it rotates past devices called read-and-write heads that operate very close (tens of nanometres in new drives) over the magnetic surface. The read-and-write head is used to detect and modify the magnetization of the material immediately under it. There is one head for each magnetic platter surface on the spindle, mounted on a common arm. An actuator arm (or access arm) moves the heads on an arc (roughly radially) across the platters as they spin, allowing each head to access almost the entire surface of the platter as it spins. The arm is moved using a voice coil actuator or in some older designs a stepper motor. The magnetic surface of each platter is conceptually divided into many small sub-micrometre-sized magnetic regions, each of which is used to encode a single binary unit of information. Initially the regions were oriented horizontally, but beginning about 2005, the orientation was changed to perpendicular. Due to the polycrystalline nature of the magnetic material each of these magnetic regions is composed of a few hundred magnetic grains. Magnetic grains are typically 10 nm in size and each form a single magnetic domain. Each magnetic region in total forms a magnetic dipole which generates a highly localized magnetic field nearby. A write head magnetizes a region by generating a strong local magnetic field. Early HDDs used an electromagnet both to magnetize the region and to then read its magnetic field by using electromagnetic induction. Later versions of inductive heads included metal in Gap (MIG) heads and thin film heads. As data density increased, read heads using magnetoresista nce (MR) came into use; the electrical resistance of the head changed according to the strength of the magnetism from the platter. Later development made use of spintronics; in these heads, the magnetoresistive effect was much greater than in earlier types, and was dubbed giant magnetoresistance (GMR). In todays heads, the read and write elements are separate, but in close proximity, on the head portion of an actuator arm. The read element is typically magneto-resistive while the write element is typically thin-film inductive.[8] HD heads are kept from contacting the platter surface by the air that is extremely close to the platter; that air moves at, or close to, the platter speed. The record and playback head are mounted on a block called a slider, and the surface next to the platter is shaped to keep it just barely out of contact. Its a type of air bearing. In modern drives, the small size of the magnetic regions creates the danger that their magnetic state might be lost because of thermal effects. To counter this, the platters are coated with two parallel magnetic layers, separated by a 3-atom-thick layer of the non-magnetic element ruthenium, and the two layers are magnetized in opposite orientation, thus reinforcing each other.[9] Another technology used to overcome thermal effects to allow greater recording densities is perpendicular recording, first shipped in 2005,[10] as of 2007 the technology was used in many HDDs. The grain boundaries turn out to be very important in HDD design. The reason is that, the grains are very small and close to each other, so the coupling between adjacent grains is very strong. When one grain is magnetized, the adjacent grains tend to be aligned parallel to it or demagnetized. Then both the stability of the data and signal-to-noise ratio will be sabotaged. A clear grain perpendicular boundary can weaken the coupling of the grains and subsequently increase the signal-to-noise ratio. In longitudinal recording, the single-domain grains have uniaxial anisotropy with easy axes lying in the film plane. The consequence of this arrangement is that adjacent magnets repel each other. Therefore the magnetostatic energy is so large that it is difficult to increase areal density. Perpendicular recording media, on the other hand, has the easy axis of the grains oriented to the disk plane. Adjacent magnets attract to each other and magnetostatic energy are much lower. So, much highe r areal density can be achieved in perpendicular recording. Another unique feature in perpendicular recording is that a soft magnetic underlayer are incorporated into the recording disk.This underlayer is used to conduct writing magnetic flux so that the writing is more efficient. This will be discussed in writing process. Therefore, a higher anisotropy medium film, such as L10-FePt and rare-earth magnets, can be used. Opened hard drive with top magnet removed, showing copper head actuator coil (top right). A hard disk drive with the platters and motor hub removed showing the copper colored stator coils surrounding a bearing at the center of the spindle motor. The orange stripe along the side of the arm is a thin printed-circuit cable. The spindle bearing is in the center. A typical hard drive has two electric motors, one to spin the disks and one to position the read/write head assembly. The disk motor has an external rotor attached to the platters; the stator windings are fixed in place. The actuator has a read-write head under the tip of its very end (near center); a thin printed-circuit cable connects the read-write head to the hub of the actuator. A flexible, somewhat U-shaped, ribbon cable, seen edge-on below and to the left of the actuator arm in the first image and more clearly in the second, continues the connection from the head to the controller board on the opposite side. The head support arm is very light, but also rigid; in modern drives, acceleration at the head reaches 250 Gs. The silver-colored structure at the upper left of the first image is the top plate of the permanent-magnet and moving coil motor that swings the heads to the desired position (it is shown removed in the second image). The plate supports a thin neodymium-iron-boron (NIB) high-flux magnet. Beneath this plate is the moving coil, often referred to as the voice coil by analogy to the coil in loudspeakers, which is attached to the actuator hub, and beneath that is a second NIB magnet, mounted on the bottom plate of the motor (some drives only have one magnet). The voice coil, itself, is shaped rather like an arrowhead, and made of doubly-coated copper magnet wire. The inner layer is insulation, and the outer is thermoplastic, which bonds the coil together after its wound on a form, making it self-supporting. The portions of the coil along the two sides of the arrowhead (which point to the actuator bearing center) interact with the magnetic field, developing a tangential force that rotates the actuator. Current flowing racially outward along one side of the arrowhead and racially inward on the other produces the tangential force. (See magnetic field Force on a charged particle.) If the magnetic field were uniform, each side would generate opposing forces that would cancel each other out. Therefore the surface of the magnet is half N pole, half S pole, with the radial dividing line in the middle, causing the two sides of the coil to see opposite magnetic fields and produce forces that add instead of canceling. Currents along the top and bott om of the coil produce radial forces that do not rotate the head. Floppy disk A floppy disk is a data storage medium that is composed of a disk of thin, flexible (floppy) magnetic storage medium encased in a square or rectangular plastic shell. Floppy disks are read and written by a floppy disk drive or FDD, the initials of which should not be confused with fixed disk drive, which is another term for a (non removable) type of hard disk drive. Invented by IBM, floppy disks in 8-inch (200mm), 5 ¼-inch (133.35mm), and 3 ½-inch (90mm) formats enjoyed many years as a popular and ubiquitous form of data storage and exchange, from the mid-1970s to the late 1990s. While floppy disk drives still have some limited uses, especially with legacy industrial computer equipment,[2] they have now been largely superseded by USB flash drives, external hard drives, CDs, DVDs, and memory cards (such as Secure Digital). 5 ¼-inch disk had a large circular hole in the center for the spindle of the drive and a small oval aperture in both sides of the plastic to allow the heads of the drive to read and write the data. The magnetic medium could be spun by rotating it from the middle hole. A small notch on the right hand side of the disk would identify whether the disk was read-only or writable, detected by a mechanical switch or photo transistor above it. Another LED/phototransistor pair located near the center of the disk could detect a small hole once per rotation, called the index hole, in the magnetic disk. It was used to detect the start of each track, and whether or not the disk rotated at the correct speed; some operating systems, such as Apple DOS, did not use index sync, and often the drives designed for such systems lacked the index hole sensor. Disks of this type were said to be soft sector disks. Very early 8-inch and 5 ¼-inch disks also had physical holes for each sector, and were termed hard sector disks. Inside the disk were two layers of fabric designed to reduce friction between the medium and the outer casing, with the medium sandwiched in the middle. The outer casing was usually a one-part sheet, folded double with flaps glued or spot-welded together. A catch was lowered into position in front of the drive to prevent the disk from emerging, as well as to raise or lower the spindle (and, in two-sided drives, the upper read/write head). The 8-inch disk was very similar in structure to the 5 ¼-inch disk, with the exception that the read-only logic was in reverse: the slot on the side had to be taped over to allow writing. The 3 ½-inch disk is made of two pieces of rigid plastic, with the fabric-medium-fabric sandwich in the middle to remove dust and dirt. The front has only a label and a small aperture for reading and writing data, protected by a spring-loaded metal or plastic cover, which is pushed back on entry into the drive. Newer 5 ¼-inch drives and all 3 ½-inch drives automatically engages when the user inserts a disk, and disengages and ejects with the press of the eject button. On Apple Macintosh computers with built-in floppy drives, the disk is ejected by a motor (similar to a VCR) instead of manually; there is no eject button. The disks desktop icon is dragged onto the Trash icon to eject a disk. The reverse has a similar covered aperture, as well as a hole to allow the spindle to connect into a metal plate glued to the medium. Two holes bottom left and right, indicate the write-protect status and high-density disk correspondingly, a hole meaning protected or high density, and a covered gap meaning write-enabled or low density. A notch top right ensures that the disk is inserted correctly, and an arrow top left indicates the direction of insertion. The drive usually has a button that, when pressed, will spring the disk out at varying degrees of force. Some would barely make it out of the disk drive; others would shoot out at a fairly high speed. In a majority of drives, the ejection force is provided by the spring that holds the cover shut, and therefore the ejection speed is dependent on this spring. In PC-type machines, a floppy disk can be inserted or ejected manually at any time (evoking an error message or even lost data in some cases), as the drive is not continuously m onitored for status and so programs can make assumptions that do not match actual status. With Apple Macintosh computers, disk drives are continuously monitored by the OS; a disk inserted is automatically searched for content, and one is ejected only when the software agrees the disk should be ejected. This kind of disk drive (starting with the slim Twiggy drives of the late Apple Lisa) does not have an eject button, but uses a motorized mechanism to eject disks; this action is triggered by the OS software (e.g., the user dragged the drive icon to the trash can icon). Should this not work (as in the case of a power failure or drive malfunction), one can insert a straightened paper clip into a small hole at the drives front, there by forcing the disk to eject (similar to that found on CD/DVD drives). Some other computer designs (such as the Commodore Amiga) monitor for a new disk continuously but still have push-button eject mechanisms. The 3-inch disk, widely used on Amstrad CPC machines, bears much similarity to the 3 ½-inch type, with some unique and somewhat curious features. One example is the rectangular-shaped plastic casing, almost taller than a 3 ½-inch disk, but narrower, and more than twice as thick, almost the size of a standard compact audio cassette. This made the disk look more like a greatly oversized present day memory card or a standard PC card notebook expansion card rather than a floppy disk. Despite the size, the actual 3-inch magnetic-coated disk occupied less than 50% of the space inside the casing, the rest being used by the complex protection and sealing mechanisms implemented on the disks. Such mechanisms were largely responsible for the thickness, length and high costs of the 3-inch disks. On the Amstrad machines the disks were typically flipped over to use both sides, as opposed to being truly double-sided. Double-sided mechanisms were available but rare. USB Ports Universal Serial Bus connectors on the back. These USB connectors let you attach everything from mice to printers to your computer quickly and easily. The operating system supports USB as well, so the installation of the device drivers is quick and easy, too. Compared to other ways of connecting devices to your computer, USB devices are incredibly simple we will look at USB ports from both a user and a technical standpoint. You will learn why the USB system is so flexible and how it is able to support so many devices so easily Anyone who has been around computers for more than two or three years know the problem that the Universal Serial Bus is trying to solve in the past, connecting devices to computers has been a real headache! Printers connected to parallel printer ports, and most computers only came with one. Things like Zip drives, which need a high-speed connection into the computer, would use the parallel port as well, often with limited success and not much speed. Modems used the serial port, but so did some printers and a variety of odd things like Palm Pilots and digital cameras. Most computers have at most two serial ports, and they are very slow in most cases. Devices that needed faster connections came with their own cards, which had to fit in a card slot inside the computers case. Unfortunately, the number of card slots is limited and you needed a Ph.D. to install the software for some of the cards. The goal of USB is to end all of these headaches. The Universal Serial Bus gives you a single, standardized, easy-to-use way to connect up to 127 devices to a computer. Just about every peripheral made now comes in a USB version. A sample list of USB devices that you can buy today includes: Printers Scanners Mice Joysticks Flight yokes Digital cameras Webcams Scientific data acquisition devices Modems Speakers Telephones Video phones Storage devices such as Zip drives Network connections In the next section, well look at the USB cables and connectors that allow your computer to communicate with these devices. Parallel port A parallel port is a type of interface found on computers (personal and otherwise) for connecting various peripherals. It is also known as a printer port or Centronics port. The IEEE 1284 standard defines the bi-directional version of the port. Before the advent of USB, the parallel interface was adapted to access a number of peripheral devices other than printers. Probably one of the earliest devices to use parallel were dongles used as a hardware key form of software copy protection. Zip drives and scanners were early implementations followed by external modems, sound cards, webcams, gamepads, joysticks and external hard disk drives and CD-ROM drives. Adapters were available to run SCSI devices via parallel. Other devices such as EPROM programmers and hardware controllers could be connected parallel. At the consumer level, the USB interface—and in some cases Ethernet—has effectively replaced the parallel printer port. Many manufacturers of personal computers and laptops consider parallel to be a legacy port and no longer include the parallel interface. USB to parallel adapters are available to use parallel-only printers with USB-only systems. However, due to the simplicity of its implementation, it is often used for interfacing with custom-made peripherals. In versions of Windows that did not use the Windows NT kernel (as well as DOS and some other operating systems) Keyboard Keyboard, in computer science, a keypad device with buttons or keys that a user presses to enter data characters and commands into a computer. They are one of the fundamental pieces of personal computer (PC) hardware, along with the central processing unit (CPU), the monitor or screen, and the mouse or other cursor device. The most common English-language key pattern for typewriters and keyboards is called QWERTY, after the layout of the first six letters in the top row of its keys (from left to right). In the late 1860s, American inventor and printer Christopher Shoals invented the modern form of the typewriter. Shoals created the QWERTY keyboard layout by separating commonly used letters so that typists would type slower and not jam their mechanical typewriters. Subsequent generations of typists have learned to type using QWERTY keyboards, prompting manufacturers to maintain this key orientation on typewriters. Computer keyboards copied the QWERTY key layout and have followed the precedent set by typewriter manufacturers of keeping this convention. Modern keyboards connect with the computer CPU by cable or by infrared transmitter. When a key on the keyboard is pressed, a numeric code is sent to the keyboards driver software and to the computers operating system software. The driver translates this data into a specialized command that the computers CPU and application programs understand. In this way, users may enter text, commands, numbers, or other data. The term character is generally reserved for letters, numbers, and punctuation, but may also include control codes, graphical symbols, mathematical symbols, and graphic images. Almost all standard English-language keyboards have keys for each character of the American Standard Code for Information Interchange (ASCII) character set, as well as various function keys. Most computers and applications today use seven or eight data bits for each character. For example, ASCII code 65 is equal to the letter A. The function keys generate short, fixed sequences of character codes that instruct application programs running on the computer to perform certain actions. Often, keyboards also have directional buttons for moving the screen cursor, separate numeric pads for entering numeric and arithmetic data, and a switch for turning the computer on and off. Some keyboards, including most for laptop computers, also incorporate a trackball, mouse pad, or other cursor-directing device. No standard exists for positioning the function, numeric, and other buttons on a keyboard relative to the QWERTY and other typewriting keys. Thus layouts vary on keyboards. In the 1930s, American educators August Dvorak and William Dearly designed this key set so that the letters th

Wednesday, November 13, 2019

Illusion versus Reality in Miss Brill Essay -- Katherine Mansfield Mis

Illusion versus Reality in Miss Brill  Ã‚  Ã‚  Ã‚  Ã‚  Ã‚  Ã‚  Ã‚  Ã‚      Is it really "okay" to talk to yourself as long as you don't talk back? Well, what if your fur piece talks back? In Katherine Mansfield's short story, "Miss Brill," it is a quickly established fact that Miss Brill has an odd relationship with her fur necklet (440). But it is the author's descriptive use of symbolism that provides a deeper understanding of Miss Brill's personality. Katherine Mansfield creates the woman in the ermine toque (441) in similarity to Miss Brill to reveal Miss Brill's identity in connection with her own fur piece and invite comparison, which further illustrates Miss Brill's perception of reality.    Introduced in the story as simply "an ermine toque" (441), Ms. Mansfield establishes the woman wearing this fur hat as a symbol that assists in defining the relationship of one-ness Miss Brill has with her own fur. Through Miss Brill's description of the woman in the ermine toque, it is clear that Miss Brill perceives the woman in connection with the fur she wears (441-442). Miss Brill compares the woman's coloring to the color of her fur by pointing out that everything, her hair, her face, even her eyes, [is] the same colour as the shabby ermine"(441). Miss Brill goes on to describe the woman's hand as being "a tiny yellowish paw" (441). And when the woman exits Miss Brill's attention, she does not walk away as a human would, but she "patters away" as a small animal might (442). Miss Brill's inability to differentiate clearly between the woman and the ermine toque she wears reinforces Miss Brill's identity in connection with her own fur. Mansfield employs this description as a techn ique to suggest the need to interpret Miss Brill from the descri... ...nly a secondary symbol, it assists in enriching our understanding of Miss Brill's peculiarities while pointing out primary symbols, like her own fur necklet. How Mansfield employs the "ermine toque" to foretell the plot of the larger story demonstrates a difference between those who interact and constructively deal with conflict and those who run away, refusing to accept the realities of life. Miss Brill, who does not interact with life, chooses to interact with her fur which, though genuine, is not alive. Instead, she chooses an imitation for her own life by "sitting in other people's lives" (440) which, though reality, cannot remain her reality.    Works Cited Mansfield, Katherine.   "Miss Brill."   Introduction to Literature: Reading,   Analyzing, and Writing. 2nd ed.   Ed. Dorothy U. Seyler and Richard   A. Wilan. Englewood Cliffs: Prentice, 1990.   440-43.

Sunday, November 10, 2019

I Am Sam Reaction Paper

ENGLISH 2 Chu, Jensy P. February 28, 2013 TTHS 1-2pm Prof. Bernardo I AM SAM -Reaction Paper I. SUMMARY The movie revolves around a mentally challenged man named Sam Dawson. He has a 7 year old daughter, Lucy who in under unfortunate circumstances was taken away from him. Now he asks help from a well-known lawyer, Rita and fights for custody over Lucy. With the help of his friends and loved ones, they do their best to get Lucy back. On the way, rough challenges comes their way but strengthens their bond and love for each other. II. PERSONAL REACTIONThe movie is a very touching, spectacular and award winning motion picture I’ve seen in my whole life. For it touched my heart and made me realize facts about life. This movie has such sad and funny parts and the acting is absolutely fabulous. All in all, This movie tells an amazing story and is never boring. I really loved the movie. The actors portrayed their roles perfectly. Sean Penn did a wonderful job in portraying a mentally challenged man. Like when he was about to give up when he saw Lucy with her new family, there was a moment that the camera focused on his eyes. You can really see the warmth he injects into his acting.Now Lucy played by Dakota Fanning for me exhibits a depth of soul which made her acting a stellar work. At such a young age, to be able to act like that is very impressive. And Rita played by Michelle Pfeiffer, her acting is skillfully acted out. She can show different emotions perfectly. From the cool composed lawyer to a troubled wife and mother. And the supporting actors did a brilliant performance too. I also love the soundtrack associated with the movie. The music is such delight. Every song is a rousing piece made up of either cheerful fluffiness or emotional embodiment in which it suites the scene quite nicely.Examples are the song Blackbird and I’m looking through you, this two songs I really like. I really loved the movie but I was a bit dismayed with the ending. The en d, in particular was unrealistic yet satisfying in a certain way. It didn’t really say what happened. It just showed what happened not how it happened. Other than that I’d give the movie a two- thumbs up. Indeed the movie is a piece of artwork. To those who want to be moved and get teary eyed, to those who wants to laugh at life. I recommend everyone see it. If you don't, you're missing out.

Friday, November 8, 2019

Free Essays on Lost City Of Atlantis

Lost city of Atlantis The Ocean is filled with untold stories that are waiting to be discov-ered. Who knows what may lie under the deep blue Ocean. Atlantis is one of those untold stories that are awaiting an explanation. Atlantis today has no solid explanation of how old it is, where it is located, or if it really existed at all. There are numerous theories on how Atlantis was destroyed or if it was a physical place. Some people believe they have found the lost city of Atlantis. Atlantis is believed to have existed over 2500 years ago and had such things as running hot and cold water, streets of gold, and all the nicest things on Earth (EarthQuest, pg. 1). Having such things as these Atlantis must have been a splendid place to live. What more would anyone want if with a wonderful place as Atlantis. Atlantis has a brief history because most of it has not yet been discovered. The Greek Word Atlantis means the Island of Atlas, just as the word Atlantic means the Ocean of Atlas (Laketech, pg.1). Atlantis was the domain of the Greek Poseidon, god of the sea. He is the one that was said to have made and destroyed Atlantis. When Poseidon fell in love with a mortal woman, Cleito, he created a dwelling at the top of a hill near the middle of the Island. Surrounding the dwelling with rings of water and land to protect her (The Active Mind, pg.1). Soon After Poseidon, god of the sea, sired five pairs of male twins with the mortal woman â€Å"Cleito†. When the children grew up Poseidon appointed the eldest of these sons, At-las the titan, ruler of his beautiful Island domain (Laketech, pg 1). The Island of Atlantis was the center for trade and commerce. Atlan-tis was governed in peace, was rich in commerce, was advanced in knowledge, and held domain over the surrounding islands and contents. Portions of the city were devoted to commerce and industry. This was be-cause the Atlanteans used the discoveries of their scient... Free Essays on Lost City Of Atlantis Free Essays on Lost City Of Atlantis Lost city of Atlantis The Ocean is filled with untold stories that are waiting to be discov-ered. Who knows what may lie under the deep blue Ocean. Atlantis is one of those untold stories that are awaiting an explanation. Atlantis today has no solid explanation of how old it is, where it is located, or if it really existed at all. There are numerous theories on how Atlantis was destroyed or if it was a physical place. Some people believe they have found the lost city of Atlantis. Atlantis is believed to have existed over 2500 years ago and had such things as running hot and cold water, streets of gold, and all the nicest things on Earth (EarthQuest, pg. 1). Having such things as these Atlantis must have been a splendid place to live. What more would anyone want if with a wonderful place as Atlantis. Atlantis has a brief history because most of it has not yet been discovered. The Greek Word Atlantis means the Island of Atlas, just as the word Atlantic means the Ocean of Atlas (Laketech, pg.1). Atlantis was the domain of the Greek Poseidon, god of the sea. He is the one that was said to have made and destroyed Atlantis. When Poseidon fell in love with a mortal woman, Cleito, he created a dwelling at the top of a hill near the middle of the Island. Surrounding the dwelling with rings of water and land to protect her (The Active Mind, pg.1). Soon After Poseidon, god of the sea, sired five pairs of male twins with the mortal woman â€Å"Cleito†. When the children grew up Poseidon appointed the eldest of these sons, At-las the titan, ruler of his beautiful Island domain (Laketech, pg 1). The Island of Atlantis was the center for trade and commerce. Atlan-tis was governed in peace, was rich in commerce, was advanced in knowledge, and held domain over the surrounding islands and contents. Portions of the city were devoted to commerce and industry. This was be-cause the Atlanteans used the discoveries of their scient...

Wednesday, November 6, 2019

ALWAYS IN 1787, 1820, 1833, AND 1850, THE NORTH AND THE SOUT essays

ALWAYS IN 1787, 1820, 1833, AND 1850, THE NORTH AND THE SOUT essays Throughout the late 1700s to early-to-mid 1800s, with the stability of the union being frequently challenged over every, occasionally petty, disagreement, somehow, both the north and south have found a common ground. Yet, it was only a matter of time when all these so-called compromises revealed their true color as a series of patchwork, or house of cards, that with every addition makes it that much more unstable. Not until the early 1860s did the house of cards finally give way and it was quite clear that neither the north nor the south was able to find that ever so frequent common ground. In the late 1700s to early 1800s, most of the disputes were over taxes, land settlement, states rights, and legislative representation, though controversial, these were quite a bit easier to deal with and settle as opposed to the slavery issue that would eventually overwhelm society with the introduction of Eli Whitneys cotton gin in 1794. This precipitated a long era of dispute, but fortunately, for men like Henry Clay and John Adams, the cards of the Missouri and the 3/5ths compromises for the most part silenced both parties on terms of slavery for the time being. These times were no stranger to instances of violence such as Shays Rebellion, so in domestic terms, life was not very happy-go-lucky. It was becoming apparent that under the surface of mild sectional strife there was something greater brewing. Maintaining the union was considered as a safeguard against domestic faction and insurrection especially the years after the Treaty of Ghent of the War of 1812 which brought a nd cemented a greater feeling of unity/nationalism, so people would be less apt to try to challenge and threaten both the safety and stability of his/her country. Therefore, it was no surprise that an agreement had been hastily reached. The mid 1800s brought, along with economic growth and nationalism, a wave of social, intellectual, and re...

Monday, November 4, 2019

Comparative Analysis of Nurse-patient Ratio Mandates for the Hospital Research Paper

Comparative Analysis of Nurse-patient Ratio Mandates for the Hospital Setting - Research Paper Example Legislation has passed in California, and will be presented in other states to mandate a specific ratio of nurses per patient that must be maintained at all times. The goal of this study is to identify a balance between adequate levels of nurse-personnel while maintaining hospital efficiency both in terms of cost, and the time-resources of medical professionals. A Comparative Analysis of Nurse-Patient Ratio Mandates For the Hospital Setting INTRODUCTION The possibility of nursing shortages is a relevant concern for hospitalists, patients, and the general public alike. Years past have produced numerous concerns of under-staffed, overburdened hospitals as a barrier to adequate care. This paper will endeavor to examine the conventional wisdom that more patient responsibility will yield lower quality care from nurses and other healthcare professionals; and the extent to which such a decline in patient outcome can be quantified. But is is true that patients will receive better care, with fewer medical errors under a system of precise nurse-patient ratios? Are nurses doing a better job under such a system? How would such a change extend to doctors and other medical practitioners? Over a dozen states are now considering some form of mandate that will enforce specific ratios of nurses for every patient under the hospital's care, it is worthwhile to examine critically the available research on the balance between caregiver and patient. It is in the interest of everyone to seek the ideal balance between nurse staffing levels and the cost-effective management of the time-resources of medical professionals. CASE STUDY â€Å" Celeste examined the patient's chart; she had to remind herself that Mr. McGillicuddy wasn't just a disease; he was a case of full-blown nephrotic syndrome; based on the protein-cysts found in his urinalysis, plus a chronic case of trigeminal neuralgia on top of that. Oh, and a living. breathing person. But with his age and prognosis, personhood would n't cut much slack with the transplant committees. The experienced Nurse was not optimistic that he would retain his living status much longer; in part because the very lab results that Celeste found so damning took as long as they did to arrive; compounding the bad news they were reporting. On the one hand, in preparation for the new regulations mandating more nurses for every medical center in the county, patients would get more attention from nurses like her; the problem being – a hospital as small as hers had to cut corners somewhere; so they'd hadn't been able to hire that new med-tech they'd been needing for months now. So the doctors were probably lucky to get their results as soon as they did – as late as it seemed to her. But adding more nurses was about to be required by law; not something she could whine about to the head-nurse. She patted Mr. McGillicuddy's hand in reassurance. Well, it would fall to her – and the new blood they were hiring to pick u p the slack; make up for the corners cut...† PROS More nurses equals better care; in order to ensure the best possible patient outcomes during hospitalization, nurse-to-patient ratios must be mandated by law. The correct ratio will lead to happier nurses, and healthier patients. It seems an obvious solution; more nurses certainly can't hurt, More eyes to watch over

Friday, November 1, 2019

A Simple Surgery Checklist Saves Lives Case Study

A Simple Surgery Checklist Saves Lives - Case Study Example They do not initiate for vital steps in the regular process but attempts to identify failures of the process. Based on the four flow charts, the system has much duplication of activities or redundancies in time especially on the number of times the patient has to give his consent. The patient consent is required in more in six scenarios in the system, from the holding room, anesthesiologist to the surgeon. The timing is such that they occur at a period when it is not late to correct the problem. This helps to ensure and further improve the safety of the patient. WHO Surgical Safety Checklist recognizes three stages of an operation, in each stage, the operation coordinator must check the completion of the task before embarking on the next stage (Szalavitz, 2009). The patient has three separate interactions with the health providers in the following phase of the WHO surgical safety checklist: During the Sign In stage (before the administration of anesthesia), the patient is identified, location, informed consent, and the procedure of operation are confirmed. The location/site marking is established and the finishing point of the anesthesia safety measure is done (Cavoukian, 2009). A confirmation of whether a functional pulse rate meter is present in the patient is done. Several other issues of concerns are addressed such as allergies, aspiration risk, air path risks, loss of excess blood, and the availability of effective tools and equipment for operation purposes. The Time Out stage is performed before the incision of the skin. Everyone involved in the operation introduces themselves to real names and their functions in the operation. They then confirm the patient once again, the identity/name of the patient, location/site, and procedure and deliberates on expected critical situations. Antibiotic prophylaxis, scheduling, and presentation of imaging studies are also confirmed at this stage.