Thursday, August 27, 2020

Interpersonal Communication for Feedback and Organizations

Question: Examine about theInterpersonal Communication for Feedback and Organizations. Answer: Relational correspondence has consistently existed for such a long time since people are social in nature and somehow they have to share data or emotions. The procedure through which individuals can trade significance, sentiments or data is alluded to as relational correspondence (Beebe, Beebe Redmond, 2014). Individual correspondence is the up close and personal correspondence between a person that happens through either verbal or non-verbal modes. Relational correspondence includes how the message is imparted through highlights, for example, non-verbal communication, the manner of speaking or motions. Input is a significant basic of relational correspondence and viable criticism encourage smooth progression of data. A viable criticism ought to be obviously heard, comprehended and acknowledged. To give compelling criticism there are a few rules and contemplations to be watched. The input ought to be as explicit as conceivable as this will make it simple for the other party to get it . The input ought to be convenient and you should consider what you are going to state and how at the proper second. While giving input one should focus on conduct and their impact and not on character. The relational aptitude of input is significant in a work environment since it advances cooperation and powerful correspondence inside an association (Amanda, 2013). Great relational correspondence advances great initiative since it advances sympathy and trust inside individuals from staff. Viable input abilities assume a basic job in advancing validity and consumer loyalty. Individual connections are significant in the working environment and this can be cultivated and kept up by the presence of powerful relational correspondence systems inside an association (De Janasz, 2014). Taking everything into account, relational abilities comprise of how individuals convey both verbally and non-verbally. Powerful input aptitudes need to follow certain rules that guarantee the criticism given is obviously comprehended and acknowledged. References Amanda Baker (2013) Feedback and associations: Feedback is acceptable, criticism amicable culture is better Year: 2013 Pages: 260-268 Volume: 54 Issue: 4 Journal Title: Canadian Psychology (remaining creators precluded intentionally). Beebe, SA, Beebe, SJ Redmond (2014) Interpersonal Communication: identifying with others, seventh edn, Allyn Bacon, Boston (Blackboard Reading List tab, and referenced as a printed copy book, not as a part from a book). De Janasz, SZ, Crossman, J, Campbell, N Power, M 2014, Interpersonal abilities in associations, second edn, Mc-Graw-Hill Education, North Ryde, NSW (Blackboard Reading List tab, and referenced as a printed copy book, not as a part from a book).

Saturday, August 22, 2020

Management research coursework Essay Example | Topics and Well Written Essays - 1000 words

The executives look into coursework - Essay Example Accordingly, the motivation behind the article is uncovered that in the wake of breaking down different parts of instructive legitimacy, the article will give an exploration technique system to evaluating the instructive legitimacy of business gaming reproduction. Along these lines, the presentation is exclusively planned for portraying the foundation that requires the examination, and furthermore at portraying the layout of the exploration paper. From that point, a concise clarification is given about the technique used to lead investigate, the typical strategies area found in an exploration paper. The journalists call attention to that the exploration was led ‘using absolute venture simulation’ (Stainton, Johnson and Borodzicz (2010, p. 705). At that point, the analysts resort to a short writing audit to show that the evaluation of legitimacy is a progressing issue and that a viable research technique for business gaming recreation is yet to come. The article additiona lly makes the reason understood that it is planned for investigating legitimacy from an instructive or learning perspective. The following area dives deep into the idea of instructive legitimacy and finds the two components; structure and usage as the variables that direct instructive legitimacy. ... The following area investigates the other factor in instructive legitimacy; that is execution. The analysts show how a reproduction ought to be executed. The main point is fusing down to earth experience will introduce genuine issues, in this way allowing the understudies to think about what they realized. Also, they acquire the point that while the ‘learning by doing’ approach is embraced, there is the requirement for a facilitator to give training, backing and inspiration. The analysts, subsequent to breaking down the issues engaged with accomplishing instructive legitimacy, proceed onward to building up a strategy system for evaluating the instructive legitimacy of a business gaming reproduction. It is brought up that none of the by and by accessible investigations has built up a procedure for planning, executing and approving an absolute endeavor recreation. It is vital for any exploration managing instructive legitimacy to survey the inward instructive legitimacy, o uter instructive legitimacy and outside illustrative legitimacy. As indicated by Stainton, Johnson, and Borodzicz (2010, p. 710), inward instructive legitimacy implies the capacity to show the understudy the relations in a business domain and outside instructive legitimacy implies the similarity of the recreation with this present reality condition. In the event that it speaks to a genuine situation, it has authentic legitimacy. Three hypothetical suggestions are made to survey instructive legitimacy. As indicated by the primary suggestion, if the members comprehend the reasons for their business results, the inward instructive legitimacy is apparent. As per the second

Friday, August 21, 2020

What You Need To Know About Paper Writing Websites

What You Need To Know About Paper Writing WebsitesThere are plenty of sites that offer free online writing and editing services. These websites may be based on the internet or they may be physically based in offices located all over the world. While most of these companies have certain individual writers who specialize in various areas of the writing job market, there are also a few companies that hire freelance writers for each specific industry. Here are some of the things you need to know about paper writing websites.Websites like these may not be free for everyone. Many writers will be surprised when they get a message saying that their query letter and other submission materials have been declined by some of these sites. These companies are quite flexible and will accept materials from all types of writers regardless of the quality of their work. So if you're writing and editing just for fun or for pay purposes, you won't be turned away just because you aren't very good at what you do.Some companies will charge you a fee for free papers. This fee is usually very small and will depend on how many papers you submit each month. They will only charge you if your paper isn't accepted. This is why it's important to submit at least three new papers each month.Since paper writing websites want to keep their budget low, they won't have many writers working for them. Instead, they'll work with smaller local businesses and individuals who can afford to pay for each submission they receive. This way, the company has more resources to offer writers who don't have the time or skill level to do the job on their own.Most companies that offer these services will charge a flat rate fee for each assignment. This can range from a few dollars to hundreds of dollars depending on the job and the writer's experience level. You might even find some companies that will offer free supplies as part of the service for writing jobs that have been accepted. This allows you to keep your expenses low and allows them to pay their writers more.Paper writing websites have a specific format and style for each job that they accept. Each of their articles is different than any other, so it's easy to see why many of them won't have much in common. The writers also have to follow certain deadlines and maintain professionalism. As a result, they are going to charge a fee for this type of work.When you hire a writer from paper writing websites, you'll have to set up a system for paying them. You can pay via checks, Pay Pal, or even through credit cards. Make sure you make payments in a timely manner so that your writer has enough time to get the job done.Paper writing websites are an excellent resource for writers. Their writers aren't being paid to sit around and write other people's articles, they are being paid to write specifically for the client. So if you're interested in getting some free writing and editing done, the best place to start is right online.

Monday, May 25, 2020

Illegal Immigration And The United States - 1532 Words

Since the establishment, people have been emigrating from every foreign country into the United States. Immigration is defined in the Merriam Webster dictionary as â€Å"a person who comes to a country to take up permanent residence† (â€Å"immigrant†). In the beginning, America was the land of opportunity, which allowed people to have a chance at reaching success. Since then, the population has grown to over 320 million people. Because of the explosion of people entering the United States, a restriction must be instilled on the number of new foreigners who are allowed in. America is now well established, and the economy is not strong enough to support more than the population currently living within the borders. In attempt to control the amount of people entering the country, the United States government created an application process in which people who wish to become citizens are granted the opportunity to gain citizenship. Immigration is illegal without being accept ed through the process, however, multitudes of people ignore the law and continue to cross the border illegally. Illegal immigration affects American citizens, immigrants, immigrant families, employers, and the Department of Homeland Security daily that create problems in the United States. The issue of immigration is a well-focused area in a majority of political debates. The question is should the United States be strengthening the laws of immigration to keep undocumented citizens out? Illegal immigration isShow MoreRelatedIllegal Immigration And The United States1573 Words   |  7 Pagesmillion illegal immigrants currently residing in the United States furthermore, for the United States economy. The correctional prerequisites against migrants were added to enactment to protect it from feedback that acquittal is absolution without outcome. Immigration makes a difference among everybody, and Congress ought to be doing everything in its energy to make it as simple as feasible for settlers to live and work lawfully what s more, openly in the United States. The United States is knownRead MoreIllegal Immigration And The United States1315 Words   |  6 PagesFor ages, the United States has seemed to be the country where people seek to move to for a better life. The United States was built on immigrants. People have always migrated to the United States both legally and illegally. The main problem the country has face with immigrants is the amount that trespass the border illegally. Illegal immigration is the unlawful act of crossing a national border(Illegal Immigration Pros and Cons). The illegal immigrant population keeps growing at an annual averageRead MoreIllegal Immigration And The United States969 Words   |  4 PagesAmerican businessman, politician, television personality, and author, is the presumptive of the y for president of the United States in 2016having won the most state primaries and caucuses and delegates to the 2016 Republican National Convention. Trump’s positions in opposition to illegal immigration, various free trade agreements that he believes are unfair, and most military interventionism have earned him particular support among blue-collar voters and voters without college degrees. Many of hisRead MoreIllegal Immigration And The United States1486 Words   |  6 PagesIllegal immigration has been an issue in the United States for a long time so the issues that come with it should not be a surprise. America was established on the basis of newcomers settling here from abroad. Recently though, thousands of illegal immigrants have come into the U.S. through either the Mexico border, the Pacific Ocean, or the Gulf of Mexico which has created a new an unanticipated issue for the U.S., in the past immigrants came from Europe and passed through the Ellis Island stationRead MoreIllegal Immigration Is The United States1805 Words   |  8 PagesIllegal immigration is the migration of people across national borders in a way that violates the immigration laws of the destination country. Some c ountries have millions of illegal immigrants. Immigration, including illegal immigration, is overwhelmingly upward, from a poorer to a richer country. The easy definition of an undocumented immigrant is someone who was not born in the United States and therefore has no legal right to be or remain in the United States. Not all undocumented immigrantsRead MoreIllegal Immigration : The United States1876 Words   |  8 PagesIllegal Immigration Due to the economic benefits of immigrant labor, the dangers of central American countries, and the basic human rights of noncitizens, the US government must grant amnesty to undocumented immigrants. Illegal immigration has become a rising issue in the US over the past few years, and it will continue to heat up coming into this year s presidential election. This issue is also very present in the local community due to it’s diversity and large Hispanic population. GenerallyRead MoreIllegal Immigration in the United States1864 Words   |  8 PagesIllegal Immigration In The United States: A Controversial Debate Illegal immigration is an on-going issue, which is of much importance in the United States today. It has been overlooked for many years, however it has reached a point where it can no longer be ignored. Most of the illegal immigrants, 54% to be exact, come through the Mexican border. (Hayes 5) Since the early 1980’s, the number of illegal Mexican immigrants has risen at an incredible rate, causing the United States government to takeRead MoreIllegal Immigration in the United States Essay2094 Words   |  9 PagesIllegal Immigration in the United States Illegal Immigration in the United States The United States (US) has always been viewed as the land of opportunity because it is the only true free country in the world. This being the case people have been fighting their way into the country for decades. However, it is becoming more and more of a problem each decade that passes. With the United States border being so close to Mexico it is now seeing the highest population of illegal immigrants toRead MoreDeportation: Immigration to the United States and Illegal Alien2373 Words   |  10 Pagesadvance because of the way the immigration process works. However, one thing we all had in mind, was to see the light of a different country, see different faces and hopefully find ways to stay. Once the airplane took off, words was already spread all over the office of Haiti air in Florida that the airplane left Haiti with more than 30 Haitian illegal on board. The first few 2 hours spent at the arrival office was like a discovery of a new world for every single illegal in that group. This was oneRead MoreIllegal Immigration And Immigration In The United States1091 Words   |  5 Pagesarrest persons for immigration enforcement purposes. (De Leon, K) The new legislation, created by California Senate President Kevin de Leon, officially makes the state of California a â€Å"sanctuary state†. Previously, de Leon determined that Donald Trump is a racist because of his positions on immigration; most notably, Trump’s attempt to defund cities that considered themselves sanctuaries. In a debate that is becoming increasingly more polarized, Brown sought to protect illegal immigrants against

Thursday, May 14, 2020

Analysis Of Cornel West, An American Philosopher And...

On Love and Intimacy Short Paper 3 Riana Nigam Due Tuesday, May 9th, 2017 Exchange â€Å"We live in a predatory capitalist society in which everything is for sale. Everybody is for sale, so there is ubiquitous commodification.† This quotation by Cornel West, an American philosopher and political activist, conveys the widespread objectification of human beings in our society. The narrow, traditional image of prostitution has experienced a dramatic shift in the post-industrial American society. Sex workers are not automatically considered to be from low-income, marginalized groups, and instead, they have come to also include individuals from the educated, middle-class category. This demographic transition reveals the gradual†¦show more content†¦Consequently, sexual relations between individuals occurred increasingly to satisfy the desire for recreational experiences and less for reproductive and long-term purposes. The growing inclination to escape the complexities of interpersonal relationships has been reflected by a tendency towards more bounded and delineated modes of sexual contact. This pattern also reveals transformations in the social structure of private and public spheres, as it allows individuals to have the feeling of a genuine sexual experience while still allowing them to bypass the obligations that are often expected or required in a more committed relationship. Bernstein labels this redefinition of sexual intimacy â€Å"bounded authenticity,† which entails the sale and purchase of authentic and physical connection, all within the realm of predefined limitations (Bernstein, 127). Bounded authenticity demonstrates how traditional romance has metamorphosed, as it has become centered around recreational sex beyond the confines of the family unit. Moreover, it has contributed to the view of heterosexual male desire as problematic. There is an expectation that heterosexual mal es engage in sexual commerce to fulfill a need that is left void within the home. However, it is more and more transparent that their participation in this industry is not a replacement for a sex life within the privatized family home, but instead, it is an additional component to it. As

Wednesday, May 6, 2020

Similarities Between Bill Brinson And Perks Of Being A...

The two texts, both Bill Brysons a short history of nearly everything, and Stephen Chboskys Perks of being a wallflower, explore responses emitted when beings are given emotional and intellectual stimuli. This can be seen through both Bill Brysons use of Narration throughout his book, and through the use of Narration in Perks of being a wallflower, through Charlie, where we can see everything he experiences through both his thoughts and voice. Especially during scenes that depict Charlie writing to someone unnamed about how he feels. Although both texts explore different ideas of discovery they, both explore different themes of the idea of discovery, with a short history of nearly everything, focusing on the themes of Intellectual†¦show more content†¦In charlie we see him learn about his emotional discovery of the past, and through the quote: â€Å"Im both happy and sad, and still trying to figure out, how that can be.† We learn that he has trouble defining who is, a nd is still trying to find himself in the world. It is through Charlie’s use of narration and writing letters to the unnamed person, that we learn about him. How he’s emotionally unstable, he doesn’t trust people easily, and how he’s in love with Sam. It’s through the use of close up camera angles, lighting and acting ability that we see the character of Charlie come to life, and learn about his friends and himself through emotional discovery. As well as having well thought out, and academic discoveries presented within his work of nonfiction. Bryson also presents the ideal, that academic and intellectual discoveries can be completely accidental and surprising, this is seen through Brysons, writing on the discovery of cosmic background radiation. Through the quote: â€Å"Although Penzias and Wilson had not been looking for cosmic background radiation, didn’t know what it was when they found it, and hadn’t described or interpreted its character in any paper, they received the 1978 Nobel Prize in Physics† This quote showcases the idea that important discoveries can be

Tuesday, May 5, 2020

Defining a Heroine free essay sample

The word heroine has many definitions, but a true heroine is a woman who is selfless, and cares about others equally, if not more, than she cares about herself. That being said, because there are so many different ideas and definitions of what a heroine actually is, one must form their own personal view of the word. For instance, the American writer Sylvia Day spoke about her personal heroines and said, My heroines, more often than not, are the ones who are troubled and resistant. Aside from the many ideas, definitions, and interpretations of the heroine, knowing the iterate definition of the word is key to being able to form a personal opinion. To make matters even more confusing, the dictionary definition of the heroine contains around three different explanations of the word. In simple terms, a heroine can either be a mythological or legendary woman who has the qualities of a hero, a woman admired and emulated for her achievements and qualities, or the principal female character in a literary or dramatic work. We will write a custom essay sample on Defining a Heroine or any similar topic specifically for you Do Not WasteYour Time HIRE WRITER Only 13.90 / page The first interpretation of the word mentions the idea of a herd, which Is simply the masculine form of the word heroine. Though it is not uncommon to hear a woman be referred to as a hero, a man will never be referred to as a heroine. Other words that are synonyms to the word heroine are words like idol, leading lady, legend, protagonist, and demigoddess. Even though the definition of a heroine Is muddled and somewhat confusing, the definition of what the opposite of a heroine Is completely clear.Overall, a heroine Is a woman who Is selfless, goes to great lengths to help others, is courageous, Is self-effacing, and does what she can for others regardless of the possible personal ramifications. That being said, the direct opposite of a heroine Is a woman who Is selfish, uncaring, conceited, and Is only concerned for her own wellbeing. In simpler terms, the opposite off heroine Is none other than a coward. Though the definition of a heroin Is a bit unclear, It Is certain that a heroine Is not someone who Is weak, cowardly, gutless, or faint-of-heart.What Is also certain about this word Is that there are as many different kinds of heroines as there are defenseless. A heroine can be the leading lady In a novel, someone you Idealize, or someone you know and look up to. For Instance, one of my own personal heroines would without a doubt be my Aunt Adrian. Adrian Is a woman of honesty, determination, drive, love, compassion, generosity, and understanding. She Is someone I look up to, aspire to be like, and have come to regard as a personal heroine.In a completely different sense, Beryl Markham Is the heroine of her novel West With the Night, In the way that she Is the leading lady, but she also displays great amounts of courage, Integrity, and honesty throughout the book. In conclusion, there Is no singular Idea, definition, or Interpretation of the heroine. There are countless Ideas of the heroine but what gives the word meaning Is ones personal Interpretation of the word.Defining a Heroine By intelligentsia mentions the idea of a hero, which is simply the masculine form of the word Even though the definition of a heroine is muddled and somewhat confusing, the definition of what the opposite of a heroine is completely clear. Overall, a heroine is a woman who is selfless, goes to great lengths to help others, is courageous, is self-effacing, and does what she can for others regardless of the possible personal ramifications. That being said, the direct opposite off heroine is a woman who is selfish, uncaring, conceited, and is only concerned for her own wellbeing.In simpler terms, the opposite off heroine is none other than a coward. Though the definition of a heroin is a bit unclear, it is certain that a heroine is not someone who is weak, cowardly, gutless, or faint-of-heart. What is also certain about this word is that there are as many different kinds of heroines as there are definitions. A heroine can be the leading lady in a novel, someone you idealize, or someone you know and look up to. For instance, one of my own personal heroines would without a doubt be my Aunt Adrian. Adrian is a woman of honesty, determination, drive, love, compassion, generosity, and understanding. She is someone I look up to, aspire sense, Beryl Markham is the heroine of her novel West With the Night, in the way that she is the leading lady, but she also displays great amounts of courage, integrity, and honesty throughout the book. In conclusion, there is no singular idea, definition, or interpretation of the heroine. There are countless ideas of the heroine but what gives the word meaning is ones personal interpretation of the word.

Tuesday, April 7, 2020

The Miseducation of Lauryn Hill free essay sample

This IP hop culture then took off, with several groups, mainly male groups, coming together to emphasis the pillars of hip hop; Mincing, Digging, graffiti, breakfasting, and beating (Starr, 2007) The sass was a groundbreaking decade for hip hop, In that this Is where artists other than Black males started to enter the rap and hip hop scene. Artists such as Salt-N-Peep, MAC Late, and Queen Latish, opened the door for female MAC artists. Salt- N-Peep were the first all female rap group to hit the mainstream, with multiple successes, and their presence opened the doors for artists Like Queen Lath.In this time, female Masc. had to prove that they were Just as good as their male counterparts, it not better. This competition left the female artists being ;hard and showing their street creed so much so that they were very masculine, because society at that point may not have been ready to accept a woman as both sexy and a great rapper. We will write a custom essay sample on The Miseducation of Lauryn Hill or any similar topic specifically for you Do Not WasteYour Time HIRE WRITER Only 13.90 / page After artists like MAC Late and the Queen opened the door for female rappers, hip hop took another turn. Artists like TTL, Foxy Brown, and Ill Kim, who not only emulated the hip hop culture, but brought sexiness Into the equation.With the emergence of these artists, females in hip hop had a socialized connotation that almost only depicted women in a sexual manner when it came to hip hop. Women in hip hop at this point held sexual roles, which were the Diva, Gold Digger, or the Dyke (Stephens, 2007), and their music usually reflected that; being about sex, money, or fulfilling the hard roles like a male rapper, and reflected more masculine characteristics than feminine (Stephens, 2007).In the late 1 asss, with the high sexual content of the female MAC such as Ill Kim, and Foxy Brown, Lauren Hill released her first solo album, The Insemination of Lauren Hill. Lauren Hill had all bases covered when It came to hip hop: she had the street credibility, the lyrical ability, the melodic flo w, sexuality etc. (Thespians, 2010), but she combined that with everyday female issues of that time, which made this album relatable for both women, and men. This album was about her struggles In life and love at the time, particularly about her pregnancy at the time she was writing about the album. The album was classified as a hip hop album; however it was a blend of rap, rapped flawlessly on a single album, with songs like Do Hop (That Thing), which fleeted the old styles of do hop vocal harmonies that were prevalent in music in the sass andass, and Everything is Everything. Lauren used Jamaican influences as well in the songs on this album, with Jamaican dialects and a reggae style to her songs.Not only did this album infuse the different music styles, she also addressed personal issues that before this time female rap artists did not do for the most part. She opened her life to her fans, as artists in the singer/songwriter forms of folk and contemporary music styles did with songs like her single Ex-Factor. This song had such a personal component, that although it wasnt as commercial successful as a song like Do Hop, it still showed such a personalized aspect of hip hop, especially female hip hop, that wasnt prevalent before.This album debuted on the Billboard charts at number one, and sold more copies during its first week than any other female artists album at that time (USA Today, 1998). This album also won Lauren 5 Grammar Award nominations, as well as 5 Grammar Award wins, which made Lauren the first female artist, hip hop or otherwise, to have that many nominations and wins, in one Grammar Award night. With all this success on her first album, one would suspect that she would have a blossoming career; however, her second release was not as successful.This could be due to the fact that people expected her to outdo her success of The Insemination, which is highly unlikely. Lauren went a very spiritual road for her second album, and some even believed that it was unfinished (Thespians, 2010), which could have lead to the disappointment of her career that persisted from then on. With the success of The Insemination of Lauren Hills Lauren will forever go down n history as a very influential participant in the hip hop culture to many female hip hop artists of today (Thespians, 2010), and tomorrow. The Miseducation Of Lauryn Hill free essay sample In case you have not heard, one of the top ten hip-hop groups ever has split. The magnificent trio, The Fugees, has just become a group of soloists. Each is doing well but only one has the impact they had as a whole. Pras has just come out with one of Americas top ten songs, called Ghetto Superstar. Wyclef is doing very well with his debut album, called The Carnival. Although they are doing well by themselves, they arent making the same impact as their third member, Lauryn Hill.Lauryn, the groups singer, has just come out with her first solo album. The rapper/singer/ actress has come a long way. Although she is famous, she pledges to remember her roots. She thinks back in such songs as Every Ghetto, Every City by singing Way before the record deal/ The streets that nurtured Lauryn Hill/ Made sure that Id never gone too far. We will write a custom essay sample on The Miseducation Of Lauryn Hill or any similar topic specifically for you Do Not WasteYour Time HIRE WRITER Only 13.90 / page Unlike many rappers, who rap about the bad things in life, Lauryn reminds us about the good things that we have and should cherish.The albums central topic is love, attempting to define what it is. It doesnt try to give a Websters Dictionary definition, but other peoples opinions, which are interesting because they are coming from children. Since most people dont associate love and children, this makes the album very unique.I would recommend The Miseducation of Lauryn Hill to everyone. This album and singer will never be forgotten. On my scale from one to ten, it gets a well-deserved ten..

Monday, March 9, 2020

20 Argumentative Essay Topics Hooking Facts on Fast Food Nation by Eric Schlosser

20 Argumentative Essay Topics Hooking Facts on Fast Food Nation by Eric Schlosser Coming up with topics for argumentative essays can be quite challenging for students, especially if you’ve decided to work on it a few days (or a few hours) before the deadline. If your next assignment is to write an argumentative essay on Eric Schlosser’s book, â€Å"Fast Food Nation: The Dark Side of the All-American Meal†, you can easily take on this challenge if you have the right topic in mind. To get your creativity going here are 20 topics you can use. Should Hospitals Ban Fast Food Outlets? Healthier Fast Food Choices and Awareness Will Improve Health Low Work Wages in the US Fast Food Industry Are Costing Taxpayers Is Fast Food Cheaper Than Home-Cooked Meals? The Hidden Costs of Obesity and Excessive Junk Food Consumption Is Fast Food as Addictive as Drugs? Should Governments Impose More Taxes on Junk Food? Junk Food Packaging Should Come with Health Warnings Famous Public Figures Should be Banned from Promoting Soda and Junk Foods As Bad as Smoking: Should There Be an Age Restriction On Eating Fast Food? The Link Between Fast Food and Child Obesity The Fast Food Industry Needs a Paradigm Shift Fast Food’s Effects on the Brain’s Pleasure Centers The Psychology of Fast Food Marketing How McDonald’s Utilized Disney’s Marketing Approach Eating Fast Food Can Make You Depressed Overcoming Fast Food Addiction: Time for Extreme Measures Do Healthy Options on Fast Food Menus Help? Causes of The Rapid Rise of Fast Food Restaurants Eliminating The Junk Out of Junk Food: Can We Turn the Fast Food Industry Around? The topics are an eclectic mix of direct claims and general themes that are directly related to the issues which Schlosser focuses on in his book. There is also a list of authoritative sources and materials at the end of which you can use to lend credence to your essay. However, if you are still at a loss for ideas, check out our list of 10 facts on Fast Food Nation by Eric Schlosser for an argumentative essay and further inspiration. Also check out the detailed guide on how to write an argumentative essay on Fast Food Nation by Eric Schlosser to properly write your own. These resources aside, refer to our sample essay below to get a better idea about how to properly structure an argumentative essay on Fast Food Nation by Eric Schlosser. This example can be used as a template and as a guide about what kind of content you need to include to draft a clear and balanced piece of writing. Sample Argumentative Essay on Labor Practices in the Fast Food Industry The fast food industry has been held responsible for numerous problems affecting the American society. Advertising to children and providing high-carbs and low-nutritional value foods, however, are only some of the main concerns of people in this industry. The matter of labor practices has become one of the prominent issues and a subject for debate in the past ten years. There are three reasons why this has become a major issue. First off, the fast food industry has the tendency to overwork its employees. Secondly, the industry has been known to pay their workers the minimum wage. Finally, there are almost no benefits for the employees of this industry. All of these lead to poverty-stricken workers who are worked to the bone. In fact, in â€Å"Modern Slavery. US Fast-food Industry Thriving on Poverty-stricken Workers†, Finian Cunningham wrote that millions of fast food employees are â€Å"so exploited it is estimated that more than half of them can only make ends meet by relying on some form of government handout.† Cunningham also writes that many fast food employees finish their shifts only to return to homeless shelters since they cannot afford to purchase homes of their own or rent apartments for their families. Even then, they do not get the peace they deserve as they are too tired to carry out their daily routines. To drive this point, he gives the example of former Dunkin Donuts employer Maria Fernandez. The 32-year-old woman has been doing back-to-back shifts at multiple outlets in the greater New York area. Unfortunately, she was too tired after being overworked one day, that she slept in her car between shifts. She died that day from asphyxiation caused by the exhaust fumes of her car. With an estimate of 2.25 million Americans working in fast food restaurants in the U.S., labor practices need to be tackled head on to ensure the survival and effective growth of the â€Å"fast food nation†. Numerous authors, including Eric Schlosser, have revealed the harsh realities of the labor practices in this industry among other controversies. Schlosser also used the example of teenager Elisa, who was hired because members of her age group are considered easier to control due to their inexperience, making them cheaper to hire since they are willing to accept a lower pay. If teenagers were unwilling to work at a place, the fast food industry replaced them with poor immigrants and the elderly. Now there have been studies showing that employees enjoy working in this industry. A study by Michael Benner, an Iowa State University student, uncovered that high school employees at McDonald’s enjoy their work because of reasons such as easy money and the lack of other job opportunities without a degree in hand. Moreover, the fast food chain seems easier as it operates on an assembly line system, breaking down the tasks of the restaurant. Despite these so-called perks, do not justify the lower wages which prevent workers from leading a meaningful existence. You can definitely come up with a better essay if you put your mind to it. So, make sure to start working right away or else your deadline will engulf you. References: Campbell, D. (2015). Ban Fast-Food Outlets from Hospitals, MPs Demand. the Guardian. Retrieved 19 March 2016, from theguardian.com/society/2015/mar/25/ban-fast-food-outlets-nhs-hospitals-mps Eating Fast Food. (2016). org. Retrieved 19 March 2016, from heart.org/HEARTORG/HealthyLiving/HealthyEating/DiningOut/Eating-Fast-Food_UCM_301473_Article.jsp McVeigh, K. (2013). Low Fast-Food Wages Come at High Cost to US Taxpayers, says Report. the Guardian. Retrieved 19 March 2016, from theguardian.com/world/2013/oct/15/fast-food-low-wages-high-cost-taxpayers Bittman, M. (2011). Is Junk Food Really Cheaper?. com. Retrieved 19 March 2016, from nytimes.com/2011/09/25/opinion/sunday/is-junk-food-really-cheaper.html Rehel, J. (2016). A Healthy Diet Costs $2,000 a Year More Than an Unhealthy One for Average Family of Jour: Harvard study. National Post. Retrieved 19 March 2016, from http://news.nationalpost.com/health/a-healthy-diet-costs-2000-a-year-more-than-an-unhealthy-one-for-average-family-of-four-harvard-study Benfield, F. Caid, Matthew D. Raimi, and Donald D. T. Chen. Once There Were Greenfields: How Urban Sprawl Is Undermining Americas Environment, Economy, and Social Fabric. Washington, D.C.: National Resources Defense Council, 1999. Emerson, Robert L. The New Economics of Fast Food.New York: Van Nostrand Reinhold, 1990. Card, D., Krueger, A. (2000). Minimum Wages and Employment: A Case Study of the Fast-Food Industry in New Jersey and Pennsylvania: Reply. American Economic Review, 90(5), 1397-1420. http://dx.doi.org/10.1257/aer.90.5.1397 Garber, A., H. Lustig, R. (2011). Is Fast Food Addictive?. Current Drug Abuse Reviewse, 4(3), 146-162. http://dx.doi.org/10.2174/1874473711104030146 Zhong, C-B. DeVoe, S.E. (2010). You Are How You Eat: Fast Food and Impatience. Psychological Science. DOI: 10.1177/0956797610366090

Friday, February 21, 2020

German Syntax Essay Example | Topics and Well Written Essays - 2000 words

German Syntax - Essay Example 2.0 Word Order2.0 Word Order German is considered an SVO language (Fagan 146), which means that the underlying word order is Subject-Verb-Object in a phrase. 1)    Seine       Mutter      trinkt Whisky.              Ã‚     [subj-his mother]       [verb-drink]   [obj-whisky]           His mother drinks whisky.    (Collins 175)Sentence 1 demonstrates the most common word order in German; it is a declarative sentence and has only one main clause (Weyerts et al. 216). So the verb is in second position in a sentence that is complete and can stand alone; in other words in an independent clause. Weyerts et al. claim that â€Å"it is always a finite verb or auxiliary that appears in second position, and it only appears there in main clauses† (216).Double clause sentences are constructed in a similar way. If two independent or main clauses are joined with a conjunction, the word order remains as SVO in both clauses. Sentence 2 is an example of two independent clauses joined with a conjunction.   2)    Wir                   wollten                     ins       Kino,                        aber          wir                hatten  [subj-we] [verb-wanted] [indir. obj- to cinema] [conj-but] [subj-we] [verb-had]  kein      Geld[direct obj-no money]

Wednesday, February 5, 2020

Nationalization of Oil Industry Essay Example | Topics and Well Written Essays - 1750 words

Nationalization of Oil Industry - Essay Example Argentinean President Cristina Fernà ¡ndez de Kirchner in April 2012 has announced that Argentina would take control of Repsol YPF, the country’s biggest crude oil producer, by nationalizing 51 percent of the company’s shares (Gaudà ­n, 2012). Until 1999 YPF was the largest oil company in Argentina and was owned by the government. However, lack of efficient management and expertise was pulling down profits in the oil industry. As a result, the government started encouraging foreign investment in the market so as to invigorate oil production. Between the years 1993 and 1999, Madrid based Repsol had acquired 100 percent of the company’s shares (Weinstein, 2012) and its name was changed from YPF to Repsol YPF. Spain is the largest foreign investor in Argentina with the European market being the largest export market for the country (Hernandez, 2012). However, the current dispute is over a slump in investment by the Spanish owner of YPF, which is leading to a drop in energy output in Argentina.In the home country, Buenos Aires based YPF is considered an emblem of state pride (Economist, 2012a). Nationalization of the $18 billion worth company in 2012 would bring huge revenue to the cash-strapped government, which is a significantly beneficial aspect of the President’s decision. Hence nationalization of the country’s biggest oil company is apparently a populist move by the President. Energy officials from Brazil and the President of Uruguay have praised this action (Economist, 2012a).... Justifications Cited for Nationalization of Oil In the home country, Buenos Aires based YPF, is considered an emblem of state pride (Economist, 2012a). Nationalization of the $18 billion worth company in 2012 would bring huge revenue to the cash-strapped government, which is a significantly beneficial aspect of the President’s decision. Hence nationalization of the country’s biggest oil company is apparently a populist move by the President. Energy officials from Brazil and the President of Uruguay have praised this action (Economist, 2012a). Ms. Fernandez has accused Repsol in terms of its failure to make enough investment in exploiting the resources of the country. YPF has declared that it has discovered an oil shale site whose estimated potential yield would be 23 billion barrels approximately (Forero, 2012). In the context of soaring cost of oil, which is damaging the country’s economy, the lack of potential investment in Argentinean gas and oil reserves is a primary show cause behind the seizure of Repsol YPf’s shares. The President said that Repsol was not producing enough oil and was therefore failing to meet the country’s total energy requirements. Argentina is currently facing a serious shortage in its total production of energy. This situation is being fixed by importing energy at a higher price from other countries (Economist, 2012a). Ms. Fernandez has said that if this continues Argentina would become an unviable state for investment (Macalister, 2012). Pressure exerted by the government on YPF was increasing since the past two months before the decision of nationalization was taken. It is evident that the current situation has been created as a turn from the 2004

Tuesday, January 28, 2020

Advances in DNA Sequencing Technologies

Advances in DNA Sequencing Technologies Abstract Recent advances in DNA sequencing technologies have led to efficient methods for determining the sequence of DNA. DNA sequencing was born in 1977 when Sanger et al proposed the chain termination method and Maxam and Gilbert proposed their own method in the same year. Sangers method was proven to be the most favourable out of the two. Since the birth of DNA sequencing, efficient DNA sequencing technologies was being produced, as Sangers method was laborious, time consuming and expensive; Hood et al proposed automated sequencers involving dye-labelled terminators. Due to the lack of available computational power prior to 1995, sequencing an entire bacterial genome was considered out of reach. This became a reality when Venter and Smith proposed shotgun sequencing in 1995. Pyrosequencing was introduced by Ronagi in 1996 and this method produce the sequence in real-time and is applied by 454 Life Sciences. An indirect method of sequencing DNA was proposed by Drmanac in 1987 called sequen cing by hybridisation and this method lead to the DNA array used by Affymetrix. Nanopore sequencing is a single-molecule sequencing technique and involves single-stranded DNA passing through lipid bilayer via an ion channel, and the ion conductance is measured. Synthetic Nanopores are being produced in order to substitute the lipid bilayer. Illumina sequencing is one of the latest sequencing technologies to be developed involving DNA clustering on flow cells and four dye-labelled terminators performing reverse termination. DNA sequencing has not only been applied to sequence DNA but applied to the real world. DNA sequencing has been involved in the Human genome project and DNA fingerprinting. Introduction Reliable DNA sequencing became a reality in 1977 when Frederick Sanger who perfected the chain termination method to sequence the genome of bacteriophage ?X174 [1][2]. Before Sangers proposal of the chain termination method, there was the plus and minus method, also presented by Sanger along with Coulson [2]. The plus and minus method depended on the use of DNA polymerase in transcribing the specific sequence DNA under controlled conditions. This method was considered efficient and simple, however it was not accurate [2]. As well as the proposal of the chain termination sequencing by Sanger, another method of DNA sequencing was introduced by Maxam and Gilbert involving restriction enzymes, which was also reported in 1977, the same year as Sangers method. The Maxamm and Gilbert method shall be discussed in more detail later on in this essay. Since the proposal of these two methods, spurred many DNA sequencing methods and as the technology developed, so did DNA sequencing. In this lite rature review, the various DNA sequencing technologies shall be looked into as well their applications in the real world and the tools that have aided sequencing DNA e.g. PCR. This review shall begin with the discussion of the chain termination method by Sanger. The Chain Termination Method Sanger discovered that the inhibitory activity of 23-didoxythymidine triphosphate (ddTTP) on the DNA polymerase I was dependent on its incorporation with the growing oligonucleotide chain in the place of thymidylic acid (dT) [2]. In the structure of ddT, there is no 3-hydroxyl group, by there is a hydrogen group in place. With the hydrogen in place of the hydroxyl group, the chain cannot be extended any further, so a termination occurs at the position where dT is positioned. Figure 1 shows the structure of dNTP and ddNTP. Sanger discovered that the inhibitory activity of 23-didoxythymidine triphosphate (ddTTP) on the DNA polymerase I was dependent on its incorporation with the growing oligonucleotide chain in the place of thymidylic acid (dT) [2]. In the structure of ddT, there is no 3-hydroxyl group, by there is a hydrogen group in place. With the hydrogen in place of the hydroxyl group, the chain cannot be extended any further, so a termination occurs at the position where dT is positioned. Figure 1 shows the structure of dNTP and ddNTP. In order to remove the 3-hydroxyl group and replace it with a proton, the triphosphate has to undergo a chemical procedure [1]. There is a different procedure employed for each of the triphosphate groups. Preparation of ddATP was produced from the starting material of 3-O-tosyl-2-deoxyadenosine which was treated with sodium methoxide in dimethylformamide to produce 2,3-dideoxy-2,3-didehydroadenosine, which is an unsaturated compound [4]. The double bond between carbon 2 and 3 of the cyclic ether was then hydrogenated with a palladium-on-carbon catalyst to give 2,3-dideoxyadenosine (ddA). The ddA (ddA) was then phosphorylated in order add the triphosphate group. Purification then took place on DEAE-Sephadex column using a gradient of triethylamine carbonate at pH 8.4. Figure 2 is schematic representation to produce ddA prior to phosphorylation. In the preparation of ddTTP (Figure 3), thymidine was tritylated (+C(Ph3)) at the 5-position and a methanesulphonyl (+CH3SO2) group was introduced at the 3-OH group[5]. The methanesulphonyl group was substituted with iodine by refluxing the compound in 1,2-dimethoxythane in the presence of NaI. After chromatography on a silica column the 5-trityl-3-iodothymidine was hydrogenated in 80% acetic acid to remove the trityl group. The resultant 3-iodothymidine was hydrogenated to produce 23-dideoxythymidine which subsequently was phosphorylated. Once phosphorylated, ddTTP was then purified on a DEAE-sephadex column with triethylammonium-hydrogen carbonate gradient. Figure 3 is a schematic representation to produce ddT prior phosphorylation. When preparing ddGTP, the starting material was N-isobutyryl-5-O-monomethoxytrityldepxyguanosine [1]. After the tosylation of the 3-OH group the compound was then converted to the 23-didehydro derivative with sodium methoxide. Then the isobutyryl group was partly removed during this treatment of sodium methoxide and was removed completely by incubation in the presence of NH3 overnight at 45oC. During the overnight incubation period, the didehydro derivative was reduced to the dideoxy derivative and then converted to the triphosphate. The triphosphate was purified by the fractionation on a DEAE-Sephadex column using a triethylamine carbonate gradient. Figure 4 is a schematic representation to produce ddG prior phosphorylation. Preparing the ddCTP was similar to ddGTP, but was prepared from N-anisoyl-5-O-monomethoxytrityldeoxycytidine. However the purification process was omitted for ddCTP, as it produced a very low yield, therefore the solution was used directly in the experiment described in the paper [2]. Figure 5 is a schematic representation to produce ddC prior phosphorylation. With the four dideoxy samples now prepared, the sequencing procedure can now commence. The dideoxy samples are in separate tubes, along with restriction enzymes obtained from ?X174 replicative form and the four dNTPs [2]. The restriction enzymes and the dNTPs begin strand synthesis and the ddNTP is incorporated to the growing polynucleotide and terminates further strand synthesis. This is due to the lack of the hydroxyl group at the 3 position of ddNTP which prevents the next nucleotide to attach onto the strand. The four tubes are separate by gel-electrophoresis on acrylamide gels (see Gel-Electrophoresis). Figure 6 shows the sequencing procedure. Reading the sequence is straightforward [1]. The first band that moved the furthest is located, this represents the smallest piece of DNA and is the strand terminated by incorporation of the dideoxynucleotide at the first position in the template. The track in which this band occurs is noted. For example (shown in Figure 6), the band that moved the furthest is in track A, so the first nucleotide in the sequence is A. To find out what the next nucleotide, the next most mobile band corresponding to DNA molecule which is one nucleotide longer than the first, and in this example, the band is on track T. Therefore the second nucleotide is T, and the overall sequence so far is AT. The processed is carried on along the autoradiograph until the individual bands start to close in and become inseparable, therefore becoming hard to read. In general it is possible to read upto 400 nucleotides from one autoradiograph with this method. Figure 7 is a schematic representation of an autoradiograph. E ver since Sanger perfected the method of DNA sequencing, there have been advances methods of sequencing along with the achievements. Certain achievements such as the Human genome project and shall be discussed later on in this review. Gel-Electrophoresis Gel-Electrophoresis is defined as the movement of charged molecules in an electric field [1][8]. DNA molecules, like many other biological compounds carry an electric charge. With the case of DNA, this charge is negative. Therefore when DNA is placed in an electric field, they migrate towards the positive pole (as shown in figure 8). There are three factors which affect the rate of migration, which are shape, electrical charge and size. The polyacrylamide gel comprises a complex network of pores through which the molecules must travel to reach the anode. Maxam and Gilbert Method The Maxam and Gilbert method was proposed before Sanger Method in the same year. While the Sangers method involves enzymatic radiolabelled fragments from unlabelled DNA strands [2]. The Maxam-Gilbert method involves chemical cleavage of prelabelled DNA strands in four different ways to form the four different collections of labelled fragments [6][7]. Both methods use gel-electrophoresis to separate the DNA target molecules [8]. However Sangers Chain Termination method has been proven to be simpler and easier to use than the Maxam and Gilbert method [9]. As a matter of fact, looking through the literature text books, Sangers method of DNA sequencing have been explained rather than Maxam and Gilberts [1][3][9][10]. With Maxam and Gilberts method there are two chemical cleavage reactions that take place [6][7]. One of the chemical reaction take places with guanine and the adenine, which are the two purines and the other cleaves the DNA at the cytosine and thymin e, the pyrimidines. For the cleavage reaction, specific reagents are used for each of the reaction. The purine specific reagent is dimethyl sulphate and the pyrimidine specific reagent is hydrazine. Each of these reactions are done in a different way, as each of the four bases have different chemical properties. The cleavage reaction for the guanine/adenine involves using dimethyl sulphate to add a methyl group to the guanines at the N7 position and at the N3 position at the adenines [7]. The glycosidic bond of a methylated adenines is unstable and breaks easily on heating at neutral pH, leaving the sugar free. Treatment with 0.1M alkali at 90oC then will cleave the sugar from the neighbouring phosphate groups. When the resulting end-labelled fragments are resolved on a polyacrylamide gel, the autoradiograph contains a pattern a dark and light bands. The dark bands arise from the breakage at the guanines, which methylate at a rate which is 5-fold faster than adenines. From this reac tion the guanine appear stronger than the adenosine, this can lead to a misinterpretation. Therefore an Adenine-Enhanced cleavage reaction takes place. Figure 9 shows the structural changes of guanine when undergoing the structural modifications involved in Maxam-Gilbert sequencing. With an Adenine-Enhanced cleavage, the glycosidic bond of methylated adenosine is less stable than that of methylated guanosine, thus gentle treatment with dilute acid at the methylation step releases the adenine, allowing darker bands to appear on the autoradiograph [7]. The chemical cleavage for the cytosine and thymine residues involves hydrazine instead of dimethyl sulphate. The hydrazine cleaves the base and leaving ribosylurea [7]. After partial hydrazinolysis in 15-18M aqueous hydrazine at 20oC, the DNA is cleaved with 0.5M piperidine. The piperidine (a cyclic secondary amine), as the free base, displaces all the products of the hydrazine reaction from the sugars and catalyzses the b-elimination of the phosphates. The final pattern contains bands of the similar intensity from the cleavages at the cytosines and thymines. As for cleavage for the cytosine, the presence of 2M NaCl preferentially suppresses the reaction of thymine with hydrazine. Once the cleavage reaction has taken place each original strand is broken into a labelled fragment and an unlabelled fragment [7]. All the labelled fragments start at the 5 end of the strand and terminate at the base that precedes the site of a nucleotide along the original strand. Only the labelled fragmen ts are recorded on the gel electrophoresis. Dye-labelled terminators For many years DNA sequencing has been done by hand, which is both laborious and expensive[3]. Before automated sequencing, about 4 x 106 bases of DNA had been sequenced after the introduction of the Sangers method and Maxam Gilbert methods [11]. In both methods, four sets of reactions and a subsequent electrophoresis step in adjacent lanes of a high-resolution polyacrylamide gel. With the new automated sequencing procedures, four different fluorophores are used, one in each of the base-specific reactions. The reaction products are combined and co-electrophoresed, and the DNA fragments generated in each reaction are detected near the bottom of the gel and identified by their colour. As for choosing which DNA sequencing method to be used, Sangers Method was chosen. This is because Sangers method has been proven to be the most durable and efficient method of DNA sequencing and was the choice of most investigators in large scale sequencing [12]. Figure 10 shows a typical sequence is ge nerated using an automated sequencer. The selection of the dyes was the central development of automated DNA sequencing [11]. The fluorophores that were selected, had to meet several criteria. For instance the absorption and emission maxima had to be in the visible region of the spectrum [11] which is between 380 nm and 780 nm [10], each dye had to be easily distinguishable from one another [11]. Also the dyes should not impair the hybridisation of the oligonucleotide primer, as this would decrease the reliability of synthesis in the sequencing reactions. Figure 11 shows the structures of the dyes which are used in a typical automated sequencing procedure, where X is the moiety where the dye will be bound to. Table 1 shows which dye is covalently attached to which nucleotide in a typical automated DNA sequencing procedure Dye Nucleotide Attached Flourescein Adenosine NBD Thymine Tetramethylrhodamine Guanine Texas Red Cytosine In designing the instrumentation of the florescence detection apparatus, the primary consideration was sensitivity. As the concentration of each band on the co-electrophoresis gel is around 10 M, the instrument needs to be capable of detecting dye concentration of that order. This level of detection can readily be achieved by commercial spectrofluorimeter systems. Unfortunately detection from a gel leads to a much higher background scatter which in turn leads to a decrease in sensitivity. This is solved by using a laser excitation source in order to obtain maximum sensitivity [11]. Figure 12 is schematic diagram of the instrument with the explanation of the instrumentation employed. When analyzing data, Hood had found some complications [11]. Firstly the emission spectra of the different dyes overlapped, in order to overcome this, multicomponent analysis was employed to determine the different amounts of the four dyes present in the gel at any given time. Secondly, the different dye molecules impart non-identical electrophoretic mobilities to the DNA fragments. This meant that the oligonucleotides were not equal base lengths. The third major complication was in analyzing the data comes from the imperfections of the enzymatic methods, for instance there are often regions of the autoradiograph that are difficult to sequence. These complications were overcome in five steps [11] High frequency noise is removed by using a low-pass Fourier filter. A time delay (1.5-4.5 s) between measurements at different wavelength is partially corrected for by linear interpolation between successive measurements. A multicomponent analysis is performed on each set of four data points; this computation yields the amount of each of the four dyes present in the detector as a function of time. The peaks present in the data are located The mobility shift introduced by the dyes is corrected for using empirical determined correction factors. Since the publication of Hoods proposal of the fluorescence detection in automated DNA sequence analysis. Research has been made on focussed on developing which are better in terms of sensitivity [12]. Bacterial and Viral Genome Sequencing (Shotgun Sequencing) Prior to 1995, many viral genomes have been sequenced using Sangers chain termination technique [13], but no bacterial genome has been sequenced. The viral genomes that been sequenced are the 229 kb genome of cytomegalovirus [14], and the 192 kb genome of vaccinia [15], the 187 kb mitochondrial and 121 kb cholorophast genomes of Marchantia polymorpha have been sequenced [16]. Viral genome sequencing has been based upon the sequencing of clones usually derived from extensively mapped restriction fragments, or ? or cosmid clones [17]. Despite advances in DNA sequencing technology, the sequencing of genomes has not progressed beyond clones on the order of the size of the ~ 250kb, which is due to the lack of computational approaches that would enable the efficient assembly of a large number of fragments into an ordered single assembly [13][17]. Upon this, Venter and Smith in 1995 proposed Shotgun Sequencing and enabled Haemophilus influenzae (H. influenzae) to become the first bacterial genome to be sequenced [13][17]. H. influenzae was chosen as it has a similar base composition as a human does with 38 % of sequence made of G + C. Table 2 shows the procedure of the Shotgun Sequencing [17]. When constructing the library ultrasonic waves were used to randomly fragment the genomic DNA into fairly small pieces of about the size of a gene [13]. The fragments were purified and then attached to plasmid vectors[13][17]. The plasmid vectors were then inserted into an E. coli host cell to produce a library of plasmid clones. The E. coli host cell strains had no restriction enzymes which prevented any deletions, rearrangements and loss of the clones [17]. The fragments are randomly sequenced using automated sequencers (Dye-Labelled terminators), with the use of T7 and SP6 primers to sequence the ends of the inserts to enable the coverage of fragments by a factor of 6 [17]. Table 2 (Reference 17) Stage Description Random small insert and large insert library construction Shear genomic DNA randomly to ~2 kb and 15 to 20 kb respectively Library plating Verify random nature of library and maximize random selection of small insert and large insert clones for template production High-throughput DNA sequencing Sequence sufficient number of sequences fragments from both ends for 6x coverage Assembly Assemble random sequence fragments and identity repeat regions Gap Closure Physical gaps Order all contigs (fingerprints, peptide links, ÃŽ », clones, PCR) and provide templates for closure Sequence gaps Complete the genome sequence by primer walking Editing Inspect the sequence visually and resolve sequence ambiguities, including frameshifts Annotation Identify and describe all predicted coding regions (putative identifications, starts and stops, role assignments, operons, regulatory regions) Once the sequencing reaction has been completed, the fragments need to be assembled, and this process is done by using the software TIGR Assembler (The Institute of Genomic Research) [17]. The TIGR Assembler simultaneously clusters and assembles fragments of the genome. In order to obtain the speed necessary to assemble more than 104 fragments [17], an algorithm is used to build up the table of all 10-bp oligonucleotide subsequences to generate a list of potential sequence fragment overlaps. The algorithm begins with the initial contig (single fragment); to extend the contig, a candidate fragment is based on the overlap oligonucleotide content. The initial contig and candidate fragment are aligned by a modified version of the Smith-Waterman [18] algorithm, which allows optional gapped alignments. The contig is extended by the fragment only if strict criteria of overlap content match. The algorithm automatically lowers these criteria in regions of minimal coverage and raises them in r egions with a possible repetitive element [17]. TIGR assembler is designed to take advantage of huge clone sizes [17]. It also enforces a constraint that sequence from two ends of the same template point toward one another in the contig and are located within a certain range of the base pair [17]. Therefore the TIGR assembler provides the computational power to assemble the fragments. Once the fragments have been aligned, the TIGR Editor is used to proofread the sequence and check for any ambiguities in the data [17]. With this technique it does required precautionary care, for instance the small insert in the library should be constructed and end-sequenced concurrently [17]. It is essential that the sequence fragments are of the highest quality and should be rigorously check for any contamination [17]. Pyrosequencing Most of the DNA sequencing required gel-electrophoresis, however in 1996 at the Royal Institute of Technology, Stockholm, Ronaghi proposed Pyrosequencing [19][20]. This is an example of sequencing-by-synthesis, where DNA molecules are clonally amplified on a template, and this template then goes under sequencing [25]. This approach relies on the detection of DNA polymerase activity by enzymatic luminometric inorganic pyrophosphate (PPi) that is released during DNA synthesis and goes under detection assay and offers the advantage of real-time detection [19]. Ronaghi used Nyren [21] description of an enzymatic system consisting of DNA polymerase, ATP sulphurylase and lucifinerase to couple the release of PPi obtained when a nucleotide is incorporated by the polymerase with light emission that can be easily detected by a luminometer or photodiode [20]. When PPi is released, it is immediately converted to adenosine triphosphate (ATP) by ATP sulphurylase, and the level of generated ATP is sensed by luciferase-producing photons [19][20][21]. The unused ATP and deoxynucleotide are degraded by the enzyme apyrase. The presence or absence of PPi, and therefore the incorporation or nonincorporation of each nucleotide added, is ultimately assessed on the basis of whether or not the photons are detected. There is minimal time lapse between these events, and the conditions of the reaction are such that iterative addition of the nucleotides and PPi detection are possible. The release of PPi via the nucleotide incorporation, it is detected by ELIDA (Enzymatic Luminometric Inorganic pyrophosphate Detection Assay) [19][21]. It is within the ELIDA, the PPi is converted to ATP, with the help of ATP sulfurylase and the ATP reacts with the luciferin to generate the light at more than 6 x 109 photons at a wavelength of 560 nm which can be detected by a photodiode, photomultiplier tube, or charge-coupled device (CCD) camera [19][20]. As mentioned before, the DNA molecules need to be amplified by polymerase chain reaction (PCR which is discussed later Ronaghi observed that dATP interfered with the detection system [19]. This interference is a major problem when the method is used to detect a single-base incorporation event. This problem was rectified by replacing the dATP with dATPaS (deoxyadenosine a–thiotrisulphate). It is noticed that adding a small amount of the dATP (0.1 nmol) induces an instantaneous increase in the light emission followed by a slow decrease until it reached a steady-state level (as Figure 11 shows). This makes it impossible to start a sequencing reaction by adding dATP; the reaction must instead be started by addition of DNA polymerase. The signal-to-noise ratio also became higher for dATP compared to the other nucleotides. On the other hand, addition of 8 nmol dATPaS (80-fold higher than the amount of dATP) had only a minor effect on luciferase (as Figure 14 shows). However dATPaS is less than 0.05% as effective as dATP as a substrate for luciferase [19]. Pyrosequencing is adapted by 454 Life Sciences for sequencing by synthesis [22] and is known as the Genome Sequencer (GS) FLX [23][24]. The 454 system consist of random ssDNA (single-stranded) fragments, and each random fragment is bound to the bead under conditions that allow only one fragment to a bead [22]. Once the fragment is attached to the bead, clonal amplification occurs via emulsion. The emulsified beads are purified and placed in microfabricated picolitre wells and then goes under pyrosequencing. A lens array in the detection of the instrument focuses luminescene from each well onto the chip of a CCD camera. The CCD camera images the plate every second in order to detect progression of the pyrosequencing [20][22]. The pyrosequencing machine generates raw data in real time in form of bioluminescence generated from the reactions, and data is presented on a pyrogram [20] Sequencing by Hybridisation As discussed earlier with chain-termination, Maxamm and Gilbert and pyrosequencing, these are all direct methods of sequencing DNA, where each base position is determined individually [26]. There are also indirect methods of sequencing DNA in which the DNA sequence is assembled based on experimental determination of oligonucleotide content of the chain. One promising method of indirect DNA sequencing is called Sequencing by Hybridisation in which sets of oligonucleotide probes are hybridised under conditions that allow the detection of complementary sequences in the target nucleic acid [26]. Sequencing by Hybridisation (SBH) was proposed by Drmanac et al in 1987 [27] and is based on Dotys observation that when DNA is heated in solution, the double-strand melts to form single stranded chains, which then re-nature spontaneously when the solution is cooled [28]. This results the possibility of one piece of DNA recognize another. And hence lead to Drmanac proposal of oligonucleotides pro bes being hybridised under these conditions allowing the complementary sequence in the DNA target to be detected [26][27]. In SBH, an oligonucleotide probe (n-mer probe where n is the length of the probe) is a substring of a DNA sample. This process is similar to doing a keyword search in a page full of text [29]. The set of positively expressed probes is known as the spectrum of DNA sample. For example, the single strand DNA 5GGTCTCG 3 will be sequenced using 4-mer probes and 5 probes will hybridise onto the sequence successfully. The remaining probes will form hybrids with a mismatch at the end base and will be denatured during selective washing. The five probes that are of good match at the end base will result in fully matched hybrids, which will be retained and detected. Each positively expressed serves as a platform to decipher the next base as is seen in Figure 16. For the probes that have successfully hybridised onto the sequence need to be detected. This is achieved by labelling the probes with dyes such as Cyanine3 (Cy3) and Cyanine5 (Cy5) so that the degree of hybridisation can be detected by imaging devices [29]. SBH methods are ideally suited to microarray technology due to their inherent potential for parallel sample processing [29]. An important advantage of using of using a DNA array rather than a multiple probe array is that all the resulting probe-DNA hybrids in any single probe hybridisation are of identical sequence [29]. One of main type of DNA hybridisation array formats is oligonucleotide array which is currently patented by Affymetrix [30]. The commercial uses of this shall be discussed under application of the DNA Array (Affymetrix). Due to the small size of the hybridisation array and the small amount of the target present, it is a challenge to acquire the signals from a DNA Array [29]. These signals must first be amplified b efore they can be detected by the imaging devices. Signals can be boosted by the two means; namely target amplification and signal amplification. In target amplification such as PCR, the amount of target is increased to enhance signal strength while in signal amplification; the amount of signal per unit is increased. Nanopore Sequencing Nanopore sequencing was proposed in 1996 by Branton et al, and shows that individual polynucleotide molecules can be characterised using a membrane channel [31]. Nanopore sequencing is an example of single-molecule sequencing, in which the concept of sequencing-by-synthesis is followed, but without the prior amplification step [24]. This is achieved by the measurement of ionic conductance of a nucleotide passing through a single ion channels in biological membranes or planar lipid bilayer. The measurement of ionic conductance is routine neurobiology and biophysics [31], as well as pharmacology (Ca+ and K+ channel)[32] and biochemistry[9]. Most channels undergo voltage-dependant or ligand dependant gating, there are several large ion channels (i.e. Staphylococcus aureus a-hemolysin) which can remain open extended periods, thereby allowing continuous ionic current to flow across a lipid bilayer [31]. If a transmembrane voltage applied across an open channel of appropriate size should d raw DNA molecules through the channel as extended linear chains whose presence would detect reduce ionic flow. It was assumed, that the reduction in the ionic flow would lead to single channel recordings to characterise the length and hence lead to other characteristics of the polynucleotide. In the proposal by Branton, a-hemolysin was used to form a single channel across a lipid bilayer separating two buffer-filled compartment [31]. a-Hemolysin is a monomeric, 33kD, 293 residue protein that is secreted by the human pathogen Staphylococcus aureus [33]. The nanopore are produced when a-hemolysin subsunits are introduced into a buffered solution that separates lipid bilayer into two compartments (known as cis and trans): the head of t Advances in DNA Sequencing Technologies Advances in DNA Sequencing Technologies Abstract Recent advances in DNA sequencing technologies have led to efficient methods for determining the sequence of DNA. DNA sequencing was born in 1977 when Sanger et al proposed the chain termination method and Maxam and Gilbert proposed their own method in the same year. Sangers method was proven to be the most favourable out of the two. Since the birth of DNA sequencing, efficient DNA sequencing technologies was being produced, as Sangers method was laborious, time consuming and expensive; Hood et al proposed automated sequencers involving dye-labelled terminators. Due to the lack of available computational power prior to 1995, sequencing an entire bacterial genome was considered out of reach. This became a reality when Venter and Smith proposed shotgun sequencing in 1995. Pyrosequencing was introduced by Ronagi in 1996 and this method produce the sequence in real-time and is applied by 454 Life Sciences. An indirect method of sequencing DNA was proposed by Drmanac in 1987 called sequen cing by hybridisation and this method lead to the DNA array used by Affymetrix. Nanopore sequencing is a single-molecule sequencing technique and involves single-stranded DNA passing through lipid bilayer via an ion channel, and the ion conductance is measured. Synthetic Nanopores are being produced in order to substitute the lipid bilayer. Illumina sequencing is one of the latest sequencing technologies to be developed involving DNA clustering on flow cells and four dye-labelled terminators performing reverse termination. DNA sequencing has not only been applied to sequence DNA but applied to the real world. DNA sequencing has been involved in the Human genome project and DNA fingerprinting. Introduction Reliable DNA sequencing became a reality in 1977 when Frederick Sanger who perfected the chain termination method to sequence the genome of bacteriophage ?X174 [1][2]. Before Sangers proposal of the chain termination method, there was the plus and minus method, also presented by Sanger along with Coulson [2]. The plus and minus method depended on the use of DNA polymerase in transcribing the specific sequence DNA under controlled conditions. This method was considered efficient and simple, however it was not accurate [2]. As well as the proposal of the chain termination sequencing by Sanger, another method of DNA sequencing was introduced by Maxam and Gilbert involving restriction enzymes, which was also reported in 1977, the same year as Sangers method. The Maxamm and Gilbert method shall be discussed in more detail later on in this essay. Since the proposal of these two methods, spurred many DNA sequencing methods and as the technology developed, so did DNA sequencing. In this lite rature review, the various DNA sequencing technologies shall be looked into as well their applications in the real world and the tools that have aided sequencing DNA e.g. PCR. This review shall begin with the discussion of the chain termination method by Sanger. The Chain Termination Method Sanger discovered that the inhibitory activity of 23-didoxythymidine triphosphate (ddTTP) on the DNA polymerase I was dependent on its incorporation with the growing oligonucleotide chain in the place of thymidylic acid (dT) [2]. In the structure of ddT, there is no 3-hydroxyl group, by there is a hydrogen group in place. With the hydrogen in place of the hydroxyl group, the chain cannot be extended any further, so a termination occurs at the position where dT is positioned. Figure 1 shows the structure of dNTP and ddNTP. Sanger discovered that the inhibitory activity of 23-didoxythymidine triphosphate (ddTTP) on the DNA polymerase I was dependent on its incorporation with the growing oligonucleotide chain in the place of thymidylic acid (dT) [2]. In the structure of ddT, there is no 3-hydroxyl group, by there is a hydrogen group in place. With the hydrogen in place of the hydroxyl group, the chain cannot be extended any further, so a termination occurs at the position where dT is positioned. Figure 1 shows the structure of dNTP and ddNTP. In order to remove the 3-hydroxyl group and replace it with a proton, the triphosphate has to undergo a chemical procedure [1]. There is a different procedure employed for each of the triphosphate groups. Preparation of ddATP was produced from the starting material of 3-O-tosyl-2-deoxyadenosine which was treated with sodium methoxide in dimethylformamide to produce 2,3-dideoxy-2,3-didehydroadenosine, which is an unsaturated compound [4]. The double bond between carbon 2 and 3 of the cyclic ether was then hydrogenated with a palladium-on-carbon catalyst to give 2,3-dideoxyadenosine (ddA). The ddA (ddA) was then phosphorylated in order add the triphosphate group. Purification then took place on DEAE-Sephadex column using a gradient of triethylamine carbonate at pH 8.4. Figure 2 is schematic representation to produce ddA prior to phosphorylation. In the preparation of ddTTP (Figure 3), thymidine was tritylated (+C(Ph3)) at the 5-position and a methanesulphonyl (+CH3SO2) group was introduced at the 3-OH group[5]. The methanesulphonyl group was substituted with iodine by refluxing the compound in 1,2-dimethoxythane in the presence of NaI. After chromatography on a silica column the 5-trityl-3-iodothymidine was hydrogenated in 80% acetic acid to remove the trityl group. The resultant 3-iodothymidine was hydrogenated to produce 23-dideoxythymidine which subsequently was phosphorylated. Once phosphorylated, ddTTP was then purified on a DEAE-sephadex column with triethylammonium-hydrogen carbonate gradient. Figure 3 is a schematic representation to produce ddT prior phosphorylation. When preparing ddGTP, the starting material was N-isobutyryl-5-O-monomethoxytrityldepxyguanosine [1]. After the tosylation of the 3-OH group the compound was then converted to the 23-didehydro derivative with sodium methoxide. Then the isobutyryl group was partly removed during this treatment of sodium methoxide and was removed completely by incubation in the presence of NH3 overnight at 45oC. During the overnight incubation period, the didehydro derivative was reduced to the dideoxy derivative and then converted to the triphosphate. The triphosphate was purified by the fractionation on a DEAE-Sephadex column using a triethylamine carbonate gradient. Figure 4 is a schematic representation to produce ddG prior phosphorylation. Preparing the ddCTP was similar to ddGTP, but was prepared from N-anisoyl-5-O-monomethoxytrityldeoxycytidine. However the purification process was omitted for ddCTP, as it produced a very low yield, therefore the solution was used directly in the experiment described in the paper [2]. Figure 5 is a schematic representation to produce ddC prior phosphorylation. With the four dideoxy samples now prepared, the sequencing procedure can now commence. The dideoxy samples are in separate tubes, along with restriction enzymes obtained from ?X174 replicative form and the four dNTPs [2]. The restriction enzymes and the dNTPs begin strand synthesis and the ddNTP is incorporated to the growing polynucleotide and terminates further strand synthesis. This is due to the lack of the hydroxyl group at the 3 position of ddNTP which prevents the next nucleotide to attach onto the strand. The four tubes are separate by gel-electrophoresis on acrylamide gels (see Gel-Electrophoresis). Figure 6 shows the sequencing procedure. Reading the sequence is straightforward [1]. The first band that moved the furthest is located, this represents the smallest piece of DNA and is the strand terminated by incorporation of the dideoxynucleotide at the first position in the template. The track in which this band occurs is noted. For example (shown in Figure 6), the band that moved the furthest is in track A, so the first nucleotide in the sequence is A. To find out what the next nucleotide, the next most mobile band corresponding to DNA molecule which is one nucleotide longer than the first, and in this example, the band is on track T. Therefore the second nucleotide is T, and the overall sequence so far is AT. The processed is carried on along the autoradiograph until the individual bands start to close in and become inseparable, therefore becoming hard to read. In general it is possible to read upto 400 nucleotides from one autoradiograph with this method. Figure 7 is a schematic representation of an autoradiograph. E ver since Sanger perfected the method of DNA sequencing, there have been advances methods of sequencing along with the achievements. Certain achievements such as the Human genome project and shall be discussed later on in this review. Gel-Electrophoresis Gel-Electrophoresis is defined as the movement of charged molecules in an electric field [1][8]. DNA molecules, like many other biological compounds carry an electric charge. With the case of DNA, this charge is negative. Therefore when DNA is placed in an electric field, they migrate towards the positive pole (as shown in figure 8). There are three factors which affect the rate of migration, which are shape, electrical charge and size. The polyacrylamide gel comprises a complex network of pores through which the molecules must travel to reach the anode. Maxam and Gilbert Method The Maxam and Gilbert method was proposed before Sanger Method in the same year. While the Sangers method involves enzymatic radiolabelled fragments from unlabelled DNA strands [2]. The Maxam-Gilbert method involves chemical cleavage of prelabelled DNA strands in four different ways to form the four different collections of labelled fragments [6][7]. Both methods use gel-electrophoresis to separate the DNA target molecules [8]. However Sangers Chain Termination method has been proven to be simpler and easier to use than the Maxam and Gilbert method [9]. As a matter of fact, looking through the literature text books, Sangers method of DNA sequencing have been explained rather than Maxam and Gilberts [1][3][9][10]. With Maxam and Gilberts method there are two chemical cleavage reactions that take place [6][7]. One of the chemical reaction take places with guanine and the adenine, which are the two purines and the other cleaves the DNA at the cytosine and thymin e, the pyrimidines. For the cleavage reaction, specific reagents are used for each of the reaction. The purine specific reagent is dimethyl sulphate and the pyrimidine specific reagent is hydrazine. Each of these reactions are done in a different way, as each of the four bases have different chemical properties. The cleavage reaction for the guanine/adenine involves using dimethyl sulphate to add a methyl group to the guanines at the N7 position and at the N3 position at the adenines [7]. The glycosidic bond of a methylated adenines is unstable and breaks easily on heating at neutral pH, leaving the sugar free. Treatment with 0.1M alkali at 90oC then will cleave the sugar from the neighbouring phosphate groups. When the resulting end-labelled fragments are resolved on a polyacrylamide gel, the autoradiograph contains a pattern a dark and light bands. The dark bands arise from the breakage at the guanines, which methylate at a rate which is 5-fold faster than adenines. From this reac tion the guanine appear stronger than the adenosine, this can lead to a misinterpretation. Therefore an Adenine-Enhanced cleavage reaction takes place. Figure 9 shows the structural changes of guanine when undergoing the structural modifications involved in Maxam-Gilbert sequencing. With an Adenine-Enhanced cleavage, the glycosidic bond of methylated adenosine is less stable than that of methylated guanosine, thus gentle treatment with dilute acid at the methylation step releases the adenine, allowing darker bands to appear on the autoradiograph [7]. The chemical cleavage for the cytosine and thymine residues involves hydrazine instead of dimethyl sulphate. The hydrazine cleaves the base and leaving ribosylurea [7]. After partial hydrazinolysis in 15-18M aqueous hydrazine at 20oC, the DNA is cleaved with 0.5M piperidine. The piperidine (a cyclic secondary amine), as the free base, displaces all the products of the hydrazine reaction from the sugars and catalyzses the b-elimination of the phosphates. The final pattern contains bands of the similar intensity from the cleavages at the cytosines and thymines. As for cleavage for the cytosine, the presence of 2M NaCl preferentially suppresses the reaction of thymine with hydrazine. Once the cleavage reaction has taken place each original strand is broken into a labelled fragment and an unlabelled fragment [7]. All the labelled fragments start at the 5 end of the strand and terminate at the base that precedes the site of a nucleotide along the original strand. Only the labelled fragmen ts are recorded on the gel electrophoresis. Dye-labelled terminators For many years DNA sequencing has been done by hand, which is both laborious and expensive[3]. Before automated sequencing, about 4 x 106 bases of DNA had been sequenced after the introduction of the Sangers method and Maxam Gilbert methods [11]. In both methods, four sets of reactions and a subsequent electrophoresis step in adjacent lanes of a high-resolution polyacrylamide gel. With the new automated sequencing procedures, four different fluorophores are used, one in each of the base-specific reactions. The reaction products are combined and co-electrophoresed, and the DNA fragments generated in each reaction are detected near the bottom of the gel and identified by their colour. As for choosing which DNA sequencing method to be used, Sangers Method was chosen. This is because Sangers method has been proven to be the most durable and efficient method of DNA sequencing and was the choice of most investigators in large scale sequencing [12]. Figure 10 shows a typical sequence is ge nerated using an automated sequencer. The selection of the dyes was the central development of automated DNA sequencing [11]. The fluorophores that were selected, had to meet several criteria. For instance the absorption and emission maxima had to be in the visible region of the spectrum [11] which is between 380 nm and 780 nm [10], each dye had to be easily distinguishable from one another [11]. Also the dyes should not impair the hybridisation of the oligonucleotide primer, as this would decrease the reliability of synthesis in the sequencing reactions. Figure 11 shows the structures of the dyes which are used in a typical automated sequencing procedure, where X is the moiety where the dye will be bound to. Table 1 shows which dye is covalently attached to which nucleotide in a typical automated DNA sequencing procedure Dye Nucleotide Attached Flourescein Adenosine NBD Thymine Tetramethylrhodamine Guanine Texas Red Cytosine In designing the instrumentation of the florescence detection apparatus, the primary consideration was sensitivity. As the concentration of each band on the co-electrophoresis gel is around 10 M, the instrument needs to be capable of detecting dye concentration of that order. This level of detection can readily be achieved by commercial spectrofluorimeter systems. Unfortunately detection from a gel leads to a much higher background scatter which in turn leads to a decrease in sensitivity. This is solved by using a laser excitation source in order to obtain maximum sensitivity [11]. Figure 12 is schematic diagram of the instrument with the explanation of the instrumentation employed. When analyzing data, Hood had found some complications [11]. Firstly the emission spectra of the different dyes overlapped, in order to overcome this, multicomponent analysis was employed to determine the different amounts of the four dyes present in the gel at any given time. Secondly, the different dye molecules impart non-identical electrophoretic mobilities to the DNA fragments. This meant that the oligonucleotides were not equal base lengths. The third major complication was in analyzing the data comes from the imperfections of the enzymatic methods, for instance there are often regions of the autoradiograph that are difficult to sequence. These complications were overcome in five steps [11] High frequency noise is removed by using a low-pass Fourier filter. A time delay (1.5-4.5 s) between measurements at different wavelength is partially corrected for by linear interpolation between successive measurements. A multicomponent analysis is performed on each set of four data points; this computation yields the amount of each of the four dyes present in the detector as a function of time. The peaks present in the data are located The mobility shift introduced by the dyes is corrected for using empirical determined correction factors. Since the publication of Hoods proposal of the fluorescence detection in automated DNA sequence analysis. Research has been made on focussed on developing which are better in terms of sensitivity [12]. Bacterial and Viral Genome Sequencing (Shotgun Sequencing) Prior to 1995, many viral genomes have been sequenced using Sangers chain termination technique [13], but no bacterial genome has been sequenced. The viral genomes that been sequenced are the 229 kb genome of cytomegalovirus [14], and the 192 kb genome of vaccinia [15], the 187 kb mitochondrial and 121 kb cholorophast genomes of Marchantia polymorpha have been sequenced [16]. Viral genome sequencing has been based upon the sequencing of clones usually derived from extensively mapped restriction fragments, or ? or cosmid clones [17]. Despite advances in DNA sequencing technology, the sequencing of genomes has not progressed beyond clones on the order of the size of the ~ 250kb, which is due to the lack of computational approaches that would enable the efficient assembly of a large number of fragments into an ordered single assembly [13][17]. Upon this, Venter and Smith in 1995 proposed Shotgun Sequencing and enabled Haemophilus influenzae (H. influenzae) to become the first bacterial genome to be sequenced [13][17]. H. influenzae was chosen as it has a similar base composition as a human does with 38 % of sequence made of G + C. Table 2 shows the procedure of the Shotgun Sequencing [17]. When constructing the library ultrasonic waves were used to randomly fragment the genomic DNA into fairly small pieces of about the size of a gene [13]. The fragments were purified and then attached to plasmid vectors[13][17]. The plasmid vectors were then inserted into an E. coli host cell to produce a library of plasmid clones. The E. coli host cell strains had no restriction enzymes which prevented any deletions, rearrangements and loss of the clones [17]. The fragments are randomly sequenced using automated sequencers (Dye-Labelled terminators), with the use of T7 and SP6 primers to sequence the ends of the inserts to enable the coverage of fragments by a factor of 6 [17]. Table 2 (Reference 17) Stage Description Random small insert and large insert library construction Shear genomic DNA randomly to ~2 kb and 15 to 20 kb respectively Library plating Verify random nature of library and maximize random selection of small insert and large insert clones for template production High-throughput DNA sequencing Sequence sufficient number of sequences fragments from both ends for 6x coverage Assembly Assemble random sequence fragments and identity repeat regions Gap Closure Physical gaps Order all contigs (fingerprints, peptide links, ÃŽ », clones, PCR) and provide templates for closure Sequence gaps Complete the genome sequence by primer walking Editing Inspect the sequence visually and resolve sequence ambiguities, including frameshifts Annotation Identify and describe all predicted coding regions (putative identifications, starts and stops, role assignments, operons, regulatory regions) Once the sequencing reaction has been completed, the fragments need to be assembled, and this process is done by using the software TIGR Assembler (The Institute of Genomic Research) [17]. The TIGR Assembler simultaneously clusters and assembles fragments of the genome. In order to obtain the speed necessary to assemble more than 104 fragments [17], an algorithm is used to build up the table of all 10-bp oligonucleotide subsequences to generate a list of potential sequence fragment overlaps. The algorithm begins with the initial contig (single fragment); to extend the contig, a candidate fragment is based on the overlap oligonucleotide content. The initial contig and candidate fragment are aligned by a modified version of the Smith-Waterman [18] algorithm, which allows optional gapped alignments. The contig is extended by the fragment only if strict criteria of overlap content match. The algorithm automatically lowers these criteria in regions of minimal coverage and raises them in r egions with a possible repetitive element [17]. TIGR assembler is designed to take advantage of huge clone sizes [17]. It also enforces a constraint that sequence from two ends of the same template point toward one another in the contig and are located within a certain range of the base pair [17]. Therefore the TIGR assembler provides the computational power to assemble the fragments. Once the fragments have been aligned, the TIGR Editor is used to proofread the sequence and check for any ambiguities in the data [17]. With this technique it does required precautionary care, for instance the small insert in the library should be constructed and end-sequenced concurrently [17]. It is essential that the sequence fragments are of the highest quality and should be rigorously check for any contamination [17]. Pyrosequencing Most of the DNA sequencing required gel-electrophoresis, however in 1996 at the Royal Institute of Technology, Stockholm, Ronaghi proposed Pyrosequencing [19][20]. This is an example of sequencing-by-synthesis, where DNA molecules are clonally amplified on a template, and this template then goes under sequencing [25]. This approach relies on the detection of DNA polymerase activity by enzymatic luminometric inorganic pyrophosphate (PPi) that is released during DNA synthesis and goes under detection assay and offers the advantage of real-time detection [19]. Ronaghi used Nyren [21] description of an enzymatic system consisting of DNA polymerase, ATP sulphurylase and lucifinerase to couple the release of PPi obtained when a nucleotide is incorporated by the polymerase with light emission that can be easily detected by a luminometer or photodiode [20]. When PPi is released, it is immediately converted to adenosine triphosphate (ATP) by ATP sulphurylase, and the level of generated ATP is sensed by luciferase-producing photons [19][20][21]. The unused ATP and deoxynucleotide are degraded by the enzyme apyrase. The presence or absence of PPi, and therefore the incorporation or nonincorporation of each nucleotide added, is ultimately assessed on the basis of whether or not the photons are detected. There is minimal time lapse between these events, and the conditions of the reaction are such that iterative addition of the nucleotides and PPi detection are possible. The release of PPi via the nucleotide incorporation, it is detected by ELIDA (Enzymatic Luminometric Inorganic pyrophosphate Detection Assay) [19][21]. It is within the ELIDA, the PPi is converted to ATP, with the help of ATP sulfurylase and the ATP reacts with the luciferin to generate the light at more than 6 x 109 photons at a wavelength of 560 nm which can be detected by a photodiode, photomultiplier tube, or charge-coupled device (CCD) camera [19][20]. As mentioned before, the DNA molecules need to be amplified by polymerase chain reaction (PCR which is discussed later Ronaghi observed that dATP interfered with the detection system [19]. This interference is a major problem when the method is used to detect a single-base incorporation event. This problem was rectified by replacing the dATP with dATPaS (deoxyadenosine a–thiotrisulphate). It is noticed that adding a small amount of the dATP (0.1 nmol) induces an instantaneous increase in the light emission followed by a slow decrease until it reached a steady-state level (as Figure 11 shows). This makes it impossible to start a sequencing reaction by adding dATP; the reaction must instead be started by addition of DNA polymerase. The signal-to-noise ratio also became higher for dATP compared to the other nucleotides. On the other hand, addition of 8 nmol dATPaS (80-fold higher than the amount of dATP) had only a minor effect on luciferase (as Figure 14 shows). However dATPaS is less than 0.05% as effective as dATP as a substrate for luciferase [19]. Pyrosequencing is adapted by 454 Life Sciences for sequencing by synthesis [22] and is known as the Genome Sequencer (GS) FLX [23][24]. The 454 system consist of random ssDNA (single-stranded) fragments, and each random fragment is bound to the bead under conditions that allow only one fragment to a bead [22]. Once the fragment is attached to the bead, clonal amplification occurs via emulsion. The emulsified beads are purified and placed in microfabricated picolitre wells and then goes under pyrosequencing. A lens array in the detection of the instrument focuses luminescene from each well onto the chip of a CCD camera. The CCD camera images the plate every second in order to detect progression of the pyrosequencing [20][22]. The pyrosequencing machine generates raw data in real time in form of bioluminescence generated from the reactions, and data is presented on a pyrogram [20] Sequencing by Hybridisation As discussed earlier with chain-termination, Maxamm and Gilbert and pyrosequencing, these are all direct methods of sequencing DNA, where each base position is determined individually [26]. There are also indirect methods of sequencing DNA in which the DNA sequence is assembled based on experimental determination of oligonucleotide content of the chain. One promising method of indirect DNA sequencing is called Sequencing by Hybridisation in which sets of oligonucleotide probes are hybridised under conditions that allow the detection of complementary sequences in the target nucleic acid [26]. Sequencing by Hybridisation (SBH) was proposed by Drmanac et al in 1987 [27] and is based on Dotys observation that when DNA is heated in solution, the double-strand melts to form single stranded chains, which then re-nature spontaneously when the solution is cooled [28]. This results the possibility of one piece of DNA recognize another. And hence lead to Drmanac proposal of oligonucleotides pro bes being hybridised under these conditions allowing the complementary sequence in the DNA target to be detected [26][27]. In SBH, an oligonucleotide probe (n-mer probe where n is the length of the probe) is a substring of a DNA sample. This process is similar to doing a keyword search in a page full of text [29]. The set of positively expressed probes is known as the spectrum of DNA sample. For example, the single strand DNA 5GGTCTCG 3 will be sequenced using 4-mer probes and 5 probes will hybridise onto the sequence successfully. The remaining probes will form hybrids with a mismatch at the end base and will be denatured during selective washing. The five probes that are of good match at the end base will result in fully matched hybrids, which will be retained and detected. Each positively expressed serves as a platform to decipher the next base as is seen in Figure 16. For the probes that have successfully hybridised onto the sequence need to be detected. This is achieved by labelling the probes with dyes such as Cyanine3 (Cy3) and Cyanine5 (Cy5) so that the degree of hybridisation can be detected by imaging devices [29]. SBH methods are ideally suited to microarray technology due to their inherent potential for parallel sample processing [29]. An important advantage of using of using a DNA array rather than a multiple probe array is that all the resulting probe-DNA hybrids in any single probe hybridisation are of identical sequence [29]. One of main type of DNA hybridisation array formats is oligonucleotide array which is currently patented by Affymetrix [30]. The commercial uses of this shall be discussed under application of the DNA Array (Affymetrix). Due to the small size of the hybridisation array and the small amount of the target present, it is a challenge to acquire the signals from a DNA Array [29]. These signals must first be amplified b efore they can be detected by the imaging devices. Signals can be boosted by the two means; namely target amplification and signal amplification. In target amplification such as PCR, the amount of target is increased to enhance signal strength while in signal amplification; the amount of signal per unit is increased. Nanopore Sequencing Nanopore sequencing was proposed in 1996 by Branton et al, and shows that individual polynucleotide molecules can be characterised using a membrane channel [31]. Nanopore sequencing is an example of single-molecule sequencing, in which the concept of sequencing-by-synthesis is followed, but without the prior amplification step [24]. This is achieved by the measurement of ionic conductance of a nucleotide passing through a single ion channels in biological membranes or planar lipid bilayer. The measurement of ionic conductance is routine neurobiology and biophysics [31], as well as pharmacology (Ca+ and K+ channel)[32] and biochemistry[9]. Most channels undergo voltage-dependant or ligand dependant gating, there are several large ion channels (i.e. Staphylococcus aureus a-hemolysin) which can remain open extended periods, thereby allowing continuous ionic current to flow across a lipid bilayer [31]. If a transmembrane voltage applied across an open channel of appropriate size should d raw DNA molecules through the channel as extended linear chains whose presence would detect reduce ionic flow. It was assumed, that the reduction in the ionic flow would lead to single channel recordings to characterise the length and hence lead to other characteristics of the polynucleotide. In the proposal by Branton, a-hemolysin was used to form a single channel across a lipid bilayer separating two buffer-filled compartment [31]. a-Hemolysin is a monomeric, 33kD, 293 residue protein that is secreted by the human pathogen Staphylococcus aureus [33]. The nanopore are produced when a-hemolysin subsunits are introduced into a buffered solution that separates lipid bilayer into two compartments (known as cis and trans): the head of t