Thursday, October 31, 2019

Literary Lenses in our Media Essay Example | Topics and Well Written Essays - 500 words

Literary Lenses in our Media - Essay Example The themes include the quest and the fall, and contain legend and myth within the plot. â€Å"Heroes† has the basic underlying of any story having to deal with a superhero. There is the good side and the bad side, and the adventures and turmoil that they all must go through in regards to who they are. The legend and myth that can be found within the story are typical of all superheroes – these people possess powers that have only been previously heard or seen in myths and tales. Once these characters receives these powers of legend, they must wield them and use them how they see fit, to their benefit. After the powers have been discovered and the characters understand what they must do with them, the things that follow – their adventures and moments of rescue and brevity – are also tales only found in legend. This literary lens basically shows things that normally should not be possible other than in their legends, yet still prove to be possible in the modern-day tale. The characters fall into the same positions as their myth counterparts, and the themes are set up in the same way, though modernized of course. With older superhero films, such as Superman, Batman, and Spiderman, the same rules apply in which ordinary people discover that they can do extraordinary things, and they must learn to use said things for their benefit in a positive way. Though, this does not always apply to the villains. Another literary lens that can be seen in â€Å"Heroes† is that of existentialism, which involves how a person exists, lives their life, and alters it to fit their environment and their purpose. The characters of â€Å"Heroes† both shape their existence and allow themselves to be shaped by it. They take who they are and use that knowledge and power to help others, and they let the need of others decide for them what they have to do. In â€Å"Heroes,† the characters are constantly choosing between what they have to do, what they want to do, and what

Tuesday, October 29, 2019

Galileo on religion and science Essay Example for Free

Galileo on religion and science Essay Explain Galileo’s attempt to make science and religion compatible, with particular reference to methods of justification. How successful is he? Use Kuhn’s notion of incommensurability to investigate Galileo’s attempt to reconcile the propositions of science and religion. There will always be a battle between religion and science, it is a truth universally acknowledged. Galileo attempted to make the two compatible by suggesting that the truth can only be sought out if the notion under consideration can be accurately tested and if the opposing view can be founded as false. Galileo’s goes into depth about the truth of scripture and the sciences, intertwined with the reason of man, in his letter to Christina of Lorraine, Grand Duchess of Tuscany. Early on in the letter Galileo, infers from St Augustine that the Holy Ghost did not intend to teach â€Å"how heaven goes† rather â€Å"how one goes to heaven†. Galileo interprets this as the underlying basis for the â€Å"common† people to believe that man should not concern themselves with science and that it is against the Bible, and therefore blasphemous. Furthermore this misconception is continued, as Galileo believes that the â€Å"common† people understand the truth in the Bible is largely to do with one’s salvation and other physical things such as whether the Sun or Earth are mobile, are irrelevant. From this Galileo leads on to discuss that â€Å"physical problems† are able to be solved through â€Å"sense-experiences† and reason, as well as the â€Å"authority of scriptural passages† highlighting that either are valid to contain the truth. Galileo argues that God gifted man with the power of reason and logic, so that they would be able to discover and learn about his creation. God did not â€Å"set bounds to human ingenuity† so why should the church impose them, by discrediting and prohibiting some of the greatest work of philosophers and scientists? Galileo had two types of physical propositions: those that are able to be subjected to tests and experiments and those that can only be conjectured about; and these are best left to faith and Holy Scriptures. Galileo did not believe that everything in the universe was known to man, he believed that there were more truths to be discovered. He did not agree that â€Å"free-philosophizing† should be shunned, because the Holy Ghost states that â€Å"man cannot find out the work that God hath done from beginning even to  end†. Galileo is not satisfied by this and urges others to not side with the common opinion of the scriptures, to the point where it blocks logic and reason, and ultimately truth. He highlights this by using the Sun and the Earth: the debate about the still Sun and the mobile Earth, as many come to believe – â€Å"it was most absurd to believe otherwise†. This is because Nicholas Copernicus provided evidence for his claims which were reasonable and logical; that it was â€Å"plain as daylight†. Galileo stresses the importance of not bending the word of God to fulfil ones â€Å"foolish fancies† i.e. one should guard themselves against deliberately misinterpreting the Holy Scripture to suit their own end. The repetition of this (citing scripture to back up their skewed claims) will lead to the inevitable adoption of this understanding and consequently missing the true mean behind the scriptural passages. This is what Galileo means when he refers to as the â€Å"v ulgar† and â€Å"common† people, sometimes misinterpret the holy writings of the Bible â€Å"to fulfil their foolish fancies†. Because of this Galileo advocates the importance for mankind to step away from information that just concerns salvation and to broaden their horizons; most importantly, to not be afraid of the truth. By this he suggests that Holy Scripture and explanations of physical things have their own place in the world and that man should keep an open mind to explore both, because both are valid sources of the truth. He goes on to say that God did not intend for his Word to be misinterpreted like this and the true meaning to be obscured, and this â€Å"sort of abuse† should not â€Å"gain countenance and authority†. However as Galileo stated it is â€Å"more pleasant to gain a reputation for wisdom† without experiments and research than it is to pursue science and produce evidence for these physical observations. Galileo is thankful to God because he does not allow the majority of these â€Å"common folk† as stated above to have all the authority, some learned people too have authority. Galileo is not speaking blasphemy, he is suggesting there is a place for both Holy doctrine as well as scientific doctrine and he attempts to prove this. Galileo is concerned with obtaining the truth about the world we live in, whether religion provides an answer or science. This is evident in the letter when he is discussing the â€Å"celestial bodies† concerning the work of Copernicus. Galileo believes that it would be wholly unjustified to ban  Copernicus’ work after it has been confirmed over the years, he puts it: â€Å"in my judgement to be a contravention of the truth† if this were to occur. Basically Galileo believes if something is true, the basis of the truth either, religious or scientific is equally creditable. He further states that if scientific fact is unattainable, it must be a thing of faith and religion; â€Å"where human reasoning cannot reach† and there â€Å"is no science but only opinion of faith† the truth must be in the Scripture. This is demonstrated in the letter â€Å"whether the stars are animate† or â€Å"whether or not the heavens are spherical†. Galileo believes that ultimately that the science behind a phenomena will equate to the Scripture that describes it; â€Å"the true senses of the Bible† â€Å"will most certainly be found to agree with the proven fact†. He further goes on to admit, that at first the two may seem completely different and closer scrutiny is required. Galileo proves his point further by stating that â€Å"two truths can never contradict each other† ultimately, the religious and scientific, must both be compatible for something to be proven true. The matter of Galileo’s success is dependent largely on his era and the audience. Being a follower of God as well as a man of science and a philosopher, Galileo had to come to terms with both his beliefs and what his physical senses and reasoning told him about the world. He achieved this by describing that the facts produced by reasoning will equal the truth held in the Scripture; both religion and science were correct. However, this was conditional: the Holy Scripture should not be misinterpreted and the notion under consideration should be tested through logic. Following this Galileo, believed that the understanding held by an individual of the Holy Scripture, chiefly relied upon how leaned such a person was and their capacity to understand the true meaning of the Scriptural passages. Primarily Galileo could not disregard the Bible wholly: firstly, as mentioned above he himself believed greatly in God and secondly the people of the time were immersed in the religion from infancy it was to them, â€Å"common† knowledge and the truth about the world. Given this Galileo had to find a solution to please both religious beliefs whilst allowing scientific inquiry and philosophising to occur. As he describe the â€Å"vulgar† and â€Å"common† people may shun his ideas, because they do not accept anything that is not Holy Scripture, and believe Galileo and other scientists and philosophers to be blasphemous. However Galileo attempts to  make sense of this by discussing that perhaps they misinterpreted the Bible and consequently the true meaning (which is compatible with science) is lost to them, that it is beyond their capacity to comprehend. It is remarked by Galileo: â€Å"people who are unable to understand both the Bible and the sciences far out number those who do understand† – perhaps this is Galileo commenting on his own success; that he is only able to reach people if they understand the true meaning of Scripture and are capable of understanding the sciences. Galileo in his attempt to reconcile the propositions of both religion and science, it is evident that he employed Kuhn’s measures of perpetual incommensurability. The evidence acquired via observation is an insufficient basis for theory comparison, due to the inherent idea that perceptual experience is theory dependant. Basically, one’s ability to develop a theory is based on their previous life experiences; whether it be from actual physical events, emotional or spiritual events. All play a role in the development of a theory, which is formed based on how an individual sees the world and what they conclude about it. This in mind, it can be observed that Galileo’s religious upbringing and his adulthood, greatly influenced his views on religion and science. Galileo believed that Holy Scripture and the sciences were compatible; it can be inferred from his letter (as previously discussed) that when human reasoning is unable to provide an answer than it must be left to faith. Extrapolating on this one may conclude that Galileo could be suggesting that the true meaning of science was written in the Bible and God gifted humans with the power to reason, to ultimately find out the truth of the world, of creation. It is this notion that people may find it hard to come to terms with because, it was believed that God did not want humans to discover the truth. However, Galileo questioned (as did many others) why then did God bestow reason and logic on humans? This ultimately drives Galileo in his quest to make religion and science compatible. This reflects the perpetual incommensurability of Kuhn; that one’s background will inevitably colour their perception, as demonstrated by Galileo’s attempt to reconcile religious faith and the s ciences. Reference list All in text citations and all information about this essay were sourced from the following: ATS2867, Thinking about Science Study guide and Readings, Monash University, 1998: 15-30;51-66.

Saturday, October 26, 2019

Effectiveness Of Biologigal Wastewater Treatment Environmental Sciences Essay

Effectiveness Of Biologigal Wastewater Treatment Environmental Sciences Essay Wastewater treatment is a serious environmental concern due to the hazards of discharging poorly treated effluent to the environment. Poor wastewater treatment poses a pollution threat to receiving water bodies, groundwater pollution, soil contamination and resulting loss of biodiversity (Mantila, 2002). Dandora Estate Sewerage Treatment Works treats on average 62,000m3 per day annually of wastewater from Nairobi city and its environs through biological treatment and will form the study area. The population targeted in this study is wastewater received and treated at DESTW. The purpose of this study is to find out the effectiveness of biological wastewater treatment and the pollution potential of DESTW activities to the environment. An experimental research design will be used to determine the wastewater characteristics and contaminant removal while a descriptive design will be used to determine the environmental implications of wastewater treatment. The instruments used in the study are observation, laboratory experiments, leopold matrix, network analysis, and impact characteristic analysis. Data analysis will be done using both inferential and descriptive statistics. Wastewater treatment has been defined as the process of removing contaminants from wastewater produced by both domestic and industrial sources (Tchobanglous, 1993). Its objective is to produce treated effluent and sludge suitable for discharge or reuse back into the environment which is achieved through physical, chemical and biological processes. The issue of wastewater treatment and disposal assumed increasing importance in the early 1970s as a result of the general concern expressed in the United States and worldwide about the wider problem of pollution of the human environment, the contamination of the atmosphere, rivers, lakes, oceans, and groundwater by domestic, municipal, agricultural, and industrial waste (Oswald, 1996) A great deal of wastewater treatment plants are scattered all over the world and until recently not much scientific attention was given to these plants. They were considered to solve local problems so specific that one did not want to think it worthwhile to discuss design and operation of them in international fora. However, the interest shown for the 1st International Specialized Conference on Design and Operation of Wastewater Treatment Plants (Trondheim, 1989), and the IAWQ Specialist Group on the same subject (formed in 1991), demonstrated that there is a need to discussion on international scale the strategies for planning and the technical development of such plants. The reason for this interest must be found in the abundance of cases around the world where small wastewater treatment plants have to be put in operation to prevent environmental pollution and hazards. There is a global shift from the traditional centralized wastewater treatment system to locally based wastewater solutions (Hallvard, 1993) following the UN Decade for Water and Sanitation recommendations. The need for good solutions for wastewater treatment plants is therefore crucial in many developing countries. Developed countries mainly use mechanical and chemical treatment processes which though requiring less land are very expensive to establish and maintain. Alabaster (1994) cites that many developing countries favour the use of biological treatment which uses wastewater stabilization ponds since climate favours its operation and it is a low-cost, low-maintenance, highly efficient and natural method of wastewater treatment. The Dandora Sewerage Treatment Works (DESTW) which treats wastewater from Nairobi city and its environs uses biological treatment. However, due to stricter discharge standards set by National Environmental Management Authority (NEMA), DESTW is increasingly falling short of those standards. Parr and Horan (1994) highlight three principal reasons for wastewater treatment plants failure: a lack of technical knowledge, failure to consider all relevant local factors at pre-design stage and inappropriate discharge standards. Mara (1992) cites the following broad impacts to the environment due to poorly treated effluent: pollution of receiving aquatic water body, groundwater pollution from seepage of effluent, soil pollution from dumping sludge and health impacts from drinking contaminated water or food grown by the same water. 1.2 Problem Statement The problem under investigation in this study is the effectiveness of biological treatment in removing contaminants from wastewater and pollution potential of DESTW activities. Factors making the problem a critical issue to warrant research are: the physical treatment unit at DESTW has not been operational for the past four years; all pond series apart from series 3 and 5 lack anaerobic ponds; closure of series 8 due to water hyacinth infestation may overload series 7; lack of pretreatment facilities in many industries that discharge into the Nairobi city sewer network may reduce treatment effectiveness; and the environmental implications of groundwater pollution by effluent seepage and soil pollution by dumping of toxic sludge. Purpose of the Study Based on the problem stated the purpose of this study is to investigate the effectiveness of biological treatment at removing contaminants from wastewater through empirical method of inquiry and propose sustainable methods of improving treatment effectiveness at DESTW. This study also aims at identifying the potential impacts to the environment resulting from DESTW activities and proposes methods of mitigating negative impacts based on findings. 1.4 Objectives of the Study The objectives of this study are as follows: To analyze the composition of wastewater received at DESTW To analyze the effectiveness of contaminants mass removal at DESTW To determine the pollution potential in relation to activities of DESTW To identify alternative uses of treated effluent 1.5 Hypothesis There is a positive relationship between the functioning of biological treatment and the quality of effluent at DESTW. 1.6 Significance and Justification of the Study This study addresses gaps in knowledge that exist in biological treatment effectiveness in treating wastewater from Nairobi, sewage effluent has long been cited as the cause of Nairobi River pollution, this study will quantify the extent to which effluent from DESTW pollutes the river. By addressing the above gaps in knowledge, the study will add to the body of knowledge in the field of wastewater treatment in Kenya. This study is important since the results will influence future environmental policies on wastewater management, recommendations will propose sustainable methods suitable for Kenya of further treating the effluent to ensure compliance with discharge standards, and they will also propose methods on improving existing methods of treating wastewater e.g. by harvesting methane gas from anaerobic ponds to provide electricity for running the physical treatment works. The findings and recommendations will mitigate negative impacts to the environment as a result of DESTW activities. Beneficiaries from findings of this study are the community surrounding DESTW who will enjoy cleaner groundwater resources and decrease health risks from eating vegetables grown by effluent or eating fish caught from oxidation ponds. Downstream users of R. Nairobi will enjoy cleaner river water which will decrease prevalence of waterborne diseases. DESTW will benefit from this studys recommendations by increased environmental compliance and they will also cut down on operational costs through generating electricity from anaerobic ponds methane gas. Researchers will benefit from this studys findings which will form background information and methodology reference for future related studies. Policy makers will use the findings and recommendations of this study in formulating policies for wastewater management in Kenya. 1.7 Limitations and Assumptions Limitations Length of the study was limited to 3 months from January to March 2008 where data was to be collected. To overcome this limitation, data for previous years was obtained from the DESTW database. Breakdown of some laboratory machines hindered analysis of samples e.g. water distiller breakdown prevented analysis on some days due to lack of distilled water. Lack of a permanent vehicle at DESTW prevented final effluent sampling on some days. Assumptions It is assumed that the reagents were not contaminated. It is assumed that the measuring equipments were calibrated properly. It is assumed that sampling and storage cans were kept clean to prevent sample contamination. 1.8 Study Area This study will be carried out at the Dandora Estate Sewerage Treatment Works (DESTW) which treats wastewater from Nairobi city and its environs using biological treatment process. The study area was chosen since it forms a representative sample of Nairobi city wastewaters. Commissioning The first phase was completed in 1977 and commissioned on 1978. The second phase was completed in 1990 and commissioned on 1992. Location DESTW is located at Ruai in Embakasi division approximately 30km from the city center and about 3km off Kangundo road. Access to the plant is on a permanent earth road. The site is approximately 1000ha and the oxidation ponds are on 200ha. Climate The climate is a typical Nairobi climate with temperature ranging between 15-30 degrees centigrade. The average rainfall is approximately 760mm with the most of the rains falling in two seasons, March to May (long rains) and October to December (short rains). Geology, soils and topography The geology of the area mainly comprise of Nairobi volcanics covered by black cotton clay soils. The area is generally flat with Nairobi River forming the north Eastern boundary of the land. Flora and fauna The area is generally arid with scanty vegetation cover, mainly sisal and shrubs. The ponds have attracted crocodiles and hippos from the nearby Nairobi River since they provide habitat and cheap source of food to for fauna and flora. Large colonies of different species of birds such as birds of prey (e.g., buzzard, golden eagle, and barn-owl), garden and woodland birds (e.g., pigeon, crow, and sparrow) water-birds and sea-birds (e.g., heron, swans, kingfisher, and curlew), and game birds such as quail hovered around the stabilization ponds during the day. Mudfish and tilapia fish have also been introduced in the maturation ponds to assist in quality monitoring. Number of ponds and arrangement There are a total of 38 waste stabilization ponds at DESTW which occur in 8 series. Facultative and maturation (aerobic) ponds run in parallel. Only series 3 and 5 have anaerobic ponds. Types of ponds There are three types of ponds at DESTW and these are: Anaerobic ponds- they are 4.0m deep and measure 100m by 100m. They are deigned for organic matter removal e.g. helminth eggs. Facultative ponds they are 2.5 m deep and measure 700m by 300m. They are designed for BOD5 removal. Maturation ponds- they are 1.5m deep and measure m by m. They are designed for nitrogen and phosphorus removal. Pretreatment and flow measurement facilities DESTW has a conventional inlet works where large suspended solids are screened by coarse bar screens before being automatically raked by cup screens. Grit is removed by use of constant velocity grit traps. A venturi flume is provided for flow measurement. CHAPTETR TWO: LITERATURE REVIEW 2.1 Nature of Wastewater 2.1.1 Origin and Quantity Wastewater originates mainly from domestic, industrial, groundwater, and meteorological sources and these forms of wastewater are commonly referred to as domestic sewage, industrial waste, infiltration, and storm-water drainage, respectively(Mara, 1997). Domestic sewage results from peoples day-to-day activities, such as bathing, body elimination, food preparation, and recreation, averaging about 90 liters per person daily in Kenya (Asano, 1998). The quantity and character of industrial wastewater is highly varied, depending on the type of industry, the management of its water usage, and the degree of treatment the wastewater receives before it is discharged. A typical metropolitan area discharges a volume of wastewater equal to about 60 to 80 percent of its total daily water requirements, the rest being used for washing cars and watering lawns, and for manufacturing processes such as food canning and bottling (WHO, 1992). 2.1.2 Composition The composition of wastewater is analyzed using several physical, chemical, and biological measurements. The most common analyses include the measurements of solids, biochemical oxygen demand (BOD5), chemical oxygen demand (COD), and pH (Pena, 2002). The solid wastes include dissolved and suspended solids. Dissolved solids are the materials that will pass through a filter paper, and suspended solids are those that do not. The concentration of organic matter is measured by the BOD5 and COD analyses. The BOD5 is the amount of oxygen used over a five-day period by microorganisms as they decompose the organic matter in sewage at a temperature of 20 ° C. Similarly, the COD is the amount of oxygen required to oxidize the organic matter by use of dichromate in an acid solution and to convert it to carbon dioxide and water. The value of COD is always higher than that of BOD 5 because many organic substances can be oxidized chemically but cannot oxidize biologically (Curtis, 1992) . Commonly, BOD5 is used to test the strength of untreated and treated municipal and biodegradable industrial wastewaters. COD is used to test the strength of wastewater that is either not biodegradable or contains compounds that inhibit activities of microorganisms. The pH analysis is a measure of the acidity of a wastewater sample. 2.2 Biological Wastewater Treatment 2.2.1 Waste Stabilization Ponds Technology Overview Waste stabilization ponds (WSPs) are usually the most appropriate method of domestic and municipal wastewater treatment in developing countries, where the climate is most favourable for their operation WSPs are low-cost (usually least-cost), low-maintenance, highly efficient, entirely natural and highly sustainable (Alabaster, 1994). The only energy they use is direct solar energy, so they do not need any electromechanical equipment, saving expenditure on electricity and more skilled operation. They do require much more land than conventional electromechanical treatment processes such as activated sludge but land is an asset which increases in value with time, whereas money spent on electricity for the operation of electromechanical systems is gone forever). WSP systems comprise one or more series of different types of ponds. Usually the first pond in the series is an anaerobic pond, and the second is a facultative pond. These may need to be followed by maturation ponds, but this depends on the required final effluent quality which in turn depends on what is to be done with the effluent: used for restricted or unrestricted irrigation; used for fish or aquatic vegetable culture; or discharged into surface water or groundwater (Horan, 1994). Prior to treatment in the WSPs, the wastewater is first subjected to preliminary treatment à ¢Ã‹â€ Ã¢â‚¬â„¢screening and grit removal à ¢Ã‹â€ Ã¢â‚¬â„¢ to remove large and heavy solids. Basically, primary treatment is carried out in anaerobic ponds, secondary treatment in facultative ponds, and tertiary treatment in maturation ponds. Anaerobic and facultative ponds are for the removal of organic matter (normally expressed as biochemical oxygen demand or BOD), Vibrio cholerae and helminth eggs; and maturation ponds for the removal of faecal viruses (especially rotavirus, astrovirus and norovirus), faecal bacteria (for example, Salmonella spp., Shigella spp., Campylobacter spp. and pathogenic strains of Escherichia coli), and nutrients (nitrogen and phosphorus). Due to their high removal of excreted pathogens, WSPs produce effluents that are very suitable for reuse in agriculture and aquaculture. 2.2.2 Related Research on Biological Wastewater Treatment Mandi (1993) in his comparative study of Wastewater treatment by stabilization ponds with and without macrophytes under arid climate found that ponds using water hyacinth proved most efficient than those using microphytic plants (algae). Howver, the process based on water hyacinth for wastewater purification is faced with two major problems: first the water loss by evapotranspiration reaches 60% during summer time and secondly the development of mosquito during summer time. He however does not address the huge quantities of biomass produced from water hyacinth treatment systems and the resulting increase in sludge deposition rate. Ghrabi (1989) in his experimental study Treatment of wastewater by stabilization ponds application to Tunisian conditions concluded that sediment accumulation occurs mainly in the first pond: the deposition rate is high (5 cm/year). In the maturation ponds, it ranges from 1.3 cm/year to 1.6 cm/year. The first pond can be desludged yearly or once each two years. He however in his study doesnt mention the environmental impacts of sludge to the soil and he also doesnt suggest methods of decreasing the amounts reaching the wastewater stabilization ponds. Jensen (1992) in his study on the Potential use of constructed wetlands for wastewater treatment in Northern environments concluded that wetlands achieve 98% phosphorus removal, 88% BOD removal and 55% nitrogen removal respectively. COD removal was only 64% due to discharge of organic matter that is slowly biodegradable e.g. humic acids. This study however didnt estimate the productive lifespan of the constructed wetlands. 2.3 Problems in Wastewater Treatment and Disposal 2.3.1 Wastewater Treatment Plant Problems Many wastewater treatment plants (WwTP) of all kinds in developing countries do not function properly. Parr and Horan (1994) found that there are three principal reasons for WwTP failure: a lack of technical knowledge; failure to consider all relevant local factors at the pre-design stage; and inappropriate discharge standards. As a result, wrong decisions are often made and inappropriate unsustainable treatment processes are selected and implemented. This is then exacerbated by the absence of any real incentive to operate the WwTP correctly once it has been commissioned. It is therefore essential for the long-term sustainability of WwTP that simple efficient technologies such as WSPs are always considered at the pre-design (or feasibility) stage. An honest comparison of the cost-effectiveness of wastewater treatment technologies will almost always favour the selection of WSPs in warm-climate countries. 2.3.2 Environmental Problems of Wastewater Treatment and Disposal If wastewater is discharged before it is properly treated, it can adversely affect the environment, public health and destinations economic well-being. The cost of these negative impacts can be expressed in monetary, health and ecological terms (Mara, 1997). Mantila (2002) identifies a number of consequences of poorly treated wastewater: Health Impacts from pathogenic bacteria, viruses and toxic algae cause diarrhoea, shellfish poisoning and other diseases; bathing in polluted water causes gastroenteritis and upper respiratory diseases; eating polluted shellfish results in hepatitis, liver damage and in some cases death. Impact on Marine Environment in the form of suspended solids may cause excessive turbidity and shading of sea grasses, produce sedimentation, damaging benthic (bottom layer) habitats and affect anaerobic conditions at the sea bottom; high BOD levels may cause severe oxygen depletion especially in shallow and enclosed aquatic systems such as estuaries that are ideal breeding grounds for various marine species resulting in fish deaths and anaerobic conditions which release bad odors(hydrogen sulfide); adverse nutrient levels cause algal blooms, resulting in the death of coral and sea grasses and eutrophication leading to severe oxygen depletion which kills living resources; many toxic materials and suspected carcinogens and mutagens can concentrate in fish tissue, putting humans at risk when they eat them; metals in specific forms can be toxic to humans and various marine organisms especially shellfish which is vulnerable, in areas with highly contaminated sediment layers; fats, oil an d grease that float on the water surface interfere with natural aeration, are possibly toxic to aquatic life, destroy coastal vegetation and reduce recreational use of waters and beaches. Impact on Groundwater and Water Resources in the form of improper disposal of wastewater can directly impact the quality of an areas groundwater and water resources and since their movements are dynamic, contaminants can spread far beyond the immediate pollution area. 2.4 Wastewater Management Options Oswald (1995) states that the following issues should be addressed before designing an effective wastewater management plan: assess current wastewater management practice before water is discharged to the municipal treatment facility, identification of sources of wastewater, determine whether discharged wastewater quality meets effluent standards, identify whether industries carry out pre-tretment of their wastewater and finally assessing complaints from users of reclaimed wastewater effluent. Once the situation has been assessed, a range of approaches and techniques to deal with wastewater can be considered. Bartone (1996) argues that to ensure effective treatment o wastewater, the volume has to be reduced to prevent overloading of wastewater treatment plants and this can only be achieved at the source through installation of water efficiency equipment e.g. ultra-low flush toilets, spray nozzles, low-flow showerheads, water spigots, all which reduce overall water consumption. Collection of domestic wastewater and transportation to a distant treatment plant is a difficult and highly expensive task, if the catchment area to be served is low in population density (Tchobanoglous, 1993). Onsite treatment of sewage is the alternative and has been applied al around the world for many centuries. However, purification achieved by traditional onsite treatment systems such as septic tanks (DIN, 1993) is rather poor especially with respect to nutrient removal and as a result impacts on the quality of groundwater are inevitable. The basic idea of the biofilter septic tank was introduced by Toshio Yahata (1981) and further developed by Stubner and Sekoulov (1987). The biofilm reactor septic tank has been found to be more efficient (Robert, 1996) and effluent can be reused for irrigating or flushing toilets. 2.5 Conceptual Framework This study is based on the conceptual framework below that aims at optimal use of resources in an environmentally sustainable manner. Stage Description The main sources of generation are households, commercial and industrial sources. This is done through the sewer network in Nairobi and conveyed to DESTW. An annual average of 62000 m3 wastewater reaches DESTW daily It aims at screening solids and grit removal from wastewater stream. Coarse bar screens- remove large suspended solids Medium bar screens remove smaller suspended solids Cup screens- remove finer suspended solids Grit traps- remove grit and sand particles from wastewater Involves use of wastewater stabilization ponds Anaerobic ponds are designed for organic matter removal Facultative ponds- are designed for BOD removal Maturation ponds- designed for nitrogen and phosphorus removal Treated effluent disposed of in Nairobi River Effluent reused for agricultural irrigation and livestock watering. Fig 1: Conceptual framework for wastewater treatment and disposal in Nairobi.(Adapted from WHO,1992) CHAPTER THREE: METHODOLOGY 3.1 Research Design The design used in this research is experimental since analysis of wastewater quality is done in the laboratory. It is also descriptive since the state of the environment and biological treatment process are described. The approach used in this study is deductive since it begins with the perceptual experience and observation of an environmental problem, leads to hypothesis formulation, experimental design, data collection, statistical analysis, theory construction, and finally to explanation. 3.2 Population and Sample Population The population targeted in this study is the wastewater received and treated at DESTW which averages 62,000m3 per day annually. Sample types Grab samples were necessary for parameters such as pH, ammonia, and faecal indicator bacteria. Flow-weighted composite samples were necessary for raw sewage parameters such as electrical conductivity, dissolved oxygen, Frequency of sampling Raw sewage was sampled hourly because its composition varies considerably throughout the day. Flow was sampled hourly throughout the day. Final effluents were sampled once daily before noon. Pond series were sampled once every week. Nairobi River upstream and downstream was sampled once a week. Data Collection Instruments 3.3.1 Field Observation Environmental impacts will be identified using field observation which will be aided by the following instruments a) Leopold matrix It is a grid-like table that is used to identify the interaction between project activities, which are displayed along one axis, and environmental characteristics, which are displayed along the other axis. Using the table, environment-activity interactions can be noted in the appropriate cells or intersecting points in the grid. Entries are made in the cells to highlight impact severity or other features related to the nature of the impact, e.g. numbers in this study are used to indicate scale in this study. This instrument was chosen for environmental impact identification because it links the action to the impact, shows impact magnitude and significance, and is a good way of displaying environmental impact results. b) Network analysis Networks illustrate the cause-effect relationship of project activities and environmental characteristics. They are, therefore, particularly useful in identifying and depicting secondary impacts (indirect, cumulative, etc). They are drawn by identifying first order impacts first then linking them to second order impacts and third order impacts by use of an arrow. This instrument was chosen for environmental impact identification since it links the actions to the impacts, is useful I simplified form for checking for second order impacts and can handle direct and indirect impacts. c) Impact characteristics analysis It is normally in the form of a summary table and this instrument was chosen for environmental impact identification because it shows impact nature, magnitude, extent/location, timing, duration, reversibility, likelihood (risk), and significance. 3.3.2 Laboratory experiments Experiments were performed to determine the composition of wastewater at DESTW and the mass removal of contaminants from the wastewater. The apparatus below will be used during the laboratory experiments: Plastic sampling cans were used to collect and store samples. A wooden pole with 1cm graduations was used to measure depth at the venturi flume. A refrigerator was used to store samples at below 4 degrees Celsius. Burettes, conical flasks, pipettes, beakers, and digestion tubes were used to hold samples and reagents when analyzing for various parameters in the laboratory. Ovens, digestion blocks, water baths, and fume chambers were used in creating conducive conditions for chemical reactions to take place in the laboratory. Pan balances, beam balances, UV spectrophotometers, atomic absorption spectrophotometers, water quality meters and flame photometers were used to measure values of various parameters in the samples. 3.4 Data Collection Procedures 3.4.1 Laboratory Analysis Procedures Parameters will be analyzed according to Alabasters 1989 Practical Guide to the Monitoring of Waste Stabilization Ponds standard operations manual that was adopted by the DESTW laboratory. a) Flow This will be measured on the raw sewage and final effluents using the venturi flume which is a restriction in the channel carrying wastewater. The formula below was used to calculate flow. Q =2à ¢Ã‹â€ Ã¢â‚¬ ¢3 à ¢Ã‹â€ Ã… ¡2/3 g CV.CD . b. h3/2 Where Q= flowrate m3/s CV = coefficient of velocity CD = coefficient of discharge b = width of throat (m) h = upstream depth (m) b) COD total and filtered The micro-digestion sealed tube method will be used with potassium dichromate as digestion solution and ferrous ammonium sulphate as titration solution. Procedure 1.5 ml of digestion solution is dispensed into a digestion tube, 2.5 ml of sample is added using a pipette and mixed well, 3.5 ml of catalyst solution (silver sulphate in 2.5 liters of sulphuric acid ) is added, the tube is capped tightly using a PTFE sealing gasket, the tubes contents are then mixed by gentle swirling, the tubes are then placed in a digestion block at 1500 C for 120 minutes, contents of the tube are transferred quantitavely to 100ml conical flask and sufficient water added to a final volume around 25 ml , 1 drop of ferroin indicator is added and the solution mixed well, it is titrated with FAS (N/40) until the faint blue colour changes to red and the value of the titre T ml recorded, a blank titration is carried out following the same procedure but using distilled water instead and the value of blank titre B ml recorded. COD calculated as follows: COD = (B-T) / S ÃÆ'- 1000 mg/l c) BOD total The standard 5 days, 20 0C, BOD bottle test will be used. Reagents Dilution water, ferric chloride solution, manganous sulphate solution, sodium azide solution, alkali- iodide solution, 90 % orthophosphoric acid, N/40 sodium thiosulphate, starch solution. Procedure Dilution water is prepared, sample added and incubated at 200C for 5 days to determine dissolved oxygen, remove stopper from the BOD bottle and 2ml each of manganous sulphate solution, sodium azide solution, alkali- iodide solution, immediately after the addition of alkali-iodide reagent a brown flocculent precipitate forms therefore the bottle is shaken to ensure that all the dissolved oxygen reacts with the reagents, when the floc settles add 2ml orthophosphoric acid and shaken until the bottle contents turn yellow, 205 ml of the bottle contents is titrated with N/40 sodium thiosulp

Friday, October 25, 2019

Role of the Quakers in Uncle Toms Cabin by Harriet Beecher Stowe Essay

The Quakers and Uncle Tom’s Cabin     Ã‚  Ã‚           Ã‚  In this paper, I will examine the choice of using the Quakers as the angelic figures that become the saviors for the black race during the slave movement in Uncle Tom’s Cabin. While examining this topic, Harriet Beecher Stowe’s background of Puritanism becomes the focus for her motivation to change the world around her and her strict discipline of keeping spiritual values as part of her daily existence. The next stage to be discussed is her conversion from conservative Calvinist views to liberal ideals of social reform. This reform captures the spirit of Transcendentalism, the idea of the individual as a divine being changing society to meet with those ideals. Finally, I will touch on the belief of the Quakers and their history and how they became the model of godliness that is portrayed in Stowe’s novel.   In the Haggadah, God creates the world by his word, the twenty-two letters of the Hebrew alphabet descending from the crown of God engraved with a pen of flaming fire on the mind of Man (Barnstone 15). Many great writers strive to tap into this inspiration of divine light or intellectual genius to produce works of literary art. The written word from these writers possessed enough power to start revolutions, change public sentiment, and alter the rational thinking of the times. One such writer that changed historical events during the American Renaissance is Harriet Beecher Stowe. Her literary masterpiece, Uncle Tom’s Cabin, caused such enlightenment of the general public to push the United States into Civil War to emancipate the black race from the bonds of slavery. The main source of inspiration for her writings comes from her own personal experiences of life and the deeply... ...y in the end.    Works Cited Barnstone, Willis. The Other Bible. New York; Harper & Row, 1984. Drake, Thomas E. Quakers and Slavery in America . Massachusetts; Gloucester, 1965. Foster, Charles H. The Rungless Ladder; Harriet Beecher Stowe and New England Puritanism. New York; Cooper Press, 1970. Miller, Perry. Errand into the Wilderness. Chicago; University of Chicago Press, 1981. Stowe, Harriet Beecher. "Uncle Toms Cabin." The Heath Anthology of American Literature. Ed. Paul Lauter. 3rd ed. Massachusetts; Houghton Mifflin, 1998. 2310-2352. Westra, Helen Petter. "Confronting Antichrist; The Influence of Jonathan Edwardà ¢s Vision" The Stowe Debate; Rhetorical Strategies in Uncle Tomà ¢s Cabin. Ed. Mason I. Lowance, Jr., Ellen E. Westbrook, and R.C. DeProspo. 1st ed. Massachusetts; University of Massachusetts Press, 1994. 141-158. Role of the Quakers in Uncle Tom's Cabin by Harriet Beecher Stowe Essay The Quakers and Uncle Tom’s Cabin     Ã‚  Ã‚           Ã‚  In this paper, I will examine the choice of using the Quakers as the angelic figures that become the saviors for the black race during the slave movement in Uncle Tom’s Cabin. While examining this topic, Harriet Beecher Stowe’s background of Puritanism becomes the focus for her motivation to change the world around her and her strict discipline of keeping spiritual values as part of her daily existence. The next stage to be discussed is her conversion from conservative Calvinist views to liberal ideals of social reform. This reform captures the spirit of Transcendentalism, the idea of the individual as a divine being changing society to meet with those ideals. Finally, I will touch on the belief of the Quakers and their history and how they became the model of godliness that is portrayed in Stowe’s novel.   In the Haggadah, God creates the world by his word, the twenty-two letters of the Hebrew alphabet descending from the crown of God engraved with a pen of flaming fire on the mind of Man (Barnstone 15). Many great writers strive to tap into this inspiration of divine light or intellectual genius to produce works of literary art. The written word from these writers possessed enough power to start revolutions, change public sentiment, and alter the rational thinking of the times. One such writer that changed historical events during the American Renaissance is Harriet Beecher Stowe. Her literary masterpiece, Uncle Tom’s Cabin, caused such enlightenment of the general public to push the United States into Civil War to emancipate the black race from the bonds of slavery. The main source of inspiration for her writings comes from her own personal experiences of life and the deeply... ...y in the end.    Works Cited Barnstone, Willis. The Other Bible. New York; Harper & Row, 1984. Drake, Thomas E. Quakers and Slavery in America . Massachusetts; Gloucester, 1965. Foster, Charles H. The Rungless Ladder; Harriet Beecher Stowe and New England Puritanism. New York; Cooper Press, 1970. Miller, Perry. Errand into the Wilderness. Chicago; University of Chicago Press, 1981. Stowe, Harriet Beecher. "Uncle Toms Cabin." The Heath Anthology of American Literature. Ed. Paul Lauter. 3rd ed. Massachusetts; Houghton Mifflin, 1998. 2310-2352. Westra, Helen Petter. "Confronting Antichrist; The Influence of Jonathan Edwardà ¢s Vision" The Stowe Debate; Rhetorical Strategies in Uncle Tomà ¢s Cabin. Ed. Mason I. Lowance, Jr., Ellen E. Westbrook, and R.C. DeProspo. 1st ed. Massachusetts; University of Massachusetts Press, 1994. 141-158.

Wednesday, October 23, 2019

B2B and B2C: Their ethical, legal, and regulatory environments Essay

The marketplace of internet commerce is rapidly expanding. Although there are wide variations within each category, internet-related businesses are usually classified as being either B2B (business-to-business), or B2C (business-to-consumer) enterprises. The economic landscape in which these companies operate is always changing. For that reason, it is all the more necessary for there to be a firm concept of the ethical, legal and regulatory responsibilities within this emerging marketplace. B2Bs and B2Cs share many of these responsibilities in common. In some cases, however, there are particular nuances of operating a B2B that may call for a different framework than is necessary for operating a B2C and vice versa. The variations between these businesses are continually widening, meaning that the particulars of the ethical, legal and regulatory frameworks will differ. The overall goal of these frameworks is the same-establishing the concept of trust. Trust is the key to establishing any reputable, successful and long lasting business. Overview B2Bs (business-to-business) are business that interconnect using the internet. In other words, they are businesses that buy and sell to each other. B2Cs are businesses that use the internet to sell to the end consumer. It may be the internet site of a well-established department store, or it may be a business that uses the internet as its only point of contact with consumers. The world of internet business is still relatively new. As it continues to grow, it will have to adapt to regulatory and legal changes. The continued emergence of variations within the E-marketplace is a challenge to those concerned with ethical and regulatory issues. Far from being on the wane, these issues are more common than ever. According to market forecasters â€Å"Security and privacy issues along with e-business regulatory issues will become more prevalent† (Warholic, 2007). Ethical environment Conduct of B2B transactions is reliant on the two-way sharing of information. As a result businesses on both ends of the transaction must make sure that information platforms are secure, and accessible only to authorized personnel. Since trust is a critical element in E-business as well as more traditional forms of business, professional codes already in existence are applicable in both areas. Unfortunately, industry-wide adherence to these codes is lacking. Companies are struggling with the wide array of issues raised by internet commerce. A recent report on the publications industry highlighted one of the many potential ethical problems of conducting business in an advertising-driven media. â€Å"†¦several respondents indicated that there was too much of a blur between editorial and advertising departments† (ASBPE, 2006). Another likely area of ethical focus for the B2B industry is highlighted by Laura Spense: â€Å"What about the facilitating of fraudulent activity? (Spense, 2002). In an environment with a multitude of partners, platforms and subsidiaries how much responsibility does a B2B company bear for the actions of its partners? In Spense’s example, a B2B bank was knowingly enabling one of its partners to conduct illegal activities overseas. Again, examples similar to this could arise in any number of industries. Ethical responsibilities for B2C companies often revolve around the protection of customers’ information. Some companies have developed software limiting customer information to only a few responsible parties. Others have not been able to resist the financial lure of information sharing or selling. This division is likely to continue until it is addressed more completely by legal and regulatory efforts. In the mean time, there is an opportunity for ethical businesses to develop strong reputations that will benefit them far into the future. Legal environment Legal concerns in E-business span a wide array of areas. There are the obvious concerns such as customer security and privacy, internet fraud and identity theft that relate most often to B2C businesses. Most case law that has been developed addresses these issues. Because these crimes are the most high profile, they are the predominant focus of the legal system. As criminals adjust to these legal efforts in any number of ways it will require a sophisticated and ongoing effort to prevent their actions. There are also possible legal issues below the surface that can be just as important. For example, the difficulty of determining the legality of electronic documents can pose issues, particularly for multinational B2B companies. What appears to be a legal document may not be admissible in court as evidence. Time differences can also result in an agreed upon document bearing one date in one country but having another effective date in another country. Because B2Bs can employ many networks and partners, it can be difficult to determine legally who bears responsibility on a particular issue. These are examples of small details, in the B2B context, that can have large legal consequences if not properly attained to. Regulatory environment The internet is still relatively unregulated. That is beginning to change in a few areas, however. Most regulation is targeted toward B2C companies, coming in the form of consumer protection measures. The government is becoming more assertive in prosecuting internet fraud, gambling, child pornography and spamming violations. It is likely that additional laws in these areas will be enacted in the coming years. The issue of taxation is also currently under debate. This is of particular concern to B2Cs, which in years past have been able to lure customers by selling their product without any sales tax. That practice has already been ended in some states. As more states become cash-strapped, this process is likely to continue. B2C firms will have to innovate in order to continue the growth of their customer base. The Uniform Commercial Code (UCC) applies to both B2B and B2C enterprises. The UCC outlines warranty, ownership and expert status issues, making some delineation between the responsibilities of B2Bs and B2Cs. For example, under the UCC, a business client is assumed to have a greater level of expertise about the transaction at hand. Therefore, statements or claims made to that client do not necessarily have to meet the same standards of reliability as statements made to an end user in a B2C transaction. The main regulatory concern in regards to B2Bs involves the prevention of monopolistic practices, including price fixing. The formation of some large B2B firms such as Covisint, a firm formed by Ford, DaimlerChrysler, and General Motors, has raised concerns about the potential domination of market share. Critics fear the emergence of monospony – a shift of pricing power from buyers to sellers (IGE, 2001). Conclusion The only thing that is certain about the internet business environment is that it will continue to change and evolve. The government will likely become a bigger factor in terms of laws and regulations, particularly within the B2C market. The B2B market, in contrast, is better able to self-regulate. Ethically both markets would be best served by anticipating potential regulatory and legal action. This has a dual purpose. It helps to establish the company as a trustworthy entity. Also it can help to head off future government interference in the market. The consequences of a lack of trust are particularly high for a B2B company, but the issue is important for any company hoping to operate profitably over the long-term.

Tuesday, October 22, 2019

Lythronax - Facts and Figures

Lythronax - Facts and Figures Name Lythronax (Greek for gore king); pronounced LITH-roe-nax Habitat Woodlands of North America Historical Period Late Cretaceous (80 million years ago) Size and Weight About 24 feet long and 2-3 tons Diet Meat Distinguishing Characteristics Moderate size; long skull; foreshortened arms About Lythronax Despite what you may have read in the press, the newly announced Lythronax (gore king) isnt the oldest tyrannosaur in the fossil record; that honor goes to pint-sized Asian genera like Guanlong that lived tens of millions of years earlier. Lythronax does, however, represent a crucial missing link in tyrannosaur evolution, since its bones were unearthed from a region of Utah that corresponds to the southern portion of the island of Laramidia, which straddled North Americas shallow Western Interior Sea during the late Cretaceous period. (The northern part of Laramidia, by contrast, corresponds to the modern-day states of Montana, Wyoming, and North and South Dakota, as well as parts of Canada.) What the discovery of Lythronax implies is that the evolutionary split leading to tyrannosaurid tyrannosaurs like T. Rex (to which this dinosaur was closely related, and which appeared on the scene over 10 million years later) occurred a few million years earlier than was once believed. Long story short: Lythronax was closely related to other tyrannosaurid tyrannosaurs of southern Laramidia (most notably Teratophoneus and Bistahieversor, in addition to T. Rex), which now appear to have evolved separately from their neighbors in the northmeaning there may be many more tyrannosaurs lurking in the fossil record than previously believed.

Monday, October 21, 2019

The Argument on Euthanasia and The Euthanasia Society of America Essays

The Argument on Euthanasia and The Euthanasia Society of America Essays The Argument on Euthanasia and The Euthanasia Society of America Paper The Argument on Euthanasia and The Euthanasia Society of America Paper Passive euthanasia is the deliberate disconnection of life support equipment, or cessation of any life sustaining medical procedure, permitting the natural death of the patient (EROG). Allowing an individual the right to choose if they want to fight to save their own life or to be allowed to die a nature death is becoming a major issue in todays society. This is not just a present day issue; it has been a topic of debate for many years. In 1906 Ohio drafted Its first euthanasia bill and then in 1938 The Euthanasia Society of America was founded (EROG). 1976 though was a turning point for euthanasia in the United States as the Quinlan Family goes all the way to the New Jersey Supreme Court to be allowed to disconnect the respirator from their comatose daughter. The courts approved the families request (EROG). Also in 1976 California passes the nations first Living Will law. A Living Will is the popular name for an advance directive by which a person requests in writing for a physician not to connect life supporting equipment if this procedure is merely going to delay an inevitable death (EROG). There are many people who feel that they do not want to have their death delayed by extraordinary means. Allowing passive euthanasia, gives individuals the right to decide about their own life. A right that I will try to show to you should not be taken away from any person. Assumptions All possible means should be used to save a life. This is not the best possible alternative. Life at all cost can become an enormous financial stress for families. There is also the added stress of taking care of a loved of that is totally dependent on others. When others are allowed to make the decision for live at all cost for an individual that is seriously ill or injured, they are generally thinking only of themselves and not the quality of life this person may have after the illness or injury. As John Miller states in one of his articles in The American Journal of Hospice and Palliative Care; When we fall to the extremes, we take choices away from those who we believe we are helping. Life at all cost if allowed could be taken to the extreme. Values Protected Assisted Suicide Passive euthanasia Life at all cost Freedom of choice.Death with dignity.Individual liberty. Freedom of choice.Death with dignity.Individual liberty.Financial stability for families.Quality of life. Preservation of life.Protection of Doctors oath.Love and caring of a family member. With assisted suicide and passive euthanasia, a patient is given the freedom to choose what happens with their life; but with life at all cost it is usually the family or the medical personal making the decisions for the patient. Protection of a doctors medical oath comes secondary to the freedom to choice for an individual. Taking away a persons right to make their own decision about their life is not correct. No one knows what is best for an individual except the person involved in the decision. According to the Hemlock Society, if you are terminally ill, a person has the right to refuse treatment even if they will die without it. To demand and to receive adequate medication for pain control even if it will shorten your life. Life at all cost can be very expensive. Medical procedures and health care for a patient can be taken too far, cost families a great deal of money and emotional stress of taking care of a loved one. With passive euthanasia a patient is allowed to die with dignity. There is no extraordinary measure taken to save the patients life, which can also lower the medical expenses a family endures through a tragedy. Passive euthanasia allows a person to die naturally and with todays hospice programs, the patient is medicated to help relieve the unwanted pain. Assisted suicide or life at all cost can be considered extremes in todays society. As John Miller states in his article called Hospice Care or Assisted Suicide; When we fall to the extremes. We take choices away from those who we believe we are helping. But, there is a middle ground. When we aim for that middle ground, we all win.

Sunday, October 20, 2019

Computers today essays

Computers today essays Although computer gaming has only been around for a few years, there exists gaps between the different genres, that most dont have anything in common. Strategy and Role-playing games, or RPGs, however, are two genres which share certain characteristics a factor that usually attracts a multitude of gamers. The two game types share many similarities. One is they both have set characters throughout the game. Another point would be their leisure time application, for they are both a relaxing (or addictive) pass-time. Without this value most gamers would pass any and all games up, and actually go outside. Even though there are many similarities, there are also unmistakable differences which give the games a certain individualities. One of the differences is game development. Strategy has a win/lose element, which keeps the pressure on, while RPGs have a diverse game development, where several paths maybe followed, and different ending reached. Another trait is units, or the controllable character/s which inhabit the game world. Strategy has multiple units for each player, all of which are replaceable and lost at the end of a mission. In contrast, RPGs have set units, characters which develop skills over time and can only be added to ones team through certain actions. So it may be said even though the genres stem from the same roots, that their unique points give them a distinct individuality. With both types of games starting from one, much like evolution, gamers dont tend to stay with one genre. ...

Saturday, October 19, 2019

Human Recources about Rewards and recognition Essay

Human Recources about Rewards and recognition - Essay Example This paper discusses strategies that can be employed in business to improve the system of reward and recognition both for the staff and the customers. Increased action may not be the necessary outcome of reward. The quality of work is more important than the quantity of work. Some managers give equal importance to both. Therefore, reward should encourage the employee to increase both the quality and the quantity of work. The difficulty with a standardized reward and recognition program is that it is a completely impersonal process. Instead of thinking about the specific people involved, the company provides the same generic rewards to everyone. But when an element of fun and play is added to a financial reward or bonus, the experience becomes personalized and much more memorable for the award recipient. (Weinstein, 1997). The approach to reward and recognition proposed by Zigon is consistent with Victor Vroom’s Expectancy Theory according to which, different employees have different likes and dislikes so standardizing a set of rewards and patterns of recognition may not work for all. Managers’ efforts should be directed at aligning the rewards with the interests of individual employees (University of Minnesota, 2010). In her article, Ryan (n.d.) has evaluated the effectiveness of the non-monetary incentives in comparison to the monetary incentives. Ryan (n.d.) expresses that she could not find any strong empirical evidence to say whether the non-monetary incentives work. However, objective empirical evidence elaborating the strength of the non-monetary incentives is there. Malotte, Hollingstead, and Rhodes (1999) found that the non-monetary incentives like grocery store coupons were 86 per cent effective in returning the patients for the skin test reading whereas efficiency of the monet ary incentive of $10 was 95 per cent. While it is important to take measures to enhance the productivity of the staff, it is equally important to do the needful

Friday, October 18, 2019

Discussions questions Article Example | Topics and Well Written Essays - 500 words

Discussions questions - Article Example se come together within a particular use situation and in this case can be considered to be the services offered by the south west airline to its users in form of travels to various destinations. The growth of customer value thinking has impacted successful marketing practice as globalization is evident in all product categories today. The essence of customer value has become a mandate for management, not with- standing the airline industry and hence focuses on service, quality, image and price. On the other hand, organizations have developed the culture of fun and humour in relating to customers as well as their employees with the aim of improving performance. In the airline industry, humour can be used to make the customers feel comfortable in airlines as well as set the right mood for the travel. Airlines can make fun of the passengers, the crew as well as the environment in getting to drive a point when flying. Simple sense of humour in petty issues such as giving directions for wearing a safety belt, actions to take in case of an emergency as well as efforts to make the customers deviate from habits not allowed in the plane such as smoking would make an airline unique in a differentiating it from other airlines. Such sense of humour could also come in handy in cases of turbulence and help calm down the passengers in making them concentrate on their safety. An example of making humour in an airline travel could letting the passengers know that if they wished to smoke, the smoking section is available on the wing where they can light ‘me and smoke ‘me. Humour would make the airline unique and in a good way differentiated from the other airlines as passengers would appreciate the effort from the cabin crew in assisting them emotionally in times of a long travel, (Robinette, Brand, & Lenz, 2001). In the case of other companies such as broadcasting agencies, the use of radio clowns can make the shows lively and humourous. This will attract more listeners as

Buddhism as a Critique of Culture Essay Example | Topics and Well Written Essays - 500 words

Buddhism as a Critique of Culture - Essay Example This is the aim, even with regard to 'normal' people in the world, since they tend get confused, and develop problems due to wrong identification of the patterns of life or 'samsara' (Watts 16) shrouded in 'maya' which is explained as being more than illusory; 'maya' encompasses the entire range of concepts from culture, one's cultural identity, to the way one perceives the world (Watts 9). An individual is an inseparable organism of the universe, and simultaneously unique, since all organisms are not uniform and differ in their identities accorded as per the societal constructs such as sex, class, and others. Conflicts arise because the rules of the universe and cosmos may not always be in consonance with that of the societal conventions, and the individual struggles to integrate between these two inherently differing components of reality and social constructs or maya (Watts 9). In these eastern thoughts, 'nirvana' or liberation (Watts 16) is the solution to the problems arising from afore mentioned conflict. The aim of nirvana is not to destroy maya, but rather, to see through maya; and to do this one must come out of the social constructs and see reality (Watts 9).

Plymouth (UK) airport will never re-open Term Paper

Plymouth (UK) airport will never re-open - Term Paper Example However, this trend did not last long after 2009 (Great Britain 2010, p. 101). The airport was eventually officially closed on the 23rd of December 2011 after the owners coming to an agreement that this was the right decision to make at that particular moment. There are a number of reasons that are believed to have led to the closure of Plymouth City Airport. According to the management of the airport, closure was made inevitable due to the economic downturn and the problems that the UK aviation market was facing at that time. Just 6 months before the airport was closed the company had registered a loss of up to  £1m (Hynes 2010, p. 201). This was a simple sign that the airport was not going to do well under the same condition given the fact that the airport had more than 50 employees. If they decided to remain functional, they would have continued incurring losses because they would still have operational costs, compensation of their employees being included. The situation at Plymouth Airport was made worse by the withdrawal of Air South West’s flights to London. This meant that the airport no longer had any London passengers going through it. This was a big blow to the airport. The management tried all they could to find a replacement for Air South West, but they were not successful. This led to a situation whereby less than 100 passengers were flying from this airport every day. This was a simple implication that the airport was no longer commercially viable. Much of the services that were offered did not earn the airport any profits due to the situation that had developed as a result of the withdrawal of Air South West, the economic turn-down, and the problems that the UK aviation industry was facing at that time (Pavlyuk 2014, p. 22). This can be explained by the fact that by the month of April 2011, Sutton Harbour Holdings had already announced that the airport would be closed down by the end of 2011. In fact, the

Thursday, October 17, 2019

Public University Analysis Essay Example | Topics and Well Written Essays - 1000 words

Public University Analysis - Essay Example The Commonwealth, and not the University, has the full duty of contributing to the plan. Significantly, the entire full-time faculty, particular managing staff, and the health care personnel involve themselves in the Faculty Discretional Retirement Plans. These are set-contribution strategies in which the retirement gains got are linked to the employer and employee aids (of which most of them are paid by the University), and the dividends and interest. Personal contracts provided the strategies for full-time faculty and given managing personnel provide for complete and instant entrusting of both the participant’s and University’s contributions (Sigo, 2014). The health care personnel’s employer aids completely entrust after a single year of employment. Full pension costs under the plans were about $36.3 million. Contributions to the Discretional Retirement Plans were aggregated by use of the base salaries of about $337.6. The input value amounted to 9.9%. Considering the Retirement Plans, it is definite that each and every stable full-time employee, as a rule of work, is an affiliate of both the State and Teachers’ Employees ‘Retirement Structure or the Discretionary Retirement Program. Fit workers can choose to participate in the Discretionary Retirement Program at the time of work, or else they get enrolled in the State and Teachers’ Employees’ Retirement Structure. The State and Teachers’ Employees’ Retirement Structure (TSERS) at the Virginia University comprise of a cost-sharing multiple-worker described benefit pension plan that is recognized by the State to give pension gains for workers of State, local boards of education and its component sections (Sigo, 2014). The TSERS is managed by around 14 members of Board of Trustees, with the Chairman of the Board being the State Treasurer. GASB proposed changes that were to affect the financial statements of institutions and

History of Kiribati Culture Research Paper Example | Topics and Well Written Essays - 1000 words

History of Kiribati Culture - Research Paper Example The essay "History of Kiribati Culture" analyzes the culture of the country Kiribati, the factors that influenced on eating habits and choice of food. The country was originally called the Gilbert Islands when it was still under British rule. The local natives, later on, joined Kiribati from the name â€Å"Gilbert†. Today, the country has a population of slightly over 100,000 (U.S Department of State, 2012). The majority of the island’s inhabitants are of the Kiribati descent, and they practice Micronesian culture. It is believed that the first inhabitants of the islands arrived there more than three thousand years ago. Later on Samoans, Togoans and Fijians invaded the islands and brought with them elements of Polynesian and Melanesian culture into the area through intermarriages. British explorers, however, discovered the island much later during the 18th century. The islands were named after one of the two British explorers, Gilbert. More islands were later found to the north, and these took the name ‘Marshalls’ after the second explorer. One resoundingly unique element in the Kiribati culture is the free-spirited nature of daily activities unconstrained by time pressures. In Kiribati, life takes on a pace that is natural and unperturbed about the future. Most economic practices aim to sustain the daily livelihood of families and the community in general. Communal, social events also take on a prominent role in the people’s lives. Some of these communal, social events are dances and martial arts contests.

Wednesday, October 16, 2019

Plymouth (UK) airport will never re-open Term Paper

Plymouth (UK) airport will never re-open - Term Paper Example However, this trend did not last long after 2009 (Great Britain 2010, p. 101). The airport was eventually officially closed on the 23rd of December 2011 after the owners coming to an agreement that this was the right decision to make at that particular moment. There are a number of reasons that are believed to have led to the closure of Plymouth City Airport. According to the management of the airport, closure was made inevitable due to the economic downturn and the problems that the UK aviation market was facing at that time. Just 6 months before the airport was closed the company had registered a loss of up to  £1m (Hynes 2010, p. 201). This was a simple sign that the airport was not going to do well under the same condition given the fact that the airport had more than 50 employees. If they decided to remain functional, they would have continued incurring losses because they would still have operational costs, compensation of their employees being included. The situation at Plymouth Airport was made worse by the withdrawal of Air South West’s flights to London. This meant that the airport no longer had any London passengers going through it. This was a big blow to the airport. The management tried all they could to find a replacement for Air South West, but they were not successful. This led to a situation whereby less than 100 passengers were flying from this airport every day. This was a simple implication that the airport was no longer commercially viable. Much of the services that were offered did not earn the airport any profits due to the situation that had developed as a result of the withdrawal of Air South West, the economic turn-down, and the problems that the UK aviation industry was facing at that time (Pavlyuk 2014, p. 22). This can be explained by the fact that by the month of April 2011, Sutton Harbour Holdings had already announced that the airport would be closed down by the end of 2011. In fact, the

Tuesday, October 15, 2019

History of Kiribati Culture Research Paper Example | Topics and Well Written Essays - 1000 words

History of Kiribati Culture - Research Paper Example The essay "History of Kiribati Culture" analyzes the culture of the country Kiribati, the factors that influenced on eating habits and choice of food. The country was originally called the Gilbert Islands when it was still under British rule. The local natives, later on, joined Kiribati from the name â€Å"Gilbert†. Today, the country has a population of slightly over 100,000 (U.S Department of State, 2012). The majority of the island’s inhabitants are of the Kiribati descent, and they practice Micronesian culture. It is believed that the first inhabitants of the islands arrived there more than three thousand years ago. Later on Samoans, Togoans and Fijians invaded the islands and brought with them elements of Polynesian and Melanesian culture into the area through intermarriages. British explorers, however, discovered the island much later during the 18th century. The islands were named after one of the two British explorers, Gilbert. More islands were later found to the north, and these took the name ‘Marshalls’ after the second explorer. One resoundingly unique element in the Kiribati culture is the free-spirited nature of daily activities unconstrained by time pressures. In Kiribati, life takes on a pace that is natural and unperturbed about the future. Most economic practices aim to sustain the daily livelihood of families and the community in general. Communal, social events also take on a prominent role in the people’s lives. Some of these communal, social events are dances and martial arts contests.

Financial problem in a country or organization of your choice Essay Example for Free

Financial problem in a country or organization of your choice Essay Discuss the causes of a financial problem in a country or organization of your choice and suggest some solutions. Specify the problem and the City/Country and relate to a particular study. Zimbabwe is an agricultural based economy previously known as the ‘bread-basket’ of Southern Africa. In the past decade, the country experienced a drastic economic disintegration due to wide range of factors including: unconstitutional land redistribution, health, decline in foreign investment and hyperinflation. The Zimbabwean economy is strongly intertwined with politics; therefore the political instability subsequently offset the economy. In 2000, the government embarked on ‘the land reform programme’ which removed white commercial farmers from arable lands so that it could be redistributed among black farmers. The experienced farmers were replaced by mostly black subsistence ones, with no farming knowledge, equipment and capital and therefore could not produce at a commercial scale. There was no agricultural export, meaning there was a loss of foreign currency being injected into the economy on a regular basis. This marked the beginning of economic downfall. Richardson (2004:307). The failure of the agricultural sector which is the backbone of the economy led to the economic crisis. This meant that the government could not generate enough revenue to sustain its infrastructures such as the health sector. Health conditions are directly related to the poor economy. Sick workers were not able to work as much or as productively as healthy ones. Labour markets were less efficient and the market was not able to produce as much. Consequently, the economy produced far less per-worker than a similar healthy economy. This was evident in Zimbabwe by the low participation rate that at just over 35 %, as opposed to 51.08 % in the U.S. or 51.97 % in Japan. Richardson (2004:289). Another contributing factor was that foreign investors also fled, due to insecurities and the government policies dictating that 51% ownership of their businesses should be locally owned. Foreign direct investment fell to zero by 2001, and the  World Bank’s risk premium on investment in Zimbabwe shot up from 4 % to 20 % that year as well. Hill (2003: 109). Furthermore, the Zimbabwean economy was brought down by the illegal sanctions (an order that is given to force a country to obey international laws by limiting or stopping trade with it. Merriam-Webster dictionary 2012:198) imposed by the American and European superpowers. This meant that no trade was to be done with Zimbabwe. There was a sudden death of foreign currency and investment influx to the country. The U.S. and Britain have partially withheld financial support for Zimbabwe and there would be no access to the International Monetary Fund (IMF) because they could not pay their debt and the prevailing hyperinflationary conditions. Hill (2003: 102). The causes of Zimbabwe’s financial problem can be mitigated by first achieving a ‘political breakthrough’ that will depoliticize the economy. Then, land should be re-redistributed among experienced commercial farmers and train the less experienced ones to ensure a more sustainable output. There must also be a liberalisation of foreign investment regulations to attract the foreign investors. In conclusion, these suggested solutions will help to rebuild the economy and restore Zimbabwe as the bread basket of Southern Africa. References: Richardson, C,J. 2004. The Collapse of Zimbabwe in the Wake of the 2000–2003 Land Reforms. New York: Edwin Mellen Hill, G. 2003. The Battle for Zimbabwe. Cape Town: Zebra

Monday, October 14, 2019

Effects of the Recession on the Housing Market

Effects of the Recession on the Housing Market Introduction This part of the dissertation seeks to understand and investigate the cause of the current global recession and how it has affected the housing market in the UK. Housing Market Trends After the housing markets spectacular collapse in the 1990s, the UK housing market staged a significant revival. According to the HBOS index, the average house price stood at about  £163,000 in 2005, approximately double the  £82,000 it would have been worth in 2000. Cameron (2005) suggests that house prices surpassed their 1989 peak, relative to average household incomes. The other traditional measure of affordability, the ratio of interest payments to income, is not so overstretched, but only if capital repayments and unsecured debt are ignored. In addition, the strength of the housing market reflects the exceptional economic performance of the economy in 2005, which in turn is partially due to the sensible independent monetary policies pursued by the Bank of England Cameron (2005). As a result, it is suggested that Britain dealt with the world economic slowdown of 2001-2003 a great deal better than the majority of chief economies, producing six per cent growth. This vigorous expansion cannot completely describe the strength of the house price boom. Consequently, numerous economists have argued that there is a bubble in the British housing market, in common with a number of other countries, such as Spain, Australia, Canada, Sweden, and parts of the USA. FIGURE 1 Figure 1 shows the ratio of average house prices to average earnings, a key measure of affordability, for Great Britain and three major regions up to 2004 which is before the economic recession struck. As is visible, there is a positive contrast of cyclical behaviour in each series, with a surprising rise since 1999. According to the HBOS index, prices rose by only 1.3% over the nine months from July 2004 to April 2005. One of the main causes of this poor rise was due to the fact that many households were affected by the increases of the Bank of England base rate. Moreover, the increasing lack of demand within first time buyers, together with decreased numbers of house sales and low request rates for mortgages, implies that house prices have become separated from their underpinnings. The Nature of the Housing Market Housing markets are unusual for a number of reasons Housing markets are peculiar for a number of reasons. First, houses take time to build, so when demand rises, supply can only respond with a considerable lag. Indeed, to all intents, the short-run supply of housing is fixed. Second, houses are an asset that pays an implicit income (that is, the amount of rent that the owner saves by owning a house), so the value of the house should reflect expectations about future rents. But more importantly, since house-ownership in the UK is so widespread, a house is most householdsà ¢Ã¢â€š ¬Ã¢â€ž ¢ most important asset and since prices can go down as well as up, households are thereby exposed to a considerable amount of risk (almost half a million households had their homes repossessed in the 1990s). Unfortunately, it is not really possible to offset this risk since nobody offers insurance against a fall in prices. The Global economic recession It seems to have been agreed that the financial crisis which formed the birth of the current global economic recession was formed in the millennia of 2000 as a result of several factors which influenced increased housing sales and increased mortgage lending. [Sakbani (2009), Turalay (2009), Sel (2009)] One of the main factors which influenced the financial crisis was the boom in the housing market which was the result of increased supply of housing which persuaded financial institutions to increase and extend mortgages at attractive rates which mortgages borrowers could not afford to pay back. At the time of increased mortgage lending, the mortgage lenders had liquid assets that where at a level never seen before and this encouraged them to invest their assets into higher earning assets. This boom gave mortgage lenders an opportunity to double their portfolio of mortgage lending in respect of the past 10 years and mortgages reached some 50 per cent of their total lending assets after 2001 (Sakbani, 2009). The second factor which influenced housing sales was the record low-interest rates which were put in place by major banks to attract would be house buyers into purchasing mortgages at very low interest rates and other influences was the deregulation of financial institutions, there was a attitude throughout the major central banks of self regulation and with the increased financial innovations, major banks tended to regulate themselves. The final major factor was the disappearance of inflation fear as banks began to grow and increase portfolios, their self confidence began also to grow and any fears which were previously held started to disappear and this therefore relaxed their customer vigilance (Sakbani, 2008). As the demand for housing rose in the last decade and a half, this reached a record high in all major countries including the UK and USA. In the USA in particular, housing units sold in 2005 reached a peak of 1,283,000 as compared to an average of 609,000 in 1995-2000. More than 6 million units were sold in the five years up to 2006 (US Economic Forecast, 2009). The affects of this, increased the wealth and amount of disposable income available to households which in turn, increased the growth of the US economy up to 2007. It is recognised however, that this increase in economies and housing sales would not have taken place if there was a reduction in the availability of cheap mortgages being made available in the USA and UK up to 2005 and the substantial increase of low interest rates (IMF, 2008). The major banks began to operate under reduced regulation and with the global financial markets know in full swing, this increased the housing boom in the UK as some mortgages contained grace periods of up to three years and minimal down payments where required and with the introduction of low-interest rates, only fuelled the housing boom. Furthermore, these mortgages that where being taken out by borrowers would have originally been considered as non-credit worthy or, at very least, borrowers who incurred debts beyond their capacity to pay back (Ronald, 2008). As the banks began to run these debts, they ensured that the higher the risk, the higher should be the lending rate which therefore gave rise to the subprime mortgage market; this is a market whose borrowers may have difficulty maintaining the repayment schedule. Proponents of subprime lending maintain that the practice extends credit to people who would otherwise not have access to the credit market. As Professor Rosen of Princeton University explained, The main thing that innovations in the mortgage market have done over the past 30 years is to let in the excluded: the young, the discriminated against, the people without a lot of money in the bank to use for a down payment.à ¢Ã¢â€š ¬? It has now been agreed that this would have only ended in one way, this being collapse of the housing market and financial institutions. As borrowers started to run out of finances to repay their mortgages and defaults began to increase, the rate of increase in housing prices started to fall and could not compete with the rate of debt which therefore meant that borrowers could not refinance their loans or sell their houses at large profits [(The) Economist (2008), Sakbani (2008), Elise (2008)]. One way this could have been prevented is that if banks had extended their mortgage loans under the old conditions of mortgage lending, they would have had to hold them on their books and eventually would have run out of funds. But starting in the late 1980s, financial innovations made it possible for mortgage lenders to unload their loans to pools, which can transform these personalised, non-negotiable obligations into derivative securities guaranteed by the mortgages (Sakbani, 2008). After the crisis erupted, the International Monetary Fund (IMF, 2008) estimated the size of these securities at more than $945 billion, while Goldman Sachs put them at more than $1.0 trillion. In September 2008, the IMF revised its estimate to $1.4 trillion ((The) Economist, 2008). On January 28, 2009 the IMF once more revised its estimate to $2.2 trillion. All these estimates therefore prove that, nobody had any idea of the amount of the non-performing assets. Sakbani (2008) tends to suggest that there were many culprits that where directly related to the financial crisis of 2008 which include: the greedy banks and other financial institutions with their irresponsible and uninformed behaviour, the equally greedy borrowers, the absence of regulations covering all the financial institutions involved and not just banks, the lacunae of vigilant supervision at both the states and federal levels, the non-regulated and non-transparent character of the financial innovations, the failure of the rating agencies to do their job and finally the loose monetary policy of the Greenspan era in the years 2001-2004. Mr Greenspan, testifying on October 23, 2008 before a Congressional Committee, admitted his error in believing that investment managers would exercise prudence in their operations and accepted that the regulatory system was loose and fundamentally obsolete. Since the beginning of the economic recession, there has been a high reduction in new housing starts after a reduced number of sales. Berkeley Homes for example, reported sales down by 50% in the summer of 2008, also with housebuildersà ¢Ã¢â€š ¬Ã¢â€ž ¢ shares falling to low levels, there is major financing problems which continue to suffer. Housing Developments Policy Turalay (2008) appears to suggest that at the beginning of the downturn, the position of the UK housing market did not appear to be that bad as it was expected that there would be a gradual slow down in housing sales and then a fairly rapid recovery process which would not adversely affect the economy, however, this did not prove to be the case and no-one could have predicted what actually happened. Although UK economist Andrew Oswald, famously declared in November 2002; à ¢Ã¢â€š ¬Ã…“I think we are about to go through the great housing crash of 2003 to 2005. . . . I advise you to sell your house, and move into rented accommodation Panic will then set inà ¢Ã¢â€š ¬?(Pickard, 2005, p. 9). When comparing the period of July-October 2007 with July-October 2008, evidence suggests that a fall in average sale prices of around 14 per cent (Land Registry, 2008). It has been noted by Pryce Sprigings (2008) that measuring price change is hampered by the fact that selling times have risen substantially and indices are therefore not comparing like with like à ¢Ã¢â€š ¬Ã¢â‚¬Å" ideally one would like to compare, for example, the acerage price of houses that sold within a month on the market in 2007 with average prices of houses that sold within a month on the market in 2008. Evidence also suggests that transaction volumes have fallen dramatically from around 111,000 sales per month in England and Wales between July and October 2007 to 45,000 sales per month between July and October 2008, which is a fall of 60 per cent (Land Registry, 2008). Other data sources also reported this fall including Halifax, Nationwide, Land Registry and Council of Mortgage Lenders (CML). Some locations are showing even greater falls, with city centre flat and apartment markets appearing to be particularly vulnerable. During Oswaldà ¢Ã¢â€š ¬Ã¢â€ž ¢s prediction, real average house prices rose at one of the steepest rates recorded in modern times, by nearly a quarter in real terms, from  £140,593 in 2003 quarter 1, to  £173,412 in 2006 quarter 1, based on nationwide real mix adjusted house prices see Figure 1 below, and continued to rise for a further two years until quarter 4 of 2007. Figure 1 Real House Prices There appears to have been significant early interventions from the government and the Bank of England to keep both the housing market and the wider economy on course. Consecutive cuts to base rates, addition of  £50bn of liquidity into the finance markets by the Bank of England to alleviate the credit crunch, and  £2.7bn fiscal improvement to balance low-income households for the withdrawal of the 10p tax rate. It was expected that these would all combine to form an apparently positive reinforcement, however this would prove not to be the case as in March 2008, initial indications emerged of a somewhat more speedy slowdown in the housing sector was about to develop. The RICS housing market survey of that month specified that surveyor attitude with regard to house prices had weakened to the lowest point since the survey began in 1978 and the ratio of completed sales in the previous three months to the stock of unsold property on the market fell to 0.224, the lowest since September 1996 (RICS UK Economic Brief, 2008). With mortgage approvals falling by 44 per cent in the same year (2008), this resulted in a significant fall in housing demand which led to banks being unwilling to offer new loans on houses. Although there is no surprise that the housing market has took a downturn and because this has happened before, there are no unexpected events occurring, Pryce and Sprigings tend to suggest that the speed and severity of the decline has been unusual. They go on to express that this leads us to naturally question whether our policies, our regulatory frameworks, our collective approach to housing and cultural obsession with house prices, have in some way exacerbated this particular manifestation of that cycle by sustaining the upswing well beyond mean trend and perhaps resulting an unnecessarily sever and rapid downturn (Pryce and Sprigings, 2008). These questions however are not wholly of interest to housing professionals as links between residential property and the broader market as well recognised. An example of this is stated by Goodhart and Hofmann (2008, p.180), where they find; à ¢Ã¢â€š ¬Ã…“a significant multidirectional link between house prices, monetary variables, and the macroeconomy with the effects of money and credit amplified when house prices are boomingà ¢Ã¢â€š ¬?. It is agreed by Maclennan and Pryce that housing impacts on the real economy via the construction, financial, estate agency and legal sector and through housing equity financed consumption, all of which are sensitive to housing market fluctuations, and all have become increasingly inter-linked across nations as a result of the globalisation of capital and labour (Maclennan and Pryce, 1996). It is also in agreement with numerous authors, Malpass in particular, that housing also impacts on welfare; not only through homelessness caused by repossessions (i.e. owner occupiers and renters affected by landlord default) at a time of crisis, but increasingly through equity release funding of education support (including accommodation) at the start of life and elderly care at the end. (Malpass, 2005). Another article which backs Malpassà ¢Ã¢â€š ¬Ã¢â€ž ¢ suggestion is the announcement of the Homes and Communities Agency (HCA) which has confirmed the closing of Local Authority New Build (LANB) as a national programme. This is a result of the Treasury announcing that it was cutting  £220 million from HCAà ¢Ã¢â€š ¬Ã¢â€ž ¢s budget, this follows on from the cut to the May budget of  £230 million. The new builds where seen as a solution to ease the housing crisis of the UK since the recession and to add to Malpassà ¢Ã¢â€š ¬Ã¢â€ž ¢ argument, Baroness Hanham stated in the House of Lords; à ¢Ã¢â€š ¬Ã…“There will be casualties; I donà ¢Ã¢â€š ¬Ã¢â€ž ¢t have any doubt that there will be casualtiesà ¢Ã¢â€š ¬? Furthermore to this statement, Labours Lord McKenzie warned à ¢Ã¢â€š ¬Ã…“It will force many to move or end up homeless and create ghettos of the poorà ¢Ã¢â€š ¬?. Unfortunately, the literature and policy debates on the nature and consequences of housing markets have evolved rather dichotomously. As Maclennan (2008, p. 424) observed; à ¢Ã¢â€š ¬Ã…“Many nations are now involved in two housing discussions, namely à ¢Ã¢â€š ¬Ã…“homelessness and affordabilityà ¢Ã¢â€š ¬? and à ¢Ã¢â€š ¬Ã‹Å"à ¢Ã¢â€š ¬Ã‹Å"house price booms, bubbles and bustsà ¢Ã¢â€š ¬Ã¢â€ž ¢Ãƒ ¢Ã¢â€š ¬Ã¢â€ž ¢. The first theme has largely been the domain of social policy ministries, lobbies and researchers (Carter and Polevychok, 2004).The second has absorbed the macroeconomic policy community, including central banks, finance ministries, financial institutions and some academic economists, who are concerned about à ¢Ã¢â€š ¬Ã…“stabilityà ¢Ã¢â€š ¬?. Affordability and stability are often discussed as if they are unrelated, not just in the press, but also within policymaking circles.à ¢Ã¢â€š ¬? Researchers can now endeavour to bridge this gap in housing discussions. By using the analogy of sowing and reaping, à ¢Ã¢â€š ¬Ã‹Å"whatsoever a man soweth, that shall he also reapà ¢Ã¢â€š ¬Ã¢â€ž ¢ (Galations 6:7, King James Version). It can be highlighted how scrupulous aspects of the existing recession should require policy makers and researchers to reflect on the failures of policy that have arisen as a result of the à ¢Ã¢â€š ¬Ã…“fragmented nature of housing thinking within modern governmentsà ¢Ã¢â€š ¬? (Maclennan, 2008). Pryce and Sprigings propose that the great correction that is currently underway is a consequence, not only of transcendent global forces, but also significantly of UK policy decisions on financial liberalisation and housing. And if we are reaping what we have sown in domestic policy, who are the winners and losers, and what are the implications for how we evaluate UK post-war policy? It has been made clear that such issues are underpinned by major policy, theoretical, and empirical questions that will most probably be debated at length in the future. What Pryce and Sprigings have done, is highlighted the issues and hope that highlighting these issues will offer some key pointers as to how the future debate should be structured and what might be done to ensure a more integrated approach to modernising UK housing policies. It is argued that successive governments i.e. Conservative Party and Labour Party have promoted homeownership since the end of the Second World War and its benefits it brings financially to the lease holder if they are the occupier as one of the White Papers show from 1953, which states; à ¢Ã¢â€š ¬Ã…“One object of future housing policy will be to continue to promote, by all possible means, the building of new houses for owner occupation. Of all forms of saving this is one of the best. Of all forms of ownership this is one of the most satisfying to the individual and the most beneficial to the nationà ¢Ã¢â€š ¬? (1953White Paper, Houses: The Next Step). Gradually homeownership became deeply embedded in the UK psyche as the tenure of aspiration (Ronald, 2008). However, people then become aware that homeownership may not be best suited for everyone and this is a point that is raised by Sprigings (2008) where he identified that by encouraging low-income households into homeownership, we are subjecting them to the worst of its costs and risks while the market may restrict for them the potential of its benefits. This idea was also backed up by Pickard (2005) where he stated that housing is believed to be a great long-term investment on average, but for the deprived areas, and for the poorest households, homeownership may simply not produce the promised benefits. Housing developments and the global recession can be seen as interlinked with certain groups of society and those in less secure jobs as people on low income will bear the biggest brunt of the recession as low income workers and people in less secure jobs are more than likely to face financial difficulties when it comes to mortgage repayments as they are likely to lose their jobs or see rising inflation and rising interest rates and therefore low income households are likely to leave homeownership at the worst point because they are facing the biggest impact of the recession and also when the market begins to resume to normality again, low-income households may find it harder to re-enter the housing market when house prices are low because there is a proven correlation between credit being made available and housing prices and low-income households may not be able to obtain credit when house prices are still low therefore not enabling them to enter the housing market when it seems mo st beneficial. The CML also back up this idea as figures for October 2008 show that, the value of loans has decreased to 83 per cent of the value of the property therefore, as it has been established that long term dividends on housing can be superior, low-income households will find it difficult to witness these dividends as they will be exiting the housing market when it begins to deteriorate and trying to enter the housing market when it is difficult to obtain credit. Pryce (2008) seems to perceive that the promotion of homeownership by successive UK governments and therefore the rapid increase of owner occupation may have inadvertently produced a money pump working in the opposite direction. Another theory which Pryce (2008) identifies is the fact that low-income and particularly ethnic groups are less likely to enjoy the benefits of inter-generational housing welfare transfer. Keister (2003) also backs up the second theory of Pryce (2008) by identifying that children from larger families accumulate less wealth than do those from smaller families and that siblings dilute parentsà ¢Ã¢â€š ¬Ã¢â€ž ¢ finite financial resources and non material resources. Sibship size also reduced that likelihood of receiving a trust account or an inheritance and decreases home and stock ownership. Buy-to-Let Mortgages Buy-to-Let mortgages where developed in 1995 and where designed as a new financial product in the UK which enabled individuals to purchase a mortgage on a property for the purpose of letting the property out to future tenants. The benefits from these mortgages can include a stable income from rental receipts, as well as an accumulation of wealth if house prices go up. However one of the main factors of risk with taking out a buy-to-let mortgage is leverage speculation where the landlord purchases a property expecting to sell the house at a later date for a higher price or that rental income will exceed the repayment amounts of the initial loan. Buy-to-Let mortgages have became extremely popular with apprentice investors as this type of mortgage attracts middle income people to start to develop into small-scale landlords as a means of investing for their retirement. The volume of these loans grew rapidly in value as shown in Figure 2. Figure 2 BTL loan Pryce (2008) expresses concern at the fact that 90 per cent of total BTL advances since 1999 have been taken out during periods of above-trend house prices, and  £74 billion of BTL mortgages, which is more than half of the total BTL advances since 1999, were issues at the very peak of the housing boom. This can be seen in Figure 3. Fig 3 It is therefore in agreement that, a significant proportion of BTL loans are at risk because there is consensus that the value of securities will fall below the outstanding mortgage debts. This consensus is backed-up by the fact that repossessions on BTL properties as a per cent of all BTL mortgages almost doubled in the space of 18 months from the second half of 2005 to the first half of 2007 before the first round of gloomy house price results were released in late 2007. Latest CML data also reinforces this claim as they show a large increase in BTL accounts over three months in arrears at the third quarter of 2008 having trebled in number in 12 months to around 18,000. (Pryce and Sprigings 2008). If home owners begin to default on their loans then the impact could be significant not only for lenders, but for particular sectors of the housing market as 80 per cent of BTL properties are terraced of flats and these account for almost a third of the entire UK private rented stock (Sprigings, 2008). One of the key features of the BTL which there is much agreement on is the impact it seems to have had on new housing supply with flats coming to dominate supply, particularly in city markets. (Taylor 2008, Sprigings 2008). Fig 4 Effects of the Recession on the Housing Market Effects of the Recession on the Housing Market Introduction This part of the dissertation seeks to understand and investigate the cause of the current global recession and how it has affected the housing market in the UK. Housing Market Trends After the housing markets spectacular collapse in the 1990s, the UK housing market staged a significant revival. According to the HBOS index, the average house price stood at about  £163,000 in 2005, approximately double the  £82,000 it would have been worth in 2000. Cameron (2005) suggests that house prices surpassed their 1989 peak, relative to average household incomes. The other traditional measure of affordability, the ratio of interest payments to income, is not so overstretched, but only if capital repayments and unsecured debt are ignored. In addition, the strength of the housing market reflects the exceptional economic performance of the economy in 2005, which in turn is partially due to the sensible independent monetary policies pursued by the Bank of England Cameron (2005). As a result, it is suggested that Britain dealt with the world economic slowdown of 2001-2003 a great deal better than the majority of chief economies, producing six per cent growth. This vigorous expansion cannot completely describe the strength of the house price boom. Consequently, numerous economists have argued that there is a bubble in the British housing market, in common with a number of other countries, such as Spain, Australia, Canada, Sweden, and parts of the USA. FIGURE 1 Figure 1 shows the ratio of average house prices to average earnings, a key measure of affordability, for Great Britain and three major regions up to 2004 which is before the economic recession struck. As is visible, there is a positive contrast of cyclical behaviour in each series, with a surprising rise since 1999. According to the HBOS index, prices rose by only 1.3% over the nine months from July 2004 to April 2005. One of the main causes of this poor rise was due to the fact that many households were affected by the increases of the Bank of England base rate. Moreover, the increasing lack of demand within first time buyers, together with decreased numbers of house sales and low request rates for mortgages, implies that house prices have become separated from their underpinnings. The Nature of the Housing Market Housing markets are unusual for a number of reasons Housing markets are peculiar for a number of reasons. First, houses take time to build, so when demand rises, supply can only respond with a considerable lag. Indeed, to all intents, the short-run supply of housing is fixed. Second, houses are an asset that pays an implicit income (that is, the amount of rent that the owner saves by owning a house), so the value of the house should reflect expectations about future rents. But more importantly, since house-ownership in the UK is so widespread, a house is most householdsà ¢Ã¢â€š ¬Ã¢â€ž ¢ most important asset and since prices can go down as well as up, households are thereby exposed to a considerable amount of risk (almost half a million households had their homes repossessed in the 1990s). Unfortunately, it is not really possible to offset this risk since nobody offers insurance against a fall in prices. The Global economic recession It seems to have been agreed that the financial crisis which formed the birth of the current global economic recession was formed in the millennia of 2000 as a result of several factors which influenced increased housing sales and increased mortgage lending. [Sakbani (2009), Turalay (2009), Sel (2009)] One of the main factors which influenced the financial crisis was the boom in the housing market which was the result of increased supply of housing which persuaded financial institutions to increase and extend mortgages at attractive rates which mortgages borrowers could not afford to pay back. At the time of increased mortgage lending, the mortgage lenders had liquid assets that where at a level never seen before and this encouraged them to invest their assets into higher earning assets. This boom gave mortgage lenders an opportunity to double their portfolio of mortgage lending in respect of the past 10 years and mortgages reached some 50 per cent of their total lending assets after 2001 (Sakbani, 2009). The second factor which influenced housing sales was the record low-interest rates which were put in place by major banks to attract would be house buyers into purchasing mortgages at very low interest rates and other influences was the deregulation of financial institutions, there was a attitude throughout the major central banks of self regulation and with the increased financial innovations, major banks tended to regulate themselves. The final major factor was the disappearance of inflation fear as banks began to grow and increase portfolios, their self confidence began also to grow and any fears which were previously held started to disappear and this therefore relaxed their customer vigilance (Sakbani, 2008). As the demand for housing rose in the last decade and a half, this reached a record high in all major countries including the UK and USA. In the USA in particular, housing units sold in 2005 reached a peak of 1,283,000 as compared to an average of 609,000 in 1995-2000. More than 6 million units were sold in the five years up to 2006 (US Economic Forecast, 2009). The affects of this, increased the wealth and amount of disposable income available to households which in turn, increased the growth of the US economy up to 2007. It is recognised however, that this increase in economies and housing sales would not have taken place if there was a reduction in the availability of cheap mortgages being made available in the USA and UK up to 2005 and the substantial increase of low interest rates (IMF, 2008). The major banks began to operate under reduced regulation and with the global financial markets know in full swing, this increased the housing boom in the UK as some mortgages contained grace periods of up to three years and minimal down payments where required and with the introduction of low-interest rates, only fuelled the housing boom. Furthermore, these mortgages that where being taken out by borrowers would have originally been considered as non-credit worthy or, at very least, borrowers who incurred debts beyond their capacity to pay back (Ronald, 2008). As the banks began to run these debts, they ensured that the higher the risk, the higher should be the lending rate which therefore gave rise to the subprime mortgage market; this is a market whose borrowers may have difficulty maintaining the repayment schedule. Proponents of subprime lending maintain that the practice extends credit to people who would otherwise not have access to the credit market. As Professor Rosen of Princeton University explained, The main thing that innovations in the mortgage market have done over the past 30 years is to let in the excluded: the young, the discriminated against, the people without a lot of money in the bank to use for a down payment.à ¢Ã¢â€š ¬? It has now been agreed that this would have only ended in one way, this being collapse of the housing market and financial institutions. As borrowers started to run out of finances to repay their mortgages and defaults began to increase, the rate of increase in housing prices started to fall and could not compete with the rate of debt which therefore meant that borrowers could not refinance their loans or sell their houses at large profits [(The) Economist (2008), Sakbani (2008), Elise (2008)]. One way this could have been prevented is that if banks had extended their mortgage loans under the old conditions of mortgage lending, they would have had to hold them on their books and eventually would have run out of funds. But starting in the late 1980s, financial innovations made it possible for mortgage lenders to unload their loans to pools, which can transform these personalised, non-negotiable obligations into derivative securities guaranteed by the mortgages (Sakbani, 2008). After the crisis erupted, the International Monetary Fund (IMF, 2008) estimated the size of these securities at more than $945 billion, while Goldman Sachs put them at more than $1.0 trillion. In September 2008, the IMF revised its estimate to $1.4 trillion ((The) Economist, 2008). On January 28, 2009 the IMF once more revised its estimate to $2.2 trillion. All these estimates therefore prove that, nobody had any idea of the amount of the non-performing assets. Sakbani (2008) tends to suggest that there were many culprits that where directly related to the financial crisis of 2008 which include: the greedy banks and other financial institutions with their irresponsible and uninformed behaviour, the equally greedy borrowers, the absence of regulations covering all the financial institutions involved and not just banks, the lacunae of vigilant supervision at both the states and federal levels, the non-regulated and non-transparent character of the financial innovations, the failure of the rating agencies to do their job and finally the loose monetary policy of the Greenspan era in the years 2001-2004. Mr Greenspan, testifying on October 23, 2008 before a Congressional Committee, admitted his error in believing that investment managers would exercise prudence in their operations and accepted that the regulatory system was loose and fundamentally obsolete. Since the beginning of the economic recession, there has been a high reduction in new housing starts after a reduced number of sales. Berkeley Homes for example, reported sales down by 50% in the summer of 2008, also with housebuildersà ¢Ã¢â€š ¬Ã¢â€ž ¢ shares falling to low levels, there is major financing problems which continue to suffer. Housing Developments Policy Turalay (2008) appears to suggest that at the beginning of the downturn, the position of the UK housing market did not appear to be that bad as it was expected that there would be a gradual slow down in housing sales and then a fairly rapid recovery process which would not adversely affect the economy, however, this did not prove to be the case and no-one could have predicted what actually happened. Although UK economist Andrew Oswald, famously declared in November 2002; à ¢Ã¢â€š ¬Ã…“I think we are about to go through the great housing crash of 2003 to 2005. . . . I advise you to sell your house, and move into rented accommodation Panic will then set inà ¢Ã¢â€š ¬?(Pickard, 2005, p. 9). When comparing the period of July-October 2007 with July-October 2008, evidence suggests that a fall in average sale prices of around 14 per cent (Land Registry, 2008). It has been noted by Pryce Sprigings (2008) that measuring price change is hampered by the fact that selling times have risen substantially and indices are therefore not comparing like with like à ¢Ã¢â€š ¬Ã¢â‚¬Å" ideally one would like to compare, for example, the acerage price of houses that sold within a month on the market in 2007 with average prices of houses that sold within a month on the market in 2008. Evidence also suggests that transaction volumes have fallen dramatically from around 111,000 sales per month in England and Wales between July and October 2007 to 45,000 sales per month between July and October 2008, which is a fall of 60 per cent (Land Registry, 2008). Other data sources also reported this fall including Halifax, Nationwide, Land Registry and Council of Mortgage Lenders (CML). Some locations are showing even greater falls, with city centre flat and apartment markets appearing to be particularly vulnerable. During Oswaldà ¢Ã¢â€š ¬Ã¢â€ž ¢s prediction, real average house prices rose at one of the steepest rates recorded in modern times, by nearly a quarter in real terms, from  £140,593 in 2003 quarter 1, to  £173,412 in 2006 quarter 1, based on nationwide real mix adjusted house prices see Figure 1 below, and continued to rise for a further two years until quarter 4 of 2007. Figure 1 Real House Prices There appears to have been significant early interventions from the government and the Bank of England to keep both the housing market and the wider economy on course. Consecutive cuts to base rates, addition of  £50bn of liquidity into the finance markets by the Bank of England to alleviate the credit crunch, and  £2.7bn fiscal improvement to balance low-income households for the withdrawal of the 10p tax rate. It was expected that these would all combine to form an apparently positive reinforcement, however this would prove not to be the case as in March 2008, initial indications emerged of a somewhat more speedy slowdown in the housing sector was about to develop. The RICS housing market survey of that month specified that surveyor attitude with regard to house prices had weakened to the lowest point since the survey began in 1978 and the ratio of completed sales in the previous three months to the stock of unsold property on the market fell to 0.224, the lowest since September 1996 (RICS UK Economic Brief, 2008). With mortgage approvals falling by 44 per cent in the same year (2008), this resulted in a significant fall in housing demand which led to banks being unwilling to offer new loans on houses. Although there is no surprise that the housing market has took a downturn and because this has happened before, there are no unexpected events occurring, Pryce and Sprigings tend to suggest that the speed and severity of the decline has been unusual. They go on to express that this leads us to naturally question whether our policies, our regulatory frameworks, our collective approach to housing and cultural obsession with house prices, have in some way exacerbated this particular manifestation of that cycle by sustaining the upswing well beyond mean trend and perhaps resulting an unnecessarily sever and rapid downturn (Pryce and Sprigings, 2008). These questions however are not wholly of interest to housing professionals as links between residential property and the broader market as well recognised. An example of this is stated by Goodhart and Hofmann (2008, p.180), where they find; à ¢Ã¢â€š ¬Ã…“a significant multidirectional link between house prices, monetary variables, and the macroeconomy with the effects of money and credit amplified when house prices are boomingà ¢Ã¢â€š ¬?. It is agreed by Maclennan and Pryce that housing impacts on the real economy via the construction, financial, estate agency and legal sector and through housing equity financed consumption, all of which are sensitive to housing market fluctuations, and all have become increasingly inter-linked across nations as a result of the globalisation of capital and labour (Maclennan and Pryce, 1996). It is also in agreement with numerous authors, Malpass in particular, that housing also impacts on welfare; not only through homelessness caused by repossessions (i.e. owner occupiers and renters affected by landlord default) at a time of crisis, but increasingly through equity release funding of education support (including accommodation) at the start of life and elderly care at the end. (Malpass, 2005). Another article which backs Malpassà ¢Ã¢â€š ¬Ã¢â€ž ¢ suggestion is the announcement of the Homes and Communities Agency (HCA) which has confirmed the closing of Local Authority New Build (LANB) as a national programme. This is a result of the Treasury announcing that it was cutting  £220 million from HCAà ¢Ã¢â€š ¬Ã¢â€ž ¢s budget, this follows on from the cut to the May budget of  £230 million. The new builds where seen as a solution to ease the housing crisis of the UK since the recession and to add to Malpassà ¢Ã¢â€š ¬Ã¢â€ž ¢ argument, Baroness Hanham stated in the House of Lords; à ¢Ã¢â€š ¬Ã…“There will be casualties; I donà ¢Ã¢â€š ¬Ã¢â€ž ¢t have any doubt that there will be casualtiesà ¢Ã¢â€š ¬? Furthermore to this statement, Labours Lord McKenzie warned à ¢Ã¢â€š ¬Ã…“It will force many to move or end up homeless and create ghettos of the poorà ¢Ã¢â€š ¬?. Unfortunately, the literature and policy debates on the nature and consequences of housing markets have evolved rather dichotomously. As Maclennan (2008, p. 424) observed; à ¢Ã¢â€š ¬Ã…“Many nations are now involved in two housing discussions, namely à ¢Ã¢â€š ¬Ã…“homelessness and affordabilityà ¢Ã¢â€š ¬? and à ¢Ã¢â€š ¬Ã‹Å"à ¢Ã¢â€š ¬Ã‹Å"house price booms, bubbles and bustsà ¢Ã¢â€š ¬Ã¢â€ž ¢Ãƒ ¢Ã¢â€š ¬Ã¢â€ž ¢. The first theme has largely been the domain of social policy ministries, lobbies and researchers (Carter and Polevychok, 2004).The second has absorbed the macroeconomic policy community, including central banks, finance ministries, financial institutions and some academic economists, who are concerned about à ¢Ã¢â€š ¬Ã…“stabilityà ¢Ã¢â€š ¬?. Affordability and stability are often discussed as if they are unrelated, not just in the press, but also within policymaking circles.à ¢Ã¢â€š ¬? Researchers can now endeavour to bridge this gap in housing discussions. By using the analogy of sowing and reaping, à ¢Ã¢â€š ¬Ã‹Å"whatsoever a man soweth, that shall he also reapà ¢Ã¢â€š ¬Ã¢â€ž ¢ (Galations 6:7, King James Version). It can be highlighted how scrupulous aspects of the existing recession should require policy makers and researchers to reflect on the failures of policy that have arisen as a result of the à ¢Ã¢â€š ¬Ã…“fragmented nature of housing thinking within modern governmentsà ¢Ã¢â€š ¬? (Maclennan, 2008). Pryce and Sprigings propose that the great correction that is currently underway is a consequence, not only of transcendent global forces, but also significantly of UK policy decisions on financial liberalisation and housing. And if we are reaping what we have sown in domestic policy, who are the winners and losers, and what are the implications for how we evaluate UK post-war policy? It has been made clear that such issues are underpinned by major policy, theoretical, and empirical questions that will most probably be debated at length in the future. What Pryce and Sprigings have done, is highlighted the issues and hope that highlighting these issues will offer some key pointers as to how the future debate should be structured and what might be done to ensure a more integrated approach to modernising UK housing policies. It is argued that successive governments i.e. Conservative Party and Labour Party have promoted homeownership since the end of the Second World War and its benefits it brings financially to the lease holder if they are the occupier as one of the White Papers show from 1953, which states; à ¢Ã¢â€š ¬Ã…“One object of future housing policy will be to continue to promote, by all possible means, the building of new houses for owner occupation. Of all forms of saving this is one of the best. Of all forms of ownership this is one of the most satisfying to the individual and the most beneficial to the nationà ¢Ã¢â€š ¬? (1953White Paper, Houses: The Next Step). Gradually homeownership became deeply embedded in the UK psyche as the tenure of aspiration (Ronald, 2008). However, people then become aware that homeownership may not be best suited for everyone and this is a point that is raised by Sprigings (2008) where he identified that by encouraging low-income households into homeownership, we are subjecting them to the worst of its costs and risks while the market may restrict for them the potential of its benefits. This idea was also backed up by Pickard (2005) where he stated that housing is believed to be a great long-term investment on average, but for the deprived areas, and for the poorest households, homeownership may simply not produce the promised benefits. Housing developments and the global recession can be seen as interlinked with certain groups of society and those in less secure jobs as people on low income will bear the biggest brunt of the recession as low income workers and people in less secure jobs are more than likely to face financial difficulties when it comes to mortgage repayments as they are likely to lose their jobs or see rising inflation and rising interest rates and therefore low income households are likely to leave homeownership at the worst point because they are facing the biggest impact of the recession and also when the market begins to resume to normality again, low-income households may find it harder to re-enter the housing market when house prices are low because there is a proven correlation between credit being made available and housing prices and low-income households may not be able to obtain credit when house prices are still low therefore not enabling them to enter the housing market when it seems mo st beneficial. The CML also back up this idea as figures for October 2008 show that, the value of loans has decreased to 83 per cent of the value of the property therefore, as it has been established that long term dividends on housing can be superior, low-income households will find it difficult to witness these dividends as they will be exiting the housing market when it begins to deteriorate and trying to enter the housing market when it is difficult to obtain credit. Pryce (2008) seems to perceive that the promotion of homeownership by successive UK governments and therefore the rapid increase of owner occupation may have inadvertently produced a money pump working in the opposite direction. Another theory which Pryce (2008) identifies is the fact that low-income and particularly ethnic groups are less likely to enjoy the benefits of inter-generational housing welfare transfer. Keister (2003) also backs up the second theory of Pryce (2008) by identifying that children from larger families accumulate less wealth than do those from smaller families and that siblings dilute parentsà ¢Ã¢â€š ¬Ã¢â€ž ¢ finite financial resources and non material resources. Sibship size also reduced that likelihood of receiving a trust account or an inheritance and decreases home and stock ownership. Buy-to-Let Mortgages Buy-to-Let mortgages where developed in 1995 and where designed as a new financial product in the UK which enabled individuals to purchase a mortgage on a property for the purpose of letting the property out to future tenants. The benefits from these mortgages can include a stable income from rental receipts, as well as an accumulation of wealth if house prices go up. However one of the main factors of risk with taking out a buy-to-let mortgage is leverage speculation where the landlord purchases a property expecting to sell the house at a later date for a higher price or that rental income will exceed the repayment amounts of the initial loan. Buy-to-Let mortgages have became extremely popular with apprentice investors as this type of mortgage attracts middle income people to start to develop into small-scale landlords as a means of investing for their retirement. The volume of these loans grew rapidly in value as shown in Figure 2. Figure 2 BTL loan Pryce (2008) expresses concern at the fact that 90 per cent of total BTL advances since 1999 have been taken out during periods of above-trend house prices, and  £74 billion of BTL mortgages, which is more than half of the total BTL advances since 1999, were issues at the very peak of the housing boom. This can be seen in Figure 3. Fig 3 It is therefore in agreement that, a significant proportion of BTL loans are at risk because there is consensus that the value of securities will fall below the outstanding mortgage debts. This consensus is backed-up by the fact that repossessions on BTL properties as a per cent of all BTL mortgages almost doubled in the space of 18 months from the second half of 2005 to the first half of 2007 before the first round of gloomy house price results were released in late 2007. Latest CML data also reinforces this claim as they show a large increase in BTL accounts over three months in arrears at the third quarter of 2008 having trebled in number in 12 months to around 18,000. (Pryce and Sprigings 2008). If home owners begin to default on their loans then the impact could be significant not only for lenders, but for particular sectors of the housing market as 80 per cent of BTL properties are terraced of flats and these account for almost a third of the entire UK private rented stock (Sprigings, 2008). One of the key features of the BTL which there is much agreement on is the impact it seems to have had on new housing supply with flats coming to dominate supply, particularly in city markets. (Taylor 2008, Sprigings 2008). Fig 4