Effect Of Information And Communication Technology In A Corporate Organization Free Sample

Proposal

At the beginning of the 21st century, information technology and communication become an integral part of the corporation. Information technologies help organizations to manage and control all processes, develop their infrastructure and change. The most rapidly developing topic related to technology in recent years has been IT, with the developments coming so fast that everyone has difficulty keeping up with them and developing conclusive interpretations about their effects on organizations. The rapid advent of computer applications, the Internet, and other forms of information and communication technology have major implications for organizations and their management, but people have trouble saying exactly what effects they have and why. As for effects on public organizations, especially until recently, research has been scarce. The rational for the research paper is to investigate and analyze new trends in information systems application and their impact on communication, effects on corporate design, structure and productivity. It is assumed that advances in technology, especially computer, information, and communications technology, have presented organizations and managers with dramatic new challenges and opportunities, and researchers have been pressing to develop the theoretical and research grounding needed to understand and manage these developments.

Increasingly, organizational researchers and managers have to try to assess the influence of information technology on organizational design. The advent and dissemination of computers, the Internet, e-mail, and other forms of information and communication technology have transformed organizations and working life within them and continue to have dramatic effects. Managers’ strategic choices also determine structure. Managers may divide up an organization into divisions and departments designed to handle particular markets, products, or challenges that have been chosen for strategic emphasis. Organizations face varying degrees of uncertainty depending on how much more information they need than they actually have. As this uncertainty increases, the organizational structure must process more information. Organizations employ a mix of alternative modes for coordinating these activities. First they use the organizational hierarchy of authority, in which superiors direct subordinates, answering their questions and specifying rules and procedures for managing the information processing load. As uncertainty increases, it overwhelms these approaches. The next logical strategy, then, is to set plans and goals and allow subordinates to pursue them with less referral up and down the hierarchy and with fewer rules.

The objectives of the research are to (1) identify the main applications of information technology in a given organization; (2) analyze their impact on a corporate organization, (3) identify the main trends and factors which influence implementation of information technology and (4) discuss the impact on information technology on employees’ relations. Experts on IT tend to report that the more salient effects in industry include the extension of computing technology into design and production applications, such as computer-aided design, in which computer programs carry out design functions, and computer-aided manufacturing, in which computers actually control machinery that carries out the manufacturing process. Computer-integrated manufacturing links together the machinery and the design and engineering processes through computers. Ultimately, an integrated information network links all major components of the organization, including inventory control, purchasing and procurement, accounting, and other functions, in addition to manufacturing and production. These developments, according to expert observers, support an evolution from mass production to mass customization, where manufacturers and service organizations produce large quantities of goods and services that are more tailored to the preferences of individual customers than previously possible. In addition, observers suggest that computerized integration of production processes has effects on organizational structures and processes. Computer-integrated manufacturing reportedly moves organizations toward fewer hierarchical levels, tasks that are less routine and more craftlike, more teamwork, more training, and more emphasis on skills in cognitive problem solving than in manual expertise. These components of the framework combine to influence the way technological initiatives play out. The framework helps to explain why even very similar technological initiatives can have very different outcomes, because of different organizational and institutional influences on their implementation. Fountain also describes how such influences raise formidable challenges for successful utilization in government, given the strong, often entrenched organizational and institutional influences.

The research will pay a special attention to computer technology and its impact on a corporate organization. Computer technology and the Internet have also become more influential in organizational decision-making processes. For many years organizations have been using computers to store large data sets and retrieve information from them, but more recently the capacity for active utilization of that data has advanced, so that computer-based management information systems (MIS) have become very common. A MIS typically provides middle-level managers with ready access to data they can use in decision making, such as sales and inventory data for business managers, and client processing and status data for managers in public and nonprofit organizations. Decision support systems provide software that managers can use interactively. Many organizations currently utilize geographic information systems (GIS), which provide information about facilities or conditions in different geographic locations. A GIS might allow a planner to designate any particular geographic location in a city and pull up on the computer screen a diagram showing all the underground utility infrastructure, such as pipelines and electric cables, at that location. An executive information system provides MIS-type support, but at a more general, strategic level, for the sorts of decisions required at higher executive levels.

It is supposed that information technology allows a corporation greater decentralization of functions thus ensuring effective management and control. Computers, the Internet, electronic mail, and other forms of information and communication technology make possible more elaborate and interactive networking of people and organizational units, both within and between organizations. Some organizations have moved away from traditional hierarchical and departmental reporting relationships to forms of virtual organization and dynamic network organization, in which a central hub coordinates other units that formally belong to the same organization, as well as organizations formally outside the hub organization (such as contractors or agencies with overlapping responsibility for public agencies), via e-mail and the Internet. Advances in IT reportedly lead to smaller organizations, decentralized organizations, better coordination internally and with external entities.

Literature Review

The field of organization theory provides many valuable concepts and insights. It raises an important issue for those interested in corporate design and management, application of information systems to corporate settings and in modern business. The book Does IT Matter? by N. G. Carr (2004) discusses the problem of IT application in modern business, pros and cons of different information systems and their impact on a corporation. The book discusses the problems of technological transformations and universal strategy, IT investments and technological change. The questions asked about how much the work involves the same tasks and issues, how easy it is to know whether the work is being done correctly, and similar issues. The researchers found relationships between the structures and coordination processes in organizational units and the nature of their tasks. Some units, such as units that handled applications for unemployment compensation, had tasks low in uncertainty (low in variability and difficulty). The employees mainly filled out and submitted application forms for the persons who came in to seek unemployment compensation. These units had more plans and rules and fewer scheduled and unscheduled meetings than other units, and relatively little horizontal communication among individuals and units. Other units had tasks higher in task uncertainty, such as the unemployment counseling bureau, which helped unemployed people seek jobs. This task involved many variations in the characteristics of the clients—in their needs and skills, for example—and often there was no clearly established procedure for responding to some of these unique variations. In this bureau, employees relied little on plans and rules and had more scheduled and unscheduled meetings and more horizontal communication than other units. Units that were intermediate on the task dimensions fell in the middle ranges on the structural and coordination dimensions. So, in many government agencies, in spite of the external political controls, subunits tend toward more flexible structures when they have uncertain, nonroutine, variable tasks. Similarly, many organizations have purposely tried to transform routine work into more interesting, flexible work to better motivate and utilize the skills of the people doing it.

The book X-Engineering the Corporation by J. Champy (2002) pays a special attention to change management and its application to information technology. Also complicating the analysis of technology, various studies have found weak relationships between structure and technology, sometimes finding that size influences structure more than technology does. Research indicates that technology shows stronger effects on structure in smaller organizations than in larger ones. Similarly, the effects of task characteristics on structure are strongest within task subunits; that is, the task of a bureau within a larger organization has a stronger relationship to the structure of that bureau than to the structure of the larger organization. In sum, size, technology, structure, and other factors have complex interrelationships. The author underlines that many contemporary organizations operate under such great uncertainty that these basic modes become overloaded, so they must pursue additional alternatives. First, managers can try to reduce the need for information. They can engage in environmental management to create more certainty through more effective competition for scarce resources, through public relations, and through cooperation and contracting with other organizations. They can create slack resources (that is, create a situation in which they have extra resources) by reducing the level of performance they seek to attain, or they can create self-contained tasks, such as profit centers or groups working independently on individual components of the work. Alternatively, managers can increase information processing capacity by investing in vertical information systems, such as computerized information management systems, or by creating lateral relations, such as task forces or liaison personnel. Thus, managers have to adopt coordination modes in response to greater uncertainty and information processing demands.

In recent work, Information Systems Management G. Philip (2007) exemplifies the movement among organizations and organization design experts toward increasing emphasis on flexibility and rapid adaptation to complex and quickly changing challenges. The technostructure consists of analysts who work on standardizing work, outputs, and skills—the policy analysts and program evaluators, strategic planners, systems engineers, and personnel training staff. The support staff units support the organization outside the work flow of the operating core—for example, mail room, food service, and public relations personnel. The different positions must be coordinated through the design of the organization’s superstructure. All organizations do this in part through unit grouping, based on any of a number of criteria: knowledge and skill (lawyers, engineers, social workers), function (police, fire, and parks and recreation employees; military personnel), time (night shift, day shift), output (the products produced by the different divisions of a corporation), clients (inpatients or outpatients; beneficiaries of insurance policies), or place (the regional offices of business firms, the federal government, and many state agencies; precincts in a city police department). An action-planning system, by contrast, specifies not the general result or standard but the details about actions that people and groups are to take. In the modules just mentioned, the applications from clients are placed in file folders that move from point to point in the modules as different people do their part of the work on the case. The filing clerks are trained in a system for moving and keeping track of the files—there are many thousands of them for each module—so they will not be lost—and can be located at any given time. As the clerks move the files around the module, they log them in when they arrive at certain points, using a bar code scanner similar to those used in supermarkets. The careful specification of the actions of the file clerks in this file-tracking system is essential to coordinating the different specialists in the module and to assessing the coordination of the work among all the modules.

Simon (2007) examines the problem of a corporate organization and its dependence on technology.. The analysts discussed in the preceding historical review have either concentrated on industrial organizations or sought to develop generic concepts and theories that apply across all types of organizations. Such organizations might respond to differences in size in different ways than do other organizations, such as business firms. When the contingency theorists analyzed environments, they typically concentrated on environmental uncertainty, especially as a characteristic of business firms’ market environments, and showed very little interest in political or governmental dynamics in organizational environments.

Laudon & Laudon (2005) analyze technology in terms of the type of interdependence among workers and units the work requires. Organizations such as banks and insurance companies have mediating technologies. They deal with many individuals who need largely the same set of services, such as checking accounts or insurance policies. Their work involves pooled interdependence because it pools together such services and sets of clients. They establish branches that have little interdependence with one another and formulate standardized rules and procedures to govern them. One unit completes its work and passes the product along to the next unit, which completes another phase of the work, and so on. Plans and schedules become an important coordination tool for these units. Units with intensive technologies have a reciprocal pattern of interdependence. The special units in a hospital or a research and development (R&D) laboratory need to engage in a lot of back-and-forth communication and adjustment in the process of completing the work. These units must be close together and coordinated through mutual adjustments and informal meetings. Thompson contended that organizations may have all these forms of interdependence. They will first organize together those persons and units that have reciprocal interdependence and require high levels of mutual adjustment. Then they will organize together those units with sequential interdependence, and then group units with pooled interdependence. Analyzing many studies of structure, Philip (2007) found some support for Laudon & Laudon (2005) observations. Philip and others concluded that studies have tended to find that organizational units with high interdependence were much less likely to have a lot of standardized work procedures than organizations with low interdependence.

Another very influential perspective on information technology, developed by Baschab et al (2007), argues that work processes vary along two main dimensions: the frequency with which exceptions to normal procedures arise and the degree to which these exceptions are analyzable (that is, the degree to which they can be solved through a rational, systematic search). If a machine breaks down, often a clear set of steps can lead to fixing it. If a human being breaks down psychologically, usually few systematic procedures lead as directly to diagnosis and treatment. Organizational technologies can rank high or low on either of these two main dimensions. Routine technologies involve few exceptions and provide clear steps in response to any that occur (high analyzability). In such cases, the work is usually programmed through plans and rules, because there is little need for intensive communication and individual discretion in performing the work. For examples of routine technology, researchers usually point to the work of many manufacturing personnel, auditors, and clerical personnel. At the opposite extreme, nonroutine technologies involve many exceptions, which are less analyzable when they occur. Units and organizations doing this type of work tend toward flexible, “polycentralized” structures, with power and discretion widely dispersed and with much interdependence and mutual adjustment among units and people. Units engaged in strategic planning, R&D, and psychiatric treatment apply such nonroutine technologies.

Craft technology involves infrequent exceptions but offers no easily programmed solutions when they occur. Government budget analysts, for example, may work quite routinely but with few clear guidelines on how to deal with the unpredictable variations that may arise, such as unanticipated shortfalls. These organizations tend to be more decentralized than those with routine technologies. Engineering technology involves many exceptions but also offers analyzable responses to them. Engineers may encounter many variations, but often they can respond in systematic, programmed ways. Lawyers and auditors often deal with this type of work. When an Internal Revenue Service (IRS) auditor examines a person’s income tax return, many unanticipated questions come up about whether certain of the person’s tax deductions can be allowed. The auditor can resolve many of the questions, however, by referring to written rules and guidelines.

Dennings (2001) states that organizations with engineering technologies tend to be more centralized than those with nonroutine technologies, but more flexibly structured than those with routine technologies. The author reviewed numerous studies that showed that organizational units with routine technologies had more formal rules and procedures and fewer highly educated and professional employees. Concerning internal and external coordination, most large government agencies, like business firms and nonprofit organizations, now have an intranet, an Internet-based network within the organization with access restricted to designated organizational members. To maintain security of data about individual citizens and about such sensitive matters as national security, these intranet arrangements usually require elaborate provisions for controlled access. Some government employees now carry with them devices that periodically inform them of newly assigned access codes for their agency’s intranet because the codes are changed periodically as a security precaution. These examples indicate that IT has provided significant improvements and opportunities for government, its employees, and the clients of government agencies.

In the study Fluency with Information Technology: Skills, Snyder (2007) examines the main concepts of information technology and its application in modern corporation. As one might expect, IT raises many challenges for managers in government, some of which are daunting. Some of these issues are new, but some involve application of the topics covered in this book and challenges similar to those encountered in managing any significant operation or initiative. Executives and managers confront challenges in strategic planning for IT itself and in integrating IT into more general plans and strategies, as well as in procurement and purchasing, creating organizational structure and designs to incorporate IT and adapt to it, training, recruiting, and many other areas. The author finds that the framework treats IT developments as emerging from interactions among objective technologies such as computer hardware and software, organizational forms such as bureaucracies and networks, and institutional arrangements such as cultural and legal conditions.

Two research studies McAfee (2006) and Danziger and Andersen (2002) examine the problem of information technology in a particular organization. Danziger and Andersen (2002) examines the effects of IT in Public Administration and its pros and cons for this types of organization. The managers perceived that any structural changes caused by IT implementation in public agencies have little impact on organizational performance (measured as improved ease of communication and improved technical decision making). However, the managers tended to regard IT adoption as having a direct positive impact on improving technical decision making (as opposed to an impact on decision making by way of influences on structure). County government managers may have different responses to developments in IT than state and federal managers, the lack of perceived structural effects of IT is striking. McAfee (2006) pays attention to corporation in developing countries and possible difficulties caused by economic and social growth patterns.

In sum, organization theorists have generally addressed structure from a generic perspective, devoting little attention to the distinctive structural attributes of public organizations, even though some important studies have concentrated on public agencies. These general points apply to most organizations, however, and the discussion here gives examples specifically involving public organizations. A number of studies indicate that an organization’s structure also depends on the nature of its work processes, or technologies and tasks. Researchers use a wide variety of definitions of technology and tasks, such as the interdependence required by and the routineness of the work.

Methodology

The framework of the research will be based on qualitative data collection methods. The research study will be based on observations and case study methods. Observation is the most frequent data-collection method used in qualitative research. Quantitative research begins with theory. From theory, prior research is reviewed (; and from the theoretical frameworks, hypotheses are generated. These hypotheses lead to data collection and the strategy needed to test them. The data are analyzed according to the hypotheses, and conclusions are drawn. These conclusions confirm or conflict with the theory, thereby completing the cycle.

First, participant observation (in which the observer is obvious to and involved with the subjects) is less valid than a questionnaire would be for sensitive data. Second, the observer’s expectations affect what he or she sees and reports, reducing the validity of the data. Third, and even more complex, is the lack of expectations that results when no structure is a priori given to the observer. If only one observation is taken, it will reflect one small portion and will not capture the essence of the culture or the situation. An example of this would be a visitor from outer space coming to Ohio during the winter and reporting that there were trees without leaves. Another outer-space visitor arriving in summer would report trees with leaves. Both would be accurate in their observations, but neither would accurately reflect the actual situation. One other caution exists: the observer’s sense perceptions are not always accurate. Particularly in a nonstructured observation situation, the element of surprise can dominate the sensory input of the observer, rendering reported data invalid. All validity concerns described here affect both participant and nonparticipant observations. Compared to participant-observation strategies, the validity of “nonparticipant-observation strategies is greater because there is no reactivity among the subjects to the presence of the researcher. This reduction in bias, however, does not cancel out the other biasing (invalidating) effects (Denzin and Lincoln 1995).

The main criteria applied to the selection of organization are large modern organization with 1000 and more employees. The organization should introduce information technology no less than 1 year ago. These criteria will help the research study to ensure quality of data and information collected for this research. First, because the subjective bias of the observer affects his or her reporting, having several observers from several backgrounds (or points of view) report on the same phenomena can increase validity. Coalescing their data reduces sensory-deficiency and misinterpretation error. Second, structuring the observation increases validity by focusing the attention of the observers on certain characteristics and events. Third, placing the observation on a scientific foundation by stating a hypothesis up front increases validity by avoiding distortion. Fourth, nonparticipant observation, as opposed to participant observation, increases validity. And, fifth, using observation only for studying those phenomena that are appropriate to this method (e.g., nonverbal behaviors and social interactions) increases validity (Denzin and Lincoln 1995).

The case-study method is one more design strategy under the qualitative rubric. Case studies can be single-subject designs or based on a single program. At the beginning, the research issues will be turned into more specific and researchable problems, followed by techniques and examples of how to collect, organize, and report case-study data. In addition, she argues that case study is a helpful procedure when one is interested in such things as diagnosing learning problems, undertaking teaching evaluations, or evaluating policy. Consistent with assumptions of qualitative research philosophy, the critical emphasis in case studies is revealing the meaning of phenomena for the participants. Denzin and Lincoln (1995) acknowledge this assumption, claiming that case-study knowledge is concrete, contextual, and interpreted through the reader’s experience. He prefers case-study methods because of their epistemological similarity to a reader’s experience. He particularly notes the reasonableness of assuming the natural appeal of the case approach. Case-study data come from strategies of information collection that have been described in Figure 2: interviews, observations, documents, and historical records. The three steps in conducting a case study: (1) assemble raw case data; (2) construct case record and (3) write case-study narrative (Dicks et al 2005).

Validity is considered to be an advantage of case studies because of their compatibility with reader understanding; in other words, they seem natural. The validity limitations that have already been put forth in this book related to observational data apply to case studies as well. However, the counterbalancing of information from documents with data from observation and interviews strengthens the resulting validity. Invalidity of one set of data can be checked by conflicting or supporting results from the other sources, which is a type of triangulation. Case-study methodology has potential for increased validity for several reasons. First, because multiple data-collection techniques are used (e.g., interview, document study, observation, and quantitative statistical analysis), the weaknesses of each can be counterbalanced by the strengths of the others. Conclusions related to a certain aspect of a phenomenon under study need not be based solely on one data source. Second, validity may be increased by checking the interpretation of information with experts. Third, with case studies there are generally a variety of data sources. There should be a structural relationship among these sources (Dicks et al 2005). To the extent that these findings are consistent within the case, the validity is enhanced. Conceptually, this is similar to giving a battery of tests to obtain an estimate of consistency in the underlying constructs. Fourth, using a scientific method in which one hypothesizes something about the case and collects data to determine if the hypothesis should be rejected could add to validity and also help future researchers determine starting places for their research. All of these approaches would tend to improve understanding of the case and give in-depth descriptive information. To estimate the applicability of this study, one needs deep descriptors to clearly define the characteristics of the sample. Description is not of sufficient detail to have a clear sense of socioeconomic status, culture, and so on.

The object of persistent observation as a design feature is to achieve depth of meaning from the data (i.e., what seems salient in the setting). Like the other criteria, it was originally described for ethnographic research. To comply with this criterion, the researcher focuses in detail on the most relevant factors in an ethnographic study. The emerging domains of meaning, then, are based on a depth of understanding. To apply this characteristic to the Fuller study (not an ethnographic study) requires examining how the researcher determined what labels to apply to the emerging themes of the managers’ experiences (Dicks et al 2005).

It is assumed that any interpretation of data is only as good as the accuracy of those data. If the data on which researchers are basing judgments are faulty, one cannot expect the conclusions to be accurate. Obviously, one can have appropriate inferences and conclusions without collecting data. Einstein knew his theory of relativity was correct before he had the data to support it. He spent most of his life collecting data to support the accuracy and usefulness of his theory because he, too, probably accepted the assumption that one can only have confidence in a research theory to the degree that one has reliable and valid data that support the theory. Therefore, high confidence is achieved only if judgments are based upon accurate data. Finally, it is important to reemphasize that one method is not necessarily better than another. It is also important when evaluating research to determine how well the research adds to the body of knowledge and to determine what it suggests needs to be done next. We believe that the model presented, the qualitative-quantitative interactive continuum, can facilitate the evaluation, planning, and conduct of research by providing a framework within which to conceptualize it (Dicks et al 2005).

The sample of subjects is drawn to reflect the population. After the pretest measures are taken, the treatment conducted, and posttest measures taken, a statistical analysis reveals findings about the treatment’s effects. To support repeatability of the findings, one experiment usually is conducted and statistical techniques are used to determine the probability of the same differences occurring over and over again. These tests of statistical significance result in findings that confirm or counter the original hypothesis. Theory revision or enhancement follows.. Fountain concluded that this and other examples suggest the impediments to major IT initiatives linking and coordinating diverse agencies and programs, and the likelihood that developments in IT applications will involve more modest projects and changes. These procedures are deductive in nature, contributing to the scientific knowledge base by theory testing. This is the nature of quantitative methodology. Because true experimental designs require tightly controlled conditions, the richness and depth of meaning for participants may be sacrificed. As a validity concern, this may be a limitation of quantitative designs (Dicks et al 2005).

Project Management: Gantt Chart

Gantt Chart will be used as a core of project management. Some charts and diagrams, once they are developed and the planning process is completed and moved forward to the implementation stage, can become management tools. completed and moved forward to the implementation stage, can become management tools. conducting and managing the project, charts and diagrams can be used as a framework for monitoring what is being done and seeing that the sequence is followed and each segment is accomplished at the right time. Diagrams can be used as an aid to determine if resources need to be reallocated at crucial points in the program (Clark 1977). Leaders who perceive a plan to be important will advocate the implementation of the plan; to implement a plan, leaders must work the plan, stay the course, and see it to completion. In the implementation of a plan, administrators will find various types of diagrams helpful. With the advent of computer capabilities and accessibility of appropriate software, charting and diagramming have become much easier, and the end result of the planner’s efforts can look much more professional, while providing clarity and readability. At one time, freehand diagrams, done with the style of an architectural draftsman, were often considered some of the better efforts to present such materials. Persons who, by their own admission, could not draw a straight line used ruler and templates to help in the process of making flow charts. One problem inevitably occurred when templates were used; the size of the figure traced from the template was either too small or too large. Consequently, professional draftsmen were frequently enlisted to make clear and professional-looking charts and diagrams (Clark 1977).

To construct a Gantt Chart, the planner begins with the placement of vertical and horizontal axes on a page. The vertical axis is placed toward the left side and the horizontal on the top of the page. The various functions and tasks to be completed in a project are listed along the vertical axis. A timeline with measured increments is recorded along the horizontal axis. Typically, the scale or time calendar is shown at the top of the page. Each function or activity along the vertical axis, listed in sequence from first to last, is numbered. The time when the function or task is to begin and end is determined, and a line, usually a heavy or wide line, is extended horizontally from the estimated starting date to the time or date when the function or task is to be completed (Philip, 2007). The thickness or texture of the timeline can be varied to communicate different meanings to the reader. Symbols can be added to the Gantt chart that represent report documents or specified activities. Each Gantt chart is titled, and a key for symbols or the meaning of various lines, as depicted by different line widths or compositions, should also be included (Clark 1977). Gantt charts can be constructed in various ways; in fact, variations are encouraged if such modifications help clarify the chart. The final Gantt chart is designed to contain a title, the timeline, the functions to be accomplished, and the bars and symbols designed to show the activities or to depict when reports are to be prepared and due or when meetings will be scheduled. Finally, a key is included at the bottom of the figure to explain the “language” of the chart. Responsible for performing the various functions should be identified. In the evaluation of a building principal, some tasks or functions would be done jointly by the principal and the person responsible for conducting the evaluation. Other tasks would be the responsibility of the person conducting the evaluation, and some responsibilities would be in the hands of the principal. While it would be possible to use varied types of timelines to depict such information, especially with the availability of computer assistance, such information is generally placed in the accompanying documentation (Philip, 2007). The completed Gantt chart is primarily useful as a management device, since it is easy to see at a glance if appropriate progress has been made and all requirements met. Documentation to accompany a Gantt chart is an important dimension of the total effort. Typically, the documentation accompanying a Gantt chart will have the same heading or title as the activities listed on the vertical axis of the chart. An introduction statement should be included, and each numbered line or item on the chart should be incorporated in the documentation, with the corresponding number used for each item.

Bibliography

  1. Baschab, J., Plot, J., Carr, N. 2007, The Executive’s Guide to Information Technology. Wiley; 2 edition.
  2. Carr, N. G. 2004, Does IT Matter? Information Technology and the Corrosion of Competitive Advantage. Harvard Business School Press;.
  3. Clark, W. 1877, The Gantt chart: A working tool of management. I. Pitman; 2nd edition.
  4. Champy, J. 2002, X-Engineering the Corporation: Reinventing Your Business in the Digital Age. Warner Business Books; 1st edition.
  5. Danziger, J. N., Andersen, K. V. 2002, The Impacts of Information Technology on Public Administration: An Analysis of Empirical Research from the “Golden Age” of Transformation. International Journal of Public Administration, 25 (5), 591.
  6. Dennings, P.I. 2001, The Invisible Future: The Seamless Integration Of Technology Into Everyday Life. McGraw-Hill Companies; 1st edition.
  7. Denzin, N.K. and Lincoln, Y.S. Handbook of Qualitative Research, Thousand Oaks CA: Sage, 1995.
  8. Dicks, B. Mason, B., Coffey, A. and Atkinson, P Qualitative research and Hypermedia. London: Sage, 2005.
  9. Laudon, K. C. & Laudon, J. P. 2005, Management Information Systems: Managing the Digital Firm, 9th Edition.
  10. McAfee, A. 2006, Mastering theThreeWorlds of InformationTechnology. Harvard Business review. pp.141-147.
  11. Philip, G. 2007, Information Systems Management. Prentice Hall; 7 edition.
  12. Snyder, L. 2007, Fluency with Information Technology: Skills, Concepts, and Capabilities (3rd Edition). Addison Wesley; 3 edition.
  13. Simon, H.A. 2007, Administrative Behavior, 4th Edition. Free Press; 4 Sub edition.

Importance Of Corporate Responsibility And Ethics

Introduction

Corporate social responsibility (CSR) allows the society or its participants to account financially when there is a certain involvement of the monetary values or assets of organization. Financial accountancy provides us with many dimensions to identify and communicate issues of public interest to both internal and external stakeholders. This goes through social audit which is a detailed evaluation of an organization’s social performance. While bridging the gap between what it takes accounting and financial standards to fulfil the criteria of social responsibility and to assess those standards in the light of ethics, there are some strategies and evaluation.

For example when majority of the top 500 corporations in the United States included information about their social performance in their annual reports, the main concern was to mention a triple-bottom-line reporting which is an attempt to make public information on the organization’s financial, social, and ethical performance (Sims, 2003, p. 59). Recent years have taught many public accounting firms like Ernst & Young lessons of getting into the business of preparing social and ethical audits just to satisfy the public on ethical foundation, thereby providing them with every reasoning and justification.

Main body

The reason behind acknowledging the fact that today investors recognize an ethical organizational climate being the foundation of efficiency, productivity, and profits is that investors know the other end of the story, that would be fines or negative publicity can lower stock prices, diminish customer loyalty, and threaten the long-term viability of the company.

This is evident from many international companies who after experiencing legal problems along with customer dissatisfaction resulting in negative publicity suffered. Another example of ethical mismatch in accountancy is that when the Securities and Exchange Commission (SEC) investigated Sunbeam for errors in accounting procedures that misrepresented sales and profits, the company’s stock fell during several months from a high of $54 to less than $10 (Sims, 2003, p. 83). The alleged wrongdoing of Sunbeam had an enormous impact on the company’s rapport with it’s customers due to which lender-investor confidence in Sunbeam shattered. Accounting loopholes devoid with ethics results in potential negative outcomes and perceived unethical or questionable decisions cause investors to take their investments elsewhere.

CSR arises when accounting problems raises the following questions: why is it that accounting based measures are more included in ethics than market based measures while governing social responsibility? Why is it that auditors are encouraged to make improper compromises which not only pave way for false or fraudulent accounting but also isolates the firms in case anything goes wrong and therefore declines to answer for auditors’ work? This creates a big question and perturbs client confidentiality. However there is an answer to these problems in terms of ethics.

Every financial performance is measured on the basis of performance choices that are measured with a straightforward aspect. Since choices are limited to relatively common financial accounting, they uphold some advantages as well as disadvantages. For example accounting based measures are more easily manipulated and are historically accepted while fulfilling performance expectations (Blackburn et al, 1994). Accounting based measures are ethically risk free and fulfils standards fixed by most of the economic and market factors and are equally relevant for most stakeholder groups.

If we analyze the relationship between CSR and financial performance, it is evident from starting point that empirical studies support and assume it in the context of a traditional view. For example, Baldwin and colleagues when initially in 1986 investigated the relationship found out that the purpose of such study was to produce quantitative estimates of the financial information that as non-market risk investors would have to bear as a result of not being able to invest in various equity securities.

Accounting regulators ‘constitution’ while keeping in mind the traditional view of the corporation have gone so far as to codify this view by including the significance of CSR that accounting regulators have not left behind in mentioning those potential users of financial information that are most directly concerned with a particular business enterprise (Pava & Krausz, 1995, p. 17). The financial accounting standards have examined CSR performance with respect to different control techniques and while analyzing these techniques, they develop strategies for their organization to grow and adapt the changing accounting trends and practices.

Ethical issues in accounting and financial reporting sometimes occur not because of greed or even fraud but because the way financialists view a situation devoid of ethics (Hoffman et al, 1996, p. 42). Such situations arise because of narrow perspective or failures to view clearly how a financial market may be incomplete or erroneous. Financial reporting and ethics in the context of CSR must adhere to the accounting standards that suggest that information to be provided to the investors and creditors which is useful in making rational investment, credit, and similar decisions (Solomons, 1986, p. 68).

The costs and benefits of social responsibility to the success of the business and the personal well being of the owner/manager depends on figuring out whether business was gained or lost from supporting the community. It is seen that despite being so much modernized in business and finance businesspersons are forced to rely more on faith than cost accounting in determining the net return from good citizenship. Outcome indicators in this respect are those that vary according to the category of social responsibility being considered (Besser, 2002, p. 28). According to Besser (2002) “It is observed that sophisticated accounting mechanisms measure the outcomes related to the economic category of social responsibility and when it comes to public companies, the economic outcomes are validated by public accounting firms and reported to the public in annual reports and in Securities and Exchange Commission filings” (Besser, 2002, p. 28).

According to Solomons (1986) “It is the corporate responsibility of accountants and auditors to visualize and fulfil the standards that provide not only useful information that helps present and potential investors as well as creditors but is also helpful in assessing the risk in timing, and uncertainty of cash receipts from dividends and interests” (Solomons, 1986, p. 68).

Conclusion

We must elaborate the context of accounting in taking off subsequent trajectory which has been one of interesting and significant socio-political tensions and dynamics (Gallhofer & Haslam, 2003, p. 107). In order to mobilize and involve accounting in emancipatory projects there is still a long way to go for the struggle to capture social accounting. Accounting that comprises and focuses gaining of insights and inspiration towards mobilization.

Work Cited

  1. Besser L. Terry, (2002) The Conscience of Capitalism: Business Social Responsibility to Communities: Praeger: Westport, CT.
  2. Blackburn V. L., Doran M. & Shrader C. B., (1994) “Investigating the Dimensions of Social Responsibility and the Consequences for Corporate Financial Performance” In: Journal of Managerial Issues. Volume: 6. Issue: 2. Pittsburg State University – Department of Economics.
  3. Gallhofer Sonja & Haslam Jim, (2003) Accounting and Emancipation: Some Critical Interventions: Routledge: New York.
  4. Hoffman W. Michael, Kamm Judith Brown, Frederick E. Robert & Petry S. Edward, (1996) The Ethics of Accounting and Finance: Trust, Responsibility, and Control: Quorum: Westport, CT.
  5. Pava L. Moses & Krausz Joshua, (1995) Corporate Responsibility and Financial Performance: The Paradox of Social Cost: Quorum Books: Westport, CT.
  6. Sims R. Ronald, (2003) Ethics and Corporate Social Responsibility: Why Giants Fall: Praeger: Westport, CT.
  7. Solomons David, (1986) Making Accounting Policy: The Quest for Credibility in Financial Reporting: Oxford US: New York.

Antibiotic Resistance Mechanisms In Bacteria

Introduction

Shortly after the introduction of the parent antibiotic penicillin to clinical practice, a report came for bacteria becoming insensitive. These were the earliest reports of bacterial resistance to antibiotics. These reports were not considered significant as lab studies displayed that such resistance can be overcome by increasing the dose. Because of the appearance of AIDS in the medical arena in 1981, all healthcare workers became increasingly aware of the problem as they had to deal with decreasing numbers of antibiotics because of antibiotics resistant bacterial strains (Cloutier, 186, 187).

Antibiotics are chemical substances that are either produced naturally by a microorganism or developed synthetically to kill pathogenic microorganisms by interacting with specific targets (cell membrane, mitochondria, or another microorganism cellular structure). Thus, antibiotic resistance points to the ability of a microorganism not to be affected by an antibiotic. This represents a clinical problem in treating everyday infections or more seriously in managing infections in a specific hospital setting as the ICU, pediatric unit, or hospital-acquired infection (LeJeune, p. 1).

As antibiotic abuse is a major risk factor in developing bacterial strains resistant to an antibiotic, Wester and others (Pp. 2210-2216) surveyed 490 internal medicine physicians about the importance, knowledge of prevalence, and self-reported experience of antibiotic resistance. Their results showed that almost all physicians included are aware of how serious the problem is. However, their attitudes as to how to solve the problem and its possible causes are varied. This may impose difficulty on the efforts of improving prescription and infection control strategies.

Main body

Multidrug-resistant organisms (MDROs) are bacteria resistant to one or more classes of antibiotics. These include methicillin-resistant Staphylococcus aureus (MRSA), Vancomycin-resistant enterococci (VRE), certain gram-negative bacilli including B-lactamase producers, E.Coli, and Klebsiella pneumonia, besides some species of Burkholderia and Stenotrophomonas, which are intrinsically resistant to broad-spectrum antibiotics.

The severity and extent of antibiotic-resistant bacterial infection vary according to the pathogen producing the infection and to healthcare institutions and facilities where infection occurred (ICU, neonatology, or burn unit). Thus, prevention and control measures have to be designed specifically in accordance to meet the needs of the healthcare facility and population affected (Siegel and others Pp.4-6).

Dziedzic and others (Pp. 11-21) suggested two main aspects to the biology of bacterial antibiotic resistance; these are the development and possession of the resistance gene (genetic features), and biochemical mechanisms (biochemical features). Genetic features include mutations (spontaneous, hypermutators, and adaptive mutagenesis) which may occur at the transcription or translation levels, and horizontal gene transfer through plasmids, conjugative transposons, and integrons.

Biochemical features, on the other hand, include antibiotic inactivation by hydrolysis, group transfer, or redox processes, target modification, and target bypass are alternative biochemical features. The design of proper antibiotic-resistant bacterial infections prevention and control strategies needs to apply epidemiological approaches and the research technologies targeted at the mechanisms of developing resistance (Dzidic and others Pp. 11-21).

Smith and others (Pp. 1-16), suggested that infections with antibiotic-resistant bacteria have specific medical, social, and ethical characteristics. First, these infections usually have higher morbidity and mortality. Second as infectious diseases are caused by invasion or attack on humans by pathogenic microorganisms. Infections with antibiotic-resistant bacteria are specifically greatly persistent and invasive.

Infections with these organisms are rapidly progressing (show more acuity in course), and more communicable, rapidly transferred from one person to another. These infections are difficult to treat and may need further bacteriological and antibiotic sensitivity testing. They are also difficult to prevent because of changing epidemiological, and infection behavior patterns. Host susceptibility and general resistance is a crucial factor in the pathogenicity of these organisms.

Vulnerable patient groups (ICU, Post-surgery, and neonates are particularly prone to this infection), which is one reason for increased morbidity and mortality of these infections. These infections have a higher socioeconomic impact because of the expense of treatment, difficult prevention, and quickly spread. Emerging antibiotic-resistant bacterial infection as Multidrug-resistant tuberculosis, West Nile Virus infection, SARS represent community public health serious problems.

Sack and others (Pp. 1-51) reviewed antibacterial resistance in bacterial enteric pathogens, namely Vibrio cholerae, Shigella species, and Campylobacter jeujeni. About V. Cholerae, they inferred that antibiotic resistance occurred through plasmids acquisition, with resistant strains causing epidemics. The suggested that the problem is antibiotics resistance patterns of this bacterial species changes both temporally, and geographically.

This calls for persistent surveillance as susceptible strains occasionally reappear as antibiotic pressure is taken off. This makes cholera one of the diseases where reversion to sensitivity occurs when antibiotic use is properly controlled. Shigella species represents the most serious challenge, because of the tendency of steadiness of trends towards multiple antibiotic resistance. Further once the organism becomes resistant epidemic and endemic Shigella strains remain resistant.

Also, prevention is difficult as a small number of the organism is all that is needed to produce infection. In which case, water and sanitation health programs, as well as vaccine development, may produce an impact on prevention. Over the past 25 years, Campylobacter jejuni has advanced from being a newly discovered pathogen to be the most significant pathogen for diarrhea in the developed countries particularly in children and travelers.

It causes endemic diarrhea as well as diarrhea outbreaks. Further, the organism is blamed to be a common cause of Guillain-Barre syndrome. The mechanism of pathogenicity is not yet fully known, which makes future molecular pathogenicity research of importance in prevention through developing vaccines and management. Uwaezuoke and Aririatu (Pp.67-69) studied the antibiotic resistance of Staphylococcus Aureus.

Their results showed that in 48 Staph aureus isolates, 95.8% were resistant to penicillin, 89.6% were resistant to ampicillin, 87.5% were resistant to tetracyclines, and 75% were resistant to chloramphenicol. The organism was highly susceptible to gentamycin 91.5% and cloxacillin 85.4%. They explained their results in the lights of prevailing use and abuse of the antibiotics the organism is resistant to.

Antibiotic-resistant pneumococci were studied by Schrag et al (Pp. 1-30). Streptococcus pneumonia is the principal pathogen of acute respiratory tract infection which is a leading cause of morbidity and mortality especially in children and compromised patients with a low general resistance. Epidemiological studies show that recent antibiotic use is associated with developing resistant pneumococci both on an individual level, and community level.

Although the biological mechanisms are not fully illustrated, yet it is known that unnecessary antibiotic use for viral respiratory tract infections is a key factor in developing pneumococcal resistant strains. It is also documented that pneumococcal resistant strain can be developed as a result of drugs used for unrelated conditions. This casts a shadow of doubt on mass antibiotic campaigns to eliminate diseases as trachoma adopted in some developing African countries. Since pneumococcal polysaccharide vaccine is the established prevention practice. It should be remembered that it is effective only against bacteremic pneumococcal pneumonia, and has no impact on pneumococcal carriers. Further the vaccine is not recommended for children of two years of age or less.

Tuberculosis is a serious disease caused by Mycobacterium tuberculosis that is indigenously resistant to common antibiotics. Prakash (Pp. 17-18) in his review on tuberculosis and antibiotic resistance estimated that there are 8 million new TB infections every year.

The main problem is the resistance toward traditional antituberculous drugs isoniazid, Rifampicin, and streptomycin with a ratio of 65% to 12% of resistant strains. This created the problem of increased infectivity as 31% of infected surviving patients are inadequately treated and have active infections. This makes spread when suitable community ecological conditions exist (as in India, some underdeveloped African and Asian countries) a near epidemic status.

BA de Moraes and others (Pp. 387-394) studied antibiotic resistance of pseudomonas aeruginosa in a neonatal ICU. Their results showed that 62.5% of cases infected with antibiotic-resistant Pseudomonas aeruginosa received empirical antibiotics before the results of blood culture. All resistant strains isolated were classified as Multidrug-resistant pathogens with a high percentage of B-lactams, chloramphenicol, and sulpha combination.

Intensive care units patients are 5 to 10 more susceptible than other hospitalized patients to acquire a nosocomial infection. Because of the complexity of the cases, increased possibility, and risks of invasive interferences, antibiotic-resistant bacterial infections are expected to produce a greater impact on morbidity and mortality. Prevention and control strategies of nosocomial infections in the ICU have focused on MRSA, B-Lactamase producing gram-negative bacilli, and vancomycin-resistant enterococci (Weber and others Pp. 34S-41S).

Weber and others (Pp. 34S-41S) suggested the following protocol for antibiotic-resistant bacterial prevention and control in the ICU clinical setting. It is composed of general measures as the availability of a surveillance system to identify an outbreak, besides a protocol for proper intervention and evaluation. Sticking to the basic infection control procedures like hand washing, disinfection and sterilization should minimize the risks. They showed that hospital areas with the highest antibiotic use show the highest prevalence of antibiotic-resistant nosocomial infections. Therefore, they stressed proper antibiotic use following the standard guideline (Weber and others Pp. 34S-41S).

The keys to the prevention and control of antibiotic-resistant bacterial infections are threefold. First are administrative measures as this should be a health institution aim not for a particular unit but the whole institution. Such measures can be the presence of an administrative member in the infection control committee, the presence of a comprehensive infection control plan, and the provision of educational programs and tutorials to the staff on antibiotic-resistant bacterial infections. Second is the cautious policy of antibiotic use in all departments. Third, is a system of surveillance with adequate tools for decolonization and application of infection control procedures once an outbreak threatens (Wisconsin Division of Public health Pp. 2- 19).

Conclusion

Antibiotic-resistant bacterial infection is a growing public health problem. Many of the antibiotics that were considered effective (as vancomycin, ampicillin, and tetracyclines) are now ineffective against resistant bacterial strains new medications carry the hope to properly treat these cases. Examples are glycopeptide class medications as oritavancin, immunotherapeutic medications and vaccines have their place in prevention and prophylaxis during outbreaks (Fowler Pp.1-3).

Facing this public health problem is multifaceted and needs work on many fronts; the antibiotic use in the agricultural business and veterinary medicine has to be regulated, Federal agencies (the FDA) have to examine methods and recommendations for proper antibiotic use. There should be a continuous medical education program for medical, nursing, and paramedical personnel as to the risks of improper use of antibiotics and the guidelines for proper use. There should be contact with the general public as to the volume of the problem and how serious it can be (Taraporewala Pp.6-11).

Works Cited

BA de Moraes, MM Loureiro, Mendonca, VLF, Quadra, MMR, et al. “Pseudomonas aeruginosa: Study of Antibiotic Resistance and Molecular Typing in Hospital Infection Cases in a Neonatal Intensive Care Unit from Rio de Janeiro City, Brazil.” Mem Inst Oswaldo Cruz, Rio de Janeiro vol 97 (3) 2002. p. 387-394.

Cloutier, Michel J. “Antibiotics: Mechanisms of Action and the Acquisition of Resistance-When Magic Bullets Lose Their Magic.” American Journal of Pharmaceutical Education vol 59 1995. p. 167-172.

Dzidic, Senka, Suskovic, Jagoda, and Kos, Blazenka. “Antibiotic Resistance Mechanisms in Bacteria: Biochemical and Genetic Aspects.” Biotechnol vol 46(1) 2008. p. 11-21.

Fowler, Vance, G. “Current and Future Antibiotics for Treatment of Resistant Gram-Positive Infections.” Clinical Updates in Infectious Diseases vol 7 (1) 2004. p. 1-3.

LeJeune, Jeffrey, T. “Antibiotic Resistance: Questions and Answers.” Extension Fact Sheet. The Ohio State University. 2003. Web.

Prakash, C. S. “Tuberculosis and antibiotic resistance.” Current Science vol 82 (1) 2002. p. 17-18.

Sack, David, A. et al, et al. Antimicrobial Resistance In Shigellosis, Cholera, And Campylobacteriosis. Geneva: World Health Organization, 2001.

Schrag, Stephanie, J., Beall, Bernard, and Dowell, Scott, et al. Resistant Pneumococcal Infections. Geneva: World Health Organization, 2001.

Siegel, Jane, D., Rhinehart, Emily, Jackson, Marguerite, and Chiarello, Linda. “Management of Multidrug-Resistant organisms In Healthcare Settings.” CDC Centers For Disease Control. The Healthcare infection Control Practices Advisory Committee. 2006. CDC Centers For Disease Control. Web.

Smith, Charles, B., Battin, Margaret, P., Jacobson, Jay, A., Francis, Leslie, P. et al. “Are There Characteristics Of Infectious Diseases That Raise Special Ethical Issues.” Developing World Bioethics vol 4(1) 2004. p. 1-16.

Food and Drug Law Institute. Food and Drug Law Institute. Antibiotic An Update: The Growing Problem of Antibiotic Resistance. By Taraporewala. Irish, B. 2008. Web.

Uwaezuoke, J. C., Aririatu, L. E. “A Survey of Antibiotic-Resistant Staphylococcus Aureus Strains from Clinical Sources in Owerri.” J. Appl. Sci. Environ. Mgt vol 8 (1) 2004. p. 67-69.

Weber, David, J., Raasch, Ralph, and Rutala, William, A.. “Nosocomial Infections in the ICU The Growing Importance of Antibiotic-Resistant Pathogens.” CHEST vol 115 1999. p. 34S-41S.

Wester, William, C., Durairaj, Lakshmi, Evans, Arthur, T., Schwartz, David, N. et al. “Antibiotic Resistance A survey of Physician Perceptions.” Arch Intern Med. vol 162 2002. p. 2210-2216.

Wisconsin Division of Public Health. Bureau of Communicable Diseases and Preparedness. Guidelines for Prevention and Control of Antibiotic-Resistant Organisms in Health Care Settings. By Borlaug, Gwen. 2005. Web.

error: Content is protected !!