Differences Between Adaptive Sampling And Network Sampling University Essay Example

The core differences between adaptive sampling and network sampling rest on their applications. Adaptive sampling in an entirely new sampling design in which regions, defined in its context as “units” are selected based on the values of variables that are of interest during the observation process in the sampling survey while network sampling takes the approach of duplicating the counting of population elements by the application of multiplicity counting rules. Such rules include friendship and kinship rules that involve surveys related to households and that which links one person to a number of households of their friends and relatives. Secondly, as opposed to network sampling, in adaptive sampling, the selection of the sampling unit does not depend on previous observations made during a previous survey. This differentiates from network sampling in that to avoid biases in the calculated statistics, different estimators must be implemented. Such a process is not necessary for network sampling that guarantees the fact that calculated statistics remain unbiased.

In addition to their definitions, the core differences between the two methods of sampling also come from the advantages derived from their applications. According to Thomson, “The primary advantage of adaptive sampling is the ability to incorporate population characteristics to obtain more precise estimates of population density in that for a given sample size and cost, more valuable information can be obtained than is possible with conventional designs.”1 This is because, for populations that include plants and animals, fossils and mineral, this sampling method has the capacity to provide a unique way that improves the effectiveness of the sampling project. The second advantage as clearly stated by Thompson is that “adaptive sampling increases the yield of important observations (e.g. the number of endangered species observed), which can result in higher quality estimates of parameters such as the mean and variance”2

The adaptive cluster sampling introduces the biases into conventional estimators so new unbiased estimators are needed. According to Thompson “if additional units are added to the sampling design wherever high positive identifications are observed, the sample mean will over-estimate the population mean”3. “A method of obtaining unbiased estimators is to make use of new observations in addition to the observations initially selected”4

Network sampling on the other hand has effectively improved design efficiencies especially in areas where classical sampling methods are infeasible or inefficient. Furthermore, network sampling forms an interdisciplinary survey method of research in that it intersects the cognitive, behavioral and statistical sciences. This demonstrates that it can effectively be applied in across range of research works in the process of survey sampling. According to Thompson and Seber “fundamental knowledge about information networks linking relatives and friends are critical in designing surveys based on network sampling and knowledge gained about the robustness of these information networks from survey applications of network sampling is potentially valuable in sociological research”5. This forms one of the core advantages of this sampling method and makes it superior in comparison to adaptive sampling method.

The application of adaptive sampling method in educational system cuts across a wide number of fields. This is because of its adaptability in the sampling of clustered and rare populations. In educational system, it takes the form in which the “conditions may be a given set of criteria, or a ranking system using the largest, second largest, and/or third largest, etc., order observation and taking two examples that involves a given one-dimensional case scenarios, one with a specified criteria and the other with a ranking criteria”6.

Network sampling on the other hand has been in use within the educational research endeavors in surveys that presented difficulties in the definitions and execution of urinary counting rules. In addition to the above, it is in the sampling of house hold surveys and rare events. This is because; according to Thompson and Seber (1994) it is a composition of network estimation procedures that effectively reduces the levels of biasness in the population samples. In close relation, this sample method is more applicable in the sampling of elusive and sensitive populations.

The central role of statistics in decision making processes in management circles cannot be underestimated. This is because statistical data form the foundations in which critical management decisions are made in line with the aims, objectives and priorities of an organization. National Science Foundation demonstrates this fact by stating that “collecting relevant facts, figures and statistics to facilitate and support decision making process”7. This fact is further buttressed by National Science Foundation in succinctly stating that “Today’s good decisions are driven by data in that all aspects of our lives, and importantly in the business context, an amazing diversity of data is available for inspection and analytical insight”.8 The functional area in business management where statistics play an essential role is not particularly hard to discern. This is because managers need to demonstrate strong evidence in support of their decisions. Such evidence can only be proven by statistical data. In summary, the functional area within management that statistics play a critical role is decision making.

References

National Science Foundation , Statistical Thinking for Managerial Decisions. 1994. Web.

Thompson SK & Seber AF. Adaptive Sampling. John Wiley and Sons, Inc., New York, 1996, p. 71

Thompson, SK, Sampling. John Wiley and Sons, Inc., New York, 1992, p. 40.

Thompson WL Sampling rare or elusive species: concepts, designs, and techniques for estimating population parameters. Island Press, New York, 2004, p. 14

Footnotes

  1. SK Thompson. Sampling. John Wiley and Sons, Inc. New York, 1992, p.40.
  2. SK Thompson. Sampling. John Wiley and Sons, Inc. New York, 1992, p.41.
  3. SK Thompson, p.37.
  4. WL Thompson, Sampling rare or elusive species: concepts, designs, and techniques for estimating population parameters. Island Press, New York, 2004, p. 14 p.44.
  5. SK Thompson and AF Seber, Adaptive Sampling. John Wiley and Sons. Inc. 1996. P.71.
  6. SK Thompson, p.44.
  7. National Science Foundation, Statistical Thinking for Managerial Decisions. 1994.
  8. National Science Foundation.

Reasons For Using CAPM Model

Introduction

Over the years the CAPM model has been used by investors when considering investing in multiple projects. The model provides a way to enable investors to make an estimation of the expected return that can be arrived at from investing in a security. It bases its assumption on the fact that investors expect an additional return when they choose to invest in risky securities. Therefore, investors choose to diversify securities in a portfolio so as to earn more returns. It is therefore advisable for individual investors to use the model for the following reasons:

It Is a Good Method as Compared to Others

When considering other methods, the CAPM model enables an investor to make superior investment decisions. The use of WACC to represent the discount rate instead of using CAPM leads to an investor making wrong investment decisions. This is because a project within the portfolio may end up being rejected for the simple reason that the security Internal Rate of Return is lesser than the return estimated in WACC. On the other hand, if an investor uses the CAPM model, the IRR of a given investment in a portfolio is placed beyond the Security Market Line hence offering a greater return than the required so as to offset the non-diversifiable risk.

Helps in Determining if the diversification decision is viable

According to Pratt and Grabowski (2010), the CAPM model operates in an understanding that investors require additional income which should compensate the risk taken when investing in riskier securities. The model enables investors to determine whether the extra risk taken when investing in security is worth adding when an investor is coming up with a portfolio. Investors can also use the CAPM model to make a sound judgment of the entire portfolio risk to allow them to make adjustments to the portfolio if the changes seem necessary.

Enables optimization of risk-return relationship

When using the model, the investors are able to optimize the risk relation of a portfolio to enable investors to attain the lowest risk from the diversification of securities undertaken. This can be possible by estimating the expected return of a project using the formula of the model. The model places emphasis on beta hence guarantees the investor an informed decision in regard to diversification of securities. Various investors using the CAPM model choose to invest in little cost index finances instead of investing in stocks. The model enables the diversification of a portfolio.

Systematic Risk

In contribution to the topic of diversification of portfolios, Chisholm (2009), highlighted that the model only considers systematic risks hence reflecting reality in the real world where unsystematic risk should be completely eliminated. This enables investors to make an estimate to determine if undertaking a specified project would be worth in regard to the diversification of a portfolio. The CAPM model assumes that the unsystematic risk should be managed by either the individual investor or the company offering the security and therefore does not need paramount consideration in regard to the estimation of a project’s expected return.

Conclusion

Research conducted by scholars has shown that the CAPM model is the best method to be used by investors when evaluating investment decisions. The model enables investors to realize how diversification of a security’s risk arrangement reduces the overall risk. Until another new method is introduced CAPM model will continue to be a useful evaluation toolkit that enables investors to make informed investment decisions.

References

Chisholm, A.2009. An Introduction to International Capital Markets: Products, Strategies, Participants. New York: John Wiley & Sons.

Pratt, P., S. & Grabowski, J., R. 2010. Cost of Capital: Applications and Examples. New York: John Wiley & Sons.

The Downfall Of Lehman Brothers

Introduction

Following the downfall of Lehman brothers, attention was focused on the role of regulators and credit rating agencies in the whole debacle. The public demanded for answers that were not forthcoming from all the concerned parties. This experience was to throw the Sarbanes-Oxley Act under close scrutiny and questions were abound as to how the act was not followed by all concerned parties. Credit rating agencies were blamed for coming up with false ratings on securities connected to subprime mortgages. Examples of these agencies are Moodys and Standard and Poors. They were found guilty of assigning exceptionally high ratings to the securities and also failing to reexamine the ratings based on the current environment at the time. There was also the issue of autonomy as it was found ha the rating agencies did not operate independently as there was conflict of interest in the rating firms. The rating agencies are often backed by the issuer who owns the bond that the same agencies rate.

At the heart of Lehman’s downfall was the evident accounting malpractice that was committed by both management and staff. The bosses and other members of staff were greedy in taking home more bonuses. Net profits were found to be manipulated in order to please the public in order to show how they were able to beat the earnings target. There was also the lack of transparency in how the managers dealt with subordinates who were found to have committed this malpractice. There was a clear lack of professional management within the organization (Tibman 23)

Securities and Exchange board

Out of all this, the major organization seen to be at the center of the blame is the Securities and Exchange board which seemed to have turned a blind eye in regulating and investigating complaints that had been raised before. The major cause within the board was its Consolidated Supervised Entities program which was supposed to check and investigate the investment’s bank liquid capability. The program required that an investment bank preserve adequate liquid asset that can last a year (Tibman 54). In Lehman’s case the SEC did not enforce this program despite acquiring enough evidence to enforce legal action against the investment bank. Following a series of stress tests which assesses the level of liquidity, the bank failed to show how it could endure a year without unsecured funding but was allowed to operate.

Sarbanes-Oxley Act

The Sarbanes-Oxley Act was meant to ensure that no other melt down such as ENRON would be experienced again. The act gives mandates to company boards and accounting firms that are public in addition to transparent and full disclosure of accounting statements. The act also requires the SEC to regulate organizations and companies. The act seemed reliant on the SEC which at the time was compromised.Commercial banks issued subprime mortgages that had the full security of investment banks such as Lehman. At the time of enacting the act, the onset of such events had not been thought about and no one had imagined that such a blow to Lehman was coming. This was because of the way Lehman was making money out of collaterized debt obligations that were in doubt. In lay mans terms the bonds offered was backed by mortgage pools.

It is out of these events that have led to the general public perception that the agencies cannot prevent future accounting fraud. Many Americans questioned do not trust the same bodies that were responsible in aiding the whole economic mess that they are now experiencing.

New Regulation

There have been calls for new regulation following the debacle that followed Bear Stearns and Lehman Brothers (Tibman 102). Calls for the overhaul of Regulations touching on corporate governance have been voiced by scholars and policy makers. In a country where compliance is expensive, these voices are fast becoming the voice of the majority. A good example of how costly it is to comply is how it needs one to re-tool their IT systems and employing more executives dealing with compliance issues. There is also the increased cost of payment to audit committees.

Conclusion

The new regulations that are being called for can lead to a better and more transparent environment for organizations as it will be cheaper for them to audit and report to the authorities.

Work Cited

Tibman, Joseph. The Murder of Lehman Brothers: An Insider’s Look at the Global Meltdown. London: Oxford Publishers, 2009.

error: Content is protected !!