• Stepwise Procedures In Discriminant Analysis

  • TABLE OF CONTENTS -- [Total Page(s) 1]

    Page 1 of 1

    • TABLE OF CONTENTS
      Page
      Title Page
      Approval page
      Dedication
      Acknowledgement
      Abstract
      Table of Contents

      CHAPTER 1: INTRODUCTION
      1.1    Discriminant Analysis
      1.2    Stepwise Discriminant analysis
      1.3    Steps Involved in discriminant Analysis
      1.4    Goals for Discriminant Analysis
      1.5    Examples of Discriminant analysis problems
      1.6    Aims and Obj ectives
      1.7    Definition of Terms
      1.7.1    Discriminant function
      1.7.2    The eigenvalue
      1.7.3    Discriminant Score
      1.7.4    Cut off
      1.7.5    The Relative Percentage
      1.7.6    The Canonical Correlation, R*
      1.7.7    Mahalanobis distance
      1.7.8    The Classification table
      1.7.9    Hit ratio 
      1.7.10    Tolerance

      CHAPTER TWO: LITERATURE REVIEW
      2.1    Discriminant analysis
      2.2    Stepwise Discriminant analysis
      2.3    Linear Discriminant function
      2.4    Criteria for Good Discriminant functions
      2.4.1    Fishers Criterion
      2.4.2    Welch’s Criterion
      2.4.3    Von Mises Criterion
      2.4.4    Bayes Criterion
      2.4.5    Unequal cost of misclassification criterion

      CHAPTER THREE: RESEARCH METHODOLOGY
      3.1    Stepwise methodologies in discriminant analysis
      3.2    The F-Distribution
      3.3    The Wilk’s Lambda Distribution
      3.4    Using the Linear Discriminant Function
      3.5    Interpretation of Linear Discriminant Function
      3.6    Limitation of Discriminant Function
      3.7    Limitation of Stepwise Methods of Discriminant Analysis
      3.8    Ways of Dealing with the Problems Inherent
      with Stepwise Discriminant Analysis
      3.9    Test of Equality of Two Mean Vectors
      3.10    Test of Equality of Two Dispersion Matrices
      3.11    Estimating Misclassification Rate
      3.11.1    Probability of Misclassification 
      3.12    Improved Estimates of Error Rates.

      CHAPTER FOUR: DATA ANALYSIS
      4.1    Method of Data Collection
      4.2    Discriminant (All independent variables) analysis
      4.3    Summary of Canonical discriminant function
      4.4    Classification Statistics
      4.5    Discriminant (stepwise method) analysis
      4.6    Stepwise Statistics
      4.7    Summary of Stepwise Canonical discriminant functions
      4.8    Classification Statistics for Stepwise procedures

      CHAPTER FIVE:    RESULTS, CONCLUSION AND
      RECOMMENDATION
      5.1    Results
      5.2    Conclusion
      5.3    Recommendation
      References
      Appendix 1
      Appendix II

  • TABLE OF CONTENTS -- [Total Page(s) 1]

    Page 1 of 1

    • ABSRACT - [ Total Page(s): 1 ] Abstract Several multivariate measurements require variables selection and ordering. Stepwise procedures ensure a step by step method through which these variables are selected and ordered usually for discrimination and classification purposes. Stepwise procedures in discriminant analysis show that only important variables are selected, while redundant variables (variables that contribute less in the presence of other variables) are discarded. The use of stepwise procedures ... Continue reading---

         

      APPENDIX A - [ Total Page(s): 1 ] ... Continue reading---

         

      APPENDIX B - [ Total Page(s): 1 ] APPENDIX II BACKWARD ELIMINATION METHOD The procedure for the backward elimination of variables starts with all the x’s included in the model and deletes one at a time using a partial  or F. At the first step, the partial  for each xi isThe variable with the smallest F or the largest  is deleted. At the second step of backward elimination of variables, a partial  or F is calculated for each q-1 remaining variables and again, the variable which is th ... Continue reading---

         

      CHAPTER ONE - [ Total Page(s): 2 ] DEFINITION OF TERMS Discriminant Function This is a latent variable which is created as a linear combination of discriminating variables, such that Y =      L1X1 + L2X2 +          + Lp Xp where the L’s are the discriminant coefficients, the x’s are the discriminating variables. The eigenvalue: This is the ratio of importance of the dimensions which classifies cases of the dependent variables. There is one eigenvalue for each discriminant functio ... Continue reading---

         

      CHAPTER TWO - [ Total Page(s): 3 ] 5 is called the mahalanobis (squared) distance for known parameters. For unknown parameters, the Mahalanobis (squared) distance is obtained by estimating p1, p2 and S by X1, X2 and S, respectively. Following the same technique the Mahalanobis (Squared) distance, D , for the unknown parameters is D2 = (X- X)+S-1 (X1- X2) . The distribution of D can be used to test if there are significant differences between the two groups.2.4 WELCH’S CRITERION Welch (1939) suggest ... Continue reading---

         

      CHAPTER THREE - [ Total Page(s): 5 ]The addition of variables reduces the power of Wilks’ Λ test statistics except if the added variables contribute to the rejection of Ho by causing a significant decrease in Wilks’ Λ ... Continue reading---

         

      CHAPTER FOUR - [ Total Page(s): 3 ]CHAPTER FOUR DATA ANALYSISMETHOD OF DATA COLLECTIONThe data employed in this work are as collected by G.R. Bryce andR.M. Barker of Brigham Young University as part of a preliminary study of a possible link between football helmet design and neck injuries.Five head measurements were made on each subject, about 30 subjects per group:Group 1    =    High School Football players Group 2    =    Non-football playersThe five variables areWDIM    =    X1    =    head width at wi ... Continue reading---

         

      CHAPTER FIVE - [ Total Page(s): 1 ]CHAPTER FIVERESULTS, CONCLUSION AND RECOMMENDATIONRESULTSAs can be observed from the results of the analysis, when discriminant analysis was employed, the variable CIRCUM(X2) has the highest Wilks’ lambda of 0.999 followed by FBEYE (X2) (0.959). The variable EYEHD (X4) has the least Wilks’ lambda of 0.517 followed by EARHD (X5) (0.705). Also the least F-value was recorded with the variable CIRCUM (X2) (0.074) followed by the variable FBEYE (X2) (2.474), while the variable EYEHD (X4 ... Continue reading---

         

      REFRENCES - [ Total Page(s): 1 ] REFERENCES Anderson, T.W. (1958). An introduction to multivariate statistical Analysis. John Wiley & Sons Inc., New York. Cohen, J. (1968). Multiple regression as a general data-analytic system. Psychological Bulletin 70, 426-443. Cooley W.W. and Lohnes P.R. (1962). Multivariate procedures for the Behavioural Sciences, New York John Wiley and Sons Inc. Efroymson, M.A. (1960). Multiple regression analysis. In A. Raston & H.S. Wilfs (Eds.) Mathematical methods for ... Continue reading---