Lifestyle

Grammatikopoulos, V., Tsigilis, N., & Koustelios, A. (2007). Influential factors of an educational program implementation evaluation: A cross validation approach. Evaluation and Research in Education, 20(2), 100-113.

Description
The aim of this study was to examine the underlying structure of the Evaluation Scale of Educational Programme implementation (ESEPI). The ESEPI consists of six factors that play an important role in educational programme implementation, namely:
Categories
Published
of 14
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
  Influential Factors of an EducationalProgramme Implementation Evaluation:A Cross-validation Approach Vasilios Grammatikopoulos, Nikolaos Tsigilis and Athanasios Koustelios  Department of Physical Education and Sport Science, University of Thessaly, Karies, Trikala, Greece  The aim of this study was to examine the underlying structure of the EvaluationScale of Educational Programme Implementation (ESEPI). The ESEPI consists of sixfactors that play an important role in educational programme implementation,namely: training, educational material, administration, facilities, relationships andeducational procedure. The instrument was administrated to 449 physical educationteachers. A general-factor model was postulated and supported. According to themodel, the six components were loaded on one general-factor. In addition, theproposed model was found to be invariant across the two groups. Confirmatoryfactor analysis with cross-validation procedure partially supported the validity of theinstrument. Nevertheless, further research is needed before ESEPI can be used as aninstrument of choice for evaluating programme implementation. doi: 10.2167/eri404.0 Keywords: programme e v aluation, educational programme implementation,confirmatory analysis E v aluation of programme implementation is a significant part of an educa-tional e v aluation procedure (Pro v us, 1971). It helps educators to unco v er theweak parts of a programme (Dimitropoulos, 1998), and point out decisi v eelements for a programme’s success (Grammatikopoulos et al ., 2004a;Grammatikopoulos et al ., 2005). For many years, se v eral inno v ati v e physicaleducation programmes ha v e been implemented in schools (e.g. health-relatedphysical education, life skills oriented physical education, en v ironmentaleducation etc.). E v aluation procedures in pre v ious programmes ha v e usuallytried to examine the o v erall success of the programme, but not the programmeimplementation itself.Programme implementation e v aluation could pro v ide useful information,which an e v aluation focused only on programme success could not re v eal.Solomon (1998) argued that certain factors play an important role in aneducational programme implementation. Thus, in order to e v aluate the imple-mentation of an educational programme, an instrument was de v eloped andused for this purpose (Grammatikopoulos et al ., 2004a; Grammatikopoulos et al ., 2005). In the abo v e studies, exploratory factor analysis using theE v aluation Scale of Educational Programme Implementation (ESEPI) suc-ceeded in identifying a defensible (statistically and conceptually distinct)factor structure (Appendix). The results of the study by Grammatikopoulos 0950-0790/07/01 100-14 $20.00/0 – 2007 V. Grammatikopoulos et al. Evaluation and Research in Education Vol. 20, No. 2, 2007 100  et al. (2005) indicated that ESEPI comprised six factors (training, educationalmaterial, administration, facilities, relationships and educational procedure)that explained 68.77% of the total v ariance. Cronbach’s alpha coefficients forthe questionnaire’s subscales ranged between 0.67 and 0.92.E v aluation of a programme’s implementation represents an integral part of any educational e v aluation (Pro v us, 1971). Therefore, the selection andimplementation of  v alid e v aluation instruments are considered that enhanceeffecti v eness and accountability in programme e v aluation. Research one v aluation benefits from the de v elopment and application of  v alid and reliablemeasures, as empirical data obtained with psychometrically sound instru-ments can be trusted, contributing to correct interpretation of the findings.Thus, establishing the v alidity and reliability of an instrument (such as ESEPI)is a major issue in the e v aluation process. Further v alidation studies (ondifferent samples, using more ad v anced statistical techniques) strengthen ourconfidence in the examined instrument, pro v ide researchers with a useful tool,which can be used to enrich the field of programme implementation andstimulate research efforts in the educational setting.The educational programme reported in the current study was the ‘OlympicEducation Programme’ (OEP), which has been implemented in the v astmajority of elementary and secondary Greek schools since 2000. About 2000physical education teachers ha v e worked on this project. The OEP ischaracterised by contemporary educational approaches. Integrated curriculum(physical acti v ities, sports, theoretical approaches, cultural acti v ities) andlearning through projects are integral components of the educational proce-dure, where students are the centre of the process. The Greek OEP aspires toestablish congruence between the goals of the Olympic ‘philosophy’ andcurrent international educational priorities. At the same time, OEP expects topro v ide physical education teachers with useful teaching suggestions and aids(Kioumourtzoglou et al ., 2001).Implementation of the OEP is not a Greek inno v ation. Se v eral approaches toOlympic education in school curricula e v ol v ed through the efforts of cities thathosted the Olympic Games (Binder, 2001). The Greek OEP seeks to establish acongruence between the goals inherent in the ‘philosophy’ of the OlympicGames and current international educational priorities, while at the same timepro v iding physical education teachers with useful teaching suggestions andaids. To prepare properly for this new project, physical educators alsoparticipated in a fi v e-day training programme.Construct v alidity is the most important property in the de v elopment of aninstrument, because it in v ol v es testing the adequacy of theoretically deri v edrelationships. The construct v alidity of the ESEPI should also be tested andconfirmed using a theoretical model that has been specified a priori .Confirmatory factor analysis (CFA) is a frequently employed technique toe v aluate the factorial v alidity of multidimensional instruments. In Gramma-tikopoulos et al . (2005), the examination of the structure of the ESEPI relied onexploratory factor analysis. Therefore, the purpose of the current study was tofurther examine the construct v alidity of the ESEPI. A cross- v alidationprocedure was used because it is an important step in building confidencein the v alidity of the ESEPI. Cross- v alidation requires two distinct samples A Cross-validation Approach  101  from the same population. The first sample is used to establish a baselinemodel with an acceptable fit. This model is then cross- v alidated on the othersample using v arious structural equation modelling procedures. Method Participants and procedures The sample composed of 449 (about 23% of the population) physicaleducation teachers (211 males and 238 females) working on the OEP. The meanage of participants was 32.2 9 / 9.1 years. The sample was randomly di v idedinto two groups, calibration ( n 0 / 224) and v alidation ( n 0 / 225). Preliminaryexamination re v ealed that there were no differences between the two groupswith respect of age (  p ! / 0.05), sex (  p ! / 0.05), and responses to six compositefactors (  p ! / 0.05).Participants were asked to complete the questionnaire, e v aluating the OEP’simplementation, at the end of the academic year 2003  Á  2004 (end of June 2004).Physical education teachers v oluntarily participated in the study. As men-tioned abo v e, the population of the physical education teachers whoimplemented the OEP was around 2000. About 1500 were randomly selectedto participate in pre v ious studies regarding other OEP e v aluation procedures(Grammatikopoulos et al ., 2004b, 2004c; Grammatikopoulos et al ., 2005). Thus,the 600 teachers of the current study represented the remaining population.Therefore, 600 questionnaires were distributed in order to collect the 449 ( n 0 / 449), which were used in this study (response rate 74%). Instrument The instrument used in the current study was the ESEPI (Appendix).The rating scale was based on a 5-point Likert scale (from 1 0 / stronglydisagree, to 5 0 / strongly agree). The scale included six factors: (a) facilities(fi v e items), (b) educational procedure (four items), (c) educational material(fi v e items), (d) administration (four items), (e) training (four items) and(f) relationships (four items).The calculated Cronbach a coefficients for both groups indicated adequateinternal consistency. Specifically, the alpha v alues were: 0.88 and 0.84 forfacilities, 0.93 and 0.90 for educational procedure, 0.86 and 0.89 for educationalmaterial, 0.91 and 0.95 for administration, 0.94 and 0.92 for training, and 0.89and 0.91 for relationships, for the calibration and v alidation group respecti v ely. Model tested A one-factor model was postulated and tested. Specifically, it washypothesised that six measured item v ariables (facilities, educational material,administration, training, relationships, and educational procedure) weremanifestations of a latent v ariable ‘programme’s implementation’. Therefore,each obser v ed v ariable would ha v e a non-zero loading on the factor it wasdesigned to measure. It was also hypothesised that the measurement errorassociated with each item v ariable would be uncorrelated. 102 Evaluation and Research in Education   Model cross-validation A hierarchical approach to testing for in v ariance (Bollen, 1989; Byrne, 1994;Byrne et al ., 1989) was employed in order to examine the cross- v alidation of the ESEPI. According to this procedure, equality constraints are imposed on aparticular set of parameters. Next, in v ariance is tested by simultaneouslyfitting the proposed model to the data from multiple groups (Arbuckle, 1997;Byrne, 1994). Constraints are imposed in a logically ordered and increasinglyrestricti v e fashion. Finally, the model in which a certain set of parametersis forced to be equal across groups is compared with a less restricti v e model,in which the same parameters are free to yield any v alue. A non-significant x 2 difference indicates that the in v ariance hypothesis may be consideredtenable. At each stage, goodness of fit was also assessed. In the presentstudy, the following set of parameters was examined in relation to groupin v ariance: (a) equality of form (model 1), (b) equality of loading paths (model2), (c) equality of factor v ariances/co v ariances (model 3), and (d) equality of error v ariances/co v ariances (model 4) (Bollen, 1989; Byrne, 1994; Byrne et al .,1989). Confirmatory factor analysis using AMOS 3.62 (Arbuckle, 1997) wasemployed in order to examine the construct v alidity of the ESEPI. Assessment of fit The o v erall fit of the data to the examined models was initially based on the x 2 statistic. Non-significant v alues suggest a good fit, since they indicate minordiscrepancy between the obser v ed and the estimated co v ariance matrix.Howe v er, the x 2 statistic has been criticised for its sensiti v ity to sample sizeand departures from normality (Byrne, 1994; Li et al ., 1998). In an attempt too v ercome this problem, v arious alternati v e indexes of fit ha v e been proposed.Thus, supplement to the con v entional x 2 , multiple fit indexes were pro v ided.Following the recommendations of Hoyle and Panter (1995), absoluteand incremental indexes were used to assess the adequacy of the postulatedmodels. Absolute fit indexes assess the difference between obser v ed andmodel-specified co v ariances, whereas incremental fit indexes assess theproportionate impro v ement in fit by comparing a target model with a morerestricted, nested baseline model. The baseline model represents the worstpossible fit in which all measured v ariables are assumed mutually uncorre-lated, pro v ided an appropriate basis for defining a zero-point. Based on asuggestion by Hu and Bentler (1999), a two-index presentation strategy wasused, which includes the maximum likelihood based standardised root meansquare residual (SRMR; Bentler, 1995) supplemented with one incrementalindex. Thus, in e v aluating goodness-of-fit it was decided to present the x 2 statistic followed by an absolute index, the SRMR and the comparati v e(incremental) fit index (CFI; Bentler, 1990). According to Hu and Bentler(1999), cut off  v alues close to 0.08 for SRMR, and close to 0.95 for CFI areneeded in order to declare a relati v ely good fit between the obser v ed data andthe hypothesised model. A Cross-validation Approach  103  Results Mardia’s coefficient of multi v ariate kurtosis pro v ided by AMOS wasstatistically not significant (Mardia’s coefficient of multi v ariate kurtosis 0 / 1.67, p ! / 0.05 for the calibration sample, and Mardia’s coefficient of multi- v ariate kurtosis 0 / 1.59, p ! / 0.05 for the v alidation sample), indicating that theassumption of multi v ariate normality was tenable. Therefore, the maximumlikelihood method of estimation was used. Table 1 displays the goodness of fitmeasures for the calibration and the v alidation sample. The p - v alue of the x 2 statistic was abo v e 0.05, indicating no significant difference between theobser v ed and the estimated co v ariance matrix. The absolute fix index reachedthe cut off  v alues of an adequate fit, whereas the incremental fit index didnot. Thus, in order to identify possible sources of misfit the modificationindexes pro v ided by AMOS were examined. Results re v ealed that the modelwould be significantly impro v ed if the error co v ariance between the obser v ed v ariables ‘educational material’ and ‘training’ were included. Inspection of the x 2 statistic for the model with correlated errors indicated a substantialdecrement. The x 2 difference ( D x 2 ) between the two models (Table 1) wassignificant in calibration sample ( D x 2(1) 0 / 8.09, p B / 0.01), which shows asubstantial impro v ement in model fit. Cross-validation of the ESEPI Results from the cross- v alidation procedure are presented in Table 2. The v alidation sample analysis yielded similar results to the calibration sample.Again there was a discrepancy between the absolute and incremental indexes.The addition of an error co v ariance between the obser v ed v ariables ‘educa-tional material’ and ‘training’ significantly impro v ed the fit of the model, asindicated by the x 2 difference ( D x 2(1) 0 / 4.34, p B / 0.05) and of the fit indexes(Table 1). Thus, the one-factor model with correlated errors seemed toadequately fit the data in both samples. The next series of analyses representsthe examination of the equality hypothesis between the two groups. Table 1 Goodness-of-fit indexes for the proposed and alternati v e models based on thecalibration and v alidation model  Model x 2 df Dx 2(df) SRMR CFI  Calibration sampleOne factor model 16.14 9 0.05 0.89One factor model with correlated errors 8.05 8 8.09 (1) ** 0.03 0.99Validation sampleOne factor model 17.00 9 0.05 0.85One factor model with correlated errors 12.66 8 4.34 (1) * 0.05 0.92 SRMR, standardised root mean square residual; CFI, comparati v e fit index.*  p B / 0.05, **  p B / 0.01. 104 Evaluation and Research in Education 
Search
Tags
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks