Bills

Impact of Program Review on Community Colleges.

Description
Impact of Program Review on Community Colleges.
Categories
Published
of 11
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
  Results of a study of program review among community colleges nationwide point to the importance of key leadership support, organizational communication, a clear understanding ofthe purposes of program review, and frequent action on program review recommendations at all organizational levels. Impact of Program Review on Community Colleges J. zyxw oseph zyxwvu oey, IV This issue addresses organizational and other factors commonly perceived to be relevant to program review impact. The focus of this chapter is to describe the results of a recent national study (Hoey, 1993) comparing several of those organizational factors with reported use of program review results. The chap- ter concludes with recommendations for heightening program review impact based on the study results. In late 1992, a survey instrument on program review impact was mailed to 253 chief academic officers at community colleges around the country. For this survey, a systematic sample (stratified by accreditation region) was drawn from the American Association of Community and Junior Colleges Membership Directory 1992 At each community college, the chief academic officer was deemed to be the one person in the best position to judge the effects of pro- gram review within the college, and, by virtue of the position, to necessarily have a broad picture of the organizational context of program review. Chief academic officers were asked to respond to a series of items on organizational factors believed relevant to program review impact as well as to items designed to measure the impact of program review on a college. Using Dillman’s (1978) total design methodology (with registered mailing omitted), 156 responses were received, for an overall response rate of 62 percent. To measure program review impact, items were developed around a con- ceptual framework extrapolated from Shadish, Cook, and Leviton (1991), where instrumental use is defined as the short-term decision outcomes following a pro- gram review, incremental use is the changes that take place over time as a result of program review, persuasive use is the degree to which program review rec- ommendations provide a leverage point or a basis for politically motivated NEW DIRECTIONS FOR INSTITUTIONAL RESEARCH zyxwvutsrq o zyxwvutsr 6 Summer 995 zyxwvu   ossey Bass Publishers 49  50 zyxwvu SING ACADEMIC PROGRAM REVIEW actions, and conceptual use is the way organizational members think about a program (its centrality to the college mission, its place in institutional long-range plans) or how a better understanding of a program is gained through program review. Findings from the survey follow. zyxw Program Review Prevalence and Purposes A formal program review process was in use at 87 percent of responding insti- tutions, and appears zyxwv o have been adopted by nearly 25 percent more insti- tutions over the past two years. The impetus for this increase may well be tied to the purposes for which program review is undertaken. Results indicate that fulfilling state evaluation mandates, satisfying effectiveness criteria of regional accreditation bodies, and demonstrating accountability to funding bodies have all become important reasons for responding community colleges to undertake program review, as outlined in Table 4.1. Mean ratings of impor- tance are based on a five-point scale, where 1 = not important at all and z   = very important. Thus, accountability concerns appear to be a prime impetus for prugram review among responding colleges. In fact, an interesting example of the trend toward greater importance of externally focused purposes has recently taken place in North Carolina, where a successful annual desktop audit program review model in one college has been adapted and mandated systemwide to replace a less uniform program review cycle at the local level (North Carolina Department of Community Colleges, 1994). Although externally focused purposes for undertaking program review were found to be important in this study, one surprising result is the extent to which internally focused improvement-oriented purposes are also considered to be of high importance, as outlined in Table 4 2 As in Table 4.1, mean rat- ings of importance are based on a five-point scale, where 1 = not important at all and 5 = very important. Conrad and Wilson (1985) posit that the emphasis in program review has shifted from program improvement to accountability. However, the results of Lhe present study indicate that both are considered important among respond- Table 4.1. Mean Ratings of Importance of Externally Oriented Purposes for Program Review Item Mean SD Satisfy a state-level mandate for program review for planning and evaluation processes zyxwv   94 97 Demonstrate accountability to our publics 3 71 97 of occupational programs 3 41 1 I1 4 zyx 8 00 Satisfy requirements fo regional accreditation bodies Respond to federal requirements for the evaluation Note n = 36  IMPACT OF PROGRAM EVIEW ON COMMUNITY OLLEGES zy   1 z Table 4.2. Mean Ratings of Importance of Internally Oriented Purposes for Program Review ltem Mean S.D. Evaluate program quality 4.44 .69 Provide basis for allocation and reallocation of resources 3.52 1.01 Provide information for administrators who are considering a program’s future 4.09 .83 Improve teaching and learning at our institution 4.23 .87 to employer or workplace requirements 4.28 .79 Ensure currency and relevancy of curricula in relation Clarify or redefine program mission and goals 3.78 .96 Provide internal political leverage for needed changes 3.00 .98 Note: zyxwvuts   zyxwvutsrqpo   136. ing community colleges. A possible explanation may be that once program review is mandated and adopted, administrators quickly realize its benefits as a vehicle for planning, keeping abreast of developments among the various programs a college offers, and improving external relations. Using Program Review Results As noted above, this study conceptualized the products of program review into four categories: instrumental, conceptual, persuasive, and incremental. Using the effects of program review most commonly mentioned in the literature, indices of items were constructed to reflect the extent of program review impact in each category. For example, the index used to assess the incremen- tal or long-term impact of program review consisted of eleven items, such as Program review has resulted in greater relevancy of curricula to employer or workplace needs; Program review has resulted in measurably improved stu- dent outcomes; Program review has resulted in improved academic decision making; and Program review has resulted in improved communication between faculty and administration. To measure the extent of use in each cat- egory, a five-point scale was used where 1 = not at all and 5 = to a great extent. Reliability (Cronbachs alpha) of all four indices was high, ranging from .78 for persuasive use to .93 for incremental use. In the results obtained, indices of conceptual and incremental use both achieved fairly high overall mean ratings, at 3.38 and 3.44 respectively; instrumental use achieved a lower overall mean of 2.96; and persuasive use was rated at 2.76, as shown in Table 4.3. The results in Table 4.3 demonstrate that conceptual and incremental use of program review results among responding community colleges appear to be occurring from a moderate to considerable extent. Instrumental and persua- sive use can be interpreted as occurring from a slight to moderate extent. Thus, although short-term changes and leveraged or politically motivated actions may not be occurring at a notable rate as a result of program review, long-term effects and less-observable attitudinal changes are in evidence among respond- ing institutions.  52 USING ACADEMIC ROGRAM EVIEW zyxwv Relationship of Factors to Use of Program Review Results Through an analysis of the data obtained in this study, significant positive corre- lations were identified between the use of program review results and several of the organizational factors deemed vital to program review impact in the literature. Leadership Support To assess the degree of key leadership support for program review, an index of six items was developed. Items were measured on a five-point Likert-type scale, from 1 (disagree) to zyxwv   (agree). The index included such items as “Top administrators here expect program review recommendations to be taken seri- ously,” “Program review reports are used at all levels of this institution to enhance institutional effectiveness,” and “Top administrators at this institution rely on program review reports for long-range planning decisions.” This index of items was found to be related to all four hypothesized com- ponents of program review, as outlined in Table 4.4. Results support the notion that the degree of leadership support is signif- icantly related to all four categories of program review impact. Organizational Communication A second important organizational factor, communication, was also found to be closely related to the use of program review results. Specifically, the per- ceived accuracy and openness of organizational communication were assessed Table 4.3. Mean Reported Use of Program Review Results Use Component Mean S.D Incremental use 3.44 .94 Instrumental use 2 96 .75 Persuasive use 2.76 .91 Note zyxwvutsrq   = zyxwvuts 36 Conceptual use 3.38 .91 Table 4.4. Results of Correlation Analysis Between Leadership Support and Use of Program Review Results Use Component zyxwvutsr   R2 Incremental use of program review results .57” .32 Persuasive use of program review results .29’* zyx 08 instrumental use of program review results .38“ .14 Conceptual use of program review results .46” .21 ~ Nok. n = 136 ‘ 01 significance level  IMPACT zyxwvut F PROGRAM REVIEW ON COMMUNITY COLLEGES zyx 3 in this study with two scales srcinally developed by O’Reilly and Roberts (1976) and modified to assess perceptions of communication as it relates to program review. Respondents were asked to rate their agreement or disagree- ment with a series of statements such as “Information gained from program reviews is widely shared at this institution,” “The information I receive through program reviews is often inaccurate,” and “It is easy to ask for feedback from faculty, staff, and students during program reviews at this institution.” Results obtained are shown in Table 4.5. Results support the interpretation that perceived accuracy and openness of organizational communication concerning program review are positively related to the use of program review results among responding colleges. z Purposes Purposes for conducting program review were also found to be related to use of program review results. Indices of both externally and internally focused purposes for conducting program review were moderately correlated to three of the identified components of program review use, as shown in Table 4.6. Intuitively, one would hope to see a relationship between the purposes for program review and the reported impact. It is surprising that a stronger rela- tionship was not obtained. Organizational Centralization Organizational centralization has been cited in the program review literature as both a boon and a bane to the impact of program review (Stevenson, 1985; Smith, 1979; Ruhland, 1990). To explore this structural factor, an index was Table 4.5. Results of Correlation Analysis Between Perceived Organizational Communication and Use of Program Review Results Use Component r R2 Perceived Openness of Communication Incremental use of program review results zyxwv 55” .28 Persuasive use of program review results .20* .04 Conceptual use of program review results .24*’ zyx 06 Perceived Accuracy of Communication Incremental use of program review results .51” 26 Conceptual use of program review results .29” 08 Note: n = 136 ‘ 05 significance level ’’ .01 significance level Instrumental use of program review results .44** .19 Instrumental use of program review results .31 .10 Persuasive use of program review results .16 .02
Search
Similar documents
Tags
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks