Industry

A framework for scaffolding students' assessment of the credibility of evidence

Description
A framework for scaffolding students' assessment of the credibility of evidence
Categories
Published
of 34
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
  JOURNAL OF RESEARCH IN SCIENCE TEACHING A Framework for Scaffolding Students’ Assessmentof the Credibility of Evidence Iolie Nicolaidou, 1 Eleni A. Kyza, 1 Frederiki Terzian, 2 Andreas Hadjichambis, 2 Dimitris Kafouris 2 1  Department of Communication and Internet Studies, Cyprus University of Technology,P.O. Box 50329, 3603 Limassol, Cyprus 2 Cyprus Ministry of Education and Culture, Nicosia, Cyprus Received 25 May 2010; Accepted 27 April 2011 Abstract: Assessing the credibility of evidence in complex, socio-scientific problems is of paramountimportance. However, there is little discussion in the science education literature on this topic and on howstudents can be supported in developing such skills. In this article, we describe an instructional designframework, which we call the Credibility Assessment Framework, to scaffold high school students’collaborative construction of evidence-based decisions and their assessment of the credibility of evidence.The framework was employed for the design of a web-based reflective inquiry environment on a socio-scientific issue, and was enacted with 11th grade students. The article describes the components of theCredibility Assessment Framework and provides the details and results of an empirical study illustratingthis framework in practice. The results are presented in the form of a case study of how 11th grade studentsinvestigated and evaluated scientific data relating to the cultivation of genetically modified plants. Multiplekinds of data were collected, including pre- and post-tests of students’ conceptual understanding and theirskills in assessing the credibility of evidence, and videotapes of students’ collaborative inquiry sessions.The analysis of the pre- and post-tests on students’ conceptual understanding of Biotechnology and theirskills in assessing the credibility of evidence revealed statistically significant learning gains. Students’ work intask-related artifacts and the analysis of two groups’videotaped discussions showed that students becamesensitive to credibility criteria, questioned the sources of data and correctly identified sources of low,moderate, and high credibility. Implications for designers and educators regarding the application of thisframework are discussed.  2011 Wiley Periodicals, Inc. J Res Sci Teach Keywords:  evidence credibility assessment; scaffolding; biotechnology; reflective inquiry; high schoolstudents; collaboration; socio-scientific issues; WWWWhat happens in nature . . . depends upon the chef you ask: for some, nature is a seasoningthrown in to the flavor the social meat and cultural potatoes; for others, nature is what isfinally brought to the table, what gets ladled into bowls, either thick stew with chunks of social left in or thin broth after the ‘‘meat’’ is methodically strained out and discarded; stillothers never bother to pick up any nature at the market—it is social down to the bottom of the pot. (Gieryn, 1999, p. ix) A central goal of science education is to support students’ evidence-based reasoning byengagingtheminseekingevidenceandusingittocritiquescientificclaims.AccordingtotheU.S.National Education Standards (NRC, 1996), students should be able to apply scientific reasoning Correspondence to : I. Nicolaidou; E-mail: iolie.nicolaidou@cut.ac.cyDOI 10.1002/tea.20420Published online in Wiley Online Library (wileyonlinelibrary.com).  2011 Wiley Periodicals, Inc.  to participate in informed decision-making at local and national level about issues that impact ontheir everyday life, such as the global climate change and genetic engineering. Participating insuch decision-making requires an understanding of the nature of science, as well as scientificknowledge and skills relating to the interpretation and weighing of evidence, issues which areunderrepresented in the current practice of teaching and learning science (Chinn & Malhotra,2002). Reaching decisions on many socio-scientific debates that scientists and the public need totake action on requires the existence of processes such as making sense of complex and diversedata-sets that are difficult to analyze and comprehend, weighing the relevance and examining thecredibility of scientific evidence. ‘‘Practices such as weighing evidence . . . and evaluating thepotential viability of scientific claims are seen as (some of the) essential components inconstructingscientificarguments’’(Driver,Newton,&Osborne,2000,p.288).Driveretal.(2000)noted thatinsufficienttime istypicallygiventoevaluativetasksbeyondthe interpretationofdata;for instance, questions such as ‘‘what trust can we place in data?’’ or ‘‘are there different possibleinterpretationsofthisdata?’’arenotfrequentlyaddressed.Indeed,oftenexistinginquirycurriculahave explicit or implicit expectations that students treat data as non-biased and do not raiseconcerns over the credibility of the evidence.Credibility assessment is a very important task for several professions and many aspects of everydaylife(Rieh&Danielson,2007).Forexample,akeyroleofjudgesandjurorsduringatrialistodecidewhether witnessesare lying ortellingthe truth.Empirical studies have confirmedthatcredibilityassessmentisahighlycomplexandoftenunreliabletask,witherrorsoccurringinabout45% of assessments in courts (Porter & Brinke, 2009). This has implications for students thatextend beyond legal issues. Students may not be called to decide whether a person should beconvicted or not in a court. However, they may easily be called to make decisions andsolve problems in every day life that require them to evaluate the credibility of evidence, verifythe source of information, decide whether those who present the evidence are trustworthy and beconvinced of a position in favor or against a specific course of action. Students, more frequentlynow than before, are faced with important socio-scientific dilemmas and they are asked or theywill be asked in the future to take action on them. They should be in position to have reflectivediscussions on such debates and not accept data at face value.Students’ difficulties in evaluating the credibility of evidence are documented in theliterature (Brem, Russel, & Weems, 2001; Dawson & Venville, 2009; Reiser, 2004; Sandoval &Millwood, 2005). However, there is no evidence in the literature to suggest that the credibilityassessment process could be enhanced at the high school level, nor has it been discussed howthis can be supported at the high school level. The work presented in this article introducesan instructional framework for supporting students’ attending to such issues. This framework was used for the design of a learning environment, scaffolding high school students’ scientificinquiry. The learning environment was enacted with a class of 11th grade students, and thestudents’ skills in assessing the credibility of evidence were examined over time. This work contributes to the understanding of students’ capacity to evaluate the credibility of evidence,given the need to foster students’ skills on this topic and the limited discussions in the literatureabout this topic.In the sections that follow we first give an overview of existing research on students’assessment of the credibility of evidence and proceed to present the Credibility AssessmentFramework, a suggested framework to support high school students’ evaluation of the credibilityofevidence.InthesecondpartofthearticlewepresentanddiscussastudyinwhichtheCredibilityAssessment Framework was implemented with the goal of illustrating: (a) how this framework supports the development of high school students’ capacity to assess the credibility of evidence;(b) the development of students’ conceptual understanding of a complex socio-scientific topic, 2 NICOLAIDOU ET AL.  Journal of Research in Science Teaching  in the context of a scientific inquiry investigation; and (c) the development of students’understanding of the credibility assessment criteria over time.Theoretical Framework   Defining Credibility Credibility has been examined across a number of fields, ranging from law, communication,information science, psychology, marketing to human–computer interaction. Each field hasexamined the construct and its practical significance using fundamentally different approaches,goals and presuppositions, which results in conflicting views of credibility and its effects (Rieh& Danielson, 2007). With regard to source credibility for students’ evaluations of scientificarguments in particular, Brem et al. (2001) stated that ‘‘determining source credibility involvesassessing expertise, detecting conflicts of interest or ulterior motives, and looking for signs of professional or unprofessional behavior’’ (p. 193).We primarily draw on the work of Driver et al. (2000) to provide an operational definition of evaluating the credibility of evidence. The credibility of different sources of information can bedefined as the consideration of the grounds for confidence through the use of interrogatory tools,suchasthecriticalaskingof‘‘reportsabouttheoriginofevidence,whethertheevidenceissimplycorrelational or whether there is a plausible theoretical mechanism, whether the results arereproducible, whether they are contested, or about the authority of the scientific source’’ (Driveretal.,p.301).Putsimply,theresearchersreferredtotwomainparametersthatareimportantfortheassessment of the credibility of evidence: the source of the evidence and the methodology of theconstructionoftheevidence.Studentsneedtoexaminethesourceoftheevidenceandthinkaboutquestionssuchas:Isthereevidentbiasornot?Wasapieceofevidencepeer-reviewed?Whoistheauthoroftheevidence?Whatistheauthor’sagenda/background?Whatwasthesourceoffundingfor producing each piece of evidence? As far as the methodology is concerned, students need tothink about the following: Does the evidence refer to a comparison of two different groups?Is there any control of variables? Were the results replicated? Students’ Difficulties in Evaluating the Credibility of Evidence There are several reports in the literature on studies of credibility and how to better supportstudents in their credibility evaluation skills; however, only a small subset of those studies comesfromscienceeducation,withmostofthemfocusingontopicssuchasInternetsearches,evaluatinginformation online, information literacy, and information seeking (e.g., Julien & Barker, 2009;Rieh & Hilligoss, 2008). Science education research has demonstrated students’ difficulties inevaluating evidence to construct evidence-based explanations at the elementary school level(Sandoval &  C¸  am, 2011; Wu & Hsieh, 2006), middle school level (Chinn, Duschl, Duncan,Buckland, & Pluta, 2008; Glassner, Weinstock, & Neuman, 2005; Kyza & Edelson, 2005;Mathews,Holden,Jan,&Martin,2008;Yoon,2008),highschoollevel(Bremetal.,2001;Dawson& Venville, 2009; Reiser, 2004; Sandoval & Millwood, 2005), undergraduate level (Halverson,Siegel, & Freyermuth, 2010; Korpan, Bisanz, Bisanz, & Henderson, 1997; Lippman, Amurao, &Pellegrino, 2008; Treise, Walsh-Childers, Weigold, & Friedman, 2003), and even pre-serviceteacher education (Zembal-Saul, Munsford, Crawford, Friedrichsen, & Land, 2002). One of themostimportantandcommondifficultystudentshaveinrelationtoevaluatingevidencereferstoanuncertainty as to what constitutes convincing evidence or valid evidence (Driver et al., 2000).Furthermore, previous research showed that one of the important reasons students faceddifficulties in evaluating the credibility of evidence was the fact that they lacked the criteria theyneeded to be able to evaluate evidence (Wu & Hsieh, 2006). SCAFFOLDING STUDENTS’ CREDIBILITYASSESSMENT 3  Journal of Research in Science Teaching  At the middle school level, Pluta, Buckland, Chinn, Duncan, and Duschl (2008) documenteda widespread difficulty that the participants in their study, 724 US middle school students, facedwith regard to the effective use of reasons and evidence. Students typically found it difficult tomake judgments about the relative strength of evidence, and rather tended to treat all evidence asequally strong. They also had difficulty in understanding the need to provide justifications thatweremoreelaboratedthanjustmentioningtheevidencethatprovidedsupport.SeethalerandLinn(2004) examined190 8th grade students’reasoning about tradeoffs in the contextof a technologyenhanced curriculum about genetically modified (GM) food and found that even though studentswere able to provide evidence both for and against their positions, they were less explicit abouthowtheyweighedthosetradeoffs.Inthehumanitiescontext,BaildonandDamico (2009) usedanexploratory research case study design to document and explain the complexities of 32 Asian-American 9th grade students’thinking about credibility in relation to a controversial video. Theyfound that students struggled in assessing credibility. In terms of criteria for assessing credibility,students considered issues of authorship. They also referred to what was absent: counter-arguments, contradictory evidence, and other theories and claims about what happened. Thoseresearchers found that students in their study tried to contextualize claims and evidence bysituating authorship, information, claims, evidence, and texts in broader contexts, by drawing ontheir own prior knowledge.At the high school level, Kolstø (2001) interviewed 22 10th grade students on their views onthe trustworthiness of knowledge claims, arguments and opinions on a socio-scientific issue. Allstudents expressed problems and uncertainty when trying to sort out who to trust and what tobelieve.Asaresult,someclaimswerenotquestionedatall.Moreover,moststudentsshowedverylittle interest in empirical evidence underpinning knowledge claims and only a few expressed apositiveinterestinmethodologicalaspectstoassessthecredibilityofevidencepresentedtothem.Even at the college level, students sometimes had difficulty assessing the credibility of evidence in scientific arguments. Lippman et al. (2008) investigated 98 college students’understanding of evidence use in scientific arguments, by measuring students’ ability to compareandanalyzethequalityofargumentsandtheuseofevidenceinessays.Theirfindingsrevealedthatstudents inconsistently applied the epistemological criteria of empirical data serving as evidencein scientific arguments. Their insensitivity to blatantly inaccurate descriptions of evidencesuggests that students lack the deep understanding needed to assess the quality of evidence. Theymadereasonable,butnotalwaysidealselectionsforthestrongandweakevidenceandfocusedonsuperficial aspects, such as ease of comprehension instead of the need for relevant empirical anddisconfirming evidence.There is scarce evidence in the literature that credibility assessment can be enhanced;however, no such discussions were located at the high school level, while some evidence fromhigher education indicated that instruction in the form of short training can support students inevaluating evidence. Sanchez, Wiley, and Goldman (2006) reported that college students had afragile understandingof the credibility ofsources of evidence in Internetsites and veryfew couldverbalize their understanding or use it to justify their evaluations of credibility. The researchersidentified four key areas in which college students needed support: considering the source of theinformation,consideringtheevidencethatwaspresented,thinkingabouthowtheevidencefitintoanexplanationofthephenomenaandevaluatingtheinformationwithrespecttopriorknowledge.In a study with 60 undergraduates who were randomly divided into two groups, one group of students received a short training course while the control group did not receive training. Thecriteria that were used to evaluate Internet sites were the following: who the author of theinformation was, how reliable a sitewas, how well the site explained the information, identifyingrelevantinformationabouttheauthor,suchasmotivation,andexaminingwhethertheinformation 4 NICOLAIDOU ET AL.  Journal of Research in Science Teaching  wasconsistentwithotherreliablesourcesandwhetheritwasbasedonscientificevidence.Resultsindicatedthatcollegestudentswereabletolearntousecriticalevaluationskillsduringshort-termtraining and were better able to identify relevant from irrelevant sources. Existing Research on Students’ Assessment of the Credibility of Evidence A few researchers attempted to overcome the challenges that students face in weighingevidence to construct an explanation and make a decision. Some efforts concentrated on havingstudents evaluate the differing relativestrengthofmultiple pieces ofevidence(Plutaetal., 2008),on prompting them to answer a series of yes–no questions for evaluating credibility (Baildon &Damico, 2009) and, in the case of pre-service teachers, on supporting them to provide a justification when creating a link between evidence and claims (Zembal-Saul et al., 2002).Wu and Hsieh (2006) indicated that students need to be introduced to the criteria by whichexplanationsare evaluated.Thescaffolding thatwas usedintheir studyforsupportingstudentsinevaluating the credibility of explanations was an evaluation worksheet. In evaluating thecredibility of explanations, students’ comments on the evaluation worksheets focused onsuperficial criteria rather than on the content and the quality of explanations. Only a few groupsmentioned thelackofevidence asacriterion,forexample.Those researchersthereforesuggestedthat the learning environment needs to provide more supports or explicit instruction on howevidence is evaluated. For example, the worksheets could include examples of scientificexplanations to help students assess their quality.Along the same lines of providing students with credibility criteria was Sandoval andMillwood’s (2005) work, which also supported that students’ epistemic understanding of thecriteria for a good explanation is needed. Design principles derived from Seethaler and Linn’s(2004) work referred to students’ exposure to building explanations in a content-rich domainwhere they are exposed to multiple forms of evidence, providing the opportunity to discuss theevidencewithacommunityofpeersandteachersandideally,overthecourseofmultiplecurricula,having those communities develop criteria for the evaluation of evidence in multiple domains.A Framework to Support Students’ Assessment of the Credibility of EvidenceWesuggesttheuseofaframework,whichwecallthe‘‘CredibilityAssessmentFramework,’’tosupporthighschoolstudents’assessmentofthecredibilityofevidenceinthecontextofasocio-scientific inquiry-based investigation. The Credibility Assessment Framework builds on thetheory of situated learning, with the aim of supporting designers in the development of learningactivities. According to the theory of situated learning in general, and this instructional designframework in particular, learning environments should provide authentic contexts, authenticactivities, multiple perspectives, coaching and scaffolding by the teacher at critical times andauthentic assessment of learning within the tasks, they should support collaborative constructionofknowledgeandtheyshouldpromotereflectionandarticulation(Herrington&Oliver,2000).Allof these elements are incorporated in the Credibility Assessment Framework. The CredibilityAssessmentFrameworkisalsobasedontheLearning-for-UseframeworkasexpressedbyEdelson(2001) and the scaffolding design framework, proposed by Quintana et al. (2004), according towhich students assume an active role for their learning when they experience the need for newknowledgeandproceedtoconstructandrefineknowledgethroughworkingonextendedscientificinquiry investigation. The framework extends the work of Wu and Hsieh (2006) whorecommended using evaluation criteria and providing support and explicit instruction to studentson how evidence is evaluated.Thedifferent componentsoftheCredibilityAssessmentFrameworkare presentedinthefirstcolumn(Table1).ThesecondcolumnofTable1transformsthecomponentsoftheframeworkinto SCAFFOLDING STUDENTS’ CREDIBILITYASSESSMENT 5  Journal of Research in Science Teaching
Search
Similar documents
View more...
Tags
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks