Handbook of Research on Instructional Systems and Educational Technology The Chicago School of Professional Psychology, USA A volume in the Advances in Educational Technologies and Instructional Design (AETID) Book Series

Handbook of Research on Instructional Systems and Educational Technology The Chicago School of Professional Psychology, USA A volume in the Advances in Educational Technologies and Instructional Design (AETID) Book Series
of 13
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
  Handbook of Research on Instructional Systems and Educational Technology Terry Kidd University of Houston-Downtown, USA Lonnie R. Morris, Jr. The Chicago School of Professional Psychology, USA A volume in the Advances in Educational Technologies and Instructional Design (AETID) Book Series  Published in the United States of America byIGI GlobalInformation Science Reference (an imprint of IGI Global)701 E. Chocolate AvenueHershey PA, USA 17033Tel: 717-533-8845Fax: 717-533-8661 E-mail: cust@igi-global.comWeb site: http://www.igi-global.comCopyright © 2017 by IGI Global. All rights reserved. No part of this publication may be reproduced, stored or distributed in any form or by any means, electronic or mechanical, including photocopying, without written permission from the publisher.Product or company names used in this set are for identification purposes only. Inclusion of the names of the products or companies does not indicate a claim of ownership by IGI Global of the trademark or registered trademark. Library of Congress Cataloging-in-Publication DataBritish Cataloguing in Publication DataA Cataloguing in Publication record for this book is available from the British Library.All work contributed to this book is new, previously-unpublished material. The views expressed in this book are those of the authors, but not necessarily of the publisher.For electronic access to this publication, please contact: Names: Kidd, Terry T., author.Title: Handbook of research on instructional systems and educational technology  / Terry Kidd and Lonnie R. Morris Jr., Editors.Description: Hershey PA : Information Science Reference, [2017]Identifiers: LCCN 2017001496| ISBN 9781522523994 (hardcover) | ISBN 9781522524007 (ebook)Subjects: LCSH: Instructional systems--Design. | Web-based instruction. | Educational technology.Classification: LCC LB1028.38 .K54 2017 | DDC 371.3--dc23 LC record available at This book is published in the IGI Global book series Advances in Educational Technologies and Instructional Design (AE-TID) (ISSN: 2326-8905; eISSN: 2326-8913)  78 Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited. Chapter 8 DOI: 10.4018/978-1-5225-2399-4.ch008 ABSTRACT Over 900 colleges and universities across the U.S. have adopted the Quality Matters Rubric for the design of their online courses with the intention of providing guidance to both instructors and peer reviewers. Given the challenge of how design components align with Web-based instruction delivery in terms of interactivity and formative assessment, there is a need to develop guidelines to establish a strong connection between design and delivery. Such information could support a dynamic, balanced, and student-centered approach to instructional development in virtual learning environments. This chapter  proposes a matrix built on the linkage among well-established design practices, delivery methods or strategies, and assessment routines. INTRODUCTION A number of quality assurance programs for online courses have been developed over the years. Most widely adopted is the Quality Matters TM  (QM) Rubric Standards used by over 900 colleges and universi-ties to ensure student success in online learning (Quality Matters, 2015). Not only do the QM standards help faculty in their design of online courses, but they also emphasize continuous improvement and consistency in the quality of online learning in individual courses and at institutional levels. The most recent iteration of the QM Rubric identifies eight general standards for designing online courses along the following criteria: course overview and introduction, learning objectives, assessment and measurement, instructional materials, course activities and learner interaction, course technology, learner support, accessibility and usability. These critical areas of interest are supported by 41 Specific Structuring Online Instruction by Dynamic Design, Delivery, and Assessment Selma Koc Cleveland State University, USA Marius Boboc Cleveland State University, USA  79 Structuring Online Instruction by Dynamic Design, Delivery, and Assessment   Review Standards, 21 of which are considered essential, meaning that not meeting any of them in this latter set would result in a course not being QM certified. The QM Rubric (2015) emphasizes the align-ment of course materials, activities and course objectives. The associated standards prompt online faculty, instructional designers, and institutions at large to refine the design of their course/program offerings for virtual delivery by using a complex peer review system. Research reviewed by Woods (2014) indicates that applying the QM review process results in greater student learning outcomes that rely on stronger, clearer connections to course objectives as well as assessment tools. Similar positive results have been reported by other researchers who completed recent studies (Little as well as Puzziferro & Shelton, as cited in Roehrs, Wang, & Kendrick, 2013, p. 55). As an illustration, faculty participating in a research project focused on QM rubric training based on which to review and update their respective courses expressed the fact that the process was useful to them as it prompted these participants to enhance the learning experiences of their online students (Roehrs, Wang, & Kendrick, 2013).In this chapter, the authors propose a matrix - Dynamic Design, Delivery and Assessment (3DA) - built on the connection among well-established design practices (as guided by QM processes), delivery methods or strategies, and assessment routines (see Table 1 below). The number of students taking online courses has been growing continuously, reaching over 7 million in 2012, based on enrollment in at least one Web-based class (Allen & Seaman, 2014). Consequently, the shift from a focus on developing the infrastructure to ensuring effectiveness (McKnight, 2004). The inherent paradigm change from quantity to quality (Liu & Johnson, 2004) implies bridging the apparent gap between design and delivery of online instruction (Southard & Mooney, 2015). Under these circumstances, the proposed theoretical matrix correlates in a bidirectional manner the aforementioned elements by grounding them in teacher presence as well as student social and cognitive presence, as outlined by Garrison, Anderson, and Archer (2000). This matrix can be used by instructors who plan to design online courses or improve their teaching by making the theoretical and practical connections between design, delivery and assessment. BACKGROUND Designing an online course needs to be based on a systems approach that considers all aspects of online instruction. Faculty who teach online or plan to teach online can benefit that from a dynamic, balanced and student-centered approach to design, delivery and assessment of instruction. As online course ef-fectiveness depends a great deal on instructional design (Gunawardena, Ortegano-Layne, Carabajal, Frechette, Lindemann, & Jennings, 2006; McGahan, Jackson, & Premer, 2015), rubrics or standards, such as QM or iNACOL as well as faculty professional development programs, can be supported by this systematic approach to teaching online. The QM program features rigorous training for faculty interested in teaching online by relying on a peer-review system for the purpose of improving the quality of virtual learning environments design. The peer review process and built-in feedback loop represent critical components of the continuous im-provement cycle supported by QM (Schwegler & Altman, 2015). As faculty-driven process connecting outcomes, objectives, and assessments (Swan, Day, Bogle, & Matthews, 2014), the fifth edition of the QM Rubric consists of 43 specific review standards that are distributed across the eight general standards mentioned earlier. There are “21 “Essential” standards worth three points each, 14 “Very Important”  80 Structuring Online Instruction by Dynamic Design, Delivery, and Assessment   Table 1. Structuring online instruction by dynamic design, delivery and assessment (3DA) QM General StandardsConnecting Dynamic Design, Delivery, and Assessment (3DA)Practice Notes to Apply 3DADesignDeliveryAssessment 1. Course overview and introductionQM Review standards 1.1-1.9- Establish social presence to support community building- Pre-assessment of student background/skills - Formative assessment focused on community building- Moving beyond the initial introduction to connect individuals to course content and each other - Online interactivity connecting social presence and cognitive presence - Developing a community of inquiry - Transfer of skills/strategies from face-to-face to online2. Clear and measurable learning objectivesQM Review standards 2.1-2.5- Connect learning objectives, activities, and assessments during instruction - Facilitate cross-curricular connections - Promote meaningful curriculum-driven interactions among community of inquiry members in the online environment- Frequent communication of learning objectives to students - Development of an awareness of learning objectives for students’ metacognitive skills and reflective learning- Through formative assessment techniques (e.g., Minute/Muddy Point papers, exit slips, or journaling) students are prompted to reflect on questions such as, “How is my learning related to course/learning objectives?” “What have I learned?” “What do I need to revisit or I have not understood?”3. Assessment strategies aligned with learning objectives, as they measure student progress & learningQM Review standards 3.1-3.5- Develop teacher and student presence (social, cognitive, emotional) supportive of assessment as learning (Earl, 2003)- Formative assessment focused on teacher and student presence (social, cognitive, emotional) - Student self- and peer-assessment- Instructors use formative assessment data to refine subsequent instruction and assessment procedures - Instructors and students develop collaboratively scoring rubrics to promote cognitive and social presence 4. Instructional materials are comprehensive and aligned to course objectivesQM Review standards 4.1-4.6- Reinforce and extend learning by covering the full extent of Bloom’s taxonomy (Bloom et al. 1956; Krathwohl, 2002) - Promote cross-curricular connections guided by learning objectives - Emphasize applications of curriculum to broad-based student engagement based on their prior knowledge and experience- Formative assessment focused on student engagement and meaning-making - Academic help seeking- The constant interaction with curriculum within a community of inquiry relies on frequent evaluations of quality of teaching and learning - Model and promote the use of academic help seeking tools and mechanisms by students5. Course interactivity motivates students and promotes learningQM Review standards 5.1-5.4- Employ questioning strategies, engaging discussions based on learning objectives - Develop and sustain a community of inquiry - Motivate and sustain social and cognitive presence - Formative assessment focused on teacher and student presence (social, cognitive, emotional) - Student self- and peer-assessment - Move beyond instructor-driven questioning - Promote student-initiated/mediated inquiry - Ensure that interactivity supports student cognitive presence prompted by formative assessment - Blend formal and informal learning6. Course technologies support learners’ achievement of course objectives.QM Review standards 6.1-6.5- Promote cognitive presence by meaningful, content-focused interactions- Continuous review and improvement- Connect and evaluate technology, content, and pedagogy - Promote interface/learning platform interactivity - Embed formative assessment7. Learner support services are identified.QM Review standards 7.1-7.4- Model and promote help-seeking behaviors - Diagnostic assessment - Provide technical help seeking tools, such as how-to videos or modules, Help Wall, FAQs, etc. - Emphasize academic help-seeking behaviors8. Accessibility and usability for all students is ensured.QM Review standards 8.1-8.5- Establish social presence - Diagnostic assessment- Provide technical help seeking tools, such as how-to videos or modules, Help Wall, FAQs, etc. - Emphasize academic help-seeking behaviors
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!