Automotive

A Sustainable Model for Integrating Current Topics in Machine Learning Research Into the Undergraduate Curriculum

Description
A Sustainable Model for Integrating Current Topics in Machine Learning Research Into the Undergraduate Curriculum
Categories
Published
of 10
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
  IEEE TRANSACTIONS ON EDUCATION, VOL. 52, NO. 4, NOVEMBER 2009 503 A Sustainable Model for Integrating Current Topics inMachine Learning Research Into theUndergraduate Curriculum Michael Georgiopoulos  , Senior Member, IEEE  , Ronald F. DeMara  , Senior Member, IEEE  ,Avelino J. Gonzalez  , Fellow, IEEE  , Annie S. Wu, Mansooreh Mollaghasemi, Erol Gelenbe  , Fellow, IEEE  ,Marcella Kysilka, Jimmy Secretan  , Member, IEEE  , Carthik A. Sharma  , Member, IEEE  , andAyman J. Alnsour  , Member, IEEE   Abstract— This paper presents an integrated research andteaching model that has resulted from an NSF-funded effort tointroduce results of current Machine Learning research into theengineering and computer science curriculum at the University of Central Florida (UCF). While in-depth exposure to current topicsin Machine Learning has traditionally occurred at the graduatelevel, the model developed affords an innovative and feasibleapproach to expanding the depth of coverage in research topics toundergraduate students. The model has been self-sustaining as ev-idenced by its continued operation during the years after the NSFgrant’s expiration, and is transferable to other institutions dueto its use of modular and faculty-specific technical content. Thismodel offers a tightly coupled teaching and research approachto introducing current topics in Machine Learning research toundergraduates, while also involving them in the research processitself. The approach has provided new mechanisms to increasefaculty participation in undergraduate research, has exposed ap-proximately 15 undergraduates annually to research at UCF, andhas effectively prepared a number of these students for graduatestudy through active involvement in the research process andcoauthoring of publications.  Index Terms— Curriculum development, integrated researchand teaching, machine learning, team teaching models, under-graduate research experiences. I. I NTRODUCTION C URRENT models of undergraduate research such as  Re-search Experiences for Undergraduate Students (REU) ,Honors Theses, and senior-year projects frequently serve as ef-fective means to introduce undergraduate students to research[1]. However, these interactions can reveal challenges with re-gards tosustaining undergraduateresearch overan extended pe-riod of time [2]. The  Sustainable Model for Assimilating Re-search and Teaching (SMART)  at UCF integrates current re-search into the undergraduate curriculum through a course se-quence that has propagated beyond an NSF-funded  Combined  ManuscriptreceivedOctober04,2007;revisedJune20,2008.FirstpublishedJune 16, 2009; current version published November 04, 2009. This work wassupported in part by the National Science Foundation under Grant 0203446.M. Georgiopoulos, R. F. DeMara, A. J. Gonzalez, A. S. Wu, M. Mol-laghasemi, M. Kysilka, J. Secretan, and C. A. Sharma are with the Universityof Central Florida, Orlando, FL 32816 USA (e-mail: michaelg@mail.ucf.edu).E. Gelenbe is with the Imperial College, London SW7 2AZ, U.K.A. J. Alnsour is with the University of Central Florida, Orlando, FL 32816USA, on leave from Al-Isra Private University, Amman, Jordan.Digital Object Identifier 10.1109/TE.2008.930511  Research and Curriculum Development (CRCD)  award [3], [4].SMARTreachesawideaudienceofundergraduatestudentswhomay not otherwise have considered well-established researchprograms for undergraduates,such as the NSF-funded ResearchExperiences for Undergraduates (REUs). The effort describedhere is a structured approach with a focus on Machine Learning(ML), spanning multiple faculty members with various ML re-search interests. This approach has encouraged undergraduatestudents to pursue graduate education while producing researchresults and outcomes which have advanced the professional de-velopment of students and faculty members involved. Further-more, the approach presented here is designed to be a generalone, applicable to almost any scientific or engineering disci-pline, where it is desired to combine graduate research with un-dergraduate education to the benefit of both. This approach pro-videsatemplateforreadersindifferentdisciplinestofollowandcreate similar programs.Faculty members will frequently work individually with un-dergraduate students on topics that are related to their own re-search. However, the proposed SMART approach provides re-search-oriented,team-taughtcourseofferingsthatspanmultipletopics.Thisapproachexposesundergraduatestudentstoawiderbreadth of research experiences. The team-taught course offer-ings benefit the faculty involved in this effort by encouragingcollaboration of faculty with similar research interests, and byprovidinga structuredand sustainablemechanismfor recruitingundergraduate students in their graduate research teams. Addi-tionally, these provide a neutral, collaborative environment forseniorfacultytomentorjuniorfacultyinanon-intrusivefashion.An overview of the SMART method is shown in Fig. 1.This framework was realized during the NSF CRCD grant’sfunding years of 2002 through 2005, and sustained thereafter.Part of the CRCD effort involved developing and teachingmodules, such as appropriately chosen homework assignments,in required undergraduate courses to encourage students toregister for the senior level courses called  Current Topics in Machine Learning I (CTML-I)  and  Current Topics in Machine Learning II (CTML-II) . In  CTML-I  , the students learn thefundamentals of the current research topics from the facultymembers who are co-teaching the course. In  CTML-II  , thosestudents who continue participate in a hands-on researchproject. Students work one-on-one with a SMART facultymember, either individually or in small groups, along with 0018-9359/$26.00 © 2009 IEEE  504 IEEE TRANSACTIONS ON EDUCATION, VOL. 52, NO. 4, NOVEMBER 2009 Fig. 1. SMART project framework. an appropriately-chosen graduate student mentor. During theNSF grant’s funding period, an advisory board of facultyand industrial members acted as facilitators and evaluatorsof this effort, and provided valuable feedback leading to the SMART   model. The  CTML-I   and  CTML-II   classes have beenconsistently taught since the fall semester of 2003, facilitatingexposure to a significant number of undergraduate engineeringand computer science students.II. R ELATED  W ORK Whilemanyfacultymembersstrivetointegratetheirresearchinto undergraduate experiences either on an individual basis ora research team basis [5]–[7], the availability of a structured ap-proach that spans multiple faculty and multiple semesters canbe beneficial. The longer-term research relationships that arecreated between faculty members and undergraduate studentsthrough this long-term approach can be synergistic with otherinitiatives,suchassummerinternshipprograms[2]andtheNSFREU under the direction of a research professor. Initial studentperception of the value of REU programs has been overwhelm-ingly positive [1]. However, the REU program is mostly cen-tered around performance of research, with little time devotedto classroomlearning onthe researchtopicor methods. Further-more, some have found that the 10-week duration of a summerREU experience may be insufficient to fully convey the essenceof technical research that leads to publishable results [2], [8].Team-based teaching has previously been integrated into un-dergraduate curricula on a number of topics, but quite oftenwith the goal of encouraging a multidisciplinary approach [10],[11] or redistributing faculty workload [9]. On the other hand,team-teaching in  CTML-I   introduces students to a range of cur-rent ML research topics, as well as to the research styles of avariety of faculty members. This exposure can assist studentsin their decision to consider a research apprenticeship with oneof these faculty members. Several other CRCD projects havebeenfundedbyNSF,suchasonesinparticletechnologyatNJIT[12]; sensor materials at Ohio State [13]; optical sciences atNAT[14];convexoptimizationforengineeringanalysisatStan-ford [15] and smart materials at Texas A & M [16], but none of these projects have focused on the creation of a portable sus-tainable model. CRCD programs have the ability to immerse astudent more fully because they avoid the time limitation of asummer term imposed by NSF REU programs. While the focusof SMART has been on ML, the model can be applied to othertopics and at other institutions without the need for NSF fundsto initiate it. This model requires only a small nucleus of facultywith similar research interests and the motivation to co-teachcourses similar to the  CTML-I   and  CTML-II   courses describedhere.III. R ESEARCH AND  C URRICULUM  I NTEGRATION  A PPROACH The SMART initiativeinvolves multiple mechanisms beyondthose srcinally incubated by a CRCD award [17]–[22]. In theSMART approach, faculty members initiate the process viatwo alternative techniques. First, the availability of the programis publicized through seminars and workshops to students.Second, technical learning modules are delivered in selectrequired undergraduate courses. Modules highlight currentML topics as application examples that students already learn,such as data structures. Both techniques attract undergraduatestudents to become involved in ML research, bootstrapping theintegrated teaching and research method.  A. SMART Teaching and Research Methodology As shown in Fig. 2, ML-related seminars, guest-lectures,one-on-one interactions with students, and ML modules offeredby SMART faculty members are some of the many vehiclesusedbySMARTfacultytoencouragestudentstoregisterforthe CTML-I   and  CTML-II   senior level courses.  CTML-I   introducesstudents to research faculty and topics, and leads to  CTML-II  ,where students engage in research projects advised by a facultymember who co-taught in  CTML-I  . Both courses are electives  GEORGIOPOULOS  et al. : INTEGRATING CURRENT TOPICS IN MACHINE LEARNING RESEARCH 505 Fig. 2. SMART activities to integrate ML research into education. in the degree program, and a number of disciplines in engi-neering and computer science allow their students to registerfor such technical electives.  CTML-I   emphasizes lecture-basedinstruction on current ML concepts of interest to the team of participating faculty, and  CTML-II   stresses hands-on researchby undergraduate students working with a graduate studentmentor while being actively advised by a faculty member. Thecourse sequence helps to address challenges cited in alternateexperiences with undergraduate research, especially recruit-ment of skilled students matched to faculty interests [23], [24].The broad cross section of research interests in ML make ita suitable candidate for co-teaching of courses. In the Schoolof Electrical Engineering and Computer Science at UCF, thereare currently eight faculty members with significant interestsin AI and ML, and at least three other faculty members whoapply these techniques to applications. This grouping consti-tutesasufficientlylargenucleusoffacultyexpertisewithdiverseresearch interests to sustain the continual offering of   CTML-I  and  CTML-II  . Since initiation, two additional new faculty hiresfrom the Computer Science program voluntarily enlisted in theSMART initiative. Furthermore, the initial proposal effort in-cluded a faculty member from the Education Department whohelped in the design of the evaluation instruments, and in theassessment of the project’s accomplishments.  B. SMART People and Timeline In order to achieve the goal of introducing undergraduate stu-dents to leading-edge research in ML, two objectives are pur-sued. The first objective is the creation and continuous offeringof   CTML-I   and  CTML-II   that have now become permanent list-ingsintheuniversitycatalog.Thesecondobjectiveisthetaskof making students aware of the  CTML-I   and  CTML-II   opportuni-ties, the most noteworthy of which was the creation of   Machine Learning modules  that can be inserted in select sophomore- and junior-level undergraduate classes.A three-year timeline required to establish a self-sustainingprogram is depicted in Fig. 3. Various semesters, since fall2002, have been devoted to course material development,project material development, as well as teaching, assessingand improving the course content and educational practices. CTML-I  and CTML-II  haveconsistentlybeenofferedtoconductteaching, assessing and improvement of both classes.As shown, most of the initial effort was expended upon thedesignoftheeducationalmaterialsfortheresearchmodulesandthe  CTML-I   lecture notes, as well as on the advising of the stu-dents in research projects assigned in the  CTML-II   course. The CTML-I   class is taught each fall by a team of faculty, which al-lows each of them to provide students with the necessary back-ground to join in that particular faculty member’s current re-search efforts. The  CTML-II   class is taught each spring by thesame faculty who taught  CTML-I   in the previous fall. Interestedstudents from the  CTML-I   class, as well as a few new students,work one-on-one with a faculty of their choice on an ML re-search project and may also receive mentoring from a faculty’sgraduate students. C. Curricular Content and Student Projects1) Machine Learning Modules:  The ML course modules ap-plied throughout the sophomore- and junior-year undergraduatecourses can stimulate student interest in ML topics through ap-plication examples of elementary technical concepts requiredforthedegreeprogram.Modulesdevelopedaspartoftheprojectintroduce students to some widely used algorithms for ML andtheir underlying principles. As an ongoing effort, thesemoduleswererefinedandimprovedbasedonfeedbackfromthestudents,such as the examples listed below:•  EEL 3801—Introduction to Computer Engi-neering —Module: “Learning the Trick of the Game calledNim”•  EGN 3420—Engineering Analysis —Module: “Percep-tron-Based Learning Algorithms/The Pocket Algorithm”  506 IEEE TRANSACTIONS ON EDUCATION, VOL. 52, NO. 4, NOVEMBER 2009 Fig. 3. SMART Timeline of Activities throughout the funding period to establish the model. •  EEL 4851—Data Structures —Modules: “Graph and Net-work Data Structures for Evolvable Hardware” and “In-ductive Learning Algorithms”•  COT 4810—Topics in Computer Science —Module:“Human GA: Learning Evolutionary Computation viaRole Playing”More details about the specifics of each of the aforementionedmodules and student feedback are provided in [3]. Becauseof space limitations, this article focuses on the  CTML-I   and CTML-II   classes. Advocacy of the SMART program by meansof invited speakers, posters, and presentations to interestedstudent groups such as senior design students and at graduatepre-recruitment seminars, have also had a positive impact onenrollment. 2) Current Topics in Machine Learning I:  In any course, thetradeoff between breadth and depth must be considered. Tra-ditional courses tend to focus on breadth, providing studentswith knowledge of many well-known and fundamental algo-rithms. The  CTML-I   class, on the other hand, emphasizes depthin specific research areas in order to better prepare students toactively join an ongoing research project through a bottom-uplearningapproach.Dependingontheparticularfacultyinvolvedin CTML-I  ,thetopicscoveredinthecoursemaynotspanalltra-ditional machine learning algorithms. The philosophy adopted,whichhasbeenquitesuccessful,isthatinvolvementinan actualML research effort will spark student interest to investigate thebreadth of ML algorithms further in the future.The  CTML-I   course features introductions to the researchtopics presented on a rolling basis by a group of faculty. Eachfaculty member presents five twice-weekly lectures on theirtopic of expertise. The teaching materials for  CTML-I   arederived almost exclusively from peer-reviewed publicationsof the SMART faculty and their other publications such asbooks or tutorials on  Adaptive Resonance Theory (ART   ) neuralnetworks [25] or decision trees [26]. For instance, the ARTtopic is elaborated below as an example.  Adaptive Resonance Theory (ART) Neural Networks:  Thestudents are first briefly introduced to neural networks becausesome may not yet have been exposed to this topic from a cor-responding course module. Next, the students are exposed tothe motivation behind ART neural network architectures andtheir specific parameters. The lectures are then devoted to dis-cussing a benchmark ART neural network architecture, called Fuzzy ARTMAP ,which is extensivelyused insolving classifica-tion problems. By understanding Fuzzy ARTMAP the studenthas the ability to quickly comprehend a number of other ARTarchitectures. Furthermore, in the ART lectures, useful analo-gies are drawn between these basic ART architectures and otherneuralnetworkarchitectures,suchasmultilayerperceptronsandradial basis function neural networks. Finally, successful appli-cations of ART neural networks are discussed, and the studentsare encouraged to study additional ART-related papers.  Homework Assignments in CTML-I:  Homework is assignedfor every major topic discussed in the  CTML-I   class. This as-signment is designed to reinforce some of the important con-cepts discussed in class, and ranges from paper and pencil as-signments to running experimental simulations. For example,one such assignment involves walking through the process of atraining cycle of a Fuzzy ARTMAP neural network for a simpleexample. Another assignment involves using existing Genetic  GEORGIOPOULOS  et al. : INTEGRATING CURRENT TOPICS IN MACHINE LEARNING RESEARCH 507 TABLE IE XAMPLES OF  S TUDENT  P ROJECTS AND  P UBLICATIONS Algorithm (GA) code to study the impact of parameter settingson GA performance. 3) Current Topics in Machine Learning II:  The first twoweeks are devoted to a discussion of the projects that the facultyadvisors propose to the students as potential research projects.In each lecture, the challenges posed by, as well as techniquesto complete, the proposed project are presented. After this two-week period, the students choose a research project of interestand work with the associated faculty member on a one-to-onebasis. Examples of projects include various ML applications,experimentation on novel ML approaches, and comparisons of two or more ML approaches on a class of application problems.The researchconducted in CTML-II   always leads toa formal re-port,occasionallyleadstoanhonorsthesis,andfrequentlyleadsto a peer-reviewed publication. Student Research Projects in the Current Topics in Machine Learning II:  Students work on their chosen projects in groupsof one to three. Each group of students is supervised by a fac-ulty member and a graduate student mentor. Students are ac-tively encouraged to form multidisciplinary groups to empha-size collaborative work. Projects are completed over a 12-wk period under weekly supervision by a faculty member and morefrequent interaction with a graduate student mentor. Monthlycourse-wide meetings are conducted in which students reportprogress and receive feedback from all participating faculty andstudents about their research.Students present their work incrementally at three presen-tation milestones. In the first presentation, students present aliterature survey, requirements overview, proposed technicalapproach, and schedule. In the second presentation, held onemonth later, students present an overviewof the design progressto date, and solicit advice for possible solutions to technical is-sues from other student groups and faculty mentors. In the thirdpresentation, held during final exam week, students presentresults and conclusions. Students must submit a final projectreport on their work using the IEEE conference article format.The quality of the presentations, the technical report, and theinteractions of faculty and graduate student mentors with thestudent contribute to the grade of the student in the  CTML-II  class. TABLE IIN UMBER OF  S TUDENTS IN THE  CTML-I   AND  CTML-II   C OURSES .D ENOTES  Y EARS OF  CRCD NSF F UNDING TableIlistssomeexamplesofprojectscompletedbystudentsin  CTML-II  . Project 3 is an example of a joint effort by anundergraduate student and a graduate student mentor involvingthe parallelization of Fuzzy ARTMAP on a Beowulf clusterwhich improved the convergence speed of the training processon large databases. The undergraduate student implementedFuzzy ARTMAP on the Beowulf cluster and generated experi-mental results that demonstrated its effectiveness. Results werepublished in two conference papers and two journal papers, allof which were coauthored by the undergraduate student. Thisstudent was later accepted into the Ph.D. program at UCF andreceived an NSF Graduate Research Fellowship, one of themost prestigious fellowships in the nation for recognition of student research potential.IV. R ESULTS AND  A SSESSMENT AssessmentoftheSMARTmodelforundergraduateresearchand curriculum begins with measuring the effectiveness of stu-dent recruitment and retention. Table II depicts the numberof students who have registered for the  CTML-I   and  CTML-II  courses each semester. A total of 97 students have completedor are in the process of completing these courses. Since someof the students took both  CTML-I   and  CTML-II  , 77 distinctstudents have been introduced to research through the  CTML-I  and  CTML-II   sequence over this 5-yr span.Theeffectivenessofthe CTML-I  and CTML-II   coursescanbegauged by a number ofindirect measures suchas student surveyquestionnaires and direct measures, such as presentations and
Search
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks