Home & Garden

Eventscope: Amplifying human knowledge and experience via intelligent robotic systems and information interaction

Description
The Eventscope program develops publicly accessible “reality browsers” that display both archived and updating representations of remote environments derived from on-site robotic sensors. The interface encourages collaborative work within a community
Categories
Published
of 5
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
  EventScope: Amplifying Human Knowledge and Experience via Intelligent Robotic Systems andInformation Interaction Peter W. Coppin, Richard Pell, Michael Wagner, John R. Hayes, Junlei Li, Liza Hall, Karl Fischer,David Hirschfield and William “Red” WhittakerThe EventScope ProjectSTUDIO for Creative Inquiry/ Robotics InstituteCarnegie Mellon UniversityPittsburgh, PA 15213   Abstract  The EventScope program develops publiclyaccessible “reality browsers” that display botharchived and updating representations of remote environments derived from on-siterobotic sensors. The interface encouragescollaborative work within a community of users. Public exploration of real remote sites presents a variety of interface issues addressed by EventScope, including time delay, publicexploration via a single robot and communication between geographicallyseparate users from diverse backgrounds. Merging public interface with educational and contextual information extends the notion of “interface” to “remote reality library.” EventScope is a NASA and private foundation- funded project based at Carnegie MellonUniversity. 1. Introduction Publicly funded Earth and planetaryexploration is conducted to increase knowledgeof our universe. The public traditionallyaccesses this knowledge passively, through themedia. However, the development of the Weband of robotic remote-sensing technology nowmake it possible for students and the public toactively participate in the scientific process.EventScope uses remote and autonomousrobotics in conjunction with interface andtechnology design to create “reality browsers”that allow geographically decentralized users toengage in scientific processes via activeexploration of telerobotic mission sites.EventScope builds on robotic interface work created for use by NASA scientists andengineers, such as visualization for the NASAMars Pathfinder Mission [1] and the NASAChernobyl Mapping Expedition [2]. 1.1 Previous Projects EventScope extends interface technologyconcepts to the public through educationally-oriented projects. Although these projects aredesigned as curriculum supplements forsecondary education, they are presented onpublicly-accessible Web sites. As such, anymember of the public in any country has accessto the current and archived NASA teleroboticmissions upon which EventScope bases itsprojects. Previous projects include:  Big Signal Antarctica 1998 and 2000— A Weband technology portal [3] linking the public tothe NASA/CMU robotic rover Nomad [4] onits field trials in Antarctica, this projectcombined live and archived robotic sensor datawith daily updates from mission scientists,contextual information, and a “rock library”tool that enabled users to compare the rocksNomad found with previously identifiedsamples. This project reached 1000+ middle-school students in over 30 classrooms. It wasprofiled in the  New York Times and Britain’s  New Scientist, on CNN.com, Yahoo.com and  other Web sources [4]. It also is scheduled toappear in an upcoming issue of the ASME journal,  Mechanical Engineer.  EventScope: Mars  — Designed to enable publicaccess to NASA Mars data gathered by roboticrovers, probes and orbiters,  EventScope: Mars  will enable students and the public to explore3-D virtual representations of the Martiansurface derived from NASA data sets that areotherwise unavailable to the public. Data fromfuture successful missions will be incorporatedas the missions take place, potentially enablinglive data to be used. Tools and technologiesdeveloped for  Big Signal Antarctica 2000 willbe refined, and a curriculum is currently beingdeveloped to situate the Mars data within ascientific and historical context [5]. 1.2 The EventScope Team and Process The EventScope team embodies the principleof interdisciplinary collaboration within aniterative design process. The multidisciplinaryEventScope team includes experts in robotics,human-computer interaction, programming,verbal and visual communications, education,cognitive psychology and design. Each projectalso incorporates contributions from experts inrelevant fields: previous and current projectshave included Earth and planetary geologists,astronomers and NASA engineers. The teamanticipates completing one additional projectper year. The current NASA mission to mapthe Eros asteroid is a strong candidate forinclusion in an EventScope project, as areupcoming Earth-based field trials for newNASA robotic rovers, including Fido.An iterative design process is used to developthe major aspects of our work: technology,interface and content. When a project is indevelopment, team members rapidly prototypeeach of these three aspects, presenting them tothe team as a whole for comment. Commentsare then integrated into a new prototype, whichis presented to the team the following week.This process is repeated until all aspects of theproject are sufficiently developed to conduct anexternal pilot test. We conduct full pilot tests inclassrooms once per school semester, notincluding summer; this averages out to onceevery six months. In addition, workshops withteachers are held prior to pilot testing in orderto further refine the interface and content priorto testing. This process instills great flexibility,enabling problems to be identified andcorrected before they can impact the project asa whole. 2. The EventScope Interface Public exploration of remote sites presents arange of interface challenges. The distanceseparating users from remote robots createsissues of time delay and non-continuous datastreams. Interfacing many users through asingle robotic vehicle while maintaining eachuser ’ s sense of exploration and control isanother challenge. Other major issues includedetermining the degree of control/realinteraction to give users and the means of distributing that control, sorting data tostreamline the interface, enabling collaborationbetween users and scientists, linking robotic-sensor data with archived data, creatingcoherent ways of allowing independentnavigation of vast amounts of information, andrepresenting remote sites both spatially andover time.The interaction paradigms developed byEventScope resolve the above issues, enablinga range of possible interaction scenarios: manyusers/one robot, many users/multiple robots,users/users, and users/scientists. Each project ispublicly accessible via the Web, but theassociated curricula and educational outreachallow teachers to easily incorporate projectsinto their classrooms. Thus, “ user ” candesignate any individual: a member of thepublic, a teacher, a student, a scientist orengineer, etc. The following sections willdiscuss interface issues and the interface  features that enable intelligent robotic systemsto act as amplifiers of human knowledge. 2.1 Public Control vs. Public Outreach At this point in time, robots remain costly andrare enough that allowing individual users truecontrol of a robot is not compatible with thegoal of bringing robotics to large numbers of people. However, showing people robots theycannot control is something TV and other non-interactive media already do; it is not new.Fortunately, the rapid evolution of consumer-level computers, coupled with the capabilitiesof the Web, offer a third possibility. If a real-world environment and a robot are recreateddigitally as 3D models, then each user candownload a copy to use as they see fit. The userexperiences control and autonomy, in that he orshe can explore the remote site at will anddirect the robot to do anything that is physicallypossible for it to do. This may includeinadvertently damaging the robot orterminating the mission: if a given actionwould cause the robot to fall over at the realremote site, for example, the virtual robot willfall over. In  EventScope: Mars, students whomiscalculate the launch of a Mars probe willnot reach Mars. Users can thus experience thewhole range of possible outcomes of atelerobotic mission without harming a realrobot or losing a real mission.However, this paradigm is not the only one inwhich EventScope is interested. . Currentinvestigation includes researching methods of allowing users a degree of control in a livemission. These methods might include ademocratic model, in which the actions of eachuser within their 3D virtual world aremonitored and uploaded to EventScope, withthe most popular actions being forwarded tolive-mission scientists for implementation. Acompetitive model is also possible, in whichusers or user teams are selected according tothe success of their performance within the 3Dworld or the scientific merit of theirsuggestions. In either case, close cooperation of the live-mission team of scientists andengineers would be necessary. We are currentlypartnering with a range of robotics projects indevelopment, with the intention of pilot testingsome of these possible models of control. 2.2 Spatial/Chronological Maps  As data are gathered by the robot, we assemblethem into a spatial map and associated timeline.These evolve throughout the mission, allowingusers to view new information as it is gatheredand examine earlier information forcomparison. The remote site is alwaysavailable for users to explore, and each"databurst" adds more exploration possibilities.The robot gathers data once, and EventScopeconverts the data into two-dimensional andthree-dimensional spatial and chronologicalmaps which are then assembled into a virtualenvironment: a spatial and chronologicalrecreation of the robotic mission. Thisenvironment can be explored in the order inwhich the robot explored it, or out of order.Users can thus follow the mission in real time, join it at their convenience, or re-live themission after it is completed. This resolves twoissues: time lag and user scheduling.Fig. 1: BigSignal Antarctica 2000 Interface.Icons refer to time and spatial events [bottom].Context libraries provide backgroundinformation [top].  2.3 Context libraries  The rock library in  Big Signal Antarctica 2000  allowed users with little knowledge of geologyto understand and participate in Nomad ’ s work.The context library being developed for  EventScope: Mars will build on users ’  knowledge of Earth features (volcanos,riverbeds, etc.) as a bridge toward assimilatingknowledge of the Martian landscape. Eachproject ’ s context library blends priorknowledge with an interface designed to letusers make new discoveries for themselves. Inaddition to building the knowledge base of student users, this facilitates communicationbetween users of different skill and knowledgelevels. More knowledgeable users may use thecontext libraries as a reference whencommunicating with student or laymen users,while less-knowledgeable users may explorethe context libraries in depth as part of anoverall curriculum. 2.4 Flags  Users with comments or questions about aspecific feature of the remote site, (e.g., aparticular rock) can place a “ flag ” on thatfeature. Other users can see the flag, read thecomment/question, and answer it or add theirown. The interface allows each user to pre-setflag visibility to different levels: “ view all, ”   “ view flags from my school, ” etc. Thiscollaboration-enhancing feature is beingdeveloped for  EventScope: Mars. Users whowish to plant a flag simply run their mouseacross the screen and click on the object. Java3D then shoots a “ ray ” into the onscreen 3Dmodel, and whatever the “ ray ” intersectsbecomes the point where the flag appears.Comments, questions and other informationinputted by users is in the form of a Javaobject.Fig. 2: Flags link to users ’ comments andexternal or internal Web sites. 2.5 Linking EventScope’s Navigation Toolwith the Web  The interface ’ s open-ended structure links realremote experience to Web-archivedinformation. Users can click on features of theremote environment and trigger an associatedWeb page to open. This page may delivercontextual information, allow messaging or livechats between users, or show a more detailedview of the feature. Users may also change thedisplay mode of the 3D viewer from anEventScope Web page. For example, users canalternate between geocentric (ancient,incorrect) and heliocentric (modern, correct)views of the Solar System to observe how thetrajectories of planetary orbits differ betweenthe two views.To invoke the Web browser from within the 3Dviewer, a Windows-native system call forresolving a URL and opening an application isused from within Java. To activate changeswithin the 3D viewer from a Web page, thebrowser calls a script which opens up a mini-3D viewer, which communicates with the main3D viewer through Remote Method Invocation. 3. Conclusions The current work of the EventScope Teaminvolves using robotics and the Web to offerthe public a new paradigm of human/ information and human/machine interaction.We create contextualized interfaces throughwhich humans can project themselves into  robots at remote locations, not only exploringlocations they cannot physically explore, butalso experiencing those locations as only therobot can: through robotic sensing devices.This paradigm is available to an elite group of scientists, but our goal is to extend it to thegeneral public. We accomplish this in twoways: by creating software that mimics thefunctionality of high-end scientific software,but runs on standard consumer-levelcomputers; and by contextualizing the remotesites in an educational, user-friendly way. Theapplications of this work are obvious:education and training, informationdissemination, and potentially, evenentertainment.However, as robotic sensing devices and fullyfunctional robots become more widespread, thepotential applications will multiply. Ourultimate goal is to work with industry to definethe standard for a “ mediated reality browser, ”  bringing real remote experience intoclassrooms, households and workplacesworldwide. The current model of human/information interaction will expand toinclude machine-mediated human/realityinteraction: the boundary separating the “ dataspace ” of users and their computers and the “ real world ” will become seamless throughrobotic interaction with the physicalenvironment. Robots will act as informationexplorers, experience gatherers, and agents of action in the physical world. This is thehuman/robot interaction paradigm we are nowworking toward. 4. References [1] C. Stoker, E. Zbinden, T. Blackmon, L.Nguyen, “ Visualizing Mars Using VirtualReality: A State of the Art Mapping Tool Usedon Mars Pathfinder ” Extraterrestrial MappingSymposium: Mapping of Mars, ISPRS,Caltech, Pasadena, CA. July 1999.[2] T.T. Blackmon, L.M. Nguyen, C.F. Neveu,et. al. “ Virtual Reality Mapping System forChernobyl Accident Site Assessment ” SPIE,San Jose, CA. Feb 27, 1999[3] P. Coppin, A. Morrissey, M. Wagner, M.Vincent, G. Thomas. “ Big Signal: InformationInteraction for Public Telerobotic Exploration ”  Proceedings for Current Challenges in InternetRobotics Full Day Workshop, IEEEInternational Conference of Robotics andAutomation, May 1999, Detroit, Michigan, alsoavailable athttp://www.cs.berkeley.edu/~paulos/papers/icra99/ [4] See the following sources:Headlam, Bruce. Accompanying a Robot on anAntarctic Quest. New York City: New YorkTimes , 2/2/2000. ( www.nytimes.com/library/ tech/00/02/circuits/articles/03geek.html)   Marchant, Joanna. Polar Pioneer. London,England: New Scientist 1/29/2000.Nomad Combs No-Man ’ s Land for Meteorites. CNN News Online 1/24/2000. (www.cnn.com/ 2000/NATURE/01/24/nomad.meteors.enn/index.html)Carnegie Mellon-Based Interactive Web Site ‘ Big Signal ’ Allows Public to ExploreAntarctica Through a Robot's Senses. YahooNews Online 1/27/2000. (http://biz.yahoo.com/ prnews/000127/pa_carnegi_1.html) [5] EventScope: www.eventscope.org. BigSignal: www.bigsignal.net.[6]   D. Apostolopoulos, M. Wagner, and W.L.Whittaker, “ Technology and FieldDemonstration Results in the Robotic Searchfor Antarctic Meteorites ”  Proceedings of the International Conference onField and Service Robotics , August 1999, pp.185-190.
Search
Tags
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks
SAVE OUR EARTH

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!

x