Big Signal: information interaction for public telerobotic exploration

Obstacles must be overcome in order to make viable public distribution of interactive remote experiences via the Internet. High latency and the fact that there are many more users than robots make traditional forms of telerobotics difficult. The Big
of 13
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
  1 Big Signal: Information Interaction for Public Telerobotic Exploration Peter Coppin*, Alexi Morrissey*, Michael Wagner*, Matthew Vincent**, Geb Thomas * The Big Signal InitiativeThe Robotics InstituteCarnegie Mellon University, Pittsburgh, PA15213http://www.bigsignal.net/ The GROK Lab** Department of Industrial EngineeringUniversity of Iowa, Iowa City, IA 52242http://grok.ecn.uiowa.edu/  Abstract Obstacles must be overcome in order tomake viable public distribution of interactiveremote experiences via the Internet. Highlatency and the fact that there are many moreusers than robots make traditional forms of telerobotics difficult. The Big Signal projectseeks to overcome these obstacles usinginformation interaction tightly coupled to alive autonomous rover mission.  Information interaction allows users toengage in a rich exploratory experiencewithout affecting the robotics mission. Additionally, information interactionadheres to the Internet standard of client/server models that allow many usersto interact with one data set of information. In December 1998, Big Signal deployed a prototype project by providing aneducational interface that allowed studentsand the public to participate in remotetelescience. 1 Introduction Factors of Success for the Internet =Obstacles for Internet Telerobotic Exploration. Interactive television, video telephonesand the like were the subject of sciencefiction stories and research labs for yearsuntil the low data requirements anddecentralized client/server configurations of the Internet converged with suitable modemspeeds that could accommodate primarilytext based information distribution. Soonimage compression techniques alloweddigital images to be transferred within areasonable amount of time. Now virtualcommunities and computer gaming allow adecentralized public to interact in virtualspace using standardized Internettechnologies.These attributes, which are primecomponents of success for the transfer of Internet data packets, are the coffin nail for aremote user attempting to engage in a real-time telerobotics scenario. 1.1 Problem: Communication Bottlenecks Controlling an electromechanism in thereal world in real time requires monitoring asituation remotely through sensors, sendingcontrol commands to the electromechanism,and finally observing the result in enoughtime to direct the physical machine to itsgoal.In the case of the Internet directtelerobotic closed loop control is difficult,because many users participate over lowbandwidth connections such as modems. AnInternet based remote experience must bestructured in such a way that time lags do notaffect an exploratory experience. 1.2 Problem: Many People, Few Rovers The second problem is a problem that isnot inherent to  Internet  telerobotics, but to  public telerobotics in general. As a publicevent, there are usually many more peoplethan there are robots. Several public roboticexperiences have attempted to resolve thissituation in several ways. 1.2.1 Traditional Approach: One Operator,One Robot  2A traditional approach to teleroboticexperience is the one operator, one robotparadigm (see figure 1.1). Though suitablefor personal robotic experiences, this isimpractical for public experiences whereother users must wait in line in order to gainthe precious and rarified experience due to itssingular nature. 1.2.2 Democratic Approach: Voting A second paradigm used for publictelerobotic experiences is a “democratic”voting scenario (see figure 1.2). In thissituation, users vote on the direction that arobot may travel or look. An example of thisproject was part of the public outreach effortfor NASA/CMU’s Atacama Desert Trek. [1-3]In this project, CMU worked withPittsburgh’s Carnegie Science Center toproduce a public interface for the Trek. LivePanoramic imagery was projected onto ahemispherical screen that surrounded atheater equipped with buttons at each seat inthe theater. By pushing buttons on theseseats, users were able to vote on the nextdirection that Nomad would travel. Otherbuttons controlled the pan/tilt direction of thepanoramic image that was projected onto thescreen.Though thrilling to see full colorimmersive imagery from 5000 miles awayupdating in real time, this voting scenariorequired the collective collaboration of alarge group of people, resulting in the neglectof individual urges in the audience. 1.2.3 Hybrid Approach: One UserControls, Others Watch Another approach to public remoteexperiences is a one operator / many viewersapproach (see figure 1.3).For example in a project called RoverTV. This venture was an "interactive"television show that allowed viewers toactively explore a remote environment bygiving TV watchers control of a the NomadRover that was deployed in the Atacamadesert of Northern Chile 5000 miles away.Viewers observed live imagery from theRover and gained control by dialing anumber with their touch-tone telephones.Users controlled both camera movements andsteering of the rover. Figure 1.2: A democratic approachFiure 1.3: One user controls, manyFigure 1.1: A single operator and robot   3 2 Solutions Using Autonomous RoverSystems and Information Interaction 2.1 Solution: Eliminate CommunicationBottlenecks through Autonomy Systems Time lag is not a new issue fortelerobotics. Space relevant rover technologyhas focused on the time lag issue for yearsbecause of the amount of time it takes forsignals to get from the Earth to off worldlocations and back again. Space roverscontain autonomy systems to keep themrunning between information bursts fromEarth bound controllers. Some of thesesystems focus on autonomous navigation andscience. One such project that explores thecreation of autonomous navigation andscience systems is the NASA/CMU RoboticSearch for Antarctic Meteorites.[4-6] Thisproject seeks to develop robotic technologythat allows the autonomous discovery andclassification of rocks and meteorites onAntarctic ice fields. This technology willenable a robot to deploy in an ice field,search this field to some level of precision,spot a potential target, autonomouslynavigate to it, and classify it as a type of rock or meteorite. Multiple sensor modalitiesare required to classify a meteorite moreeffectively than a human scientist, but therobot has limited energy available. Therefore,to efficiently search an area without wastingtime or energy requires science autonomy – acapability that does not just safely guide arobot to a waypoint, but actually createsmission plans from higher-level goals.To further this research, the project sentan expedition team and the robot Nomad toPatriot Hills, Antarctica in October 1998.During this expedition, the polar navigation,ice traversal and meteorite / rock classification capabilities of Nomad weretested. Data from a wide number of sourceswere sent back to CMU: •   High-resolution images of rock / meteorite targets •   Panoramic images •   Spectrometer results •   GPS information •   Telemetry from Nomad’s numerouspose, weather, and other sensors 2.2 Solution: Information Interaction These many types of information had tointegrate into one seamless experience by BigSignal. The limited bandwidth fromAntarctica to CMU and bad weather inAntarctica made communications andinfrequent; however, each new transmissionaugmented the data set at CMU’s server.These data were processed into a form thatcould easily be integrated into and publiclyaccessed in the Big Signal web site. Thistransformed robotic activity from a one rover / one user model to a client/server typesystem, optimized for the Internet. Thepublic would now be able to access thewealth of information rather than the robotitself: a model we call informationinteraction.An archive of data on a stateside servercontinuously downloading from a remoterover provided the perfect framework for ademonstration of public information accessto a telerobotic mission. 3 Pilot Project: An EducationallyRelevant Remote Science Interface In the fall of 1998, the Big Signal teamcreated a prototype project consisting of aninteractive web site that linked selected pilotclassrooms with the 1998 RAMS mission.This project provided a perfect platformupon which to test and demonstrate the useof information interaction for remote sciencein a public setting.Classroom teachers, interface designers,web designers, illustrators, videographersand robotics researchers worked together toboth create content and adapt robot telemetryto web formats that students and teacherscould access from the classroom. Pilotclassrooms focused on the areas of physics,remote geology, and technology education.Students using Big Signal encounteredseveral interface features that engaged them  4in the active experience of remoteexploration. 3.1 Remote Sensing The most visible connection toAntarctica was a continuously updatingpanoramic image taken by a camera onboardthe rover and sent with other robot data viathe communications link to the project webserver in Pittsburgh. This updating imageallowed students to gain a tangible view of the most recent condition of the environmentin Antarctica (see figure 3.1)Students viewed these panoramic imagesover the web using Apple’s QuickTime VRPlug-ins. This downloadable cross-platformstandard virtual reality technology plug-inwas the perfect counterpart for Nomad’sunique imaging system specifically designedto acquire complete 360-degree images, asshown in figure 3.2.Custom software converted thepanospheric images acquired by Nomad intomercador projection (panoramic) imagesrequired for input into the web friendly andhigh performance QuickTimeVR Movieconversion program. Use of this VRtechnology created an immersive experiencefor the students while using minimalcomputational resources to support the usercontrolled photorealistic movie capability.Figure 3.3 illustrates the method by whichthe raw panospheric image converts to aQuickTime VR format.The ultimate goal of creatively applyingvirtual reality technology to quickly andreliably exchange information between therobot and the students was the ultimate goalof this aspect of the user interface. Thepanospheric to QuickTime VR interfacedesign approach was a significant successfactor in achieving this goal. 3.2 Data Reduction In addition to direct links to sensoryinformation, Big Signal also reduced rawdata sent from Antarctic. Therefore,technical facts could be presented in amanner easily understood by students and thepublic. There were three main aspects of BigSignal that used reduced data. Figure 3.2: Unaltered 360-degree panoramic photograph.Fiure 3.1: Panoramic imaes provided a tanible link to the remote environment.  5 Panospheric to QTVRConversionPanospheric to QTVRConversion Image Data:PanosphericImagePerspectiveConversion:Panospheric toMercadorProjectionLocate "ImageCeter" based onAxis of SymmetryCameraCalibration DataImageReorientation& Cropping forinput intoQTVR       C    o    n     t    r    o      l      T    r    a    n    s      f    r      S      G      I    s Run non-GUI QTVRMakePanoVRinput is MercadorURL link to newQTVR file Controltransferto anyQTVR-enabledplatform QTVRrepresentation ofrobotenvironment Figure 3.3: Panospheric to QuickTime VR conversion methodology.
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!