A framework for the assessment and selection of software components and connectors in cots-based architectures

A framework for the assessment and selection of software components and connectors in cots-based architectures
of 10
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
  A Framework for the Assessment and Selection of Software Componentsand Connectors in COTS-based Architectures Jesal Bhuta 1 , Chris A. Mattmann 1, 2 , Nenad Medvidovic 1 , Barry Boehm 1   1 Computer Science Department University of Southern California Los Angeles, CA 90089{jesal,mattmann,neno,boehm} 2  Jet Propulsion LaboratoryCalifornia Institute of Technology Pasadena, CA Abstract Software systems of today are frequently composed  from prefabricated commercially purchased components and connectors that provide complex functionality and engage in complex interactions.Unfortunately, because of the distinct assumptionsmade by developers of these products, successfullyintegrating them into a software system can becomplicated, often causing budget and scheduleoverruns. A large amount of integration risks can oftenbe resolved by selecting the ‘right’ set of COTS components and connectors which can be integrated with minimal effort. In this paper we describe a framework for selecting COTS software componentsand connectors ensuring their interoperability in software-intensive systems. Our framework is built upon standard definitions of both COTS componentsand connectors and is intended for use by architectsand developers during the design phase of a software system. We highlight the utility of our framework using a challenging example from the data-intensive systemsdomain. We describe empirical results gleaned fromour experience using the framework in an advanced  software engineering course at USC. We conclude by pointing the reader to future research directions. 1. Introduction The increasing complexity of software systemscoupled with the decreasing costs of underlyinghardware has ushered forth the realization of Brook’sfamous “buy versus build” colloquy [1]. In the past, a business organization could afford spending $500,000to manually develop, maintain and evolve asophisticated payroll system for deployment on itscostly two million dollar hardware. Nowadayshowever, a business organization that purchases$50,000 worth of off-the-shelf (OTS) office hardwarecannot afford such a customized payroll program.Instead they often opt to purchase a commercial off-the-shelf (COTS) software system (or component) thatcan fulfill the same desired capabilities.Such COTS systems and components recurrentlyhave diminished up-front cost, development time,maintenance, and evolution costs. These economicconsiderations often entice organizations to piecetogether COTS components into a working softwaresystem that meets the business’s organizationrequirements, and the system’s functionalrequirements, even at the expense of altering theorganization’s existing business processes!Unfortunately over the past ten years numerousstudies [2-6] have shown that piecing together available open source OTS and COTS components isquite dissimilar from custom development. Instead of the traditional requirements–design–develop–test– deploy process, COTS-based development involvesactivities such as assessment–selection–composition– integration–test–deploy. Tantamount to the success of the entire process, are the assessment and selection of the “right set” of COTS components and connectors.Careful and precise execution of these activities oftenensures the development of a system on-time, on- budget and in line with the objectives of the project.There are two major components within the assessmentand selection process: (1) assessment of COTSfunctional and non-functional requirements; and (2)assessment of interoperability to ensure that theselected COTS components will satisfactorily interactwith each other. While the former has been addressed previously [2-6] an efficient solution to the latter haseluded researchers.The first example of such an interoperability issuewas documented by Garlan et al. in [5] whenattempting to construct a suite of software architecturalmodeling tools using a base set of 4 reusablecomponents. Garlan et al. termed this problem architectural mismatch and found that it occurs due to(specific) assumptions that an OTS component makesabout the structure of the application in which it is toappear that ultimately do not hold true.  The best-known solution to identifying architecturalmismatches is prototyping COTS interactions as theywould occur in the conceived system. Such anapproach is extremely time-and effort-intensive. Theapproach compels developers (in the interest of limitedresources) to either neglect the interoperability issuealtogether and hope that it will not create problemsduring the composition and integration phases or itcompels them to neglect interoperability until thenumber of COTS combinations available for selectionare cut down to a manageable number (based onfunctional and quality of service requirements). Boththese options add significant risk to the project. Whendevelopers completely neglect interoperabilityassessment, they often will be required to writeenormous amounts of glue-code, causing cost andschedule overruns. Otherwise, they risk losing a COTS product combination which is easy to integrate, but just“isn’t right” because of some low-priority functionalityit did not possess. Neither of the above prospects isappealing to development teams.In addition to the above stated COTS componentintegration issues, there are issues of utilizing availableCOTS connectors that arise as well. The study of software architecture [7] tells us that softwareconnectors are the embodiment of the interactions andassociations between software components. Therefore,ideally, when trying to construct the architecture of asoftware system, we need to be able to deal not onlywith the assembly of software components, butadditionally the assembly of software connectors. Thisis exacerbated by the current lack of understanding inmany software system domains (e.g., data-intensivesystems [8]) of how to select between differentavailable COTS connectors. The research literature [9-11] contains many other studies that describe theenormous difficulty in assembling software connectors by themselves, let alone with COTS softwarecomponents.In this paper, we propose an attribute-drivenframework that addresses selection of (C)OTScomponents and connectors to ensure that they can beintegrated within project budget and schedule. One of the key contributions of our work is the identificationof connectors to (1) “bridge the gap” between COTScomponents and ensure interoperability, and (2) satisfysystems quality of service (QoS) requirements. Our  proposed framework identifies COTS componentincompatibilities and recommends resolution strategies partly by using specific connectors and glue-code tointegrate these components. Where componentinteractions require satisfying of QoS requirements theframework will recommend appropriate connectors.Such incompatibility information can be used toestimate the effort taken in COTS integration [12],which can then be used as a criterion when selectingCOTS products. The framework is non-intrusive,interactive, and tailorable. The assessment conducted by the framework can be carried out as early as theinception phase, as soon as the development team hasidentified possible architectures and a set of COTScomponents and connectors. We have tested thisframework in a classroom setting and in variousexample studies, including a challenging real worldexample from the data-intensive systems domain. Our early experience from using the framework indicatesthat our approach is feasible, and worthy of active pursuit. 1.1 Definitions We adopt the SEI COTS-Based System Initiative’sdefinition [7] of a COTS product: a product that is •   sold, leased, or licensed to the general public; •   offered by a vendor trying to profit from it; •   supported and evolved by the vendor, whoretains the intellectual property rights; •   available in multiple identical copies; •   used without source code modification.For the purpose of this work we include open-source products as part of the COTS domain. This is becausealthough source code for these products is freelyavailable, they are most often used withoutmodification. Moreover there have been severalrevenue-generating models around open-source [13].In this paper, we define a component generally as aunit of computation or data store [10]. Componentsmay be as small as a single procedure or as large as anentire application. Connectors are architectural building blocks used to model interactions amongcomponents and rules that govern those interactions[10].   The rest of this paper is organized as follows. InSection 2, we describe a motivating real-world COTSassessment and selection problem in the data-intensivesystems domain. In Section 3 we describe theassessment framework in detail, including the attributemetadata that it captures and how it applies to our example. In Section 4 we present empirical evidenceand data taken from an advanced software engineeringcourse at USC that evaluated our framework. Section 5identifies related works to our own approach andsection 6 rounds out the paper with a view of somefuture work. 2. Motivating Example Consider the following COTS assessment andselection problem derived from several existingchallenges we are currently facing at NASA’s JetPropulsion Laboratory (JPL). The scenario helps to  illustrate the utility of our framework and ground itwithin an existing real-world problem.   Four  planetary scientists at JPL in  Pasadena,California are responsible for managing hundreds of  gigabytes of planetary science data which includesdigital content, corresponding metadata, and additional planetary science data. The JPL scientists need to sharetheir data with colleagues at the European SpaceAgency (ESA) in  Madrid, Spain . Their colleagues atESA consist of  two planetary scientists managing tensof gigabytes of planetary data. Each of the two ESAscientists has separate preferences for  the number of delivery intervals in which she would like to receiveher JPL colleagues’ data, ranging from the amount of data per interval  to the appropriate interval timesduring the day in which to send the data. In turn,similar user preference issues arise from the JPL planetary scientists desire to receive their ESAcolleagues’ data.In addition to the aforementioned data sharing tasks between the JPL and ESA scientists, there are also tensof thousands of external users, including other  planetary scientists and educators (each with their own preferences) who are customers of the data madeavailable by JPL and ESA’s independent planetary datasystems. The users are separated by highly distributedgeographic networks that span both WAN  and  LAN  ,and in some cases entire continents.In order to support the planetary scientists’ needs,JPL and ESA commission a team of software architectsand engineers to design and implement a softwaresystem that can support the data distribution tasksoutlined between JPL and ESA. Additionally thesystem needs to support the tens of thousands of external users.Figure 1 displays a potential architecture for such asystem. The systems based at JPL and ESA utilize aCOTS digital asset management system such asDSpace, data stores that include at least one type of database system such as Oracle or Sybase, and twocustom components, one of which manages user queries while the other retrieves data from itscounterpart system at periodic intervals.At first glance, the complexity of the above systemmight be glossed over and the first impression might be to “just deploy Oracle”, or to “utilize web services”.However, these COTS technologies might beunrealistic because of several reasons, including therequirements of the organization (ESA may be aSybase house), the skill levels of the programmerstasked with implementing the system (JPL programmers may prefer Java), or even the architecture of the system itself (ESA’s existing datasystem may be client-server and the desireddistribution connector may be peer-to-peer). What isneeded is a fundamental understanding of how to selectthe appropriate COTS components and connectors for employing them into a working software system. Thus,we believe that any approach to solving the describeddata sharing challenge boils down to answering thefollowing two questions:1.   How do we select the appropriate COTScomponents that will support data distributiongiven the large amount of heterogeneity betweenthem?2.   How do we select the appropriate COTSconnectors that will support the QoS requirements between the JPL and ESA scientists and theexternal users?In the remainder of the paper we describe how our COTS assessment framework is uniquely positioned toattack each of these fundamental questions. 3. Assessment and Selection Framework  The framework as shown in Figure 2 is modeledusing several modular components that provideservices to one another. The three key components of the framework are: COTS interoperability analyzer   (shown in the middle right portion of Figure 2), COTS representation attributes (that make up the COTS Figure 1. A potential architecture for a large-scaledata distribution scenario  definitions shown in the left side of Figure 2), and integration rules (shown in the bottom right portion of Figure 2). The inputs to the framework are variousCOTS component definitions and a high-level systemarchitecture. The output of the framework is aninteroperability assessment report which includes threemajor analyses:1.   Internal assumption mismatches , which arecaused due to assumptions made by interactingCOTS’ systems about each other’s internalstructure [4].2.   Interface (or packaging) mismatches , whichoccur because of incompatible communicationinterfaces between two components.3.   Dependency analysis , which ensure that facilitiesrequired by COTS packages used in the system are being provisioned (e.g. Java-based CRM solutionrequires Java Runtime Engine).In the reminder of this section we will describe each of the framework components in details. 3.1 COTS interoperability evaluator To develop the COTS interoperability evaluator weneeded to address two significant challenges:1.   Ensure that the effort spent in COTS architectureassessment is much less than the effort spent performing the assessment manually.2.   Ensure that the framework is extensible, i.e. sothat it can be updated based on prevailing COTScharacteristics.We address these challenges by developing aframework that is modular, automated, and whereCOTS definitions and assessment criteria can beupdated on-the-fly. Our framework allows for anorganization to maintain a reusable and frequentlyupdated portion (COTS selector) remotely, and a portion which is minimally updated (interoperabilityanalyzer) at client-side. This allows for a dedicatedteam to maintain definitions for COTS being assessed by the organization.The internal architecture of the COTS interoperability evaluator  component is shown inFigure 2. The architecture consists of the followingsub-components. COTS Definition Generator is asoftware utility that allows users as well as COTSvendors to define the COTS components in a generallyaccepted standard format. Currently we haveimplemented an XML-based format however theimplementation format is independent of theunderlying metadata (e.g., the COTS definition canstill be represented using other representation formats,so long as suitable parsers exist). For brevity, we omitits full description of our existing XML format and we point the reader to [14] for a complete description. COTS Definition Repository is an online storage of various COTS definitions indexed and categorized bytheir roles and functionality they provide (databasesystems, graphic toolkits etc.). The repository isqueried by different sub-components of theinteroperability evaluator. In practice, this componentwould be shared across the organization to enableCOTS definitions reuse. Architecting ComponentUser Interface provides a graphical user interface for the developers to create the system deploymentdiagram. The component queries the COTS definitionrepository to obtain the definitions of COTS products COTS DefinitionGenerator COTS Connector Selector  ArchitectingUser InterfaceComponentCOTSDefinitionsDeployment ArchitectureCOTSDefinitionsConnector Query/ResponseIntegration AnalysisComponentConnector OptionsInteroperability Analyzer COTS Selector Quality of ServiceConnector SelectionFrameworkCOTSInteroperability Analysis ReportCOTS DefinitionRepositoryIntegration RulesRepositoryIntegrationRulesConnector OptionsProject AnalystDefine Architecture &COTScombinations   Figure 2. COTS Interoperability evaluation framework    being used in the conceived system. Integration RulesRepository specifies various integration rules that willdrive the analysis results and interoperabilityassessment. The rules repository can be maintainedremotely; however it will be required to download thecomplete repository at the client-side (interoperabilityanalyzer) before performing interoperabilityassessment. This reduces the number of remote queriesrequired when assessing COTS architectures. Integration Analysis Component contains the actualalgorithm for analyzing the system. It uses the rulesspecified in the integration rules repository along withthe architecture specification to identify internalassumption mismatches, interface (or packaging)mismatches and dependency analysis. When theintegration analysis component encounters an interfacemismatch the component queries the COTS connector selector component to identify if there is an existing bridge connector which could be used for integrationof the components, if not it will recommend in theinteroperability analysis report that a wrapper of theappropriate type (either communication, or coordination or conversion) be utilized. The integrationanalysis component then provides some simple textualinformation (in human readable format) as to thefunctionality of the wrapper required to enableinteraction between the two components. In additionthe integration analysis component identifiesmismatches caused due to internal assumptions made by COTS components, and also identifies COTScomponent dependencies not satisfied by thearchitecture. These identifications are both included inthe interoperability analysis report. COTS ConnectorSelector is a queryinterface used byintegration analysiscomponent to identify a bridging connector in theevent of interfaceincompatibility, or a QoSspecific connector. Quality of ServiceConnector SelectionFramework  is anextensible component built for identifyingquality of servicespecific connectors. Onesuch extension discussedin this paper aids in theselection of highlydistributed andvoluminous dataconnectors. Other qualityof service extensionsmay include connectors for mobile-computingenvironments that require low memory footprint, or connectors for highly reliable, fault-tolerant systems.To create a quality of service extension, a developer first identifies needed COTS attribute information andensures the information is captured in the COTSdefinition repository. This information will typicallydescribe the scenario requirements for COTSconnector selection for the particular level of service,e.g., for data-intensive systems, it may include the Total Volume ,  Number of Delivery Intervals and possibly the  Number of Users  present in the datatransfer. The developer then can construct a simpleweb-based service that accepts the COTS connector definition information, and any other needed data, andthen returns the appropriate COTS connectors to selectto satisfy the desired level of service scenario. COTSInteroperability Analysis Report is output by theselector and contains the result of the analysis in threemajor sections: (1) internal assumptions mismatchanalysis, (2) interface (packaging) mismatch analysis,and (3) dependency analysis. This is the ultimateoutput of the interoperability evaluator component. 3.2   COTS Representation Attributes The COTS Representation Attributes are a set of 38attributes that define COTS product interoperabilitycharacteristics. COTS interoperability characteristicsdefined using these attributes are utilized by theintegration analysis component along with integrationassessment rules (described in the next section) tocarry out interoperability analysis. These attributeshave been derived from the literature, as well as our  COTS Representation Attributes NameVersionRole* (platform, middleware, …)Type (third-party, custom, legacy) COTS General AttributesCOTS Interface Attributes* Packaging* (source code, object modules, binaries ...)Data Inputs* (e.g. procedure calls, shared data, …)Data Outputs* (e.g. procedure calls, shared data, …)Data Protocols* (e.g. http, ftp, …)Data Format* (e.g. HTML, JavaScript …)Data Representation* (e.g. ASCII, Unicode, binary, …)Control Inputs* (e.g. procedure calls, triggers, ...)Control Output* (e.g. procedure calls, triggers, ...)Control Protocols* (e.g. ADODB)Binding* (e.g. static, compile-time dynamic, run-time dynamic, ...)Extensions* (plug-ins, third party extensions, ...)Error Handling Inputs* (e.g. HTTP error codes)Error Handling Outputs* (e.g. HTTP error codes)Communication Language Support* (e.g. .NET languages for MS Office) COTS Internal Assumption Attributes Synchronization (synchronous, asynchronous)Concurrency (single-threaded, multi-threaded)Distribution (single-node, multi-node)Dynamism (static, dynamic)Encapsulation (yes, no)Layering (yes, no)Triggering capability (yes, no)Backtracking (yes, no)Control Unit (central control, distributed control, none)Component Priorities (yes, no)Preemption (yes, no)Reconfiguration (online, offline ...)Reentrant (yes, no)Response Time (bounded, unbounded, cyclic ...)Error Handling Mechanism (rollback, roll forward, none ...) Underlying Dependency* (e.g. JRE for Java)Communication Dependency* (e.g. databasefor CRM)Deployment Language* (e.g. binary, PHPscript, ...)Execution Language Support* (e.g. PHP for PHP interpreter) COTS Dependency Attributes*   Figure 3. COTS Representation Attributes
Similar documents
View more...
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks