Book

A Framework for Requirements Engineering in a Digital Integrated Environment

Description
A Framework for Requirements Engineering in a Digital Integrated Environment
Categories
Published
of 8
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
  SETE 2000Page 1   AFramework for Requirements Engineering in a Digital Inte-grated Environment 1 Joseph Kasser Systems Engineering and Evaluation CentreUniversity of South Australia,Mawson Lakes Campus, Room F-37Mawson Lakes, SA, 5095AustraliaEmail:Joseph.Kasser@unisa.edu.au Abstract The current systems and software development life cycle (SDLC) paradigm is that of a produc-tion system. The focus is on process and product. The Framework for Requirements Engineeringin a Digital Integrated Environment (FREDIE) is a potential tool arising from the AnticipatoryTesting concept which views the SDLC from the perspective of Information Systems, and theapplication of Knowledge Management and modern Quality theory. The FREDIE has the poten-tial to implement significant cost reductions in the SDLC. This paper introduces the FREDIEconcept, outlines how the cost reductions can be achieved by means of a tool that could becomethe basis for the next generation of software project management tools and presents some usecases of the FREDIE tool. 1 This work was funded from the DSTO SEEC Centre of Expertise Contract. Introduction The SDLC has evolved several methodolo-gies since the early days of the Waterfallmodel. One of them, the Spiral model(Boehm 1988) placed explicit emphasis onRisk Management. However, even with Risk Management and the current emphasis onProcess Standards and Capability MaturityMeasurement, the developer working withinthe current production paradigm, cannot an-swer two simple questions posed by the cus-tomer during the SDLC, namely: • “What Do You Mean, You Can’t TellMe How Much of My Project Has BeenCompleted?” (Kasser 1997). • “What Do You Mean You Can’t TellMe if My Project is in Trouble?” (Kas-ser and Williams 1998).Another flaw in the paradigm is that theSDLC is characterized by large cost over-runs, schedule slips, and dramatic perform-ance deficiencies in weapon, C4I, and auto-mated information systems (DoD 1995). Thereasons for these failures are varied (Kasser and Williams 1998) however major con-tributors to the failures are poor require-ments and poor requirements engineeringmanagement. There has been a lot of re-search into building the right system anddoing requirements better (Glass 1992).Much of that research has focused on how tostate the requirements in the form of a speci-fication once they have been obtained, usingarequirement traceability matrix (RTM),and the tools that incorporate a RTM. Con-sequently, while the implementation of goodsystem and software requirements manage-ment practices is believed to be one of thefirst process improvement steps an organiza-tion should take, implementation still re-mains a challenging problem (El Emam andHoeltje 1997).  SETE 2000Page 2   Anticipatory Testing Anticipatory Testing combines preventionwith testing and is based on the recognitionthat prevention is planned anticipation(Crosby 1981). The Anticipatory Testingapproach (Kasser 1995) concept is a controland information system paradigm rather than a production paradigm. It views theSDLC from the perspective of InformationSystems, the application of KnowledgeManagement and modern Quality theory. Ithas explicit emphasis on ConfigurationManagement and building Quality into the process. The Anticipatory Testing approachhas the potential to provide better answers tothe two questions posed above, facilitatechange management and presents an oppor-tunity for an improvement in program man-agement at least as great as that obtained bythe introduction of PERT 2 .From the Anticipatory Testing perspective, arequirement can be thought of as consist-ing of two parts; the functionality and theQuality criteria that define the measur-able attributes associated with the func-tionality. The term Quality is used based onthe definitions of Quality as "conformanceto specifications" (Crosby 1979) and as "fit-   2 Developed by the United States Department of Defense as amanagement tool for complex military projects. PERT is anacronym for "Program Evaluation and Review Technique". ness for use" (Juran 1988). For example, arequirement to ingest sensor data into a sys-tem is made up of the function that ingeststhe data and the minimum measurableamount of data to be ingestedover a speci-fied period of time. Having to consider boththe functionality and Quality criteria com- ponents of a requirement should tend to en-sure that requirements are measurable andhence verifiable at the time of acceptance.Then later when the requirement is decom- posed into subsystem requirements the flowdown of measurable subsystem requirementsshould also tend to ensure that the capability provided by the system meets the perform-ance required by the customer.Anticipatory Testing is used within an Or-ganizational Engineering or  integrated  product-process   and management  paradigm(Kasser 1999).    The product under construction is a system and the process producing the product is a system. Peopleworking within the context of an enterpriseframework (system) build a product over a period of time . Thus,      Figure 1 Anticipatory Testing view of the SDLC  SETE 2000Page 3    In addition, every one of thesystems changes over time. From the An-ticipatory Testing perspective the   andcan be considered as a series-parallel setof phased Builds (mini waterfalls or cata-racts) in a multithreaded environmentunder the control of the ConfigurationControl Board (CCB).  Figure 1 presents this concept by showingthe traditional Waterfall methodology se-quential elements connected via a CCB thatallocates the implementation of require-ments and subsequent changes to Builds inwhich: • Engineering converts user needs intofunctionality and Quality criteria (re-quirements) and groups functionalityinto sets (Builds). • Management ensures that Builds areimplemented in a phased manner. Change Management From the Information flow perspective, the processes of accepting prospective require-ments (before the baseline is set) and changerequests (after the baseline is set) are identi-cal and contain the following steps: • Prioritize the requirement/change. • Determine if a contradiction exists. • Perform an impact assessment using anIntegrated Product and Process Team(IPPT). The impact assessment must: • Estimate the cost/schedule to imple-ment. • Determine the cost/schedule drivers -factors that are responsible for thegreatest part of the cost/schedule. • Perform a sensitivity analysis on thecost/schedule drivers. • Determine if the cost drivers arereally necessary and how muchmodification can be made by negoti-ating the requirement with customers based on the results of the sensitivityanalysis. • Make the decision to accept, accept withmodifications, or reject. •  Notify the srcinator. • Document the decision(s) in the re-quirement repository. • If the requirement/change is accepted,allocate the implementation to a specificBuild modifying the Work BreakdownStructure (WBS) appropriately.However, in order to perform the impactassessment and make informed decisions atany specific time in the SDLC in an effec-tive manner, a certain amount of informationis needed. In the existing production para-digm, this information tends to be containedin several different and usually unconnectedtools: Requirements Management, ProjectManagement, WBS, Configuration Control,and Cost Estimation, etc. This information,herein named Quality System Elements(QSE) includes but is not limited to: • Unique identification number - thekey to tracking. • Requirement - the imperative statementcontaining both the required functional-ity and its corresponding Quality criteriaor other form of representation. • Traceability to source(s) - the previouslevel in the production sequence.   • Traceability to implementation - thenext level in the production sequence.Thus requirements are linked to designelements, which are linked to code ele-ments, and so on. • Priority - knowing the priority allowsthe high priority items to be assigned toearly Builds, and simplifies the analysisof the effect of budget cuts. • Estimated cost and schedule - thesefeed into the management plan and arerefined as the project passes through theSDLC. • The level of confidence in the cost andschedule estimates - these should im- prove as the project passes through theSDLC. • Rationale for requirement - the extrin-sic information and other reasons for therequirement. • Planned verification methodology(s) - developing this at the same time as the  SETE 2000Page 4   requirement avoids accepting require-ments that are either impossible to ver-ify or too expensive to verify. • Risk - any risk factors associated withthe requirement. • Keywords - allow for searches throughthe database when assessing the impactof changes. • Production parameters - the Work Breakdown Structure (WBS) elementsin the Builds in which the requirementsare scheduled to be implemented.   • Testing parameters - the Test Plansand Procedures in which the require-ments are scheduled to be verified. • Traceability sideways to documentduplicate links - required when apply-ing the QSE to an existing paper based project. • Access control parameters –  nationalsecurity classification or company con-fidential as appropriate. The FREDIE ParadigmThis paper proposes   expanding the tradi-tional RTM into a database representedby the set of Quality System Elements(QSE) to be   stored in a Framework for Re-quirements Engineering in a Digital Inte-grated Environment (FREDIE) instead of the several separate tools currently in use.This database and its Agents 3 would be thenext step in the evolution of RequirementsEngineering; a discipline that is evolvingfrom its traditional role as a mere front-endto the systems life cycle towards a centralfocus of change management in system-intensive organizations (Jarke 1996).By requiring a full set of QSE for each re-quirement at the time the requirement isagreed to by the customer and contractor,  some quality is built into the structure of a project  .For example: • The cost and schedule impact of a re-quirement or a change is known (tosome extent) up front. 3 Entities that operate on the contents on the database. • The impact of change requests on the project can be more easily identifiedthan in the paper based production para-digm. • The ambiguity in poorly written re-quirements is minimized because the ra-tional for the requirement and the verifi-cation methodology are documentedearly in the process. • Unrealistic and unverifiable require-ments are not imposed on the developer  by virtue of the testing parameters.Access to the QSE via the FREDIE Agentswill allow decisions to be made more rap-idly and effectively. Thus the concept willimprove the shared meaning as well as thethree dimensions of Requirements Engineer-ing success (El Emam and Madhavji 1996): • Cost effectiveness of the process • Quality of the Requirements Engineer-ing products • Quality of the Requirements Engineer-ing service. Use Case Scenarios The value of a FREDIE can best be implied by demonstrating its use in various scenariosas shown herein. Change request - impact assessment When a change request is received, the pro-cess outlined above is followed. Once thespecific WBS element 4 and/or requirement(as appropriate) affected by the change isidentified, then the links in the FREDIE fa-cilitate an informed assessment of the im- pact of the change on the other elements(capability, cost, schedule, risk, WBS)within the project. Project Quality Audit If the FREDIE database is populated by therequirements, configuration control and pro- ject management data for a project, a Qual-ity Audit may be performed. The audit is 4 Implementing the requirement.  SETE 2000Page 5    performed by examining the data in theFREDIE database. With the addition of theappropriate knowledge base for the specificfunction, this audit may perform severalfunctions including: • Identification of some poorly writtenrequirements by scanning the text of therequirements for words that do not meetthe requirements for writing require-ments (Kasser 1995). • Identifying work that is not being done,or work that doesn't have to be done, byfinding missing links in the traceabilityof the requirements to the WBS. • Identifying missing links in the trace-ability of the requirements to test plansand procedures. • Identifying missing activities or steps inthe processes within the SDLC. • Identifying other missing information. Categorized Requirements in Process(CRIP) The Categorized Requirements in Process(CRIP) approach (Kasser 1999), is a way toestimate the percentage of project comple-tion by looking at the change in the state of the requirements over time from several per-spectives. The CRIP approach follows thefollowing procedure: • Develop categories for the require-ments - use factors such as complexity”,“cost”, and “priority" as appropriate. • Identify and set up the rules for 10distinct ranges within each category – the ranges represent a partitioning of thecategory into a number of ranges. Asimple 1 to 10, or A - J is enough. Therules for determining the category mustnot change over the SDLC. • For each category, place each re-quirement into a range - it is permissi- ble to move a requirement from onerange to another should the initial de-termination be found to be incorrect. • As the SDLC progresses, for each cate-gory of requirements, record the state of each of the requirements in the FREDIEdatabase – a requirement must reside inany one and only one of the followingstates at any time: • Identified - A requirement has beenidentified, documented and ap- proved. • Working - The supplier has begunwork to implement the requirement. • Completed - The supplier has com- pleted work on the requirement. • In test - The supplier has started totest the requirement. • Accepted - The buyer has accepteddelivery of a Build containing theimplementation of the requirement. • At each reporting milestone, monitor the changes in the  state of each of the re-quirements between the SDLC reportingmilestones - the typical project man-agement view should be used, namely:- • Expected from last time • Actual achieved • Planned for next timeThe summaries may be presented in graphi-cal or Table format as shown in Figure 2.Each cell in the table contains three values(Expected, actual and planned). A compari-son of the summaries from different report-ing milestones can identify progress andshow that problems may exist. On its ownhowever, it cannot identify the actual prob-lem. For example: • The cell in the "Identified" column for category B shows that the project planned that 43 requirements would beidentified, but only 2 were actually iden-tified and the project expects to identify4in the next reporting phase. Somethingmay be wrong here! • In category B, it was expected that the part of the system implementing 12 re-quirements would go into test in the lastreporting period, but only 5 made it intotest and none are planned for the nextreporting period. Again, something iswrong.
Search
Tags
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks