A Retrospective Study of Software Analytics Projects: In-Depth Interviews with Practitioners

A Retrospective Study of Software Analytics Projects: In-Depth Interviews with Practitioners Ayse Tosun Misirli, University of Oulu Bora Caglayan, Bogazici University Ayse Bener, Ryerson University Burak
of 8
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
A Retrospective Study of Software Analytics Projects: In-Depth Interviews with Practitioners Ayse Tosun Misirli, University of Oulu Bora Caglayan, Bogazici University Ayse Bener, Ryerson University Burak Turhan,University of Oulu 1. History 2. What is Software Analytics 3. Procedure 4. Interview Protocol 5. Data Collection 6. Cases 6.1. Effort Estimation Project 6.2. Defect Prediction Projects 7. Future Road Map 1. HISTORY In 2012, announce of a special issue for IEEE Software: Software Analytics: So What? (Volume 1) o Guest Editors Introduction: Software Analytics: So What? Menzies, Tim & Zimmermann, Thomas o Leveraging the Crowd: How 48,000 Users Helped Improve Lync Performance. Musson, Robert & Richards, Jacqueline & Fisher, Danyel & Bird, Christian & Bussone, Brian & Ganguly, Sandip o Developer Dashboards: The Need for Qualitative Analytics. Baysal, Olga & Holmes, Reid & Godfrey, Michael o Roundtable: What s Next in Software Analytics. Hassan, Ahmed E. & Hindle, Abram & Runeson, Per & Shepperd, Martin & Devanbu, Prem & Kim, Sunghun o Searching under the Streetlight for Useful Software Analytics. Johnson, Philip M. o CODEMINE: Building a Software Development Data Analytics Platform at Microsoft. Czerwonka, Jacek & Nagappan, Nachiappan & Schulte, Wolfram & Murphy, Brendan Added one more issue The Many Faces Of Software Analytics (Volume 2) o Software Analytics in Practice. Zhang, Dongmei & Han, Shi &Dang, Yingnong & Lou, Jian- Guang & Zhang, Haidong & Xie, Tao o Are software analytics efforts worthwhile for small and medium companies? The case of Amisoft & Robbes, Romain & Vidal, René & Bastarrica, Cecilia o Using software analytics to understand how companies interact in free software communities. Gonzalez-Barahona, Jesus & Izquierdo, Daniel & Maffulli, Stefano & Robles, Gregorio o A Retrospective Study of Software Analytics Projects: In-Depth Interviews with Practitioners. Tosun Misirli, Ayse & Caglayan, Bora & Bener, Ayse & Turhan, Burak 2. WHAT IS SOFTWARE ANALYTICS? Analytics on software data for managers and software engineers with the aim of empowering software development individuals and teams to gain and share insight from their data to make better decisions. Software Analytics: So What? Analytics Must Be Real Time and Actionable o Faster than the rate of change of effects within a project Analytics Means Sharing Information o What software projects can learn from themselves and each other. Sharing models, sharing insights, sharing data, sharing methods. o A model that predicts defects in Internet Explorer may not be able to predict defects in Firefox 2009, Zimmermann Figure 1. The frequency of analytics questions. A small number of questions are very frequent and therefore should be supported by tools (orange region), whereas the long tail of questions that are more unique and asked less frequently should be addressed by data scientists (blue). As the analytics domain matures, we expect the orange area to grow because tools will become more powerful and cheaper to develop. 3. PROCEDURE In-depth interviews with 12 practitioners Three case study Figure 2. Demographic information about stakeholder participants. Participants vary in terms of roles in software development teams and experience. Two-thirds of the participants are computer engineers, half with higher degrees (MSs or PhDs) 4. INTERVIEW PROTOCOL Two different sets of questions. Attention to question order and the main themes. A common understanding about the usage of prediction models 5. DATA COLLECTION Sending questions to respondents with a letter. One-on-one interviews with respondents Recorded all responses anonymously. Read all responses and combined them based on the themes covered in the questions Combined the three projects results 6. CASES 6.1 Effort Estimation Project Estimate the effort for a new project (in man-months) using a prediction model Inputs; List of project metrics that were periodically collected from past projects A questionnaire Effort estimation model was able to predict overall effort in 75 percent of projects with an error rate of less than 25 percent Problems 6/6 respondents agreed that organizational changes before the deployment disrupted the project Subjects of problems o Organizational o Data extraction o Modeling issues 4/6 respondents believe that model performance played an important role in deployment failure. 2/6 respondents not satisfied with the model s input metrics 1/6 respondents believe that lack of survey culture in the company 6/6 six respondents agreed that data collection was a major challenge 5/6 the model s output was insufficient 6.1.2 Ideas for Improvement Estimation of each requirement s effort and an estimation of employee capabilities Increasing the model output s information content Indirect Benefits Software measurement Process improvement Experience in software analytics Transfer of this experience to new projects 6.2 Defect Prediction Projects The goal tasks as software measurement and defect prediction. Inputs; Implemented a tool for collecting static code metrics from past and current projects source files Classify the files of past projects as defective or defect-free Evaluated on 10 releases of nine projects in terms of defect-detection effectiveness Problems Issue-commit matching was a major challenge Fast development cycles and market pressure often prevented the company from making time-consuming process changes, and senior management couldn t force the development team to follow certain policies The model didn t receive sufficient and accurate data A lack of actionable outputs from defect prediction models Release schedules were very tight and chaotic, so the development team wasn t eager to implement additional tasks. 5/6 found the binary classification (defect-prone/defect-free) to be insufficient Ideas for Improvement Additional information about defect causes, such as the phases introduced, categories and severity levels Actionable recommendations related to a specific defect The model should be easily integrated as a plug-in Indirect Benefits Increase their awareness about problems and limitations in their software processes Identified weaknesses in company processes 7. Future Road Map Figure 3. Main challenges A tool Dione
Similar documents
View more...
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks