5

Click here to load reader

[IEEE 2008 IEEE International Conference on Global Software Engineering (ICGSE) - Bangalore (2008.08.17-2008.08.20)] 2008 IEEE International Conference on Global Software Engineering

  • Upload
    sapna

  • View
    215

  • Download
    3

Embed Size (px)

Citation preview

Page 1: [IEEE 2008 IEEE International Conference on Global Software Engineering (ICGSE) - Bangalore (2008.08.17-2008.08.20)] 2008 IEEE International Conference on Global Software Engineering

Ailments of Distributed Document Reviews and Remedies of DOCTOR (DOCument Tree ORganizer Tool) with Distributed Reviews support

Krishnamurthy T.V Sapna Subramani Nokia Siemens Networks Nokia Siemens Networks [email protected] [email protected]

Abstract

In Distributed Software development, effective reviews of analysis documents lead to increased correctness of analysis results. When managed without tool support, such reviews lead to ineffective usage of time and effort and possibly even delay the project. Also going ahead without reviews completed, implies risks of incorrect or incomplete features.

In this paper we contrast the manual process we used to follow to organize distributed software reviews and the process with DOCTOR to show how it greatly improves the process of distributed software reviews. We also attempt to provide a projection in terms of cost of efforts saved and the resultant increase in productivity. We also compare DOCTOR with other web based tools and project its advantages and disadvantages. We hope to convince readers who haven’t adopted tool based support for distributed reviews by this paper on the clear advantages of the same.

1. Introduction.

Software Development has gone truly Global. More

and more projects are being developed in different sites spanning different countries. The factors leading to this development have already been well researched [1, 2]. Even more interestingly and significantly almost all sites now participate in all phases of the project and not just with construction and testing. Involvement of all sites during early phases of the project has been noted to be a Critical Success Factor for Global Software Development; in terms of relocation of off-shore site members to the “central” team [3]. In extension, even if relocation is not possible, at the minimum, distributed reviews of early phase outputs where all sites participate, should also count as a critical factor for success.

Most of the times the outputs of these early phase or upstream tasks tend to be documents. Though modern agile gurus advocate movement towards more collaborative forms of content management using wikis for example [9], big organizations still tend to produce and work with Documents as outputs of the upstream phases – be it Requirements Analysis or design documents. That efficient Content Management systems are needed to be in place for providing consistent uniform access to these produces across multiple sites has already been researched [4].

In our organization, with distributed development happening in large scale it was very imperative that these reviews were planned and executed in time and also resulted in a healthy review efficiency rate, to ensure that subsequent phases proceeded smoothly without many defects. The fact that different reviewers at different sites brought their special perspectives to the review; for example, Architects with system knowledge, systems engineers with customer features focus, and subsystem owners with special component knowledge, ensured that the review itself on the whole tended to help check the phase output from complete system perspective.

The objective of this paper is to share our industry experiences in terms of doing distributed document reviews with explicit review support provided by our Content Management Tool (DOCTOR), wherein we contrast the improvement gained vis-à-vis the manual process we followed earlier.

In Section 2 of this paper, we introduce the manual review process and highlight the difficulties we faced during the same. In Section 3 we explain how the document review process support of DOCTOR helped mitigate the difficulties to a great extent. In Section 4 we examine how our proprietary tool compares with some other well researched tools [5, 6]. Finally we conclude the paper in Section 5.

2008 IEEE International Conference on Global Software Engineering

978-0-7695-3280-6/08 $25.00 © 2008 IEEE

DOI 10.1109/ICGSE.2008.8

210

Page 2: [IEEE 2008 IEEE International Conference on Global Software Engineering (ICGSE) - Bangalore (2008.08.17-2008.08.20)] 2008 IEEE International Conference on Global Software Engineering

2. Introduction to our Manual Review process and its ailments.

The following table gives the complete process of

document review that we follow. The process is compliant with PPP:D process for Radio Access Networks at Siemens.

Table 1. Review process in detail

Step Activity 1 Announce the document in the Document

Management Tool. 2 Plan the review period for the document. 3 Plan the reviewers for the document. 4 Get commitment from the reviewers for

the planned dates. 5 Prepare the document in MS-Word and

upload to the Document Management Tool (AFI1).

6 Prepare the review template for the document in MS-Word.

7 Circulate the document and the review template for the review.

8 Obtain the individual review comments from the reviewers.

9 Combine the different review comments and prepare the consolidated review report document (with replies) in MS-Word, with all statistics updated.

10 Incorporate the comments into the document.

11 Circulate the updated document with the consolidated review report for approval of review replies (Pre-IUS2).

12 Store the document for freeze along with the updated review report in Document Management Tool (IUS3).

Note: steps 2 to 4 are iterative in nature as reviewer

commitment is needed to complete the plan. Similarly steps 7 to 11 may need repeating till the review is satisfactorily completed.

Though we used DOCTOR for Document Management, there was support only for steps 1, 2, 3, 4, 5 and 12 in the above process.

1 AFI – Available For Inspection

2 PRE-IUS – ready to be Inspected, Updated and Stored

3 IUS – Inspected Updated and Stored

Thus there were many parts of the process where the work was done manually and hence tended to be erroneous or tedious as noted by the following ailments. Ailment 1 - Review Template preparation:-

The review template had to be filled from a base template with duplication of much of the data entered during announcement of the document and hence had lots of room for mismatched entries.

Ailment 2 - Manual circulation of the Document and review template:-

Once the document is ready the same has to be circulated by mail with the review template to all the reviewers.

Ailment 3 - Entering the Review Comments in template:-

Each review comment required that the correct comment type (severity of the comment) to be selected, and sometimes invalid values were entered. Also the reviewer had to keep track of time spent on the review over multiple sessions, to update the same in the template before sending to the review organizer.

Ailment 4 - Getting substitutes to review the document:-

In case the planned reviewer cannot complete the review, the reviewer has to be manually forward the mail and information to get a substitute to review the document.

Ailment 5 - Inability to find out review progress till announced review end date:-

Till the comments are called for by the review end date, the organizer and the approver do not get to know the current level of progress of the review, and also if any of the authors have any delays.

Ailment 6 - Review comments consolidation and statistics update:-

Multiple review templates have to be manually combined, with replies on acceptance status and overall statistics generated for each severity type on the following parameters:

1. Number of comments received 2. Number of comments rejected 3. Number of comments accepted. Also the total time spent by the reviewers and total

time spent by the review organizer to work on the review comments and the document need to be recorded.

Most of these ailments were due to the manual

processes. The inherent nature of distributed reviews

211

Page 3: [IEEE 2008 IEEE International Conference on Global Software Engineering (ICGSE) - Bangalore (2008.08.17-2008.08.20)] 2008 IEEE International Conference on Global Software Engineering

meant added overhead and decreased efficiency for the overall flow. We shall see in the next section how these were overcome with DOCTOR.

3. Introduction to the DOCTOR and its cures for the Distributed Review. DOCTOR (DOCument Tree ORganizer) is a proprietary documentation configuration management tool based on Clearcase from Siemens. Apart from supporting the usual lifecycle features of Document Announcing, Planning, Tracking, and Versioning, DOCTOR also provides Document and Code Review support using Web interfaces. The focus for us here is the Document Review process of the DOCTOR and the support that it provides to cure the ailments explained in the previous section on Distributed Document reviews. DOCTOR provides support for both online and offline reviews. Online reviews are completed by entering comments online, in the web interface, while offline comments are entered in a supported spreadsheet and imported into DOCTOR. Also we are interested in review by Commentary as that is the one we most often used for document reviews. As explained below, once the document is available for review and the reviewers and review planning is completed in DOCTOR, the following can be done. Cure 1 – Review Start and Automatic mail notification:- Reviewer can start the Review in DOCTOR by pressing the Start button as shown below, mentioning Review type and the deadline for the review. DOCTOR sends an automatic mail notification to all review participants, with all necessary information for the review.

Figure 1. Starting the review process

Cure 2 – Reviewer can forward review to substitute reviewers:- Once review is started, reviewer can forward his review responsibility to a substitute, who can now enter the comments online. Cure 3 – Reviewers can enter comments online:- Once review is started, reviewer can enter his comments online into the web interface. The comment severity is a controlled field and the comment location is also entered. Once comments are saved, they are visible to everyone.

Figure 2. Web interface for comments

Cure 4 – DOCTOR keeps track of total review time:- Once review is completed reviewer can indicate the same by checking the checkbox for the same. DOCTOR then requests information for review completion and suggests the time spent from the time the web interface was active.

Figure 3. Total review time

Cure 5 – DOCTOR shows overview of Review progress at all times:- Once the info button is clicked (ref Fig 1), DOCTOR shows the status of the Review progress in terms of how many authors have started the review, how many comments have been entered etc. This gives important information to find out if a review is really not progressing as expected and helps form alternate measures or to send further reminders using DOCTOR.

Figure 4. Review progress

Cure 6 – DOCTOR allows Document author and reviewers to see comments already entered:- Once the comments are saved, DOCTOR allows reviewers and the document author to view them even while review is not completed. This has the following advantages: 1. A reviewer who starts review slightly late could check on existing comments and tune which sections and comments he may provide in addition to what his role entails already. 2. The Document author can get a preview of the comments coming her way and can start preparing the replies and also check updates needed to the document. In case the comment unearths a major deficiency, it can really help the author that she gets to know of the comment well ahead of the deadline. Cure 7 – DOCTOR allows Document author to respond to comments online:- Once a reviewer has completed her review, the document author can start responding to the comments online in the web interface and indicate her acceptance

212

Page 4: [IEEE 2008 IEEE International Conference on Global Software Engineering (ICGSE) - Bangalore (2008.08.17-2008.08.20)] 2008 IEEE International Conference on Global Software Engineering

of the comment. In case a comment is rejected, the reason for rejection should be entered in DOCTOR. Also this allows the document author to start updating the document for accepted comments. Cure 8 – DOCTOR generates Consolidated Review Protocol summary automatically on closure:- Once all comments have been replied to satisfactorily and the Pre-IUS document has been accepted by all reviewers, DOCTOR allows the document Approver to close the review online. DOCTOR then generates the summary Review Protocol automatically with all required statistics and stores the generated word document into Clearcase. Thus the review support provided by DOCTOR greatly improves the situation of distributed document reviews. The tool as such requires administrative support of 0.5 FTE (Full Time Equivalent) to support about 2000 users. Though it is tough to put an exact figure to the efforts saved, in practice, for a decent sized document of about 150 pages, that receives around 100 comments (from about 10 people), around 10 to 15 hours of efforts of the document author have been found to be saved. This is only an indication of the direct benefit of using such tools, as the indirect benefits of improved visibility of the review progress and additional efficiency of the review process are not easily quantifiable and need further studies. 4. DOCTOR compared to other Tools. Whilst DOCTOR is a proprietary tool from Siemens there are other well researched tools also based on the Internet like IBIS [5] and XATI [6]. In this section we shall try to have a comparison between the tools to have an idea of how DOCTOR shapes up to the other tools. IBIS (Internet Based Inspection Systems) is a web based application for inspection that uses XML for its structured persistent data and used XSL for data access. The main idea behind IBIS is the stress on individual Discovery steps with special emphasis on the Discrimination phase, where comments are consolidated and checked for validity. IBIS moves away from a real discussion based review process and hence is suitable for reviews where teams are distributed or at the least cannot collect at the same place. Also IBIS is now available under SourceForge [7]. XATI (XML Annotation Tool for Inspection) is also a web based tool for inspection that actually accesses documents under inspection as XML content using DOM (Document Object Model) for access. XATI is incidentally based upon the Mozilla environment as a

generic application environment and uses Mozilla for viewing different document types. In addition, as it uses a node based approach towards content of a document, XATI can actually attach additional information in terms of nodes that are critically commented and hence possibly provide ways for the author to address issues with such nodes in more detail. In this context DOCTOR is different in that it is basically a documentation configuration management tool based on clearcase, that also supports a review process which fits in style with IBIS, using a web based interface. Like IBIS, it is also configurable for the review item using a template and hence can be used to review documents as well as code. Unlike other tools DOCTOR also has options for importing comments through offline sources, like spreadsheets. Also, being part of an industrial, production environment DOCTOR has support for interfacing information with many other tools and systems internal to Siemens. The following table summarizes the key points of comparison of DOCTOR with the other tools. For a more detailed comparison please refer to the comparison table in IBIS [5].

Table 2. DOCTOR feature comparison Feature DOCTOR IBIS XATI Defect classification, Web based interface, Data classification and Measurement

Yes Yes Yes

Any document Yes Yes No Weighted evaluation of high priority defects

No No Yes

Clearcase dependency

Yes No No

Support information

Meeting / Ad-Hoc

checklists checklists

5. Conclusion and summary Distributed Software Development in itself is a complicated endeavor and distributed asks have been proven to take up to 2.5 times more time to complete than if the tasks were to have been done by co-located personnel [8]. Hence it is essential that important activities like Distributed Document Reviews be handled in a manner that is as efficient as possible.

213

Page 5: [IEEE 2008 IEEE International Conference on Global Software Engineering (ICGSE) - Bangalore (2008.08.17-2008.08.20)] 2008 IEEE International Conference on Global Software Engineering

In our practice we have found that performing distributed document reviews using DOCTOR contributes significantly in overall quality of the review. It helps to reduce manual work for both the document owner and the reviewer and hence results in significant effort reduction for the review process. It also helps to identify and track a review that is getting delayed and take corrective measures earlier. Even if the effort reduction does not seem to be significant immediately, over time the ease of use of reviewing using the tool and the attitudinal changes towards doing the review help the project turn in significantly better quality documents with better defect identification with lesser efforts. Thus there is a strong motivation for doing distributed reviews and in particular document reviews using a review process that has an open tool based support. 6. References [1].Carmel E, “Global Software Teams: Collaborating across Borders and Time Zones”, Prentice Hall, Upper Saddle River, NJ, 1999. [2].Herbsleb ,J., Paulish, D., and Bass, M., “Global Software Development at Siemens: Experience from Nine Projects”, Proceedings of the 27th International Conference on Software Engineering, St. Louis, MO, 2005. [3]. Sangwan R, Bass M, Mullick N, Paulish D, Kazmeier J., “Global Software Development Handbook”, Auerback publications, 2007. [4]. Gersmann, S., “Development of Strategies for Global Software Development”, Technische Universitat Munchen and Siemens Corporate Research, Masters Thesis, 2005. [5]. Lanubile F, Mallardo T, and Calefato F, “Tool Support for Geographically Dispersed Inspection Teams”, SOFTWARE PROCESS IMPROVEMENT AND PRACTICE Softw. Process Improve. Pract. 2003; 8: 217–231 (DOI: 10.1002/spip.184). [6]. Henrik Hedberg, Lasse Harjumaa,, “Virtual Software Inspections for Distributed Software Engineering Projects”, Department of Information Processing Science, University of Oulu. [7] Lanubile F, Mallardo T, “IBIS - Internet Based Inspection System”, https://sourceforge.net/projects/ibis/ [8] James D. Herbsleb and Audris Mockus, “An Empirical Study of Speed and Communication in Globally-Distributed Software Development”, IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, VOL. 29, NO. 6, JUNE 2003 [9] Larman Craig, “Agile Requirements”, retrieved from www.craiglarman.com on 29th Feb 2008, http://www.craiglarman.com/wiki/index.php?title=Agile_Requirements

214