Proceedings of the 20th Ibero-American Conference on Software Engineering, Experimental Software Engineering (ESELAW) Track
Models play an important role in Software and Systems<br> Engineering processes. Reviews are well-established methods for model quality<br> assurance that support early and efficient defect detection. However, traditional<br> document-based review processes have limitations with respect to the number<br> of experts, resources, and the document size that can be applied. [Objective] In<br> this paper, we introduce a distributed and scalable review process for model<br> quality assurance to (a) improve defect detection effectiveness and (b) to<br> increase review artifact coverage. [Method] We introduce the novel concept of<br> Expected Model Elements (EMEs) as a key concept for defect detection. EMEs<br> can be used to drive the review process. We adapt a best-practice review<br> process to distinguish (a) between the identification of EMEs in the reference<br> document and (b) the use of EMEs to detect defects in the model. We design<br> and evaluate the adapted review process with a crowdsourcing tool in a<br> feasibility study. [Results] The study results show the feasibility of the adapted<br> review process. Further, the study showed that inspectors using the adapted<br> review process achieved results for defect detection effectiveness, which are<br> comparable to the performance of inspectors using a traditional inspection<br> process, and better defect detection efficiency [Conclusions] Although the study<br> shows promising results of the novel inspection process, future investigations<br> should consider larger and more diverse review artifacts and the effect of the<br> limited scope of artifact coverage for an individual inspector.
Information and Communication Technology