[Top] [Prev] [Next] [Index] [TOC]
Chapter 1
Introduction
Software testing and maintenance are the two most expensive phases of the software life cycle. Why? And why, in general, are customers not satisfied with the quality of software? As much as 70% of the cost of an average software system over its lifetime is estimated to be dedicated to these two tasks. Are there techniques and tools which can help us reduce the development cost and also improve productivity and quality? Yes. Suds Software Visualization and Test Toolkit developed at Cleanscape, is just such a solution. The Toolsuite emphasizes dynamic behavior, and uses software visualization and heuristic guidance in the solution of software problems.
1.1 The Purpose of This Manual
This manual covers the use of the Toolsuite. It has two parts. PART 1 explains the basic ideas behind coverage testing, how ATAC and ATAC work, how to invoke the various features of each, and how one might use ATAC or ATAC to test a program. PART 2 looks at other tools including Regress, a tool for effective regression testing; Vue, a tool for effective software maintenance; Slice, a tool for dynamic program debugging; Prof, a tool for detailed performance analysis; Find, a tool for transitive pattern recognition; and Diff, a tool for better displaying program differences.
1.2 The Contents of This Manual
In addition to this Introduction, this manual is comprised of fifteen other chapters, two appendices and an index:
PART I
- Chapter 2, ATAC: A Tutorial, describes how ATAC might be used to test a simple program;
- Chapter 3, ATAC: Overview, explains the basic ideas behind coverage testing and describes how ATAC and ATAC work;
- Chapter 4, ATAC: Setting Up Your Execution Environment, tells you how to modify your execution environment in order to use ATAC and ATAC;
- Chapter 5, ATAC: Instrumenting Your Software, describes how to instrument a program using the ATAC compiler;
- Chapter 6, ATAC: Executing Software Tests, describes how to manipulate the trace file and identifies problems that might occur during test execution;
- Chapter 7, ATAC: Managing Your Test Cases, describes how to manage the contents of an execution trace file;
- Chapter 8, ATAC: Generating Summary Reports, describes how to generate a report summarizing the current level of code coverage;
- Chapter 9, ATAC: Displaying Uncovered Code, describes how to display source code that has not yet been covered;
- Chapter 10, ATAC: Testing Modified Code, describes how to find code which has been modified from one release to the next to facilitate test modification;
PART II
- Chapter 11, xRegress: A Tool for Effective Regression Testing, describes the tool used to identify a representative subset of tests to revalidate modified software;
- Chapter 12, xVue: A Tool for Effective Software Maintenance, describes how to use the tool which locates where features are implemented;
- Chapter 13, xSlice: A Tool for Program Debugging, describes how to use the tool which is the dynamic program debugger;
- Chapter 14, xProf: A Tool for Detailed Performance Analysis, describes how to use the tool which identifies poorly performing parts of code;
- Chapter 15, xFind: A Tool for Transitive Pattern Recognition, describes the tool used to assist in identifying pieces of code that are related to one another in a thematic way;
- Chapter 16, xDiff: A Tool for Displaying Program Differences, describes how to use the tool which graphically displays differences between files;
APPENDIX
1.3 How to Use This Manual
This manual contains both background material and reference material. The former explains the basic ideas behind coverage testing and describes how the various components work. This is what you read if you want to find out ``what a coverage tool is good for'' or ``what ATAC is all about.'' The latter describes how to analyze a program using the various tools.
When you are ready to instrument your code, refer to Chapter 5, ATAC: Instrumenting Your Software. If you want to avoid reading this manual in its entirety, but want to use ATAC, read Chapter 2, ATAC: A Tutorial, working through the example as you go. Turn to the other chapters in this manual, most likely, Chapter 3, ATAC: Overview and the relevant sections of Chapter 4, ATAC: Setting Up Your Execution Environment, only if necessary. If you are a software manager, you may only need to read Chapter 3. Looking through the example provided for each tool (Chapter 2 and 12-16), is useful in bringing all the details together and seeing how the various tools are used in testing software.
1.3.1 About Examples
Throughout this manual, descriptive examples have been used to illustrate what is discussed and whenever possible real output has been incorporated. Commands input by the user are preceded by:
prompt:>
to assist the user in distinguishing inputs from outputs.
Most of the examples in this manual originate from using the various components of the Toolsuite to test wordcount, a small program consisting of two source files, wc.c and main.c (and it's variants), which counts the number of characters, words and/or lines in its input. A complete source code listing appears in Appendix A, Platform Specific Information, and specific examples appear in the chapters describing each tool.
1.3.2 Type Conventions
All text that represents input to or output from programs in the surrounding computing environment appear in a font whose typeface has constant width. Environment variables appear in ALL_CAPS. The names of executable programs, source code files, and references to files created by the tools (.atac and .trace files), symbols, command-line options, and significant terminology (first usage) appear in italics, as does descriptive text representative of the actual words or phrases that are to appear. For example, filename is representative of any file name that might be referenced. Representations of interface displays are as truthful to the color screen displays as possible. Widget labels (buttons and pull down menu items) are in italics and ``quoted''. Finally, some insets and figures are annotated with descriptive comments or tags that may be referred to later in this manual. The presence of these annotations and the points to which they refer are indicated by arrows, like this:
1.4 Other Sources of Information
Additional information concerning ATAC may be found in:
- J. R. Horgan and S. London, ``Data Flow Coverage and the C Language,'' in Proceedings of the Fourth Symposium on Software Testing, Analysis, and Verification, pp 87-97, Victoria, British Columbia, Canada, October 1991.
- J. R. Horgan and S. London, ``ATAC: A Data Flow Coverage Testing Tool for C,'' in Proceedings of Symposium on Assessment of Quality Software Development Tools, pp 2-10, New Orleans, LA, May 1992.
More information and an explanation of the ideas and terminology underlying coverage testing may also be found in:
R. A. DeMillo, R. J. Lipton and F. G. Sayward, ``Hints on Test Data Selection: Help for the Practicing Programmer,'' IEEE Computer, 11(4), 1978.
J. R. Horgan and A. P. Mathur, ``Assessing Tools in Research and Education,'' IEEE Software, 9(3), May 1992.
J. R. Horgan, Saul London and M. R. Lyu, ``Achieving Software Quality with Testing Coverage Measures,'' IEEE Computer, 27(9), September 1994.
H. Agrawal, ``Dominators, Super Blocks, and Program Coverage,'' in Proceedings of the 21st ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, pp 25-34, Portland, Oregon, January 1994.
Information regarding other tools providing automated support for testing may be found in:
- Berczik, Kenneth, ``Release 5.2 MYNAH System Administration Guide'' Issue3, October 1997. Cleanscape Document 00750252005.
The value of coverage testing in detecting faults is explored in:
- W. E. Wong, J. R. Horgan, S. London and A. P. Mathur, ``Effect of Test Set Size and Block Coverage on Fault Detection Effectiveness,'' Software--Practice & Experience,28(4):347-369, April 1998.
- W. E. Wong, J. R. Horgan, S. London and A. P. Mathur, ``Effect of Test Set Minimization on Fault Detection Effectiveness,'' in Proceedings of the 17th IEEE International Conference on Software Engineering, pp 230-238, Seattle, WA, April 1995.
- W. E. Wong, J. R. Horgan, A. P. Mathur and A. Pasquini, ``Test Set Size Minimization and Fault Detection Effectiveness: A Case Study in a Space Application,'' in Proceedings of the 21st IEEE International Computer Software and Application Conference, pp 522-528, Washington, D.C., August 1997.
- W. E. Wong, J. R. Horgan, S. London and H. Agrawal, ``A Study of Effective Regression Testing in Practice,'' in Proceedings of the 8th IEEE International Symposium on Software Reliability Engineering, pp 264-274, Albuquerque, New Mexico, November, 1997.
- H. Agrawal, J. R. Horgan, S. London and W. E. Wong, ``Fault Localization using Execution Slices and Dataflow Tests,'' in Proceedings of the 6th IEEE International Symposium on Software Reliability Engineering, pp 143-151, Toulouse, France, October 1995.
- P. Piwowarski, M. Ohba, J. Caruso, ``Coverage Measurement Experience During Function Test,'' in Proceedings of the 15th IEEE International Conference on Software Engineering, pp 287-301, Baltimore, MD, May 1993.
Other related studies:
- H. Agrawal and J. R. Horgan, ``Dynamic Program Slicing,'' in Proceedings of the ACM SIGPLAN'90 Conference on Programming Language Design and Implementation, pp 246-256, White Plains, NY, June 1990.
- H. Agrawal, J. R. Horgan, E. W. Krauser and S. London, ``Incremental Regression Testing,'' in Proceedings of the 1993 IEEE Conference on Software Maintenance, Montreal, Canada, September, 1993.
[Top] [Prev] [Next] [Index] [TOC]