SlideShare uma empresa Scribd logo
1 de 4
Baixar para ler offline
World Academy of Science, Engineering and Technology 1 2005




                  A System for Performance Evaluation of
                           Embedded Software
                                       Yong-Yoon Cho, Jong-Bae Moon, and Young-Chul Kim


                                                                                   time-consuming work. The evaluation system needs
   Abstract—Developers need to evaluate software’s performance to                  occasionally additional hardware that exists between host
make software efficient. This paper suggests a performance evaluation              computer and target board and executes such functions as
system for embedded software. The suggested system consists of code                software’s performance testing and result analyzing. But, that
analyzer, testing agents, data analyzer, and report viewer. The code
                                                                                   may impose heavy burden on developers, because they have to
analyzer inserts additional code dependent on target system into
source code and compiles the source code. The testing agents execute               pay additional cost and study how to use the instrument. In this
performance test. The data analyzer translates raw-level results data to           paper, we suggest graphic-based evaluation system to test and
class-level APIs for reporting viewer. The report viewer offers users              analyze embedded software’s performance. Because the
graphical report views by using the APIs. We hope that the suggested               suggested evaluation system is based on pure software without
tool will be useful for embedded-related software development,                     any additional hardware, developers don’t have to spend a lot
because developers can easily and intuitively analyze software’s                   of time studying about how to operate additional equipment.
performance and resource utilization.
                                                                                   The suggested evaluation system involves a graphic report
  Keywords—Embedded Software, Performance                     Evaluation
                                                                                   viewer, which reports various results and shows them
System, Testing Agents, Report Generator                                           graphically and according to such test items as memory usage,
                                                                                   code coverage, and function call times. By using the viewer,
                          I. INTRODUCTION                                          developers can analyze software’s performance at a glance and
                                                                                   find easily what is fixed to make software more efficient. As a
A    ccording as embedded system has become increasingly
     sophisticated and user’s requirement for embedded
software has become complicated, developing efficient
                                                                                   result, developers can improve development efficiency of
                                                                                   embedded-related software, because they can have opportunity
                                                                                   to analyze software’s performance instantly and intuitively
embedded software against the restricted resource has become
                                                                                   through the graphical report viewer. The paper is organized as
much more difficult and important. Because embedded system
                                                                                   follows. Section II reviews works related in embedded software
generally offers less computing resource than general-purpose
                                                                                   evaluation system. Section III describes the proposed
computer system does, embedded software that is so poor in
                                                                                   evaluation system and graphic report viewer. Section IV
quality or performance wastes the scarce resources [3][4].
                                                                                   conducts a testing and presents the results. Finally, Section V
Developers want to improve the quality of their embedded
                                                                                   states our conclusions and presents a summary of our research.
software and make it to have always a good performance in
resource usage. To do this, developers occasionally use
                                                                                                        II. RELATED WORKS
embedded software evaluation system to decrease time and
increase efficiency in developing embedded. The evaluation                             Telelogic’s Tau TTCN Suite is a system to test telecom and
system is useful for developers, because they know whether                         datacom equipment ranging from built-in communication chips
developed software is efficiently optimized in the embedded                        to huge switches and intelligent network services. It includes
system’s restricted resource. By using evaluation system,                          various tools such as script editor, compiler and simulator. But,
developers can execute embedded software on target system                          it is not suitable for testing embedded software because it is test
during the development process, test its performance in                            system for telecommunication vendor. It also is very expensive
advance, and know what must be fixed to make it more efficient.                    because it is mostly additional hardware equipment to test
But, because the testing results are text-based string, developers                 telecom system.
have to analyze the raw-level data to find software’s portion                          AstonLinux’s CodeMaker is IDE(Integrated Development
where they has to revise. It is often very tiresome and                            Equipment) to develop embedded software based on linux in
                                                                                   Windows. It supports remote debugging and source-level
                                                                                   debugging. But, it doesn’t offer function to test and analyze
   Manuscript received November 30, 2004.
   Yong-Yoon Cho is with the Department of Computing, Soongsil University,         embedded software’s performance, because it is only IDE for
Seoul, CO 156743 Korea (corresponding author to provide phone:                     specific RTOS/chip vendor. It is also suitable for testing
+82-02-824-3862; fax: +82-02-824-3862; e-mail: sslabyycho@hotmail.com).            general-purpose target system, because it uses linux-based
   Jong-Bae Moon is with the Department of Computing, Soongsil University,
Seoul, CO 156743 Korea (e-mail: comdoct@ss.ssu.ac.kr).                             utilities.
   Young-Chul Kim is with the Department of Computing, Soongsil University,            TestQuest Pro is automated test solution for embedded
Seoul, CO 156743 Korea (e-mail: yckim@ss.ssu.ac.kr).



                                                                              47
World Academy of Science, Engineering and Technology 1 2005




systems with sophisticated human interfaces. It is automated                                                  GUI, host/target-side agents, code analyzer, result analyzer,
test solution for virtually any application and device ranging                                                and report viewer. The code analyzer consists of instrumentor
from cell phones, PDAs and tablets to devices with embedded                                                   to insert additional code into source code and cross compiler to
technologies, including wireless enterprise systems, smart                                                    create target-execute file for the source code. The evaluation
appliances and automobiles. It offers functions related in                                                    system is a client/server model based in host-target architecture.
debugging and simulating embedded software’s source code                                                      Because embedded system offers insufficient memory and
but doesn’t proffer information for software’s performance in                                                 inconvenient user interface, the suggested tool places agent not
usage of resource.                                                                                            only on host-side to proffer users convenient GUI but also on
   Rational’s TestRealTime is target-based evaluation system                                                  target-side to execute software’s performance testing in target
for embedded software’s performance. It proffers various result                                               board. The agents keep a communication connection to deliver
views that users can easily analyze software’s performance. It                                                source file and test result to each other. Firstly host-side agent
can also execute various performance testing ranging from                                                     transfers inputted source to target-side agent through serial
memory usage, memory leak, cpu usage to code coverage. But,                                                   cable or wireless network. Then, target-side agent executes
The result view is somewhat complicated to understand result’s                                                testing process, gains results from the test events, and send the
meaning at a glance.                                                                                          results to its host-side counterpart. Consequently, host-side
                                                                                                              agent stores the raw-level result received from target-side one
     III. PROPOSED PERFORMANCE EVALUATION SYSTEM                                                              into result DB. Generally, embedded software must use
  In this paper, we suggest system for testing embedded                                                       minimum process and memory resources [3][4]. To meet the
software’s performance that consists of pure software without                                                 requirement, the suggested tool tests software’s performance
additional hardware equipment and offers such various                                                         for the 4 items described in Table I [5][6].
performance testing as memory, code coverage, code trace and
                                                                                                                                             TABLE I
function performance [1]-[7]. The evaluation system proffers                                                                      UNITS FOR MAGNETIC PROPERTIES
users graphical report views that they can easily and intuitively
                                                                                                              Testing Items            Function
analyze the test result. Fig. 1 is the proposed architecture for a
performance evaluation system.                                                                                                         Tracing software’s runtime execution in UML
                                                                                                              Trace Profiling
                                                                                                                                       sequence diagram


 H o s t-S id e                                                                                               Memory Profiling         Profiling software’s resource usage related memory.
                     U ser
                                                                                                                                       Profiling function or method’s execution
                                                                                                              Performance Profiling
                                     R e p o r t V ie w e r                                                                            performance.

                                                       R e p o r t G e n e r a to r
                                                                                                              Code Coverage Profiling Separating and Profiling code block [8].
      Test C ode                                               API          R e s u lt
                                                                           A n a ly z e r

    C ro s s -               In s tr u m e n to r
                                                                 R e s u lt
                                                              T r a n s la to r
                                                                                        R e s u lt
                                                                                      S e p e r a to r
                                                                                                                 Through trace profiling, users can trace what functions are
   C o m p ile r
                                                                                                              executed according to software’s execution process and find
           C o d e A n a ly z e r
                                                                                                              what functions are unnecessarily called. Report viewer shows
                                                                                                              result for trace profiling as UML sequence diagram [9][10].
                                                         A P I fo r R e s u lt        R a w -le v e l
                                                                                                              Through memory profiling, users can know information about
                                                                                        R e s u lt            memory allocation/de-allocation, memory leak, and code
       H o s tA g e n t
             Test C ode
              H a n d le r
                                                                    R e s u lt                                sections frequently to use memory. Users can use performance
                                                                   H a n d le r
                                                                                                              profiling to estimate how much time it takes to execute the
                                                                                                              whole or part of embedded software and confirms whether it
             T e s t C o d e w ith                    R a w -le v e l                                         becomes optimized in embedded system. Code coverage
          In s tr u m e n te d C o d e                  R e s u lt
                                                                                                              profiling offers users information about used or not used code
          T a r g e t-S id e                                                                                  section, and frequently or not frequently code section. Users
             T a r g e tA g e n t
                                                                                                              can make embedded software more efficient by using
                                                                                                              information profiled according to the 4 items. Usually result
                   T e s tin g
                  C o n tr o lle r
                                                                                                              created from profiling software exists in raw-level strings. It is
                                                     T e s tin g
                                                     M o d u le
                                                                                                              difficult and tiresome for users to analyzing software
                    R e s u lt
                   H a n d le r                                                                               performance with it.
                                                                                                                 Result analyzer classifies raw-level result according to the
                                                                                                              items referred in Table I and converts it into refined data type. It
     Fig. 1. A Proposed Architecture of Performance Evaluation System
                                                                                                              contains result separator and result translator. The result
                                                                                                              separator classifies raw-level result into different data type in
  In Fig. 1, the proposed evaluation system is composed with                                                  accordance with the profiling items. The result translator



                                                                                                         48
World Academy of Science, Engineering and Technology 1 2005




converts the classified result into API classes or XML files that                agents. After testing in target system, raw-level test result in
report generator can use to make a report view for user’s                        string type is outputted. Table III shows a part of raw-level test
requirement. The API or XML produced by the result analyzer                      result related in memory performance of the test source code
is stored into API DB.                                                           appeared in Fig. 2.
   Table II describes a part of memory-related API class that                                                      TABLE III
                                                                                                         A RAW-LEVEL TEST RESULT
raw-level result created through memory profiling is converted
                                                                                    index % time self children called name
into.                                                                                                       95000       func1 <cycle 1> [3]
                                                                                    [2] 0.0 0.00 0.00 95000              func2 <cycle 1> [2]
                                   TABLE II                                                                    900      func1 <cycle 1> [3]
    A PART OF API CLASS FOR MEMORY-RELATED RAW-LEVEL RESULT                                                  900       func2 <cycle 1> [2]
    public class DataConverter {                                                                 0.00 0.00 1000/1000         main [13]
    private CInstrumentationTable instrumentations;                                 [3] 0.0 0.00 0.00 1900              func1 <cycle 1> [3]
    public instrumentations(CInstrumentationTable instrumentations){                                             95000       func2 <cycle 1> [2]
    this. instrumentations = instrumentations;                                      Index by function name
    }                                                                               [3] func1             [2] func2          [1] <cycle 1>
    public static IMemoryElement convertingMemoryElement(IEvent                     @ <Location>:(Called function + Called Location)[Instruction
 event){                                                                            Location] +/- Address Size= Start
    /* Read the symbol from raw-level result produced according to                  @         ./ex-n:(mtrace+0x169)[0x80484e9]         +       0x804a378
 memory profile test event and convert it into * /                                  0x12@ ./ex-n:[0x8048596] - 0x804a378
            return new MemoryElement();
        }
                                                                                    But, it is difficult for users to analyze the meanings. So, the
  Report generator makes graphical report view by using result                   evaluation system offers users graphical result views through
APIs, when users select one of report views in evaluation                        data analyzer, report generator, and report viewer. Fig. 3 shows
system’s menu. The suggested evaluation system includes                          a graphical report view that report generator and report viewer
report viewer, through that users can see selected report view.                  make with API to that data analyzer translates raw-level trace
                                                                                 result after testing the source in Fig. 2.
                    IV. TESTING AND RESULTS
   The suggested evaluation system is implemented in Java and
its testing environment is embedded board launched strong
ARM chip and embedded Linux. To evaluate performance for
example C source program, we use a calculator program in C
language as input source code. The source’s code size is about
520 lines. It consists of three modules. We test them against the
4 items appeared in Table I and shows graphical report views
for the results through report viewer.
   Fig. 2 shows the evaluation system’s initial screen to test
calculator program.




                                                                                             Fig. 3. Trace profiling view for raw-level test result

                                                                                    With the view showed in Fig. 2, we can know each
                                                                                 function’s execution time and its call order. So, developers can
                                                                                 find a function that take a lot of time to process its processing,
                                                                                 and can re-write source program to distribute a burden of works
                                                                                 to others. Then, we can obtain other report views for test result
                                                                                 of the test source in Fig. 2. Fig. 4 shows memory report view
     Fig. 2. The suggested evaluation system's initial screen for testing        related in memory performance of the test result.
                                                                                    In Fig. 4, we can know information about software’s
   After cross-compiling by code analyzer, the source code                       memory usage such as memory allocation, freed memory, or a
includes additional code that is dependent on target system.                     maximum of byte used at the same times through the memory
Then the evaluation system takes connection with target system                   report view.
to transfer source code to target system through host/target                        So, developers can find a portion of source code that cause



                                                                            49
World Academy of Science, Engineering and Technology 1 2005




memory extravagance and dismiss it.                                           So, we can get information about code blocks and make
                                                                            balance among code blocks. So, we can get information about
                                                                            each function’s execution efficiency and tune it.

                                                                                                         V. CONCLUSION
                                                                               In this paper, we suggested a graphic-based system to easily
                                                                            evaluate embedded software’s performance and intuitively to
                                                                            analyze results through graphic report views. The suggested
                                                                            evaluation system has adopted a client-server model based in
                                                                            agents. It is for the purpose of supporting development
                                                                            environment of embedded software based in cross-platform.
                                                                            The system adopts data analyzer to refine initial raw-level test
                                                                            result into API, which is class type. Developers can easily reuse
                                                                            API in various tools to evaluate embedded software’s
                                                                            performance [12]. As a evaluation tool using API, the
                                                                            suggested system offers developers a report viewer. In test, we
                                                                            tested C source code and showed the results graphically
                                                                            through report viewer. Through the suggested tool, developers
                                                                            can clearly know what must be fixed in software’s source code
                                                                            and can improve development efficiency of embedded
                                                                            software.
                                                                               For the future, we will study method to automate a test case
                                                                            and to develop an internet-based testing system for embedded
                     Fig. 4. Memory report view                             system by translating test data as XML document [13][14].

   Fig. 5 shows code coverage and performance report views.                                                REFERENCES
In Fig. 5, through the code coverage report view, we can also               [1]    Roper, Marc, Software Testing, London, McGraw-Hill Book Company,
know whether some function was executed or not and get                             1994.
                                                                            [2]    Boris Beizer, Software Testing Techniques 2nd edition, New York: Van
information about execution rates of code blocks in function.
                                                                                   Nostrand Reinhold, 1990
And, with the performance report view, we can find                          [3]    Bart Broekman and Edwin Notenboom, Testing Embedded Software,
information about each function’s call times and execution                         Addisson-wesley, Dec. 2002
time.                                                                       [4]    Dr. Neal Stollon, Rick Leatherman and Bruce Ableidinger, Multi-Core
                                                                                   Embedded Debug for Structured ASIC Systems, proceedings of
                                                                                   DesignCon 2004, Feb, 2004.
                                                                            [5]    David B. Stewart, Gaurav Arora, A Tool for Analyzing and Fine Tuning
                                                                                   the Real-Time Properties of an Embedded System. IEEE Trans. Software
                                                                                   Eng., Vol.TSE-29, No.4, April 2003, pp.311-326.
                                                                            [6]    Ichiro Satoh, A Testing Framework for Mobile Computing Software.
                                                                                   IEEE Trans. Software Eng., Vol.TSE-29, No.12, December 2003,
                                                                                   pp.1112-1121.
                                                                            [7]    Paul Anderson, Thomas W. Reps, Tim Teitelbaum, Design and
                                                                                   Implementation of a Fine-Grained Software Inspection Tool. IEEE Trans.
                                                                                   Software Eng., Vol.TSE-29, No.8, August 2003, pp.721-733.
                                                                            [8]    John Joseph Chilenski and Steven P. Miller, Applicability of Modified
                                                                                   Condition/Decision Coverage to Software Testing, Software Engineering
                                                                                   Journal, September 1994, Vol. 9, No. 5, pp. 193-200.
                                                                            [9]    Robert B. France, Dae-Kyoo Kim, Sudipto Ghosh, Eunjee Song, A
                                                                                   UML-Based Pattern Specification Technique, IEEE Trans. Software Eng.,
                                                                                   Vol.TSE-30, No.4, April 2004, pp. 193-206.
                                                                            [10]   Ludovic Apvrille, Jean-Pierre Courtiat, Christophe Lohr, Pierre de
                                                                                   Saqui-Sannes, TURTLE: A Real-Time UML Profile Supported by a
                                                                                   Formal Validation Toolkit. IEEE Trans. Software Eng., Vol.TSE-30,
                                                                                   No.7, July 2004, pp. 473-487.
                                                                            [11]   William E. Howden, Weak Mutation Testing and Completeness of Test
                                                                                   Sets, IEEE Trans. Software Eng., Vol.SE-8, No.4, July 1982, pp.371-379.
                                                                            [12]   Brad Long, Daniel Hoffman, Paul A. Strooper, Tool Support for Testing
                                                                                   Concurrent Java Components. IEEE Trans. Software Eng., Vol.TSE-29,
                                                                                   No.6, June 2003, pp.555-566.
                                                                            [13]   Morell, Larry, A Theory of Fault-Based Testing, IEEE Trans. Software
                                                                                   Eng., Vol.16, No.8, August 1990, pp.844-857.
                                                                            [14]   John P. Kearns, Carol J. Meier and Mary Lou Soffa, The Performance
          Fig. 5. Code coverage and Performance report views                       Evaluation of Control Implementations. IEEE Trans. Software Eng.,
                                                                                   Vol.SE-8, No.2, February 1982, pp.89-96.




                                                                       50

Mais conteúdo relacionado

Mais procurados

UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREUVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREVLSICS Design
 
Component Based Testing Using Finite Automata
Component Based Testing Using Finite AutomataComponent Based Testing Using Finite Automata
Component Based Testing Using Finite AutomataSanjoy Kumar Das
 
SE2018_Lec-22_-Continuous-Integration-Tools
SE2018_Lec-22_-Continuous-Integration-ToolsSE2018_Lec-22_-Continuous-Integration-Tools
SE2018_Lec-22_-Continuous-Integration-ToolsAmr E. Mohamed
 
Engineering Software Products: 2. agile software engineering
Engineering Software Products: 2. agile software engineeringEngineering Software Products: 2. agile software engineering
Engineering Software Products: 2. agile software engineeringsoftware-engineering-book
 
An Analysis of Component-based Software Development -Maximize the reuse of ex...
An Analysis of Component-based Software Development -Maximize the reuse of ex...An Analysis of Component-based Software Development -Maximize the reuse of ex...
An Analysis of Component-based Software Development -Maximize the reuse of ex...Mohammad Salah uddin
 
Introduction to software engineering
Introduction to software engineeringIntroduction to software engineering
Introduction to software engineeringHitesh Mohapatra
 
Engineering Software Products: 8. Reliable programming
Engineering Software Products: 8. Reliable programmingEngineering Software Products: 8. Reliable programming
Engineering Software Products: 8. Reliable programmingsoftware-engineering-book
 
Software cost estimation
Software cost estimationSoftware cost estimation
Software cost estimationHaitham Ahmed
 
An Approach To Erp Testing Using Services
An Approach To Erp Testing Using ServicesAn Approach To Erp Testing Using Services
An Approach To Erp Testing Using ServicesSagi Schliesser
 
9. Software Implementation
9. Software Implementation9. Software Implementation
9. Software Implementationghayour abbas
 
Engineering Software Products: 5. cloud based software
Engineering Software Products: 5. cloud based softwareEngineering Software Products: 5. cloud based software
Engineering Software Products: 5. cloud based softwaresoftware-engineering-book
 

Mais procurados (18)

UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER COREUVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
UVM BASED REUSABLE VERIFICATION IP FOR WISHBONE COMPLIANT SPI MASTER CORE
 
PSResume
PSResumePSResume
PSResume
 
Component Based Testing Using Finite Automata
Component Based Testing Using Finite AutomataComponent Based Testing Using Finite Automata
Component Based Testing Using Finite Automata
 
SE2018_Lec-22_-Continuous-Integration-Tools
SE2018_Lec-22_-Continuous-Integration-ToolsSE2018_Lec-22_-Continuous-Integration-Tools
SE2018_Lec-22_-Continuous-Integration-Tools
 
Engineering Software Products: 2. agile software engineering
Engineering Software Products: 2. agile software engineeringEngineering Software Products: 2. agile software engineering
Engineering Software Products: 2. agile software engineering
 
Component based software engineering
Component based software engineeringComponent based software engineering
Component based software engineering
 
An Analysis of Component-based Software Development -Maximize the reuse of ex...
An Analysis of Component-based Software Development -Maximize the reuse of ex...An Analysis of Component-based Software Development -Maximize the reuse of ex...
An Analysis of Component-based Software Development -Maximize the reuse of ex...
 
Ch17 distributed software engineering
Ch17 distributed software engineeringCh17 distributed software engineering
Ch17 distributed software engineering
 
Introduction to software engineering
Introduction to software engineeringIntroduction to software engineering
Introduction to software engineering
 
Engineering Software Products: 8. Reliable programming
Engineering Software Products: 8. Reliable programmingEngineering Software Products: 8. Reliable programming
Engineering Software Products: 8. Reliable programming
 
Software cost estimation
Software cost estimationSoftware cost estimation
Software cost estimation
 
Ch21 real time software engineering
Ch21 real time software engineeringCh21 real time software engineering
Ch21 real time software engineering
 
An Approach To Erp Testing Using Services
An Approach To Erp Testing Using ServicesAn Approach To Erp Testing Using Services
An Approach To Erp Testing Using Services
 
Ch19
Ch19Ch19
Ch19
 
9. Software Implementation
9. Software Implementation9. Software Implementation
9. Software Implementation
 
SE notes by k. adisesha
SE notes by k. adiseshaSE notes by k. adisesha
SE notes by k. adisesha
 
Engineering Software Products: 5. cloud based software
Engineering Software Products: 5. cloud based softwareEngineering Software Products: 5. cloud based software
Engineering Software Products: 5. cloud based software
 
Iv2515741577
Iv2515741577Iv2515741577
Iv2515741577
 

Destaque

Performance testing based on time complexity analysis for embedded software
Performance testing based on time complexity analysis for embedded softwarePerformance testing based on time complexity analysis for embedded software
Performance testing based on time complexity analysis for embedded softwareMr. Chanuwan
 
Spacebrew & Arduino Yún
Spacebrew & Arduino YúnSpacebrew & Arduino Yún
Spacebrew & Arduino YúnJohan Nilsson
 
GTUG Android iglaset Presentation 1 Oct
GTUG Android iglaset Presentation 1 OctGTUG Android iglaset Presentation 1 Oct
GTUG Android iglaset Presentation 1 OctJohan Nilsson
 
Runtime performance evaluation of embedded software
Runtime performance evaluation of embedded softwareRuntime performance evaluation of embedded software
Runtime performance evaluation of embedded softwareMr. Chanuwan
 
Deep dumpster diving 2010
Deep dumpster diving 2010Deep dumpster diving 2010
Deep dumpster diving 2010RonnBlack
 
High level programming of embedded hard real-time devices
High level programming of embedded hard real-time devicesHigh level programming of embedded hard real-time devices
High level programming of embedded hard real-time devicesMr. Chanuwan
 
Application scenarios in streaming oriented embedded-system design
Application scenarios in streaming oriented embedded-system designApplication scenarios in streaming oriented embedded-system design
Application scenarios in streaming oriented embedded-system designMr. Chanuwan
 
Analyzing memory usage and leaks
Analyzing memory usage and leaksAnalyzing memory usage and leaks
Analyzing memory usage and leaksRonnBlack
 
High-Performance Timing Simulation of Embedded Software
High-Performance Timing Simulation of Embedded SoftwareHigh-Performance Timing Simulation of Embedded Software
High-Performance Timing Simulation of Embedded SoftwareMr. Chanuwan
 
Software Architectural low energy
Software Architectural low energySoftware Architectural low energy
Software Architectural low energyMr. Chanuwan
 
High performance operating system controlled memory compression
High performance operating system controlled memory compressionHigh performance operating system controlled memory compression
High performance operating system controlled memory compressionMr. Chanuwan
 
FOSS STHLM Android Cloud to Device Messaging
FOSS STHLM Android Cloud to Device MessagingFOSS STHLM Android Cloud to Device Messaging
FOSS STHLM Android Cloud to Device MessagingJohan Nilsson
 
Object and method exploration for embedded systems
Object and method exploration for embedded systemsObject and method exploration for embedded systems
Object and method exploration for embedded systemsMr. Chanuwan
 
Java lejos-multithreading
Java lejos-multithreadingJava lejos-multithreading
Java lejos-multithreadingMr. Chanuwan
 
Android Cloud to Device Messaging Framework at GTUG Stockholm
Android Cloud to Device Messaging Framework at GTUG StockholmAndroid Cloud to Device Messaging Framework at GTUG Stockholm
Android Cloud to Device Messaging Framework at GTUG StockholmJohan Nilsson
 
Performance and memory profiling for embedded system design
Performance and memory profiling for embedded system designPerformance and memory profiling for embedded system design
Performance and memory profiling for embedded system designMr. Chanuwan
 
Deep Dumpster Diving
Deep Dumpster DivingDeep Dumpster Diving
Deep Dumpster DivingRonnBlack
 
Utmaningar som tredjepartsutvecklare för kollektivtrafikbranchen - Kollektivt...
Utmaningar som tredjepartsutvecklare för kollektivtrafikbranchen - Kollektivt...Utmaningar som tredjepartsutvecklare för kollektivtrafikbranchen - Kollektivt...
Utmaningar som tredjepartsutvecklare för kollektivtrafikbranchen - Kollektivt...Johan Nilsson
 
Performance prediction for software architectures
Performance prediction for software architecturesPerformance prediction for software architectures
Performance prediction for software architecturesMr. Chanuwan
 

Destaque (19)

Performance testing based on time complexity analysis for embedded software
Performance testing based on time complexity analysis for embedded softwarePerformance testing based on time complexity analysis for embedded software
Performance testing based on time complexity analysis for embedded software
 
Spacebrew & Arduino Yún
Spacebrew & Arduino YúnSpacebrew & Arduino Yún
Spacebrew & Arduino Yún
 
GTUG Android iglaset Presentation 1 Oct
GTUG Android iglaset Presentation 1 OctGTUG Android iglaset Presentation 1 Oct
GTUG Android iglaset Presentation 1 Oct
 
Runtime performance evaluation of embedded software
Runtime performance evaluation of embedded softwareRuntime performance evaluation of embedded software
Runtime performance evaluation of embedded software
 
Deep dumpster diving 2010
Deep dumpster diving 2010Deep dumpster diving 2010
Deep dumpster diving 2010
 
High level programming of embedded hard real-time devices
High level programming of embedded hard real-time devicesHigh level programming of embedded hard real-time devices
High level programming of embedded hard real-time devices
 
Application scenarios in streaming oriented embedded-system design
Application scenarios in streaming oriented embedded-system designApplication scenarios in streaming oriented embedded-system design
Application scenarios in streaming oriented embedded-system design
 
Analyzing memory usage and leaks
Analyzing memory usage and leaksAnalyzing memory usage and leaks
Analyzing memory usage and leaks
 
High-Performance Timing Simulation of Embedded Software
High-Performance Timing Simulation of Embedded SoftwareHigh-Performance Timing Simulation of Embedded Software
High-Performance Timing Simulation of Embedded Software
 
Software Architectural low energy
Software Architectural low energySoftware Architectural low energy
Software Architectural low energy
 
High performance operating system controlled memory compression
High performance operating system controlled memory compressionHigh performance operating system controlled memory compression
High performance operating system controlled memory compression
 
FOSS STHLM Android Cloud to Device Messaging
FOSS STHLM Android Cloud to Device MessagingFOSS STHLM Android Cloud to Device Messaging
FOSS STHLM Android Cloud to Device Messaging
 
Object and method exploration for embedded systems
Object and method exploration for embedded systemsObject and method exploration for embedded systems
Object and method exploration for embedded systems
 
Java lejos-multithreading
Java lejos-multithreadingJava lejos-multithreading
Java lejos-multithreading
 
Android Cloud to Device Messaging Framework at GTUG Stockholm
Android Cloud to Device Messaging Framework at GTUG StockholmAndroid Cloud to Device Messaging Framework at GTUG Stockholm
Android Cloud to Device Messaging Framework at GTUG Stockholm
 
Performance and memory profiling for embedded system design
Performance and memory profiling for embedded system designPerformance and memory profiling for embedded system design
Performance and memory profiling for embedded system design
 
Deep Dumpster Diving
Deep Dumpster DivingDeep Dumpster Diving
Deep Dumpster Diving
 
Utmaningar som tredjepartsutvecklare för kollektivtrafikbranchen - Kollektivt...
Utmaningar som tredjepartsutvecklare för kollektivtrafikbranchen - Kollektivt...Utmaningar som tredjepartsutvecklare för kollektivtrafikbranchen - Kollektivt...
Utmaningar som tredjepartsutvecklare för kollektivtrafikbranchen - Kollektivt...
 
Performance prediction for software architectures
Performance prediction for software architecturesPerformance prediction for software architectures
Performance prediction for software architectures
 

Semelhante a A system for performance evaluation of embedded software

Software engineering : Layered Architecture
Software engineering : Layered ArchitectureSoftware engineering : Layered Architecture
Software engineering : Layered ArchitectureMuhammed Afsal Villan
 
3Audit Software & Tools.pptx
3Audit Software & Tools.pptx3Audit Software & Tools.pptx
3Audit Software & Tools.pptxjack952975
 
Software engineering introduction
Software engineering introductionSoftware engineering introduction
Software engineering introductionVishal Singh
 
Open Source Software Testing Tools
Open Source Software Testing ToolsOpen Source Software Testing Tools
Open Source Software Testing ToolsVaruna Harshana
 
Arun Prasad-R.DOCX
Arun Prasad-R.DOCXArun Prasad-R.DOCX
Arun Prasad-R.DOCXArun R
 
Agile Development in Aerospace and Defense
Agile Development in Aerospace and DefenseAgile Development in Aerospace and Defense
Agile Development in Aerospace and DefenseJim Nickel
 
Performancetestingbasedontimecomplexityanalysisforembeddedsoftware 1008150404...
Performancetestingbasedontimecomplexityanalysisforembeddedsoftware 1008150404...Performancetestingbasedontimecomplexityanalysisforembeddedsoftware 1008150404...
Performancetestingbasedontimecomplexityanalysisforembeddedsoftware 1008150404...NNfamily
 
Surekha_haoop_exp
Surekha_haoop_expSurekha_haoop_exp
Surekha_haoop_expsurekhakadi
 
Introduction-to-the-Waterfall-Model.pptx
Introduction-to-the-Waterfall-Model.pptxIntroduction-to-the-Waterfall-Model.pptx
Introduction-to-the-Waterfall-Model.pptxAsadBaig49
 
ccs356-software-engineering-notes.pdf
ccs356-software-engineering-notes.pdfccs356-software-engineering-notes.pdf
ccs356-software-engineering-notes.pdfVijayakumarKadumbadi
 
Softweare Engieering
Softweare Engieering Softweare Engieering
Softweare Engieering Huda Alameen
 
Unit Testing to Support Reusable for Component-Based Software Engineering
Unit Testing to Support Reusable for Component-Based Software EngineeringUnit Testing to Support Reusable for Component-Based Software Engineering
Unit Testing to Support Reusable for Component-Based Software Engineeringijtsrd
 

Semelhante a A system for performance evaluation of embedded software (20)

Software engineering : Layered Architecture
Software engineering : Layered ArchitectureSoftware engineering : Layered Architecture
Software engineering : Layered Architecture
 
3Audit Software & Tools.pptx
3Audit Software & Tools.pptx3Audit Software & Tools.pptx
3Audit Software & Tools.pptx
 
Software metrics
Software metricsSoftware metrics
Software metrics
 
Software engineering introduction
Software engineering introductionSoftware engineering introduction
Software engineering introduction
 
Open Source Software Testing Tools
Open Source Software Testing ToolsOpen Source Software Testing Tools
Open Source Software Testing Tools
 
Arun Prasad-R.DOCX
Arun Prasad-R.DOCXArun Prasad-R.DOCX
Arun Prasad-R.DOCX
 
Agile Development in Aerospace and Defense
Agile Development in Aerospace and DefenseAgile Development in Aerospace and Defense
Agile Development in Aerospace and Defense
 
SE_Unit 2.pptx
SE_Unit 2.pptxSE_Unit 2.pptx
SE_Unit 2.pptx
 
Ka3517391743
Ka3517391743Ka3517391743
Ka3517391743
 
SANJAY_SINGH
SANJAY_SINGHSANJAY_SINGH
SANJAY_SINGH
 
Performancetestingbasedontimecomplexityanalysisforembeddedsoftware 1008150404...
Performancetestingbasedontimecomplexityanalysisforembeddedsoftware 1008150404...Performancetestingbasedontimecomplexityanalysisforembeddedsoftware 1008150404...
Performancetestingbasedontimecomplexityanalysisforembeddedsoftware 1008150404...
 
Software engineer
Software engineerSoftware engineer
Software engineer
 
Surekha_haoop_exp
Surekha_haoop_expSurekha_haoop_exp
Surekha_haoop_exp
 
Introduction-to-the-Waterfall-Model.pptx
Introduction-to-the-Waterfall-Model.pptxIntroduction-to-the-Waterfall-Model.pptx
Introduction-to-the-Waterfall-Model.pptx
 
ccs356-software-engineering-notes.pdf
ccs356-software-engineering-notes.pdfccs356-software-engineering-notes.pdf
ccs356-software-engineering-notes.pdf
 
functional requirements using LPP
functional requirements using LPPfunctional requirements using LPP
functional requirements using LPP
 
Softweare Engieering
Softweare Engieering Softweare Engieering
Softweare Engieering
 
Se lec 3
Se lec 3Se lec 3
Se lec 3
 
Unit Testing to Support Reusable for Component-Based Software Engineering
Unit Testing to Support Reusable for Component-Based Software EngineeringUnit Testing to Support Reusable for Component-Based Software Engineering
Unit Testing to Support Reusable for Component-Based Software Engineering
 
Resume
ResumeResume
Resume
 

A system for performance evaluation of embedded software

  • 1. World Academy of Science, Engineering and Technology 1 2005 A System for Performance Evaluation of Embedded Software Yong-Yoon Cho, Jong-Bae Moon, and Young-Chul Kim time-consuming work. The evaluation system needs Abstract—Developers need to evaluate software’s performance to occasionally additional hardware that exists between host make software efficient. This paper suggests a performance evaluation computer and target board and executes such functions as system for embedded software. The suggested system consists of code software’s performance testing and result analyzing. But, that analyzer, testing agents, data analyzer, and report viewer. The code may impose heavy burden on developers, because they have to analyzer inserts additional code dependent on target system into source code and compiles the source code. The testing agents execute pay additional cost and study how to use the instrument. In this performance test. The data analyzer translates raw-level results data to paper, we suggest graphic-based evaluation system to test and class-level APIs for reporting viewer. The report viewer offers users analyze embedded software’s performance. Because the graphical report views by using the APIs. We hope that the suggested suggested evaluation system is based on pure software without tool will be useful for embedded-related software development, any additional hardware, developers don’t have to spend a lot because developers can easily and intuitively analyze software’s of time studying about how to operate additional equipment. performance and resource utilization. The suggested evaluation system involves a graphic report Keywords—Embedded Software, Performance Evaluation viewer, which reports various results and shows them System, Testing Agents, Report Generator graphically and according to such test items as memory usage, code coverage, and function call times. By using the viewer, I. INTRODUCTION developers can analyze software’s performance at a glance and find easily what is fixed to make software more efficient. As a A ccording as embedded system has become increasingly sophisticated and user’s requirement for embedded software has become complicated, developing efficient result, developers can improve development efficiency of embedded-related software, because they can have opportunity to analyze software’s performance instantly and intuitively embedded software against the restricted resource has become through the graphical report viewer. The paper is organized as much more difficult and important. Because embedded system follows. Section II reviews works related in embedded software generally offers less computing resource than general-purpose evaluation system. Section III describes the proposed computer system does, embedded software that is so poor in evaluation system and graphic report viewer. Section IV quality or performance wastes the scarce resources [3][4]. conducts a testing and presents the results. Finally, Section V Developers want to improve the quality of their embedded states our conclusions and presents a summary of our research. software and make it to have always a good performance in resource usage. To do this, developers occasionally use II. RELATED WORKS embedded software evaluation system to decrease time and increase efficiency in developing embedded. The evaluation Telelogic’s Tau TTCN Suite is a system to test telecom and system is useful for developers, because they know whether datacom equipment ranging from built-in communication chips developed software is efficiently optimized in the embedded to huge switches and intelligent network services. It includes system’s restricted resource. By using evaluation system, various tools such as script editor, compiler and simulator. But, developers can execute embedded software on target system it is not suitable for testing embedded software because it is test during the development process, test its performance in system for telecommunication vendor. It also is very expensive advance, and know what must be fixed to make it more efficient. because it is mostly additional hardware equipment to test But, because the testing results are text-based string, developers telecom system. have to analyze the raw-level data to find software’s portion AstonLinux’s CodeMaker is IDE(Integrated Development where they has to revise. It is often very tiresome and Equipment) to develop embedded software based on linux in Windows. It supports remote debugging and source-level debugging. But, it doesn’t offer function to test and analyze Manuscript received November 30, 2004. Yong-Yoon Cho is with the Department of Computing, Soongsil University, embedded software’s performance, because it is only IDE for Seoul, CO 156743 Korea (corresponding author to provide phone: specific RTOS/chip vendor. It is also suitable for testing +82-02-824-3862; fax: +82-02-824-3862; e-mail: sslabyycho@hotmail.com). general-purpose target system, because it uses linux-based Jong-Bae Moon is with the Department of Computing, Soongsil University, Seoul, CO 156743 Korea (e-mail: comdoct@ss.ssu.ac.kr). utilities. Young-Chul Kim is with the Department of Computing, Soongsil University, TestQuest Pro is automated test solution for embedded Seoul, CO 156743 Korea (e-mail: yckim@ss.ssu.ac.kr). 47
  • 2. World Academy of Science, Engineering and Technology 1 2005 systems with sophisticated human interfaces. It is automated GUI, host/target-side agents, code analyzer, result analyzer, test solution for virtually any application and device ranging and report viewer. The code analyzer consists of instrumentor from cell phones, PDAs and tablets to devices with embedded to insert additional code into source code and cross compiler to technologies, including wireless enterprise systems, smart create target-execute file for the source code. The evaluation appliances and automobiles. It offers functions related in system is a client/server model based in host-target architecture. debugging and simulating embedded software’s source code Because embedded system offers insufficient memory and but doesn’t proffer information for software’s performance in inconvenient user interface, the suggested tool places agent not usage of resource. only on host-side to proffer users convenient GUI but also on Rational’s TestRealTime is target-based evaluation system target-side to execute software’s performance testing in target for embedded software’s performance. It proffers various result board. The agents keep a communication connection to deliver views that users can easily analyze software’s performance. It source file and test result to each other. Firstly host-side agent can also execute various performance testing ranging from transfers inputted source to target-side agent through serial memory usage, memory leak, cpu usage to code coverage. But, cable or wireless network. Then, target-side agent executes The result view is somewhat complicated to understand result’s testing process, gains results from the test events, and send the meaning at a glance. results to its host-side counterpart. Consequently, host-side agent stores the raw-level result received from target-side one III. PROPOSED PERFORMANCE EVALUATION SYSTEM into result DB. Generally, embedded software must use In this paper, we suggest system for testing embedded minimum process and memory resources [3][4]. To meet the software’s performance that consists of pure software without requirement, the suggested tool tests software’s performance additional hardware equipment and offers such various for the 4 items described in Table I [5][6]. performance testing as memory, code coverage, code trace and TABLE I function performance [1]-[7]. The evaluation system proffers UNITS FOR MAGNETIC PROPERTIES users graphical report views that they can easily and intuitively Testing Items Function analyze the test result. Fig. 1 is the proposed architecture for a performance evaluation system. Tracing software’s runtime execution in UML Trace Profiling sequence diagram H o s t-S id e Memory Profiling Profiling software’s resource usage related memory. U ser Profiling function or method’s execution Performance Profiling R e p o r t V ie w e r performance. R e p o r t G e n e r a to r Code Coverage Profiling Separating and Profiling code block [8]. Test C ode API R e s u lt A n a ly z e r C ro s s - In s tr u m e n to r R e s u lt T r a n s la to r R e s u lt S e p e r a to r Through trace profiling, users can trace what functions are C o m p ile r executed according to software’s execution process and find C o d e A n a ly z e r what functions are unnecessarily called. Report viewer shows result for trace profiling as UML sequence diagram [9][10]. A P I fo r R e s u lt R a w -le v e l Through memory profiling, users can know information about R e s u lt memory allocation/de-allocation, memory leak, and code H o s tA g e n t Test C ode H a n d le r R e s u lt sections frequently to use memory. Users can use performance H a n d le r profiling to estimate how much time it takes to execute the whole or part of embedded software and confirms whether it T e s t C o d e w ith R a w -le v e l becomes optimized in embedded system. Code coverage In s tr u m e n te d C o d e R e s u lt profiling offers users information about used or not used code T a r g e t-S id e section, and frequently or not frequently code section. Users T a r g e tA g e n t can make embedded software more efficient by using information profiled according to the 4 items. Usually result T e s tin g C o n tr o lle r created from profiling software exists in raw-level strings. It is T e s tin g M o d u le difficult and tiresome for users to analyzing software R e s u lt H a n d le r performance with it. Result analyzer classifies raw-level result according to the items referred in Table I and converts it into refined data type. It Fig. 1. A Proposed Architecture of Performance Evaluation System contains result separator and result translator. The result separator classifies raw-level result into different data type in In Fig. 1, the proposed evaluation system is composed with accordance with the profiling items. The result translator 48
  • 3. World Academy of Science, Engineering and Technology 1 2005 converts the classified result into API classes or XML files that agents. After testing in target system, raw-level test result in report generator can use to make a report view for user’s string type is outputted. Table III shows a part of raw-level test requirement. The API or XML produced by the result analyzer result related in memory performance of the test source code is stored into API DB. appeared in Fig. 2. Table II describes a part of memory-related API class that TABLE III A RAW-LEVEL TEST RESULT raw-level result created through memory profiling is converted index % time self children called name into. 95000 func1 <cycle 1> [3] [2] 0.0 0.00 0.00 95000 func2 <cycle 1> [2] TABLE II 900 func1 <cycle 1> [3] A PART OF API CLASS FOR MEMORY-RELATED RAW-LEVEL RESULT 900 func2 <cycle 1> [2] public class DataConverter { 0.00 0.00 1000/1000 main [13] private CInstrumentationTable instrumentations; [3] 0.0 0.00 0.00 1900 func1 <cycle 1> [3] public instrumentations(CInstrumentationTable instrumentations){ 95000 func2 <cycle 1> [2] this. instrumentations = instrumentations; Index by function name } [3] func1 [2] func2 [1] <cycle 1> public static IMemoryElement convertingMemoryElement(IEvent @ <Location>:(Called function + Called Location)[Instruction event){ Location] +/- Address Size= Start /* Read the symbol from raw-level result produced according to @ ./ex-n:(mtrace+0x169)[0x80484e9] + 0x804a378 memory profile test event and convert it into * / 0x12@ ./ex-n:[0x8048596] - 0x804a378 return new MemoryElement(); } But, it is difficult for users to analyze the meanings. So, the Report generator makes graphical report view by using result evaluation system offers users graphical result views through APIs, when users select one of report views in evaluation data analyzer, report generator, and report viewer. Fig. 3 shows system’s menu. The suggested evaluation system includes a graphical report view that report generator and report viewer report viewer, through that users can see selected report view. make with API to that data analyzer translates raw-level trace result after testing the source in Fig. 2. IV. TESTING AND RESULTS The suggested evaluation system is implemented in Java and its testing environment is embedded board launched strong ARM chip and embedded Linux. To evaluate performance for example C source program, we use a calculator program in C language as input source code. The source’s code size is about 520 lines. It consists of three modules. We test them against the 4 items appeared in Table I and shows graphical report views for the results through report viewer. Fig. 2 shows the evaluation system’s initial screen to test calculator program. Fig. 3. Trace profiling view for raw-level test result With the view showed in Fig. 2, we can know each function’s execution time and its call order. So, developers can find a function that take a lot of time to process its processing, and can re-write source program to distribute a burden of works to others. Then, we can obtain other report views for test result of the test source in Fig. 2. Fig. 4 shows memory report view Fig. 2. The suggested evaluation system's initial screen for testing related in memory performance of the test result. In Fig. 4, we can know information about software’s After cross-compiling by code analyzer, the source code memory usage such as memory allocation, freed memory, or a includes additional code that is dependent on target system. maximum of byte used at the same times through the memory Then the evaluation system takes connection with target system report view. to transfer source code to target system through host/target So, developers can find a portion of source code that cause 49
  • 4. World Academy of Science, Engineering and Technology 1 2005 memory extravagance and dismiss it. So, we can get information about code blocks and make balance among code blocks. So, we can get information about each function’s execution efficiency and tune it. V. CONCLUSION In this paper, we suggested a graphic-based system to easily evaluate embedded software’s performance and intuitively to analyze results through graphic report views. The suggested evaluation system has adopted a client-server model based in agents. It is for the purpose of supporting development environment of embedded software based in cross-platform. The system adopts data analyzer to refine initial raw-level test result into API, which is class type. Developers can easily reuse API in various tools to evaluate embedded software’s performance [12]. As a evaluation tool using API, the suggested system offers developers a report viewer. In test, we tested C source code and showed the results graphically through report viewer. Through the suggested tool, developers can clearly know what must be fixed in software’s source code and can improve development efficiency of embedded software. For the future, we will study method to automate a test case and to develop an internet-based testing system for embedded Fig. 4. Memory report view system by translating test data as XML document [13][14]. Fig. 5 shows code coverage and performance report views. REFERENCES In Fig. 5, through the code coverage report view, we can also [1] Roper, Marc, Software Testing, London, McGraw-Hill Book Company, know whether some function was executed or not and get 1994. [2] Boris Beizer, Software Testing Techniques 2nd edition, New York: Van information about execution rates of code blocks in function. Nostrand Reinhold, 1990 And, with the performance report view, we can find [3] Bart Broekman and Edwin Notenboom, Testing Embedded Software, information about each function’s call times and execution Addisson-wesley, Dec. 2002 time. [4] Dr. Neal Stollon, Rick Leatherman and Bruce Ableidinger, Multi-Core Embedded Debug for Structured ASIC Systems, proceedings of DesignCon 2004, Feb, 2004. [5] David B. Stewart, Gaurav Arora, A Tool for Analyzing and Fine Tuning the Real-Time Properties of an Embedded System. IEEE Trans. Software Eng., Vol.TSE-29, No.4, April 2003, pp.311-326. [6] Ichiro Satoh, A Testing Framework for Mobile Computing Software. IEEE Trans. Software Eng., Vol.TSE-29, No.12, December 2003, pp.1112-1121. [7] Paul Anderson, Thomas W. Reps, Tim Teitelbaum, Design and Implementation of a Fine-Grained Software Inspection Tool. IEEE Trans. Software Eng., Vol.TSE-29, No.8, August 2003, pp.721-733. [8] John Joseph Chilenski and Steven P. Miller, Applicability of Modified Condition/Decision Coverage to Software Testing, Software Engineering Journal, September 1994, Vol. 9, No. 5, pp. 193-200. [9] Robert B. France, Dae-Kyoo Kim, Sudipto Ghosh, Eunjee Song, A UML-Based Pattern Specification Technique, IEEE Trans. Software Eng., Vol.TSE-30, No.4, April 2004, pp. 193-206. [10] Ludovic Apvrille, Jean-Pierre Courtiat, Christophe Lohr, Pierre de Saqui-Sannes, TURTLE: A Real-Time UML Profile Supported by a Formal Validation Toolkit. IEEE Trans. Software Eng., Vol.TSE-30, No.7, July 2004, pp. 473-487. [11] William E. Howden, Weak Mutation Testing and Completeness of Test Sets, IEEE Trans. Software Eng., Vol.SE-8, No.4, July 1982, pp.371-379. [12] Brad Long, Daniel Hoffman, Paul A. Strooper, Tool Support for Testing Concurrent Java Components. IEEE Trans. Software Eng., Vol.TSE-29, No.6, June 2003, pp.555-566. [13] Morell, Larry, A Theory of Fault-Based Testing, IEEE Trans. Software Eng., Vol.16, No.8, August 1990, pp.844-857. [14] John P. Kearns, Carol J. Meier and Mary Lou Soffa, The Performance Fig. 5. Code coverage and Performance report views Evaluation of Control Implementations. IEEE Trans. Software Eng., Vol.SE-8, No.2, February 1982, pp.89-96. 50