Automation with Flexcover and FlexUnit

Code coverage tools are addictive, and for a good reason – they provide a sort of direct video-game like quality feedback-loop when building software.  Many of us have incorporated Agile Test-Driven methodologies into the way we create our software.  We follow Kent Beck’s mantra of:  Write a test to make it fail, fix all failed tests as directly as possible, then refactor.

Code coverage tools like Clover and Cobertura have been increasing popularity due in part to the richness and immediacy of feedback in seeing essentially what areas of your software aren’t be exercised properly when running an automated test suite.

Coverage goes hand and hand with the testing itself.  We need to write tests as we build and change software, but we need to know if the tests we write are covering not only the major thoroughfares of our code paths, but also the back alleyways the place that bugs love to live.

Coverage also gives us a very important bit of insight- What code can be removed?

Let’s look at an example of this.

Flexcover in Action

Many thanks to Joe Berkovitz and Alex Uhlmann for creating Flexcover and opening it to Google Code.   With Flexcover, we now have insight into the quality and completeness of our testing coverage for our Flex App.  As an example, lets take a look at the Example Flex Application that we have used to guide some discussions of Unit Testing, Mock Objects and Inversion of Control in previous posts.

As the documentation outlines, Flexcover provides an overlay onto the existing Flex SDK.  Applications that are compiled with this Flexcover version of the Flex SDK will emit events to the Flexcover Viewer that collects coverage information and collates them into a useful report.

When the Example Flex App was compiled with the Flexcover SDK, it created a metadata file which can be used by the Coverage Viewer.  When this file is opened in the Viewer it shows no testing coverage, because the application has not be interacted with- so none of the code paths have been followed or recorded.

After running the FlexUnit suite, we see the following:

Flexcover Viewer Screenshot


Overall the coverage is pretty good at around 80%, but there are couple of problems.  There are some code paths that don’t get executed at all.  Rapidly a developer can dig into those and create new tests that bolster the existing coverage.  Flexcover lets us dig into specific lines and see at a function-level the number of times the testing suite exercises a particular code-path.

Automating Flexcover

Many thanks goes to Peter Martin of Adobe Consulting for his work a few years ago to expose Flexunit test suites to Junit Report via ANT.  This allowed us to run our FlexUnits in a Continuous Integration system like Cruise Control, Teamcity or Bamboo to run all of the Flexunit tests after every submit.

This constant and continuous integration give us regular immediate feedback on the state of the build.  It also does the critical job of isolating changes from authors who broke the build in a team environment.

Flexcover has some automation hooks as well.  Although the documentation is a bit sparse on the topic, using ANT (or another build framework) we can connect Flexunit and Flexcover together and in an automated Continuous Integration environment run the Unit Tests and Generate a coverage report.

Here is an example Flex Project that exposes Flexunit and Flexcover coverage to a continuous integration system:

Below is the ANT build file that compiles the Flex project, opens the Flexcover viewer, runs the tests and saves both the Unit test results and the coverage report to a directory.

    Ant Build file for Example Flex App project

    @author amanning
<project name="origin-flex" default="test" basedir=".">

    <property file=""></property>

    <!-- The flex compilation ant task -->
    <taskdef resource="flexTasks.tasks" classpath="${basedir}/libs/flexTasks.jar" />

    <!-- Load the custom task definitions for the flexunit-->
    <taskdef resource="com/adobe/ac/ant/tasks/" />

        Clean the project by removing all source and executable files.
    <target name="clean">

        <delete dir="${BUILD_DIR}" />
        <delete dir="${DEPLOY_DIR}" />
        <delete dir="${REPORT_DIR}" />


	   Prepare all of the source files needed to compile the application
    <target name="prepare">

        <mkdir dir="${BUILD_DIR}" />
        <mkdir dir="${DEPLOY_DIR}" />
        <mkdir dir="${REPORT_DIR}" />
        <mkdir dir="${FLEXUNIT_REPORT_DIR}" />
        <mkdir dir="${FLEXCOVER_REPORT_DIR}" />

        <copy todir="${BUILD_DIR}">
            <fileset dir="${MAIN_DIR}" />


       Compile source files
    <target name="compile" depends="clean, prepare">

        <mxmlc file="${TEST_DIR}\${APP_FILE_BASE_NAME}.mxml" output="${BUILD_DIR}\${APP_FILE_BASE_NAME}.swf">
            <load-config filename="${FLEX_HOME}\frameworks\flex-config.xml" />
            <source-path path-element="${FLEX_HOME}\frameworks" />
            <source-path path-element="${MAIN_DIR}" />
            <source-path path-element="${basedir}\locale\{locale}" />
            <!-- List of SWC files or directories that contain SWC files. -->
            <compiler.include-libraries dir="${LIB_DIR}" append="true">
                <include name="FlexUnit.swc" />
                <include name="Cairngorm.swc" />
                <include name="mockas3.swc" />
                <include name="spring-actionscript.swc" />
                <include name="FlexUnitOptional.swc" />



    <!-- Deploy the application files to the deployment directory -->
    <target name="deploy" depends="compile">

        <!-- Move over the compiled SWF -->
        <copy file="${BUILD_DIR}/${APP_FILE_BASE_NAME}.swf" tofile="${DEPLOY_DIR}/${APP_FILE_BASE_NAME}.swf" overwrite="true" verbose="true" />

        <!-- Move of the Flexcover metadata file -->
        <copy file="${BUILD_DIR}/${APP_FILE_BASE_NAME}.cvm" tofile="${DEPLOY_DIR}/${APP_FILE_BASE_NAME}.cvm" overwrite="true" verbose="true" />

        <!-- Move over the application configuration file for Spring Actionscript -->
        <copy file="${BUILD_DIR}/application-context.xml" tofile="${DEPLOY_DIR}/application-context.xml" overwrite="true" verbose="true" />


    <!-- Open the test runner -->
    <target name="test" depends="deploy, flexcover-open">

        <flexunit todir="${FLEXUNIT_REPORT_DIR}" timeout="60000" swf="${DEPLOY_DIR}/${APP_FILE_BASE_NAME}.swf" haltonfailure="true" />


    <!-- Open the Flexcover Viewer -->
    <target name="flexcover-open">
        <exec executable="${FLEX_COVER_VIEWER}" spawn="true">
            <arg line="-output ${FLEXUNIT_REPORT_DIR}report.cvr ${BUILD_DIR}/${APP_FILE_BASE_NAME}.cvm" />


Some may ask, how can coverage reports really be used?  Won’t they just be ignored?  If an organization is interested in adding a bit more “bite” to the maintain coding coverage, they may consider actually breaking the build if the coverage goes below a certain amount.  This will prevent any developer from submitting code that goes below a coverage threshold.