Tuesday, August 23, 2011

Using DBUnit through Ant with a large MS SQL Database

The following describes my experiences with using DBUnit through Ant with a large MS SQL database, and all of the adventures I had. The intention is to use DBUnit though Ant to export a database as an XML file, to clear the database, and to re-populate it using that XML file.

Why?

Automated acceptance testing. You start with an XML backup of what you want your base system to contain, refresh your database with that backup, run your suite of automated acceptance tests, and then restore the XML backup again. That way your tests can do things specific to the application like create users, invoices, etc, and otherwise mess around with every conceivable operation. When you are done you can restore everything to the way it was before.

Running it though Ant lets you easily integrate this refresh into other automation tools such as Jenkins (Hudson), TeamCity, and so on.

Did you say large?

Yes I did, but in this context I am using it to refer to a database with a large number of tables, not a large number of records. A large number of tables is a catch-all for several circumstances which you cause you trouble, such as with tables which have keywords for names, foreign key dependencies between tables, tables with timestamp columns, and some more items I will get address.

But I want a lot of records!

There are a couple of problems with using a lot of records with DBUnit, the first of which is that if you are exporting and importing you will have to deal with very long save and load times. As a general rule you probably are aiming for an export XML file size of under 3 MB, which is around 2,000 records depending on what it is your tables. The second problem has to do with memory, but I have a workaround for that as well.

DBUnit with Ant

So you want to use Ant? You are then going to have to write your own task by extending the DBUnit Ant Task. This is because there are options that need to be set [at least to my knowledge] that cannot be set in the DBUnit Ant Task that is provided to you.

To start out you will need to extend three things:

  1. The DbUnitTask, which is the <dbunit> tag. This allows you to define your own Export and Operation tags.
  2. The Export, which is the <export> tag. This is for changing how the export works.
  3. The Operation, which is the <operation> tag. This is for changing how the insert works.

MSDBUnitTask.java

public class MSDBUnitTask extends DbUnitTask {

    @SuppressWarnings("unchecked")
    public void addMSExport(MSExport export) {
        this.getSteps().add(export);
    }
    @SuppressWarnings("unchecked")
    public void addMSOperation(MSOperation operation) {
        this.getSteps().add(operation);
    }
}

MSExport.java

public class MSExport extends Export   {

    private String tableNames;
    @Override
    public void execute(IDatabaseConnection arg0) throws DatabaseUnitException {
        // code goes here
    }

    public String getTableNames() {
        return tableNames;
    }

    public void setTableNames(String tableNames) {
        this.tableNames = tableNames;
    }

}

MSOperation.java

public class MSOperation extends Operation   {

        @Override
    public void execute(IDatabaseConnection arg0) throws DatabaseUnitException {
        // code goes here
    }

}

Including your custom task in Ant

This assumes that the Ant classes from above are compiled into a JAR and present in the ${lib.dir} along with the dbunit.jar

<path id="classpath">

    <fileset dir="${lib.dir}">
        <include name="*.jar" />
    </fileset>   
</path>

<taskdef name="msdbunit" classname="com.foo.ant.dbunit.MSDBUnitTask" classpathref="classpath" />

 

Adventures in Exporting

The Java code here goes in the execute method of MSExport.

This is where the majority of problems start. As it turns out you can export with ease, but can end up with XML that had invalid characters and some other interesting things that cannot be imported.

Let’s start with the result we want (and need): the XML to export the database:

<target name="db-export">

    <msdbunit
        driver="${db.driver}" 
        url="${db.url}" 
        userid="${db.username}" 
        password="${db.password}">

        <dbconfig>
            <property name="escapePattern" value="[?]" /> 
            <property name="datatypeFactory" value="org.dbunit.ext.mssql.MsSqlDataTypeFactory" /> 
        </dbconfig>
        <msexport format="xml" tableNames="${insert.table.names}" dest="full_export.xml" />
    </msdbunit>

</target>

The msdbunit and msexport tags were explained earlier, and the base Java code for creating them was given.

Issue #1: Export Table Order

You must specify the order of tables to do the export in insertion order. This is done by passing in a comma separated list of tables name from ${insert.table.names} into the <msexport> tag as the tableNames attribute.

The reason for this order is that you must take into account FK dependencies when both inserting and deleting. For example if Foo.A is a FK to Bar.A, then you can’t start out by inserting records into Foo; you have to insert into Bar first. From the deletion perspective it is the opposite: You can’t delete from Bar since Foo can reference it and cause the delete to fail.

The MSExport task already has tableNames defined, so in order to get the result as a list of tables all that needs to be done is this:

String[] tableNamesArray = tableNames.split(",");

ITableFilter filter = new SequenceTableFilter(tableNamesArray);

IDataSet dataset = new FilteredDataSet(filter, arg0.createDataSet());

The second part of this is do use this array to define the table sequence. You must do this manually because even though it can be done automatically, with 100+ tables it will take minutes as opposed to a fraction of a second.

Issue #2: Columns that are of type TIMESTAMP or default DATETIME

There is an issue if you export a column of the TIMESTAMP type or of the defaulted DATETIME, because when you try and import it you will get an error regarding not being able to insert a value into the column.

The exact error you get is the following:

Cannot insert an explicit value into a timestamp column

The solution is not to export these columns, which requires the use of a column filter. This was the primary reason for abandoning the default Ant task, as I couldn’t figure out a good way to do a large column filtering.

List<ITable> updatedTables = new ArrayList<ITable>();

for (String tableName : tableNamesArray) {
                ITable table = dataset.getTable(tableName);
                List<Column> excludedColumns = new ArrayList<Column>();
                Column[] columns = table.getTableMetaData().getColumns();
                //For each column...
                for (Column column : columns) {
                    //TIMESTAMP columns have to be ignored in MSSQL because they cannot have an explicit insert
                    //Unfortunately DATETIME in MSSQL also shows here as a TIMESTAMP so the only way to
                    //determine defaulted DATETIME or TIMESTAMP is by looking for defaults and the SQL Type name
                    if ( (column.getDefaultValue() != null && column.getDataType() == DataType.TIMESTAMP) ||
                            column.getSqlTypeName().equals("timestamp")) {
                        excludedColumns.add(column);
                    }
                }
                //convert the list of excluded columns to an array
                Column[] excluded =  excludedColumns.toArray(new Column[excludedColumns.size()]);
                //create a new ITable that excludes the filtered columns
                ITable filteredTable = DefaultColumnFilter.excludedColumnsTable(table,excluded);
                //keep track of the modified table
                updatedTables.add(filteredTable);               
            }
            //Take all of the modified table and create a new dataset
            ITable[] updateTableArray =  updatedTables.toArray(new ITable[updatedTables.size()]);
            CompositeDataSet compositeDataSet = new CompositeDataSet(updateTableArray);

This code iterates over all of the tables, looks for datetime and timestamp columns, excludes them from that table, and adds them to a new list of tables with the appropriate columns excluded. It then constructs a new composite dataset containing all of the new tables and columns.

Issue #3: Exporting your customizations

When you extend the Export task, you have to then define how to export.

//Get the format and file destination
String format = this.getFormat();
File dest = this.getDest();
//Output based on the format
if (format.equalsIgnoreCase(FORMAT_CSV)) {
    CsvDataSetWriter.write(compositeDataSet, dest);
else if (format.equalsIgnoreCase(FORMAT_FLAT)) {
    FlatXmlDataSet.write(compositeDataSet, new FileOutputStream(dest));
} else if (format.equalsIgnoreCase(FORMAT_XML)) {
    XmlDataSet.write(compositeDataSet, new FileOutputStream(dest));
} else if (format.equalsIgnoreCase(FORMAT_DTD)) {
    FlatDtdDataSet.write(dataset, new FileOutputStream(dest));
} else {
    throw new DatabaseUnitException(format+" is not a recognized format.");
}

Issue #4: Exporting Tables that have keywords for names

For example User is a keyword, but you can have a table of this name in MS SQL and access it using [User]. In order to deal with this you want to put all table names within [].

This can be configured in the Ant XML using the escapePattern property of dbconfig:

<target name="db-export">

    <msdbunit
        driver="${db.driver}" 
        url="${db.url}" 
        userid="${db.username}" 
        password="${db.password}">

        <dbconfig>
            <property name="escapePattern" value="[?]" /> 
            <property name="datatypeFactory" value="org.dbunit.ext.mssql.MsSqlDataTypeFactory" /> 
        </dbconfig>
        <msexport format="xml" tableNames="${insert.table.names}" dest="full_export.xml" />
    </msdbunit>

</target>

Issue #5: Data Type Factory

In order to be able to insert records into a MS SQL database using DBUnit, you have to specify the data factory to use as a part of the export. If you don’t any time you try and import a result containing a primary key you will get the following error:

Cannot insert explicit value for identity column in table

This issue can also be resolved in the default Ant using the dataTypeFactory property in the dbconfig:

<target name="db-export">

    <msdbunit
        driver="${db.driver}" 
        url="${db.url}" 
        userid="${db.username}" 
        password="${db.password}">

        <dbconfig>
            <property name="escapePattern" value="[?]" /> 
            <property name="datatypeFactory" value="org.dbunit.ext.mssql.MsSqlDataTypeFactory" /> 
        </dbconfig>
        <msexport format="xml" tableNames="${insert.table.names}" dest="full_export.xml" />
    </msdbunit>

</target>

Issue #6: Which table failed to export?

This issue is that when you are exporting your database and something goes wrong, it is not specified which table was the problem. You just get an error message such as “Cannot insert explicit value for identity column in table” and are left to figure out which table may have caused it.

Since we have our own Ant task that iterates over tables, this is pretty easy to do: put a System.out.println with the table name at the start of the loop. That was if something breaks, you know which table caused it.

Deleting from the Database

If you exported correctly this isn’t a problem, and doesn’t require any custom work beyond the escapePattern and data type factory. It should be noted that before inserting into your database you want to clear it using your export file first. The CLEAN_INSERT operation is supposed to do this on insert for you, but in my experience I have never been able to get it work on a large database. It ends up throwing constraint violations, either because of the database, a bug, or my misuse of the operation.

<target name="db-delete">

    <msdbunit
        driver="${db.driver}" 
        url="${db.url}" 
        userid="${db.username}" 
        password="${db.password}">

        <dbconfig>
            <property name="escapePattern" value="[?]" /> 
            <property name="datatypeFactory" value="org.dbunit.ext.mssql.MsSqlDataTypeFactory" /> 
        </dbconfig>
        <operation type=”DELETE_ALL” format="xml" src="full_export.xml" />
    </msdbunit>

</target>

Populating the Database

The Java code here goes in the execute method of MSOperation.

The same as with deleting from the database, if you correctly exported the dataset you should not have any problems. There is one feature though that is nice to have, which is the ability in the case of an error to know which table cause the error.

        //Only perform an operation for every table if doing a clean insert for MS SQL
        if (this.getType().equalsIgnoreCase("MSSQL_CLEAN_INSERT")) {
            try {
                FileReader reader = new FileReader(this.getSrc());
                String format = this.getFormat();
                IDataSet dataset = null;
                if (format.equalsIgnoreCase(FORMAT_XML)) {
                    dataset = new XmlDataSet(reader);
                } else {
                    throw new DatabaseUnitException("xml is the only supported format.");
                }
                String[] tableNames = dataset.getTableNames();
                //For each table, create a new dataset and execute the operation because:
                for (int i = 0; i < tableNames.length; i++) {
                    String tableName = tableNames[i];
                    System.out.println("Populating table "+tableName);
                    ITable table = dataset.getTable(tableName);
                    ITable[] iTables = new ITable[1];
                    iTables[0] = table;
                    CompositeDataSet composite = new CompositeDataSet(iTables);
                    this.getDbOperation().execute(connection, composite);
                }
            } catch (Exception e) {
                e.printStackTrace();
                throw new DatabaseUnitException(e);
            }
        } else {
            super.execute(connection);
        }

    }

Beyond that all you need is the following:

  • MSSQL_CLEAN_INSERT
  • The MS SQL Data Type Factory
  • The escapePattern for tables with keywords for names

<target name="db-populate">

    <msdbunit
        driver="${db.driver}" 
        url="${db.url}" 
        userid="${db.username}" 
        password="${db.password}">

        <dbconfig>
            <property name="escapePattern" value="[?]" /> 
            <property name="datatypeFactory" value="org.dbunit.ext.mssql.MsSqlDataTypeFactory" /> 
        </dbconfig>
        <msoperation type=”MSSQL_CLEAN_INSERT” format="xml" src="full_export.xml" />
    </msdbunit>

</target>

Where did I get my information?

The internet combined with trial and error. Here are the sites I used for information:

Tuesday, August 16, 2011

SQL query performance using a select on a large record set with Hibernate

This title is partly a question because I am unsure that my solution is the best. I have come to the following solution based on trial and error, and settling on what seemed to have worked.

The Database

The database is MS SQL 2008 Enterprise, we are using the correct JDBC driver, and are using Hibernate 3.4.0.GA. The database is also located on the same physical machine as the application server.

The Table

There is only a single table that is the target of the problem, which has the following structure:

Foo

  • ID (UUID Primary Key)
  • A (int, indexed)
  • B (datetime, indexed)
  • C (varchar, indexed)
  • D (int, not indexed)

In this table there are slightly over 2.5 million records, and I am trying to select less than 10 results.

The Query

This is the query that was slow, taking over 5 minutes to complete (in the SQL Variant). Take into account though that the different columns were used under different circumstances, making the query dynamic. The more columns used, the slower the query. In the case of using every column it looked like this:

SQL:

SELECT ID, A, B, C, D

FROM FOO

WHERE A = 1 AND B >= ‘2011-08-12’ AND C = ‘BAR’ AND D = 3

Hibernate:

Query query = em.createQuery("from Foo where a=:a and b >= :b and c = :c and d = :d”);

query.setParameter(“a”, 1);

query.setParameter(“b”, date);

query.setParameter(“c”, “BAR”);

query.setParameter(“d”, 3);

List<Foo> list = query.getResultList();

The Obvious

Column D isn’t indexed. Column D isn’t also used all of the time, and removing it from the query only cuts the 5 minute time by 30 seconds. There is also the preference that I do not change the existing database unless absolutely the only option.

There are also probably better ways to deal with the datetime column type, for example see http://diegworld.blogspot.com/2010/09/sql-server-performance-querying-by-date.html. However, if I remove this column from the query it doesn’t change the amount of time it takes.

The SQL Solution

Eventually I just started playing around with the SQL variant of the query to see if I could improve the performance. I started with running select statements with each individual column and found that they all worked in a second, for example:

SELECT ID, A, B, C, D, FROM FOO WHERE A = 1

SELECT ID, A, B, C, D, FROM FOO WHERE B >= ‘2011-08-12’

SELECT ID, A, B, C, D, FROM FOO WHERE C = ‘BAR’

SELECT ID, A, B, C, D, FROM FOO WHERE D = 3

At this point I realized that the problem has to do with all of the restrictions I added in the where clause, but changing the order of the “ands” didn’t improve performance either. With this information I then tried building the query so that it would incorporate sub queries. The idea being that if I have select with A I have set A, in which I can select with B from that and have Set AB, select C from that and have Set ABC, and select D from that and have Set ABCD. The resulting Set of ABCD would contain only records which had the A, B, C, and D criteria.

The resulting SQL that I came up with was the following:

SELECT ID, A, B, C, D FROM FOO fooA WHERE fooA.ID IN (

SELECT ID FROM FOO fooB WHERE fooB.ID IN (

    SELECT ID FROM FOO fooC WHERE fooC.ID IN (

          SELECT ID FROM FOO fooD WHERE D = 3

    ) AND fooC.C = ‘BAR’

) AND fooB.B >= ‘2011-08-12’

) AND fooA.A = 1

This query took less than a second to run, as opposed to the previous variant that was taking more than 5 minutes.

My question is though, is this the best thing to do from both a SQL and MSSQL perspective?

The Hibernate Implementation

Hibernate has a lot of flexibility in terms of how you can do sub queries, but according to my research on the internet the preferred means for doing this is to use Criteria and DetachedCriteria. Here are some places which I used to come to this conclusion:

The way this works in Hibernate is as follows:

Session session = (Session) getEntityManager().getDelegate();

Criteria crit = session.createCriteria(Foo.class);

// First subquery to get A

DetachedCriteria query1 = DetachedCriteria.forClass(Foo.class);

query1.add(Restrictions.eq("a", 1));

query1.setProjection(Projections.property("id"));

crit.add(Subqueries.propertyIn("id", query1));

// Second subquery to get B

DetachedCriteria query2 = DetachedCriteria.forClass(Foo.class);

query2.add(Restrictions.ge("b", date));

query2.setProjection(Projections.property("id"));

crit.add(Subqueries.propertyIn("id", query2));

// Third subquery to get C

DetachedCriteria query3 = DetachedCriteria.forClass(Foo.class);

query3.add(Restrictions.eq("c", “BAR”));

query3.setProjection(Projections.property("id"));

crit.add(Subqueries.propertyIn("id", query3));

// Fourth subquery to get D

DetachedCriteria query4 = DetachedCriteria.forClass(Foo.class);

query4.add(Restrictions.eq("d", 3));

query4.setProjection(Projections.property("id"));

crit.add(Subqueries.propertyIn("id", query4));

List<Foo> results = crit.list();

Each subquery reads as “get the id’s from Foo where the restriction holds true”, and the criteria represents the combined result of all of the added subqueries.

This isn’t the exact equivalent of the previously specified SQL though. When looking at the HQL debug output it ends up being the following:

SELECT ID, A, B, C, D

FROM Foo

WHERE ID in (SELECT ID FROM Foo WHERE A = 1)

AND ID in (SELECT ID FROM Foo WHERE B >= ‘2011-08-12’)

AND ID in (SELECT ID FROM Foo WHERE C = ‘BAR’)

AND ID in (SELECT ID FROM Foo WHERE D = 3)

This is better than what was there, but it still isn’t as efficient as it could be. In this code we are selecting 4 different times across the entire Foo record set. In the SQL I am trying to write each time we select it is from the previous subset. For example we are looking for condition B inside the A result set,which is smaller than all of the Foo records.

As it turns out, it isn’t that hard. You just add your subqueries to other subqueries like this:

Session session = (Session) getEntityManager().getDelegate();

Criteria crit = session.createCriteria(Foo.class);

 

// First subquery to get A

DetachedCriteria query1 = DetachedCriteria.forClass(Foo.class);

query1.add(Restrictions.eq("a", 1));

query1.setProjection(Projections.property("id"));

crit.add(Subqueries.propertyIn("id", query1));

 

// Second subquery to get B

DetachedCriteria query2 = DetachedCriteria.forClass(Foo.class);

query2.add(Restrictions.ge("b", date));

query2.setProjection(Projections.property("id"));

query1.add(Subqueries.propertyIn("id", query2));

 

// Third subquery to get C

DetachedCriteria query3 = DetachedCriteria.forClass(Foo.class);

query3.add(Restrictions.eq("c", “BAR”));

query3.setProjection(Projections.property("id"));

query2.add(Subqueries.propertyIn("id", query3));

 

// Fourth subquery to get D

DetachedCriteria query4 = DetachedCriteria.forClass(Foo.class);

query4.add(Restrictions.eq("d", 3));

query4.setProjection(Projections.property("id"));

query3.add(Subqueries.propertyIn("id", query4));

 

List<Foo> results = crit.list();

This hibernate code results in the desired SQL:

SELECT ID, A, B, C, D FROM FOO fooA WHERE fooA.ID IN (

SELECT ID FROM FOO fooB WHERE fooB.ID IN (

    SELECT ID FROM FOO fooC WHERE fooC.ID IN (

          SELECT ID FROM FOO fooD WHERE D = 3

    ) AND fooC.C = ‘BAR’

) AND fooB.B >= ‘2011-08-12’

) AND fooA.A = 1

Thursday, August 11, 2011

Flex 3 Test Automation using QTP

This information has been sitting in my “to publish” list for a while now, so I am finally going to put it out there since a couple people have been asking about it. It describes my experiences with automating user interface interaction testing using QTP, and all the things I had to workaround.

1. Setting up and running QTP

1.1 Add “automation_charts.swf” to the Flex SDK directory

Example: C:\Program Files\Adobe\Flex Builder 3 Plug-in\sdks\3.2.0\frameworks\libs

1.2 Add the QTP libraries to your project compiler options

Add the following to the compiler options on the project:

-include-libraries "C:\Program Files\Adobe\Flex Builder 3 Plug-in\sdks\3.2.0\frameworks\libs\automation.swc" "C:\Program Files\Adobe\Flex Builder 3 Plug-in\sdks\3.2.0\frameworks\libs\automation_agent.swc" "C:\Program Files\Adobe\Flex Builder 3 Plug-in\sdks\3.2.0\frameworks\libs\automation_charts.swc" "C:\Program Files\Adobe\Flex Builder 3 Plug-in\sdks\3.2.0\frameworks\libs\qtp.swc"

The absolute path must be used, relative will not work.

This will force the libraries into your main application SWF, which should increase its size by about 1.2 MB

1.3 Run your Flex application on a web server.

You must run the application on a web server, it cannot run locally. If you attempt to run it locally you will get lots of browser scripting errors.

Example: http://myserver/foo/bar.html

1.4 You must be running Flash 9

The computer running the application and QTP must have Flash 9, otherwise system popups will be blocked by Flash Player security

1.5 You must run QTP and the browser window in the same monitor.

The computer running the application and QTP must be doing so in the same monitor window in the event that two monitors are being used, more on this later.

 

2. Required Code Changes

For various reasons when doing any type of user interface testing in Flex you have to make code changes.

2.1 Create new build and deployment

You will need to create a new variation of your build that compiles the automation libraries into the application binary (and all modules). This is because you don’t want to deploy your release binaries with the QTP automation classes compiled into them. This is because it significantly increases the size of your SWF.

2.2 Recursively set the automation hierarchy when running for automation

In order for the QTP to be able to playback button presses to objects deep within the component hierarchy, an automation value has to be set throughout the application at runtime, see “QTP Playback is slow” for information as to why and how.

2.3 Creation of automation delegates for every custom components that QTP does not recognized.

QTP will not recognized interactions with items that inherit directly from UIComponent directly, or are otherwise not a standard button, or text component, or control. See “Custom Automation Delegates” for more information.

 

3. QTP Specifics and Problems

3.1 Creating a Custom Flex Automation Delegate

References

Summary

Delegates exist so that you do not have to place automation related code in you standard components, after all, most people don’t want to run their applications with automation support. Delegates are provided for all the framework classes and will generally work out of the box for any framework class you extend. The need to write custom delegates arises if you create a custom component that directly extends UIComponent or if you have complex requirements for a component which already has a framework provided delegate.

A delegate has to be created for each custom class. Example:

package

{

import flash.display.DisplayObject;

import mx.core.IInvalidating;

import randomWalkClasses.RandomWalkEvent;

import mx.automation.IAutomationObject;

import mx.automation.AutomationIDPart;

import mx.automation.delegates.core.UIComponentAutomationImpl;

import mx.automation.IAutomationObjectHelper;

import mx.automation.Automation;

import mx.automation.events.AutomationRecordEvent;

import flash.events.Event;

[Mixin]

public class RandomWalkDelegate extends UIComponentAutomationImpl

{

       private var walker:RandomWalk

       public function RandomWalkDelegate(randomWalk:RandomWalk)

       {

               super(randomWalk);

               randomWalk.addEventListener(RandomWalkEvent.ITEM_CLICK, itemClickHandler);

               randomWalk.addEventListener(AutomationRecordEvent.RECORD, labelRecordHandler);

               walker = randomWalk;

       }

       public static function init(obj:DisplayObject):void

       {

              Automation.registerDelegateClass(RandomWalk, RandomWalkDelegate);

       }

       private function itemClickHandler(event:RandomWalkEvent):void

       {

               recordAutomatableEvent(event);

       }

       public function labelRecordHandler(event:AutomationRecordEvent):void

       {

               // if the event is not from the owning component reject it.    

               if(event.replayableEvent.target != uiComponent)

               //event.preventDefault(); can also be used.

              event.stopImmediatePropagation();

       }

       override public function get numAutomationChildren():int

       {

              var numChildren:int = 0;

              var renderers:Array = walker.getItemRenderers();

              for(var i:int = 0;i< renderers.length;i++)

              {

                numChildren += renderers[i].length;

              }

              return numChildren;

       }

       override public function getAutomationChildAt(index:int):IAutomationObject

       {

              var numChildren:int = 0;

              var renderers:Array = walker.getItemRenderers();

              for(var i:int = 0;i < renderers.length;i++)

              {

                  if(index >= numChildren)

                  {

                          if(i+1 < renderers.length && (numChildren + renderers[i].length) <= index)

                          {

                             numChildren += renderers[i].length;

                             continue;

                          }

                          var subIndex:int = index - numChildren;

                          var instances:Array = renderers[i];          

                          return (instances[subIndex] as IAutomationObject);

                   }

              }

       return null;

       }

       override public function createAutomationIDPart(child:IAutomationObject):Object

       {

               var help:IAutomationObjectHelper = Automation.automationObjectHelper;

               return help.helpCreateIDPart(this, child);  

       }  

       override public function resolveAutomationIDPart(part:Object):Array   

       {            

               var help:IAutomationObjectHelper = Automation.automationObjectHelper;

               return help.helpResolveIDPart(this, part as AutomationIDPart);  

       }

       override public function replayAutomatableEvent(event:Event):Boolean

       {

               var help:IAutomationObjectHelper = Automation.automationObjectHelper;

               if (event is RandomWalkEvent)

               {

                 var rEvent:RandomWalkEvent = event as RandomWalkEvent

                         help.replayClick(rEvent.itemRenderer);

                        (uiComponent as IInvalidating).validateNow();

                        return true;

               }

               else

               return super.replayAutomatableEvent(event);

       }

}

}

The TEAFlexCustom.xml for Mercury on all machines that intend to do automation has to be updated to include details about the custom class. Example:

<TypeInformation xsi:noNamespaceSchemaLocation="ClassesDefintions.xsd"

Priority="0" PackageName="TEA" Load="true" id="Flex" xmlns:xsi="http:/

/www.w3.org/2001/XMLSchema-instance">

       <ClassInfo Name="FlexRandomWalk" GenericTypeID="randomwalk" Extends="FlexObject" SupportsTabularData="false">

              <Description>FlexRandomWalk</Description>

              <Implementation Class="RandomWalk"/>

              <TypeInfo>

                      <Operation Name="Select" PropertyType="Method" ExposureLevel="CommonUsed">

                             <Implementation Class="randomWalkClasses::RandomWalkEvent" Type="itemClick"/>

                             <Argument Name="itemRenderer" IsMandatory="true" >

                                    <Type VariantType="String" Codec="automationObject"/>

                                    <Description>User clicked item</Description>

                             </Argument>

                      </Operation>

              </TypeInfo>

              <Properties>

                      <Property Name="automationClassName" ForDescription="true">

                             <Type VariantType="String"/>

                             <Description>To be written.</Description>

                      </Property>

                      <Property Name="automationName" ForDescription="true">

                             <Type VariantType="String"/>

                             <Description>The name used by the automation system to identify an object.</Description>

                      </Property>

                      <Property Name="className" ForDescription="true">

                             <Type VariantType="String"/>

                             <Description>To be written.</Description>

                      </Property>

                      <Property Name="id" ForDescription="true" ForVerification="true">

                             <Type VariantType="String"/>

                             <Description>Developer-assigned ID.</Description>

                      </Property>

                      <Property Name="automationIndex" ForDescription="true">

                             <Type VariantType="String"/>

                             <Description>The object's index relative to its parent.</Description>

                      </Property>

                      <Property Name="openChildrenCount" ForVerification="true" ForDefaultVerification="true">

                             <Type VariantType="Integer"/>

                             <Description>Number of children open currently</Description>

                      </Property>

              </Properties>

       </ClassInfo>

</TypeInformation>

3.2 File Selection fails to playback

When playing back the selection of a file in a System32 file dialog, the following error is thrown in QTP with the message “Object not visible”:

image041

Cause: Dual Monitors
Reference: http://forums11.itrc.hp.com/service/forums/questionanswer.do?threadId=1200371

Summary: This error is sometimes thrown when the browser, the secondary window (the file dialog), and QTP are not all in the same monitor.

Solution: Display all the windows in the same monitor.

3.3 QTP Playback is slow

Cause #1: Bug with automation in Flex 3 SDK

Summary: QTP playback is slower because of the implementation of automation libraries for Flex in the version 3 SDK. The issue does not exist in the version 2 SDK and has been resolved in the 4 SDK.

This is specifically because deep container hierarchies are slow to iterate through.

In the Flex 3 SDK automation scripts are significantly slower in deep component and container hierarchies. This is because the scripts have to iterate through the entire hierarchy in order to find the component that is being used. This issue was resolved in the Flex 4 SDK,

For example pressing the upload button to open the file dialog took 4 minutes, because the button is so deep within the hierarchy.

Potential Workaround: Set the automationName of the container and component

Workaround Summary

The automationName is a property available every component which designates the name that is used to identify a component in test automation. If no automationName is specified in Flex containers use their id attribute, and if no id attribute is present the container will have an automationName generated for it prefixed with “index”. Buttons however use the value of their label as their automationName when it is not specified.

It was theorized that setting the automationName may allow quicker access to the component, without having to iterate through the entire component container hierarchy, but this did not make a difference.

For example a button without an automationName would be accessed in QTP like the following:

Browser(#).FlexApplication(“Client”).FlexButton(“Upload and Print”).Click

Where “Upload and Print” is the generated automationName from the button label, but if automationName were set on the component in flex to “myButton” then the button would be accessed like the following:

Browser(#).FlexApplication(“Client”).FlexButton(“myButton”).Click

Actual Workaround: Set every container in the application hierarchy to showAutomationInHierarchy to true

References:

Workaround Summary

By default when QTP has to access a Flex component, it has to access that component by automationName. That component is located within an application by iterating starting with the application container and all of its children until the component within that automationName is found. The result is that a component that is deep within a container hierarchy is slow to find. It is for this reason and others, that it is recommend that Flex applications do not overuse container components. Unfortunately in the case of a large already built application, it is too late to go back and redesign the component container structure.

The solution is to specify every automationName within a hierarchy in order to limit the looping needed to find a component. This can be done by setting the showAutomtationInHierarchy value on a container to true, which will cause it to be detected in QTP even if that component is not directly involved in the interaction. If a Flex application uses modules to incrementally load itself over time, it is not possible on startup to recursively set the hierarchy value. Instead after a module is loaded, the module and the components within can recursively have this value set.

public function configureAutomation(obj:DisplayObject) : void

{

        if (obj is Container)

        {

            var c:Container = obj as Container;

                c.showInAutomationHierarchy = true;

                var children:Array = c.getChildren();

            for(var i:int=0; i < children.length; i++)

                {

                    var child:DisplayObject = c.getChildAt(i);

                    configureAutomation(child);

            }

        }

}

For example without this change the Upload button in the Client is accessed using QTP like the following, which requires 4 minutes to process:

Browser("#").FlexApplication("Client").FlexDividedBox("workflowUploadBox").FlexButton("uploadFirstBtn").Click

When all of the containers have the automation hierarchy set to show, that same button is accessed like the following in QTP and happens instantly:

Browser("#").FlexApplication("Client").FlexContainer("index:29").FlexCanvas("_DocumentViewModule_DocumentVi").FlexDividedBox("workflowUploadBox").FlexBox("index:0").FlexCanvas("newDocView").FlexBox("_NewDocument_VBox1").FlexCanvas("uploadView").FlexBox("_UploadView_HBox2").FlexButton("uploadFirstBtn").Click

Cause #2: QTP 9.1 is slow

Summary: Script playback in QTP 9.1 is very slow, and switching to 9.5 makes the playback significantly faster.

 

4. Maintainability Concerns

This is a concern with Flex and HTML based user interface drivers, in which the actor has to know the ID of the component in order to play back some action on it. In languages like JSF and Flex you don’t have to give an XML component an ID, which results in that component getting a dynamically generated ID.

image001

In Flex the problem is that when a component is not explicitly named, that name is calculated at compile time. The problem with the generated name is that it is based on a component’s position in its hierarchy. For example when a new Canvas is added, the names of the other two canvases change as shown by the highlighting in red.

It should also be noted that buttons are labeled by their text, which means of the text on the button changes the script will also have to change.

In order to ensure that a component’s name to QTP is always the same, the developer has to set the component’s automation name in the code for buttons and labels, and the component’s ID for boxes and canvases.

Contributors