New Data Analysis Features of Test Center 1.1

This article focuses on improved data analysis features of Squish Test Center 1.1. More specifically, we focus on changes we've made to the Dashboard, Timeline and Graphs page, changes which were made to enhance the workflow of any user. We encourage you to take a look at the official 1.1 release announcement for an overview of other enhancements not specifically covered here.

Improvements to Dashboard & Project Overview

Test Center's Dashboard acts as a central summary of your result uploads, showing at a glance indicators for the health of your projects. We've improved the look and functionality of the Dashboard to include more details to the project overview. Where we previously showed only the test status of the most recent test runs, we now show the name of each batch and the labels associated with it. Which labels should be displayed can be configured by the user.

Next to retrieving the most recent reuslts by execution time or upload time, a new option exists for retrieving the most recent results for each label value. You can set this up to show the most recent result for each branch of your Application Under Test (AUT), or the most recent result for each configuration on which you run your tests.

For each project, you can now display the latest result by label on the Dashboard.

If you encounter test failures in your current development branch, you might want to compare the results quickly against the stable branch of your AUT. We've now made the comparison functionality accessible directly from the project overview. Previously, it was required to take a detour via the History page.

You can now initiate the batch comparison directly from the Dashboard, using the compare buttons on the right side.

Whether you're dealing with several test runs in a short timeframe, or with many different branches developed in parallel, you can now also increase the number of recent results shown in the overview.

Since Test Center was built to handle projects of any number, we've improved the listing of projects in the Dashboard. Now, you can filter the projects to show only those of current relevance. Previously, projects were sorted either by latest execution or upload time, whereas now, you can filter and sort to give you a desired -- and more predictable -- order of projects.

Improvements to Timeline Visualization

Until 1.1, the timeline showed only the passed or failed state of a test run at a specific point in time, while the name of the test run was shown only as a tooltip. Now, you can choose to display the batch names directly in the timeline as well, giving you a better idea on which test runs you are currently visualizing.

While it was already possible to look at the timelines of a specific label (e.g., to look at the history of a specific branch), you can now display label values within the timeline, too, which provides additional information about the displayed test runs. An example where this is useful, would be viewing the history of a specific test and also seeing for which branches the tests were executed.

Additional configuration options for the Timeline make it easier to identify results for specific labels.

This will make it easier to identify the test runs that are relevant to you.

Improvements to Graphs Page & Analytics

Test Center 1.1 brings with it a major overhaul to the Graphs page, for improved data analytics.

The number of available statistics shown on a graph has been extended to include the following:

  • Passes
  • Failures
  • Passed tests
  • Failed tests
  • Warnings
  • Skips
  • Duration of tests

It's also now possible to combine the above in a custom way. For example, you could show the number of passes next to the duration of tests, to analyze if a change in the duration is caused by new tests being added or tests not being executed.

The improved graphs will help you analyze test duration changes.

When choosing multiple statistics of the same unit, you can also choose to show them either stacked or clustered.

Depending on the data, it might be easier to spot trends in a clustered or stacked graph.

Within the Graphs page, you can view the statistics across all test cases, or you can select an individual test, scenario or step. When looking at the statistics across all test cases, you can choose to look at a single summarized value, or you can look at how much each test or label contributes to the overall value. When looking at the distribution across test items, it's also possible to drill down into the hierarchy from within the graph itself, using a mouse-click.

Here we can see how much each of the test suites contributes to the overall execution time

Hovering over bars in the graph will show you a tooltip that reveals detailed information about the other statistics currently not displayed in the graph.

Even in a duration graph we can see the number of passes and failures in the graph's tooltip.

Wrap Up

Test Center 1.1 brings with it a plethora of new features for more advanced analysis of your test result data. No matter your project size, Test Center can help you assess application health with ease.

    The Qt Company acquired froglogic GmbH in order to bring the functionality of their market-leading automated testing suite of tools to our comprehensive quality assurance offering.