Test Center 1.1 Release Webinar Q&A

We received some great questions during the Test Center 1.1 release webinars following the 1.1 feature release.

We compiled the Q&A from both sessions and included them in this blog. If you want to revisit the webinar, check out the recording we uploaded to our YouTube channel, given below.

Getting Started, Licensing & Documentation

[1.]We have licenses for Squish 6.6.x. Can we directly install and use Test Center?

All Squish 6.6.x packages come bundled with Test Center. To get started using Test Center, you'll need an activation code unique to the tool, separate from your Squish license key.

Contact us to begin an evaluation and get your activation code.

[2.]Where can I find Test Center documentation?

The latest Test Center 1.1 documentation is available freely online. You can view it here.

[3.]How are results grouped by Test Center? What's the difference between a project, a batch and a report?

Test Center Projects are used to separate test results which are not related. For example, if you're using Test Center for managing results of multiple products, you'd have separate Test Center Projects for each of these products.

A Batch is a group of labeled test Reports, where a report is one or more uploaded test results. Reports are for test results executed on one configuration. For defining those, you can tag your reports using labels, to specify attributes of that execution configuration.

A deep dive into how Test Center structures and groups results, plus examples, can be found in our documentation.

Integrations

[1.] Does Test Center integrate with FogBugz?

Not currently. For issue tracking and reporting tools, we currently support integrating with Atlassian Jira.

FogBugz support, however, is tentatively scheduled in our development plans. If you're a FogBugz user, comment below or shoot us an email as we're currently gathering interest.

[2.]Are there plans to integrate Test Center with ALM / Quality Center from Micro Focus?

A Test Center integration with ALM / Quality Center from Micro Focus has garnered growing interest from our users. Support for this tool integration with Test Center is on the horizon.

Dashboard & Project Overview

[1.]In earlier versions of Test Center, the Project Dashboard showed how many tests failed out of total. Why was that removed?

We made multiple improvements to the Dashboard, in the hope that we could provide the most useful information about your projects at-a-glance. Number of test failures in a batch, by itself, is not an informative metric. Now, not only the look of the Dashboard has been improved, but also its functionality. Here's an overview of the new capabilities:

  • Batch names and associated labels displayed with test runs; labels shown are user-configurable.
  • Retrieve most recent results by execution date, upload date or for label value.
  • Comparison functionality accessible directly from the Dashboard.
  • Custom number of displayed results.
  • Filtering by relevance.

Graphs Page & Explore Page

[1.]Can you explain the 'Retried' metric when working with custom graphs?

Retried tests in Test Center are defined as test cases Squish tries to execute again after a failure. The number of retries are counted until the test passes.

More information on the retry option of squishrunner can be found in our documentation.

[2.]Is it possible to overwrite the 'reds' in the Explore page's pie charts when a test suite is retriggered?

This is currently not possible, but as a workaround, we suggest removing the failed report and uploading the retriggered test.

Watch the Webinar

If you missed it live, or if you're looking to revisit the session, we've uploaded a recording of the webinar to our YouTube channel.

Have a look below:

 

 

If you have any other questions, drop us a comment here or send us an email!

Comments

    The Qt Company acquired froglogic GmbH in order to bring the functionality of their market-leading automated testing suite of tools to our comprehensive quality assurance offering.