Running TestArchitect tests from Microsoft Test Manager

After creating a TA-MTM integration project in Visual Studio that contains test cases, you can plan your tests, execute test cases, and view test results in MTM.

Ensure that you have already taken the following steps:
Important: When running an automated test, the execution of every single TFS test case entails the execution of both the INITIAL and FINAL sections, if any. For example, the execution of test case #01 is preceded by the execution of the INITIAL section, and followed by the FINAL section; the same is true for test case #02, and each subsequent test case.
  1. Run MTM and connect it to the team project from TFS where your project resides.
    • Visual Studio 2015.
    • Visual Studio 2013.
  2. In MTM Testing Center, click Test on the ribbon. Then click the Run Tests tab. Double-click your TFS project in the project explorer tree.

    Your selected project appears in the MTM main window panel showing the test cases that are linked with TestArchitect test cases.
  3. In the MTM main window panel, right-click a test case that you want to run and choose Run. The selected test case must be of the automated type. (That is, the value of Automated must be Yes.)
    Note: If you want to run a test in a different environment or with different test settings, choose Run with options.
    The TestArchitect Execution dialog box appears.

  4. In the dialog box, enter the requested information as described below:
    • TestArchitect Information panel:

      • Repository Server: Name or IP address of TestArchitect Repository Server holding the repository of the tests.
      • Port: Port number of the TestArchitect Repository Server host.
      • Repository Name: Name of TestArchitect repository hosting the tests.
      • User Name: TestArchitect user name whose account has permission to access and execute tests on the host.
      • Password: Password for the above user account.
    • General tab:
      • Variation Specification panel:

        • Keyword: Keyword, or comma-delimited list of keywords, specifying the test variation to be executed, if any. (See Creating keyword variations.)
        • AUT Version: Enter a value or click the Select Version button to specify a variation tailored to an AUT version or platform (See Creating linked variations.)
        • Time Traveling: To opt for time traveling execution, which selects an historical "snapshot" of the test's project items for execution during the test run, select the check box and provide an appropriate timestamp. (See Time Traveling for details.)
      • Controllers/Devices panel:

        • Select Controllers/Devices: Click this button to designate which controller and device the test will execute on.
          • Lab Manager Server: (Display only) IP and port number of the Lab Manager Server to which the test controllers and devices are registered.
          • Controllers/Devices panel: Lists all available controllers and devices on which the test can be executed. The list consists of those controllers and devices that are either registered with the Lab Manager Server or have been manually added with the Add Controller button.
          • Controller Port Configuration: Use this panel to specify to TestArchitect the port number that the remote machine is using for its TestArchitectcontroller, if not using the default.
            • Host Name: (Display only) IP address of remote machine currently selected in the Controllers/Devices panel.
            • Port: Port number through which TestArchitect will attempt to communicate with the controller on the host specified in the Host Name field. If this is not the port on which the controller is known to be listening, change this value and then click Save .
        Restriction: Only one controller or one device may be selected to run tests at a time. In other words, multiple controller/device execution is prohibited.
      • Screenshot recording: Select this check box to configure the capturing of screenshots of UI-Interacting actions. (For details, see Capturing screenshots during test execution.)

      • Include screenshots: Select this check box if you want to retain all captured screenshots in the exported HTML test result.

        • Optimized resolution: (Default) Included screenshots' dimensions are optimized to save space in the exported HTML test results. Specifically, the screenshots are saved as thumbnail images.
        • Regular resolution: Original resolution of included screenshots is retained. Specifically, the screenshots are saved as full size images.
      • Automatically add result(s) to repository: Select this check box if you want to add test results automatically to the repository once the test concludes.

        • Repository destination: Add test results to the repository at this specified location.
        • By result: Limit the results stored to the repository in accordance with the following check box selections:
          • Passed: Passed test results are stored.
          • Passed with Warnings/Errors: Passed test results with warnings/errors status are stored.
          • Passed with known bug: Test results which passed, but with marked known bugs whose outcomes have been ignored, are stored.
          • Failed: Failed test results are stored.
    • Startup Settings tab:

    Note:
    • To save the configurations, click the Save button.
    • By default, the TestArchitect Execution dialog box only appears on the first run. If and when the configurations saved from the previous run are no longer valid, the dialog box reappears on the subsequent run.
  5. Click Run to execute the tests.
    Important: If you are executing tests for the first time, you are prompted to enter your TestArchitect license server information. Enter the required information in the dialog box. If the specified license server is reached successfully, or the trial key is validated, the provided license information is stored. From then on, you will be able to execute tests without again being prompted for this information.

    CAUTION:
    MTM's automated tests settings include two timeout settings for imposing limits on automated test times. If these settings are enabled, you must ensure that they meet or exceed the maximum expected test times for your tests. These settings are available under MTM's Lab Center, under Test Settings > Timeouts:

    • The first setting, Abort a test run if the total time exceeds:, when enabled via the check box, attempts to enforce a ceiling on the total run time of your test run. Should you set a value here and your test run exceeds it, your test is not aborted or directly affected in any way. However, your test results will not be uploaded to TFS.
    • The setting Mark an individual test as failed if its execution time exceeds: is applied to each individual test of your test run. Hence, if it is used (which we advise against), you should ensure that it exceeds the maximum possible test time of your longest test of the test run set.
      CAUTION:
      If this setting is enabled and any one of your tests does exceed its value, it may have the effect of corrupting the transferred results of other tests in the set.

A test run panel appears showing the status of the running test. Initially, the Result Overview section displays the current state of the tests as pending while the test cases are run. After the selected test cases finish their runs, the Result Overview section is updated to show the status of the test runs.

Important: When running an automated test, the execution of every single TFS test case entails the execution of both the INITIAL and FINAL sections, if any. For example, the execution of test case #01 is preceded by the execution of the INITIAL section, and followed by the FINAL section; the same is true for test case #02, and each subsequent test case.

Additionally, the test run results are generated, and then uploaded into the Attachment sections as HTML attachment file(s).



Tip:
  • Right-click the attachment file and select Open to view the test results in a browser.
  • The layout of the generated test results when test cases are executed in MTM is similar to the layout displayed when the same test cases are executed in TestArchitect.