Project

General

Profile

Feature #4789

Compare duration test data for nightly builds

Added by Stuart Sides about 1 year ago. Updated 7 months ago.

Status:
Acknowledged
Priority:
High
Assignee:
-
Category:
Infrastructure
Target version:
-
Impact:
Software Version:
Test Reviewer:
Story points:
2

Description

The nightly builds have timing information attached to each application. Create a way to compare past successful durations. Flag ones that increase over time and ones that increase suddenly.

History

#1 Updated by Ian Humphrey about 1 year ago

  • Story points set to 2

DoD:

  • SPIKE: How do we want to approach this?
    • Having 2 people working together (spending 1 SP each) to find solutions and bounce ideas off each other would be great for this ticket
    • Database?
    • Wait until cmake in place?
    • New test framework?
    • Scripting?

Tyler-Adam solution (or the start of one)

  • Create a python script to parse the test logs for each development machine. As long as the
    test logs are in the same format under cmake as they are now, it doesn't make a difference and we can start on this now. Even if there are differences, modifying the script should be easy.

  • The format of the output from this script should be as simple as possible. Here is one possible design:

The header:
[Category:string (unit/app/cat)] [Name of the object (if a unit Test)] [Time Stamp Run 1], [Time Stamp Run 2]......
[Application: test name]
[Category test: test name]

In the output logs the run time is displayed in this format: [HH:mm:ss] We think this should be converted
to a single value in seconds by the script parsing the input log.

This could also be a table in a database just as easily. And depending on how long we want to save time-series data, this probably would be better off stored in a database. Talk to Mike or Stuart about what kind of database options are available to us.

Here is a sample of what a space-delimited output log of run-times over the pass three days might look like:

Category          Name              2017-05-17             2017-05-16          2017-05-15
Unit              AutoReg           32.4                   28.7                22.8
Unit              ShapeModel        12.4                   26.0                22.3
 :
App               cnet2dem:default  34.2                   31.3                11.34
App               cnet2dem:full     31.1                   36.1                12.74
App               cnetadd:default   35.4                   24.6                13.44
App               cnetadd:point     37.6                   48.8                11.44
App               cnetadd:polygon   32.2                   18.3                12.5
App               cnetadd:validator 38.4                   17.4                11.54
:
Cat               mro:hirise        120.3                  125.6               123.5
Cat               near:msi          240.6                  241.9               253.3 

Every time the tests are run on this system, this output file would continue to add date stamped columns to the right. The cut-off period for how long this should be done is something we still haven't decided. 30 days, a year?

The time stamp at the top of each column can also be anything. Unix epoch time (my preference (Tyler) ) could also work, and is easy to convert to other time formats.

  • Use matplotlib to make time plots for everything stored in this file (actually multiple files on the different development machines). The parsing file can be separate from this file, or they can be the same one. It might be easier to split this project among different developers if they are separate though.

  • A preliminary idea is that using matplotlib, the script would output a report in html (or pdf?) format showing time plots for each application/object/cat test on each development machine and store them in some temporary directory.

As a side note. The unit test build logs do not currently track the time it takes for the unit tests to complete. If at all possible, times for unit tests should be included in the new cmake system.

#2 Updated by Tyler Wilson about 1 year ago

  • Assignee set to Adam Paquette

#3 Updated by Adam Paquette about 1 year ago

  • Status changed from Acknowledged to In Progress

#4 Updated by Adam Paquette about 1 year ago

  • Status changed from In Progress to Resolved

#5 Updated by Stuart Sides about 1 year ago

  • Status changed from Resolved to Acknowledged

The description look like a good start. This issues has been moved back to acknowledged to await implementation (possible cmake).

#6 Updated by Stuart Sides 11 months ago

  • Target version changed from 3.5.1 (Sprint 1) to 3.5.2 (2017-01-31 Jan)

#7 Updated by Stuart Sides 8 months ago

  • Assignee deleted (Adam Paquette)

#8 Updated by Stuart Sides 7 months ago

  • Target version deleted (3.5.2 (2017-01-31 Jan))

Removed from 3.5.2. Hold until github transition is very stable and replan

Also available in: Atom PDF