Salesforce

Compare Runs in Simcenter Testlab

« Go Back

Information

 
TitleCompare Runs in Simcenter Testlab
URL Namecompare-runs-in-simcenter-testlab
Summary
Details

run_comp2.pngWant to compare data from different runs quickly in Simcenter Testlab or in Simcenter Testlab Neo? Want to calculate the average, standard deviation, or envelope of these runs?

This article explains how to use the Run Comparison features of both Simcenter Testlab and Simcenter Testlab Neo. 

Contents:
1. Simcenter Testlab Run Averaging and Comparison Organizer
   1.1 Getting Started
   1.2 Online Run Comparison
   1.3 Online Run Comparison Example
   1.4 Offline Run Comparison
2. Simcenter Testlab Neo Run Average
   2.1 Getting Started
   2.2 Process Designer
   2.3 Pivot Table


1. Simcenter Testlab Run Averaging and Comparison Organizer


Direct YouTube link: https://youtu.be/es1tbX_kOHU


Simcenter Testlab has a dedicated add-in for doing comparisons between different runs of data.  It is called the "Run Data Averaging & Comparison Organizer". It can be used online when acquiring data live in Signature Acquisition or in postprocessing.

1.1 Getting Started

Begin by going to Tools / Add-ins in Simcenter Testlab as shown in Figure 1.
 

User-added image

Figure 1: In Simcenter Testlab, go to Tools/Add-ins


Select “Run Data Averaging & Comparison Organizer” (Figure 2). If using Simcenter Testlab token licensing, this uses 16 tokens.

User-added image

Figure 2: The "Add-ins" window.
 

The add-in includes both online and offline functionality. In either case, five averaging functions are added as shown in Figure 3 below:

User-added image

Figure 3: Definitions of the five functions.
 

This article is composed of two sections. The first section will cover the online functionality of Compare Runs. The second section will cover the offline functionality of Compare Runs.
 

1.2 Online Run Comparison
 

After activating the Compare Runs add-in five averaging functions are added to the Measure workbook of Simcenter Testlab Signature as shown in Figure 4.
 

User-added image

Figure 4: The Measure workbook before and after the Compare Runs add-in is activated.


Compare Runs will also add information into the Data Explorer. Any 2D function that can be averaged will appear under the Compare Runs folder (Figure 5).
 

User-added image

Figure 5: Explore the Compare Runs folder in Data Explorer to see what functions you can compare across runs.


1.3 Online Run Comparison Example

For this example, three runs were acquired.

Each run consisted of a single tone.

  • Run 1 was a tone of 1000Hz.
  • Run 2 was a tone of 1500Hz.
  • Run 3 was a tone of 2000Hz.

Each run was acquired using a stationary acquisition consisting of 13 averages over 3 seconds.

To begin, drop data into the display from the Compare Runs folder. In this case, a stationary autopower acquisition was done. So, there is an “AutoPower Mic” selection available. Drag from the LEFT pane into the display.
 

User-added image

Figure 6: Any 2D functions will be available under the Compare Runs folder.


Next, select which averaging functions to display in the Measure workbook. For this example, the average function is displayed. Users can display any combination of the Compare Runs functions by toggling on or off the function buttons on the upper right of the plot display area.
 

User-added image
Figure 7: Five functions are available with the Compare Runs add-ins.


Under the “More…” button (Figure 8), there are three online averaging display options.
 

User-added image

Figure 8: Select which runs to display under the “More…” button.


For this example, “All runs” is selected.

A. “No run display” will not display any of the runs as they populate. The averaging functions selected will
B. “Last run only” will display only the most recently acquired run in addition to the averaging functions selected.
C. “All runs” will display all acquired runs in addition to the averaging functions selected.

The result after Run 1 (a tone at 1000Hz) is shown below. Run 1 is displayed in red. The average of all runs is displayed in black (Figure 9). In this case, the two curves are the same, so they overlap completely until another run is acquired.
 

User-added image

Figure 9: The first run and average of all runs are displayed. After only one run these values are the same.


The result after Run 2 (a tone at 1500Hz) is shown below. Run 1 remains displayed in red because the online averaging run display mode is set to “All runs”. As each run is acquired, the run will populate and remain as the additional runs are acquired. Run 2 is displayed in purple. The average of the two runs is displayed in black as shown in Figure 10.
 

User-added image

Figure 10: The display mode is set to “All runs”; the runs populate over one another and remain in the display.


The third and final run is then acquired (a tone at 2000 Hz). Run 3 is displayed in green. The original two runs remain in the display and the average function updates to include the third run as shown in Figure 11.
 

User-added image

Figure 11: The runs continue to populate in the display and the average function updates appropriately.


When finished acquiring data, go to the “Compare Runs” workbook. Here is where you will save the averaging functions across your runs. 1) Select which averaging functions you would like to be save by toggling the function buttons. 2) Press “Save…”
 

User-added image

Figure 12: The Compare Runs workbook.


Users can either save just the selected functions or save all the functions. Name the folder then click OK to save. Menu shown in Figure 13.
 

User-added image

Figure 13: Save the functions.


The functions will be saved to the folder in the Navigator workbook. In this case, just the average function was saved. Browse to the data block and display it (Figure 14).
 

User-added image

Figure 14: Just the average function was saved.
 

If instead save “All functions” was selected, the function tree in the Navigator would look like Figure 15 below.
 

User-added image

Figure 15: All functions are saved.


Online run comparison is a great tool to instantly check your run data average!

1.4 Offline Run Comparison

It is possible to compare runs offline. These runs can come from any project or section.

In the offline mode, analysis begins in the Compare Runs tab (Figure 16).

User-added image

Figure 16: Compare Runs tab.

The Compare Runs workbook has four main panes.

User-added image

Figure 17: Compare Runs workbook.

  1. Data Source Selection: import run data from the input basket or the active project.
  2. Run Selection: choose which runs to display and which runs to include in the averaging functions.
  3. Function Selection: a matrix of each Point ID and calculated function. From this matrix, choose what will be displayed the display pane.
  4. Display area: data plotted in this area is selected from the function and point matrix (area 3).

Data Source Selection

It is possible to import runs via the input basket or from the active project. Use the radio buttons to determine what the data source is.

User-added image

Figure 18: Select to import from the Input Basket or Active project.

Once the data is imported, the Run Selection and Function Selection areas will populate.

User-added image

Figure 19: The Run Selection and Function Selection areas are populated once the data is imported.

Run Selection

In the Run Selection area, there are two columns of checkboxes on the far right (see Figure 20).

  • The display column determines whether or not the run data is displayed in the display area on the right side of the screen (area 4).
  • The averaging column determines whether or not the run data is included in the averaging functions that are calculated.

It is important to note that a function may be included in the average even if it is not displayed. It is wise to inspect which of these boxes are checked before saving your averaged run data.

User-added image
Figure 20: The Display and Averaging columns determine which runs are displayed and which runs are used to calculate the averaging functions.

Function Selection

There are three different ways to display the functions: Point by Point, Function by Function, or User Defined Layout. To choose a display type, click either P,F, OR U in the lower right corner of the Function Selection area.

User-added image

Figure 21: P: Point by Point. F: Function by Function. U: User Defined Layout.

  • Point by Point
    • Allows the user to display multiple functions calculated from the same Point ID.22.png
Figure 22: Multiple functions are displaying of the same Point ID.
  • Function by Function
    • Allows the user to display one function across multiple Point IDs.23.png
Figure 23: Multiple Point IDs are displaying for the same function.
  • User Defined Layout
    • The user defined layout allows the user to choose a custom mix of Point IDs and functions from the available options.24.png
Figure 24: A user-defined mix is selected.

In the user defined layout mode, there is one extra step to be aware of.

If a multi-plot display is opened, the number of plots in the display will correspond to the number of buttons that become sensitive to the left of the P, F, and U.

25.png

Figure 26: The user defined mode. Here the third graph is selected. The dotted line around the third plot indicates it is the active plot.

To select which functions to display, toggle the boxes in the upper right corner of the Function Selection area. The functions are defined at the top of the article in Figure 3.

27.png
Figure 27: The averaging functions are in the top right corner.

To control how many points/ functions are displayed, click on “Options…” in the Function Selection area.

Adjust the numbers for “Functions per display” and “Points per display”.

28.png
Figure 28: The Options window in Compare Runs.

Saving Average Run Data

Finally, save your data. Choose where to save your data, which functions to save, and what to call your new folder.

29.png
Figure 29: The Save window in Compare Runs.

2. Simcenter Testlab Neo Run Average


Direct YouTube link: https://youtu.be/9-_AYpVe1_8


In Simcenter Testlab Neo Process Designer, the "Run Average" method can be used to compare data from different runs and compute statistics.

2.1 Getting Started

To activate the method, go to "File -> Add-ins" and load the Process Designer, Run Averaging, and Signature Analysis (or another analysis method).
 

User-added image
Figure 30: A minimum of three add-ins (Process Designer, Run Averaging, and Signature Analysis) are needed to do use the "Run Average" method.


If using Simcenter Testlab token licensing, 74 tokens would be needed to load the add-ins in the example.

2.2 Process Designer

With the add-ins loaded, create the following process (or similar) in Process Designer of Simcenter Testlab Neo as shown in Figure 31 below:
 

User-added image
Figure 31: Process to calculate and compare average, minimums, maximums of multiple runs in Simcenter Testlab Neo.


In the Run average method, the desired statistics can be selected as shown in Figure 32:
 

User-added image
Figure 32: Select the desired statistics (average, minimum, maximum, standard deviation, 3 sigma) from the properties of the Run Average method.

 

The desired functions and run statistics can then be calculated.

An example of using this process to calculate and compare orders can be found in the knowledge article: Calculating and Comparing Orders and Overall Levels in Simcenter Testlab Neo.

2.3 Pivot Table

After processing the data, it can be viewed using the Pivot Table as shown in Figure 33:
 

User-added image
Figure 33: Use the Pivot Table feature of Simcenter Testlab Neo to compare data between runs.
 

Additional columns of information can be added to the pivot table as needed to sort and view the data quickly (Figure 34):
 

User-added image
Figure 34: Additional fields can be added to the pivot table to compare data between runs more easily.


More about information on using Simcenter Testlab Neo in the knowledge articles: 



Enjoy the power of the run comparison in Simcenter Testlab and Simcenter Testlab Neo!

 

Questions? Email scott.beebe@siemens.com, contact Siemens Support Center, or post a reply!

 

Simcenter Testlab Acquisition Tips

Simcenter Testlab Processing Tips


Powered by