The Data Block Speed Dashboard provides developers and technical users with performance insights into the execution times of Data Blocks. It allows users to monitor execution history and identify bottlenecks to better understand which Data Blocks may be contributing to performance delays. This dashboard is available in the Developer Center and provides a structured, multi-level view of Data Block performance and trace information.
Accessing the Data Block Speed Dashboard #
To open the Data Block Speed Dashboard, navigate to the Developer Center and click Stats in the top-right corner of the screen. From the drop-down menu, select Data Block Speed.
Once opened, the dashboard presents a Summary view showing all Data Blocks executed within the selected time period. Users can adjust the Start Date, End Date, Block Category, and Block Type to filter results.
Understanding the Summary View #
The Summary View displays an overview of Data Block runs, including key metrics such as:
- Name – The name of the Data Block.
- Output Type – The Data Block’s output (e.g., Data Table or Values).
- Block Category – The category of the block (e.g., Interface, Transform).
- Block Type – The type of Data Block, such as Python or Local Database.
- Runs – Number of executions within the selected period. Clicking this value opens the Details View for that specific Data Block.
- Min Time / Max Time / Avg Time – The minimum, maximum, and average execution times.
- Failed – Indicates the number and percentage of failed runs.
- Last Execution Time – Displays the most recent execution date and time.
This view includes both user-created and Assette-managed Data Blocks (for example, AttributesMaster). Assette-managed Data Blocks may not be visible in the Developer Center’s Data Block list, but they appear here to help users trace potential system-level performance issues.
Please note that Python Environment Data Blocks (for example, CalculationEnv) are excluded from the dashboard.
Details View #
Clicking on the Runs value of any Data Block opens the Details View, which lists individual executions within the chosen date range.
The Details View includes:
- Run Date and Run Time
- Pipeline Level
- Data Block Duration
- Pipeline Duration
- Status (e.g., Succeeded or Failed)
- Errors (if applicable)
Users can also filter by successful or failed runs, or view only those with specific durations (e.g., Min Time or Max Time).
The system provides trace information to help identify performance dependencies; however, please note there is a 5-minute delay before trace data becomes available after a Data Block run.
Content Details #
Selecting a specific run from the Details View opens the Content Details screen. This screen provides trace-level information for the selected run, including:
- Name of the Data Block and any dependent blocks involved in that execution.
- Pipeline Level – Indicates execution order within the pipeline.
- CID – The Content ID of the run instance.
- Template Name and Data Object – If applicable, the template or object associated with the execution.
- Run On – The exact date and time the run occurred.
- Data Block Duration and Pipeline Duration – Execution metrics.
- Status and Errors – The outcome of the run.
At this level, users can view dependencies triggered during the run. For example, if the CurrencyCodes Data Block references Source_ListOfCurrencyCodes in its definition, both will appear in the trace.
However, Data Blocks that are added as dependencies but not called in the definition (for example, GetCurrencyCodesLocal added but unused) are not shown in the Content Details view.
On this screen, users can also view the request parameters and properties for the specific Data Block. It is expected and normal that the “Properties” field will be blank for must Data Block executions.
Data and Refresh Behavior #
The Data Block Speed Dashboard does not update in real time. Users must refresh the page to view newly executed runs.
Runs initiated via the Test or Preview options within the Developer Center also appear in the dashboard. This makes it possible to measure both production and test execution performance from a single interface.
Example Use Cases #
- Identifying performance bottlenecks across Data Blocks.
- Comparing run times across multiple days or versions of a block.
- Tracing dependency delays, especially when multiple blocks are triggered by a single definition.
- Monitoring Assette-managed system Data Blocks for visibility into background processes.