Tuesday, December 11, 2012

Analyzing SQL execution plans in SQL Server Management Studio (SSMS)


Description

It is important that the trace is taken using a data set that has large record counts for the transactional tables the SQL statements are executed against.  The number of records in the tables will impact the SQL execution plans and theexecution times.

Create a trace 

Dynamics AX 2012 : Getting started with tracing


The trace must have the “Bind parameters” option selected in order to get the correct SQL statement from the trace to analyze.

Analyze the trace

Dynamics AX Trace Parser (Quick Trace tool)



Analyze the SQL execution plans

As part of analyzing the trace file it may be determined that a SQL statement has a high “Exclusive Average (ms)” or “Exclusive Total (ms)” and the execution plan should be analyzed.
  • As a general guideline any “Exclusive Average (ms)” greater than 10 ms should investigated.  Some exceptions would be insert and update statements that impact a large number of records, but even those statements should be investigate to ensure optimal execution plans are being achieved.
  • It may also be determined that the average time is not high but the query is executed a large number of times resulting in a high “Exclusive Total (ms)” and the SQL execution plan should be analyzed to ensure each call is as efficient as possible or the number of calls should be reduced if possible.
1. Select the SQL statement in Trace Parser that needs to be analyzed.
  • In the bottom code preview window copy the displayed SQL statement.


  • This is where it is important the “Bind parameters” option was used during the trace capture.  Otherwise, the SQL statement will not have valid parameters specified and will not be executable in SSMS.
2. Start SQL Server Management Studio
  • Connect to the AX database (usually AXDBDEV).
  • Create a new query.
  • Paste the SQL statement from the above trace into the query window.
3. Start analyzing
  • Select Query > Include Actual Execution Plan


  • Execute the query.
  • Select the Execution plan tab in the output window.

    • Analyze each table in the execution plan to verify the predicate and output list for the operation.
      • Hover over the table to view the operation details in the execution plan.

    •  Analyze index scans
      • Look at the predicate for the table being scanned. Determine whether an index exists that would provide an index seek and adjust the SQL statement to utilize that index.
      • SSMS may provide a recommendation to fix the issue, or it may be determined that no index exists to provide better performance.
      • In these cases additional fields may be needed on an existing index or a new index may be needed.
      • Scans may also be caused by a low number of records in the table.  Make sure the table has a larger number of records if it is expected to have large volume in a production environment.  If the table is a setup table it may not be an issue if the execution plan has a scan on that table when the record count is very small and will always be small.
    • Analyze index seeks
      • Look at the seek predicate for the table and determine if the only predicate for the index seek are the Partition field and/or the DataAreaId field.
      • In these cases the index seek will have the same performance as a scan and take into consideration the same updates that would be made to fix a scan operation.
    • Analyze key lookups
      • These are lookups to get additional fields from the record that are not part of the index used to select the record.
      • In some cases key lookups can change the SQL execution plan and may be an expensive operation.
      • Determine whether all the fields in the output list are needed to be returned and trim the return if possible.
      • If the lookup is causing performance issues determine whether an alternate index for the table contains the output list and whether the SQL statement can be adjusted to utilize that index, or determine whether index changes would fix the issue.
    • Items to consider:
      • Even though a new index or changes to an existing index may fix the scenario, consider whether the change can have negative impact on other scenarios.
      • New indexes or adding columns to existing indexs on high volume tables will slow down the write times in SQL and that has to be considered before making the change.
      • Consider whether marking any index columns as an included columns is better for the scenario.  This determination is based on the selectivity of the column and how likely it is to change.

No comments:

Post a Comment