CRM Performance Pre-determiner

CRM Performance Pre-determiner

This article gives you information on how to diagnose non-performing MSCRM applications, identify where the actual problem is and some suggested solutions.

Best Practices Analyzer 

First up, have you run the Microsoft Dynamics CRM 2013 Best Practices Analyzer on your installation? If there are any errors that are highlighted, then foremost you should address those – they might not be having a direct impact on the performance of your system but it’s best to eliminate those before you get too far.

Determine aspects that cause performance issue

The first thing you’re going to need to understand is – what is slow about your CRM and how have you measured that? You are going to need to get a grip on what aspects of performance are being reported as problematic if you’re going to solve them. Often there’s a multitude of factors that lead to poor performance so it is important that you establish some kind of baseline measurement of the current ‘poor’ performance is, understand the acceptable performance and then work out what is causing the difference.

– Is it the web service calls from some custom integration?

– Could it be related to increased number of users, or increased data?

– The user perception of performance, when accessing the web UI?

– Is it searching data?

– Is it editing and creating data?

Some Performance Diagnostic Tools

Following are some of the tools that help us determine performance of system.

– CRM’s built-in diagnostic tools – for measuring browser web UI performance, and network metrics between browser and server

– Fiddler – for capturing traffic between the browser and server

– SOAP-UI – for reproducing web service calls – works on both SOAP and REST web services

– SQL Server Management Studio – for examining underlying SQL Server performance, executing queries, checking execution plans

CRM’s Built-in diagnostic tools

This is a very handy way of getting end users to capture the performance of the CRM from their end and send it in to you. If you go to http://<CRM-URL>/tools/diagnostics/diag.aspx you’ll see like the page below. Click the Run button and you’ll get the results of some tests executed from the browser that reveal any issues between browser and server, or with the browser itself. Also helps to run this over different times of the day if your scenario involves different performance at different times of the day. 

From CRM 2013 SP1 onwards Microsoft added a new browser diagnostic tool. Press Ctrl + Shift + Q in internet Explore and you’ll see the following.

Now click the Enable button and load a form you’re trying to analyze. Hit Ctrl + Shift + Q again and now you’ve got an excellent breakdown of the form performance. 

This can be handy to compare performance from different browsers / end-users, and also to see the impact your form design is having. Loaded up with a billion sub-grids? That’s a paddling.

External application call to MSCRM

MSCRM allows integration by other systems. A pattern I’ve seen often involves getting .NET developers to write a simplified set of web services that conform to organization specific data models, acting as a wrapper to the CRM. This simplifies integration and transforms CRM objects into the required models, it also provides a bit of abstraction so you can minimize disruption if you upgrade the CRM installation later. Sounds awesome and you can bash out some code pretty damn quickly that gives you the desired results using LINQ.

Assuming you is writing queries against the OData service:

– Only return the attributes and links that you need

– Specify only the attributes that you need and then execute the query

– Additional unnecessary attributes just result in additional overhead of serializing / de-serializing.

– A handy way to log the timing of your custom web services is to ensure ‘time-taken’ is logged in IIS (assuming ASP.NET web services). You can then analyze this for queries exceeding your target times.

– Understand that your LINQ query may result in multiple OData web service calls. Which happen sequentially? Which adds up to lost time?

– Test your queries using SOAP-UI directly against the CRM OData service

– Compare the results to SQL Server Filtered Views – try T-SQL in SQL Server Management Studio that gets similar result-sets, how does that perform by comparison?

– One option for reading data is to connect to the SQL Server Filtered Views – go straight to the heart of the beast.

– Don’t jump into this without considering the future implications – it won’t work in a CRM Online world for instance, but if the bulk of the operations for your web services are read-oriented it may be worth checking out.

MSCRM Maintenance Jobs

CRM has a number of built-in maintenance related jobs that perform things such as index re-calculations. By default these kick off once a day roughly around the time that the CRM was installed. Which is usually the business hours? An excellent tool to review these schedules and change them to a time of less interruption to the user is the CRM Job Editor – with editions for CRM 2011 / 2013 / 2015.

Infrastructure

Check MSCRM implementation Guides for appropriate hardware value such as RAM and CPU.

If you’ve got multiple CRM servers deployed behind a load balancer, are you able to side step the load balancer and browse the CRM server directly (from the end user desktop environment), and does this make any difference to the performance? If it does, then check out the NLB configuration for any issues.

What’s the network topology? When you’re browsing the CRM Web UI and getting your own feel of system performance, are you doing it only meters away from the data center, while the users complaining about performance are in some regional office connected over a wet piece of string? If the performance symptoms being complained about seem to be geo-specific, replicate testing from their end as much as possible (see the built-in diagnostic tools in the Web Interface section).

Have you got latency issues between end users and your CRM servers? CRM can be a bit chatty and these can causes you pain over a connection with high latency.

Use of SQL Server Best Practices Analyzer

Of course Dynamics CRM performance is heavily dependent on the performance of the underlying SQL Server installation. So, have you run the SQL Server Best Practices Analyzer (2012 edition)?

– Memory and CPU – is SQL crying out for any more of either of these?

– Physical location of Data and Log files – are the data and log files on separate physical disks?

– Max Degree of Parallelism (MAXDOP) – it is recommended that this is set to 1. It affects the entire instance, and changes occur immediately. A word of caution: Tread carefully before making this change.

Tempdb – it is generally recommended to have the same number of physical files for Tempdb data as the number of CPUs on the server. By default there will be 1 file.

Growth of database files – check the auto-growth settings of the database files and pre-grow them to a larger size if your database is growing regularly. This can reduce the number of ‘disk grabs’ SQL makes as it expands the databases.

Use of SQL Statistics

SQL has some excellent statistics that it keeps regarding costly queries, index usage and other tuning related things. From a CRM perspectives these can help reveal what your most costly queries are.

SQL Indexes

If you’re doing a lot of queries that involve WHERE clauses with custom attributes or ORDER BYs with custom attributes chances are you can benefit from an index on those attributes – particularly if the number of records is large. Adding an INDEX to the SQL Table is the only supported thing you can change in the SQL Database. Things you need to consider – how unique is the data? How often are you reading from it vs. writing to it (inserts, updates)? Because the cost will come in terms of index calculation as you make changes. These days this calculation happens ‘online’ and doesn’t block but it still taxes CPU and Memory of course.

Run management Studio Execution Plan for Costly Queries

Run queries similar to the ones that are performing slowly, directly in SQL Server Management Studio and make sure to include the execution plan. SQL will tell you the cost of the query components and reveal if an index would benefit that query.

For Online Deployment if you put a support call in to Microsoft you can have them add the index for you.

Increase the performance of MSCRM form

Many times we see that MSCRM form takes much time to load, this is really hectic. Everyone wants smooth processing of MSCRM. So we have to take care of the thing by which MSCRM perform well. Slow processing may be due to fallowing reasons, slow internet, less ram and java scripts on MSCRM form. Improper java scripts code is also a bigger reason for slow processing of MSCRM form.

 

Below are some of the things you should remember before using java script on MSCRM form.

Do not include unnecessary java script web resource library: As we know if any web page contains more script to include on page load, page will take more time to load, so the more scripts you add to the form, the more time it will take to download them. After loading first time scripts get cached in browser. But as we usually say first impression is the last impression, so we have to try to not include unnecessary web resource library on MSCRM form.

Try to avoid loading all scripts in the on load event of the MSCRM form: If you have code that only supports on change events for fields or the on save event, make sure to set the script library with the event handler for those events instead of the On Load event. This way loading those libraries can be deferred and increase performance when the form loads.

Use collapsed tabs to defer loading web resources: Here is one interesting thing whenever a web resources or IFRAMES are included in sections inside a collapsed tab; they will not be loaded if the tab is collapsed so the script will also not get loaded on page load. They will be loaded when the tab is expanded.

Try to set default visibility options: We have to use default functionality for hiding attributes, we should avoid using form scripts in the On Load event to hide form elements. Instead, set the default visibility options for form elements that might be hidden to not be visible by default when the form loads. Then, use scripts in the On Load event to show those form elements you want to display.

Creating a Custom Entity using Import Data feature

Imagine an exceptional scenario where in you have a requirement of creating a custom entity and have data of 1000 records to be imported into this new custom entity. For this we will begin with the creation of a Custom Entity named Stores. Following is the sequence that is usually followed:

1. Create a Custom Entity named Stores.

2. Create Fields that are required and are present in the excel

3. Form Customizations to place the all the fields.

4. Publish Customizations

5. Import Data Wizard

– Map Record Types: Map the source Data file to the newly created custom entity named Stores.

– Map Fields.

6. Customize the default active view to display the required columns and Publish.

To summarize the above 6 steps in 2 stages,

Stage 1: Configuration. This will include creating the Custom Entity and Custom Fields which will be created one by one.

Stage 2: Import Data. Once you are done with the configuration, run the import wizard and do the mapping of entity and fields.

Since we already have the data to be imported into the CRM, an alternate way of quickly shortening the entire process is to create a new entity and fields during the Data Import stage itself.

Following is the sequence:

1. Import Data

a. On the Map Data window, Create New. Type in the name of the Custom Entity once a new window pops up.

 

b. On Map Fields window, select Create New Field

c. Select the field type

d. Review and Submit

That’s it! Now, the custom entity as well as the entire data is brought inside the CRM, we have to now perform one last step.

2. Define the area of display and customize the view as required.

3. The form customizations process remains same.

PROS:

The configuration time is saved and all the hassles associated, as all the configuration work can be done at the Data Import Stage itself.

CONS:

1. More than 1 mandatory field cannot be saved during the import stage. By default, only the Primary field is set as a mandatory field. This can however be changed by editing the field window by changing the field requirement type.

2. For Field Type “Option Set” it will consider the Option Values that are present in the column. It will not allow adding any more Option Values to this option set during import stage. Hence, one has to make sure that the source file has all the option values listed in this column. We can add more option values manually by editing the field settings.

Fun with JavaScript

 In this post we pick some of the examples of JavaScript

Part 1 : Working with aliased value (getting attributes value from Link Entity) in JavaScript.

Most often we find that the content available on the web,  concentrate on how to retrieve  the aliased value in C# . Below is an easy way to get the aliased value in JavaScript.

For example : Fetch Xml is

 <fetch version=”1.0″ output-format=”xml-platform” mapping=”logical” distinct=”false”>

     <entity name=”pcl_EntityA”>

       <attribute name=”pcl_fieldA” />

       <order attribute=”pcl_fieldA” descending=”true” />

         <link-entity name=”pcl_EntityB” from=”pcl_EntityBid” to=”pcl_fieldBonEntityA” visible=”false” link-type=”outer” alias=”AB”>

           <attribute name=”pcl_fieldB” />

                      </link-entity>

                </entity>

             </fetch>

 Conventionally to get the value of field A : collection[0].attributes. pcl_EntityA.value;

To get the value of the field B from the Link Entity B : collection[0].attributes[‘AB. pcl_fieldB’].value;

Part 2 : Registering a JavaScript function on all CRM Entities

Usually we can register a plugin on all CRM entities but do you know we can also register JavaScript function on all CRM entities?

For example: Suppose there is a used case scenario where there is bulk update of records, which will take up considerable amount of time to complete. Now in an ideal situation, one wouldn’t want the user to wait till it completes and prompt the user once it’s done in which ever entity he currently is working on. For this the following steps can be followed:

1. Add the Application Ribbon to the solution and open the solution in the Ribbon Workbench.

2. Select a button which will be common on the required or all entities For example: New Button.

3. Add a new Custom Enable Rule and further add your logic to it and return true by default to enable the function.

4. Enable rule of a function fires on load of the form so the function gets called each time on load.

The post has been written by Chetan Khandelwal