Archive

Archive for the ‘Performance’ Category

Windows 2008 R2 DFS doesn’t seem to use all the bandwidth available

April 14, 2011 2 comments

If you have just created a DFS replication between 2 machines, it should start replicating. However you will notice that the replication is extremely slow and usually does not take up too much of the available bandwidth, epecially in an intranet environment.

What happens is in Windows Server 2008 R2, by default DFS replication turns on remote differential compression (rdc). What RDC does is to just send delta changes from the source to destination servers.

This works great if you

  1. are paying for bandwidth
  2. have a low bandwidth pipe to do the transfer
  3. have large files
  4. Have ample CPU, Ram, Disk

 

But its crap when

  1. there are lots of small files
  2. the link is a strong pipe (e.g intranet environment)
  3. Don’t have so much CPU, Ram, Disk or those can be better utilized (delta changes takes up cycles and space)

 

So what you can do, if you do not want to enable RDC?

  1. Start up DFS management
  2. Select the replication that you want to modify
  3. Click on the connections tab
  4. Select the connections in questions (multiple selection is possible)
  5. Right click, properties
  6. Uncheck “Use remote differential compression (RDC)”
  7. Click ok

And the next thing you know, DFS will start gobbling up your bandwidth to send all the files over asap!

Categories: Performance

Visual Studio 2010 Ultimate with MSDN now gives you unlimited virtual users!

March 25, 2011 Leave a comment

Reproduced from http://blogs.msdn.com/b/vstsqualitytools/archive/2011/03/08/announcement-unlimited-load-testing-for-visual-studio-2010-ultimate-with-msdn-subscribers-now.aspx

 

Earlier today, S. Somasegar, Senior Vice President of Developer Division, announced Visual Studio 2010 Load Test Feature Pack, a new benefit for active and new Visual Studio 2010 Ultimate with MSDN subscribers to load test their applications with unlimited virtual users. Subscribers can immediately download the Visual Studio 2010 Load Test Feature Pack Deployment Guide and take advantage of the benefit today!

This is a huge deal! It democratizes the access to load testing by providing development teams tools to integrate performance testing into their application lifecycle early without having to incur incremental costs with additional virtual users they require.

We have been on a journey to democratize Application Lifecycle Management (ALM).  We believe that performance testing should be a core functionality of a top tier development tool, not an add-on. This is why we’ve built these tools into the Visual Studio IDE. We want to enable developers to write more robust, secure code that can scale to the most demanding business requirements.

The benefits of Visual Studio 2010 Load Test Feature Pack include:

  • Improved Overall Software Quality through Early Lifecycle Performance Testing: Everyday businesses lose money, productivity and competitive advantage when application performance does not meet customer expectations. Visual Studio 2010 Ultimate lets you stress test your application early and throughout its development lifecycle with realistically modeled simulated load. By integrating performance validations early into your applications, you can ensure that your solution copes with real-world demands and behaves in a predictable manner, effectively increasing overall software quality.

 

  • Higher Productivity and Reduced TCO with the Ability to Scale without Incremental Costs:Visual Studio 2010 Load Test Feature Pack effectively removes the financial constraints that prevented customers from engaging in the level of performance testing that they know is required and that they wanted to do. Development teams no longer have to purchase Visual Studio Load Test Virtual User Pack 2010 (retails at $4,499 for a pack of 1000 virtual users).

 

  • First Class Product Support through Software Assurance (SA): Microsoft is committed to providing on-going values to active SA customers. This Feature Pack is one of the five (5) Feature Packs which we have delivered to our valuable MSDN subscribers – you! Visual Studio 2010 Features Packs enable you to extend Visual Studio 2010 with capabilities that enhance and complement the existing tools.

 

To read about the benefits and practice of early lifecycle performance testing, download an industry thought leadership whitepaper here.

Call to Action: Download the Visual Studio 2010 Load Test Feature Pack Deployment Guide Today! Utilize the following resources to get started with stress and performance testing with Visual Studio 2010 Ultimate:

For additional information, please visit:

 

Categories: Performance

SQL Server having high RESOURCE_SEMAPHORE_QUERY_COMPILE waits

February 16, 2011 Leave a comment

If your SQL server is experiencing not so high CPU, but queries are getting timed out, it may be worthwhile to look at activity monitor resource waits (compilation), or run the following SQL

select * from sys.dm_os_wait_stats
where wait_type like ‘resource%’
order by wait_type

If the wait time very high (e.g the first in the list of resource waits), you might want to look at recent queries and see what’s being executed (Activity Monitor -> Processes, right click then details)

Chances are you will see quite a fair number of queries which looks more or less the same.

One easy way to solve this issue is to run dbcc freeproccache, which should effective drop your cpu to normal values once the query is completed. However in the long run, its best to look at the query causing this and see how to make it better

The usual cause of this is when there is alot of SQL which looks more or less the same being executed at the same time. But because the parameters are hard coded, it causes SQL Server to create an execution plan for each and every query.

To solve this, you can either create plan guides, or else find ways to parameterize your queries so that it does not have so many recompiles

 

Categories: Performance, SQL Server

Microsoft Visual Studio Team Suite Load Test Part 2 – Optimizing your load tests

October 23, 2009 1 comment

Its been quite a while since my first post of VSTS and it seems Mercury Loadrunner has caught up in terms of functionality.

Anyway here’s my 2nd post in this series which tells you how to improve performance when doing a load test.

Things to note…

Since the load test agent is licensed based on a per processor model, it is generally advisable to get a processor with as many cores as possible, and also pump up the ram for the machine, but as the agent is a 32bit application, it will not be able to use > 4GB of ram and does not work so well in a x64 environment. Recommended installation environment is still either Windows XP or Windows 2003.

The good news is in VSTS 2010, the agent itself is x64 so all these issues should magically disappear 🙂

ok back to the main topic, how to optimize your load tests

  1. stick to a x86 os with a maximum of 4GB of ram for load test agents
  2. Turn on server GC

    To enable your application to use Server GC, you need to modify either the VSTestHost.exe.config or the QTAgent.exe.config. If you are not using a Controller and Agent setup, then you need to modify the VSTesthost.exe.config. If you are using a controller and agent, then modify the QTAgent.exe.config for each agent machine.

    Open the correct file. The locations are

    VSTestHost.exe.config – C:\Program Files\Microsoft Visual Studio 8\Common7\IDE

    QTAgent.exe.config – C:\Program Files\Microsoft Visual Studio 2005 Team Test Load Agent\LoadTest

    To enable gcserver you need to add the following line in the runtime section:

    <gcServer enabled=”true” />

    It should look something like:

    <?xml version =”1.0″?>
    <configuration>
    <runtime>
    <gcServer enabled=”true” />
    <assemblyBinding xmlns=”urn:schemas-microsoft-com:asm.v1″>
    <probing privatePath=”PrivateAssemblies;PublicAssemblies”/>
    </assemblyBinding>
    </runtime>
    </configuration>

    Source: http://blogs.msdn.com/slumley/pages/improve-load-test-performance-on-multi-processor-machines.aspx

  3. Turn on agent connection pooling (Run Settings -> MySettingsName ->Properties -> WebTest Connection Pool
    By default VSTS emulates a client open and closing TCP connections to the server, however these opening and closing of connections take up resources. If your chief aim is to test the server, and especially if your server is sitting behind a load balancer with TCP offloading, turn this off
  4. Create a declarative WebTest program to easily change parameters inside your webtest
    http://blogs.msdn.com/densto/pages/declarativewebtest-declarativewebtestserializer.aspx
  5. Specify ResponseBodyCaptureLimit – By indicating how much of the response you want to capture for parsing, you can run away from those dreaded agent out of memory errors, this is especially so when you are doing e.g a file download test
    http://msdn.microsoft.com/en-us/library/microsoft.visualstudio.testtools.webtesting.webtest.responsebodycapturelimit%28VS.80%29.aspx)
  6. Monitor your servers, but not so frequent: You don’t wish to overload your servers just because you want to monitor their performance right
Categories: Performance, Visual Studio

Performance of Generics SortedDictionary and Dictionary

April 16, 2009 5 comments

Based on a Vladimir Bodurov blog post on IDictionary options – performance test – SortedList vs. SortedDictionary vs. Dictionary vs. Hashtable, it seems to indicate that SortedDictionary is slower during inserts as compared to Dictionary but faster during searching.

However during the course of my development in ASP.NET 3.5, there have been several occasions which seems to contradict this behaviour, perhaps it is because he is using WinForms and i’m using ASP.NET. So i took his code, did some modification for it to run several rounds and in IIS and went off from there (The source code can be found at the bottom of the page)

The test procedure is exactly the same as what he did, except since its only 2 i did 2 sets of 20, First doing SortedDictionary then Dictionary (Test result 1 to 20) and the second set doing Dictionary then SortedDictionary (Test result 21 to 40).

The averages of both set do seem to indicate negligible differences in the execution sequence.

The results are extremely interesting. Dictionary outperforms SortedDictionary on all counts! Memory usage, insertion time and search time are all much lower for Dictionary compared to SortedDictionary!

Not really sure why this is so, but if i remember correctly Dictionary uses hashes, so my best guess is it is easier to search for a hash as compared to string comparison..

Time Taken for Search

Time Taken for Search

Time Taken for Insert

Time Taken for Insert

Memory Usage for Insert

Memory Usage for Insert

The generated raw results from the web application

Test Memory Insert (Time taken in Ticks) Search (Time taken in Ticks)
SortedDictionary Dictionary SortedDictionary Dictionary SortedDictionary Dictionary
1 53919472 53376100 48921833 13100780 488 132
2 53919484 53376100 47371977 12744252 647 72
3 53919468 53376088 47386173 12994452 209 69
4 53919340 53376124 47428627 15464752 239 71
5 53919484 53376112 47508765 13040552 211 68
6 53919644 53376100 48597956 13617493 216 71
7 53919484 53376100 49073226 13166390 404 71
8 53919484 53376100 48807615 13104169 229 69
9 53919484 53376088 48633159 12904931 278 89
10 53919484 53376088 49147736 13043290 243 72
11 53919484 53376100 48755009 12901049 221 68
12 53919484 53376100 48720694 12851970 216 65
13 53919484 53376088 48968614 13938456 278 76
14 53919484 53376124 47349876 13072479 232 69
15 53917592 53376112 47436475 13217946 374 263
16 53919496 53376100 47271521 12936580 279 85
17 53919484 53376088 49116031 14775245 228 64
18 53919496 53376076 47675434 13543449 258 79
19 53919496 53376088 47454095 13330955 219 62
20 53919484 53376100 49053229 12932112 236 73
Average (Set 1) 53919390.6 53376098.8 48233902.25 13334065.1 285.25 84.4
21 53919496 53376052 35231348 9248780 245 108
22 53919544 53376064 35237264 8211268 183 312
23 53919544 53376052 35354053 8161178 170 82
24 53919516 53376040 35166182 8433640 248 117
25 53919984 53376000 35340838 8559027 161 92
26 53919532 53376064 35446193 8515937 190 87
27 53919532 53376040 35425444 8307412 301 89
28 53919484 53376124 35207419 8468252 173 88
29 53919532 53376064 35275126 8100160 326 193
30 53919532 53376064 35484239 8119661 188 86
31 53919532 53376064 35604412 8484580 180 82
32 53919532 53376052 35550023 8283421 247 108
33 53919532 53376076 35213423 8398157 187 85
34 53919484 53376136 35264521 8527381 183 89
35 53919484 53376136 35383377 8723896 179 84
36 53919484 53376100 35232178 8567521 162 83
37 53919544 53376064 35054454 8132677 174 80
38 53919532 53376052 35188672 8193882 184 86
39 53919496 53376136 35265145 8709565 179 103
40 53919532 53376064 35123234 8095966 199 87
Average (Set 2) 53919542.4 53376072.2 35302377.25 8412118.05 202.95 107.05
Average 53919466.5 53376085.5 41768139.75 10873091.58 244.1 95.725

Source Code (.aspx.cs)

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Diagnostics;
using System.Data;

public partial class test_cittka : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
DataTable tbl = new DataTable();
tbl.Columns.Add(“Memory_SortedDictionary”);
tbl.Columns.Add(“Memory_Dictionary”);
tbl.Columns.Add(“Insert_SortedDictionary”);
tbl.Columns.Add(“Insert_Dictionary”);
tbl.Columns.Add(“Search_SortedDictionary”);
tbl.Columns.Add(“Search_Dictionary”);

for (int i = 0; i < 20; i++)
{
DataRow row = tbl.NewRow();
sortedDictionary = new SortedDictionary<string, string>();
dictionary = new Dictionary<string, string>();
Insert_Test(“SortedDictionary”, sortedDictionary, row, 0, 2);
Insert_Test(“Dictionary”, dictionary, row, 1,3);
Search_Test(“SortedDictionary”, sortedDictionary, row,4);
Search_Test(“Dictionary”, dictionary,row,5);
tbl.Rows.Add(row);
}
GridView gv = new GridView();
gv.DataSource = tbl;
gv.DataBind();
Page.Form.Controls.Add(gv);
}
private readonly int searchIndex = 88888;
private readonly int numberRepetition = 500000;
private SortedDictionary<string, string> sortedDictionary = new SortedDictionary<string, string>();
private Dictionary<string, string> dictionary = new Dictionary<string, string>();

private void Insert_Test(string name, IDictionary<string, string> dict, DataRow row, int idxMemory, int idxInsert)
{
Response.Write(String.Format(“<br>——–Insert {0}——–“, name));
string[] letters = { “A”, “B”, “C”, “D”, “E”, “F”, “G”, “H”, “I”, “J” };
long memoryStart = System.GC.GetTotalMemory(true);
Stopwatch watch = new Stopwatch();
watch.Start();
Random rand = new Random();
for (int i = 0; i < numberRepetition; i++)
{
string key = GetRandomLetter(letters, rand, i) + “_key” + i;
string value = “value” + i;

dict.Add(key, value);
}
long memoryEnd = System.GC.GetTotalMemory(true);
watch.Stop();
Response.Write(String.Format(“<br>Memory Allocated by {0} is: {1}bytes”,
name, memoryEnd – memoryStart));
PrintResults(watch);
row[idxMemory] = (memoryEnd – memoryStart).ToString();
row[idxInsert] = watch.ElapsedTicks.ToString();
}
private void Search_Test(string name, IDictionary<string, string> dict, DataRow row, int idx)
{
Stopwatch watch = new Stopwatch();
Response.Write(String.Format(“<br>——–Search {0}——–“, name));
watch.Start();
Response.Write(String.Format(“<br>Found:{0}”, dict[“A_key” + searchIndex]));
watch.Stop();
PrintResults(watch);
row[idx] = watch.ElapsedTicks.ToString();
}
private void PrintResults(Stopwatch watch)
{
Response.Write(String.Format(“<br>Elapsed: {0}”, watch.Elapsed));
Response.Write(String.Format(“<br>In milliseconds: {0}”, watch.ElapsedMilliseconds));
Response.Write(String.Format(“<br>In timer ticks: {0}”, watch.ElapsedTicks));
}
private string GetRandomLetter(string[] letters, Random rand, int i)
{
if (i == searchIndex)
{
return “A”;
}
return letters[rand.Next(0, 10)];
}
}

Categories: ASP.NET, Performance

Performance Programming with ASP.NET C#

February 16, 2009 Leave a comment

It is easy to develop using ASP.NET, you can have a web application with database operations up in less than half an hour. But this ease of development comes at the expense of several performance hits.

  1. Viewstate
    Web sites are stateless by design, which means that anything you have on the page is not retained when you submit the data. What ASP.NET tries to do is to save the state on the page so that state is retained when the page submits its data.

    However this quickly becomes unmanageable when you have too many controls on the page, especially if you have large server controls e.g gridviews. As viewstate is stored on the page itself, a simple page with a few form elements may end up sending large amounts of data on postback, the majority of which is the viewstate itself.

    Other than increasing page size, there is also the performance cost incurred when serializing and deserializing the viewstate.

    Solution:

    • if a page is an output page and does not do any further processing, you can disable the viewstate for the page <% Page EnableViewState=”false”.
    • If the control on the page does not handle any events, is not a data-bound control or the data i constantly refreshed on postback, then you can disable the viewstate for the control
    • If all else fails, you can look into storing viewstate on the session state using SessionPageStatePersister (There is a drawback because the session state may expire before the user is done with what he or she is doing on the page – think blogs, wikis etc)
  2. String concatenation & comparisonHaving experience with int i = 0; i = 1 +2, you decide to go ahead and do the same for strings. string s = “aa” + “bb”This is very bad programing, simply because strings work differently from numeric primitive data types. When you do this type of string concatenation, quite a fair bit of resources are created to hold the strings and results

    The same can be said for case insensitive string comparison, usually you would do something like string1.toLower() == string2.toLower(). Same thing occurs, temporary space needs to be created to store the lowered strings and then do a compare.

    Solution:

    use stringbuilder for string concatenation functions
    System.Text.StringBuilder sb = new System.Text.StringBuilder();
    sb.Append(“aa”);
    sb.Append(“bb”);
    sb.ToString();

    use the overloaded string compare function if you want to do case insensitive searches
    String.Compare (string strA, string strB, bool ignoreCase);

  3. For loops with easily checked conditions perform far better than for each loops. However if the for loop condition requires a property call (e.g list.count()) then it will perform much slower, in this case get the count before the loop and assign it to a variable and in the for loop check against this variable.
  4. If you have multiple exit conditions, put them when it is first possible to check and also put the one most likely to happen right on top. This reduces the amount of checks needed to exit, which can give you a performance improvement.
Categories: ASP.NET, Performance

Performance Programming with ASP.NET Ajax

February 9, 2009 Leave a comment

At first glance, ajax seems to be a cinch with asp.net 3.5, what with all the updatepanel, timers and such. However behind this ease of development hides a danger which most developers may not notice.

Although updatepanels and timers are easy to use, each update triggers a complete postback and if your viewstate is on the page, its an extremely large form post. On the server side, each event of a normal postback is called, and the whole control tree is generated to handle the ajax form post. This is quite intensive both for the server and for the client.

This can be good or bad depending on how you look at it. Its good because it ensures that the postback is a valid one, and also gives you flexibility on the controls you want to change and to include in any javascript that you need, all these being done on the code behind, without the need to fiddle with all the client side javascript required to perform these actions.

On the flip side, there are also client side apis and scriptmanager service references which gives you complete flexibility to go ahead and program using client side javascript.

Nexus initially went with the first approach (i.e a whole bunch of updatepanels), but the results were less than impressive. Moving onto the second approach required a revamp of the whole source code, but this gave the user a much better experience along with substantially reducing server load.

To reduce the footprint on the server side, i utilized ScriptManager’s ServiceReference parameters

<asp:ScriptManager ID="ScriptManager" runat="server"
      onasyncpostbackerror="ScriptManager_AsyncPostBackError">
<Services>
    <asp:ServiceReference InlineScript="true" Path="JSON.asmx" />
    <asp:ServiceReference InlineScript="true" Path="ProxyAsync.asmx" />
</Services>
</asp:ScriptManager>

ServiceReferences exposes the webservice functions in client side javascript so you can call these webservices without having to write complicated javascript. for example, i have a webservice function called UserFirstLoad(). This function is called the everytime the user logs into Nexus.

[WebMethod]
[ScriptMethod(UseHttpGet = false, XmlSerializeString = true)]
public JSON_Tabs UserFirstLoad()

Instead of writing javascript to interface to this webservice, ScriptManager automatically writes it for you and exposes it nicely as JSON.UserFirstLoad(), a function which i can easily call and have the capability to do error handling if need be.

As i’m not using UpdatePanel, GUI updates are done directly via DOM. Fortunately Ajax ClientScripts provide some global functions which i can use to

  1. Locate the element i want to update ($get)
  2. Add events to the element ($addHandler)

It is also fortunate that Visual Studio 2008 comes with javascript intellisense, which autocompletes your javascript as you type.

As Javascript is OOP, i decided to build my client script scripts around objects and functions. This gives me the flexibility to overwrite functions and have nice clean code which i can then easily debug with VS2008.

Categories: AJAX, ASP.NET, Performance