Archive

Archive for May, 2011

Test Runs significantly add to TFS 2010 size

May 31, 2011 Leave a comment

If you’re using the Test Run feature in Team Foundation Server 2010 you may notice that the size of your attachments (tbl_Attachment) will grow over time – there is no facility for purging or archiving.  While all the diagnostic data captured can be real handy, you probably don’t need it after some time (just look at what you can collect during a test run).

Grant Holliday has a great post explaining how to use a command line power tool known as Test Attachment Cleaner for Visual Studio Ultimate 2010 & Test Professional 2010: TCMPT.exe.  He also provides some SQL queries you can run to check the size of your attachments.

-Krip

 

How to update model data in receiving action

May 30, 2011 Leave a comment

When using all the standard out-of-the-box fuctionality of ASP.NET MVC, you may notice that you simply can’t just update a model’s properties in an action method.  Oh the code will execute just fine, but the data you see on the page will not reflect updates from code.  Here’s an example of what I mean:

[HttpPost]
public ActionResult Index(myModel m)
{
   if (ModelState.IsValid)
   {
      // do some stuff
      m.Message = "data submitted";
   }
   return View(m);
 }

The new value of m.Message will not be reflected on the page.  Instead you will see the previous value.  For the updates to be reflected you must remove the old data from Model State.  You do this in one of two ways:

  • Either clear out the affected properties, one by one using ModelState.Remove(propertyName)
  • Or just clear out the entire model state with ModelState.Clear()

So the above code becomes:

[HttpPost]
public ActionResult Index(myModel m)
{
   if (ModelState.IsValid)
   {
      // do some stuff
      ModelState.Clear()
      m.Message = "data submitted";
   }
   return View(m);
}

-Krip

P.S. credit

Categories: ASP.NET, MVC Tags: , , ,

Performance of in-line SQL vs. stored procedures

May 27, 2011 Leave a comment

Just been doing some load tests with .NET code against SQL Server 2008 R2 Developer edition via Data Access Application Block in Enterprise Library.

I insert 5,000 records into a table with a primary key on an Identity column and a clustered index on the same.  I’m setting 3 nvarchar fields and one smallint one, and then returning the Identity value of the newly added record.  The in-line SQL method calls SELECT SCOPE_IDENTITY() as the second statement in the batch and utilizes the ExecuteScalar() method while the stored proc path merely returns the same value and is called via the ExecuteNonQuery method.  Using caching as detailed below, I almost achieve 1,000 records per second on my developer machine.

I’m afraid I can’t help the stored procedure vs. in-line SQL debate as sometimes the in-line SQL is quicker and other times the stored proc is quicker!

However to get the greatest throughput (on both in-line and stored procs) be sure to cache (or reuse) the following:

  • Database object (from DatabaseFactory.CreateDatabase) – although caching this didn’t make a huge difference
  • DbCommand (needed for next item!)
  • ALL DbParameters – merely overwrite the values on subsequent calls instead of calling Database.AddInParameter

-Krip

Minimum configuration for data access block v5

May 27, 2011 1 comment

If you’re using the Data Access Block from Microsoft’s Enterprise Library (v5, e.g. 5.0.414.0), here is the minimum configuration you’ll need to be able to do data access:

<configuration>

<configSections>

<sectionname=dataConfigurationtype=Microsoft.Practices.EnterpriseLibrary.Data.Configuration.DatabaseSettings, Microsoft.Practices.EnterpriseLibrary.Data, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35requirePermission=true />

</configSections>

<dataConfigurationdefaultDatabase=myDbConn>

</dataConfiguration>

<connectionStrings>

<addname=myDbConnconnectionString=data source=(local);Integrated Security= SSPI;Initial Catalog=myDatabase;providerName=System.Data.SqlClient/>

</connectionStrings>

</configuration>

Without this you’re liable to get the error: “The type Database cannot be constructed. You must configure the container to supply this value.”

-Krip

How to use Forms Authentication with SSRS

May 26, 2011 Leave a comment

SSRS out-of-the-box supports Windows Authentication.  Microsoft, however, has baked in an extension model that allows you to plug in your own authentication scheme.  So it’s fully possible to change over to Forms authentication.  This is particularly useful if you want to run reports from inside of your own website that uses Forms Authentication.

When you convert SSRS over, all use of the system will then use Forms Authentication.  So this includes:

  • The web portal (Report Manager)
  • The power user reporting tool (Report Builder)
  • API access (Web Services use of Report Server)
  • SQL Server Mangement Studio (when connecting to SSRS)

So I will say you end up with a robust solution.  But getting there is not for the faint of heart!  There are a number of steps to take including writing some custom code, adding some pages, and changing a bunch of settings in a number of config files.  I keep hoping Microsoft will one day make this a lever we just pull!  I’ve done these steps on more than one enterprise project in SQL Server 2005.  From what I see things haven’t changed much on this front in SQL Server 2008 R2 but feel free to drop me a line if you’ve got some notes there.

You can configure SSRS to use the very same aspnet database that your website uses and that’s probably where the real value lies.  So you’re actually passing through the same credentials from your web app through to SSRS.  If you do the extension right, you can even make use of ASP.NET Roles when granting access to reports and folders in SSRS.  That will reduce the administration burden and IT will thank you.

So here’s the steps in detail: Security Extension Sample for SQL Server 2005.  And if you’re using SQL Server 2008 R2, have a look at Implementing a Security Extension.

If you’re going to have to do the changeover several times (and I’d be surprised if you didn’t), do yourself a favour and write a tool to do it.  We did that on our team and it was a lifesaver, particularly since more than one group was involved in installations, and we had dozens to do!

-Krip

How to save disk space on dev SQL Servers

May 23, 2011 Leave a comment

One of the first things I do on any development installation of SQL Server is change the Recovery Model on the “model” System Database from “Full” to “Simple”.  This ensures that new databases created carry the Simple model which means the transaction log will be truncated automatically.  Just checked a new SQL Server 2008 R2 Developer install and it was Full – I would have thought that Microsoft would default to Simple on dev installs but I guess not.

One other tip: If you restore a database – say you bring back a live or test copy onto your dev environment for analysis / debugging – it will keep the Recovery Model of the original – so be sure to change to Simple if you don’t want a large transaction log building up during development!

-Krip

BizTalk Server Tuning Tips

May 22, 2011 Leave a comment

Here are a number of things to pay attention to in order to get the most out of your BizTalk system (many of these are for SQL Server in general):

  • Place data and log files on separate disks
  • Create one tempdb data file per CPU (including cores); ensure each is exactly the same size; do not exceed 8 (see Optimizing tempdb Performance for more details)
  • Use dedicated SQL Servers
  • Be careful of constant database auto-growth – set to sensible values
  • Ensure all databases are getting backed up (which reduces transaction log size) – use the generated SQL Agent job to backup the BizTalk databases – it’s the only supported means of BizTalk database backups
  • Ensure all BizTalk Agent jobs are running successfully (there is one that runs in a loop and never terminates so be aware of that) (and another does backups so if doing your own, skip that one)
  • Microsoft does not want you changing any of their customized settings on the databases made during the BizTalk install – this includes any schema changes to the MsgBox database including indexes (See What you can and can’t do with the MsgBox database)
  • Separate the MsgBox and Tracking database and log files onto separate disks (so 4 disks – so I hope you’re reading this before you’ve committed to your kit!)
  • Change the DTA Purge and Archive job to just purge if you don’t need archiving (see documentation below)
  • Place sending, receiving, processing (i.e. orchestrations), and tracking into separate BizTalk hosts (so 4 are needed at a minimum) (and note again, a separate host for tracking)
  • If on a 32-bit system and getting out of memory (host restarted) error messages, try the /3GB OS startup switch – this gives each host 3GB instead of 2 (I’ve used it with great success)

Finally a couple of great resources to check out.

First I highly recommend the BizTalk Server Operations Guide (from which many of the above suggestions were derived).  It’s chock full of goodness.  There’s one for each version but the 2010 may be sufficient:

Also an absolutely fantastic tool to perform an automatic check of the health and configuration of your system and provide tuning tips is MsgBoxViewer.  I think it’s completely misnamed as it does a lot more than just look at the MsgBox database.

-Krip