Saturday, October 02, 2010

TFS 2010 Basic is My Choice for Personal Use

Team Foundation Server (commonly abbreviated to TFS) is a Microsoft product offering source control, data collection, reporting, and project tracking, and is intended for collaborative software development projects. It is good choice for big company like Chevron. But I wouldn’t choose TFS 2005 or TFS 2008 as my choice for personal use or for small companies. But with the compact features of TFS 2010 Basic: Source Control, TFS Build, and Work Items; and the power tools to easily backup and restore,  I found TFS is more attractive to me than other tools as Subversion or CVS as my personal source control system along with Application Life Cycle Management system.

 

So I picked TFS 2010 Basic to install with SQL Server Express on my Windows 7. The installation was very pleasant and smooth. It works like a charm.

 

TFS 2010 Basic is my choice now for personal use. It can be yours too.

 

References:

·         Reference: Mahesh Mitkari's Blog  My Coffee cup: Installing TFS 2010 on Windows 7

·         Reference: How to Backup / Restore TFS 2010

·         Download TFS Power Tools September 2010

 

 

Wednesday, June 16, 2010

SortedSet vs HashSet


HashSet<T> is very good at add and search operations. Any search operation (Contains, Remove, and similar operations) are O(1). That's great. However, on the minus side, the HashSet<T> is not a sorted collection. Therefore, enumerating the elements in a sorted order forces you to copy the items to a different collection (like a List<T>) and sort the resulting list. You could construct a LINQ query to order the elements, however internally that query will likely use some form of temporary storage to create the sorted sequence. That means every sort will be an expensive operation. Sort is typically an O(n ln n) operation, Also, because the HashSet<T> does not have a sort method, you'll also have increased memory pressure and time cost to copy the elements.

SortedSet is new to .NET 4.0 System.Collections.Generic namespace. SortedSet<T> has different characteristics. The sorted set ensures that the elements in the set are always in sorted order. Every Add operation places the new element in the correct location in the set. That means Add is an O(ln n) operation. The SortedSet<T> must perform a binary search to find the correct location for the new element. The search happens on any of the search actions (Contains, Remove, etc). Those operations also have an O(ln n) performance characteristic. That sounds like the SortedSet<T> is always slower than the HashSet<T>. No one would use it if it was always slower. SortedSet<T> is much faster for iterating the set in sorted order. It's already in the correct order, so the enumeration becomes an O(n) operation.

Conclusion
SortedSet<T> will typically be faster than HashSet<T> when the majority of your operations require enumerating the set in one particular order. If, instead, most of the operations are searching, you'll find better performance using the HashSet<T>. The frequency of insert operations also has an effect on which collection would be better. The more frequently insert operations occur, the more likely HashSet<T> will be faster.

Thursday, February 25, 2010

System.OutOfMemoryException on WCF Web Service

Recently ran into OutOfMemoryException from a .NET 3.0 WCF web service whenever the w3wp.exe reaches ~1.395 GB memory. WCF web service is hosted in IIS 6.0. After poking around, the problem was found...

IIS has limitations and warts when it comes to memory handling, and if your WCF service really must use more than 1.4 GB of memory on the server, then you need to host that WCF service yourself, in a console app, a NT Service, a Winforms app - whichever way to you choose to go.
Quick question though: how is your server going to handle 10 simultaneous requests if handling each request will use up 1.4 GB of memory....

Keep in mind that you don't get access to all memory if you're running in asp.net, it'll only allow you 2gigs with a standard configuration. Maybe you should farm this out to a windows service, or a console app.


See here: "Fact: In a standard setup your worker process always have 2GB Virtual memory available (no matter if you have 1, 2 or 4GB physical memory in the machine)."

http://jesperen.wordpress.com/2007/05/23/understanding-aspnet-memory/

In that case, I am going to check out
WCF streaming which allows you to substantially reduce the size of buffer memory needed on the server. Let me get back to this after I try WCF streaming out.

Be A Developer That Uses AI

Developers will not be replaced by AI, they'll be replaced by developers that use AI. Generative AI tools are revolutionizing the way de...