Wednesday, January 28, 2009

TFS Guide from Microsoft on CodePlex

I think is helpful as it provides very useful training/mentoring/resource information on Microsoft Team Foundation Server.

Part I, Fundamentals
* Chapter 1 - Introducing the Team Environment
* Chapter 2 - Team Foundation Server Architecture
Part II, Source Control
* Chapter 3 - Structuring Projects and Solutions in Source Control
* Chapter 4 - Structuring Projects and Solutions in Team Foundation Source Control
* Chapter 5 - Defining Your Branching and Merging Strategy
* Chapter 6 - Managing Source Control Dependencies in Visual Studio Team System
Part III, Builds
* Chapter 7 - Team Build Explained
* Chapter 8 - Setting Up Continuous Integration with Team Build
* Chapter 9 - Setting Up Scheduled Builds with Team Build
Part IV, Large Project Considerations
* Chapter 10 - Large Project Considerations
Part V, Project Management
* Chapter 11 - Project Management Explained
* Chapter 12 - Work Items Explained
Part VI, Process Templates
* Chapter 13 - Process Templates Explained
* Chapter 14 - MSF for Agile Software Development Projects
Part VII, Reporting
* Chapter 15 - Reporting Explained
Part VIII, Setting Up and Maintaining the Team Environment
* Chapter 16 - Team Foundation Server Deployment
* Chapter 17 - Providing Internet Access to Team Foundation Server
Part IX, Visual Studio Team System 2008 Team Foundation Server
* Chapter 18 - What's New in Visual Studio Team System 2008 Team Foundation Server

Monday, January 26, 2009

Setup TFS WorkSpaces for Developers

Read Its a pain in the neck to teach everyone how to set up workspaces, show them how to get latest and hope they don't accidentally (or purposely) mess it up. I recommend reading this posting and downloading the script into your Custom TFS Tools Solution that you maintain in the Version Control tree for your organization's architecture/processes' TFS team project.

Not yet ready for 64-bit

Read Its a reminder that TFS is not really ready for 64-bit yet. But stay tuned that it will be, and that's when it'll roll out to tens of thousands on a single server farm and work tremendous magic in very very large enterprises that adopt it well. This article also reminds me why I was wise to shift gears recently towards supporting virtualization as the way to go for all future TFS implementations.

TFS Branching Guide

Go read Nobody should use Branching on a VSTS/TFS project without reading up on the pain/lessons that others have gone through. There are 5 good PDF documents here that all by themselves would make an excellent brownbag:

- Main 2.0.pdf
- Scenarios 2.0.pdf
- Q&A 2.0.pdf
- Labs -
- Drawings

Wednesday, January 21, 2009

TFS Install Bug with SQL Server Files

Today I setup a server for TFS 2008. I first installed SQL Server. It put the Reporting Services data/log files in the C:\Program Files\Microsoft SQL Server\MSSQL.2\MSSQL\Data directory. Then I moved these files into the D:\TFSDATA\SQLDATA and D:\TFSDATA\SQLLOG directories on the D: drive and configured SQL Server to use these same directories as the default directories for data/logs. Then I installed TFS 2008 (including WSS). The installer put all the new databases’ data/log files in the C:\Program Files\Microsoft SQL Server\MSSQL.2\MSSQL\Data directory. Note to Microsoft: That’s what I call a bug. The workaround is simple. I just have to move those files from the folders on C: over to the default folders on D:.

Saturday, January 17, 2009

Migrate VSS to TFS

Here's some steps to follow to learn about VSS to TFS migrations.

1. I recommend watching the following Microsoft video:

<br/><a href="" target="_new" title="How To - Migrate from VSS to Team Foundation Source Control">Video: How To - Migrate from VSS to Team Foundation Source Control</a>

2. Make sure you have installed TFS 2008 Service Pack 1 on your TFS server. See for information on VSSConverter improvements.

3. Read and to learn about the Analyze and Migrate commands for VSSConverter.

4. Read and do the five step process to prepare for migrations: (a) Back up your Visual SourceSafe database. (b) Identify and resolve data integrity issues in your existing database using the Visual SourceSafe Analyze tool. (c) Run the converter tool to identify potential sources of information loss. (d) Specify which Visual SourceSafe folders to migrate. (e) Create a user mapping file to map Visual SourceSafe users to Team Foundation users.

5. Read and do the two part process for migrations: (a) 7 steps to modify the settings file to create a migration file, (b) 5 steps for running the converter. There's actually a 6th step that's essential for developers using VSTS. See for instructions on migrating source control bindings.

6. Be sure to search the MSDN forums (see or for the latest discussion threads on VSS->TFS migrations. Also see to check out the latest gossip about doing VSS->TFS migrations.

7. Be sure to google VSSConverter for further information on the migration tool. I did a search today that provided a few interesting links including

8. If you really want total flexibility then learn about the Microsoft.TeamFoundation.VersionControl.Client namespace that opens up possibilities such as,, and

We Share Your Pain (WSYP)

Finally Microsoft will build pain-free software ;) This is funny.

<br/><a href="" target="_new" title="Sharing software Customer pain">Video: Sharing software Customer pain</a>

Thursday, January 8, 2009

Deleting Work Items in TFS

Many new TFS users are surprised to learn that deleting work items is not a piece of cake. Technically its possible to do. But its wisely kept from being straightforward. Making a Work Item be "deleted" means destroying history and that's not good for honesty/transparency. What's recommended is to modify the WIT template workflow to create a State called "Deleted". But if you must delete a work item there are a couple ways:

(1) Delete the work item in the database.

delete from dbo.WorkItemsLatest where ID = ?"

delete from dbo.WorkItemsAre where ID = ?"

delete from dbo.WorkItemsWere where ID = ?"

delete from dbo.WorkItemLongTexts where ID = ?"

delete from dbo.WorkItemFiles where ID =? "

Warning: You need to also possibly delete rows in the Attachments database and the work item will still be in the data warehouse.

(2) Get the latest TFS Power Tools (see and use the "tfpt destroywitd" command.

Warning: The work item will still be in the data warehouse as this only deletes them from the operational store.

Wednesday, January 7, 2009

Setup of TFS for Virtualization

I was reminded today that the first step in working with TFS is to get it setup. If I were to start from scratch I'd do so with "virtualization" and a single-server dedicated server solution. Whether to go with Hyper-V, VMWare, Virtual Server, etc. all depends on your IT shop. I feel this way primarily because of the importance of backup/recovery procedures being reliable and straight-forward. The leading virtualization platforms have solid backup/recovery procedures in place that can be used effectively for TFS. I also recommend a one-server solution that's completely used for just TFS and its components and nothing else. Whoever is managing the backups/recoveries for your web servers, email servers, data servers, etc. ought to be the same person(s) managing these same processes for your TFS server(s).

See for the best timely article on TFS 2008 hardware recommendations. For most organizations running TFS it can be done with a 50 GB hard disk space server instance using the 32-bit edition of TFS 2008, 32-bit edition of SQL Server 2005 and a 32-bit edition of Windows Server 2003 or Windows Server 2008 OS. For virtualization during runtime it'll need to run with 2-3 GB of available RAM. I'd also plan for about 1 GB per month in hard disk growth. Now if the IT teams are putting up a lot of multimedia content as attachments then your space requirements will grow proportionally.

Some other links to read include:

I'd also like to suggest bookmarking this site at and

Tuesday, January 6, 2009

Building a SharePoint Custom List

Microsoft is quickly moving the world towards SharePoint for everything application-related including TFS. Don't be surprised to see me write about SharePoint-related matters here as they greatly impact the possibilities with TFS.

I'd like to provide an article that covers how to create a basic custom site definition, how to create a basic custom list definition, and how to display that custom list on the default page at site creation. This knowledge is useful in TFS as SharePoint lists are great for collaboratively working with constituencies in setting priorities, gathering requirements, reporting bugs, reporting bad implementations and coming up with todo lists that are meaningful. Build your custom list templates right and you'll extend TFS functionality more properly.

See for the article. And I'd highly recommend reading to get some understanding of CAML as it relates to custom views.

Friday, January 2, 2009

Future Improvements on C# Code Commenting

Read to see a tutorial I wrote on what I consider to be the current "best practice" for C# Code Commenting. I have a few suggestions on future improvements to this "Best Practice":

(1) There should be a way to automate adding a post-build event for source code projects to have the Sandcastle CHM generated as part of the build.
(2) Certainly other project documentation (such as project charters, requirements documents, design/architecture documents, TFS reports (i.e. builds, work items, changesets, etc.) and other artifacts can be integrated into the post-build event.
(3) Certainly the whole post-build process done locally on auto-building the CHM file and including other artifacts can be included as part of the TFS Build process.
(4) Certainly templates with the HP logo, SLM-labels, and other process needs can be integrated into the help-files generation process.
(5) Considering that code comments are compiled into a XML file using a defined schema we can always customize the documentation however we want with XSLT, third party tools, etc.
(6) As Sandcastle is "open source" and Visual Studio is quite extensible it should be possible to specify custom XML tags that can be processed to generate whatever output or perform whatever tasks we want done at build-time.

Do you have any suggestions to add?

Best Practice on C# Code Commenting

For the next 30 minutes I recommend you do the following 7 steps. They'll teach you how to become proficient on a best practice for self-documenting your C# code in projects done using Microsoft Visual Studio. The same lessons apply for other .NET languages but I mention C# because its the most popular one. Once you've learned this process I'm confident you will use it for now on because it automates the documentation process for all development work to allow you to build HTML/Help files on-the-fly. Make sure you have Visual Studio 2005 or later installed on your workstation and that you've been able to successfully create and build a solution/project before proceeding with these steps.

(1) Read which is an excellent intro to C# Code Commenting.

(2) Read for a nice explanation on code commenting.

(3) Run Visual Studio, open a solution/project and follow the following instructions. (a) Open the property page for the project, usually by right-clicking on the project in the Solution Explorer, and click Properties. (b) After the dialog has opened, click the Configuration Properties folder. (c) Click the Build option. (d) In the right pane, there will be a property field called XML Documentation File. Set this to the path and file name of the desired file. The path entered is relative to the project directory, not absolute. (e) Put some "Hello World" text in a summary tag on at least one class/member within the project. (f) Build the project and view the XML Documentation File to see your "Hello World" text. (source:

(4) Download and install the product. If you skip this step then you will be very sorry.

(5) Go to, download the latest "Sandcastle Help File Builder Installer" MSI file and install. Then run it on your desktop start menu as follows: "All Programs -> Sandcastle Help File Builder". In the GUI click the ADD button to include the DLL file that corresponds with the assembly you generated in Step (3)(f) of this tutorial. Then select "Documentation -> Build Project" from the menu to build a Help file. Finally select "Documentation -> View Help File" from the menu to see the help file and search for your "Hello World" text.

(6) Read on recommended documentation tags and try them out in the project you used in Step 3. Build the project in Visual Studio. Then build the help file in Sandcastle to see the resulting help file.

(7) Read to see some other code commenting tools for .NET. However I recommend using Sandcastle because it has plenty of functionality, its source code is freely available at and a google search such as shows that there is sufficient documentation, assistance and community support for Sandcastle's future.

Did you do all 7 steps and bookmark this link for future reference? If so then you should now be sufficiently proficient enough on the current best practice for self-documenting your C# or VB.Net code in projects done using Microsoft Visual Studio. Congratulations Guru!! See for ideas on enhancing this best practice.

Thursday, January 1, 2009

TFS Build and WSPBuilder

This may be the "holy grail" for Sharepoint development as Brian Farnhill describes how to make TFS Build and WSPBuilder work well together. This makes full-service collaborative large team Sharepoint development possible. Read the following link and keep in mind my summary notes on the 7 steps.!AEC42F315B4528B0!3290.entry

1. Install WSPBuilder on the TFS build server.

2. Add WSPBuilder to the paths environment variable. Add the WSPBuilder path (C:\Program Files\WSPTools\WSPBuilderExtensions) to the PATH system variable on both the development box and the TFS build box.

3. Set files included in the WSP file to copy to output directory. In the Solution Explorer window of Visual Studio you select and do the following on each 12-hive content file in the properties window: (a) set the item to be content, (b) copy if newer for each of these 12-hive items you set. Thus when TFS does a build in the directory that it puts the DLL's it will give you the other files for the 12 hive in the same directory at the same time on the TFS server, ready for the WSP builder call to run.

4. Add the post build activity to the appropriate projects. For each project in your solution producing a WSP you need to follow Brian's instructions for post-build action code. So if your solution builds 9 WSP's you need to do this 9 times. Usually there will be just one WSP per solution and just one post-build action code snippet written. I advise setting the $(OutDir) variable to always be the complete path to where the build is outputting to on the TFS Build server.

5. Create a new build in TFS. Simple creation of a new build but its VITAL to make sure you choose the debug option because WSPBuilder expects everything to be in the debug folder.

6. Add additional reference paths. Open the TFSBuild.proj file that is created for your TFS build and add one as Brian instructs to the appropriate section (the very last one by default, read the comments to be sure). Then you can set this folder up as a network share and drop referenced assemblies in there as required.

7. Add a pre-build action (if more than one WSP file in solution). With multiple WSP's you will find that if you don't follow Brian's instructions that the DLL's from the first projects that are build will make their way into the second and subsequent WSP files. With just one WSP you can skip this step.

Now you should find that your drop location will now contain all the DLL's, PDB and config files, as well as your WSP files. Mission accomplished.

More links for my personal research

The following is here for knowledge retention purposes only so be forewarned that it might bore you and you can stop reading now.

Today I've been doing plenty of blog-reading. There are many more things I'd like to read further but I'm out of time. Here are 5 links without notes and 2 links with notes.

The Greatest Invention in Computer Science is "The Routine"

Aside from the invention of the computer, the routine is arguably the single greatest invention in computer science. It makes programs easier to read and understand. It makes them smaller (imagine how much larger your code would be if you had to repeat the code for every call to a routine instead of invoking the routine). And it makes them faster (imagine how hard it would be to make performance improvements in similar code used in a dozen places rather than making all the performance improvements in one routine). In large part, routines are what make modern programming possible.

the problem with routines: they only take a minute to learn, but a lifetime to master ....

* How long should this routine be? How long is too long? How short is too short? When is code "too simple" to be in a routine?
* What parameters should be passed to this routine? What data structures or data types? In what order? How will they be used? Which will be modified as a result of the routine?
* What's a good name for this routine? Naming is hard. Really hard.
* How is this routine related to other nearby routines? Do they happen at the same time, or in the same order? Do they share common data? Do they really belong together? What order should they be in?
* How will I know if the code in this routine succeeded? Should it return a success or error code? How will exceptions, problems, and error conditions be handled?
* Should this routine even exist at all?

One thing I'll add is this. I LOVE STUBS. That's where I decide for a subroutine on a test value that will get sent back to the calling method until I have time to properly design/write the subroutine. My favorite phrase is "Hello World from ". For example a subroutine called MyNamespace.MyClass.MySubroutine will send back "Hello World from MyNamespace.MyClass.MySubroutine 001" the first time. As new test versions are built and the code gets improved I might increase the number so it now says "... 002", then "... 003", etc.

It is the job of a good software project manager to recognize the tell-tale symptoms of this classic mistake and address them head on before they derail the project. How? By forcingencouraging developers to create a detailed list of everything they need to do. And then breaking that list down into subitems. And then adding all the subitems they inevitably forgot because they didn't think that far ahead. Once you have all those items on a list, then -- and only then -- you can begin to estimate how long the work will take.

Until you've got at least the beginnings of a task list, any concept of scheduling is utter fantasy. A very pleasant fantasy, to be sure, but the real world can be extremely unforgiving to such dreams.

Johanna Rothman makes the same point in a recent email newsletter, and offers specific actions you can take to avoid being stuck 90% done:

1. List everything you need to do to finish the big chunk of work. I include any infrastructure work such as setting up branches in the source control system.

2. Estimate each item on that list. This initial estimate will help you see how long it might take to complete the entire task.

3. Now, look to see how long each item on that list will take to finish. If you have a task longer than one day, break that task into smaller pieces. Breaking larger tasks into these inch-pebbles is critical for escaping the 90% Done syndrome.

4. Determine a way to show visible status to anyone who's interested. If you're the person doing the work, what would you have to do to show your status to your manager? If you're the manager, what do you need to see? You might need to see lists of test cases or a demo or something else that shows you visible progress.

5. Since you've got one-day or smaller tasks, you can track your progress daily. I like to keep a chart or list of the tasks, my initial estimated end time and the actual end time for each task. This is especially important for you managers, so you can see if the person is being interrupted and therefore is multitasking.

Reading and Blogging in 2009

IMHO the year 2009 will be the year when the best-organized IT managers/professionals get themselves in the habit of well-managing their blog-reading and knowledge retention. The IT professionals in 2009 who are going to be leading the efforts in the industry to apply technology to real-world business needs are going to be doing plenty of reading in order to stay current, solve problems and come up with the best solutions for their organizations. Typically this will be done through an "in-the-cloud" RSS reader that's also installed on their mobile phone and allows them to analyze statistics and prioritize their RSS feeds to be more productive. They'll spend at least 10 hours/week reading blogs. It'll happen on the train, in restaurants or anywhere they're "waiting" on their mobile phones. It'll happen while they are using their computers at work or home. They will typically be following hundreds of blogs and add at least a dozen or more additional ones every month to their subscriptions. Next they'll track the statistics on their subscriptions and organize them according to which ones are most useful in their jobs/work and then focus on getting updates from the blogs that are the highest priority. In a typical week they'll glance at thousands of titles and then pick a few hundred of the titles to click on to see the opening sentence in the posting. Of these a couple dozen will actually be read and most of the rest just skimmed. With all the constant change in technology there is no more effective way to stay current on technology in 2009 than through fast-speed-blog-reading.

If you want to be heard then you need to earn trust and be rated high on their watch list. You can accomplish this by doing the following:

(1) Regularly update your blog with current useful content.
(2) Generally make each entry short and easy to read.
(3) Provide all possible links and references to avoid plagiarism.
(4) Its imperative that you sincerely try to be completely unbiased and objective.
(5) Warn your readers if you are going to write lengthy details for your own knowledge retention purposes.

Communication and Blogs

One thing I really like about TFS and Sharepoint is how they really work to improve collaboration, communication and using blogs out-of-the-box to disseminate information appropriately. I found a blogger at who provides some good tips that should help every IT professional.

The single most important thing you must do to improve your programming career is improve your ability to communicate.

To program, you must elicit ideas from other people and share your ideas with them. Sharing ideas comes in many forms: explaining how you did something. Suggesting a new practice for the team. Demonstrating how something works. Convincing everyone to switch programming languages. Persuading a brilliant engineer to join your team. Persuading your manager to get out of the way and let you do your thing.

Advancing your career is entirely about communicating. Getting a job. Turning down a job. Asking for a promotion. Turning down a promotion. Getting onto a good team. Politely extricating yourself from a good team. Persuading a brilliant engineer to co-found a company. Helping a brilliant engineer understand why co-founding a company isn’t the right thing to do. Asking for funding. Turning down funding. Getting clients. Turning down clients.

If you take just one thing from this post, let it be this: To improve your programming career, the single most important thing you must do is improve your ability to communicate your ideas face to face.

I learn:

Three types of great BLOG posts:
What I learned from Language X that makes me a better programmer when I use Language Y
Something surprising that you probably wouldn’t guess about Language X from reading blog posts
My personal transformation about Idea X

A type of timewasting BLOG post:
Here’s why such-and-such

Using Routines in Programming

The longer I code the more I've learned that the key to good programming is in how you approach writing routines. Steve McConnell at lists the following 8 reasons and I provide commentary.

  • Reducing complexity - I like to keep my routines short enough so I can read the routine declaration line to the bottom "}" all within one Visual Studio window. That's usually about 40 lines.

  • Avoiding duplicate code - I agree its the most popular reason for routines.

  • Limiting effects of changes - Doing the repeatable job in one place is important and that's why good requirements analysis and good software design/architecture up-front is essential in creating good coding practices.

  • Hiding sequences - This is why good routine-writing is essential to good object-oriented programming as its through hiding sequences in routines that we accomplish good encapsulation.

  • Improving performance - without good, organized routines you will never figure out where the bad-performing code is located

  • Hiding data structures and global data - good for encapsulating, sometimes bad for documenting the data structures' usage so that future DBA's and architects can read the code and make future changes or integrated systems.

  • Promoting code reuse and planning for a family of programs - This is where object-oriented programming provides real value with routines. And with the [obsolete] tag in C# and other code management tags & comment sections for .NET routines it makes it much easier to plan, document and validate code reuse or lack-of-reuse.

  • Improving readability - well-naming and keeping the size of the routine down to 20-40 lines is vital. Making routines too short or not designing the high-level algorithm well will cause what I call "sub of sub of sub of sub of sub ... I'm lost and can't find the code" syndrome. Basically its not easy to have too many nested levels of subroutines to dig through in order to find the code you need to work with and this is a common problem I run across when the subroutines are too short and the algorithm was either not well thought-out or its undergone considerable change.

  • Improving portability and isolating use of nonstandard language functions - interfaces and gateways from C# into COM libraries, external web services, external "C" code, etc. are well-managed through good routine writing.

  • Isolating complex operations - this is one of the greatest use of routines - any complex task must be done through ONE routine with calls to multiple subroutines in cases where alot of code is needed

VS2010 Planned Features and "Later" Features

for a list of planned features for Microsoft Visual Studio 2010 and some features planned for future features AFTER Visual Studio goes out.

For VS10:

* A new Windows Presentation Foundation-based (WPF) text editor
* More “modern,” with more of a WPF look and feel throughout the suite
* Smaller in size (in code and data) than Visual Studio 2008
* More reliable and modular

For some time “later”:

* Visual Studio Tools for Applications (VSTA) used for macros, plus other “end-user extensibility” improvements
* The ability to create more add-ins in managed code
* Full WPF shell
* Extensive support for the parallel framework for multicore hardware

Like just about every Microsoft product these days, VS 10 is going to get the Software+Services treatment ....

New Microsoft Programming Language Code-Named "D"

According to

A handful of Microsoft’s top developers are working to create a new programming language, code-named “D,” which will be at the heart of the Microsoft’s push toward more intuitive software modeling.

D is a key component of Microsoft’s Oslo software-oriented architecture (SOA) technology and strategy. Microsoft outlined in vague terms its plans and goals for Oslo in late fall 2007, hinting that the company had a new modeling language in the works, but offering no details on what it was or when the final version would be delivered.

D will be a declarative language aimed at non-developers, and will be based on eXtensible Application Markup Language (XAML), sources, who asked not to be named, said.

Sources close to Microsoft confirmed the existence of D, which they described as a forthcoming “textual modeling language.” In addition to D, sources said, Microsoft also is readying a comlementary editing tool, code-namd “Intellipad,” that will allow developers to create content for the Oslo repository under development by Microsoft.

5 Key Areas of Microsoft Oslo

See for the following on the “Oslo” advancements that will be delivered through Microsoft server and tools products in five key areas:

Server. Microsoft BizTalk Server “6” will continue to provide a core foundation for distributed and highly scalable SOA and BPM solutions, and deliver the capability to develop, manage and deploy composite applications.

Services. BizTalk Services “1” will offer a commercially supported release of Web-based services enabling hosted composite applications that cross organizational boundaries. This release will include advanced messaging, identity and workflow capabilities.

Framework. The Microsoft .NET Framework “4” release will further enable model-driven development with Windows Communication Foundation (WCF) and Windows Workflow Foundation (WF).

Tools. New technology planned for Visual Studio “10” will make significant strides in end-to-end application life-cycle management through new tools for model-driven design of distributed applications.

Repository. There will also be investments in aligning the metadata repositories across the Server and Tools product sets. Microsoft System Center “5,” Visual Studio “10” and BizTalk Server “6” will utilize a repository technology for managing, versioning and deploying models.

David Chappell on Microsoft Oslo

See for the following:

Some Oslo details first went public in June of this year at TechEd. As described then, the code name "Oslo" applied to three things: a new version of Windows Workflow Foundation (WF), a server for running WF applications and others, and a set of modeling technologies, including a repository and visual editor. All of these technologies can be used together, so putting then under an umbrella code name made some sense.

Microsot Oslo and Visual Studio 2010

Planning a technology roadmap for the next few years requires that we know what's coming up from Microsoft with Oslo, Visual Studio 2010, etc. I'm not exactly sure how Oslo and VS 2010 will affect each other as Microsoft is wisely keeping all the details under wraps. I just know that their core functionality must be married as Oslo is the coding foundation for future Microsoft software code development and Visual Studio is the future tool for making it all happen. Stay tuned or be sorry!!

Suggested links: