Wednesday, March 18, 2009

Problems in changing username for TFS services

I changed the username that's running all the TFS services on the TFS Application Tier machine. I noticed the next day that certain information such as users in the Assigned To dropdown, Areas, Iterations, etc. were not getting updated in a timely manner. What's going on? It turns out its in the Integration services. At is a discussion on this topic.

The following SQL on the TFS data tier will show whether or not the username on the TFS application tier has all 4 necessary subscriptions in place.

select s.address, i.domain, i.account_name from TfsIntegration..tbl_subscription s
left join TfsIntegration..tbl_security_identity_cache i on s.subscriber = i.sid
where delivery_type = 2

In my case the results didn't work so I had to login to the application tier server as the username running the TFS services and run the following 5 DOS commands in sequence substituting my server name for "".

cd C:\Program Files\Microsoft Visual Studio 2008 Team Foundation Server\Tools

BisSubscribe.exe /eventType DataChangedEvent /deliverytype Soap /address http://:8080/VersionControl/v1.0/Integration.asmx /server http://:8080

BisSubscribe.exe /eventType DataChangedEvent /deliverytype Soap /address http://:8080/WorkItemTracking/v1.0/SyncEventsListener.asmx /server http://:8080

BisSubscribe.exe /eventType BuildCompletionEvent /deliverytype soap /address http://:8080/WorkItemTracking/v1.0/Integration.asmx /server http://:8080

BisSubscribe.exe /eventType ProjectCreatedEvent /deliverytype Soap /address http://:8080/Warehouse/v1.0/warehousecontroller.asmx /server http://:8080

On the 3rd one for a BuildCompletionEvent I got an error "requested value 'soap' was not found". When I dropped off the deliverytype parameter as follows it worked fine:
BisSubscribe.exe /eventType BuildCompletionEvent /address http://:8080/WorkItemTracking/v1.0/Integration.asmx /server http://:8080

So it appears to work properly now for the Assigned To dropdown. But the Areas dropdown is still not working properly. The Event viewer on the Application Tier server shows the TF51338 error. Solving this problem requires putting this username into the TFS Server's Service Accounts group. This can't be done in the TFS GUI like can be done for other server or team project groups in TFS. It requires the TFSSECURITY.exe command line utility. I went to a DOS window on the application tier server and typed in the following:

cd C:\Program Files\Microsoft Visual Studio 2008 Team Foundation Server\Tools
tfssecurity.exe /server: /g+ "[Server]\Service Accounts" n:\tfsservices

Presto!! It worked :) My updates to Areas/Iterations get updated realtime into the Areas and Iterations tree views in the Work Item GUI of Visual Studio or TSWA. As soon as I click "refresh" on the Team Project my list of users in my "Assigned To" dropdown gets filled in.

Monday, March 16, 2009

Tasks "at the bottom" in MS Project

Microsoft Project is a great tool for creating task/schedule hierarchies for project tasks. It well-integrates with TFS. Whenever you run a query to import work items it'll automatically filter out duplicates from your WIQL results. It provides a grid of non-duplicate work items meeting the WIQL criteria with the first column of checkboxes pre-checked. This way you can further filter out work items to bring into your project plan document.

The problem is that all the new work items come in "at the bottom" of the MS Project plan. The solution can be found at under the heading "Moving tasks". Here are the steps to follow:

1. Remember to drag and drop, and do not cut and paste.
2. For the task you wish to move, select the entire task row by clicking the gray row heading, which includes the task number.
3. Point to the row heading until a four-headed arrow appears, and then drag the task to where you want it in the active view.
4. A gray line along the row border indicates where the task will be inserted when you release the mouse button.

Friday, February 27, 2009

Tips on TFS Security

1. During permissions testing .... always make sure that at least 2 of TFS admins are full admins on the TFS server and in the TFS security settings.

2. Define the security groups you want for TFS.

3. Create these groups as Windows groups on the server that's running TFS.

4. Create corresponding TFS server groups for these Windows groups and assign one member (i.e. the Windows group) to each of these TFS server groups (see first 2 snapshots).

5. Create team project security groups and by default assign TFS server groups as members of these groups (see 3rd snapshot).

6. Assign security permissions by team project security group at the Team Project level or at the Area levels.

7. On the Windows server, assign roles/permissions for SQL Server Reporting Services that correspond to the Windows Server groups you've created.

8. On the Windows server, assign roles/permissions for Windows SharePoint Services that correspond to the Windows Server groups you've created.

9. Remember to "keep it simple". Think of having a handful of Windows server groups that correspond 1:1 with TFS server groups that in turn correspond 1:1 with Team Project groups. Then in turn these Team Project groups have all the team project level and areas' security permissions based back on those Windows server groups as the standard. Anything else should be thought of as an exception. What this will do is handle the cascading of all permissions for SharePoint, SQL Server Reporting Services and TFS throughout all the services associated with TFS/SharePoint.

Warning: It can be disastrous to NOT "keep it simple". And keep in mind that this article assumes you are running a single server solution with all the databases and web front ends on the same Windows server.

Wednesday, February 25, 2009

Questions every CEO should ask their CIO

What plaform do you use to provide real-time integration of all information on source code control, source changes, builds, tasks, bugs, issues, project plans, documentation, test scripts and metrics reporting?

How are your project managers aware of all saved code changes in other projects that may regressively impact their projects?

How many minutes would it take you to find out what specific code changes were made in relation to an invoice line item charged to a customer?

How many minutes does it take your organization to start tracking new metrics and have reporting procedures in place at both the high-level project and low-level task detail levels?

They should have a definitive answer or plan on the first two questions. And they should be able to say a very small number as the answer on the second two questions. If the answer isn't TFS then I'd like to know what it is.

TFS Learning Curve for Project Managers

TFS often comes with a steep learning curve for project managers, CIO(s), CTO(s) and others in business and IT management. This is unnecessary if approached correctly. One word describes the key in overcoming this barrier. It's SharePoint.

Show TFS and VSTS to developers, DBA(s) and other techies and you'll see them get excited and see the magic of having a one-stop place to do all their work in Visual Studio. Show VSTS to project managers and you'll often see blank stares and confusion as they imagine that this new tool is only going to give them more headaches and work to do. The facts are that TFS was designed with the intent of making life easier for the IT technical workers, especially those who have worked with Visual Studio. Its much like why Boeing and Airbus design cockpits with the needs of airplane pilots in mind rather than the realtime needs of their management back in the office building. During airplane takeoffs, landings and emergency crash landings the last thing a pilot needs is to be called out of the cockpit to file some status report that some office building manager deems as necessary to help them do their job. Well that's how techies often feel about their management. And that's why nearly all project teams and IT organizations struggle with communication.

In my opinion the best way to get Project Managers up/running with TFS is to not tell them they're using TFS. Approach it as a SharePoint site deployment. Setup your Process Template to include the libraries in SharePoint they need to get the starting Project/Excel files they'll need to do their job, the Word templates for documentation/processes, process guidance, reports and organizational links. Get them very familiar with SharePoint.

Once they know SharePoint well enough to be productive then show them how to use some of the tools they'll need to access TFS work items. Start with MS Excel by launching some of the Excel files in the SharePoint libraries that come out of the box in a Team Project site. Train them on how to use Excel with TFS work items. Then show them MS Project with TFS work items. Then show them the reports they can get. Finally introduce them to TSWA, Team Explorer and other tools that can help them see how work items integrate with version control files and builds.

Thursday, February 5, 2009

Essay on Team Foundation Server Capabilities

Team Foundation Server (TFS) provides all the capabilities to fully manage the full software development lifecycle (SDLC). I've successfully implemented it in several environments and seen it perform well in meeting the needs of an IT organization. I've yet to run across any scenario for software development in a Windows-based environment (i.e. not counting Macs, Unix, etc.) where TFS couldn't be used effectively.

Out-of-the-box TFS fully implements the CMMi-3 and Agile Scrum methodologies. Implementing it for an organization is just a matter of starting a new TFS Team Project, setting up a list of tasks in the TFS Work Items catalog to accomplish the organizational mandates and setting up organizational folders & document templates in the team project SharePoint portal site that gets auto-generated when a new TFS team project is created. If an organization seeks to invest in automating the project launch/management process then one of the out-of-the-box TFS process templates can be customized/imported into TFS so that whenever a new team project is created that all the organizational mandates will be there (i.e. document templates, work items, workflow, control policies, reports, etc.).

TFS is a back end for IT project workers using Visual Studio in the same way that MS Exchange is the back end for end users using Outlook for email. There are 4 primary editions of Microsoft Visual Studio Team Suite (VSTS) for developers, architects, testers and database modelers/administrators. Learning/using the TFS capabilities in Visual Studio is quite intuitive for those already familiar with the Microsoft Visual Studio platform. In fact TFS was specifically built with Visual Studio users in mind.

Some of the key TFS features are:

(1) Robust work items tracking is provided. Each work item has attributes (i.e. Fields, Columns) to store information such as Title, Description, Iteration, Area, Discipline, AssignedTo, Priority, Hours Completed, Hours Remaining, Start Date, End Date, Related Builds, Related Work Items, related Version Control Items, related Version Control Changesets, Hyperlinks, Attachments, etc. The attributes available depend on the work item type (WITs). For the Agile process template the WITs are QoS requirements, bugs, tasks, scenarios, and tests. The CMMi process template includes Requirements and Review WIT's. The Process Template Editor can be used to create/modify WIT templates in order to allow you to track almost anything in a work item and to setup workflow rules/procedures with the work items of these WIT's.

(2) Robust version control using true client/server and web services technology. The TFS version control data is stored in SQL Server. Earlier version control systems such as SourceSafe and PVCS are much like dBase or ISAM files where data corruption, uncleared locks and other processing errors are more commonplace. That's because these legacy products have no true, reliable client/server processing. Performance is noticably improved in TFS over SourceSafe. Features such as shelving, changesets processing, branching, merging and reporting are quite effective/useful in TFS.

(3) SQL Server Reporting Services is used by TFS. There are about 2 dozen reports out-of-the-box that meet most project management needs. Reports can be created/modified to meet the needs for a specific project for the organizational purposes. A data warehouse that integrates all the version control, project management, work item tracking, attachments, builds, integration and reporting information is provided. Its cubes with measures/dimensions are auto-refreshed according to a configurable schedule.

(4) Microsoft Excel and Microsoft Project integrate well with TFS. Its easily possible to add/edit work items in these tools for any TFS team project. Project plans can be generated in MS Project and imported into TFS. To Do lists can be created in MS Excel and imported into TFS. Or to do lists or checklists can be copied/pasted into Excel from some other program in order to create TFS work items.

(5) A SharePoint portal site is created for each new TFS team project. In this portal a user on the team (or in the management & project users communities) can participate with as limited/open collaboration functionality as needed. The documents/reports available through Visual Studio (or other front ends using TFS data) are available in this portal. All WSS 3.0 functionality/capabilities is available in each TFS team project portal site.

(6) TFS Build Management is versatile glue for marrying programming/testing, bugs/resolutions, development/maintenance and implementations together. As many build projects as needed can be created for a team project. Build parameters such as schedule, build machine, destination, source code snapshot, notifications and build errors handling can be made out-of-the-box. As work items for bugs, development tasks, tests, etc. are completed then builds where the bugs were found or resolved will be related.

(7) TFS is fully extensible. All features are available in .NET namespaces and/or server configuration utilities where customizations, extensions and integration with any application/process/workflow can be made. The market is starting to see many TFS third party tools being built for purposes such as timesheet entry, task time tracking, resource planning, status reporting, integration with CRM/ERP suites, integration with accounting systems (i.e. GL, AP, AR, OE, etc.), integration with system admin utilities, etc. are being performed. Developers building apps in platforms such as Java, Oracle, Mainframes, etc. can still use TFS thanks to the extensibility capabilities of the TFS core.

(8) Security can be handled at the server-level, team project level or area level. In each team project a hierarchy of areas can be created. It can be as simple as one node or as complex with N-generations of M+ nodes per generation or however complex the security needs are for a team project to restrict/grant access to read/edit information or grant/revoke rights at the server, team project or area level. Just a FYI that there is an Area attribute for each work item in TFS and only an end user with change rights on both the old Area and new Area can change a work item's Area. Also keep in mind that the rights assigned at root node levels trickle down to their children/leaf nodes. And at any server/project/area node the rights can be assigned to either a TFS user or a TFS group. A TFS group is a collection of Active Directory users and/or Active Directory groups that has been created at the server level or at the team project level. This security model can handle any realistic security requirement for all enterprise-wide SDLC processes. However an important thing to keep in mind is that the server-level administrators can get access to ALL data on the TFS server where they have server-level access. Its also important to remember that a VSTS user (or any Excel/Project/SharePoint/3rd party user of TFS) can use multiple/unlimited TFS servers at the same time without incurring extra per-seat costs for Visual Studio and CAL’s.

(9) Perhaps the greatest feature of TFS is how well it integrates the core SDLC processes elements contained therein - work item tracking, version control, build management, reporting and external integration. One example on how well this works is the need to set a policy that developers will regularly report what tasks they're working on and what code is associated. In this case a policy can be established that a developer can only checkin changes by first associating work item(s) and entering changeset comments. In addition as every checkin/change is timestamped its possible to provide very robust/detailed time/effort reporting. Rules can also be put in place (out of the box for simple rules, customize for complex rules) to immediately notify certain users when certain events occur such as a user reporting a bug, a build not being successful, certain work items being marked as "done", critical path tasks not being done in time, certain versioned items being checked in or someone making a change to work items assigned to you.

If you have any other questions on TFS then please let me know.

Wednesday, January 28, 2009

TFS Guide from Microsoft on CodePlex

I think is helpful as it provides very useful training/mentoring/resource information on Microsoft Team Foundation Server.

Part I, Fundamentals
* Chapter 1 - Introducing the Team Environment
* Chapter 2 - Team Foundation Server Architecture
Part II, Source Control
* Chapter 3 - Structuring Projects and Solutions in Source Control
* Chapter 4 - Structuring Projects and Solutions in Team Foundation Source Control
* Chapter 5 - Defining Your Branching and Merging Strategy
* Chapter 6 - Managing Source Control Dependencies in Visual Studio Team System
Part III, Builds
* Chapter 7 - Team Build Explained
* Chapter 8 - Setting Up Continuous Integration with Team Build
* Chapter 9 - Setting Up Scheduled Builds with Team Build
Part IV, Large Project Considerations
* Chapter 10 - Large Project Considerations
Part V, Project Management
* Chapter 11 - Project Management Explained
* Chapter 12 - Work Items Explained
Part VI, Process Templates
* Chapter 13 - Process Templates Explained
* Chapter 14 - MSF for Agile Software Development Projects
Part VII, Reporting
* Chapter 15 - Reporting Explained
Part VIII, Setting Up and Maintaining the Team Environment
* Chapter 16 - Team Foundation Server Deployment
* Chapter 17 - Providing Internet Access to Team Foundation Server
Part IX, Visual Studio Team System 2008 Team Foundation Server
* Chapter 18 - What's New in Visual Studio Team System 2008 Team Foundation Server

Monday, January 26, 2009

Setup TFS WorkSpaces for Developers

Read Its a pain in the neck to teach everyone how to set up workspaces, show them how to get latest and hope they don't accidentally (or purposely) mess it up. I recommend reading this posting and downloading the script into your Custom TFS Tools Solution that you maintain in the Version Control tree for your organization's architecture/processes' TFS team project.

Not yet ready for 64-bit

Read Its a reminder that TFS is not really ready for 64-bit yet. But stay tuned that it will be, and that's when it'll roll out to tens of thousands on a single server farm and work tremendous magic in very very large enterprises that adopt it well. This article also reminds me why I was wise to shift gears recently towards supporting virtualization as the way to go for all future TFS implementations.

TFS Branching Guide

Go read Nobody should use Branching on a VSTS/TFS project without reading up on the pain/lessons that others have gone through. There are 5 good PDF documents here that all by themselves would make an excellent brownbag:

- Main 2.0.pdf
- Scenarios 2.0.pdf
- Q&A 2.0.pdf
- Labs -
- Drawings

Wednesday, January 21, 2009

TFS Install Bug with SQL Server Files

Today I setup a server for TFS 2008. I first installed SQL Server. It put the Reporting Services data/log files in the C:\Program Files\Microsoft SQL Server\MSSQL.2\MSSQL\Data directory. Then I moved these files into the D:\TFSDATA\SQLDATA and D:\TFSDATA\SQLLOG directories on the D: drive and configured SQL Server to use these same directories as the default directories for data/logs. Then I installed TFS 2008 (including WSS). The installer put all the new databases’ data/log files in the C:\Program Files\Microsoft SQL Server\MSSQL.2\MSSQL\Data directory. Note to Microsoft: That’s what I call a bug. The workaround is simple. I just have to move those files from the folders on C: over to the default folders on D:.

Saturday, January 17, 2009

Migrate VSS to TFS

Here's some steps to follow to learn about VSS to TFS migrations.

1. I recommend watching the following Microsoft video:

<br/><a href="" target="_new" title="How To - Migrate from VSS to Team Foundation Source Control">Video: How To - Migrate from VSS to Team Foundation Source Control</a>

2. Make sure you have installed TFS 2008 Service Pack 1 on your TFS server. See for information on VSSConverter improvements.

3. Read and to learn about the Analyze and Migrate commands for VSSConverter.

4. Read and do the five step process to prepare for migrations: (a) Back up your Visual SourceSafe database. (b) Identify and resolve data integrity issues in your existing database using the Visual SourceSafe Analyze tool. (c) Run the converter tool to identify potential sources of information loss. (d) Specify which Visual SourceSafe folders to migrate. (e) Create a user mapping file to map Visual SourceSafe users to Team Foundation users.

5. Read and do the two part process for migrations: (a) 7 steps to modify the settings file to create a migration file, (b) 5 steps for running the converter. There's actually a 6th step that's essential for developers using VSTS. See for instructions on migrating source control bindings.

6. Be sure to search the MSDN forums (see or for the latest discussion threads on VSS->TFS migrations. Also see to check out the latest gossip about doing VSS->TFS migrations.

7. Be sure to google VSSConverter for further information on the migration tool. I did a search today that provided a few interesting links including

8. If you really want total flexibility then learn about the Microsoft.TeamFoundation.VersionControl.Client namespace that opens up possibilities such as,, and

We Share Your Pain (WSYP)

Finally Microsoft will build pain-free software ;) This is funny.

<br/><a href="" target="_new" title="Sharing software Customer pain">Video: Sharing software Customer pain</a>

Thursday, January 8, 2009

Deleting Work Items in TFS

Many new TFS users are surprised to learn that deleting work items is not a piece of cake. Technically its possible to do. But its wisely kept from being straightforward. Making a Work Item be "deleted" means destroying history and that's not good for honesty/transparency. What's recommended is to modify the WIT template workflow to create a State called "Deleted". But if you must delete a work item there are a couple ways:

(1) Delete the work item in the database.

delete from dbo.WorkItemsLatest where ID = ?"

delete from dbo.WorkItemsAre where ID = ?"

delete from dbo.WorkItemsWere where ID = ?"

delete from dbo.WorkItemLongTexts where ID = ?"

delete from dbo.WorkItemFiles where ID =? "

Warning: You need to also possibly delete rows in the Attachments database and the work item will still be in the data warehouse.

(2) Get the latest TFS Power Tools (see and use the "tfpt destroywitd" command.

Warning: The work item will still be in the data warehouse as this only deletes them from the operational store.

Wednesday, January 7, 2009

Setup of TFS for Virtualization

I was reminded today that the first step in working with TFS is to get it setup. If I were to start from scratch I'd do so with "virtualization" and a single-server dedicated server solution. Whether to go with Hyper-V, VMWare, Virtual Server, etc. all depends on your IT shop. I feel this way primarily because of the importance of backup/recovery procedures being reliable and straight-forward. The leading virtualization platforms have solid backup/recovery procedures in place that can be used effectively for TFS. I also recommend a one-server solution that's completely used for just TFS and its components and nothing else. Whoever is managing the backups/recoveries for your web servers, email servers, data servers, etc. ought to be the same person(s) managing these same processes for your TFS server(s).

See for the best timely article on TFS 2008 hardware recommendations. For most organizations running TFS it can be done with a 50 GB hard disk space server instance using the 32-bit edition of TFS 2008, 32-bit edition of SQL Server 2005 and a 32-bit edition of Windows Server 2003 or Windows Server 2008 OS. For virtualization during runtime it'll need to run with 2-3 GB of available RAM. I'd also plan for about 1 GB per month in hard disk growth. Now if the IT teams are putting up a lot of multimedia content as attachments then your space requirements will grow proportionally.

Some other links to read include:

I'd also like to suggest bookmarking this site at and

Tuesday, January 6, 2009

Building a SharePoint Custom List

Microsoft is quickly moving the world towards SharePoint for everything application-related including TFS. Don't be surprised to see me write about SharePoint-related matters here as they greatly impact the possibilities with TFS.

I'd like to provide an article that covers how to create a basic custom site definition, how to create a basic custom list definition, and how to display that custom list on the default page at site creation. This knowledge is useful in TFS as SharePoint lists are great for collaboratively working with constituencies in setting priorities, gathering requirements, reporting bugs, reporting bad implementations and coming up with todo lists that are meaningful. Build your custom list templates right and you'll extend TFS functionality more properly.

See for the article. And I'd highly recommend reading to get some understanding of CAML as it relates to custom views.

Friday, January 2, 2009

Future Improvements on C# Code Commenting

Read to see a tutorial I wrote on what I consider to be the current "best practice" for C# Code Commenting. I have a few suggestions on future improvements to this "Best Practice":

(1) There should be a way to automate adding a post-build event for source code projects to have the Sandcastle CHM generated as part of the build.
(2) Certainly other project documentation (such as project charters, requirements documents, design/architecture documents, TFS reports (i.e. builds, work items, changesets, etc.) and other artifacts can be integrated into the post-build event.
(3) Certainly the whole post-build process done locally on auto-building the CHM file and including other artifacts can be included as part of the TFS Build process.
(4) Certainly templates with the HP logo, SLM-labels, and other process needs can be integrated into the help-files generation process.
(5) Considering that code comments are compiled into a XML file using a defined schema we can always customize the documentation however we want with XSLT, third party tools, etc.
(6) As Sandcastle is "open source" and Visual Studio is quite extensible it should be possible to specify custom XML tags that can be processed to generate whatever output or perform whatever tasks we want done at build-time.

Do you have any suggestions to add?

Best Practice on C# Code Commenting

For the next 30 minutes I recommend you do the following 7 steps. They'll teach you how to become proficient on a best practice for self-documenting your C# code in projects done using Microsoft Visual Studio. The same lessons apply for other .NET languages but I mention C# because its the most popular one. Once you've learned this process I'm confident you will use it for now on because it automates the documentation process for all development work to allow you to build HTML/Help files on-the-fly. Make sure you have Visual Studio 2005 or later installed on your workstation and that you've been able to successfully create and build a solution/project before proceeding with these steps.

(1) Read which is an excellent intro to C# Code Commenting.

(2) Read for a nice explanation on code commenting.

(3) Run Visual Studio, open a solution/project and follow the following instructions. (a) Open the property page for the project, usually by right-clicking on the project in the Solution Explorer, and click Properties. (b) After the dialog has opened, click the Configuration Properties folder. (c) Click the Build option. (d) In the right pane, there will be a property field called XML Documentation File. Set this to the path and file name of the desired file. The path entered is relative to the project directory, not absolute. (e) Put some "Hello World" text in a summary tag on at least one class/member within the project. (f) Build the project and view the XML Documentation File to see your "Hello World" text. (source:

(4) Download and install the product. If you skip this step then you will be very sorry.

(5) Go to, download the latest "Sandcastle Help File Builder Installer" MSI file and install. Then run it on your desktop start menu as follows: "All Programs -> Sandcastle Help File Builder". In the GUI click the ADD button to include the DLL file that corresponds with the assembly you generated in Step (3)(f) of this tutorial. Then select "Documentation -> Build Project" from the menu to build a Help file. Finally select "Documentation -> View Help File" from the menu to see the help file and search for your "Hello World" text.

(6) Read on recommended documentation tags and try them out in the project you used in Step 3. Build the project in Visual Studio. Then build the help file in Sandcastle to see the resulting help file.

(7) Read to see some other code commenting tools for .NET. However I recommend using Sandcastle because it has plenty of functionality, its source code is freely available at and a google search such as shows that there is sufficient documentation, assistance and community support for Sandcastle's future.

Did you do all 7 steps and bookmark this link for future reference? If so then you should now be sufficiently proficient enough on the current best practice for self-documenting your C# or VB.Net code in projects done using Microsoft Visual Studio. Congratulations Guru!! See for ideas on enhancing this best practice.

Thursday, January 1, 2009

TFS Build and WSPBuilder

This may be the "holy grail" for Sharepoint development as Brian Farnhill describes how to make TFS Build and WSPBuilder work well together. This makes full-service collaborative large team Sharepoint development possible. Read the following link and keep in mind my summary notes on the 7 steps.!AEC42F315B4528B0!3290.entry

1. Install WSPBuilder on the TFS build server.

2. Add WSPBuilder to the paths environment variable. Add the WSPBuilder path (C:\Program Files\WSPTools\WSPBuilderExtensions) to the PATH system variable on both the development box and the TFS build box.

3. Set files included in the WSP file to copy to output directory. In the Solution Explorer window of Visual Studio you select and do the following on each 12-hive content file in the properties window: (a) set the item to be content, (b) copy if newer for each of these 12-hive items you set. Thus when TFS does a build in the directory that it puts the DLL's it will give you the other files for the 12 hive in the same directory at the same time on the TFS server, ready for the WSP builder call to run.

4. Add the post build activity to the appropriate projects. For each project in your solution producing a WSP you need to follow Brian's instructions for post-build action code. So if your solution builds 9 WSP's you need to do this 9 times. Usually there will be just one WSP per solution and just one post-build action code snippet written. I advise setting the $(OutDir) variable to always be the complete path to where the build is outputting to on the TFS Build server.

5. Create a new build in TFS. Simple creation of a new build but its VITAL to make sure you choose the debug option because WSPBuilder expects everything to be in the debug folder.

6. Add additional reference paths. Open the TFSBuild.proj file that is created for your TFS build and add one as Brian instructs to the appropriate section (the very last one by default, read the comments to be sure). Then you can set this folder up as a network share and drop referenced assemblies in there as required.

7. Add a pre-build action (if more than one WSP file in solution). With multiple WSP's you will find that if you don't follow Brian's instructions that the DLL's from the first projects that are build will make their way into the second and subsequent WSP files. With just one WSP you can skip this step.

Now you should find that your drop location will now contain all the DLL's, PDB and config files, as well as your WSP files. Mission accomplished.

More links for my personal research

The following is here for knowledge retention purposes only so be forewarned that it might bore you and you can stop reading now.

Today I've been doing plenty of blog-reading. There are many more things I'd like to read further but I'm out of time. Here are 5 links without notes and 2 links with notes.

The Greatest Invention in Computer Science is "The Routine"

Aside from the invention of the computer, the routine is arguably the single greatest invention in computer science. It makes programs easier to read and understand. It makes them smaller (imagine how much larger your code would be if you had to repeat the code for every call to a routine instead of invoking the routine). And it makes them faster (imagine how hard it would be to make performance improvements in similar code used in a dozen places rather than making all the performance improvements in one routine). In large part, routines are what make modern programming possible.

the problem with routines: they only take a minute to learn, but a lifetime to master ....

* How long should this routine be? How long is too long? How short is too short? When is code "too simple" to be in a routine?
* What parameters should be passed to this routine? What data structures or data types? In what order? How will they be used? Which will be modified as a result of the routine?
* What's a good name for this routine? Naming is hard. Really hard.
* How is this routine related to other nearby routines? Do they happen at the same time, or in the same order? Do they share common data? Do they really belong together? What order should they be in?
* How will I know if the code in this routine succeeded? Should it return a success or error code? How will exceptions, problems, and error conditions be handled?
* Should this routine even exist at all?

One thing I'll add is this. I LOVE STUBS. That's where I decide for a subroutine on a test value that will get sent back to the calling method until I have time to properly design/write the subroutine. My favorite phrase is "Hello World from ". For example a subroutine called MyNamespace.MyClass.MySubroutine will send back "Hello World from MyNamespace.MyClass.MySubroutine 001" the first time. As new test versions are built and the code gets improved I might increase the number so it now says "... 002", then "... 003", etc.

It is the job of a good software project manager to recognize the tell-tale symptoms of this classic mistake and address them head on before they derail the project. How? By forcingencouraging developers to create a detailed list of everything they need to do. And then breaking that list down into subitems. And then adding all the subitems they inevitably forgot because they didn't think that far ahead. Once you have all those items on a list, then -- and only then -- you can begin to estimate how long the work will take.

Until you've got at least the beginnings of a task list, any concept of scheduling is utter fantasy. A very pleasant fantasy, to be sure, but the real world can be extremely unforgiving to such dreams.

Johanna Rothman makes the same point in a recent email newsletter, and offers specific actions you can take to avoid being stuck 90% done:

1. List everything you need to do to finish the big chunk of work. I include any infrastructure work such as setting up branches in the source control system.

2. Estimate each item on that list. This initial estimate will help you see how long it might take to complete the entire task.

3. Now, look to see how long each item on that list will take to finish. If you have a task longer than one day, break that task into smaller pieces. Breaking larger tasks into these inch-pebbles is critical for escaping the 90% Done syndrome.

4. Determine a way to show visible status to anyone who's interested. If you're the person doing the work, what would you have to do to show your status to your manager? If you're the manager, what do you need to see? You might need to see lists of test cases or a demo or something else that shows you visible progress.

5. Since you've got one-day or smaller tasks, you can track your progress daily. I like to keep a chart or list of the tasks, my initial estimated end time and the actual end time for each task. This is especially important for you managers, so you can see if the person is being interrupted and therefore is multitasking.

Reading and Blogging in 2009

IMHO the year 2009 will be the year when the best-organized IT managers/professionals get themselves in the habit of well-managing their blog-reading and knowledge retention. The IT professionals in 2009 who are going to be leading the efforts in the industry to apply technology to real-world business needs are going to be doing plenty of reading in order to stay current, solve problems and come up with the best solutions for their organizations. Typically this will be done through an "in-the-cloud" RSS reader that's also installed on their mobile phone and allows them to analyze statistics and prioritize their RSS feeds to be more productive. They'll spend at least 10 hours/week reading blogs. It'll happen on the train, in restaurants or anywhere they're "waiting" on their mobile phones. It'll happen while they are using their computers at work or home. They will typically be following hundreds of blogs and add at least a dozen or more additional ones every month to their subscriptions. Next they'll track the statistics on their subscriptions and organize them according to which ones are most useful in their jobs/work and then focus on getting updates from the blogs that are the highest priority. In a typical week they'll glance at thousands of titles and then pick a few hundred of the titles to click on to see the opening sentence in the posting. Of these a couple dozen will actually be read and most of the rest just skimmed. With all the constant change in technology there is no more effective way to stay current on technology in 2009 than through fast-speed-blog-reading.

If you want to be heard then you need to earn trust and be rated high on their watch list. You can accomplish this by doing the following:

(1) Regularly update your blog with current useful content.
(2) Generally make each entry short and easy to read.
(3) Provide all possible links and references to avoid plagiarism.
(4) Its imperative that you sincerely try to be completely unbiased and objective.
(5) Warn your readers if you are going to write lengthy details for your own knowledge retention purposes.

Communication and Blogs

One thing I really like about TFS and Sharepoint is how they really work to improve collaboration, communication and using blogs out-of-the-box to disseminate information appropriately. I found a blogger at who provides some good tips that should help every IT professional.

The single most important thing you must do to improve your programming career is improve your ability to communicate.

To program, you must elicit ideas from other people and share your ideas with them. Sharing ideas comes in many forms: explaining how you did something. Suggesting a new practice for the team. Demonstrating how something works. Convincing everyone to switch programming languages. Persuading a brilliant engineer to join your team. Persuading your manager to get out of the way and let you do your thing.

Advancing your career is entirely about communicating. Getting a job. Turning down a job. Asking for a promotion. Turning down a promotion. Getting onto a good team. Politely extricating yourself from a good team. Persuading a brilliant engineer to co-found a company. Helping a brilliant engineer understand why co-founding a company isn’t the right thing to do. Asking for funding. Turning down funding. Getting clients. Turning down clients.

If you take just one thing from this post, let it be this: To improve your programming career, the single most important thing you must do is improve your ability to communicate your ideas face to face.

I learn:

Three types of great BLOG posts:
What I learned from Language X that makes me a better programmer when I use Language Y
Something surprising that you probably wouldn’t guess about Language X from reading blog posts
My personal transformation about Idea X

A type of timewasting BLOG post:
Here’s why such-and-such

Using Routines in Programming

The longer I code the more I've learned that the key to good programming is in how you approach writing routines. Steve McConnell at lists the following 8 reasons and I provide commentary.

  • Reducing complexity - I like to keep my routines short enough so I can read the routine declaration line to the bottom "}" all within one Visual Studio window. That's usually about 40 lines.

  • Avoiding duplicate code - I agree its the most popular reason for routines.

  • Limiting effects of changes - Doing the repeatable job in one place is important and that's why good requirements analysis and good software design/architecture up-front is essential in creating good coding practices.

  • Hiding sequences - This is why good routine-writing is essential to good object-oriented programming as its through hiding sequences in routines that we accomplish good encapsulation.

  • Improving performance - without good, organized routines you will never figure out where the bad-performing code is located

  • Hiding data structures and global data - good for encapsulating, sometimes bad for documenting the data structures' usage so that future DBA's and architects can read the code and make future changes or integrated systems.

  • Promoting code reuse and planning for a family of programs - This is where object-oriented programming provides real value with routines. And with the [obsolete] tag in C# and other code management tags & comment sections for .NET routines it makes it much easier to plan, document and validate code reuse or lack-of-reuse.

  • Improving readability - well-naming and keeping the size of the routine down to 20-40 lines is vital. Making routines too short or not designing the high-level algorithm well will cause what I call "sub of sub of sub of sub of sub ... I'm lost and can't find the code" syndrome. Basically its not easy to have too many nested levels of subroutines to dig through in order to find the code you need to work with and this is a common problem I run across when the subroutines are too short and the algorithm was either not well thought-out or its undergone considerable change.

  • Improving portability and isolating use of nonstandard language functions - interfaces and gateways from C# into COM libraries, external web services, external "C" code, etc. are well-managed through good routine writing.

  • Isolating complex operations - this is one of the greatest use of routines - any complex task must be done through ONE routine with calls to multiple subroutines in cases where alot of code is needed

VS2010 Planned Features and "Later" Features

for a list of planned features for Microsoft Visual Studio 2010 and some features planned for future features AFTER Visual Studio goes out.

For VS10:

* A new Windows Presentation Foundation-based (WPF) text editor
* More “modern,” with more of a WPF look and feel throughout the suite
* Smaller in size (in code and data) than Visual Studio 2008
* More reliable and modular

For some time “later”:

* Visual Studio Tools for Applications (VSTA) used for macros, plus other “end-user extensibility” improvements
* The ability to create more add-ins in managed code
* Full WPF shell
* Extensive support for the parallel framework for multicore hardware

Like just about every Microsoft product these days, VS 10 is going to get the Software+Services treatment ....

New Microsoft Programming Language Code-Named "D"

According to

A handful of Microsoft’s top developers are working to create a new programming language, code-named “D,” which will be at the heart of the Microsoft’s push toward more intuitive software modeling.

D is a key component of Microsoft’s Oslo software-oriented architecture (SOA) technology and strategy. Microsoft outlined in vague terms its plans and goals for Oslo in late fall 2007, hinting that the company had a new modeling language in the works, but offering no details on what it was or when the final version would be delivered.

D will be a declarative language aimed at non-developers, and will be based on eXtensible Application Markup Language (XAML), sources, who asked not to be named, said.

Sources close to Microsoft confirmed the existence of D, which they described as a forthcoming “textual modeling language.” In addition to D, sources said, Microsoft also is readying a comlementary editing tool, code-namd “Intellipad,” that will allow developers to create content for the Oslo repository under development by Microsoft.

5 Key Areas of Microsoft Oslo

See for the following on the “Oslo” advancements that will be delivered through Microsoft server and tools products in five key areas:

Server. Microsoft BizTalk Server “6” will continue to provide a core foundation for distributed and highly scalable SOA and BPM solutions, and deliver the capability to develop, manage and deploy composite applications.

Services. BizTalk Services “1” will offer a commercially supported release of Web-based services enabling hosted composite applications that cross organizational boundaries. This release will include advanced messaging, identity and workflow capabilities.

Framework. The Microsoft .NET Framework “4” release will further enable model-driven development with Windows Communication Foundation (WCF) and Windows Workflow Foundation (WF).

Tools. New technology planned for Visual Studio “10” will make significant strides in end-to-end application life-cycle management through new tools for model-driven design of distributed applications.

Repository. There will also be investments in aligning the metadata repositories across the Server and Tools product sets. Microsoft System Center “5,” Visual Studio “10” and BizTalk Server “6” will utilize a repository technology for managing, versioning and deploying models.

David Chappell on Microsoft Oslo

See for the following:

Some Oslo details first went public in June of this year at TechEd. As described then, the code name "Oslo" applied to three things: a new version of Windows Workflow Foundation (WF), a server for running WF applications and others, and a set of modeling technologies, including a repository and visual editor. All of these technologies can be used together, so putting then under an umbrella code name made some sense.

Microsot Oslo and Visual Studio 2010

Planning a technology roadmap for the next few years requires that we know what's coming up from Microsoft with Oslo, Visual Studio 2010, etc. I'm not exactly sure how Oslo and VS 2010 will affect each other as Microsoft is wisely keeping all the details under wraps. I just know that their core functionality must be married as Oslo is the coding foundation for future Microsoft software code development and Visual Studio is the future tool for making it all happen. Stay tuned or be sorry!!

Suggested links: