Category Archives: ALM

Retrieve Work Item Discussion History using Visual Studio Online’s REST API

At TechEd North America Microsoft announced the availability of a preview REST API for Visual Studio Online.  Anyone who has spent time around Team Foundation Server will know that it’s always had a strongly-typed C# SDK which can be used to extract and manipulate work items held in TFS.  This REST API is new and will eventually be available for the on-premise hosted version of TFS.

In a lucky coincidence I’m doing some work that requires me to extract work item information for other purposes so I decided to give the API a whirl.  As you can see it went pretty well!

*Strictly* speaking, it isn’t a complete Work Item as by default Attachments and History are both excluded.  Interestingly both of these tend to be pain points for other traditional tools to deal with too :).

Luckily the new REST API offers support for retrieval of “Updates” which represent the set of fields changed on each update to the Work Item.  This can be any combination of fields or comments directly added to the history by users.

The demo code below shows how you can build out the history so it displays “Discussion Only” items and excludes all others.  As it’s early days for the API maybe we’ll see the sort of logic covered here wrapped in an API call so you don’t need to parse these objects locally.

Once the entire solution is ready I’ll post the source up on Github.  Note you’ll need to enable the “Alternate Credentials” on your user profile to get this code working.

Tagged , , , , ,

Use Tags to better manage your TFS work items

Updated: early in 2014 Microsoft released an update that now makes it possible to query using Tags.  See the full details online.

As many of you have found, at present it isn’t possible to customise work item templates to provide custom fields the same way you can in the on-premise version of TFS. While the out-of-the-box work item types mostly suffice there are cases where not being able to customise the template impacts your ability to properly report on the work you are managing.

To go some way to address this Microsoft released an update to Team Foundation Service in January 2013 to provide a ‘tag’ feature that allows users to add meta-data to a work item.

I must admit until my most recent engagement that I hadn’t looked closely at tags as I’d found that using a well thought through Area Path scheme tended to work well when querying work items. I’m now working on a project where we are performing a large number of migrations between platforms and must deal with a fair amount of variation between our source system and our target.

Our primary concern is to arrange our work by business unit – this will ultimately determine the order in which we undertake work due to a larger project that will affect availability of our target system for these business units. To this end, the Area Path is setup so that we can query based on Business Unit.

In addition to business unit we then have a couple of key classifications that would be useful to filter results on: type of the source system; long term plan for the system once migrated.

When creating our backlog we tagged each item as we bought it in, resulting in a set of terms we can easily filter with.

Backlog with tags

Which allows us to easily filter the backlog (and Sprint backlogs if you use a query view for a Sprint) like this:

Backlog with applied filter

The great thing with this approach is as you add a filter to a backlog the remaining filter options display the number of work items that match that element – in the example below we can see that there are 90 “team sites” and 77 “applications” that are also tagged “keep”. This is extremely useful and a great way to get around some of the work item customisation limitations in Team Foundation Service.

Filter applied

One item to be aware of with tags is that right now the best experience with them is on the web. If you regularly use Team Explorer or Excel to manage your TFS backlog then they may be of limited use. You can still filter using them within Excel but the mechanism in which they are presented (as a semi-colon list of keywords) means easily filtering by individual tags isn’t possible.

Excel filter

Tagged , , ,

Quick Links To Help You Learn About Developing For The Cloud

Unsurprisingly I think the way Cloud Computing is transforming the IT industry is also leading to easier ways to learn and develop skills about the Cloud. In this post I’m going to give a run down on what I think some of the best ways are to start dipping your toe into this space if you haven’t already.

Sign up for a free trial

This is easy AND low cost. Turn up to the sign-up page for most major players and you’ll get free or low-cost services for a timed period. Sure, you couldn’t start the next Facebook at this level, but it will give you enough to start to learn what’s on offer.  You can run VMs, deploy solutions, utilise IaaS, PaaS and SaaS offerings and generally kick the tyres of the features of each. At time of writing these are:

Learn the APIs and use the SDKs

Each of Amazon, Azure, Google, Office 365 and Rackspace offer some form of remote programmable API (typically presented as REST endpoints).  If you’re going to move into Cloud from traditional hosting or system development practices then starting to learn about programmable infrastructure is a must.  Understanding the APIs available will depend on leveraging existing documentation:

If you aren’t a fan of working so close to the wire you can always leverage one of the associated SDKs in the language of your choice:

The great thing about having .Net support is you can then leverage those SDKs directly in PowerShell and automate a lot of items via scripting.

Developer Tool Support

While having an SDK is fine there’s also a need to support developers within whatever IDE they happen to be using.  Luckily you get support here too:

Source Control and Release Management

The final piece of the puzzle and one not necessarily tied to the individual Cloud providers is where to put your source code and how to deploy it.

  • Amazon Web Services: You can leverage Elastic Beanstalk for deployment purposes (this is a part of the Visual Studio and Eclipse toolkits). http://aws.amazon.com/elasticbeanstalk/
  • Google App Engine: Depending on language you have a few options for auto-deploying applications using command-line tools from build scripts.  Eclipse tooling (covered above) also provides deployment capabilities.
  • Rackspace Cloud: no publicly available information on build and deploy.
  • Windows Azure: You can leverage deployment capabilities out of Visual Studio (probably not the best solution though) or utilise the in-built Azure platform support to deploy from a range of hosted source control providers such as BitBucket (Git or Mercurial), Codeplex, Dropbox (yes, I know), GitHub or TFS.  A really strong showing here from the Azure platform! http://www.windowsazure.com/en-us/develop/net/common-tasks/publishing-with-git/

So, there we have it – probably one of the most link-heavy posts you’ll ever come across – hopefully the links will stay valid for a while yet!  If you spot anything that’s dead or that is just plain wrong leave me a comment.

HTH.

Tagged , ,

SharePoint Online 2013 ALM Practices

SharePoint has always been a bit a challenge when it comes to structured ALM and developer practices which is something Microsoft partially addressed with the release of SharePoint and Visual Studio 2010. Deploying and building solutions for SharePoint 2013 pretty much retains most of the IP from 2010 with the noted deprecation of Sandbox Solutions (this means they’ll be gone in SharePoint vNext).

As part of the project I’m leading at Kloud at the moment we are rebuilding an Intranet so it runs on SharePoint Online 2013 so I wanted to share some of the Application Lifecycle Management (ALM) processes we’ve been using.

Packaging

Most of the work we have been doing to date has leveraged existing features within the SharePoint core – we have, however, spent time utilising the Visual Studio 2012 SharePoint templates to package our customisations so they can be moved between multiple environments. SharePoint Online still provides support for Sandboxed Solutions and we’ve found that they provide a convenient way to deploy elements that are not developed as Apps. Designer packages can also be exported and edited in Visual Studio and produce a re-deployable package (which result in Sandboxed Solutions).

Powershell

At the time of writing, the number of Powershell Commandlets for managing SharePoint Online are substantially less those for on-premise. If you need to modify any element below a Site Collection you are pretty much forced to write custom tooling or perform the tasks manually – we have made a call in come cases to build tooling using the Client Side Object Model (CSOM) or to perform tasks manually.

Development Environment

Microsoft has invested some time in the developer experience around SharePoint Online and now provides you with free access to an “Office 365 Developer Site” which gives you a single-license Office 365 environment in which to develop solutions. The General Availability of Office 365 Wave 15 (the 2013 suite) sees these sites only being available for businesses holding enterprise (E3 or E4) licenses.  Anyone else will need to utilise a 30 day trial tenant.

We have had each team member setup their own site and develop solutions locally prior to rolling them into our main deployment. Packaging and deployment is obviously key here as we need to be able to keep the developer instances in sync with each other and the easiest way to achieve that is with WSPs that can be redeployed as required.

One other item we have done around development is to utilise an on-premise setup in a VM to provide developers with a more rapid development experience in some cases (and more transparent troubleshooting). As you mostly stick to the SharePoint CSOM a lot of your development these days resides in JavaScript which means you shouldn’t hit any snags in relying in on-premise / full-trust features in your delivered solutions.

Note that the Office 365 Developer Site is a single-license environment which means you can’t do multi-user testing or content targeting. That’s where test environments come into play!

Test Environment

The best way to achieve a more structured ALM approach with Office 365 is to leverage an intermediate test environment – the easiest way for anyone to achieve this is to register for a trial Office 365 tenant – while only technically available for 30 days this still provides you with the ability to test prior to deploying to your production environment.

Once everything is tested and good to go into production you’re already in a position to know the steps involved in deployment!

As you can see – it’s still not a perfect world for SharePoint ALM, but with a little work you can get to a point where you are at least starting to enforce a little rigour around build and deployment.

Hope this helps!

Useful Links

Tagged , , , , ,

Create New Folder Hierarchies For TFS Projects using Git SCM

If, like a lot of people who’ve worked heavily with TFS you may not have spent much time working with Git or any of its DVCS bretheren.

Firstly, a few key things:

1. Read and absorb the tutorial on how best to work with Git from the guys over at Atlassian.
http://atlassian.com/git/tutorial/git-basics

2. Install the Visual Studio 2012 Update 2 (currently in CTP, possibly in RTM by the time you read this).
http://www.microsoft.com/en-us/download/details.aspx?id=36539 (grab just vsupdate_KB2707250.exe)

3. Install the Git Tools for Visual Studio http://visualstudiogallery.msdn.microsoft.com/abafc7d6-dcaa-40f4-8a5e-d6724bdb980c

4. Install the most recent Git client software from http://git-scm.com/downloads

5. Set your default Visual Studio Source Control provider to be “Microsoft Git Provider”.

6. Setup an account on Team Foundation Service (https://tfs.visualstudio.com/), or if you’re lucky enough maybe you can even do this with your on-premise TFS instance now…

7. Make sure you enable and set alternative credentials in your TFS profile:

alt-credentials

8. Setup a project that uses Git for source control.

At this stage you have a couple of options – you can clone the repository using Visual Studio’s Git support

gitclone

OR you can do it right from the commandline using the standard Git tooling (make sure you’re at a good location on disk when you run this command):

git clone https://thesimpsons.visualstudio.com/defaultcollection/_git/bart milhouse
Cloning into 'milhouse'...
Username for 'https://thesimpsons.visualstudio.com/': homer
Password for 'https://thesimpsons.visualstudio.com/':
Warning: You appear to have cloned an empty repository.

I tend to setup a project directory hierarchy early on and with Git support in Visual Studio I’d say it’s even more important as you don’t have a Source Control Explorer view of the world and Visual Studio can quickly create a mess when adding lots of projects or solution elements.  The challenge is that (as of writing) Git won’t support empty folders and the easiest work around is to create your folder structure and drop an empty file into each folder.

Now this is where Visual Studio’s Git tools won’t help you – they have no concept of files / folders held outside of Visual Studio solutions so you will need to use the Git tools at the commandline to affect this change. Once have your hierarchy setup with empty files in each folder, at a command prompt change into the root of your local repository and then do the following.

git add -A
git commit -m "Hmmmm donuts."

Now, at this point, if you issue “git push” you may experience a problem and receive this message:

No refs in common and none specified; doing nothing.
Perhaps you should specify a branch such as ‘master’.
Everything up-to-date.

Which apart from being pretty good english (if we ignore ‘refs’) is pretty damn useless.

How to fix? Like this:

git push origin master

This will perform a forced push and your newly populated hierachy should be pushed to TFS, er Git, er TFS. You get the idea. Then the others on your team are able to clone the repository (or perform a pull) and will receive the updates.

HTH.

Update: A big gotcha that I’ve found, and it results in a subtle issue is this: if you have a project that has spaces in its title (i.e. “Big Web”) then Git happily URL encodes that and will write the folder to disk in the form “Big%20Web” which is all fine and dandy until you try to compile anything in Visual Studio. Then you’ll start getting CS0006 compilation errors (unable to find metadata files).  The fix is to override the target when cloning the repository to make sure the folder is validly named (in my example above this checks out the “bart” project to the local “milhouse” folder).

Tagged

Deploy Umbraco using Octopus Deploy

Every once in a while you come across a tool that really fits its purpose and delivers good value for not a whole lot of effort.  I’m happy to say that I think Octopus Deploy is one such tool!  While Octopus isn’t the first (or most mature in this space) it hits a sweet spot and really offers any sized team the ability to get on board the Continous Deployment / Delivery bandwagon.

My team is using it to deploy a range of .Net websites and we’re considering it for pushing database changes too (although these days a lot of what we build utilises Entity Framework so we don’t need to push that many DB scripts about any more) and one thing we’ve done a lot of is deploy Umbraco 4 sites.

Its About The Code

One important fact to get out here is that I’m only going to talk about how Octopus will help you deploy .Net code changes only.  While you can generate SQL scripts and deploy them using Octopus (and ReadyRoll perhaps), Umbraco development has little to do with schema change and everything to do with instance data change.  This is not an easy space to be – more so with larger websites – and even Umbraco has found it hard to solve despite producing Courier specifically for this challenge.  This all being said, I’m sure if you spend time working with SQL Data Compare  you can come up with a database deployment step using scripts.

Setting It Up

Before you start Umbraco deployments using Octopus you need to make a decision about what to deploy each time and then modify your target.

When developing with Umbraco you will have a “media”, an “umbraco” and an “umbraco_client” folder in your solution folder but not necessarily included in your Visual Studio solution.  These three folders will also be present on your target deployment server and in order to leverage Octopus properly you need to manage these three folders appropriately.

Media folder

This folder holds files that are uploaded by CMS users over time.  It is rare that you would want to take media from a development environment and push it to any other environment other than on initial deployment.  If you do deploy it each time then your deployment will be (a) larger and (b) more challenging to deploy (notice I didn’t say impossible).  You’ll also need to deal with merging of media “meta data” in the Umbraco CMS you’re deploying to (you’re back at Courier at this stage).

Regardless of whether you want to push media or not you will need to deal with how you treat the media folder on your target server – Octopus can automatically re-map your IIS root folder for your website to your new deployment so you’lll need to write Powershell to deal with this (and merging content if required).

Our team’s process is to not transfer media files via Octopus and we have solved the media folder problem by creating the folder as a Virtual Directory in IIS on the target web server.  As long as the physical folder has the right permissions you will have no problems with this approach.  The added benefit here is that when Octopus Deploy remaps your IIS root folder to a new deployment the media is already in place and not affected at all.

Umbraco folders

The two Umbraco folders are required for the CMS to function as expected.  While some of you might make changes internally to these folders I’d recommend you visit your reasons for doing so and see if you can’t make these two folders static and simply re-deploy the Umbraco binaries in your Octopus package.

There are a couple of ways to proceed with these folders – you can choose to redeploy them each time or you can treat them as exceptions and, as with the media folder, you can create Virtual Directories for them under IIS.

If you want to redeploy them as part of your package you will need to do a few things:

  1. Create a small stub “umbraco.zip” that is empty or that contains a single file (or similar) in your Visual Studio solution.
  2. Write some Powershell in your PostDeploy.ps1 file that unzips those folders into the right place on your target web server.
  3. In your build script (on your build server, right?) utilise an appropriate MSBuild extension (like the trusty MSBuildCommunityTasks) to zip the two folders into a zip that replaces the stub you created in (1).
  4. Run your build in Release mode (required to trigger OctoPack) which will trigger the packaging of your outputs including the new zip from (1).

On deployment you should see your Powershell execute and unpack your Umbraco folders into the right place!

Alternatively, you can choose not to redeploy these two folders each time – if this suits (and it does for us) then you can use the same approach as with the media folder and simply create two Virtual Directories.  Once you’ve deployed everything will work as expected.

It’s Packaged (and that’s a Wrap)

So there we have a simple scenario for deploying Umbraco via Octopus Deploy – I’m sure there are more challenging scenarios than the above but I bet with a little msbuild and Powershell-foo you can work something out.

I hope you’ve found this post useful and I’d recommend checking out the Octopus Deploy Blog to see what great work Paul does at taking feedback on board for the product.

Tagged , , , ,

TFS As A Service (TFSPreview.com) – Connect to and Query Using C#

Having worked with every version of Team Foundation Server (TFS) since its inception I was keen to see what API support “TFSPreview.com” has. The good news is that (at time of blogging) the API accessibility is all there, is free and aligns with the on-premise API and client object model.

I’ve always felt the strongly-typed client object model and library is a strength of the TFS offering over many of its competitors and the classes that compose it provide some good extensibility possibilities – I’ve been on a project where all “Requirements” work item types from an MSF Agile project were exported via the API to a Word document and formatted for presentation to the customer (and we could re-run it any time!)

This past week has seen the RTM availability for a bunch of Microsoft products including Visual Studio and Team Foundation Server 2012, which means that an RTM set of the TFS Client Object Model assemblies are now available. After grabbing them I fired up Visual Studio, added in the correct references and was able to connect to our TFS Preview instance and perform some query magic.

Here’s an (incomplete) sample of how easy it is!


// Add the namespaces.  Make sure it's version 11.0.0.0!
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.WorkItemTracking.Client; 

// some time later in your code...
// this code assumes the user who owns the current thread has previously authorised against the server
using (TfsTeamProjectCollection tpc = new TfsTeamProjectCollection(new Uri("https://yourinstance.tfspreview.com/defaultcollection"), new TfsClientCredentials()))
{
     var workItemService = tpc.GetService<WorkItemStore>();

     var queryString = "Select [State], [Title] From WorkItems Where [System.TeamProject] = @TeamProject and [Work Item Type] = @Type and [Assigned to] = @Me";

     var variables = new Dictionary()
                         {
                             {"Type","Task"},
                             {"TeamProject", "YOURPROJECT"}
                         };

     var query = new Query(workItemService, queryString, variables);
     WorkItemCollection results = query.RunQuery();
}

So the above is just a simple and incomplete example – but it will connect and return results for you. The TFS extensibility options are pretty wide and varied as can be seen on MSDN! I’ll post more stuff up here over time as I work through my planned use of this server (a custom team board in our office… too big perhaps, but hey…).

If you don’t have a TFSPreview account I’d recommend getting one and having a play – Microsoft has said the platform will be free through until the end of 2012 so I’d say there’s no better way to try it out than that. They are also shipping updates for the platform every 3 weeks which will be ahead of the on-premise version which will get quarterly updates (based on the TFSPreview updates). Get in and get informed.

Tagged , , , , , , , ,