MyGet Blog

Package management made easier!

NAVIGATION - SEARCH

MyGet Update: version 1.3.0

We are happy to announce a new version of the MyGet Web site containing a set of improvements, fixes and updates.

The following work items have been addressed in this release:

Product

  • number of plans has been reduced. We now have Free, Starter (formerly Small plan), Professional (formerly Large plan) and a new Enterprise plan
  • introduction of the new Enterprise plan focused on companies that require more performance, integration and security

Usability

  • improved layout and readability on smaller screen resolutions
  • auto-lower cased routing for feed URL (helps avoiding issues caused by incorrect casing of feed identifier)
Performance
  • upgraded to MVC4
Features
  • new identity provider: StackExchange
  • upgraded SymbolSource integration to take benefit from their improved and freshly released API
  • Gallery feeds now support a README in Markdown format, available in the feed's Gallery Settings
  • Enterprise plan: upgraded administrative & quota management dashboard
More details about the new features will be covered in follow-up blog posts soon.
 
Happy packaging!

Site issues on July 2nd, 2012

What a day! July 2nd seemed to go from sunshine to rain in a couple of hours. The good news is: we’re back. Let’s first go through what happened…

What happened?

imageThis morning, you've probably received an e-mail stating that "Your Free subscription at MyGet has expired". Free? Expired? Even we were puzzled! We’ve quickly sent out an e-mail stating free subscriptions don’t expire and that we were investigating the issue. In our rush to find out what happened, we even forgot to remove some template text from that e-mail. Professional of us, really.

A couple of hours later, the website started acting funny: one moment it was up, one moment it appeared down. The strange thing was: we did not see this happening in our server monitoring page. The brand new Windows Azure portal was all shiny and bright, all systems running. A short glitch? Probably.

And then 4:00 PM (GMT+1) came around: we started to receive SMS messages from Pingdom. UP, DOWN, UP, DOWN. Some sweaty hours later, we’ve found the root cause of all this mess and shortly thereafter, deployed a hotfix to the cloud. What a day!

All-in-all, we’ve spammed you in the morning, had some small glitches during the day and went bezerk between 4:00 PM (GMT+1) and 7:00 PM (GMT+1).

What was wrong?

During the entire day, we were suspecting two of our latest changes. First of all, we’ve moved storage accounts in Windows Azure to make use of some of the new features in there with regards to mirroring packages. The code isn’t done yet, but we did already switch the storage accounts. Next to that, we’ve recently deployed our site using ASP.NET MVC 4 (RC). Was any of this the cause? We’ve been searching… But no.

Elmah revealed some interesting details. We were getting throttled on those new storage accounts. Was this due to the fact that we enabled storage analytics? We’ve disabled them and found that we were back up in full glory! For a moment, because this wasn’t the reason we were being throttled…

We process several operations asynchronously, such as updating our caches. It seemed that there were 16.000 commands in there, all telling the system to update the cache for one specific feed. One of our newest MyGet members managed to break a couple of gears there! Something we don’t judge. In contrary: there’s no better stress test or user test than real users.

A rookie mistake

Now why were there 16.000 messages unprocessed and why were some of them on the dead-letter queue? And why were our machines looking fine (all green!) while the website did not respond? The answer lies in the following piece of code:

1 [DebuggerDisplay("{Title} Version={Version}")] 2 public class FeedPackage 3 : TableServiceEntity 4 { 5 public string Id { get; set; } 6 public string Version { get; set; } 7 8 // ... 9 10 public override bool Equals(object obj) 11 { 12 if (ReferenceEquals(null, obj)) return false; 13 if (ReferenceEquals(this, obj)) return true; 14 if (obj.GetType() != typeof(FeedPackage)) return false; 15 return GetHashCode() == obj.GetHashCode(); 16 } 17 18 public override int GetHashCode() 19 { 20 int hashCode = 7; 21 22 if (PartitionKey != null) hashCode ^= PartitionKey.GetHashCode(); 23 if (RowKey != null) hashCode ^= RowKey.GetHashCode(); 24 25 return hashCode; 26 } 27 }

Did you spot it? We’ve gone with a complete rookie implementation of Equals and GetHashCode there…

The devil is in the fact that we’re using the hashcode of package A and compare it with package B. That’s a serious problem, as those hash codes are based on the hashcodes of their package identifier and package version. Guess what MSDN tells us

If two string objects are equal, the GetHashCode method returns identical values. However, there is not a unique hash code value for each unique string value. Different strings can return the same hash code.

Our newest member was in the position where he had a lot of packages uploaded to MyGet and by coincidence, two packages appeared to be “equal” using our faulty code above.

Now why did the website choke on that?

Windows Azure table storage, which we use to store all your data, In the .NET space, the WIndows Azure storage client uses WCF Data Service to fetch data. Deep in that framework, the Equals'() method is called on every entity to check whether a specific entity is already being tracked by the data context in use. And since we had a duplicate result for Equals(), an unhandled exception was thrown there.

Wait a minute: you guys don’t catch unhandled exceptions? Yes we do. And when such exception occurs, we retry some operations for up to five times. Retry operations? Well, rather than failing immediately, a good pattern is to retry a failing operation to check that the error wasn’t just a minor glitch like a network fault. On Windows Azure, it is recommended to do this for calls to any external system, like a webserver calling storage. Check David Aiken’s blog for some info on this.

So, retries… Yes. Retry an I/O operation with minor latency for five times. Have a user who’s experiencing an issue with the website F5 a couple of times. And have that queue with 16.000 operations pending hammer this storage account again. What will happen? Throttling. Windows Azure tells our system to retry again after a second. And our retry logic does exactly that. This retry logic causes some threads to sleep, all the queue processing and F5-ing causes some more thread to sleep and what happens? The server comes to a halt while it still looks okay to our external monitoring.

Are we running just one server? No. We’re running multiple, in a round-robin load-balanced farm. This means that all this refreshing impacted all of our servers, making the site as slow as a snail. And recover again. And go almost down again…. and come up again. Exactly what we’ve been seeing today.

What are we doing to prevent this from happening in the future?

The truths of offering a cloud service are: hardware fails, software has bugs and people make mistakes. Hardware failures are mitigated by the Windows Azure system. Software bugs? It seems we’re proficient at that. And yes, that was a giant mistake of us. Our job is to mitigate all of these and provide a reliable, robust service to you. We’ll be using the lessons learned today to improve our service, your service.

First of all, we’ve fixed our rookie mistake. This should have never happened and we’ll make sure that our code base is checked and repaired for faulty Equals / GetHashCode implementations.

Next, we’ll be reviewing our retry logic. Blindly retrying on any exception type isn’t the smartest thing to do, so it seems. We’ll be making sure the retry logic only fires on transient exceptions and not blindly on any exception that occurs.

We’ll also be investigating additional monitoring. We’re not sure yet about possible tools or services there, we do want to know when something is wrong and our CPU is heating the entire datacenter at 100%.

We're sorry!

Happy packaging!

The MyGet team

Pushing packages from MyGet to NuGet (or another feed)

Imagine you have a package repository hosted on MyGet. Every time a team member of your open source or enterprise project commits source code changes, your build server pushes an updated release to this package repository in the form of a prerelease NuGet package. Now what happens if a release to the official NuGet package source has to be created? Typically, you will either create a fresh package which will be the package to release, or download a package from your build server, change the version and upload that one to NuGet.org (or another repository). No need for such overloaded process anymore: MyGet will perform the push for you.

Setting up a continuous integration (CI) feed

First of all, you will need a CI feed to which your build server can push every NuGet package related to your project. Simply create your feed on MyGet, a three-second process. After creating your feed, MyGet will present you the feed URL as well as an API key. Optionally, you can make it a private feed and ensure only people who were granted the correct privileges can access your feed.

Next, configure your build server to push the NuGet artifacts to MyGet. Using TeamCity, for example, this can be done by adding a NuGet Push build step:

MyGet feed TeamCity

Pushing packages from MyGet to NuGet (or another feed)

The first time you want to push a package to another NuGet feed, you’ll probably have to configure the other feed’s URL and API key to use when pushing there. The push package feature is based on the package source proxy feature released earlier. Navigate to your feed and on the left hand side, click the “Package Sources” item and either add a new package source or edit the existing, default NuGet package source. In order to be able to push a package to another feed, the API key has to be specified. You can enter your NuGet.org (or another feed) API key and click the Save button.

MyGet package source selection

Pushing packages from MyGet to NuGet or any other feed is easy. From the moment a package source has been configured, using the “Push” button will enable you to push packages to another feed.

image

After clicking the “Push” button, MyGet will present you with an overview of the package which will be pushed to another feed. Select the feed to which you want to push and verify the other fields. When pushing a prerelease package to a stable package, simply make the Prerelease tag field blank. You can also modify the prerelease tag if wanted. If you do intend to push a prerelease version, simply continue clicking the “Push” button.

Push a NuGet package

Told you it was easy. MyGet will now push the package to the selected feed and inform you by e-mail should anything go wrong. Happy packaging!

Pro NuGet is finally there!

Short version: Install-Package ProNuget or http://amzn.to/pronuget

Pro NuGet - Continuous integration Package RestoreIt’s been a while since I wrote my first book. After I’ve been telling that writing a book is horrendous (try writing a chapter per week after your office hours…) and that I would never write on again, my partner-in-crime Xavier Decoster and I had the same idea at the same time: what about a book on NuGet? So here it is: Pro NuGet is fresh off the presses (or on Kindle).

Special thanks go out to Scott Hanselman and Phil Haack for writing our foreword. Also big kudos to all who’ve helped us out now and then and did some small reviews. Yes Rob, Paul, David, Phil, Hadi: that’s you guys.

Why a book on NuGet?

Why not? At the time we decided we would start writing a book (september 2011), NuGet was out there for a while already. Yet, most users then (and still today) were using NuGet only as a means of installing packages, some creating packages. But NuGet is much more! And that’s what we wanted to write about. We did not want to create a reference guide on what NuGet command were available. We wanted to focus on best practices we’ve learned over the past few months using NuGet.

Some scenarios covered in our book:

  • What’s the big picture on package management?
  • Flashback last week: NuGet.org was down. How do you keep your team working if you depend on that external resource?
  • Is it a good idea to auto-update NuGet packages in a continous integration process?
  • Use the PowerShell console in VS2010/11. How do I write my own NuGet PowerShell Cmdlets? What can I do in there?
  • Why would you host your own NuGet repository?
  • Using NuGet for continuous delivery
  • More!

I feel we’ve managed to cover a lot of concepts that go beyond “how to use NuGet vX” and instead have given as much guidance as possible. Questions, suggestions, remarks, … are all welcome. And a click on “Add to cart” is also a good idea ;-)

Introducing MyGet package source proxy (beta)

My blog already has quite the number of blog posts around MyGet, our NuGet-as-a-Service solution which my colleague Xavier and I are running. There are a lot of reasons to host your own personal NuGet feed (such as protecting your intellectual property or only adding approved packages to the feed, but there’s many more as you can <plug>read in our book</plug>). We’ve added support for another scenario: MyGet now supports proxying remote feeds.

Up until now, MyGet required you to upload your own NuGet packages and to include packages from the NuGet feed. The problem with this is that you either required your team to register multiple NuGet feeds in Visual Studio (which still is a good option) or to register just your MyGet feed and add all packages your team is using to it. Which, again, is also a good option.

With our package source proxy in place, we now provide a third option: MyGet can proxy upstream NuGet feeds. Let’s start with a quick diagram and afterwards walk you through a scenario elaborating on this:

MyGet Feed Proxy Aggregate Feed Connector

You are seeing this correctly: you can now register just your MyGet feed in Visual Studio and we’ll add upstream packages to your feed automatically, optionally filtered as well.

Enabling MyGet package source proxy

Enabling the MyGet package source proxy is very straightforward. Navigate to your feed of choice (or create a new one) and click the Package Sources item. This will present you with a screen similar to this:

MyGet hosted package source

From there, you can add external (or MyGet) feeds to your personal feed and add packages directly from them using the Add package dialog. More on that in Xavier’s blog post. What’s more: with the tick of a checkbox, these external feeds can also be aggregated with your feed in Visual Studio’s search results. Here’s the magical add dialog and the proxy checkbox:

Add package source proxy

As you may see, we also offer the option to filter upstream packages. For example, the filter string substringof('wp7', Tags) eq true that we used will filter all upstream packages where the tags contain “wp7”.

What will Visual Studio display us? Well, just the Windows Phone 7 packages from NuGet, served through our single-endpoint MyGet feed.

Conclusion

Instead of working with a number of NuGet feeds, your development team will just work with one feed that is aggregating packages from both MyGet and other package sources out there (NuGet, Orchard Gallery, Chocolatey, …). This centralizes managing external packages and makes it easier for your team members to find the packages they can use in your projects.

Do let us know what you think of this feature! Our UserVoice is there for you, and in fact, that’s where we got the idea for this feature from in the first place. Your voice is heard!

MyGet tops Vanilla NuGet feeds with a Chocolatey flavor

We recently deployed an all new version of MyGet.org, which contains quite a lot of optimizations and some new features as well. If you didn’t notice, go check it out!

My personal favorite is in fact the underlying architecture that allows us to aggregate feeds and link package sources. These package source presets are configurable on Feed level through the new Package Sources tab available in the Feed management interface.

managePackageSources

To add a package from these package source onto your own feed (either referenced or mirrored, with or without its dependencies), navigate to the Add Package dialog and select the From Feed tab (previously called ‘From NuGet.org’).

addChocolateyPackage

The default setting is still NuGet.org obviously, but you might notice the dropdown containing another feed: Chocolatey.org! That’s right, why not add Chocolatey to your package sources and build a feed containing your favorite tools?

Build your own favorite Chocolatey tools feed

Building such feed is very straightforward. Choose a feed name (which will be represented in your URL) and provide a meaningful description.

myChocolateyFeedDetails

All that is left for a basic MyGet feed is to choose one of the predefined feed templates, as shown below. Obviously you can modify and tweak these settings further to meet your needs afterwards. We’ve updated our FAQ with a full explanation of MyGet’s security model and how you can assign or revoke user rights on a feed.

myChocolateyFeedTemplate

Once the feed is created, you can start pushing packages to it. We support various ways of doing so, including:

  • Uploading packages through the website (and optionally mirror any dependencies found on the configured package source)
  • Referencing or mirroring packages from another feed (any package source, such as NuGet.org, Chocolatey.org, …)
  • Uploading a packages.config file, targeting any package source

Let’s add some packages from the Chocolatey Gallery shall we? Simply select the Chocolatey Gallery package source and type any package name (or any other search criteria using our other search options). Autocomplete will kick in after you typed a character or two and show you a list of possible matches. Select the one you want, verify whether you want to only reference it (copy the metadata onto your feed and keep the real package in the package source) or mirror it (deep copy). If you perform a deep copy, you might prefer to opt-in and check the include dependencies checkbox.

addChocolateyPackageGitExtensions

After clicking the upload button, the operation is queued into the background for processing. It might take up to a minute until the package (and its dependencies if requested) appear on your feed. Note that we also cache the packages list on the website, but nevertheless, once processed they will appear instantly onto your feed when queried from within Visual Studio for instance.

For this demo, I only added GitExtensions to the feed but I mirrored it and included dependencies. After processing, my feed packages list contains the following packages.

myFavoriteToolsPackagesList

Now it’s a piece of cake to add those other tools I’m using. You might be wondering…

Why do you want to do that?

Very simple: I don’t want to mess around with a script or packages.config file on multiple computers. If I repaved my system, that file is not on there. If I want to put it on that repaved machine, I need to find it back (that’s usually the real issue). And if I somehow manage to do so, it’s usually out-of-date.

What if there was a central (read: cloudy) location where I could keep track of this list? A single place to manage the tools list, and always the same location to refer to. Something that would reduce the installation of all my favorite tools to a one-liner. When drafting the Chocolatey chapter in our Pro NuGet book, I learned about the convenient –all command line option support by many of Chocolatey’s commands, including the update (cup) and list (clist) commands . That’s when it struck me that this switch was missing for the install (cinst) command, so I bothered Rob Reynolds (@ferventcoder) with it :) Rob was so nice to help us out and made sure the cinst –all –source [feedUrl] scenario would be supported in the near future. Guess what, it is! Thanks again Rob!

Knowing this, it’s pretty straightforward to repave a system and get all your favorite tools installed in no time. It suffices to run the following command (replace my feed with your feed):

cinst -all -source http://www.myget.org/F/myfavoritetools

Not only can you use it after repaving your system, you could use it as well every time you work for a new customer and need to set up your development environment (maybe have multiple feeds for various scenarios?), or why not share the feed with your team members and make sure everyone benefits from these awesome tools out there.

If ever you have a question about MyGet or need further assistance to get you started, please refer to our blog, reach out on twitter (@MyGetTeam) or use the Support form to contact us. We’ll be happy to help!

Publishing symbol packages for a MyGet feed

MyGet host your NuGet feed serverEver since NuGet 1.2, there is a great way for NuGet package authors to let their users debug into the package’s binaries. With almost no additional effort, package authors can publish their symbols and sources, and package consumers can debug into them from Visual Studio, simply by pushing a symbols package in addition to the standard NuGet package.

SymbolSourceToday, we’re proud to announce MyGet has partnered with SymbolSource.org to offer an easy workflow to publish symbol packages for a private MyGet feed. This means from now on you can publish symbol packages for your private feeds as well!

On a sidenote: we're sharing API keys between both services. If you also want to share the same password with both services, simply go to your MyGet profile page and re-enter your password. We'll keep it in sync after that.

Publishing a symbols package for use with MyGet

As I will assume you are used to publishing packages to NuGet and SymbolSource, here’s what changes. First of all, you will require the URLs to which to publish. Log in to MyGet and browse to your feed details. The Feed Details tab will give you all the information you need, as you can see in the following screenshot:

image

In short, your feed URL remains the same. If you want to consume your private feed in Visual Studio or using the NuGet Package Manager Console, simply add http://www.myget.org/F/yourfeedname as the source. The thing that changed is the publish URL: if you want to publish your packages to MyGet, use the URL http://www.myget.org/F/yourfeedname/api/v1 as the publish URL. For symbol packages, your URL will be in the form of http://nuget.gw.symbolsource.org/MyGet/yourfeedname.

The publish workflow to publish the SamplePackage.1.0.0.nupkg to a MyGet feed, including symbols, would be issuing the following two commands from the console:

1 nuget push SamplePackage.1.0.0.nupkg 00000000-0000-0000-0000-00000000000 -Source http://www.myget.org/F/somefeed/api/v1 2 3 nuget push SamplePackage.1.0.0.Symbols.nupkg 00000000-0000-0000-0000-00000000000 -Source http://nuget.gw.symbolsource.org/MyGet/somefeed

An example of these commands can also be found on the Feed Details tab for your MyGet feed.

Consuming symbol packages in Visual Studio

When logging in to MyGet, you can find the symbols URL compatible with Visual Studio under the Feed Details tab for your MyGet feed. This URL will be the same for all feeds you are allowed to consume, so no need to configure 10+ symbol servers in Visual Studio. Here’s how to configure it.

First of all, Visual Studio typically will only debug your own source code, the source code of the project or projects that are currently opened in Visual Studio. To disable this behavior and to instruct Visual Studio to also try to debug code other than the projects that are currently opened, open the Options dialog (under the menu Tools > Options). Find the Debugging node on the left and click the General node underneath. Turn off the option Enable Just My Code. Also turn on the option Enable source server support. This usually triggers a warning message but it is safe to just click Yes and continue with the settings specified.

MyGet symbol server in Visual Studio

Keep the Options dialog opened and find the Symbols node under the Debugging node on the left. In the dialog shown in Figure 4-14, add the symbol server URL for your MyGet feed: http://srv.symbolsource.org/pdb/MyGet/username/11111111-1111-1111-1111-11111111111. After that, click OK to confirm configuration changes and consume symbols for NuGet packages.

Enjoy!

Setting up a NuGet repository in seconds: MyGet public feeds

A few months ago, my colleague Xavier Decoster and I introduced MyGet as a tool where you can create your own, private NuGet feeds. A couple of weeks later we introduced some options to delegate feed privileges to other MyGet users allowing you to make another MyGet user “co-admin” or “contributor” to a feed. Since then we’ve expanded our view on the NuGet ecosystem and moved MyGet from a solution to create your private feeds to a service that allows you to set up a NuGet feed, whether private or public.

Supporting public feeds allows you to set up a structure similar to www.nuget.org: you can give any user privileges to publish a package to your feed while the user can never manage other packages on your feed. This is great in several scenarios:

  • You run an open source project and want people to contribute modules or plugins to your feed
  • You are a business and you want people to contribute internal packages to your feed whilst prohibiting them from updating or deleting other packages

Setting up a public feed

Setting up a public feed on MyGet is similar to setting up a private feed. In fact, both are identical except for the default privileges assigned to users. Navigate to www.myget.org and sign in using an identity provider of choice. Next, create a feed, for example:

Create a MyGet NuGet feed and host your own NuGet packages

This new feed may be named “public”, however it is private by obscurity: if someone knows the URL to the feed, he/she can consume packages from it. Let’s change that. Go to the “Feed Security” tab and have a look at the assigned privileges for Everyone. By default, these are set to “Can consume this feed”, meaning that everyone can add the feed URL to Visual Studio and consume packages. Other options are “No access” (requires authentication prior to being able to consume the feed) and “Can contribute own packages to this feed”. This last one is what we want:

Setting up a NuGet feed

Assigning the “Can contribute own packages to this feed” privilege to a specific user or to everyone means that the user (or everyone) will be able to contribute packages to the feed, as long as the package id used is not already on the feed and as long as the package id was originally submitted by this user. Exactly the same model as www.nuget.org, that is.

For reference, all available privileges are:

  • Has no access to this feed (speaks for itself)
  • Can consume this feed (allows the user to use the feed in Visual Studio / NuGet)
  • Can contribute own packages to this feed '(allows the user to contribute packages but can only update and remove his own packages and not those of others)
  • Can manage all packages for this feed (allows the user to add packages to the feed via the website and via the NuGet push API)
  • Can manage users and all packages for this feed (extends the above with feed privilege management capabilities)

Contributing to a public feed

Of course, if you have a public feed you may want to have people contributing to it. This is very easy: provide them with a link to your feed editing page (for example, http://www.myget.org/Feed/Edit/public). Users can publish their packages via the MyGet user interface in no time.

If you want to have users push packages using nuget.exe or NuGet Package Explorer, provide them a link to the feed endpoint (for example, http://www.myget.org/F/public/). Using their API key (which can be found in the MyGet profile for the user) they can push packages to the public feed from any API consumer.

Enjoy!

 

PS: We’re working on lots more, but will probably provide that in a MyGet Premium version. Make sure to subscribe to our newsletter on www.myget.org if this is of interest.

NuGet push... to Windows Azure

When looking at how people like to deploy their applications to a cloud environment, a large faction seems to prefer being able to use their source control system as a source for their production deployment. While interesting, I see a lot of problems there: your source code may not run immediately and probably has to be compiled. You don’t want to maintain compiled assemblies in source control, right? Also, maybe some QA process is in place where a deployment can only occur after approval. Why not use source control for what it’s there for: source control? And how about using a NuGet repository as the source for our deployment? Meet the Windows Azure NuGetRole.

Disclaimer/Warning: this is demo material and should probably not be used for real-life deployments without making it bullet proof!

Download the sample code: NuGetRole.zip (262.22 kb)

How to use it

If you compile the source code (download), you have X steps left in getting your NuGetRole running on Windows Azure:

  • Specifying the package source to use
  • Add some packages to the package source feed (which you can easily host on MyGet)
  • Deploy to Windows Azure

When all these steps have been taken care of, the NuGetRole will download all latest package versions from the package source specified in ServiceConfiguration.cscfg:

1 <?xml version="1.0" encoding="utf-8"?> 2 <ServiceConfiguration serviceName="NuGetRole.Azure" 3 xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" 4 osFamily="1" 5 osVersion="*"> 6 <Role name="NuGetRole.Web"> 7 <Instances count="1" /> 8 <ConfigurationSettings> 9 <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> 10 <Setting name="PackageSource" value="http://www.myget.org/F/nugetrole/" /> 11 </ConfigurationSettings> 12 </Role> 13 </ServiceConfiguration>

Packages you publish should only contain a content and/or lib folder. Other package contents will currently be ignored by the NuGetRole. If you want to add some web content like a default page to your role, simply publish the following package:

NuGet Package Explorer MyGet NuGet NuGetRole Azure

Just push, and watch your Windows Azure web role farm update their contents. Or have your build server push a NuGet package containing your application and have your server farm update itself. Whatever pleases you.

How it works

What I did was create a fairly empty Windows Azure project (download).  In this project, one Web role exists. This web role consists of nothing but a Web.config file and a WebRole.cs class which looks like the following:

1 public class WebRole : RoleEntryPoint 2 { 3 private bool _isSynchronizing; 4 private PackageSynchronizer _packageSynchronizer = null; 5 6 public override bool OnStart() 7 { 8 var localPath = Path.Combine(Environment.GetEnvironmentVariable("RdRoleRoot") + "\\approot"); 9 10 _packageSynchronizer = new PackageSynchronizer( 11 new Uri(RoleEnvironment.GetConfigurationSettingValue("PackageSource")), localPath); 12 13 _packageSynchronizer.SynchronizationStarted += sender => _isSynchronizing = true; 14 _packageSynchronizer.SynchronizationCompleted += sender => _isSynchronizing = false; 15 16 RoleEnvironment.StatusCheck += (sender, args) => 17 { 18 if (_isSynchronizing) 19 { 20 args.SetBusy(); 21 } 22 }; 23 24 return base.OnStart(); 25 } 26 27 public override void Run() 28 { 29 _packageSynchronizer.SynchronizeForever(TimeSpan.FromSeconds(30)); 30 31 base.Run(); 32 } 33 }

The above code is essentially wiring some configuration values like the local web root and the NuGet package source to use to a second class in this project: the PackageSynchronizer. This class simply checks the specified NuGet package source every few minutes, checks for the latest package versions and if required, updates content and bin files.  Each synchronization run does the following:

1 public void SynchronizeOnce() 2 { 3 var packages = _packageRepository.GetPackages() 4 .Where(p => p.IsLatestVersion == true).ToList(); 5 6 var touchedFiles = new List<string>(); 7 8 // Deploy new content 9 foreach (var package in packages) 10 { 11 var packageHash = package.GetHash(); 12 var packageFiles = package.GetFiles(); 13 foreach (var packageFile in packageFiles) 14 { 15 // Keep filename 16 var packageFileName = packageFile.Path.Replace("content\\", "").Replace("lib\\", "bin\\"); 17 18 // Mark file as touched 19 touchedFiles.Add(packageFileName); 20 21 // Do not overwrite content that has not been updated 22 if (!_packageFileHash.ContainsKey(packageFileName) || _packageFileHash[packageFileName] != packageHash) 23 { 24 _packageFileHash[packageFileName] = packageHash; 25 26 Deploy(packageFile.GetStream(), packageFileName); 27 } 28 } 29 30 // Remove obsolete content 31 var obsoleteFiles = _packageFileHash.Keys.Except(touchedFiles).ToList(); 32 foreach (var obsoletePath in obsoleteFiles) 33 { 34 _packageFileHash.Remove(obsoletePath); 35 Undeploy(obsoletePath); 36 } 37 } 38 }

Or in human language:

  • The specified NuGet package source is checked for packages
  • Every package marked “IsLatest” is being downloaded and deployed onto the machine
  • Files that have not been used in the current synchronization step are deleted

This is probably not a bullet-proof solution, but I wanted to show you how easy it is to use NuGet not only as a package manager inside Visual Studio, but also from your code: NuGet is not just a package manager but in essence a package management protocol. Which you can easily extend.

One thing to note: I also made the Windows Azure load balancer ignore the role that’s updating itself. This means a roie instance that is synchronizing its contents will never be available in the load balancing pool so no traffic is sent to the role instance during an update.

Why MyGet uses Windows Azure

MyGet - NuGet hosting private feedRecently one of the Tweeps following me started fooling around and hit one of my sweet spots: Windows Azure. Basically, he mocked me for using Windows Azure for MyGet, a website with enough users but not enough to justify the “scalability” aspect he thought Windows Azure was offering. Since Windows Azure is much, much more than scalability alone, I decided to do a quick writeup about the various reasons on why we use Windows Azure for MyGet. And those are not scalability.

First of all, here’s a high-level overview of our deployment, which may illustrate some of the aspects below:

image

Costs

Windows Azure is cheap. Cheap as in cost-effective, not as in, well, sleezy. Many will disagree with me but the cost perspective of Windows Azure can be real cheap in some cases as well as very expensive in other cases. For example, if someone asks me if they should move to Windows Azure and they now have one server running 300 small sites, I’d probably tell them not to move as it will be a tough price comparison.

With MyGet we run 2 Windows Azure instances in 2 datacenters across the globe (one in the US and one in the EU). For $180.00 per month this means 2 great machines at two very distant regions of the globe. You can probably find those with other hosters as well, but will they manage your machines? Patch and update them? Probably not, for that amount. In our scenario, Windows Azure is cheap.

Feel free to look at the cost calculator tool to estimate usage costs.

Traffic Manager

Traffic Manager, a great (beta) product in the Windows Azure offering allows us to do geographically distributed applications. For example, US users of MyGet will end up in the US datacenter, European users will end up in the EU datacenter. This is great, and we can easily add extra locations to this policy and have, for example, a third location in Asia.

Next to geographically distributing MyGet, Traffic Manager also ensures that if one datacenter goes down, the DNS pool will consist of only “live” datacenters and thus provide datacenter fail-over. Not ideal as the web application will be served faster from a server that’s closer to the end user, but the application will not go down.

One problem we have with this is storage. We use Windows Azure storage (blobs, tables and queues) as those only cost $0.12 per GB. Distributing the application does mean that our US datacenter server has to access storage in the EU datacenter which of course adds some latency. We try to reduce this using extensive caching on all sides, but it’d be nicer if Traffic Manager allowed us to setup georeplication for storage as well. This only affects storing package metadata and packages. Reading packages is not affected by this because we’re using the Windows Azure CDN for that.

CDN

The Windows Azure Content Delivery Network allows us to serve users fast. The main use case for MyGet is accessing and downloading packages. Ok, the updating has some latency due to the restrictions mentioned above, but if you download a package from MyGet it will always come from a CDN node near the end user to ensure low latency and fast access. Given the CDN is just a checkbox on the management pages means integrating with CDN is a breeze. The only thing we’ve struggled with is finding an acceptable caching policy to ensure stale data is limited.

Windows Azure AppFabric Access Control

MyGet is not one application. MyGet is three applications: our development environment, staging and production. In fact, we even plan for tenants so every tenant in fact is its own application. To streamline, manage and maintain a clear overview of which user can authenticate to which application via which identity provider, we use ACS to facilitate MyGet authentication.

To give you an example: our dev environment allows logging in via OpenID on a development machine. Production allows for OpenID on a live environment. In staging, we only use Windows Live ID and Facebook whereas our production website uses different identity providers. Tenants will, in the future, be given the option to authenticate to their own ADFS server, we’re pretty sure ACS will allow us to simply configure that and instrument only tenant X can use that ADFS server.

ACs has been a great time saver and is definitely something we want to use in future project. It really eases common authentication pains and acts as a service bus between users, identity providers and our applications.

Windows Azure AppFabric Caching

Currently we don’t use Windows Azure AppFabric Caching in our application. We currently use the ASP.NET in-memory cache on all machines but do feel the need for having a distributed caching solution. While appealing, we think about deploying Memcached in our application because of the cost structure involved. But we might as well end up with Wndows Azure AppFabric Caching anyway as it integrates nicely with our current codebase.

Conclusion

In short, Windows Azure is much more than hosting and scalability. It’s the building blocks available such as Traffic Manager, CDN and Access Control Service that make our lives easier. The pricing structure is not always that transparent but if you dig a little into it you’ll find affordable solutions that are really easy to use because you don’t have to roll your own.