dotnet, gists, build comments edit

I’ve run across a similar situation to many folks I’ve seen online, where I have a solution with a pretty modular application and when I build it,I don’t get all the indirect dependencies copied in.

I found a blog article with an MSBuild target in it that supposedly fixes some of this indirect copying nonsense, but as it turns out, it doesn’t actually go far enough.

My app looks something like this (from a reference perspective)

  • Project: App Host
    • Project: App Startup/Coordination
      • Project: Core Utilities
      • Project: Server Utilities
        • NuGet references and extra junk

The application host is where I need everything copied so it all works, but the NuGet references and extra junk way down the stack isn’t making it so there are runtime explosions.

I also decided to solve this with MSBuild, but using an inline code task. This task will…

  1. Look at the list of project references in the current project.
  2. Go find the project files corresponding to those project references.
  3. Calculate the path to the project reference output assembly and include that in the list of indirect references.
  4. Calculate the paths to any third-party references that include a <HintPath> (indicating the item isn’t GAC’d) and include those in the list of indirect references.
  5. Look for any additional project references – if they’re found, go to step 2 and continue recursing until there aren’t any project references we haven’t seen.

While it’s sort of the “nuclear option,” it means that my composable application will have all the stuff ready and in place at the Host level for any plugin runtime assemblies to be dropped in and be confident they’ll find all the platform support they expect.

Before I paste in the code, the standard disclaimers apply: Works on my box; no warranty expressed or implied; no support offered; YMMV; and so on. If you grab this and need to tweak it to fit your situation, go for it. I’m not really looking to make this The Ultimate Copy Paste Solution for Dependency Copy That Works In Every Situation.

And with that, here’s a .csproj file snippet showing how to use the task as well as the task proper:

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="12.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <!-- All the stuff normally found in the project, then in the AfterBuild event... -->
  <Target Name="AfterBuild">
    <!-- Here's the call to the custom task to get the list of dependencies -->
    <ScanIndirectDependencies StartFolder="$(MSBuildProjectDirectory)"
                              StartProjectReferences="@(ProjectReference)"
                              Configuration="$(Configuration)">
      <Output TaskParameter="IndirectDependencies" ItemName="IndirectDependenciesToCopy" />
    </ScanIndirectDependencies>

    <!-- Only copy the file in if we won't stomp something already there -->
    <Copy SourceFiles="%(IndirectDependenciesToCopy.FullPath)"
          DestinationFolder="$(OutputPath)"
          Condition="!Exists('$(OutputPath)\%(IndirectDependenciesToCopy.Filename)%(IndirectDependenciesToCopy.Extension)')" />
  </Target>


  <!-- THE CUSTOM TASK! -->
  <UsingTask TaskName="ScanIndirectDependencies" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v12.0.dll">
    <ParameterGroup>
      <StartFolder Required="true" />
      <StartProjectReferences ParameterType="Microsoft.Build.Framework.ITaskItem[]" Required="true" />
      <Configuration Required="true" />
      <IndirectDependencies ParameterType="Microsoft.Build.Framework.ITaskItem[]" Output="true" />
    </ParameterGroup>
    <Task>
      <Reference Include="System.Xml"/>
      <Using Namespace="Microsoft.Build.Framework" />
      <Using Namespace="Microsoft.Build.Utilities" />
      <Using Namespace="System" />
      <Using Namespace="System.Collections.Generic" />
      <Using Namespace="System.IO" />
      <Using Namespace="System.Linq" />
      <Using Namespace="System.Xml" />
      <Code Type="Fragment" Language="cs">
      <![CDATA[
var projectReferences = new List<string>();
var toScan = new List<string>(StartProjectReferences.Select(p => Path.GetFullPath(Path.Combine(StartFolder, p.ItemSpec))));
var indirectDependencies = new List<string>();

bool rescan;
do{
  rescan = false;
  foreach(var projectReference in toScan.ToArray())
  {
    if(projectReferences.Contains(projectReference))
    {
      toScan.Remove(projectReference);
      continue;
    }

    Log.LogMessage(MessageImportance.Low, "Scanning project reference for other project references: {0}", projectReference);

    var doc = new XmlDocument();
    doc.Load(projectReference);
    var nsmgr = new XmlNamespaceManager(doc.NameTable);
    nsmgr.AddNamespace("msb", "http://schemas.microsoft.com/developer/msbuild/2003");
    var projectDirectory = Path.GetDirectoryName(projectReference);

    // Find all project references we haven't already seen
    var newReferences = doc
          .SelectNodes("/msb:Project/msb:ItemGroup/msb:ProjectReference/@Include", nsmgr)
          .Cast<XmlAttribute>()
          .Select(a => Path.GetFullPath(Path.Combine(projectDirectory, a.Value)));

    if(newReferences.Count() > 0)
    {
      Log.LogMessage(MessageImportance.Low, "Found new referenced projects: {0}", String.Join(", ", newReferences));
    }

    toScan.Remove(projectReference);
    projectReferences.Add(projectReference);

    // Add any new references to the list to scan and mark the flag
    // so we run through the scanning loop again.
    toScan.AddRange(newReferences);
    rescan = true;

    // Include the assembly that the project reference generates.
    var outputLocation = Path.Combine(Path.Combine(projectDirectory, "bin"), Configuration);
    var localAsm = Path.GetFullPath(Path.Combine(outputLocation, doc.SelectSingleNode("/msb:Project/msb:PropertyGroup/msb:AssemblyName", nsmgr).InnerText + ".dll"));
    if(!indirectDependencies.Contains(localAsm) && File.Exists(localAsm))
    {
      Log.LogMessage(MessageImportance.Low, "Added project assembly: {0}", localAsm);
      indirectDependencies.Add(localAsm);
    }

    // Include third-party assemblies referenced by file location.
    var externalReferences = doc
          .SelectNodes("/msb:Project/msb:ItemGroup/msb:Reference/msb:HintPath", nsmgr)
          .Cast<XmlElement>()
          .Select(a => Path.GetFullPath(Path.Combine(projectDirectory, a.InnerText.Trim())))
          .Where(e => !indirectDependencies.Contains(e));

    Log.LogMessage(MessageImportance.Low, "Found new indirect references: {0}", String.Join(", ", externalReferences));
    indirectDependencies.AddRange(externalReferences);
  }
} while(rescan);

// Expand to include pdb and xml.
var xml = indirectDependencies.Select(f => Path.Combine(Path.GetDirectoryName(f), Path.GetFileNameWithoutExtension(f) + ".xml")).Where(f => File.Exists(f)).ToArray();
var pdb = indirectDependencies.Select(f => Path.Combine(Path.GetDirectoryName(f), Path.GetFileNameWithoutExtension(f) + ".pdb")).Where(f => File.Exists(f)).ToArray();
indirectDependencies.AddRange(xml);
indirectDependencies.AddRange(pdb);
Log.LogMessage("Located indirect references:\n{0}", String.Join(Environment.NewLine, indirectDependencies));

// Finally, assign the output parameter.
IndirectDependencies = indirectDependencies.Select(i => new TaskItem(i)).ToArray();
      ]]>
      </Code>
    </Task>
  </UsingTask>
</Project>

Boom! Yeah, that’s a lot of code. And I could probably tighten it up, but I’m only using it once, in one place, and it runs one time during the build. Ain’t broke, don’t fix it, right?

Hope that helps someone out there.

I’m messing around with Boxstarter and Chocolatey and one of the things I wanted to do was install the various “Command Prompt Here” context menu extensions I use all the time. These extensions are .inf files and, unfortunately, there isn’t really any documentation on how to create a Chocolatey package that installs an .inf.

So here’s how you do it:

First, package the .inf file in the tools folder of your package alongside the chocolateyInstall.ps1 script..inf files are pretty small anyway and you want the file to be around for uninstall, so it’s best to just include it.

Next, set your chocolateyInstall.ps1 to run InfDefaultInstall.exe. That’s an easier way to install .inf files than the rundll32.exe way and it’ll work with Vista and later. So… no XP support. Aw, shucks.

Here’s a sample chocolateyInstall.ps1:

$packageName = 'YourPackageNameHere'
$validExitCodes = @(0)

try {
  $scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
  $target = Join-Path $scriptPath "YourInfFileNameHere.inf"
  $infdefaultinstall = Join-Path (Join-Path $Env:SystemRoot "System32") "InfDefaultInstall.exe"
  Start-ChocolateyProcessAsAdmin "$target" "$infdefaultinstall" -validExitCodes $validExitCodes
  Write-ChocolateySuccess "$packageName"
} catch {
  Write-ChocolateyFailure "$packageName" "$($_.Exception.Message)"
  throw
}

To support uninstall, add a chocolateyUninstall.ps1 script. This will have to use rundll32.exe to uninstall, but it’s not too bad.

    $packageName = 'YourPackageNameHere'
    $validExitCodes = @(0)

    try {
      $scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
      $target = Join-Path $scriptPath "YourInfFileNameHere.inf"
      Start-ChocolateyProcessAsAdmin "SETUPAPI.DLL,InstallHinfSection DefaultUninstall 132 $target" "rundll32.exe" -validExitCodes $validExitCodes
      Write-ChocolateySuccess "$packageName"
    } catch {
      Write-ChocolateyFailure "$packageName" "$($_.Exception.Message)"
      throw
    }

That’s it! Run the packaging and you’re set to go. This will support both installation and uninstallation of the .inf file.

Note: At one point I was having some trouble getting this to run on a Windows Server 2012 VM using the one-click Boxstarter execution mechanism. I found this while testing an install script that installs something like 40 things. After rolling back the VM to a base snapshot (before running the script) I’m no longer able to see the failure I saw before, so I’m guessing it was something else in the script causing the problem. This INF install mechanism appears to work just fine.

net comments edit

I was testing out some changes to versioning in Autofac. We have a MyGet feed, but all of the internal dependencies of the various NuGet packages when they’re built point to the CI versions, so it’s sort of hard to stage a test of what things will look like when they’re released – you have to rename each .nupkg file to remove the “-CI-XYZ” build number, open each .nupkg file, change the internal .nuspec file to remove the “-CI-XYZ” build number info, then re-zip everything up. In testing, I had to do this a few times, so I scripted it.

I put everything in a folder structure like this:

  • ~/TestFeed
    • backup – contains all of the original .nupkg files (renamed without the “-CI-XYZ”)
    • msbuildcommunitytasks – contains the MSBuild Community Tasks set

Then I wrote up a quick MSBuild script for doing all the extract/update/rezip stuff. I could have used any other scripting language, but, eh, the batching and file scanning in MSBuild made a few things easy.

msbuild fixrefs.proj /t:Undo puts the original packages (from the backup folder) into the test feed folder.

msbuild fixrefs.proj Does the zip/fix/re-zip.

One of the challenges I ran into was that the zip task in MSBuild Community Tasks seemed to always want to add an extra level of folders into the .nupkg – I couldn’t get the original contents to live right at the root of the package. Rather than fight it, I used 7-Zip to do the re-zipping. I probably could have gotten away from the MSBuild Community Tasks entirely had I some form of sed on my machine because I needed that FileUpdate task. But… Windows. And, you know, path of least resistance. I think this was a five-minute thing. Took longer to write this blog entry than it did to script this.

Here’s “fixrefs.proj”:

<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="All" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="4.0">
  <PropertyGroup>
    <MSBuildCommunityTasksPath>.</MSBuildCommunityTasksPath>
    <SevenZip>C:\Program Files\7-Zip\7z.exe</SevenZip>
  </PropertyGroup>
  <Import Project="$(MSBuildProjectDirectory)\msbuildcommunitytasks\MSBuild.Community.Tasks.Targets"/>
  <ItemGroup>
    <Package Include="*.nupkg"/>
  </ItemGroup>
  <Target Name="All">
    <MakeDir Directories="%(Package.Filename)" />
    <Unzip ZipFileName="%(Package.FullPath)" TargetDirectory="%(Package.Filename)" />
    <ItemGroup>
      <NuSpec Include="**/*.nuspec" />
    </ItemGroup>
    <FileUpdate Files="@(NuSpec)" Regex="(.)\-CI\-\d+" ReplacementText="$1" WarnOnNoUpdate="true" />
    <Delete Files="@(Package)" />
    <CallTarget Targets="ZipNewPackage" />
    <RemoveDir Directories="%(Package.Filename)" />
  </Target>
  <Target Name="Undo">
    <Delete Files="@(Package)" />
    <ItemGroup>
      <Original Include="backup/*.nupkg" />
    </ItemGroup>
    <Copy SourceFiles="@(Original)" DestinationFolder="$(MSBuildProjectDirectory)" />
  </Target>
  <Target Name="ZipNewPackage" Inputs="@(Package)" Outputs="%(Identity).Dummy">
    <Exec
      Command="&quot;$(SevenZip)&quot; a -tzip &quot;$(MSBuildProjectDirectory)\%(Package.Filename)%(Package.Extension)&quot;"
      WorkingDirectory="$(MSBuildProjectDirectory)\%(Package.Filename)" />
  </Target>
</Project>

net comments edit

Until now, Autofac assemblies have changed version using a slow-changing assembly version but a standard semantic version for the NuGet package and file version.

The benefit of that approach is we could avoid some painful assembly redirect issues.

The drawback, of course, is that even minor changes (adding new functionality in a backwards-compatible way) can cause problems – one project uses version 3.0.0.0 of Autofac and works great, a different project also uses version 3.0.0.0 of Autofac but breaks because it needs some of that newer functionality. That’s hard to troubleshoot and pretty much impossible to fix. (It’s the wrong version of 3.0.0.0? That’s a new kind of dependency hell.)

As a compromise to that, we’ve switched to work sort of like MVC and Web API – for major and minor (X.Y) changes, the assembly version will change, but not for patch-level changes; for all changes, the NuGet package and file versions will change.

This initial switch will potentially be a little painful for folks since it means every Autofac package has to be re-issued to ensure assembly dependencies line up. After that, we should be running smooth again.

You’ll see a 0.0.1 update to the packages – all of those have the new assemblies with the new versions and proper prerequisite references. (Not entirely sure 0.0.1 was the right semantic version increment, but, well, c’est la vie.)

Really sorry about the bit of upgrade pain here. I had hoped we could sneak the change out on a package-by-package basis, but as each integration or extras package gets released, it gets its dependencies set and has assembly references, so we’d end up releasing everything a few times – the first time for when the version of the integration package changes; a second time for when core Autofac changes; and one more time for every time any other dependencies change. For packages like Autofac.Extras.Multitenant.Wcf (which relies on Autofac, Autofac.Integration.Wcf, and Autofac.Extras.Multitenant), it’d mean releasing it a minimum of four times just for the assembly reference changes. Best just to rip the bandage off, right? (I hope?)

NuGet should take care of the assembly redirect issues for you, but if you see assembly dependency conflict warnings in your build, it’s because you’ve not updated all of your Autofac packages.

Relevant GitHub issues: #502, #508

dotnet, aspnet comments edit

I just spent a day fighting these so I figured I’d share. You may or may not run into them. They do get pretty low-level, like, “not the common use case.”

PROBLEM 1: Why Isn’t My Data Serializing as XML?

I had set up my media formatters so the XML formatter would kick in and provide some clean looking XML when I provided a querystring parameter, like http://server/api/something?format=xml. I did it like this:

var fmt = configuration.Formatters.XmlFormatter;
fmt.MediaTypeMappings.Add(new QueryStringMapping("format", "xml", "text/xml"));
fmt.UseXmlSerializer = true;
fmt.WriterSettings.Indent = true;

It seemed to work on super simple stuff, but then it seemed to arbitrarily just stop - I’d get XML for some things, but others would always come back in JSON no matter what.

The problem was the fmt.UseXmlSerializer = true; line. I picked the XmlSerializer option because it can create prettier XML without all the extra namespaces and cruft of the standard DataContractSerializer

UPDATE: I just figured out it’s NOT IEnumerable<T> that’s the problem - it’s an object way deep down in my hierarchy that doesn’t have a parameterless constructor.

When I started returning IEnumerable<T> values, that’s when it stopped working. I thought it was because of the IEnumerable<T>, but it turned out that I was enumerating an object that had a property with an object that had another property that didn’t have a default constructor. Yeah, deep in the sticks. No logging or exception handling to explain that one. I had to find it by stepping into the bowels of the XmlMediaTypeFormatter.

PROBLEM 2: Why Aren’t My Format Configurations Being Used?

Somewhat related to the first issue - I had the XML serializer set up for that query string mapping, and I had JSON set up to use camelCase and nice indentation, too. But for some weird reason, none of those settings were getting used at all when I made my requests.

Debugging into it, I could see that on some requests the configuration associated with the inbound request message was all reset to defaults. What?

This was because of some custom route registration stuff.

When you use attribute routes…

  1. The attribute routing mechanism gets the controller selector from the HttpConfiguration object.
  2. The controller selector gets the controller type resolver from the HttpConfiguration object to which it holds a reference.
  3. The controller type resolver locates all the controller types for the controller selector.
  4. The controller selector builds up a cached list of controller name-to-descriptor mappings. Each descriptor gets passed a reference to the HttpConfiguration object.
  5. The attribute routing mechanism gets the action selector from the HttpConfiguration object.
  6. The action selector uses type descriptors from the controller type selector and creates a cached set of action descriptors. Each action descriptor gets passed a reference to the HttpConfiguration object and get a reference back to the parent controller descriptor.
  7. The actions from the action selector get looked at for attribute route definitions and routes are built from the action descriptor. Each route has a reference to the descriptor so it knows what to execute.
  8. Execution of an action corresponding to one of these specific routes will use the exact descriptor to which it was tied.

Basically. There’s a little extra complexity in there I yada-yada’d away. The big takeaway here is that you can see all the bajillion places references to the HttpConfiguration are getting stored. There’s some black magic here.

I was trying to do my own sort of scanning for attribute routes (like on plugin assemblies that aren’t referenced by the project), but I didn’t want to corrupt the main HttpConfiguration object so I created little temporary ones that I used during the scanning process just to help coordinate things.

Yeah, you can’t do that.

Those temporary mostly-default configurations were getting used during my scanned routes rather than the configuration I had set with OWIN to use.

Once I figured all that out, I was able to work around it, but it took most of the day to figure it out. It’d be nice if things like the action descriptor would automatically chain up to the parent controller descriptor (if it’s present) to get configuration rather than holding its own reference. And so on, all the way up the stack, such that routes get their configuration from the route table, which is owned by the root configuration object. Set it and forget it.