dotnet, csharp, vs comments edit

UPDATE OCT 25 2022: I filed an issue about some of the challenges here and the weird <Compile Remove> solution I had to do to get around the CS2002 warning. I got a good comment that explained some of the things I didn’t catch from the original issue about strongly-typed resource generation (which is a very long issue). I’ve updated the code/article to include the fixes and have a complete example.


In the not-too-distant past I switched from using Visual Studio for my full-time .NET IDE to using VS Code. No, it doesn’t give me quite as much fancy stuff, but it feels a lot faster and it’s nice to not have to switch to different editors for different languages.

Something I noticed, though, was that if I updated my *.resx files in VS Code, the associated *.Designer.cs was not getting auto-generated. There is a GitHub issue for this and it includes some different solutions to the issue involving some .csproj hackery, but it’s sort of hard to parse through and find the thing that works.

Here’s how you can get this to work for both Visual Studio and VS Code.

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <!--
        Target framework doesn't matter, but this solution is tested with
        .NET 6 SDK and above.
    -->
    <TargetFrameworks>net6.0</TargetFrameworks>

    <!--
        This is required because OmniSharp (VSCode) calls the build in a way
        that will skip resource generation. Without this line, OmniSharp won't
        find the generated .cs files and analysis will fail.
    -->
    <CoreCompileDependsOn>PrepareResources;$(CompileDependsOn)</CoreCompileDependsOn>
  </PropertyGroup>

  <ItemGroup>
    <!--
        Here's the magic. You need to specify everything for the generated
        designer file - the filename, the language, the namespace, and the
        class name.
    -->
    <EmbeddedResource Update="MyResources.resx">
      <!-- Tell Visual Studio that MSBuild will do the generation. -->
      <Generator>MSBuild:Compile</Generator>
      <LastGenOutput>MyResources.Designer.cs</LastGenOutput>
      <!-- Put generated files in the 'obj' folder. -->
      <StronglyTypedFileName>$(IntermediateOutputPath)\MyResources.Designer.cs</StronglyTypedFileName>
      <StronglyTypedLanguage>CSharp</StronglyTypedLanguage>
      <StronglyTypedNamespace>Your.Project.Namespace</StronglyTypedNamespace>
      <StronglyTypedClassName>MyResources</StronglyTypedClassName>
    </EmbeddedResource>

    <!--
        If you have resources in a child folder it still works, but you need to
        make sure you update the StronglyTypedFileName AND the
        StronglyTypedNamespace.
    -->
    <EmbeddedResource Update="Some\Sub\Folder\OtherResources.resx">
      <Generator>MSBuild:Compile</Generator>
      <LastGenOutput>OtherResources.Designer.cs</LastGenOutput>
      <!-- Make sure this won't clash with other generated files! -->
      <StronglyTypedFileName>$(IntermediateOutputPath)\OtherResources.Designer.cs</StronglyTypedFileName>
      <StronglyTypedLanguage>CSharp</StronglyTypedLanguage>
      <StronglyTypedNamespace>Your.Project.Namespace.Some.Sub.Folder</StronglyTypedNamespace>
      <StronglyTypedClassName>OtherResources</StronglyTypedClassName>
    </EmbeddedResource>
  </ItemGroup>
</Project>

Additional tips:

Once you have this in place, you can .gitignore any *.Designer.cs files and remove them from source. They’ll be regenerated by the build, but if you leave them checked in then the version of the generator that Visual Studio uses will fight with the version of the generator that the CLI build uses and you’ll get constant changes. The substance of the generated code is the same, but file headers may be different.

You can use VS Code file nesting to nest localized *.resx files under the main *.resx files with this config. Note you won’t see the *.Designer.cs files in there because they’re going into the obj folder.

{
  "explorer.fileNesting.enabled": true,
  "explorer.fileNesting.patterns": {
    "*.resx": "$(capture).*.resx, $(capture).designer.cs, $(capture).designer.vb"
  }
}

azure, tfs, build comments edit

Using the Sonatype Nexus IQ for Azure DevOps task in your build, you may see some warnings that look like this:

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.google.inject.internal.cglib.core.$ReflectUtils$1 (file:/agent/_work/_tasks/NexusIqPipelineTask_4f40d1a2-83b0-4ddc-9a77-e7f279eb1802/1.4.0/resources/nexus-iq-cli-1.143.0-01.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of com.google.inject.internal.cglib.core.$ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release

The task, internally, just runs java to execute the Sonatype scanner JAR/CLI. The warnings here are because that JAR assumes JDK 8 and the default JDK on an Azure DevOps agent is later than that.

The answer is to set JDK 8 before running the scan.

# Install JDK 8
- task: JavaToolInstaller@0
  inputs:
    versionSpec: '8'
    jdkArchitectureOption: x64
    jdkSourceOption: PreInstalled

# Then run the scan
- task: NexusIqPipelineTask@1
  inputs:
    nexusIqService: my-service-connection
    applicationId: my-application-id
    stage: "Release"
    scanTargets: my-scan-targets

csharp comments edit

I was doing some AutoMapper-ing the other day, converting my data object…

public class Source
{
  public Source();
  public string Description { get; set; }
  public DateTimeOffset? ExpireDateTime { get; set; }
  public string Value { get; set; }
}

…into an object needed for a system we’re integrating with.

public class Destination
{
  public Destination();
  public Destination(string value, DateTime? expiration = null);
  public Destination(string value, string description, DateTime? expiration = null);
  public string Description { get; set; }
  public DateTime? Expiration { get; set; }
  public string Value { get; set; }
}

It appeared to me that the most difficult thing here was going to be mapping ExpireDateTime to Expiration. Unfortunately, this was more like a three-hour tour.

I started out creating the mapping like this (in a mapping Profile):

// This is not the answer.
this.CreateMap<Source, Destination>()
    .ForMember(dest => dest.Expiration, opt.MapFrom(src => src.ExpireDateTime));

This didn’t work because there’s no mapping from DateTimeOffset? to DateTime?. I next made a mistake that I think I make every time I run into this and have to relearn it, which is that I created that mapping, too.

// Still not right.
this.CreateMap<Source, Destination>()
    .ForMember(dest => dest.Expiration, opt.MapFrom(src => src.ExpireDateTime));
this.CreateMap<DateTimeOffset?, DateTime?>()
    .ConvertUsing(input => input.HasValue ? input.Value.DateTime : null);

It took a few tests to realize that AutoMapper handles nullable for you, so I was able to simplify a bit.

// Getting closer - don't map nullable, map the base type.
this.CreateMap<Source, Destination>()
    .ForMember(dest => dest.Expiration, opt.MapFrom(src => src.ExpireDateTime));
this.CreateMap<DateTimeOffset, DateTime>()
    .ConvertUsing(input => input.DateTime);

However, it seemed that no matter what I did, the Destination.Expiration was always null. For the life of me, I couldn’t figure it out.

Then I had one of those “eureka” moments when I was thinking about how Autofac handles constructors: It chooses the constructor with the most parameters that it can fulfill from the set of registered services.

I looked again at that Destination object and realized there were three constructors, two of which default the Expiration value to null. AutoMapper also handles constructors in a way similar to Autofac. From the docs about ConstructUsing:

AutoMapper will automatically match up destination constructor parameters to source members based on matching names, so only use this method if AutoMapper can’t match up the destination constructor properly, or if you need extra customization during construction.

That’s it! The answer is to pick the zero-parameter constructor so the mapping isn’t skipped.

// This is the answer!
this.CreateMap<Source, Destination>()
    .ForMember(dest => dest.Expiration, opt.MapFrom(src => src.ExpireDateTime))
    .ConstructUsing((input, context) => new Destination());
this.CreateMap<DateTimeOffset, DateTime>()
    .ConvertUsing(input => input.DateTime);

Hopefully that will save you some time if you run into it. Also, hopefully it will save me some time next time I’m stumped because I can search and find my own blog… which happens more often than you might think.

halloween, costumes comments edit

This year we had 140 trick-or-treaters. This is pretty low for us, but I can’t complain since it’s the first year after COVID-19.

2021: 140 trick-or-treaters.

Average Trick-or-Treaters by Time Block

Year-Over-Year Trick-or-Treaters

Halloween was on a Sunday and it was chilly and windy. It had been raining a bit but didn’t rain during prime trick-or-treat time.

We didn’t hand out candy last year due to the COVID-19 outbreak. Looking up and down our street, it appeared a lot of people chose again this year to not hand out candy. We also saw some “take one” bowls on porches and various creative “candy torpedo tubes” that would send candy from the porch to the kid in a distanced fashion.

Cumulative data:

  Time Block
Year 6:00p - 6:30p 6:30p - 7:00p 7:00p - 7:30p 7:30p - 8:00p 8:00p - 8:30p Total
2006 52 59 35 16 0 162
2007 5 45 39 25 21 139
2008 14 71 82 45 25 237
2009 17 51 72 82 21 243
2010 19 77 76 48 39 259
2011 31 80 53 25 0 189
2013 28 72 113 80 5 298
2014 19 54 51 42 10 176
2015 13 14 30 28 0 85
2016 1 59 67 57 0 184
2019 1 56 59 41 33 190
2021 16 37 30 50 7 140

Our costumes this year:

  • Me: Prisoner Loki from the Disney+ Loki show
  • Jenn: Medusa
  • Phoenix: Cerise Hood from Ever After High

Me as Prisoner Loki

linux, mac, windows comments edit

I used to think setting up your PATH for your shell - whichever shell you like - was easy. But then I got into a situation where I started using more than one shell on a regular basis (both PowerShell and Bash) and things started to break down quickly.

Specifically, I have some tools that are installed in my home directory. For example, .NET global tools get installed at ~/.dotnet/tools and I want that in my path. I would like this to happen for any shell I use, and I have multiple user accounts on my machine for testing scenarios so I’d like it to ideally be a global setting, not something I have to configure for every user.

This is really hard.

I’ll gather some of my notes here on various tools and strategies I use to set paths. It’s (naturally) different based on OS and shell.

This probably won’t be 100% complete, but if you have an update, I’d totally take a PR on this blog entry.

Shell Tips

Each shell has its own mechanism for setting up profile-specific values. In most cases this is the place you’ll end up setting user-specific paths - paths that require a reference to the user’s home directory. On Mac and Linux, the big takeaway is to use /etc/profile. Most shells appear to interact with that file on some level.

PowerShell

PowerShell has a series of profiles that range from system level (all users, all hosts) through user/host specific (current user, current host). The one I use the most is “current user, current host” because I store my profile in a Git repo and pull it into the correct spot on my local machine. I don’t currently modify the path from my PowerShell profile.

  • On Windows, PowerShell will use the system/user path setup on launch and then you can modify it from your profile.
  • On Mac and Linux, PowerShell appears to evaluate the /etc/profile and ~/.profile, then subsequently use its own profiles for the path. On Mac this includes evaluation of the path_helper output. (See the Mac section below for more on path_helper.) I say “appears to evaluate” because I can’t find any documentation on it, yet that’s the behavior I’m seeing. I gather this is likely due to something like a login shell (say zsh) executing first and then having that launch pwsh, which inherits the variables. I’d love a PR on this entry if you have more info.

If you want to use PowerShell as a login shell, on Mac and Linux you can provide the -Login switch (as the first switch when running pwsh!) and it will execute sh to include /etc/profile and ~/.profile execution before launching the PowerShell process. See Get-Help pwsh for more info on that.

Bash

Bash has a lot of profiles and rules about when each one gets read. Honestly, it’s pretty complex and seems to have a lot to do with backwards compatibility with sh along with need for more flexibility and override support.

/etc/profile seems to be the way to globally set user-specific paths. After /etc/profile, things start getting complex, like if you have a .bash_profile then your .profile will get ignored.

zsh

zsh is the default login shell on Mac. It has profiles at:

  • /etc/zshrc and ~/.zshrc
  • /etc/zshenv and ~/.zshenv
  • /etc/zprofile and ~/.zprofile

It may instead use /etc/profile and ~/.profile if it’s invoked in a compatibility mode. In this case, it won’t execute the zsh profile files and will use the sh files instead. See the manpage under “Compatibility” for details or this nice Stack Overflow answer.

I’ve set user-specific paths in /etc/profile and /etc/zprofile, which seems to cover all the bases depending on how the command gets invoked.

Operating System Tips

Windows

Windows sets all paths in the System => Advanced System Settings => Environment Variables control panel. You can set system or user level environment variables there.

The Windows path separator is ;, which is different than Mac and Linux. If you’re building a path with string concatenation, be sure to use the right separator.

Mac and Linux

I’ve lumped these together because, with respect to shells and setting paths, things are largely the same. The only significant difference is that Mac has a tool called path_helper that is used to generate paths from a file at /etc/paths and files inside the folder /etc/paths.d. Linux doesn’t have path_helper.

The file format for /etc/paths and files in /etc/paths.d is plain text where each line contains a single path, like:

/usr/local/bin
/usr/bin
/bin
/usr/sbin
/sbin

Unfortunately, path_helper doesn’t respect the use of variables - it will escape any $ it finds. This is a good place to put global paths, but not great for user-specific paths.

In /etc/profile there is a call to path_helper to evaluate the set of paths across these files and set the path. I’ve found that just after that call is a good place to put “global” user-specific paths.

if [ -x /usr/libexec/path_helper ]; then
  eval `/usr/libexec/path_helper -s`
fi

PATH="$PATH:$HOME/go/bin:$HOME/.dotnet/tools:$HOME/.krew/bin"

Regardless of whether you’re on Mac or Linux, /etc/profile seems to be the most common place to put these settings. Make sure to use $HOME instead of ~ to indicate the home directory. The ~ won’t get expanded and can cause issues down the road.

If you want to use zsh, you’ll want the PATH set block in both /etc/profile and /etc/zprofile so it handles any invocation.

The Mac and Linux path separator is :, which is different than Windows. If you’re building a path with string concatenation, be sure to use the right separator.