Special Folder Enum Values on Windows and Mac in .Net Core

On Windows it is common to use Environment.SpecialFolder to access certain folders instead of having to hard code the paths or write the appropriate lookup code for them. Now that code is being ported to Mac using .Net core, I thought I would document the various values that appear for the special folders when running .Net Core code on the Mac. Below is a table that contains the data for a user whose username is john on both the Windows Machine and the Mac OSX machine.

Enum Value Windows Value Mac Value
AdminTools C:\Users\john\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Administrative Tools  
ApplicationData C:\Users\john\AppData\Roaming /Users/john/.config
CDBurning C:\Users\john\AppData\Local\Microsoft\Windows\Burn\Burn  
CommonAdminTools C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Administrative Tools  
CommonApplicationData C:\ProgramData /usr/share
CommonDesktopDirectory C:\Users\Public\Desktop  
CommonDocuments C:\Users\Public\Documents  
CommonMusic C:\Users\Public\Music  
CommonPictures C:\Users\Public\Pictures  
CommonProgramFiles C:\Program Files\Common Files  
CommonProgramFilesX86 C:\Program Files (x86)\Common Files  
CommonPrograms C:\ProgramData\Microsoft\Windows\Start Menu\Programs  
CommonStartMenu C:\ProgramData\Microsoft\Windows\Start Menu  
CommonStartup C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Startup  
CommonTemplates C:\ProgramData\Microsoft\Windows\Templates  
CommonVideos C:\Users\Public\Videos  
Cookies C:\Users\john\AppData\Local\Microsoft\Windows\INetCookies  
Desktop C:\Users\john\Desktop /Users/john/Desktop
DesktopDirectory C:\Users\john\Desktop /Users/john/Desktop
Favorites C:\Users\john\Favorites /Users/john/Library/Favorites
Fonts C:\WINDOWS\Fonts /Users/john/Library/Fonts
History C:\Users\john\AppData\Local\Microsoft\Windows\History  
InternetCache C:\Users\john\AppData\Local\Microsoft\Windows\INetCache /Users/john/Library/Caches
LocalApplicationData C:\Users\john\AppData\Local /Users/john/.local/share
MyDocuments C:\Users\john\Documents /Users/john
MyDocuments C:\Users\john\Documents /Users/john
MyMusic C:\Users\john\Music /Users/john/Music
MyPictures C:\Users\john\Pictures /Users/john/Pictures
MyVideos C:\Users\john\Videos  
NetworkShortcuts C:\Users\john\AppData\Roaming\Microsoft\Windows\Network Shortcuts  
ProgramFiles C:\Program Files /Applications
ProgramFilesX86 C:\Program Files (x86)  
Programs C:\Users\john\AppData\Roaming\Microsoft\Windows\Start Menu\Programs  
Recent C:\Users\john\AppData\Roaming\Microsoft\Windows\Recent  
Resources C:\WINDOWS\resources  
SendTo C:\Users\john\AppData\Roaming\Microsoft\Windows\SendTo  
StartMenu C:\Users\john\AppData\Roaming\Microsoft\Windows\Start Menu  
Startup C:\Users\john\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup  
System C:\WINDOWS\system32 /System
SystemX86 C:\WINDOWS\SysWOW64  
Templates C:\Users\john\AppData\Roaming\Microsoft\Windows\Templates  
UserProfile C:\Users\john /Users/john
Windows C:\WINDOWS  

The code for this is pretty straightforward. I enumerate over the possible enum values and output them to a CSV.

static void Main(string[] args)
    StringBuilder sb = new StringBuilder();
    foreach (Environment.SpecialFolder sf in Enum.GetValues(typeof(System.Environment.SpecialFolder)))
        sb.AppendLine($"{sf.ToString()}, {Environment.GetFolderPath(sf)}");
    var path = System.IO.Path.GetDirectoryName(Assembly.GetExecutingAssembly().FullName);
    var fileName = GetFileName();
    var filePath = System.IO.Path.Combine(path, $"{fileName}.csv");
    System.IO.File.WriteAllText(filePath, sb.ToString());

static string GetFileName()
    if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
        return "Win";
    else if (RuntimeInformation.IsOSPlatform(OSPlatform.OSX))
        return "OSX";

    return "Linux";

If you just want to pull the code and run it, I have a copy up on GitHub. As you can see, some special folders have a direct mapping to Mac OSX and others do not. When you think about it, they all make sense. As long as you understand the values you will get back in the various scenarios, you can use the values that are appropriate for your application.

Dealing with Duplicate Assembly Attributes in .Net Core

When migrating a project from .Net Framework to .Net Standard, you may run into issues where you get duplicate assembly attributes. An example you might see is something like this:

Severity	Code	Description	Project	File	Line	Suppression State
Error	CS0579	Duplicate 'System.Reflection.AssemblyTitleAttribute' attribute MyProject
D:\Dev\MyProject\obj\Debug\netstandard2.0\MyProject.AssemblyInfo.cs	20	N/A

I ran into this because I have an AssemblyInfo.cs with an AssemblyTitleAttribute and the .Net Standard project is also generating the AssemblyTitleAttribute. After reading through some GitHub issues, it appears there are two ways around this issue.

First, I could remove the AssemblyInfo.cs that I already had in my project and add the appropriate attributes to the csproj file. Since I am converting a .Net Framework project in place with a new solution and csproj file, this will not work for me. I am left with the second option.

Add settings to the csproj file to indicate that the various attributes should not be generated. Here is an example csproj file with a few of the attributes disabled:

<Project Sdk="Microsoft.NET.Sdk">

Once those settings are added to the csproj file, everything compiles and there are no duplicate attribute errors.

Simple Interprocess Communication in .Net Core using Protobuf

In the past, I have used WCF to handle inter-process communication (IPC) between various separate components of my client applications. Since .Net Core doesn’t yet support WCF server side code, I had to look into alternatives. The two main approaches to this that I have found are TCPServer and NamedPipeServerStream. Others have covered the TCP approach, so I wanted to see what could be done with the NamedPipeServerStream.

I started reading the MSDN documentation on the basics of IPC with named pipes and found that it worked with .Net Core 2.0 with no changes. This is the true benefit of .Net Core. An older article about IPC is still completely relevant even though the code is now running on a Mac instead of a Windows Machine. One thing I didn’t like too much about that article was the StreamString class and I wanted to see what I could do with plain old C# objects.

I decided to start try out Protobuf. I had heard about it in the past and figured this would be a good foray into learning more about it. Since I was developing a client and a server, I decided I would start with the API and put that into a shared class project. So I created a Common project, added a reference to protobuf, and defined a Person class in there:

public class Person
    public string FirstName { get; set; }

    public string LastName { get; set; }

Decorating the class with the protobuf attributes was all I had to do. Now that it is defined in the common class, I could write a server to serve up the data and a client to consume the data, each referencing the Common library. Next up, I created the server. Following the linked example above, I defined the server console application as:

static void Main(string[] args)
    Console.WriteLine("Starting Server");

    var pipe = new NamedPipeServerStream("MyTest.Pipe", PipeDirection.InOut);
    Console.WriteLine("Waiting for connection....");


    Serializer.Serialize(pipe, new Person() { FirstName="Janey", LastName = "McJaneFace" });

I am simply defining the NamedPipeServerStream to listen on a pipe named “MyTest.Pipe”. For now, the code immediately returns an object to the connection, that can be read from the client side. This is achieved using protobuf’s Serializer.Serialize method. To define the client, I need to use a NamedPipeClientStream to connect to the same pipe.

static void Main(string[] args)
    var pipe = new NamedPipeClientStream(".", "MyTest.Pipe", PipeDirection.InOut, PipeOptions.None);
    var person = Serializer.Deserialize<Person>(pipe);
    Console.WriteLine($"Person: {person.FirstName} {person.LastName}");

Once I connect, I then use protobuf’s Serializer.Deserialize method to read from the stream and deserialize the person object. That’s it. I am passing data from one process to another in .Net core. If you are using .Net Core 1.x, you will need to explicitly add a reference to the System.IO.Pipes nuget package. And for both 1.x and 2.0 .net core, you need to add a nuget reference to protobuf.

Even though this is a basic example, it does demonstrate the functionality and could be easily extended to handle much more complex scenarios.

A fully working solution for this can be found as a sample GitHub project. There appear to be other .Net Core/Standard projects(1, 2) attempting to better facilitate IPC and it will be interesting to see how they mature with the ecosystem. My hope is that some flavor of WCF server makes its way over to .Net core, to make porting code that much easier.

Minified Javascript not Deploying With .Net Core Projects Created in Visual Studio 2017

I was working on a very simple site that I created using the new .Net Core project templates in Visual Studio 2017. Everything worked great on my machine, but, when I deployed to Azure, none of my custom javascript or CSS were working properly. What gives?

After doing some digging, I found that the deployed site was trying to use the site.min.js and the site.min.css, but those files weren’t deployed to Azure. After googling a bit, I found that it was probably an issue with my bundling and when I opened the bundleconfig.json, Visual Studio tried to be helpful:

Extensions are available...

Of course, I ignored the extension warning and comment at first, but the extension that is missing solves the exact problem I was having. The link in the comments has an article on how to enable and configure bundling in ASP.NET Core.

So, while the Visual Studio team could work on making this a better experience, I have to remember to read the warnings and comments that are left in the generated code. They are there for a reason.

Zero to CI in Thirty Minutes Or Less (Or its free!)

One of the biggest complaints I hear from teams about CI is that it is too much work. While getting it perfect can be a lot of work, getting started couldn’t be easier.

I am going to demonstrate continuously building a C# project using Jenkins as our CI host.

To get started we’ll need a machine to be our build agent. I am going to create a VM in Azure to be my build agent. Since I am building a C# project, I am going to choose the template that already has Visual Studio 2017 installed on it. But this could be any machine. It could be an extra machine you have sitting under your desk or a virtual machine in your own datacenter.

Azure template image

Once the machine is created, you can connect to it and install Jenkins. Start by downloading and running the windows installer.

Once installed, a browser window will open that you can use to administer Jenkins. It may open before Jenkins has a chance to start, so you may need to refresh the page. Follow the instructions on the page to unlock Jenkins for first time use.

You will be prompted to install plugins. Plugins are the lifeblood of the Jenkins ecosystem and there are plugins to do pretty much everything. You can start by installing the suggested plugins.

Customize Jenkins

This will install a handful of plugins to get us started. Once complete, you will be asked to setup an admin user. Go through the steps of setting up the user and then you can start using Jenkins. At this point, Jenkins is ready to go.

Jenkins is Ready

We are going to create a job to build a simple C# library I am hosting on GitHub to demonstrate Jenkins builds. Now we can create a new job and give it a name. The easiest project to configure in Jenkins is a freestyle project. This allows you to do any type of build you want by combining source control and scripts to accomplish your task (along with features from whatever plugins you have installed).

Jenkins freestyle project

Next, we will configure the project to pull from GitHub and give it a little batch script to build the project and run the tests. In the Source Code Management section, we will select Git and enter our repository URL.

Jenkins Source Control Configuration

Then in the Build section, we will setup our script to build the project. Since this is a .Net core project and I have the 2017 core tools installed, I can simply specify a batch command with the following script:

dotnet restore
dotnet build
cd UnitTests
dotnet test

Save the job. Then click the Build Now button on the left hand side. This will start a job which will appear in the Build History portion of the page. You can click on the build number to get more information about the build. The most useful information in here is the console. If you click on the Console Output you can see the full console output of your build. Since this is your first build on the machine, you will see information about populating your local cache for the first time and then you should see the project build output, and finally you should see the tests run and see them all pass.

At this point, we have a build server that builds our project on demand, but not continuously. To set that up, we can go back to the project page and select the Configure option. We’ll use the Poll SCM option to configure the job to poll for changes from GitHub every 15 minutes. In the Schedule box, enter the following value

H/15 * * * *

The format for this schedule mainly follows the cron syntax. Clicking on the ? next to the schedule box will give you plenty of examples and information on how you might want to configure your job. Save the job and you are good to go.

You now have a build server that will build within 15 minutes of a change to the repository. Congratulations, you have a CI server. As you can see, getting started with CI is not hard and there is really no excuse for not having some sort of automations around your builds.


Q:But John, why did you ignore webhooks?

A: While setting up webhooks is very straightforward, securing a Jenkins installation to be accessible via the internet is another thing altogether. I decided that using polling was a better approach than teaching people to setup an insecure Jenkins installation and having them get hacked. I’ll probably have a few more posts where I cover setting up webhooks for your Jenkins jobs.

View Archive (72 posts)