Speed up development in a NuGet package-centric solution

As microservices architectures become more popular, so increases usage of NuGet as a way of sharing code amongst separate services. The last few projects I’ve worked on have typically contained a number of “Core” NuGet packages with shared code and interfaces that are consumed by one or more services in the solution. Our build pipeline will publish these NuGet packages to an in-house NuGet server.

This can cause a lot of developer friction when you’re working in one of the services and you decide you need to change one of the Core projects. Typically you’ll

  1. make the change to Core
  2. commit it
  3. (Perhaps raise a PR and wait for approval)
  4. wait for the build to publish new versions of the package
  5. and finally updated the package in your service.

Probably at least 20 min wait for all that to happen. And then you realise that your change didn’t quite work, so you have to go through the cycle again with another change.

Well, here’s a shortcut to that cycle and it’s pretty darn simple. Simply

  1. make the change in Core and compile it locally
  2. Copy the resulting dll(s) to your local NuGet packages feed folder. NOT the bin folder of your service, or the packages folder of your service. On my machine the local NuGet packages feed folder is C:\Users\matt\.nuget\packages\.

As soon as the dll is copied, Visual Studio detects the change – you don’t need to tell it to update the NuGet packages or even compile!

Another surprise bonus of this – while debugging you can step into the Core code! You don’t need to mess around copying PDBs around or anything. I don’t know how it works, but Visual Studio seems to somehow know that the DLL you’ve copied into the NuGet folder came from your local machine and thus it knows where the source code for it is.

Obviously once you’re happy that your changes to Core are working correctly, then you can can commit them and consume an updated version of the package in your service as usual.

Anyway, hopefully this will speed up your local development as much as it has mine.

Advertisements

How to include Swashbuckle .xml files in your Service Fabric project

If you’re using Swashbuckle’s IncludeXmlComments() option, then your build needs to output an XML file containing the various comments.

By default, the XML file will not be included in your Service Fabric deployment. To get it to work, one way is to add the following section to your Web project’s .csproj:

  <Target Name="PrepublishScript" BeforeTargets="PrepareForPublish">
    <ItemGroup>
      <DocFile Include="bin\x64\$(Configuration)\$(TargetFramework)\win7-x64\*.xml" />
    </ItemGroup>
    <Copy SourceFiles="@(DocFile)" DestinationFolder="$(PublishDir)" SkipUnchangedFiles="false" />
  </Target>

You should also make sure that your xml file is output for all of these different build configurations, again in your Web project’s .csproj:

  
  <PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|AnyCPU'">
    <DocumentationFile>bin\Debug\net46\win7-x64\WebHost.xml</DocumentationFile>
    <NoWarn>1701;1702;1705;1591</NoWarn>
  </PropertyGroup>

  <PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
    <DocumentationFile>bin\Debug\net46\win7-x64\WebHost.xml</DocumentationFile>
    <NoWarn>1701;1702;1705;1591</NoWarn>
  </PropertyGroup>

  <PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|AnyCPU'">
    <DocumentationFile>bin\Release\net46\win7-x64\WebHost.xml</DocumentationFile>
    <NoWarn>1701;1702;1705;1591</NoWarn>
  </PropertyGroup>

  <PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
    <DocumentationFile>bin\Release\net46\win7-x64\WebHost.xml</DocumentationFile>
    <NoWarn>1701;1702;1705;1591</NoWarn>
  </PropertyGroup>

Thanks to vdevappa’s comment here https://github.com/dotnet/sdk/issues/795#issuecomment-306202030

Run a Service Fabric solution locally without deploying to Service Fabric

This is a similar piece to another post of mine from a few years ago Run a Windows Azure cloud service locally without the Azure compute emulator

So you’re working on a Service Fabric application which has an ASP.NET Web api host project. I find the debugging experience painful, for two reasons:

  1. Time to start debugging the project is a minimum of 45 seconds, every time, because the app gets deployed to a local service fabric cluster which takes forever.
  2. You need to remember to run Visual Studio as administrator in order for the above local deployment to succeed.

If either of these things bug you, then here’s a possible solution. Once we’re done you’ll be able to set the Web project in your solution as the StartUp project instead of the Service Fabric application, for much faster debugging, and you’ll no longer need to run VS as admin.

First, change the Program.cs in the Web project:

private static void Main()
{
	if (UseServiceFabric())
	{
		StartServiceFabric();
	}
	else
	{
		StartWebHost();
	}
}

private static bool UseServiceFabric()
{
	var webHostBuilder = new WebHostBuilder();
	var environment = webHostBuilder.GetSetting("environment");

	return environment != "Development";
}

private static void StartWebHost()
{
	var builder = new WebHostBuilder()
		.UseKestrel()
		.UseContentRoot(Directory.GetCurrentDirectory())
		.UseStartup<Startup>();

	var host = builder.Build();
	host.Run();
}

private static void StartServiceFabric()
{
	try
	{
		// The ServiceManifest.XML file defines one or more service type names.
		// Registering a service maps a service type name to a .NET type.
		// When Service Fabric creates an instance of this service type,
		// an instance of the class is created in this host process.

		ServiceRuntime.RegisterServiceAsync("Web1Type",
			context => new WebHost(context)).GetAwaiter().GetResult();

		ServiceEventSource.Current.ServiceTypeRegistered(Process.GetCurrentProcess().Id, typeof(Web1).Name);

		// Prevents this host process from terminating so services keeps running. 
		Thread.Sleep(Timeout.Infinite);
	}
	catch (Exception e)
	{
		ServiceEventSource.Current.ServiceHostInitializationFailed(e.ToString());
		throw;
	}
}

If the ASPNETCORE_ENVIRONMENT setting is Development, then it won’t use Service Fabric at all and will just use a plain ol’ ASP.NET Core WebHostBuilder to start the web host.

You’ll also need to change your Debug target to be the Web project, instead of IISExpress, via the VS Standard toolbar.

And just like that the app startup time has shrunk from around 40 seconds to around 5 seconds. Or from an unbearable 100 seconds for the application the team I just joined is working on.

Add an authorization header to your swagger-ui with Swashbuckle

Out of the box there’s no way to add an Authorization header to your API requests from swagger-ui. Fortunately (if you’re using ASP.NET), Swashbuckle 5.0 is extendable, so it’s very easy to add a new IOperationFilter to do it for us:

public class AddAuthorizationHeaderParameterOperationFilter : IOperationFilter
{
    public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription)
    {
        if (operation.parameters != null)
        {
            operation.parameters.Add(new Parameter
            {
                name = "Authorization",
                @in = "header",
                description = "access token",
                required = false,
                type = "string"
            });
        }
    }
}

Now all you need to do is register it in your EnableSwagger call:

configuration
    .EnableSwagger(c =>
    {
        c.SingleApiVersion("v1", "Commerce Services - Discounts");

        foreach (var commentFile in xmlCommentFiles)
        {
            c.IncludeXmlComments(commentFile);
        }

        c.OperationFilter<ExamplesOperationFilter>();
        c.OperationFilter<AddAuthorizationHeaderParameterOperationFilter>();
    })
    .EnableSwaggerUi(config => config.DocExpansion(DocExpansion.List));

Once that’s done it’ll give you an input field where you can paste your Authorization header. Don’t forget to add the word “bearer” if you’re using a JWT token:

Edit: I wrote this more than a year ago using Swashbuckle 5.2.1, it may not work with later versions.

The decline of Apple

Back in the 90s hardly anyone used Macs or Apple products. They had a small foothold in schools and graphic design shops but that was about it. The iPod was the beginning of their post-millenium rise – the 3rd generation iPod released in 2003 was the first Apple hardware I’d ever wanted, and I eventually purchased a 4th gen iPod in 2004.

ipod
The fourth generation iPod “photo” model

A few years later in 2007 the iPhone came out and completely changed what a smartphone was. Prior the the iPhone the best smartphone was probably a Blackberry, with it’s clunky little keyboard and tiny screen. But the iPhone with it’s giant glass screen and intuitive gestures was revolutionary.

history-of-iphone-3gs-hero
The iPhone 3GS (2009), the first iPhone I owned

There’s hardly any products I can think of that I could afford to purchase the best in the world of. Take cars for example. I can’t afford a brand new Ferrari. Or TVs – I couldn’t afford (or couldn’t justify buying) a top of the line whizz-bang TV. But a phone – yes! The iPhone at the time was the best phone any money could buy. Not that I gave a shit about having an awesome phone, all I wanted at the time was to have wikipedia in my pocket so that I could be a know-all at parties.

Both the iPod and later the iPhone made people, especially us geeks, open our eyes to how good Apple’s products could be. The Macbook Air made us realise that the lightest and sexiest laptop/notebook was also made by Apple. Apple started shipping with Intel processors! Within a couple of years Apple laptops were everywhere, even at conferences for us Windows developers.

I never considered myself an Apple fanboy, but here I am in 2017 typing this on my trusty Macbook Air, with my aging iPhone 5 alongside, and an iPad sitting on the windowsill (which I still never use). At the time of each purchase it wasn’t that I wanted to buy an Apple product, it was just that I wanted the best laptop/phone/tablet on the market at the time, and each of those happened to be made by Apple.

Amongst my geeky friends it was pretty much the same pattern. Mac laptops and iPhones for the most part. Not all though – some swore by Android phones.

In the last couple of years though, Apple have made a few key mistakes with their products which have seen my geek friends desert them, and I think it won’t be long before I too leave Apple products behind.

iWatch sucked

The first big Apple let down was the Apple Watch. Rumours had abounded for years that Apple would be bringing out a smartwatch that would change the world, just like the iPhone did. I was excited about it. Then when it was finally released, it was expensive and gimmicky. Hardly any of my geek friends bought one (hi Andy J, Alan, Alex).

No headphone port

Yep, another mistake was omitting the headphone port from the iPhone 7. Not a single iPhone buyer thought that was a good idea or a step forward. That small error was enough for us geeks to doubt that Apple knew what they were doing, and look at Android phones.

Downgrading the Macbook Pro

The killer mistake though was 2016’s Macbook Pro

  • More expensive
  • No USB 3.0 ports
  • No HDMI port
  • No SD card reader
  • No magsafe power
  • Similar spec CPU, memory and storage to the 2015 model
  • Lame touch bar

Although most users don’t use the F1-F12 keys on the top row of a QWERTY keyboard, us developers DO use them, so getting rid of those keys was a big deal.

See Benjamin Button reviews the new Macbook Pro

Now what?

In summary, Apple haven’t released anything amazing since Steve Jobs died.

So what’s a geek to buy in 2017 then? I don’t know – it’s not as straightforward as a couple of years ago. I usually go on my geek friends recommendations – on the phone front, Google Pixel phones are well regarded. iPhones are hanging on by their fingernails.

On the laptop front, NONE of my friends are buying new Macbooks. Microsoft(!) Surface Books or Surface Pros are looking like a good option. Windows laptops are making a comeback.

What now for Apple

For now, they’ve lost the geek crowd, and in technology where the geeks lead the world follows. Apple needs to release a new killer product to get us back. Or just drop the prices on their bloody phones and make a decent laptop again!

They have enough $$$ in the bank that they’re not gonna die any time soon, but until they win the geek crowd back I predict shrinking profits, maybe even losses, and a dropping share price for Apple (currently USD$132).

On music consumption

This morning I was thinking about how my music listening has changed over the years. The very first album I bought was Guns N’ Roses Appetite for Destruction, which I bought on a cassette tape with my pocket money for something ridiculous like $15 in 1987.

2zgvqex

I didn’t spend too much money over the years on cassettes cos back then we all used to dub them – pretty much everyone had a stereo with 2 tape decks – one for playing and the other for recording.

80s-boom-box-ghetto-blaster-tape-deck

A few years later CDs came out and wow, music got even more expensive. But the quality was worth it – no more tape hiss. However, I still didn’t spend a whole lot on music – I was still at high school, so the tape dubbing continued – except now the source was a digital quality CD instead of a tape :-) I remember getting Guns N’ Roses Use Your Illusion 1 AND 2 off my mate’s older brother (hi Ed!) this way.

Come 1997 (woah, 20 years ago) and I was studying computer science at university, and a friend showed me mp3s (hi Flip!). At first I didn’t get it – because HDD space was expensive, so I was like “nah, I don’t really wanna fill my hard drive with music”. But once I realised how easy it was to share music (no more tape dubbing!) I was sold. So I got me a 2Gb Bigfoot hard drive and I was like – wow, loads of space, bring it on.

2664533936_3a6dfbc8f4
Bigfoot hard drive

This was the start of a period of CD borrowing and ripping – where you’d “rip” a CD on your computer to convert it into mp3s. Back then on our 2x speed CDROM drives and Pentium 1 processors it would take about half an hour to rip the CD and then I think most of the night(!?) to compress the ripped CD to an mp3 album. Playing an mp3 on your computer (using Winamp) was very CPU intensive, it would take pretty much 100% of your CPU to play an mp3 so you couldn’t use it for anything else while playing.

I fleshed out my music collection by borrowing friends’ CDs and ripping them – then you’d meet up with another fellow ripper (hi Trent!) and share mp3s with them, by unplugging the hard drive from your computer, taking it over to their house, and plugging it into their computer. Even though the mp3s were a digital copy so in theory perfect copies, every now and then you’d come across a track with pops and clicks in it – from when someone with a crappy CDROM drive would rip something.

1999 and Napster came along. Peer to peer sharing of music over the Internet! Almost any album you wanted, available to download, for free! Although we only had 56k modems to connect to the Internet, it still meant you could download an entire album in about 2 hours – much quicker and easier than ripping! And then we started running the Linux version of Napster on the University’s computers, so we could pull down an album in about ten or 20 minutes.

A few years later I was working my first post-Uni job so now I had money to spend. I did have a guilty conscience over all that music I’d ripped off so I started buying CDs – which would get converted to mp3s straight away, then the CD would never get played again. I’d moved to Auckland and I started going to concerts – my loose rule was – if I have an album by a band, and they come to Auckland, then I should go see them live. That was my way of supporting an artist.

Anyway, back to the mp3s, in the early 2000s. I was never a straight-up music hoarder. I didn’t want to have any old shite in my collection – it had to be good, memorable music, that I would still like in a few years. I became extremely pedantic with my organising and naming of the mp3s. Every mp3 had to be named correctly, with full metadata (ID3v1 and V2 tags), the correct genre, album art. I used JRiver Media Center software to manage it all. That program could do everything – I even used my newfound VB skills to write a plugin for it.

audio_standard_view
JRiver Media Center

Every time I’d get an album (usually from Napster, or from copying friends music over our corporate network (hi Deano!)), I’d spend ages scrutinizing it – do I like it? Will I still like it a few years from now? If this album came on randomly, would I listen to it? If it came on publicly, would I be embarrased by it? If it met those criteria then it was worthy enough to be added to my library. I’d almost always have to rename it correctly and populate all the metadata. All of which took time and effort.

All this time I was still constrained to listening to these mp3s through a PC – which was OK. I had a PC hooked up to a stereo in my bedroom, and then at work I’d be working on a PC wearing headphones all day. But mp3s on the go wasn’t yet possible for me. Early portable mp3 players were clunky, prone to crashing, had crap interfaces, or just didn’t have the capacity to store my entire collection.

The first game changer that came out was Apple’s iPod in 2001. I remember when it came out – it took the mp3 world by storm, mainly because it was pretty and easy to use. It solved the problem navigating through 1000 songs thanks to its scrollwheel interface. The downside of it was you had to run a Mac computer to use it – and absolutely no one had one of those. Back then Apple was dead. No one had a Mac PC or laptop. The only place I’d seen them was in the Uni’s computer labs.

ipod

If only I’d bought Apple shares back then! I knew the iPod was a hit, but I didn’t think to invest in the company. It turned out to be the beginning of Apple’s turnaround. Back then their shares were around $1.50, now they’re $117.

The first iPod wasn’t big enough for me though – it was 5Gb and I probably had around 20Gb of music by then. But Moore’s law caught up to my music collection, and the 4th generation iPod with 60Gb capacity (and Windows compatibility) was the first one I bought, in 2004. At last, my music everywhere.

In 2010 I moved to the UK and first heard about Spotify, which some of my friends were using. I ignored it for a few years, because I was pretty happy with my mp3 collection, and because I thought they might get shutdown by the music industry (or just go broke), as so many other online music services had.

2014 and I started using Spotify at work, just to try it out. I realised that their playlists solve the problem of what to listen to – when you’ve got hundreds of albums to choose from picking one can be tough. Spotify then became my main source – I installed it on my iPad and that was our main source of music in the house.

So, alas, my carefully curated music collection became obsolete. Here it is, on my laptop right now, still frozen in 2014. The _2014 folder is for new music for curation.

screen-shot-2017-01-08-at-17-22-10

So, I’m a happy Spotify user. Until last week. My girlfriend bought me an Amazon Echo for Christmas, and the voice interface has me sold. I say “Alexa, play music”. And it replies “OK, here’s a station you might like: Adele”. And I’m usually fine with what it (she?) chooses. Alexa has further removed the choice – I don’t even need to think about which playlist to play. I just say “play music” and that’s it. I’ve gone from an avid music collector before to not really caring what I listen to now. Life’s too short to be tagging mp3s.

Anyway, I didn’t intend for this post to be so long – I was just going to write how Alexa has killed my mp3 collection and ended up going on a trip down memory lane. I’ve added a new “Musing” category to this blog as I have a few more topics in mind.

OnePlus 2 = poo

I’ve been a long time iPhone user, first with a 3GS in 2009 and then an iPhone 5 in 2012. So at 3+ years old my iPhone 5 was getting a bit long in tooth. My colleague Nick recently replaced his iPhone 5 with a OnePlus 2 and he was very happy with it. Envious of that big screen and also it’s relatively cheap price, I decided to give it a go. I also wanted to try out Android as I’d heard good things.

DSC04123.0

I wasn’t impressed.

On the phone hardware side: The ringtone was quiet and the vibration unnoticeable when in a pocket. I managed to cause it to crash a few times by accidentally hammering the back button. It would get hot. And battery life wasn’t amazing.

On the Android side: I didn’t like Android’s way of doing notifications. Some web pages would run really slowly, and I don’t tolerate lag in a brand new phone.

As for that big screen – well, the apps didn’t feel optimized it – e.g. Facebook would still only show one news story at a time. Even with the smallest font there wouldn’t be much text on the screen at once, just lots of white space and big buttons. So the big screen felt wasted.

After 5 days I gave up on it and decided to go back to my iPhone 5, so I attempted to return the OnePlus. And that’s where their pathetic support team got involved. I started the returns process on Feb 16, and so far I’ve had 12 messages back and forward confirming my address, and confirming whether or not I want to return the cover, over and over. Finally today, March 11, I received an RMA form. So it’s taken almost a month of backwards and forwards with “Joey” and “Alex”. Let’s see how long it takes to get the money into my account.

Update: it took another month for the RMA to be processed, and I hadn’t heard anything until I chased them, so in total it took 2 months (and 18 emails) just to return the phone and get a refund.