and has 0 comments
I have to admit that my expectations for this book were so high that it was probably doomed to not satisfy me. I was expecting something deeply Asian, with fantastic elements and fresh ideas and characters. What I got is something that is almost accidentally fantastical and has few cultural elements to make it fresh. Yet it does have interesting characters and, if it weren't for the plot, which meanders whichever way the author needs to further her agenda, it would have been a good book.

Joan He is American of Chinese descent (hence the name of the book?) and the culture described in Descendant of the Crane is based on an American's understanding of Chinese culture. That makes it both relatable and less Asian than I would have liked. What do I know, though? My feeling was that the author was exploring her own understanding of her origins instead of sharing something solid with the reader. There were some very intriguing ideas in the book, but they rode the story and the characters too strongly, making them inconsistent and irrational. This is an almost maybe book for me.

Bottom line: even without getting a lot of satisfaction out of it, I feel two stars out of five is too little, yet I am certain three is too much.

and has 0 comments
It was one of the few Brandon Sanderson books that I hadn't read and it is, at least at this moment, a standalone book. I know that title says that The Rithmatist is part of a series, but what successful book isn't? In truth, Sanderson started this book while working (and failing) on another and it took years to rethink and rewrite it into a finished story. It's best you take it as one of those wonderful accidents that successfully reach the reader despite the publishing industry.

Back to the book, it's almost classic Sanderson: the main character is young, passionate and intelligent, yet lacking power. Everybody else has it, though, and he is fascinated by it, while learning at a prestigious school that also teaches the techniques of power. In this gearpunk like novel, the power is magical and involves drawing lines with chalk and imbuing them with personal will. The lines then become defenses, attacks, weapons and even magical minions. When the school is attacked and his friends are in danger, it is up to him to solve the mystery.

It's obvious that the story has issues, and that is probably why Sanderson worked so much at it to make it publishable, but one can get past it quickly. The characters are not as funny or punny as his others or even very complex, being satisfied to have one or two goals in life and go for them (kind of like magical chalk drawings themselves, hmmm). The plot, involving the awesome power of carrying a piece of chalk with you and going through amazing duels that resemble tower defense games, it also not the most captivating. Yet the story kind of works.

Bottom line: a pleasant Brandon Sanderson classic, without being exceptional in any way.

and has 0 comments
I thought long and hard how to start this review and I think the best place is the ending. In July 2015 the Internet and even the classic media exploded with the news of the American space probe named New Horizons reaching the last unexplored classical planet of the solar system: Pluto. Long thought to be a frozen piece of rock floating too far from the Sun to be of any interest, it has been ignored by NASA and every other space agency out there, only to be revealed to be one of the most intriguing and beautiful gem of our system. New Horizons had been launched in 2006 and it took one more year to get all the Pluto data back to Earth. As far as the general population was concerned and even most of the people passionate about space, this was an axis in time with three major events: launch, flyby and end of data download. As far as the media was concerned, it was a great discovery because it produced memeable pictures (the Pluto heart is one, for example).

Chasing New Horizons starts in 1989, when Alan Stern deciding he would work for a NASA mission to Pluto, it takes us through the Herculean job of creating interest, gathering people, drafting a project, finding support and funding, fighting competing teams, bureaucracy and political apathy and even bad will, the ups and downs, the almost-theres, the Sisyphean and thankless work to get something up the Hill only to see it fall because of a change in administration, or a cut in the budget, or some hidden agenda and even people petty enough to demote Pluto's status as a planet as a personal grudge against the person who discovered it.

I liked how the book was written, even if at start I had to move over the usual platitudes meant to garner interest for space from the average reader and had to cope with the American units of measurement: feet, miles, Fahrenheit and, of course, the bus-size. However, no matter the small faults in the writing, the subject is so important than I can't rate the book lower than maximum. This is a must read, even if it skirts the technical aspects and mostly discusses the 25 year work from a managerial standpoint.

It's hard to describe how awesome these people are. Can you imagine working for a quarter century for something that can fail abruptly and with no positive outcome at literally any moment? I thought I had it bad when a project I was working on for six months was cancelled - imagine having to go through something like this several times, at the exact moment when you thought it will all sail smoothly from then on, when you had the funding, the assurances and even the construction of the probe nearly finished. Four days before one critical moment, the flyby, when New Horizons was supposed to do almost all of its work, the on board computer rebooted and lost all previously uploaded programming. In those four days, people had to scramble to recreate the entire software package they had worked for incrementally for the last 9 years and upload it to a machine that was 9 light hours away from Earth. One of the most critical moments of the mission (after 16 years of ground work to make it happen) was the launch, for example. The mission planners had no control over the mechanism of the launch vehicle. It could have blown on the pad or in the air.

There would be no redos. First, no way the project would have been approved again after a failure so senseless. Second, Pluto would have moved in a region of its 248 year long orbit where its atmosphere would freeze, making any other future probe return much less interesting data than at that exact moment. If it failed, it would have been the first and probably last APL planetary exploration mission, after they fought tooth and nail to be the ones doing it, rather than the usual and entrenched JPL. People had lived their entire lives working on this thing and it could have failed in so many different ways.

Bottom line: you have to be insane to do what Stern did. A wonderful flavor of insanity that is both admirable and terrifying. The system behind NASA should value and support these people even if, especially if, they are insanely driven enough that they don't actually need it. I would say that New Horizons succeeded despite the American space industry and political system, not because of them. It really shouldn't be that hard. This is a book for all space fans, but also people who had difficulties in their projects. While it might help with specific insights, this book will make almost any hardship you ever endured seem insignificant.

and has 2 comments
We already know how to load types in .NET Framework and we know what they say we should use in .NET Core. But what about Standard? Is that a trick question? Sort of. Right now we have two .NET Standard and three .NET Core versions, albeit .NET Core 3 is in preview mode. The signature for AssemblyLoadContext and how it is used has changed dramatically. Core 3 enables context unloading, but Standard 2 does not. So you either are forced to build your library as Core 3 or you have to not use Unloading contexts or use reflection, which is not robust and probably will not be needed with the possible arrival of Standard 3.

But there are subtler issues at work. One of them is that, at least with .NET Core 3 Preview6, when you reference System.Runtime.Loader in a Standard library, so you can access AssemblyLoadContext, you get conflicts between the System.Runtime you are using and the one referenced by System.Runtime.Loader. The only solution is to use the System.Runtime.Loader NuGet package, but that returns you to the Standard 2 version of AssemblyLoadContext, even if the library version is higher!

The setup is this: I have an ITestInterface interface which resides in TestInterfaceLibrary.dll. I also have a TestImplementation class that can be found in TestImplementationLibrary.dll and implements ITestInterface. My program either does not reference any of these libraries or it only references the interface one. The task is to load both these types and then simply convert one instance of TestImplementation to ITestInterface. Simple test would be loading the types and then expecting interfaceType.IsAssignableFrom(implementationType) to be true.

Core 3


Let's first try the Core 3 way:
var context = new AssemblyLoadContext("testContext", true);
 
var interfaceAssembly = context.LoadFromAssemblyPath(interfaceAssemblyPath);
var interfaceType = interfaceAssembly.GetType("TestInterfaceLibrary.ITestInterface");
Console.WriteLine(interfaceType?.ToString()??"interface type not loaded");
 
var implementationAssembly = context.LoadFromAssemblyPath(implementationAssemblyPath);
var implementationType = implementationAssembly.GetType("TestImplementationLibrary.TestImplementation");
Console.WriteLine(implementationType?.ToString() ?? "implementation type not loaded");
 
Console.WriteLine("implementation implements interface: "+interfaceType.IsAssignableFrom(implementationType));
 
context.Unload();
The output is:
TestInterfaceLibrary.ITestInterface
TestImplementationLibrary.TestImplementation
implementation implements interface: True

It works! But only because the interface assembly is loaded first. If you try to load just the implementation type first, it will come up as empty. There are no exceptions thrown unless you get all the assembly types or specify the throwOnError parameter in GetType. The exception is "System.IO.FileNotFoundException: 'Could not load file or assembly 'TestInterfaceLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null'. The system cannot find the file specified.'".

In order to solve this, we need to use the Resolve event of the AssemblyLoadContext class. Let's try this:
var context = new AssemblyLoadContext("testContext", true);
context.Resolving += Context_Resolving;
 
var implementationAssembly = context.LoadFromAssemblyPath(implementationAssemblyPath);
var implementationType = implementationAssembly.GetType("TestImplementationLibrary.TestImplementation", true);
Console.WriteLine(implementationType?.ToString() ?? "implementation type not loaded");
 
var interfaceAssembly = context.LoadFromAssemblyPath(interfaceAssemblyPath);
var interfaceType = interfaceAssembly.GetType("TestInterfaceLibrary.ITestInterface", true);
Console.WriteLine(interfaceType?.ToString() ?? "interface type not loaded");
 
Console.WriteLine("implementation implements interface: " + interfaceType.IsAssignableFrom(implementationType));
 
context.Resolving -= Context_Resolving;
context.Unload();
 
private static Assembly Context_Resolving(AssemblyLoadContext context, AssemblyName assemblyName)
{
var expectedPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, assemblyName.Name + ".dll");
return context.LoadFromAssemblyPath(expectedPath);
}

And now it works again, by assuming the assembly name is the same as the assembly file name and that it is found in the same place.

But... if we try this in different contexts:
var context = new AssemblyLoadContext("testContext", true);
context.Resolving += Context_Resolving;
 
var implementationAssembly = context.LoadFromAssemblyPath(implementationAssemblyPath);
var implementationType = implementationAssembly.GetType("TestImplementationLibrary.TestImplementation", true);
Console.WriteLine(implementationType?.ToString() ?? "implementation type not loaded");
 
context.Resolving -= Context_Resolving;
context.Unload();
context = new AssemblyLoadContext("testContext2", true);
context.Resolving += Context_Resolving;
 
var interfaceAssembly = context.LoadFromAssemblyPath(interfaceAssemblyPath);
var interfaceType = interfaceAssembly.GetType("TestInterfaceLibrary.ITestInterface", true);
Console.WriteLine(interfaceType?.ToString() ?? "interface type not loaded");
 
Console.WriteLine("implementation implements interface: " + interfaceType.IsAssignableFrom(implementationType));
 
context.Resolving -= Context_Resolving;
context.Unload();
the output will show
implementation implements interface: False

This means that if we want to encapsulate this in a TypeLoader class or something, we cannot use different contexts for dynamically loading types. Even if we had one context that we would unload in order to refresh all the types, it could still be different from the main context, in case the interface is loaded twice or referenced directly in the project.

For example, if you reference TestInterfaceLibrary directly and you load TestImplementation dynamically it will work as expected, because ITestInterface is resolved automatically from the main context. However, if you load ITestInterface dynamically, too, it will be a different type from the referenced ITestInterface, even if they apparently have the same name and full name and assembly qualified name! So it kind of makes sense to not load a type twice. Is this where the context unloading comes in? Not really. Let's define a method that counts the number of types with a certain name in the current domain as
private static int CountTypes(string typeName)
{
return AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(assembly => assembly.GetTypes().Where(t => t.FullName == typeName))
.Count();
}

And now let's run this code:
var context = new AssemblyLoadContext("testContext", true);
context.Resolving += Context_Resolving;
 
var referencedInterfaceType = typeof(ITestInterface);
Console.WriteLine(referencedInterfaceType?.ToString() ?? "interface type not loaded");
 
var interfaceAssembly = context.LoadFromAssemblyPath(interfaceAssemblyPath);
var interfaceType = interfaceAssembly.GetType("TestInterfaceLibrary.ITestInterface", true);
Console.WriteLine(interfaceType?.ToString() ?? "interface type not loaded");
 
Console.WriteLine($"Types are the same: {interfaceType==referencedInterfaceType}");
 
Console.WriteLine($"Number of types with name {interfaceType.FullName}: {CountTypes(interfaceType.FullName)}");
 
context.Resolving -= Context_Resolving;
context.Unload();
Console.WriteLine($"Number of types with name {interfaceType.FullName}: {CountTypes(interfaceType.FullName)}");

There is the referenced type, then we load the type dynamically again, inside a new context. We count the types loaded in the current domain, we unload the context, we count the types again. The result is
TestInterfaceLibrary.ITestInterface
TestInterfaceLibrary.ITestInterface
Types are the same: False
Number of types with name TestInterfaceLibrary.ITestInterface: 2
Number of types with name TestInterfaceLibrary.ITestInterface: 2
The types are always 2!

Bottom line, even when unloading the AssemblyLoadContext, the types used are not unloaded and trying to find a type by name will result in duplicates.

OK, so let's just agree that types with the same name, once loaded, should remain there and no other type with the same name should be loaded. Let's try to incorporate this into a TypeLoader class:
public class TypeLoader : IDisposable
{
private readonly AssemblyLoadContext _context;
 
public TypeLoader()
{
_context = new AssemblyLoadContext(GetType().FullName, true);
_context.Resolving += Context_Resolving;
}
 
private Assembly Context_Resolving(AssemblyLoadContext context, AssemblyName assemblyName)
{
var expectedPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, assemblyName.Name + ".dll");
return context.LoadFromAssemblyPath(expectedPath);
}
 
public Type LoadType(string typeName, string assemblyPath)
{
var type = AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(assembly => assembly.GetTypes().Where(t => t.FullName == typeName))
.FirstOrDefault();
if (type != null)
{
return type;
}
var assembly = _context.LoadFromAssemblyPath(assemblyPath);
return assembly.GetType(typeName, true);
}
 
public void Dispose()
{
_context?.Resolving -= Context_Resolving;
_context?.Unload();
}
}

The code in our test is now much clearer:
var interfaceAssemblyPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "TestInterfaceLibrary.dll");
var implementationAssemblyPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "TestImplementationLibrary.dll");
var interfaceTypeName = "TestInterfaceLibrary.ITestInterface";
var implementationTypeName = "TestImplementationLibrary.TestImplementation";
 
using (var loader = new TypeLoader())
{
Type referencedType = typeof(TestInterfaceLibrary.ITestInterface);
var interfaceType = loader.LoadType(interfaceTypeName, interfaceAssemblyPath);
var implementationType = loader.LoadType(implementationTypeName, implementationAssemblyPath);
Console.WriteLine($@"
referenced type: {referencedType}
interface type: {interfaceType}
implementation type: {implementationType}
referenced and loaded interfaces are the same: {referencedType == interfaceType}
interface implemented: {interfaceType.IsAssignableFrom(implementationType)}"
);
}
and the result is
referenced type: TestInterfaceLibrary.ITestInterface
interface type: TestInterfaceLibrary.ITestInterface
implementation type: TestImplementationLibrary.TestImplementation
referenced and loaded interfaces are the same: True
interface implemented: True

But we still use Unload. Maybe it will work some day as I want it to work, but until then, why not get rid of Unload and make TypeLoader a class in a Standard 2 library?

Standard 2


For this I will create a new Standard 2 library project and then reference it in our test Core 3 project. Then I will move the TypeLoader class in the library project.

The errors in the library project are related to not knowing what an AssemblyLoadContext is, therefore the first solution is to reference System.Runtime.Loader from the framework. I get the immediate error "Assembly 'System.Runtime.Loader' with identity 'System.Runtime.Loader, Version=4.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' uses 'System.Runtime, Version=4.2.1.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' which has a higher version than referenced assembly 'System.Runtime' with identity 'System.Runtime, Version=4.1.2.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".

Solution 2: load the System.Runtime.Loader NuGet package, which at the time of writing this, is version 4.3.0. The error is now gone, but several things are immediately apparent:
  1. the Unload method doesn't exist anymore
  2. the constructor doesn't receive a name and a bool anymore
  3. AssemblyLoadContext is now abstract

In order to solve this I am creating a DynamicAssemblyLoadContext class that inherits from AssemblyLoadContext and just return null from the Load method overload, and I give it an Unload method and a constructor with a string and a bool that don't do anything. And it works again. The updated TypeLoader class is now:
public class TypeLoader : IDisposable
{
private readonly DynamicAssemblyLoadContext _context;
 
public TypeLoader()
{
_context = new DynamicAssemblyLoadContext(GetType().FullName, true);
_context.Resolving += Context_Resolving;
}
 
private Assembly Context_Resolving(AssemblyLoadContext context, AssemblyName assemblyName)
{
var expectedPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, assemblyName.Name + ".dll");
return context.LoadFromAssemblyPath(expectedPath);
}
 
public Type LoadType(string typeName, string assemblyPath)
{
var type = AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(ass => ass.GetTypes().Where(t => t.FullName == typeName))
.FirstOrDefault();
if (type != null)
{
return type;
}
var assembly = _context.LoadFromAssemblyPath(assemblyPath);
return assembly.GetType(typeName, true);
}
 
public void Dispose()
{
_context?.Resolving -= Context_Resolving;
_context?.Unload();
}
 
 
private class DynamicAssemblyLoadContext : AssemblyLoadContext
{
public DynamicAssemblyLoadContext(string name, bool isCollectible)
{
}
 
protected override Assembly Load(AssemblyName assemblyName)
{
return null;
}
 
public void Unload()
{
}
}
}

The safe way


The code above has an issue, though. If the interface type is dynamically loaded before its referenced type is used, this fails again. This is the case when you use dependency injection. You dynamically load the types in order to register the implementation relationship to the interface, but then, when you ask for a resolution for the interface type, now referenced by the main project, you get another type named just the same.

The safe way, considering that we don't really use Unload and we don't count on it every working, why not use the default context, the one where everything loads, and be done with it. When you do that, the code becomes a little uglier, but it works in all situations.

Final version.
public class TypeLoader
{
private readonly object _resolutionLock = new object();
 
private Assembly Context_Resolving(AssemblyLoadContext context, AssemblyName assemblyName)
{
var expectedPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, assemblyName.Name + ".dll");
return context.LoadFromAssemblyPath(expectedPath);
}
 
public Type LoadType(string typeName, string assemblyPath)
{
var context = AssemblyLoadContext.Default;
lock (_resolutionLock)
{
context.Resolving += Context_Resolving;
var type = AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(ass => ass.GetTypes().Where(t => t.FullName == typeName))
.FirstOrDefault();
if (type != null)
{
return type;
}
var assembly = context.LoadFromAssemblyPath(assemblyPath);
 
type = assembly.GetType(typeName, true);
context.Resolving -= Context_Resolving;
return type;
}
}
}

You just gotta hate that adding and removing the event inside a lock, right? Well, if you find a better solution, let me know.

and has 0 comments
The Blood Mirror felt like the weakest book in the series, but really, if I think about it, it's the pattern that unfolded through the entire Lightbringer saga that feels wrong. The first book was amazing, with interesting characters, great world building, an intriguing story, but then came the second book - and I didn't see it then - which upended much of the concepts in the first and added many more. It was not a continuation, per se, but a reframing of the story with other parameters. Instead of closing story arcs, Brent Weeks was transforming them, kept them open and added many more. The third book made this pattern obvious and in this book it became annoying.

Forget that everybody is the relative of everybody or in the extreme the member of an organization that we didn't know existed or cared about in previous books. Forget that after we follow a character as something, we have to follow them as something completely different in the next book, because of reasons that we didn't know (or cared) about. Forget even that threatening someone's loved ones seems to control everything with maximum efficiency in this universe, while actually harming them is a forgivable offense. Nothing. Ever. Ends. It just piles on. And since there is limited space in the book, important things - like the war or what the people are actually doing when the entire establishment blows in their faces and destroys their lives - get sidelined or completely eliminated in favor of whatever insecurity Kip feels while discussing hot sex with his friends or amazingly beautiful (and totally inconsistent) wife. And of course, the book ends in another cliffhanger.

In chess, when you are overwhelmed by the complexity of the position, you simplify it. You exchange pieces until the board is clearer. In Lightbringer, enemies just enjoy threatening each other and never following up while they work together for some completely pointless goal. Just like in TV soap operas, they all hate and love each other at the same time while things that could never have been predicted by the reader happen as chaotically as possible around them.

So, the fifth book will be published this year and I will read it, but my rating on the entire series just plunged dramatically. I don't expect things to really come to any conclusion, I don't expect characters to evolve in any meaningful way anymore or the lore behind it all to ever be explained. We started with seven colors and a god, now we have 11 colors and about 200 gods, for example. The chances that all of this mess will become clear in the future are remote.

and has 0 comments
The Broken Eye continues right after the shocking finale of The Blinding Knife! And that pretty much sucks, because the ending was the type of cliffhanger that just felt added on in order to make people quickly buy the next volume. Unfortunately, this book is no different. After a zillion story arcs that meet improbably and a lot of agitation one way or the other, Brent Weeks ends Broken Eye with an even shockier (is that a word?) ending.

And I will bite, I will read the fourth book in the series, The Blood Mirror, but only because I find the characters intriguing. Yet I definitely lost that feeling of respect for the story, the careful attention to detail that I enjoyed so much in the first volume. Weeks is a good writer, maybe even a great one, but instead of the series getting better, it just gets bloated until it needs over the top twists and abrupt cliffhangers. One of the most pervasive feelings when reading this volume was frustration that the stories of characters that I wanted to follow were interrupted by all of the others and how each and every one of even the secondary heroes needed their own grand achievement until it got claustrophobic. OK, you're the good guy, but when you see someone hurting everyone you know, you just kill them. You don't one up them, you don't talk to them, you don't strategize or play games. OK, you're a powerful psycho, but it doesn't mean everything needs to be a power show. I mean, does Andross Guile even go to the bathroom or just wills his bowels into submission? OK, you are young and inexperienced so you don't know what to do when you love someone, but doing the exact opposite? And how come in this universe there are at most two degrees of separation? More like one and a half. And how come everyone knows what they need to do when they need it, regardless if they ever learned it before?

I am already hooked into the story and Brent Weeks creates a complex and compelling one, however the experience of reading the books is only diminishing with stupid techniques like cliffhangers and hidden information and mindless expansions into new territories that absolutely did not need to be there. Too bad that now everything will need to at least maintain this insane level of tension and complexity, for fear of turning boring.

Bottom line: not bad, certainly not boring, but pointlessly exhausting.

This is more a backup for the extensions that I have installed on the two main IDEs I'm using for my job: Visual Studio and Visual Studio Code.

Visual Studio Code


In order to list the extensions installed, use the command code --list-extensions. For me, this results in this output:

code --install-extension aeschli.vscode-css-formatter
code --install-extension Angular.ng-template
code --install-extension christian-kohler.npm-intellisense
code --install-extension cmstead.jsrefactor
code --install-extension cssho.vscode-svgviewer
code --install-extension danwahlin.angular2-snippets
code --install-extension dbaeumer.jshint
code --install-extension dbaeumer.vscode-eslint
code --install-extension donjayamanne.jquerysnippets
code --install-extension donjayamanne.jupyter
code --install-extension DotJoshJohnson.xml
code --install-extension ecmel.vscode-html-css
code --install-extension eg2.vscode-npm-script
code --install-extension esbenp.prettier-vscode
code --install-extension fabianlauer.vs-code-xml-format
code --install-extension fknop.vscode-npm
code --install-extension HookyQR.beautify
code --install-extension humao.rest-client
code --install-extension joelday.docthis
code --install-extension k--kato.docomment
code --install-extension michelemelluso.code-beautifier
code --install-extension minhthai.vscode-todo-parser
code --install-extension mohsen1.prettify-json
code --install-extension mrmlnc.vscode-scss
code --install-extension ms-mssql.mssql
code --install-extension ms-vscode.csharp
code --install-extension ms-vscode.typescript-javascript-grammar
code --install-extension ms-vscode.vscode-typescript-tslint-plugin
code --install-extension msjsdiag.debugger-for-chrome
code --install-extension naumovs.color-highlight
code --install-extension pmneo.tsimporter
code --install-extension rbbit.typescript-hero
code --install-extension robinbentley.sass-indenrted
code --install-extension wayou.vscode-todo-highlight

In order to install them, use code --install-extension [extension name] for each line.

Visual Studio


For Visual Studio, funny enough, in order to export and import your extensions you need to use an extension: Extension Manager 2017, which on my system exports a file in .vsext format:

{
"id": "5f191824-b8a6-47c0-9f96-f607dfd3c09b",
"name": "My Visual Studio extensions",
"description": "A collection of my Visual Studio extensions",
"version": "1.0",
"extensions": [
{
"name": ".NET Portability Analyzer",
"vsixId": "55d15546-28ca-40dc-af23-dfa503e9c5fe"
},
{
"name": "Advanced Installer for Visual Studio 2017",
"vsixId": "Caphyon.AdvancedInstaller.23debb5a-cff4-4b91-88bf-6280f72a7ebb"
},
{
"name": "Azure Data Lake and Stream Analytics Tools",
"vsixId": "1e906ff5-9da8-4091-a299-5c253c55fdc9"
},
{
"name": "Azure Functions and Web Jobs Tools",
"vsixId": "Microsoft.VisualStudio.Web.AzureFunctions"
},
{
"name": "BuildVision",
"vsixId": "837c3c3b-8382-4839-9c9a-807b758a929f"
},
{
"name": "Clean Code .NET",
"vsixId": "CleanCode.NET.9ecfa9bb-0775-48d0-9898-4dbbbd529fe3"
},
{
"name": "Cloud Explorer for VS 2017",
"vsixId": "Microsoft.VisualStudio.CloudExplorer"
},
{
"name": "Code Cracker for C#",
"vsixId": "CodeCracker.Vsix..5b99e64c-1418-4a06-990c-fd4cf01f4f63"
},
{
"name": "Code Graph",
"vsixId": "CodeAtlasVSIX.Company.df5456fb-08ea-4256-b5ff-ecdb3a512ad3"
},
{
"name": "CodeMaid",
"vsixId": "4c82e17d-927e-42d2-8460-b473ac7df316"
},
{
"name": "CommentCop",
"vsixId": "CommentCop..0521EE68-1A5D-4C78-9970-B6A46B03FA6D"
},
{
"name": "EntityFramework Reverse POCO Generator",
"vsixId": "EntityFramework_Reverse_POCO_Generator..d542a934-8bd6-4136-b490-5f0049d62033"
},
{
"name": "Extension Manager 2017",
"vsixId": "e83d71b8-8bfc-4e06-b145-b0388910c016"
},
{
"name": "Fix Mixed Tabs",
"vsixId": "FixMixedTabs.9f1d3050-b986-4b10-ae36-97c6efc5e968"
},
{
"name": "Fix Namespace",
"vsixId": "f073da8c-bb52-41f8-b95a-a6346b1a0b52"
},
{
"name": "MetricsAnalyzer",
"vsixId": "MetricsAnalyzer..8026235d-7afc-401b-8f45-ba8624a07ef5"
},
{
"name": "Microsoft Code Analysis 2017",
"vsixId": "4db2d63d-3320-4fbd-bf80-07f8d1500bd3"
},
{
"name": "Moq.Analyzers",
"vsixId": "Moq.Analyzers..c3c7e3f8-2407-428d-beef-c4557253517b"
},
{
"name": "Object Exporter",
"vsixId": "07fb5b16-f4be-4488-9a19-b4f36d2c05a6"
},
{
"name": "Output enhancer",
"vsixId": "VSOutputEnhancer.Nikolay Balakin.a06be4c3-f97e-425c-8a0d-bdef08ac2abb"
},
{
"name": "Power Commands for Visual Studio",
"vsixId": "PowerCommands.3ecdd89b-f985-483d-8c94-be37de4dc083"
},
{
"name": "Ref12",
"vsixId": "SLaks-Ref12-086C4CE4-7061-4B1F-BC77-B64E4ED71B8E"
},
{
"name": "Reference Conflicts Analyser",
"vsixId": "ff477521-e67b-4ca3-931f-3edf36125d28"
},
{
"name": "Regular Expression Tester Extension",
"vsixId": "a65d58d2-ead8-4eea-a47d-fa60865a6043"
},
{
"name": "ResolveUR - Resolve Unused References",
"vsixId": "637ba02c-3388-4e54-9051-3eea7c51b054"
},
{
"name": "Roslyn Security Guard",
"vsixId": "RoslynSecurityGuard..45fa56c2-16f1-4395-8c10-a5a460084018"
},
{
"name": "Roslynator 2017",
"vsixId": "9289a8ab-1bb6-496b-9992-9f7ea27f66a8"
},
{
"name": "Security Code Scan (for VS2017 and newer)",
"vsixId": "955196A7-ACBF-4F6B-820B-51B8507CE853"
},
{
"name": "Solution Error Visualizer",
"vsixId": "SolutionErrorVisualizer.a392f96b-6b33-4b53-b4bb-3376a05f986c"
},
{
"name": "SonarLint for Visual Studio 2017",
"vsixId": "SonarLint.36871a7b-4853-481f-bb52-1835a874e81b"
},
{
"name": "SQL Search",
"vsixId": "Redgate.SQLSearch.VSExtension.9BD7AEDA-C291-4702-8191-4189B099F3A9"
},
{
"name": "Target Framework Migrator",
"vsixId": "TargetFrameworkMigrator..4f7666b9-e62c-46a1-af25-21ab8742ef00"
},
{
"name": "Trailing Whitespace Visualizer",
"vsixId": "4c1a78e6-e7b8-4aa9-8812-4836e051ff6d"
},
{
"name": "Unit Test Boilerplate Generator",
"vsixId": "UnitTestBoilerplate.RandomEngy.ca0bb824-eb5a-41a8-ab39-3b81f03ba3fe"
},
{
"name": "Visual Studio IntelliCode",
"vsixId": "IntelliCode.VSIX.598224b2-b987-401b-8509-f568d0c0b946"
},
{
"name": "Visual Studio Spell Checker (VS2017 and Later)",
"vsixId": "43EA967E-0DE2-4136-8E52-C6DCFB5C2748"
},
{
"name": "Wix Toolset Visual Studio 2017 Extension",
"vsixId": "WixToolset.VisualStudioExtension.Dev15"
}
]
}

and has 0 comments
The Blinding Knife continues the story of Kip the bastard, Gavin/Dazen Guile the genius god-like Prism and just about every other person alive a mere mortal. It is just as entertaining as the first book, although more focused on action than lore. A lot of new concepts are explored here, like colors that are not on the spectrum but can be drafted, other gods, other chromatic skills, but, as fantasy focused on little boys taught us, always unexplained, mysterious, too young to understand, people dying before they can finish their sentence, etc. I hate that cliché and I really hope people would stop using it so much. I am talking to you, Brent Weeks!

Anyway, I can't say anything more about the story or the style or the author than I did when I read the first book in the Lightbringer series. It's a continuous story, split in book sized volumes. I will start reading the next book in the saga momentarily. I recommend the writing style and I like the attention to detail and the lore, although after a while the boy genius recipe feels more and more like a Japanese manga and less than a real story.

and has 0 comments

The Problem


Phew, that's a mouthful. But the issue is that trying to serialize a FileInfo or a DirectoryInfo object with Newtonsoft's Json library in .NET Core fails with a vague exception:
Newtonsoft.Json.JsonSerializationException: Unable to serialize instance of 'System.IO.DirectoryInfo'.
at Newtonsoft.Json.Serialization.DefaultContractResolver.ThrowUnableToSerializeError(Object o, StreamingContext context)
at Newtonsoft.Json.Serialization.JsonContract.InvokeOnSerializing(Object o, StreamingContext context)
at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.OnSerializing(JsonWriter writer, JsonContract contract, Object value)
at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.SerializeObject(JsonWriter writer, Object value, JsonObjectContract contract, JsonProperty member, JsonContainerContract collectionContract, JsonProperty containerProperty)
at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.Serialize(JsonWriter jsonWriter, Object value, Type objectType)

It doesn't say why it fails, just that a method called ThrowUnableToSerializeError threw um... an unable to serialize error?

The Cause


Looking at the Newtonsoft code, we eventually get to this piece of code:
// serializing DirectoryInfo without ISerializable will stackoverflow
// https://github.com/JamesNK/Newtonsoft.Json/issues/1541
if
(Array.IndexOf(BlacklistedTypeNames, objectType.FullName) != -1)
{
contract.OnSerializingCallbacks.Add(ThrowUnableToSerializeError);
}

Later, another piece of code will execute the serializing callbacks and throw the exception. We can get rid of this functionality, by using a custom contract resolver, like this:
var settings = new JsonSerializerSettings
{
ContractResolver = new FileInfoContractResolver()
};
 
private class FileInfoContractResolver : DefaultContractResolver
{
protected override JsonContract CreateContract(Type objectType)
{
var result = base.CreateContract(objectType);
if (typeof(FileSystemInfo).IsAssignableFrom(objectType))
{
result.OnSerializingCallbacks.Clear();
}
return result;
}
}

Yet now, when trying to serialize, we get the stack overflow exception described in the original Newtonsoft.Json issue. It stems from the difference between the .NET Framework implementation and the .NET Core implementation of ISerializable in FileSystemInfo, which in Core just throws PlatformNotSupportedException. It's still not clear why it goes to a StackOverflowException, probably some conflict with Newtonsoft code, but it's clear Microsoft does not intend to make these classes serializable. If you think about it, those classes suck for so many reasons!

The Solution


So, in order to solve it, we will use a custom JSON converter:
private class FileSystemInfoConverter:JsonConverter
{
public override bool CanConvert(Type objectType)
{
return typeof(FileSystemInfo).IsAssignableFrom(objectType);
}
 
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
if (reader.TokenType == JsonToken.Null)
return null;
var jObject = JObject.Load(reader);
var fullPath = jObject["FullPath"].Value<string>();
return Activator.CreateInstance(objectType, fullPath);
}
 
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
var info = value as FileSystemInfo;
var obj = info == null
? null
: new
{
FullPath = info.FullName
};
var token = JToken.FromObject(obj);
token.WriteTo(writer);
}
}
And we use it like this:
var settings = new JsonSerializerSettings
{
Converters = new List<JsonConverter>
{
new FileSystemInfoConverter()
}
};
var json = JsonConvert.SerializeObject(dir, settings);
var info = JsonConvert.DeserializeObject<DirectoryInfo>(json, settings);

Why FileInfo and DirectoryInfo suck


The answer of a senior developer to any question should be "Why?" or "Why on Earth or anywhere in the Solar System would you want to do a dumb thing like that?!?!". Why would you want to serialize a directory or file info object? The answer is that you should not. The info objects are defined by only one thing: a path, but they have so much baggage: properties that access the file system, unsafe methods, no interfaces or factory methods that can allow them to be mocked in unit tests. They might look like data objects, but they are not!

Imagine a scenario where you have a list of all the files in your drive. You enumerated them all and now you want to serialize them. Should the serializer save Exists or Length, for example? Because that means it will access the file system for each of them in the process of serialization, leading to a lot of work, propensity to access errors and so on.

Best practices say you should either use some model classes to move around data, like an empty FileSystemInfoModel with Type and FullPath and maybe Attributes or Size properties or whatever you want to save, but that you set yourself as a separate responsibility. And if you want to use the functionality of the Info classes, use System.IO.Abstractions or the new Core IFileProvider abstraction to get implementations of interfaces that you can mock in unit tests.

Tell me what you think.

and has 0 comments

It seems there is a dedicated fan base for the Riyria series that so enjoy the setup that they ignore the quality (or lack thereof) of the writing. The writing style is amateurish at best, the characters are not fleshed out, yet the little building they get is contradicted with impunity whenever the plot requires it, the point of the story of the book has not been revealed after more than half of it, while the plot doesn't make any sense most of the time.

I am sorry, Michael J. Sullivan, but I could only read 60% of The Crown Tower before deciding I will not continue and I will not try any of the other books in the series. For the readers, imagine a story about implausibly competent youngsters that are forced to work together by a kindly old professor for no good reason other than they have to work together. Imagine a prostitute who decides to fight the world and open her own brothel, right across the street from her former pimp and king of the street, but the only concerns she has is how to bribe city officials to give her a business permit. After half of the book in which the characters have barely begun to do any of the activities listed above, nothing really happened, while hints have been placed to imply this is a world where magic exists, goblins, elves, dwarves, gods, yet none of them made an appearance.

I don't understand how stuff like that gets any awards. Is it just because they sell? Toilet paper sells and doesn't win anything! Just... ugh!

and has 0 comments
You remember when you had to write a paper for college and you had the thing that you wanted to say, but then your coordinator told you to make it a chapter, and then add others that are related for context? This book kind of feels like that. In English it is called The Fear Factor, but the Romanian edition calls it "Altruist or being good without reward" (my direct translation, as Good for Nothing didn't feel right, even if it is the title of the book in the UK), showing that even editors didn't really agree with the author on the right way to label it.

Overall, what Abigal Marsh tries to say is simple: our capacity to do good to others without expecting a reward stems from an ancient mammalian mechanism designed to bond mothers to children and it is triggered by our ability to empathize with the fear other feel, while regulated by a network of brain centers, mainly our amygdala and hippocampus using the oxytocin hormone. This takes the book through eight chapters, each kind of separate and which I liked in different measures. The ones describing carefully crafted experiments and their outcomes I liked best, the ones that felt like fillers or the ones affirming that correlation doesn't imply causation then proceeding in describing a lot of correlation less so.

Marsh goes out of her way to portray a positive image of humanity, where most people are generous, empathetic and altruistic. She describes people who aren't capable of it - psychopaths and their amygdala dysfunction, people on the other side of the curve - superaltruists who don't care to whom they do good, they just do it, goes to very interesting experiments and comes with theories about how and why altruism, fear and empathy work. Her conclusion is that our focus on negative things makes us falsely believe things are getting worse, people less trustworthy, when the actual opposite if overwhelmingly true.

Bottom line: I liked the book, but some of the chapters felt forced. I didn't really need the exposition of her beach trip to save the turtles or how much she feared and then appreciated the help of a random guy who looked like a hood thug. Most of the information interesting to me was concentrated in the first chapters, while the last, explaining what to do to become more altruistic and how that improves our well being and filled with international statistical charts on altruism I could have done without entirely. It's not that it wasn't correct or well written, it just felt like an add on that had little to do with the book or, worse, was there just to fill up space.

If you search on TED Talks, you will see the author have a talk there titled Abigail Marsh: Why some people are more altruistic than others.

and has 0 comments
This post starts from a simple question: how do I start a task with timeout? You go to StackOverflow, of course, and find this answer: Asynchronously wait for Task<T> to complete with timeout. It's an elegant solution, mainly to also start a Task.Delay and continue when either task completes. However, in order to cancel the initial operation, one needs to pass the cancellation token to the original task and manually handle it, meaning polluting the entire business code with cancellation logic. This might be OK, yet are there alternatives?

But, isn't there the Task.Run(action) method that also accepts a CancellationToken? Yes, there is, and if you thought this runs an action until you cancel it, think again. Here is what Task.Run says it does: "Queues the specified work to run on the thread pool and returns a Task object that represents that work. A cancellation token allows the work to be cancelled." and if you scroll down to Remarks, here is what it actually does: "If cancellation is requested before the task begins execution, the task does not execute. Instead it is set to the Canceled state and throws a TaskCanceledException exception". You read that right: the token is only taken into account when the task starts running, not while it is actually executing.

Surely, then, there must be a way to cancel a running Task. How about Task.Dispose()? Dispose throws a funny exception if you try it: "System.InvalidOperationException: 'A task may only be disposed if it is in a completion state (RanToCompletion, Faulted or Canceled).'". In normal speech, it means "Fuck you!". If you think about it, how would you abort a task execution? What if it does nasty things, leaves resources occupied, has to clean up after it? The .NET team took the safe path and refused to give you an out of the box unsafe cancelling mechanism.

So, what is the solution? The recommended one is that you pass the token to all methods that can be cancelled and then check inside if cancellation was requested. Of course this only works if
  1. you control what the task does
  2. you can split the operation into small chunks that are either executed sequentially or in a loop so you interrupt their flow
. If you have something like an external process that is being executed, or a long running operation, you are almost out of luck. Why almost? Well, CancellationSource or CancellationToken do not have events, but the token exposes a "wait handle" that you can wait for synchronously. And here it gets funky. Check out an example of a method that executes some long running action and can react to token cancelling:
/// <summary>
/// Executes the long running action and cancels it when needed
/// </summary>
/// <param name="token"></param>
private void LongRunningAction(CancellationToken token)
{
// instantiate a container and keep its reference
var container = new IdentificationContainer();
Task.Run(() =>
{
// wait until the token gets cancelled on another thread
token.WaitHandle.WaitOne();
// this will use the information in the container to kill the action
// (presumably by interrupting external processes or sending some kill signal)
KillLongRunningAction();
});
// this executes the action and populates the identification container if needed
RunLongRunningAction();
}
This introduces some other issues, like what happens to the monitoring task if you never cancel the token or dispose of the cancellation source, but that's a bit too deep.

In the code above we get a sort of a solution if we can control the code and we can actually cancel things gracefully inside of it. But what if I can't (or won't)? Can I get something that does what I wanted Task.Run to do: execute something and, when I cancel it, stop it from executing?

And the answer, using what we learned above, is yes, but as explained at the beginning, it may have effects like resource leaks. Here it is:
/// <summary>
/// Run an action and kill it when canceling the token
/// </summary>
/// <param name="action">The action to execute</param>
/// <param name="token">The token</param>
/// <param name="waitForGracefulTermination">If set, the task will be killed with delay so as to allow the action to end gracefully</param>
private static Task RunCancellable(Action action, CancellationToken token, TimeSpan? waitForGracefulTermination=null)
{
// we need a thread, because Tasks cannot be forcefully stopped
var thread = new Thread(new ThreadStart(action));
// we hold the reference to the task so we can check its state
Task task = null;
task = Task.Run(() =>
{
// task monitoring the token
Task.Run(() =>
{
// wait for the token to be canceled
token.WaitHandle.WaitOne();
// if we wanted graceful termination we wait
// in this case, action needs to know about the token as well and handle the cancellation itself
if (waitForGracefulTermination != null)
{
Thread.Sleep(waitForGracefulTermination.Value);
}
// if the task has not ended, we kill the thread
if (!task.IsCompleted)
{
thread.Abort();
}
});
// simply start the thread (and the action)
thread.Start();
// and wait for it to end so we return to the current thread
thread.Join();
// throw exception if the token was canceled
// this will not be reached unless the thread completes or is aborted
token.ThrowIfCancellationRequested();
}, token);
return task;
}

As you can see, the solution is to run the action on a thread and then manually kill the thread. This means that any control of where and how the action is executed is wrestled from the default TaskScheduler and given to you. Also, in order to force the stopping of the task, you use Thread.Abort, which may have nasty side effects. Here is what Microsoft says about it:



Bummer! .NET Core doesn't want you to kill threads. However, if you are really determined, there is a way :) Use ThreadEx.Abort(thread);


Bonus code: How do you get the cancellation token if you have the task?
var token = new TaskCanceledException(task).CancellationToken;
It might not help too much, especially if you want to use it inside the task itself, but it might help clean up the code.

Conclusion


Just like async/await, using the provided cancellation token method will only pollute your code with little effect. However, considering you want to use a common interface for the purpose, use RunCancellable instead of Task.Run and handle the token manually whenever you feel resources have been allocated and need to be cleaned up first.

and has 0 comments
If there is something that went wrong with this book, then it has to be the cover on Goodreads: a hipster young man with dark hair, a goatee and a pointlessly fancy dagger, which has almost no connection to the story. Instead, try the one on Amazon, which at least doesn't offend. And that concludes what went wrong with The Black Prism! I actually liked it a lot.

The story feels like so many other young adult fantasy novels, with the young child with important ancestry that had a bad childhood and is suddenly thrown in a world of magic, war and intrigue, but the characters are fresh, their motivations carefully crafted with respectful attention to detail. The world building follow suit, with a novel magic system, a deep history and not all yet revealed. The writing is good, too. After reading this first book in the Lightbringer saga, I immediately felt the need to read the next one in the series. But there is a dark side to all this, too, as The Black Prism isn't a stand alone book. If you like it, you will have to read it all.

Bottom line: I really liked the love Brent Weeks weaved in his book. This is not one of those "give me your money now" kind of work, it's something that has value and beauty. It's not the greatest book ever written, but what book is? For the fantasy genre, it was pretty entertaining (and big!).

and has 0 comments
When I was a child I was obsessed with dinosaurs. I was going through the pages of the Zoological Atlas again and again, looking at the big lizard like monsters and memorizing all of their names. If I would have had access to a book like Steve Brusatte's, I would have probably become a paleontologist! By that I mean that the book is good... for an eight year old or for somebody who is already giddy with the prospect of reading about their favorite subject. Now, decades later, I really made an effort to enjoy The Rise and Fall of the Dinosaurs, but it had no wow factor anymore. The plethora of names that I haven't known about when I was smaller than a toddler did not bring me joy. Hearing about feathered dinosaurs and what is the most likely reason feathers evolved at all or how they dinosaurs turned into their modern day form - the birds, even the tales about the dwarf dinosaurs found in my own homeland merely made the book bearable. Having a long chapter focused on Tyrannosaurs and having my book reader stop after each T. in T. Rex didn't help either.

My verdict, therefore, is that it is a good history book. It is well written and the passion of the author is palpable and admirable. Yet, unless you either know nothing about dinosaurs or you already love them, you won't read anything really amazing or new. It is, quite literally, a history of how dinosaurs rose and fell and it feels like reading a history book. Somehow, I was expecting more, something that explored in depth a lost world, but in fact it only made clear how little we know and how tiny chances are that this will ever change. Instead of the feel of a lush green world where danger loomed and beauty abounded, I got a dry dusty look at people digging in rocks for small hints of that world. It was like looking at shadows and trying to figure out what made them.

and has 0 comments

Have you ever found a book so bland that you just refused to continue reading it? To me it happens rarely, but it did with Malice, by John Gwynne. And I do feel a sense of loss, since the reviews I've seen are all so overwhelmingly positive. Maybe if I would have just read a few more formulaic chapters I would have gotten to the part when something, anything, happens.

But no, I do have a lot of books to read and I am not going to waste my time reading about another child who wants to be a hero, but he's weak and bullied, another large blacksmith who was once a soldier, another pair of good and evil gods and their minions, noble savages, strong princesses, evil viziers and so on and so on. After several chapters all I got was a bunch of people in different contexts, each with their own names, friends, family, dreams, history and narration. Whenever I thought something would happen, another character with a silly name came along to perform whatever ritual is assigned to its cardboard role. Confusing and boring as hell.

Bottom line, I couldn't even begin to finish it. I probably read about 10-15% and gave up.