and has 0 comments
The Invention of Nature is an ode to Alexander von Humboldt, the man who has practically invented our concept of nature, inspired Darwin and Goethe and Bolivar and Jefferson and so many others, created the ideas of ecology, Gaia (although it wouldn't be called that for some time), global connection between volcanic activity. He was among the first to popularize the idea that man's mindless exploitation of nature cannot last and will have dire consequences. The last true polymath, Andrea Wulf calls him, and on paper he seems a god: an avid reader, a great thinker, fluent in many languages, exploring on foot tirelessly until in his seventies, dabbling not only in natural sciences, but also politics, social revolution, physics, drawing, prose and poetry. He had been actively writing and corresponding until well in his eighties. The quintessential 19th century romantic scientist, he was interested in everything and anyone and wrote incessantly. At one time he remained out of money because he was paying for the publication of all his books, being interested in spreading the knowledge, not profit. He was collecting rocks, insects, plants, soil samples, etc. then he would send them to other scientists who were interested, for nothing in return.

His view of nature and the cosmos (term that he coined) permeates the vision of our society even now. So how come so few people know about him? To my shame, that includes me. I vaguely knew the name, but had no idea how grand his influence is. Wulf's explanation is that after the first world war (and I guess the second didn't help, either) an anti-German sentiment spread in Europe and America, leading to burning of books, lynching of German people and an overall erasure of anything Germanic from culture.

Now, half of the book is almost exclusively a Humboldt biography and it is awesome! I was imagining how great it would be if someone were to make a TV series about it (Netflix and National Geographic, I am looking at you!): so many details, so many adventures, so many important people of the age. I think the book would have been more accessible if it would been just that. But then the author also described some other people who were influenced by Humboldt, and while knowing that Darwin venerated the man and did everything he did from the moment he read one of the man's books, the others were less interesting or important.

Even so, the other people cover less than a quarter of the book... the rest is acknowledgements, bibliography, references, etc. Andrea Wulf did a wonderful job researching this and bringing Humboldt to life for me. Even if the ending of the book was not as satisfying as the beginning, it's hard for me to rate this any less than excellent. You need to read this!

and has 0 comments
I can't decide if Velocity Weapon is brilliant or stupid. What I can say is that I didn't like it. Megan O'Keefe tells a story of three characters: a gunnery sergeant who ejects her pod during a space battle and is picked up by an intelligent spaceship, her brother who is a member of the Prime Protectorate and does everything to find her and a thief on some other planet who stumbles upon a strange lab that changes her entire life.

The writing is competent, nothing inspiring, though, and probably that is why I had difficulty finishing the book. But there are also some features of the story that I didn't like. For example of the three main characters who start the book on equal ground, the thief gets less and less space and, worse, her story never connects to the others. It's like O'Keefe wrote a book and a novella and then merged them into a larger book, even if their only commonality is the same universe. Then there is a part of the story that I got invested in, only to be aborted midway; I can't say more without spoiling the story, but I didn't like that.

The thing that bothered me most, though, is how the plot meanders instead of getting to the point. I used to think that a good story would be less straightforward, but now that I read one that just comes and goes, gives you glimpses of the world, then does nothing with them... it just felt like wasted time. Don't get me wrong, the author builds a world with vast opportunities, a universe of multiple colonized worlds connected by star gates which are controlled by the Primes and their technology originated from an alien artifact. She is just beginning the story. The characters might yet come together, the villains might become clearer, the whole thing felt potentially epic, only one would probably have to read all of the books to understand where O'Keefe is planning to go.

Basically the book is a string of almost random events, driven by forces that are never made clear, then somehow brought together by incredible coincidence, while the characters are barely sketched and hard to relate to, especially the male ones. The world has a lot of potential, but little is built on it so far. It feels like Star Wars, a little: a galaxy far far away where everybody is related or knows each other and everything in a chapter happens on one planet only. And it felt dated, as well.

So I can't decide: is this the start of a wonderful epic universe with immense potential or is it just a stupid space opera book that is not very good? I just didn't like it.

and has 1 comment
.NET Core comes with its own dependency injection engine, separated in the Microsoft.Extensions.DependencyInjection package, and ASP.Net Core uses it by default. In a very simplistic description, it uses an IServiceCollection to add services to, then it builds an IServiceProvider from that list, an interface which returns an implementation based on a type or null if finding none. Any change in the list of services is not supported. There are situations, though, where you want to add new services. One of them being dynamically resolving new types.

Therefore I set up to create a custom implementation of IServiceProvider that fixes that, using the mechanisms already existing in .NET Core. Note that this is just something I did from frustration, "because I could". Most people choose to replace the entire IServiceProvider with an implementation that uses some other DI container, like StructureMap.

First attempt was proxying a normal ServiceProvider and keeping a reference to the collection. Then I would just change the collection and recreate the service provider. That has two major problems. One is that the previous serviceProvider is not disposed. If you try, you automatically dispose all services already resolved and if you do not, you remain with references to the created services. The second, and more dire, is that recreating the service provider will generate new instances for services, even if registered as singletons. That is not good.

I thought of a solution:
  1. keep a list of service providers, instead of just one
  2. use a custom service collection which will let us know when changes occurred
  3. whenever new services are added, add them to a list of new services
  4. whenever a service is resolved, go through the list of providers
  5. if any provider returns a value, provide it
  6. else if any new service create a new provider from the new services and add it to the list
  7. else return null
  8. when disposing, dispose all providers in the list

This works great except the newly added providers are separate from the existing providers so when you try to resolve a type with a second provider and that type has in its constructor a type that was registered in the first provider, you get nothing.

One solution would be to add all services to the second provider, not only the new ones, but then we get back to the original issue of the singletons, only a bit more subtle:
  1. register type1 as a singleton
  2. get an instance of type1 (1)
  3. build the provider
  4. get an instance of type1 (2)
  5. register type2 which receives a type1 in its constructor
  6. get an instance of type2
  7. now, type1 (1) is the same as type1 (2), because it was resolved by the same provider
  8. type1 is different from type2.type1, though, because that was resolved as a different singleton by the second provider in the list

One solution would be to add all previous services as factories, then. For Itype1, instead of returning typeof(type1), return a factory method that resolves the value with our system. And it works... until it reaches a definition (like IOptions) that was registered as an open generic: services.AddSingleton(typeof(IType3<>),typeof(Type3<>)). In case of open generics, you cannot use a descriptor with a factory, because it returns an object, regardless of the generic type argument used. It would not to do return a Type3<Banana> for a requested type of IType3<int>.

So, final version is this:
  1. keep a list of service providers, instead of just one
  2. keep a dictionary of the last object resolved for a type
  3. use a custom service collection which will let us know when changes occurred
  4. whenever new services are added, add them to a list of new services
  5. whenever a service is resolved, go through the list of providers
  6. if any provider returns a value, return it
  7. if no new services registered return null
  8. create a new provider from all the services like this:
    • if it's a new registration, use it as is
    • if it's an open generic definition type:
      • if singleton, add first all the existing resolutions for types that are defined by it
      • use the original descriptor afterwards
    • use a registration that proxies to the advanced resolution mechanism we created
  9. when disposing, dispose all providers in the list

This implementation also has a flaw: if a dependency parameter with a generic type definition descriptor was resolved as a singleton by an additional service provider, then is requested directly and can be resolved by a previous provider, it will return a different instance. Here is the scenario:
  1. the initial provider knows to map I<> to M<>
  2. you add a new singleton mapping from X to Y and Y gets a constructor parameter of type I<Z>
  3. you request an instance of X
  4. the first provider cannot resolve it
  5. the second provider can resolve it, therefore it will also resolve a I<Z> as an M<Z> singleton instance
  6. you request an instance of I<Z>
  7. the first provider can resolve it, therefore it will return a NEW singleton instance of M<Z>

This is an edge case that I don't have the time to solve. So, with the caveat above, here is the final version.
Use it like this:
// IAdvancedServiceProvider either injected 
// or resolved via serviceProvider.GetService<IAdvancedServiceProvider>
// or even serviceProvider as IAdvancedServiceProvider
advancedServiceProvider.ServiceCollection.AddSingleton...

And this is the source code:
/// <summary>
/// Service provider that allows for dynamic adding of new services
/// </summary>
public interface IAdvancedServiceProvider : IServiceProvider
{
/// <summary>
/// Add services to this collection
/// </summary>
IServiceCollection ServiceCollection { get; }
}
 
/// <summary>
/// Service provider that allows for dynamic adding of new services
/// </summary>
public class AdvancedServiceProvider : IAdvancedServiceProvider, IDisposable
{
private readonly List<ServiceProvider> _serviceProviders;
private readonly NotifyChangedServiceCollection _services;
private readonly object _servicesLock = new object();
private List<ServiceDescriptor> _newDescriptors;
private Dictionary<Type, object> _resolvedObjects;
 
/// <summary>
/// Initializes a new instance of the <see cref="AdvancedServiceProvider"/> class.
/// </summary>
/// <param name="services">The services.</param>
public AdvancedServiceProvider(IServiceCollection services)
{
// registers itself in the list of services
services.AddSingleton<IAdvancedServiceProvider>(this);
 
_serviceProviders = new List<ServiceProvider>();
_newDescriptors = new List<ServiceDescriptor>();
_resolvedObjects = new Dictionary<Type, object>();
_services = new NotifyChangedServiceCollection(services);
_services.ServiceAdded += ServiceAdded;
_serviceProviders.Add(services.BuildServiceProvider(true));
}
 
private void ServiceAdded(object sender, ServiceDescriptor item)
{
lock (_servicesLock)
{
_newDescriptors.Add(item);
}
}
 
/// <summary>
/// Add services to this collection
/// </summary>
public IServiceCollection ServiceCollection { get => _services; }
 
/// <summary>
/// Gets the service object of the specified type.
/// </summary>
/// <param name="serviceType">An object that specifies the type of service object to get.</param>
/// <returns>A service object of type serviceType. -or- null if there is no service object of type serviceType.</returns>
public object GetService(Type serviceType)
{
lock (_servicesLock)
{
// go through the service provider chain and resolve the service
var service = GetServiceInternal(serviceType);
// if service was not found and we have new registrations
if (service == null && _newDescriptors.Count > 0)
{
// create a new service collection in order to build the next provider in the chain
var newCollection = new ServiceCollection();
foreach (var descriptor in _services)
{
foreach (var descriptorToAdd in GetDerivedServiceDescriptors(descriptor))
{
((IList<ServiceDescriptor>)newCollection).Add(descriptorToAdd);
}
}
var newServiceProvider = newCollection.BuildServiceProvider(true);
_serviceProviders.Add(newServiceProvider);
_newDescriptors = new List<ServiceDescriptor>();
service = newServiceProvider.GetService(serviceType);
}
if (service != null)
{
_resolvedObjects[serviceType] = service;
}
return service;
}
}
 
private IEnumerable<ServiceDescriptor> GetDerivedServiceDescriptors(ServiceDescriptor descriptor)
{
if (_newDescriptors.Contains(descriptor))
{
// if it's a new registration, just add it
yield return descriptor;
yield break;
}
 
if (!descriptor.ServiceType.IsGenericTypeDefinition)
{
// for a non open type generic singleton descriptor, register a factory that goes through the service provider
yield return ServiceDescriptor.Describe(
descriptor.ServiceType,
_ => GetServiceInternal(descriptor.ServiceType),
descriptor.Lifetime
);
yield break;
}
// if the registered service type for a singleton is an open generic type
// we register as factories all the already resolved specific types that fit this definition
if (descriptor.Lifetime == ServiceLifetime.Singleton)
{
foreach (var servType in _resolvedObjects.Keys.Where(t => t.IsGenericType && t.GetGenericTypeDefinition() == descriptor.ServiceType))
{
 
yield return ServiceDescriptor.Describe(
servType,
_ => _resolvedObjects[servType],
ServiceLifetime.Singleton
);
}
}
// then we add the open type registration for any new types
yield return descriptor;
}
 
private object GetServiceInternal(Type serviceType)
{
foreach (var serviceProvider in _serviceProviders)
{
var service = serviceProvider.GetService(serviceType);
if (service != null)
{
return service;
}
}
return null;
}
 
/// <summary>
/// Dispose the provider and all resolved services
/// </summary>
public void Dispose()
{
lock (_servicesLock)
{
_services.ServiceAdded -= ServiceAdded;
foreach (var serviceProvider in _serviceProviders)
{
try
{
serviceProvider.Dispose();
}
catch
{
// singleton classes might be disposed twice and throw some exception
}
}
_newDescriptors.Clear();
_resolvedObjects.Clear();
_serviceProviders.Clear();
}
}
 
/// <summary>
/// An IServiceCollection implementation that exposes a ServiceAdded event for added service descriptors
/// The collection doesn't support removal or inserting of services
/// </summary>
private class NotifyChangedServiceCollection : IServiceCollection
{
private readonly IServiceCollection _services;
 
/// <summary>
/// Fired when a descriptor is added to the collection
/// </summary>
public event EventHandler<ServiceDescriptor> ServiceAdded;
 
/// <summary>
/// Initializes a new instance of the <see cref="NotifyChangedServiceCollection"/> class.
/// </summary>
/// <param name="services">The services.</param>
public NotifyChangedServiceCollection(IServiceCollection services)
{
_services = services;
}
 
/// <summary>
/// Get the value at index
/// Setting is not supported
/// </summary>
public ServiceDescriptor this[int index]
{
get => _services[index];
set => throw new NotSupportedException("Inserting services in collection is not supported");
}
 
/// <summary>
/// Count of services in the collection
/// </summary>
public int Count { get => _services.Count; }
 
/// <summary>
/// Obviously not
/// </summary>
public bool IsReadOnly { get => false; }
 
/// <summary>
/// Adding a service descriptor will fire the ServiceAdded event
/// </summary>
/// <param name="item"></param>
public void Add(ServiceDescriptor item)
{
_services.Add(item);
ServiceAdded.Invoke(this, item);
}
 
/// <summary>
/// Clear the collection is not supported
/// </summary>
public void Clear() => throw new NotSupportedException("Removing services from collection is not supported");
 
/// <summary>
/// True is the item exists in the collection
/// </summary>
public bool Contains(ServiceDescriptor item) => _services.Contains(item);
 
/// <summary>
/// Copy items to array of service descriptors
/// </summary>
public void CopyTo(ServiceDescriptor[] array, int arrayIndex) => _services.CopyTo(array, arrayIndex);
 
/// <summary>
/// Enumerator for service descriptors
/// </summary>
public IEnumerator<ServiceDescriptor> GetEnumerator() => _services.GetEnumerator();
 
/// <summary>
/// Index of item in the list
/// </summary>
public int IndexOf(ServiceDescriptor item) => _services.IndexOf(item);
 
/// <summary>
/// Inserting is not supported
/// </summary>
public void Insert(int index, ServiceDescriptor item) => throw new NotSupportedException("Inserting services in collection is not supported");
 
/// <summary>
/// Removing items is not supported
/// </summary>
public bool Remove(ServiceDescriptor item) => throw new NotSupportedException("Removing services from collection is not supported");
 
/// <summary>
/// Removing items is not supported
/// </summary>
public void RemoveAt(int index) => throw new NotSupportedException("Removing services from collection is not supported");
 
/// <summary>
/// Enumerator for objects
/// </summary>
IEnumerator IEnumerable.GetEnumerator() => ((IEnumerable)_services).GetEnumerator();
}
}

and has 0 comments
Stranger than we can Imagine feels like a companion book to the 2002 documentary The Century of the Self. Both are really well done and discuss the brusque changes that define the 20th century and they complement each other in content. I recommend them highly to just about everyone except maybe little children.

John Higgs starts the book comparing history to a landscape and the works describing it as maybe roads. There are well trodden paths on this landscape, but also deep forests where few dare enter. He then promises that his book will try to describe the twentieth century by exploring these dark places, avoided by others. I didn't feel that was completely the case, but certainly it was a novel path to take to explain history: Einstein, Heisenberg, Gödel, Lorenz, Mandelbrot, Freud, Picasso, Dalí, Joyce, Leary, Stravinsky, Crowley, Thatcher, The Rolling Stones, Miyamoto and so on. Its basic premise is that an abrupt change occurred at the beginning of the 20th century, when the general belief in absolutes (which he generically calls omphaloi) was replaced with relativism and individuality.

How would classical empires survive these changes when at their core stands the belief in a supreme leader, representing and supported by a supreme god, who protects and enforces rules that are culturally accepted by everyone? They would not, therefore the world wars that ended them. What absolute pillar of belief would survive general relativity, the uncertainty principle, quantum mechanics, the incompleteness theorems, the id and individualism, impressionism, cubism, modernism, postmodernism and finally, the corporation? None of them. Religion not so much dies as it breaks apart in small fragments that then fade away. Morality shakes under the reign of individual desires and psychopathic legal entities. Social norms, economical behavior, even the foundation of money are wiped out and replaced with the new. Art fractures as well, constantly redefining and contradicting itself and everything else. It is the century where value exists only when seen from certain perspectives and nothing has any intrinsic value.

The book ends with a chapter that heralds the coming of a new age, the 21st century: the Internet and the erosion of the last remaining omphalos: truth. If truth also depends on the observer, if there is no one truth, if science if just a belief like any others, what awaits us in the post-truth era?

Overall it is a very interesting and informative book. More than simply stating facts, it is the unexpected connections between things that bring value to the reader, rather fitting considering the subject. Maybe not going into the depths of dark forests, but certainly exploring their edges and the strange beings that live there. Top marks!

Long story short, check out this answer to a StackOverflow question: Package destination of restore of .net-core projects is always global package directory

The scenario is this, you are used to .NET Framework projects for which Visual Studio restored NuGet packages in a packages folder in the solution folder and then you switch to .NET Core. No packages folder! You google it and you find that there is a global folder in your user's profile where NuGet will download all of these projects, that .NET Core uses it by default and also that you might change this behavior used on a property in nuget.config. So here are the issues you have take into account:
  1. There are two ways of configuring NuGet packages for your projects: a packages.config file and PackageReference elements in your .csproj file
  2. The name of the property you need to set is different based on the type of configuration: repositoryPath and globalPackagesFolder, respectively
  3. There are two formats for configuring nuget properties: using the add element inside the config element and using the repositoryPath element inside the settings element
  4. The format that worked for me in VS 2017 was the config element
  5. There are two locations for the nuget.config file: in a .nuget folder inside your solution folder and directly in the solution folder (or any of its parent folders)
  6. The location that is accepted by the latest versions of NuGet is directly in the solution folder
  7. Sometimes you need to restart Visual Studio for the change in nuget.config files to considered
  8. The path you specify is relative to the nuget.config file, no matter where it is

A bit of an overkill, but try this as the beginning of your nuget.config file that sits next to the .sln file in your solution:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<config>
<add key="repositoryPath" value="./packages" />
<add key="globalPackagesFolder" value="./packages" />
</config>
<settings>
<repositoryPath>./packages</repositoryPath>
</settings>
...
</configuration>

and has 0 comments
Deathcaster is the final book in the Shattered Realms series, or at least it should be, since it kills off the villain and has everybody live happily ever after. It's one of the least satisfying endings I've read in a long time.

Cinda Williams Chima started slowly, by creating a complex world of realms, magic and a multitude of characters and factions. She spent two books on that. The third book, Stormcaster, was about introducing a powerful and mysterious villain and yet more characters, realms and factions. Deathcaster pretty much ends it all in an until then unknown place, at a random time, for a completely random reason. Imagine Luke Skywalker walking around, playing with his sword, thinking on how to defeat the Death Star and accidentally bumping into and killing the emperor and Darth Vader both. This is how this book feels, after wading through a zillion people, with their feelings described in detail while any military or political strategy is explained (poorly) in a paragraph or two, through their relationships with other people, through their random interactions that always seem to bring them together for no apparent reason and then split them apart randomly and then the villain basically stumbling and falling on their sword.

There is nothing interesting that actually happens, no moral in any of the stories and the development of the characters is basically just beefing up and aging a few years.

Bottom line: the ending of this book makes the reading of the three previous books and this one feel like a complete waste of time. How do you rate a book that makes all the previous ones unrateable? Cinda, you're a troll!

and has 0 comments
The Flight of Morpho Girl is a short story set in the Wild Cards universe. If you haven't read the books until now, you won't know who the characters are. Even so, this story is so basic that it feels like "The Unsuccessful Mugging of Batman" or "Murder of Crows v Superman": predictable and stakeless.

That doesn't mean that the authors didn't do a good job, it's just that it is a short that brings nothing to the table other than the introduction of Morpho Girl's (Adesina, the teenager daughter of Amazing Bubbles) post cocoon form: a teenage girl with very tough butterfly wings. For me it's like a collectible item in the Wild Cards set, nothing more.

and has 0 comments
I have to admit that my expectations for this book were so high that it was probably doomed to not satisfy me. I was expecting something deeply Asian, with fantastic elements and fresh ideas and characters. What I got is something that is almost accidentally fantastical and has few cultural elements to make it fresh. Yet it does have interesting characters and, if it weren't for the plot, which meanders whichever way the author needs to further her agenda, it would have been a good book.

Joan He is American of Chinese descent (hence the name of the book?) and the culture described in Descendant of the Crane is based on an American's understanding of Chinese culture. That makes it both relatable and less Asian than I would have liked. What do I know, though? My feeling was that the author was exploring her own understanding of her origins instead of sharing something solid with the reader. There were some very intriguing ideas in the book, but they rode the story and the characters too strongly, making them inconsistent and irrational. This is an almost maybe book for me.

Bottom line: even without getting a lot of satisfaction out of it, I feel two stars out of five is too little, yet I am certain three is too much.

and has 0 comments
It was one of the few Brandon Sanderson books that I hadn't read and it is, at least at this moment, a standalone book. I know that title says that The Rithmatist is part of a series, but what successful book isn't? In truth, Sanderson started this book while working (and failing) on another and it took years to rethink and rewrite it into a finished story. It's best you take it as one of those wonderful accidents that successfully reach the reader despite the publishing industry.

Back to the book, it's almost classic Sanderson: the main character is young, passionate and intelligent, yet lacking power. Everybody else has it, though, and he is fascinated by it, while learning at a prestigious school that also teaches the techniques of power. In this gearpunk like novel, the power is magical and involves drawing lines with chalk and imbuing them with personal will. The lines then become defenses, attacks, weapons and even magical minions. When the school is attacked and his friends are in danger, it is up to him to solve the mystery.

It's obvious that the story has issues, and that is probably why Sanderson worked so much at it to make it publishable, but one can get past it quickly. The characters are not as funny or punny as his others or even very complex, being satisfied to have one or two goals in life and go for them (kind of like magical chalk drawings themselves, hmmm). The plot, involving the awesome power of carrying a piece of chalk with you and going through amazing duels that resemble tower defense games, it also not the most captivating. Yet the story kind of works.

Bottom line: a pleasant Brandon Sanderson classic, without being exceptional in any way.

and has 0 comments
I thought long and hard how to start this review and I think the best place is the ending. In July 2015 the Internet and even the classic media exploded with the news of the American space probe named New Horizons reaching the last unexplored classical planet of the solar system: Pluto. Long thought to be a frozen piece of rock floating too far from the Sun to be of any interest, it has been ignored by NASA and every other space agency out there, only to be revealed to be one of the most intriguing and beautiful gem of our system. New Horizons had been launched in 2006 and it took one more year to get all the Pluto data back to Earth. As far as the general population was concerned and even most of the people passionate about space, this was an axis in time with three major events: launch, flyby and end of data download. As far as the media was concerned, it was a great discovery because it produced memeable pictures (the Pluto heart is one, for example).

Chasing New Horizons starts in 1989, when Alan Stern deciding he would work for a NASA mission to Pluto, it takes us through the Herculean job of creating interest, gathering people, drafting a project, finding support and funding, fighting competing teams, bureaucracy and political apathy and even bad will, the ups and downs, the almost-theres, the Sisyphean and thankless work to get something up the Hill only to see it fall because of a change in administration, or a cut in the budget, or some hidden agenda and even people petty enough to demote Pluto's status as a planet as a personal grudge against the person who discovered it.

I liked how the book was written, even if at start I had to move over the usual platitudes meant to garner interest for space from the average reader and had to cope with the American units of measurement: feet, miles, Fahrenheit and, of course, the bus-size. However, no matter the small faults in the writing, the subject is so important than I can't rate the book lower than maximum. This is a must read, even if it skirts the technical aspects and mostly discusses the 25 year work from a managerial standpoint.

It's hard to describe how awesome these people are. Can you imagine working for a quarter century for something that can fail abruptly and with no positive outcome at literally any moment? I thought I had it bad when a project I was working on for six months was cancelled - imagine having to go through something like this several times, at the exact moment when you thought it will all sail smoothly from then on, when you had the funding, the assurances and even the construction of the probe nearly finished. Four days before one critical moment, the flyby, when New Horizons was supposed to do almost all of its work, the on board computer rebooted and lost all previously uploaded programming. In those four days, people had to scramble to recreate the entire software package they had worked for incrementally for the last 9 years and upload it to a machine that was 9 light hours away from Earth. One of the most critical moments of the mission (after 16 years of ground work to make it happen) was the launch, for example. The mission planners had no control over the mechanism of the launch vehicle. It could have blown on the pad or in the air.

There would be no redos. First, no way the project would have been approved again after a failure so senseless. Second, Pluto would have moved in a region of its 248 year long orbit where its atmosphere would freeze, making any other future probe return much less interesting data than at that exact moment. If it failed, it would have been the first and probably last APL planetary exploration mission, after they fought tooth and nail to be the ones doing it, rather than the usual and entrenched JPL. People had lived their entire lives working on this thing and it could have failed in so many different ways.

Bottom line: you have to be insane to do what Stern did. A wonderful flavor of insanity that is both admirable and terrifying. The system behind NASA should value and support these people even if, especially if, they are insanely driven enough that they don't actually need it. I would say that New Horizons succeeded despite the American space industry and political system, not because of them. It really shouldn't be that hard. This is a book for all space fans, but also people who had difficulties in their projects. While it might help with specific insights, this book will make almost any hardship you ever endured seem insignificant.

and has 2 comments
We already know how to load types in .NET Framework and we know what they say we should use in .NET Core. But what about Standard? Is that a trick question? Sort of. Right now we have two .NET Standard and three .NET Core versions, albeit .NET Core 3 is in preview mode. The signature for AssemblyLoadContext and how it is used has changed dramatically. Core 3 enables context unloading, but Standard 2 does not. So you either are forced to build your library as Core 3 or you have to not use Unloading contexts or use reflection, which is not robust and probably will not be needed with the possible arrival of Standard 3.

But there are subtler issues at work. One of them is that, at least with .NET Core 3 Preview6, when you reference System.Runtime.Loader in a Standard library, so you can access AssemblyLoadContext, you get conflicts between the System.Runtime you are using and the one referenced by System.Runtime.Loader. The only solution is to use the System.Runtime.Loader NuGet package, but that returns you to the Standard 2 version of AssemblyLoadContext, even if the library version is higher!

The setup is this: I have an ITestInterface interface which resides in TestInterfaceLibrary.dll. I also have a TestImplementation class that can be found in TestImplementationLibrary.dll and implements ITestInterface. My program either does not reference any of these libraries or it only references the interface one. The task is to load both these types and then simply convert one instance of TestImplementation to ITestInterface. Simple test would be loading the types and then expecting interfaceType.IsAssignableFrom(implementationType) to be true.

Core 3


Let's first try the Core 3 way:
var context = new AssemblyLoadContext("testContext", true);
 
var interfaceAssembly = context.LoadFromAssemblyPath(interfaceAssemblyPath);
var interfaceType = interfaceAssembly.GetType("TestInterfaceLibrary.ITestInterface");
Console.WriteLine(interfaceType?.ToString()??"interface type not loaded");
 
var implementationAssembly = context.LoadFromAssemblyPath(implementationAssemblyPath);
var implementationType = implementationAssembly.GetType("TestImplementationLibrary.TestImplementation");
Console.WriteLine(implementationType?.ToString() ?? "implementation type not loaded");
 
Console.WriteLine("implementation implements interface: "+interfaceType.IsAssignableFrom(implementationType));
 
context.Unload();
The output is:
TestInterfaceLibrary.ITestInterface
TestImplementationLibrary.TestImplementation
implementation implements interface: True

It works! But only because the interface assembly is loaded first. If you try to load just the implementation type first, it will come up as empty. There are no exceptions thrown unless you get all the assembly types or specify the throwOnError parameter in GetType. The exception is "System.IO.FileNotFoundException: 'Could not load file or assembly 'TestInterfaceLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null'. The system cannot find the file specified.'".

In order to solve this, we need to use the Resolve event of the AssemblyLoadContext class. Let's try this:
var context = new AssemblyLoadContext("testContext", true);
context.Resolving += Context_Resolving;
 
var implementationAssembly = context.LoadFromAssemblyPath(implementationAssemblyPath);
var implementationType = implementationAssembly.GetType("TestImplementationLibrary.TestImplementation", true);
Console.WriteLine(implementationType?.ToString() ?? "implementation type not loaded");
 
var interfaceAssembly = context.LoadFromAssemblyPath(interfaceAssemblyPath);
var interfaceType = interfaceAssembly.GetType("TestInterfaceLibrary.ITestInterface", true);
Console.WriteLine(interfaceType?.ToString() ?? "interface type not loaded");
 
Console.WriteLine("implementation implements interface: " + interfaceType.IsAssignableFrom(implementationType));
 
context.Resolving -= Context_Resolving;
context.Unload();
 
private static Assembly Context_Resolving(AssemblyLoadContext context, AssemblyName assemblyName)
{
var expectedPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, assemblyName.Name + ".dll");
return context.LoadFromAssemblyPath(expectedPath);
}

And now it works again, by assuming the assembly name is the same as the assembly file name and that it is found in the same place.

But... if we try this in different contexts:
var context = new AssemblyLoadContext("testContext", true);
context.Resolving += Context_Resolving;
 
var implementationAssembly = context.LoadFromAssemblyPath(implementationAssemblyPath);
var implementationType = implementationAssembly.GetType("TestImplementationLibrary.TestImplementation", true);
Console.WriteLine(implementationType?.ToString() ?? "implementation type not loaded");
 
context.Resolving -= Context_Resolving;
context.Unload();
context = new AssemblyLoadContext("testContext2", true);
context.Resolving += Context_Resolving;
 
var interfaceAssembly = context.LoadFromAssemblyPath(interfaceAssemblyPath);
var interfaceType = interfaceAssembly.GetType("TestInterfaceLibrary.ITestInterface", true);
Console.WriteLine(interfaceType?.ToString() ?? "interface type not loaded");
 
Console.WriteLine("implementation implements interface: " + interfaceType.IsAssignableFrom(implementationType));
 
context.Resolving -= Context_Resolving;
context.Unload();
the output will show
implementation implements interface: False

This means that if we want to encapsulate this in a TypeLoader class or something, we cannot use different contexts for dynamically loading types. Even if we had one context that we would unload in order to refresh all the types, it could still be different from the main context, in case the interface is loaded twice or referenced directly in the project.

For example, if you reference TestInterfaceLibrary directly and you load TestImplementation dynamically it will work as expected, because ITestInterface is resolved automatically from the main context. However, if you load ITestInterface dynamically, too, it will be a different type from the referenced ITestInterface, even if they apparently have the same name and full name and assembly qualified name! So it kind of makes sense to not load a type twice. Is this where the context unloading comes in? Not really. Let's define a method that counts the number of types with a certain name in the current domain as
private static int CountTypes(string typeName)
{
return AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(assembly => assembly.GetTypes().Where(t => t.FullName == typeName))
.Count();
}

And now let's run this code:
var context = new AssemblyLoadContext("testContext", true);
context.Resolving += Context_Resolving;
 
var referencedInterfaceType = typeof(ITestInterface);
Console.WriteLine(referencedInterfaceType?.ToString() ?? "interface type not loaded");
 
var interfaceAssembly = context.LoadFromAssemblyPath(interfaceAssemblyPath);
var interfaceType = interfaceAssembly.GetType("TestInterfaceLibrary.ITestInterface", true);
Console.WriteLine(interfaceType?.ToString() ?? "interface type not loaded");
 
Console.WriteLine($"Types are the same: {interfaceType==referencedInterfaceType}");
 
Console.WriteLine($"Number of types with name {interfaceType.FullName}: {CountTypes(interfaceType.FullName)}");
 
context.Resolving -= Context_Resolving;
context.Unload();
Console.WriteLine($"Number of types with name {interfaceType.FullName}: {CountTypes(interfaceType.FullName)}");

There is the referenced type, then we load the type dynamically again, inside a new context. We count the types loaded in the current domain, we unload the context, we count the types again. The result is
TestInterfaceLibrary.ITestInterface
TestInterfaceLibrary.ITestInterface
Types are the same: False
Number of types with name TestInterfaceLibrary.ITestInterface: 2
Number of types with name TestInterfaceLibrary.ITestInterface: 2
The types are always 2!

Bottom line, even when unloading the AssemblyLoadContext, the types used are not unloaded and trying to find a type by name will result in duplicates.

OK, so let's just agree that types with the same name, once loaded, should remain there and no other type with the same name should be loaded. Let's try to incorporate this into a TypeLoader class:
public class TypeLoader : IDisposable
{
private readonly AssemblyLoadContext _context;
 
public TypeLoader()
{
_context = new AssemblyLoadContext(GetType().FullName, true);
_context.Resolving += Context_Resolving;
}
 
private Assembly Context_Resolving(AssemblyLoadContext context, AssemblyName assemblyName)
{
var expectedPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, assemblyName.Name + ".dll");
return context.LoadFromAssemblyPath(expectedPath);
}
 
public Type LoadType(string typeName, string assemblyPath)
{
var type = AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(assembly => assembly.GetTypes().Where(t => t.FullName == typeName))
.FirstOrDefault();
if (type != null)
{
return type;
}
var assembly = _context.LoadFromAssemblyPath(assemblyPath);
return assembly.GetType(typeName, true);
}
 
public void Dispose()
{
_context?.Resolving -= Context_Resolving;
_context?.Unload();
}
}

The code in our test is now much clearer:
var interfaceAssemblyPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "TestInterfaceLibrary.dll");
var implementationAssemblyPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "TestImplementationLibrary.dll");
var interfaceTypeName = "TestInterfaceLibrary.ITestInterface";
var implementationTypeName = "TestImplementationLibrary.TestImplementation";
 
using (var loader = new TypeLoader())
{
Type referencedType = typeof(TestInterfaceLibrary.ITestInterface);
var interfaceType = loader.LoadType(interfaceTypeName, interfaceAssemblyPath);
var implementationType = loader.LoadType(implementationTypeName, implementationAssemblyPath);
Console.WriteLine($@"
referenced type: {referencedType}
interface type: {interfaceType}
implementation type: {implementationType}
referenced and loaded interfaces are the same: {referencedType == interfaceType}
interface implemented: {interfaceType.IsAssignableFrom(implementationType)}"
);
}
and the result is
referenced type: TestInterfaceLibrary.ITestInterface
interface type: TestInterfaceLibrary.ITestInterface
implementation type: TestImplementationLibrary.TestImplementation
referenced and loaded interfaces are the same: True
interface implemented: True

But we still use Unload. Maybe it will work some day as I want it to work, but until then, why not get rid of Unload and make TypeLoader a class in a Standard 2 library?

Standard 2


For this I will create a new Standard 2 library project and then reference it in our test Core 3 project. Then I will move the TypeLoader class in the library project.

The errors in the library project are related to not knowing what an AssemblyLoadContext is, therefore the first solution is to reference System.Runtime.Loader from the framework. I get the immediate error "Assembly 'System.Runtime.Loader' with identity 'System.Runtime.Loader, Version=4.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' uses 'System.Runtime, Version=4.2.1.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' which has a higher version than referenced assembly 'System.Runtime' with identity 'System.Runtime, Version=4.1.2.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".

Solution 2: load the System.Runtime.Loader NuGet package, which at the time of writing this, is version 4.3.0. The error is now gone, but several things are immediately apparent:
  1. the Unload method doesn't exist anymore
  2. the constructor doesn't receive a name and a bool anymore
  3. AssemblyLoadContext is now abstract

In order to solve this I am creating a DynamicAssemblyLoadContext class that inherits from AssemblyLoadContext and just return null from the Load method overload, and I give it an Unload method and a constructor with a string and a bool that don't do anything. And it works again. The updated TypeLoader class is now:
public class TypeLoader : IDisposable
{
private readonly DynamicAssemblyLoadContext _context;
 
public TypeLoader()
{
_context = new DynamicAssemblyLoadContext(GetType().FullName, true);
_context.Resolving += Context_Resolving;
}
 
private Assembly Context_Resolving(AssemblyLoadContext context, AssemblyName assemblyName)
{
var expectedPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, assemblyName.Name + ".dll");
return context.LoadFromAssemblyPath(expectedPath);
}
 
public Type LoadType(string typeName, string assemblyPath)
{
var type = AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(ass => ass.GetTypes().Where(t => t.FullName == typeName))
.FirstOrDefault();
if (type != null)
{
return type;
}
var assembly = _context.LoadFromAssemblyPath(assemblyPath);
return assembly.GetType(typeName, true);
}
 
public void Dispose()
{
_context?.Resolving -= Context_Resolving;
_context?.Unload();
}
 
 
private class DynamicAssemblyLoadContext : AssemblyLoadContext
{
public DynamicAssemblyLoadContext(string name, bool isCollectible)
{
}
 
protected override Assembly Load(AssemblyName assemblyName)
{
return null;
}
 
public void Unload()
{
}
}
}

The safe way


The code above has an issue, though. If the interface type is dynamically loaded before its referenced type is used, this fails again. This is the case when you use dependency injection. You dynamically load the types in order to register the implementation relationship to the interface, but then, when you ask for a resolution for the interface type, now referenced by the main project, you get another type named just the same.

The safe way, considering that we don't really use Unload and we don't count on it every working, why not use the default context, the one where everything loads, and be done with it. When you do that, the code becomes a little uglier, but it works in all situations.

Final version.
public class TypeLoader
{
private readonly object _resolutionLock = new object();
 
private Assembly Context_Resolving(AssemblyLoadContext context, AssemblyName assemblyName)
{
var expectedPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, assemblyName.Name + ".dll");
return context.LoadFromAssemblyPath(expectedPath);
}
 
public Type LoadType(string typeName, string assemblyPath)
{
var context = AssemblyLoadContext.Default;
lock (_resolutionLock)
{
context.Resolving += Context_Resolving;
var type = AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(ass => ass.GetTypes().Where(t => t.FullName == typeName))
.FirstOrDefault();
if (type != null)
{
return type;
}
var assembly = context.LoadFromAssemblyPath(assemblyPath);
 
type = assembly.GetType(typeName, true);
context.Resolving -= Context_Resolving;
return type;
}
}
}

You just gotta hate that adding and removing the event inside a lock, right? Well, if you find a better solution, let me know.

and has 0 comments
The Blood Mirror felt like the weakest book in the series, but really, if I think about it, it's the pattern that unfolded through the entire Lightbringer saga that feels wrong. The first book was amazing, with interesting characters, great world building, an intriguing story, but then came the second book - and I didn't see it then - which upended much of the concepts in the first and added many more. It was not a continuation, per se, but a reframing of the story with other parameters. Instead of closing story arcs, Brent Weeks was transforming them, kept them open and added many more. The third book made this pattern obvious and in this book it became annoying.

Forget that everybody is the relative of everybody or in the extreme the member of an organization that we didn't know existed or cared about in previous books. Forget that after we follow a character as something, we have to follow them as something completely different in the next book, because of reasons that we didn't know (or cared) about. Forget even that threatening someone's loved ones seems to control everything with maximum efficiency in this universe, while actually harming them is a forgivable offense. Nothing. Ever. Ends. It just piles on. And since there is limited space in the book, important things - like the war or what the people are actually doing when the entire establishment blows in their faces and destroys their lives - get sidelined or completely eliminated in favor of whatever insecurity Kip feels while discussing hot sex with his friends or amazingly beautiful (and totally inconsistent) wife. And of course, the book ends in another cliffhanger.

In chess, when you are overwhelmed by the complexity of the position, you simplify it. You exchange pieces until the board is clearer. In Lightbringer, enemies just enjoy threatening each other and never following up while they work together for some completely pointless goal. Just like in TV soap operas, they all hate and love each other at the same time while things that could never have been predicted by the reader happen as chaotically as possible around them.

So, the fifth book will be published this year and I will read it, but my rating on the entire series just plunged dramatically. I don't expect things to really come to any conclusion, I don't expect characters to evolve in any meaningful way anymore or the lore behind it all to ever be explained. We started with seven colors and a god, now we have 11 colors and about 200 gods, for example. The chances that all of this mess will become clear in the future are remote.

and has 0 comments
The Broken Eye continues right after the shocking finale of The Blinding Knife! And that pretty much sucks, because the ending was the type of cliffhanger that just felt added on in order to make people quickly buy the next volume. Unfortunately, this book is no different. After a zillion story arcs that meet improbably and a lot of agitation one way or the other, Brent Weeks ends Broken Eye with an even shockier (is that a word?) ending.

And I will bite, I will read the fourth book in the series, The Blood Mirror, but only because I find the characters intriguing. Yet I definitely lost that feeling of respect for the story, the careful attention to detail that I enjoyed so much in the first volume. Weeks is a good writer, maybe even a great one, but instead of the series getting better, it just gets bloated until it needs over the top twists and abrupt cliffhangers. One of the most pervasive feelings when reading this volume was frustration that the stories of characters that I wanted to follow were interrupted by all of the others and how each and every one of even the secondary heroes needed their own grand achievement until it got claustrophobic. OK, you're the good guy, but when you see someone hurting everyone you know, you just kill them. You don't one up them, you don't talk to them, you don't strategize or play games. OK, you're a powerful psycho, but it doesn't mean everything needs to be a power show. I mean, does Andross Guile even go to the bathroom or just wills his bowels into submission? OK, you are young and inexperienced so you don't know what to do when you love someone, but doing the exact opposite? And how come in this universe there are at most two degrees of separation? More like one and a half. And how come everyone knows what they need to do when they need it, regardless if they ever learned it before?

I am already hooked into the story and Brent Weeks creates a complex and compelling one, however the experience of reading the books is only diminishing with stupid techniques like cliffhangers and hidden information and mindless expansions into new territories that absolutely did not need to be there. Too bad that now everything will need to at least maintain this insane level of tension and complexity, for fear of turning boring.

Bottom line: not bad, certainly not boring, but pointlessly exhausting.

This is more a backup for the extensions that I have installed on the two main IDEs I'm using for my job: Visual Studio and Visual Studio Code.

Visual Studio Code


In order to list the extensions installed, use the command code --list-extensions. For me, this results in this output:

code --install-extension aeschli.vscode-css-formatter
code --install-extension Angular.ng-template
code --install-extension christian-kohler.npm-intellisense
code --install-extension cmstead.jsrefactor
code --install-extension cssho.vscode-svgviewer
code --install-extension danwahlin.angular2-snippets
code --install-extension dbaeumer.jshint
code --install-extension dbaeumer.vscode-eslint
code --install-extension donjayamanne.jquerysnippets
code --install-extension donjayamanne.jupyter
code --install-extension DotJoshJohnson.xml
code --install-extension ecmel.vscode-html-css
code --install-extension eg2.vscode-npm-script
code --install-extension esbenp.prettier-vscode
code --install-extension fabianlauer.vs-code-xml-format
code --install-extension fknop.vscode-npm
code --install-extension HookyQR.beautify
code --install-extension humao.rest-client
code --install-extension joelday.docthis
code --install-extension k--kato.docomment
code --install-extension michelemelluso.code-beautifier
code --install-extension minhthai.vscode-todo-parser
code --install-extension mohsen1.prettify-json
code --install-extension mrmlnc.vscode-scss
code --install-extension ms-mssql.mssql
code --install-extension ms-vscode.csharp
code --install-extension ms-vscode.typescript-javascript-grammar
code --install-extension ms-vscode.vscode-typescript-tslint-plugin
code --install-extension msjsdiag.debugger-for-chrome
code --install-extension naumovs.color-highlight
code --install-extension pmneo.tsimporter
code --install-extension rbbit.typescript-hero
code --install-extension robinbentley.sass-indenrted
code --install-extension wayou.vscode-todo-highlight

In order to install them, use code --install-extension [extension name] for each line.

Visual Studio


For Visual Studio, funny enough, in order to export and import your extensions you need to use an extension: Extension Manager 2017, which on my system exports a file in .vsext format:

{
"id": "5f191824-b8a6-47c0-9f96-f607dfd3c09b",
"name": "My Visual Studio extensions",
"description": "A collection of my Visual Studio extensions",
"version": "1.0",
"extensions": [
{
"name": ".NET Portability Analyzer",
"vsixId": "55d15546-28ca-40dc-af23-dfa503e9c5fe"
},
{
"name": "Advanced Installer for Visual Studio 2017",
"vsixId": "Caphyon.AdvancedInstaller.23debb5a-cff4-4b91-88bf-6280f72a7ebb"
},
{
"name": "Azure Data Lake and Stream Analytics Tools",
"vsixId": "1e906ff5-9da8-4091-a299-5c253c55fdc9"
},
{
"name": "Azure Functions and Web Jobs Tools",
"vsixId": "Microsoft.VisualStudio.Web.AzureFunctions"
},
{
"name": "BuildVision",
"vsixId": "837c3c3b-8382-4839-9c9a-807b758a929f"
},
{
"name": "Clean Code .NET",
"vsixId": "CleanCode.NET.9ecfa9bb-0775-48d0-9898-4dbbbd529fe3"
},
{
"name": "Cloud Explorer for VS 2017",
"vsixId": "Microsoft.VisualStudio.CloudExplorer"
},
{
"name": "Code Cracker for C#",
"vsixId": "CodeCracker.Vsix..5b99e64c-1418-4a06-990c-fd4cf01f4f63"
},
{
"name": "Code Graph",
"vsixId": "CodeAtlasVSIX.Company.df5456fb-08ea-4256-b5ff-ecdb3a512ad3"
},
{
"name": "CodeMaid",
"vsixId": "4c82e17d-927e-42d2-8460-b473ac7df316"
},
{
"name": "CommentCop",
"vsixId": "CommentCop..0521EE68-1A5D-4C78-9970-B6A46B03FA6D"
},
{
"name": "EntityFramework Reverse POCO Generator",
"vsixId": "EntityFramework_Reverse_POCO_Generator..d542a934-8bd6-4136-b490-5f0049d62033"
},
{
"name": "Extension Manager 2017",
"vsixId": "e83d71b8-8bfc-4e06-b145-b0388910c016"
},
{
"name": "Fix Mixed Tabs",
"vsixId": "FixMixedTabs.9f1d3050-b986-4b10-ae36-97c6efc5e968"
},
{
"name": "Fix Namespace",
"vsixId": "f073da8c-bb52-41f8-b95a-a6346b1a0b52"
},
{
"name": "MetricsAnalyzer",
"vsixId": "MetricsAnalyzer..8026235d-7afc-401b-8f45-ba8624a07ef5"
},
{
"name": "Microsoft Code Analysis 2017",
"vsixId": "4db2d63d-3320-4fbd-bf80-07f8d1500bd3"
},
{
"name": "Moq.Analyzers",
"vsixId": "Moq.Analyzers..c3c7e3f8-2407-428d-beef-c4557253517b"
},
{
"name": "Object Exporter",
"vsixId": "07fb5b16-f4be-4488-9a19-b4f36d2c05a6"
},
{
"name": "Output enhancer",
"vsixId": "VSOutputEnhancer.Nikolay Balakin.a06be4c3-f97e-425c-8a0d-bdef08ac2abb"
},
{
"name": "Power Commands for Visual Studio",
"vsixId": "PowerCommands.3ecdd89b-f985-483d-8c94-be37de4dc083"
},
{
"name": "Ref12",
"vsixId": "SLaks-Ref12-086C4CE4-7061-4B1F-BC77-B64E4ED71B8E"
},
{
"name": "Reference Conflicts Analyser",
"vsixId": "ff477521-e67b-4ca3-931f-3edf36125d28"
},
{
"name": "Regular Expression Tester Extension",
"vsixId": "a65d58d2-ead8-4eea-a47d-fa60865a6043"
},
{
"name": "ResolveUR - Resolve Unused References",
"vsixId": "637ba02c-3388-4e54-9051-3eea7c51b054"
},
{
"name": "Roslyn Security Guard",
"vsixId": "RoslynSecurityGuard..45fa56c2-16f1-4395-8c10-a5a460084018"
},
{
"name": "Roslynator 2017",
"vsixId": "9289a8ab-1bb6-496b-9992-9f7ea27f66a8"
},
{
"name": "Security Code Scan (for VS2017 and newer)",
"vsixId": "955196A7-ACBF-4F6B-820B-51B8507CE853"
},
{
"name": "Solution Error Visualizer",
"vsixId": "SolutionErrorVisualizer.a392f96b-6b33-4b53-b4bb-3376a05f986c"
},
{
"name": "SonarLint for Visual Studio 2017",
"vsixId": "SonarLint.36871a7b-4853-481f-bb52-1835a874e81b"
},
{
"name": "SQL Search",
"vsixId": "Redgate.SQLSearch.VSExtension.9BD7AEDA-C291-4702-8191-4189B099F3A9"
},
{
"name": "Target Framework Migrator",
"vsixId": "TargetFrameworkMigrator..4f7666b9-e62c-46a1-af25-21ab8742ef00"
},
{
"name": "Trailing Whitespace Visualizer",
"vsixId": "4c1a78e6-e7b8-4aa9-8812-4836e051ff6d"
},
{
"name": "Unit Test Boilerplate Generator",
"vsixId": "UnitTestBoilerplate.RandomEngy.ca0bb824-eb5a-41a8-ab39-3b81f03ba3fe"
},
{
"name": "Visual Studio IntelliCode",
"vsixId": "IntelliCode.VSIX.598224b2-b987-401b-8509-f568d0c0b946"
},
{
"name": "Visual Studio Spell Checker (VS2017 and Later)",
"vsixId": "43EA967E-0DE2-4136-8E52-C6DCFB5C2748"
},
{
"name": "Wix Toolset Visual Studio 2017 Extension",
"vsixId": "WixToolset.VisualStudioExtension.Dev15"
}
]
}

and has 0 comments
The Blinding Knife continues the story of Kip the bastard, Gavin/Dazen Guile the genius god-like Prism and just about every other person alive a mere mortal. It is just as entertaining as the first book, although more focused on action than lore. A lot of new concepts are explored here, like colors that are not on the spectrum but can be drafted, other gods, other chromatic skills, but, as fantasy focused on little boys taught us, always unexplained, mysterious, too young to understand, people dying before they can finish their sentence, etc. I hate that cliché and I really hope people would stop using it so much. I am talking to you, Brent Weeks!

Anyway, I can't say anything more about the story or the style or the author than I did when I read the first book in the Lightbringer series. It's a continuous story, split in book sized volumes. I will start reading the next book in the saga momentarily. I recommend the writing style and I like the attention to detail and the lore, although after a while the boy genius recipe feels more and more like a Japanese manga and less than a real story.