and has 1 comment

  So both Google and Microsoft are pushing for this PWA concept (Progressive Web Apps) where a normal web site can function like a native app if configured in a certain way. It's in early stages, but it's supported by the major browsers and mobile operating systems. So I played around with the idea and now there is a new option in the menu of the blog: Notify me. What it does is it lets you grant notification permissions for the web page of the blog and then you get a notification whenever I write a new post or update an old one. In order to remove the option you have to reset the notification permissions for the website yourself (there is no way to do it programmatically, yet) 

  At the moment the registrations are saved in memory, so whenever the web site is restarted (like with a new update), you will lose any notification sending until you enter in the web site again. I will work on persisting it on the disk, but that will come later.

  Meanwhile, enjoy! Also, you can install the blog as an application, too, which makes little sense, really :) but it gives you the option of having it in the phone or in Windows in the start menu and pinned to the taskbar and everything.

and has 0 comments

Installing Windows 11 is a trip. Not only does it comes with its own idea on how your taskbar and start menu should look and feel without giving you even the options Windows 10 had, but it also brings so much wisdom by telling you what applications you should use most. Like every single Microsoft Office app, plus the Office loader. Meanwhile, the apps you pinned are stuck at the end of the list. You have to scroll to get to them!

Easy to fix, you might think, so you right-click on the apps expecting to get an option to unpin and you do, and you get a loading prompt and then the menu closes and... nothing happens. Like at all. Even loading the app would have been a better result.

It turns out the answer is to press the Shift key and right click the items, only then you get a reasonable menu that gives you the option to unpin. No such luck as being able to enter an edit mode by pressing long, so you have to unpin every single useless thing from there until you get to the few things you actually want to use without, you know, pressing the Start key and then typing what you want.

Ugh! Every generation of designers out there becomes more and more idiotic. In the near future I am sure they will reach the conclusion that the entire digital experience of humanity can be condensed into a tridimensional metaverse infinite scroll and remove any other option.

and has 0 comments

  Imagine if the fairy godmother would teach Cinderella the very basics of magic, warned her to not do it, then promptly vanished forever. Then imagine Cinderella was French right around the time of the French Revolution. That's the plot in a nutshell.

  All That Glitters has the hallmarks of greatness: good writing, a very interesting world and a character that grows with the reader. However, I found it really difficult to finish it. I believe the reason was the telegraphing of the protagonist's suffering, making me think of all the horrid things that were going to happen to her, only for her to actually find rather convenient and facile ways of getting out of trouble. 

  And I have to tell you that it was a weird feeling throughout. Made me feel guilty for fearing of all the bad things that were predictably going to happen to the heroine and then resentful of Gita Trelease for letting her off the hook. I mean, this girl and her sister have to deal with the death of their parents, systemic classism, being disconsidered for being women, having a violent addict and gambler of a brother that leeches from them even the money for food and rent, nobles, sorcerers and, of course, the worse of it all, romantic triangles! We can't miss those. And the only solution, a form of magic that feeds on one's sorrow and actual blood and only gives illusions in return.

  Now, of course, this is the first book in a bloody series, luckily a duology, at least for now. There are no standalone books anymore. Therefore the author has all the opportunity to grow as a writer, torture her protagonist to her and the readers' content and determine the most important thing of them all: who is Camille going to marry? Can you imagine being able to turn anything iron into coins for a limited time and not once considering what (or who) else can you turn into what? Maybe that will happen in the next books, but I won't be reading them.

  Bottom line: a definite success of a debut and full of potential and value. However it seems the author and myself are focusing on different things in life and even if we witness the same story, we only want to see the parts the other doesn't. I guess the book appeals more to the feminine side of the reader.

and has 0 comments

  Americans want to think of themselves as gods, the better of humanity, the all powerful rulers of the world. And the reason they get to think that is that we want them to be so. We entrust them with the faith of the world just like ordinary Russians believe Putin to be their savior. Yet once that faith is gone, so is their power, because with great power comes ... pardon the sticky platitude... great responsibility.

  The U.S. economy is not resilient because of something they do, but because all the other economies anchor to it. It cannot fail because then the world would fail. Yet, one has to take care of said economy lest it will just become a joke no one believes in. Crises are loses of faith more than actual technical issues with whole economies.

  I will argue that the Americans did something right: they followed the money and indirectly attracted the science and the technology to maintain their growth. Now they have the responsibility to keep that growth going. It is not a given. Innovation needs to be nourished, risks be taken, solutions for new problems continuously found. But once you believe your own bullshit, that you're the best of them all, that you can't fail, that you need not do anything because your supremacy is ordained, you will fail and fail miserably.

  And no one actually wants that. Certainly not the Americans with their horrendous privilege, which is national more than anything like race, gender, religion or sexual orientation, which they keep focusing on as a diversion. And no, it's not a conspiracy, it's the direction their thoughts must take in order to deflect from the truth. Americans are weird because they can't be anything but. And certainly nobody else wants that Americans fail. Even "the enemies" like Iran or the vague terrorists, or China... they need the Americans to be where they are. Good or evil, they need to remain gods, otherwise the entire world belief structure would crumble. The U.S. is not the world, they are just the fixed point that Archimedes was talking about.

 It is complacency that will get us. Once we believe things are because they are we stop making efforts. Ironically, the military-industrial complex that we like to malign is the only thing that dispels dreams, acts based on facts and pushes for world domination not because it is inherited or deserved, but because it must be fought for.

 Funny enough, it is the economic markets like the stock market that show what the world will become. Years of growth vanish like dreams if the market sentiment shifts. Growth is slow and long term, falls are short and immediate. The world is now hanging by a thread, on the belief that goodness is real, that Americans will save us all, but they need to act on it. Knee-jerk reactions and "we can't fail because we are right" discourse will not cut it. You guys need to lead, not just rule!

  In summary: monkey humans need an Alpha. In groups of people we have one person, in countries we have a government (or for the stupid ones, a person) and in groups of countries, a country. The Alpha will first rise on their own strength, then on the belief of others on their own strength, then on their ability to influence the beliefs of others. Finally they will lead as gods or die as devils. There are no alternatives.

and has 0 comments

  Chess is a game. In order for something to be called a game, it must be fun, it must be tailored to the level of the players and sometimes, especially nowadays, it needs to be exciting to an audience.

  Now, chess engines are fantastic in respecting the rules of chess and mating the king in the quickest possible way, but it's not a game anymore, it's a process. Occasionally people watch what computer engines are doing and notice the beauty in some of the ideas, but that beauty is coincidental, it has no value to the machine and "sparks no joy".

  I've been advocating for a while training chess engines on other values, like beauty or excitement, but those are hard to quantify. So here is a list of values that I thought would be great to train chess engines on:

  1. player rating
    • which is great because it's constrained in time, so if someone is a GM, but completely drunk and haven't been sleeping for a week, the engine would adapt for their play at that time
    • I know that engines have a manual level configuration, but I doubt it was ever correctly modelled as an input. Most of the time, a random move is chosen from the list of best moves, which is not what I am suggesting here at all
  2. value and risk of a move
    • I know this sounds like what engines are doing now, but they are actually minimizing risk, not maximizing value
    • We also have the player rating to take into account now, so the calculation changes with the player! A move that would be negative because another perfect computer chess engine would take advantage of a minute flaw means nothing now, because there is no way an 1800 rated human will see it. And if they do, what a boost in confidence when they win and what pleasure in witnessing the moment!
  3. balance risk with the probability of winning
    • this is the best part. Riskier moves are more fun, but can cause one to lose. Allow a probability of the other player missing a move, based on what we have calculated above. We are actually adding a value of disrespect from the engine. It attempts to win despite the moves it makes, not because of them.

  What I am modelling here is not a computer that plays perfect chess, but a chess streamer. They gambit, they try weird stuff, they do moves that look good because they can think of what the other player is or the audience are going to feel. They are min-maxing entertainment!

  A chess streamer is usually a guy around 2500 showing mercy and teaching when playing against lower rated players and trying entertaining strategies against equal or even better rated ones. They rate the level of a move, which is an essential metric on what strategies they are going to employ and what moves they are going to play. In other words, they are never considering a move without taking context into account.

  Imagine a normal chess engine, using min-max or neural networks to determine how to win the game. Against another computer, the valuation function is extremely important, since it limits the number of possible moves to one or two. Against a human noob, there are a lot of moves that will lead to a win. It is obvious that another metric is necessary to filter them out. That's how humans play!

  Short story shorter: use opponent rating to broaden the list of winning candidate moves, then filter them with a second metric that maximizes entertainment value.

and has 0 comments

  You've seen this before, either as a book or as a TV series or something similar: the hidden world of magic, the gatekeepers tasked to keep the veil on the eyes of the common folk, the particular technique that they use (in spite of many others in existence), the ethnic flavor of the inspiration, even the formulaic definitions of good and evil. As it stands, Ink & Sigil is a rather bland book, with very little original content and the little that is being inspired by other cultures than the one of the author.

  You see, it all happens in Scotland, where everybody speaks with a strong Scottish accent, even the goth lesbian battle seer girl who is Indian. And there is a magical world of the fae, separated from ours by ... legal bindings, enforced by only five people in the whole world who work for no particular reason, with little resources and themselves bound by inexplicable moral qualms. Every fae described is a horrid caricature, an average of the most common clichés. Every fight is fought exclusively with the particular magical trinkets specific to the gatekeepers and nothing else.

  So forgive me when I am not impressed by Ink & Sigil, another uninspired fantasy millionology which translates to a classic detective story with a little bit of magic and locale sprinkled for taste. It's as authentic as a Margarita in a Ruby Tuesday or a single malt whisky made in Texas.

  As for Kevin Hearne, I didn't know who he was, but I could feel he was not Scottish in any way or form. Not because I am an expert in the culture of Scotland, mind you, but because it was obvious. It was funny how American the world view was, even when bad mouthing Americans, people who leer when they see an attractive girl or, God forbid, are racist. The author tried to be subtle and not stink up his writing with politics, but he couldn't help being a raging progressive from time to time.

  Bottom line: it was partly fun, but it was a chore finishing the book while knowing exactly what was going to happen and trudging through the flood of clichés that made up this story. I would not recommend it.

and has 0 comments

  Sorcery of Thorns starts with an interesting idea that made me curious and involved: a library like a prison of Necronomicon-like books, bound in flesh, partly alive, trying to manipulate people in various ways, fighting amongst each other, destroying minds and bodies of unguarded people and able to transform into murderous demonic beasts when damaged.

  Then, almost immediately, Margaret Rogerson turns away from that premise and proceeds to write a very young adult fantasy romance where a Mary Sue orphan girl who has lived all her life in one of these libraries leaves it and falls in love with a young very eligible sorcerer while battling an old sorcerer in authority and patriarchy in general. Gad!

  The writing is not bad, but nothing spectacular either. It's the way the author fails to put her character in even the slightest challenge that makes this book average at most. Whenever something bad happens, or rather about to happen, she immediately finds a new ability or a new friend to save her. Her "best friend" is there just to be used in various occasions and then forgot for the rest of the book. Men dismiss her opinions, not because she is a shut-in orphan and poor and uneducated and doesn't know anything, but because she is a woman. Only then to do 360s and completely believe and support her when the actual need arises.

  And that ending! There is one thing that feels like a consequence, like it all wasn't some sort of bed game to spice up Elisabeth's romance, then it goes poof!

  In conclusion I can't recommend this book. It's not bad, but certainly not good. Somehow I got duped again by the legions of horny girls using fantasy to scratch their itch and then rating books in droves. The only good thing I can say about the book is that it's standalone and not part of some misbegotten series.

and has 0 comments

  Using my system of randomizing the choice of books, I started reading Dread Nation. The very first page there is a dedication to all people of color. So I immediately deflated, as there are some people who think writing with a political agenda doesn't require any actual knowledge of bookcraft. Then there was the author's name, Justina Ireland, suggesting anything else than the Black female writer she is. And then it was a period piece, set right after the American Civil War. And then... it was also about zombies! So I prepared for a bad woke book written by a woman who looks to the past to justify her antiracist outrage. Yeah, I know, I'm a monster. But then I kind of liked the book!

  That doesn't mean I wasn't partly right. The story is told from the first person perspective of a character who is a Mary Sue. She is a Black girl, birthed by a White woman, but also partly raised as a slave, but also knowing how to read and being well read, but also speaking in Black English, but only randomly or when it suits her, she is smart, perfectly trained to fight zombies and also trained in etiquette, but also a rebel and a tomboy, but also cute enough to attract the attention of a beautiful and fiery Black boy, oppressed her entire life, but also capable of taking control of any situation, small of build and being hurt repeatedly, but then shrugging off damage like an action hero, which she also is, and so on. Then there are the male White characters, which are all bad, except maybe some which I am pretty sure will turn out to be bad too in the end. White women are vile, but some of them, if they are not rich, are OK. Black boys are naive and needing guidance, even if they have their hearts in the right place.

  In short, the book is very inconsistent in its characterization and this girl can do *everything*, except maybe feel when people are sneaking up to her to cock guns when the story requires her to get caught and brutalized. But the world building is good. The author researched books about the forced Native American "educational" centers, another bright spot in the U.S. history - land of the free if you survive, are not enslaved and are White - and created this world where the dead had risen right in the middle of the Civil War, abruptly terminating it, yet not solving any of the social issues that had been in dispute during it.

  And yes, it does feel a little "inspired" from the likes of Lovecraft Country, only instead of cosmic horror you get the run of the mill zombie outbreak as the background for a story about racism. The writing is typical Young Adult, focused on what the character feels, intends and believes, with action and interaction with other people just there to further the story in a blatantly obvious way. But it was also fun. Unfortunately, it is yet another "first book in a trilogy" and you would have to read the two other books to get any closure. I liked the book, but not that much to continue reading the rest.

  Interesting SQL table hint I found today: READPAST. It instructs SQL queries to ignore locked rows. This comes with advantages and disadvantages. For one it avoids deadlocks when trying to read or write an already locked row, but it also provides the wrong results. Just as NOLOCK, it works around the transaction mechanism, and while NOLOCK will allow dirty reads of information partially changed in transactions that have not been committed, READPAST ignores its existence completely.

  There is one scenario where I think this works best: batched DELETE operations. You want to delete a lot of rows from a table, but without locking it. If you just do a delete for the entire table with some condition you will get these issues:

  • the operation will be slow, especially if you are deleting on a clustered index which moves data around in the table
  • if the number of deleted rows is too large (usually 5000 or more) then the operation will lock the entire table, not just the deleted rows
  • if there are many rows to be deleted, the operation will take a long while, increasing the possibility of deadlocks

  While there are several solutions for this, like partitioning the table and then truncating the partitions or soft deletes or designing your database to separate read and write operations, one type of implementation change that is small in scope and large is result is batched deletes. Basically, you run a flow like this:

  1. SELECT a small number of rows to be deleted (again, mind the 5000 limit that causes table locks, perhaps even use ROWLOCK hint)
  2. DELETE rows selected and their dependencies (DELETE TOP x should work as well for steps 1 and 2, but I understand in some cases this syntax automatically causes a table lock and maybe also use ROWLOCK hint)
  3. if the number of selected rows is larger than 0, go back to step 1

  This allows SQL to lock individual rows and, if your business logic is sound, no rows should be deleted while something is trying to read or write them. However, this is not always the case, especially in high stress cases with many concurrent reads and writes. But here, if you use READPAST, then locked rows will be ignored and the next loops will have the chance to delete them.

  But there is a catch. Let's take an example:

  1. Table has 2 rows: A and B
  2. Transaction 1 locks row A
  3. In a batched delete scenario, Transaction 2 gets the rows with READPAST and so only gets B
  4. Transaction 2 deletes row B and commits, and continues the loop
  5. Transaction 3 gets the rows with READPAST and gets no rows (A is still locked)
  6. Transaction 3 deletes nothing and exists the loop
  7. Transaction 1 unlocks row A
  8. Table now has 1 row: A, which should have been deleted, but it's not

  There is a way to solve this: SELECT with NOLOCK and DELETE with READPAST

  • this will allow to always select even locked and uncommitted rows
  • this will only delete rows that are not locked
  • this will never deadlock, but will loop forever as long as some rows remain locked

  One more gotcha is that READPAST allows for a NOWAIT syntax, which says to immediately ignore locked rows, without waiting for a number of seconds (specified by LOCK_TIMEOUT) to see if it unlocks. Since you are doing a loop, it would be wise to wait, so that it doesn't go into a rapid loop while some rows are locked. Barring that, you might want to use READPAST NOWAIT and then add a WAITFOR DELAY '00:00:00.010' at the end of the loop to add 10 millisecond delay, but if you have a lot of rows to delete, it might make this too slow.

  Enough of this, lets see some code example:

DECLARE @batchSize INT = 1000
DECLARE @nrRows INT = 1

CREATE TABLE #temp (Id INT PRIMARY KEY)

WHILE (@nrRows>0)
BEGIN

  BEGIN TRAN

	INSERT INTO #temp
    SELECT TOP (@batchSize) Id
    FROM MyTable WITH (NOLOCK)
    WHERE Condition=1

    SET @nrRows = @@ROWCOUNT

    DELETE FROM mt 
    FROM MyTable mt WITH (READPAST NOWAIT)
    INNER JOIN #temp t
    ON mt.Id=t.Id

	WAITFOR DELAY '00:00:00.010'

  COMMIT TRAN

END

DROP TABLE #temp

Now the scenario goes like this:

  1. Table has 2 rows: A and B
  2. Transaction 1 locks row A
  3. Transaction 2 gets the rows with NOLOCK and so only gets A and B
  4. Transaction 2 deletes rows A and B with READPAST, but only B is deleted
  5. loop continues (2 rows selected)
  6. Transaction 3 gets the rows with NOLOCK and gets one row 
  7. Transaction 3 deletes with READPAST with no effect (A is still locked)
  8. loop continues (1 rows selected)
  9. Transaction 1 unlocks row A
  10. Transaction 4 gets the rows with NOLOCK and gets row A (not locked)
  11. Transaction 4 deleted with READPAST and deletes row A
  12. loop continues (1 rows selected), but next transaction selects nothing, so loop ends (0 rows selected)
  13. Table now has no rows and no deadlock occurred

Hope this helps.

  So yeah, I've decided to try out stock trading. I wanted to see how it works, how it feels and if it's a valid avenue for investment versus something like placing money in a bank. Long story short: it is! I fact, I would say placing money in banks feels stupid now. Will this make me a billionaire in Euros? No. But let me detail.

  Usually, when people get some extra money they think: should I leave them in my expenses account or should I move a sum to a savings account? The difference being the amount of interest and some rules against retrieving money from the savings account. One account is for fast operations, the other is for the rainy days, one you think of in days, the other in months. Well, imagine you have to save money in order to someday retire. That's one you would think of in decades. Well, in that case, stocks are what you need. 

Here is a chart of QQQ, a aggregate stock on the top performing stock, for the last 22 years. Its value rose consistently and grew 536%. That's 8.7% a year on average. In comparison, the average inflation rate in the same period is something like a third. Tell me, which bank will give you this interest?

But take a closer look. You see that big spike at the end? That's November 2021, when the U.S. market reached its apex, due to various reasons. Since then it plummeted, so the value now is the same as in June 2021. If you would have read a blog post like this and invested all your money in QQQ stock in November, you would have found a special set of skills, found me and killed me now. Or look on the left of the chart, to the spike there. In March 2000 the value increased to 118, only to then go down for a period of 16 years, only to grow 250% in the next 6 years!

So in the end, it goes to your trust in the world as a whole. Will it grow, stagnate or disintegrate? If you are optimistic in the long run or at least think that the next 20 years will go the same, then this is for you.

Of course, it was an interesting moment to start learning and experiment with stock trading in 2022. The boom that the trillions of US dollars injected by Biden in the economy because of Covid (so yeah, you read that right, the economy went up during the pandemic) ended, also the distraction caused by Covid which turned from an excitingly unexpected threat to life to an endemic virus that coexists with all the others we got used to. Now we have to look back at how to get those trillions paid, how much good Brexit does to the economy, how the European Union economy recovers and, to add insult to injury, another psychopathic world leader threatening World War III. Can you even think of making money on the stock market now?

The answer is again, yes! Did I make more money? No. But I didn't lose that much either and I believe that loss will disappear. I won't go into the details, but enough to say that while the stocks that took the market to that November high dropped, but other stocks that are considered safe, like the dividend stocks of huge companies, went up. And there is another hook: if the market goes down and you trust it to increase (on average) every year, that means the lower the stocks the higher they will rise in the future!

But, you will ask yourself, what am I missing? I everybody could do that, why don't they? Where is the high risk that everybody warns me about when talking about the stock market?

Well, first there are the short to medium term risks like the 2008 economic crisis or a measly World War. However, can you show me without looking at the years where is that crisis on the chart above? As I said, this is a "sure thing" only on large periods of time and while the global order remains largely unchanged. Also, money itself is a form of national stock. That's why you get inflation, where the buying power of the same sum of the same currency is vastly different from year to year. It's not a matter of money vs stock, but of stock vs stock, of managing risk.

Again with the risk! Where is it? Personally I think there is a huge psychological risk. Because you have a lot more options, you get more opportunities to fuck it all up. For example a guy sold his house and bought Tesla stock for all the money. He even tweeted to Elon Musk to encourage him to increase the value of the stock from $900 to $1000. The highest value for Tesla was 1222, but now it's 838. The guy could have increased his personal wealth 20% in just 20 days if he bought in October 2021. He didn't.

There is a huge pressure to perform when you gamble (and that's the correct word) with your money. You may take a few hundred Euros like me and play around, then the pressure is not that high, but if you put most of your savings into this, you always get to second guess yourself. Did I buy the correct thing? Oh, it's growing! Oh, it's going down! Oh, no, I am losing money, should I sell early or wait until it gets back up?

Sometimes you trust a company so much that it makes no sense to invest in something else. So you just buy the one stock. And then it goes bankrupt! Or the stock falls so much and forever that you have lost all of your savings. Having a diverse portfolio decreases your risk, but also your revenue.

There is a saying among traders that goes something like this: 95% of people trading are losing money and the rest of 5% bought some stock and then forgot about it for a few years. This says something about the safest way to proceed, but also tells you something about where the money from trading is coming from: those 95%.

So I am not an expert in any conceivable way, but I am going to try things out. There is a lot to learn, but when you push everything aside, there are two basic strategies: timing the market and investing long term.

Timing the market is to "buy the dip" when the stocks are low and sell them when they spike. The good news is that it makes you filthy rich, the bad news is that you can't time the market. And I am not joking. This is basically playing Roulette. If you consistently place your bets on the right number, you become filthy rich (or are thrown out of the casino), but that's theoretically impossible. And while a casino game is probabilistic, the market is actually fighting against you, adapting to strategies and making them obsolete in days (if not in minutes, considering you are competing with AI algorithms run by companies betting billions).

Investing long is what I described above. You take your savings (which come after you've bought your house, saved some in the bank and you have a comfortable sum left to live on) and you buy either diverse stocks from the top 500 or ETF (Exchange traded fund) which does this for you, for a small percentage. Invesco QQQ from above is an ETF, for example. And you do it with your monthly savings, every month. And you leave it alone. And you count your money (or lack thereof) when you retire.

That being said, there is a lot to learn about trading. The statistical indicators, what they mean, the math, the taxes, the way to investigate companies, how to structure your portfolio, the information sources, the gotchas, the various people and tricks that want to manipulate you and/or the market so that they make the money.

and has 0 comments

  Decency makes us abstain from doing something that we could do, we might be inclined to do, but we shouldn't do. It's living according to some general principles that are intimately connected to our own identity. And when someone else is indecent, we try to steer them towards the "right path", for our own sake as well as theirs. This is what I was raised to think. Today, though, decency is more and more proclaimed for actively opposing things that are declared indecent and nothing else. It's the glee that gives it away, that twisted joy of destroying somebody else after having being given permission to do so. You see it in old photos, where decent town folk were happily and communally lynching some poor soul. After half a century the world is finally becoming a global village, but not because of the free sharing of information, as the creators of the Internet naively believed, but because of social media and 24 hour news cycles. And we are behaving like villagers in tiny isolated bigoted villages.

  South Park is a comedy animated show that has a similar premise: a small U.S. town as a mirror for the world at large. And while 25 years ago that was a funny idea, now it feels weirdly prescient. The latest episode of the show depicts the vilifying of some local residents of Russian descent because of the Ukraine conflict as a symptom of nostalgia towards the Cold War era. Then too, people were feeling mighty good about themselves as they were fighting the Ruskies, the Commies, the Hippies, or anything that was threatening democracy and the American way of life.

  This is not an American affliction as it is human nature. Witch hunts, lynching, playing games with the heads of your enemies, sacrificing virgins, they all have the same thing in common: that feeling that you have social permission to hurt others and that if they are bad, that makes you good. But acting good is what makes you good, not merely destroying evil. When Stalin was fighting Hitler no one said what a nice decent guy Stalin was. Yet now this mob mentality has been exported, globalized, strengthened by the sheer number of people that now participate. It's not easy to mention decency when thousands of people may turn on you for defending their sworn enemy. This "either with us or against us" feeling is also old and symmetrically evil, because usually all sides harbor it towards the others.

  I have started this post two times before deleting everything and starting again. At first I was continuing the story of the playground war, South Park style, where the town people refuse service to the family of the bully, start giving the victim crotch protectors and helmets at first, then baseball bats and pocket knives, slowly delimiting themselves from that family and ostracizing it as "other", even while the two kids continue to go to school and the bullying continues. But it was the glee that gave it away. I was feeling smart pointing out the mistakes of others. Then I tried again, explaining how Putin is wrong, but that's not the fault of the entire Russian people, most of them already living in poverty and now suffering even more while the rich are merely inconvenienced. I also shed doubt on the principledness of vilifying Russia when we seem to do no such thing to Israel, for example. And then I felt fear! What if this is construed to be antisemitic or pro Putin? What if I want to get hired one day and corporate will use the post as proof that I am a terrible human being? Because some nations can be vilified, some must be, but other should never ever be. And I may be a terrible human being, as well.

  Isn't stifling free expression for the sake of democracy just as silly as invading a country for the sake of peace?

  Regardless of how I feel about it, I am inside the game already. I am not innocent, but corrupted by these ways of positioning and feeling and doing things. I am tempted to gleefully attack or to fearfully stay quiet even when I disagree. So take it with a grain of salt as I am making this plea for decency. The old kind, where acting badly against bad people is still bad and acting good and principled is necessary for the good of all.

  Only you can give yourself permission to do something, by the way.

and has 0 comments

  Senlin Ascends reminded me of many things: the intellectual protagonist, lost in a world that feels part dream like in Zamyatin's We, a metaphorical world that reflects our own social order like in Snowpiercer, the cruel tourist traps hiding horror like in Song of Kali. But the book is anything but derivative. Josiah Bancroft writes this in his own voice, slowly building both the world and the characters.

  Have to admit that I've found it difficult to keep reading the book. The naïve character that never seems to catch a break and keeps getting abused by an uncaring world makes it hard to enjoy. The book is very well written, but starts by destroying your faith in humanity. This also does make the last quarter of the book a little jarring, as the winds suddenly blow in a slightly different direction. It is possible that Bancroft found it as hard to torture his protagonist as I found it to bear reading about it. Yet, he seems to have kept at it, as this is just the first book of a series of four books (and a series of shorts).

  The book is a steampunkish novel set in a fictional tower of Babel where a teacher and his new bride go on a honeymoon, only to be swept into the tumultuous world contained by the tower. Each level of the tower is separate in culture and resources: the higher the level, the harder to get to and the richer the society. But that doesn't mean better, in any way.

  That's the plot in a nutshell, but the beauty is in the details. I can't say the book is perfect, but I am going to give it my highest rating because it is certainly a good book and refreshingly original.

 So I got assigned this bug where date 1900-01-01 was displayed on the screen so, as I am lazy, I started to look into the code without reproducing the issue. The SQL stored procedure looked fine, it was returning:

SELECT
  CASE SpecialCase=1 THEN ''
  ELSE SomeDate
  END as DateFilteredBySpecialCase

Then the value was being passed around through various application layers, but it wasn't transformed into anything, then it was displayed. So where did this magical value come from? I was expecting some kind of ISNULL(SomeDate,'1900-01-01') or some change in the mapping code or maybe SomeDate was 1900-01-01 in some records, but I couldn't find anything like that.

Well, at second glance, the selected column has to have a returning type, so what is it? The Microsoft documentation explains:

Returns the highest precedence type from the set of types in result_expressions and the optional else_result_expression. For more information, see Data Type Precedence.

If you follow that link you will see that strings are at the very bottom, while dates are close to the top. In other words, a CASE statement that returns strings and dates will always have the return type a date!

SELECT CAST('' as DATETIME) -- selects 1900-01-01

Just a quickie. Hope it helps.

and has 0 comments

  Tracing and logging always seem simple, an afterthought, something to do when you've finished your code. Only then you realize that you would want to have it while you are testing your code or when an unexpected issue occurs in production. And all you have to work with is an exception, something that tells you something went wrong, but without any context. Here is a post that attempts to create a simple method to enhance exceptions without actually needing to switch logging level to Trace or anything like that and without great performance losses.

  Note that this is a proof of concept, not production ready code.

  First of all, here is an example of usage:

public string Execute4(DateTime now, string str, double dbl)
{
    using var _ = TraceContext.TraceMethod(new { now, str, dbl });
    throw new InvalidOperationException("Invalid operation");
}

  Obviously, the exception is something that would occur in a different way in real life. The magic, though, happens in the first line. I am using (heh!) the new C# 8.0 syntax for top level using statements so that there is no extra indentation and, I might say, one of the few situations where I would want to use this syntax. In fact, this post started from me thinking of a good place to use it without confusing any reader of the code.

  Also, TraceContext is a static class. That might be OK, since it is a very special class and not part of the business logic. With the new Roslyn source generators, one could insert lines like this automatically, without having to write them by hand. That's another topic altogether, though.

  So, what is going on there? Since there is no metadata information about the names of the currently executing method (without huge performance issues), I am creating an anonymous object that has properties with the same names and values as the arguments of the method. This is the only thing that might differ from one place to another. Then, in TraceMethod I return an IDisposable which will be disposed at the end of the method. Thus, I am generating a context for the entire method run which will be cleared automatically at the end.

  Now for the TraceContext class:

/// <summary>
/// Enhances exceptions with information about their calling context
/// </summary>
public static class TraceContext
{
    static ConcurrentStack<MetaData> _stack = new();

    /// <summary>
    /// Bind to FirstChanceException, which occurs when an exception is thrown in managed code,
    /// before the runtime searches the call stack for an exception handler in the application domain.
    /// </summary>
    static TraceContext()
    {
        AppDomain.CurrentDomain.FirstChanceException += EnhanceException;
    }

    /// <summary>
    /// Add to the exception dictionary information about caller, arguments, source file and line number raising the exception
    /// </summary>
    /// <param name="sender"></param>
    /// <param name="e"></param>
    private static void EnhanceException(object? sender, FirstChanceExceptionEventArgs e)
    {
        if (!_stack.TryPeek(out var metadata)) return;
        var dict = e.Exception.Data;
        if (dict.IsReadOnly) return;
        dict[nameof(metadata.Arguments)] = Serialize(metadata.Arguments);
        dict[nameof(metadata.MemberName)] = metadata.MemberName;
        dict[nameof(metadata.SourceFilePath)] = metadata.SourceFilePath;
        dict[nameof(metadata.SourceLineNumber)] = metadata.SourceLineNumber;
    }

    /// <summary>
    /// Serialize the name and value of arguments received.
    /// </summary>
    /// <param name="arguments">It is assumed this is an anonymous object</param>
    /// <returns></returns>
    private static string? Serialize(object arguments)
    {
        if (arguments == null) return null;
        var fields = arguments.GetType().GetProperties();
        var result = new Dictionary<string, object>();
        foreach (var field in fields)
        {
            var name = field.Name;
            var value = field.GetValue(arguments);
            result[name] = SafeSerialize(value);
        }
        return JsonSerializer.Serialize(result);
    }

    /// <summary>
    /// This would require most effort, as one would like to serialize different types differently and skip some.
    /// </summary>
    /// <param name="value"></param>
    /// <returns></returns>
    private static string SafeSerialize(object? value)
    {
        // naive implementation
        try
        {
            return JsonSerializer.Serialize(value).Trim('\"');
        }
        catch (Exception ex1)
        {
            try
            {
                return value?.ToString() ?? "";
            }
            catch (Exception ex2)
            {
                return "Serialization error: " + ex1.Message + "/" + ex2.Message;
            }
        }
    }

    /// <summary>
    /// Prepare to enhance any thrown exception with the calling context information
    /// </summary>
    /// <param name="args"></param>
    /// <param name="memberName"></param>
    /// <param name="sourceFilePath"></param>
    /// <param name="sourceLineNumber"></param>
    /// <returns></returns>
    public static IDisposable TraceMethod(object args,
                                            [CallerMemberName] string memberName = "",
                                            [CallerFilePath] string sourceFilePath = "",
                                            [CallerLineNumber] int sourceLineNumber = 0)
    {
        _stack.Push(new MetaData(args, memberName, sourceFilePath, sourceLineNumber));
        return new DisposableWrapper(() =>
        {
            _stack.TryPop(out var _);
        });
    }

    /// <summary>
    /// Just a wrapper over a method which will be called on Dipose
    /// </summary>
    public class DisposableWrapper : IDisposable
    {
        private readonly Action _action;

        public DisposableWrapper(Action action)
        {
            _action = action;
        }

        public void Dispose()
        {
            _action();
        }
    }

    /// <summary>
    /// Holds information about the calling context
    /// </summary>
    public class MetaData
    {
        public object Arguments { get; }
        public string MemberName { get; }
        public string SourceFilePath { get; }
        public int SourceLineNumber { get; }

        public MetaData(object args, string memberName, string sourceFilePath, int sourceLineNumber)
        {
            Arguments = args;
            MemberName = memberName;
            SourceFilePath = sourceFilePath;
            SourceLineNumber = sourceLineNumber;
        }
    }
}

Every call to TraceMethod adds a new MetaData object to a stack and every time the method ends, the stack will pop an item. The static constructor of TraceMethod will have subscribed to the FirstChangeException event of the current application domain and, whenever an exception is thrown (caught or otherwise), its Data dictionary is getting enhanced with:

  • name of the method called
  • source file name
  • source file line number where the exception was thrown.
  • serialized arguments (remember Exceptions need to be serializable, including whatever you put in the Data dictionary, so that is why we serialize it all)

(I have written another post about how .NET uses code attributes to get the first three items of information during build time) 

This way, you get information which would normally be "traced" (detailed logging which is usually detrimental to performance) in any thrown exception, but without filling some trace log or having to change production configuration and reproduce the problem again. Assuming your application does not throw exceptions all over the place, this adds very little complexity to the executed code.

Moreover, this will enhance exception with the source code file name and line number even in Release mode!

I am sure there are some issues with code that might fail and it is not caught in a try/catch and of course the serialization code is where people should put a lot of effort, since different types get to be serialized for inspection differently (think async methods and the like). And more methods should be added so that people trace whatever they like in thrown exceptions. Yet, as I said, this is a POC, so I hope it gets you inspired.

 T-SQL Querying is a very good overview of SQL Server queries, indexing, best practices, optimization and troubleshooting. I can't imagine someone can just read it and be done with it, as it is full of useful references, so it's good to keep it on the table. Also, it's relatively short, so one can peruse it in a day and then keep using it while doing SQL work.

What I didn't like so much was the inconsistent level of knowledge needed for the various chapters. It starts with a tedious explanations of types of queries and what JOINs are and what ORDER BY is and so on, then moves on to the actual interesting stuff. Also, what the hell is that title and cover? :) You'd think it's a gardening book.

Another great thing about it is that it is available free online, from its publishers: Packt.