I don't remember why I thought this would be a good book to read. Perhaps because it was one of those "gothic novels" and I had just read one that I liked a lot. The Owl Service is a short novel, but it took me ages to finish it. Whenever I had the time to read/listen to it I always found something else to do. I think Alan Garner wanted to do right by the story, which is a reimagining of a traditional Welsh legend, but it ended up an opaque and pretentious mess with characters that you cannot stand. If at least the writing had called to me. Garner is not a bad writer, but the style of writing didn't capture my attention. I had to make efforts to stay in the story and not let my mind wander.
The plot revolves around a valley in Wales where a British family owns property and where the locals are treated as uneducated peasants. The family comes to spend the summer and weird things start to happen. But they are either completely random or, when it comes to be some sort of possession or empowerment, there is always someone near to break the spell or destroy things in fear and righteous anger, which made it all rather boring. At no point there was anyone saying "Oh, that's peculiar, let's dig into it!" or "Hey, I can make books fly by themselves, let's see if I can solve world hunger or space exploration".
The worst part was the characters, all entitled twats. Every single one of them believes he can order other around, force things upon them or do and say whatever the hell they want. And I mean everyone, including the Welsh help. If they don't insult you, force things upon you or treat you like scum just because you are different, they smack you upon the head with indignation for not having done what was rudely ordered to you. And that's the maid doing it!
Bottom line: as a scholar of Welsh legend and the literary interpretation of myth in British literature I... hell, no! Just leave this book be! It's just bad.
I know I am shooting myself in the foot here, but, to paraphrase some people, staying quiet doesn't help anyone. I've come to love Dev.to, a knowledge sharing platform aimed at software developers, because it actually promotes blogging and dissemination of information. It doesn't do enough against clickbait, but it's great so far. So, hungry for some new dev stuff, I opened up the website only to find it spammed with big colorful posters and posts supporting female devs. It was annoying, but it got me thinking.
I like women in software! I too can honestly say I support them. I've always done so. I worked with them, mentored them, learned from them, worked for them, hired them. I want them to get paid what they are due, just like any other person: quiet, happiness, money, respect, understanding. I support their right to tell me when they hate (or love) something I do or say and I am totally against assholes who would pray on them or belittle them. Not because they are women, but because they are human, and no one should stand for stupid little people who only think of themselves and have a chip on their shoulder.
And yes, women need more support than men, because they traditionally did not have it before. For them it is an uphill battle to fit into communities that contain few females. They have to butt in, they have to push and struggle and we need to understand their underdog status and protect them through that. But not because they are some fantasy creature, or perpetual victims or some other thing, but because they are people. This applies to them, to minorities, to majorities, to every single person around you. I would feel the same about some guy not getting hired because he is too muscular as for some woman who won't get a job because she's bland looking.
So ask yourself, are you really supporting women, or are you just playing a game? Are you the one shouting loudly in the night "Night time! Everybody go to sleep!"? Are you protecting women or singling them out as something different that must be treated differently? Are you actually thinking of people or just doing politics? Because if you decide to annoy devs on behalf of women, you'd better do a good job supporting them for real.
This post will take you on an adventure through time and sound. It will touch the following software development concepts:
await/async in Javascript
named groups in regular expressions in Javascript
the AudioContext API in Javascript
musical note theory
Gorillas!
In times immemorial, computers were running something called the DOS operating system and almost the entire interface was text based. There was a way to draw things on the screen, by setting the values of pixels directly in the video memory. The sound was something generated on a "PC speaker" which was a little more than a small speaker connected to a power port and which you had to make work by handling "interrupts". And yet, since this is when I had my childhood, I remember so many weird little games and programs from that time with a lot of nostalgic glee.
One of these games was Gorillas, where two angry gorillas would attempt to murder each other by throwing explosive bananas. The player would have to enter the angle and speed and also take into account a wind speed that was displayed as an arrow on the bottom of the screen. That's all. The sounds were ridiculous, the graphics really abstract and yet it was fun. So, as I was remembering the game, I thought: what would it take to make that game available in a modern setting? I mean, the programming languages, the way people thought about development, the hardware platform, everything has changed.
In this post I will detail the PLAY command from the ancient programming language QBASIC. This command was being used to generate sound by instructing the computer to play musical notes on the PC speaker. Here is an example of usage:
PLAY "MBT160O1L8CDEDCDL4ECC"
This would play the short song at the beginning of the Gorillas game. The string tells the computer to play the sound in the background, at a tempo of 160 in the first octave, with notes of an eighth of a measure: CDEDCD then end with quarter measure notes: ECC. I want to replicate this with Javascript, one because it's simpler to prototype and second because I can make the result work in this very post.
Sound and Music
But first, let's see how musical notes are being generated in Javascript, using the audio API. First you have to create an AudioContext instance, with which you create an Oscillator. On the oscillator you set the frequency and then... after a while you stop the sound. The reason why the API seems so simplistic is because it works by creating an audio graph of nodes that connect to each other and build on each other. There are multiple ways in which to generate sound, including filling a buffer with data and playing that, but I am not going to go that way.
Therefore, in order to PLAY in Javascript I need to translate concepts like tempo, octaves, notes and measures into values like duration and frequency. That's why we need a little bit of musical theory.
In music, sounds are split into domains called octaves, each holding seven notes that, depending on your country, are either Do, Re, Mi, Fa, So, La, Si or A, B,C, D, E, F and G or something else. Then you have half notes, so called sharp or flat notes: A# is half a note above A and A♭ is a half note below A. A# is the same as B♭. For reasons that I don't want to even know, the octaves start with C. Also the notes themselves are not equally spaced. The octaves are not of the same size, in terms of frequency. Octave 0 starts at 16.35Hz and ends at 30.87, octave 1 ranges between 32.70 and 61.74. In fact, each octave spreads on twice as much frequency space as the one before. Each note has twice the frequency of the same note on the lower octave.
In a more numerical way, octaves are split into 12: C, C#, D, E♭, E, F, F#, G, G#, A, B♭, B. Note (heh heh) that there are no half notes between B and C and E and F. The frequency of one of these notes is 21/12 times the one before. Therefore one can compute the frequency of a note as:
Frequency = Key note * 2n/12, where the key note is a note that you use as a base and n is the note-distance between the key note and the note you want to play.
The default key note is A4, or note A from octave 4, at 440Hz. That means B♭ has a frequency of 440*1.059463 = 466.2.
Having computed the frequency, we now need the duration. The input parameters for this are: tempo, note length, mode and the occasional "dot":
tempo is the number of quarter measures in a minute
this means if the tempo is 120, a measure is 60000 milliseconds divided by 120, then divided by 4, so 125 milliseconds
note length - the length of a note relative to a measure
these are usually fractions of a measure: 1, 1/2, 1/4, 1/8, 1/16, etc
mode - this determines a general speed of playing the melody
as defined by the PLAY command, you have:
normal: a measure is 7/8 of a default measure
legato: a measure is a measure
staccato: a measure is 3/4 of a default measure
dotted note - this means a specific note will be played for 3/2 of the defined duration for that note
With this knowledge, we can start writing code that will interpret musical values and play a sound. Now, the code will be self explanatory, hopefully. The only thing I want to discuss outside of the audio related topic is the use of async/await in Javascript, which I will do below the code. So here it is:
class QBasicSound {
constructor() {
this.octave = 4;
this.noteLength = 4;
this.tempo = 120;
this.mode = 7 / 8;
this.foreground = true;
this.type = 'square';
}
setType(type) {
this.type = type;
}
async playSound(frequency, duration) {
if (!this._audioContext) {
this._audioContext = new AudioContext();
}
// a 0 frequency means a pause
if (frequency == 0) {
await delay(duration);
} else {
const o = this._audioContext.createOscillator();
const g = this._audioContext.createGain();
o.connect(g);
g.connect(this._audioContext.destination);
o.frequency.value = frequency;
o.type = this.type;
o.start();
await delay(duration);
// slowly decrease the volume of the note instead of just stopping so that it doesn't click in an annoying way
g.gain.exponentialRampToValueAtTime(0.00001, this._audioContext.currentTime + 0.1);
}
}
getNoteValue(octave, note) {
const octaveNotes = 'C D EF G A B';
const index = octaveNotes.indexOf(note.toUpperCase());
if (index < 0) {
throw new Error(note + ' is not a valid note');
}
return octave * 12 + index;
}
async playNote(octave, note, duration) {
const A4 = 440;
const noteValue = this.getNoteValue(octave, note);
const freq = A4 * Math.pow(2, (noteValue - 48) / 12);
await this.playSound(freq, duration);
}
async play(commandString) {
const reg = /(?<octave>O\d+)|(?<octaveUp>>)|(?<octaveDown><)|(?<note>[A-G][#+-]?\d*\.?)|(?<noteN>N\d+\.?)|(?<length>L\d+)|(?<legato>ML)|(?<normal>MN)|(?<staccato>MS)|(?<pause>P\d+\.?)|(?<tempo>T\d+)|(?<foreground>MF)|(?<background>MB)/gi;
let match = reg.exec(commandString);
let promise = Promise.resolve();
while (match) {
let noteValue = null;
let longerNote = false;
let temporaryLength = 0;
if (match.groups.octave) {
this.octave = parseInt(match[0].substr(1));
}
if (match.groups.octaveUp) {
this.octave++;
}
if (match.groups.octaveDown) {
this.octave--;
}
if (match.groups.note) {
const noteMatch = /(?<note>[A-G])(?<suffix>[#+-]?)(?<shorthand>\d*)(?<longerNote>\.?)/i.exec(match[0]);
if (noteMatch.groups.longerNote) {
longerNote = true;
}
if (noteMatch.groups.shorthand) {
temporaryLength = parseInt(noteMatch.groups.shorthand);
}
noteValue = this.getNoteValue(this.octave, noteMatch.groups.note);
switch (noteMatch.groups.suffix) {
case '#':
case '+':
noteValue++;
break;
case '-':
noteValue--;
break;
}
}
if (match.groups.noteN) {
const noteNMatch = /N(?<noteValue>\d+)(?<longerNote>\.?)/i.exec(match[0]);
if (noteNMatch.groups.longerNote) {
longerNote = true;
}
noteValue = parseInt(noteNMatch.groups.noteValue);
}
if (match.groups.length) {
this.noteLength = parseInt(match[0].substr(1));
}
if (match.groups.legato) {
this.mode = 1;
}
if (match.groups.normal) {
this.mode = 7 / 8;
}
if (match.groups.staccato) {
this.mode = 3 / 4;
}
if (match.groups.pause) {
const pauseMatch = /P(?<length>\d+)(?<longerNote>\.?)/i.exec(match[0]);
if (pauseMatch.groups.longerNote) {
longerNote = true;
}
noteValue = 0;
temporaryLength = parseInt(pauseMatch.groups.length);
}
if (match.groups.tempo) {
this.tempo = parseInt(match[0].substr(1));
}
if (match.groups.foreground) {
this.foreground = true;
}
if (match.groups.background) {
this.foreground = false;
}
if (noteValue !== null) {
const noteDuration = this.mode * (60000 * 4 / this.tempo) * (longerNote ? 1 : 3 / 2);
const duration = temporaryLength
? noteDuration / temporaryLength
: noteDuration / this.noteLength;
const A4 = 440;
const freq = noteValue == 0
? 0
: A4 * Math.pow(2, (noteValue - 48) / 12);
const playPromise = () => this.playSound(freq, duration);
promise = promise.then(playPromise)
}
match = reg.exec(commandString);
}
if (this.foreground) {
await promise;
} else {
promise;
}
}
}
function delay(duration) {
return new Promise(resolve => setTimeout(resolve, duration));
}
One uses the code like this:
var player = new QBasicSound();
await player.play('T160O1L8CDEDCDL4ECC');
Note that you cannot start playing the sound directly, you need to wait for a user interaction first. An annoying rule to suppress annoying websites which would start playing the sound on load. And here is the result (press multiple times on Play for different melodies):
Javascript in modern times
There are two concepts that were used in this code that I want to discuss: named regular expression groups and async/await. Coincidentally, both are C# concepts that have crept up in the modern Javascript specifications when .NET developers from Microsoft started contributing to the language.
Named groups are something that appeared in ES2018 and it is something I've been using with joy in .NET and hated when I didn't have it in some other language. Look at the difference between the original design and the current one:
// original design
var match = /(a)bc/.exec('abcd');
if (match && match[1]) { /*do something with match[1]*/ }
// new feature
const match = /(?<theA>a)bc/.exec('abcd');
if (match && match.groups.theA) { /*do something with match.groups.theA*/ }
There are multiple advantages to this:
readability for people revisiting the code
robustness in the face of changes to the regular expression
the index might change if new groups are added to it
the code aligns with the C# code (I like that :) )
My advice is to always use named groups when using regular expressions.
Another concept is await/async. In .NET it is used to hide complex asynchronous interactions in the code and with the help of the compiler helps with all the tasks that are running at the same time. Unfortunately, in C#, that means polluting code with async keywords on all levels as async methods can only be used inside other async methods. No such qualms in Javascript.
While in .NET the await/async system runs over Task<T> methods, in Javascript it runs over Promises. Both are abstractions over work that is being done asynchronously.
A most basic example is this:
// original design
getSomethingAsync(url,function(data) {
getSomethingElseAsync(data.url,function(data2) {
// do something with data2
}, errorHandler2);
},errorHandler1);
// Promises
getSomethingAsync(url)
.then(function(data) {
getSomethingElseAsync(data.url);
})
.then(function(data2) {
// so something with data2
})
.catch(errorHandler);
// async/await
try {
var data = await getSomethingAsync(url);
var data2 = await getSomethingElseAsync(data.url);
// do something with data2
} catch(ex) {
errorHandler(ex);
}
You see that the await/async way looks like synchronous code, you can even catch errors. await can be used on any function that returns a Promise instance and the result of it is a non-blocking wait until the Promise resolves and returns the value that was passed to the resolve function.
If you go back to the QBasicSound class, at the end, depending on if the sound is in the foreground or background, the function is either awaiting a promise or ... just letting it run. You might also notice that I've added a delay function at the end of the code which is using setTimeout to resolve a Promise. Here is what is actually going on:
// using await
console.log(1);
await delay(1000).then(()=>console.log(2));
console.log(3);
// this logs 1,2,3
// NOT using await
console.log(1);
delay(1000).then(()=>console.log(2));
console.log(3);
// this logs 1,3,2
In the first case, the Promise that was constructed by a one second delay and then logging 2 is awaited, meaning the code waits for the result. After it is executed, 3 gets logged. In the second case, the logging of 2 is executed after one second delay, but the code does not wait for the result, therefore 3 is logged immediately and 2 comes after.
What sorcery is this?! Isn't Javascript supposed to be single threaded? How does it work? Well, consider that in the delay function, the resolve function will only be called after a timeout of one second. When executed, it starts the timeout, then reaches the end of the function. It has not been resolved yet, so it passes control back to the engine, which uses it to execute other things. When the timeout is fired, the engine takes back control, executes the resolve function, then passes control back. All of this is invisible to the user, who gets the illusion of multithreaded behavior.
Already some standard out of the box APIs are async, like fetch. In order to get an object from a REST API that is called via HTTP the code would look like this:
// fetch API
let response = await fetch('/article/promise-chaining/user.json');
let user = await response.json();
Conclusion
I spent an entire day learning about sounds and writing code that would emulate QBASIC code from a billion years ago. Who knows, maybe my next project will be to port the entire Gorillas game in Javascript. Now one can lovingly recreate the sounds of one's childhood.
We are all racists. We belittle dinosaurs for getting extinct, we pump our chests and declare we are the highest pinnacle of evolution and they are inferior, failed experiments of nature, we, mammals, are clearly the superior product. Yet they existed and flourished and ruled every ecosystem on Earth for hundreds of millions of years. Even today the number of species of birds, the direct ancestors of dinosaurs, is more than double the number of species of mammals. Kenneth Lacovara starts his book with a similar assumption: Einstein was a schmuck! Every one of his great achievements means nothing because, in the end, Einstein died. If that idea is ridiculous for him, how come we still use it for dinosaurs?
Why Dinosaurs Matter is a short book, one in the TED Books series, and it pretty much adds detail to Lacovara's TED talk, like all of the TED books. Frankly, I am not very happy with the series, as it often adds little to the ideas summarised in the talks themselves. Some people are spending a lot of effort to summarize existing books into 15 minutes bite size media and TED books do the opposite, adding fat onto already fleshed out ideas. That doesn't mean this book is bad. It is well written, it has a lot of useful information, but it felt disjointed, like a combination of an opinion piece and a history book of discovered fossils. It gets its point across, but that's about it.
And the point is that we can learn a lot from dinosaurs, from how they spread around the world, adapted to all kinds of environments and the biological innovations they brought on with this. We can learn from their apparently absolute dominion and their immediate and humiliating downfall. Being at the top of the food chain is not only a matter of prideful boasting, but also a fragile spot with multiple dependencies. Once the natural order is disrupted, the top of the pyramid is the first to topple.
Bottom line: it is a nice introductory book in the world of dinosaurs, but not more than that. It's short enough to read on a long train ride or plane flight and it can be easily read by a child or teenager.
On the SQLite reference page for the WITH clause there is a little example of solving a Sudoku puzzle. Using SQL. I wanted to see it in action and therefore I've translated it into T-SQL.
You might think that there is a great algorithm at play, something that will blow your mind. I mean, people have blogged about Sudoku solvers to hone their programming skills for ages and they have worked quite a lot, writing lines and lines of how clever they were. And this is SQL, it works, but how do you do something complex in it? But no, it's very simple, very straightforward and also performant. Kind of a let down, I know, but it pretty much takes all possible solutions and only selects for the valid ones using CTEs (Common Table Expressions).
Here is the translation, followed by some explanation of the code:
DECLARE @Board VARCHAR(81) = '86....3...2...1..7....74...27.9..1...8.....7...1..7.95...56....4..1...5...3....81';
WITH x(s,ind) AS
(
SELECT sud,CHARINDEX('.',sud) as ind FROM (VALUES(@Board)) as input(sud)
UNION ALL
SELECT
CONVERT(VARCHAR(81),CONCAT(SUBSTRING(s,1,ind-1),z,SUBSTRING(s,ind+1,81))) as s,
CHARINDEX('.',CONCAT(SUBSTRING(s,1,ind-1),z,SUBSTRING(s,ind+1,81))) as ind
FROM x
INNER JOIN (VALUES('1'),('2'),('3'),('4'),('5'),('6'),('7'),('8'),('9')) as digits(z)
ON NOT EXISTS (
SELECT 1
FROM (VALUES(1),(2),(3),(4),(5),(6),(7),(8),(9)) as positions(lp)
WHERE z = SUBSTRING(s, ((ind-1)/9)*9 + lp, 1)
OR z = SUBSTRING(s, ((ind-1)%9) + (lp-1)*9 + 1, 1)
OR z = SUBSTRING(s, (((ind-1)/3) % 3) * 3
+ ((ind-1)/27) * 27 + lp
+ ((lp-1) / 3) * 6, 1)
)
WHERE ind>0
)
SELECT s FROM x WHERE ind = 0
The only changes from the original code I've done is to extract the unsolved puzzle into its own variable and to change the puzzle values. Also, added a more clear INNER JOIN syntax to replace the obnoxious, but still valid, comma (aka CROSS JOIN) notation. Here is the breakdown of the algorithm, as it were:
start with an initial state of the unsolved puzzle as a VARCHAR(81) string and the first index of a dot in that string, representing an empty slot - this is the anchor part
for the recursive member, join the current state with all the possible digit values (1 through 9) and return the strings with the first empty slot replaced by all valid possibilities and the position of the next empty slot
stop when there are no more empty slots
select the solutions (no empty slots)
It's that simple. And before you imagine it will generate a huge table in memory or that it will take a huge time, worry not. It takes less than a second (a lot less) to find the solution. Obviously, resource use increases exponentially when the puzzle doesn't have just one solution. If you empty the first slot (. instead of 8) the number of rows is 10 and it takes a second to compute them all. Empty the next slot, too (6) and you get 228 solutions in 26 seconds and so on.
The magical parts are the recursive Common Table Expression itself and the little piece of code that checks for validity, but the validity check is quite obvious as it is the exact translation of the Sudoku rules: no same digits on lines, rows or square sections.
an initial query that represents the starting state, often called the anchor member
a recursive query that references the CTE itself, called the recursive member, which is UNIONed with the anchor
a termination condition, to tell SQL when to end the recursion
For us, we started with one unsolved solution, we recursed on all possible valid solutions for replacing the first empty slot and we stopped when there were no more empty slots.
CTEs are often confusing because the notation seems to indicate something else to a procedural programmer. You imagine doing this without CTEs, maybe in an object oriented programming language, and you think of this huge buffer that just keeps increasing and you have to remember where you left off so you don't process the same partial solution multiple times and you have to clean the data structure so it doesn't get too large, etc. SQL, though, is at heart a declarative programming language, very close to functional programming. It will take care not only of the recursion, but also filter the rows by the final condition of no empty slots while (and sometimes before) it makes the computations.
Once you consider the set of possible solutions for a problem as a working set, SQL can do wonders to find the solution, provided you can encode it in a way the SQL engine will understand. This is just another example of the right tool for the right job. Hope you learned something.
One Word Kill starts off like an episode of Stranger Things. You've got the weird kid, his weird friends and the mysterious girl who is both beautiful, smart and hangs out with them to play D&D, all set in the 80's. Then the main character gets cancer and his future self comes to save... the girl. There is also a school boy psycho after them. But that's where the similarities end... the rest of the story is just... nothing. People explain things that needed little explaining and make no sense, good kids and their parents run around from a school boy, as psychotic as he could possibly be, without involving police or gang member allies and, in the middle of all the drama: cancer, psycho killer, future self, time travel... they play Dungeons and Dragons, a game that promotes imagination and creativity that then the protagonists fail to use in any amount in their real life.
Having just read Prince of Thorns, I really expected a lot more from Mark Lawrence. Instead I get a derivative and boring story that brings absolutely nothing new to the table. It's reasonably well written, I guess, but nothing Wow!, which is exactly the reaction reviewers seem to have about this book. Have I read a different story somehow?
Bottom line: I am tempted to rate this average, on account of other raving reviews and on the fact that I liked another Mark Lawrence book, but I have to be honest with me and rate this book alone, which I am sorry to say, is sub par.
This is something that appeared in C# 5, so a long time ago, with .NET 4.5, but I only found out about it recently. Remember when you wanted to know the name of a property when doing INotifyPropertyChanged? Or when you wanted to log the name of the method that was calling? Or you wanted to know which line in which source file is responsible for calling a certain piece of code? All of this can be done with the Caller Information feature.
And it is easy enough to use, just decorate a method parameter with an explicit default value with any of these three attributes:
The parameter value, if not set when calling the method, will be filled in with the member name or file name or line number. It's something that the compiler does, so no overhead from reflection. Even better, it works on the caller of the method, not the interior of the method. Imagine you had to write a piece of code to do the same. How would you reference the name of the method calling the method you are in?
So you want to use a queue, a structure that has items added at one side and removed on another, in Javascript code. Items are added to the tail of the queue, while they are removed at the head. We, Romanians, are experts because in the Communist times resources were scarce and people often formed long queues to get to them, sometimes only on the basis of rumour. They would see a line of people and ask "Don't they have meat here?" and the answer would come "No, they don't have milk here. It's next building they don't have meat at". Anyway...
There is an option that can be used directly out of the box: the humble array. It has methods like .push (add an item), .pop (remove the latest added item - when you use it as a stack) and .shift (remove the oldest added item - when you use it as a queue). For small cases, that is all you need.
However, I needed it in a high performance algorithm and if you think about it, removing the first element of an array usually means shifting (hence the name of the function) all elements one slot and decreasing the length of the array. Consider a million items array. This is not an option.
One of the data structure concepts we are taught at school is the linked list. Remember that? Each item has a reference to the next (and maybe the previous) item in the list. You explore it by going from one item to the next, without indexing, and you can remove any part of the list or add to any part of the list just by changing the value of these references. This also means that for each value you want stored you have the value, the reference(s) and the overhead of handling a more complex data object. Again, consider a million numbers array. It's not the right fit for this problem.
Only one option remains: still using an array, but moving the start and the end of the array in an abstract manner only, so that all queue/dequeue operations take no effort. This means keeping a reference to the tail and the head of the queue in relation to the length of the queue and of the underlying array.
But first let's establish a baseline. Let's write a test and implement a queue using the default array pop/shift implementation:
// the test
const size = 100000;
const q=new Queue();
time(()=> { for (let i=0; i<size; i++) q.enqueue(i); },'Enqueue '+size+' items');
time(()=> { for (let i=0; i<size; i++) q.dequeue(i); },'Dequeue '+size+' items');
time(()=> { for (let i=0; i<size/10; i++) {
for (let j=0; j<10; j++) q.enqueue(i);
for (let j=0; j<9; j++) q.dequeue(i);
} },'Dequeue and enqueue '+size+' items');
// the Queue implementation
class Queue {
constructor() {
this._arr = [];
}
enqueue(item) {
this._arr.push(item);
}
dequeue() {
return this._arr.shift();
}
}
// the results
Enqueue 100000 items, 10ms
Dequeue 100000 items, 1170ms
Dequeue and enqueue 100000 items, 19ms
The Enqueue operation is just adding to an array, enqueuing and dequeuing by leaving an item at ever series of dequeues is slightly slower, as the amount of array shifting is negligible. Dequeuing, though, is pretty heavy. Note that increasing just a little bit the amount of items leads to an exponential increase in time:
Now let's improve the implementation of the queue. We will keep enqueue using Array.push, but use a _head index to determine which items to dequeue. This means faster speed, but the queue will never shorten. It's the equivalent of Romanians getting their product, but remaining in the queue.
// the Queue implementation
class Queue {
constructor() {
this._arr = [];
this._head = 0;
}
enqueue(item) {
this._arr.push(item);
}
dequeue() {
if (this._head>=this._arr.length) return;
const result = this._arr[this._head];
this._head++;
return result;
}
}
// the results
Enqueue 200000 items, 11ms
Dequeue 200000 items, 4ms
Dequeue and enqueue 200000 items, 11ms
The performance has reached the expected level. Dequeuing is now even faster than enqueuing because it doesn't need to expand the array as items are added. However, for all scenarios the queue is only growing, even when dequeuing all the items. What I can do is reuse the slots of the dequeued items for the items to be added. Now it gets interesting!
My point is that right now we can improve the functionality of our queue by replacing dequeued but still stored items with newly enqueued items. That is the equivalent of Romanians leaving the queue only after they get the meat and a new Romanian comes to take their place. If there are more people coming than getting served, then people that got their meat will all leave and we can just add people to the tail of the queue.
So let's recap the algorithm:
we will use an array as a buffer
the queue items start at the head and end at the tail, but wrap around the array buffer
whenever we add an item, it will be added in the empty space inside the array and the tail will increment
if there is no empty space (queue length is the same as the array length) then the array will be rearranged so that it has space for new itms
when we dequeue, the item at the head will be returned and the head incremented
whenever the head or tail reach the end of the array, they will wrap around
Some more improvements:
if we enqueue a lot of items then dequeue them, the array will not decrease until we dequeue them all. An improvement is to rearrange the array whenever the queue length drops below half of that of the array. It will add computation, but reduce space.
when we make space for new items (when the array size is the same as the one of the logical queue) we should add more space than just 1, so I will add the concept of a growth factor and the smallest size increase.
Here is the code:
/**
* A performant queue implementation in Javascript
*
* @class Queue
*/
class Queue {
/**
*Creates an instance of Queue.
* @memberof Queue
*/
constructor() {
this._array = [];
this._head = 0;
this._tail = 0;
this._size = 0;
this._growthFactor = 0.1;
this._smallestSizeIncrease = 64;
}
/**
* Adding an iterator so we can use the queue in a for...of loop or a destructuring statement [...queue]
*/
*[Symbol.iterator]() {
for (let i = 0; i < this._size; i++) {
yield this.getAt(i);
}
}
/**
* Returns the length of the queue
*
* @readonly
* @memberof Queue
*/
get length() {
return this._size;
}
/**
* Get item based on item in the queue
*
* @param {*} index
* @returns
* @memberof Queue
*/
getAt(index) {
if (index >= this._size) return;
return this._array[(this._head + index) % this._array.length];
}
/**
* Gets the item that would be dequeued, without actually dequeuing it
*
* @returns
* @memberof Queue
*/
peek() {
return this.getAt(0);
}
/**
* Clears the items and shrinks the underlying array
*/
clear() {
this._array.length = 0;
this._head = 0;
this._tail = 0;
this._size = 0;
}
/**
* Adds an item to the queue
*
* @param {*} obj
* @memberof Queue
*/
enqueue(obj) {
// special case when the size of the queue is the same as the underlying array
if (this._size === this._array.length) {
// this is the size increase for the underlying array
const sizeIncrease = Math.max(this._smallestSizeIncrease, ~~(this._size * this._growthFactor));
// if the tail is behind the head, it means we need to move the data from the head to
// the end of the array after we increase the array size
if (this._tail <= this._head) {
const toMove = this._array.length - this._head;
this._array.length += sizeIncrease;
for (let i = 0; i < toMove; i++) {
this._array[this._array.length - 1 - i] = this._array[this._array.length - 1 - i - sizeIncrease];
}
this._head = (this._head + sizeIncrease) % this._array.length;
}
else
// the array size can just increase (head is 0 and tail is the end of the array)
{
this._array.length += sizeIncrease;
}
}
this._array[this._tail] = obj;
this._tail = (this._tail + 1) % this._array.length;
this._size++;
}
/**
* Removed the oldest items from the queue and returns it
*
* @returns
* @memberof Queue
*/
dequeue() {
if (this._size === 0) {
return undefined;
}
const removed = this._array[this._head];
this._head = (this._head + 1) % this._array.length;
this._size--;
// special case when the size of the queue is too small compared to the size of the array
if (this._size > 1000 && this._size < this._array.length / 2 - this._smallestSizeIncrease) {
if (this._head<this._tail) {
this._array = this._array.slice(this._head,this._tail);
} else {
this._array=this._array.slice(this._head, this._array.length).concat(this._array.slice(0,this._tail));
}
this._head = 0;
this._tail = 0;
}
return removed;
}
}
Final notes:
there is no specification on how an array should be implemented in Javascript, therefore I've used the growth factor concept, just like in C#. However, according to James Lawson, the array implementation is pretty smart in modern Javascript engines, we might not even need it.
the optimization in dequeue might help with space, but it could be ignored if what you want is speed and don't care about the space usage
final benchmarking results are:
Enqueue 200000 items, 15ms, final array size 213106
Dequeue 200000 items, 19ms, final array size 1536
Dequeue and enqueue 200000 items, 13ms, final array size 20071
While on the road with his mother and baby brother, a ten year old prince is attacked by an enemy armed group. Thrown into a patch of thorns from where he could not move, only watch, he sees his mother defiled and killed and his brother smashed on a rock like a toy. He vows vengeance. Such a classic story, right? Only we see him a few years later, leading a band of brigands, murdering and looting and raping, his vengeance all but forgotten and replaced by a desire to unite all the hundred little states warring against each other. Well, more interesting, but still pretty classic, right? Nope, stuff still happens that makes the lead character (and you) doubt his thoughts and the true nature of reality and retroactively explains some of the more incredulous questions that the reader is asking.
I would say Prince of Thorns is all about revealing layers of this world that Mark Lawrence is still shaping. I quite liked that. The first book sets things up, but it is not a setup book. It is filled with action. Nor does it tell us everything, leaving a lot to be explored in the next books in the series. That's something that is sorely missing in many modern stories. In order to enjoy the book, though, you have to suspend your disbelief when it tells of an eleven year old boy smashing heads, swinging swords and leading men. Yes, in feudal times being 11 is the time to have a midlife crisis, but it is all a little bit too much for a child.
It is a game of thrones kind of book, but mercifully from the standpoint of a single character. There is not a lot of lore, but there is magic and a mysterious connection to an advanced but now dead civilisation, plenty of violence and strategy. I will probably read the next books in the series.
While working on my pet project Linqer (LINQ for Javascript and Typescript) I've spent quite a lot of time improving the performance of the Quicksort algorithm I am using for .orderBy. Therefore I am publishing it here, even if you could extract it just the same from the Linqer sources, with limited discussion on what is going on.
Why
First, why use it at all? Doesn't Javascript have the .sort method in the Array class? What's wrong with that?
The answer is that the implementation for sort is different from browser to browser, or better said, from Javascript engine to Javascript engine. In Chrome, for example, the algorithm used is insertion sort, which is simple, in place, stable and reasonably fast. It is optimized for the most common usage: small arrays that need to be sorted for UI purposes and such. However, when using large arrays, the algorithm doesn't perform as well as one might expect.
For Linqer I had an additional reason, because I would use ordering followed by skip and take methods that limited the scope of the need for sorting. Imagine a million items array that I wanted ordered and then needed the first ten items. Sorting the entire thing for just ten items would have been overkill. The default .sort function doesn't have parameters for such scenarios.
And there is another reason: the default function used to compare array items is alphanumeric. [1, 2, 10] would get ordered as [1, 10, 2].
Second, why Quicksort? There are a bunch of sorting algorithms out there. Mergesort, Heapsort, Radixsort, Timsort, Selectionsort. What's so special about Quicksort.
I have to admit that I went for it by googling fast sorting algorithm. It does have "quick" in the name, doesn't it? I also found it elegant and easy to comprehend. And for my particular scenario, I liked that it used a divide et impera strategy which allowed me to ignore parts of the array if I didn't need the items there. In other words, it's very well suited both as a general sorting algorithm and a partial sorting algorithm.
What
I would like to tell you that it's simple to explain what Quicksort does, but it requires some amount of attention and time. In general terms, it chooses an arbitrary item (called a pivot) then orders the remaining items relative to the pivot, in two so called partitions: the smaller items to the left, the larger to the right. Then it repeats the process for each of the two sides. How the pivot is chosen and how the partitions are handled is what differentiates Quicksort algorithms and determines their performance.
It is an in place algorithm, meaning it doesn't copy the array in some other type of structure and instead it moves items around inside it. It is not a stable algorithm, meaning the order of "equal" items is not preserved. The average computational complexity is O(n log n), with the worst cases O(n^2). The space complexity is harder to determine. Most people say it is O(1) because it uses no additional data structures, but that is not really correct. Being a recursive algorithm, the call stack gets used quite a lot, an invisible storage that should be computed in the data complexity.
Unfortunately, the worst case scenarios are also very common: already sorted arrays and arrays filled with the same value. There are various optimizations to be used in order to handle this sort of thing. Also, Quicksort is efficient with large quantities of data, but less so with small numbers of items.
How
Finally, we get to the code. The _quicksort function receives:
an array
left and right index values determining the inclusive area that will be sorted (usually 0 and array.length-1)
a comparer function (item1,item2)=> 1, 0 or -1 and that defaults to _defaultComparer which tries to sort items based on the > and < operators
min and max index values determining the window of the array that we need to have sorted
The left and right indexes determine which section (before the sort) of the array will be sorted, the min and max indexes determine which items I am interested in (after the sort). This allows me to skip ordering partitions that are outside my area of interest.
As I said, the pivot choice is important. Some strategies are very popular:
the last item in the array as the pivot
this is the strategy used in the original incarnation of Quicksort
leads to very poor performance when the array is already sorted
the median item
this suggests parsing the array in order to get the value, implying extra computation
it only makes sense when the values in the array are numbers
the average between the first, the last and the middle item
it only makes sense when the values in the array are numbers
the item that is in the middle of the array
this is the one that I am using
a random item in the array
this makes the algorithm escape scenarios where the performance would be bad
the outcome of the sorting is unpredictable in terms of time used and stability of items
multiple pivots
an interesting concept, but one that complicated the algorithm too much for comfort
Then there is the matter of the partitioning. I've used an optimization that involves two indexes, one at the start the other at the end of a partition, coming toward each other and swapping items that are on the wrong side of the pivot. In some implementations, if the pivot is the last item, the partitioning is from one side only. In others, multiple indexes are used to handle multiple pivots.
In most implementations, the algorithm recurses on _quicksort, but I refactored it to only recurse on the partitioning. Then, because I didn't want to get stack overflows when bad data was used, I've eliminated the recursion and instead used a stack of my own where the partitions to be sorted are stored and wait their turn. This is where the data complexity comes around. In my case I was using a little more data than I actually needed, because I was adding partitions to the stack and also incrementing the index of the current partition, meaning the stack array grew with handled partitions. Even if there is no computation performance benefit, I've optimized this as well by adding a queueIndex which is used to recycle the slots in the partition array that are behind the partitionIndex. New partitions are being added behind the partitionIndex and queueIndex is incremented. When the loop reaches the last partition in the stack, a new loop is started with the partitions from 0 to queueIndex. Thus, for a ten million item array the partition stack rarely goes above 40000 in length.
A further optimization is to use insertion sort on partitions that have become too small (under 64 items). It irks me to have had to do this, I would have liked to use a "pure" algorithm, but this improved the performance and minimized the size of the partition stack.
The Code
That's about it. Here is the code:
function _insertionsort(arr, leftIndex, rightIndex, comparer) {
for (let j = leftIndex; j <= rightIndex; j++) {
const key = arr[j];
let i = j - 1;
while (i >= leftIndex && comparer(arr[i], key) > 0) {
arr[i + 1] = arr[i];
i--;
}
arr[i + 1] = key;
}
}
function _swapArrayItems(array, leftIndex, rightIndex) {
const temp = array[leftIndex];
array[leftIndex] = array[rightIndex];
array[rightIndex] = temp;
}
function _partition(items, left, right, comparer) {
const pivot = items[(right + left) >> 1];
while (left <= right) {
while (comparer(items[left], pivot) < 0) {
left++;
}
while (comparer(items[right], pivot) > 0) {
right--;
}
if (left < right) {
_swapArrayItems(items, left, right);
left++;
right--;
}
else {
if (left === right)
return left + 1;
}
}
return left;
}
const _insertionSortThreshold = 64;
function _quicksort(items,
left, right, comparer = _defaultComparer,
minIndex = 0, maxIndex = Number.MAX_SAFE_INTEGER) {
if (!items.length)
return items;
const partitions = [];
partitions.push({ left, right });
while (partitions.length) {
({ left, right } = partitions.pop());
if (right - left < _insertionSortThreshold) {
_insertionsort(items, left, right, comparer);
continue;
}
const index = _partition(items, left, right, comparer);
if (left < index - 1 && index - 1 >= minIndex) {
partitions.push({ left, right: index - 1 });
}
if (index < right && index < maxIndex) {
partitions.push({ left: index, right });
}
}
return items;
}
_defaultComparer = (item1, item2) => {
if (item1 > item2)
return 1;
if (item1 < item2)
return -1;
return 0;
};
I've accepted the old man should teach me as the only solution to becoming a champion, but it is hard to swallow it. He is very old, but mischievous, so whenever I try to learn something from him, he kicks me to the ground. He tricks me again and again and again. I am frustrated, but I am trying to keep my cool. I am strong. If I were to really fight him, he might be smart, but every attack would break bone and then what good would he be? Just a bag of meat and broken shards. I close my eyes, I breath, I tell myself it is worth it.
The old man apologizes and offers me a hand, I take it, only to be kicked in the ass and thrown into a jumble of debris. I lose my temper and stomp away. He doesn't understand. Getting angry at him is pointless, hurting him futile. I have nothing to learn from him. I walk through the old grounds of my conquests, now just the walled in and decrepit underground of the large arena above. I feel a presence behind me and I see the old man is following me, eyes to the ground. Contrition? Surely another of his tricks. "Begone!" I roar at him, but he goes to his knees and kowtows in front of me, his hands touching my feet. I feel tears swelling up in my eyes. He might as well be a little boy asking for forgiveness. Just who is the teacher and who is the student? Who is the adult here?
"How did you get to a hundred years or whatever behaving like a little kid?! You are a child!" I shout at him in admonishment. I look around and ghosts of my past awaken my anguish. I feel my face contort into a painful grin as my tears flow freely. "Every week I was coming here to murder people!", I rage, my voice barely my own, a booming, low, animal growl, my expression that of an enraptured madman, for sure. "I would stake my life every time and I would leave, alive, every time!". The images of old fights flash before my wet blurred vision and I imagine that some of the painted white walls might contain some of the scrolls of the ancient arts, built over by a world that doesn't get it anymore. "I loved it!", I say, walking in the dead halls, every step a pulse of power overlaying glorious past over grey reality. My body is shaking with now uncontrollable weeping. "I killed so many people and I miss it... so.... very... MUCH!".
Does he get it now, I ask myself? Has he even an inkling of the power he needs to teach me to control? I burst through the door to the surface and climb the stairs that get me to the arena above. The seats are packed with oblivious spectators, all watching some performance I don't even care to notice. I breathe in the fresh air and feel better. Ready to come to a final understanding with the old man, if he is capable of it ,I turn around. There is little time and we should not fight each other. But the old man is gone.
I strain my eyes into the darkness of the stairs and I feel it, The Beast, the adversary I need to fight is there. He's got the old man and, even if I cannot see it, I know it is there, all cunning, fury and power. My body roars by itself, a predator sound, strong and fearless, no sound a man should ever be able to make. The arena spectators panic in surprised horror, but I ignore them. I jump into the darkness with animal strength. I will fight this beast, I will meet it head on, I will be the most savage, alone I will remain alive.
The Grace of Kings feels long from the very start. Ken Liu is starting off from a fictional empire of seven islands, but it might as well have been a historical book. Everything is mostly realistic, with very human characters that do what human characters do: harm and kill other people and find rationalizations for it. Some of them are heroic and occasionally think of other people, too.
Half way through the book (which is one of a trilogy, of course) I couldn't keep up with all the characters that kind of did the same thing, the long expositions of why people did stupid or horrible things to others and the various anecdotes that made some of the characters heroes or villains. And I call them anecdotes because that's what they feel like: short moments that disrupt rather than enforce the long and unfortunately boring history of the realm.
Bottom line, it feels like a Chinese Game of Thrones, with less interesting characters and no magic as of yet. It's not badly written, quite the contrary, but its subject is long winding and doesn't interest me. I will therefore abandon reading it.
I guess I don't have to tell you about ad blockers and browser extensions that improve YouTube. They are a dime a dozen and bring many features to the habitual YouTube watcher. However there is one particular new YouTube annoyance that you don't really need an extension to get rid of: the dreaded Video paused dialog.
To get rid of it is easy: on an interval, check if there is a visible element of a certain type containing a certain text and click its button. While this can be done in simple Javascript, I am lazy, so the script that I am using will first load jQuery, then run a one line function periodically. This code is to be copied and pasted in the Console tab of the browser's development tools, then press Enter.
finds the element with id button containing the text "Yes"
inside an element of type yt-confirm-dialog-renderer which is visible and contains the text "Video paused"
click the element
There is an even more comfortable solution, though, that I recommend. You will need a Chrome extension called cjs that loads whatever script you tell it in whatever page you want. It gives you the option to inject jQuery, so all you have to do is write
Wakenhyrst is very well written, but where it excels is the dissection of the hypocrisy of people. Michelle Paver is telling the story from the viewpoint of a young girl who must navigate the world and her own adolescence in the house of a father that has no love for her or for her mother, finds every reason to blame others for his shortcomings and deeds, and yet is untouchable because he is a man and the lord of the manor. What legions of screeching feminists could not do, Paver manages with her subdued, yet defiant description of how women are used and ignored and pretty much treated as glorified pets. It is impossible to not hate the father figure in the book, even as the main character is torn between wanting to forgive him and dealing with the creepy and sometimes evil shit he pulls. The ending is powerful, as the daughter finds the strength to sublimate her hate into an even more appropriate emotion: pity.
But the story's power is not limited to the detailed analysis of the human psyche. It binds together Anglican folklore, medieval beliefs about devils and angels and art, whitewashed (in the actual sense of the term) by Puritans and systematically destroyed by Victorians, the power of untamed nature and the horror of the human complacency. How refreshing to have a very young girl be the rational and intelligent agent that fends for herself in a world of mystical belief and societal poppycock, so that we can identify with her and see it as it was. How wonderful to have Paver describe it all without any trace of anachronism, as if she has lived in that world herself.
The story starts slow and the pace almost never picks up, yet the tension and the level of details are constantly increasing, managing to somehow convey at the same time two distinct and contrary feelings: one of slow burn and the other of untamed power rising to a crescendo. It brilliantly mingles the oppressive hot wet feel of subconscious fear and superstition with cold analytical reason as its adversary. In the beginning I wanted to rate it above average only, but now, the more I think about it the more I admire the writing and the way the book tells the story. Good job, Michelle Paver!
Bottom line: move past the slower start, it is certainly worth reading. A gothic tale of subliminal supernatural horror and a very human and real one at the same time.
Salvation Lost is the second book in the Salvation Sequence trilogy from Peter F. Hamilton. I was commenting on the previous book saying that it is mostly filler and action that is irrelevant to the larger story. This book is a lot more action packed and a bit more interesting, but ultimately just as pointless. A lot of characters that will only get relevant in the third book, if at all, a lot of stories that happen in the past as we know what is going 10000 years into the future and no closure on anything. The horror of the alien invasion is powerful, but not as much as it could have been. The story invests so much in some people only to kill them later with no apparent effect on the timeline of events.
Bottom line: I will probably read the last book, scheduled sometime in Sep 2020, but I feel this series is one of Hamilton's weakest.