and has 0 comments
He did it again! Or would it be better for me to say he has done it afore, as I am kind of moving backwards in the Peter F. Hamilton writings? Night's Dawn is a huge book, about 9Mb of pure text, divided into three parts purely out of paper formatting reasons, I am sure. So far, this is the best thing he has written, at least in my opinion.

Maybe the guy is the kind of writer who writes his best work first, then tries unsuccessfully to follow up. Not that any of his followups could be called a failure, it's just that Night's Dawn is really cool! I mean who can seriously deal with possession, necromancy, devil worship, witch hunting, vampires, werewolves, ghosts and demons, all in a future world in which humanity has conquered space an tries to attack the situation with science and rationality? Seriously! If this guy would have written the Bible, there would be no Muslims! (Ron Hubbard, eat your heart out!)

There isn't much else I can say. I certainly cannot summarize a book that spreads over about a dozen inhabited planets, all with their own history, socio-economic situations described and own characters to add plot (real plot) to the story. Right now I am terrified. I need to find the Greg Mandel trilogy, which is the last of the Hamilton big stories, and there are only two outcomes: a) I hate it, which would have wasted my time and trust in humanity; b) I love it, and then I go into withdrawal waiting for the last two volumes of the Void trilogy and whatever else he brilliantly writes in the future.

Bottom line: if you like Sci-Fi, you need to read this. Hamilton and the ReSharper guys are the only people I ever felt the need to send money to in order to apologize for my shameless Internet piracy.

Links:
Peter Hamilton's official site
Wikipedia's entry on the Night's Dawn trilogy
Peter F. Hamilton's entry in Wikipedia.

and has 0 comments
There are a lot of people in the world having problem with lateral parking. It is not only difficult to manage when you have nothing to go on but the little tiny images in the car mirrors, but stressful as well, as normally the procedure is done on a crowded street with drivers behind you urging you to do it faster.

Not a problem anymore! The Chinese have found the perfect way of parking in under 5 seconds! Here is a video tutorial on how to do it:



Here is an alternative solution.

and has 0 comments

Two months ago I wrote a post about the exotic fruits I found in a hypermarket in Sibiu, now it's time for additions, taken from the Bucharest Metro.


So here are the fruits I bought today:
the Maracuya
. It has the same feel as a fruit from my first post, the Kiwano. The taste, though, is very sour, like a lemon, and a little spicy. I could eat it, but I think people don't really eat it raw. The outer skin is hard, inedible and thick.
the Kaki fruit. It is very tasty, although it has a peary texture that I didn't quite like. It is a sweet banana tasting fruit.
the Cactus fruit has a sweet coating around the hard inedible seeds inside with a texture of baby food and taste like a not aromatic cantaloupe, similar to the Pepino. The seeds, though, make it less than pleasant to eat.
the Pepino mellon has a similar texture to the Kaki, but it has the taste of Cantaloupe, yet not so aromatic.
Now, the Papaya is an interesting tasting fruit. I am afraid my best approximation is still the cantaloupe, but the papaya also has its aromatic properties and the texture. Its aroma, though, is slightly different, more like banana. It has a big core of inedible seeds and the skin is also inedible. That makes the useful part of the fruit rather small.

In conclusion, one must definitely try the papaya and the kaki. The maracuya is the weirdest taste among all, not entirely pleasant, although I can try to eat it with sugar or something. The net suggests mixing the seedy content with water and sugar after letting the skin wrinkle.

and has 0 comments
I went to one of the Metro hypermarkets in Bucharest and I was pleasantly surprised by the variety of stuff one can find there, as compared with other hypermarkets that seem to be supplied from the same sources. Also check my next entry, I will write about exotic fruits there.

In this post I will talk about calamari! There is an entire store dedicated to fish in Metro, filled with a lot of nice looking and/or packaged treasures of the sea. My picks were swordfish stakes and one big calamari. Well, not that big... it's no architeuthis, but it will do.

I went home, made a longitudinal incision, threw away the awkward looking organ inside (which I suspect had a digestive function) and the eyes and beak, then threw it in boiling butter after putting a bit of spice over it. I removed it after 2-3 minutes and ate it. Yes, it's that simple! The taste is not strong, but really special and it was totally worth the buy.

Warning, as read from the googling on calamari: Calamari is either to be cooked in 2-3 minutes or in more than 30. Everything in between turns it to rubber. So, if you are like my wife and you want to spit it after you taste it, you might want to try the long cooking calamari recipes out there. :)

A while ago I wrote a little post about changing a paging GridView in order to show certain number of pages and page index, but extract from the database only the data you needed. I was looking there for something in SQL Server like the MySql LIMIT, the T-Sql equivalent of TOP, but which accepts two parameters instead of only one.

I just found out that there is a way of doing this in Sql Server 2005 by employing functions that return the position of a row, depending on a certain ordering and partition, called Ranking Functions. From the little googling I did, it seems that Microsoft borrowed this technique from Oracle, but one never knows, maybe it was the other way around.

I will write a short example on ranking functions, then link to the appropiate articles. Given this SQL query:
select *,
Row_Number() OVER (ORDER BY name) as [Index],
Rank() OVER (ORDER BY name) as Rank,
Dense_Rank() OVER (ORDER BY name) as DenseRank,
NTile(3) OVER (ORDER BY name) as NTile,
Row_Number() OVER (PARTITION BY nr ORDER BY name) as IndexNr,
Rank() OVER (PARTITION BY nr ORDER BY name) as RankNr,
Dense_Rank() OVER (PARTITION BY nr ORDER BY name) as DenseRankNr,
NTile(3) OVER (PARTITION BY nr ORDER BY name) as NTileNr
from test
ORDER BY ID

you get the following result:
IDNameNrNr2IndexRankDRankNtileIndexNrRankNrDRankNrNtileNr
1Mark1787422222
2Mike141111633333
3John2855321111
4Charles3211111111
5Ellen3643212222
6Mary4199532222
7Mark41777421111
8Mike2411211632222
9John68365321111
10Charles17221111111
11Ellen06833211111
12Mary321109533333


As you can see, Row_Number returns the row index, Rank returns the rank, Dense_Rank returns consecutive rank (no gaps between rank numbers) while NTile puts each row in a category using a given number of total categories. Partition by makes the operations work for each distinct value of a certain column, in this case nr. If the partition would have been on nr2, all the ranking values would have equaled 1, since there are only distinct values on the nr2 column. The Over clause can be used on more than just ranking functions; it also works on Aggregate functions. Yummy!

Links:
Ranking Functions (Transact-SQL)
OVER Clause (Transact-SQL)
Aggregate Functions (Transact-SQL)
This article also shows a similar method in Sql Server 2000 of which I knew nothing until today: Row_Number() function in SQL Server 2005
Returning Ranked Results with Microsoft SQL Server 2005

Maybe it works for other IIS versions as well, but I certainly was looking for a way of turning it on on our Windows 2000 development/test computer. So this is the long story:
HOW TO: Enable ASPX Compression in IIS

and this is the short one:
Step 1: backup your site metabase
Go to the Internet Information Services (IIS) tab and right click on it, go to All Tasks, choose Backup/Restore Configuration and save it.

Step 2: make the change to the metabase
Create a .bat file that has the following content:
net stop iisadmin
cd C:\InetPub\adminscripts
CSCRIPT.EXE ADSUTIL.VBS SET W3Svc/Filters/Compression/GZIP/HcScriptFileExtensions "asp" "dll" "exe" "aspx"
CSCRIPT.EXE ADSUTIL.VBS SET W3Svc/Filters/Compression/DEFLATE/HcScriptFileExtensions "asp" "dll" "exe" "aspx"
net start w3svc


Make sure to restart the SMTP service or any others that were stopped by the bat. I don't know how to start it from the command line and I pretty much don't care. The batch file will notify you of possible services it will shut down, but will restart in the end only the Web service.

The performance is immediately visible and it also works with Ajax.

Update:
This article was originally talking about Windows XP. Thanks to McHilarant (see comment below) I realized that, even if the changes in the metabase are possible on any IIS5 (Windows XP and Windows 2000), the actual compression will not be possible on XP. I remembered then that the actual modification that I did that time was not on my dev machine, but on our office server, therefore I updated the post accordingly.

Another Update:
Here is a link about a script to enable IIS 6 gzip compression: Script to Enable HTTP Compression (Gzip/Deflate) in IIS 6.

I was looking for ways of speeding up my web pages, considering that one utility page I built reached 500Kb and from that, only 170kb where actually content, the rest was Javascript. Bad Javascript, I accept, but it works, ya know?

So, after thinking and thinking I thought of a novel approach to this: how about googling for it? and I found this very nice and comprehensive article: Speed Up Your Javascript Load Time.

There isn't much else I can say about it.

Before Asp.Net Ajax was released I used to work with a library called AjaxPro (formerly Ajax.Net). It also wrapped specially decorated methods and sent them to the Page as javascript functions with a callback javascript function as the last parameter. In this library, by not using a callback you were saying that you want to execute the function synchronously, that you want to wait for a result. After all, that's a functionality included in the XmlHttpRequest object.

I was amazed to see that Asp.Net Ajax does not offer such an option. You NEED a callback function. So... how do you use a WebMethod as a javascript validation function, since the validators only accept a synchronous function?

The first idea I had was the easy way, to cycle inside the function until I had a result. Bad idea: Javascript itself is not really multitask, so it just took the CPU to 100% and that was the end of it: kill task. The second idea was the hard way: redo all code in the XmlHttpRequest wrapper and use the included synchronous functionality. Too complicated, although I found an article about how to do that. So here is the way I did it:
1. Get the validator object
2. Set a property on it to tell if the validator is inside an async call or not
3. Set a property to hold the last value that was validated
4. Set a property to hold the last result returned
5. Set a method of the validator as the call back function
6. If is in async return false
7. If the last value is the same with the current validating value, return last result
8. Execute the async validation function with the validator callback method as a parameter and return false (or "in async")

Now, the callback method just sets the result and clears the async flag and also updates the visual of the validator. By doing this I essentially don't allow postbacks during validation calls, which makes sense. I also had an idea of creating a special "in async" validator that would not allow postback and would display something nice, but the solution I am explaining keeps it local, with ValidationGroups and everything working just fine.

I will show the code in a few seconds, but first be warned that I am using a validator made by myself which expects null or true as valid and a string or false as invalid. If the result is a string, it displays it as the ErrorMessage. So, on with the code:
The function to get a reference to the validator (which in some cases, especially Ajax, is not the same as the span with the same id)
function getValidator(id)
{
if (typeof(Page_Validators)=='object')
for (c=0; c<Page_Validators.length; c++)
if (Page_Validators[c].id==id) return Page_Validators[c];
return false;
}


The preparing function, which can be reused for more validators. I couldn't find a better way to encapsulate this behaviour yet, please feel free to enlighten me with ideas
// parameters are the value to be validated 
// and the validator object
function prepareAsync(value,validator) {
if (!validator) {
validator.doReturn=true;
return 'Error in prepareAsync: validator not found!';
}
// it's validating
if (validator.inAsync) {
validator.doReturn=true;
return 'In async validation';
}
// it has validated this value before
if (validator.prevValue==value) {
validator.doReturn=true;
return validator.result;
}
// create the callBack function as a method
// of the validator object
if (!validator.callBack)
validator.callBack=function(res) {
validator.inAsync=false;
validator.result=res;
if (res==null) {
validator.isvalid=true;
} else
if (typeof(res)=='boolean') {
validator.isvalid=res;
} else {
// setting the validation message is
// as simple as changing the innerHTML
// of the validator span
validator.innerHTML=res;
validator.isvalid=false;
}
// that's an ASP.Net validator function
// since .NET 1.0 upwards
ValidatorUpdateDisplay(validator);
}
validator.inAsync=true;
validator.prevValue=value;
validator.doReturn=false;
}


and now a sample validation function:
// parameters are the value to be validated
// and the id of the validator
function ValidateProdotto(value,id) {
// get the validator object
var validator=getValidator(id);
// reuse this function to all validators in the page
var res=prepareAsync(value,validator);
// doReturn says if it has a result for us
if (validator.doReturn) return res;
// perform the actual validation
PageMethods.ValidateProdotto(value,validator.callBack);
// and return false, since we are in async now
return false;
}


Hope this helps somebody. This applies to short validations, but feel free to enhance my design with cancellations of running async operations in case the value is changed or with an array caching all the values tried and validated.

Yesterday I was trying to guess what the hell is going on with my application. You see, I was moving controls from one panel to the other in case the browser was FireFox and I forgot all about it. In Internet Explorer it would work, in FireFox ASP.Net would crash, with a Server not available error and not any relevant message as to why. In the Application Event Log I would get stuff like:
aspnet_wp.exe (PID: 872) stopped unexpectedly.
and
EventType clr20r3, P1 aspnet_wp.exe, P2 2.0.50727.832, P3 461ef1db, P4 system, P5 2.0.0.0, P6 461ef191, P7 143a, P8 1f, P9 system.stackoverflowexception, P10 NIL.

and no explanations. Google would give me a lot of links that pertained to WebServices and Asynchronous operations and multi threading, but my application is a simple app. What was going on?

It turns out I was doing this:
pnlNoAjax.Controls.Add(UpdatePanel1.ContentTemplateContainer.Controls[0]);
Nothing wrong with it, except the control that I was adding was also pnlNoAjax and so it went round and round chasing its own tail until a stack overflow exception was generated.

So, bottom line: try to see if you don't have a control adding itself to its control collection when you meet this weird error. Or, of course, the problem might be in a WebService or threading operation :)

You have one beautiful page with Ajax.Net, UpdatePanels and the sorts. Then you go test it with FireFox and it doesn't work. Or you have some other reason for not wanting Ajax in certain situations, like for example people with no Javascript. So what do you do? How can you tell the page NOT to do Ajax postbacks?

The solution is to set ScriptManager's EnablePartialRendering to false. I used to do it with a Panel that was outside the UpdatePanel and in the page Init cycle I would move all controls from the update panel to this normal panel. And it worked, too.

Quick fix: look for indexed views or indexes on tables with computed columns and delete the index or change it to not reference the view or the computed column.

I got this error while trying to optimize an SQL server. I did the trace, I used the Database Engine Tuning Advisor, it gave me an sql file of magical Create Statistics and Create Index lines. But after I applied it, happy that I will get a 38% increase in my SQL speed, I got this weird error:
UPDATE failed because the following SET options have incorrect settings: 'ARITHABORT'

I wasn't aware that a cluster could break your database! So I found this little Microsoft article: PRB: Error "INSERT Failed" When You Update Table Referenced in an Indexed View.

Yet it only speaks about Indexed Views and it somehow shifts the blame to some missing ArithAbort setting. Not so. In my case it was about one of the indexes referencing a computed column in the INCLUDE statement.

In my particular case, changing the offending index to not reference the computed column was the solution. Of course, indexing computed columns and views is totally possibly, but it depends on how you create those indexes. In my case, SET ARITHABORT ON was set before creating the index. The solution in the Microsoft Support article might be better, even if less attractive to lazy people as myself.

Update February 2016: This post discusses a problem in .Net 2.0 that is more than 8 years old. Perhaps you should look elsewhere. That being said, there are several situations here:
  1. The default execution timeout is 90 in ASP.Net, increase it to whatever you like (in seconds) with <system.web>
       <httpRuntime executionTimeout="VALUE" />
    </system.web>
    Also read this: What exactly is Appdomain recycling, since it is likely this applies in a lot more situations where the server decides the app needs to be recycled.
  2. You use a Response.Redirect, Server.Transfer or Response.End directly, which is the method that ultimately throws this exception. Response.Redirect has an optional bool parameter that, when set to false, does not execute Response.End. Response.End itself is obsolete, recommended is the HttpApplication.CompleteRequest method. Read the remarks for Response.End, it is a method exclusively left there as for compatibility with ASP and it is pretty damn awful. See here how to replace Response.End.
  3. The original problem that made me write this post, which was a long running method inside a web service in .Net 2.0, to which I have not found the solution back then. The solution is probably related to AppDomain/Application Pool recycling from the first point and at that time I did not look hard enough at all application pool settings.
  4. Some issues encountered by other readers that were caused by the ASP.Net application writing in its bin folder, causing the automatic recompiling of the app.

Read more about the ThreadAbortException, which is raised when aborting a thread, which in itself is a pretty bad idea. Ultimately, I would say that getting this exception nowadays is a sure sign of bad design.

Now for the original post:

Update: the problem I was facing is still there and with no solution. I must be doing something wrong, but I can't figure out what. It involves a web service in NET 2.0, added with Web Reference in another app. The service has a method that takes a lot of time. If I do it synchronously it works, depending on the Timeout property of the web service proxy and the executionTimeout setting in the web.config. If I do it asynchronously, though, it doesn't work! If it takes too long it just kills the thread. The Timeout property of the web service doesn't even count in async calls. My solution was to call the method synchronously and be done with it. If anyone has an answer for this, please let me know!

(Oh, but if you mean to tell me that an asynchronous operation in a web service should not take long and that it is bad design and so on and so on, don't let me know!)

The single most related article to my problem that I found on the net is this: timeout problem with webservice and suggests a client problem, but no solution other than asynchronously calling the synchronous call to the web service, and no explanation to the underlying problem.

As for the regular solution for Thread being aborted, it actually it's not my solution, it was given in a forum by a guy with the nickname dstefanov, but I stole it with impunity :)

Here it is:
ThreadAbortException happens when the thread serving the request doesn't finish for a long time. The ASP.NET hosting process aborts this thread after certain time. You can adjust that time with:

<system.web>
<httpRuntime executionTimeout="600" />
</system.web>

By default the executionTimeout="90" seconds.


This applies to ASP.Net web apps, including web services, so also to console or windows forms applications that use web services. My own personal favourite is 86400, the number of seconds in a day.

Thanks dstefanov!

and has 0 comments
I saw this chick sing "They" on Mtv. She was stripping in space in that video, a thing that I found a bit distasteful, even if it did link to a Jane Fonda scene in a film called Barbarella, Queen of the Galaxy. But that's the MTV world: strip if you want to get noticed. True enough, I haven't seen any other of her videos on any music television since, but then I stopped watching them.

Jem (real name Jemma Griffiths) is a Welsh singer and this is (I guess) the original video of "They". She is young, beautiful, but also has a nice voice and sound. Check her out at her official web site or the MySpace site.

I had to rewrite the entire post. It started from a nice article by a guy called Dan Wahlin and it ended with three (at least? :) ) separate links to his articles and his blog entered in my Technorati favourites.

Here are links to what appears to be a series about Asp.Net Ajax, really informative and concise:

Update Panel properties explained: Implement UpdatePanel Properties

Implement visual cues during updates:Inform Users With Customized Visual Feedback

Minimize the load on the server on many subsequent clicks or refresh requests:Coping With Click-Happy Users

Here is a complete list of the links in the same series:
ASP.NET AJAX Articles

and has 0 comments
I've found this while randomly browsing the net, a substance that is supposed to be increasing memory and brain plasticity called ampakine. An article called it steroids for the brain and I borrowed that in the title, since I found it is appropriate enough.

Like all mind enhancing drugs it was discovered by accident, while working on a cure for Alzheimer. Why would anyone try to enhance one's brain on purpose, anyway? :-|

I am not much of a biochemist (even if I recently made some acquaintances that are :) and I could ask them to enlighten me) so I will just post a list of links that I found on the subject. It seems that there are no publicly available pills yet, as the drug is still in trials, but who knows... maybe we can become smarter rather than dumber for a change.

Here are the links:

ampakines
A profile of the behavioral changes produced by
facilitation of AMPA-type glutamate receptors

ampakine
Ampakines
'Memory pill' for the forgetful