This post will be quite lengthy and it will detail my findings on best practices with SQL, specifically Microsoft SQL Server.

I started with an article written by Vyas Kondreddi in 2001: SQL Server TSQL Coding Conventions, Best Practices, and Programming Guidelines. In 2001 people were not microblogging!

Well, to summarize the article and bring it up to date a little, here are some of the most important points (in my view):
  • Decide upon a database naming convention, standardize it across your organization, and be consistent in following it. It helps make your code more readable and understandable.
  • Write comments in your stored procedures, triggers and SQL batches generously, whenever something is not very obvious.
  • Try to avoid server side cursors as much as possible.
    As Vyas Kondreddi himself says: "I have personally tested and concluded that a WHILE loop is always faster than a cursor"
  • Avoid the creation of temporary tables while processing data as much as possible, as creating a temporary table means more disk I/O. Consider using advanced SQL, views, SQL Server 2000 table variable, or derived tables, instead of temporary tables.
    This is interesting, because I usually use a lot of temporary tables in my stored procedures to make the code more orderly. I guess that in the case of SQL Server 2005 and later one can always use Common Table Expressions to make the code more readable. For SQL 2000 and such I found two interesting articles about not using temporary tables and replacing them with either derived tables (selects in selects) or with table variables, although they do have some limitations, thoroughly explained in the latter post. Here are the links: Eliminate the Use of Temporary Tables For HUGE Performance Gains and Should I use a #temp table or a @table variable?
  • Try to avoid wildcard characters at the beginning of a word while searching using the LIKE keyword, as that results in an index scan, which defeats the purpose of an index.
    For a short analysis of index scans go to SQL SERVER - Index Seek Vs. Index Scan (Table Scan).
  • Use the graphical execution plan in Query Analyzer or SHOWPLAN_TEXT or SHOWPLAN_ALL commands to analyze your queries.
  • Use SET NOCOUNT ON at the beginning of your SQL batches, stored procedures and triggers in production environments, as this suppresses messages like '(1 row(s) affected)' after executing INSERT, UPDATE, DELETE and SELECT statements. This improves the performance of stored procedures by reducing network traffic.
  • Use the more readable ANSI-Standard Join clauses instead of the old style joins.
  • Incorporate your frequently required, complicated joins and calculations into a view so that you don't have to repeat those joins/calculations in all your queries.
  • Use User Defined Datatypes if a particular column repeats in a lot of your tables, so that the datatype of that column is consistent across all your tables.
    Here is a great article about Sql UDTs (not the new .NET CLR types): What's the Point of [SQL Server] User-Defined Types?. Never used them, myself, but then again I am not an SQL guy. For me it seems easier to control data from .Net code
  • Do not let your front-end applications query/manipulate the data directly using SELECT or INSERT/UPDATE/DELETE statements. Instead, create stored procedures, and let your applications access these stored procedures.
    I am afraid I also fail at this point. I don't use stored procedures for simple actions like selecting a specific item or deleting a row. Many time I have to build search pages with lots of parameters and I find it really difficult to add a variable number of parameters to a stored procedure. For example a string that I have to split by spaces and search for all found words. Would it be worth to use a stored procedure in such a situation?
  • Avoid dynamic SQL statements as much as possible. Dynamic SQL tends to be slower than static SQL, as SQL Server must generate an execution plan every time at runtime.
    Personally, I never use dynamic SQL. If I need to create an SQL string I do it from .Net code, not from SQL.
  • Consider the following drawbacks before using the IDENTITY property for generating primary keys. IDENTITY is very much SQL Server specific, and you will have problems porting your database application to some other RDBMS. IDENTITY columns have other inherent problems. For example, IDENTITY columns can run out of numbers at some point, depending on the data type selected; numbers can't be reused automatically, after deleting rows; and replication and IDENTITY columns don't always get along well.
    So, come up with an algorithm to generate a primary key in the front-end or from within the inserting stored procedure. There still could be issues with generating your own primary keys too, like concurrency while generating the key, or running out of values. So, consider both options and go with the one that suits you best.
    This is interesting because I always use identity columns for primary keys. I don't think a data export or a database engine change justify creating a custom identity system. However I do have to agree that in the case that data is somehow corrupted a GUID or some other identifier would be more useful. I am sticking with my IDENTITY columns for now.
  • Use Unicode datatypes, like NCHAR, NVARCHAR, or NTEXT.
  • Perform all your referential integrity checks and data validations using constraints (foreign key and check constraints) instead of triggers, as they are faster.
  • Always access tables in the same order in all your stored procedures and triggers consistently. This helps in avoiding deadlocks. Other things to keep in mind to avoid deadlocks are: Keep your transactions as short as possible. Touch as few data as possible during a transaction. Never, ever wait for user input in the middle of a transaction. Do not use higher level locking hints or restrictive isolation levels unless they are absolutely needed. Make your front-end applications deadlock-intelligent, that is, these applications should be able to resubmit the transaction incase the previous transaction fails with error 1205. In your applications, process all the results returned by SQL Server immediately so that the locks on the processed rows are released, hence no blocking.
    I don't have much experience with transactions. Even if I would need transactions in some complex scenarios, I would probably use the .Net transaction system.
  • Offload tasks, like string manipulations, concatenations, row numbering, case conversions, type conversions etc., to the front-end applications.
    Totally agree, except the row numbering, where SQL 2005 added all those nice Getting the index or rank of rows in SQL Server 2005 aggregate ranking options
  • Always add a @Debug parameter to your stored procedures. This can be of BIT data type. When a 1 is passed for this parameter, print all the intermediate results, variable contents using SELECT or PRINT statements and when 0 is passed do not print anything. This helps in quick debugging stored procedures, as you don't have to add and remove these PRINT/SELECT statements before and after troubleshooting problems.
    Interesting, I may investigate this further, although the SQL debugging methods have improved significantly since the article was written.
  • Make sure your stored procedures always return a value indicating their status. Standardize on the return values of stored procedures for success and failures. The RETURN statement is meant for returning the execution status only, but not data. If you need to return data, use OUTPUT parameters
  • If your stored procedure always returns a single row resultset, consider returning the resultset using OUTPUT parameters instead of a SELECT statement, as ADO handles output parameters faster than resultsets returned by SELECT statements.
  • Though T-SQL has no concept of constants (like the ones in the C language), variables can serve the same purpose. Using variables instead of constant values within your queries improves readability and maintainability of your code.



The next stop was SQL Server Best Practices from Microsoft.

Here are the articles I found most important, covering stuff from testing the I/O system of the system you want to install SQL server to up to Database backup, mirroring and maintainance:
Predeployment I/O Best Practices
SQL Server 2005 Deployment Guidance for Web Hosting Environments
SQL Server 2005 Security Best Practices - Operational and Administrative Tasks
Comparing Tables Organized with Clustered Indexes versus Heaps
Troubleshooting Performance Problems in SQL Server 2005
Implementing Application Failover with Database Mirroring
SQL Server 2005 Waits and Queues
TEMPDB Capacity Planning and Concurrency Considerations for Index Create and Rebuild
The Impact of Changing Collations and of Changing Data Types from Non-Unicode to Unicode
XML Best Practices for Microsoft SQL Server 2005
Performance Optimizations for the XML Data Type in SQL Server 2005
Top 10 Hidden Gems in SQL Server 2005

Lastly some links that I will not go in depth on:

SQL Server 2000 Best Practices
SQL SERVER - 2005 Best Practices Analyzer Tutorial - Sample Example describes the Microsoft Best Practices Analyser application. I tried it myself, it's not much. It touches mainly on the maintainance and security issues that I don't really concern myself with.
Top 10 Best Practices for Building a Large Scale Relational Data Warehouse. I don't think I will need it soon, but it is a short and interesting read.
SQL Server Pre-Code Review Tips. This Pinal Dave guy is pretty cool. He seems like a good resource for SQL related issues.
CMS Database Administration SQL Server Standards, a set of SQL coding standards for a medical government agency.

I am sure there are a lot of interesting resources on the net. I will update this post with new information once I get to it.

and has 0 comments

My computer was never the sanest of them all. Its most peculiar problem was that, after running for a while, you could not reset it. You had to either lower the CPU frequency or shut it down, wait for it to cool down, then start it up. And funnily enough, it did that only when having 512Mb of RAM. With an additional 512Mb chip it did not present this abnormality.

About two weeks ago, it started to show "Write delayed failed" errors during the night, when the only utility running was some file sharing app. I started researching and found that the error itself wasn't indicative of anything! A lot of people had the same problem, but each and one of them for different reasons. It was like an "I am on strike" sign from my computer.

Well, I thought updating drivers for my SATA hard drive may solve the problem. In the same swoop I also updated to the latest NVidia display drivers. Nothing changed. I then looked for an application that tells me what is the temperature of my CPU. I found SpeedFan, which is completely freeware and which I highly recommend, application that indicated a temperature of about 63 Celsius.

I thought it was too much so I've decided to clean up my CPU cooler. Between the cooler fan and the metal radiator there was a 2 mm thick dust cover. No wonder it was so hot. I cleaned everything up and my temperature dropped to about 51C. Glad I'd solved everything I started playing a game. Nothing fancy, not something 3D or anything complicated. The computer suddenly reset itself.

Now that was a complete revolt! Frustrated I started a movie and went to the other room to watch it on my TV. You see, I use this wireless AV connector to transfer the TVout signal from my display adapter to the TV in the other room. I could hear everything, but the screen was blank. What the...?

After hours of trial and error I've reached the conclusion that the newest drivers just couldn't do video overlay on the TV. Also, the fact that I chose 1280x1024 as my monitor resolution and it showed in 640x480 was also from the drivers. Funny enough, when I removed the "clone to TV" option, it worked ok. The solution was to download and install an older NVidia driver, version 91.47. When everything was said and done and I was half in watching the movie, the computer reset itself!

I am now at the stage where I found an overclocking setting on the NVidia drivers and when fiddling with it, I get my computer to reset. It is funny that it does that even if I set a lower clock setting, which was my original intention anyway. I believe that the new drivers (and the new not so new drivers) are making my NVidia FX 5200 card go too fast. It overheats and causes my computer to stop. I can only hope that buying a cooler fan for the video card (it has only a small metal radiator) will solve the problem.

Meanwhile, I have set the Hardware acceleration setting from the 3D Settings menu of Nvidia control panel to Single display performance mode instead of Multiple display performance mode as it was set, since I have only one monitor and a TV clone. I could play that game without a reset, although that is not really indicative of anything yet.

Please, computer, I am sorry! I will never do it again! Please forgive me! Don't act insanely anymore! :(

Update: in the end it was neither the video card nor the processor. It was the SATA RAID drivers!! Updating them was NOT a good idea. I went to the specific site of the motherboard I have and installed some drivers from 2003 and it now works without any glitch.

But how did I realize that was the source? Because in a rare stroke of luck I watched the computer display a blue screen and then resetting itself. In Windows XP, if you go to My Computer - Properties - Advanced - Startup and Recovery you find a System Failure groupbox with a bunch of options. Unchecking the "Automatically restart" box will make the computer display the BSOD and NOT reset, giving you the opportunity to read the error message. In my case it was a viamraid.sys giving the error 0x000000D1: VER_IRQL_NOT_LESS_OR_EQUAL.

Update 2: The problem with Write Delayed Failed was not from the SATA drivers, but from the USB external hard drive. After trying everything I knew and found on Google, I was really pissed off that this still happened, until I ran into an obscure forum where a guy said his problems all went away after changing the physical USB port!! So I moved the actual USB jack and the problem was solved!

Some other explanation I found were about the protocol of sending data packets. First it is sent a small chunk of data then, if everything went ok, the next data chunk is twice as big, so 256kb, 512, 1024, 2048 and so on. Windows has some problems with sending packets bigger then 1024Kb! There is a utility on the net that patches a Windows system file to address the issue. I might have run it, I don't even remember it's name :) I pretty much threw everything at my computer and ran it for a few weeks until I was satisfied that it worked. But do try the USB port change first...

How can I get some content from javascript (like a string) and send it to the client as a file? I don't have access to the content I want to send from the server.

This was a question posed to me in a rather more complex way: how do I take some file content from a web service and send it to the client as as file without downloading the content to the web server first?

The simple answer right now: you cannot do it. If you guys know more about this, please let me know. I've exausted all avenues I could think of, but then again, I am no master of Javascript and html responses.

Here is what I have tried. Basically, I have a string like a html table and I want it sent to the client browser as an excel download. So I opened a new window with javascript and tried to write the content there:
var win2=window.open('');
win2.document.write(s);


It worked and it displayed a table. Now, all I wanted to do is add/change the html header content-type to application/vnd.ms-excel. Apparently, you can't do it from Javascript. Ok, how about getting the necessary headers from the ASP.Net server? (remember, the restriction was that only the file content should not come from the web server). So I created a new page that would render a completely empty page with the right headers:

protected void Page_Load(object sender, EventArgs e)
{
Response.Clear();
Response.Buffer = true;
Response.ContentType = "application/vnd.ms-excel";
Response.AppendHeader("Content-Disposition", "attachment;filename=test.xls");
Response.Charset = "";

Response.Flush();
Response.End();
}


Then I just opened it in a new window (just ignore the browser pop-up filters for now) with
var win2=window.open('PushFile.aspx');
win2.document.write(s);


What happened was that the page was just rendered like a normal page. How come? I change the code so that it would write the content after a few seconds. And I got this: first the browser asks me if I want to permit downloading the file, then, after a few seconds, the warning goes away and the string is displayed in the new window. I tried with document.createTextNode, it didn't work.

So far, none of my attempts to serve javascript content as a binary file worked. If you know of a way to achieve this, please let me know. Thanks!

Update:
Meaflux took a swipe at this request and came up with two delicious ideas that, unfortunately, don't really work. But I had no idea things like these existed, so it is very much worth mentioning.

First: the data URI. Unfortunately it is only supported by FireFox and such and has no way of setting a content-disposition header or some other way of telling the browser that I actually want it saved. It would work for an excel file, but an image, for example, would be opened in a browser window.

Second: the IE execCommand javascript function which has a little command called SaveAs. Unfortunately this would only work for actual HTML pages. Even if the browser would open a binary file, I doubt that a saveAs command would save it correctly.

Besides, both these options, as well as my own attempts above, have a major flaw: there is no way to send chunks of data as you are receiving them from the web service. What is needed it declaring some sort of data stream, then writing stuff in it and then declaring it programatically closed.

I've spent about half an hour trying to determine why the DataFormatString format would not be applied to cell values in a GridView. The short answer: set the HtmlEncode property of the BoundField to false!

Now for the long answer:

A while ago I wrote a small article about how to format the data in your autogenerated GridView columns. At the end of the post I added a small update that explained why I set the HtmlEncode to false. It was, in my opinion, a GridView bug.

However, I didn't realise at the time that the same thing applies to normal GridView BoundFields as well. The thing is, in order to display a value in a bound cell, it FIRST applies the HtmlEncoding to the value CAST TO STRING, THEN it applies the FORMATTING. Here is the reflected source:


/// <summary>Formats the specified field value for a cell in the <see cref="T:System.Web.UI.WebControls.BoundField"></see> object.</summary>
/// <returns>The field value converted to the format specified by <see cref="P:System.Web.UI.WebControls.BoundField.DataFormatString"></see>.</returns>
/// <param name="dataValue">The field value to format.</param>
/// <param name="encode">true to encode the value; otherwise, false.</param>
protected virtual string FormatDataValue(object dataValue, bool encode)
{
string text1 = string.Empty;
if (!DataBinder.IsNull(dataValue))
{
string text2 = dataValue.ToString();
string text3 = this.DataFormatString;
int num1 = text2.Length;
if ((num1 > 0) && encode)
{
text2 = HttpUtility.HtmlEncode(text2);
}
if ((num1 == 0) && this.ConvertEmptyStringToNull)
{
return this.NullDisplayText;
}
if (text3.Length == 0)
{
return text2;
}
if (encode)
{
return string.Format(CultureInfo.CurrentCulture, text3,
new object[] { text2 });
}
return string.Format(CultureInfo.CurrentCulture, text3,
new object[] { dataValue });
}
return this.NullDisplayText;
}



At least the method is virtual. As you can see, there is no way to format a DateTime, let's say, once it is in string format.

Therefore, if you ever want to format your data in a GridView by using DataFormatString, you should make sure HtmlEncode is set to false! Or at least create your own BoundField object that implements a better FormatDataValue method.

and has 1 comment

I know, you're thinking "Who made the great effort of coming up with this incredible gay name?", but keep reading a little more, because this anime series is rather interesting. I am refraining from calling it cool, since the name and because it is not over yet and because it is partially mecha. Also, because I think the direction it is going is a bit off course. Now that the bad things are out of the way, let me tell you about the good ones.

Anyway, the whole thing revolves around Lelouch, the third prince in line for the throne of the Holy Empire of Britannia. It is set in an alternate universe where battles are fought with humanoid robots called Knightmares, and the above mentioned empire considers the Britannians first class citizens while any other enslaved nation gets a number that designates its teritory and its people. The Japanese are called Elevens after the occupation of Japan, and Japan itself is renamed to Area 11.

I will let you read the plot in the Wikipedia page, and focus on the good bits: Lelouch is a very smart guy, he plays chess and defeats just about everybody. He uses his strategic skills to fight against the empire of Britannia as the faceless terorist Zero - for reasons too complicated to explain here. He is still in highschool (why must every Japanese story happen in high schools?!) and he has one more advantage: a geass. This is a magical ability that allows him to command any person he has eye contact with.

The first season had 25 episodes and was pretty cool. It involved strategy, drama, action, sci-fi and a tight script. The second season (R2) is more complex, but my opinion is that it lost much of the power of the first season and has reached episode 14 (released with English subs today). It is worth mentioning that the team that made Code Geass also worked on Planetes, a sci-fi anime based on Arthur C. Clarke's ideas, which I also liked a lot.

Some links:
Code Geass Wikipedia page

and has 0 comments
Oh, no! After such a glorious second volume, Baxter regressed for the third volume of the Destiny's Children series, Transcendent. What you get is basically a continuation of the first volume, but without the emotional content or the cool ideas of Coalescent. Same awkward family relationships that no one really cares about, same main character who is actually driven by the actions and thoughts of people around him, rather than his own, same single final moment that shapes the world without actually making the reader feel anything, same lengthy dialogue that brings important issues into discussion, but without drawing the reader in.

As Stalin said, one death is a tragedy, one million is a statistic. Same thing applies to humans 500.000 years into the future, going back into the past to redeem the sins of humanity. No one cares! The Earth is pushed to the edge by global warming and the lead character is championing a great hydrate stabilisation engineering project. Who cares?!

Bottom line: the book was well written, but badly designed. It's like an engineer doing a great job building something that is fundamentally flawed. I struggled to finish the book just as I've struggled to finish Coalescent, which was far more interesting to begin with. The reason is simple: the reader cannot really empathise with any of the characters, except in disparate fragments of the storyline.

Now, this is a stupid post. And By stupid I mean that kind of thing that if you hear about it you think "Oh, it was obvious", but if it happends to you you waste a lot of time trying to fix, then you think "Oh, it was obvious. If I blog about it I will sound stupid". Well, this kind of blog posts are the most important, I think, so here it is, sounding stupid:

I had a div, absolutely positioned and of fixed width, with a long text inside that would just not wrap! That means the div would either expand its width to accomodate my text (Internet Explorer) or bleed the text out (FireFox). Was it white-space:nowrap? No! Was it that some of the table cells that contained elements had the "noWrap" attribute? No! What the hell is going on?!

Solution and explanation: wrapping occurs only on NORMAL SPACES, not &NBSP;. The text inside my div didn't really have any spaces in it, it was one continuous string of words joined by the innocent &nbsp;.

and has 1 comment

I was looking for new manga to read on the OneManga site and I found Gantz. It has senseless violence, gratuituos sex scenes and great looking chicks in erotic positions. The concept is that a weird black ball is taking people right from before they die and brings them to a room from which they are sent to battle aliens in the streets of Tokyo. Normal people can't see them or the monsters, but they can be killed by them. And they often are.

[youtube:SJ5ICtGn6u8]

This is something only a Japanese could have thought of. People are dying, most people around don't care and they are all trying to show how superior they are compared to others. And then they find something to PROTECT and they cry all the time.

Bottom line: Monsters, Aliens, Vampires, Hot chicks, Sword fighting, Gore, Sex, Rape, Emotional torture... they are all in there. The script doesn't make much sense, though, and I think all characters are emotionally stumped to the level of three year olds. That's how Naruto and InuYasha won so many fans, through carefully crafted emotional landscapes, something Gantz lacks almost completely.

Read Gantz at OneManga.
imDb link for the anime

It was great! Not only the setting was nice (the four star Smart hotel is exactly what I had expected a hotel should be, except the restaurant, maybe), but the weather was cool, the presentation helpful, the tutor (Aurelian Popa) was above expectations and the people pleasant. Not to mention a week away from boring stuff. ;) I feel it would be pointless to detail what we did there, since it was either my own personal life or the actual workshop (which involves work), so I will give you some impressions of the technology and point you towards the resources that would allow you to go through the same learning process.

The whole thing was about WPF and SilverLight and I can tell you two conclusions right now:
WPF/XAML/SilverLight are a great technology and I expect a lot of .Net applications to migrate towards it in the next 6 to 12 months.
The complexity of this technology is likely to put a lot of people off, therefore the tools like Expression Blend and the Visul Studio interface become completely indispensable and must evolve to have great ease of use and become more intuitive.

The entire presentation model allows one to use any graphical transformation available, including 3D, on any part of the interface. The controls are now without appearance. They come with a default appearance that can be totally replaced with your own. A weird example is to use a 3D cube with video running on each side as a button. Of course, the whole thing is still work in progress and some stuff is yet difficult to do. Besides, you know Microsoft: a lot of complicated things are easy to do, while some of the simplest are next to impossible.

You can taste the Microsoft confidence on this by watching them release an entire design oriented suite (Expression) and working on making Silverlight available on all platforms and browsers. Just the fact that Silverlight can access directly the browser DOM is enough to make me remove all those patchy javascript scripts and replace them with nice Silverlight C# code.

Enough of this. Go learn for yourself!

Tools:
Silverlight is at version 2 beta 2. That is painfully obvious when new bugs are introduced and beta 1 applications break. The Expression Blend tool is at version 2.5 June 2008 CTP and it has also a long walk ahead towards becoming useful. Visual Studio 2008 performs rather well when faced with XAML and WPF stuff, but the Resharper 4.0 addon helps it out a lot. You need the Visual Studio 2008 Silverlight Tools, too. After this compulsory tool kit you could also look at Snoop, Blender and Expression Deep Zoom Composer.

Learning material:
Simplest thing to do is to go to Silverlight Hands-on Labs or download the WPF Hand-on labs and download them all and run through the documentation script that is included with each one. There are video tutorials about how to use the tools, too. Here is one for Blend. Of course, all blogs and materials available online at the search of a Google are helpful, as well.

Community:
As any community, it depends on your desired locality and interests. You can look for local .Net / WPF groups or browse for blogs half way around the globe from you. From my limited googling during the workshop I can see that there are people talking about their issues with WPF and SL, but not nearly enough: the technology still needs to mature. I haven't really searched for it, but I've stumbled upon this site: WindowsClient.NET that seems to centralize WPF, Windows Forms and a bit of Silverlight information.

and has 0 comments

A while ago I was recommending the anime called Inuyasha and then reading on the manga. Well, after 558 episodes - each having around 19 manga (comics) slides - Inuyasha has reached the end. A bit anticlimactic, considering the things that attracted me to the story in the first place, but an end nevertheless.

You can read the entire story here at MangaStream

I wanted to write this great post about how to make Web User Controls that would have templates, just like Repeaters or GridViews or whatever, face any problems, then solve them. Unfortunately, MSDN already has a post like this: How to: Create Templated ASP.NET User Controls. So all I can do is tell you where to use this and what problems you might encounter.

I think the usage is pretty clear and useful: whenever you have as?x code that repeats itself, but has different content, you can use a templated Web User Control. The best example I can think of is a collapsable panel. You have a Panel, with some javascript attached to it, a hidden field to hold the collapse state, some buttons and images and texts to act as a header, but every time the content is different.

Now with the issues one might encounter. In Visual Studio 2005 you get an error, while in VS 2008 you get a warning telling you the inner template, whatever name you gave it, is not supported. This is addressed by the
[ PersistenceMode(PersistenceMode.InnerProperty) ]
decoration of the ITemplate property of the control.
Then there is the issue of the design mode, where you get an ugly error in all Visual Studio versions: Type 'System.Web.UI.UserControl' does not have a public property called '[yourTemplatePropertyName]'. As far as I know there is no way of getting rid of this. It is an issue within Visual Studio. However, the thing compiles and the source as?x code is warning free. I think one could easily sacrifice some design time comfort to reusability.

and has 0 comments
I am not usually one to talk politics, especially since I don't really think there are essential differences between people participating in this game. Yes, I do see it as a game, with rules that you need to follow to get the prize. But this year's mayor election proved that some of the rules are more subtle than just spending the biggest amount of money in promotional ads and having the biggest party support you. Well, pretending to be "one of the people" seems to always help, though. :)

What happened? Internal struggles within the main opposition party led to them choosing their candidate for the city hall not the man with the most popular votes (as resulting from opinion poles), but the man with the most connections in the party. Therefore the other guy decided to candidate independently. He spent almost nothing on campaign ads, while the leading party candidate spent about 600000 euros just for the first part of the elections and God knows how much for the final part.

Conclusion? Sorin Oprescu, the independent candidate, has won the elections. The leading party candidate lost, with all his ad money, while no one even noticed the candidate from the oposition party. Apparently, a great victory for the people in Bucharest. I have no idea if the guy will be any good as a mayor, and I think that is the major flaw in Romanian elections, but the arrogant belief that party support and lots of money can just land you in a popular position has failed once more in Bucharest today.

But my theory is that Oprescu didn't win just by charisma or by the total lack of charisma of his opponent, Blaga, but from the ugly and cheap attacks against him and other candidates from the main party. With slogans like "Let's get rid of the garbage in sector 5, dump Vanghelie" and images of a bulldog and a snake with glasses (Blaga looks like a big ugly dog, while Oprescu wears glasses) they pushed people away. I guess that the fact that the snake is a symbol of wisdom in many cultures past them by completely.

Anyway, my conclusion is that arrogance is the worst thing a Romanian politician can do right now. They can be stupid, corrupt, pathetic, but NOT arrogant. It is traditional in Romania to dream to become powerful, rich, above all others, and it is even more traditional, since most people never do get rich or famous, to totally despise and hate the people that do or behave like they do. Today was a lesson in humility for the political class.

Update: this fix is now on Github: Github. Get the latest version from there.

The scenario is pretty straightforward: a ListBox or DropDownList or any control that renders as a Select html element with a few thousand entries or more causes an asynchronous UpdatePanel update to become incredibly slow on Internet Explorer and reasonably slow on FireFox, keeping the CPU to 100% during this time. Why is that?

Delving into the UpdatePanel inner workings one can see that the actual update is done through an _updatePanel Javascript function. It contains three major parts: it runs all dispose scripts for the update panel, then it executes _destroyTree(element) and then sets element.innerHTML to whatever content it contains. Amazingly enough, the slow part comes from the _destroyTree function. It recursively takes all html elements in an UpdatePanel div and tries to dispose them, their associated controls and their associated behaviours. I don't know why it takes so long with select elements, all I can tell you is that childNodes contains all the options of a select and thus the script tries to dispose every one of them, but it is mostly an IE DOM issue.

What is the solution? Enter the ScriptManager.RegisterDispose method. It registers dispose Javascript scripts for any control during UpdatePanel refresh or delete. Remember the first part of _updatePanel? So if you add a script that clears all the useless options of the select on dispose, you get instantaneous update!

First attempt: I used select.options.length=0;. I realized that on Internet Explorer it took just as much to clear the options as it took to dispose them in the _destroyTree function. The only way I could make it work instantly is with select.parentNode.removeChild(select). Of course, that means that the actual selection would be lost, so something more complicated was needed if I wanted to preserve the selection in the ListBox.

Second attempt: I would dynamically create another select, with the same id and name as the target select element, but then I would populate it only with the selected options from the target, then use replaceChild to make the switch. This worked fine, but I wanted something a little better, because I would have the same issue trying to dynamically create a select with a few thousand items.

Third attempt: I would dynamically create a hidden input with the same id and name as the target select, then I would set its value to the comma separated list of the values of the selected options in the target select element. That should have solved all problems, but somehow it didn't. When selecting 10000 items and updating the UpdatePanel, it took about 5 seconds to replace the select with the hidden field, but then it took minutes again to recreate the updatePanel!

Here is the piece of code that fixes most of the issues so far:

/// <summary>
/// Use it in Page_Load.
/// lbTest is a ListBox with 10000 items
/// updMain is the UpdatePanel in which it resides
/// </summary>
private void RegisterScript()
{
string script =
string.Format(@"
var select=document.getElementById('{0}');
if (select) {{
// first attempt
//select.parentNode.removeChild(select);


// second attempt
// var stub=document.createElement('select');
// stub.id=select.id;
// for (var i=0; i<select.options.length; i++)
// if (select.options[i].selected) {{
// var op=new Option(select.options[i].text,select.options[i].value);
// op.selected=true;
// stub.options[stub.options.length]=op;
// }}
// select.parentNode.replaceChild(stub,select);


// third attempt
var stub=document.createElement('input');
stub.type='hidden';
stub.id=select.id;
stub.name=select.name;
stub._behaviors=select._behaviors;
var val=new Array();
for (var i=0; i<select.options.length; i++)
if (select.options[i].selected) {{
val[val.length]=select.options[i].value;
}}
stub.value=val.join(',');
select.parentNode.replaceChild(stub,select);

}};"
,
lbTest.ClientID);
ScriptManager sm = ScriptManager.GetCurrent(this);
if (sm != null) sm.RegisterDispose(lbTest, script);
}



What made the whole thing be still slow was the initialization of the page after the UpdatePanel updated. It goes all the way to the WebForms.js file embedded in the System.Web.dll (NOT System.Web.Extensions.dll), so part of the .NET framework. What it does it take all the elements of the html form (for selects it takes all selected options) and adds them to the list of postbacked controls within the WebForm_InitCallback javascript function.

The code looks like this:

if (tagName == "select") {
var selectCount = element.options.length;
for (var j = 0; j < selectCount; j++) {
var selectChild = element.options[j];
if (selectChild.selected == true) {
WebForm_InitCallbackAddField(element.name, element.value);
}
}
}
function WebForm_InitCallbackAddField(name, value) {
var nameValue = new Object();
nameValue.name = name;
nameValue.value = value;
__theFormPostCollection[__theFormPostCollection.length] = nameValue;
__theFormPostData += name + "=" + WebForm_EncodeCallback(value) + "&";
}

That is funny enough, because __theFormPostCollection is only used to simulate a postback by adding a hidden input for each of the collection's items to a xmlRequestFrame (just like my code above) in the function WebForm_DoCallback which in turn is called only in the GetCallbackEventReference(string target, string argument, string clientCallback, string context, string clientErrorCallback, bool useAsync) method of the ClientScriptManager which in turn is only used in rarely used scenarios with the own mechanism of javascript callbacks of GridViews, DetailViews and TreeViews. And that is it!! The incredible delay in this javascript code comes from a useless piece of code! The whole WebForm_InitCallback function is useless most of the time! So I added this little bit of code to the RegisterScript method and it all went marvelously fast: 10 seconds for 10000 selected items.

string script = @"WebForm_InitCallback=function() {};";
ScriptManager.RegisterStartupScript(this, GetType(), "removeWebForm_InitCallback", script, true);



And that is it! Problem solved.

This is mostly a noob post, but I had to write it because I've had to work with a project written by a colleague of mine and her method of maintaining value across postbacks was to use HiddenFields. I will explore that option and the ViewState option.

First of all, what are the advantages of using a hidden field? I can see only two:
1. it would work even if ViewState is disabled
2. its value is accesible through javascript
The disadvantages are:
1. it creates additional HTML markup
2. it can only store stuff in string format
3. its value is accesible through javascript

I would not use the hidden field option mainly because it gives people the ability to see and change the value through simple javascript manipulation. It's a security risk, even if most of the times you don't really care about the security of some value maintained through postback. I would use it only when _I_ need to change that value through javascript.

For some code, I assume I want to store an integer value called MyValue. There will be a field called _myValue that will store the value during a cycle, but it is used mainly for caching (reading Request and ViewState is slow) and it is declared like this:

private int? _myValue;


Now, about the structure of such a code. The simplest method is to actually create a (or many) hidden field(s) in your page. You can then use the values directly. It is simple, but hardly maintainable:

Markup:

<asp:HiddenField id=hfMyValue runat=server>

C# code:

public int MyValue
{
get
{
if (_myValue == null)
{
if (String.IsNullOrEmpty(hfMyValue.Value)) MyValue = 10;
else _myValue = Int32.Parse(hfMyValue.Value);
}
return _myValue.Value;
}
set
{
hfMyValue.Value = value.ToString();
_myValue = value;
}
}


I've wrapped the functionality of the hidden field in a property so I can easily use it through my code.

Another method of doing this is to use the RegisterHiddenField method of the ScriptManager like this:

C# code only:

public int MyValue
{
get
{
if (_myValue==null)
{
if (Request["MyValue"] == null) MyValue = 10;
else _myValue = Int32.Parse(Request["MyValue"]);
}
return _myValue.Value;
}
set
{
PreRender -= MyValue_Registration;
PreRender += MyValue_Registration;
_myValue = value;
}
}

void MyValue_Registration(object sender, EventArgs e)
{
if (_myValue.HasValue)
ScriptManager.RegisterHiddenField(this, "MyValue", _myValue.Value.ToString());
}


As you can see, there is no need of my changing the markup. There is the ugly part of having to attach to the prerender event to register the hidden field because the ScriptManager doesn't have any way of accessing the registered hidden field after you did it or at least a way to un-register it. Registering it again doesn't change its value, either.

In both these cases the value is accessible through javascript:

<script>var myValue=parseInt(document.getElementById('<%=hfMyValue.ClientID%>').value);</script>
<script>var myValue=parseInt(document.getElementById('MyValue').value);</script>


But there is an easier way of storing the values through postback, and that is by using ViewState. In order to do that, your object needs only to be Serializable. It can be anything from a string to a complex DataSet. There is no way to access it through javascript, though. Here is the C# code for it:

public int MyValue
{
get
{
if (_myValue == null)
{
if (ViewState["MyValue"] == null) MyValue = 10;
else _myValue = (int)ViewState["MyValue"];
}
return _myValue.Value;
}
set
{
ViewState["MyValue"] = value;
_myValue = value;
}
}


Doesn't that look a lot simpler? And the beauty of it is, the ViewState can be sent inside the markup, as in the default behaviour, but it can just as easily be stored on the server, either by using a SessionStatePersister or by other ways.

Update: Also, a more complicated, but a lot more flexibile way of doing things is described on the DimeBrain blog: Frictionless data persistence in ASP.NET WebForms.

and has 0 comments
Exultant, the second book in the Destiny's Children series felt a lot better than Coalescent. Not without its own flaws, it made the entire experience better, but maybe that's just me.

The book describes a universe twenty thousand years into the future, when human kind has infested the galaxy, destroying all sentient races they encountered with their immense war machine. They are currently at war with a technologically superior enemy called the Xeelee, which are trapped at the core of the galaxy, pushed back by the sheer size of human forces. The war has waged for 3000 years and continues with no advancement of any kind, with the entire human philosophy focused on spewing more and more cannon fodder for a war that is neither to be won or lost, just endured.

A rather bleak vision of the future, but fear not, there comes hope! Somehow, an excentric aristocrat comes with all the ideas and resources to create the ultimate weapon that will destroy the Xeelee! And in the pages of the book it is described how they go at it. This is where the book actually fails, because at a such immense space and time scale, a solution of this simplicity is just not believable. You don't feel it in your GUT! But the book is well written, the style bringing memories of Asimov, and the ideas in it pretty interesting.

Stephen Baxter is again applying Universal Darwinism to his universe, bringing more and more species and types of lifeforms out of his magician hat. The ending of the book is terribly naive, but without a bit of naivite, you cannot finish great space sagas in a single book.

Bottom line: if you like space fights, military stratagems, character development, time travel, large scale galactic intrigues and a lot of techno babble (and I know I do! :) ) you will love this book. I do think that some of the great ideas in the book would have mixed nicely with late David Feintuch's writing. Anyway, on with the next book in the series: Transcendent