and has 0 comments
Stellvia is an anime with teenage kids saving the world. It starts like a kind of Harry Potter, only the main character is a girl, the academy is in space and there are no Voldermort or Slitherins in sight. All in all it was a fun series to watch, but so easy going and adolescent oriented that I am sure it will not remain in my memory for long.

The plot is simple enough: Earth was devastated by a supernova blast wave, it recovered, then it set out on a mission to defend the Solar System from the second wave, slower but deadlier. Their solution was to create a bunch of stellar academies, fill them with children trained by dedicated teachers, while the whole world stands united against this coming disaster. One can see from this plot alone that the focus is not on realism nor human nature. However, since it does touch all the Japanese topics of choice like pursuit of perfection, positive competition, love between school children and loyalty and "gabatte"-ness, it was nice to watch and I have easily enjoyed it.

Composed of 26 episodes, the series does leave room for more, like humanity exploring the stars. The aliens were never explained and the last episode does show a rebuilt Stellvia star academy with the trainees that saved the world as full students welcoming a new batch of recruits. However, it seems like a second season of Stellvia will never happen, due to creative differences.

and has 1 comment
What a sad day this is. I have been reading manga at OneManga on an almost daily basis for a few years now. I liked how you can easily find the manga you want to read, then go through it without tons of ads and crap distracting you. Today, I entered their site and this message appeared:
"There is an end to everything, to good things as well."

It pains me to announce that this is the last week of manga reading on One Manga (!!). Manga publishers have recently changed their stance on manga scanlations and made it clear that they no longer approve of it. We have decided to abide by their wishes, and remove all manga content (regardless of licensing status) from the site. The removal of content will happen gradually (so you can at least finish some of the outstanding reading you have), but we expect all content to be gone by early next week (RIP OM July 2010).

So what next? We're not really sure at this point, but we have some ideas we would like to try out. Until then, the One Manga forums will remain active and we encourage all of you to continue using them. OMF has developed into a great community and it would be a shame to see that disappear.

You can also show us some love in this moment of sadness by 'liking' our brand new Facebook page. It would be nice to see just how many of you came to enjoy our 'better than peanut butter and jelly' invention.


Regardless of whether you stay with us or not, on behalf of the One Manga team, I would like to thank you all for your unwavering support over the years. Through the ups and downs you have stuck with us, and that is what kept us going.

As a certain Porky was fond of saying... That's all folks!

Time for me to go lay down and let this all sink in.

- Zabi


Sure, there are a lot of free manga sites out there, but none of them had the soul of OneManga, a place where obvious passion was fueling things and not financial greed. I will soon add a post with the newest place for free manga. I will also have to update all manga links in the blog. Ugh! Nothing good seems to last forever...

and has 0 comments
If there was any doubt about the style of writing and book structure for the first novel from Ian Cameron Esslemont in the Malazan universe, the second book: The Return of the Crimson Guard, dispelled any. One can barely see a little more focus on action than on description compared to Steven Erikson, but, having read it, I feel like this is the tenth novel in the series, not the second in a parallel Malazan world.

First of all, it is a full length book, similar in size with the ones written by Erkison. Again we see an amassing of forces, set to converge towards the climactic end. There are the Avowed of the Crimson Guard with a full army of mercenaries in tow, there is Lasseen, empress of the Malazans, there are Seguleh, man-beasts, D'ivers, Soletaken, mages of huge power, Claws, Talons, Seti, Wickans and the all pervading regular Malazan soldier, with focus on our favourite sort: the sapper :)

I have to say that the writing is so similar to Erikson's, that it even acquired the same problems. There is a lack of finality to just about anything. One just knows that a lot of questions will remain ... not unanswered, but simply ignored... and that the next books will bring more wonder, more magic, more characters, all dancing around this huge singleton of a main character which is the universe of the Malazan Empire. It's refreshing, it's great... it's annoying!! :)

Having said that, this was another great book, one of those writings that make me want to abandon programming to start writing, even if I know nothing about it, one of those books that make me want to abandon watching movies altogether, for lack of detail and significance. Now my big dillema is what should I read next...

and has 1 comment
I wanted to open a dialog in .NET asking for an image file and so I needed to construct a filter with all supported image types. Strangely enough, I didn't find it on Google, so I did it myself. Here is a piece of code that gets the filter string for you:

private string getImageFilter()
{
StringBuilder sb=new StringBuilder();
ImageCodecInfo[] codecs = ImageCodecInfo.GetImageEncoders();
foreach (ImageCodecInfo info in codecs)
{
if (sb.Length > 0)
sb.Append(";");
sb.Append(info.FilenameExtension);
}
return sb.ToString();
}
As you can see, it enumerated through the image encoders list and creates the extension list. The filter, then, is obtained as

filter = string.Format(
"Image Files({0})|{0}|All files (*.*)|*.*",
getImageFilter())

Before using it, though, here is the (surprisingly disappointing) filter string: *.BMP;*.DIB;*.RLE;*.JPG;*.JPEG;*.JPE;*.JFIF;*.GIF;*.TIF;*.TIFF;*.PNG
Kind of short, isn't it?

and has 0 comments

Google was born from an idea in 1996. It gained momentum and it became a word in the English dictionary. To google means more than to search for something, it means to delegate the responsibility of the search, it means not simply search, but find the answers to your question.

It reminds me of that scifi joke about a universe populated by billions of races that decided to combine all their networks into a large information entity. Then they asked the question "Is there a God?" and the machine answered "Now there is" and melted the off switch with a bolt of lightning. Can one really trust the answers given to them by a machine?

I am not the paranoid type. This is not a blog post about the perils of machine domination or about the Machiavellian manipulation of the company wielders. Instead is an essay on the willingness of humans to delegate responsibility. "Surely Google is just a search engine, it is not intelligent and it could never take over the world", one might say. But that's exactly the problem. Millions of people in the world are willing to let this stupid thing find answers for them.

Why? Because it worked. The search engine has more information available that any human could possibly access, not to mention remember. It is a huge statistical machine that finds associations on words, concepts, the search person preferences, the relationships between people and any other data available, like who the searcher is. Any AI dabbler could tell you that this is the first step towards intelligence, but again, that is not the point. The algorithms employed are starting to fail. The information that has been gathered by Google is being eroded by "Search Engine Optimization" techniques, by time and by the people's own internal algorithms that have started to trust and care about only the first links in a search.

Already there are articles about the validity of the answers given by "Doctor Google", a nickname given to the search engine used in the context of finding out medical solutions. The same principle applies to almost everything. The basis of Google's search is that pages that are linked by other sites and blogs are probably more important or interesting that those that are not. Of course, there is more than that, like when was the page last updated, balck and white lists, and stuff like that, but basically, old information has better chances to get in the first searches. Also information that is on sites that are well done and organized. That raises the question: would a true specialist that spends a large amount of effort and time researching their field of activity have the skill set and be willing to spend the resources to have a professional web site? How about the people that are not specialists? How about people that are actively trying to take advantage of you?

You can easily check this by searching for a restaurant name. Chances are that the site for the restaurant is not even on the first page, which has been usurped by aggregators, review sites and others like that. If a technology has not changed its name, but went through a large change, chances are that googling for its name will get you reading about it before the change. Search for a book and you will get to Amazon, not a review or (God forbid) a download site. Search for "[anything] download" and you will get to huge ad-ridden sites that have a page for just about every search that contains those words, but, surprise, no download.

Do not think that I am attempting to bash Google. Instead, I am trying to understand why such obvious things are not taken into consideration by the people doing the search. The same thing applies to other sites that have gained our confidence, so now are targets for more and more advanced cons. Confidence is a coin, after all, one that gets increasingly important as the distribution monopoly gets out of the hands of huge corporations and dissembles into a myriad of blogs and forum sites. This includes Wikipedia, IMDb, aggregators of all kinds, YouTube, Facebook, Twitter, blogs, etc. I know that we don't really have the time to do in depth searches for everything, but do you remember the old saying "God is in the details"?

Has Google reached godhood? Is it one we faithfully turn to for our answers? The Church of Google seems to think so. There are articles being written now about Searching without searching, algorithms that would take into consideration who you are when you are searching in order to give you the relevant information. It is a great concept, but doesn't that mean we will trust in a machine's definition of our own identity?

I once needed to find some information about some Java functions. Google either has statistical knowledge that .Net is cooler or that I have searched .Net related topics in the past and would swamp me with .Net results, which have pretty similar method names. Imagine you are trying to change your identity, exploring things that are beyond your scope of knowledge. Google would just try to stop you, just like family and friends, who give comfort, but also hold dearly to who you were rather that who you might be or want to become. And it is a global entity, there for you no matter where. You can't just move out!

To sum up, I have (quite recently) discovered that even for trivial searches, paying attention to all the links on the first page AND the second is imperative if I want to get the result I want, not just the one suggested. Seeing a Wikipedia link in the found items doesn't mean I should go there and not look at the others. Imdb is great at storing information about movies, but I can't trust the rating (or the first review on the page). YouTube is phenomenal at hosting video, but if I want something that is fresh and not lawyer approved I need to go to other sites as well. When having a problem and asking a friend, I appreciate their answer and seek at least a second opinion.

To simply quote and link: Unfortunately, this is where the SharedSizeGroup method breaks down. If you want to have a shared Grid that uses the whole available width and automatically adjusts when that space changes you're going to need a different method. A column set to * in a shared group acts just like an Auto column and won't fill or stay within the given space. Taken from John Bowen's blog.

and has 0 comments
I am not much of an art guy, but this thing just blew me away. Not so much the animation itself (it is very original, but... not an art guy) as the volume of effort and work this had to require. Just watch it, it is worth it.


BIG BANG BIG BOOM - the new wall-painted animation by BLU from blu on Vimeo.

and has 0 comments
Forced to wait for the tenth and final novel of the Malazan Book of the Fallen series, due to be published this year, I've started to read the books placed in the same universe written by Steven Erikson's friend, Ian Cameron Esslemont. The first of these books is Night of Knives, which is rather short compared with Erikson's novels or, indeed, with the second Esslemont book, Return of the Crimson Guard, which I am reading now.

The book is alert, as it spans a single night on the island of Malaz, during a rare event which weakens the borders between realms. Anything can happen during this night and, indeed, does happen. The island is assaulted by alien ice magic water dwellers, the dead house is under siege and Kellanved and Dancer are making their move towards the throne of Shadow realm. Meanwhile Surly is Clawing her way into the throne, a natural talented girl with too much attitude is trying to get a job and start an adventure and an old retired soldier gives his all once again.

All and all, it was a nice book. The writing style is clearly different from Erikson's, with less descriptive passages, a little more action and a more positive bias, tending to lend people more good qualities and having them end a little better. However, it only takes a few pages to get into the Malazan feel of things and enjoy the book.

Ok, so I am doing a lot of Access for a friend. And I got into a problem that I was sure had a simple solution somewhere. Apparently it does not. The documentation for the issue is either not existant or buggy and the "helping" comments usually are trying to tell you you are wrong without trying to give a workable solution or directing you to some commercial solution. So this is for the people trying to solve the following problem: You have images embedded in an Ole Object field in a table, the images are jpg or whatever format and they appear to the table editor as Package and you want to display those images in an Image control in an Access Form via VB, without binding anything. Also, I am using Access 2007.

The first thing you are going to find when googling is that putting images in the database is a bad idea. Whenever you see this, close the page. People will give you their solution, which is store the URL of the image in the database. We don't want that, for various reasons.

After googling some more, you will find there is no solution involving the Image control, but rather only Bound or Unbound Ole Object Frames. We don't want that either.

The only solution left, since the Image control does not support direct content, but only a path to an image, is to read the binary data from the field, store it in a temporary file, then display it. When looking for this you will get to a Microsoft knowledge base article, which does most of the work, but is buggy! You see, the FileData variable they use in the WriteBLOB function is defined as a string, and it should be defined as a byte array.

Also, you want to retrieve the data from the record as binary data and so you want to use CurrentDb.OpenRecordset("MyQuery") and you get a stupid error like "Run-time error '3061': Too few parameters. Expected 1.". This is because your query has a form parameter and it just fails. There are some solutions for this, but what I basically did was to read the ID of the record in a variable using normal DLookup, then write a new SQL query inline: CurrentDb.OpenRecordset("SELECT Picture FROM MyTable WHERE ID=" & id).

When you finally make it to save the binary data in a file, you notice that the file is not what you wanted, instead it is a little bigger and starts with some mambo jumbo containing the word Package again. That means that, in order to get the file we want, you need to decode the OLE package format.

And here is where I come from, with the following code:

' Declarations that should go at the beginning of your code file
' ==========================
Const BlockSize = 32768
Const UNIQUE_NAME = &H0

Private Declare Function GetTempPathA Lib "kernel32" _
(ByVal nBufferLength As Long, _
ByVal lpBuffer As String) As Long

Private Declare Function GetTempFileNameA Lib "kernel32" _
(ByVal lpszPath As String, ByVal lpPrefixString As String, _
ByVal wUnique As Long, ByVal lpTempFileName As String) _
As Long
' ==========================

' Get a temporary file name
Public Function GetTempFileName() As String

Dim sTmp As String
Dim sTmp2 As String

sTmp2 = GetTempPath
sTmp = Space(Len(sTmp2) + 256)
Call GetTempFileNameA(sTmp2, "", UNIQUE_NAME, sTmp)
GetTempFileName = Left$(sTmp, InStr(sTmp, Chr$(0)) - 1)

End Function

' Get a temporary file path in the temporary files folder
Private Function GetTempPath() As String

Dim sTmp As String
Dim i As Integer

i = GetTempPathA(0, "")
sTmp = Space(i)

Call GetTempPathA(i, sTmp)
GetTempPath = AddBackslash(Left$(sTmp, i - 1))

End Function

' Add a trailing backslash is not already there
Private Function AddBackslash(s As String) As String

If Len(s) > 0 Then
If Right$(s, 1) <> "\" Then
AddBackslash = s + "\"
Else
AddBackslash = s
End If
Else
AddBackslash = "\"
End If

End Function

' Write binary data from a recordset into a temporary file and return the file name
Function WriteBLOBToFile(T As DAO.Recordset, sField As String)
Dim NumBlocks As Integer, DestFile As Integer, i As Integer
Dim FileLength As Long, LeftOver As Long
Dim FileData() As Byte
Dim RetVal As Variant

On Error GoTo Err_WriteBLOB

' Get the size of the field.
FileLength = T(sField).FieldSize()
If FileLength = 0 Then
WriteBLOBToFile = Null
Exit Function
End If

'read Package format
Dim pos As Integer
pos = 70 ' Go to position 70
Do ' read a string that ends in a 0 byte
FileData = T(sField).GetChunk(pos, 1)
pos = pos + 1
Loop Until FileData(0) = 0
Do ' read a string that ends in a 0 byte
FileData = T(sField).GetChunk(pos, 1)
pos = pos + 1
Loop Until FileData(0) = 0
pos = pos + 8 ' ignore 8 bytes
Do ' read a string that ends in a 0 byte
FileData = T(sField).GetChunk(pos, 1)
pos = pos + 1
Loop Until FileData(0) = 0
' Get the original file size
FileData = T(sField).GetChunk(pos, 4)
FileLength = CLng(FileData(3)) * 256 * 256 * 256 + _
CLng(FileData(2)) * 256 * 256 + _
CLng(FileData(1)) * 256 + CLng(FileData(0))
' Read the original file data from the current position
pos = pos + 4

' Calculate number of blocks to write and leftover bytes.
NumBlocks = FileLength \ BlockSize
LeftOver = FileLength Mod BlockSize

' Get a temporary file name
Dim Destination As String
Destination = GetTempFileName()

' Remove any existing destination file.
DestFile = FreeFile
Open Destination For Output As DestFile
Close DestFile

' Open the destination file.
Open Destination For Binary As DestFile

' SysCmd is used to manipulate the status bar meter.
RetVal = SysCmd(acSysCmdInitMeter, "Writing BLOB", FileLength / 1000)

' Write the leftover data to the output file.
FileData = T(sField).GetChunk(pos, LeftOver)
Put DestFile, , FileData

' Update the status bar meter.
RetVal = SysCmd(acSysCmdUpdateMeter, LeftOver / 1000)

' Write the remaining blocks of data to the output file.
For i = 1 To NumBlocks
' Reads a chunk and writes it to output file.
FileData = T(sField).GetChunk(pos + (i - 1) * BlockSize _
+ LeftOver, BlockSize)
Put DestFile, , FileData

RetVal = SysCmd(acSysCmdUpdateMeter, _
((i - 1) * BlockSize + LeftOver) / 1000)
Next i

' Terminates function
RetVal = SysCmd(acSysCmdRemoveMeter)
Close DestFile
WriteBLOBToFile = Destination
Exit Function

Err_WriteBLOB:
WriteBLOBToFile = Null
Exit Function

End Function


The function is used like this:

Dim id As String
id = DLookup("ID", "MyTableQueryWithFormCriteria", "")
Dim rs As DAO.Recordset
Set rs = CurrentDb.OpenRecordset("SELECT Picture FROM MyTable WHERE ID=" & id)
Dim filename As String
filename = Nz(WriteBLOBToFile(rs, "Picture"), "")
imgMyImage.Picture = filename


So, MyTable is a fictional table that contains an ID field and a Picture field of type OLE Object. MyTableQueryWithFormCriteria is a query used inside the form to get the data for the current form. It contains the MyTable table and selects at least the ID field. The WriteBLOBToFile function creates a temporary file, writes the binary data in the OLE Object field in it and returns the file's filename, so that we can feed it in the Image control.

The trick in the WriteBLOBToFile function is that, at least in my case with Access 2007, the binary data in the field is stored in a "Package". After looking at it I have determined that its format is like this:
  1. A 0x40 (64) byte header
  2. A 4 byte length
  3. A 2 byte (version?) field
  4. A string (characters ended with a 0 byte)
  5. Another string
  6. 8 bytes that I cared not to decode
  7. Another string
  8. The size of the packaged file (the original) in a 4 byte UInt32
  9. The data in the original file
  10. Some other rubbish that I ignored

The function thus goes to 64+6=70, reads 2 strings, moves 8 bytes, reads another string, then reads the length of the data and saves that much from the current position.

The examples in the pages I found said nothing about this except that you need an OLE server for a specific format in order to read the field, etc, but all of them suggested to save the binary data as if it were the original file. Maybe in some cases this happends, or maybe it is related to the version of MS Access.

and has 6 comments
I have been trying to build this setup for a project I made, using WiX, the new Microsoft paradigm for setup packages. So I did what any programmer would do: copy paste from a previously working setup! :) However, there was a small change I needed to implement, as it was a .NET4.0 project. I built the setup, compiled it, ran the MSI and kaboom!

Here is a piece of the log file:
Action 15:34:48: FetchDatabasesAction. 
Action start 15:34:48: FetchDatabasesAction.
MSI (c) (A0:14) [15:34:48:172]: Invoking remote custom action. DLL: C:\DOCUME~1\siderite\LOCALS~1\Temp\MSI21CF.tmp, Entrypoint: FetchDatabases
MSI (c) (A0:68) [15:34:48:204]: Cloaking enabled.
MSI (c) (A0:68) [15:34:48:219]: Attempting to enable all disabled privileges before calling Install on Server
MSI (c) (A0:68) [15:34:48:251]: Connected to service for CA interface.
Action ended 15:34:48: FetchDatabasesAction. Return value 3.
DEBUG: Error 2896: Executing action FetchDatabasesAction failed.
The installer has encountered an unexpected error installing this package. This may indicate a problem with this package. The error code is 2896. The arguments are: FetchDatabasesAction, ,
Action ended 15:34:48: WelcomeDlg. Return value 3.
MSI (c) (A0:3C) [15:34:48:516]: Doing action: FatalError

In order to get the log of an MSI installation use this syntax:
msiexec /i YourSetup.msi /l*vvv! msiexec.log
vvv is used to specify verbosity, the ! sign is used to specify that the log should be flushed after each line.

As you can notice, the error is a numeric error (2896) and nothing else. Googling it you get to a lot of people having security issues with it on Vista and Windows 7, but I have Windows XP on my computer. The error message descriptions pretty much says what the log does: Custom action failed. Adding message boxes and System.Diagnostics.Debugger.Launch(); didn't have any effect at all. It seemed the custom action was not even executed!

After hours of dispair, I found what the problem was: A custom action is specified in a DLL which has a config file containing this:

<startup>
<supportedRuntime version="v2.0.50727"/>
</startup>
which specifies for the MSI installer which version of the .NET framework to use for the custom action. Not specifying it leads to a kind of version autodetect, which takes into account the version of the msiexec tool rather than the custom action dll. It is highly recommended to not omit it. The problem I had was that I had changed the target of the custom action to .NET 4.0 and had also changed the config file to:

<startup>
<!--<supportedRuntime version="v2.0.50727"/>-->
<supportedRuntime version="v4.0.30319.1"/>
</startup>


Changing the version to NET3.5 and adding the original config string fixed it. However, I am still unsure on what are the steps to take in order to make the 4.0 Custom Action work. I have tried both 4.0.30319 and 4.0.30319.1 versions (the framework version folder name and the version of the mscorlib.dll file in the .NET 4.0 framework). I have tried v4.0 only and even removed the version altogether, to no avail.

In the end, I opened the WiX3.5 sources and looked for a config file. I found one that had this:

<startup useLegacyV2RuntimeActivationPolicy="true">
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.0"/>
<supportedRuntime version="v2.0.50727" />
</startup>
As you can see, there is an extended supportedRuntime syntax in the 4.0 case, but that is not really relevant. The thing that makes it all work is useLegacyV2RuntimeActivationPolicy="true"!

So shame on whoever wrote msiexec for not specifying the actual problems that make a setup fail, and a curse on whoever decided to display a numeric code for an error, rather than trying to write an as verbose a description as possible. I hope people will find this post when facing the same problem and not waste three hours or more on a simple and idiotic problem.

If you have a T4 Template .tt file that throws a weird Compiling transformation: Invalid token 'this' in class, struct, or interface member error that seems to come out of nowhere, try to delete extraneous spaces.

In my case, I has copied/pasted the .tt content from a web page and I was trying to understand why it wouldn't work. I right clicked on the source, went to Advanced, chose Convert all spaces to tabs, then back to Convert all tabs to spaces. Then it worked. I guess some white spaces where not really spaces or some other formatting issue.

If you don't have the options when you right click, it might be that they are features of the Tangible T4 Editor.

and has 0 comments
The word that I think best describes the book is "naive". That's not necessarily a bad thing; people have been hooked by naive stories since forever. Isaac Asimov had some very simplistic plots where everything was going well for the main character. The Harry Potter series was also what I could call naive; didn't hurt it much. From the same perspective I can say that The Vorkosigan Saga, which now spans about twenty novels and short stories, had its share of success (and three Hugo awards) no matter what the writing style. That writing style should have evolved anyhow as each book was written.

Back to The Warrior's Apprentice, though. It's about a kid, son of royalty on his home backward planet, who singlehandedly buys a spaceship, runs a blockade, creates a mercenary force, fools everybody that he is older, is smarter than anyone and also foils a ploy to destroy his father. And the drama is, as teenagers go, that he doesn't get the girl. Now see why I call it naive?

However, I am sure I would have gobbled the whole series up when I was fifteen, so, even if I have decided to not read the other books in the series, it depends on what your tastes are. The Warrior's Apprentice is an easy to read, easy to follow, shortish book. As a travel book I guess it would be decent.

Update: Thanks to Tim Fischer from Tangible, I got to solve all the problems described in the post below using VolatileAssembly and macros like $(SolutionDir) or $(ProjectDir).

When T4 (Text Template Transformation Toolkit) appeared as a third party toolkit that you could install on Visual Studio 2008, I thought to myself that it is a cool concept, but I didn't get to actually use it. Now it is included in Visual Studio 2010 and I had the opportunity to use it in a project.

The idea is to automatically create code classes and other files directly in Visual Studio, integrated so that the files are generated when saving the template. All in all a wonderful idea... but it doesn't work. Well, I may be exaggerating a bit, but my beginning experience has been off putting. I did manage to solve all the problems, though, and this is what this blog post is about.

First of all, there is the issue of intellisense. I am using ReSharper with my Visual Studio, so the expectations for the computer knowing what I am doing are pretty high. In the .tt (the default extension for T4) files you don't have any. The solution for this is to use the Tangible T4 editor (I think they were going for a fifth T here) that comes as a Visual Studio addon for VS2008 and VS2010. Fortunately, there is a free version. Unfortunately, it doesn't do intellisense on your own libraries unless you buy the priced one. Also, the intellisense is years behind the one provided by ReSharper or even the default Visual Studio one and the actions one can do automatically on code in a T4 template are pretty limited.

The second problem was when trying to link to an assembly using a relative path to the .tt file. The Assembly directive supports either the name of an assembly loaded in the GAC or a rooted path. Fortunately, the VS2010 version of the T4 engine supports macros like $(SolutionDir). I don't know if it supports all Visual Studio build macros in the link, but the path ones are certainly there.

Here is how you use it. Instead of


<#@ Assembly Name="Siderite.Contract.dll" #>

use


<#@ Assembly Name="$(SolutionDir)/Assemblies/Siderite.Contract.dll" #>



The third problem was that using an assembly directive locked the assembly used until you either reopened the solution or renamed the assembly file. That proved very limiting when using assemblies that needed compiling in the same solution.

This can be solved by installing the T4 Toolbox and using the VolatileAssembly directive. Actually, on the link above from Oleg Sych you can also find a bit advising using the T4 toolbox VolatileAssembly directive in the Assembly Locking section.

Here is how you use it. Instead of


<#@ Assembly Name="$(SolutionDir)/Assemblies/Siderite.Contract.dll" #>

use


<#@ VolatileAssembly
processor="T4Toolbox.VolatileAssemblyProcessor"
name="$(SolutionDir)/Assemblies/Siderite.Contract.dll" #>

As you can see you need to specify the processor (the VolatileAssemblyProcessor would have been installed by the T4 Toolbox) and you can use macros to get to a relative rooted path.

So thanks to the eforts of Oleg and Tim here, we can actually use T4. It would have been terribly akward to work with the solution in the obsolete section below. The Tangible guys have a learning T4 section on their site as well. I guess that using the resources there would have spared me from a day and a half wasted on this.

The following is obsolete due to the solutions described above, but it is still an informative read and may provide solutions for similar problems.

Click to expand.



Tips And Tricks:
Problem: the T4 generated file has some unexplained empty lines before the start of the text.
Solution: Remove any spaces from the end of lines. Amazingly so, some white space at the end of some of the lines were translated as empty lines in the resulting .tt.

Problem: The code is not aligned properly
Solution: Well, it should be obvious, but empty spaces before the T4 tags are translated as empty spaces in the resulting .tt file. In other words, stuff like <# ... should not be preceded by any indenting. It will make the template look a bit funny, but the resulting template will look ok. If you dislike the way the intending looks in the template, move the indent space in the tag, where it will be treated as empty space in the T4 code.

I've finally finished the book WPF in Action with Visual Studio 2008 by Arlen Feldman and Maxx Daymon. Simply put, it was a great book. Most of the programming books focus too much on structure, resulting in very comprehensive information, but giving one little in the way of actual work. WPF in Action is describing features while using them in a few applications that are built almost entirely out of code printed in the book. I think this is the second book any beginner in WPF should read, the first being one of those boring comprehensive ones :)

The book goes from a brief history of Windows Forms and WPF to Hello World in part one, then to describing layouts, styles, triggers, events and animations in the second part. The third goes to create a wiki application using commands and binding, datatemplates, converters, triggers, validation, then custom controls and drawing (including 3D!). I am a big fan of the MVVM pattern, but I liked that in this book, while it got described, it didn't suffocate the other topics, getting only a small subchapter in the binding section. The fourth part explains navigation, XBAP, goes briefly through ClickOnce and Silverlight, then has a large chapter about printing (too large, I believe). The book finishes with transition effects, interoperability with Windows Forms and threading.

All in all I think it was a very nice read. The authors clearly have a lot of experience and are quite qualified to talk not only about the features in WPF, but also the gotchas and some of the problematic implementations or even bugs. The fourth part of the book was a bit of a bore, though. After a pretty heavy 3D drawing ending of part three, I get to read a whole lot about really boring stuff like printing. I am sure that when need arises, though, this is the first book I will open to see what they did.

Bottom line: First three chapters are a must read. Maybe skip the 3D drawing part the the end of part three. The fourth is optional. The authors themselves said that they intended to write something that could be used as a reference, and I think they succeeded. So read the table of contents and see which parts of WPF you are really interested in in those optional parts.

The WPF in Action with Visual Studio 2008 link goes to the publishers site, where you can download the source code and even read some sample chapters.

and has 0 comments
The final chapter of the Fullmetal Alchemist story has been released today. Have the Elric brothers regained their bodies? Have they sacrificed everything in that Japanese way we so love? Did they get to yet another place filled with Nazis, turned vampire and got to be characters in an Uwe Boll movie? You will have to read the manga to find out! :) The good news it that following the link above you can do just that!

As you may know, the anime finished abruptly a while ago with the two brothers teleporting to our world in the middle of World War 2 and ended up in a ridiculous story. Luckily enough, the manga had none of that bullshit and continued on its merry way. Picking on that, another anime was started, Fullmetal Alchemist Brotherhood, which was supposed to delete from our memory the shame of the ending of the first anime. It is now pretty close to ending itself.

My opinion about the whole story is that it was a pretty imaginative concept, a kind of alchemic steampunk universe, filled with wonder, horror and fun stories. I hope you Read/watch it with just as much fun as I have.

Update 4th of July: The anime (Fullmetal Alchemist: Brotherhood) has also ended. It covered the exact same things as the manga this time.

I also forgot to mention that the story ends with a few loose ends: Al goes to explore the East and learn Alkahestry, accompanied by the two chimera men that want their original bodies back as well; Ed is going West, trying to learn as much as possible so that he can return and complete his brother's research and then help people together; Mrs Bradley is raising the last homunculus, Selim, as her son, trying to infuse him with love and make him a good person. These three threads can lead to a possible continuation of the Alchemist story. At least, I hope they do.