XMLPreprocess Getting Started

XMLPreprocess is a simple command line utility that can be used to deploy config files to multiple environments without having to maintain multiple copies. It can be used as a custom action in your MSI to transform the files as they are being deployed to different environments.

The goal of this tool was to minimize the maintenance of multiple configuration files for deployment and I should say it neatly meets the goal.  To understand it better – a single config file can be written and maintained as the source of truth, and it can be decorated with non-breaking comments that contain the instructions for the preprocessor.

Example:

<configuration>
<system.web>
 <!-- ifdef ${production} -->
 <!-- <compilation defaultLanguage="c#" debug="false"/> -->
 <!-- else -->
 <compilation defaultLanguage="c#" debug="true"/>
 <!-- endif -->
 </system.web>
</configuration>

Look carefully at the comments part of the above snippet. You can easily understand that we are trying to turn off the debug page compilation when in production. 

Now that is about the introduction, moving on to the actuals:

First thing first: Download the XMLPreproces from CodePlex, and extract the zip.

For this example, I will be using the spreadsheet(XML Spreadsheet 2003) as the storage of the environment specific variables. However you can also store it in a normal xml file.

I am using the sample files which came along with the download. I copied the sample.config and the SettingsSpreadSheet.xml into my xmpreprocess bin folder – I just did this to make the demo simpler – You can always provide the path for processing.

My folder looks like:

xmlprep1

Open the SettingsSpreadsheet.xml

xmlprep3

The columns represent the different environments you are targeting, and the values in the rows of Column A are the variables you will use in the config file. While preprocessing, the values of the variables are replaced with the corresponding value stored in the environment column.

For e.g. Let us say I want to generate a config file for testing environment, then I store the variable ServiceLocation in the column A, and then I provide the corresponding value in the column of “Test” – now when preprocessed “${ServiceLocation}” in the config file will be replaced with the corresponding value in the xml file.

Open the sample.config file;

xmlprep2

Look at the value in the comments section “${ServiceLocation}” – as mentioned above you should add this to the variable in the settings files – just add “ServiceLocation” (dollar and braces are not required in the settings file), and provide different values for different environments.

Now use this command to preprocess:

“XmlPreprocess.exe /v /q /nologo /i sample.config /x SettingsSpreadsheet.xml /e Test /o Processed.config /d test ”

Once you run the above command, it generates another config file named Processed.config and you will have the value ”testserver” instead of localhost, in the new config file.

Now think of adding this to your WIX setup, or calling the bat file with the preprocessing info as part of build scripts. This will reduce the work of maintaining different config files for different environments.

This tool is really simple to understand, and when used efficiently gets rid of maintenance problems.

Few other resources which may help:

Loren’s Blog

Scott Hanselman’s Blog

Cheers!

LogParser 2.2 – First Look

Recently I got introduced to a tool called LogParser while working on a critical enterprise application. It is used to extract information from the log files as easily as querying a SQL database. Guys working in enterprise applications know how important is to log a particular event, and maintain it for a period of time – it contains information deemed crucial to a business. Logs help the admin guys to troubleshoot an issue on a production box as well as for developers when the debugging is not used – mostly when the app is running on the test environment. But some times it is a nightmare to even look at the logs specially if they are on flat files and trace out the information what you really need. Log files are mostly very large in size and almost impossible to find a meaningful information out of it. LogParser bridges this gap by providing a SQL like querying ability. LogParser allows user to treat log files as just another SQL table, the rows of which can be queried, and formatted as per the users choice.

LogParser helps filter the log entries matching specific criteria and to sort the resulting entries according to values of specific files. Log parser consists of three components, which are: 1) Input processor, 2) SQL query parser, and 3) Output processor. Log parser can accept any common log format and output it into one of many formats.  When you are done, you can combine all your separate logs into one common format for analysis.

Ok now lets get started. First thing to do is download the LogParser 2.2, and install it on your machine where you want to process the log files. Next run the log parser from you program files to execute the samples provided here.

Look at this sample Query: logparser.exe -i:EVT -o:NAT “SELECT TimeGenerated, EventID FROM System”

And here is the output:

output

Now the first part of the query : –i:EVT is processed by the input processor, –o:NAT is processed by the output process and the rest is the SQL Query processed by the SQL parser. In the above SQL Query you may see the fields like TimeGenerated, EventID – to know how exactly we can get to know these fields,  try commanding with a help attribute (-h).

Query example: LogParser -h -i:EVT

And here is the output:

help

– i:EVT , the parameter “EVT” is used to query the System Event Log.

-o:NAT, the parameter “NAT” is used to output to a readable and nicely formatted text to the Console Window.

In the select query you can also use the “where”, “order by”, and “group by” clauses to narrow down your result output.

To output to a text file use the SQL Query with “INTO”. For e.g. logparser.exe -i:EVT -o:NAT “SELECT TimeGenerated, EventID  INTO C:\out.txt FROM System” You can also write the output to a CSV file – just replace .txt to .csv. You can also output to a  datagrid, use –o:datagrid.

So that was little of basics, and now let us look at a scenario, where we need to query a text log file and output to another format based on certain criteria. Now the very basic question people have is – why should they use the logparser if the intention was to extract information from one file and put it to another. To answer this think of a situation, where your log file is of size 10mb or more and it is on a server located at the other side of the globe.

Look at this query which partially addresses this problem:

LogParser ” SELECT INDEX,TEXT INTO C:\out.CSV FROM \\server\app\logs\runningTrace.log WHERE TEXT LIKE ‘%@@Start%’OR TEXT LIKE ‘%@@End%’ OR TEXT LIKE ‘%TimeStamp%’ ORDER BY INDEX  ASC” -i:TEXTLINE  -o:CSV

The log file which is queried upon almost looks like this:

Timestamp: 7/7/2009 11:12:30 AM
Trace Msg: 
MethodName : abc.HelloWorld
TraceMessage : ——————– Processing abc.HelloWorld() ——————–

***************
Some text logged here
***************
@@Start: CriticalMethod.Processing.Started at 11:12:30:5625000

*****************
Some text Logged here

*****************
@@End: CriticalMethod.Processing.Ended at 11:12:30:6875000

————————————————————————–

Now the situation was to calculate the time taken by the method for various scenarios. And this had to be noticed for a weeks time. Think of the size of the file it would have generated for a week and now going through and analyzing them would have been a real pain. The above query provides a narrow result which can be easily analyzed by just looking at it or by an application(user-written) which can be used to read this output.

The query used here is just a sample, the actual query we use is more complicated and more powerful (think of a complicated SQL query which gives a base result). I was just amazed to see this app addressing the issue of looking into the logs, and I posted this so that it should be of help to someone. Do let me know how this has helped you guys, and do share if you come across some interesting issues.

Here is the book I found on Google. Should help.
And this one too is too good. Click Here
Cheers!