LogParser 2.2 – First Look

Recently I got introduced to a tool called LogParser while working on a critical enterprise application. It is used to extract information from the log files as easily as querying a SQL database. Guys working in enterprise applications know how important is to log a particular event, and maintain it for a period of time – it contains information deemed crucial to a business. Logs help the admin guys to troubleshoot an issue on a production box as well as for developers when the debugging is not used – mostly when the app is running on the test environment. But some times it is a nightmare to even look at the logs specially if they are on flat files and trace out the information what you really need. Log files are mostly very large in size and almost impossible to find a meaningful information out of it. LogParser bridges this gap by providing a SQL like querying ability. LogParser allows user to treat log files as just another SQL table, the rows of which can be queried, and formatted as per the users choice.

LogParser helps filter the log entries matching specific criteria and to sort the resulting entries according to values of specific files. Log parser consists of three components, which are: 1) Input processor, 2) SQL query parser, and 3) Output processor. Log parser can accept any common log format and output it into one of many formats.  When you are done, you can combine all your separate logs into one common format for analysis.

Ok now lets get started. First thing to do is download the LogParser 2.2, and install it on your machine where you want to process the log files. Next run the log parser from you program files to execute the samples provided here.

Look at this sample Query: logparser.exe -i:EVT -o:NAT “SELECT TimeGenerated, EventID FROM System”

And here is the output:


Now the first part of the query : –i:EVT is processed by the input processor, –o:NAT is processed by the output process and the rest is the SQL Query processed by the SQL parser. In the above SQL Query you may see the fields like TimeGenerated, EventID – to know how exactly we can get to know these fields,  try commanding with a help attribute (-h).

Query example: LogParser -h -i:EVT

And here is the output:


– i:EVT , the parameter “EVT” is used to query the System Event Log.

-o:NAT, the parameter “NAT” is used to output to a readable and nicely formatted text to the Console Window.

In the select query you can also use the “where”, “order by”, and “group by” clauses to narrow down your result output.

To output to a text file use the SQL Query with “INTO”. For e.g. logparser.exe -i:EVT -o:NAT “SELECT TimeGenerated, EventID  INTO C:\out.txt FROM System” You can also write the output to a CSV file – just replace .txt to .csv. You can also output to a  datagrid, use –o:datagrid.

So that was little of basics, and now let us look at a scenario, where we need to query a text log file and output to another format based on certain criteria. Now the very basic question people have is – why should they use the logparser if the intention was to extract information from one file and put it to another. To answer this think of a situation, where your log file is of size 10mb or more and it is on a server located at the other side of the globe.

Look at this query which partially addresses this problem:

LogParser ” SELECT INDEX,TEXT INTO C:\out.CSV FROM \\server\app\logs\runningTrace.log WHERE TEXT LIKE ‘%@@Start%’OR TEXT LIKE ‘%@@End%’ OR TEXT LIKE ‘%TimeStamp%’ ORDER BY INDEX  ASC” -i:TEXTLINE  -o:CSV

The log file which is queried upon almost looks like this:

Timestamp: 7/7/2009 11:12:30 AM
Trace Msg: 
MethodName : abc.HelloWorld
TraceMessage : ——————– Processing abc.HelloWorld() ——————–

Some text logged here
@@Start: CriticalMethod.Processing.Started at 11:12:30:5625000

Some text Logged here

@@End: CriticalMethod.Processing.Ended at 11:12:30:6875000


Now the situation was to calculate the time taken by the method for various scenarios. And this had to be noticed for a weeks time. Think of the size of the file it would have generated for a week and now going through and analyzing them would have been a real pain. The above query provides a narrow result which can be easily analyzed by just looking at it or by an application(user-written) which can be used to read this output.

The query used here is just a sample, the actual query we use is more complicated and more powerful (think of a complicated SQL query which gives a base result). I was just amazed to see this app addressing the issue of looking into the logs, and I posted this so that it should be of help to someone. Do let me know how this has helped you guys, and do share if you come across some interesting issues.

Here is the book I found on Google. Should help.
And this one too is too good. Click Here

6 thoughts on “LogParser 2.2 – First Look

  1. It is really good artical and i know the importance of log files for big project.

    LogParser is really userful tool because my experience is saying that it is not easy to read log files and extract information eaisly.

    Nishant, if you have code please upload so i can undestand LogParser more closely.

    1. Thanks Mitesh. This was just an intro so I did not incorporate complex codes here. Anyting in specific you are looking for let me know.

  2. Hi,

    I’ve used following query for my project and it works fine. I want only exception log from large log file and I got it using following query.

    logparser “SELECT TOP * INTO c:\ABCD.TXT FROM
    WHERE TEXT LIKE ‘%Exception%'”

    1. Glad that you found this article useful to query your log files. I have edited the server details in your comment so that it is not exposed to the outer world 🙂 Sorry about that.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.