Most recent post

Friday, July 9, 2010

VSTO, ClickOnce, MSBuild equals happiness

After spending several frustrating days working on VSTO and ClickOnce deployment I thought I'd share the experiences in the below guidance and instructions for newbie’s to follow (like I was a few days ago). There are some great articles out there, but a lot of misinformation because of differences between VS2008 and VS2005.

The objective :-From an intranet hosting a VSTO package enable a use to click on a URL of the document (http://mysebserver/VSTO/vstogluegood.docx) that is opened on their PC and the required VSTO file is installed and ran. The VSTO ClickOnce package I wanted to be created using MSBuild project file.

Target development environment and Office version
- VSTO developed using Visual Studio 2008
- Office 2007 (specifically targeting Word 2007)

Client side components required
The below are a list of client side components required to run VSTO packages using ClickOnce.
- .NET Framework 3.5
- Microsoft Office 2007 System Update Redistributable PIA (o2007pia.msi)
- Microsoft Visual Studio Tools for the Microsoft Office system (vstor30.exe)

Client side component installs note
- I opted for a pre-deployment of the packages through ActiveDirectory group policy. The o2007pia.msi is easy to place in the OU, however the vstor30.exe isn't because it is an .exe. The solution we found was to extract the vstor30.exe using the below command :-

vstor30.exe /x:Extract

Using the extracted contents in <current directory>\Extract\, we simply placed the trin_trir.msi in the OU. I suspect this is less than ideal, but worked for us.

Certificates
You need a certificate to deploy and run VSTO packages. There are 3 choices :-
a) Purchase a certificate (mandatory if you are deploying beyond your internal network)
b) Use the Temporary Visual Studio certificate created (downside - expires in 1 year)
c) Create a Certificate using makecert.

I'm using the 3rd choice as it provides you with a more robust certificate which can be deployed using ActiveDirectory. Below are the steps :-

1. Open the Visual Studio Command Prompt
2. Type makecert -r -pe -n "CN=<yourCertificateName>" -b 01/01/2000 -e 01/01/2099 -ss My
3. We have made a .cer certificate, but for Visual Studio we need a .pfx certificate. Within Internet Explorer under Options->Content->Certificates find your certificate and export it as a .pfx, (Choose export private key). Now keep this certificate for the VSTO packages.
4. To create a .cer which isn't exportable (and can be pushed out onto client PCs), simple delete the existing certificate created in step 3 and double click on the .pfx file created in step 4. Install the certificate (mark as not exportable) and then export it as a .cer.

You will need to install the non-exportable .cer on the users' PCs, while using the .pfx file in Visual Studio when you create your packages.

Certificate installation
The certificate created above needs to be installed in
a) Trusted Root Certificate Authorities
b) Trusted Publishers

Manual
- On each PC double click on the .cer file created above as administrator and install into the 2 locations.

OU
- In the Group Policy Object Editor add the certificates to Computer Configuration > Windows Settings > Security Settings > Public Key Policies. For a full walkthrough see this (don't forget to place in both certificate locations).:-https://support.smoothwall.net/index.php?_m=knowledgebase&_a=viewarticle&kbarticleid=180

Word configuration
To automatically allow the Word document to run the ClickOnce deployment it must have the location of the website in its Trusted locations.

Manual
- Within Word goto Office menu->Word Options->Trust Center->Trust Center Settings->Trusted Locations.
a. Allow Trusted Locations on my network – checked
b. The following list of locations :- e.g.
http://website-Prod/VSTO/
http://website-UAT/VSTO/
http://website-TEST/VSTO/
http://website-DEV/VSTO/
* Ensure Subfolders of this location are also trusted is checked when added.

Note: Word has a quirk where the VSTOs must be installed in the subfolder of the site (for Trusted Locations), so you can't use http://MyWebSite/Gluegood.docx, while you can use http://MyWebSite/VSTO/Gluegood.docx

OU
- Follow instructions here on how to set the above in an OU (http://technet.microsoft.com/en-us/library/cc178948(office.12).aspx )


Web site configuration
So that the VSTO runs correctly you need to add the MIME type of extension .vsto and application/x-ms-vsto. See this article for instructions (http://msdn.microsoft.com/en-us/library/bb608629.aspx )


MSBuild project file (aka the Magic happens here)
VS2008 comes with a nice 'Publish' UI, however it lacks features in the following ways :-
1. You are unable to differentiate builds between the environments.
2. It can't be part of a scheduled build

There seem lots of other ways of doing the below, but I like the simplicity of building and creating the VSTO in once step using MSBuild without needing to fiddle with other tools.

Below I have provided VB.NET code to generate the MSBuild file and kick off an install. The below however can easily be converted to .msbuild file and .cmd file with hard coded values.

There are 2 parts to the MSBuild file :-
a) The properties created in the Property Group
b) The Actual MSBuild settings

Below are parameters which are passed to the code :-
SolutionFile - This is the location of the .sln file to be compiled. e.g. C:\Projects\VSTOGluegood.sln

Version - This is the version of the VSTO project. e.g. 3.1.2.29

Environment - We use this to determine the SolutionId (see below)

Destination - This is the network address in which you want to create the VSTO. e.g. \\MyWebServer\e$\Site\VSTO\

DestinationURL - This is the public address in which the VSTO will be installed, when new versions are checked by the clients this is the location they will use. e.g. http://MySite/VSTO

SolutionId - To make the VSTOs unique for each environment they must have different AssemblyName and different SolutionId – (http://briannoyes.net/2006/11/03/ClickOnceDeploymentApplicationIdentity.aspx) . The function CompileVSTOReports_GetSolutionId in the below code looks for the .vbproj and extracts the SolutionId, however it replaces the last character so each environment is unique, e.g. Our ProjId is dd0d7e0e-7913-4e9a-8541-d3ef27387d16, for VSTO deployments the 6 becomes an 'a' in DEV, 'b' in TEST, 'c' in UAT and 'd' in PROD.

ManifestCertificateThumbprint - This value you extract from the .vbproj file and is the password you created for the .pfx file encrypted by Visual Studio. To get this value in Visual Studio sign the project with the .pfx file and then open up the .vbproj file and find the value and paste it below.

  Private Sub CompileVSTOReports_CompileSolutionFile(ByVal SolutionFile As System.IO.FileInfo, _
                            ByVal Version As String, _
                            ByVal Environment As String, _
                            ByVal Destination As String, _
                            ByVal DestinationURL As String)

    Dim oStreamWriter As System.IO.StreamWriter = Nothing
    Dim sBuildFile As String = My.Computer.FileSystem.GetTempFileName


      Dim sMSBuild As System.Text.StringBuilder = New System.Text.StringBuilder

      sMSBuild.Append("<Project DefaultTargets=""Main"" xmlns=""http://schemas.microsoft.com/developer/msbuild/2003"">" & vbCrLf)
      sMSBuild.Append("<PropertyGroup>" & vbCrLf)
      sMSBuild.Append("  <SourceSolution>" & SolutionFile.FullName & "</SourceSolution>" & vbCrLf)
      ' If you change the PFX file you must change the Thumbprint below. This comes from the .vbproj file if you manually
      ' use the PFX file - when seen in a text editor.
      sMSBuild.Append("  <KeyFileLocation>" & My.Application.Info.DirectoryPath & "\gluegood.pfx</KeyFileLocation>" & vbCrLf)
      sMSBuild.Append("  <VersionNumber>" & Version & "</VersionNumber>" & vbCrLf)
      sMSBuild.Append("  <AssemblyName>" & "VSTO_" & Replace(SolutionFile.Name, ".sln", "") & "_" & Environment & "</AssemblyName>" & vbCrLf)
      sMSBuild.Append("  <PublishDir>" & Destination & "</PublishDir>" & vbCrLf)
      sMSBuild.Append("  <InstallUrl>" & DestinationURL & "</InstallUrl>" & vbCrLf)
      sMSBuild.Append("  <SolutionID>" & CompileVSTOReports_GetSolutionId(Environment, SolutionFile) & "</SolutionID>" & vbCrLf)
      sMSBuild.Append("</PropertyGroup>" & vbCrLf)

      sMSBuild.Append("<Target Name=""Main"">" & vbCrLf)
      sMSBuild.Append("  <MSBuild Projects=""$(SourceSolution)""" & vbCrLf)
      sMSBuild.Append("    Properties = ""Configuration=Release;" & vbCrLf)
      sMSBuild.Append("      InstallUrl=$(InstallUrl);" & vbCrLf)
      sMSBuild.Append("      PublishDir=$(PublishDir);" & vbCrLf)
      sMSBuild.Append("      PublishUrl=$(InstallUrl);" & vbCrLf)
      sMSBuild.Append("      SignAssembly=true;" & vbCrLf)
      sMSBuild.Append("      DelaySign=false;" & vbCrLf)
      sMSBuild.Append("      AssemblyOriginatorKeyFile=$(KeyFileLocation);" & vbCrLf)
      sMSBuild.Append("      ManifestKeyFile=($KeyFileLocation);" & vbCrLf)
      ' If you change the PFX file you must change the Thumbprint below. This comes from the .vbproj file if you manually
      ' use the PFX file - when seen in a text editor.
      sMSBuild.Append("      ManifestCertificateThumbprint=84489FD07E578BBA71CC67E9EB4F49E1B4EE740A;" & vbCrLf)
      sMSBuild.Append("      SignManifests=true;" & vbCrLf)
      sMSBuild.Append("      ApplicationVersion=$(VersionNumber);" & vbCrLf)
      sMSBuild.Append("      BootstrapperEnabled=true;" & vbCrLf)
      sMSBuild.Append("      UpdateEnabled=true;" & vbCrLf)
      sMSBuild.Append("      UpdateInterval=0;" & vbCrLf)
      sMSBuild.Append("      AssemblyName=$(AssemblyName);" & vbCrLf)
      sMSBuild.Append("      SolutionID=$(SolutionId);" & vbCrLf)
      sMSBuild.Append("      IsWebBootstrapper=True""" & vbCrLf)
      sMSBuild.Append("    ContinueOnError = ""false""" & vbCrLf)
      sMSBuild.Append("    Targets=""Publish"" />" & vbCrLf)
      sMSBuild.Append("</Target>" & vbCrLf)
      sMSBuild.Append("</Project>" & vbCrLf)

      oStreamWriter = New System.IO.StreamWriter(sBuildFile, False)
      oStreamWriter.Write(sMSBuild.ToString)
      oStreamWriter.Close()
      oStreamWriter = Nothing

Running the compile in code :-

      Dim sBuildLogFileName As String = "MSBuild_VSTO_" & Now.ToString("yyyyMMdd_hhmmss") & ".log"

      ' Run the build
      Dim oProcessInfo As System.Diagnostics.ProcessStartInfo = New ProcessStartInfo("C:\WINDOWS\Microsoft.NET\Framework\v3.5\MSBuild.exe", """" & sBuildFile & """ /clp:errorsonly /l:FileLogger,Microsoft.Build.Engine;logfile=" & sBuildLogFileName & " /filelogger")
      oProcessInfo.WindowStyle = ProcessWindowStyle.Hidden
      Dim oProcess = Process.Start(oProcessInfo)
      oProcess.WaitForExit()

      If oProcess.ExitCode <> 0 Then
        'Error
        Throw New Exception("Build failed for solution " & SolutionFile.FullName & ". See log file " & sBuildLogFileName)
      End If


Notes:
Debugging - VSTO and ClickOnce is 'fun' to debug. Below are some suggestions :-
1. Run the .vsto deployment file to install the package first. Do this directly by copying the file and then clicking on the link. If you get a security warning you know that Office is going to silently discard the VSTO, so troubleshoot this first.
2. In add/remove programs you can see the VSTO, so check it exists. Remove it manually between tests.
3. Initially run the .docx or .dotx locally from the C:\ of the test PC. This gets around any security issues.

Passing query string / parameters via URL - We required to deliver some parameters to the VSTO. A really simple way of doing this is by passing Me.Fullname to the below function e.g.

  Private Sub ThisDocument_Startup(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Startup
    ...
GetParameterIntFromQueryString("UserId", Me.Fullname)
   ...
  End Sub


  Public Shared Function GetParameterIntFromQueryString(Byval ParameterName as String, ByVal Location As String) As Integer
    ' Retrieve the ParameterName from the QueryString used to launch the document. Pass to this function me.Fullname
    Dim oURI As UriBuilder = New UriBuilder(Location)
    Dim sQueryString = oURI.Query.ToString

    If sQueryString = vbNullString Then
      Return 0
    Else
      Dim NameValueTable As New Collections.Specialized.NameValueCollection()
      NameValueTable = System.Web.HttpUtility.ParseQueryString(sQueryString)
      If IsNumeric(NameValueTable(ParameterName)) Then
        Return CInt(NameValueTable(ParameterName))
      Else
        Return 0
      End If
    End If
  End Function


e.g. Opening the VSTO from a website with http://myvstoserver/VSTO/UserReport.docx?UserId=123 would return 123 from the above function.

I would strongly suggest to prompt for the parameters if not supplied (use Inputbox) so it can be run in debug mode, or your users can run off website.

VSTO run once from .docx - We required our VSTO to run once and then not again. Unfortunately the original developer created the VSTO as a Word Document (.docx) which meant once the code the Word Document that was saved continue to embed in it the VSTO relationship, so when the user opened it again it would re-run wiping any manual changes they made.. The work around is either to change the project to a Document Template (.dotx), or in the ThisDocument_Startup at the end of the Finally block place the command Me.RemoveCustomization() - which strips the relationship between the document and VSTO.

Enjoy!

*************************************************
** Legal **
This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.

Saturday, August 1, 2009

Freeware - LogParser.TFSQuery (Team Foundation Server)

I recently found myself needing to query Team Foundatin Server (TFS) like I had previously done with Visual Source Safe (VSS). I found I needed to following capabilities :-
- Querying and notification of recent work items
- Querying and notification of the most recent check-ins
- Querying and notification of check-outs which were over a time threshold
- Querying and notification of shelvesets (parked code) which expired a certain amount of time
- Querying and notification across linked work items

I could have used out of the box TFS functionality to provide some of the above, but some don't exist without reaching into the database. Therefore given I have LogParser.Email capabilities (ability to e-mail LogParser results) I decided to write a LogParser COM input format plugin for TFS to perform all the above.

Initially I thought I could provide the tool using the direct TFS web services, however after reading online documentation I decided against this approach, mostly because Microsoft haven't got any documentation, nor will guarentee they won't break it between releases. I additionally considered simply parsing the output from TF.EXE, however thought this was inelegant as a solution and prone to error and additionally wouldn’t provide Work Item querying I was looking for.

I therefore decided to look at TFS APIs available in VB.Net. The available functionality is extensive and blogs like James Manning make it much easier to navigate various examples of using the APIs.

Below are details on the solution.

Features

  • Freeware with VB.NET 2005 source code included
  • Can query Team Foundation Server (TFS) work items, check-ins (ChangeSets), check-outs (Pending ChangeSets), and Shelvesets. This has been tested on TFS 2005 - it may work on earlier or later versions.
  • For work items supports either querying using existing Stored Queries, or using direct work item query language (WIQL). The queries can be run across a specific project, or multiple projects.
  • For check-ins, check-outs and shelveset queries it reports the file details and associated work items. In addition supports a single or multiple project paths in the same query and the recursion level.
  • Work items and check-ins (changesets) support iCheckPoint functionality to enable you to retrieve only the latest changes since the last query. Works on recording the time (UTC) and then querying for item greater than that time.
  • The Work items query can retrieve specified fields (either in Stored Query, or WIQL) or return all fields available for that work item.
  • Can query multiple different TFS projects, or even different TFS servers and combine the results. Internally used Dataset.Merge functionality.
  • Work item query allows for LEFT JOIN-ing linked Work Items. You can specific the fields you want returned in the LEFT JOIN.
  • To provide extra querying features Implements a number of custom columns (as per standard LogParser).
  • Logs any errors to console and to EventLog.Application
  • Supports Log Parser data types.
  • Uses the ILogParserInputContext interface to enable LogParser COM interface to directly call. i.e, you can code against this plugin.


*************************************************
Download VB.NET source is included in download.
*************************************************
To install:

Assume that TeamFoundation*.dll (Team Explorer or Visual Studio) is installed in the GAC and working on your system. Stand-alone Team Explorer install instructions. Both Team Explorer and Visual Studio install the dlls into the GAC.

1. Copy the LogParser.TFSQuery folder to a location on your harddrive

2. Run the command .\LogParser.TFSQuery\InstalldotNETasCOM.bat.

3. Run your Log Parser query.

e.g. LogParser.exe -i:COM "select * From 'http://tfsserver:8080/TestProject/My Work Items'" -iProgId:Gluegood.LogParser.TFSQuery -o:DataGrid

A number of samples are available in the downloaded file in SampleQuery.bat

Syntax

LogParser.exe -i:COM "select * From '{Path1}','{Path2}',..." -iProgId:Gluegood.LogParser.TFSQuery -iComParams:"iAllFields={TrueFalse},iCheckPoint={Path},iWIQL={WIQL query},iQueryCheckOuts={TrueFalse},iQueryShelvesets={TrueFalse}, iRecursive={TrueFalse},iWorkItemLinksQuery={TrueWIQL queryField list}"

  • {Pathx} - a formatted path that adheres to the following standard.
    - Work items – {server:port}/{project}/{query} e.g. ‘http://tfsserver:8080/ProjectTest/All Work Items’ or ‘tfsserver/*/All Work Items’. Expect the 3 slashes as delimiters even when item isn’t required..
    - Check-ins, Check-outs, shelvesets - {server:port}/{path} e.g. ‘http://tfsserver:8080/$/ProjectTest/Release’ or ‘tfsserver/$/
    Supports multiple Paths as long as they are deliminated with a comma (,).
  • iAllFields - Ignore the columns defined in the query (or WIQL) and retrieve all columns. Default = False
  • iCheckPoint – Path to a checkpoint file. Default is no checkpoint.
  • iWIQL – WIQL query. Overrides any stored queries specified. Defaults to no iWIQL Hint: If using from command line any commas must be escaped with \u002c.
  • iQueryCheckOuts - Query the path specified looking for CheckOuts. If iQueryCheckOuts and iQueryShelvesets are not provided, or both false, then assume querying for Check-ins. Default is false.
  • iQueryShelvesets - Query the path specified looking for Shelvesets. If iQueryCheckOuts and iQueryShelvesets are not provided, or both false, then assume querying for Check-ins. Default is false.
  • iRecursive – defines wether to recusively scan subfolders, or stay within the current folder/project defined. Default is true.
  • iWorkItemLinksQuery – Defines whether to LEFT JOIN all linked Work Items. Extra columns returned will be prefixed with ‘LogParserTFSRelated’. Setting to True will return Linked Work Item Id, Type, State, and Title. Can additionally define your own WIQL (with only SELECT and FROM clause) or a field list. Default is "" (not enabled).

Columns returned (Work items)
LogParserTFSServer – Server in which the query is being executed.
LogParserTFSProject – TFS Project which this query has been executed
LogParserTFSQuery – TFS Stored Query, or WIQL query
LogParserTFSWorkItemId – Work Item Id for this item
LogParserRecordNumber – Record number for this item in the query.
LogParserTFSRelated – If using iWorkItemLinksQuery Columns in the query or all Columns


Columns returned (Check-ins / changesets)
LogParserTFSServer – Server in which the query is being executed.
LogParserRecordNumber – Record number for this item in the query.
ChangeType – Type of change
User – User who checked in this item.
Date – Date that this check-in occurred.
Comment – Comment associated to this check-in
Path – File path (server side) of this check-in
WorkItemId – Associated Work Item Ids. Will display multiple seperated by a comma.
ChangeSetId – ChangeSet Id.
PolicyOverrideComment – Policy override comment if entered.
PolicyFailures – Number of policy failures

Columns returned (Check-outs / pendingsets)
LogParserTFSServer – Server in which the query is being executed.
LogParserRecordNumber – Record number for this item in the query.
ChangeType – Type of change
User – User who checked out this item.
Date – Date that this check-out occurred.
Path – File path (server side) of this check-out
PendingSetId – PendingSet Id.
Computer – Computer in which this check-out was conducted against.

Columns returned (Shelvesets)
LogParserTFSServer – Server in which the query is being executed.
LogParserRecordNumber – Record number for this item in the query.
ChangeType – Type of change
User – User who shelved this item.
Date – Date that this shelve occurred.
Path – File path (server side) of this shelve
PendingSetId – PendingSet Id.
Computer – Computer in which this Shelve occurred from.
Name – Name of this shelveset

Challenges


CheckPoint, WIQL and store.Query
When using iCheckPoint with Work Items I had to modifying the supplied stored query or WIQL provided with an additional clause returning the latest items based on System.ChangeDate. I ran into a problem using {string}.replace to look for the existence of existing WHERE clause as it’s case sensitive and I didn’t think I could rely on the case, therefore I ended up using Regex.Replace thanks to this. In addition because the CheckPoint stores date and time I had to modify the way I executed the query from the standard store.Query approach listed in many examples to a more obscure method using the Query object and setting the dayPrecision flag.

Performance and WorkItems
There is a lot of documentation stating that performance is degraded if you query every work item field beyond those returned in the WIQL query (as TFS needs to round trip to the server) , therefore I ensure that when returning fields (without the iAllFields setting) I only reference those fields specified. In addition I bumped up the PageSize to 200.

Changeset, Pendingset and Shelveset and querying
There seems to be some complexity around this (using APIs QueryPendingSets, QueryShelvedChanges, and QueryHistory) and I conducted some research and found this entry helpful in providing directions. In addition I leaned heavily on James Manning’s site and numerous examples provided e.g. Shelvesets

Querying Linked WorkItems and Performance
The easy way would have been to return the Linked Work Item details on each Work Item that had a link. However this performs terribly (as multiple trips are made for every Work Item link). Another better way I found was to

  1. Add the RelatedLinkCount field to the initial WIQL query (to determine if there were any links),
  2. Then loop through the WorkItem.Links collection (performance hit on this one and doesn't seem to be a way around this) determining if any are WorkItems and then storing any found WorkItem link ids into a BatchReadParameter.
  3. Then after cycling all Work Items I then execute a new WIQL store.query using the BatchReadParameterCollection (a list of Work Item Ids).
  4. Then I link the original Work Item dataset to the Linked Work Item dataset by moving through all the DataTable.rows. There is a few tricks around this code to support when a Work Item cannot be referenced due to permissions.

GACUTIL and REGASM
My current approach with LogParser COM Input addins was to always install them in the GAC as I mistakenly thought that LogParser when reading the COM components both needed the item registered and installed in the GAC. I realised with this project that I really didn’t want to have to get the entire TeamFoundation assemblies in the GAC and played around with the install and I made 2 discoveries.

a) the TeamFoundation assemblies appear to already be installed in the GAC (if you have Visual Studio Installed, or Team Explorer) and

b) that using REGASM with the /codebase (as per MSDN - The Codebase entry specifies the file path for an assembly that is not installed in the global assembly cache) means you don't need to install into the GAC.


*************************************************

** Legal **

This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.

Saturday, May 30, 2009

Freeware - LogParser.Email

Monitoring logs is always very challenging. You can use very expensive software which has its own interfaces and nuances, or you can write your own (can be just as painful).

A few years ago I had the responsibily of monitoring a buggy 3rd party web application. I would often be awoken in the middle of the night to simply bounce the IIS service by overseas users. To enjoy a more peaceful nights sleep I created a tool that monitored IIS logs and when a particular string appeared I would know that the 3rd party software had crashed and thus issue a command to restart IIS. The tool worked and I got some rest.

I wish at the time I had known of LogParser and had written the tool I've created in this post which would have solved the problem very quickly and very cleanly.

LogParser.Email can take the output from any LogParser query and deliver a 'nicely' formatted SMTP e-mail. In addition the tool has a 'trigger' concept that can determine if the email should be sent. It also as an extension can run batch commands (e.g. restart IIS). Used in combination with a scheduling software you have an extremely powerful monitoring solution.

LogParser.exe -i:EVT "Select Top 10 * into STDOUT from System" -tpl:EmailTemplate.tpl -stats:off | LogParser.Email.exe

Suggested usage
  • A cheaper alternative to some of the monitoring features in Microsoft Operations Manager (MOM), or Heroix RoboMon, HP Openview etc
  • Used to 'test' if a specific rule being introduced into an enterprise monitoring product is valid
  • SharePoint Alert replacement (in conjunction with LogParser.WSS30) where an e-mail can list items, or changes across multiple lists, wikis or document libraries
  • Query databases and send alerts for specific detials (in conjunction with Logparser.OLEDB) which can e-mail row from multiple data sources. e.g. a poor man's security breach tool
  • Check for Low Disk Space, or Unexpected Server restarts
  • Notify when WLBS (monitor EventLog), or other Load Balancing product is down (monitor logs)
  • E-mail daily a combined list of file versions running on particular servers (using -i:FS)
  • Monitor IIS performance daily, weekly, monthly, or the health of the server (using -:IISW3C)
  • Check anything you have a log file for which you need to monitor.
Solution overview
1. Create a LogParser query which returns the data in exactly the way you'd want in presented in an e-mail. e.g. formatting, column positions, column names etc.

2. Create a SharePoint Template file that matches the E-mail Template format. e.g., Create a LogParser Template File Named C:\Template.tpl. The Template should follow the format in the samples provided (simple one listed below). Add more or less FieldName_X as per the number of columns in the LogParser query. Also update the EmailAddress and SMTPServer address.

<LPHEADER>
<Email>
<Trigger Operator=">" Value="0" />
<Actions>
<Action Type="EMAIL" EmailAddress="destination@destination.com" SMTPServer="mail"/>
</Actions>
<Rows>
</LPHEADER>
<LPBODY>
<Row>
<Field Name="%FIELDNAME_1%">%FIELD_1%</Field>
<Field Name="%FIELDNAME_2%">%FIELD_2%</Field>
<Field Name="%FIELDNAME_x%">%FIELD_x%</Field>
</Row>
</LPBODY>
<LPFOOTER>
</Rows>
</Email>
</LPFOOTER>

3. Run the LogParser query ensuring
a) that the INTO STDOUT clause is used
b) that -stats:off is used to ensure that only the XML is generated to STDOUT
c) that your LogParser results escape any XML characters. Use the REPLACE_STR function in LogParser e.g. escape < > " ' &
d) that you pipe the result to LogParser.Email.exe

LogParser.exe -i:EVT "Select Top 10 * into STDOUT from System" -tpl:EmailTemplate.tpl -stats:off | LogParser.Email.exe


Features of LogParser.Email.exe
  • Freeware with VB.NET 2005 source code included
  • LogParser.Email can take the output from any LogParser query and deliver a 'nicely' formatted HTML SMTP e-mail. Can deliver to multiple recipients in a single e-mail, or seperate e-mails via multiple EMAIL 'Actions'
  • Used the LogParser template (-tpl) flag and a specially crafted (not too complex) Template file to pass results to the tool.
  • The tool displays the LogParser results in a grid within the e-mail. Using LogParser native facilities you can name and order the columns and format the row results (e.g. date/time formats)
  • The tool works on the concept of 'Triggers' which allow you to define when to send the e-mail based on a threshold being e.g. only send an e-mail if more than 50 LogParser rows are returned (Rows>50 then send e-mail)
  • Can specify the SMTPServer, e-mail address, e-mail source address, e-mail subject, e-mail notes, e-mail footer text, and choose the colours (columns headings, and even and odd rows)
  • Using a special prefix 'NOHTMLESCAPE' on any results you can espace the HTML formatting and created HTML tags within the e-mails. e.g. Hyperlinks
  • The tool can execute a batch command (or multiple via COMMAND 'Actions') based on the Trigger
  • Logs any errors to console, EventLog.Application, and a AppPath\Trace folder
  • Can be used without LogParser by piping contents of an XML file (formatted as directed) to the utility.
*************************************************

Download VB.NET source is included in download.

*************************************************

To install:
1. Copy the LogParser.Email.exe to C:\Program Files\Log Parser 2.2\

2. Setup the Template file required. Ensure you set the fields names equal to the Sharepoint names in your Logparser query and the number also matches the FIELD_X values in the Template file.

3. Run your Log Parser query.

*************************************************
Syntax

{STDIN Sharepoint Batch XML} LogParser.Email.exe {Optional Debug Template}"
e.g. TYPE | c:\Email LogParser.Email.exe True

*************************************************
XML Attribues

Below are details of the XML attributes within the Template file

Trigger
Operator - This is the operator that will be used to test the value against the returned Rows from LogParser. e.g. > > or =. It must be escaped so > or < e.g. >0 will mean that if the Logparser query returns any rows the trigger will be fired.
Value - This is the value which the returned rows and operator are tested against.

Action
Type- Mandatory. EMAIL or COMMAND.
EmailAddress - Mandatory. SMTP destination e-mail address.
SMTPServer - Mandatory. SMTP Server name.
EmailText - Optional. Text that you'd like to appear above the results. Usually used to provide some guideance on what the result contains or what should be done to rectify.
EmailTextFullControl - Optional. Allows to override the standard header text with your own text (with NOHTMLESCAPE you can markup the text.
EmailSourceAddress - Optional. Source e-mail address.
EmailSubject - Optional. Email subject
EmailFooter - Optional. Text you'd like displayed at the bottom of the e-mail.
EmailColumnColour - Optional. The HTML colour used for the column heading.
EmailRowColour - Optional. The HTML colour used for the even rows.
EmailRowAlternateColour - Optional. The HTML colour used for the odd rows.
Command - Mandatory. Used for COMMAND type only and the full path to the command that needs to be executed.

Row
Field name attribute - Name of the field
Field node value - Value for the field in this row. Values with NOHTMLESCAPE will display without their characters being escaped. e.g. hyperlinks.

*************************************************
** Legal **
This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.

Wednesday, April 1, 2009

Bitten by the cloud

Software as a service, cloud computing, SOA etc or whatever the buzz word is. They all rely on trust. Trust that we won't do anything naughty with with their service, and our trust that the service is reliable and that you get plenty of warning of closure.

I've been happily using Openomy.com for about 2 years after Omnidrive died. At least with Omnidrive it was around for a few months before it disappeared, while Openomy simplied died on the 24th March after 10 days notice from this site.

So once again I've moved content to a new provider - Box.net.

P.S. Some content will be unavailable temporarily while I recover it from various locations.

Thursday, November 13, 2008

Freeware - WSS30ListUpload (Sharepoint)

I recently published a solution to query Sharepoint lists/document libraries using LogParser. I then thought about how you could output LogParser results directly to Sharepoint (WSS 3.0) lists.

Some advantages that I can see of using Sharepoint as a Log Parser output is that you have a more managed location for results and additionally you could utilise the alert facilities. e.g. provide a monthly IIS statistics report that is automatically distributed to required people.

After some thinking I decided upon a solution involving using the LogParser output 'Template' format (-o:TPL) to generate the Sharepoint batch XML. Outputing the resultant XML to STDOUT (in the LogParser query) and then redirecting (using pipe i.e, ) to a small tool named WSS30ListUpload (which I wrote) which uploads to a SharePoint site list or document library. This allows for a solution utilising features within Log Parser and therefore provides a very clean solution with a great amount of flexibility.

Note: The WSS30ListUpload tool can be used without Log Parser. In essence using this tool you have a command line driven Sharepoint list uploading tool. e.g.

type c:\listdata.xml | WSS30ListUpload.exe "http://mysite/site/Lists/MyList/AllItems.aspx"

Solution overview

1. Create a SharePoint Template file that matches the SharePoint Batch XML format. e.g., Create a LogParser Template File Named C:\Template.tpl. The Template should follow the below example. You can download a working XML in below. Add more FieldName_X as per the number of columns uploaded into SharePoint.

<LPHEADER><Batch OnError="Continue" PreCalc="TRUE" ListVersion="0" ViewName=""> </LPHEADER>
<LPBODY> <Method ID="%FIELD_1%" Cmd="New">
<Field Name="%FIELDNAME_2%">%FIELD_2%</Field>
<Field Name="%FIELDNAME_3%">%FIELD_3%</Field>
<Field Name="%FIELDNAME_4%">%FIELD_4%
</Field>
<Field Name="%FIELDNAME_5%">%FIELD_5%</Field>
<Field Name="%FIELDNAME_6%">%FIELD_6%</Field>
<Field Name="%FIELDNAME_7%">%FIELD_7%</Field>
<Field Name="%FIELDNAME_8%">%FIELD_8%</Field>
</Method>
</LPBODY>
<LPFOOTER></Batch>
</LPFOOTER>

2. Place WSS30ListUpload.exe in C:\Program Files\LogParser 2.2\

3. Run the LogParser query ensuring
a) that the first column is a unique number per row. e.g. RecordNumber
b) that a column [ID] is generated with the text 'New'
c) that the column names match either the SharePoint 'DisplayName' (the title in SharePoint), or the underlying Name (e.g. 'This_x0020_Field' vs 'This Field') within the list.
d) that the INTO STDOUT clause is used.
e) that -stats:off is used to ensure that only the XML is generated to STDOUT.
f) that you pipe the result to WSS30ListUpload.exe which takes a single command line parameter of the SharePoint List URL.

e.g. LogParser.exe -i:EVT "Select Top 10 RecordNumber,'New' as [ID],EventLog,RecordNumber,TimeGenerated,TimeWritten,EventID,EventType into STDOUT from System" -tpl:c:\SharePointTemplate.tpl -stats:off | WSS30ListUpload.exe "http://mysite/site/Lists/Test/AllItems.aspx"

Note: The result from the execution is the Sharepoint Lists.asmx -> UpdateListItems XML response. Any errors resulting from the upload will be in this file. If you want to capture it to file redirect to a file. e.g. add > C:\Output.xml at the end of the example above.

Features of WSS30ListUpload

  • Freeware with VB.NET 2005 source code included
  • Can execute any Sharepoint Batch XML file against WSS 3.0 list, wiki, or document library content. Have only tested on WSS 3.0 - it may work on earlier or later versions. Uses the web service Lists.asmx -> UpdateListItems.
  • Single command line parameter is SharePoint URL to List/Document Library. Internally resolves Site and List Name based on the URL.
  • STDIN supports Batch XML
  • The STDOUT from WSS30ListUpload is the result of Sharepoint Lists.asmx -> UpdateListItems in XML format. This includes any errors.
  • Supports both Name and DisplayName for the Sharepoint column names. To do this it automatically translates Sharepoint column DisplayNames into Name (e.g 'This Field' becomes 'This_x0020_Field'). In addition if duplicate DisplayNames are found then will abort the upload.
  • Logs any errors to console and to EventLog.Application

*************************************************
Download VB.NET source is included in download.
*************************************************

To install:

1. Copy the WSS30ListUpload.exe to C:\Program Files\Log Parser 2.2\

2. Setup the Template file required. Ensure you set the fields names equal to the Sharepoint names in your Logparser query and the number also matches the FIELD_X values in the Template file.

3. Run your Log Parser query.

*************************************************

Syntax

{STDIN Sharepoint Batch XML} WSS30ListUpload.exe {SharePointListURL}"

e.g. TYPE c:\batch.xml | WSS30ListUpload.exe http://mysite/site/lists/MyList/AllItems.aspx

*************************************************

** Legal **

This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.

Saturday, November 8, 2008

Freeware - Logparser.WSS30 (SharePoint)

Also see the next post concerning Log Parser Sharepoint output solution

I recently came across the need query a SharePoint Document and List libraries programatically . I played with CAML and found the experience fairly miserable. I played with LINQ and warmed a little to the technology, but in the end came back to Logparser and writing a custom input format plugin.

I decided that LogParser needed a SharePoint (WSS 3.0) Log Parser COM input format plugin. I've been playing with the plugin for a couple of months as the complexities in Sharepoint web services soon became obvious.

Hope you find the utility useful, and if not the utility then the code. Enjoy!

Features
  • Freeware with VB.NET 2005 source code included
  • Can query WSS 3.0 list, wiki, or document library content. Have only tested on WSS 3.0 - it may work on earlier or later versions.
  • Supports retrieving columns and rows back to the LogParser engine based on your existing defined views in the list, wiki, or document library. Additionally can ignore columns in the view and returning all fields (including hidden).
  • No CAML. Create a SharePoint view and call it with LogParser using all its querying techniques on the returned data.
  • Supports multiple list, wiki, or document library querying.
  • Supports version (check in) details for list, wiki or document libraries. This enables you to retrieve all historical changes for a list if version control is enabled on the list, wiki, or document library.
  • Implements iCheckPoint to support only showing the latest added content to the list, wiki, or document library
  • Implements a number of custom columns (as per standard LogParser). They are listed below. This is to provide extra querying features.
  • Logs any errors to console and to EventLog.Application
  • Supports Log Parser data types.
  • Uses the ILogParserInputContext interface to enable LogParser COM interface to directly call. i.e, you can code against this plugin.

*************************************************

Download VB.NET source is included in download.

*************************************************

To install:
1. Copy the LogParser.WSS30 folder to a location on your harddrive
2. Run the command .\LogParser.WSS30\InstalldotNETasCOM.bat (need GACUTIL.EXE - part of the .NET FW 2.0 SDK - why?). This will install into the GAC the Gluegood.LogParser.WSS30.
3. Run your Log Parser query. e.g.

LogParser.exe -i:COM "select * From 'http://sharepoint.net/topsite/mysite/Lists/Change Control/AllItems.aspx'" -iProgId:Gluegood.LogParser.WSS30 -o:DataGrid

Syntax :-
LogParser.exe -i:COM "select * From '{URL1}','{URL2}',..." -iProgId:Gluegood.LogParser.WSS30 -o:DataGrid -iComParams:"iVersion={TrueFalse},iAllFields={TrueFalse},iSiteCollection={SiteURL},iListName={ListName or ListGUID},iViewGUID={ListGUID},"


  • {URLx} - a URL to the path of the SharePoint list and including the view name. Within Sharepoint open a document library, choose your view and then copy the link. Supports multiple URLs as long as they are deliminated with a comma (,).
  • iVersion - Whether the history for each record should be included. Default = False.
  • iAllFields - Ignore the columns defined in the view and retrieve all columns including hidden columns. Default = False
  • iSiteCollection, iListName, iViewGUID - You can manually define the site, list name and GUID for the view. This shouldn't be required except for where the code's logic fails to determine these 3 values from the {URL}. If using these settings set {URL} = '.' - see SampleQuery.bat for example.

Columns returned

  • LogParserListName - The name, or GUID of the list, wiki or document library. GUID is returned where the title of the list is different from the URL path. SharePoint strips special characters from the URL path name. e.g. v2.4 vs v24
  • LogParserRecordNumber - A counter used to uniquely identify the row.
  • LogParserItemURL - The URL to the item in the list, wiki or document library.
  • LogParserListItemId - (iVersion only) The unique item id.
  • LogParserVersionId - (iVersion only) The Version number of the check in.
  • LogParserVersionModified - (iVersion only) Time that the check in occurred.
  • LogParserVersionModifiedBy - (iVersion only) Who checked in the change.
  • LogParserVersionComments - (iVersion only) Provides the check in comments. Available only for Wiki and Document Libraries.
  • {Columns defined in the view} - beyond the fields above only the columns defined in the view. Sharepoint provides a DisplayName and Name for each column defined. Where possible I try to name the columns (as per the view) by their DisplayName. When however I find that the DisplayName is non-unique I revert to its unique Name.
Challenges -

Version - The implementation of retrieving version details in Sharepoint isn't straight forward.
  • Trap 1 - There isn't a single call to retrieve version details for a view. To retrieve history for items you must retrieve all the nodes using Lists.asmx -> GetListItems and then loop through each node and each column in that node using Lists.asmx -> GetVersionCollection. That's right GetVersionCollection provides history per column. Performance is therefore fairly horrible so if you are looking for version details ensure that
    a) create a view in SharePoint that has the fields you require (reduce the number of columns) and
    b) consider using checkpoint to reduce the amount of rows returned.
    Thanks to this site for pointing me in the right direction in regards to versions and Sharepoint.
  • Trap 2 - You'd think that Version comments using the Sharepoint field ows__CheckinComments would come through using Lists.asmx -> GetVersionCollection, especially because it it part of GetListItems. Not in Sharepoint land. You do get the Comments, but only the last one (miserable!). To get all comments you need to make a call to Versions.asmx -> GetVersions and look at the comments attribute and then match up the version to the rest of the list details.
  • Trap 3 - I really struggled in ensuring that all parts of the version details aligned. The simpliest way I could find was creating a datatable with the columns I required and then creating a base entry for all the versions available for that item based on the ows__UIVersionString field and using its Modified field (date/time). Then because every GetVersionCollection for a field returns a Modified field in addition to the field you are after you are able to sync it up against the row version (why it doesn't return the ows__UIVersionString is beyond me). Because in Versions.asmx -> GetVersions you can't rely on the Modified time we are fortunate that version is returned (equivalent to ows__UIVersionString) as a matching row.

Column headings - If you call Lists.asmx -> GetListItems you get a list of ALL fields available and secondly their column name is based on the internal Sharepoint name, not the name displayed to end user (these names are truncated by the way, so not very nice). To get around this I make an initial call to Lists.asmx -> GetListAndView which supplies for the view the internal Sharepoint name (Name) and its friendly name (DisplayName). . A userful trick is that in the datatable I create and store the Name as the Column.Name and the DisplayName as the Column.Caption - useful as they are often different. Sharepoint allows for the same DisplayName therefore you need to (and I do) revert to Name for the Column.Caption if I find a duplicate. When I return back to Log Parser I use the Column.Caption. By using the GetListandView method you are also able to get the data types of the field, which I can then map to LogParser datatypes. All up by making this extra web service call you are able to get the columns in this view, their friendly name, and their datatype.

ViewName (you mean ViewGUID) - Sharepoint documentation for its web services provide a field named ViewName. e.g. GetListItems

Public Function GetListItems (listName As String, viewName As String, ... etc)

You'd think that like ListName you can provide either the 'friendly name' or the 'ListGUID'. In the case of ViewName it actually always means ViewGUID. So if you have the 'friendly name' you need to make a call to Views.asmx -> GetViewCollection and then hunt for your GUID. I do this in the code for you.

xpath and Sharepoint - I spent a couple of days fighting with xpath queries and Sharepoint. After a number of frustrating hours I finally discovered a site which talked about needing to use namespaces in SharePoint xpath queries.

e.g.

Dim nodeListItems As System.Xml.XmlNode = Webservice.GetListCollection()
Dim xpq As String = "//Lists/List"
For Each Node As System.Xml.XmlNode In nodeListItems.SelectNodes(xpq)

while this does

Dim nodeListItems As System.Xml.XmlNode = Webservice.GetListCollection()
Dim xpq As String = "//sp:Lists/sp:List"
For Each Node As System.Xml.XmlNode In RunSharePointXPathQuery(nodeListItems, xpq)


* Download code for function RunSharePointXPathQuery

Date Time in SharePoint web service - It is truely amazing that different web methods in SharePoint return times in UTC or local timezone format. The most frustrating is that Lists.asmx -> GetListItems can return in UTC, while Versions.asmx -> GetVersions and Lists.asmx -> GetVersionCollection do not. There is a useful blog talking about UTC and SharePoint.

*************************************************

** Legal **
This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.

Thursday, September 11, 2008

Freeware - Logparser.ZIP

My last project was an OLEDB Log Parser COM input format plugin. One of the comments was that ‘we’ (meaning I) should work on a solution to query within ZIP files. I read the comment at the time and thought the suggestion was very specific and technically very difficult. Several months have passed and I did some more research and found a few more comments on trying to do this.....

Ta-da! Introducing a Log Parser COM input format ZIP plugin that will allow you to execute any Log Parser input query (e.g., –i:CSV, -i:IISW3C, -:TSV) against a single, or multiple Zip, GZip, Tar and BZip2 files. * For convenience I’ll refer to this set of files as ZIP files.

Features

  • Freeware with VB.NET 2005 source code included
  • Can query Zip files (GZip, Tar and BZip2 files are supported via an extension written by joelangley). To reduce disk requirements the LogParser.ZIP plugin extracts each file inside the zip, runs the Log Parser query, then deletes the unpacked file and repeats for each file.
  • Supports multiple zip files including wildcards, so you can query in the same Log Parser query across not only multiple files within a single ZIP, but multiple ZIP files either using comma separated list, or wildcards. E.g. ‘Select * From ‘2009*.zip’,’ 200812.zip’”
  • Supports all Log Parser Input Queries and their associated parameters. Note: Because the zip files are unpacked and then deleted the iCheckPoint parameter doesn’t work as it thinks the files have completely changed.
  • Implements iCheckPoint to support only showing the latest added files to the ZIP (not too sure if this will always work). This is supported across multiple ZIP files.
  • Implements 2 custom columns (as per standard LogParser). They are LogParserZIPFilename, LogParserRecordNumber in addition to returned columns. This is to provide extra querying features.
  • Logs any errors to console and to EventLog.Application
  • Supports all Log Parser data types except TIMESTAMP_TYPE, see challenges below.

*************************************************

Download. VB.NET source is included in download.

*************************************************
Note: I've given the code a good test, however there may be unexpected bugs. As this code has overwrite and delete file features please ensure that you test the software.

To install:
1. Copy the LogParser.ZIP folder to a location on your harddrive
2. Run the command .\LogParser.ZIP\InstalldotNETasCOM.bat (need GACUTIL.EXE - part of the .NET FW 2.0 SDK - why?). This will install into the GAC the 3 components being Gluegood.LogParser.ZIP, Interop.MSUtil.dll (LogParser wrapper) and ICSharpCode.SharpZipLib.dll.
3. Change directory to the location where LogParser.ZIP can extract and then delete the unpacked zip files and directories. E.g. cd c:\test
4. Run your Log Parser query. e.g.

LogParser.exe -i:COM "select * From 'C:\Test\Test.zip'" -iProgId:Gluegood.LogParser.ZIP -iCOMParams:iQuery="Select * From *.txt",iInputParameters="-i:CSV -HeaderRow:Off" -o:DataGrid

Syntax

LogParser.exe -i:COM "select * From {ZIPFile(s)}'" -iProgId:Gluegood.LogParser.ZIP -iCOMParams:iQuery="{LogParserQuery}",iInputParameters="{LogParser InputParameters}" -o:DataGrid

*************************************************

Challenges
1. Calling Log Parser APIs within VB.NET
The LogParser.ZIP COM Plugin is a VB.NET program that is COM interop enabled (to allow for LogParser to call it). Additionally it calls the LogParser COM API to execute LogParser queries thus making the whole interfacing rather technically challenging.

a) GAC, assemblies and DLLs
To enable LogParser to call LogParser.ZIP COM input plugin you need to give it a strong name, register and then publish it to the Global Assembly Cache (GAC). Due to the LogParser.ZIP residing in the GAC you need to create a dotNet wrapper for the Log Parser COM APIs (LogParser.dll), give it a strong name, and then place it too into the GAC. After a number of trials I found that the TLBIMP program did just this :-

C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\tlbimp "C:\Program Files\Log Parser 2.2\LogParser.dll" /out:Interop.MSUtil.dll /keyfile:"GluegoodStrongKey.snk"

I then needed to Register the dotNet wrapper (Interop.MSUtil.dll) using REGASM.

C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\RegAsm.exe .\Interop.MSUtil.dll

I then needed to place Interop.MSUtil.dll into the GAC

"C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\gacutil" /i .\Interop.MSUtil.dll

(The InstalldotNETasCOM.bat does all of the above for you)

I then added a reference to the Interop.MSUtil.dll within my VB.Net project and can now call the LogParser COM API.

b) Calling late bound a COM dll using VB.NET reflection
To enable the greatest flexibility with the solution I needed to enable users to specify all Input types. E.g. –i:CSV, -i:TSV etc. Therefore I needed to dynamically call the COM Input Context Class. In my research this seems only achievable in using Reflection. Unfortunately there isn’t a lot of documentation on COM interop reflection using the GAC (bit unique I’d suggest). I found 2 useful articles that helped provide guidance on what to do – I’d like the thank the authors for documenting this.

Effectively I late bound to the COM input class using code exampled in this site. One gotcha was the late bound GAC name I had to use for Interop.MSUtil.dll. Initially I used just Interop.MSUtil, however I found that the late bound function didn’t work, therefore I used the full GAC name for the assembly being

Interop.MSUtil, Version=1.0.0.0, Culture=neutral, PublicKeyToken=048dde0ba838787f, processorArchitecture=MSIL

I then had to set the Input Parameters and I used a function called CallByName, which allows you to set object properties by name, instead of early binding (very cool).

2. Deciding on ZIP API
There are 2 major free ZIP APIs available for dotNET developers. The jslib ZIP methods, or the #ZIPLib dotNET Library. I decided on the latter simply because it was open source like my solution, it had a smaller footprint (wasn’t an extra download) and the comments out in the community tended to favour it as a solution. Additionally it provided more than just ZIP extraction which I thought may be useful. For the sceptics it was really easy to implement within my code.

3. TIMESTAMP_TYPE datatype conversion issue
This remains unsolved, so if anyone has any ideas let me know. The LogParser.ZIP program extracts each value in the IRecordSet response and places into a DataTable. In my code I match the DataTable column data types to the LogParser column data types. When I then return to LogParser (via the GetFieldType method) the values I re-convert the DataTable column types to the appropriate LogParser FieldType. For some very strange reason the delivery of TIMESTAMP_TYPE on sample data I tried performed extremely poorly when compared to STRING_TYPE. E.g. 2 seconds compared to 5 minutes. I therefore decided not to convert TIMESTAMP_TYPE, but pass any TIMESTAMP columns back as STRING_TYPE.

*************************************************

** Legal **
This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.