November 2010 (1)
August 2010 (1)
July 2010 (1)
June 2010 (3)
July 2009 (3)
June 2009 (1)
May 2009 (1)
February 2009 (1)
January 2009 (1)
November 2008 (3)
October 2008 (4)
September 2008 (9)
August 2008 (6)
July 2008 (3)
June 2008 (3)
January 2008 (1)
November 2007 (2)
October 2007 (6)
September 2007 (5)
August 2007 (22)
July 2007 (6)
June 2007 (1)
May 2007 (3)
April 2007 (27)
March 2007 (8)
February 2007 (6)
September 2006 (2)
August 2006 (4)
July 2006 (9)
June 2006 (17)
May 2006 (20)
April 2006 (12)
March 2006 (9)
February 2006 (4)
January 2006 (3)
December 2005 (2)
November 2005 (4)
October 2005 (5)
September 2005 (37)
August 2005 (83)
July 2005 (6)

Active Directory / LDAP (0)
ASP.Net (19)
Blackberry Development (4)
c# (34)
c++ (3)
Code Camp (1)
Excel (1)
Exchange (3)
Front Page 2003 (6)
FTP User Editor (4)
HTML / CSS / DHTML (8)
IIS (146)
IIS - Log Parser (7)
IIS / FTP (12)
IIS / Tools / Administration (42)
IIS / Tools / Authentication (6)
IIS / Tools / Compression (8)
IIS / Tools / Crash & Hang (12)
IIS / Tools / ISAPI Filters (17)
IIS / Tools / Log Files (17)
IIS / Tools / Scripts (28)
IIS / Tools / Security (9)
IIS / Tools / SSL (6)
IIS 7 (3)
Internet Information Server (1)
Me (Chris Crowe) (6)
MIME Types (1)
Misc (72)
Oulook Express (2)
Silverlight (1)
SQL Server (27)
SQL Server CTE (1)
Vista (15)
Vista Gadgets (8)
Visual Studio (11)
Voice over BroadBand (1)
Windows (33)
Windows Powershell (3)
Windows Sharepoint Services (0)
Windows Sharepoint Services (15)
Windows Vista (14)
Wine Cellar (1)
WMI (8)

Archive

September 2008 (9)

Christchurch Code Camp - November 1st, 2008

Just announced is the second Code Camp to be held in Christchurch, New Zealand.

The date of the Code Camp is going to be November 1st, 2008 and again it is going to be held at the office of Trimble Navigation, 11 Birmingham Drive, Riccarton (where I work)

 image

Date: Saturday 1st November, Christchurch.  9am - 6pm
Location: Trimble Navigation, 11 Birmingham Drive, Christchurch
Cost: Free! (Lunch provided)

Theme:  Keeping It Real

The sessions are designed to showcase .NET related tools and techniques that will be useful to you as a developer, focusing on real-world topics that will be of immediate use.

Featuring mostly local presenters it's a time to talk and socialise and connect with others in the local community. An optional dinner in the evening is an ideal way to finish the day (the great restaurant last year is still being talked about!)

For more details and to register for the event (remember it is free) please visit http://www.codecamp.net.nz


Christchurch Weather Live - Updated Blackberry Page

I have updated my Blackberry Weather page to have a similar look to the main weather site at http://weather.crowe.co.nz

The image below fits perfectly on my Blackberry 8310 - Screen Size is 320 * 240 Pixels

image

And here is now it looks inside the Blackberry 8310.

bb

The URL for the BB weather is http://weather.crowe.co.nz/bb

The image that is downloaded is 6KB so small enough that it does not consume your bandwidth and $$$$


SQL Server 2008 - Backup Compression

* UPDATED : Sep 15, 2008 to include HyperBac for SQL Server in the comparison results.

SQL Server 2008 now supports compression when backing up your databases. But note this is only included in the Enterprise edition which is a pity.

Note: Though creating compressed backups is supported only in SQL Server 2008 Enterprise and later, every SQL Server 2008 or later edition can restore a compressed backup.

Note: The tests was made on a low end test machine hence the actual throughput is not good but the overall results are important as percentages

In this example I will show you some metrics when dealing with SQL Server backups on SQL Server 2008 Enterprise. I am also comparing Red-Gate SQL Backup and Quest SQL LightSpeed.

I have a database I have created called Performance which is currently using 15.3GB

sp_helpdb

-- SQL Server 2008 Backup with Compression
BACKUP DATABASE [Performance] TO  
    DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\Backup\Performance_Compressed' 
    WITH  COPY_ONLY, NOFORMAT, INIT,  NAME = N'Performance-Full Database Backup', SKIP, NOREWIND, NOUNLOAD, 
    COMPRESSION,  STATS = 10

10 percent processed.
20 percent processed.
30 percent processed.
40 percent processed.
50 percent processed.
60 percent processed.
70 percent processed.
80 percent processed.
90 percent processed.
Processed 384760 pages for database 'Performance', file 'Performance' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2001' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2002' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2003' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2004' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2005' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2006' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2007' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2008' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2009' on file 1.
100 percent processed.
Processed 1 pages for database 'Performance', file 'Performance_log' on file 1.
BACKUP DATABASE successfully processed 385913 pages in 150.111 seconds (20.084 MB/sec).


-- SQL Server 2008 Backup with No Compression
BACKUP DATABASE [Performance] TO  
    DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\Backup\Performance_UnCompressed' 
    WITH  COPY_ONLY, NOFORMAT, INIT,  NAME = N'Performance-Full Database Backup', SKIP, NOREWIND, NOUNLOAD, 
    NO_COMPRESSION,  STATS = 10

10 percent processed.
20 percent processed.
30 percent processed.
40 percent processed.
50 percent processed.
60 percent processed.
70 percent processed.
80 percent processed.
90 percent processed.
Processed 384760 pages for database 'Performance', file 'Performance' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2001' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2002' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2003' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2004' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2005' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2006' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2007' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2008' on file 1.
Processed 128 pages for database 'Performance', file 'Orders2009' on file 1.
100 percent processed.
Processed 1 pages for database 'Performance', file 'Performance_log' on file 1.
BACKUP DATABASE successfully processed 385913 pages in 206.006 seconds (14.635 MB/sec).

Now the results:

Database Size as listed in SQL Server : 15,350 MB

  Uncompressed Compressed
Backup time 206 Seconds 150 Seconds
Backup speed 14.635 MB/Second 20.084 MB/second
File Size 3,087,475 KB 452,749 KB

So as a result the compressed backup out performed the uncompressed backup as follows:

  • 72.86% of the time
  • 37.23% faster throughput
  • 14.66% of the uncompressed file size

So how does this compare to the 3rd party tools?

Red-Gate - SQL Backup 5

Red-Gate SQL Backup 5 supports compression and encryption of 32-bit and 64-bit versions of SQL Server 2008, SQL Server 2005 and SQL Server 2000 (SP 3 or later)

  Uncompressed Compressed
Level 1
Compressed
Level 2
Compressed 
Level 3
256-bit
Encryption
No Compression
256-bit Encryption
Compressed Level 1
256-bit Encryption
Compressed Level 2
256-bit Encryption
Compressed Level 3
Backup time 155 Seconds 95 Seconds 119 Seconds 128 Seconds 162 Seconds 115 Seconds 95 Seconds 126 Seconds
Backup speed 19.42 MB/Sec 31.68  MB/Sec 25.29 MB/Sec 23.51 MB/Sec 18.58 MB/Sec 26.17 MB/Sec 31.68 MB/Sec 23.88 MB/Sec
File Size 3,088,078 KB 532,149 KB 448,821 KB 412,893 KB 3,088,078 KB 532,149 KB 448,821 KB 412,893 KB

* Red gate supports 3 different levels of compression and two encryption settings - listed as 128-bit key and 256-bit key.

Quest - SQL LightSpeed 5.0

Quest SQL LightSpeed supports compression and encryption of 32-bit and 64-bit versions of SQL Server 2008,  SQL Server 2005,  SQL Server 2000 (Service Pack 4), SQL Server 7.0 (Service Pack 4)

  Uncompressed
Did not WORK - still compressed
Compressed
Level 1
Compressed
Level 5
Compressed 
Level 11
256-bit
Encryption
No Compression
256-bit Encryption
Compressed Level 1
256-bit Encryption
Compressed Level 5
256-bit Encryption
Compressed Level 11
Backup time 118 Seconds 92 Seconds 114 Seconds 906 seconds 115 Seconds 119 Seconds 116 Seconds 709 Seconds
Backup speed 25.50 MB/Sec 32.71  MB/Sec 26.40  MB/Sec 3.32 MB/Sec 26.17 MB/Sec 25.30 MB/Sec 25.94 MB/Sec 4.24 MB/Sec
File Size 565,343 KB 565,456 KB 394,567 KB 376,114 KB 565,399 KB 565,379 KB 393,634 KB 376,114 KB

* LightSpeed supports 11 different levels of compression and 9 encryption settings - listed as 40-bit RC2, 56-bit RC2, 112-bit RC2, 128-bit RC2,168-bit 3DES, 128-bit RC4, 128-Bit AES, 192-bit AES, 256-bit AES

Hyperbac for SQL Server

Quest SQL LightSpeed supports compression and encryption of 32-bit and 64-bit versions of SQL Server 2008,  SQL Server 2005,  SQL Server 2000 (Service Pack 4), SQL Server 7.0 (Service Pack 4)

  Uncompressed
(Custom Ext)
Compressed
.HBC Ext
Compressed
.ZIP Ext
(Zip)
Compressed 
.HBC2 Ext
(FastZip)
256-bit
Encryption
No Compression
(Custom Ext)
256-bit Encryption
Compressed
.HBE Ext
Backup time 210 Seconds 114 Seconds 101 Seconds 112 Seconds 196 Seconds 119 Seconds
Backup speed 14.293 MB/Sec 26.50 MB/Sec 29.65 MB/Sec 26.852 MB/Sec 15.30 MB/Sec 25.18 MB/Sec
File Size 3,087,475 KB 433,857 KB 433,022 KB 545,840 KB 3,087,475 KB 434,211 KB

* Hyperbac supports 2 levels of compression and 3 encryption levels listed as AES-256, AES-192, and AES-128

 

Overall Fasted Backup

  Backup Time Backup Speed File Size
Quest - SQL LightSpeed 92 Seconds 32.71 MB/Sec 565,456 KB
Red-Gate - SQL Backup 5 95 Seconds 31.68 MB/Sec 532,149 KB
Hyperbac for SQL Server 101 Seconds 29.65 MB/Sec 433,022 KB
Microsoft SQL Server 2008 150 Seconds 20.06 MB/Sec 452,749 KB


Overall Smallest Backup File

  Backup Time Backup Speed File Size
Quest - SQL LightSpeed 906 Seconds
116 Seconds
3.32 MB/Sec
24.94 MB/Sec
376,114 KB
393,634 KB
Red-Gate 126 Seconds 23.88 MB/Sec 412,893 KB
Hyperbac for SQL Server 101 Seconds 29.65 MB/Sec 433,022 KB
Microsoft SQL Server 2008 150 Seconds 20.08 MB/Sec 452,749 KB

 

References:


URL Rewrite Module for IIS 7.0

The Microsoft URL Rewrite Module for IIS 7.0 provides flexible rules-based rewrite engine that can be used to perform broad spectrum of URL manipulation tasks, including, but not limited to:

  • Enabling user friendly and search engine friendly URL with dynamic web applications;
  • Rewriting URL’s based on HTTP headers and server variables;
  • Web site content handling;
  • Controlling access to web site content.

The Microsoft URL rewrite module includes these key features:

  • Rules-based URL rewriting engine. Rewrite rules are used to express the logic of what to compare/match the request URL with and what to do if comparison was successful. Web server and site administrators can use rewrite rule sets to define URL rewriting logic.

  • Regular expression pattern matching. Rewrite rules can use ECMA-262 compatible regular expression syntax for pattern matching.
    Wildcard pattern matching. Rewrite rules can use Wildcard syntax for pattern matching

  • Wildcard pattern matching. Rewrite rules can use Wildcard syntax for pattern matching

  • Global and distributed rewrite rules. Global rules are used to define server-wide URL rewriting logic. These rules are defined within applicationHost.config file and they cannot be overridden or disabled on any lower configuration levels. Distributed rules are used to define URL rewriting logic specific to a particular configuration scope. This type of rules can be defined on any configuration level by using web.config files.

  • Access to server variables and http headers. Server variables and HTTP headers provide additional information about current HTTP request. This information can be used to make rewriting decisions or to compose the output URL.

  • Various rule actions. Instead of rewriting a URL, a rule may perform other actions, such as issue an HTTP redirect, abort the request, or send a custom status code to HTTP client.

  • Rewrite maps. Rewrite map is an arbitrary collection of name-value pairs that can be used within rewrite rules to generate the substitution URL during rewriting. Rewrite maps are particularly useful when you have a large set of rewrite rules, all of which use static strings (i.e. there is no pattern matching used). In those cases, instead of defining a large set of simple rewrite rules, you can put all the mappings between input URL and substitution URL as keys and values into the rewrite map, and then have one rewrite rule which references this rewrite map to look up substitution URL based on the input URL.

  • UI for managing rewrite rules. Rewrite rules can be added, removed and edited by using "URL Rewrite Module" feature in IIS Manager.

  • GUI tool for importing of mod_rewrite rules. URL rewrite module includes a GUI tool for converting rewrite rules from mod_rewrite format into an IIS format. 

To download the module use the following links

 

More Resources


See Also


Update: Christchurch Live Weather has been updated

The small web site I started about 6 months back has been getting quite a bit of work done to it over the last few weeks. With the help of Simon (who provides the data for the site) I have updated it to include a forecast, a web cam image, history, and charts.

The history goes back until around the start of 2007 and the charts at this time are for a whole month only.

The main screen has not changed a lot but it does have a new chart that plots the wind direction of the previous 24 hours.

image

The forecast is updated about every 1-2 hours and I have included a chart of the temperature and rainfall for the period of the forecast. The forecast is currently generated for up to 6 days in the future.

image

The history section allows you to see the current month in this year and any previous years for which there is data, and then it displays the data for each month going back to January 2007.

You can drill in to any month of any year, you can also drill into the days, and hours and minutes and you can chart any month.

image

The charts section has had a lot of work done lately, and all the charts are written in c# and rendered on the fly.

image

If you live in Christchurch, New Zealand or have an interest in this little project you can browse the site at http://weather.crowe.co.nz


Panoramio - have you been there lately?

Panoramio has now more than 2 million photos online (October 10th, 2006 there was 50,000) . When you are browsing around Google Earth and Google Maps you may seen little pics of the areas that you are looking at and most of these come from Panoramio. Panoramio allows users to upload their photos and then these photos can be (if liked) included in the Google index.

I personally have two photos included - one from Dubrovnik in Croatia and another from the Moroccan Desert.

http://www.panoramio.com/user/1946508

I had not been to the site for a month or two and noticed that they had a new feature called Look Around. This allows you to see other peoples photos of the same location as your photos or the photos you are looking at and it is amazing. Microsoft has done a lot of research into a similar project (which is now live) which is called PhotoSynth -  http://livelabs.com/photosynth/ but that project goes well beyond this and is absolutely amazing in its own right.

I live in Christchurch, New Zealand and in the centre of the city we have a Cathedral. Clicking on the link below will open this up and you will be able to see other peoples photos which include the Cathedral with smooth transitions between the different users photos.

Christchurch Cathedral
http://nv0.panoramio.com/navigate.php?id=573765

Dubrovnik, Croatia
http://nv0.panoramio.com/navigate.php?id=11680114

 

Check it out at http://www.panoramio.com/ and they also have an API so you can display photos from Panoramio on your own web site http://www.panoramio.com/api/


Developer Express - a free set of 60 components for Windows Forms and ASP.NET

I have used Developer Express products for years and years, going back to the days of Delphi 3 and possible before. I really love the controls, they are nice looking and provide a lot of additional functionality over the default controls included with Windows Forms and ASP.NET

They are now offering a set of 60 components for Windows Forms and ASP.NET for free.

  • Visual Studio 2005 and Visual Studio 2008 are fully supported

The controls consist of their xtraEditors which includes TextEdit, CalcEdit, DateEdit, ButtonEdit, ColorEdit, FontEdit,PictureEdit,MemoEdit + lots more.

                            

 

In addition there are buttons, check boxes, filter controls, tab controls..... and some other styling controls including custom forms and usercontrols.

 

There are too many to list here so check them out at http://www.devexpress.com/Products/Free/WebRegistration60/ and remember there are controls for ASP.NET as well.

Note: The applications you create with these controls can be distributed royalty free


Charting wind direction over time - a c# custom chart

I have been updating my weather site lately and have included charts to display data for each month. I decided to use the free Google chart API for most of the charts since they are easy and quite cool.

For those who have not seen the Google Chart API see - http://code.google.com/apis/chart/

Here are some live examples of what you can do with the chart API

http://chart.apis.google.com/chart?cht=p3&chd=t:60,40&chs=250x100&chl=Hello|World

 

http://chart.apis.google.com/chart?cht=p3&chs=220x100&chd=s:Hellob&chl=May|Jun|Jul|Aug|Sep|Oct

http://chart.apis.google.com/chart?cht=lc&chd=s:pqokeYONOMEBAKPOQVTXZdecaZcglprqxuux393ztpoonkeggjp&chco=676767&chls=4.0,3.0,0.0&
chs=200x125&chxt=x,y&chxl=0:|1|2|3|4|5|1:|0|50|100&chf=c,lg,0,76A4FB,1,ffffff,0|bg,s,EFEFEF

 

Right enough of the Google API - go check it out at - http://code.google.com/apis/chart/

 

Now I do not know what type of chart you call the one I wrote in c# (with the help of Simeon - see http://simeonpilgrim.com/blog/2008/09/08/timedirection-graph/) but here is an example showing the Wind direction over the month. The inner portion of the chart is the oldest data and the outer area is the newest data.

 

RenderWindDirectionChart

The chart above is made with 744 data points, 1 per hour over the period of the month of August 2008.

At this point I will probably refine it a bit more but I will release the source when I am happy with it.

To see more of these charts in action look at http://weather.crowe.co.nz/Charts.aspx


ClientRaw.TXT IIS7 Managed Module Handler

Some background

I guess first off you will be asking what is ClientRaw.txt?

Well this is a file format that a particular software package (Weather Display - http://www.weather-display.com) uses to represent weather data. It is a space delimited file, basically undocumented except for a great tool available at http://www.tnetweather.com/wd-parser.php

So basically this is a simple example of how the data can look  (truncated) :

12345 0.0 0.0 273 6.28 85 1027.7 0.0 9.8 514.2 0.000 0.000 17.1 52 0.0 1 0.0 0 0 2.6 -100.0 255.0 0.0 -100.0 -100.0 -100.0 

So this Weather Display software creates this file and uploads it via FTP to one or more sites. Now clients download this file, using some tool like a Vista Gadget http://weather.cobbnz.com/weather/VistaGadget.aspx that then parses the file and does something useful to it.

I have been working with a colleague from work on a number of projects such as http://weather.cobbnz.com (his) and http://weather.crowe.co.nz (mine), the Vista Gadget and a few other small projects that use this data.

What we found was there was a problem when the FTP site uploads the new file and the clients downloading the file. The FTP site gets updates every 15 seconds and depending on how many clients there are wanting this file you can quite easily end up with getting errors about not being able to open the file because it is in use by another process.

My solution

So my solution was since the site is hosted on IIS 7 was to write a new simple managed module that would cache the file for 15 seconds hopefully reducing the issues we have seen.

So basically the solution is this:

A user requests the file, we have a c# managed module that will attempt to deliver the file to the user using the following logic:

  • Is the file in the cache?
    • If it is in the cache, then deliver it to the client
    • If it is not in the cache, load the file from disk
      • If it successfully reads the file it writes it to another file on disk (secondary cache) and delivers it to the client
      • If it can not read the file from disk it tries to read the secondary cache file
        • If it can read the secondary cache file then it delivers it to the client
        • If it can not read the secondary cache then we return an error

Now remember that the client is requesting a TEXT file, simple static file about 1KB. In IIS 7 when it is using the managed pipeline we can create a module to handle this file and using a single line in web.config have IIS use our module.

In the case I am trying to fix the site is hosted on www.godaddy.com which means we have no real ability to modify the server other than the limited tools they provide and the ability to modify web.config.

 

So now for some code (Note: this is the complete module source code)

using System;
using System.Collections.Generic;
using System.Text;
using System.Web;

namespace ClientRawHandler
{
    public class ClientRawHandler : IHttpHandler
    {
        #region IHttpHandler Members
        public const string CacheKey = "ClientRaw";
        public const string CacheKeyDisk = "ClientRaw.cache";
        public const string CacheKeyDiskErr = "ClientRaw.err";

        public bool IsReusable
        {
            get { return false; }
        }

        public void ProcessRequest(HttpContext context)
        {
            context.Response.ContentType = "text/plain";
            try
            {
                if (context.Cache[CacheKey] == null)
                {
                    // Attempt to read the physical clientraw.txt file
                    string ClientRawFilespec = System.IO.Path.Combine(context.Request.PhysicalApplicationPath, "clientraw.txt");
                    string ClientRawText = System.IO.File.ReadAllText(ClientRawFilespec);

                    if (ClientRawText.StartsWith("12345") == false)
                        throw new Exception();

                    // Save in Application Context in case of problems reading the file after the cache has expired
                    string FileSpec = System.IO.Path.Combine(context.Server.MapPath("/Weather"), CacheKeyDisk);
                    System.IO.File.WriteAllText(FileSpec, ClientRawText);

                    int ClientRawCacheTimeoutInSeconds = Convert.ToInt32(System.Configuration.ConfigurationSettings.AppSettings["ClientRawCacheTimeoutInSeconds"]);
                    context.Cache.Add(CacheKey, ClientRawText, null, DateTime.Now.AddSeconds(ClientRawCacheTimeoutInSeconds), System.Web.Caching.Cache.NoSlidingExpiration, System.Web.Caching.CacheItemPriority.NotRemovable, null);

                    context.Response.Headers.Add("ClientRawHandler", "ReadFromDisk");
                    context.Response.Write(ClientRawText);
                }
                else
                {
                    context.Response.Headers.Add("ClientRawHandler", "ReadFromCache");
                    context.Response.Write(context.Cache[CacheKey].ToString());
                }
            }
            catch(Exception ex)
            {
                try
                {                    
                    
                    // Any errors try to return the cached version of the data from the Application context.
                    string FileSpec = System.IO.Path.Combine(context.Server.MapPath("/Weather"), CacheKeyDisk);
                    if (System.IO.File.Exists(FileSpec) == false)
                    {
                        // No cached version available so return an empty string
                        context.Response.Headers.Add("ClientRawHandler", "NoSecondaryCache");
                        context.Response.Write("-NoCacheVersion- " + FileSpec + " - " + ex.ToString());
                    }
                    else
                    {
                        string Text = System.IO.File.ReadAllText(FileSpec);
                        if (Text.StartsWith("12345") == false)
                        {
                            context.Response.Headers.Add("ClientRawHandler", "InvalidSecondaryCache");
                            context.Response.Write("-CacheVersionEmpty- " + FileSpec + " - " + ex.ToString());
                        }
                        else
                        {
                            context.Response.Headers.Add("ClientRawHandler", "ReadFromSecondaryCache");
                            context.Response.Write(Text);
                        }
                    }
                }
                catch (Exception ex1)
                {
                    context.Response.Headers.Add("ClientRawHandler", "Exception");
                    context.Response.Headers.Add("ClientRawHandlerException", ex1.Message.ToString());
                    context.Response.Write("-Unknown Exception-" + ex1.ToString());

                    string FileSpec1 = System.IO.Path.Combine(context.Server.MapPath("/Weather"), CacheKeyDiskErr);
                    System.IO.File.WriteAllText(FileSpec1, ex1.ToString());
                }
            }
        }

        #endregion
    }
}

We complie this code into a .DLL and copy this to the BIN folder on the web server. We then edit the web.config file so IIS knows how about our module.

Basically we need to add the following:

<system.webServer>
    <handlers>
      <add name="ClientRawHandler" path="clientraw.txt" verb="GET" type="ClientRawHandler.ClientRawHandler,ClientRawHandler" preCondition="integratedMode" />
    </handlers>
</system.webServer>
 

Now that is it.

When a client requests a file identified by the "path" attribute above in our case "clientraw.txt" we will use our handler to process it.

In conclusion

Using this managed module approach has basically stopped the glitches that people were seeing a lot. It can not stop the actual errors because if a file is being written to and you try to read it the error still occurs, but using the secondary cache with the last known good handles this case quite cleanly.