Mark S. Rasmussen improve.dk
Apr 21
2010

Recently I was doing some experimental AS3 development. Much to my surprise, simple collection classes like Stack/Queue are not available in the framework - guess I’m spoiled being used to the .NET Framework.

I ended up implementing a simple stack) using an internal linked list. There’s nothing exciting about the implementation but I thought others might be able to use it, so here it is :)

StackNode.as

package dk.improve.collections
{
    internal final class StackNode
    {
        public var value:Object;
        public var next:StackNode;

        public function StackNode(value:Object):void
        {
            this.value = value;
        }
    }
}

Stack.as

package dk.improve.collections
{
    public class Stack
    {
        private var head:StackNode;

        public function push(obj:Object):void
        {
            var newNode:StackNode = new StackNode(obj);

            if(head == null)
                head = newNode;
            else
            {
                newNode.next = head;
                head = newNode;
            }
        }

        public function pop():Object
        {
            if(head != null)
            {
                var result:Object = head.value;
                head = head.next;

                return result;
            }
            else
                return null;
        }

        public function peek():Object
        {
            if(head != null)
                return head.value;
            else
                return null;
        }
    }
}
Mar 17
2010

I will be giving two presentations at Miracle OpenWorld 2010 in April.

Working with PDF documents

In this session he will sharing his experiences with dissecting PDF documents for the iPaper system. We’ll look at various commercial as well as open source tools for reading, editing and rasterizing PDF documents. What are some of the common pitfalls you’ll run into when working with PDF documents and what does a PDF really consist of internally? While some of the tools and libraries can also be used to create PDFs, this session will concentrate on working with existing PDF files.

SQL Azure for DBA’s

Will be presenting together with Martin Schmidt from Miracle. Abstract is still in the works. It’s safe to say though that this will be no typical sales pitch - we’ll be looking at what limitations you’ll run into, what happens when you reach the limits of SQL Azure and generally what the sales pitch happily ignores.

I highly recommend going to MOW as this is one of the best DBA & developer conferences with networking and fun being in the high seat! There’s a beach party - what more do you need?

Feb 19
2010

On March 10th I’ll be giving a presentation at Odense .NET User Group on Scalability & Availability on the Microsoft platform.

As the session will be in Danish, I’ll be posting a danish abstract:

I denne session kommer vi ind på hvordan man sørger for høj tilgængelighed og skalerbarhed af ASP.NET web applikationer ved brug af flere maskiner opsat som cluster. Hvordan sørger vi for konsistent deling af sessions på tværs af maskiner indenfor et cluster? NLB vil blive demonstreret som clustering teknologi med dets styrker og svagheder. Ligeledes vil den nye IIS7 Application Request Routing extension blive demonstreret og hvordan den kan bruges i samspil med NLB.

If you want to join, please sign up at LinkedIn.

Dec 09
2009

What does Flash, upload, cookies, IIS load balancing and cookies have to do with each others? More than I’d like :(

When users need to upload files I often use the Flash based SWFUpload component. It allows for multiple file selection and progress display during upload. Handling the uploaded files on the .NET side is rather easy:

for (int i = 0; i < Request.Files.Count; i++)
{
    HttpPostedFile hpf = Request.Files[i];

    // ... Save / process the HttpPostedFile
}

One of the arguments for using Flash for web designs is that it’ll look the same in all browsers. While that is literally true, there are a number of functionality differences when it comes to Flash and cross browser support.

There’s a bug in all current Flash players that causes the Flash player to send persistent cookies from Internet Explorer, no matter what browser you’re currently using. That is, if you’ve visited a given website in IE previously and you’re nor visiting it in Chrome/Firefox - yup, your IE cookies will be sent to the website instead of the Firefox/Chrome cookies! There’s a good description and discussion at the SWFUpload site.

This bug poses a number of problems if you’re using SWFUpload on a password protected site that relies on cookie based forms authentication. Whenever the file is uploaded, the users will appear to not be logged in. This is because the forms authentication ticket is stored in a cookie (which is correctly stored by Firefox/Chrome), but whenever the request is made IE’s cookies are sent and those do not contain a valid forms authentication ticket cookie.

Luckily there’s a workaround for this. Basically we’ll need to tell our upload SWF the current SessionID as well as the contents of the forms authentication ticket cookie:

var flashVars = {
    ASPSESSID: "<%= Session.SessionID %>",
    AUTHID: "<%= Request.Cookies[FormsAuthentication.FormsCookieName] == null ? "" : Request.Cookies[FormsAuthentication.FormsCookieName].Value %>"
}

Now we need to modify our SWFUpload code so it sends the SessionID and ticket values in the query string to the upload file, so instead of calling:

UploadFile_Upload.aspx

We’ll call:

UploadFile_Upload.aspx?ASPSESSID=e2u35jfs0pvevfugkfnmm045&AUTHID=E7BA5BDD2D6E9FBBC7CF613352EF10E01E0E8B0AD9920F62A465BC0CA20FB9CC2BA67F95D5A82F5D30B3162D6DFB3EA7FD505456E5EA5407094D03C1D48E6EE0B80F85F1B6AFD5F52FDC14C2ED6D77A8

Now that we have the SessionID and ticket value we can manually restore those cookies in Global.asax (or an HttpModule, doesn’t matter). We’ll be doing the fix in Application_BeginRequest as this allows us to fix the cookies before ASP.NET will perform its validation and thereby notice the missing session and forms authentication cookies.

public class Global : HttpApplication
{
    protected void Application_BeginRequest(object sender, EventArgs e)
    {
        fixCookie("ASP.NET_SessionId", "ASPSESSID");
        fixCookie(FormsAuthentication.FormsCookieName, "AUTHID");
    }

    private void fixCookie(string cookieName, string queryStringKey)
    {
        // Did we get a querystring value to override the cookie value?
        if (HttpContext.Current.Request.QueryString[queryStringKey] != null)
        {
            // Try to get the current cookie value
            HttpCookie cookie = HttpContext.Current.Request.Cookies.Get(cookieName);

            if (cookie == null)
            {
                /* If there's no cookie, add a new one and add it to the Response.Cookies collection.
                   Note that it HAS to be put in the Response.Cookies collection even though Request.Cookies
                   makes more sense.
                */ 
                cookie = new HttpCookie(cookieName, HttpContext.Current.Request.QueryString[queryStringKey]);
                Response.Cookies.Add(cookie);
            }
            else
            {
                /* If there's already a cookie (one from IE perhaps), overwrite its value with the querystring
                   provided value.
                */
                cookie.Value = HttpContext.Current.Request.QueryString[queryStringKey];
                HttpContext.Current.Request.Cookies.Set(cookie);
            }
        }
    }
}

Note that there is a security implication in doing this as it allows for session hijacking if you’re able to fake another users SessionID and forms authentication ticket! Thus, make sure you handle this or at least know the risks in not doing so.

OK, so that fixes the SWFUpload issue. This ran perfectly for some time. However, once i placed an IIS7 Application Request Routing based load balancer in front of the machine serving the upload applications, the issue from before reappeared, even though my original cookie handling code was still in place.

The reason for the resurrection of the cookie bug was to be found in the way ARR maintains client affinity:

IIS ARR will set a cookie on the client that basically contains a hash of the content server to which the client is bound. This is a very simple and neat client affinity solution as there’s no shared state on the IIS ARR machine itself. Thus, it’s easy to combine a number of IIS ARR servers using NLB and let IIS ARR handle client affinity and thus simplify the NLB setup.

However, since the client affinity is handled by a cookie - that cookie was now suffering from the same bug as before. Basicaly the IIS ARR load balancer thought it received a completely new client request and thus assigned the request to a random content server, giving a 1/[num_machines] chance of succeeding in case it randomly hit the correct content server.

The solution is similar, though there is one major difference. The previous problem occurred on the actual content machines because those were missing a cookie value, in this case it’s the load balancer itself. Thus, deploying a fix on the content servers won’t do any good.

We’ll create a new HttpModule that performs the fix in Application_BeginRequest - which occurs before IIS ARR assigns the request to a content server. To ensure this fix does not in any way affect normal requests in case something goes wrong, exceptions are being silently ignored. This is generally a bad practice, but in this case I really do not want to affect the load balancer as that’ll put down the website for all users if an error occurs. Note that while the handling is very similar to the previous bit of code, this time we’re modifying the actual Cookie header directly. If we don’t do this, IIS ARR won’t pick up the overwritten cookie values and thus still send the user to a random content server.

using System;
using System.Text.RegularExpressions;
using System.Web;

namespace iPaper.Web.ArrCookieRestorer
{
    public class ArrCookieRestorer : IHttpModule
    {
        public void Dispose()
        { }

        public void Init(HttpApplication context)
        {
            context.BeginRequest += context_BeginRequest;
        }

        private void context_BeginRequest(object sender, EventArgs e)
        {
            try
            {
                HttpContext context = HttpContext.Current;
                string serverHash = context.Request.QueryString["ARRIPARRAffinity"];

                if (serverHash != null)
                {
                    string cookieHeader = context.Request.Headers["Cookie"];

                    if (cookieHeader != null)
                    {
                        if (cookieHeader.Contains("IPARRAffinity="))
                            cookieHeader = Regex.Replace(cookieHeader, "IPARRAffinity=[0-9a-f]+;?", "IPARRAffinity=" + serverHash + ";");
                        else
                            cookieHeader += "; IPARRAffinity=" + serverHash;

                        context.Request.Headers["Cookie"] = cookieHeader;
                    }
                    else
                        context.Request.Headers.Add("Cookie", "IPARRAffinity=" + serverHash);
                }
            }
            catch
            {}
        }
    }
}

Once you’ve compiled the HttpModule we need to install it on the IIS ARR machine. On a default installation of IIS ARR you’ll have your rewrite rules as global rules at the IIS-level. However, if you install the HttpModule at the IIS level you’ll get the following exception on all requests:

The virtual path ‘null’ maps to another application, which is not allowed root

Apparently it’s a bug in IIS 7.0 on Windows Server 2008 which has been fixed in IIS 7.5 on Windows Server 2008 R2. As I’m still running a vanilla 2008 and IIS 7.0, I had to get around it by moving the rewrite rules into the default website - which code runs for all requests.

Make sure there’s a bin folder in the default website root and place your HttpModule in there. Then setup your web.config like so:

<?xml version="1.0" encoding="UTF-8"?>

This adds our HttpModule so it’ll run for all requests - fixing any missing ARR client affinity cookies. Note that your rewrite rules will likely differ from mine.

Nov 11
2009

Obtaining the movie height & weidth of a Flash file is an easy task using the swfdump tool that comes as part of the swftools package. Here’s an example of how to invoke swfdump from C# and read out the height & width of a given Flash file.

Start out by downloading on of the swftools releases. I’m using the latest development snapshot. I’ll be using one of the Flash files I made in a previous blog post as a test file, but you can use any .swf file you want. The test Flash file is called test.swf.

Once you’ve extracted the swftools package and copied both the test file and the swfdump.exe file into your solution directory, we can now test it out manually:

D:\Webmentor Projekter\Blog\RetrievingSwfProperties>swfdump -X -Y test.swf
-X 500 -Y 375

By providing the -X and -Y switches swfdump only prints out the movie height & width. You can see all the switches on the swfdump man page. At this point it’s a simple matter of spinning up a swfdump process and parsing the output:

static void Main(string[] args)
{
    // Set process properties
    Process p = new Process();
    p.StartInfo = new ProcessStartInfo("swfdump.exe", "-X -Y test.swf");
    p.StartInfo.CreateNoWindow = true;
    p.StartInfo.RedirectStandardOutput = true;
    p.StartInfo.UseShellExecute = false;
    p.Start();

    // Read all output, waiting for process to end
    string output = p.StandardOutput.ReadToEnd();

    // Regex that'll match both the width and height output - has to take care of potential decimals
    Match m = Regex.Match(output, @"-X (?d+(.d+)?) -Y (?d+(.d+)?)");

    // Convert width & height to doubles forcing en-US culture
    double width = Convert.ToDouble(m.Groups["width"].Value, new CultureInfo("en-US"));
    double height = Convert.ToDouble(m.Groups["height"].Value, new CultureInfo("en-US"));

    Console.WriteLine("Width: " + width);
    Console.WriteLine("Height: " + height);
    Console.Read();
}

Result:

Width: 500
Height: 375

Nov 05
2009

As little children we’ve all been taught that it’s better to program defensively than relying on exceptions being thrown. However, sometimes it’s preferably to just hope for the best and catch exceptions if they happen.

Defending against the improbable

Say we have a web application that receives and ID through the query string and serves a file accordingly, usually we’d write that like:

if(File.Exists(path))
    serveFile(path);
else
    serve404();

This is what I’ve been doing in a large image serving website I run. However, it recently struck me that on average about 99.9% of all requests map to existing files, only those few .01% of all requests actually resulted in 404 errors being thrown.

Given the above ratio, if we dropped the File.Exists() check, out of 10,000 requests only 10 of those would result in a FileNotFoundException being thrown while the rest would just be served as usual. That means in 99.9% of all requests we waste resources on performing a trivial File.Exists() request that we know will most likely come back true anyways. What’s worse is that this will hit the harddrive and actually cost us an IO operation!

A local test

Observe the following two methods. serveFileUnsafely will serve a file under the assumption that it probably exists and will rely on a FileNotFoundException being throw if it doesn’t exist. serveFileSafely will ensure the file exists before actually serving it (trusting nothing happens between File.Exists() and File.ReadAllText()).

private static void serveFileUnsafely(string path)
{
    try
    {
        File.ReadAllText(path);
    }
    catch (FileNotFoundException)
    {
        Console.WriteLine("File does not exist!");
    }
}

private static void serveFileSafely(string path)
{
    if (File.Exists(path))
        File.ReadAllText(path);
    else

        Console.WriteLine("File does not exist!");
}

The following two methods will be used to measure the time taken to serve 100 requests. I have created 100 identical files named [1-100].txt, each containing just the text “Hello world!”. I have then deleted a random file so there’s only 99 left. Thus in this example we assume that only 99% of all requests map to existing files even though the actual app has a success rate in excess of 99.9%. Note that the two methods each hit a separate folder - Test and Test2. This is to avoid any advantage of prewarming the cache before running the second test.

private static void testLocalUnsafely()
{
    for (int file = 1; file <= 100; file++)
        serveFileUnsafely(@"E:\Test" + file + ".txt");
}

private static void testLocalSafely()
{
    for (int file = 1; file <= 100; file++)
        serveFileSafely(@"E:\Test2" + file + ".txt");
}

The actual profiling goes like this. I’ll be using my CodeProfiler class to make the measurements, running a total of 500 iterations using a single thread - as well as running an automatic warmup iteration.

TimeSpan safeLocalTime = CodeProfiler.ProfileAction(() => testLocalSafely(), 500, 1);
TimeSpan unsafeLocalTime = CodeProfiler.ProfileAction(() => testLocalUnsafely(), 500, 1);

Console.WriteLine("Safe: " + safeLocalTime.TotalSeconds);
Console.WriteLine("Unsafe: " + unsafeLocalTime.TotalSeconds);

And the results?

Safe: 4,211061
Unsafe: 4,9368481

Conclusion: Unsafe is 17% slower than safe.

Interestingly the defensive method actual performs the best! It’s easy to conclude that throwing the exception is somewhat more expensive than hitting the IO layer to check for the files existance.

It’s no coincidence that I mention the IO layer and not the disk in the above statement. As this is run locally on a machine with 8GBs of memory all 200 files are easily cached in memory - making it a pure in-memory operation to both check for the files existance as well as reading the file contents. This can be verified as well by the CPU taking up 100% resources while the test is running.

A remote test

Back to the real scenario. The web servers are not serving files off of local disks, they’re serving the files from a backend SAN exposed as CIFS shares.

I’ve copied the two test file directories onto two directories on a remote share.

Two new methods have been added. They’re identical to the last ones except that they access a remote share mapped to the local drive Z.

private static void testShareUnsafely()
{
    for (int file = 1; file <= 100; file++)
        serveFileUnsafely(@"Z:\Test" + file + ".txt");
}

private static void testShareSafely()
{
    for (int file = 1; file <= 100; file++)
        serveFileSafely(@"Z:\Test2" + file + ".txt");
}

The testing is performed like this. Note that I’m only running 10 iterations + warmup here as it’d otherwise take far too long time.

TimeSpan safeShareTime = CodeProfiler.ProfileAction(() => testShareSafely(), 10, 1);
TimeSpan unsafeShareTime = CodeProfiler.ProfileAction(() => testShareUnsafely(), 10, 1);

Console.WriteLine("Safe: " + safeShareTime.TotalSeconds);
Console.WriteLine("Unsafe: " + unsafeShareTime.TotalSeconds);

The results?

Safe: 4,1287161
Unsafe: 3,1327967

Conclusion: Unsafe is 25% faster than safe.

As usual - it depends!

As can be seen, sometimes it’s best to defend against exceptions and sometimes it’s better to just hope for the best and catch the exception if it occurs. In my scenario throwing an exception was… Well. The exception. Make sure you always consider the cost of avoiding exceptions before you do so blindly.

Oct 28
2009

All of the following samples are based on the following table:

CREATE TABLE [dbo].[tblCars]
(
	[CarID] [int] IDENTITY(2,5) NOT NULL,
	[Name] [nvarchar](50) NOT NULL,
)

Find identity column seed and increment values

We can use the IDENT_SEED, IDENT_INCR and IDENT_CURRENT functions to retrieve the identity seed and increment values, as well as the current value. Note that the next row will have IDENT_CURRENT() + IDENT_INCR() as its identity value.

SELECT
	IDENT_SEED('tblCars') AS Seed,
	IDENT_INCR('tblCars') AS Increment,
	IDENT_CURRENT('tblCars') AS CurrentIdentity

Result:

Seed    Increment    CurrentIdentity
2       5            17

An alternative way is to query the sys.identity_columns system view for the same values. Note that the sys.columns view (of which sys.identity_columns inherit) has an object_id column specifying the object ID of the table to which the column belongs. Thus we’ll have to apply a predicate filtering away any columns not belonging to the desired table, tblCars in this example.

SELECT
	seed_value AS Seed,
	increment_value AS Increment,
	last_value AS CurrentIdentity
FROM
	sys.identity_columns
WHERE
	object_id = OBJECT_ID('tblCars')

Result:

Seed    Increment   CurrentIdentity
2       5           17

A third way of finding the current identity value is to use the DBCC CHECKIDENT function:

DBCC CHECKIDENT(tblCars, NORESEED)

Result:

Checking identity information: current identity value '22', current column value '22'.
DBCC execution completed. If DBCC printed error messages, contact your system administrator.

Changing the seed value

Using the DBCC CHECKIDENT command we can manually apply a new seed value to our table. Note that this will enable you to set an identity value that’ll cause the identity column to have duplicates unless you have a unique index on the column, in which case you’ll get an error instead. Thus, if you manually reseed the table, make sure you won’t run into duplicate values.

DBCC CHECKIDENT(tblCars, RESEED, 500)

Result:

Checking identity information: current identity value '22', current column value '500'.
DBCC execution completed. If DBCC printed error messages, contact your system administrator.

If for some reason the identity value has become out of synch with the values in the table, we can automatically reseed the table to a valid identity value. In the following case I’ve manually set the seed to 10 while the highest identity value in the table is 27. After running RESEED with no explicit value, the seed is automatically set to 27, thus the next inserted row will have an identity value of 32, provided the increment is 5.

DBCC CHECKIDENT(tblCars, RESEED)

Result:

Checking identity information: current identity value '10', current column value '27'.
DBCC execution completed. If DBCC printed error messages, contact your system administrator.

Getting the maximum and minimum identity values

Using the IDENTITYCOL alias for any identity column in a table (of which there can be at most one), we can easily select the maximum and minimum identity values:

SELECT
	MAX(IDENTITYCOL) AS MaximumIdentity,
	MIN(IDENTITYCOL) AS MinimumIdentity
FROM
	tblCars

Result:

MaximumIdentity MinimumIdentity
27              22

Changing the identity increment value

Unfortunately there’s no easy way to change the increment value of an identity column. The only way to do so is to drop the identity column and add a new column with the new increment value. The following code will create a new temporary table, copy the data into it, recreate the original table with the correct increment value and then finally copy the data back using SET IDENTITY_INSERT ON.aspx) to insert explicit values into the identity column.

BEGIN TRAN

-- Create new temporary table to hold data while restructuring tblCars
CREATE TABLE tblCars_TMP
(
	CarID int NOT NULL,
	Name nvarchar(50) NOT NULL
)

-- Insert tblCars data into tblCars_TMP
INSERT INTO tblCars_TMP SELECT * FROM tblCars

-- Drop original table
DROP TABLE tblCars

-- Create new tblCars table with correct identity values (1,1) in this case
CREATE TABLE [dbo].[tblCars]
(
	[CarID] [int] IDENTITY(1,1) NOT NULL,
	[Name] [nvarchar](50) NOT NULL,
)

-- Reinsert data into tblCars table
SET IDENTITY_INSERT tblCars ON
INSERT INTO tblCars (CarID, Name) SELECT CarID, Name FROM tblCars_TMP
SET IDENTITY_INSERT tblCars OFF

COMMIT
Oct 21
2009

Access denied errors are not uncommon when deploying new websites / features that interact with the filesystem. While it might work in local testing, it suddenly doesn’t anymore when deployed. Using Process Monitor I’ll show how to easily debug these issues.

I’ve made a very simple web application project with a Default.aspx file that has the following codebehind code:

using System;
using System.IO;
using System.Web.UI;

namespace FileWritingWebsite
{
	public partial class _Default : Page
	{
		protected void Page_Load(object sender, EventArgs e)
		{
			File.WriteAllText(@"C:\Test.txt", "Hello world!");

			Response.Write("Done!");
		}
	}
}

After deploying this to my webserver we receive the archetypical access denied error:

In this case it’s rather obvious where the error stems from, but the cause isn’t as obvious. We’re running under IIS, but we may be impersonating a user profile, running under a non-standard user account for the application pool (that is, not NETWORK SERVICE) or explicitly writing the file on a thread that’s running on a different user account (which we are not in this case, however).

Looking at the user permissions for C: it’s clear that no special permissions have been granted for the web user. Thus our task is first and foremost to identify the user that’s trying to write the file.

Once you startup Process Monitor you’ll quickly be swamped with input data that’s irrelevant to the task at hand. The first filter we’ll apply is the overall event type filter. There’s five standard types, of which the first four are enabled by default: Registry, File, Network, Process & Threads and Profiling. As we’re having an access denied issue with the file system, disable all but the File System events.

At this point the number of events should already be filtered down a lot - down to 32% in my case. Now click the cyan funnel icon to open up the filter editor window.

Since we know IIS is running under the w3wp.exe process, we can add a filter that includes all events with a process name of w3wp.exe. As soon as we add an Include filter, all event that do not match an include filter are excluded.

At this point th event list is somewhat more manageable. The important event is clearly the one with a result of “ACCESS DENIED”. That event shows we’re trying to write (CreateFile) the C:Test.txt file and we’re receving an ACCESS DENIED error from the file system.

Now right click the ACCESS DENIED event and go to Properties. Once you’ve opened the properties window, switch to the Process tab. At this point you’ll be ableto see the exact user account that tried to perform the denied action. As can be seen from the screenshot, it was the NETWORK SERVICE user in this case - the default IIS user.

Once we’ve identified the necessary user account, it’s a simple matter of granting it NTFS write rights to the C: directory.

And finally we can run the website again and verify that we’ve now got proper permissions for writing the Test.txt file to the C: directory.

Not just for web applications

While I gave an example of a web application security issue, Process Monitor can be used to solve any kind of permission issues. You can use it for debugging why Windows Services won’t start properly, why Outlook is suddenly complaining about access denied issues etc. Note that Process Monitor will also allow you to monitor the registry and can thus be used to solve security issues just as simple as with the file system.

Process Monitor is also a great tool for monitoring 3rd party applications to discover their exact usage of the file system and registry.