Ok, so maybe I'm obsessing a bit but I just couldn't help myself. I have created a... 1TB (yes that's terabyte) database on my home PC. Not only is the database itself 1TB but there is also a 500GB log file and 500GB tempdb (all pre-grown) to total 2TB of database goodness.

Now, why the madness you ask?

If you read the two previous posts, those probably have something to do with it. You see my web server ran out of space trying to do this little exercise so I had to change venue. To date I have generated permutations of strings from 1 to 6 characters in length to total a a staggering

2,238,976,116 rows of data. (yes that's billion with a B)

How long does a query take to execute? Oh, just a few milliseconds. The first query against the data took about 23 seconds, not bad considering this is on what is classified as a "home PC" sporting a quad core 2.8GHz processor with 8GB of RAM and a 7TB RAID 5 on 6 disks. After the query plan is cached, it takes merely the blink of an eye to return a word from the dictionary.

To populate this table took an ever increasing amount of time. The first strings of length 1 to 4 took from only a blink to just under 30 seconds. 5 characters took just over 9 minutes with 6 characters pushing 13 hours.

What's next? Why 7 character strings of course, that will put this at over 70 billion rows and pushing 600GB of data space.