SQL Azure – Alternative to what

After attending a presentation on SQL Azure by Microsoft’s George Huey an interesting topic came to my mind in regards to the actual use for a cloud based database that has over 1400 incompatibilities with SQL Server: What’s the use for this since it is not a viable alternative to a production database.
I would say that it is an alternative to MS Access!!!
Let me explain that. I definitely see the use for it when you need to deploy a solution quickly for an event or conference and you need a reliable solution for the backend of your application that fits a tight budget and an even tighter schedule or even as George suggested for a startup that has no money for a datacenter hosted db environment.
in these situations deploying a 1GB database to host non-confidential information for under $10/mth seems a no-brainer. But I would not recommend this when HIPAA compliance is required for instance.

My thanks to George for a great presentation.

Scorpions – Last album

For those of us that com from an Eastern European country, Scorpions is a legend. Their songs have made us strong and gave us the force to go forward like a “Wind of change”. But sadly they have announced that  their newest album: “Sting in the Tail” Is also their last.

Thank you for keeping us energized for so many decades!

Important Message to our Fans
It was always our pleasure, our purpose in life, our passion and we were fortunate enough to make music for you – whether it was live on stage or in the studio, creating new songs.
While we were working on our album these past few months, we could literally feel how powerful and creative our work was – and how much fun we were still having, in the process. But there was also something else: We want to end the Scorpion’s extraordinary career on a high note. We are extremely grateful for the fact that we still have the same passion for music we’ve always had since the beginning. This is why, especially now, we agree we have reached the end of the road. We finish our career with an album we consider to be one of the best we have ever recorded and with a tour that will start in our home country Germany and take us to five different continents over the next few years.
We want you, our fans, to be the first to know about this. Thank you for your never-ending support throughout the years!

http://www.the-scorpions.com/english/

What three events brought you here?

Recently Paul Randal (aka Mr. Tripp)  started a web version of the tag game in the SQL Server blogging community  on what were the three most important events that shaped your career and got you to this point in your life. It is a very difficult question because unlike data, life events do not accept a SELECT TOP 3 … ORDER BY [ImportanceRating]. The sorting function is a subjective one for humans in the case of life changing events and it depends on the impact that one event had on the individual. It is a lot harder than most would think and I spent the weekend thinking about this and trying to review what were the most important events that shaped my life and career. Here’s what I came up with:

1) My FamilySinclair ZX Spectrum+

As I grew up in a communist Romania I was one of the fortunate kids that had access to technology thanks to my parents, both University Computer Science professors. My father is still teaches Databases from the 101 course to Oracle PL\SQL  and also Assembly Language. My mom is now retired but she used to teach the Fundamentals of Computer Programming  to students in the freshman year. As a kid I got to see all the  different generations of technology from things like ferrite memory to BASIC programming on Romanian replica of Sinclair ZX Spectrum+ Z80 home computer. I have thank my parents for explaining to me how things work and as a very inquisitive boy I absorbed all that information without realizing that one day this will help me understand the systems as a whole and see the big picture where others only see the frontend. The first database I worked on was dBase III the precursor of FoxPro (anybody who used it understands why some people call database tables “files”). The interesting side to this is that while I was hooked on computers, my sister decided to go to Law School to finally land in the wireless communications industry.

2) Discovering relational algebra

During third year in the Politehnica University I found an opportunity to apply for a EU scholarship in the frame of Erasmus Programme. I remember that one of the requirements was to be fluent in French and because of that there were only 3 or 4 students that applied and I was selected. I chose the “DB and Artificial Intelligence”  major at “Universite de Paris 6 – Pierre & Marie Curie“  and one of the courses was  “Relational Databases”. We started from the basics of the relational algebra operations and representing query plans as a tree  that can be transformed, all of which which I was fascinated by. I kept my interest in this when I returned home after my year in Paris, I got my Software Engineering MS in “Database Systems Optimizations”.

3)Real world challenge

After my graduation I  decided to continue the family tradition and started teaching Labs on “Programming Languages” and “Relational Database Design” at the  Politehnica University. A few years later I  decided to get a job as Oracle DBA. The first day on the job they gave me a desktop and some form of an install manual for Oracle and their product  and asked me to read it and ask around if I had questions. Two days later, to everybody’s surprise I had their product installed and running on Oracle 9i while it was designed for 8i. This was one of the first moments when I realized that I like challenges and that I can live up to the expectations.
A few years later came a moment when I realized just how lucky I am. It was the moment when I found out that I was one of the one selected for the Diversity Visa program to get a Green Card  and move the US.  

From that point forward I did not find it hard to get a job (I got my first job in US in three days) but I learned that the most important thing is to find the right job and the right employer that would challenge me just like Brent Ozar was writing in his post on the same subject:

I want to be a successful employee in my employer’s eyes, but when I take a job, one of the questions is, “One year after someone’s taken this position, what does success look like?  What is the best employee doing?  How are you rewarding them for what they’ve done?” 
In IT, this question takes people by surprise, but the answers reveal a lot.

Until this day I can proudly say that I have exceeded all my employers’ expectations but I cannot say the same about all the jobs I held.

I’m going to tag Ted Krueger, Jorge Segarra, Mladen Prajdic and Jonathan Kehayias to see how they are answering this question. The list does not have to end here. If you would like answer this question please link back to this post or the original post by Paul.

Marrons Chauds, Marrons Chauds

Tonight we started the fire in the outdoor fireplace enjoyed the warmth of the fire and in the end I roasted chestnuts in the hot embers. It’s funny how memories are triggered by simple things like a smell or an object.

Source: Flickr
Source: Flickr

All of a sudden while roasting the chestnuts my mind wandered back in time when I was a student in Paris. This time of year once the cold weather comes a lot of street vendors start selling roasted chestnuts. They yell: “Marrons chauds! Marrons Chauds!” trying to get people to buy their chestnuts.  This was a an interesting experience since it is the first time I’m roasting chestnuts. Some got burned but most of them were delicious and we had a good time around the fire.

Query optimizer pains

As requested by Yavor Angelov from MSFT via Twitter im writing down what seem to to be my biggest pains with QO in SqlServer.

1)
Whenever you have a a join between large tables QO will look at the indexes that exists on the join keys. If some indexes are missing it seems that the default join type chosen is Nested Loops which is OK for fast results but is really slow if the memory is scarce on the server or there is a high load.
Also this is hard to execute in parallel.
Usually in this situation I have to look at the execution plan and suggest the proper plan to QO by creating missing indexes.
I think creating a temporary index if the cardinality of the join tables is high would help a lot to change QP from nested loops to hash joins which can take advantage of multiple processors. Also the time to create the NC index that is missing is also very low.

2)
A lot of times it seems that the QO is not looking at the statistics to apply Selection first on small and limit the resultset that has to be joined with the other tables to a small number of rows.
The QO should calculate QP cost using different methods and then pick the QP with the smallest cost. This does not seem to happen very well especially when there is a mix of small and large tables to be joined.

To test these scenarios you can use a Virtual Server with minimum memory resources and you’ll see that the differences between QPs become very evident.

Automating encryption tasks with PowerShell

Recently I answered a question on how to automate a task that runs a DTS package, encrypts the result with PGP and then transfers the encrypted file to a FTP site. I became interested in this subject since I heard the same request from 2 people in less than a month and realized that this is a standard pattern for a business process. Extract.Encrypt.Transfer.

First person asking me this found the solution in having a .NET application designed to perform these task using existing .NET components.

I think the is a easier solution is a PowerShell script.

All ww have to do is script the following:

#execute DTSPackage
DTSRun /S "(DBServer)" /N "DTSPackage" /A "StringPar":"8"="String Value" /A "IntPar":"3"="1" /W "1" /E
#Encrypt result file
gpg -E Datafile.csv --options
#transfer file over to FTP
function Open-FTPConnection($ftphost, $username, $password) {

[void][Reflection.Assembly]::LoadFrom("C:\IndyNet-Daily-20051007\Indy.Sockets.dll")
$ftp = new-object Indy.Sockets.FTP
$ftp.Disconnect()
$ftp.Host = $ftphost
$ftp.Username = $username
$ftp.Password = $password
$ftp.Connect()
$ftp.Passive=$true;
return $ftp
}
function Close-FTPConnection($ftp) { $ftp.Disconnect(); } function Upload-FTPFile($ftp, $sourceFileName, $targetDir) {     "Uploading {0} into {1}.." -f $sourceFileName, $targetDir;     $ftp.Put($sourceFileName, ($targetDir + $sourceFileName), $false);     "Uploading of {0} into {1} is complete" -f $sourceFileName, $targetDir; } $f = Open-FTPConnection "localhost" "foo" "bar" Upload-FTPFile $f "C:[\Path]\Datafile.pgp" (Get-FTPCurrentLocation $f) "Datafile.pgp" Close-FTPConnection $f

Source code found here:

http://groups.google.com/group/microsoft.public.windows.powershell/msg/6d9b5a4541160b25

Command line GNUPG can be downloaded  from here:

http://www.gnupg.org/

Of course this needs some polishing but … this is can be easily reused


SQL Server 2008 Failover Clustering with High Availability

I’m heading home from the August session of Chicago PASS and SSUG meeting.
First off, this was the biggest meeting in Chicago. Over 80 people signed up and for a very good reason.
After a very interesting Microsoft SQL Server 2008 Administration with Windows PowerShell presentation by The MAK and Yan Pan,  Allan Hirt managed to cram the essentials of the new MS Failover Cluster Manager and he made it in less than an hour!
One very interesting and long awaited feature is Live Migration. Basically for the first time you can failover a MS Cluster without dropping any transaction. This is only happening for planned failovers of course but this is a phenomenal achievement for the MS team. Until now any time you would failover a cluster, the operation was the equivalent of stopping the SQLServer service an restarting it on a different machine. Obviously any pending transactions were dropped. With Live Migration the when a planned failover is initiated data from memory is copied over to the new active node and all transactions are seamlessly ported over as well. There might be some performance decrease but no transactions are dropped achieving true High Availability.
And last but not least all this is free with Windows Server 2008 Enterprise or Datacenter
For more information read this blog from Microsoft.
http://blogs.technet.com/virtualization/archive/2009/05/06/microsoft-hyper-v-server-2008-r2-release-candidate-free-live-migration-ha-anyone.aspx

Thanks Allan!

How (not) to design a SQL Server table

I recently saw the worst example of  how not to design a table and indexes in SQL Server.

So here it is:

Table1(UserID,ActivityID,ActivityType,AddDate)

Role : Log any activity that a user is invited for.

PrimaryKey: … Wait for it … tada : UserID,ActivityID,ActivityType,AddDate … Wait you say …. YES … all table columns are used for the primary key and of course by default this will be a clustered index. Which means that if you want to insert 100K records in this table … it will split a lot of pages to reorganize the index beside actually adding the table records.

Wait … it gets better…

Someone created a non clustered index on ActivityID,UserID,,ActivityType,AddDate … yes in that order … and NO it does not help but I guess not everybody knows that a non clustered index will contain the clustered index besides the actual index.

So let’s recap:

Table : 4 Columns

PK Clustered Index: Same 4 Columns

Non clustered index: Same 4 columns x2

Insert efficiency -> extremely poor.

Why did SQL Server cross the road?

I posted a comment on Kevin Kline’s blog regarding:

Microsoft Marketing Throws SQL Server Under the Bus

“A spokesperson for Microsoft said that the problem was not bandwidth but that its SQL Server database had reached excessive fragmentation levels caused by the tremendous surge of queries”.  Read about it here:

http://www.itnews.com.au/News/102445,windows-7-rc-download-crashes.aspx

and also here:

http://blogs.zdnet.com/Bott/?p=910

Why didn’t anyone involved bother to think thru the fact that competitors and various elements of anti-Microsoft factions can make a lot of hay about how Microsoft’s own product?!?

Just as politicians only ever admit to “wanting to spend more time with family”, a technology company shouldn’t point to their own product as the culprit of a major technology failure, especially when there are thousands of other reasons that the meltdown might’ve happened.

What are your thoughts?

Here are my thoughts.

I think the truth is somewhere in the middle. I have worked with both Oracle and SQL Server and while I agree that in a lot of regards Oracle is a superior product it is also because of their experience.
Let’s face it …Oracle started in 1979 while Microsoft bought Sybase in ’89.
In my opinion the first great product they had were Oracle 8 in 1997 and SQL Server 2005 in  2005. Right now Oracle has a stable multi tiered scalable clustered architecture (including app server) whereas Microsoft is still discussing about deployment dates.
Also … Oracle query optimizer is IMO one of the best in the industry since Oracle 8. Microsoft has finally developed a good (should I say … comparable)query optimizer in 2008.

So while SQL Server is a good product  we should compare apples to apples because IMO Oracle and Microsoft play in different leagues.

Where did Microsoft Zune coders failed? or It’s a feature!!!

Eduard gives us the following  problem:

The result is that  every 4 years for a day you realized the money spent on a Zune are wasted

http://www.engadget.com/2008/12/31/30gb-zunes-mysteriously-begin-to-fail-at-12am-december-31st/

Now for the Microsoft solution (Please refrain from ROTFL :-D! ):

Follow these steps:

  1. Disconnect your Zune from USB and AC power sources.
  2. Because the player is frozen, its battery will drain—this is good. Wait until the battery is empty and the screen goes black. If the battery was fully charged, this might take a couple of hours.
  3. Wait until after noon GMT on January 1, 2009 (that’s 7 a.m. Eastern or 4 a.m. Pacific time).

http://www.zune.net/en-us/support/zune30.htm

Skip to toolbar