Join #SQLPASS virtual chapters for free online #SQLServer learning!

SQLPASS virtual chapters (VC) provides free sqlserver training year-round:

If you are not signed up already, then consider signing up! With that, And here’s a Quick walk-through on how to Join a VC:

If you do not have a SQLPASS account:

a. Go to

b. Fill up the required information and register

Now, After successful login/registration:

a. Go to

b. switch to MyChapters section

c. Now under virtual chapters, you would see a list of virtual chapters. Join the one’s you are interested in!

my PASS my Chapter Azure VC

Azure PASS VC Next meeting: Kung Fu Migration to Windows Azure SQL Database

Azure PASS VC’s next meeting:

Kung Fu Migration to Windows Azure SQL Database

Speaker: Scott Klein, Technical Evangelist Microsoft

Summary: As cloud computing becomes more popular and cloud-based solutions the norm rather than the fringe, the need to efficiently migrate your database is crucial. This demo-filled session will discuss the tips and tricks, methods and strategies for migrating your on-premises SQL Server databases to Windows Azure SQL Database, AKA SQL Azure. Focusing primarily on SQL Server Data Tools and the DAC Framework, this session will focus on how these tools can make you a kung-fu migration master.

About Scott: Scott Klein is a Corporate Technical Evangelist for Microsoft focusing on Windows Azure SQL Database (AKA SQL Azure) and related cloud-ready data services. His entire career has been built around SQL Server, working with SQL Server since the 4.2 days. Prior to Microsoft he was a SQL Server MVP for several years, then followed that up by being one of the first 4 SQL Azure MVPs. Scott is the author of over ½ dozen books for both WROX and APress, including Pro SQL Azure. He can be found talking about Windows Azure SQL Database and database scalability and performance at events large and small wherever he can get people to listen, such as SQL Saturday events, local SQL Server user groups, and TechEd.

Details at

Download the calendar file:

How to Join Azure PASS VC’s?

If you want to stay updated on meeting announcements, please consider registering on PASS’s website and Joining our VC:

If you do not have a SQLPASS account:

a. Go to

b. Fill up the required information and register

Now, After successful login/registration – Go to

a. switch to MyChapters section

b. Now under virtual chapters, you would see a list of virtual chapters. Join the one’s you are interested in!

my PASS my Chapter Azure VC

I look forward to seeing you at next Azure PASS VC’s meeting!

Hadoop on Azure’s Javascript Interactive Console has basic graphing functions:

The Hadoop on Azure’s Javascript console has basic graphing functions: Bar, Line & Chart. I think this is great becuase it gives an opportunity to visualize data that’s in HDFS directly from the Interactive Javascript Console! Here’s a screenshot:

hadoop on azure bar and line graph javascript

In the console, I ran the help(“graph”) command to see how I can use this function:
Draw a graph of data, options) Bar graph
graph.line(data, options) Line graph
graph.pie(data, options) Pie chart

data (array) Array of data objects
options (object) Options object, with
x (string) Property to use for x-axis values
y (string) Property to use for y-axis values
title (string) Graph title
orientation (number) x-axis label orientation in degrees
tickInterval (number) x-axis tick interval


In this blog-post, I posted that Hadoop on Azure’s Javascript Interactive Console has basic graphing functions.

Related articles:

How to Load Twitter data into Hadoop on Azure cluster and then analyze it via Hive add-in for excel?

In this blog post, we would:

1. Upload Twitter Text Data into Hadoop on Azure cluster

2. Create a Hive Table and load the data uploaded in step 1 to the Hive Table

3. Analyze data in Hive via Excel Add-in

Before we begin, I assume you have access to Hadoop on azure, Have your sample data (don’t have one? learn from a blog post), familiar with Hadoop ecosystem and know your way around the Hadoop on Azure Dashboard.

Now, Here are the steps involved:

STEP 1: Upload Twitter Text Data into Hadoop on Azure cluster

1. Have your data to be uploaded ready! I am just going to Copy Paste the File from my host machine to the RDP’ed machine. In this case, the machine that I am going is the Hadoop on Azure cluster.

For the purpose of this blog post, I have a text file having 1500 tweets:

upload twitter text data to hadoop on azure

2. Open web browser > Go to your cluster in Hadoop on Azure

3. RDP into your Hadoop on Azure cluster

Remote Desktop into Hadoop on Azure cluster

4. Copy-Paste the File. It’s a small data file so this approach works for now.

uploading twitter text data to hadoop on azure hdfs cluster

Step 2: Create a Hive Table and load the data uploaded in step 1 to the Hive Table

1. Stay on the machine that you Remote Desktop (RDP’ed) into.

2. Open the Hadoop command line (you’ll see a icon on your Desktop)

3. switch to Hive:

write hive commands in hadoop on azure

4. Use the following Hive Commands:


CREATE TABLE TweetSampleTable (
id string,
text string,
favorited string,
replyToSN string,
created string,
truncated string,
replyToSID string,
replyToUID string,
statusSource string,
screenName string

LOAD DATA LOCAL INPATH ‘C:\apps\dist\examples\data\tweets.txt’ OVERWRITE INTO TABLE TweetSampleTable;

Note that for the purpose of this blog-post, I’ve chose string as data type for all fields. This is something that depends on the data that you have. If I were building a solution, I would spend some more time choosing the right data type.

Step 3. Analyze data in Hive via Excel Add-in

1. Switch to Hadoop on Azure Dashboard

2. Go to the Hive Console and run the show tables to verify that there is a tweetsampletable.

show all tables in hive hadoop on azure

3. Now if you haven’t, Download and Install the Hive ODBC Driver from the Downloads section of your Hadoop on Azure Dashboard.

4. I setup  a ODBC connection to Hive by following the instructions here: How To Connect Excel to Hadoop on Azure via HiveODBC (en-US)

5. After that, Open Excel. I have Excel 2010 64 bits.

6. Switch to Data Tab > Hive Pane

7. Choose the Hive connection > select Table > Select Columns > And off you go!

you have Hive Data in Excel!

Hadoop on azure Hive Excel addin

Now go Analyze!


In this blog-post, we saw How to Load Twitter data into Hadoop on Azure cluster and then analyze it via Hive add-in for excel?

For my Archives: Few questions answered on Windows Azure & SQL Azure MSDN forums

I normally Blog about the answers that I give out on MSDN forums. The answer on MSDN forum is generally brief and to the point and in the blog post – I expand it to cover related areas. Here are the questions for which I didn’t choose to write a blog. So I am just going to archive them for now:

Azure PASS VC session on 24th Sep 2012 Monday: Getting Started with Windows Azure

Join the Azure PASS VC’s session on “Getting Started with Windows Azure” on:

Date: 24th Sep (Monday)

Time: 11 AM Eastern Time; 8 AM Pacific; 8:30 PM India Time; You can download the event calendar from here

Speaker: Brian Prince, Principal Cloud Evangelist Microsoft

Session Abstract: Windows Azure is Microsoft’s cloud platform for quickly building and running scalable applications. We will cover just what the cloud is, as an industry, and what Microsoft is offering. We will see into the data-centers, how they work, and the a high level view of all the components of the platform.

More Details:

components of windows azure

New Azure portal is ALL HTML 5!

New Azure portal is HTML 5 – so what? it just means that portal would be accessible from all devices! Do not get me wrong, I am not against Silver-light but it’s just it was little limiting because the portal was not accessible from say iPad. So from the accessibility stand-point, I am happy!

Let me share a conversation I had with @krisherpi few months back where he was not able to access Azure portal from a tablet that he had just bought – At that time, I had commented that I wish the portal was build using HTML 5 so that we could have more device options to connect to Azure portal – well, seems like Azure team was already working on that!

So I just wanted to point this out. And this is just one of the many awesome features that were discussed at Meet Windows Azure event (7 June 2012)

HTML 5 Powered Azure portal and it’s metro-styled !

new azure portal html 5

Quick updates from meet windows azure event for Data Professionals

1. SQL Azure reporting is generally available and backed by SLA

2. You can now run SQL Server on VM roles

3. Azure was rebranded a while back but quick reminder: SQL Azure was renamed to Windows Azure SQL Database and so in the “new” portal – you’ll see “SQL database” instead of SQL Azure.

I’ll blog about these features as and when I get a chance to play with it.

Read all updates here: Now Available: New Services and Enhancements to Windows Azure

And I updated

Get started on Windows Azure: Attend “Meet Windows Azure” event Online

On June 7th 2012 – there’s an online event called “Meet Windows Azure” where Scott Gu and his Windows Azure team would introduce the Windows Azure platform. You can register here:

If you’re planning to attend – there’s a very interesting tweet-up planned called “Social meet up on Twitter for MEET Windows Azure on June 7th” – All you have to do is follow #MeetAzure, #WindowsAzure on Twitter & Interact! Simple!

There’s an unofficial blog relay, if you write a post – Tweet it to @noopman – Here is the Blog Relay:

Played with Microsoft research “Project Daytona” – MapReduce on Windows Azure

Recently, I played with Project Daytona which is a MapReduce on Windows Azure.

It seems like a great “Data Analytic’s as a service”. I tried the k-means and the word-count sample application that comes bundled with the project run-time download:

The documentation along with the project guides you in a step by step fashion on how to go about setting up the environment but for those who are curious, here is a brief description on how I setup the environment:

1) Uploaded the sample data-sets to Azure Storage

2) Edited the configuration file (ServiceConfiguration.cscfg) to point to correct Azure Storage

3) Chose the Instance size and the no. of Instances for the deployment

4) Deployed the binaries to Windows Azure (.cspkg and .cscfg)

5) Ran the Word Count Sample

6) Ran the K-means Sample

Conclusion: It was pretty amazing to run MapReduce on Windows Azure. If you are into BigData, MapReduce, Data Analytic’s – then check out “Project Daytona”

That’s about for this post. And what do you think about Project Daytona – MapReduce on Windows Azure?