About Me

I now work for Microsoft Federal in Chevy Chase, MD.

Dedicated to providing customer-driven, results-focused solutions to the complex business problems of today... and tomorrow.

At SQLTrainer.com, LLC  we understand that the technical challenges faced by businesses today are much greater in both scope and complexity than they have ever been. Businesses today are faced with understanding both local IT infrastructures as well as cloud-based technologies.

What is SQLTrainer.com?

Founded in 1998 by Ted Malone, SQLTrainer.com is a technical consulting, training and content development firm dedicated to the following core principles:

  • Technology Alone is NOT the Answer! - Implementing a particular technology because it is interesting or "cool" will not solve customer problems.
  • Technology Solutions do NOT need to be Overly Complex! - Many times developers and technical practitioners will attempt to build the most clever solution possible. While this serves to stroke the egos of those involved, it doesn't provide a maintainable solution.
  • Consultants Should be Mentors First! - When looking to hire an external consultant, businesses should look to the consultant who's willing to train themselves out of a paycheck.

Why the name, SQLTrainer.com?

SQL (pronounced See-Quell) stands for Structured Query Language, which is at the heart of every modern-day relational database system. Since many technology solutions today rely on some form of database storage or interaction, it was only logical to find a way to incorporate SQL into the name of the organization. Given that one of our core principles is to be a mentor/training above everything, the name SQLTrainer made sense. Since we also wanted to represent our embracing of the cloud, it seemed logical to add the ".com", referring to the biggest "cloud" of them all.

Live Feeds

RSS Feeds RSS
Thursday, September 18, 2014 10:00:00 AM

Microsoft today announced that Azure HDInsight is now available for all customers in China as a public preview, making it the first global cloud provider to have a publicly available cloud Hadoop offering in China. With this launch, Microsoft is bringing Azure HDInsight’s ability to process big data volumes from unstructured and semi-structured sources to China. Both local Chinese organizations and multi-national corporations with offices in China can spin up a Hadoop cluster within minutes. 

Hadoop is an open-source platform for storing and processing massive amounts of data. By using Hadoop alongside your traditional data architecture, you can gain deep insights into data you never imagined being able to access.  As an example, Blackball, a Taiwanese Tea and Dessert restaurant chain was able to combine traditional data from its point-of-sale systems with new data from social sentiment and weather feeds to understand why a customer would purchase their products.  By combining traditional sources with new “Big Data”, Blackball found out that hot/cold weather was not really a factor in its sales of hot/cold drinks and was able to adjust accordingly to customer demand.

It is these types of results that are causing a wave of demand for big data. We invite you to try HDInsight today either in China or across the other Azure data centers. 

Learn more through the following resources:

Wednesday, September 17, 2014 11:05:00 AM

Microsoft Azure continues to offer customers more reasons why  Azure Virtual Machines are an ideal place to develop, test and run your SQL Server applications.  This promise is further enhanced with the release of new workload optimized images for SQL Server that provide greater performance and simplified setup when it comes to running data warehousing and OLTP workloads in Azure.   There are 4 new SQL Server images being released with benefits outlined below:

  • New Data Warehousing Optimized Image for both SQL Server 2014 and SQL Server 2012  – These images simplify the setup process for customers by adding more automation. For example, the automation of attaching a disk to the SQL Server VM running a data warehouse workload. 
  • New OLTP Optimized Image for both SQL Server 2014 and SQL Server 2012 – These new images allow a customer to get better performance for an OLTP high IO intensive workload.   One of the key improvements in this tuned image is the ability to attach many disks to the SQL Server VM.  This is critical in terms of improving IO in an OLTP workload as the number of disk has a direct impact on OLTP performance.  In addition, new Windows features are also utilized like Storage Pools for multi-disk environment to improve IO performance and latency.

The image below highlights the new optimized images being available on Microsoft Azure.

Be sure to read the blog post “New VM Images Optimized for Transactional and DW Workloads in Azure VM Gallery” for a more detailed technical overview.

Continue learning more about Microsoft Azure, Virtual Machines, and how our customers are reaping the benefits of these technologies:

Try Microsoft Azure.

Learn more about Virtual Machines.

Read how Amway and Lufthansa leveraged Microsoft SQL Server 2014 and Windows Azure.

Friday, September 12, 2014 10:00:00 AM

We are delighted to announce the release of new optimized SQL Server images in the Microsoft Azure Virtual Machines Gallery. These images are pre-configured with optimizations for transactional and Data Warehousing workloads respectively by baking in our performance best practices for running SQL in Azure VMs.

What preconfigured VM images are available?

The following four new pre-configured VM images are now available in the Azure VM Gallery:

  • SQL Server 2014 Enterprise Optimized for Transactional Workloads on Windows Server 2012 R2
  • SQL Server 2014 Enterprise Optimized for Data Warehousing on Windows Server 2012 R2
  • SQL Server 2012 SP2 Enterprise Optimized for Transactional Workloads on Windows Server 2012
  • SQL Server 2012 SP2 Enterprise Optimized for Data Warehousing on Windows Server 2012

Currently we support these images on VM instances that allow up to 16 data disks attached to provide the highest throughput (or aggregate bandwidth). Specifically, these instances are Standard Tier A4, A7, A8 and A9 and Basic tier A4. Please refer to Virtual Machine and Cloud Service Sizes for Azure for further details on the sizes and options.

How to provision a VM from the gallery using the new transactional/DW images?

To provision an optimized transactional or DW VM image by using the Azure Management Portal,

  1. Sign in to the Azure Management Portal.
  2. Click VIRTUAL MACHINE in the Azure menu items in the left pane.
  3. Click NEW in the bottom left corner, and then choose COMPUTE, VIRTUAL MACHINE, and FROM GALLERY.
  4. On the Virtual machine image selection page, select one of the SQL Server for transactional or Data Warehousing images.
  5. On the Virtual machine configuration page, in the SIZE option, choose from the supported sizes.

    Please note that only Standard tier A4, A7, A8 and A9 and Basic Tier A4 are supported at this point and attempts to provision unsupported VM sizes will fail.
  6. Wait for the provisioning to finish. While waiting, you can see the provisioning status on the virtual machines page (as in the picture below). When the provisioning is finished, the status will be Running with a checkmark.

Alternatively, you can use PowerShell Commandlet New-AzureQuickVM to create the VM. You will need to pass your cloud service name, VM name, image name, Admin user name and password and so on as parameters. A simple way is to obtain the image name is to use Get-AzureVMImage to list out all the available VM images.

What are the specific configurations included in the transactional/DW images?

The optimization we include in the optimized images are based on the Performance Best Practices for SQL Server in Azure Virtual Machines. Specifically, it includes:

 

 

Transactional

DW

Disk  configurations

Number of data disks attached

15

15

Storage spaces

 

Two storage pools:

-          1 data pool with 12 data disks; fixed size 12 TB; Column = 12

-          1 log pool with 3 data disks; fixed size 3 TB; Column = 3

One data disk remaining for the user to attach and determine the usage.

Stripe size = 64 KB

Stripe size = 256 KB

Disk sizes, caching, allocation size

1 TB each, HostCache=None, NTFS Allocation Unit Size = 64KB

SQL Configurations

 

Startup Parameters

-T1117 to help keep data files the same size in case DB needs to autogrow

-T1118 to assist in TEMPDB scalability (See here for more details)

Recovery Model

No change

Set to “SIMPLE” for MODEL database using ALTER DATABASE

Setup default locations

Move SQL Server error log and trace file directories to data disks

Default locations for databases

System databases moved to data disks.

The location for creating user databases changed to data disks.

Instant File Initialization

Enabled

Locked pages

Enabled (See here for more details)

 

FAQ

  • Any pricing difference between the optimized images and the non-optimized ones?
    No. The new optimized images follow exactly the same pricing model (details here) with no additional cost. Note that with larger VM instance sizes, higher cost is associated.
  • Any other performance fixes I should consider:
    Yes, consider applying relevant performance fixes for SQL Server
  • How can I find more information on Storage Spaces?
    For further details on Storage Spaces, please refer to Storage Spaces Frequently Asked Questions (FAQ).
  • What is the difference between the new DW image and the previous one?
    The previous DW image requires customers to perform additional steps such as attaching the data disks post VM creation while the new DW image is ready for use upon creation so it is more streamlined and less error prone.
  • What if I need to use the previous DW image? Is there any way I can access it?
    The previous VM images are still available, just not directly accessible from the gallery. Instead, you can continue using Powershell commandlets. For instance, you can use Get-AzureVMImage to list out all images and once you locate the previous DW image based on the description and publish date, you can use New-AzureVM to provision it accordingly.

Visit our Azure portal and give this new SQL VM image offering a try, and let us know what you think.

Let your colleagues know about the New VM Images available by sharing via your preferred social channels and don’t forget to follow @SQLServer on Twitter and find SQL Server on Facebook

Thursday, September 11, 2014 3:54:00 PM

Thank you to everyone who took the time to enter the contest #pass24hop Challenge!  As always, we had a great time listening to such passionate community speakers.

Congratulations to all the winners!  You will be notified via a Direct Message (DM) with details on how to redeem your free Microsoft Certification Exam.

So without further ado, the winners are……

Session #1

Taiob Alia @SQLTaiob correctly answered the question: “What did N.U.S.E stand for?” posed by Brent Ozar who conducted the “Who Needs a DBA??”

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7273

Session #2

John Fewer @johnfewer correctly answered the question: “What is the default buffer size for SSIS pipelines?” posed by Brian Knight who conducted the “Performance Tuning SQL Server Integration Services”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7292

Session #3

Yang Shuai @shuaiyang correctly answered the question: “If it was required, what is the type of witness for quorum that would be used if using standalone instances of SQL Server in an availability group?” posed by Allan Hirt who conducted the “Availability Groups vs Failover Cluster Instances: What’s the Difference?

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7275

Session #4

Mike Cornell @DataMic correctly answered the question: “In the UPDATE demo, what city does John Smith move to?” posed by Kalen Delaney who conducted the “In-Memory OLTP Internals: How is a 30x Performance Boost Possible?”

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7276

Session #5

John Ludvig Brattas @intolerance correctly answered the question: “How many relationships can you define between two tables in Tabular?” posed by Marco Russo who conducted the “Create your first SSAS Tabular Model”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7277

Session #6

Sergio Pacheco @Sergmis correctly answered the question: “Using XE, how can you determine the number of times a query was executed during a specific timeframe?” posed by Erin Stellato and Jonathan Kehayias who conducted the “Everything You Never Wanted to Know about Extended Events”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7278

Session #7

The listeners were stumped on the question: “Who are the complainer hotel guests from the demo? What show are they from?” posed by Hope Foley who conducted the “Spatial Data: Looking Outside the Map”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7295

Session #8

The listeners were stumped on the question: “Without stats, what will the row estimation be for an equality predicate?” posed by Gail Shaw who conducted the “Guessing Games: Statistics, Heuristics, and Row Estimations”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7279

Session #9

@fja correctly answered the question: “What is the name of the executable that ships with SQL Server that can be used to collect diagnostic information?” posed by Tim Chapman and Denzil Ribeiro who conducted the “Troubleshoot Customer Performance Problems Like a Microsoft Engineer”

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7293

Session #10

Andrea Allred @RoyalSQL correctly answered the question: “What technology should you think about when you get a requirement to encrypt data on the fly?” posed by Argenis Fernandez who conducted the “Secure Your SQL Server Instance without Changing Any Code”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7280

Session #11

Theresa Iserman @Theresalserman correctly answered the question: “What are the 4 C’s of Hiring?”  posed by Joe Webb who conducted the “Hiring the Right People: Interviewing and Selecting the Right Team”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7294

Session #12

Andy Pho @andycsuf correctly answered the question:  “Why does Chris (@SQLShaw) recommend database mirroring as his primary technology choice to upgrade SQL Server?” posed by Chris Shaw and John Morehouse who conducted the “Real World SQL 2014 Migration Path Decisions”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7291

Session #13

Stephen Radford @stephen_radford correctly answered the question: “What are the first two cmdlets you should learn, when learning PowerShell?” posed by Robert Cain, Bradley Ball and Jason Strate who conducted the " Zero to Hero with PowerShell and SQL Server”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7281

Session #14

Mark Holmes @SQLJuJu correctly answered the question: “Which Excel data mining add-in offers functionality for the entire data mining lifecycle?” posed by Peter Myers who conducted the “Past to Future: Self-service Forecasting with Microsoft BI”

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7283

Session #15

P K Towett @pthepebble correctly answered the question: “SQL Server 2014 adds one piece of functionality to statistics maintenance, what is it?” posed by Grant Fritchey who conducted the “Query Performance Tuning in SQL Server 2014”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7284

Session #16

Allen Smith @cognitivebi correctly answered the question: “What is your first mission when trying to determine if indexes help or hurt you?” posed by Jes Borland who conducted the “Are your Indexes Hurting you or Helping You?”

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7296

Session #17

Andre Ranieri @sqlinseattle correctly answered the question: “What behavior is sometimes called 'the king of irreplaceable behaviors'?” posed by Kevin Kline who conducted the “Techniques to Fireproof Your IT Career”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7285

Session #18

Ginger Grant @DesertisleSQL correctly answered the question: “In what part of a predictive analytics project to people typically spend 60-80% of their time?” posed by Carlos Bossy who conducted the “Predictive Analytics in the Enterprise”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7286

Session #19

The listeners were stumped on the question: correctly answered the question: “What WSMan feature do you need to set up on your client system to remote to Azure VMs?” posed by Allen White who conducted the “Manage Both On-Prem and Azure Databases with PowerShell?”

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7287

Session #20

@wBob_uk correctly answered the question: “What is the name of the base theme used in the Adventure Works Power View demo today?” posed by Julie Koesmarno who conducted the “I Want It NOW!" Data Visualization with Power View”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7290

Session #21

Conan Farrell @SQL_Dub correctly answered the question: “What is the "mantra" for engineering a DWH solution?” posed by Davide Mauri who conducted the “Agile Data Warehousing: Start to Finish”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7289

Session #22

Nicky @ ADA @NickyvV correctly answered the question: “Which DAX function lets you access columns in other tables?” posed by Alberto Ferrari who conducted the “DAX Formulas in action”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7274

Session #23

Regis Baccaro @regbac correctly answered the question: “What R package do you need to use to connect R and PowerBI?” posed by Jen Stirrup who conducted the “Business Intelligence Toolkit Overview: Microsoft Power BI & R”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7288

Session #24

Anil Maharjan @Anil_Maharjan correctly answered the question: “Can the SSRS databases be clustered?” posed by Ryan Adams who conducted the “SQL Server AlwaysOn Quickstart”.

Webcast Link: http://www.sqlpass.org/24hours/2014/summitpreview/Sessions/SessionDetails.aspx?sid=7282

 

This is just a taste of what you can expect from PASS Summit 2014. To ensure you have a seat, register today at PASS Summit 2014 registration page.  

Wednesday, September 10, 2014 11:00:00 AM

We are excited to announce that the new SQL Database service tiers, Basic, Standard, and Premium are now generally available. These service tiers raise the bar for what you can expect from a database-as-a-service with business-class functionality that is both built-in and seamless to use—allowing you to dramatically increase the number of databases managed under a single database administrator.

Today is an important milestone for the Azure SQL Database community. Since the first public introduction in 2009, our journey has been influenced by our direct and deep engagements with customers and partners.  Along the way we have increased the global scale and reach of the service, database size, and made it easier to run database diagnostics, to name a few. Your drive to push the boundaries on what is possible in the cloud brought us to a million databases. Your feedback on what you need from a relational database-as-a-service helped us reimagine an approach that aligns best with the unique needs of cloud-based database workloads.

In April, we introduced the Basic, Standard, and Premium tiers into preview. These tiers address the needs of today’s demanding cloud applications by providing predictable performance for your light- to heavy-weight transactional applications while also ensuring the performance of your apps are no longer affected by other customer workloads. Additionally, the new tiers provide you with the following new capabilities:

  • Higher uptime SLA, previously at 99.9%, uptime is now 99.99%--one of the highest in the database-as-a-service industry
  • Point-in-time-restore, with built-in backups and up to 35 days of data retention
  • Active geo-replication and standard geo-replication options for continuous data replication to geographically dispersed secondaries
  • Larger database sizes, previously at 150 GB, database max size is now up to 500 GB
  • Auditing for added security confidence (remains in preview)

Since the preview release in April, we listened to important feedback on the new tiers and as a result pre-announced changes to GA based on this direct dialogue. A summary of the key changes are:

  • New S0 performance level:  Within the Standard service tier, we have introduced an S0 performance level to ease the transition from Basic to Standard.
  • Premium and Standard price reductions:  Final pricing reflects up to 50% savings from previously-published GA pricing. GA pricing will take effect on November 1, 2014.
  • Hourly billing:  Starting today, Azure SQL Database will move to hourly billing in the new service tiers. 

I am incredibly excited about the value our ongoing investments will continue to deliver to you. Our customers Samsung, ESRI, Callaway Golf, and Pottermore, to name a few, are already using Azure SQL Database as a relational database service platform to help grow their cloud-based businesses. With an expanding portfolio of cloud services, including DocumentDB, Azure Search, Azure Machine Learning, and Azure HDInsight and complimentary data services from our partners, we’re committed to delivering a complete data platform that makes it easier for you to work with data of any type and size—using the tools, languages and frameworks you want in a trusted cloud environment. 

Try the new SQL Database service tiers today.

Wednesday, September 10, 2014 10:00:00 AM

Earlier this week we talked about getting up and running on Azure to start exploring the new machine learning service. From here, you can really start to dig in and try out the capabilities of a cloud ML and put predictive analytics into practice.

Below you’ll find several more video tutorials that help you learn your way around the service. Check them out, and let us know what predictions you uncover or big ideas you solve.

  1. Getting and Saving Data in Azure ML Studio: Data Access is the first step of data science workflow. Azure Machine Learning supports numerous ways to connect to your data. This video illustrates several methods of data ingress in Azure Machine Learning.


     
  2. Pre-processing data in Azure ML Studio: Data preprocessing is the next step in data science workflow and general data analysis projects.  This video illustrates the commonly used modules for cleaning and transforming data in Azure Machine Learning


     
  3. R in Azure ML Studio: Azure Machine Learning supports R. You can bring in your existing R codes in to Azure Machine Learning and run it in the same experiment with provided learners and publish this as web service via Azure Machine Learning. This video illustrates how to incorporate your R code in ML studio


     
  4. Predictive Modeling with Azure ML: Azure Machine Learning features a pallets of modules to build a predictive model, including state of the art ML algorithms such as Scalable boosted decision trees, Bayesian Recommendation systems, Deep Neural Networks and Decision Jungles developed at Microsoft Research. This video walks through steps to building, scoring and evaluating a predictive model in Azure Machine Learning


     
  5. Deploying a Predictive Model as a Service – Part 1: This video walks through creating a Web service graph for a predictive model and putting the predictive model into staging, using the Azure Machine Learning API service


     
  6. Deploying a Predictive Model as a Service – Part 2: Azure Machine Learning enables you to put staging service into production via the Azure Management portal.  This video walks through putting the predictive model staging service into production


     

As a reminder, there are a ton of resources you can use to continue your learning:

Tuesday, September 9, 2014 10:00:00 AM

Earlier this summer at WPC, we announced the preview of Microsoft Azure Machine Learning, a fully-managed cloud service for building predictive analytics solutions. With this service, you can overcome the challenges most businesses have in deploying and using machine learning. How? By delivering a comprehensive machine learning service that has all the benefits of the cloud. In mere hours, with Azure ML, customers and partners can build data-driven applications to predict, forecast and change future outcomes – a process that previously took weeks and months.

But once you get your hands on Azure ML, what do you do with it? Some examples we already see happening include:

  • Consumer oriented firms with targeted marketing, churn analysis and online advertising
  • Manufacturing companies enabling failure and anomaly forecasting for predictive maintenance
  • Financial services companies providing credit scoring, bankruptcy prediction and fraud detection
  • Retailers doing demand forecasting, inventory planning, promotions and markdown management
  • Healthcare firms and hospitals supporting patient outcome prediction and preventive care.

So how can machine learning impact your organization? Walk through these tutorials and start exploring the possibilities. The video tutorials and learning resources below will help you to quickly get up and running on Azure ML.

  1. Create an Azure Account
    Before you begin, you must create an Azure account. Create a free trial here.

  2. Overview of Azure ML: Watch an overview of the Azure Machine Learning service: a browser-based workbench for the data science workflow, which includes authoring, evaluating, and publishing predictive models.


     
  3. Getting started with Azure ML Studio: Walk through a visual tour of the Azure Machine Learning studio workspaces and collaboration features.


     
  4. Introduction to Azure ML API Service: Learn about the Azure Machine Learning API service capabilities.


     
  5. Provisioning Azure ML Workspaces: Walk through steps needed to provision a Machine Learning workspace from the Azure Portal.

     

Look for more video tutorials later this week, when we’ll cover getting and saving data in Azure ML, pre-processing that data, how we handle R in Azure ML Studio, and deploying predictive models with Azure ML.

In the meantime, there are a ton of resources you can use to continue your learning:

Wednesday, September 3, 2014 10:00:00 AM

Calling all data junkies! How smart are you?  Want to get smarter?

Play along with #pass24hop Challenge on Twitter starting at 5:00 AM PT Tuesday, September 9, 2014 to win a free Microsoft Exam Voucher!  Simply watch 24 Hours of PASS and be the first to answer the question correctly. At the beginning of each 24 live 24 Hours of PASS sessions (approximately 5-8 minutes into each hour) a new question regarding the session will be posted online on the  @SQLServer Twitter account. The first tweet with the correct answer will win a prize.  Your answer must include hashtags #pass24hop and #24hopquiz.

To take part in the #pass24hop Challenge, you must:

  1. Sign in to your Twitter account. If you do not have an account, visit www.twitter.com to create one. Twitter accounts are free.
  2. Once logged into your Twitter account, follow the links and instructions to become a follower of @SQLServer.
  3. From your own account, reply your response to the question tweeted by @SQLServer.  
  4. Your tweet must contain both the #pass24hop and #24hopquiz hashtags to be eligible for entry.
  5. Your tweet must include the complete answer to the question, or it will be disqualified.
  6. The first person to correctly tweet a correct reply to the corresponding question will win the prize described below.  

Register now for 24 Hours of PASS and get ready for 24 hours of play!  

Learn more about the 24 Hours of PASS read the official rules below.

 

NO PURCHASE NECESSARY. COMMON TERMS USED IN THESE RULES:

These are the official rules that govern how the ’24 Hours of PASS Social Media Answer & Question Challenge (“Sweepstakes”) promotion will operate. This promotion will be simply referred to as the “Sweepstakes” throughout the rest of these rules. In these rules, “we,” “our,” and “us” refer to Microsoft Corporation, the sponsor of the Sweepstakes. “You” refers to an eligible Sweepstakes entrant.

WHAT ARE THE START AND END DATES?

This Sweepstakes starts at 5:00 AM PT Tuesday, September 9, 2014 and ends at 5:00 AM PT Tuesday, September 9, 2014 and ends at 7:00 AM PT Wednesday, September 10, 2014 (“Entry Period”). The Sweepstakes consists of 24 prizes. Each Prize Period will begin immediately following each of the 24 session and run for 60 minutes.  

CAN I ENTER?

You are eligible to enter this Sweepstakes if you meet the following requirements at time of entry:

· You are professional or enthusiast with expertise in SQL Server or Business Intelligence and are 18 years of age or older; and

o If you are 18 of age or older, but are considered a minor in your place of residence, you should ask your parent’s or legal guardian’s permission prior to submitting an entry into this Sweepstakes; and

· You are NOT a resident of any of the following countries: Cuba, Iran, North Korea, Sudan, and Syria.

PLEASE NOTE: U.S. export regulations prohibit the export of goods and services to Cuba, Iran, North Korea, Sudan and Syria. Therefore residents of these countries / regions are not eligible to participate

• You are NOT an employee of Microsoft Corporation or an employee of a Microsoft subsidiary; and

• You are NOT involved in any part of the administration and execution of this Sweepstakes; and

• You are NOT an immediate family (parent, sibling, spouse, child) or household member of a Microsoft employee, an employee of a Microsoft subsidiary, or a person involved in any part of the administration and execution of this Sweepstakes.

This Sweepstakes is void wherever prohibited by law.

HOW DO I ENTER?  

At the beginning of each 24 live 24 Hours of PASS sessions (approximately 5-8 minutes into each hour) a new question regarding the session will be posted online on the  @SQLServer Twitter account. The first tweet with the correct answer will win a prize.  Your answer must include hashtags #pass24hop and #24hopquiz.  Failure to use this hashtag will automatically disqualify you.

To enter, you must do all of the following:

  1. Sign in to your Twitter account. If you do not have an account, visit www.twitter.com to create one. Twitter accounts are free.
  2. Once logged into your Twitter account, follow the links and instructions to become a follower of @SQLServer
  3. From your own account, reply your response to the question tweeted by @SQLServer  
  4. Your tweet must contain both the #pass24hop and #24hopquiz hashtags to be eligible for entry
  5. Your tweet must include the complete answer to the question, or it will be disqualified.
  6. The first person to correctly tweet a correct reply to the corresponding question will win the prize described below.  

Limit one entry per person, per session.  For the purposes of these Official Rules, a “day” begins 5:00 AM PT Tuesday, September 9, 2014 and ends at 7:00 AM PT Wednesday, September 10, 2014 (“Entry Period”). If you reply with more than one answer per session, all replies received from you for that session will be automatically disqualified.  You may submit one answer to each session, but will be eligible to win only one prize within the 24 hour contest period.

We are not responsible for entries that we do not receive for any reason, or for entries that we receive but are not decipherable for any reason, or for entries that do not include your Twitter handle.

We will automatically disqualify:

  • Any incomplete or illegible entry; and
  • Any entries that we receive from you that do not meet the requirements described above.

WINNER SELECTION AND PRIZES

The first person to correctly respond will receive a Microsoft Exam Voucher.  Approximate Retail Value each $150.  A total of twenty four prizes are available.

Within 48 hours following the Entry Period, we, or a company acting under our authorization, will select one winner per session to win one free Microsoft Certification Exam.  Voucher has a retail value of $ $150.  Prize eligibility is limited to one prize within the contest period.  If you are selected as a winner for a session, you will be ineligible for additional prizes for any other session.  In the event that you are the first to answer correctly on multiple session, the prize will go to the next person with the correct answer. 

If there is a dispute as to who is the potential winner, we reserve the right to make final decisions on who is the winner based on the accuracy of the answer provided, ensuring that the rules of including hashtags is followed, and the times the answers arrives based on what times are listed on www.twitter.com.

Selected winners will be notified via a Direct Message (DM) on Twitter within 48 business hours of the daily drawing. The winner must reply to our Direct Message (DM) within 48 hours of notification via DM on Twitter. If the notification that we send is returned as undeliverable, or you are otherwise unreachable for any reason, or you do not respond within 48 business hours, we will award the prize to an alternate winner as randomly selected. Only one alternate winner will be selected and notified; after which, if unclaimed, the prize will remain unclaimed.

If you are a potential winner, we may require you to sign an Affidavit of Eligibility, Liability/Publicity Release within 10 days of notification. If you are a potential winner and you are 18 or older, but are considered a minor in your place of legal residence, we may require your parent or legal guardian to sign all required forms on your behalf. If you do not complete the required forms as instructed and/or return the required forms within the time period listed on the winner notification message, we may disqualify you and select an alternate winner.

If you are confirmed as a winner of this Sweepstakes:

  • You may not exchange your prize for cash or any other merchandise or services. However, if for any reason an advertised prize is unavailable, we reserve the right to substitute a prize of equal or greater value; and
  • You may not designate someone else as the winner. If you are unable or unwilling to accept your prize, we will award it to an alternate potential winner; and
  • If you accept a prize, you will be solely responsible for all applicable taxes related to accepting the prize; and
  • If you are otherwise eligible for this Sweepstakes, but are considered a minor in your place of residence, we may award the prize to your parent/legal guardian on your behalf.

WHAT ARE YOUR ODDS OF WINNING? 
There will be 24 opportunities to respond with the correct answer. Your odds of winning this Challenge depend on the number of responses and being the first to answer with the correct answer.

WHAT OTHER CONDITIONS ARE YOU AGREEING TO BY ENTERING THIS CHALLENGE? 
By entering this Challenge you agree:

· To abide by these Official Rules; and

· To release and hold harmless Microsoft, and its respective parents, subsidiaries, affiliates, employees and agents from any and all liability or any injury, loss or damage of any kind arising from or in connection with this Challenge or any prize won; and

· That Microsoft’s decisions will be final and binding on all matters related to this Challenge; and

· That by accepting a prize, Microsoft may use of your proper name and state of residence online and in print, or in any other media, in connection with this Challenge, without payment or compensation to you, except where prohibited by law

WHAT LAWS GOVERN THE WAY THIS CHALLENGE IS EXECUTED AND ADMINISTRATED? 
This Challenge will be governed by the laws of the State of Washington, and you consent to the exclusive jurisdiction and venue of the courts of the State of Washington for any disputes arising out of this Challenge.

WHAT IF SOMETHING UNEXPECTED HAPPENS AND THE CHALLENGE CAN’T RUN AS PLANNED? 
If cheating, a virus, bug, catastrophic event, or any other unforeseen or unexpected event that cannot be reasonably anticipated or controlled, (also referred to as force majeure) affects the fairness and / or integrity of this Challenge, we reserve the right to cancel, change or suspend this Challenge. This right is reserved whether the event is due to human or technical error. If a solution cannot be found to restore the integrity of the Challenge, we reserve the right to select winners from among all eligible entries received before we had to cancel, change or suspend the Challenge. If you attempt to compromise the integrity or the legitimate operation of this Challenge by hacking or by cheating or committing fraud in ANY way, we may seek damages from you to the fullest extent permitted by law. Further, we may ban you from participating in any of our future Challenge, so please play fairly.

HOW CAN YOU FIND OUT WHO WON? 
To find out who won, send an email to v-daconn@microsoft.com by September 15, 2014 with the subject line: “SQL Server QQ Winners

WHO IS SPONSORING THIS CHALLENGE? 
Microsoft Corporation 
One Microsoft Way 
Redmond, WA 98052

Wednesday, August 27, 2014 10:00:00 AM

Do you have the “Data Gene”? 

Preparations for PASS Summit 2014 in Seattle, Washington are well underway.  We are very excited to have this year’s event back in Seattle and look forward to bringing you some great sessions and activities throughout the event.    

Tune in to this week’s TechNet Radio special to listen to Jennifer Moser, Lara Rubbelke, Ann Bachrach and SQL RockStar, Thomas LaRock talk about the “data gene” and why you don’t want to miss this year’s event.

Don’t procrastinate, get registered for PASS Summit 2014!  

Monday, August 25, 2014 2:55:02 PM

We are excited to announce the release of a SQL Server AlwaysOn template in the Microsoft Azure Portal Gallery. This offering was announced in Scott Guthrie’s blog post along with several other exciting new features.

This template fully automates the configuration of a highly available SQL Server deployment on Azure Infrastructure Services using AlwaysOn Availability Groups.

AlwaysOn Availability Groups

AlwaysOn Availability Groups, released in SQL Server 2012 and enhanced in SQL Server 2014, guarantee high availability for mission-critical workloads. Last year we started supporting Availability Groups on Azure Infrastructure Services. The main components of such a configuration are two SQL Server replicas (a primary and a secondary), and a listener (DNS name). The replicas are configured for automatic failover, and each replica is contained on a distinct Virtual Machine. The listener is a DNS name that client applications can use in their connection string to connect to the current primary replica. The image below shows a depiction of this setup.

 Other components required are a file share witness to guarantee quorum in the configuration to avoid “split brain” scenarios, and a domain controller to join all VMs to the same domain. Similar to the SQL replicas, there is a primary and secondary domain controller to prevent a single point of failure for the domain. The SQL replicas are deployed to an availability set to ensure they are in different Azure failure and upgrade domains. Likewise, the domain controller replicas are in their own availability set. The configuration is depicted in the image below.

 

SQL Server AlwaysOn Template

Setting up the Availability Group configuration requires a long set of steps and a decent time commitment. In order to dramatically simplify this, we have released a SQL Server AlwaysOn template in the Azure Gallery. This template fully automates the configuration of a highly available SQL Server deployment on Azure Infrastructure Services using an Availability Group. Currently, this feature only supports SQL Server 2014 Enterprise images.

The SQL Server AlwaysOn Template, depicted below, is found in the Gallery under “Virtual Machines” and “Recommended”.

 

After selecting it, it will show a description of the configuration that will be created, and the option to specify some arguments. This is depicted in the picture below.

The only arguments required are a Resource Group (an identifier of the deployment) and administrator credentials. From that point on, all settings are optional and will be auto-generated based on these 3 inputs. The domain Sysadmin account, the local SQL Server accounts, and the SQL Server service account password will be auto-generated based on the credentials entered. The names for all resources being created will be based off of what was entered for Resource Group name. The SQL Server service account name and the domain name will be auto-generated but will not be based on the Resource Group name or credentials. If you wish to customize any of these arguments, simply go to the other configurations and change the values entered for any setting. One argument that you may want to change is the Listener name, which your applications will use to connect to SQL Server. By default, entirely new resources will be provisioned for you. You have the option to select an existing domain for the deployment. In future updates, there will be more options to add existing resources to your configuration.

After the template has executed, 5 Virtual Machines will be created under the resource group: 2 Standard A5 VMs for the SQL Server replicas, 2 Standard A1 VMs for the Domain Controller replicas, and 1 Basic A0 VMs for the file share witness. This is depicted below:

You can RDP to one of the SQL Server VMs to see the Availability Group configuration as depicted below:

Try out the SQL Server AlwaysOn Template today by going to the Azure portal: http://portal.azure.com/

Site Map | Printable View | © 2008 - 2014 SQLTrainer.com, LLC | Powered by mojoPortal | HTML 5 | CSS | Design by mitchinson