About Me

I now work for Microsoft Federal in Chevy Chase, MD.

Dedicated to providing customer-driven, results-focused solutions to the complex business problems of today... and tomorrow.

At SQLTrainer.com, LLC  we understand that the technical challenges faced by businesses today are much greater in both scope and complexity than they have ever been. Businesses today are faced with understanding both local IT infrastructures as well as cloud-based technologies.

What is SQLTrainer.com?

Founded in 1998 by Ted Malone, SQLTrainer.com is a technical consulting, training and content development firm dedicated to the following core principles:

  • Technology Alone is NOT the Answer! - Implementing a particular technology because it is interesting or "cool" will not solve customer problems.
  • Technology Solutions do NOT need to be Overly Complex! - Many times developers and technical practitioners will attempt to build the most clever solution possible. While this serves to stroke the egos of those involved, it doesn't provide a maintainable solution.
  • Consultants Should be Mentors First! - When looking to hire an external consultant, businesses should look to the consultant who's willing to train themselves out of a paycheck.

Why the name, SQLTrainer.com?

SQL (pronounced See-Quell) stands for Structured Query Language, which is at the heart of every modern-day relational database system. Since many technology solutions today rely on some form of database storage or interaction, it was only logical to find a way to incorporate SQL into the name of the organization. Given that one of our core principles is to be a mentor/training above everything, the name SQLTrainer made sense. Since we also wanted to represent our embracing of the cloud, it seemed logical to add the ".com", referring to the biggest "cloud" of them all.

Live Feeds

RSS Feeds RSS
Tuesday, April 15, 2014 11:45:00 AM

Earlier today, Microsoft hosted a customer event in San Francisco where I joined CEO Satya Nadella and COO Kevin Turner to share our perspective on the role of data in business. Satya outlined his vision of a platform built for an era of ambient intelligence. He also stressed the importance of a “data culture” that encourages curiosity, action and experimentation – one that is supported by technology solutions that put data within reach of everyone and every organization. 

Kevin shared how customers like Beth Israel Deaconess Medical Center, Condé Nast, Edgenet, KUKA systems, NASDAQ, telent, Virginia Tech and Xerox are putting Microsoft’s platform to work and driving real business results. He highlighted an IDC study on the tremendous opportunity for organizations to realize an additional $1.6 trillion dividend over the next four years by taking a comprehensive approach to data. According to the research, businesses that pull together multiple data sources, use new types of analytics tools and push insights to more people across their organizations at the right time, stand to dramatically increase their top-line revenues, cut costs and improve productivity. 

A platform centered on people, data and analytics
In my keynote, I talked about the platform required to achieve the data culture and realize the returns on the data dividend – a platform for data, analytics and people. 

It’s people asking questions about data that’s the starting point -- Power BI for Office 365 and Excel’s business intelligence features helps get them there. Data is key – data from all kinds of sources, including SQL Server, Azure and accessibility of the world’s data from Excel. Analytics brings order and sets up insights from broad data – analytics from SQL Server and Power BI for Office 365, and Azure HDInsight for running Hadoop in the cloud.

A platform that solves for people, data, and analytics accelerates with in-memory. We created the platform as customers are increasingly needing the technology to scale with big data, and accelerate their insights at the speed of modern business. 

Having in-memory across the whole data platform creates speed that is revolutionary on its own, and with SQL Server we built it into the product that customers already know and have widely deployed. At the event we celebrated the launch of SQL Server 2014. With this version we now have in-memory capabilities across all data workloads delivering breakthrough performance for applications in throughput and latency. Our relational database in SQL Server has been handling data warehouse workloads in the terabytes to petabyte scale using in-memory columnar data management. With the release of SQL Server 2014, we have added in-memory Online Transaction Processing. In-memory technology has been allowing users to manipulate millions of records at the speed of thought, and scaling analytics solutions to billions of records in SQL Server Analysis Services. 

The platform for people, data and analytics needs to be where the data and the people are. Our on-premises and cloud solutions provide endpoints for a continuum of how the realities of business manage data and experiences – making hybrid a part of every customer’s capability. Today we announced that our Analytics Platform System is generally available – this is the evolution of the Parallel Data Warehouse product that now supports the ability to query across the traditional relational data warehouse and data stored in a Hadoop region – either in the appliance or in a separate Hadoop cluster. SQL Server has seamless integration with VMs in Azure to provide secondaries for high availability and disaster recovery. The data people access in the business intelligence experience comes through Excel from their own data and partner data – and Power BI provides accessibility to wherever the data resides.  

The platform for people, data and analytics needs to have full reach. The natural language search query Q&A feature in Power BI for Office 365 is significant in that it provides data insights to anyone that is curious enough to ask a question. We have changed who is able to reach insights by not demanding that everyone learn the vernacular of schemas and chart types. With SQL Server, the most widely-deployed database on the planet, we have many people who already have the skills to take advantage of all the capabilities of the platform. With a billion people who know how to use Excel, people have the skills to get engaged on the data.

Looking forward, we will be very busy. Satya mentioned some work we are doing in the Machine Learning space and today we also announced a preview of Intelligent Systems Service – just a couple of the things we are working to deliver a platform for the era of ambient intelligence. The Machine Learning work originates in what it takes to run services at Microsoft like Bing. We had to transform ML from a deep vertical domain into an engineering capability, and in doing so learned what it would take to democratize ML for our customers. Stay tuned. 

The Internet of Things (IoT) space is very clearly one of the most important trends in data today. Not only do we envision the data from IoT solutions being well served by the data platform, but we need to ensure the end-to-end solution can be realized by any customer. To that end, Intelligent Systems Service (ISS) is an Internet of Things offering built on Azure, which makes it easier to securely connect, manage, capture and transform machine-generated data regardless of the operating system platform.

It takes a data platform built for the era of ambient intelligence with data, analytics and people to let companies get the most value from their data and realize a data culture. I believe Microsoft is uniquely positioned to provide this platform – through the speed of in-memory, our cloud and our reach. Built on the world’s most widely-deployed database, connected to the cloud through Azure, delivering insights to billions through Office and understanding the world through our new IoT service – it is truly a data platform for a new era. When you put it all together only Microsoft is bringing that comprehensive a platform and that much value to our customers.

 

Quentin Clark
Corporate Vice President
Data Platform Group

Monday, April 14, 2014 9:00:00 AM

Tomorrow’s the day! Tune in to hear from Microsoft CEO Satya Nadella, COO Kevin Turner, and Data Platform Group CVP Quentin Clark about Microsoft’s approach to data, and how the latest advancements in technology can help you transform data into action.

Who should watch?

Join us tomorrow morning at 10AM PDT if you like data or want to learn more about it. If you store it, you manage it, you explore it, you slice and dice it, you analyze it, you visualize it, you present it, or if you make decisions based on it. If you’re architecting data solutions or deciding on the best data technology for your business. If you’re a DBA, business analyst, data scientist, or even just a data geek on the side, join the live stream.

What will I hear about?

Data infrastructure. Data tools. And ultimately, the power of data. From finding the connections that could cure cancer, to predicting the success of advertising campaigns, data can do incredible things. Join us online and get inspired. You’ll see how your peers are putting their data, big and small, to work.

From a product perspective, we’ll celebrate the latest advancements in SQL Server 2014, Power BI for Office 365, SQL Server Parallel Data Warehouse, and Microsoft Azure HDInsight. And ultimately, we’ll explore how these offerings can help you organize, analyze, and make sense of your data – no matter the size, type, or location.

Where do I sign up?

Mark your calendar now or RSVP on Facebook so you’re ready to go tomorrow. When streaming goes live, you can join us here for all the action live from San Francisco.

When do things get started?

Tomorrow, April 15, at 10AM PDT. Be there.

See you tomorrow!

Wednesday, April 9, 2014 10:00:00 AM

 Guest blog post by: PASS President Thomas LaRock – a SQL Server MVP, MCM, and Head Geek at Solarwinds – is a seasoned IT professional with over a decade of technical and management experience. Author of DBA Survivor: Become a Rock Star DBA, he holds an MS degree in Mathematics from Washington State University and is a Microsoft Certified Trainer and a VMware vExpert. You can read his blog at thomaslarock.com and follow him on Twitter at @SQLRockstar.

*     *     *     *     *

April opened with the general availability of SQL Server 2014. But well before we could wrap our hands around the final bits of the new release, the SQL Server community has been getting an early taste of its exciting performance, availability, manageability, and cloud features, thanks to a grassroots launch and readiness program that has spread around the globe.

The Professional Association for SQL Server (PASS) and hundreds of our volunteers around the world have joined with Microsoft to host free SQL Server 2014 launch events and technical sessions that focus on what matters most to data pros. These sessions explain the new features in the release, how the features work, and how we can use them to benefit our companies.

From user group meetings and PASS SQLSaturday sessions to the ongoing SQL Server 2014 Countdown webinars with PASS Virtual Chapters, the launch of SQL Server 2014 has truly been a community affair – and we're just getting started. Whether you're already on the path to early adoption, preparing to take advantage of the new release soon, or gathering information for the future, here's how you can get involved and get the details you need to make smart decisions for your organization:

  • Connect with fellow SQL Server pros: Microsoft Data Platform Group GM Eron Kelly noted that for Community Technology Preview 2, there were nearly 200K evaluations of SQL Server 2014, including 20K evaluations with the new release running in a Microsoft Azure Virtual Machine. That's a lot of folks who now have first-hand knowledge about SQL Server 2014. Check out those blogging and speaking about their experiences and sharing at chapter meetings, and then get to know them and what they know.
  • Share your questions, issues, and solutions: Have you tried out SQL Server's new built-in in-memory OLTP features? How about the enhanced mission-critical and availability capabilities? Have questions about implementing a hybrid data solution that bridges on-premises and cloud technologies? And how and when should you use the new delayed durability setting or clustered columnstore indexes? Share your experiences – and what you don't know or need more information about – and help the community build up resources that enable us all to work better, smarter, and faster.
  • Learn how to get the most from your data: Go inside the new release with experts on the SQL Server product team at upcoming live SQL Server 2014 Countdown webinars and watch on-demand replays of those you missed. You can also learn more about SQL Server 2014 and Microsoft's data platform strategy at the Accelerate Your Insights online launch event April 15 with Microsoft CEO Satya Nadella, COO Kevin Turner, and Data Platform Group Corporate Vice President Quentin Clark. And remember to check with your local PASS chapter, Virtual Chapter, or nearby SQLSaturday event for more SQL Server 2014 launch and learning events happening worldwide.

I'm grateful to be part of one of the most passionate technology communities in the world and excited to participate in a SQL Server 2014 launch program that, at its core, is about empowering SQL Server professionals and their organizations to be successful.

Thanks to everyone who is helping connect, share, and learn about SQL Server 2014.
Thomas

Tuesday, April 8, 2014 10:00:00 AM

With the recently disclosed general availability of SQL Server 2014, Microsoft brings to market new hybrid scenarios, enabling customers to take advantage of Microsoft Azure in conjunction with on-premises SQL Server.

SQL Server 2014 helps customers to protect their data and make it more highly availably using Azure. SQL Server Backup to Microsoft Azure builds on functionality first introduced in SQL Server 2012, introducing a UI for easily configuring backup to Azure from SQL Server Management Studio (SSMS). Backups are encrypted and compressed, enabling fast and secure cloud backup storage. Set up requires only Azure credentials and an Azure storage account. For help getting started, this step-by-step guide will get you going with the easy, three-step process.

Storing backup data in Azure is cost-effective, secure, and inherently offsite, making it a useful component in business continuity planning. A March 2014 commissioned study conducted by Forrester Consulting on Microsoft's behalf about Cloud Backup and Disaster Recovery found that saving money on storage is the top benefit of cloud database backup, cited by 61%, followed closely by 50% who said savings on administrative cost was a top reason for backing up to the cloud. Backups stored in Azure also benefit from Azure built-in geo-redundancy and high services levels, and can be restored to a Azure VM for fast recovery from onsite outages.

In addition to the SQL Server 2014 functionality for backing up to Azure, we have now made generally available a free standalone SQL Server Backup to Microsoft Azure Tool that can encrypt and compress backup files for all supported versions of SQL Server, and store them in Azure—enabling a consistent backup to cloud strategy across your SQL Server environments. This fast, easy to configure tool enables you to quickly create rules that direct a set of backups to Azure rather than local storage as well as select encryption and compression settings.

Another new business continuity planning scenario enabled by SQL Server 2014 is disaster recovery (DR) in the cloud. Customers are now able to setup an asynchronous replica in Azure as part of an AlwaysOn high availability solution. A new SSMS wizard enables you to simplify the deployment of replicas on-premises and to Azure. As soon as a transaction is committed on-premises it is sent asynchronously to the cloud replica. We still recommend you keep your synchronous replica on-premises, but by having the additional replicas in Azure you gain improved DR and can reduce the CAPEX and OPEX costs of physically maintaining additional hardware in additional data centers.

Another benefit of keeping an asynchronous replica in Azure is that the replica can be efficiently utilized for read functionality like BI reporting or utilized for doing backups, speeding up the backup to Azure process as the secondary is in Azure already.

But the greatest value to customers of an AlwaysOn replica in Azure is the speed to recovery. Customers are finding that their recovery point objectives (RPO) can be reduced to limit data loss, and their recovery time objectives (RTO) can be measured in seconds:

  • Lufthansa Systems is a full-spectrum IT consulting and services organization that serves airlines, financial services firms, healthcare systems, and many more businesses. To better anticipate customer needs for high-availability and disaster-recovery solutions, Lufthansa Systems piloted a solution on SQL Server 2014 and Azure that led to faster and more robust data recovery, reduced costs, and the potential for a vastly increased focus on customer service and solutions. They expect to deploy the solution on a rolling basis starting in 2014.
  • Amway is a global direct seller. Amway conducted a pilot test of AlwaysOn Availability Groups for high availability and disaster recovery. With multisite data clustering with failover to databases hosted both on-premises and in Azure, Amway found that the test of SQL Server AlwaysOn with Azure replicas delivered 100 percent uptime and failover took place in 10 seconds or less. The company is now planning how best to deploy the solution.

Finally, SQL Server 2014 enables you to move your database files to Azure while keeping your applications on-premises for bottomless storage in the cloud and greater availability. The SQL Server Data Files in Microsoft Azure configuration also provides an alternative storage location for archival data, with cost effective storage and easy access.

If you're ready to evaluate how SQL Server 2014 can benefit your database environment, download a trial here. For greater flexibility deploying SQL Server on-premises and in the cloud, sign up for a free Azure evaluation. And, to get started backing up older versions of SQL Server to Azure, try our free standalone backup tool. Also, don't forget to save the date for the live stream of our April 15 Accelerate Your Insights event to hear more about our data platform strategy from CEO Satya Nadella, COO Kevin Turner and CVP of Data Platform Quentin Clark.

Tuesday, April 1, 2014 10:00:00 AM

Microsoft today released Microsoft SQL Server 2014, the foundation of our cloud-first data platform. SQL Server 2014 delivers breakthrough performance with new and enhanced in-memory technologies to help customers accelerate their business and enable new, transformational scenarios. In addition, SQL Server 2014 enables new hybrid cloud solutions to take advantage of the benefits of cloud computing with scenarios such as cloud backup and cloud disaster recovery for on-premises SQL Server installations.  SQL Server 2014 continues to offer industry-leading business intelligence capabilities through integration with familiar tools like Excel and Power BI for Office 365 for faster insights. 

We will also soon make generally available the SQL Server Backup to Microsoft Azure Tool, a free tool that allows customers to backup older versions of SQL Server to Azure storage.

Try SQL Server 2014 release today

Download and try the generally available release of SQL Server 2014 today on premises, or get up and running in minutes in the cloud. And, please be sure to save the date for the live stream of our April 15 Accelerate Your Insights event to hear more about our data platform strategy from CEO Satya Nadella, COO Kevin Turner and CVP of Data Platform Quentin Clark.

Thanks.

Eron Kelly

General Manager

Data Platform Group

Friday, March 28, 2014 1:06:00 PM

Data enthusiasts & Microsoft talk about Big Data in the enterprise at Gigaom Structure Data 2014

Last week in New York, our data solutions experts spent a few days with more than 900 big data practitioners, technologists and executives at Structure Data for a conversation how big data can drive business success.

The rich conversations with attendees at the event were inspiring, and the broad range of speakers was impressive.  Our discussions over the two days in New York centered on what the Big Data solution looks like inside an enterprise and the challenges around accessing and processing big data to make better data-driven decisions. 

Structure Data attendees want to combine data from multiple sources to do a couple key things -- to gain deeper insights and to ask new questions and get answers.  However, without the right technology to support that, it can be very challenging to do this.  That's where Microsoft comes in -- and where we continued the dialog with attendees as our data experts used a huge Microsoft touchscreens to show how easy it can be to transform big data to insights using simple, front-end tools (like Excel or Power BI for Office 365) and back-end technology for scale, power and speed (like Windows Azure HDInsight and SQL Server). 

Microsoft Research Distinguished Scientist John Platt also spoke at Structure Data and shared the latest on our work in machine learning, which is quite pervasive throughout many Microsoft products. If you missed it, take a moment to watch the short chat here

Our data experts also gave attendees an insiders’ view at how Microsoft’s Cybercrime Center is using data to fight worldwide organized crime and BotNets. (See the video below for more.) 

Take the first step and learn more about Microsoft Big Data solutions:

Or, connect with us on Facebook.com/sqlserver and Twitter @SQLServer and learn how Microsoft’s approach to data, big and small, helps employees, IT professionals and data scientists quickly transform data into insight and action.

And don't forget about the April 15th Accelerate your insights event where Microsoft will unveil the details of new capabilities for the appliance that has both scale-out relational data warehouse and Hadoop in the same box, thus evolving PDW from a solution built for high performance relational data warehousing to a true turnkey Big Data Analytics appliance.

Thursday, March 27, 2014 10:00:00 AM

Mark your calendar now to join us online on April 15 for the Accelerate your insights event, streaming live from San Francisco, California at 10:00 AM PDT.

Wondering what your data can do for you? Join us to online find out how to drive your business in real-time, from apps to insights. You’ll hear from several of Microsoft’s top executives, including Chief Executive Officer Satya Nadella, Chief Operating Officer Kevin Turner, and Corporate Vice President of the Data Platform Group, Quentin Clark.

Save the date to watch the keynotes streamed live on 4/15:

-        Mark your calendar

-        RSVP on Facebook

Join us as we share Microsoft’s data platform vision, and how the latest advancements in our data technologies can help you transform data into action.

See you there.

Wednesday, March 26, 2014 11:52:00 AM

In my recent post, Simplifying Business Intelligence through Power BI for Office 365, I described how the Power BI cloud service has changed the way I personally work. The ability to do “self-service BI” without any IT involvement—and to have the infrastructure transparently provisioned in a cloud computing environment—has enabled me to drop any dependency on specific hardware, and I can use just about any Web browser. We have heard from many business users that this self-service approach is extremely empowering, and it enables rapid progress and insights.

One potential down side of self-service BI is that the reports, charts, and graphs that users develop with such tools can be cut off from other data and reports. Self-service BI solutions can also become out-of-date and are, as such, unreliable. These are serious limitations of self-service BI tools. For self-service tools to be convenient, as well as reliable and trustworthy, there needs to be a way to easily include user-created reports as part of a larger solution, and also to operationalize these solutions. Without such capabilities, self-service tools are, at best, good for prototyping.

Power BI is a complete system, tying together the empowering aspects of self-service with the operational and collaborative capabilities that are critical for an enterprise-class solution. Power BI also provides governance and data stewardship capabilities, as needed, to enable management oversight. The result is a managed self-service system.

In this post, I want to talk about how Power BI enables me to create complete business insight solutions that are based on fresh, reliable data. I will not cover data governance capabilities, keeping that as a topic for another day.

Pulling It Together

For me, a Power BI site is the place where I can pull together a full solution that gives an “at a glance” view of visualizations and insights that are available for a set—or sets—of data. It also is the place to enable operations such as refreshing the data for the solution. I use the “Featured Reports” row at the top of a Power BI sites application to highlight the reports and visualizations that may be most important and that I want to call out visually.

Power BI Q&A is a good way to ask ad-hoc questions using natural language, but to be really effective, it is important to provide starting points from which users can start exploring a dataset. For this, I use the “Featured Questions” capability. Featured questions enables me to come up with starter questions, which can be used in two ways:

1. To lead users into the Q&A experience from the Power BI site app main page through the Featured Questions row on Power BI sites.

2. To guide users to what they can ask on the Q&A page itself. The “About this data” slide out panel on the right side of the Q&A page lists several types of questions that users can ask.

Adding specific featured questions can be done from the Featured Questions view (accessible from the  icon next to the Q&A text box and in the “About this data” slide out panel).

Keeping It Fresh

Power BI leverages Excel workbooks to create data models and compelling interactive visuals. Data is mashed up in Excel and can come from various different data sources. Typically, once I mash up data and have some interesting visuals to share, I upload the Excel workbook to a site on the Power BI service, and as simple as that, the solution is available in the cloud. So far so good, right—except that the data in the solution is “as of” the upload time. It can quickly get stale and might not make sense later, if the data is time-sensitive. Of course, it is possible to re-upload the workbook with refreshed data, but that is a manual and potentially unreliable solution. What is required is a way to automatically refresh the data. Power BI enables this via the “Scheduled Data Refresh” feature. Scheduled Data Refresh enables automatic updates on a regular schedule, keeping the data fresh and the reports reliable, and this is critical for real world solutions.

Like everything in Power BI, the process for setting up the Scheduled Data Refresh is self-service in nature. The ellipsis (…) menu for each workbook on the Power BI site is where a data refresh is configured and monitored.


Setting up the actual schedule is straightforward, with simple options. Of course, the flip side to simplicity is that the options are somewhat limited. Over time, we will add more customizations, and the overall goal is to enable business users to set up the refresh without requiring help from someone in IT.

Once this is set up properly, the refresh job runs as defined in the schedule, and switching to the “history” tab provides a quick view of jobs that have already run and the results of each run (including any errors).


In the example, my data source is an Azure SQL Database, and as such, it is simple to establish the connection. The connection I defined in my Excel workbook just works in the cloud after uploading to the Power BI service, and the refresh operation also just works. However, in many cases, data comes from an on-premises source, and in these cases, the refresh is not as simple. To refresh data from an on-premises source, the first step is to install and configure the Data Gateway; then, add the data source to the Data Gateway. I will not cover this in detail here; it deserves a write up on its own.

A Complete BI System

With this up and running, the overall solution is now a workable and practical implementation. Power BI sites provide the “face” of the solution and allows the user to get productive quickly. The Scheduled Data Refresh ensures data freshness and reliability so critical to business insights. Power BI converts the Excel workbook from an island of data to a full participant in a BI solution, with fresh, reliable data that the business can rely on. Add this to the rest of the capabilities in Power BI related to data management, and we have a complete BI system that enables the business user to get powerful insights, and at the same time, still enables administrators to have a degree of oversight and control.

Read more about using the Power BI sites application with your SharePoint Online site.

Kamal Hathi, Director of PM, Data Platform Group

 

Check out the website to learn more about Power BI for Office 365 and start a free trial today. For those who want access to the upcoming SQL Server 2014 release as soon as possible, please sign-up to be notified once the release is available.  Also, please join us on April 15 for the Accelerate Your Insights event to learn about our data platform strategy and how more customers are gaining significant value with SQL Server 2014.  There also will be additional launch events worldwide so check with your local Microsoft representatives or your local PASS chapter for more information on SQL Server readiness opportunities.

Monday, March 24, 2014 1:00:00 PM

Microsoft will be at the Gartner Business Intelligence and Analytics summit being held in Las Vegas from March 30th – April 2nd as a premier sponsor. We’re looking forward to sharing our vision of how we’re making big data real through the familiar tools you already use – SharePoint, Excel and SQL Server – as well as new ones such as Power BI for Office 365 and Windows Azure HDInsight.

Over the last few years, we’ve all had to deal with an explosion in data types and the velocity at which we need to react to that data. In this world of big data, Microsoft has refined our data toolkit – adding performance and scaling capabilities on commodity hardware in SQL Server 2014, as well as the ability to store, process and analyze large volumes of data through Windows Azure HDInsight, our 100% compatible implementation of Apache Hadoop.

Just as we’ve added capabilities to our data platform, we’ve continued to focus on making it as easy as possible to get rich insights from your stored data – whether it’s in SQL Server, Windows Azure HDInsight or a 3rd party data provider such as a LOB system. We’ve built powerful visualizations right into Excel with Power View and have added geospatial mapping capabilities through Power Map. It’s also now possible to query your data with natural language through Q&A in Power BI.

Our focus at Gartner will be on showcasing how all of these innovations are coming together to enable all of your users to find, analyze and use the information they need quickly and easily.

We’d love to speak to you if you’ll be there. Stop by our booth; attend our session on April 2nd from 10:45 AM – 11:45 AM; or schedule an individual meeting with Microsoft through the Gartner concierge. We’re also co-hosting a learning lab on the show floor with our partner SAP where you can learn about how Power BI connects to SAP BusinessObjects BI Universes, both through small group sessions and hands-on demonstrations.

We hope to see you. If you haven’t yet registered, you may use code BISP7 to get a $300 discount on registration.

For those who want access to the upcoming SQL Server 2014 release as soon as possible, please sign-up to be notified once the release is available.  Also, please join us on April 15 for the Accelerate Your Insights event to learn about our data platform strategy and how more customers are gaining significant value with SQL Server 2014.  There also will be additional launch events worldwide so check with your local Microsoft representatives or your local PASS chapter for more information on SQL Server readiness opportunities.

Monday, March 24, 2014 8:00:00 AM

With the release to manufacturing announcement of SQL Server 2014 you will start to see more customer stories showcasing their use of hybrid features that span the cloud and on-premises.

One such customer is global direct seller, Amway.  Its data centers support about 34 terabytes of information spread across 100 instances of Microsoft SQL Server software, with the data load growing at an annual rate of about 15 percent.  The company faces the same challenges of almost any sizable organization in maximizing data availability and ensuring disaster recovery.  As Amway has grown they have created additional secondary data centers as disaster-recovery sites. All this additional infrastructure has, inevitably, introduced more complexity and cost into the Amway data environment.

Previously, Amway had been concerned about cloud configurations that were not under its control. But with Windows Azure Infrastructure as a Service, the company could create its own virtual machine configuration image and install it on Windows Azure, addressing that concern. Amway uses the same virtual-machine media for instances on Windows Azure and in its own data centers, ensuring that installations are consistent across both environments.

Amway conducted a pilot test of a prerelease version of Microsoft SQL Server 2014 software, focusing on the software’s AlwaysOn Availability Groups for high availability and disaster recovery. That feature is based on multisite data clustering with failover to databases hosted both on-premises and in Windows Azure. The pilot test focused on a CRM application. The test architecture consisted of three nodes in a hybrid on-premises/cloud configuration.  A primary replica and secondary replica, operating synchronously to support high availability through automatic failover, both located on-premises. A secondary replica located in Windows Azure, operating in asynchronous mode to provide disaster recovery through manual failover

Amway found that the test of SQL Server AlwaysOn Availability Groups with Windows Azure replicas delivered 100 percent uptime and failover took place in 10 seconds or less, compared to the 45 seconds Amway experienced with traditional SQL Server Failover Clusters. Amway is looking forward to an even bigger reduction in the time required to recover from a complete data center failure. Instead of the two-hour, three-person process required with database mirroring, Amway will be able to restore a data center with just 30 seconds of one DBA’s time.

You can learn more about the Amway solution by reading the more detailed case study here.  

For those who want access to the upcoming SQL Server 2014 release as soon as possible, please sign-up to be notified once the release is available.  Also, please join us on April 15 for the Accelerate Your Insights event to learn about our data platform strategy and how more customers are gaining significant value with SQL Server 2014.  There also will be additional launch events worldwide so check with your local Microsoft representatives or your local PASS chapter for more information on SQL Server readiness opportunities.

Site Map | Printable View | © 2008 - 2014 SQLTrainer.com, LLC | Powered by mojoPortal | HTML 5 | CSS | Design by mitchinson