About Me

I now work for Microsoft Federal in Chevy Chase, MD.

Dedicated to providing customer-driven, results-focused solutions to the complex business problems of today... and tomorrow.

At SQLTrainer.com, LLC  we understand that the technical challenges faced by businesses today are much greater in both scope and complexity than they have ever been. Businesses today are faced with understanding both local IT infrastructures as well as cloud-based technologies.

What is SQLTrainer.com?

Founded in 1998 by Ted Malone, SQLTrainer.com is a technical consulting, training and content development firm dedicated to the following core principles:

  • Technology Alone is NOT the Answer! - Implementing a particular technology because it is interesting or "cool" will not solve customer problems.
  • Technology Solutions do NOT need to be Overly Complex! - Many times developers and technical practitioners will attempt to build the most clever solution possible. While this serves to stroke the egos of those involved, it doesn't provide a maintainable solution.
  • Consultants Should be Mentors First! - When looking to hire an external consultant, businesses should look to the consultant who's willing to train themselves out of a paycheck.

Why the name, SQLTrainer.com?

SQL (pronounced See-Quell) stands for Structured Query Language, which is at the heart of every modern-day relational database system. Since many technology solutions today rely on some form of database storage or interaction, it was only logical to find a way to incorporate SQL into the name of the organization. Given that one of our core principles is to be a mentor/training above everything, the name SQLTrainer made sense. Since we also wanted to represent our embracing of the cloud, it seemed logical to add the ".com", referring to the biggest "cloud" of them all.

Live Feeds

RSS Feeds RSS
Wednesday, October 29, 2014 1:01:00 AM

This blog post is authored by Joseph Sirosh, Corporate Vice President of Machine Learning at Microsoft.

Today, I am excited to announce three new services: Azure Stream Analytics, Azure Data Factory and Azure Event Hubs. These services continue to make Azure the best cloud platform for our customers to build big data solutions.

Azure Stream Analytics and Azure Data Factory are available in preview and Azure Event Hubs is now generally available. These new capabilities help customers process data from devices and sensors within the Internet of Things (IoT), and manage and orchestrate data across diverse sources. 

  • Stream Analytics is a cost-effective event processing engine that helps uncover real-time insights from devices, sensors, infrastructure, applications and data quickly and easily.
  • Azure Data Factory enables information production by orchestrating and managing diverse data.
  • Azure Event Hubs is a scalable service for collecting data from millions of “things” in seconds.

Azure Stream Analytics and Azure Event Hubs

Every day, IoT is fueling vast amounts of data from millions of endpoints streaming at high velocity in the cloud. Examples of streaming analytics can be found across many businesses, such as stock trading, fraud detection, identity protection services, sensors, web clickstream analytics and alerts from CRM applications. In this new and fast-moving world of cloud and devices, businesses can no longer wait months or weeks for insights generated from data.

With Azure Stream Analytics, businesses can gain insights in real time from data generated by devices, sensors, infrastructure, applications and other sources. Developers can easily combine streams of data – such as clickstreams, logs, metering data or device-generated events – with historic records or reference data. Complementing Stream Analytics, Azure Event Hubs is a highly scalable publish-subscribe ingestor that collects millions of events per second, allowing users to process and analyze data produced by connected assets such as devices and sensors. Stream Analytics provides out-of-the-box integration with Event Hubs – when connected, these two solutions enable customers to harness IoT by processing and analyzing massive amounts of data in real time.

One customer already using Stream Analytics and Event Hubs is Aerocrine, a medical products company focused on the improved management and care of patients with inflammatory airway diseases. The company is developing devices that include the ability to collect telematics data from clinics. The devices will connect to Azure and use Stream Analytics and Event Hubs to collect telematics information and perform near real-time analytics on top of the stream of the data from the instruments. The system will collect data about usage and performance to further improve the customer service experience and send out real-time alerts for maintenance.

Azure Data Factory

Most organizations today are dealing with a variety of massive amounts of data from many different sources: across geographic locations, on-premises and cloud, unstructured and structured. Effectively managing, coordinating and processing this data can be challenging, especially when the system needs to constantly evolve to deal with new business requirements, scale to handle growing data volume and be broad enough scope to manage diverse systems – commercial or open source – from a single place.

Azure Data Factory helps solve this problem by providing customers with a single place to manage data movement, orchestration and monitoring of diverse data sources, including SQL Server and Azure Blobs, Tables, Azure SQL Database and SQL Server in Azure Virtual Machines. Developers can efficiently build data driven workflows that join, aggregate and transform data from local, cloud-based and internet services, and set up complex data processing systems with little programming.

Milliman, an independent actuarial and consulting firm, is continuously innovating solutions for its clients and is now taking advantage of Azure Data Factory to unlock Azure HDInsight to organize and report over large and disorganized data sets. Milliman’s SaaS solution, IntegrateTM, will provide a data management environment to support both the creation of input data for the models and reporting across the vast amount of data generated from the models.

Rockwell Automation, the world’s largest company dedicated to industrial automation and information, is demonstrating IoT capabilities by offering remote monitoring services that collect data from sensors which is then securely sent to Microsoft Azure. A key component of their architecture is Data Factory. With Data Factory, Rockwell Automation is able to orchestrate critical data pipelines for time series sensor data by leveraging Microsoft Azure HDInsight so users can work with the data in Power BI and Azure Machine Learning.
 

Microsoft data services

Azure Stream Analytics, Azure Event Hubs and Data Factory are just a few of the data services we’ve added to Azure recently. Just this month at Strata + Hadoop World we introduced support for Apache Storm in Azure HDInsight, and over the past few months we announced Azure SQL Database, Azure DocumentDB, Azure Search and Azure Machine Learning. We’re delivering these new services so our customers have easier ways to manage, analyze and act on their data – using the tools, languages and frameworks they are familiar with – in a scalable and reliable cloud environment. To learn more, go here.

###

Tuesday, October 28, 2014 10:00:00 AM

Just one week from now, PASS Summit will bring together the #SQLFamily in Seattle for the best week of SQL Server and BI learning and networking on the calendar, Nov. 4-7. With a record 200+ technical sessions across 3 jam-packed days of connecting and sharing with 5,000 fellow SQL Server professionals from around the world, PASS Summit 2014 will be the biggest Summit yet.

In addition to sessions with top community and Microsoft experts, guidance from Microsoft CSS, SQLCAT, and SQL Tiger teams at the popular SQL Server Clinic, and hands-on instructor-led workshops, Summit attendees can get 50% off Microsoft certification exams onsite. What else can you look forward to at this year’s Summit? Here are just some of the networking activities, onsite events, and opportunities to immerse yourself in the #SQLCommunity like never before.

First Timers

First-time Summit attendee? Don’t know what to expect at the conference? Don’t miss one of our First-Timers’ orientation sessions Tuesday before the Welcome Reception to get an inside look at what’s in store for you at PASS Summit and tips on getting the most from your week. Then, jump into a Speed Networking session with your fellow First-Timers, and start making connections. We’ll also have daily “Get to Know Your Community” sessions on how to navigate Summit and get more involved in PASS and the #SQLCommunity year-round.

Speaker Idol

For the first time ever, watch as 12 presenter compete for a guaranteed speaking spot at PASS Summit 2015. With three rounds across three days, a panel of judges from the community will give Speaker Idol contestants real-time feedback and select the finalists for Friday’s “speak-off.” Drop by the Community Session Room (Room 400) to watch this competition and cheer on your favorite speakers.

Community Zone

The Community Zone, on the level-4 Skybridge, is the place to mix and mingle with members of the community. Local and Virtual Chapter leaders, Regional Mentors, SQLSaturday organizers, and MVPs will be on hand Wednesday through Friday to answer any questions you have about PASS. The Community Zone will also feature a different country/language spotlight every hour – come by and talk with community leaders from your area and who speak your native language. Plus, meet community leaders from around the world and you could win a $250 VISA gift card in our SQL Around the World scavenger hunt-style game. 

Luncheons

This year’s Summit features two great opportunities tolearn more as you dig into lunch. Thursday’s Women in Technology Luncheon welcomes special guest keynoter Kimberly Bryant, founder of Black Girls CODE, to share her thoughts in a question-and-answer session. And join with MVPs, speakers, PASS Board members, and fellow attendees in our closing day Birds of a Feather Luncheon, focused on bringing people with the same passions together.

Evening Events

The fun continues even after sessions are over, with evening events designed to help you engage with the community and relax after an intense day of training. Help us kick off PASS Summit at Tuesday’s Welcome Reception, and rub shoulders with our sponsors and exhibitors at Wednesday’s Exhibitor Reception. Then enjoy Thursday’s Community Appreciation Party at the contemporary, cutting-edge Experience Music Project Museum, sponsored by PASS and Microsoft as a special thank you for being part of the #SQLCommunity.

Read what community bloggers are looking forward to at the SQL Server event of the year. And if you haven’t already, make sure you register by Oct. 31 to save $200 off the onsite rate. We can’t wait to see everyone there!

Wednesday, October 22, 2014 10:00:00 AM

Are you curious about how to begin working with big data using Hadoop? Perhaps you know you should be looking into big data analytics to power your business, but you’re not quite sure about the various big data technologies available to you, or you need a tutorial to get started.  

  1. If you want a quick overview on why you should consider cloud Hadoop: read this short article from MSDN Magazine that explores the implications of combining big data and the cloud and provides an overview of where Microsoft Azure HDInsight sits within the broader ecosystem.  

  1. If you’re a technical leader who is new to Hadoop: check out this webinar about Hadoop in the cloud, and learn how you can take advantage of the new world of data and gain insights that were not possible before.  

  1. If you’re on the front lines of IT or data science and want to begin or expand your big data capabilities: check out the ‘Working with big data on Azure’ Microsoft Press eBook, which provides an overview of the impact of big data on businesses, a step-by-step guide for deploying Hadoop clusters and running MapReduce in the cloud, and covers several use cases and helpful techniques.  

  1. If you want a deeper tutorial for taking your big data capabilities to the next level: Master the ins and outs of Hadoop for free on the Microsoft Virtual Academy with this ‘Implementing Big Data Analysis’ training series

What question do you have about big data or Hadoop? Are there any other resources you might find helpful as you learn and experiment? Let us know. And if you haven’t yet, don’t forget to claim your free one month Microsoft Azure trial

Tuesday, October 21, 2014 10:00:00 AM

We are working to make Azure the best cloud platform for big data, including Apache Hadoop. To accomplish this, we deliver a comprehensive set of solutions such as our Hadoop-based solution Azure HDInsight and managed data services from partners, including Hortonworks. Last week Hortonworks  announced the most recent milestone in our partnership and yesterday we announced even more data options for our Azure customers through a partnership with Cloudera.

Cloudera is recognized as a leader in the Hadoop community, and that’s why we’re excited Cloudera Enterprise has achieved Azure Certification. As a result of this certification, organizations will be able to launch a Cloudera Enterprise cluster from the Azure Marketplace starting October 28. Initially, this will be an evaluation cluster with access to MapReduce, HDFS and Hive. At the end of this year when Cloudera 5.3 releases, customers will be able to leverage the power of the full Cloudera Enterprise distribution including HBase, Impala, Search, and Spark.

We’re also working with Cloudera to ensure greater integration with Analytics Platform System, SQL Server, Power BI and Azure Machine Learning. This will allow organizations to build big data solutions quickly and easily by using the best of Microsoft and Cloudera, together.  For example Arvato Bertelsmann was able to help clients cut fraud losses in half and speed credit calculations by 1,000x.

Our partnership with Cloudera allows customers to use the Hadoop distribution of their choice while getting the cloud benefits of Azure. It is also a sign of our continued commitment to make Hadoop more accessible to customers by supporting the ability to run big data workloads anywhere – on hosted VM’s and managed services in the public cloud, on-premise or in hybrid scenarios.

From Strata in New York to our recent news from San Francisco it’s exciting times ahead for those in the data space.  We hope you join us for this ride!

Eron Kelly
General Manager, Data Platform

Friday, October 17, 2014 10:00:00 AM

Author: Shaun Connolly
VP Corporate Strategy - Hortonworks

Data growth threatens to overwhelm existing systems and bring those systems to their knees. That’s one of the big reasons we’ve been working with Microsoft to enable a Modern Data Architecture for Windows users and Microsoft customers.

A history of delivering Hadoop to Microsoft customers

Hortonworks and Microsoft have been partnering to deliver solutions for big data on Windows since 2011. Hortonworks is the company that Microsoft relies on for providing the industry’s first and only 100% native Windows distribution of Hadoop, as well as the core Hadoop platform for Microsoft Azure HDInsight and the Hadoop region of the Microsoft Analytics Platform System.

This week we made several announcements that further enable hybrid choice for Microsoft-focused enterprises interested in deploying Apache Hadoop on-premises and in the cloud.

New capabilities for Hadoop on Windows

Hortonworks has announced the newest version of the Hortonworks Data Platform for Windows  - the market’s only Windows native Apache Hadoop-based platform. This brings many new innovations for managing, processing and analyzing big data including:

  • Enterprise SQL at scale
  • New capabilities for data scientists
  • Internet of things with Apache Kafka
  • Management and monitoring improvements
  • Easier maintenance with rolling upgrades

Automated cloud backup for Microsoft Azure
Data architects require Hadoop to act like other systems in the data center, and business continuity through replication across on-premises and cloud-based storage targets is a critical requirement.  In HDP 2.2, we extended the capabilities of Apache Falcon to establish an automated policy for cloud backup to Microsoft Azure.  This is an important first step in a broader vision to enable seamlessly integrated hybrid deployment models for Hadoop.

Certified Hadoop on Azure Infrastructure as a Service (IaaS)

Increasingly the cloud is an important component for big data deployments. On Wednesday October 15 we announced that the Hortonworks Data Platform (HDP) is the first Hadoop platform to be Azure certified to run on Microsoft Azure Virtual Machines. This gives customers new deployment choices for small and large deployments in the cloud. With this new certification, Hortonworks and Microsoft make Apache Hadoop more widely available and easy to deploy for data processing and analytic workloads enabling the enterprise to expand their modern data architecture.

Maximizing Hadoop Deployment choice for Microsoft Customers

These latest efforts further expand the deployment options for Microsoft customers while providing them with complete interoperability between workloads on-premises and in the cloud. This means that applications built on-premises can be moved to the cloud seamlessly. Complete compatibility between these infrastructures gives customers the freedom to use the infrastructure that best meets their needs. You can backup data where the data resides (geographically) and provide the flexibility and opportunity for others to do Hadoop analytics in the cloud (globally).

We are excited to be the first Hadoop vendor to offer Hadoop on Azure Virtual Machines and we look forward to continuing our long history of working with Microsoft to engineer and offer solutions that meet the most flexible and easy to use deployment options for big data available, further increasing the power of the Modern Data Architecture. 

Read more information as follows:

Thursday, October 16, 2014 10:00:00 AM

Yesterday at Strata + Hadoop World, Microsoft announced the preview of Apache Storm clusters on Azure HDInsight.  This post will give you the ins and outs of Storm.

What is Storm?

Apache Storm is a distributed, fault-tolerant, open source real-time event processing solution. Storm was originally used by Twitter to process massive streams of data from the Twitter firehose. Today, Storm is an incubator project as part of the Apache Software foundation. Typically, Storm will be integrated with a scalable event queuing system like Apache Kafka or Azure Event Hubs.

What can it do?

Combined with an event queuing system, the combined solution will be able to process a large amount of real-time data. This can enable many different scenarios like real-time fraud detection, click-stream analysis, financial alerts, telemetry from connected sensors/devices, and more. For information on real world scenarios, read how companies are using Storm.

How do I get started?

For Microsoft customers, we offer Storm as a preview cluster in Azure HDInsight. This gives you a managed cluster where you will have the benefit of being easy-to-setup (within a few clicks and a few minutes), having high availability (clusters are monitored 24/7 and under the Azure SLA for uptime), having elastic scale (where more resources can be added depending on need), and being integrated to the broad Azure ecosystem (ie. Event Hubs, HBase, VNet, etc).

To get started, customers will need to have an Azure subscription or a free trial to Azure. With this in hand, you should be able to get a Storm cluster up and running in minutes by going through this getting started guide.

For more information on Storm:

For more information on Azure HDInsight:

Wednesday, October 15, 2014 10:00:00 AM

This morning at Strata + Hadoop World, Microsoft announced the preview of Apache Storm clusters inside HDInsight as well as new machine learning capabilities in the Azure Marketplace.

Apache Storm is an open source project in the Hadoop ecosystem which gives users access to an event-processing analytics platform that can reliably process millions of events. Now, users of Hadoop can gain insights to events as they happen in real-time.  Learn more from here:

As part of Strata, Microsoft partner, Hortonworks announced the next version of their Hadoop distribution HDP 2.2 will include capabilities to orchestrate data from on-premise to Azure.  This will allow customers to back-up their on-premise data or elastically scale out using the power of the cloud.

Finally, Microsoft is offering new machine learning capabilities as part of the Azure Marketplace.  Customers can now access ML as web services which enable scenarios like doing anomaly detection, running a recommendation engine, doing fraud detection, and a set of R packages.

Read more of Microsoft’s Strata announcements on the Official Microsoft Blog

Wednesday, October 8, 2014 10:15:51 AM

Got SQL Server 2005 running on Windows Server 2003?  We have fantastic pre-con and general sessions to help you plan your upgrade and migration strategies.

Interested in the new Azure data services like Azure DocumentDB, Azure ML, and Azure Search?  We have awesome people lined up to give you all the details.

Want know the nitty-gritty about Azure SQL Database? We got you covered.

It is PASS Summit time and we are counting down the days. We have Microsoft experts from our Redmond campus, field experts flying in from Italy and the U.K., and we’re bringing customers to share their stories – just to name a few.  Check out the Microsoft sessions below and add them to your PASS Summit session builder along with great community sessions. 

7 Databases in 70 Minutes: A Primer for NoSQL in Azure, Lara Rubbelke and Karen Lopez

Analytics Platform System Deep Dive (APS), Paul Dyke

Analytics Platform System Overview (APS), Nicolle Whitman

Analyzing tweets with HDInsight, Excel and Power BI, Miguel Martinez and Sanjay Soni

Application Lifecycle Management for SQL Server database development, Lonny Bastien and Steven Green

Azure CAT: Azure Data Platform: Picking the right storage solution for the right problem, Kun Cheng, Rama Ramani, and Ewan Fairweather

Azure CAT: Azure SQL DB Performance Tuning & Troubleshooting, Sanjay Mishra, Kun Cheng, and Silvano Coriani

Azure CAT: Deep dive of Real world complex Azure data solutions: Lindsey Allen, and Rama Ramani

Azure CAT: Running your Line of Business application on Azure Virtual Machine Services, Juergen Thomas

Azure CAT: SQL Server 2014 Gems, Shep Sheppard

Azure CAT: SQL Server 2014 In-Memory Customer Deployments: Lessons Learned, Michael Weiner and Stephen Baron

Azure Search Deep Dive, Pablo Castro

Azure SQL Database Business Continuity and Auditing Deep Dive, Nadav Helfman and Sasha Nosov

Azure SQL Database Overview, Bill Gibson and Sanjay Nagamangalam

Azure SQL Database Performance and Scale Out Deep Dive, Torsten Garbs and Michael Ray

BI Power Hour, Matt Masson and Matthew Roche

Building a Big Data Predictive Application, Nishant Thacker and Karan Gulati

Built for Speed: Database Application Design for Performance, Pam Lahoud

ColumnStore Index: SQL Server 2014 and Beyond, Sunil Agarwal and Jamie Reding

Connecting SAP ERP and Microsoft BI Platform, Sanjay Soni

Data-tier Considerations of Cloud-based Modern Application Design, Scott Klein

Deep Dive into Power Query Formula Language, Matt Masson and Theresa Palmer-Boroski

Deploying Hadoop in a Hybrid Environment, Matt Winkler

Deployment and best practices for Power BI for Office 365, Miguel Llopis

End-to-End Demos with Power BI, Kasper de Jonge and Sanjay Soni

HBase: Building real-time big data apps in the cloud, Maxim Lukiyanov

Improve Availability using Online Operations in SQL Server 2014, Ajay Jagannathan and Ravinder Vuppula

In-Memory OLTP in SQL Server 2014: End-to-End Migration, George Li

Interactive Data Visualization with Power View, Will Thompson

Introducing Azure Machine Learning, Raymond Laghaeian

Introduction to Azure HDInsight and Visual Studio customizations, Matt Winkler

Just in Time Data Analytics with SQL Server 2014, Binh Cao and Tomas Polanco

Leveraging SQL Server in Azure Virtual Machines Best Practices, Scott Klein

Life in the fast lane with Azure DocumentDB, Stephen Baron

Making the most of Azure Machine Learning end-to-end, Parmita Mehta

Managing 1 Million+ DBs-How Big Data is used to run SQL Azure, Conor Cunningham

Match the database to the data – from on prem to the cloud, Buck Woody

Microsoft Azure SQL Database – Resource Management, Mine Tokus

Migration and Deployment Principles for SQL Server in Azure VMs, Selcin Turkarslan

Polybase in the Modern Data Warehouse, Artin Avanes

Power BI Hybrid Data Access via Data Management Gateway, Luming Han and Mini Nair

Power View with Analysis Services Multidimensional Models, Kasper de Jonge

Real world Healthcare BI transformations in the cloud, Matt Smith and Michael Wilmot

SQL Server 2014 AlwaysOn (High Availability and Disaster Recovery), Luis Carlos Vargas Herring

SQL Server 2014 in-Memory OLTP - Memory/Storage Monitoring and Troubleshooting, Sunil Agarwal

SQL Server 2014 In-Memory OLTP Query Processing, Jos de Bruijn

SQL Server 2014 In-Memory OLTP Transaction Processing, Jos de Bruijn

SQL Server 2014: In-Memory Overview, Kevin Farlee

SQL Server Hybrid Features End to End, Xin Jin

SQL Server in Azure VM Roadmap, Luis Carlos Vargas Herring

To The Cloud, Infinity, & Beyond: Top 10 Lessons Learned at MSIT, Jimmy May

Upgrading and Migrating SQL Server, John Martin

What's New in Microsoft Power Query for Excel, Miguel Llopis

Who Dunnit? A Walk Around the SQL Server 2014 Audit Feature, Timothy McAliley and Michael Ray

 

Still want more? No problem. Check back November 5th for additional sessions and speakers.

Only 28 More days until PASS Summit. You won’t want to miss it!

Tuesday, October 7, 2014 11:01:00 AM

Thank you for participating in our Twitter sweepstakes.

As always, the creativity of our community never fails to amaze.  Congratulations to @dragonfurther for the winning haiku.


 

Notable mentions include entries from @John_Deardurff and @sdevanny

 

 


 

We look forward to seeing everyone at PASS Summit 2014.  Can’t make it in person?  Don’t miss the live streaming of the keynotes on November 5th and 6th at www.passsummit.com



Wednesday, October 1, 2014 11:12:19 AM

Faster transactions, faster queries and faster analytics. Sounds like nirvana right? Just imagine it… your customers can find what they want from your large product catalog more quickly and purchase without huge lags. Your business divisions can perform timely analytics highlighting product, web site, and data trends. It’s all possible with in-memory technologies, and Microsoft SQL Server 2014’s in-memory is the secret speed sauce you need to realize these benefits.  

Microsoft SQL Server 2014 offers optimized in-memory technologies for transaction processing (OLTP), data warehousing and data analytics built right into the product. We have a long history with in-memory technologies in SQL Server (more on that in a subsequent blog post), and the enhancements we’ve made to the In-Memory ColumnStore provide greater data compression and increased performance, resulting in world-record benchmarks on industry standard hardware.

So what does all this mean for you? Significant performance gains for starters. Microsoft’s in-memory solution leads to up to 30x faster transactions, over 100x faster queries and reporting, and easy management of millions of rows of data in Excel. The following video highlights just how in-memory can help speed your business:

Of course, gains vary by situation, but check out a few of our customers and how they’ve benefited from the latest in-memory improvements in SQL Server 2014:

  • Nasdaq was able to decrease query times from days to minutes, while at the same time reducing storage costs by 10x.
  • Bwin, using our in-memory technology on standard commodity servers, was able to boost performance gains by 17x and queries by 340x.
  • EdgeNet realized near-real time inventory updates and higher customer satisfaction because of the 7x faster performance our in-memory gave them.

Best of all, Microsoft’s in-memory solution is included in SQL Server 2014 at no additional cost. It can be used on industry-standard hardware, without the need for expensive upgrades, and there are no new development tools, management tools or APIs to learn. We invite you to visit http://www.microsoft.com/en-us/server-cloud/solutions/in-memory.aspx where you can see more about our in-memory solution, how customers are using it to speed their business, and how you can get started.

Site Map | Printable View | © 2008 - 2014 SQLTrainer.com, LLC | Powered by mojoPortal | HTML 5 | CSS | Design by mitchinson