About Me

I now work for Microsoft Federal in Chevy Chase, MD.

Dedicated to providing customer-driven, results-focused solutions to the complex business problems of today... and tomorrow.

At SQLTrainer.com, LLC  we understand that the technical challenges faced by businesses today are much greater in both scope and complexity than they have ever been. Businesses today are faced with understanding both local IT infrastructures as well as cloud-based technologies.

What is SQLTrainer.com?

Founded in 1998 by Ted Malone, SQLTrainer.com is a technical consulting, training and content development firm dedicated to the following core principles:

  • Technology Alone is NOT the Answer! - Implementing a particular technology because it is interesting or "cool" will not solve customer problems.
  • Technology Solutions do NOT need to be Overly Complex! - Many times developers and technical practitioners will attempt to build the most clever solution possible. While this serves to stroke the egos of those involved, it doesn't provide a maintainable solution.
  • Consultants Should be Mentors First! - When looking to hire an external consultant, businesses should look to the consultant who's willing to train themselves out of a paycheck.

Why the name, SQLTrainer.com?

SQL (pronounced See-Quell) stands for Structured Query Language, which is at the heart of every modern-day relational database system. Since many technology solutions today rely on some form of database storage or interaction, it was only logical to find a way to incorporate SQL into the name of the organization. Given that one of our core principles is to be a mentor/training above everything, the name SQLTrainer made sense. Since we also wanted to represent our embracing of the cloud, it seemed logical to add the ".com", referring to the biggest "cloud" of them all.

Live Feeds

Friday, March 20, 2015 9:00:00 AM

In honor of the upcoming PASS Business Analytics conference, we wanted to take some time to spotlight the great work happening in the SQL and BI communities across the world. The conference is focused on business analytics, but PASS offers many great community activities for SQL Server and beyond. Learn about the various local and digital opportunities to connect with the PASS community here.

Name: Grant Fritchey
Role: Product Evangelist, Red Gate Software
Location: Grafton, MA, USA

What is an exciting project that you’re working on right now?

I’m helping to build a set of classes to teach people how to automate their database deployments in support of Database Lifecycle Management. Development is moving faster and faster in order to keep up with the demands of business. Because of this, databases must also be deployed faster and faster. But, you still have to ensure the protection of the vital business information stored within your databases. In the class I’m working on, we’ll show you how to get your database into source control alongside your application and how to perform continuous integration with databases. We’re going to cover all sorts of mechanisms for automating database deployments and database testing in order to work Database Lifecycle Management right into your Application Lifecycle Management.

What are your current analytics and/or database challenges, and how are you solving?

The main challenges we have with databases are the same ones we’ve always had: performance and uptime. The thing is, we have blazing fast hardware these days. Or, if you’re looking at online solutions like Azure, we have very large VMs as well as methods for sharing across servers and databases. All this means that the underlying architectures of our database systems can perform very well. But, we still have to deal with the database design and the T-SQL code being run against the database. More and more we’re taking advantage of ORM tools such as Entity Framework, which really do speed up development. But, around 10% of the queries still need to be coded by hand in order to ensure adequate performance. Add to this the fact that we need to deploy all this while still ensuring up-time on the databases… Figuring out how to get adequate functionality in place without affecting up-time is tough work.

How does data help you do your job better?

Decisions on what to do with systems need to be based on information, not guesses. Data gathered about my systems shows me where I need to prioritize my work and directs choices on resource allocation.

What’s your favorite example of how data has provided an insight, a decision, or a shift in how business gets done?

Recently I found that I was seeing a serious “observer affect” in how I was collecting performance data. While tuning queries I was using STATISTICS IO and STATISTICS TIME. I normally do this all the time. As I was adjusting the code, I wasn’t seeing the kind of performance improvements I expected. In fact, some of my solutions seemed to be working even worse. I was a little surprised because I thought I was following a good methodology, and so I tried turning off all the STATISTICS capturing and just used Extended Events. Suddenly, the tuning started was working extremely well. I went back and experimented until I discovered that for some of my queries STATISTICS IO was actually impacting query execution, affecting both the time and the reads. Turning it off cleared the problem completely. I’ve now changed to using extended events most of the time in order to minimize, or eliminate, that issue. Best of all, I’m able to use these within Azure SQL Database as well as in my earthed servers.

What or who do you read, watch, or follow to help grow your data skills?

I go to SQLSkills.com over and over, sometimes multiple times in a day. It’s one of the single best resources for detailed SQL Server information. I also go to SQLServerCentral.com regularly to ask and answer questions. It’s a great resource for expanding your knowledge.

What’s your favorite SQL command and why?

RESTORE DATABASE: Because it has saved my job and the companies I’ve worked for so many times.

How does Azure help you protect your local databases?

There are a couple of ways you can use Azure to extend local capabilities. The first, and probably the easiest, is to use Azure Blob Storage as a means of ensuring that you have off-site storage of your backup files. You could pretty easily write a PowerShell script that copies your backups to Azure Storage. But, starting in SQL Server 2012, you can also issue a backup command to go straight to Azure Storage. Either way, you can be sure there’s a copy of your backups in case you suffer a catastrophic event locally.

Another way to extend your local capabilities to the cloud is to set up a virtual network. You can incorporate Azure Virtual Machines directly into your local network. Because of this, you can set up Availability Groups between Azure VMs and your local machines. This would enable you to have a failover setup to Azure, allowing for additional protection of your data and your systems.

Are there other ways Azure can be used in combination with local databases?

It’s my opinion that every developer should be using a local copy of SQL Server for their development. This is to allow them to experiment, learn, and, well, break stuff, without affecting anyone else. But, some laptops might be underpowered, or this could in some way violate a corporate policy. As a workaround, people can take advantage of the fact that SQL Database covers the vast majority of standard SQL Server functionality, at the database level. This makes it a great place to develop and test databases, especially if you’re already developing applications for Azure. You only need to keep the database around while you’re developing, and because you can keep the size small, the costs are extremely minimal.

Any other benefits for Azure in combination with local machines?

Tons. For one, expanded capacity. What if you need to get a lot more servers online quickly, but you’re hurting on disk space, or the servers are on back-order? Go back to that virtual network we talked about earlier. Set that up and now you can very quickly, even in an automated fashion through PowerShell, add SQL Server machines to your existing systems.

Another thing you could do, although this not something I’ve tried yet, is take advantage of the fact that in SQL Server 2014 you can actually add file groups that are in Azure Blob Storage. Do you need extra disks, right now, that you can’t get from the SAN team? Well, if you can afford a bit of latency, you can just expand immediately into Azure Storage. I’d certainly be cautious with this one, but it’s exciting to think about the expanded capabilities this offers for dealing with certain kinds of disk space emergencies.

Thanks for joining us, Grant!

Know someone doing cool work with data? Nominate them for a spotlight in the comments.

Wednesday, March 18, 2015 9:00:00 AM

This is a guest blog post from Thomas LaRock, the President of PASS (Professional Association for SQL Server)

It’s no secret that the role of data in the IT industry, in business, and in the world at large is changing at a rapid pace. As technology continues to become a more integrated and integral part of our lives the value of data continues to rise.

At PASS we have a 16-year history of empowering IT professionals who use Microsoft data technologies. The SQL Server community is the largest and most engaged group of data pros in the world. PASS volunteers and the PASS Board of Directors work together to help the PASS community succeed in connecting, sharing, and learning from one another. A big part of that effort is keeping an eye on the future of the data profession.

What we see is that data analytics is the next frontier for professionals passionate about data. The growth of Big (and Little) Data, the advent of cloud computing, and advances in machine learning are all areas that present challenges for our community. Data analysts, data scientists, and line-of-business managers are in high demand as organizations realize the potential of collecting and understanding the wealth of information that is available from a variety of sources.

PASS is dedicated to helping our members harness the technologies, skills, and networks that are the foundation of solid and successful careers. We believe that keeping up with industry advances is a vital skill for all data professionals. Setting and achieving new goals as well as learning new ways of working with data is a must.

Whether you’re coming from a background in SQL Server, business intelligence, or social media there are specific cornerstones of turning all this data into something that can benefit your organization. We call this the “analyst’s journey.”

One such cornerstone is data discovery and integration. We want our members to be aware of the latest technologies in collecting, modeling, and preparing data for analysis. Next is data analysis and interpretation. We want to help our members understand the techniques and tools that enable sophisticated analysis, prediction, and optimization. Then there’s visualization: the creative side of things, where we get into report and dashboard design.

As with any career another key skillset is communication. The people who analyze and work with data are in the best position to help gain executive buy-in for data-driven business decisions. For years PASS has been the leader in helping data professionals improve their communication and soft skills.

One way in which we’re reaching out to those who want to learn more about analytics is the PASS Business Analytics Conference. This premiere event brings together a stellar lineup of business and analytics speakers including our keynote speakers Carlo Ratti and BI Brainz founder Mico Yuk. We have created a series of webinars and a Meet the Expert interview series to give people an idea of what the conference will offer. We also have replays from last year’s conference, and we have hours of training available through our Virtual Chapters.

We’re excited about data and analytics and we’re hearing from more and more SQL Server pros who share that excitement.

It’s a wonderful time to be a data professional.

See you in Santa Clara!

Thomas LaRock

President, PASS


Thursday, March 12, 2015 12:00:00 PM

As of March 11 2015, SAP has certified support for SAP NetWeaver-based applications on Microsoft SQL Server 2014.  Now you can run even more of your Tier-1, mission-critical workloads on SQL Server.  And, the ability to run SAP on Microsoft Azure means that it can be accomplished with low total cost of ownership (TCO).

SQL Server 2014 provides the higher scale, availability, and breakthrough performance needed for your most demanding SAP workloads. The updatable in-memory ColumnStore will deliver blazing fast query performance for your SAP Business Warehouse (BW).  SQL Server AlwaysOn availability groups help with the reliability and availability requirements of SAP systems by enabling multiple, readable secondaries that can be used for failover and read workloads like reporting and backup.

With SAP’s certification, you can also run SAP in Microsoft Azure Virtual Machines with SQL Server 2014. Azure enables SAP customers to reduce TCO by leveraging Microsoft infrastructure as system needs grow, rather than investing in additional servers and storage.  With Microsoft Azure, customers can leverage development and test environments in the cloud that can be spun up and scaled out as needed.  SQL Server 2014 also introduced Disaster Recovery to Azure using an asynchronous AlwaysOn secondary, which can make Azure a part of your SAP disaster recovery plan.

With the certification, customers can now adopt SQL Server 2014 for mission-critical SAP workloads, and we look forward to telling you their stories soon. Here are some customers who are taking advantage of SAP on SQL Server today:

  • Quanta Computer Boosts Performance of Its SAP ERP System with In-Memory Technology
  • Zespri International Prunes Costs, Defends Business from Disasters by Running SAP in the Cloud
  • Saudi Electric Company Increases Query Times by 75 Percent, Can Respond Faster to Customers
  • Mitsui & Co. Deploys High-Availability and Disaster-Recovery Solution After Earthquake

Many companies are already betting their mission critical apps on SQL Server 2014. To read about Microsoft’s leader position for business critical operational database management and data warehouse workloads, read Gartner's Magic Quadrant for Operational Database Management Systems and Magic Quadrant for Data Warehouse Database Management Systems Report.  For more information about how customers are already using SQL Server 2014 for mission critical applications, read these case studies:

For more about the powerful combination of Microsoft and SAP, visit http://www.microsoft.com/SAP.  To get started with SQL Server 2014, click here.

Tuesday, March 10, 2015 9:00:00 AM

Hybrid cloud solutions offer the best of both worlds: you get the flexible power of the cloud combined with the tight control of localized datacenters. SQL Server offers tons of out-of-the-box hybrid capabilities, and Microsoft Principal PM Nosheen Syed will show you how to make the most of them in his Ignite session "Microsoft SQL Server to Microsoft Azure Virtual Machines: Hybrid Story."

You'll learn all about:

  • Using Managed Backup to take charge of data storage sensibly
  • Boosting IT efficiency with Azure Replica Wizard
  • Easy "lift and shift" migration of on-site SQL Server workloads to Azure Virtual Machines
  • Backup to Block Blob, which is as easy to use as it is difficult to say

There’s plenty more happening at Ignite. Get email updates by subscribing here. And if you haven’t already, register for Ignite here.

Monday, March 9, 2015 1:00:00 PM

As part of SQL Server’s ongoing interoperability program, we are pleased to announce an updated Microsoft SQL Server driver for PHP.  The new driver, which supports PHP 5.6, is now available!

This driver allows developers who use the PHP scripting language to access Microsoft SQL Server and Microsoft Azure SQL Database, and to take advantage of new features implemented in ODBCThe new version works with Microsoft ODBC Driver 11 or higher.

You can download the PHP driver here.  We invite you to explore the rest of the latest the Microsoft Data Platform has to offer via a trial evaluation of Microsoft SQL Server 2014, or by trying the new preview of Microsoft Azure SQL Database.

Friday, March 6, 2015 3:00:00 PM

In January, we released the SQL Automated Backup and SQL Automated Patching services for SQL Server Virtual Machines in Azure. These services automate the processes of backing up and patching your SQL Server VMs. In that release, you were able to configure these services in the Azure Preview Portal when provisioning a new SQL Server 2014 Enterprise or Standard VM. You could also configure these services in an existing Virtual Machine via PowerShell commandlets.

We have now expanded the experience so you can configure these services in an existing SQL Server VM in the Azure Preview Portal. Whether you have already enabled these services or not inside your Virtual Machine, you can go to that VM in the Azure Preview Portal and either update your configuration or create a new configuration for each service. You will find both services under the Configuration label, shown in Figure 1 below.

Figure 1. Configuration label has both services

Try these services out in the Azure Portal, and check out the documentation for further details.

Thursday, March 5, 2015 12:00:00 PM

 Guest post by Tiffany Wissner, Senior Director, Data Platform

As part of our commitment to delivering a world-class data platform for our customers, I am  excited to announce the general availability of Azure Search and Azure DocumentDB to support the search and unstructured data  needs of today’s modern cloud applications.

  • Generally available today, Azure Search is a fully manage search service that enables developers to easily add search capabilities to web and mobile applications. 
  • We are also announcing the April 8 general availability of Azure DocumentDB , our NoSQL document database-as-a-service.

These new data services extend our investments in a broad portfolio of solutions to unlock insights from data. Azure is at the center of our strategy, offering customers scale, simplicity and great economics. And we continue to make it easier for customers to work with data of any type and size – using the tools, languages and frameworks they want – in a trusted cloud environment.

Azure Search now generally available

 Azure Search helps developers build sophisticated search experiences into web and mobile applications. It reduces the friction and complexity of implementing full-text search and helps developers differentiate their applications through powerful features not available with other search packages. As an example, we are adding enhanced multi-language support for more than 50 languages built on our many years of natural language processing experience from products like Microsoft Office and Bing. With general availability, Azure Search now offers customers the ability to more easily load data from Azure DocumentDB, Azure SQL Database, and SQL Server running in Azure VMs in to Azure Search using new indexers. Plus, a .NET software development kit (SDK) is now available to make working with Azure Search a more familiar experience.

JLL, a professional services and investment management company that specializes in commercial real estate, is currently using Azure Search to enable search at scale as it was difficult to do so previously.

“It has always been our plan to have our entire web listings platform in the cloud, but the only component that stopped us from doing that was search. Now, with the Azure Search service, we can realize our goal,” said Sridhar Potineni, director of Innovation at JLL. 

Gjirafa, a full-text web search engine and news aggregator specialized in the Albanian language, is also using Azure Search to take on web search giants. Gjirafa is able to use Azure Search to prioritize results that directly tie to their business model.

“Using Azure Search to pre-process the language, we can determine the exact meaning of a phrase before returning the search results. This, along with our local data, means we can serve the Albanian market better than the big search engines,” said Mergim Cahani, founder and CEO of Gjirafa.

Azure DocumentDB generally availability next month

 Azure DocumentDB offers rich query and transactional processing over a schema-free JavaScript Object Notation (JSON) data model, which helps enable rapid development and high performance for cloud-born data and applications. Offered in units that scale to meet application performance and storage needs, Azure DocumentDB allows customers to quickly build, grow, and scale cloud applications. The global reach of Azure datacenters ensures that data can scale with the global growth of the application.

New at general availability are flexible performance levels within our standard tier which allow fine control over throughput and cost for data depending on application needs. Azure DocumentDB will be available in three standard performance levels: S1, S2, and S3. Collections of data within a DocumentDB database can be assigned to different performance levels allowing you to purchase only the performance you need. In preview, DocumentDB has been used for a wide variety of scenarios including telemetry and logging data, event and workflow data, device and app configuration data, and user-generated content.

Customers such as Telenor and News Republic are currently using Azure DocumentDB to store and query schema-free event and app configuration data. Telenor, based in Fornebu, Norway, is a mobile operator with subscribers in 27 nations. To attract customers to their service, the company used Azure DocumentDB to get a promotion up and running quickly and track user sign ups.

“With Azure DocumentDB, we didn’t have to say ‘no’ to the business, and we weren’t a bottleneck to launching the promotion—in fact, we came in ahead of schedule,” said Andreas Helland, Mobility architect at Telenor.

News Republic, a free mobile app that aggregates news and delivers it to more than 1 million users in 15 countries, uses Azure DocumentDB to make its app more interactive and create more user-focused features.

“Many people read the news passively, but we have built personalization and interactivity into our app with Azure DocumentDB. This is definitely a great way to get more people using the app and keep existing users interested,” said Marc Tonnes, database administrator for News Republic.

Microsoft data services

The value provided by our data services multiplies when customers use them together. For example, Xomni uses both Azure DocumentDB and Azure Search to create omni-channel marketing experiences for top-tier retailers like Gamestop. Likewise, we are committed to making it easy for customers to connect these services to others within the data platform through tools like the Search Indexer (to connect with SQL Database, Azure DocumentDB, and SQL Server in an Azure VM) and the recently announced Azure DocumentDB Hadoop Connector.  By making it simpler to connect data sources across Azure, we want to make it easier to implement big data and Internet-of-Things solutions.

Azure data services provide unparalleled choice for businesses, developers and IT pros with a variety of managed services from Microsoft and our partners that work together seamlessly and connect to our customers’ data platform investments-- from relational data to non-relational data, structured data to unstructured data, constant and evolving data models. I encourage you to try out our new and expanded Azure data services and let us know what you think.

Tuesday, March 3, 2015 10:00:00 AM

You've probably already heard why you should use the in-memory tech built into Microsoft SQL Server: 30 times faster transaction speed, 100 times the query speed, and millions of rows of data analyzed every second. If you want to learn more about the how, check out the Microsoft Ignite session with the no-nonsense title, In-Memory Technologies Overview.

Microsoft SQL Program Manager Kevin Farlee will walk you through all three in-memory technologies contained within SQL Server:

  • In-memory online transaction processing (OLTP)
  • In-memory data warehousing
  • Buffer pool extension (BPE)

He'll cover the thinking behind their inclusion in SQL Server, take a look at the tech behind the scenes, and explore real-world customer benefits. You'll come away with a deeper understanding of these powerful technologies, knowing exactly how to put them to work for your team.

There’s plenty more happening at Ignite. Get email updates by subscribing here. And if you haven’t already, register for Ignite here.

Friday, February 27, 2015 10:59:00 AM

If you are running older versions of SQL Server or use another database management system as your primary data platform, you may be wondering what the latest version of SQL Server has to offer. Upgrading to the latest version of SQL Server enables breakthrough performance, availability and manageability for your mission critical applications – an investment that could pay you back in as little as 9.5 months.1 Learn more about how SQL Server 2012 and 2014 can help run your tier-1, OLTP applications while maintaining lower costs, strong management and high security.

Join our guest speakers, Forrester Research Principal Analyst Noel Yuhanna, Senior TEI Consultant Anish Shah and Principal TEI Consultant Sean Owens for a webinar on SQL Server economics based on research findings from a recent Microsoft-commissioned study*.

Register to view this complimentary webinar to learn about:

  • The SQL Server tier one application market growth and outlook
  • The importance of security and management when considering a database platform
  • Forrester Consulting’s Total Economic Impact commissioned research on SQL Server for data applications
  • Key costs and benefits organizations should consider when evaluating SQL Server for data-intensive applications
  • Customer experience benefits, opportunities and cost savings enabled by SQL Server

Join us on Wednesday, March 4 at 10:00am Pacific. Register now for this webinar on the economic impact of SQL Server 2012 and 2014 to your business.

*Forrester Consulting, The Total Economic Impact(TM) of Microsoft SQL Server, a commissioned study conducted on behalf of Microsoft, July 2014

Wednesday, February 18, 2015 11:00:00 AM

Today at Strata + Hadoop World, Microsoft is announcing the public preview of Azure HDInsight running on Linux, the general availability of Storm on HDInsight, the general availability of Azure Machine Learning, and the availability of Informatica technology on Azure. These new services are part of our continued investment in a broad portfolio of solutions to unlock insights from data.

Head over to the announcement blogs to read more about these exciting developments:

Site Map | Printable View | © 2008 - 2015 SQLTrainer.com, LLC | Powered by mojoPortal | HTML 5 | CSS | Design by mitchinson