About Me

I now work for Microsoft Federal in Chevy Chase, MD.

Dedicated to providing customer-driven, results-focused solutions to the complex business problems of today... and tomorrow.

At SQLTrainer.com, LLC  we understand that the technical challenges faced by businesses today are much greater in both scope and complexity than they have ever been. Businesses today are faced with understanding both local IT infrastructures as well as cloud-based technologies.

What is SQLTrainer.com?

Founded in 1998 by Ted Malone, SQLTrainer.com is a technical consulting, training and content development firm dedicated to the following core principles:

  • Technology Alone is NOT the Answer! - Implementing a particular technology because it is interesting or "cool" will not solve customer problems.
  • Technology Solutions do NOT need to be Overly Complex! - Many times developers and technical practitioners will attempt to build the most clever solution possible. While this serves to stroke the egos of those involved, it doesn't provide a maintainable solution.
  • Consultants Should be Mentors First! - When looking to hire an external consultant, businesses should look to the consultant who's willing to train themselves out of a paycheck.

Why the name, SQLTrainer.com?

SQL (pronounced See-Quell) stands for Structured Query Language, which is at the heart of every modern-day relational database system. Since many technology solutions today rely on some form of database storage or interaction, it was only logical to find a way to incorporate SQL into the name of the organization. Given that one of our core principles is to be a mentor/training above everything, the name SQLTrainer made sense. Since we also wanted to represent our embracing of the cloud, it seemed logical to add the ".com", referring to the biggest "cloud" of them all.

Live Feeds

RSS Feeds RSS
Thursday, April 23, 2015 1:00:00 PM

Last week we announced the General Availability of Azure Premium Storage for Azure Virtual Machines.

Premium Storage provides steady high throughput (up to 64,000 IOPs; 8x more than Standard Storage) at low latency (single-digit milliseconds; 8x less than Standard Storage). This enables enterprise SQL Server workloads (OLTP or Data warehousing) needing consistent high performance to run on Azure VM.

During the Preview of Azure Storage we worked with many SQL Server customers of different workload sizes to ensure that Premium Storage satisfied their requirements on Azure VM.  Here are some examples of customer results:

  • Transaction latency for thousands of concurrent users consistently within 10ms
  • Query times over large data sets reduced from minutes in standard storage to seconds
  • Batch loads for millions of records reduced from hours in standard storage to minutes
  • Backup/restore times on large databases reduced from many hours in standard storage to less than one hour

Azure Premium Storage and SQL Server

Premium Storage is based on Solid State Disks (SSD) in the storage backend, dedicated fast connections between the storage backend and new compute clusters, and VMs local read-only caches that are also SSD-based. Writes are sent to the backend to guarantee their persistence via 3 copies. Writes trigger an update on the VM read-only cache. Reads that can be served from the cache return immediately; others are served quickly from the backend, also updating the cache as a result. More details here.

VMs using Premium Storage get a guaranteed higher storage bandwidth to serve writes and reads. Reads served from the cache are not counted towards this bandwidth. The high bandwidth allows writing and reading more data per second to the storage. This increases transaction throughput and reduces the time for query scans and other operations such as backup/restore, batch loads, and index rebuilds.

The following PerfMon picture shows a SQL Server backup consistently reading and writing ~500MB/s:

The main benefit of the fast storage writes is lowering SQL Server transaction latency. This is achieved by shortening the time to synchronously write commit records to the log file. This benefits both standalone and AlwaysOn configurations, where the secondary must ACK writing commit records. Besides this, the fast storage writes reduce the time for other SQL Server write operations (e.g. checkpoints (asynchronously writing dirty pages to disk), AlwaysOn secondary’s log redo, etc).

The main benefit of fast reads is lowering SQL Server query time. This is achieved by shortening the time to retrieve data pages, especially if served from the read-only cache. In addition, the higher storage bandwidth aids retrieving more data pages. The benefit of the read-only cache is for data files, as data pages are read very frequently. There is no benefit for log files, as log records are only read during distant operations (e.g. backups).

The following PerfMon picture shows a SQL Server workload executing an average of 9K Batch Requests per second. This accounts for 20K reads and 17K writes per second (37K IOPs). The average read latency is just 1ms with a max of 6ms, and the average write latency is just 3ms with a max of 10ms.

Azure Premium Storage Options

There are 3 types of Premium Storage disks to choose from: P10, P20, and P30. The type of disk is determined by its size. Each disk type is assigned a different number of IOPs and bandwidth:

Disk Type

Disk Size

Storage IOPS

Storage Bandwidth (MB/s)

P10

128 GB

500

100

P20

512 GB

2300

150

P30

1024 GB

5000

200

 

To support Premium Storage, there is a new series of VMs called DS-Series. The capabilities of these VMs are below:

 VM Size

CPU Cores

Max

Storage Disks

Max Storage Space

Max Storage IOPS*

Max Storage Bandwidth (MB/s)

Cache size (GB)

DS1

1

2

2 TB

3,200

32

43

DS2

2

4

4 TB

6,400

64

86

DS3

4

8

8 TB

12,800

128

172

DS4

8

16

16 TB

25,600

256

344

DS11

2

4

4 TB

6,400

64

72

DS12

4

8

8 TB

12,800

128

144

DS13

8

16

16 TB

25,600

256

288

DS14

16

32

32 TB

50,000

512

576

  * Doesn’t include IOPs directly from the VM read-only cache

Notice that the total number of IOPs and bandwidth will depend on the combination of VM size, number of disks, and the sizes of these disks.

Consider the size of your database, workload requirements, and pricing when choosing the above. Notice that a VM can have disks with different sizes and, it’s even possible to mix disks from Premium and Standard storage. More details here.

Creating a new SQL Server Virtual Machine using Premium Storage

  1. Go to the new Azure Portal
  2. Create a storage account of type Premium Locally Redundant

    Notice that there is a limit of 32TB per storage account. If you need more storage, then create more storage accounts.
  3. Create a new VM using a SQL Server Image from the Gallery, specifying a DS-Series VM, and the Premium Storage account that you previously created (type PREMIUM-LRS). Notice that this VM can’t be added to resource groups that have other VM Series (DS-Series are hosted by new compute clusters).

  4. Attach disks to the VM
    Select the VM that you previously created, go to Disks, and select Attach New. Choose the Premium Storage account that you previously created, a container for the disk (by default vhds), the disk file name, size, and caching. A common basic configuration is using 3 disks, one for data, another for log, and another for TempDB.

Migrating an existing SQL Server to Premium Storage

Notice that it’s not possible to upgrade an existing Standard Storage account to Premium Storage and that DS-Series VMs can’t be added to a Resource Group that have other VM Series. 

To migrate an existing SQL Server to Premium Storage please create a new DS-Series VM that uses a Premium Storage account. Then backup and restore your databases and copy your SQL configuration (equivalent to a side-by-side migration).

To reduce downtime during the migration to few minutes:

  1. Take a full backup of the databases and restore them to the new SQL VM
  2. Disconnect the clients from the databases in the old SQL VM
    ALTER DATABASE SET SINGLE_USER WITH ROLLBACK AFTER 20 SECONDS
  3. Take a log backup of the databases (for any final transactions) and restore them to the new SQL VM
  4. Change the clients connection string to point to the new SQL VM

If you are using SQL AlwaysOn Availability Groups you can minimize downtime during the migration to seconds. Availability Groups allow you to failover a group of databases from a primary SQL Server replica to a secondary SQL Server replica in seconds without data loss. In addition, applications connect to the primary replica using a listener (virtual network name), so their connection string doesn’t need to change.

You can add a synchronous secondary SQL Server replica in a DS-Series VM that uses Premium Storage and failover to it. Notice that you will need to add the secondary replica VM to the same Windows Domain and the same Windows Cluster as the primary replica. In addition, you will need to create an endpoint for the secondary replica VM and add it to the load balancer supporting the Availability Group listener.

More details here.

Performance Best Practices

Most existing performance best practices apply to Premium Storage. Premium Storage disks have much higher bandwidth and IOPs limits than Standard Storage disks, thus a smaller number of Premium Storage disks will satisfy the performance requirements (especially for P30 disks). Consider the bandwidth and IOPs limits of the DS-Series VM sizes when determining the number and types of disks.

To get the highest performance:

  1. Use a Premium storage account and VM in the same region
  2. Use separate disks for data files, log files, and TempDB files
  3. Enable the read-only cache for data disks and TempDB disks, but not for log disks
  4. If you need higher bandwidth or IOPs: use Storage Spaces over multiple disks to aggregate their IOPs, bandwidth, and storage space. Use separate Storage Spaces for data and log. Depending on your TempDB requirements you could put TempDB in the Storage Pool for data files or in a different pool.

Summary

Premium Storage provides steady high throughput at low latency. This enables enterprise SQL Server workloads (OLTP or Data warehousing) needing consistent high performance to run on Azure VM.

Many SQL Server customers of different workload sizes have satisfied their requirements on Azure VM using Premium Storage. We hope that you will too!

Premium Storage is available in the following regions: West US, East US 2, West Europe, East China, Southeast Asia, West Japan. It’ll become available in other regions soon.

 

Learn more about SQL Server on Azure VM and try Premium Storage today!

Tuesday, April 21, 2015 9:00:00 AM

Don’t get caught off-guard.

Upgrading to Microsoft SQL Server 2014 helps you maintain security and compliance, gain faster data insights, and optimize your data infrastructure—all using familiar tools you already know and trust.

Be prepared when extended support for SQL Server 2005 ends. Protect your data with a thoughtful upgrade plan and mitigate cost and risk. Read this complimentary report today to learn more about the robust backward compatibility features and tools that enable a range of migration options. 

Read Directions on Microsoft’s report: Migrating from SQL Server 2005

Friday, April 17, 2015 5:40:15 PM

On Wednesday, April 15, Microsoft launched SQL Server 2014 Service Pack 1 (SP1).  Shortly after release we discovered an installation issue.  As a result, we have put the Service Pack downloads on hold.

SQL Server 2014 (SP1) will be re-released in the next few weeks. In the meantime, we recommend that you continue using SQL Server 2014 and its Cumulative Updates.

If you have encountered an issue during installation of the Service Pack, this article on the Release Services Blog will provide workaround steps.

Thursday, April 16, 2015 9:00:00 AM

In honor of the upcoming PASS Business Analytics conference, we wanted to take some time to spotlight the great work happening in the SQL and BI communities across the world. The conference is focused on business analytics, but PASS offers many great community activities for SQL Server and beyond. Learn about the various local and digital opportunities to connect with the PASS community here.

Name: Luan Moreno Medeiros Maciel

Role: SQL Server Database Consultant at The Pythian Group

Location: Brazil

 

What is an exciting project that you’re working on right now?

I have a few interesting projects going on right now…

Hybrid Cloud on SQL Server: This projects consists in a cloud between Amazon and the local DataCenter of the client. I’m designing this solution utilizing Database Mirroring on SQL Server 2012. After the implementation the client will be able to have a High Available solution between two sites.

In-Memory OLTP a.k.a “Hekaton” Feature Implementation: Implementing tables In-Memory for a medium company. The idea is to provide a big table In-Memory that will absorb all the requests and then, at a specific time of the day, flush the data to disk-based tables.

Partitioning in SQL Server 2012: Partitioning a big table to provide better maintenance and velocity in the data request. The data will be split by month.

Creating DW (Data Warehouse) – Educational Company: Creating the complete Data Warehouse process for a client who needs to analyze data to be used with end visualization tools in Excel (PowerView, PowerMap and PowerPivot).

 

What are your current analytics and/or database challenges, and how are you solving them?

I have a few challenges I am currently navigating. The first is determining the proper tables to be used in an In-Memory implementation. To help choose the correct tables to put In-Memory, I am using the Data Collector to analyze the client workload and generate insights. The second is creating the right Table Partitioning to improve one of my clients’ weekly maintenance. To do this I am analyzing the client server/database/maintenance to check what options are available and to determine the specific timeframe to apply the changes. I am also consulting some DMV’s and the MSDB database to check the jobs in this instance.

 

How does data help you do your job better?

Data is the center of my globe; I need to analyze, maintain, sustain, and provide insights and solutions to my clients. All of this can happen because of data.

 

What types of data-related challenges do you frequently see with customers, and how are you solving them?

Working with customers requires you to understand the customer environment, check what you have available to accomplish the work required, and then provide the best solution possible.

 

What’s your favorite example of how data has provided an insight, a decision, or a shift in how business gets done?

One of my clients had several problems related to data analysis. Because their data was not organized in the correct manner in their database, the client couldn’t provide solutions to front-end users. To solve the problem my team created a Data Quality Services structure with the client, organizing the data so that the data insights of the OLTP system could be recognized. As a result of these improvements, the client started to provide his own insights with the new, reinvented data.

 

What or who do you read, watch, or follow to help grow your data skills? (book, blog, Twitter handle, podcast, online course…anything!)

 

What’s your favorite SQL command and/or Excel function and why?

ALTER DATABASE imoltp
ADD FILEGROUP imoltp_mod CONTAINS MEMORY_OPTIMIZED_DATA

 

This command gives me the certainty that I will work with the In-Memory OLTP (a.k.a “Hekaton”).

 

Thanks for joining us, Luan!

Know someone doing cool work with data? Nominate them for a spotlight in the comments.

Wednesday, April 15, 2015 2:00:00 PM

The SQL Server 2014 SP1 package download is temporarily unavailable.  Please read this post for more information.

 

One year ago, Satya Nadella spoke in San Francisco to launch SQL Server 2014.  Today, we are pleased to announce the release of SQL Server 2014 Service Pack 1 (SP1). The Service Pack is available for download on the Microsoft Download Center. As part of our continued commitment to software excellence for our customers, this upgrade is available to all customers with existing SQL Server 2014 deployments.

SQL Server 2014 SP1 contains fixes to issues that have been reported through our customer feedback platforms and Hotfix solutions provided in SQL Server 2014 CU 1 up to and including CU 5, as well as a rollup of fixes previously shipped in SQL Server 2012 SP2. 

A few of the customer-requested updates in Microsoft SQL Server 2014 SP1 are:

  • Column store performance is improved when batch mode operators spill over to disk. A new XEvent provides insights into column store inserts.
  • Several issues in buffer pool extension SSD configuration are addressed, including the ability to do instant initialization of the buffer pool extension file.
  • Certain queries compile faster when using new cardinality estimator, and other cardinality estimation query plans are more efficient when using TF-4199. 
  • The scalability benefits of two trace flags (TF-1236, TF-9024) are applied automatically.
  • Backup header information retrieval works better with encrypted backups.

Customers like Samsung Electro-Mechanics are already seeing the benefits of upgrading to SQL Server 2014. Samsung improved OLTP performance up to 24 times and DW up to 22 times when they upgraded to SQL Server 2014.  SBI Liquidity Market sped up their transaction performance by tenfold when they adopted SQL Server 2014 in-memory OLTP.

For information about upgrading to SQL Server 2014, see the tools and resources available at www.microsoft.com/sqlserverupgrade.  To trial SQL Server 2014, visit the SQL Server 2014 Evaluation Center.

For more highlights of the release, please read the Knowledge Base Article for Microsoft SQL Server 2014 SP1.  To obtain SQL Server 2014 SP1 with its improved supportability, please visit the links below. SQL Server 2014 with SP1 will be available in additional venues including the Volume Licensing center and via Microsoft Update starting May 1, 2015.

Monday, April 13, 2015 9:00:00 AM

Can you believe it’s been 10 great years since SQL Server 2005 was released? Extended support for this offering will end on April 12, 2016. Before support ends, you’ll need a plan for migrating remaining instances of SQL Server 2005. But, why wait to migrate when you have an opportunity to provide new value to your business now with a modern data platform? Many customers are already experiencing the benefits of upgrading to SQL Server 2014. GE Healthcare, for example, wanted a more flexible, scalable platform to deliver applications to healthcare providers worldwide. They met this goal using a cross-platform cloud strategy with SQL Server 2014 and Microsoft Azure. With SQL Server on Azure VMs, GE Healthcare can accelerate time-to-market, cuts costs and enable compliance for more customers.

If you’re hesitant to make this move, it is important that you know what end of support means for your business. After the end of support date, hotfixes and security updates will no longer be provided. The costs to maintain security and support as well as the potential liabilities associated with compliance audits can make it more expensive to stay on the old version than to upgrade. Now is the time to take advantage of new technology to support your business with SQL Server 2014, SQL Server 2014 in a VM (on-premises or in Azure), and/or Azure SQL Database.

Read more about SQL Server 2005 end of support on the Official Microsoft Blog and start planning your upgrade today. 

Thursday, April 9, 2015 9:00:00 AM

Updated April 21, 2015

This is your opportunity to put your data skills to use for a good cause! Come along and meet other data pros as you use big data technologies to solve real-world problems, make a difference in your community, and learn about Microsoft’s big data technologies and cloud services. The Hackathons offer the chance to get hands on with the latest data science technologies and show the world what is possible when Hadoop/HD Insight, Machine Learning, and Power BI tools are used together to find solutions and unlock insights.

How does the Hackathon work? Contestants spend two days at the event, with food, drink, a $250 credit pass on Microsoft Azure, and other goodies & prizes provided. The agenda typically looks like this:

  • Day 1: Morning: registration and short presentations from top Microsoft data professionals and partners providing insight and instruction on big data and data science topics such as Hadoop and HDInsight, Power BI, and Machine Learning. (Note: The experts will be available for the duration of the hackathon to assist with questions). Hacking starts around noon, and the contestants self-organize into teams of 3-5 members to use data science tools to design and build an analytical model to solve a specific data science problem or challenge. The hacking will continue well into the evening, and has been known to go all night! (Note: Remember to bring your own lap top, cords and dongles)

  • Day 2: The competition continues from early morning to around 4pm, when the winning team will be announced by the Hackathon judging panel based on creativity, collaboration, innovation, and overall brilliance in data modelling and visualization. Cool prizes will be awarded to the winning teams.

Since November, over 800 data scientists and big data developers have participated in these big data Hackathons in London, Moscow, Johannesburg, Istanbul, San Jose, and Toronto. Didn’t make it to one of these events? You have more chances to participate! Check below for an upcoming Hackathon near you.

  • Register here for New York Hackathon, May 9th - 10th, 2015
  • Register here for Copenhagen, May 23rd – 24th, 2015
  • Register here for Seattle May 30th-31st, 2015

  • Registration details coming soon for Hackathons in Moscow, London & Milan!

Keep an eye out for Hackathons in a city near you! Those being considered right now are: Zurich, Frankfurt or Dusseldorf, Dubai, Chicago, an Israeli city, Auckland, Sao Paulo, Sydney, and possibly Dallas, Singapore, Sydney, Santiago!

Interested in having a big data hackathon in your city? Let us know by leaving a note in the comments section.

Want to learn more about Microsoft’s big data solutions? Visit www.microsoft.com/bigdata.

Photo of the teams in San Jose Feb 2015, hacking in groups of 3-6.

Friday, March 20, 2015 9:00:00 AM

In honor of the upcoming PASS Business Analytics conference, we wanted to take some time to spotlight the great work happening in the SQL and BI communities across the world. The conference is focused on business analytics, but PASS offers many great community activities for SQL Server and beyond. Learn about the various local and digital opportunities to connect with the PASS community here.

Name: Grant Fritchey
Role: Product Evangelist, Red Gate Software
Location: Grafton, MA, USA

What is an exciting project that you’re working on right now?

I’m helping to build a set of classes to teach people how to automate their database deployments in support of Database Lifecycle Management. Development is moving faster and faster in order to keep up with the demands of business. Because of this, databases must also be deployed faster and faster. But, you still have to ensure the protection of the vital business information stored within your databases. In the class I’m working on, we’ll show you how to get your database into source control alongside your application and how to perform continuous integration with databases. We’re going to cover all sorts of mechanisms for automating database deployments and database testing in order to work Database Lifecycle Management right into your Application Lifecycle Management.

What are your current analytics and/or database challenges, and how are you solving?

The main challenges we have with databases are the same ones we’ve always had: performance and uptime. The thing is, we have blazing fast hardware these days. Or, if you’re looking at online solutions like Azure, we have very large VMs as well as methods for sharing across servers and databases. All this means that the underlying architectures of our database systems can perform very well. But, we still have to deal with the database design and the T-SQL code being run against the database. More and more we’re taking advantage of ORM tools such as Entity Framework, which really do speed up development. But, around 10% of the queries still need to be coded by hand in order to ensure adequate performance. Add to this the fact that we need to deploy all this while still ensuring up-time on the databases… Figuring out how to get adequate functionality in place without affecting up-time is tough work.

How does data help you do your job better?

Decisions on what to do with systems need to be based on information, not guesses. Data gathered about my systems shows me where I need to prioritize my work and directs choices on resource allocation.

What’s your favorite example of how data has provided an insight, a decision, or a shift in how business gets done?

Recently I found that I was seeing a serious “observer affect” in how I was collecting performance data. While tuning queries I was using STATISTICS IO and STATISTICS TIME. I normally do this all the time. As I was adjusting the code, I wasn’t seeing the kind of performance improvements I expected. In fact, some of my solutions seemed to be working even worse. I was a little surprised because I thought I was following a good methodology, and so I tried turning off all the STATISTICS capturing and just used Extended Events. Suddenly, the tuning started was working extremely well. I went back and experimented until I discovered that for some of my queries STATISTICS IO was actually impacting query execution, affecting both the time and the reads. Turning it off cleared the problem completely. I’ve now changed to using extended events most of the time in order to minimize, or eliminate, that issue. Best of all, I’m able to use these within Azure SQL Database as well as in my earthed servers.

What or who do you read, watch, or follow to help grow your data skills?

I go to SQLSkills.com over and over, sometimes multiple times in a day. It’s one of the single best resources for detailed SQL Server information. I also go to SQLServerCentral.com regularly to ask and answer questions. It’s a great resource for expanding your knowledge.

What’s your favorite SQL command and why?

RESTORE DATABASE: Because it has saved my job and the companies I’ve worked for so many times.

How does Azure help you protect your local databases?

There are a couple of ways you can use Azure to extend local capabilities. The first, and probably the easiest, is to use Azure Blob Storage as a means of ensuring that you have off-site storage of your backup files. You could pretty easily write a PowerShell script that copies your backups to Azure Storage. But, starting in SQL Server 2012, you can also issue a backup command to go straight to Azure Storage. Either way, you can be sure there’s a copy of your backups in case you suffer a catastrophic event locally.

Another way to extend your local capabilities to the cloud is to set up a virtual network. You can incorporate Azure Virtual Machines directly into your local network. Because of this, you can set up Availability Groups between Azure VMs and your local machines. This would enable you to have a failover setup to Azure, allowing for additional protection of your data and your systems.

Are there other ways Azure can be used in combination with local databases?

It’s my opinion that every developer should be using a local copy of SQL Server for their development. This is to allow them to experiment, learn, and, well, break stuff, without affecting anyone else. But, some laptops might be underpowered, or this could in some way violate a corporate policy. As a workaround, people can take advantage of the fact that SQL Database covers the vast majority of standard SQL Server functionality, at the database level. This makes it a great place to develop and test databases, especially if you’re already developing applications for Azure. You only need to keep the database around while you’re developing, and because you can keep the size small, the costs are extremely minimal.

Any other benefits for Azure in combination with local machines?

Tons. For one, expanded capacity. What if you need to get a lot more servers online quickly, but you’re hurting on disk space, or the servers are on back-order? Go back to that virtual network we talked about earlier. Set that up and now you can very quickly, even in an automated fashion through PowerShell, add SQL Server machines to your existing systems.

Another thing you could do, although this not something I’ve tried yet, is take advantage of the fact that in SQL Server 2014 you can actually add file groups that are in Azure Blob Storage. Do you need extra disks, right now, that you can’t get from the SAN team? Well, if you can afford a bit of latency, you can just expand immediately into Azure Storage. I’d certainly be cautious with this one, but it’s exciting to think about the expanded capabilities this offers for dealing with certain kinds of disk space emergencies.

Thanks for joining us, Grant!

Know someone doing cool work with data? Nominate them for a spotlight in the comments.

Wednesday, March 18, 2015 9:00:00 AM

This is a guest blog post from Thomas LaRock, the President of PASS (Professional Association for SQL Server)

It’s no secret that the role of data in the IT industry, in business, and in the world at large is changing at a rapid pace. As technology continues to become a more integrated and integral part of our lives the value of data continues to rise.

At PASS we have a 16-year history of empowering IT professionals who use Microsoft data technologies. The SQL Server community is the largest and most engaged group of data pros in the world. PASS volunteers and the PASS Board of Directors work together to help the PASS community succeed in connecting, sharing, and learning from one another. A big part of that effort is keeping an eye on the future of the data profession.

What we see is that data analytics is the next frontier for professionals passionate about data. The growth of Big (and Little) Data, the advent of cloud computing, and advances in machine learning are all areas that present challenges for our community. Data analysts, data scientists, and line-of-business managers are in high demand as organizations realize the potential of collecting and understanding the wealth of information that is available from a variety of sources.

PASS is dedicated to helping our members harness the technologies, skills, and networks that are the foundation of solid and successful careers. We believe that keeping up with industry advances is a vital skill for all data professionals. Setting and achieving new goals as well as learning new ways of working with data is a must.

Whether you’re coming from a background in SQL Server, business intelligence, or social media there are specific cornerstones of turning all this data into something that can benefit your organization. We call this the “analyst’s journey.”

One such cornerstone is data discovery and integration. We want our members to be aware of the latest technologies in collecting, modeling, and preparing data for analysis. Next is data analysis and interpretation. We want to help our members understand the techniques and tools that enable sophisticated analysis, prediction, and optimization. Then there’s visualization: the creative side of things, where we get into report and dashboard design.

As with any career another key skillset is communication. The people who analyze and work with data are in the best position to help gain executive buy-in for data-driven business decisions. For years PASS has been the leader in helping data professionals improve their communication and soft skills.

One way in which we’re reaching out to those who want to learn more about analytics is the PASS Business Analytics Conference. This premiere event brings together a stellar lineup of business and analytics speakers including our keynote speakers Carlo Ratti and BI Brainz founder Mico Yuk. We have created a series of webinars and a Meet the Expert interview series to give people an idea of what the conference will offer. We also have replays from last year’s conference, and we have hours of training available through our Virtual Chapters.

We’re excited about data and analytics and we’re hearing from more and more SQL Server pros who share that excitement.

It’s a wonderful time to be a data professional.

See you in Santa Clara!

Thomas LaRock

President, PASS

PASSBAConference.com

Thursday, March 12, 2015 12:00:00 PM

As of March 11 2015, SAP has certified support for SAP NetWeaver-based applications on Microsoft SQL Server 2014.  Now you can run even more of your Tier-1, mission-critical workloads on SQL Server.  And, the ability to run SAP on Microsoft Azure means that it can be accomplished with low total cost of ownership (TCO).

SQL Server 2014 provides the higher scale, availability, and breakthrough performance needed for your most demanding SAP workloads. The updatable in-memory ColumnStore will deliver blazing fast query performance for your SAP Business Warehouse (BW).  SQL Server AlwaysOn availability groups help with the reliability and availability requirements of SAP systems by enabling multiple, readable secondaries that can be used for failover and read workloads like reporting and backup.

With SAP’s certification, you can also run SAP in Microsoft Azure Virtual Machines with SQL Server 2014. Azure enables SAP customers to reduce TCO by leveraging Microsoft infrastructure as system needs grow, rather than investing in additional servers and storage.  With Microsoft Azure, customers can leverage development and test environments in the cloud that can be spun up and scaled out as needed.  SQL Server 2014 also introduced Disaster Recovery to Azure using an asynchronous AlwaysOn secondary, which can make Azure a part of your SAP disaster recovery plan.

With the certification, customers can now adopt SQL Server 2014 for mission-critical SAP workloads, and we look forward to telling you their stories soon. Here are some customers who are taking advantage of SAP on SQL Server today:

  • Quanta Computer Boosts Performance of Its SAP ERP System with In-Memory Technology
  • Zespri International Prunes Costs, Defends Business from Disasters by Running SAP in the Cloud
  • Saudi Electric Company Increases Query Times by 75 Percent, Can Respond Faster to Customers
  • Mitsui & Co. Deploys High-Availability and Disaster-Recovery Solution After Earthquake

Many companies are already betting their mission critical apps on SQL Server 2014. To read about Microsoft’s leader position for business critical operational database management and data warehouse workloads, read Gartner's Magic Quadrant for Operational Database Management Systems and Magic Quadrant for Data Warehouse Database Management Systems Report.  For more information about how customers are already using SQL Server 2014 for mission critical applications, read these case studies:

For more about the powerful combination of Microsoft and SAP, visit http://www.microsoft.com/SAP.  To get started with SQL Server 2014, click here.

Site Map | Printable View | © 2008 - 2015 SQLTrainer.com, LLC | Powered by mojoPortal | HTML 5 | CSS | Design by mitchinson