SQL Server 2012

by Mary Branscombe

Whether you’re working with a traditional data warehouse or in the cloud, the latest release of SQL Server has plenty to offer.

HardCopy Issue: 56 | Published: May 1, 2012

With all the talk about ‘big data’, whether unstructured or streaming in from millions of sensors, it’s easy to think of SQL as outdated and expect Microsoft to focus on sexy new cloud technologies. But in reality the majority of businesses are still using structured data and the database applications built on top of it, so SQL Server 2012 has major improvements at every level from the database engine up. There’s also a new licensing model, which may or may not make SQL Server a better bargain for you. But what does it all add up to?

For a start, Microsoft is promising significant performance improvements, especially from new in-memory processing features like the xVelocity ColumnStore index. Adding a read-only view that indexes columns only adds 20 per cent to the size of the database but the same queries and reports will be generated far faster, especially when doing a full-text search. Furthermore you can run the query on a secondary database rather than the primary, so avoiding slowing down other database processes and giving you the option of using cheaper, slower hardware without compromising database performance.

Comparing the schemas screenshot

Comparing the schemas of two databases to find differences or incompatibilities saves a lot of development headaches.

A new compression algorithm means databases take up 20 to 60 per cent less space, and SQL Server 2012 now supports up to 256 logical cores, which is particularly beneficial for in-memory processing. There are optimisations for getting performance advantages from a Solid-State Drive (SSD) without wearing it out – especially useful now you can put TEMPDB on the server’s local hard disk – and the number of partitions that can be supported by a single table increases from 1,000 to 15,000. Furthermore you can now run SQL Server 2012 instances on a Server Core installation of Windows Server 2008, unless you’re using Master Data or Data Quality Services, which means fewer patches and reboots.

 

AlwaysOn availablity

High availability features are much improved and less complicated to set up, with the choice of PowerShell or a wizard that tells you exactly what you need to do at each stage of a process, such as taking a full database backup or setting the recovery mode. AlwaysOn availability groups include up to four mirrors of your data as active secondaries with much faster failover because the solution is built on top of Windows failover clustering. SQL Server 2012 also supports failover cluster instances across subnets, or from one data centre to another.

This kind of resilient infrastructure is at the heart of the private clouds Microsoft expects larger businesses to be building, and there are new management options for SQL Server 2012 to help you on that journey, from using the latest version of the Microsoft Assessment and Planning tool to inventory the databases you want to virtualise, right down to the SQL Server components they’re using, to managing availability groups in System Center or from the new management dashboard.

Knight Frank

Global estate agent Knight Frank wasn’t originally planning to upgrade the SQL Server 2008 R2 systems it uses for “more traditional enterprise reporting” immediately. Instead the company was evaluating other software for a project that would use geospatial data to display residential properties in London on an interactive map. However the integration of SQL Server 2012 with the Bing Maps for Enterprise API proved so useful that it’s become an early adopter of many other new features.

The new developer tools helped ISV itelligence take the project from initial concept to near completion in just 15 days. “We take an agile approach,” explains Graham Brammall of itelligence, “but we could get it up very quickly because the integration is so good.”

Joseph Megkousoglou, Knight Frank’s lead software architect, saw it as a test of SQL Server 2012 and he’s impressed with the results. “As a property business, geospatial support is key and we have been doing this for many years by implementing our own formula. We are the only estate agent in UK that can offer true ‘near me’ functionality in our iPhone app; other agents just do a postcode search. In the past we have had to write tons of code and create complicated maps. It’s been working for years but with the SQL Server 2012 geospatial support we can eliminate a lot of that code. At the end of the day we’re estate agents; we’re not coders: we shouldn’t have to write complicated stuff like this and SQL Server does that for us.”

Knight Frank has just started implementing PowerPivot and Power View, having developed a BI semantic model for its data. “We had already developed a very good platform based on Reporting Services,” says Megkousoglou. “What we were missing was the rich graphical environment for interacting with the data, and that’s where Power View will allow us to do a lot more collaboration.” Instead of developers creating the reports they think the users want, the users themselves can explore the data directly.

He’s also exploring Data Quality Services and the updated Master Data Services. “We’ve just implemented our own contact management system and a key part of that is data quality; we’re looking at DQS to see how they can help maintain it.” AlwaysOn availability groups proved ideal too. “We’re a global business and we can’t have downtime. We have Web sites around world and we can’t take them offline. With AlwaysOn we can achieve much better uptime.” Megkousoglou is a big fan of the new in-memory database functionality too: “It allows us to page result sets from SQL instead of having to do crazy weird stuff; sounds trivial but trust me, I’ve done this and it really helps!”

There are also improvements to SQL Server Management Studio, especially when it comes to restoring databases from backups, while the new Distributed Replay feature helps for tuning and troubleshooting, allowing you to capture and ‘replay’ workloads across multiple systems.

You can finally define your own server roles so that you can give users the specific permissions they need, so an administrator can run the Profiler without needing access to the whole database. You can also encapsulate logins in the database so users only have access to the database and not to the whole instance. This simplifies deployment and moving databases between different SQL Server instances but does have some security implications. On the other hand, stronger encryption is supported for databases, going to AES256 for backup keys and SHA512 for password hashes.

Auditing is much more secure and much more resilient. Previously, if the network wasn’t available or the audit log filled up, the audit would fail and, unless you could restore it quickly, would shut down the server to preserve integrity. Now only those actions that cause events to be written to an unreachable audit log will fail while other operations can continue as normal. The combination makes auditing more useful as well as less annoying.

 

Developer productivity

The developer tools included with SQL Server 2012 are a big step forward, particularly as they are nicely integrated with a helpful visual interface. The new SQL Server Data Tools (codename ‘Juneau’) are all about developer productivity and make it much easier to build, test and validate both databases and applications for multiple versions of SQL Server, from SQL Server 2005 to SQL Azure, and keep multiple schemas under control. You can use them with SQL Server Express and even with previous versions of SQL Server.

You can create a new database declaratively in the Data Tools or connect to an existing database, which can be on Windows Azure, and explore its objects visually. You can edit and debug SQL with full T-SQL IntelliSense code completion and static code analysis, edit scripts and use the table designer which understands code-behind. You can validate and verify code even if you’re not connected to the database, so you can work on a database when you’re sitting on a train and publish changes when you get back to the office. You can also compare schemas visually with any incompatibilities between SQL Server versions clearly indicated, whether it’s a project and a live database, two live databases that are supposed to be the same, or a local and an Azure database.

Powerful as they are, you don’t have to use the Data Tools to unify application and database development. When you import a database into Visual Studio you can manage it as a source code artefact, complete with the object and tables and stored procedures, and Team Foundation now supports SQL Server 2012.

 

Data quality

The more you rely on your databases, the more important it is that they are consistent. Master Data Services in SQL Server 2012 gives you validation rules and reference data sets which are major improvements on the scripts that are attached to many databases, particularly as you can buy data quality services from the Azure Data Market. You can even profile your data to find out how many records are missing values.

PowerView screenshot

PowerView in action, showing some of the ways in which you can slice, dice and chart your data.

But the right person to correct the way company names are spelled or addresses are formatted or how products are classified isn’t the database developer or even the DBA; it’s the people who use the database to generate reports and deal with customers. They can still do this through a Web interface, which is improved in this release, but they can now also do it from the place most business users work with data, namely Microsoft Excel. The Excel Master Data Explorer add-in lets business users make corrections while still maintaining central management.

 

Business intelligence

The business intelligence tools in SQL Server 2012 are designed to balance the needs of end users who want to gain insights on-the-fly, and the need for management of reports that are important enough to merit centralised development and support. This is achieved through an overall semantic model that is shared by all the BI tools, together with improvements that go much further than finally supporting Excel and Word Open XML document formats for reports.

Apaceh Hadoop framework on Windows Azure

Working with the Apache Hadoop framework on Windows Azure.

End users get improvements in sorting, formatting, filtering and data modelling for the PowerPivot Excel add-in that lets them ‘mash up’ multiple data sources from Web sites, SQL databases and other spreadsheets. PowerPivot uses the xVelocity engine to handle much larger models than the one million rows supported by Excel and let you scroll through them at speed.

Users can share those models in SharePoint’s PowerPivot Gallery and even those without Excel installed can explore them in the Excel Web app. Admins can use SharePoint’s PowerPivot Management Dashboard to see how PowerPivot is being used and what resources it’s consuming. If a particular report proves popular then a developer can turn it into a BI Semantic Model (BISM) in Visual Studio or BI Development Studio, link it to a SQL Server Analysis Services engine for performance and use roles to control who can see what aspects of the model. Once a report has been converted to BISM, users can carry on experimenting using Power View, a Silverlight tool that runs in the browser and lets you create richer interactive graphs and reports that are easy to filter and sort. Any reports you save in Power View show up in the PowerPivot Gallery.

Licensing SQL Server 2012

The number of main editions has dropped from six to three, namely Enterprise, Business Intelligence and Standard, and there is still the free and powerful Express edition which gets many of the new features but is limited to running on a single core. The Developer edition can be bought separately for development purposes only, or as part of an MSDN subscription. The Web hosting edition is only available to SPLA partners reselling it as a service. The Datacenter, Workgroup and SMB editions are discontinued, while the Parallel Data Warehouse product hasn’t yet moved to SQL Server 2012.

Enterprise includes all the features of the other editions and gets the functionality of the now-defunct Datacenter edition. It supports as many cores as the OS and you pay according to the number of cores you use. There are no server or CAL costs, so you don’t have to count users or tie licences to servers in a way that makes it hard to virtualise or migrate VMs. You can move your SQL Server installation from system to system every 90 days, or whenever you want if you have Software Assurance, and you can deliver services to any user whether they’re an employee, a partner or a customer. If you currently have SA for your SQL Enterprise licences then you can continue an existing Server and CAL licence model, or if you currently have per-processor licences then you can migrate to SQL Server 2012 now and not worry about counting cores until you come to renew.

The new Business Intelligence edition has the full set of reporting and analytics features for BI including PowerPivot and PowerView plus data quality management tools. It’s limited to 16 cores for standard database functionality but has no cap for the BI features, and you pay by server and CAL rather than by core so you’re paying for the users that need access to the insights BI can deliver. CAL prices do go up from SQL Server 2008 R2, but Microsoft points out that CAL prices haven’t risen much since SQL Server 2000.

The same core limit applies to each instance of Standard edition, but you can license by core or by Server and CAL and you get all the updates to the engine plus basic reporting and analytics, but not the data quality management tools or full BI functionality. The same migration rules apply as with Enterprise – unlimited with Software Assurance, even if you’re running some SQL Server 2008 R2 systems as well, or every 90 days without.

Users can also tag data they’re particularly interested in with a Data Alert and get an email when the data, or a report that references it, changes. Again, administrators can see how many alerts are being set up and how well they’re working.

These tools are where SQL Server will extend beyond traditional analysis, supporting non-relational data sources like Hadoop. Microsoft rather grandly promises insight “from any data, any size, anywhere”, but as yet the Hadoop service for Azure and Hive ODBC driver and Excel add-in are still only at the community preview stage, and the on-premise Windows Server version of Hadoop is only available to selected testers.

But cloud data stores like Hadoop aren’t the only source of unstructured data; the documents on your file shares might have the information you really need. You can still use BLOB and filestream data, and geospatial data, and there are improvements in all those areas. Furthermore a new option called File Table lets you use either T-SQL or Win32 streaming APIs that have T-SQL transactional semantics to work directly with documents and files. This lets you dig through documents without having to change their format, or duplicate them in a database.

Its support for unstructured data has certainly got the most attention and it does extend the scope of what you can do with SQL Server – or what you will be able to do once the tools mature. But it’s hard to highlight any single improvement because this release adds so much. The updates to the database engine itself are less about architecture and more about performance. The developer tools are extremely significant, as are the possibilities offered by the BI tools. This is a major new release, and proves that structured and relational databases still have a great deal to offer.

Adobe Creative Cloud for Teams ad
ISV Royalty Advert