SQL Server 2016 Technical Preview Coming This Summer

Microsoft made some cool announcements at the Ignite Conference yesterday. If you’re a data geek, then you’ll be happy to know that the next version of SQL Server is coming up soon.

This summer, Microsoft in planning to release SQL Server 2016 Public Preview. You can sign up to be notified once it becomes available.

Some of the high-level features they are advertising include:

  • Always Encrypted
  • Stretch Database
  • Real-time Operational Analytics & In-Memory OLTP
  • Built-in Advanced Analytics, PolyBase and Mobile BI
  • Row-level Security and Dynamic Data Masking
  • More Synchronous Replicas for AlwaysOn Availability Groups
  • Round Robin Load Balancing for Secondaries
  • Native JSON Support

You can read the data sheet information here.

Brent Ozar has a nice write-up with his thoughts on what it all means. He does a nice job of distilling the marketing buzz down to nuts and bolts and what it will really mean to the rest of us.

I know I’m signing up to download the SQL Server 2016 Preview as soon as it’s available. How about you?

SQL Saturday #396 Dallas BI Edition

This past weekend, I traveled to Dallas to attend SQL Saturday #396 BI Edition. SQL Saturday is a free (mostly) all-day training event about all things SQL Server. This particular event was focused on Business Intelligence and SQL Server. There were a total of 27 sessions on various topics including Power BI, SSIS, Azure, Big Data, Data Visualization, Reporting, and more.

SQL Saturday 396 - Dallas - BI Edition
SQL Saturday 396 – Dallas – BI Edition

The event was held at Microsoft’s office in Irving, Texas. The facility was very nice with the exception of some issues with the air conditioning. Sometimes it was running, and other times it was off. Since we were there on a weekend, it is not normally programmed to run.

They had excellent WIFI available for all attendees. This was a nice feature, as it allowed me to work on some Data Warehouse assignments I had, all while learning more about Data Warehouses. Very nice.

Session Highlights

The first session I attended was Russ Loski’s Introduction to BIML. BIML stands for Business Intelligence Markup Language. You can use BIML to help create templates for your SSIS packages. So for example, if you have to create 100 packages that perform a similar function, BIML can help you with that.

Next up, Bryan Smith showed us how to use Power Map. He was a last minute fill-in due to a schedule conflict, and I must say he did an awesome job. He only had one PowerPoint slide. Yeah! The rest was all demos. He showed us some cool use cases involving Hurricane Sandy, weather data, demographics, etc. He also had some cool tricks looking at Microsoft’s real estate information, energy usage, and heat. He was able to pinpoint who has a mini-fridge under their desk. That would be me! :-)

Finishing out the morning sessions was Allen Smith teaching us about Dimensional Modeling. He took us through the basics of relational vs. dimensional modeling, star and snowflake schemas, and had some group exercises where we had to determine what were fact tables and what were dimension tables.

Over lunch, several of the sponsors and vendors had technical sessions that they presented. I skipped out on these to grab some lunch, network, and socialize. All sessions and no gossip makes Steven a dull boy.

For the afternoon, I took in Mike Diehl’s presentation about incorporating the Agile Methodology with Microsoft development tools. I’ve worked in an Agile shop before, but using mostly open source and un*x-based tools. So it was neat seeing someone be an Agile Evangelist using the Microsoft Stack.

Next, was Meagan Longoria talking about data visualization and how to make it interesting and entertaining. She spoke to us about the horrors of relying solely on numbers arranged in tables for your reporting needs. And executive dashboards? If it doesn’t fit on one page, then it’s not a dashboard. Love it.

Last and certainly not least, Bill Fellows finished out the day with a session on learning to love SSIS. This was a very lively session, with Bill sharing his wisdom on all aspects of SSIS. He touched on configuration methods, logging, troubleshooting, and much more. While the session was only as hour, he tried to fit as much content in as possible. As a testament, this was the last session of the day, he kept going after the time ran out, and we all stayed until we had to leave. He gets my gold star for the day.

Brazilian Breakfast

For me, no trip to Dallas is complete without a stop at Cafe Brazil in Deep Ellum. I discovered this place years ago when I came to see Björk perform at Deep Ellum Live. I highly recommend the Brazilian Breakfast and Caramel Latte. This is the place where I learned to like coffee many moons ago.

Make love to your stomach
Make love to your stomach

If you’ve never been, check it out, your stomach will thank you. Also, the Deep Ellum neighborhood is a great area with a funky, artsy vibe. If you’re familiar with South Austin, or the Montrose neighborhood in Houston, it reminds me of those places.

If you do go and your server is Kelsey, be sure to tip her well. When I stopped in it was her and one lone cook holding down the fort with lots of customers present. She was literally running back and forth, and I do mean literally. Shame on the owners for not scheduling more staff.

Last Call for SQL PASS 2015 Early Bird Pricing

 

If you are interested in going to PASS this year, please take note of your calendar. Early Bird pricing ends on January 11th. The Early Bird price is $1195, versus the full price of $2395. That is over a 50% discount.pass 2015 early bird

The difference in price could cover most of your flight and hotel. If you are stuck convincing your manager about the need for some training, the folks at PASS got your back. They have already laid out a plan of attack to justify your attendance.

So, go talk to your boss and see if your employer can swing it. You might just be surprised as to what they say. If not, consider paying your own way. One hundred dollars a month is not very much when you consider all that you get from attending PASS. You don’t have to go every year, but if you’ve never gone, you owe it to yourself to do so at least once.

SQL PASS 2014 Early Bird

Next year, the PASS Summit will be held in Seattle on November 4th through the 7th. The full price is $2295. However, if you register early, you can get it for as little as $1095. That amounts to a discount of $1200, or a 52% discount.

This year, I am registering for PASS now to get the Early Bird Price. And, I encourage many of you to do the same thing. Act now, don’t delay. The Early Bird Price is only good until December 6th. Then, the price starts creeping up every couple of months.

The primary reason I am doing this is to take control of my career and my learning. While I have had employers agree to send me to PASS in years past, it was never a sure thing. Sometimes yes, sometimes no.

And to be fair, your job doesn’t really owe you that, either. However, I feel PASS is important enough to my career, that I am willing to pay the costs myself to ensure that I am able to attend.

Of course, the best option is if your employer covers the costs. And what better way to convince your boss than to make it cheaper for her? By paying for the registration myself, I’m doing two things. One, I’ve cut the costs in half. That’s enough to pay for most of my travel costs. And two, I have helped with budget planning. The training budget for 2014 may not be approved or funded yet. If your company is on a fiscal calendar, and most are, any training money for next November probably won’t be available until after the summer.

Hopefully, this will demonstrate some things to your employer. You are serious about SQL Server and your career. If you are willing to spend your own money for training, then they should recognize that you will be a good steward in spending their money. You are the one keeping up with current trends in technology. When it is time to help steer the company’s technology vision, you will be the one they call on.

Here are a few links to some other brave (or foolish) souls who have done the same thing:

“The only thing worse than training your employees and having them leave is not training them and having them stay.” – Henry Ford

SQL Saturday #255 Dallas Revisited

Last week, I presented my session on Troubleshooting Memory Pressure in SQL Server at the SQL Saturday in Dallas. I have uploaded my session slides and sample code both here and at the SQL Saturday website.

This SQL Saturday was a special one for me. However, there were both good and bad sides to my experience.

The Good

At this point, I have presented at all three SQL Saturdays in Texas: Austin, Houston, and Dallas. Of course, since I’ve only done one per year, it’s taken me three years to get this far. So I’m not exactly tearing it up. :-)

So my goal for next year is two-fold. On the one hand, I want to present at as many of the Texas SQL Saturdays as my schedule will allow. On the other, I would like to present at at least one SQL Saturday that is out of state. Of course, I have to balance this with work requirements and my family obligations.

I had a great crowd for my session. All the seats were filled and people were sitting on the floor in the aisles. The estimated crowd size was about 65 people. Not bad at all.

As always, one of the best aspects of any community event is, the community. I really enjoyed the speakers dinner, reconnecting with old friends, and making some new ones in the process.

I got to see some great sessions in the afternoon. To start things off, Tim Mitchell had a great session full of tips to Make Your SSIS Packages Run Faster. Then, I sat in on Sri Sridharan’s session on Turbocharging Your Career. He had some great ideas on how to take your career to the next level. To close out the day, I checked out Mike Hotek’s session on how he designed a 10+ Exabyte Data Warehouse. Afterwards, several of us were trying to figure out who the client was. But alas, it was confidential.

The Bad

In the weeks leading up to SQL Saturday I have been sick. So I was not exactly enthused about presenting in front of anybody. All I wanted to do was stay home and sleep. Add to this I had some other fairly stressful things going on and I wasn’t the happiest camper this particular weekend.

I signed up for Andy Leonard’s pre-con on SSIS. There wasn’t anything wrong about his presentation. But I felt so sick that I ended up leaving at lunchtime. Basically, I could do the pre-con or give my session the next day, but I couldn’t do both.

But that’s alright, because things soon took a turn for the worse, and helped me forget about any of that.

The Ugly

I had some pretty bad laptop problems during my presentation. Everything was going fine while I was getting setup, but the moment I started my laptop display and the projector display cut out. It took the A/V guys a full ten minutes of futzing around with it to get everything running again.

At that point, I was thrown off balance a bit and ended up rushing through my presentation. To make matters worse, it seemed like every time I switched between PowerPoint and SSMS I would loose my laptop screen again. So I would have to crane my neck to look at the projector screen while setting up each demo. Lots of fun.

At the end of the day, I have to take responsibility for my laptop problems. I have given presentations several times before, but this was a new laptop and I had never used it before. If it were not for the Johnny-on-the-Spot A/V guy, this could have been a lot worse. Thank you, sir!

So I pushed on through my presentation, trying to make the best of a bad situation. I made some jokes about how the title should have been about Disaster Recovery. People laughed at my jokes, and no one walked out. Thank you, Dallas!

I gotta tell you, I was really dreading my evaluations. I did get dinged by a few people, fair enough. But I was pleasantly surprised shocked to find that I overwhelmingly received fours and fives on my evaluations. Thank you again Dallas.

Phoenix Rising

I’ve heard it said that whatever does not kill you, makes you stronger. I must say that I agree.

Just a week ago, I was sick, stressed out, and not a happy camper. Now, my career has taken a new turn that should be quite entertaining.

Many times, the SQL Community has given me a little boost that I needed to get myself back on track. Get involved with your local SQL User Group. Network, learn, and grow. And when you’re ready, or even when you are not, sign up to give a presentation. You might be surprised how your career starts to take off.

Veterans Day

Iwo Jima
Iwo Jima

Today is veterans day. I like to take some time to reflect on the all the men and women in service who have sacrificed for the rest of us.

I am also thankful for the things that the Marines have provided me, as well. They taught my the value of integrity and hard work. They also helped pay my way through college.

USMC
USMC

As you are going about your day, please take a moment to think about the world around us and be thankful for what you have in this world.

Service and sacrifice is not limited to the military. Throughout human history countless people have stood up for what they felt was right and have helped propel us along. It is that desire for improvement that makes us all human.

SQL Server 2014 CTP2

SQL Server 2014 CTP2 is now available. This is the last public beta before the final RTM release. So, if you like playing with new features and experimenting before everyone else then this is the download for you.

CTP2 offers some new enhancements with In-Memory OLTP, aka Hekaton. I’m interested to see the new range indexes work with memory-optimized tables. In CTP1 we only had hash indexes.

Kalen Delaney has an updated white paper available. If you want read more about Hekaton under the covers, then be sure to check it out.

I’m Speaking at SQL Saturday #255 Dallas

SQL Saturday is coming to town. SQL Saturday is coming to town. OK, so that’s not as awesome as Santa Claus coming to town, it’s still pretty cool. You get a full day of SQL Training for free. Well, almost free. There is a $10 charge for lunch.

sqlsat255_web

If you haven’t been to a SQL Saturday before, here is your chance. Dallas is hosting SQL Saturday #255 on Saturday, November 2nd. Saturday? Are you kidding me? I know, I thought it was a joke the first time I heard it, but this will make my 5th SQL Saturday.

SQL Saturday’s are a ton of fun. You get to meet other SQL Folks from all over the state. There are always a significant number of attendees that come from across the country, and even a few that will come in from overseas.

Quite a few of the speakers will be top tier talent who have spoken at PASS or other national conferences. And then they let me in the club. Not sure how I made it, but I’ve been selected to speak, as well.

I’ll be presenting a session on Troubleshooting Memory Issues in SQL Server. I’ll go through the basics of Memory Pressure, and show you various tools and techniques to troubleshoot it. Be sure to bring some tomatoes and old DIMMs for the interactive portion of the show.

If you’re not interested in my session, there are a total of 42 sessions being offered. Douglas Adams would be proud. Sessions are organized into several tracks including Administration, Development, Business Intelligence, and Career Development.

Additionally, on Friday there are three full-day pre-con sessions being offered. These cost $100 and you must register ahead of time. I’ve registered for Andy Leonard’s session covering SSIS 2012. Another great one is Grant Fritchy’s session on Query Tuning. I saw his session at PASS last year; it’s a good one. At PASS these sessions cost about $300 – $400. So this is a huge discount for the same level of content.

So what are you waiting for? Grab a mouse, head to the website, and register. Oh, you don’t live anywhere near Dallas. That’s OK, because there’s a SQL Saturday coming soon to a town near you.

An Overview of SQL Server 2014 In-Memory OLTP Hekaton

So you’ve heard of Hekaton, but what is it, and why do you want it?

Hekaton, or In-Memory OLTP, is an entirely new set of data structures for tables and indexes that is optimized for memory-optimized storage as opposed to disk-based storage.

Hekaton is the code name for In-Memory OLTP, so I will use these two terms interchangeably.

Why is Microsoft doing this? Short version, memory and server are much cheaper now, than they were when SQL Server first launched.

At this point, it is feasible to have enough memory on your server to house the entire database. Even large, one terabyte databases.

However, the query optimizer and its costing rules haven’t changes along with this. So, even if you have tons of memory, SQL Server is still making assumptions that it will be reading data off of the disk.

Basic Glossary of Terms

  • Cross-Container Transactions – transactions that use both disk-based tables and memory-optimized tables
  • Disk-Based Tables – plain old normal tables, what you have right now, 8k pages
  • Hekaton – codename for In-Memory OLTP
  • In-Memory OLTP – new architecture and data structures using memory for data storage instead of disks
  • Interop – a TSQL query written against memory-optimized tables
  • Memory-Optimized Tables – tables using new memory data structures to store their data
  • Natively Compiled Stored Procedures – compiled machine code instead of interpreted TSQL, still written in TSQL but with some restrictions

Databases

In order to make use of In-Memory OLTP, you need a database that supports it. It’s fairly easy to do this. When you create the database you need a special filegroup with the CONTAINS MEMORY_OPTIMIZED_DATA clause. Additionally, you need to use a Windows BIN2 collation. This can be done at the database, table, or query level.

Tables

To create a Memory-Optimized you use the MEMORY_OPTIMIZED = ON clause. There are several restrictions on column types, but in simple terms no LOB data types are allowed, no whatever(max), and no CLR.

Rows are limited to 8060 bytes with nothing stored off row. The size limitation is enforced at creation, so all of your columns sizes must fit within this limitation.

DML triggers are not allowed, neither are foreign key or check constraints. Love GUIDs? I hope so, because identity columns are out, too.

There are two basic types of Memory-Optimized Tables, SCHEMA_ONLY and SCHEMA_AND_DATA.

SCHEMA_ONLY tables are non-durable. You can put data in there, but in the event of a restart or crash, the table is recreated but your data is gone. This could be useful for storing application session state or for staging tables in a data warehouse.

Indexes

Memory-Optimized Tables can have two types of indexes, Hash Indexes and Range Indexes. All tables must have at least one index, and no more than eight. Also, tables that are defined as SCHEMA_AND_DATA must have a primary key. Indexes are rebuilt each time SQL Server starts up.

A Hash Index is an array of pointer, where each element points to a linked list of rows. The number of elements in the array is controlled by the BUCKET_COUNT clause. In general, you want to set the BUCKET_COUNT to at least the number of unique key values in your table.

If you have too few buckets, then multiple key values will share the same linked list, which will mean longer scans to look for your row. If you have too many, then you will be wasting memory with empty buckets.

Range Indexes are good for when you will be searching for a range of values, or if you are not able to properly estimate the BUCKET_COUNT size. However, Range Indexes are not available in CTP1, so we’ll have to wait a bit to learn more about those.

Queries and Stored Procedures

There are two basic methods for querying memory-Optimized Tables. Natively Compiled Stored Procedures or good old-fashioned TSQL, known as Interop, which also includes regular stored procedures.

Natively Compiled Stored Procedures are going to be the fastest. However, they are only able to access Memory-Optimized Tables. If you want to be able to query regular tables along with Memory Optimized Tables, then you will need to use TSQL Interop. There are a variety of restrictions when using TSQL Interop such as MERGE, cross database queries, locking hints, and linked servers.

TSQL Interop allows you to make a gradual migration to In-Memory OLTP. This way you can slow convert a few objects at a time based on which ones will give you the most performance gain.

One Big Caveat

One thing to keep in mind is that tables, indexes, and stored procedures cannot be modified in Hekaton. This means that you will need to drop and re-create these objects in order to make changes. Also, stats have to be rebuilt manually. And then to take advantage of them, the stored procedures would need to be recreated, as well.

Obviously, this is a fairly major restriction. However, I think I can live with this for a version one product. I hope that by the time SQL Server 2015 comes out, there will be an easier way to add a column to a Memory-Optimized Table.

Concurrency

Hekaton offers an improved versioned optimistic concurrency model for memory-Optimized Tables that removes waiting for locks and latches. Explicit transactions are supported using Repeatable Read, Serializable, and Snapshot isolation levels. Read Committed and RCSI are only available with autocommit transactions, with RCSI only if no disk-based tables are involved.

High Availability and Disaster Recovery

All the basics such as backups and restores are available. Additionally, AlwaysOn and Log Shipping are fully supported. Unfortunately, Mirroring and Transactional Replication are not. However, this isn’t too much a surprise since SQL Server is definitely pushing AlwaysOn as the new HA/DR solution.

Migration Assistance

The AMR Tool (Analyze, Migration, Reporting) will identify unsupported data types and constraints in tables. It will also recommend which tables and stored procedures should see the most performance improvement by converting to In-Memory OLTP and Memory-Optimized Tables.

Management Data Warehouse offers the Transaction Performance Collection Set, which will help you gather the necessary data in order to let the AMR Tool work its magic.

SQL Server 2014 AMR Tool for In-Memory OLTP

SQL Server 2014 has many new features and improvements over SQL Server 2012. One feature that a lot of people are interested in is In-Memory OLTP. Knowing where or how to take advantage of this feature can hold people back from playing with it.

The AMR Tool (Analyze, Migrate, Report) helps you simplify migrations to SQL Server 2014 In-Memory OLTP.

SQL Server 2014 AMR Tool
SQL Server 2014 AMR Tool

The AMR Tool helps you identify which tables and stored procedures will benefit from In-Memory OLTP. If you already have some migration plans, then the AMR Tool can help validate your plans. It will evaluate what needs to be done to migrate your tables and stored procedures.

In order to take advantage of the AMR Tool you will need the following three items:

  • A target database that you want to migrate to SQL Server 2014. This needs to be SQL Server 2008 or higher. So no old-school databases here.
  • A copy of SQL Server 2014 CTP1 Management Studio installed. Note, you do not need a SQL Server 2014 instance or database, just the tools.
  • And last, a Management Data Warehouse with the Transaction Performance Collection Set installed.

Once you have these items setup, you are ready to begin using the AMR Tool to generate recommendations based on the access characteristics of your workload, contention statistics, and CPU usage of stored procedures.

Resources

Benjamin Nevarez has a nice tutorial on using the AMR Tool. Another good resource is the Hekaton whitepaper by Kalen Delaney. If you don’t already have SQL Server 2014 CTP1, you can download it here.

Technology Musings