Clustered Columnstore Updates

Recently the SQLCAT team published a post on their blog titled Data Loading performance considerations with Clustered Columnstore indexes. In it they talk of loading data into partitioned tables where the load is aligned. Interestingly, their recommended process is to load into a staging table with the clustered columnstore index created first.

This is still based on the fact that you can hit the magic number of 1,048,576, or at the very least 102,400, per bulk insert, to create compressed segments. If you cannot, then I still feel that the method of loading into a heap, and then creating the clustered columnstore index, is preferential. This is because if you trickle insert into a CCI, you will end up with a mixture of open/closed delta stores and come compressed segments. You could leave it to the tuple mover to compress these, and you would still be left with some open delta stores. However, as Remus Rusanu points out, this is not recommended. To complete the process quickly, and to remove any open delta stores, you would need to rebuild the partition. Continue reading

Defaulting Data Compression on Creating Indexes

Here’s a theoretical situation;

  • you have a vendor database, that is used solely for staging very large tables, and the tables are created and dropped regularly.
  • The tables can exist for a few hours, or for days.
  • This database has only 1 type of table, with a few indexes created on it, none of which are compressed.
  • You cannot change the code.
  • However, when compressed, these tables/indexes can save up to 90% of space.
  • And with space being a commodity, the idea that we can allocate a disk of a few hundred gb as opposed to several tb is very appealing to all concerned
  • compression can possibly even make the process run faster.

Continue reading

Configuring Available Memory for SQL Server Reporting Services

If you need to configure the amount of memory available to an instance of SSRS you have to get your hands dirty and edit the RsReportServer.config file. The RsReportServer.config file stores settings that are used by Report Manager, the Report Server Web service, and background processing. Continue reading

Tidying Up Ugly Tab Titles in SSMS

Ever get tired of the ugly looking tabs at the top of SSMS, full of impertinent information? I was recently told of a method to tidy up the info displayed on the tabs, making them more legible:

2015-02-11 03_57_55_uglyTabs

Go to Tools > Options

2015-02-11 03_56_35_tools

Then navigate to Text Editor > Editor Tab and Status Bar

2015-02-11 03_57_12_before

Those options underneath “Tab Text” determine what is displayed on the tab. Here you can set them to be on or off.

2015-02-11 03_55_52_options

Once you made your changes click “OK” and your tabs are now altered.

2015-02-11 04_04_10_tidyTab

Microsoft’s Lousy Responses To More Connect Issues

I was catching up on some blogs yesterday, and one I really liked was this post on Spaghetti DBA. The authors main complaint was that he had grown disillusioned with Connect and the way in which so many issues are dealt with (ie, closed as “Won’t” Fix”). All the author wanted was better feedback from the engineers when responding to issues. I’m adding my voice to this request because in the past year I have raised a couple of issues myself, only to be disappointed with the feedback: Continue reading

New Year Musings

Hello! And Happy New Year!

I’ve spent much of the past month off work looking after my daughter whilst my wife is at work. It seems that while this time of year is quiet for me at the office, in the NHS its the busiest period of the year. So it has been great to spend time with Phoebe at home, which has resembled a building site since the end of October. Indeed, as I work from home I have had to move the computer from into 5 different times whilst work was completed. During that time I’ve learnt more things about plumbing than I’ve ever wanted to know, and surprised myself when I kept a remarkably cool head when I noticed water leaking out the ceiling (from the room I had just removed the radiator from successfully (I thought) and whose pipes I had capped) into our living room. And here is some advice which is as unrelated to technology as you’ll ever read on this site, but invaluable nonetheless: try not to reuse caps to cap off radiator pipes, as you have to turn them so tight they tend to break up when you try to use them again. Which is exactly what I had done. I thought they were screwed on well enough until I turned the heating on and water got flowing around the system, which was when the water started to leak out of the busted cap. Fortunately for me no damage was done and I was able to drain the entire heating system, which unfortunately coincided with us living without heating during the coldest days of 2014, until the plastering was done. It’s all part of us paying our dues until the house is done. Currently we are without a shower/bath, though mercifully we are not far away from friends who are kind enough to let us use their bathroom. Continue reading

What I Talk About When I Talk About Blogging

This post has nearly been a year in the making. When I hit the 100 post mark, which was roughly a year into writing this blog, I wanted to share some of my thoughts about blogging and what it meant to me and how someone can start up a blog and still be actively posting a year later. But I decided not to, as I felt a year and 100 posts was not nearly enough time to post anything with any real authority. But 1 year and 100 posts later I still wanted to share my thoughts on blogging. If for nothing else, it’ll be interesting to read this post in 2/3/4 years time and see just how much of what I wrote I still agree with. This post is not definitive; rather, it’s like viewing a junk shop; I’m sure there’ll be something for someone to take home from this collection of thoughts. Continue reading

Processing Cubes After a Data Refresh: Batching Processing Using AMO

Recently I needed to update some cubes in a test environment after the underlying data warehouse had been refreshed. Typically I’d sync the cubes from prod, but as sync causes locking, and I needed to get this done during the day, I needed to reprocess all of the cubes in the test environment, and there were a lot of them, all ranging in sizes from hundreds of gb to a few mb. The simplest way to update the cubes was to run a process default on the dimensions and then run a process default on the measure groups. Some of the measure groups were partitioned, so I could’ve got clever and just processed fully any unprocessed partitions, but I felt a default would do most of the hard work for me. Continue reading

Don’t Be A Jerk, Use Approved Verbs in Powershell

Today I’m going on a mini rant about people abusing Powershell, specifically to do with naming Powershell functions. If you’re going to take the trouble to write Powershell functions and add them to a module for others to have access to, please, don’t be a jerk, use a verb from the approved verbs list that is freely and readily available on the MSDN Developers site, or even by typing “get-verb” in the console.

Continue reading