Making the best use out of Mainenance Plans on SQL Server

By | August 29, 2010

This article is about making the best use of the SQL Server Maintenance Plans, I am saying that because if you really want to do best practice maintenance on a SQL Server is that you should do it on a script level and trigger it by job schedule as doing this one you can have more control on the objects that really need maintenance on their own specified schedule.  Now if you are like me whose main function is not a DBA but more of a Developer then the next best thing is making the best use of what’s out there that you can utilize immediately in the quickest and safest manner without understanding the whole structure of the database.  This can be subjective but this is the most generic that I can get, you can tweak this recommendations according to your need but there is an explanation on why I had done this choices.

Now on t he image above that’s my full maintenance plan that is designed using the best practice as much as possible with the help of other DBA’s as well as searching answers online, just take note of the scheduling and their precedence as it has placed at that schedule for a reason, explanation below.

1. Backup Transactions Logs every 30 minutes between hours where database is being used (you can increase this time to every 15 minutes depending on your requirement).  Now I mentioned that the schedule will where the database is being used as there us no point in backing up Transaction Logs if there is no Database Activity, so usually this would run from 8:00 AM to 7:00 PM to cater for people who work early and late, so if you work in an organization that runs for 24 hours you just don’t stop the schedule and continue for the whole 24 hr.  Now to remind you why you are doing this as you cannot perform a full backup for every 30 minutes so the next best thing is backup the transaction logs, where it backs up the database transactions that happened on the specific period.

Now on the configuration, I set it to expire for 7 days and you can increase and decrease this depending on your requirement, I maintain it for 7 days so that if anything happens within the week I can easily restore and not resorting to the tape backups.  I also create the backups on a drive outside of my server so if anything happens to the server the data is safe.  And finally check backup integrity just to make sure it is all OK.

2. Backup Database which should be a Full Backup and every day, which saves you a lot of time restoring with a lot of Transaction Logs, in this scenario if a disaster occurs we will restore from a Full Backup +  22 Transaction Logs at Max.  I always do in between 11:00 PM to 3:00 AM where its unlikely people will be using it as backing up a database will take up a lot of resources and using it during that time will heavily reduce the database performance.  If your organization is active for 24×7 then you have to do the back up in the time where the database have the least activity

Now on the configuration it will be similar to the Transaction Log backup

3. Reorganize Indexes to make sure you have database performance gains this is because this task moves the index pages into a more efficient search order.  The perform the update statistics as it is not updated during a reorganize index and it will add more performance kick on your database as statistics are the metadata about the data within a table which is used when generating execution plans which will be used to access data within your database. So having an accurate and up-to-date statistics is very crucial.  In my schedule I do it everyday, before the business day starts and after the full backup as it’s not a really intensive task compare to a full index rebuild.

Now on the configuration of the reorganize index you notice that there is a Compact Large Object task which I haven’t checked this is only available in SQL 2005 and what it does is it will compact large object (LOB) data types such as images or text (its similar to zipping a large file) so there is no point if you are not using those data types on your database, anyway it’s not a good practice to place large binary types in the database as it greatly affects performance.   Now on the update statistics I chose Update mode of All existing Statistics which means I will have an up to date statistics about the columns and indexes on my Database and Scan Type as 30% Scan this ensures that I can still get a good sample from my database, you can lower the value if you have massive amounts of data and you can do a Full Scan if you only have less data.  Think of it like sampling a population in the country, if you’re performing some statistics in Pitcairn Islands who have 50 total population you would want to sample the whole population to get an exact result but if you are doing your statistics in China then I guess 5% would give you a better idea and if you do a full sampling well….

4. Rebuild Indexes to reorganize everything again from scratch which gives you better performance gains than the latter, this is a more CPU Intensive task and shouldnt be done on a daily basis so I do this once a week on a weekend.  Now for this one we don’t need to update the statistics as its is updated once a rebuild index is run.

Now on my configuration the free space option I chose is to reorganize with the default free space as the indexes FILLFACTORS are properly set when the database was created.  Now if you want to use free space percentage take note that it will do for all your Indexes and any customization you made on each index will use the same value, so the best practice here if you want to use this option you better do it on a script level so you can adjust the value per Index.  For more information about FILLFACTORS here is a good read with a good explanation

5. Shrink Database not really recommended but I am using it to conserve space as I don’t have much space on my Server, Done also once a week every week-end after a full backup just to make sure.

Now if you have searched google a lot you will notice this is not recommended by most of the DBA’s  as it will degrade the performance

Look at these expert recommendations

But what if you need space, badly? there should be a balance in between specially if you are constrained with space.  If you don’t have much space then you can do this but moderately once a week perhaps would be good but if you have heaps of space you don’t need to do this.  The options are straightforward, Shrink database when it grows beyond means it will only start the job when it reaches that specified size.  Amount of free space to remain after shrink means that it will stop shrinking when free space in database files reaches this size. And the last option I chose is Return freed space to operating system as I need extra disk space the latter will not give you space as the database is condensed to contiguous pages but the pages are not de-allocated.

6. Backup the system databases is also important as this keeps the information on how your Database entities are structured, this must be regularly backed up specially when you do a lot of changes in the structure of your Database.  I do mine once a week as we barely change database entities.

Options for this one will be the same a backing up the user databases.

7.  Then Clean Up this removes any logs that was created either by backup, SQL Server agent and Maintenance Plan.  I only retain 4 weeks of data.  And run it everyday not a really process intensive process best to do it out of office hours.

Now my options for this one is use everything, I want to clean everything anyways if I needed old logs I can go to the tape backups and I have 4 weeks worth of data readily available.

Now you have your maintenance plan setup don’t just stop there, a good practice is always redundancy so I suggest you have file backups and SQL Agent Backups installed, and if you are in a Virtual Environment such as VM, Xen or Hyper-V then back up the image as well.


One thought on “Making the best use out of Mainenance Plans on SQL Server

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.