Archive for the ‘System Stored Procs’ Category

what does NULLIF() do ? how is it different from ISNULL()?

October 7, 2010 3 comments

NULLIF( ) : Returns a null value if the two expressions specified in the function are equal.

The syntax used is

NULLIF ( expression1 ,expression2 )

If expression1 = expression2, then the function will return NULL.

ISNULL( ): The function replaces a null expressions with a specified value.

The syntax used is

Continue Reading …


Set Nocount On – Hide the number of records affected by a query

October 7, 2010 Leave a comment

Often it might be necessary to hide the number of records affected by a query. It can be a simple select or insert  statement . Ususally, as soon as we execute a query a message pops up on the messages tab as shown below with the number of records affected.

Continue Reading ….

Renaming MS SQL Server database

February 8, 2010 2 comments

Renaming a database is not done very often. However, I have done it myself many times.  this can be done in two ways. The common method used for rename is by using ” Alter Database ” script. An example is as shown below.


The above script will rename the database “OldDbName” with new name “NewDbName”. Another way of acheiving the same is by using the stored Procedure “SP_RENAMEDB” The syntax is as below

EXEC SP_RENAMEDB 'OldDbName' , 'NewDbName'

“Alter” is preferred over the SP_RenameDB as the SP will be deprecated in the future version of  SQL Server.

SQL Server might throw an error while renaming a database as below.

This is due to the existing conections to the database. to overcome this error, need to drop all the connections to the database.  Refer to my previous blog for more info on this.

2. You can set the database to single user mode . refer the link for more info on this.

Change the system error messages with custom messages

December 30, 2009 1 comment

This post discusses about ways to change the system messages with custom messages . This is not widely used, as it is not advisable to change these messages.  However there is nothing wrong in knowing it. :) 

We can change the Custom messages by using system pocs “sp_dropmessage” and “sp_addmessage “. 

I executed a query as given in the picture below. 

After this I used the script below to change the system message. The proc “sp_dropmessage” is used to drop the already existing message and  “sp_addmessage ” to add a new one.  


IF EXISTS (SELECT 1 FROM sys.messages WHERE message_id = 100097) BEGIN 

EXEC sp_dropmessage 100097  



EXEC sp_addmessage @msgnum = 100097, @severity = 16, @msgtext = ‘SP %s : My new text message!’  


The message was changed as shown in the picture. 

Now, can we do the same using the script below ? 


 set  text = ‘SP %s : My new text message!’ 


try it yourself

Some SQL best practices

September 24, 2009 Leave a comment

I had been seeing lot of SQL best practices posts on the web. I found some points missing in many of these post. I have consolidated few of my learning and posted it below. Hope it will be useful to you.

Storing objects in Database files
You should store the database catalog in the primary file store and all data and objects in secondary files  . Disk access contention can be reduced by this configuration.

Designing Filegroups
Create one user-defined filegroup to hold secondary data files and database objects. Configure  this filegroup as the default, this will automatically let  SQL Server will store all objects you create in this filegroup.

For good performance of data and log files
Do not place data files and the operating system files on the same drive .doing so will result in disk contention.

Place the tempdb database on a separate drive if possible. In environments in which there is intensive use of tempdb databases, you can get better performance by putting tempdb on a separate drive. It lets SQL Server perform tempdb operations in parallel with database operations.

Place data files and transaction log files on a separate drives . This will give you the best performance by reducing disk contention between data and transaction log files.

use the options to check the Windows expiration policy for SQL Server logins and apply the local Windows password policy .

Cleaning up temp objects
If you create a temporary table, you should drop it after its use. This ensures that structures are not left hanging around and allows resources to be reused/ reclaimed .

Do not wait for the connections to be closed to clean up any temporary tables, because many applications use connection pools in which the connections may never get closed. Explicitly dropping a temporary objects ensures that you never receive errors on attempting to create the temp objects again.

Security – assigning permissions

  • Security best practices mentions that you never grant permissions directly to a user. You should add a Windows login to a Windows group and the Windows group as a login to  SQL Server. You then add this group as a user in a database.
  • Create roles in a database corresponding to various job functions, and assign database users to the appropriate role.
  • Assign security permissions on objects in the database to the database role.

If possible, avoid SELECT * queries, which return all columns in a table/ s. Always specify a column list, which will ensure that you don’t select columns that are not needed.

Use different column names
Make sure that every output column of a query has a distinct column name. Applications should always be able to rely on column names for  retrieving data and should not be allowed to use column ordinal position.

Avoid cursors
This might be the millionth time you hearing it,  but thought of adding it. Avoid cursors . Ideally, cursors should be used only when a set-based solution is impossible to implement.

Test your code by Using transactions
Begin a transaction before running your code, and then roll back the transaction when you’re done testing. Your data will be in the same state it was in when you started.

use schema binding
This will avoid dropping objects accidentaly. Create some views even if it is not used on a table and then make it bonded to schema.

Using Stored Procedures to modify data
Stored procedures are always a better option because you can more easily validate changes via stored procedures. Stored procedures are also more flexible.

Try to avoid recompilation of Stored Procs
Stored procedures are compiled into the query cache when executed. Compilation creates a query plan as well as an execution plan. SQL Server can reuse the query plan for subsequent executions, which conserves resources. But the RECOMPILE option forces SQL Server to discard the query plan each time the procedure is executed and create a new query plan. There are only a few extremely rare cases when recompiling at each execution is beneficial, such as if you add a new index from which the stored procedure might benefit. You typically should not add the RECOMPILE option to a procedure when you create it.

Code efficiency
Use built in procedures and function instead of custom ones ,as these are optimized for performance.

Referential integrity and triggers
You can use triggers to enforce referential integrity. However, you should not use triggers in place of declarative referential integrity (DRI) via a FOREIGN KEY constraint. DRI is enforced when the modification is made, before the change is part of the table, and is much more efficient than executing trigger code. However, you cannot define FOREIGN KEY constraints across databases. To  enforce referential integrity across databases, you must use triggers.

Using filegroup backups
You should select a filegroup backup method when the size of a database makes it impractical to either back up or restore an entire database while still meeting your recovery requirements.

moving databases
Creating the files from scratch can consume a significant  amount of time, you should not drop a database before a restore if you are going to overwrite it. If you are using backup and restore to move a database to a different server with a different directory structure or the directory structure has changed, you can use the WITH MOVE option to cause the restore operation to create the underlying files in a path different from the original backup.

Recovering to a point in time after a disaster
During most disaster scenario, you always have transactions in the log that have not yet been backed up. For this reason, your first step in any recovery operation is to issue one final BACKUP LOG command. This process captures all remaining committed transactions that have not been backed up. Because you can issue a BACKUP LOG command against a database even if every data file, including the primary data file, is no longer available. The backup of the tail of the log then becomes the final transaction log that you apply in a restore process, enabling the database to be recovered without any loss of data.

Shrinking databases
As with the automatic shrink setting, the manual shrink process takes place in the background and does not affect users in the database, but the process of shrinking a database can consume system resources and degrade server performance. Also, as with auto shrinks, continually shrinking and regrowing a database can lead to fragmentation at the file level, which can be difficult to fix in busy database environments. DBAs should perform database shrink operations or transaction log shrink operations (covered in a moment) only when they are certain that the unused space being reclaimed will not be needed in the future.

When all instances reside within a single domain or across trusted domains, you should use Windows authentication. When instances span nontrusted domains, you should use certificate-based authentication.

use sys.dm_db_index_usage_stats
The sys.dm_db_index_usage_stats DMV can be used to find any indexes that the query optimizer is not using. If the system has been running for awhile, and an index does not have any seeks, scans, or lookups registered for it, it is a strong possibility that the index is not being used to satisfy any queries. Or an index might show activity but is no longer being used.

Use RC4 for minimal encryption strength and best performance and AES if you require strong encryption, but this algorithm will affect performance.

When using log shipping to increase availability reasons only, use No Recovery mode.

Executing dynamic queries with more than 4000 characters using variables

September 4, 2009 Leave a comment

Usually problem occurs when we are trying to build a query dynamically and if the length exceeds 4000 characters ( a variable of type nvarchar) or 8000 ( in case of varchar). The query stored in the variable gets truncated when it reaches the limit. One of the easiest way to overcome this is to split the query across few manageable number of variables. You can then execute the query/es by concatenating the variables. How ever do not use “SP_EXECUTESQL”  , as this will throw an error when you try to execute. SP_EXECUTESQL can execute a string, a variable but not when you concatenate variables.

that is Exec sp_executesql @var1+@var2 will throw an error.

You can use the EXEC( ) to execute these type of queries. All you need to do is EXEC( @var1 +@var2+@var3)


Declare @var1 nvarchar(max), @var2 nvarchar(max),@var3 nvarchar(max);

Set @var1 = ‘Select  ‘;

Set @var2 = ‘ * From ‘;

Set @var3= ‘ mytable’;

exec (@var1+@var2+@var3);

Above query will return the records in the table.

Database Compatibility Errors : ” If it is intended as a parameter to a table-valued function,ensure that your database compatibility mode is set to 90.”

August 11, 2009 3 comments

You might have encounterd the errors related to compatibility while running DMVs and functions on sql server databases. The error in the header is a typical compatibility error.  you might also get some errors such as

If it is intended as a parameter to a table-valued function,
ensure that your database compatibility mode is set to 90.
Version90 database compatibility level is not supported

and some others stating that the number of parameters passed is wrong.

Most of the DMVs are not available in versions of sql before 9.0 (2005).  To resolve this you need to make the database compatible with 2005 or above, the DMVs/ missing system objects gets created in the database automatically.

You can use the ALTER DATABASE to change the compatibility as

General syntax :






80 = sql server 2000/8.0

90= sql server 2005/9.0

100 = sql server 2008/10.0

you can alternatively use the system stored proc “SP_DBCMPTLEVEL“.

syntax is : SP_DBCMPTLEVEL 'DBname','Compatibility_value'



it will be better if you change the compatibility to 9.0 or greater