TSQL Performance Rule #1

There’s no significance to this one being number one 🙂 its just the one I’ve just been thinking about 🙂 I may now have built this up a bit more than is warranted, so hope your not expecting too much from my number one performance rule. Oh, Ok then, here goes …

“All numbers in stored-procedures should be in single quotes”.

Even if they are defined as INT they could potentially force a VARCHAR to be converted to INT.

Consider WHERE SomeColumn = 42. Conversion precedency means VARCHAR’s will always be converted to INT’s never the other way around. The one numeric value above (42) could cause a million rows in the column (“SomeColumn”) to have to be converted to INT to be tested. Significantly affecting performance.

Consider WHERE SomeColumn = ’42’. “SomeColumn” is either numeric or non-numeric. If its INT then just one value (the ’42’ in the where clause) has to be converted to INT (taking no time at all). If “SomeColumn” is VARCHAR then there is no conversion.

Exporting a Report to Excel

Finance wanted to export their reports into spread sheets but the company Logo and report Title were messing up the rendering.

To fix this I amended the SQL Server 2012 SSRS config file (called “rsreportserver.config“) after taking a copy.

The location of the config file was …

C:\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportServer

I commented out the line … (please note: I have replaced the greater-than and less-than symbols with square brackets. WordPress does not like those symbols in posts)

[Extension Name="EXCELOPENXML" Type="Microsoft.ReportingServices.Rendering.ExcelOpenXmlRenderer.ExcelOpenXmlRenderer,Microsoft.ReportingServices.ExcelRendering"/]

… and replaced it with these 7 lines …

[Extension Name="EXCELOPENXML" Type="Microsoft.ReportingServices.Rendering.ExcelOpenXmlRenderer.ExcelOpenXmlRenderer,Microsoft.ReportingServices.ExcelRendering"]
	[Configuration]
		[DeviceInfo]
			[SimplePageHeaders]True[/SimplePageHeaders]
		[/DeviceInfo]
	[/Configuration]
[/Extension]

To use this, I moved the report Logo and Title into a heading-block within the reports.

** UPDATE **

On another occasion I was unable to make this global change and resorted to making changes within individual reports.

The method was to right-click on the items to be hidden, choose properties, then Visibility. I pasted this expression into the appropriate box …

=IIF(Globals!RenderFormat.Name = "EXCELOPENXML", True, False)

 

Report-Builder Cube Data-Source

Had a tricky situation connecting Report Builder 3 to a cube. I was able to copy the connection string from withing SSDT but it still would not work.

I will use “Adventure Works” for illustration.

The solution was in the error message “Either the user, does not have access to the AdventureWorksDW2012 database, or the database does not exist.”

It turned out the database did not exist … as its SSMS Database-Engine name (“AdventureWorksDW2012”).

Connecting SSMS to Analysis Services however showed a different name “AdventureWorksDW2012Multidimensional-EE”

Plugging this into my connection string (with Data Source Type being Analysis services, and Connect Using being Windows Integrated Security) worked eg:-

Provider=SQLNCLI11.1;
Data Source=(ServerName\InstanceName);
Integrated Security=SSPI;
Initial Catalog=AdventureWorksDW2012Multidimensional-EE

Annoyingly (grrr!), I found just removing the Initial Catalog worked also (ah bah).

Update Statistics on a whole database

Whilst performance tuning an SSRS report server I wanted to update all the statistics within the two databases ‘ReportServer’ and ‘ReportserverTempDB’.

I chose a simply-coded, two step method (for safety and to keep control).

First I generated the commands (per database) …

Select 'UPDATE STATISTICS [' + [name] + '] WITH FULLSCAN' 
from reportserver.sys.tables
Select 'UPDATE STATISTICS [' + [name] + '] WITH FULLSCAN' 
from reportservertempdb.sys.tables

… before executing them in a separate session.

Adding ‘All’ to a report drop-down list

There is a problem when you configure a report parameter to ‘Accept Multiply Items’, in that it won’t work with a stored procedure (SP).

One work-around is to only Accept Single Items, but make one of them ‘All’, like this …

1) Create a separate SP to populate the parameter and add an ‘All’ option …

SELECT 'All' CustId
UNION
SELECT DISTINCT CustomerId
FROM SomeTable
ORDER BY CustId

2) Then amend the main SP by joining to SomeTable, and adding a CASE statement in the WHERE clause, like so …

WHERE SomeTable.CustomerId = CASE WHEN @CustId != 'All' THEN @CustId ELSE SomeTable.CustomerId END

Which translates as …

WHERE SomeTable.CustomerId = SomeTable.CustomerId

… when ‘All’ is selected (which will let everything through), or …

WHERE SomeTable.CustomerId = @CustId

… where ‘All’ is Not selected.

This allows the user to select All or a single value.

However, if you need to select more than one value but not all values – you will need another approach. You will need to use a split-string function.

A Report that uses its location

A customer had many copies of the same report – one for each site. And the application database had multiple copies of each table – one for each site. The table names were all appended with the sites name EG: “WP London_SalesHeader”, “WP Barcelona_SalesHeader” etc.

In Report Manager there was a folder for each site, containing sub-folders for different categories of report (EG: /London/Sales, OR /London/Production).

This is a simplified account of how I created a Report that returned only information specific to its location.

In Report Builder I created a hidden parameter called @site of type Text with no “Available Values” and its “Default Values” using the global variable ReportFolder.

As the output from this built-in variable would be like “\Paris\Sales” I had to create an expression for the “Default Value” of @site searching through each site name in turn …

=IIf(Globals!ReportFolder.Contains("Barcelona"),"WP Barcelona",
IIf(Globals!ReportFolder.Contains("Paris"),"WP Paris", "WP London"))

Finally, in the report query I derived the table name using the @site parameter.

declare @cmd varchar(max) = 
'select	[SalesCode],
	[Description],
	[NetWeight],
	[SalesDate]
from	[Production].[dbo].[' + @site + '_SalesHeader]'

exec(@cmd)

(NB: As a best-practice I displayed the value of @site, along with the other parameter choices, in the report sub-title.)

Collecting Wait Stats

Based on the work of GS, here is my script to create a Job that collects wait stats every 15 minutes.

--CreateJob_DBA_CollectWaitStats.sql

USE [msdb]
GO

IF  EXISTS (SELECT job_id FROM msdb.dbo.sysjobs_view WHERE name = N'DBA_CollectWaitStats')
EXEC msdb.dbo.sp_delete_job @job_name=N'DBA_CollectWaitStats', @delete_unused_schedule=1
GO

USE [msdb]
GO

BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'Database Maintenance' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'Database Maintenance'
IF (@@ERROR  0 OR @ReturnCode  0) GOTO QuitWithRollback
END

declare 
	@today varchar(50) = (select convert(varchar, getdate(), 112)),
	@nextweek varchar(50) = (select convert(varchar, getdate()+8, 112)),
	@dbname varchar(50) = 'master' --<<<<<>>>>>>>>

DECLARE @jobId BINARY(16)
EXEC @ReturnCode =  msdb.dbo.sp_add_job @job_name=N'DBA_CollectWaitStats', 
		@enabled=1, 
		@notify_level_eventlog=0, 
		@notify_level_email=0, 
		@notify_level_netsend=0, 
		@notify_level_page=0, 
		@delete_level=0, 
		@description=N'Collects wait stats for performance tuning.', 
		@category_name=N'Database Maintenance', 
		@owner_login_name=N'sa', @job_id = @jobId OUTPUT
IF (@@ERROR  0 OR @ReturnCode  0) GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Create the table', 
		@step_id=1, 
		@cmdexec_success_code=0, 
		@on_success_action=3, 
		@on_success_step_id=0, 
		@on_fail_action=3, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N'TSQL', 
		@command=N'create table [dbo].[WaitStats] 
(
	WaitType nvarchar(60) not null,
	NumberOfWaits bigint not null,
	SignalWaitTime bigint not null,
	ResourceWaitTime bigint not null,
	SampleTime datetime not null
)', 
		@database_name=@dbname, 
		@flags=0
IF (@@ERROR  0 OR @ReturnCode  0) GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Collect current waits', 
		@step_id=2, 
		@cmdexec_success_code=0, 
		@on_success_action=1, 
		@on_success_step_id=0, 
		@on_fail_action=2, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N'TSQL', 
		@command=N'INSERT INTO [dbo].[WaitStats]
SELECT  wait_type as WaitType,
        waiting_tasks_count AS NumberOfWaits,
        signal_wait_time_ms AS SignalWaitTime,
        wait_time_ms - signal_wait_time_ms AS ResourceWaitTime,
        GETDATE() AS SampleTime
FROM    sys.dm_os_wait_stats
WHERE [wait_type] NOT IN (
	N''BROKER_EVENTHANDLER'', N''BROKER_RECEIVE_WAITFOR'',
	N''BROKER_TASK_STOP'', N''BROKER_TO_FLUSH'',
	N''BROKER_TRANSMITTER'', N''CHECKPOINT_QUEUE'',
	N''CHKPT'', N''CLR_AUTO_EVENT'',
	N''CLR_MANUAL_EVENT'', N''CLR_SEMAPHORE'',
	N''DBMIRROR_DBM_EVENT'', N''DBMIRROR_EVENTS_QUEUE'',
	N''DBMIRROR_WORKER_QUEUE'', N''DBMIRRORING_CMD'',
	N''DIRTY_PAGE_POLL'', N''DISPATCHER_QUEUE_SEMAPHORE'',
	N''EXECSYNC'', N''FSAGENT'',
	N''FT_IFTS_SCHEDULER_IDLE_WAIT'', N''FT_IFTSHC_MUTEX'',
	N''HADR_CLUSAPI_CALL'', N''HADR_FILESTREAM_IOMGR_IOCOMPLETION'',
	N''HADR_LOGCAPTURE_WAIT'', N''HADR_NOTIFICATION_DEQUEUE'',
	N''HADR_TIMER_TASK'', N''HADR_WORK_QUEUE'',
	N''KSOURCE_WAKEUP'', N''LAZYWRITER_SLEEP'',
	N''LOGMGR_QUEUE'', N''MEMORY_ALLOCATION_EXT'',
	N''ONDEMAND_TASK_QUEUE'',
	N''PREEMPTIVE_XE_GETTARGETSTATE'',
	N''PWAIT_ALL_COMPONENTS_INITIALIZED'',
	N''PWAIT_DIRECTLOGCONSUMER_GETNEXT'',
	N''QDS_PERSIST_TASK_MAIN_LOOP_SLEEP'', N''QDS_ASYNC_QUEUE'',
	N''QDS_CLEANUP_STALE_QUERIES_TASK_MAIN_LOOP_SLEEP'',
	N''QDS_SHUTDOWN_QUEUE'',
	N''REQUEST_FOR_DEADLOCK_SEARCH'', N''RESOURCE_QUEUE'',
	N''SERVER_IDLE_CHECK'', N''SLEEP_BPOOL_FLUSH'',
	N''SLEEP_DBSTARTUP'', N''SLEEP_DCOMSTARTUP'',
	N''SLEEP_MASTERDBREADY'', N''SLEEP_MASTERMDREADY'',
	N''SLEEP_MASTERUPGRADED'', N''SLEEP_MSDBSTARTUP'',
	N''SLEEP_SYSTEMTASK'', N''SLEEP_TASK'',
	N''SLEEP_TEMPDBSTARTUP'', N''SNI_HTTP_ACCEPT'',
	N''SP_SERVER_DIAGNOSTICS_SLEEP'', N''SQLTRACE_BUFFER_FLUSH'',
	N''SQLTRACE_INCREMENTAL_FLUSH_SLEEP'',
	N''SQLTRACE_WAIT_ENTRIES'', N''WAIT_FOR_RESULTS'',
	N''WAITFOR'', N''WAITFOR_TASKSHUTDOWN'',
	N''WAIT_XTP_RECOVERY'',
	N''WAIT_XTP_HOST_WAIT'', N''WAIT_XTP_OFFLINE_CKPT_NEW_LOG'',
	N''WAIT_XTP_CKPT_CLOSE'', N''XE_DISPATCHER_JOIN'',
	N''XE_DISPATCHER_WAIT'', N''XE_TIMER_EVENT'')
AND	[waiting_tasks_count] > 0

', 
		@database_name=@dbname,
		@flags=0
IF (@@ERROR  0 OR @ReturnCode  0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR  0 OR @ReturnCode  0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id=@jobId, @name=N'Every 15 mins for a week', 
		@enabled=1, 
		@freq_type=4, 
		@freq_interval=1, 
		@freq_subday_type=4, 
		@freq_subday_interval=15, 
		@freq_relative_interval=0, 
		@freq_recurrence_factor=0, 
		@active_start_date=@today, 
		@active_end_date=@nextweek, 
		@active_start_time=100, 
		@active_end_time=235959, 
		@schedule_uid=N'5b0842fe-8f80-44e9-8a09-aac6ce5c2b2e'
IF (@@ERROR  0 OR @ReturnCode  0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR  0 OR @ReturnCode  0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:

GO

More thoughts against Triggers

For Rob & Karl – Triggers run outside of transactions. An insert that fires a trigger may be rolled back, but the trigger rolls on.

Triggers introduce a long-term maintenance headache. You can read a stored-procedure from top to bottom and imagine you understand what it does. But unless you examine every tables it touches – you don’t. Little bits of code may be running silently which augment or even reverse some of the logic within the stored-procedure.

Triggers are used by lazy developers to ‘bolt on’ new features to applications, rather than track-down all the code that could insert/update/delete from a table and add the code (or a link to it) there.

This would be forgivable if the application code was closed or propitiatory, but never when the application is open to the application developer, who just cannot be bothered to integrate code changes properly, and cares not-a-jot about long-term maintenance headaches (slow breaths, slow breaths :))

SQL Deadlock graph – arrows

In a SQL deadlock graph the direction of the arrows is an interesting thing.

pic02

With my mechanistic head on, I am imagining it as …

  1. Spid-a requested a lock, and then got a lock (a two-way trip)
  2. Spid-b requested a lock, and then got a lock (arrow ends-up pointing at spid)
  3. Spid-a requested a lock, and is waiting (a one-way thing)
  4. Spid-b requested a lock, and is waiting (arrow pointing away from spid)

Capture Deadlock Graph using Profiler

To Capture a Deadlock Graph using Profiler (NB: with SQL 2008 and above you can also use an extended-event).

  • File / New trace
  • Connection details
  • Use Template / Blank
  • Events Selection / Locks ..1) DeadlockGraph 2) Lock:Deadlock 3) Lock:Deadlock Chain
  • Event Extraction Settings / Save Deadlock XML events seperately / (somefilename)
  • Each deadlock in a distinct file
  • All Deadlocks
  • Run
  • (wait)
  • File / Export / Extract SQL Server Events / Extract deadlock Events / (somefilename2)

Check every Linked Server

I was unable to cobble together some Powershell code that I could execute within a job-step to check our linked-servers were working.

So I resorted to making the best of the built-in, but flawed, “SP_testlinkedserver” (Its a flawed procedure as if a link fails, it crashes, slowly).

The code below, when ran in a job-step overnight, will dynamically create one job for each linked-server on the box. The job(s) will then run and email the “DBA” operator every linked-server that fails, before deleting themselves.

-- testlinkedservers.sql

-- get list of all linked servers on this box

	CREATE TABLE #temp (
		srv_name varchar(MAX), 
		srv_providername varchar(MAX), 
		srv_product varchar(MAX), 
		srv_datasource varchar(MAX), 
		srv_providerstring varchar(MAX), 
		srv_location varchar(MAX), 
		srv_cat varchar(MAX))
	INSERT INTO #temp EXEC sp_linkedservers
	DELETE FROM #temp WHERE srv_name LIKE 'LOGSHIP%'
	DELETE FROM #temp WHERE srv_name = @@SERVERNAME

-- loop

	DECLARE @name VARCHAR(MAX), @cmd VARCHAR(MAX), @run VARCHAR(MAX)
	WHILE (SELECT COUNT(*) FROM #temp) > 0
	BEGIN
	SELECT TOP 1 @name = srv_name FROM #temp

	-- create the job code

	SET @cmd = 'BEGIN TRANSACTION
	DECLARE @jobId BINARY(16)
	SELECT @jobId = job_id FROM msdb.dbo.sysjobs WHERE (name = N''DBA - LinkedServerTest ' + @name + ''')
	IF (@jobId IS NULL)
	BEGIN
	EXEC msdb.dbo.sp_add_job @job_name=N''DBA - LinkedServerTest ' + @name + ''', 
		@enabled=1, 
		@notify_level_eventlog=2, 
		@notify_level_email=2, 
		@notify_level_netsend=0, 
		@notify_level_page=0, 
		@delete_level=3, 
		@description=N''No description available.'', 
		@category_name=N''[Uncategorized (Local)]'', 
		@owner_login_name=N''sa'', 
		@notify_email_operator_name=N''DBA'', 
		@job_id = @jobId OUTPUT
	END

	-- create the job-step code

	IF NOT EXISTS (SELECT * FROM msdb.dbo.sysjobsteps WHERE job_id = @jobId AND step_id = 1)
	EXEC msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N''one'', 
		@step_id=1, 
		@cmdexec_success_code=0, 
		@on_success_action=1, 
		@on_success_step_id=0,
		@on_fail_action=2, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N''TSQL'', 
		@command=N''sp_testlinkedserver [' + @name + ']'', 
		@database_name=N''master'', 
		@flags=0;

	-- create instantiation code

	EXEC msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N''(local)''

	COMMIT TRANSACTION'

	-- create the job

	EXEC(@cmd)

	-- run the job

	SET @run = 'EXECUTE msdb.dbo.sp_start_job ''DBA - LinkedServerTest ' + @name + ''''
	EXEC(@run)

        -- move to next row in loop

	DELETE FROM #temp WHERE srv_name = @name
	END

.

Log-Shipping Monitor incorrect after outage

After a virtualization issue caused an unscheduled rebooted of production, I found the DR (log-shipping) monitor incorrectly reporting issues.

It seems the linked-server was no longer working, as @@servername returned NULL on Prod.

On Production SP_AddServer failed as the servername was in sys.servers – but not with server_id 0 (as needed for @@servername).

Removing the incorrect entry with SP_DropServer failed as there were remote- connections using it. And SP_DropRemoteLogin failed as there was not a remote users called NULL.

The fix was to remove log-shipping first using the GUI, which was only partially successful. Then manually, by deleting jobs from prod and DR, and truncating system-tables in MSDB starting log_shipping~ (on both servers).

Once log-shipping was cleaned off both machines I could use … EXEC SP_DropServer ‘ProdServer’, ‘droplogins’ followed by EXEC SP_AddServer ‘ProdServer’, LOCAL successfully. Now the server-name was correctly at the top of sys.servers the only task left was to schedule a reboot so Select @@ServerName would pick-up the new value.

After which I could re-configure log-shipping.

Move Databases

I wanted to move about 50 databases on a Sharepoint server off the C-drive
(yes I know).

Sadly the only place I could move both datafiles and logfiles to was the D-Drive
(I know, I know).

Here’s the code I wrote to help me …

--move_db.sql

-- to move a user-database to different-drives on the same-server

USE [master]
GO

-- backup

	DECLARE @dbname VARCHAR(max) = 'SomeDatabaseName' -- database name

	DECLARE @backup_cmd VARCHAR(MAX) = 'BACKUP DATABASE ['+ @dbname + ']
	TO DISK = N''\\SomeNetworkShare\SomeServerName_' + @dbname + '.bak''
	WITH INIT, COMPRESSION, STATS = 1;'

	SELECT (@backup_cmd)
	--EXEC (@backup_cmd)


-- kill connections

	DECLARE @kill_cmd VARCHAR(MAX) = 'DECLARE @kill varchar(8000) = '''';
	SELECT @kill=@kill+''kill ''+convert(varchar(5),spid)+'';'' from master..sysprocesses 
	WHERE dbid=db_id(''' + @dbname + ''') and spid>50;
	EXEC (@kill);'

	SELECT (@kill_cmd)
	--EXEC (@kill_cmd)

-- restore

	DECLARE @restore_cmd VARCHAR(MAX) = 'RESTORE DATABASE [' + @dbname + ']
	FROM DISK = N''\\SomeNetworkShare\SomeServerName_' + @dbname + '.bak''
	WITH FILE = 1,  
	MOVE N''' + @dbname + ''' TO N''D:\SQL_Data\' + @dbname + '.mdf'',
	MOVE N''' + @dbname + '_log'' TO N''D:\SQL_Log\' + @dbname + '_log.ldf'',
	REPLACE,  STATS = 1;'

	SELECT (@restore_cmd)
	--EXEC (@restore_cmd)

sp_whoisactive

I tend to paste this (below) into SSMS keyboard shortcuts under Ctrl+F1 so I can instantly see what’s going on. 

EXEC [master].[dbo].[sp_WhoIsActive] @get_outer_command=1, @get_plans=1, @get_additional_info = 1, @get_task_info = 2, @output_column_list = '[dd%][session_id][block%][sql_text][login_name][CPU%][wait_info][tasks][tran_log%][database%][Program%][percent%][host%][reads%][writes%][sql_command][query_plan][locks][%]'

When it’s too slow 🙂 I use a “lite” version under Crtl+3

EXEC [master].[dbo].[sp_WhoIsActive] @get_additional_info = 1, @output_column_list = '[dd%][session_id][block%][sql_text][sql_command][login_name][CPU%][wait_info][tasks][tran_log%][database%][Program%][percent%][host%][reads%][writes%][query_plan][locks][%]'

Who deleted that data?

Sadly I could not find out.

Going forward – To capture deletes on a table I set-up a regular ‘after delete’ trigger with some extra columns to hold system functions.

This allowed me to capture the date/time, PC-Name and login that originated deletes. Here is my working lab …

--lab_trigger_deletes.sql

--create table to be monitored and add some data
	CREATE TABLE t1 (c1 INT, c2 int)
	INSERT INTO t1 VALUES (1,7), (2,8), (3,9)

-- create audit table
	CREATE TABLE t1_audit (c1 INT, c2 INT, c3 DATETIME, c4 SYSNAME, c5 SYSNAME, c6 SYSNAME)

-- check contents of both tables
	SELECT * from t1
	SELECT * FROM t1_audit

-- create trigger
	CREATE TRIGGER trg_ItemDelete 
	ON dbo.t1 
	AFTER DELETE 
	AS
	INSERT INTO dbo.t1_audit(c1, c2, c3, c4, c5, c6)
			SELECT d.c1, d.c2, GETDATE(), HOST_NAME(), SUSER_SNAME(), ORIGINAL_LOGIN()
			FROM Deleted d

-- delete a row (firing the trigger)
	DELETE FROM t1 WHERE c1 = 2

-- check contents of both tables again
	SELECT * from t1
	SELECT * FROM t1_audit

-- tidy up
	IF OBJECT_ID ('trg_ItemDelete', 'TR') IS NOT NULL DROP TRIGGER trg_ItemDelete;
   	drop TABLE t1
	drop TABLE t1_audit

Two SSDT’s

Microsoft seems to have two products called SSDT (SQL Server Data Tools)

1. A development environment for SSIS etc.

2. A source control tool that uses TFS.

To install 1. You choose SSDT as an option whilst installing SQL Server 2008r2 or 2012.

To install 2. It is a separate download for Visual Studio 10 and 12, or a post installation option for Visual Studio 13.

(SSDT is really just a badly integrated set of tools. I find treating it as two distinct products keeps me sane whilst googling)

Log Space

I noticed a jump in logfile size the other day and was wondering how to predict a autogrowth event.

I know old data is truncated after a log-backup but that’s internal and not normally visable.

I came up with this to run across production …

--LogSpace.sql
-- To help find near-full logfiles that may autogrow soon.

-- create table to hold raw data
CREATE TABLE #temp (DBName varchar(100), SizeMB int, UsedPct float, [STATUS] bit)

-- populate table
INSERT #temp EXEC('DBCC SQLPERF(logspace)')

-- output
SELECT DBName, SizeMB, UsedPct FROM #temp --WHERE UsedPct > 90 -- 90% full

-- clean-up
DROP TABLE #temp

Making index changes to Production

These days I use a SQL Job called ‘DBA – index maint’.

Whenever I have an index change to make I paste the script into a new step, name that step with today’s date, change the ‘start step’ to that step, and schedule it to run once overnight.

This gives me a history and outcome, along-side the exact action.

SQL Safe error “Cannot connect to SQL Server instance”

This was fixed by re-installing SQL Safe. Bonus – Here is a working restore command with move

EXEC [master].[dbo].[xp_ss_restore] 
	@database = 'SomeDatabase',
	@filename = 'J:\backups\SomeDatabase.BAK', 
	@backuptype = 'Full',
	@withmove = 'SomeDatabase_data "J:\sql_data\SomeDatabase_data.mdf"',
	@withmove = 'SomeDatabase_log "J:\sql_log\SomeDatabase_log.ldf"',
	@recoverymode = 'recovery',
	@replace = '1';

Diff Backup / Restore (part 3)

On my Reporting Server I created the SQL Job ‘DBA – Restore MyDatabase’ with two steps 1) Restore MyDatabase, and 2) Add Users.

Step 2 just de-orphaned some user accounts EG:-

EXEC sp_change_users_login 'Auto_Fix', 'uidServerLink';

Step 1 contained this code …

-- Restore

	EXECUTE [master].[dbo].[DatabaseRestore] 
		@dbName = 'MyDatabase',
		@SourceServer = 'MySourceServer',
		@backupPath = 'M:\Backups'


-- Change recovery model

	ALTER DATABASE MyDatabase set recovery SIMPLE

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-1/

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-2/

Diff Backup / Restore (part 2)

On the Production Server I created the Job ‘DBA – Backup/Restore MyDatabase’ with two steps 1) Backup MyDatabase, and 2) Restore MyDatabase.

Step 2 just started the Restore job on the Reporting server (detailed in “Part 3”)

Step 1 needed to check that no random backups had happened in the last 24 hours before starting a Diff Backup …

-- If its Friday or the LSN's do not match - do a FULL backup

	DECLARE @DBName VARCHAR(100) = 'MyDatabase' --<< Database Name
	IF (SELECT DATEPART(dw, GETDATE())) = 6	--<< = Friday
	OR (SELECT MAX(differential_base_lsn)
	    FROM [MyProdServer].[master].[sys].[master_files]
	    WHERE [name] LIKE '%' + @DBName + '%')
	    !=
	   (SELECT MAX(differential_base_lsn)
	    FROM [MyReportServer].[master].[sys].[master_files]
	    WHERE [name] LIKE '%' + @DBName + '%')

	BEGIN
		SELECT 'We can only do a FULL backup'
		EXECUTE [master].[dbo].[DatabaseBackup] 
			@Databases = @DBName,
			@Directory = N'\\MyReportServer\backups', 
			@BackupType = 'FULL', 
			@CleanupTime = 1, --<< ONE HOUR
			@CleanupMode = 'BEFORE_BACKUP',
			@Compress = 'Y',
			@CheckSum = 'Y',
			@LogToTable = 'Y'
	END

-- Else do a DIFF backup

	ELSE
	BEGIN
		SELECT 'we can do a diff backup' 
		EXECUTE [master].[dbo].[DatabaseBackup] 
			@Databases = @DBName,
			@Directory = N'\\MyReportServer\backups', 
			@BackupType = 'DIFF',
			@CleanupTime = 168, --<< ONE WEEK
			@CleanupMode = 'BEFORE_BACKUP',
			@Compress = 'Y',
			@CheckSum = 'Y',
			@LogToTable = 'Y'
	END

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-3/

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-1/

sysutility_get_views_data_into_cache_tables

Having uninstalled UCP and reinstalled MDW I found the MDW job “sysutility_get_views_data_into_cache_tables” failing at step-3 with an error message about invalid synonyms.

The fix was to re-create the MSDB sysnonyms from a clean SQL 2012 server.

Namely …

CREATE SYNONYM [dbo].[syn_sysutility_ucp_databases] 
FOR [msdb].[dbo].[sysutility_ucp_databases_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_filegroups] 
FOR [msdb].[dbo].[sysutility_ucp_filegroups_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_dacs] 
FOR [msdb].[dbo].[sysutility_ucp_dacs_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_smo_servers] 
FOR [msdb].[dbo].[sysutility_ucp_smo_servers_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_volumes] 
FOR [msdb].[dbo].[sysutility_ucp_volumes_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_computers] 
FOR [msdb].[dbo].[sysutility_ucp_computers_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_logfiles] 
FOR [msdb].[dbo].[sysutility_ucp_logfiles_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_datafiles] 
FOR [msdb].[dbo].[sysutility_ucp_datafiles_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_space_utilization] 
FOR [msdb].[dbo].[sysutility_ucp_space_utilization_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_cpu_utilization] 
FOR [msdb].[dbo].[sysutility_ucp_cpu_utilization_stub]

SSMS on Windows Server 2012

To start SSMS on Windows Server 2012 R2 Standard I …

1) Clicked in the bottom-left corner of the screen – which brought up a bunch of blue and green boxes.

2) Clicked on the little down-arrow in a circle near the bottom-left of the screen.

2b) If not visible I clicked a blank part of the screen – outside the boxes.

3) This is like an ‘All Programs’ screen.

4) Find and click “SQL Server Management Studio”.

Slave SQL Jobs

To run a SQL-Job on the same Server use …

EXECUTE msdb.dbo.sp_start_job 'JobName'

To run a SQL-Job on another Server use …

EXECUTE [ServerName].msdb.dbo.sp_start_job 'JobName'

(assuming LinkedServers etc are already set-up)

** Update ** I had an issue where the remote job-name could not be found. The cause (I saw in sys.servers) was that I had used the wrong version of SSMS to create the link. The fix was to amend a working scripted-out link.

Differential backup / restore using SQL Safe

I wanted to improve a backup / restore sql-job that populated a Reporting Server every night. I felt it would be quicker if it did a weekly Full backup and daily Diff backups.

The 3rd-party backup software was “SQL Safe Backup” from Idera.

I used this code on the Production Server …

--BackupDBA.sql

-- FULL Backup if Friday

	IF (SELECT DATEPART(dw, GETDATE())) = 6
	BEGIN
		EXEC [master].[dbo].[xp_ss_backup]
		@database = 'DBA',
		@filename = '\\ReportingServer\m$\DBA_Full.safe',
		@backuptype = 'Full',
		@overwrite = 1,
		@verify = 1;
	END;


-- DIFF Backup

	EXEC [master].[dbo].[xp_ss_backup]
	@database = 'DBA',
	@filename = '\\ReportingServer\m$\DBA_Diff.safe',
	@backuptype = 'Diff',
	@overwrite = 1,
	@verify = 1;

… and this code I scheduled a few hours later on the Reporting Server …

--RestoreDBA.sql

-- First, kill any connections

	DECLARE @kill VARCHAR(8000) = '';
	SELECT  @kill = @kill + 'kill ' + CONVERT(VARCHAR(5), spid) + ';'
	FROM    [master].[dbo].[sysprocesses]
	WHERE   dbid = DB_ID('DBA')
	AND spid > 50;
	EXEC (@kill);


-- Restore FULL

	EXEC [master].[dbo].[xp_ss_restore] 
	@database = 'DBA',
	@filename = 'M:\DBA_Full.safe', 
	@backuptype = 'Full',
	@recoverymode = 'norecovery', 
	@replace = '1';


-- Restore DIFF

	EXEC [master].[dbo].[xp_ss_restore] 
	@database = 'DBA',
	@filename = 'M:\DBA_Diff.safe', 
	@backuptype = 'Diff',
	@recoverymode = 'recovery', 
	@replace = '1';

Finally, I added a step on Reporting to repair the orphaned user-accounts …

USE [DBA]
GO

EXEC sp_change_users_login 'Auto_Fix', 'ReportingLogin';
EXEC sp_change_users_login 'Auto_Fix', 'LinkedServerLogin';
GO

Reset File size and Autogrowth settings

This is the logical conclusion of my reset_tempdb and reset_model scripts. It show all of the file sizes and autogrowth settings in the current instance and the code to change them.

The suggested sizes (128 MB for Logfiles and 256 MB for Datafiles) are reasonable for Model, but should probably be amended for other databases dependent on current size and autogrowth history.

--autogrowth_all.sql

-- get current settings & create commands to change them
select	db.Name, case mf.[Type] when 0 then 'DATA' else 'LOG' end [FileType],
	convert(varchar(50), size*8/1024) + ' MB' [CurrentSize], 
	case mf.is_percent_growth 
		when 1 then convert(varchar(50), growth) + ' %' 
		when 0 then convert(varchar(50), growth*8/1024) + ' MB' end [AutoGrowth],
	'ALTER DATABASE [' + db.Name + '] MODIFY FILE (NAME = N''' + mf.Name + ''', SIZE = ' +        case mf.[type] when 0 then '256' else '128' end + 'MB);' [ReSizeCommand],
	'ALTER DATABASE [' + db.Name + '] MODIFY FILE (NAME = N''' + mf.Name + ''', FILEGROWTH = ' +  case mf.[type] when 0 then '256' else '128' end + 'MB);' [AutogrowthCommand]
from [master].[sys].[master_files] mf
join [master].[sys].[databases] db 
on mf.database_id = db.database_id
order by mf.database_id, mf.[type];

TempDB Autogrowth

I was shocked to find TempDB was the most often autogrown in our environment – and changing them by hand soon got old.

Here’s my global script to set them to best-practice values, that will only change files that exist, and will fail if the specified size is less than the current size (IE: it fails safe).

--tempdb_autogrowth.sql

-- report size and growth
select name, size*8/1024 [sizeMB], growth*8/1024 [growthMB]
from master.sys.master_files
where db_name(database_id) = 'tempdb'
order by physical_name


-- resize datafile(s) to 256MB & logfile to 128MB
USE [master]
GO
begin try
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev',  SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev2', SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev3', SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev4', SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'templog',  SIZE = 131072KB , FILEGROWTH = 131072KB );
end try begin catch end catch

Set Default Autogrowth

Here’s another one of my global changes. This one sets-up ‘Model’, which is the template for all databases created in the future.

-- AutogrowthDefault.sql

-- first examine the current settings
select Name, size*8/1024 [SizeMB], case is_percent_growth 
	when 1 then convert(varchar(50), growth) + ' %' 
	when 0 then convert(varchar(50), growth*8/1024) + ' MB' end AutoGrowth
from master.sys.master_files
where db_name(database_id) = 'Model'
order by [type];

----set Initial-Size to best-practice
ALTER DATABASE [model] MODIFY FILE (NAME = N'modeldev', SIZE = 256MB);
ALTER DATABASE [model] MODIFY FILE (NAME = N'modellog', SIZE = 128MB);

---- set Autogrowth to best-practice
ALTER DATABASE [model] MODIFY FILE (NAME = N'modeldev', FILEGROWTH = 256MB);
ALTER DATABASE [model] MODIFY FILE (NAME = N'modellog', FILEGROWTH = 128MB);

Scripting out SQL Jobs

When I scripted out the “DROP and CREATE” options (on SQL2008R2) for a SQL job I found it only worked once as it created a GUID that fails next time it runs (as it already existed).

To fix this I replaced [@job_id=N’blah blah blah’] in [sp_delete_job] with [@job_name=N’blah blah’] including the actual job name pasted from [sp_add_job].

Also I modified the script to work in additionally with SQL 2000 and SQL 2005 by …

– in [sp_delete_job] commenting out [–, @delete_unused_schedule=1]

– duplicating [sp_add_job] with the first one proceeded by [IF @@version LIKE ‘%2000%’; BEGIN] and no [@notify_email_operator_name]

– and the second one proceeded by [END; ELSE]

Which SQL Process is running which SQL Instance?

To easily see which SQL Process relates to which Instance of SQL Server – In Windows Task Manager / Processes
– right-click on one of the ‘sqlservr.exe’ names
– and select ‘Open file location’.

By examining the path to the file that opens I was able to distinguish the particular instance that this executable related to.

Migrate SQL 2000 to SQL 2012

Luckily there was a shiney new (virtual) server waiting to be the backend for this customers myriad in-house applications.

The live production server was Standard edition SQL 2000, so I installed SQL Server 2008r2 on my laptop and SQL2012 on the new box – both ‘Standard editions’.

My plan …

– Tidy-up by removing dev and unused databases.
– Run SQL2008r2 upgrade-advisor against the SQL2k box & correct issues
– Backup all SQL2k user databases & drag to SQL2008 box
– Restore them (triggering an auto-upgrade)
– Change compatability levels from 80 to 100

– Run SQL2012 upgrade-advisor against SQL2008r2 box & correct issues
– Backup all SQL2008 user databases & drag to SQL2012 box
– Restore them (triggering an auto-upgrade)
– Change compatability level from 100 to 110
– DBCC CHECKDB WITH DATA_PURITY;
– EXEC sp_MSForEachDB ‘DBCC UPDATEUSAGE(?);’;

– Capture and apply SQL2k logins to SQL2012
– Rewrite 2x DTS packages as SSIS
– Apply SQL 2012 (SP2)
– Backup all SQL2012 databases

I was able to get one application (and Reports) re-pointed to the new backend immediately, however the others look ages.

Some of the ‘fun’ issues moving individual applications were … Some had links to a remote Oracle database necessitating Oracle 64bit client installation. Many had old reporting issues.

To keep the SQL2012 data up to date during the ‘fun’, I ran the SSIS wizard from SQL2012 and pulled data directly from SQL2k.

To Empty a Database

To stop the import of ‘already there’ data (twice) I needed to empty every table. Its a very, very good idea to back up your databases before using this code. It empties a whole database in one go!

-- empty database
use SomeDatabaseName -- ** change to subject database first & *** CHECK SERVER NAME **
go
EXEC sp_MSForEachTable 'TRUNCATE TABLE [?]'

Untrusted Keys

Here’s my mod to Aaron Lowe’s @Vendoran script to get rid of untrusted foreign keys and constraints …

--fix_untrusted_keys.sql

SELECT 'ALTER TABLE [' + s.name + '].[' + o.name + '] WITH CHECK CHECK CONSTRAINT [' + i.name + ']' AS keyname
from sys.foreign_keys i
INNER JOIN sys.objects o ON i.parent_object_id = o.object_id
INNER JOIN sys.schemas s ON o.schema_id = s.schema_id
WHERE i.is_not_trusted = 1 AND i.is_not_for_replication = 0

Notification Test

Notification emails telling a DBA that something failed are all-well-an-good, but if you don’t get any emails, how can you be sure the notification system is working?

The simplest solutions are often the best. I’ve taken to scheduling an 8am job (on every SQL Server that runs jobs) called ‘Notification Heartbeat’ that contains just an empty step. It’s own notification configuration is set to email the ‘DBA’ operator every time it succeeds.

Of course, I then have an Outlook rule that moves them (all with ‘Heartbeat’ in the subject), to the ‘DBA’ folder so I dont have to read them every day.

Now when my morning inbox is suspiciously empty – I have a way to verify that the Notification system was tested end-to-end at 8am.

NOTE: I used to try combining the notification-test with actual working Jobs like Backups, but this is simpler and the message value is clearer.

Remove oci.dll

When trying to uninstall Oracle Client (from a Windows server) the standalone deinstall tool couldn’t remove the file “oci.dll”.

To remove it (without rebooting) I tried to manually delete the file. The error message indicated which Windows service I had to stop. (IE: First it was locked by “SQL Server” then when I’d stopped that “VMWare Tools”, then DCOM, then COM+.

After stopping all the locking services in order – I was able to delete the file.
Lastly I restarted all the services I’d stopped 😉

Removing Oracle Client

Whilst setting up a Linked-Server connection from SQL-2012 to Oracle the installation of “Instant client for Windows x64” failed.

To clean-up the mess left I downloaded the ‘Oracle Deinstall Tool’ from http://www.oracle.com/technetwork/database/enterprise-edition/downloads/index.html by clicking ‘see all’ next to Windows x64 within the Oracle 11g2 section.

To use the tool, I opened a command-prompt on my windows-server and changed directory to the folder containing ‘deinstall.bat’ from the above download. The command that worked for me was “deinstall -home D:\Oracle\Client_64”.

BTW@ The path-part of the deinstall command (above) was the ‘Oracle_Home’ location. There are a bunch of ways to find that. I looked in the Oracle Universal-installer.

Finally, I tidied-up my local (enviroment?) Path, by cutting/ pasting from/ into the ‘path’ command – to remove all traces of Oracle.

Cannot truncate table xxx because it is being referenced by a FOREIGN KEY constraint.

When you want to empty a table, and trancate returns this not-very-helpful error message …

TRUNCATE TABLE dbo.SomeTable

Msg 4712, Level 16, State 1, Line 1
Cannot truncate table ‘dbo.SomeTable’ because it is being referenced by a FOREIGN KEY constraint.

Try using delete instead …

begin tran
DELETE dbo.SomeTable
rollback tran

Msg 547, Level 16, State 0, Line 2
The DELETE statement conflicted with the REFERENCE constraint “FK__SomeTable__7BFD7C01”. The conflict occurred in database “SomeDatabase”, table “dbo.SomeTable”, column ‘SomeColumn’.

… for a more helpful message.

Dropping & (re)Creating Foreign-Keys

I noticed that when I script-out the dropping and re-creating of a foreign key constraint it would only work once – because the name-portion would be missing …

ALTER TABLE [dbo].[SomeTable] DROP CONSTRAINT [FK__SomeTable_SYS__FRSIn__1BFD2C07]
GO
ALTER TABLE [dbo].[SomeTable] WITH CHECK ADD FOREIGN KEY([SomeColumnName])
REFERENCES [dbo].[SomeOtherTable] ([SomeColumnName])
GO

The soution was to cut and paste the name portion from the DROP statement into the ADD statement …

ALTER TABLE [dbo].[SomeTable] WITH CHECK ADD CONSTRAINT [FK__SomeTable_SYS__FRSIn__1BFD2C07] FOREIGN KEY([SomeColumnName])
REFERENCES [dbo].[SomeOtherTable] ([SomeColumnName])
GO

Refreshing data (mid-migration)

After I migrated some database from SQL 2000 to SQL 2012 (via SQL2008r2) I was unable to get the front-end applications switched to point to the new backend straight away.

This resulted in the unused SQL 2012 databases gradually becoming out-of-date.

To update them without the pain of repeating the migration I found that I could use the SQL 2012 Import Wizard.

There were three tricky bits to this …

1) to use [Microsoft OLE DB Provider for SQL Server] to connect to the SQL 2000 server
2) to select all Tables but not Views, and
3) To edit the mappings for each table choosing the radio-buttons to [Delete rows in destination Table] and [Enable identity insert]

After this it was simply a matter of saving the output SSIS packages to jobs.

** Update ** … or so I thought. This failed for one database that contained Foreign-keys. So I had to create a ‘pre update’ step to remove the constraints, then a ‘post update’ step to put them back (see next).

Database Compare

Here’s a neat script I wrote to list every database with the total number of rows in it.

The idea was to do a quick-n-dirty post-migration checksum between the old sql2000 and new sql2012 databases – to see which ones needed updating.

--database_compare.sql

EXECUTE sp_MSforeachdb 'use ?; DBCC UPDATEUSAGE(0) WITH COUNT_ROWS;'

create table #temp (DBName varchar(100), TotalRows int)
exec sp_msforeachdb 'insert into #temp select ''?'', sum(rows) 
from [?].[dbo].sysindexes i join [?].[dbo].sysobjects o on o.id=i.id
where indid < 2 and type=''U'''

select * 
from #temp
where DBName not in ('master','model','msdb','tempdb')
order by 1--2

drop table #temp