sp_whoisactive

I tend to paste this (below) into SSMS keyboard shortcuts under Ctrl+F1 so I can instantly see what’s going on. 

EXEC [master].[dbo].[sp_WhoIsActive] @get_outer_command=1, @get_plans=1, @get_additional_info = 1, @get_task_info = 2, @output_column_list = '[dd%][session_id][block%][sql_text][login_name][CPU%][wait_info][tasks][tran_log%][database%][Program%][percent%][host%][reads%][writes%][sql_command][query_plan][locks][%]'

When it’s too slow 🙂 I use a “lite” version under Crtl+3

EXEC [master].[dbo].[sp_WhoIsActive] @get_additional_info = 1, @output_column_list = '[dd%][session_id][block%][sql_text][sql_command][login_name][CPU%][wait_info][tasks][tran_log%][database%][Program%][percent%][host%][reads%][writes%][query_plan][locks][%]'

Who deleted that data?

Sadly I could not find out.

Going forward – To capture deletes on a table I set-up a regular ‘after delete’ trigger with some extra columns to hold system functions.

This allowed me to capture the date/time, PC-Name and login that originated deletes. Here is my working lab …

--lab_trigger_deletes.sql

--create table to be monitored and add some data
	CREATE TABLE t1 (c1 INT, c2 int)
	INSERT INTO t1 VALUES (1,7), (2,8), (3,9)

-- create audit table
	CREATE TABLE t1_audit (c1 INT, c2 INT, c3 DATETIME, c4 SYSNAME, c5 SYSNAME, c6 SYSNAME)

-- check contents of both tables
	SELECT * from t1
	SELECT * FROM t1_audit

-- create trigger
	CREATE TRIGGER trg_ItemDelete 
	ON dbo.t1 
	AFTER DELETE 
	AS
	INSERT INTO dbo.t1_audit(c1, c2, c3, c4, c5, c6)
			SELECT d.c1, d.c2, GETDATE(), HOST_NAME(), SUSER_SNAME(), ORIGINAL_LOGIN()
			FROM Deleted d

-- delete a row (firing the trigger)
	DELETE FROM t1 WHERE c1 = 2

-- check contents of both tables again
	SELECT * from t1
	SELECT * FROM t1_audit

-- tidy up
	IF OBJECT_ID ('trg_ItemDelete', 'TR') IS NOT NULL DROP TRIGGER trg_ItemDelete;
   	drop TABLE t1
	drop TABLE t1_audit

Two SSDT’s

Microsoft seems to have two products called SSDT (SQL Server Data Tools)

1. A development environment for SSIS etc.

2. A source control tool that uses TFS.

To install 1. You choose SSDT as an option whilst installing SQL Server 2008r2 or 2012.

To install 2. It is a separate download for Visual Studio 10 and 12, or a post installation option for Visual Studio 13.

(SSDT is really just a badly integrated set of tools. I find treating it as two distinct products keeps me sane whilst googling)

Log Space

I noticed a jump in logfile size the other day and was wondering how to predict a autogrowth event.

I know old data is truncated after a log-backup but that’s internal and not normally visable.

I came up with this to run across production …

--LogSpace.sql
-- To help find near-full logfiles that may autogrow soon.

-- create table to hold raw data
CREATE TABLE #temp (DBName varchar(100), SizeMB int, UsedPct float, [STATUS] bit)

-- populate table
INSERT #temp EXEC('DBCC SQLPERF(logspace)')

-- output
SELECT DBName, SizeMB, UsedPct FROM #temp --WHERE UsedPct > 90 -- 90% full

-- clean-up
DROP TABLE #temp

My SQL 2008 Install optimizations

1)create service accounts & add to local admin
2) optimize power settings
3) optimize TempDB datafiles (number, size, & T1118)
4) set max-memory
5) set compressed backups
6) optimize cost parallel >> 80
7) adhoc workloads
8) resize & set autogrowth for system db’s
9) transfer logins (revlogins)
10) ola solution install customize, schedule
11) setup operator & database mail
12) config notifications
13) setup links & permissions
14) install whoisactive
15) check recovery models are full for log backups
16) initial manual backup/restore
17) update usage and purity
18) configure remote dtc
19) set cmdshell to match old server
20) add me to local admin
21) make catchall backup job
22) transfer jobs

Orphaned Users

Here’s a quick script to fix orphaned users after a migration …

--OrphanedUsers.sql

-- create temp table
	CREATE TABLE #orphans (oname VARCHAR(100), oSID VARCHAR(100) PRIMARY KEY)
	DECLARE @cmd VARCHAR(MAX), @name VARCHAR(100)

-- populate temp table with orphaned logins
	INSERT #orphans(oname,osid)
	EXEC sp_change_users_login @Action='Report';

-- loop to fix / or else create login with default pw
	WHILE (SELECT COUNT(*) FROM #orphans) > 0
	BEGIN
		SELECT TOP 1 @name = oname FROM #orphans 
		SET @cmd = 'EXEC sp_change_users_login ''Auto_Fix'', ''' + @name + ''', NULL, ''B3r12-3x$098f6'';'
		DELETE FROM #orphans WHERE oname = @name
		EXEC (@cmd)
	END

-- tidy up
	DROP TABLE #orphans 

Making index changes to Production

These days I use a SQL Job called ‘DBA – index maint’.

Whenever I have an index change to make I paste the script into a new step, name that step with today’s date, change the ‘start step’ to that step, and schedule it to run once overnight.

This gives me a history and outcome, along-side the exact action.

SQL Safe error “Cannot connect to SQL Server instance”

This was fixed by re-installing SQL Safe. Bonus – Here is a working restore command with move

EXEC [master].[dbo].[xp_ss_restore] 
	@database = 'SomeDatabase',
	@filename = 'J:\backups\SomeDatabase.BAK', 
	@backuptype = 'Full',
	@withmove = 'SomeDatabase_data "J:\sql_data\SomeDatabase_data.mdf"',
	@withmove = 'SomeDatabase_log "J:\sql_log\SomeDatabase_log.ldf"',
	@recoverymode = 'recovery',
	@replace = '1';

Diff Backup / Restore (part 3)

On my Reporting Server I created the SQL Job ‘DBA – Restore MyDatabase’ with two steps 1) Restore MyDatabase, and 2) Add Users.

Step 2 just de-orphaned some user accounts EG:-

EXEC sp_change_users_login 'Auto_Fix', 'uidServerLink';

Step 1 contained this code …

-- Restore

	EXECUTE [master].[dbo].[DatabaseRestore] 
		@dbName = 'MyDatabase',
		@SourceServer = 'MySourceServer',
		@backupPath = 'M:\Backups'


-- Change recovery model

	ALTER DATABASE MyDatabase set recovery SIMPLE

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-1/

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-2/

Diff Backup / Restore (part 2)

On the Production Server I created the Job ‘DBA – Backup/Restore MyDatabase’ with two steps 1) Backup MyDatabase, and 2) Restore MyDatabase.

Step 2 just started the Restore job on the Reporting server (detailed in “Part 3”)

Step 1 needed to check that no random backups had happened in the last 24 hours before starting a Diff Backup …

-- If its Friday or the LSN's do not match - do a FULL backup

	DECLARE @DBName VARCHAR(100) = 'MyDatabase' --<< Database Name
	IF (SELECT DATEPART(dw, GETDATE())) = 6	--<< = Friday
	OR (SELECT MAX(differential_base_lsn)
	    FROM [MyProdServer].[master].[sys].[master_files]
	    WHERE [name] LIKE '%' + @DBName + '%')
	    !=
	   (SELECT MAX(differential_base_lsn)
	    FROM [MyReportServer].[master].[sys].[master_files]
	    WHERE [name] LIKE '%' + @DBName + '%')

	BEGIN
		SELECT 'We can only do a FULL backup'
		EXECUTE [master].[dbo].[DatabaseBackup] 
			@Databases = @DBName,
			@Directory = N'\\MyReportServer\backups', 
			@BackupType = 'FULL', 
			@CleanupTime = 1, --<< ONE HOUR
			@CleanupMode = 'BEFORE_BACKUP',
			@Compress = 'Y',
			@CheckSum = 'Y',
			@LogToTable = 'Y'
	END

-- Else do a DIFF backup

	ELSE
	BEGIN
		SELECT 'we can do a diff backup' 
		EXECUTE [master].[dbo].[DatabaseBackup] 
			@Databases = @DBName,
			@Directory = N'\\MyReportServer\backups', 
			@BackupType = 'DIFF',
			@CleanupTime = 168, --<< ONE WEEK
			@CleanupMode = 'BEFORE_BACKUP',
			@Compress = 'Y',
			@CheckSum = 'Y',
			@LogToTable = 'Y'
	END

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-3/

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-1/

Diff Backup / Restore (part 1)

Although Ola’s Backup solution works great …
https://ola.hallengren.com/sql-server-backup.html

For this project I needed a corresponding Restore procedure, so I could setup nightly Diff Backup / Restores (from Prod to Reporting). Without having to write too much code 🙂

I modified the code from here for our environment (see below).
http://jason-carter.net/professional/restore-script-from-backup-directory-modified.html

In my next posts I will detail the SQL Jobs for this project.

USE [master]
GO

CREATE PROCEDURE [dbo].[DatabaseRestore]
		@dbName sysname,
		@SourceServer NVARCHAR(500),
		@backupPath NVARCHAR(500)

AS

/* To restore backups created from ola.hallengren's backup solution (RS) */

	SET NOCOUNT ON
	DECLARE @cmd NVARCHAR(500),
			@lastFullBackup NVARCHAR(500),
			@lastDiffBackup NVARCHAR(500),
			@backupFile NVARCHAR(500)

	DECLARE @fileList TABLE (backupFile NVARCHAR(255))
	DECLARE @directoryList TABLE (backupFile NVARCHAR(255))

/* Kill any connections */

	DECLARE @kill VARCHAR(8000) = '';
	SELECT  @kill = @kill + 'kill ' + CONVERT(VARCHAR(5), spid) + ';'
		FROM [master].[dbo].[sysprocesses]
		WHERE dbid = DB_ID(@dbName)
		AND spid > 50;
	EXEC (@kill);

/* Match that of Olas output */

	SET @backupPath = @backupPath + '\' + @SourceServer + '\' + @dbName + '\'

/* Get List of Files */

	SET @cmd = 'DIR /s /b /O D ' + @backupPath
	IF (SELECT value_in_use
		FROM sys.configurations
		WHERE name = 'xp_cmdshell') = 0
	BEGIN /* cmd shell is disabled */
		EXEC sp_configure 'show advanced options', 1 RECONFIGURE
		EXEC sp_configure xp_cmdshell, 1 RECONFIGURE
		INSERT INTO @fileList(backupFile) EXEC master.sys.xp_cmdshell @cmd
		EXEC sp_configure 'xp_cmdshell', 0 RECONFIGURE
		EXEC sp_configure 'show advanced options', 0 RECONFIGURE
	END
	ELSE /* cmd shell is enabled */
		INSERT INTO @fileList(backupFile) EXEC master.sys.xp_cmdshell @cmd

/* Find latest full backup */

	SELECT @lastFullBackup = MAX(backupFile) 
	FROM @fileList 
	WHERE backupFile LIKE '%' + @SourceServer + '_' + @dbName + '_FULL_%.bak'

	SET @cmd = 'RESTORE DATABASE [' + @dbName + '] FROM DISK = ''' 
	   + @lastFullBackup + ''' WITH NORECOVERY, REPLACE'
	SELECT (@cmd); EXEC (@cmd)

/* Find latest diff backup */

	SELECT @lastDiffBackup = MAX(backupFile)
	FROM @fileList 
	WHERE backupFile  LIKE '%' + @SourceServer + '_' + @dbName + '_DIFF_%.bak'
	AND RIGHT(backupfile, 19) > RIGHT(@lastFullBackup, 19)

/* check to make sure there is a diff backup */

	IF @lastDiffBackup IS NOT NULL
		BEGIN
		SET @cmd = 'RESTORE DATABASE [' + @dbName + '] FROM DISK = ''' 
			   + @lastDiffBackup + ''' WITH NORECOVERY'
		SELECT (@cmd); EXEC (@cmd)
		SET @lastFullBackup = @lastDiffBackup
		END

--/* check for log backups */

--	DECLARE backupFiles CURSOR FOR 
--	SELECT backupFile 
--	FROM @fileList
--	WHERE backupFile LIKE  '%' + @SourceServer + '_' + @dbName + '_LOG_%.trn'
--	AND RIGHT(backupfile, 19) > RIGHT(@lastFullBackup, 19)

--	OPEN backupFiles 

--/* Loop through all the files for the database */

--	FETCH NEXT FROM backupFiles INTO @backupFile 

--	WHILE @@FETCH_STATUS = 0 
--		BEGIN 
--		   SET @cmd = 'RESTORE LOG [' + @dbName + '] FROM DISK = ''' 
--			   + @backupFile + ''' WITH NORECOVERY'
--		   SELECT (@cmd); EXEC (@cmd)
--		   FETCH NEXT FROM backupFiles INTO @backupFile 
--		END

--	CLOSE backupFiles 
--	DEALLOCATE backupFiles 

/* put database in a useable state */

	SET @cmd = 'RESTORE DATABASE [' + @dbName + '] WITH RECOVERY'
	SELECT (@cmd); EXEC (@cmd)

GO

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-2/

https://richardbriansmith.wordpress.com/2015/10/02/diff-backup-restore-part-3/

sysutility_get_views_data_into_cache_tables

Having uninstalled UCP and reinstalled MDW I found the MDW job “sysutility_get_views_data_into_cache_tables” failing at step-3 with an error message about invalid synonyms.

The fix was to re-create the MSDB sysnonyms from a clean SQL 2012 server.

Namely …

CREATE SYNONYM [dbo].[syn_sysutility_ucp_databases] 
FOR [msdb].[dbo].[sysutility_ucp_databases_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_filegroups] 
FOR [msdb].[dbo].[sysutility_ucp_filegroups_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_dacs] 
FOR [msdb].[dbo].[sysutility_ucp_dacs_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_smo_servers] 
FOR [msdb].[dbo].[sysutility_ucp_smo_servers_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_volumes] 
FOR [msdb].[dbo].[sysutility_ucp_volumes_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_computers] 
FOR [msdb].[dbo].[sysutility_ucp_computers_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_logfiles] 
FOR [msdb].[dbo].[sysutility_ucp_logfiles_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_datafiles] 
FOR [msdb].[dbo].[sysutility_ucp_datafiles_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_space_utilization] 
FOR [msdb].[dbo].[sysutility_ucp_space_utilization_stub]

CREATE SYNONYM [dbo].[syn_sysutility_ucp_cpu_utilization] 
FOR [msdb].[dbo].[sysutility_ucp_cpu_utilization_stub]

SSMS on Windows Server 2012

To start SSMS on Windows Server 2012 R2 Standard I …

1) Clicked in the bottom-left corner of the screen – which brought up a bunch of blue and green boxes.

2) Clicked on the little down-arrow in a circle near the bottom-left of the screen.

2b) If not visible I clicked a blank part of the screen – outside the boxes.

3) This is like an ‘All Programs’ screen.

4) Find and click “SQL Server Management Studio”.

DBA Rule #6

You can serve only one God.

Whatever you fall asleep thinking about, your sub-consious will continue to work on over-night.

If your last thoughts are about impressing your boss, your technical work the next day will be pedestrian.

The two groups I see perpetually falling foul of this rule are Manager\Dba’s and Developer\Dba’s

DBA Rule #2

Avoid Scripting.

A well crafted SQL statement is a thing on beauty, and writing one gives a great deal of satisfaction.

But creating art is not the job.

Script is fragile (To know how fragile, count the number of potential typo’s). Which is fine for those who’s output is intended to be constantly improved upon. But I feel DBA’s should favour robustness over the satisfaction of creativity, and implement for the long term.

If there is simply no other choice but using a script, avoid the near endless cycle of creating // debugging // augmenting your own, and download industrial-strength scripts from a trusted source (like Ola Hallengren). But read and understand them before using in production. 🙂

Slave SQL Jobs

To run a SQL-Job on the same Server use …

EXECUTE msdb.dbo.sp_start_job 'JobName'

To run a SQL-Job on another Server use …

EXECUTE [ServerName].msdb.dbo.sp_start_job 'JobName'

(assuming LinkedServers etc are already set-up)

** Update ** I had an issue where the remote job-name could not be found. The cause (I saw in sys.servers) was that I had used the wrong version of SSMS to create the link. The fix was to amend a working scripted-out link.

Differential backup / restore using SQL Safe

I wanted to improve a backup / restore sql-job that populated a Reporting Server every night. I felt it would be quicker if it did a weekly Full backup and daily Diff backups.

The 3rd-party backup software was “SQL Safe Backup” from Idera.

I used this code on the Production Server …

--BackupDBA.sql

-- FULL Backup if Friday

	IF (SELECT DATEPART(dw, GETDATE())) = 6
	BEGIN
		EXEC [master].[dbo].[xp_ss_backup]
		@database = 'DBA',
		@filename = '\\ReportingServer\m$\DBA_Full.safe',
		@backuptype = 'Full',
		@overwrite = 1,
		@verify = 1;
	END;


-- DIFF Backup

	EXEC [master].[dbo].[xp_ss_backup]
	@database = 'DBA',
	@filename = '\\ReportingServer\m$\DBA_Diff.safe',
	@backuptype = 'Diff',
	@overwrite = 1,
	@verify = 1;

… and this code I scheduled a few hours later on the Reporting Server …

--RestoreDBA.sql

-- First, kill any connections

	DECLARE @kill VARCHAR(8000) = '';
	SELECT  @kill = @kill + 'kill ' + CONVERT(VARCHAR(5), spid) + ';'
	FROM    [master].[dbo].[sysprocesses]
	WHERE   dbid = DB_ID('DBA')
	AND spid > 50;
	EXEC (@kill);


-- Restore FULL

	EXEC [master].[dbo].[xp_ss_restore] 
	@database = 'DBA',
	@filename = 'M:\DBA_Full.safe', 
	@backuptype = 'Full',
	@recoverymode = 'norecovery', 
	@replace = '1';


-- Restore DIFF

	EXEC [master].[dbo].[xp_ss_restore] 
	@database = 'DBA',
	@filename = 'M:\DBA_Diff.safe', 
	@backuptype = 'Diff',
	@recoverymode = 'recovery', 
	@replace = '1';

Finally, I added a step on Reporting to repair the orphaned user-accounts …

USE [DBA]
GO

EXEC sp_change_users_login 'Auto_Fix', 'ReportingLogin';
EXEC sp_change_users_login 'Auto_Fix', 'LinkedServerLogin';
GO

Reset File size and Autogrowth settings

This is the logical conclusion of my reset_tempdb and reset_model scripts. It show all of the file sizes and autogrowth settings in the current instance and the code to change them.

The suggested sizes (128 MB for Logfiles and 256 MB for Datafiles) are reasonable for Model, but should probably be amended for other databases dependent on current size and autogrowth history.

--autogrowth_all.sql

-- get current settings & create commands to change them
select	db.Name, case mf.[Type] when 0 then 'DATA' else 'LOG' end [FileType],
	convert(varchar(50), size*8/1024) + ' MB' [CurrentSize], 
	case mf.is_percent_growth 
		when 1 then convert(varchar(50), growth) + ' %' 
		when 0 then convert(varchar(50), growth*8/1024) + ' MB' end [AutoGrowth],
	'ALTER DATABASE [' + db.Name + '] MODIFY FILE (NAME = N''' + mf.Name + ''', SIZE = ' +        case mf.[type] when 0 then '256' else '128' end + 'MB);' [ReSizeCommand],
	'ALTER DATABASE [' + db.Name + '] MODIFY FILE (NAME = N''' + mf.Name + ''', FILEGROWTH = ' +  case mf.[type] when 0 then '256' else '128' end + 'MB);' [AutogrowthCommand]
from [master].[sys].[master_files] mf
join [master].[sys].[databases] db 
on mf.database_id = db.database_id
order by mf.database_id, mf.[type];

PostgreSQL command-line

Having logged in locally on a linux box, I used these steps to access the database via a terminal session …

 $ sudo su - postgres
 [sudo] password for richard: *****
 $ psql
 Password: ******
 postgres=#
 postgres=# select version()
 PostgreSQL 9.4.4 on x86_64 (Red Hat 4.1.2-55), 64-bit

Line 1) As root, I switch to linux user “postgres” (including environmental variables)
Line 2) I typed in my password
Line 3) And ran the Executable (psql.exe)
Line 4) I typed in the password of the postgres user
Line 5) Success! and to prove it …
Line 6) My fist SELECT statement, lol

BTW: to leave I typed “\q” to quit the PostgreSQL environment, “exit” to leave the postgres account, then “exit” again to close the terminal session.

Importing MID/MIF files into SQL Server 2012

What a true pain that was! “shonky” (thanks Rory) There just is no proper documentation for Spatial data on MSSQL Server yet. (Oracle is so much simpler for GIS).

A migration for Fire & Rescue included some rather enhanced sat-nav data.

Firstly – to get OGR2OGR installed I downloaded the OSGeo4W setup utility which fails with “Unable to get setup.ini from http: //download.osgeo.org/osgeo4w/”.

Eventually I found the trick was to choose “Advanced Install” from the first screen, then “Use IE5 Settings”.

Then – after the local installation had completed – to get it installed on a server without internet connections I copied the entire 2GB folder c:\OSGeo4W\ across the network.

I needed to create a system-variable called “GDAL_DATA” to the path of ‘coordinate_axis.csv’ being “D:\OSGeo4W\share\gdal\”

Lastly, I created a short-cut on my desk-top to cmd.com and customized it so it starts in the ogr2ogr.exe folder.

To import using ogr2ogr.exe … I played with a number of commands in a BAT file until this one worked (all on one line) …

D:\OSGeo4W\bin\ogr2ogr 
-f "MSSQLSpatial" 
"MSSQL:Server=localhost;Database=GIS;trusted_connection=yes" 
"D:\Media\Spatial Data\BFRS_10_MILE.mid" 
-t_srs "EPSG:27700" 
-lco "GEOM_TYPE=geometry" 
-lco "GEOM_NAME=geog27700"

To ‘decode’ that as much as I can …

– Line-1 is the path to ogr2ogr.exe
– Line-2 means import to SQL Server
– Line-3 is the SQL Server connection string (SQL Server 2012 btw)
– Line-4 is the path to the MID file (the MIF file will be imported too)
– Line-5 converts it to ‘Geometry’ projected on the UK template

Emergency Friday afternoon Backups

I had a situation where a VMWare upgrade stopped the backup software from working across the board (mmm, a little notice would have been nice).

To avoid the possibility of a full log crashing an application over the weekend I created two Jobs to do full backups every 10pm and transaction backups every 3 hours and deployed them.

The script (and scripted-out job) had to be compatible with all versions of SQL Server. The only prep I had to do was creating a folder for each server (_instance) at the backup destination.

The code I downloaded from http://www.mssqltips.com/sqlservertip/1070/simple-script-to-backup-all-sql-server-databases/ evolved into this …

--backup_all.sql

	DECLARE @name VARCHAR(50) -- database name  
	DECLARE @path VARCHAR(256) -- path for backup files  
	DECLARE @fileName VARCHAR(256) -- filename for backup 
	DECLARE @fileDate VARCHAR(20) -- added to file name
	DECLARE @sname varchar(100) -- server name
	DECLARE @DeleteDate DATETIME -- purge date ..
	SET @DeleteDate = getdate()-14 -- .. two weeks
	
-- get server name
	select @sname = replace(@@servername, '\', '_')
 
-- specify backup directory
	SET @path = '\\SomeServerName\BACKUPS\' + @sname + '\'

-- specify filename format
	SELECT @fileDate = CONVERT(VARCHAR(20),GETDATE(),112)
 
-- setup cursor
	DECLARE db_cursor CURSOR FOR  
	SELECT name FROM master.dbo.sysdatabases
	WHERE name NOT IN ('model', 'tempdb')  -- exclude these databases
 	OPEN db_cursor
	FETCH NEXT FROM db_cursor INTO @name   
 
 --loop through databases backing them up
	WHILE @@FETCH_STATUS = 0   
	BEGIN   
		SET @fileName = @path + @name + '_' + @fileDate + '.BAK'  
		BACKUP DATABASE @name TO DISK = @fileName  
		FETCH NEXT FROM db_cursor INTO @name
	END
	
-- close cursor
	CLOSE db_cursor   
	DEALLOCATE db_cursor   

-- purge old backups (but manually delete SQL2K)
	if @@version not like '%2000%' 
	exec master.sys.xp_delete_file 0, @path, 'BAK', @DeleteDate, 0

… and this for the log backups …

--backup_all_t.sql

	DECLARE @name VARCHAR(50) -- database name   
	DECLARE @path VARCHAR(256) -- path for backup files   
	DECLARE @fileName VARCHAR(256) -- filename for backup   
	DECLARE @fileDate VARCHAR(20) -- used for file name
	DECLARE @sname varchar(100) -- server name
	DECLARE @DeleteDate DATETIME -- purge date ..
	SET @DeleteDate = getdate()-7 -- .. one weeks  

-- get server name
	select @sname = replace(@@servername, '\', '_')

-- specify backup directory
	SET @path = '\\SomeServerName\BACKUPS\' + @sname + '\' 

-- specify filename format
	SELECT @fileDate = CONVERT(VARCHAR(20),GETDATE(),112)  
	   + '_' + REPLACE(CONVERT(VARCHAR(20),GETDATE(),108),':','') 

-- setup cursor
	DECLARE db_cursor CURSOR FOR   
	SELECT name FROM master.dbo.sysdatabases  
	WHERE name NOT IN ('master','model','msdb','tempdb')  -- exclude these databases
	AND DATABASEPROPERTYEX(name, 'Recovery') NOT IN ('SIMPLE') -- exclude Simple dbs
	OPEN db_cursor    
	FETCH NEXT FROM db_cursor INTO @name    

 --loop through databases, backing them up 
	WHILE @@FETCH_STATUS = 0    
	BEGIN    
		SET @fileName = @path + @name + '_' + @fileDate + '.TRN'   
		BACKUP LOG @name TO DISK = @fileName  
		FETCH NEXT FROM db_cursor INTO @name    
	END
	
-- close cursor
	CLOSE db_cursor    
	DEALLOCATE db_cursor    

-- purge old backups (but manually delete SQL2K)
	if @@version not like '%2000%' 
	exec master.sys.xp_delete_file 0, @path, 'TRN', @DeleteDate, 0

I heavily commented it as I was near the end of my contract, and knew “temporary solutions” can persist for a long time ;).

TempDB Autogrowth

I was shocked to find TempDB was the most often autogrown in our environment – and changing them by hand soon got old.

Here’s my global script to set them to best-practice values, that will only change files that exist, and will fail if the specified size is less than the current size (IE: it fails safe).

--tempdb_autogrowth.sql

-- report size and growth
select name, size*8/1024 [sizeMB], growth*8/1024 [growthMB]
from master.sys.master_files
where db_name(database_id) = 'tempdb'
order by physical_name


-- resize datafile(s) to 256MB & logfile to 128MB
USE [master]
GO
begin try
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev',  SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev2', SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev3', SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev4', SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'templog',  SIZE = 131072KB , FILEGROWTH = 131072KB );
end try begin catch end catch

Set Default Autogrowth

Here’s another one of my global changes. This one sets-up ‘Model’, which is the template for all databases created in the future.

-- AutogrowthDefault.sql

-- first examine the current settings
select Name, size*8/1024 [SizeMB], case is_percent_growth 
	when 1 then convert(varchar(50), growth) + ' %' 
	when 0 then convert(varchar(50), growth*8/1024) + ' MB' end AutoGrowth
from master.sys.master_files
where db_name(database_id) = 'Model'
order by [type];

----set Initial-Size to best-practice
ALTER DATABASE [model] MODIFY FILE (NAME = N'modeldev', SIZE = 256MB);
ALTER DATABASE [model] MODIFY FILE (NAME = N'modellog', SIZE = 128MB);

---- set Autogrowth to best-practice
ALTER DATABASE [model] MODIFY FILE (NAME = N'modeldev', FILEGROWTH = 256MB);
ALTER DATABASE [model] MODIFY FILE (NAME = N'modellog', FILEGROWTH = 128MB);

DBA Audit

Building on my “is the database used?” article I developed this SQL-job as a light-weight audit to find unused databases and logins.

The job was scheduled to run every minute on every server, and had to work on SQL 2000, 2005, 2008r2 and 2012.

There were 4 steps

1) Initial Setup – Database Audit

--create table of database names

	select [name] dbase_name, [dbid], cast(null as datetime) last_used
	into [master].[dbo].[dbaudit]
	from [master].[dbo].[sysdatabases]
	where [dbid] > 4

2) Initial Setup – Login Audit

-- initial setup

	CREATE TABLE loginAudit (
		login_name VARCHAR(200), last_login_date DATETIME)

	INSERT INTO loginAudit (login_name, last_login_date)
		SELECT DISTINCT sl.name, NULL FROM syslogins sl

Steps 1 and 2 create the tables to hold the results, and interestingly carry on to the next step if they fail (IE: the tables already existed). I did it this way so I could just run it without any preparation.

3) Update Results – Database Audit

--update connections

	update [master].[dbo].[dbaudit]
	set last_used = getdate()
	where [dbid] in 
		(select [dbid] from [master].[dbo].[sysprocesses])

4) Update Results – Login Audit

-- update logins
		
	SELECT MAX(login_time) [login_time], loginame login_name INTO #loginTempTable
	FROM master.dbo.sysprocesses
	where loginame not in ('sa')
	and loginame not like ' %'
	GROUP BY loginame

	UPDATE loginAudit
	SET last_login_date = ltt.login_time 
	FROM #loginTempTable ltt
	WHERE loginAudit.login_name = ltt.login_name

I wanted steps 3 and 4 to capture the current information without any table growth, so if my little job ran for years unnoticed the results-tables would not be huge.

BTW, I developed this on a candidate server from each of the 4 SQL versions, before scripting out the job on SQL server 2012 (“Drop and Create”).

FYI, I tested the resulting script (“CreateJob_DBAAudit.sql”) on the 4 servers again, and made some minor adjustments (so it would work more than once, and without errors on SQL-2000), before deploying to all 300+ production servers via SSMS Registered Servers.

Bulk import into Arcserve UDP

To bulk import my SQL Servers into Arcserve I ran this query on all my production servers via Registered Servers

--arc_import.sql
select cast(serverproperty('ComputerNamePhysicalNetBIOS') as varchar(100)) + ',DOM\svc-arcserve,password'

I pasted the column into a text file saved to my desktop. Then in Arcserve’s main console I chose to Add a Node, then Import from a File Lastly, I navigated to the text document and imported them all. NOTE1: you have to put the right domain, service-account, and password into the script NOTE2: With clusters the script will only get the name of the current active-node.

POST SCRIPT
by-the-way: that query won’t get the server-name for sql2000 boxes. This one will get then all …

-- physical server names
exec master..xp_regread 'HKEY_LOCAL_Machine',
'SYSTEM\CurrentControlSet\Control\ComputerName\ComputerName\','ComputerName'

Find a SQL Job from an JobStepID

Looking through Waits in my MDW I often see things like …

Program Name: SQLAgent- TSQL JobStep (Job 0xD56…

… and can get the job name from …

SELECT * 
FROM msdb.dbo.sysjobs
WHERE job_id = 0xD56...

Or even better! to go directly from the JobStep-ID to the Executable causing the issue …

SELECT * 
FROM msdb.dbo.sysjobsteps
WHERE job_id = 0xD56...

Setting Cost Threshold of Paralelism

Prep. Reset the Cost_Threshold_for_Parallelism to Microsoft’s default (5) if it has been changed

1. Set MaxPOD from the default (0) to Microsoft recommendations (IE: number of cores in 1 cpu max 8)

2. Run dbo.sp_BlitzCache, and from the first result table cut-n-paste the first three columns into a spreadsheet.

3. Sort the spreadsheet by column-2 ‘cost’ and try to find a natural boundary between OLTP and OLAP
queries.

4. Set Cost Threshold for Parallelism in the middle of this split.

NB: do not run #2 again as the results will be deceptive unless (prep) is actioned first

When SnapManager backup fails

When a SnapManager backup failed I …
– manually ran the SQL job – which failed again
– remoted onto the server and read the logs (start/ NetApp/ SnapManager for SQL Management) – there weren’t any for that day!
– fired up SnapDrive (start/ NetApp/ SnapDrive) and examined the Disks – there weren’t any (dum dum duuuum)
– restarted all Snap~ services (x3)
– found the disks had re-appeared in Snapdrive
– manually started the sql backup job – success!

Pruning SQL 2005 backup history

When trying to prune the backup history of a SQL 2005 Server (sp3) I kept getting the error.

Msg 547, Level 16, State 0, Line 1
The DELETE statement conflicted with the REFERENCE constraint “FK__backupfil__backu__793DFFAF”.
The conflict occurred in database “msdb”, table “dbo.backupfilegroup”, column ‘backup_set_id’.

The FIX was to drop the constraints, do the dirty deed, then re-create the constraints.

Here’s the code I used …

--PruneBackupHistory_2005.sql

-- remove foreign keys
use msdb
go
--ALTER TABLE [restorefile] drop CONSTRAINT FK__restorefi__resto__00DF2177;
ALTER TABLE [restorefilegroup] drop CONSTRAINT FK__restorefi__resto__02C769E9;
ALTER TABLE [backupmediafamily] drop CONSTRAINT FK__backupmed__media__72910220;
ALTER TABLE [backupset] drop CONSTRAINT FK__backupset__media__76619304;
ALTER TABLE [backupfilegroup] drop CONSTRAINT FK__backupfil__backu__793DFFAF;
ALTER TABLE [backupfile] drop CONSTRAINT FK__backupfil__backu__7C1A6C5A;
ALTER TABLE [restorehistory] drop CONSTRAINT FK__restorehi__backu__7EF6D905;
go

-- empty tables
truncate table backupfile
truncate table backupfilegroup
truncate table backupmediafamily
truncate table backupmediaset
truncate table backupset
truncate table restorefile
truncate table restorefilegroup
truncate table restorehistory
go

-- replace foreign keys
--ALTER TABLE [restorefile] ADD CONSTRAINT FK__restorefi__resto__00DF2177 FOREIGN KEY (restore_history_id) REFERENCES restorehistory (restore_history_id);
ALTER TABLE [restorefilegroup] ADD CONSTRAINT FK__restorefi__resto__02C769E9 FOREIGN KEY (restore_history_id) REFERENCES restorehistory (restore_history_id);
ALTER TABLE [backupmediafamily] ADD CONSTRAINT FK__backupmed__media__72910220 FOREIGN KEY (media_set_id) REFERENCES backupmediaset (media_set_id);
ALTER TABLE [backupset] ADD CONSTRAINT FK__backupset__media__76619304 FOREIGN KEY (media_set_id) REFERENCES backupmediaset (media_set_id);
ALTER TABLE [backupfilegroup] ADD CONSTRAINT FK__backupfil__backu__793DFFAF FOREIGN KEY (backup_set_id) REFERENCES backupset (backup_set_id);
ALTER TABLE [backupfile] ADD CONSTRAINT FK__backupfil__backu__7C1A6C5A FOREIGN KEY (backup_set_id) REFERENCES backupset (backup_set_id);
ALTER TABLE [restorehistory] ADD CONSTRAINT FK__restorehi__backu__7EF6D905 FOREIGN KEY (backup_set_id) REFERENCES backupset (backup_set_id);
go
dbcc shrinkdatabase ('msdb')
go

Its a bit of a sledge-hammer, but that’s just the sort of day I had.

Update multiple XLSM spreadsheets using VBA

As part of a migration I needed to change the hard-code connections within a ‘WSSI’ application (a suit of dynamic spreadsheets used for stock-control). In essence this was a folder in a shared-location containing hundreds of ‘xlsm’ spreadsheets in sub-folders.

To do this manually I would have had to open each spreadsheet one-by-one, press ALT-F11, navigate down the tree in the top-left pane, opening MODULES then CONSTANTS. Then change the hard-coded values EG: [Public Const DB_SERVER As string = “Old-Name”] and type in the “New-Name” before saving.

** This would have taken days as there were hundreds of them **

I was able to semi-automated the process and completed it in an hour. Here’s how …

– I copied the root folder to my C-Drive
– Created a new XLSM spreadsheet on my desktop
– Opened this File \ Options \ Trust Center \ Trust Center Settings \ Macro Settings
– and ticked “Trust access to the VBA project model”
– saved it as [wssi_mod.xlsm]
– selected any cell and typed ALT-F11 (to develop)
– In the top-left pane I right-clicked on “Sheet1”, Insert \ Module
– in the right-hand pane I pasted this VBA

Option Explicit

Sub Test()
  Dim Path As String, FName As String
  Dim SearchFor As String, ReplaceWith As String, Contents As String
  Dim Wb As Excel.Workbook
  Dim vbComp As Object 'VBIDE.VBComponent
  Dim Changed As Boolean
  
  'Customize this:
   Path = "C:\WSSI_2014\Dairy\"
     SearchFor = "Public Const DB_SERVER As String = ""old-name"""
   ReplaceWith = "Public Const DB_SERVER As String = ""new-name"""
    
  'Prepare Excel
  Application.EnableEvents = False
  Application.Calculation = xlCalculationManual
  Application.ScreenUpdating = False
  Application.DisplayAlerts = False
  
  'Find the first file
  FName = Dir(Path & "*.xlsm")
  'While found
  Do While FName  ""
    'Open the file
    Set Wb = Workbooks.Open(Path & FName, False, False)
    Changed = False
    'For each module
    For Each vbComp In Wb.VBProject.VBComponents
      With vbComp.CodeModule
        'Any lines?
        If .CountOfLines > 0 Then
          'Get them
          Contents = .Lines(1, .CountOfLines)
          If InStr(1, Contents, SearchFor, vbTextCompare) > 0 Then
            Contents = Replace(Contents, SearchFor, ReplaceWith, , , vbTextCompare)
            'Replace the contents with the modified string
            .DeleteLines 1, .CountOfLines
            .InsertLines 1, Contents
            'Clean empty lines at the top and bottom
            Do While Len(Trim$(.Lines(1, 1))) = 0
              .DeleteLines 1, 1
            Loop
            Do While Len(Trim$(.Lines(.CountOfLines, 1))) = 0
              .DeleteLines .CountOfLines, 1
            Loop
            Changed = True
          End If
        End If
      End With
    Next
    'Close the file, save if necessary
    Wb.Close Changed
    'Next file
    FName = Dir
  Loop
  
  'Done
  Application.EnableEvents = True
  Application.Calculation = xlCalculationAutomatic
  Application.ScreenUpdating = True
  Application.DisplayAlerts = True
End Sub

– I customized the “Customize this” section as appropriate
– saved and closed [wssi_mod.xlsm]
– then ran it by opening it and typing ALT-F8 (run)
– I noticed the “Date modified” property of the files had all been updated
– and opened a few to confirmed the changes had been completed correctly
– then I changed the path in the “Customize this” to the next sub-folder before saving and running again
– lastly I coped the whole lot back to the shared-drive

Who changed a stored-procedure?

To quickly find out who recently changed a stored procedure. First I found the date/time it was last changed …

SELECT name, create_date, modify_date
FROM sys.objects
WHERE type = 'P'
order by modify_date desc

… then in SQL Server Management Studio I right-clicked the Server-Name and chose Reports \ Standard Reports \ Schema-Changes History, and scrolled down to that data/time.

Scripting out SQL Jobs

When I scripted out the “DROP and CREATE” options (on SQL2008R2) for a SQL job I found it only worked once as it created a GUID that fails next time it runs (as it already existed).

To fix this I replaced [@job_id=N’blah blah blah’] in [sp_delete_job] with [@job_name=N’blah blah’] including the actual job name pasted from [sp_add_job].

Also I modified the script to work in additionally with SQL 2000 and SQL 2005 by …

– in [sp_delete_job] commenting out [–, @delete_unused_schedule=1]

– duplicating [sp_add_job] with the first one proceeded by [IF @@version LIKE ‘%2000%’; BEGIN] and no [@notify_email_operator_name]

– and the second one proceeded by [END; ELSE]

Database Restore Msg 3634 Error ’32

I kept getting an odd error message …

Msg 3634, Level 16, State 1, Line 3
The operating system returned the error ’32(The process cannot access the file because it is being used by another process.)’ while attempting ‘RestoreContainer::ValidateTargetForCreation’ on ‘E:\Website.mdf’.
blah blah blah

… while executing this code (simplified) …

RESTORE DATABASE [Website]
FROM DISK = N'E:\backups\Website.bak' WITH FILE = 1,
MOVE N'Website' TO N'D:\Website.mdf',
MOVE N'Website_Indexes' TO N'D:\Website_1.ndf',
MOVE N'Website_Customers' TO N'D:\Website_2.ndf'
MOVE N'Website_log' TO N'D:\Website_3.ldf',
NOUNLOAD, REPLACE, STATS = 1
GO

The solution was to use a double-backslash in the path like so …

RESTORE DATABASE [Website]
FROM DISK = N'E:\\backups\Website.bak' WITH FILE = 1,
MOVE N'Website' TO N'D:\Website.mdf',
MOVE N'Website_Indexes' TO N'D:\Website_1.ndf',
MOVE N'Website_Customers' TO N'D:\Website_2.ndf'
MOVE N'Website_log' TO N'D:\Website_3.ldf',
NOUNLOAD, REPLACE, STATS = 1
GO

Empty Database

Further to my last – here is my script to empty (using the very fast TRUNCATE command) all the tables that are not referenced by a foreign-key. Then in (2) attempt to delete the contents of those remaining (This fails for tables that are referenced by non-empty tables). Step-3 has another go at those that failed, hopefully more successfully now the referencing tables are empty.

-- EmptyDatabase.sql

-- 1) TRUNCATE child & independent tables

-- capture non-parent table names
SELECT SCHEMA_name(schema_id) [schema], [name] INTO #tables
FROM sys.tables
where name NOT IN (SELECT OBJECT_NAME(referenced_object_id) FROM sys.foreign_keys)
ORDER BY name

-- loop through list truncating tables
DECLARE @sql VARCHAR(300)
WHILE (SELECT COUNT(*) FROM #tables) > 0
BEGIN
	SELECT TOP 1 @sql = [schema] + '.[' + [name] + ']' FROM #tables
	DELETE FROM #tables WHERE [schema] + '.[' + [name] + ']' = @sql
	SELECT @sql = 'truncate table ' + @sql
	EXEC(@sql)
END

DROP TABLE #tables


-- 2) DELETE parents of child tables

--list all parent-tables
SELECT OBJECT_NAME(referenced_object_id) pname into #ptables
FROM sys.foreign_keys
GO

-- loop through list DELETING tables
DECLARE @sql VARCHAR(300)
WHILE (SELECT COUNT(*) FROM #ptables) > 0
BEGIN
	SELECT TOP 1 @sql = [pname] FROM #ptables
	DELETE FROM #ptables WHERE [pname] = @sql
	SET @sql = 'delete ' + @sql
	exec(@sql)
END

drop table #ptables


-- 3) DELETE parents of (now-empty) parent tables

--list all parent-tables
SELECT OBJECT_NAME(referenced_object_id) pname into #ptables2
FROM sys.foreign_keys
GO

-- loop through list DELETING tables
DECLARE @sql VARCHAR(300)
WHILE (SELECT COUNT(*) FROM #ptables2) > 0
BEGIN
	SELECT TOP 1 @sql = [pname] FROM #ptables2
	DELETE FROM #ptables2 WHERE [pname] = @sql
	SET @sql = 'delete ' + @sql
	exec(@sql)
END

drop table #ptables2

In use, this reduced a database of [965,849,336] total rows of data to just [10,860] rows in less than two seconds. Which after shrinking (this was a copy of a production-database on a dev-server set to Simple-mode), shrunk it from [822000.50 MB] to [65100.13 MB].

I think I deserve a beer!

Tables referenced by Foreign Keys

As part of my project to empty a database I wanted to TRUNCATE tables without Foreign Keys. Here’s a neat script I have not seen elsewhere, that lists the tables that are referenced my at-least one Foreign Key (that therefore could not be truncated)

-- ParentTables.sql
SELECT OBJECT_NAME(referenced_object_id) AS TablesRefdByFKs
FROM sys.foreign_keys

And just for interest here’s the script to list tables that are not parents and may or maybe not children …

-- NonParentTables.sql
SELECT name NonParentTables
FROM sys.tables
where name NOT IN (SELECT OBJECT_NAME(referenced_object_id) FROM sys.foreign_keys)
ORDER BY name

Which SQL Process is running which SQL Instance?

To easily see which SQL Process relates to which Instance of SQL Server – In Windows Task Manager / Processes
– right-click on one of the ‘sqlservr.exe’ names
– and select ‘Open file location’.

By examining the path to the file that opens I was able to distinguish the particular instance that this executable related to.