Greetings

Welcome to my technical blog. I like data and databases they seem to be at the heart of everything, although my perspective may be a tad biased :).

Below I plan to jot down things that trip me up in work, along with things I find interesting. There will be no “level” to my posts, some will seem blindingly obvious. Hopefully others will be deeper and of more interest.

I tend to write in the first person (“… this worked for me” rather than “this may work for you if your environment is similar to mine”) in the hope that you can adapt my succinct notes that ACTUALLY WORKED to your situation.

DBA Rule #6

You can serve only one God.

Whatever you fall asleep thinking about, your sub-consious will continue to work on over-night.

If your last thoughts are about impressing your boss, your technical work the next day will be pedestrian.

The two groups I see perpetually falling foul of this rule are Manager\Dba’s and Developer\Dba’s

DBA Rule #2

Avoid Scripting.

A well crafted SQL statement is a thing on beauty, and writing one gives a great deal of satisfaction.

But creating art is not the job.

Script is fragile (To know how fragile, count the number of potential typo’s). Which is fine for those who’s output is intended to be constantly improved upon. But I feel DBA’s should favour robustness over the satisfaction of creativity, and implement for the long term.

If there is simply no other choice but using a script, avoid the near endless cycle of creating // debugging // augmenting your own, and download industrial-strength scripts from a trusted source (like Ola Hallengren). But read and understand them before using in production. 🙂

Emergency Friday afternoon Backups

I had a situation where a VMWare upgrade stopped the backup software from working across the board (mmm, a little notice would have been nice).

To avoid the possibility of a full log crashing an application over the weekend I created two Jobs to do full backups every 10pm and transaction backups every 3 hours and deployed them.

The script (and scripted-out job) had to be compatible with all versions of SQL Server. The only prep I had to do was creating a folder for each server (_instance) at the backup destination.

The code I downloaded from http://www.mssqltips.com/sqlservertip/1070/simple-script-to-backup-all-sql-server-databases/ evolved into this …

--backup_all.sql

	DECLARE @name VARCHAR(50) -- database name  
	DECLARE @path VARCHAR(256) -- path for backup files  
	DECLARE @fileName VARCHAR(256) -- filename for backup 
	DECLARE @fileDate VARCHAR(20) -- added to file name
	DECLARE @sname varchar(100) -- server name
	DECLARE @DeleteDate DATETIME -- purge date ..
	SET @DeleteDate = getdate()-14 -- .. two weeks
	
-- get server name
	select @sname = replace(@@servername, '\', '_')
 
-- specify backup directory
	SET @path = '\\SomeServerName\BACKUPS\' + @sname + '\'

-- specify filename format
	SELECT @fileDate = CONVERT(VARCHAR(20),GETDATE(),112)
 
-- setup cursor
	DECLARE db_cursor CURSOR FOR  
	SELECT name FROM master.dbo.sysdatabases
	WHERE name NOT IN ('model', 'tempdb')  -- exclude these databases
 	OPEN db_cursor
	FETCH NEXT FROM db_cursor INTO @name   
 
 --loop through databases backing them up
	WHILE @@FETCH_STATUS = 0   
	BEGIN   
		SET @fileName = @path + @name + '_' + @fileDate + '.BAK'  
		BACKUP DATABASE @name TO DISK = @fileName  
		FETCH NEXT FROM db_cursor INTO @name
	END
	
-- close cursor
	CLOSE db_cursor   
	DEALLOCATE db_cursor   

-- purge old backups (but manually delete SQL2K)
	if @@version not like '%2000%' 
	exec master.sys.xp_delete_file 0, @path, 'BAK', @DeleteDate, 0

… and this for the log backups …

--backup_all_t.sql

	DECLARE @name VARCHAR(50) -- database name   
	DECLARE @path VARCHAR(256) -- path for backup files   
	DECLARE @fileName VARCHAR(256) -- filename for backup   
	DECLARE @fileDate VARCHAR(20) -- used for file name
	DECLARE @sname varchar(100) -- server name
	DECLARE @DeleteDate DATETIME -- purge date ..
	SET @DeleteDate = getdate()-7 -- .. one weeks  

-- get server name
	select @sname = replace(@@servername, '\', '_')

-- specify backup directory
	SET @path = '\\SomeServerName\BACKUPS\' + @sname + '\' 

-- specify filename format
	SELECT @fileDate = CONVERT(VARCHAR(20),GETDATE(),112)  
	   + '_' + REPLACE(CONVERT(VARCHAR(20),GETDATE(),108),':','') 

-- setup cursor
	DECLARE db_cursor CURSOR FOR   
	SELECT name FROM master.dbo.sysdatabases  
	WHERE name NOT IN ('master','model','msdb','tempdb')  -- exclude these databases
	AND DATABASEPROPERTYEX(name, 'Recovery') NOT IN ('SIMPLE') -- exclude Simple dbs
	OPEN db_cursor    
	FETCH NEXT FROM db_cursor INTO @name    

 --loop through databases, backing them up 
	WHILE @@FETCH_STATUS = 0    
	BEGIN    
		SET @fileName = @path + @name + '_' + @fileDate + '.TRN'   
		BACKUP LOG @name TO DISK = @fileName  
		FETCH NEXT FROM db_cursor INTO @name    
	END
	
-- close cursor
	CLOSE db_cursor    
	DEALLOCATE db_cursor    

-- purge old backups (but manually delete SQL2K)
	if @@version not like '%2000%' 
	exec master.sys.xp_delete_file 0, @path, 'TRN', @DeleteDate, 0

I heavily commented it as I was near the end of my contract, and knew “temporary solutions” can persist for a long time ;).

DBA Audit

Building on my “is the database used?” article I developed this SQL-job as a light-weight audit to find unused databases and logins.

The job was scheduled to run every minute on every server, and had to work on SQL 2000, 2005, 2008r2 and 2012.

There were 4 steps

1) Initial Setup – Database Audit

--create table of database names

	select [name] dbase_name, [dbid], cast(null as datetime) last_used
	into [master].[dbo].[dbaudit]
	from [master].[dbo].[sysdatabases]
	where [dbid] > 4

2) Initial Setup – Login Audit

-- initial setup

	CREATE TABLE loginAudit (
		login_name VARCHAR(200), last_login_date DATETIME)

	INSERT INTO loginAudit (login_name, last_login_date)
		SELECT DISTINCT sl.name, NULL FROM syslogins sl

Steps 1 and 2 create the tables to hold the results, and interestingly carry on to the next step if they fail (IE: the tables already existed). I did it this way so I could just run it without any preparation.

3) Update Results – Database Audit

--update connections

	update [master].[dbo].[dbaudit]
	set last_used = getdate()
	where [dbid] in 
		(select [dbid] from [master].[dbo].[sysprocesses])

4) Update Results – Login Audit

-- update logins
		
	SELECT MAX(login_time) [login_time], loginame login_name INTO #loginTempTable
	FROM master.dbo.sysprocesses
	where loginame not in ('sa')
	and loginame not like ' %'
	GROUP BY loginame

	UPDATE loginAudit
	SET last_login_date = ltt.login_time 
	FROM #loginTempTable ltt
	WHERE loginAudit.login_name = ltt.login_name

I wanted steps 3 and 4 to capture the current information without any table growth, so if my little job ran for years unnoticed the results-tables would not be huge.

BTW, I developed this on a candidate server from each of the 4 SQL versions, before scripting out the job on SQL server 2012 (“Drop and Create”).

FYI, I tested the resulting script (“CreateJob_DBAAudit.sql”) on the 4 servers again, and made some minor adjustments (so it would work more than once, and without errors on SQL-2000), before deploying to all 300+ production servers via SSMS Registered Servers.

Setting Cost Threshold of Paralelism

Prep. Reset the Cost_Threshold_for_Parallelism to Microsoft’s default (5) if it has been changed

1. Set MaxPOD from the default (0) to Microsoft recommendations (IE: number of cores in 1 cpu max 8)

2. Run dbo.sp_BlitzCache, and from the first result table cut-n-paste the first three columns into a spreadsheet.

3. Sort the spreadsheet by column-2 ‘cost’ and try to find a natural boundary between OLTP and OLAP
queries.

4. Set Cost Threshold for Parallelism in the middle of this split.

NB: do not run #2 again as the results will be deceptive unless (prep) is actioned first

Update multiple XLSM spreadsheets using VBA

As part of a migration I needed to change the hard-code connections within a ‘WSSI’ application (a suit of dynamic spreadsheets used for stock-control). In essence this was a folder in a shared-location containing hundreds of ‘xlsm’ spreadsheets in sub-folders.

To do this manually I would have had to open each spreadsheet one-by-one, press ALT-F11, navigate down the tree in the top-left pane, opening MODULES then CONSTANTS. Then change the hard-coded values EG: [Public Const DB_SERVER As string = “Old-Name”] and type in the “New-Name” before saving.

** This would have taken days as there were hundreds of them **

I was able to semi-automated the process and completed it in an hour. Here’s how …

– I copied the root folder to my C-Drive
– Created a new XLSM spreadsheet on my desktop
– Opened this File \ Options \ Trust Center \ Trust Center Settings \ Macro Settings
– and ticked “Trust access to the VBA project model”
– saved it as [wssi_mod.xlsm]
– selected any cell and typed ALT-F11 (to develop)
– In the top-left pane I right-clicked on “Sheet1”, Insert \ Module
– in the right-hand pane I pasted this VBA

Option Explicit

Sub Test()
  Dim Path As String, FName As String
  Dim SearchFor As String, ReplaceWith As String, Contents As String
  Dim Wb As Excel.Workbook
  Dim vbComp As Object 'VBIDE.VBComponent
  Dim Changed As Boolean
  
  'Customize this:
   Path = "C:\WSSI_2014\Dairy\"
     SearchFor = "Public Const DB_SERVER As String = ""old-name"""
   ReplaceWith = "Public Const DB_SERVER As String = ""new-name"""
    
  'Prepare Excel
  Application.EnableEvents = False
  Application.Calculation = xlCalculationManual
  Application.ScreenUpdating = False
  Application.DisplayAlerts = False
  
  'Find the first file
  FName = Dir(Path & "*.xlsm")
  'While found
  Do While FName  ""
    'Open the file
    Set Wb = Workbooks.Open(Path & FName, False, False)
    Changed = False
    'For each module
    For Each vbComp In Wb.VBProject.VBComponents
      With vbComp.CodeModule
        'Any lines?
        If .CountOfLines > 0 Then
          'Get them
          Contents = .Lines(1, .CountOfLines)
          If InStr(1, Contents, SearchFor, vbTextCompare) > 0 Then
            Contents = Replace(Contents, SearchFor, ReplaceWith, , , vbTextCompare)
            'Replace the contents with the modified string
            .DeleteLines 1, .CountOfLines
            .InsertLines 1, Contents
            'Clean empty lines at the top and bottom
            Do While Len(Trim$(.Lines(1, 1))) = 0
              .DeleteLines 1, 1
            Loop
            Do While Len(Trim$(.Lines(.CountOfLines, 1))) = 0
              .DeleteLines .CountOfLines, 1
            Loop
            Changed = True
          End If
        End If
      End With
    Next
    'Close the file, save if necessary
    Wb.Close Changed
    'Next file
    FName = Dir
  Loop
  
  'Done
  Application.EnableEvents = True
  Application.Calculation = xlCalculationAutomatic
  Application.ScreenUpdating = True
  Application.DisplayAlerts = True
End Sub

– I customized the “Customize this” section as appropriate
– saved and closed [wssi_mod.xlsm]
– then ran it by opening it and typing ALT-F8 (run)
– I noticed the “Date modified” property of the files had all been updated
– and opened a few to confirmed the changes had been completed correctly
– then I changed the path in the “Customize this” to the next sub-folder before saving and running again
– lastly I coped the whole lot back to the shared-drive

Spreadsheet Sizer

A customer was constantly importing a variety of XLS files, and atleast once a week having to scroll through raw data to find the cause of a generic “data would be truncated” type error.

I pieced together this SP (to be called by the already inplace EMAIL_SORTER procedure) to indicate the column with the problem …

USE [DBA]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

/***************************************************************************************************
Procedure Name	:	Spreadsheet_check
Date Created	:	12/02/2013

Description	:	Reports the Size of spreadsheet columns
***************************************************************************************************/

alter procedure [dbo].[Spreadsheet_check]
@requester varchar(200)
as

declare @recipients varchar(200), @query varchar(200), @subject varchar(100), @body varchar(200)
declare @file_attachments varchar(200), @xlsname varchar(50), @doscmd varchar(50), @sqlcmd varchar(max)

-- get the name of the oldest spreadsheet to be checked
set @doscmd = 'dir /b E:DBAsizer*.xls /OD'
truncate table tmp_inputs
insert into tmp_inputs
exec master.dbo.xp_cmdshell @doscmd
set @xlsname = (select top 1 * from tmp_inputs)

-- import the spreadsheet
begin try drop table tmp_import end try begin catch end catch -- ignore errors

begin try
set @sqlcmd = 'select * into tmp_import
				from openrowset(''microsoft.ace.oledb.12.0'',
				''excel 12.0;database=e:dbasizer' + @xlsname + ''',
				''select * from [Sheet1$]'')'
exec (@sqlcmd)
end try

	--ERROR: Wrong Sheet Name
	begin catch
	select	@recipients = @requester,
		@query = '',
		@subject = 'Excel Size checker: Wrong worksheet',
		@body = 'Sorry the SPREADSHEET SIZING TOOL can only work with default WorkSheet names IE: sheet1',
		@file_attachments = '\' + dbo.RB_GetServerName() + 'DBAsizer' + @xlsname
	exec dbo.RB_Emailsender @recipients, @query, @subject, @body, @file_attachments
	goto tidyup
	end catch

-- look up metadata
select @sqlcmd = stuff((select
'union all
select ' + quotename(column_name,'''') + ',
max(len(' + quotename(column_name) + '))
from ' + quotename(table_schema) + '.' + quotename(table_name)
from information_schema.columns
where table_name = 'tmp_import'
and table_schema = 'dbo'
order by column_name
for xml path(''),type).value('.','varchar(max)'),1,11,'')

-- email report
truncate table tmp_sizer
insert into tmp_sizer exec(@sqlcmd)

select	@recipients = @requester,
		@query = 'SELECT * FROM  [' + dbo.RB_GetServerName() + '].DBA.dbo.tmp_sizer',
		@subject = 'Excel Size Checker',
		@body = 'To use this SPREADSHEET SIZING TOOL please email an xls file to SQLSRV with the subject of SIZER. ' + Space(5) + ' Please find below the results for the attached spreadsheet. ' + Space(5),
		@file_attachments = '\' + dbo.RB_GetServerName() + 'DBAsizer' + @xlsname

exec dbo.RB_Emailsender @recipients, @query, @subject, @body, @file_attachments

tidyup:
set @doscmd = 'del \' + dbo.RB_GetServerName() + 'dbasizer' + @xlsname
exec master.dbo.xp_cmdshell @doscmd

Find stored-procedure on current server

A SQL-Job was failing with Could not find stored procedure ‘RemovePushMsgs’.

I noticed the drop-down-value ‘database’ was missing from the job-step – which was odd. To confirm my suspicion that the database had been removed but the job forgotton I wrote the following script to search each and every database for the missing SP.

--find_sp.sql
-- to find stored-procedure names like '%RemoveM%' in all databases

create table #temp (name varchar(50), dbname varchar(50), xtype varchar(50))

exec sp_Msforeachdb "use [?];insert #temp select name, '?', xtype from sysobjects where name like '%RemoveM%'"

select name, dbname, case xtype
when 'P' then 'SP'
when 'S' then 'System Table'
when 'X' then 'XP'
when 'U' then 'Table'
when 'TF' then 'FunctionT'
when 'FN' then 'FunctionS'
when 'V' then 'View'
when 'IF' then 'FunctionI'
when 'D' then 'Default'
else xtype end [type]
from #temp
order by 1

drop table #temp

Nightly Restore from FTP

Similar to a previous project, a customer wanted a SQL-2005 database refreshing over-night from a backup on an FTP server. Using an already inplace tool ‘CoreFTP’ via an already inplace stored-procedure ‘usr_FTPDownload’ I created a job …

Step-1 “Download backup(s) from ftp site”
exec dba.dbo.usr_FTPDownload
@ftpfile = ‘*.bak’,
@DLdir = ‘E:FTP’,
@ftpprofile = ‘FTP’,
@logfile = ‘E:Microsoft SQL ServerMSSQL.1MSSQLLOGRestore of AdventureWorks from FTP.log’,
@email = ‘someone@somewhere.com’ — on failure

Step-2 “Restore the latest backup”
— GET NAME OF LATEST BACKUP
create table #temp (col1 varchar(max))
insert #temp(col1)
exec xp_cmdshell ‘dir /b /o-d e:ftp*.bak’

declare @newest varchar(50)
set @newest = (select top 1 * from #temp)

–BUILD DYNAMIC COMMAND
declare @cmdstr varchar(200)
set @cmdstr = ‘restore database AdventureWorks
from disk=”e:ftp’ + @newest + ”’
with file=1,
nounload,
replace,
stats=10′

–RUN
exec (@cmdstr)

Step-3 “grant access to user” (running on AdventureWorks)
create user an_other for login an_other
exec sp_addrolemember ‘db_datareader’, ‘an_other’

Step-4 “rename the backup-file for housekeeping”
— get name of latest backup
create table #temp (col1 varchar(max))
insert #temp(col1)
exec xp_cmdshell ‘dir /b /o-d e:sftp*_*.bak’

declare @oldname varchar(50)
set @oldname = (select top 1 * from #temp)

–check if already renamed
if @oldname like ‘%File Not Found%’ return

— extract namepart
declare @namepart varchar(50)
set @namepart = left(@oldname,(patindex(‘%[_]%’,@oldname)-1)) — text before underscore

— convert date-part into day-of-week
declare @weekday varchar(10)
set @weekday = datepart(dw, right(left(@oldname,(patindex(‘%.%’,@oldname)-1)),10))

— buld-up new-name
declare @newname varchar(50)
set @newname = @namepart + @weekday + ‘.bak’

— run it
declare @cmdstr varchar(100)
set @cmdstr = ‘ren e:sftp’ + @oldname + ‘ ‘ + @newname
exec xp_cmdshell @cmdstr

Step-5 “email results – on success”

declare @datetime as varchar(20)
declare @email as varchar(100)

set @datetime = convert(varchar(17),(select top 1 restore_date
from msdb.dbo.restorehistory
where (destination_database_name = ‘AdventureWorks’)
order by restore_date desc),113)

set @email = ‘Database “AdventureWorks” Refreshed at “‘+@datetime+'”.’

exec msdb.dbo.sp_send_dbmail
@recipients = ‘someone@somewhere.com’,
@subject = @email ;

Database Sizes

The script below is very useful for listing database sizes. However sometimes it comes up with the following error…

Cannot insert the value NULL into column ” owner”;

This is the result of a database not having an owner which is quite a serious condition. Happily it is easiy identified like this …

select name, suser_sname(sid) owner from master..sysdatabases

… and fixed like this …

alter authorization on database::[SomeDatabase] to sa;

Or this (which is depreceiated from SQL2012) …

exec sp_changedbowner ‘sa’;

Back on topic 🙂 … here’s the script …

— dbsize.sql
set nocount on

create table #sizes (
name varchar(50),
db_size varchar(50),
owner varchar(50),
dbid char(1),
created smalldatetime,
status varchar(500),
compatability_level char(2))

insert #sizes exec sp_helpdb

select name, db_size
from #sizes
order by db_size desc

drop table #sizes

Tablespace has only 1% free

My daily checks reveiled that tablespace D_LARGE3 had only 1% free (NB: these are manual extend).
I listed the current datafile’s to see the path and name, and to see how large the current ones were …

<code>col tablespace_name for a20

col file_name for a40

select tablespace_name, file_name, bytes/1024/1024

from dba_data_files

where tablespace_name = ‘D_LARGE3’

order by file_name;

TABLESPACE FILE_NAME                                BYTES/1024/1024———- —————————————- —————D_LARGE3   /db010/oradata/CIX/d_large3_01.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_02.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_03.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_04.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_05.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_06.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_07.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_08.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_09.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_10.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_11.dbf                  1985

TABLESPACE FILE_NAME                                BYTES/1024/1024———- —————————————- —————D_LARGE3   /db010/oradata/CIX/d_large3_12.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_13.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_14.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_15.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_16.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_17.dbf                  1985D_LARGE3   /db010/oradata/CIX/d_large3_18.dbf                  1985D_LARGE3   /db013/oradata/CIX/d_large3_19.dbf                  1000</code>

… and noticed the last one was only 1GB in size. Therefore I didn’t need to add a new datafile, just expand it …

<code>alter database datafile ‘/db013/oradata/CIX/d_large3_19.dbf’ resize 2000m

</code>

I confirmed this had worked with the ts-free script (attached) …

<code>column owner format a10

column “%free” format 999

select a.tsname, a.total “Tot/Kb”, a.total – b.free “Used/Kb”,

b.free “Free/K”, (b.free * 100) / a.total “%free”

from

(select tablespace_name tsname, sum(bytes)/1024 total

from dba_data_files

group by tablespace_name ) a,   (select tablespace_name tsname, sum(bytes)/1024 free     from dba_free_space     group by tablespace_name ) b   where a.tsname = b.tsname(+)/
TSNAME                             Tot/Kb    Used/Kb     Free/K %free—————————— ———- ———- ———- —–D_LARGE1                         58312704   56477696    1835008     3D_LARGE3                         56929280   56142848     786432     2</code>
job done :-))
btw <a href=”/node/179″>here’s</a> a similar incident were I had to add a datafile

Useful Windows commands

This will be my repository for tried-and-tested useful Windows commands …

1) This is my perpetual alarm-lock whilst working-at-home during the week. Stops me clock-watching, and lets me know when I’m doing overtime. (Don’t forget to check the messenger-service is started).

at 17:00 /every:M,T,W,Th,F net send “richard” Its Five O’Clock lol

2) In the path (EG: C:Windows) I create a file ‘remindme.cmd’ containing the code …

at %1 net send “richard” %*
exit

… then at some time in my busy future I can set a simple, robust, reminder (robust as I don’t need anything running – like an email client). I just open a command prompt and type something like ‘remindme 13:58 call joe for lunch’.
(Oh, and don’t forget to check the messenger-service is working).