Backup Status

Here’s my script that estimates when a running backup will finish …

--backup_status.sql
SELECT r.session_id,r.command,CONVERT(NUMERIC(6,2),r.percent_complete) AS [Percent Complete],
CONVERT(VARCHAR(20),DATEADD(ms,r.estimated_completion_time,GetDate()),20) AS [ETA Completion Time],
CONVERT(NUMERIC(10,2),r.total_elapsed_time/1000.0/60.0) AS [Elapsed Min],
CONVERT(NUMERIC(10,2),r.estimated_completion_time/1000.0/60.0) AS [ETA Min],
CONVERT(NUMERIC(10,2),r.estimated_completion_time/1000.0/60.0/60.0) AS [ETA Hours],
CONVERT(VARCHAR(1000),(SELECT SUBSTRING(text,r.statement_start_offset/2,
CASE WHEN r.statement_end_offset = -1 THEN 1000 ELSE (r.statement_end_offset-r.statement_start_offset)/2 END)
FROM sys.dm_exec_sql_text(sql_handle))) [CommandText]
FROM sys.dm_exec_requests r WHERE r.session_id > 50
--AND command IN ('RESTORE DATABASE','BACKUP DATABASE', 'KILLED/ROLLBACK')
--AND CONVERT(NUMERIC(6,2),r.percent_complete) > 0

Alert on low disk space

Here’s a script I wrote for SQL2008R2, to run hourly in a sql-job, and email me if the percentage of free disk-space is getting low.

-- space_alert.sql

     DECLARE @trig int = 40, -- percentage free
		@operator varchar(50) = 'DBA', -- operator with email address
		@drive varchar(50), @free varchar(50), @email varchar(300)

-- capture free-space to temp-table

	DROP TABLE #space ------------testing
     select volume_mount_point drive, cast(sum(available_bytes)*100 / sum(total_bytes) as int) Free
     into #space
     from sys.master_files f
     cross apply sys.dm_os_volume_stats(f.database_id, f.[file_id])
     group by volume_mount_point

-- loop through the table row-by-row

     while (select count(*) from #space) > 0
     begin
          set @drive = (select top 1 drive from #space order by drive)
          set @free = (select top 1 free from #space order by drive)

-- and send email if space low

          if @free < @trig
          begin
               set @email = 'EXECUTE msdb.dbo.sp_notify_operator @name=N''' + @operator + ''', @subject=N''Space Alert (' + @@servername + @drive + @free + '%)'',@body=N''On '+ @@servername  + ' the Volume ' + @drive + ' has only ' + @free + '% free space.'''
               exec(@email)
          end

-- then remove current line from table

          delete from #space where drive = @drive
     END

NOTE: I am using @@servername as there are only default instances here.

Try/Catch and Transactions

Try/Catch-blocks and Transaction-Blocks really don’t play nice together. Within nests for example an outer @@ROLLBACK TRANSACTION will rollback both inner and outer transaction-blocks regardless of already executed COMMIT TRANSACTION statements.

This is the most robust template I could come up with for nested Try/Catch-blocks that contain Transactions.

BEGIN TRY
BEGIN TRY
(inner DDL statements)
begin transaction
(inner DML statements)
commit transaction
END TRY

BEGIN CATCH
(inner error capture statements)
rollback transaction
END CATCH
(outer DDL statements)
begin transaction
(outer DML statements)
while @@trancount > 0 commit transaction
END TRY

BEGIN CATCH
(outer error capture statements)
while @@trancount > 0 rollback transaction
END CATCH

Configuring DAC

There is another definition of DAC in SQL Server – but this one is about the emergency DBA connection method, used when the server it too busy to get onto any other way.

DAC can be a life saver but has to be configured in advance. Luckily its a two minute job and I do it on all Production servers. and here’s how …

– Right click on server-name (in SSMS Object explorer), and choose facets.
– when the Facets box pops up select Surface Area Configuration in the top box.
– in the lower box set RemoteDacEnabled to True, and save with OK.

** TO USE IT **

– startup SSMS and ensure nothing is connected.
– click on “New Query”
– in the “Connect to Server” box prefix the Server Name with “admin: ”
– complete Authentication and click “Connect”

You should now be connected within a minimal safe-mode type environment. That is, with a blank Object Explorer pane.

** UPDATE 05 Mar 2015 **

To enable DAC on multiple servers (2005 and above) programmatically (via Registered Servers) use this …

Use master
GO
sp_configure 'remote admin connections', 1
GO
RECONFIGURE
GO

Check a whole server for database corruption

To check every database on a sql-2005 server for corruption (before an upgrade) I created a job with one step containing this code …

exec sp_msforeachdb 'DBCC CHECKDB(''?'')  WITH NO_INFOMSGS'

… then in the advanced tab I created an output file d:\checkdb.log.

A few hours after I’d manually started the job I was happy to see the output file was empty.

SQL Server 2012 Books-On-Line

To setup a local copy of SQL Server 2012 Books-On-Line (BOL) on Windows 2012 I downloaded it from here … http://www.microsoft.com/en-us/download/details.aspx?id=347 … unpacked it to the default location and noted it. Then in SSMS I chose Help / Manage Help Settings / Choose online or local help / I want to use local help / Install contents from disk, and browsed within the unpack location for the HelpContentSetup.msha file. I clicked the three ADD links within the pop-up box. And found BOL by clicking Help / View help within SSMS.

Using OUTPUT to simplify the auditing of data changes.

Here’s an example of using OUTPUT to simplify the recording of updates etc …

-- create the audit log
create table audit_table (fname varchar(50), colour varchar(50), 
old_tasty int, changed smalldatetime, new_tasty int)

-- create and populate the fruit table
create table fruit (fname varchar(50),colour varchar(50),tasty int)
insert into fruit values ('banana','yellow',6),('kiwi','green',3),('gooseberry','green',2)

-- change score to 'yuk' for any green fruits
update fruit set tasty = 1
output deleted.*, getdate(), inserted.tasty into audit_table
where colour = 'green'

-- check audit log
select * from audit_table

… and another nail in the coffin of Triggers 🙂

Log Shipping is simply the best!

For robust High Availability and/or Disaster Recovery projects Log-Shipping is my ALL TIME FAVORITE option.

Log-Shipping is the idea that you take your -already there- backups and restore them on any number of other SQL Servers. This means it works with all versions of SQL Server, and all edition.

There are no overheads on the production server that is being protected and the standby server(s) are fully open to use between restores. For example you could schedule restores to a reporting-server out of hours.

Compared to the complexity, restrictions, over-head, and manageability of other more fashionable HA/DR options its a “no-brainer” for me.

For example Mirroring, is only one prod-server to one (read-only) standby server, Replication is high maintenance and flattens referential-integrity, and AlwaysOn Availability Groups is difficult to setup correctly and near-impossible to troubleshoot.

ss11

** UPDATE ** It is best to log ship between the same versions of SQL Server.

SQL 2005 to SQ 2008 for example will look like its worked, but all restores will be skipped *Unless you use no-recovery mode*.

Spreadsheet Sizer

A customer was constantly importing a variety of XLS files, and atleast once a week having to scroll through raw data to find the cause of a generic “data would be truncated” type error.

I pieced together this SP (to be called by the already inplace EMAIL_SORTER procedure) to indicate the column with the problem …

USE [DBA]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

/***************************************************************************************************
Procedure Name	:	Spreadsheet_check
Date Created	:	12/02/2013

Description	:	Reports the Size of spreadsheet columns
***************************************************************************************************/

alter procedure [dbo].[Spreadsheet_check]
@requester varchar(200)
as

declare @recipients varchar(200), @query varchar(200), @subject varchar(100), @body varchar(200)
declare @file_attachments varchar(200), @xlsname varchar(50), @doscmd varchar(50), @sqlcmd varchar(max)

-- get the name of the oldest spreadsheet to be checked
set @doscmd = 'dir /b E:DBAsizer*.xls /OD'
truncate table tmp_inputs
insert into tmp_inputs
exec master.dbo.xp_cmdshell @doscmd
set @xlsname = (select top 1 * from tmp_inputs)

-- import the spreadsheet
begin try drop table tmp_import end try begin catch end catch -- ignore errors

begin try
set @sqlcmd = 'select * into tmp_import
				from openrowset(''microsoft.ace.oledb.12.0'',
				''excel 12.0;database=e:dbasizer' + @xlsname + ''',
				''select * from [Sheet1$]'')'
exec (@sqlcmd)
end try

	--ERROR: Wrong Sheet Name
	begin catch
	select	@recipients = @requester,
		@query = '',
		@subject = 'Excel Size checker: Wrong worksheet',
		@body = 'Sorry the SPREADSHEET SIZING TOOL can only work with default WorkSheet names IE: sheet1',
		@file_attachments = '\' + dbo.RB_GetServerName() + 'DBAsizer' + @xlsname
	exec dbo.RB_Emailsender @recipients, @query, @subject, @body, @file_attachments
	goto tidyup
	end catch

-- look up metadata
select @sqlcmd = stuff((select
'union all
select ' + quotename(column_name,'''') + ',
max(len(' + quotename(column_name) + '))
from ' + quotename(table_schema) + '.' + quotename(table_name)
from information_schema.columns
where table_name = 'tmp_import'
and table_schema = 'dbo'
order by column_name
for xml path(''),type).value('.','varchar(max)'),1,11,'')

-- email report
truncate table tmp_sizer
insert into tmp_sizer exec(@sqlcmd)

select	@recipients = @requester,
		@query = 'SELECT * FROM  [' + dbo.RB_GetServerName() + '].DBA.dbo.tmp_sizer',
		@subject = 'Excel Size Checker',
		@body = 'To use this SPREADSHEET SIZING TOOL please email an xls file to SQLSRV with the subject of SIZER. ' + Space(5) + ' Please find below the results for the attached spreadsheet. ' + Space(5),
		@file_attachments = '\' + dbo.RB_GetServerName() + 'DBAsizer' + @xlsname

exec dbo.RB_Emailsender @recipients, @query, @subject, @body, @file_attachments

tidyup:
set @doscmd = 'del \' + dbo.RB_GetServerName() + 'dbasizer' + @xlsname
exec master.dbo.xp_cmdshell @doscmd

Importing CSV-file produces NULL values

I had an issue where importing data from a CSV file resulted in NULL’s in a column that should have contained bank sort-codes.

The CSV data in question was in the format ‘nn-nn-nn’ or ‘nn/nn/nn’.

Here’s the Select part of the import command …

SELECT sortcode
FROM OPENROWSET ('Microsoft.ACE.OLEDB.12.0',
		 'Text;Database=\someserver\somefolder;HDR=Yes;',
		 'SELECT * FROM [somefile.csv]')

I found adding IMEX=1 didn’t help.

The answer was to change the registry value …

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office14.0\Access\Connectivity Engine\Engines\TextImportMixedTypes

from ‘Majority Type’ to ‘Text’.

(the majority of numbers in nn/nn/nn were setting this to numeric. Then on import the slashes made it look like dates lol)

** BEWARE – This is off-course a global server setting – be ready to undo – BEWARE **

CoreFTP: “Can’t find site profile”

Further to my post about using CoreFTP to automate FTPS transfers … I had an issue where the SQL Job-step kept failing with “cannot find profile!”.

The issue was caused by my configuring a ‘site-profile’ within CoreFTPs site-manager whilst logged in as myself. This was then stored in my registry settings and when the job-step executed as the SQL-Service-account it did not have access.

The fix (I chose) was to change the default-storage setting from the registry to a file (View / Options / Data / lock / admin / use default configuration file / lock / admin) then set-up the sites profile again.

Script to examine SQL Server 2005/08 logins

Adapting my previous “Script to examine SQL Server 2000 logins” for an upcoming SQL-2005 audit 🙂 …

-- sp_logins_report.sql
use dba
go

begin try drop table #rolemember, #dbnames, #report END TRY
BEGIN CATCH END CATCH -- ignore errors

--variables
declare @counter int
declare @dbname varchar(50)
declare @sqlstr nvarchar(4000)

--make table to hold database, user-define roles & user-names
create table #rolemember
(
rm_servername varchar(50) default @@servername,
rm_dbname varchar(1000),
rm_rolename varchar(1000),
rm_username varchar(1000),
rm_userid varchar(1000)
)

--make table to hold database names
create table #dbnames
(dbid int identity(1,1),
db_dbname varchar(50))

--make table to accumulate report
create table #report
(
re_servername varchar(50),
re_dbname varchar(1000),
re_rolename varchar(1000),
re_username varchar(1000),
re_userid varchar(1000)
)

--get members of each server role
insert into #rolemember (rm_rolename, rm_username, rm_userid)
exec dbo.sp_helpsrvrolemember

--get database names
insert into #dbnames (db_dbname)
select '[' + name + ']' from master.dbo.sysdatabases
where version > 0  -- online
set @counter = @@rowcount

--loop through databases to get members of database roles and user-defined roles
while @counter > 0
	begin

	--get database name from #dbnames table
	set @dbname = (select db_dbname from #dbnames where dbid = @counter)

	--get members of each database and user-defined role
	set @sqlstr = 'insert into #rolemember (rm_rolename, rm_username, rm_userid)
	exec ' + @dbname + '.dbo.sp_helprolemember'
	exec sp_executesql @sqlstr

	--update database name in rolemember table
	update #rolemember
	set rm_dbname = @dbname
	where rm_dbname is null

	set @counter = @counter - 1
	end

--put data into report table
insert into #report
select rm.* from #rolemember rm
left join #report re
on rm.rm_username = re.re_username
and rm.rm_dbname = re.re_dbname
and rm.rm_rolename = re.re_rolename
and rm.rm_servername = re.re_servername
where re.re_servername is null

--display report
select re_username UserName, re_rolename RoleName, re_dbname DBName, re_servername ServerName
from #report
where re_username != 'dbo'
--order by re_username --display by user
order by re_rolename --display by role
--order by re_dbname --display by database

Error displaying Standard-Reports

I installed the latest sp and Performance-Dashboard download on my sql 2005 production server so I could see some of the reports I’m used to in SQL 2008. I’m aware these are just front-ends to the dynamic-views so wouldn’t increase the server-load.

Unfortunately the very first report I wanted “Server Dashboard” came up with an error about the compatibility-mode setting …

“sql_handle” is not a recognized table hints option. If it is intended as a parameter to a table-valued function, ensure that your database compatibility mode is set to 90.

After much head-scratching the fix was to simply set my default database back to ‘Master’. lol.

Manage SQL job owner’s

Whilst taking over from the outgoing DBA I wrote this handy query to list the SQL-Jobs he owned …

--list_job_owners.sql
SELECT name job_name, SUSER_SNAME(owner_sid) job_owner
FROM msdb.dbo.SysJobs
ORDER BY name;

… and here’s the code to change them to ‘sa’ …

--change_job_owners.sql
EXEC msdb.dbo.sp_manage_jobs_by_login
    @action = N'REASSIGN',
    @current_owner_login_name = N'SomeLoginName',
    @new_owner_login_name = N'sa';

Biggest Tables

Here’s a handy script to list the sizes of all tables in the current database …

-- biggest_tables.sql

CREATE TABLE #temp (
table_name sysname ,
row_count INT,
reserved_size VARCHAR(50),
data_size VARCHAR(50),
index_size VARCHAR(50),
unused_size VARCHAR(50))

SET NOCOUNT ON
INSERT #temp
EXEC sp_msforeachtable 'sp_spaceused ''?'''

SELECT a.table_name,
a.row_count,
COUNT(*) AS col_count,
a.data_size
FROM #temp a
INNER JOIN information_schema.columns b
ON a.table_name collate database_default
= b.table_name collate database_default
GROUP BY a.table_name, a.row_count, a.data_size
order by a.row_count desc --row_count
--order by cast(COUNT(*) as int) desc --col_count
--ORDER BY CAST(REPLACE(a.data_size, ' KB', '') AS integer) DESC --data_size

DROP TABLE #temp

Changing RAC to noarchivelog mode

Whilst trying to change an Oracle 11g2 database to noarchivelog mode I kept getting the error “ORA-01126: database must be mounted exclusive and not open for this operation”.

The word “exclusive” reminded me this was part of a cluster. Here’s the commands I ultimately used …

sql> select log-mode from gv$database; -- to check the modes of all the nodes.
c:> srvctl status database -d CLUSTER_NAME -- check nodes are running
c:> srvctl stop database -d CLUSTER_NAME -- stop all nodes of the database
c:> srvctl status database -d CLUSTER_NAME -- check nodes are stopped
sql> startup mount exclusive; -- mount database just on this node
sql> alter database noarchivelog; -- change the log-mode
sql> select log_mode from gv$database; - check it worked
sql> shutdown immediate; -- shutdown this node
c:> srvctl start database -d CLUSTER_NAME -- startup all nodes of the database
sql> select log_mode from gv$database; -- check modes of all nodes
c:> srvctl status database -d CLUSTER_NAME -- check nodes are running

Of-course I used the actual name of the cluster instead of “CLUSTER_NAME” above.

Documentation

Documentation is the constant burden of many a computer professional – but why? Because it is for-the-most-part a static, legacy reporting system.

The usefulness of documentation is undoubted when dealing with other professions, but the curse is that it needs to be maintained. And in a fast changing environment – like in the computer industry – documentation needs to be dynamic.

Spreadsheets should update themselves for example. Have you ever seen a correct server list?

So please please, file static documentation with telephone directories – in the bin 🙂

Configuring MySQL replication

This is a record of the steps I performed to enable replication between MySQL5.0.77 on RHEL and 5.5.25 on WindowsXP.

1. ensure two-way connectivity with pings.

2. make mysql on windows easier to work with by copying mysql.exe and mysqldump.exe to c:windowssystem32.

3. On the Replication-Master (RHEL) I made a copy of /etc/my.cnf then amended the original using VI adding the lines
“log-bin=mysql-bin” and “server-id=1” within the [mysqld] section.

I rebooted and tested that binary-logging was ‘on’ with the command “show variables like ‘log_bin'”.

4. On the WindowsXP / 5.5 / Slave machine I copied then amended the file “c:program filesmysqlmysql server 5.5my-large.ini” (note: I chose ~large.ini as my machine had 512mb memory).

server-id=2
master-host=169.254.0.17
master-user=repuser
master-password=password

I then confirmed all other conflicting settings were commented out (EG: server-id=1) and sdaved the file.

The difference between SQL-Server and Oracle DBA’s.

I feel that Microsoft DBA’s are not so worried about knowing every nook-and-cranny of their DBMS because Microsoft can always be relied on to find a specialist to help out 🙂

Oracle DBA’s on-the-other-hand know that Oracle PLC only does databases, not Support or Documentation (ok I may be exaggerating a bit). So Oracle DBA’s are on their own, and really need an intimate understanding of their particular DBMS internals.

I’m not inferring the superiority of one tribe over the other, just noting the difference 🙂

TransactionLog sizes

Here’s a nice script I like to run on all registered servers (at once) to show the sizes of the log-files …

--logdrivesize.sql

create table #temp(
dbname varchar(100),
logsize_mb real,
spaceused_pct real,
dbstatus int)
insert #temp exec ('DBCC SQLPERF (LOGSPACE)')

select dbname, logsize_mb
from #temp
where logsize_mb > 1000
order by logsize_mb desc

drop table #temp

Moving Logins

After migrating a database to another server use the system-procedure ‘sp_help_revlogin’ (on the source) to create scripts of all logins. Including PASSWORDS of all sql-logins (then run it on the target) …

exec sp_help_revlogin --all
exec sp_help_revlogin 'somelogin' -- just for 'somelogin'

Orphaned users

Here’s my crib-sheet for finding and dealing with orphaned users …

--orphaned_users.sql

-- SOURCE = 
-- TARGET = 

-- step-1: list orphaned users in current db
EXEC sp_change_users_login 'Report';

-- step-2: map user to login - if they both exist
EXEC sp_change_users_login 'update_one', 'some_user', 'some_login';

-- step-3: copy login, then step-2, then step-1
-- (NOTE: execute this on the source, then execute the output on the target)
Select
'Create Login ' + QUOTENAME(A.name) 
+ ' With Password=' + CONVERT(varchar(max), A.password_hash, 1) + ' hashed'		--script out the passwrod
+ ', DEFAULT_DATABASE=' + quotename(A.default_database_Name) --if you apply the scrip to a mirroed server or log-shipped server user master database, as user database is not in useable state
+ ', DEFAULT_LANGUAGE=' + quotename(A.default_language_Name)
+ ', CHECK_POLICY=' + Case A.is_policy_checked when 0 then 'OFF' When 1 Then 'On' End 
+ ', CHECK_EXPIRATION=' + Case A.is_expiration_checked when 0 then 'OFF' When 1 Then 'On' End
+ ', SID=' + CONVERT(varchar(max), A.SID, 1)		--script out the SIDs
 As SQLLogin
From 
sys.sql_logins A
Where A.name Not like '##%##'  --remove those system generated sql logins
And A.sid != 0x01 --SA sid is always same
and QUOTENAME(A.name) = '[' + 'some_login' + ']'


-- Extra

-- list USERS (current server/db)
SELECT * FROM sys.database_principals where type in ('S', 'U') order by 1

-- list LOGINS (current server/db)
SELECT * FROM sys.server_principals where type in ('S', 'U') order by 1

Database sizes

Although I already have one – here’s a neat script to show all database sizes …

--database_sizes.sql
select db.name, round(sum(convert(bigint, mf.size)) * 8 /1024 /1000 ,0) [SizeGB]
from master.sys.master_files mf
inner join master.sys.databases db
on db.database_id = mf.database_id
where db.database_id > 4
group by db.name
--order by db.name
order by [SizeGB] desc

Scheduling CHECKDB

I modified my backup-all script to schedule a DBCC CHECKDB run on ‘most’ databases. Then I augmented it further – to quit if still running at 18:30 …

--Checkdb_most.sql
DECLARE @dbname VARCHAR(100)
declare @return int
set @return = 0

DECLARE db_cursor CURSOR FOR
SELECT name FROM master.dbo.sysdatabases
where dbid > 4 -- exclude system databases
and name not in ('somedatabasename')

OPEN db_cursor
FETCH NEXT FROM db_cursor INTO @dbname

WHILE (@@FETCH_STATUS = 0) AND (@return = 0)
BEGIN
       if getdate() > '01 feb 2012 18:20' set @return = 1
       DBCC CHECKDB (@dbname) with PHYSICAL_ONLY
       FETCH NEXT FROM db_cursor INTO @dbname
END

CLOSE db_cursor
DEALLOCATE db_cursor

Scripted full backup of all databases.

I had a requirement to script full backups and this was the simplest code I could find …

--backupAll.sql

DECLARE @dbname VARCHAR(50)
DECLARE @fileName VARCHAR(256)
DECLARE @path VARCHAR(256)

SET @path = N'H:\UpgradeBackups\'

DECLARE db_cursor CURSOR FOR
SELECT name FROM master.dbo.sysdatabases
WHERE name  'tempdb'

OPEN db_cursor
FETCH NEXT FROM db_cursor INTO @dbname

WHILE @@FETCH_STATUS = 0
BEGIN
       SET @fileName = @path + @dbname + '_full.Bak'
       BACKUP DATABASE @dbname TO DISK = @fileName with init, noskip, noformat, nounload, stats=10, checksum
       FETCH NEXT FROM db_cursor INTO @dbname
END

CLOSE db_cursor
DEALLOCATE db_cursor

Resolving UserID’s in Event Viewer.

I wanted to find out who made a change on a server – on a known day at a known time. Looking through event-viewer on the server all I could find was an irrelevent “TerminalServices-Printers” error. However within this error – by clicking the Details tab and expanding the ‘system’ paragraph I saw a USERID, which was a long string of numbers and dashes.

To resolve this userid I opened regedit and expanded HKEY_USERS. I located the userID in question and right-clicked it. Examininmg the ‘permissions’ configuration the username was listed.

When was CheckDB last run?

There’s a whole bunch of scripts out there that show the history of when dbcc checkdb was last run. Many are inaccurate. Here’s the one I use …

--checkdb_history.sql
CREATE TABLE #DBInfo_LastKnownGoodCheckDB
	(
		ParentObject varchar(1000) NULL,
		Object varchar(1000) NULL,
		Field varchar(1000) NULL,
		Value varchar(1000) NULL,
		DatabaseName varchar(1000) NULL
	)

DECLARE csrDatabases CURSOR FAST_FORWARD LOCAL FOR
SELECT name FROM sys.databases WHERE name NOT IN ('tempdb')

OPEN csrDatabases

DECLARE
	@DatabaseName varchar(1000),
	@SQL varchar(8000)

FETCH NEXT FROM csrDatabases INTO @DatabaseName

WHILE @@FETCH_STATUS = 0
BEGIN
	--Create dynamic SQL to be inserted into temp table
	SET @SQL = 'DBCC DBINFO (' + CHAR(39) + @DatabaseName + CHAR(39) + ') WITH TABLERESULTS'

	--Insert the results of the DBCC DBINFO command into the temp table
	INSERT INTO #DBInfo_LastKnownGoodCheckDB
	(ParentObject, Object, Field, Value) EXEC(@SQL)

	--Set the database name where it has yet to be set
	UPDATE #DBInfo_LastKnownGoodCheckDB
	SET DatabaseName = @DatabaseName
	WHERE DatabaseName IS NULL

FETCH NEXT FROM csrDatabases INTO @DatabaseName
END

--Get rid of the rows that I don't care about
DELETE FROM #DBInfo_LastKnownGoodCheckDB
WHERE Field != 'dbi_dbccLastKnownGood'

SELECT Distinct
	DatabaseName,
	Value  as LastGoodCheckDB--,
	--DATEDIFF(dd, CAST(Value AS datetime), GetDate()) AS DaysSinceGoodCheckDB,
	--DATEDIFF(hh, CAST(Value AS datetime), GetDate()) AS HoursSinceGoodCheckDB
FROM #DBInfo_LastKnownGoodCheckDB
ORDER BY DatabaseName

DROP TABLE #DBInfo_LastKnownGoodCheckDB

Orphaned Datafiles ?

Occasionally, in my hunt for space I find a datafile with a name unrelated to any current installed database. This query lists datafiles and the database’s they belong to …

SELECT DB_NAME(database_id) DBName,
	   [name] Logical_Name,
       Physical_Name, 
       CONVERT(FLOAT, size) * 8 / 1000000 SizeGB
FROM master.sys.master_files
-- WHERE DB_NAME(database_id) like 'Man%'
WHERE physical_name LIKE 'c%' -- on the c-drive!
-- ORDER BY database_id
ORDER BY size DESC;

What port is SQL 2005 using?

Well, to connect using SSMS you specify the server name / the instance then a comma and port (if its not standard)
EG: “lon101dds ,5001”

If you dont know the port …
– remote onto server
– start / programs / sql 2005 / configuration tools / configuration manager /
– find the instance under network config
– look in tcp/ip settings
– within the ip addresses tab
– scroll to the bottom “IP All”
– read tcp port

Scripted Notification

Here’s an interesting (ActiveX) SQL Job-Step. It can be used to send an email should the usual methods not be available.

Function Main()
	Main = DTSTaskExecResult_Success

Set objEmail = CreateObject("CDO.Message")
objEmail.From = "sendingemail@address.com"
objEmail.To = "receivingemail@address.com"
objEmail.Subject = "FAILED JOB: name SERVER: name"
objEmail.Textbody = "FAILED JOB: name SERVER: name"
objEmail.Configuration.Fields.Item ("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2
objEmail.Configuration.Fields.Item ("http://schemas.microsoft.com/cdo/configuration/smtpserver") = "123.123.123.123"
objEmail.Configuration.Fields.Item ("http://schemas.microsoft.com/cdo/configuration/smtpserverport") = 25
objEmail.Configuration.Fields.Update
objEmail.Send
set objEmail = nothing
End Function

Where the ip-address (123.123.123.123) is the local email server.

Open SSMS as another user

I cobbled together this script which sits on my desktop called ‘ssms.vbs’. The disadvantage is you need to hard-code your password into the script and keep it upto date.

set WshShell = WScript.CreateObject("WScript.Shell")

'build runas command 
WshShell.run "runas /user:DOMAINUSERNAME %comspec%"
WScript.Sleep 100
WshShell.SendKeys "PASSWORD" 'send password
WshShell.SendKeys "{ENTER}"
WScript.Sleep 500

'Open SSMS
WshShell.SendKeys Chr(34) + "C:Program FilesMicrosoft SQL Server100ToolsBinnVSShellCommon7IDEssms.exe" + Chr(34)
WshShell.SendKeys "{ENTER}"

'Close command prompt
WshShell.SendKeys "exit"
WshShell.SendKeys "{ENTER}"
WScript.Sleep 1000

set wshshell = nothing

Working with xp_cmdshell

Most production systems these-days have xp_cmdshell disabled as a security precaution, rightly so. If you NEED to execute an operating-system command best to use this sort-of routine …

--cmdshell.sql

--	DECLARE @cmd varchar(100) = 'shutdown /r'
--      DECLARE @cmd varchar(100) = 'shutdown -r -f -t 0'
	DECLARE @cmd varchar(100) = 'Dir'

		-- DEBUG ... is xp_cmdshell enabled?
		SELECT case when value_in_use = 1 then 'YES' else 'no' end [is CMDSHELL enabled]
		FROM sys.configurations
		where name = 'xp_cmdshell'

IF (SELECT value_in_use /* cmd shell is disabled */
	FROM sys.configurations
	WHERE name = 'xp_cmdshell') = 0
BEGIN
	exec sp_configure 'show advanced options', 1 reconfigure -- show advanced options
	exec sp_configure xp_cmdshell, 1 reconfigure -- enable command-shell
	exec xp_cmdshell @cmd -- run the command
	exec sp_configure 'xp_cmdshell', 0 reconfigure -- disable command-shell
	exec sp_configure 'show advanced options', 0 reconfigure  -- hide advanced options
END
ELSE /* cmd shell is enabled */
	exec xp_cmdshell @cmd -- just run the command


		-- DEBUG ... is xp_cmdshell enabled?
		SELECT case when value_in_use = 1 then 'YES' else 'no' end [is CMDSHELL enabled]
		FROM sys.configurations
		where name = 'xp_cmdshell'

(NOTE: executing sp_configure by itself will show the current settings)

Moving SSIS packages

To copy IIS packages that are stored in MSDB from PROD to UAT I connected to the ‘Integration Services’ object of both servers within SSMS.

To ‘pull’ the package from PROD into UAT I first drilled down both trees (EG: server stored packages msdb ) to see which folders were missing.

The first one was called ‘Press’ so I right-clicked on the UAT msdb icon  and created a new folder with that name.

I right-clicked on the new folder and chose ‘import package’. This opened a config-box, within which I just filled in the ‘server:’ and Package path:’ values. The first contained the name of the PROD server (including instance).

The ‘package path’ box contained one parent folder (‘SSIS Packages’) and subfolders for each package. I opened the ‘Press’ folder and selected the package within. Once I clicked ‘ok’ twice the package was copied over.

Simple Log-shipping of ‘Actions’ database from London to Paris using SQL2000 Std

SOURCE

All the code below is freely available on the internet. I first saw it in the SQL2000 SDK from where it has been improved many-many-times-over. Rather than rehash the implementation here is my handover notes.

OVERVIEW OF THIS PROJECT

Transaction-logs (Tlogs) from the London database ‘Actions’ are backed up every hour to a warm standby in Paris and restored there.

DETAILS

Hourly a SQL Job in London (‘Log ship Incidents to Paris’) executes a stored procedure (‘sp_ShipLogOnce’) that backs up the Tlog directly to Paris (to E:Backups shared as ‘Backup$’) then runs the Paris SQL Job (‘sp_ApplyStandByLog’) that restores it.

MAINTENANCE & LIMITATIONS

There is a Windows batch file on Paris (‘PurgeFilesExceptRecent.bat’) that will delete old Tlogs every 2am. It has been initially set to ignore the newest 720 (being one per hour for 30 days).

Logins on Paris need to be regularly updated from London (at least monthly). The stored procedure ‘sp_help_revlogins’ will script out the London logins, from which any new, sql, non-svc, entries should be added to Paris.

The Tlogs need to be restored in strict order. After a temporary network failure all outstanding Tlogs can be restored by running this command at Paris …

C:ApplyStandbyLogs.bat E:Data S:Backups

Alternatively individual Tlog can be restored using the stored procedure …

exec Sp_ApplyStandByLog
@DBName=’Actions’,
@BackupFileName=’S:BackupsActions[name.trn]’,
@UndoFile=’T:DATAActions_undo.ldf’,
@MoveFileTo=’S:BackupsActionsRestored’

SWITCHING ROLES

  1. Disable the London LogShip job
  2. Apply final Tlog (sp_ShipLogOnce with @LocalUndoFile)
  3. Change Paris ‘Actions’ database to read/write, multi-user
  4. Enable the Paris LogShip job

DISASTER RECOVERY

If London is unavailable, only step-3 above is needed. IE:-

exec sp_dboption 'actions', 'dbo use only', 'false'
go
exec sp_dboption 'actions', 'read only', 'false'
go
restore database actions with recovery
go

Mirroring of BESMgmt from London to Paris using SQL Server 2008 r1 Std

PREREQUISITES

  • First I changed the recovery model of the BESMgmt database to use the Full Recovery Model.
  • To manage the BESMgmt Tlog files generated from this I created a Maintenance Plan to backup the Tlogs every 6 hours to the E:Backups folder. These will be automatically deleted after two weeks (NOTE: The size of the TLogs can be reduced by increasing the backup frequency).
  • I reinstalled SQL Server 2008+SP1 on Paris ensuring that …
  •           The SQL service account used was SVC_SQL
  •           The SQL agent account used was SVC_AGT_SQL
  •           The collation was set to LATIN1_GENERAL_CI_AS
  •           The default location for datafiles was set to S:DATA
  •           The default location for logfiles was set to T:LOGS
  •           TempDB was located on the U-Drive

– Minimised potential connectivity delays by …

o Setting up Paris as a linked-server on London
o Ensuring the SQL Browser Services was active and working on both servers

SETUP

To initiate mirroring I …
– Backed up the BESMgmt database and restored it on Paris with no recovery
– Backed up the TLOG and restored it on Paris with no recovery
– Configured Mirroring of BESMgmt on London GUI without a witness.

OPERATION

There are three methods of failing over / swapping the Mirror-Live (Principal) and Mirror-Standby (Partner) roles.
1. There is a ‘Failover button’ on the Principal SSMS GUI mirror-configuration screen. This should be used for testing, and for planned maintenance of the Mirror-Live server.
2. I believe ‘BES Manager’ can be configured to automatically failover the database. This should be the normal way failover occurs.
3. A sql-command can be executed on the Mirror-Partner as a last-ditch option should the Mirror-Live server become unavailable ( IE: alter database besmgmt set partner force_service_allow_data_loss). Data loss may occur.

MONITORING

The Mirror Monitor (screen within SSMS) can be viewed from either server by right-clicking the BESMgmt database and choosing Tasks / Launch database Mirroring Monitor. I have initiated mirror warnings within this tool which will log an error to the application log if Mirroring becomes slow.