--disabled_triggers.sql select name disabled_triggers from dbo.sysobjects where objectproperty (id, 'ExecIsTriggerDisabled') = 1
What about table joins in MongoDB?
er no … sorry. You can always join tables at the application level after uploading them – or just make a new table with the rows you need from other tables.
Joining sharded data would be very slow.
MongoDB indexing
Only uses primary indexes/ keys. No need for foreign-keys if there are no relationships.
If you need a table indexed on another column – make a copy of the table – indexed on the other column.
Documentation
Documentation is the constant burden of many a computer professional – but why? Because it is for-the-most-part a static, legacy reporting system.
The usefulness of documentation is undoubted when dealing with other professions, but the curse is that it needs to be maintained. And in a fast changing environment – like in the computer industry – documentation needs to be dynamic.
Spreadsheets should update themselves for example. Have you ever seen a correct server list?
So please please, file static documentation with telephone directories – in the bin đ
Configuring MySQL replication
This is a record of the steps I performed to enable replication between MySQL5.0.77 on RHEL and 5.5.25 on WindowsXP.
1. ensure two-way connectivity with pings.
2. make mysql on windows easier to work with by copying mysql.exe and mysqldump.exe to c:windowssystem32.
3. On the Replication-Master (RHEL) I made a copy of /etc/my.cnf then amended the original using VI adding the lines
“log-bin=mysql-bin” and “server-id=1” within the [mysqld] section.
I rebooted and tested that binary-logging was ‘on’ with the command “show variables like ‘log_bin'”.
4. On the WindowsXP / 5.5 / Slave machine I copied then amended the file “c:program filesmysqlmysql server 5.5my-large.ini” (note: I chose ~large.ini as my machine had 512mb memory).
server-id=2
master-host=169.254.0.17
master-user=repuser
master-password=password
I then confirmed all other conflicting settings were commented out (EG: server-id=1) and sdaved the file.
The difference between SQL-Server and Oracle DBA’s.
I feel that Microsoft DBA’s are not so worried about knowing every nook-and-cranny of their DBMS because Microsoft can always be relied on to find a specialist to help out đ
Oracle DBA’s on-the-other-hand know that Oracle PLC only does databases, not Support or Documentation (ok I may be exaggerating a bit). So Oracle DBA’s are on their own, and really need an intimate understanding of their particular DBMS internals.
I’m not inferring the superiority of one tribe over the other, just noting the difference đ
TransactionLog sizes
Here’s a nice script I like to run on all registered servers (at once) to show the sizes of the log-files …
--logdrivesize.sql create table #temp( dbname varchar(100), logsize_mb real, spaceused_pct real, dbstatus int) insert #temp exec ('DBCC SQLPERF (LOGSPACE)') select dbname, logsize_mb from #temp where logsize_mb > 1000 order by logsize_mb desc drop table #temp
Moving Logins
After migrating a database to another server use the system-procedure ‘sp_help_revlogin’ (on the source) to create scripts of all logins. Including PASSWORDS of all sql-logins (then run it on the target) …
exec sp_help_revlogin --all exec sp_help_revlogin 'somelogin' -- just for 'somelogin'
Orphaned users
Here’s my crib-sheet for finding and dealing with orphaned users …
--orphaned_users.sql -- SOURCE = -- TARGET = -- step-1: list orphaned users in current db EXEC sp_change_users_login 'Report'; -- step-2: map user to login - if they both exist EXEC sp_change_users_login 'update_one', 'some_user', 'some_login'; -- step-3: copy login, then step-2, then step-1 -- (NOTE: execute this on the source, then execute the output on the target) Select 'Create Login ' + QUOTENAME(A.name) + ' With Password=' + CONVERT(varchar(max), A.password_hash, 1) + ' hashed' --script out the passwrod + ', DEFAULT_DATABASE=' + quotename(A.default_database_Name) --if you apply the scrip to a mirroed server or log-shipped server user master database, as user database is not in useable state + ', DEFAULT_LANGUAGE=' + quotename(A.default_language_Name) + ', CHECK_POLICY=' + Case A.is_policy_checked when 0 then 'OFF' When 1 Then 'On' End + ', CHECK_EXPIRATION=' + Case A.is_expiration_checked when 0 then 'OFF' When 1 Then 'On' End + ', SID=' + CONVERT(varchar(max), A.SID, 1) --script out the SIDs As SQLLogin From sys.sql_logins A Where A.name Not like '##%##' --remove those system generated sql logins And A.sid != 0x01 --SA sid is always same and QUOTENAME(A.name) = '[' + 'some_login' + ']' -- Extra -- list USERS (current server/db) SELECT * FROM sys.database_principals where type in ('S', 'U') order by 1 -- list LOGINS (current server/db) SELECT * FROM sys.server_principals where type in ('S', 'U') order by 1
Backup devices
Here’s my crib-sheet of commands for dealing with ‘backup devices’ …
--devices.sql USE master; GO --list select * from sys.backup_devices --create --EXEC sp_addumpdevice 'disk', 'somedevicename', '\somebackupserverSomebackupshareSomeservernameSomeinstancenameSomedatabasename.bak'; --delete --EXEC sp_dropdevice 'somedevicename';
Database sizes
Although I already have one – here’s a neat script to show all database sizes …
--database_sizes.sql select db.name, round(sum(convert(bigint, mf.size)) * 8 /1024 /1000 ,0) [SizeGB] from master.sys.master_files mf inner join master.sys.databases db on db.database_id = mf.database_id where db.database_id > 4 group by db.name --order by db.name order by [SizeGB] desc
Scheduling CHECKDB
I modified my backup-all script to schedule a DBCC CHECKDB run on ‘most’ databases. Then I augmented it further – to quit if still running at 18:30 …
--Checkdb_most.sql DECLARE @dbname VARCHAR(100) declare @return int set @return = 0 DECLARE db_cursor CURSOR FOR SELECT name FROM master.dbo.sysdatabases where dbid > 4 -- exclude system databases and name not in ('somedatabasename') OPEN db_cursor FETCH NEXT FROM db_cursor INTO @dbname WHILE (@@FETCH_STATUS = 0) AND (@return = 0) BEGIN if getdate() > '01 feb 2012 18:20' set @return = 1 DBCC CHECKDB (@dbname) with PHYSICAL_ONLY FETCH NEXT FROM db_cursor INTO @dbname END CLOSE db_cursor DEALLOCATE db_cursor
Scripted full backup of all databases.
I had a requirement to script full backups and this was the simplest code I could find …
--backupAll.sql DECLARE @dbname VARCHAR(50) DECLARE @fileName VARCHAR(256) DECLARE @path VARCHAR(256) SET @path = N'H:\UpgradeBackups\' DECLARE db_cursor CURSOR FOR SELECT name FROM master.dbo.sysdatabases WHERE name 'tempdb' OPEN db_cursor FETCH NEXT FROM db_cursor INTO @dbname WHILE @@FETCH_STATUS = 0 BEGIN SET @fileName = @path + @dbname + '_full.Bak' BACKUP DATABASE @dbname TO DISK = @fileName with init, noskip, noformat, nounload, stats=10, checksum FETCH NEXT FROM db_cursor INTO @dbname END CLOSE db_cursor DEALLOCATE db_cursor
Resolving UserID’s in Event Viewer.
I wanted to find out who made a change on a server – on a known day at a known time. Looking through event-viewer on the server all I could find was an irrelevent “TerminalServices-Printers” error. However within this error – by clicking the Details tab and expanding the ‘system’ paragraph I saw a USERID, which was a long string of numbers and dashes.
To resolve this userid I opened regedit and expanded HKEY_USERS. I located the userID in question and right-clicked it. Examininmg the ‘permissions’ configuration the username was listed.
The little book of Mongo
http://openmymind.net/mongodb.pdf
Here’s an eBook by Karl Seguin that answered my basic questions.
When was CheckDB last run?
There’s a whole bunch of scripts out there that show the history of when dbcc checkdb was last run. Many are inaccurate. Here’s the one I use …
--checkdb_history.sql CREATE TABLE #DBInfo_LastKnownGoodCheckDB ( ParentObject varchar(1000) NULL, Object varchar(1000) NULL, Field varchar(1000) NULL, Value varchar(1000) NULL, DatabaseName varchar(1000) NULL ) DECLARE csrDatabases CURSOR FAST_FORWARD LOCAL FOR SELECT name FROM sys.databases WHERE name NOT IN ('tempdb') OPEN csrDatabases DECLARE @DatabaseName varchar(1000), @SQL varchar(8000) FETCH NEXT FROM csrDatabases INTO @DatabaseName WHILE @@FETCH_STATUS = 0 BEGIN --Create dynamic SQL to be inserted into temp table SET @SQL = 'DBCC DBINFO (' + CHAR(39) + @DatabaseName + CHAR(39) + ') WITH TABLERESULTS' --Insert the results of the DBCC DBINFO command into the temp table INSERT INTO #DBInfo_LastKnownGoodCheckDB (ParentObject, Object, Field, Value) EXEC(@SQL) --Set the database name where it has yet to be set UPDATE #DBInfo_LastKnownGoodCheckDB SET DatabaseName = @DatabaseName WHERE DatabaseName IS NULL FETCH NEXT FROM csrDatabases INTO @DatabaseName END --Get rid of the rows that I don't care about DELETE FROM #DBInfo_LastKnownGoodCheckDB WHERE Field != 'dbi_dbccLastKnownGood' SELECT Distinct DatabaseName, Value as LastGoodCheckDB--, --DATEDIFF(dd, CAST(Value AS datetime), GetDate()) AS DaysSinceGoodCheckDB, --DATEDIFF(hh, CAST(Value AS datetime), GetDate()) AS HoursSinceGoodCheckDB FROM #DBInfo_LastKnownGoodCheckDB ORDER BY DatabaseName DROP TABLE #DBInfo_LastKnownGoodCheckDB
Orphaned Datafiles ?
Occasionally, in my hunt for space I find a datafile with a name unrelated to the current installed databases. This query lists datafiles and the database’s they belong to …
select db_name(database_id) dbname, physical_name
from master.sys.master_files
where physical_name like 'N%' -- on the n-drive
order by physical_name;
What port is SQL 2005 using?
Well, to connect using SSMS you specify the server name / the instance then a comma and port (if its not standard)
EG: “lon101dds ,5001”
If you dont know the port …
– remote onto server
– start / programs / sql 2005 / configuration tools / configuration manager /
– find the instance under network config
– look in tcp/ip settings
– within the ip addresses tab
– scroll to the bottom “IP All”
– read tcp port
Download and Install MongoDB
http://www.mongodb.org/downloads
including Drivers and Packages
Scripted Notification
Here’s an interesting (ActiveX) SQL Job-Step. It can be used to send an email should the usual methods not be available.
Function Main() Main = DTSTaskExecResult_Success Set objEmail = CreateObject("CDO.Message") objEmail.From = "sendingemail@address.com" objEmail.To = "receivingemail@address.com" objEmail.Subject = "FAILED JOB: name SERVER: name" objEmail.Textbody = "FAILED JOB: name SERVER: name" objEmail.Configuration.Fields.Item ("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2 objEmail.Configuration.Fields.Item ("http://schemas.microsoft.com/cdo/configuration/smtpserver") = "123.123.123.123" objEmail.Configuration.Fields.Item ("http://schemas.microsoft.com/cdo/configuration/smtpserverport") = 25 objEmail.Configuration.Fields.Update objEmail.Send set objEmail = nothing End Function
Where the ip-address (123.123.123.123) is the local email server.
Cannot open backup device. Operating system error 1909
The reason for error 1909 was because it’s access token had expired (an Active Directory replication problem).
The workaround was …
net use \somebackupserversomebackupshare /user:domainsqlagentloginaccount
Error: “Class not Registered” when connecting to SSIS
I found that SQL 2008 SSMS cannot connect to SQL 2005 SSIS. I needed to remote onto the Server, or install SQL 2005 locally.
Oh No! all my SSIS packages in MSDB have disappeared!!
In the registry the path to the SSIS config file can be overwritten.
To fix, reset this registry-key (the default for SQL 2005 on Windows 2008 is “C:Program FilesMicrosoft SQL Server90DTSBinn”) …
 [HKEY_LOCAL_MACHINESOFTWAREMicrosoftMSDTSServiceConfigFile]
Open SSMS as another user
I cobbled together this script which sits on my desktop called ‘ssms.vbs’. The disadvantage is you need to hard-code your password into the script and keep it upto date.
set WshShell = WScript.CreateObject("WScript.Shell") 'build runas command WshShell.run "runas /user:DOMAINUSERNAME %comspec%" WScript.Sleep 100 WshShell.SendKeys "PASSWORD" 'send password WshShell.SendKeys "{ENTER}" WScript.Sleep 500 'Open SSMS WshShell.SendKeys Chr(34) + "C:Program FilesMicrosoft SQL Server100ToolsBinnVSShellCommon7IDEssms.exe" + Chr(34) WshShell.SendKeys "{ENTER}" 'Close command prompt WshShell.SendKeys "exit" WshShell.SendKeys "{ENTER}" WScript.Sleep 1000 set wshshell = nothing
Working with xp_cmdshell
Most production systems these-days have xp_cmdshell disabled as a security precaution, rightly so. If you NEED to execute an operating-system command best to use this sort-of routine …
--cmdshell.sql -- DECLARE @cmd varchar(100) = 'shutdown /r' -- DECLARE @cmd varchar(100) = 'shutdown -r -f -t 0' DECLARE @cmd varchar(100) = 'Dir' -- DEBUG ... is xp_cmdshell enabled? SELECT case when value_in_use = 1 then 'YES' else 'no' end [is CMDSHELL enabled] FROM sys.configurations where name = 'xp_cmdshell' IF (SELECT value_in_use /* cmd shell is disabled */ FROM sys.configurations WHERE name = 'xp_cmdshell') = 0 BEGIN exec sp_configure 'show advanced options', 1 reconfigure -- show advanced options exec sp_configure xp_cmdshell, 1 reconfigure -- enable command-shell exec xp_cmdshell @cmd -- run the command exec sp_configure 'xp_cmdshell', 0 reconfigure -- disable command-shell exec sp_configure 'show advanced options', 0 reconfigure -- hide advanced options END ELSE /* cmd shell is enabled */ exec xp_cmdshell @cmd -- just run the command -- DEBUG ... is xp_cmdshell enabled? SELECT case when value_in_use = 1 then 'YES' else 'no' end [is CMDSHELL enabled] FROM sys.configurations where name = 'xp_cmdshell'
(NOTE: executing sp_configure by itself will show the current settings)
Moving SSIS packages
To copy IIS packages that are stored in MSDB from PROD to UAT I connected to the ‘Integration Services’ object of both servers within SSMS.
To ‘pull’ the package from PROD into UAT I first drilled down both trees (EG: server stored packages msdb ) to see which folders were missing.
The first one was called ‘Press’ so I right-clicked on the UAT msdb icon  and created a new folder with that name.
I right-clicked on the new folder and chose ‘import package’. This opened a config-box, within which I just filled in the ‘server:’ and Package path:’ values. The first contained the name of the PROD server (including instance).
The ‘package path’ box contained one parent folder (‘SSIS Packages’) and subfolders for each package. I opened the ‘Press’ folder and selected the package within. Once I clicked ‘ok’ twice the package was copied over.
Simple Log-shipping of âActionsâ database from London to Paris using SQL2000 Std
SOURCE
All the code below is freely available on the internet. I first saw it in the SQL2000 SDK from where it has been improved many-many-times-over. Rather than rehash the implementation here is my handover notes.
OVERVIEW OF THIS PROJECT
Transaction-logs (Tlogs) from the London database âActionsâ are backed up every hour to a warm standby in Paris and restored there.
DETAILS
Hourly a SQL Job in London (âLog ship Incidents to Parisâ) executes a stored procedure (âsp_ShipLogOnceâ) that backs up the Tlog directly to Paris (to E:Backups shared as âBackup$â) then runs the Paris SQL Job (âsp_ApplyStandByLogâ) that restores it.
MAINTENANCE & LIMITATIONS
There is a Windows batch file on Paris (âPurgeFilesExceptRecent.batâ) that will delete old Tlogs every 2am. It has been initially set to ignore the newest 720 (being one per hour for 30 days).
Logins on Paris need to be regularly updated from London (at least monthly). The stored procedure âsp_help_revloginsâ will script out the London logins, from which any new, sql, non-svc, entries should be added to Paris.
The Tlogs need to be restored in strict order. After a temporary network failure all outstanding Tlogs can be restored by running this command at Paris …
C:ApplyStandbyLogs.bat E:Data S:Backups
Alternatively individual Tlog can be restored using the stored procedure …
exec Sp_ApplyStandByLog @DBName=âActionsâ, @BackupFileName=âS:BackupsActions[name.trn]â, @UndoFile=âT:DATAActions_undo.ldfâ, @MoveFileTo=âS:BackupsActionsRestoredâ
SWITCHING ROLES
- Disable the London LogShip job
- Apply final Tlog (sp_ShipLogOnce with @LocalUndoFile)
- Change Paris âActionsâ database to read/write, multi-user
- Enable the Paris LogShip job
DISASTER RECOVERY
If London is unavailable, only step-3 above is needed. IE:-
exec sp_dboption 'actions', 'dbo use only', 'false' go exec sp_dboption 'actions', 'read only', 'false' go restore database actions with recovery go
Mirroring of BESMgmt from London to Paris using SQL Server 2008 r1 Std
PREREQUISITES
- First I changed the recovery model of the BESMgmt database to use the Full Recovery Model.
- To manage the BESMgmt Tlog files generated from this I created a Maintenance Plan to backup the Tlogs every 6 hours to the E:Backups folder. These will be automatically deleted after two weeks (NOTE: The size of the TLogs can be reduced by increasing the backup frequency).
- I reinstalled SQL Server 2008+SP1 on Paris ensuring that …
- Â Â Â Â Â The SQL service account used was SVC_SQL
- Â Â Â Â Â Â The SQL agent account used was SVC_AGT_SQL
- Â Â Â Â Â Â The collation was set to LATIN1_GENERAL_CI_AS
- Â Â Â Â Â The default location for datafiles was set to S:DATA
- Â Â Â Â Â Â The default location for logfiles was set to T:LOGS
- Â Â Â Â Â Â TempDB was located on the U-Drive
– Minimised potential connectivity delays by …
o Setting up Paris as a linked-server on London
o Ensuring the SQL Browser Services was active and working on both servers
SETUP
To initiate mirroring I …
– Backed up the BESMgmt database and restored it on Paris with no recovery
– Backed up the TLOG and restored it on Paris with no recovery
– Configured Mirroring of BESMgmt on London GUI without a witness.
OPERATION
There are three methods of failing over / swapping the Mirror-Live (Principal) and Mirror-Standby (Partner) roles.
1. There is a âFailover buttonâ on the Principal SSMS GUI mirror-configuration screen. This should be used for testing, and for planned maintenance of the Mirror-Live server.
2. I believe âBES Managerâ can be configured to automatically failover the database. This should be the normal way failover occurs.
3. A sql-command can be executed on the Mirror-Partner as a last-ditch option should the Mirror-Live server become unavailable ( IE: alter database besmgmt set partner force_service_allow_data_loss). Data loss may occur.
MONITORING
The Mirror Monitor (screen within SSMS) can be viewed from either server by right-clicking the BESMgmt database and choosing Tasks / Launch database Mirroring Monitor. I have initiated mirror warnings within this tool which will log an error to the application log if Mirroring becomes slow.
Removal of SQL Server 2008 R1
The day after I installed two instances of SQL Server 2008 R1 on my old Windows-2003 test-server I decided to remove them, here’s what I did …
– rebooted đ
– start/ settings/ control panel/ add or remove programs/ Microsoft SQL 2008/ remove
– from the instance screen I chose the last instance installed
– from the component screen I ticked All Components
– rebooted and repeated the above steps for the other instance
– used the Add or Remove screen to remove the SQL 2008 browser then native client
– removed the base folders for the two instances from my c-drive
– used Start/Search to find any other folders modified yesterday (happily there were none)
– rebooted then fired up SQL Server 2005 to test it still worked – no problems đ
“Date spin detected”
I didn’t know what to make of the error message “Date spin detected”. Until I noticed a SQL job scheduled every 10 seconds!
The overhead of executing a job so often is very high. Naughty developers!
A much better solution would be for the SQL job to call a stored-procedure, and the sp to include a While loop containing a “wait 10” instruction.
An example of a sql Restore
An example of restoring database Liath using sql-script.
If database exists make it single-user-mode
Alter database liath set single_user with rollback immediate Go
If possible backup the tail IE: unbacked up data
Backup log liath to disk = Nâ\192.168.0.32volume_1SQLBackups_D2liathliathTail.bakâ with init, norecovery go
Restore the last full backup EG:-
Restore database liath from disk = Nâ\192.168.0.32volume_1SQLBackups_D2liathliath_backup_2011_10_13_174000_5625000.bacâ with norecovery go
Restore the latest differential backup EG:-
Restore database liath from disk = Nâ\192.168.0.32volume_1SQLBackups_D2liathliath_backup_2011_10_14_000001_1250000.difâ with norecovery Go
Restore all the transactional logs in sequence from the last diff backup EG:-
Restore log liath from disk = Nâ\192.168.0.32volume_1SQLBackups_D2liathliath_backup_2011_10_14_003001_1718750.trnâ with norecovery Go Restore log liath from disk = Nâ\192.168.0.32volume_1SQLBackups_D2liathliath_backup_2011_10_14_013001_3437500.trnâ with norecovery Go
Restore the final tail backup
Restore log liath from disk = Nâ\192.168.0.32volume_1SQLBackups_D2liathliathTail.bakâ with recovery Go
Change database to mult-user mode
Alter database liath set multi-user Go
Remove Completely Unused Indexes
Here’s a great little script I found that lists indexes in the current database with no stats at all. That is, indexes that have never been read or even written to (since the last reboot, note). I’m not sure how much overhead these will have on the system – but it’s probably worth clearing them out anyway.
From the results I wouldn’t delete the ‘system’ indexes (that is ones starting with ‘MS~’ or ‘sys~’ and the like), or Clustered Indexes đ
I was about to work-up a script to delete them, then (phew) remembered … “dba-rule #23: never automate deletes”
SELECT object_name(i.object_id) as tableName, i.name as indexName FROM sys.indexes i LEFT JOIN sys.dm_db_index_usage_stats s ON i.object_id = s.object_id AND i.index_id = s.index_id AND s.database_id = db_id() WHERE objectproperty(i.object_id,'IsUserTable') = 1 and i.index_id> 0 AND s.object_id IS NULL AND i.is_Primary_Key = 0 AND i.is_unique_constraint = 0 AND i.is_unique = 0 --DROP INDEX [index-name] ON [dbo].[table-name]
Index Reorganize/Rebuild
The best script I’ve found to Reorganize/Rebuild indexes and Update Stats can be found on this web-page …
http://ola.hallengren.com/
Record db Growth
To get my own metrics in place quickly at a new sql2008r1 site I created a ‘dba’ database with a table called ‘dbsizes. Then a job scheduled to run every 10pm with the first step called ‘record sizes’ containing this code …
set nocount on create table #sizes ( name varchar(50), db_size varchar(50), [owner] varchar(50), [dbid] char(1), created smalldatetime, [status] varchar(500), compatability_level char(2)) insert #sizes exec sp_helpdb insert into dba.dbo.dbsizes select name, getdate() dbdate, db_size from #sizes order by name desc drop table #sizes
Cognos Report locking production tables.
Every time a Cognos8 Report was executed, a critical application would freeze. And even when the report had finished I still needed to go into SQL Management Studio and kill the blocking Cognos SPID.
Eventually I found that the solution was to change the isolation-level used my the Cognos OLE DB connector from read-committed to read-uncommitted.
Simple-Mode to Full-Recovery-Mode script
Leading on from my script to list databases set to Simple recovery-mode, I found changing them to Full was not as easy as just changing the sys.databases flag from ‘3’ to ‘1’ (not allowed).
I had a requirement to change a whole server full of databases, but due to our ‘smart’ backup software (that on discovering new Full recovery-mode databases would immediately want to take full backups), had to stagger the changes to one-an-hour – so the (production) box wouldn’t slow to a crawl.
Here’s the job-step …
--capture name of one simple database declare @name varchar(50) select @name = [name] from sys.databases where recovery_model = 3 and [name] not in ('master', 'msdb', 'tempdb') -- 1 = Full, 3 = Simple --change it to full declare @cmd varchar(200) set @cmd = 'alter database ' + @name + ' set recovery full' exec (@cmd)
When were the last Full, Diff, & Log Backups?
My all-time favorite script is this cross-tab script showing the latest backups of all databases on the current box …
--lastbackups.sql select sdb.name, max(case bst.[type] when 'D' then convert(varchar(16), bst.backup_finish_date, 120) end) last_full_backup, max(case bst.[type] when 'I' then convert(varchar(16), bst.backup_finish_date, 120) end) last_diff_backup, max(case bst.[type] when 'L' then convert(varchar(16), bst.backup_finish_date, 120) end) last_log_backup, case sdb.[status] when '0' then 'Full' when '4' then 'Bulk-logged' when '8' then 'Simple' when '16' then 'Full' when '24' then 'Simple' when '32' then 'Restoring' when '528' then 'Off_line' when '536' then 'Off_line' when '1048' then 'Read_only' when '2072' then 'Restricted User' when '65536' then 'Full' when '65544' then 'Simple' when '2098176' then 'Standby / Read-Only' when '4194328' then 'Simple' when '4259848' then 'Simple' when '1073741840' then 'Full' when '1073807360' then 'ShutDown' else '' end db_status, sdb.[status] from master..sysdatabases sdb left join msdb..backupset bst on bst.database_name = sdb.name where sdb.name != 'tempdb' group by sdb.name, sdb.[status] --order by sdb.name order by max(bst.backup_finish_date) -- nulls at top --order by sdb.[status]
Script to help manually resize datafiles.
Here’s a great script I use when manually resizing datafiles. It lists the percentage of free space and suggest a new size that would have 60% free.
--freespace.sql SET NOCOUNT ON IF OBJECT_ID('tempdb..#DBSTATS') IS NOT NULL BEGIN DROP TABLE #DBSTATS END CREATE TABLE #DBSTATS ( [dbname] sysname, lname sysname, usage varchar(20), [size] decimal(9, 2) NULL , [used] decimal(9, 2) NULL ) IF OBJECT_ID('tempdb..#temp_log') IS NOT NULL BEGIN DROP TABLE #temp_log END CREATE TABLE #temp_log ( DBName sysname, LogSize real, LogSpaceUsed real, Status int ) IF OBJECT_ID('tempdb..#temp_sfs') IS NOT NULL BEGIN DROP TABLE #temp_sfs END CREATE TABLE #temp_sfs ( fileid int, filegroup int, totalextents int, usedextents int, name varchar(1024), filename varchar(1024) ) DECLARE @dbname sysname ,@sql varchar(8000) IF OBJECT_ID('tempdb..#temp_db') IS NOT NULL BEGIN DROP TABLE #temp_db END SELECT name INTO #temp_db FROM master.dbo.sysdatabases WHERE DATABASEPROPERTY(name,'IsOffline') = 0 AND has_dbaccess(name) = 1 ORDER BY name WHILE (1 = 1) BEGIN SET @dbname = NULL SELECT TOP 1 @dbname = name FROM #temp_db ORDER BY name IF @dbname IS NULL GOTO _NEXT SET @sql = ' USE [' + @dbname + '] TRUNCATE TABLE #temp_sfs INSERT INTO #temp_sfs EXECUTE(''DBCC SHOWFILESTATS'') INSERT INTO #DBSTATS (dbname, lname, usage, [size], [used]) SELECT db_name(), name, ''Data'' , totalextents * 64.0 / 1024.0 , usedextents * 64.0 / 1024.0 FROM #temp_sfs INSERT INTO #DBSTATS (dbname, lname, usage, [size], [used]) SELECT db_name(), name, ''Log'', null, null FROM sysfiles WHERE status & 0x40 = 0x40' EXEC(@sql) DELETE FROM #temp_db WHERE name = @dbname END _NEXT: INSERT INTO #Temp_Log EXECUTE ('DBCC SQLPERF(LOGSPACE)') --select * from #Temp_Log UPDATE #DBSTATS SET SIZE = B.LogSize , USED = LogSize * LogSpaceUsed / 100 FROM #DBSTATS A INNER JOIN #Temp_Log B ON (A.DBNAME = B.DBNAME)AND(A.Usage = 'LOG') SELECT dbname AS [database name], lname AS [file name], usage, [size] AS [space allocated (MB)], [size]*1.6 as 'plus 60% (MB)', -- used AS[space used (MB)], -- [size] - used AS [free space (MB)], -- cast(used/[size]*100 AS numeric(9,2)) AS [space used %], cast(100-(used/[size]*100) AS numeric(9,2)) AS [free space %] FROM #DBSTATS ORDER BY [free space %] --dbname, usage DROP TABLE #DBSTATS DROP TABLE #temp_db DROP TABLE #temp_sfs DROP TABLE #temp_log SET NOCOUNT OFF
What port is SQL 2005 listening on ?
To find out, open / Management studio / Management / SQL Server Logs / Current~.
Scroll to the bottom and look for something like …
Server is listening on [some ip address] 1366
(NOTE: If there are two entries, the one used by clients will be the smaller one)
Script to return the SQL Server service-account
--service_accounts.sql -- return sql-server-service-account used by the default-instance DECLARE @serviceaccount varchar(100) EXECUTE master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'SYSTEMCurrentControlSetServicesMSSQLSERVER', N'ObjectName', @ServiceAccount OUTPUT, N'no_output' SELECT @Serviceaccount ServiceAccount_DefaultInstance -- ditto for named-instance DECLARE @serviceaccount2 varchar(100) exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'SystemCurrentControlSetServicesMSSQL$sqlins01', -- edit N'ObjectName', @ServiceAccount2 OUTPUT, 'no_output' SELECT @Serviceaccount2 ServiceAccount_NamedInstance -- and for named-agent DECLARE @agentaccount varchar(100) EXECUTE master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'SYSTEMCurrentControlSetServicessqlagent$sqlins01', --edit N'ObjectName', @agentaccount OUTPUT, N'no_output' select @agentaccount ServiceAccount_NamedAagent
‘Management Studio 2008’ causes job-scripting issue.
When using SQL 2008 Management Studio against SQL 2000/5 databases, note that although the job-step output files CAN be set to ‘Log to table’.
a) it wont work (although it won’t say so), and b) setting this option will cause an error if the job is ever scripted out.
Search All Tables – script
--dba_SearchAllTables.sql --Searches all columns of all tables for a given search string (4.47 mins) --EG: exec dba_searchalltables 'patriqu%' alter proc dba_SearchAllTables ( @SearchStr nvarchar(100) ) AS begin create table #Results (ColumnName nvarchar(370), ColumnValue nvarchar(3630)) set nocount on declare @TableName nvarchar(256), @ColumnName nvarchar(128), @SearchStr2 nvarchar(110) set @TableName = '' -- set @SearchStr2 = quotename('%' + @SearchStr + '%','''') SET @SearchStr2 = quotename('' + @SearchStr + '','''') while @TableName is not null begin set @ColumnName = '' set @TableName = ( select min(quotename(table_schema) + '.' + quotename(table_name)) from information_schema.tables where table_type = 'BASE TABLE' and quotename(table_schema) + '.' + quotename(table_name) > @TableName and objectproperty( object_id( quotename(table_schema) + '.' + quotename(table_name) ), 'IsMSShipped' ) = 0 ) while (@TableName is not null) and (@ColumnName is not null) begin set @ColumnName = ( select min(quotename(column_name)) from information_schema.columns where table_schema = parsename(@TableName, 2) and table_name = parsename(@TableName, 1) and data_type in ('char', 'varchar', 'nchar', 'nvarchar') and quotename(column_name) > @ColumnName ) if @ColumnName is not null begin insert into #Results exec ( 'select ''' + @TableName + '.' + @ColumnName + ''', left(' + @ColumnName + ', 3630) from ' + @TableName + ' (nolock) ' + ' where ' + @ColumnName + ' like ' + @SearchStr2 ) end end end select ColumnName, ColumnValue from #Results end GO
Orphaned Indexes
Our ‘check indexes’ job was failing occasionally with an error like …
Msg 7999, Level 16, State 40, Line 1 Could not find any index named âindex_nameâ for table âtable_nameâ.
I tried to just exclude the named index, but a few days later another one appeared. The fix was to dig into the SP and add a condition to the statement …
... insert into #indexlist select name from sysindexes where name not like '_WA%' and indid not in (0,255) and [root] is not null and id = @tab_id ...
Temp Tables
When developing using temp tables here’s a good line to drop them if they exist.
IF OBJECT_ID('tempdb.dbo.#indexes','U') IS NOT NULL DROP TABLE dbo.[#indexes]
The Oracle SCN issue
Oracle-database uses a large number (‘SCN’) as a time-stamp for referential-integrity, backup/restores, patch levels etc etc. When instances are linked the highest SCN number overwrites all others.
Trouble is … a ‘warning limit’ is being reached (three quarters of the actual limit). And when exceeded Oracle-database assumes its corrupt and becomes unstable.
But wait! There’s a ‘fix’, a patch which lets you manually set the SCN back below the soft-limit. BUT this needs to occur on all linked instances within a short period, or they are just updated from the one with the highest number (dowh!).
And Finally … the fix is only available for the latest version of Oracle. So an obscure, forgotten, out-of-date, instance in the back-of-beyond can bring down hundreds of shiny new machines (‘patched’ or not) until upgraded.
‘Oracle SCN management’ then, not a disaster just another ongoing DBA task đ
Run on each database
To run a stored-procedure against every database in turn without a bunch of ‘USE dbname GO’ headers, use …
exec sp_msforeachdb 'use ? drop statistics alldocs.allDocs_level'
Taste the freshness of a restore.
I struggled for ages to find the simplest way to handle a situation where a backup file may or may not have been uploaded to an FTP site for restoring (see my last, which worked fine – but failed to produce the desired error message when the folder was empty).
As it only took a few minutes to restore this particular database, it was acceptable to blindly restore it and then decide if doing so had been useful. For this I made use of the backup_set_id column in the restorehistory table within MSDB.
This column appears to contain a check-sum for every restore, and therefore it was easy enough to see if the last two restores had the same value. Here’s the full job-step code …
select top 2 backup_set_id into #temp from restorehistory where destination_database_name = 'AdventureWorks' order by restore_history_id desc if (select max(backup_set_id) from #temp) = (select min(backup_set_id) from #temp) exec sp_send_dbmail @recipients = 'somebody@somewhere.com', @subject = 'AdventureWorks was restored from an old backup';
Using ‘CoreFTP’ command line
To download a backup from an FTP site then delete the original …
C:pathcorecmd.exe -s -O -site site -d /path/latestbackup.bak -p c:ftp -delsrc -log c:logfile.log
-s = silent
-O = overwrite
-site = pre configured connection details within coreftp gui
-d = path to ftp site
-p = local destination
-delsrc = delete source after successful download
-log = record a logfile
******** UPDATE Dec 2012 ********
I found that this utility was the only way I could find to script/schedule FTPS uploads & downloads. Here’s more working examples …
“C:Program Files (x86)CoreFTPcorecmd.exe” -s -OS -site thirdpartyname -u “E:DBAthirdpartynamesomefilename_2012*.txt” -p “/export files/” -log E:DBAthirdpartynamelogfile.log
“C:Program Files (x86)CoreFTPcorecmd.exe” -s -OS -site thirdpartyname -d “/Client imports/somefilename_*.txt” -p \mylocalsqlserverdbathirdpartyname -log \mylocalsqlserverdbathirdpartynamelogfile.log
Note:-
– The use of double quotes so the utility isn’t confused by spaces in the path.
– The -OS switch which means copy everything except what’s already copied.
– The use of wild-cards with -OS to simplify management of file-names containing dates.
– the use of -u for uploading, and -d for downloads.
Best Practice – Creating a Procedure that includes remote references
To create or alter a stored procedure that contains references to data on remote servers you need to connect using an account that can access all the data.
That is, to successfully alter a sp that includes links to 2 remote servers, connect (in SSMS) using an account that is common in the ‘Linked Servers’ configuration. Test this by drilling down through ‘Linked Servers’ to see both lots of tables.
Kill all users
In a technical sense of course đ
Here’s the code – very handy before a restore …
DECLARE processes CURSOR FOR   select spid from sysprocesses where dbid=(select dbid from sysdatabases where name='AdventureWorks') and status in ('runnable','sleeping') DECLARE @processid int  -- spid of active processes OPEN processes --open the cursor  FETCH FROM processes INTO @processid  WHILE (@@FETCH_STATUS = 0)   --cycle through the list and kill each spid  BEGIN   Print @processid   Print 'killed'   Exec ('   kill '+@processid+'   ')  FETCH FROM processes INTO @processid  END -- END OF WHILE (@@FETCH_STATUS = 0) CLOSE processes DEALLOCATE processes