Testing Read-Only Routing using SSMS

There are much better and easier ways to test a read-only routing configuration of an Availability Group. But here is my walk-through using SQL Server Management Studio.

  1. In SSMS connect to “Database Engine”
  2. Type in the name of the listener, a comma, then the port used by the listener
  3. Click the “Options” button
  4. On the Connection Properties tab choose “Connect to Database” then “Browse Server”
  5. Choose a database protected by the AG
  6. On the Additional Parameters tab type “ApplicationIntent=ReadOnly”
  7. Click “Connect”
  8. Run a read-only query like “SELECT @@SERVERNAME”
  9. The result should be the Secondary instance (AKA secondary replica)
  10. Delete the query-pane and connection
  11. Then set it up again – but without “ApplicationIntent=ReadOnly”
  12. Now the result of the query should be the Primary replica.

BONUS

The “much better ways” include using SQLCMD from another server. For example running this should return the name of the Secondary replica.

sqlcmd -S (ListenerName),(ListnerPort) 
-d (ProtectedDatabaseName) 
-q "SELECT @@SERVERNAME;" 
-K ReadOnly

Then, running it again without the last line should return the name of the Primary replica.

Note: It does not matter which Listener/Port you choose if there is more than one.

Todays failed jobs

I cobbled this together to run on multiply servers via CMS. It is simpler and more trustworthy than alerts from monitoring-software.

--JobFailures.sql

SELECT
    J.name,
	H.FailedAt,
    H.Message
FROM
    msdb.dbo.sysjobs AS J
    CROSS APPLY (
        SELECT TOP(1)
            FailedAt = msdb.dbo.agent_datetime(T.run_date, T.run_time),
            Message = T.message
        FROM
            msdb.dbo.sysjobhistory AS T
        WHERE
            T.job_id = J.job_id
		AND
			T.run_status = 0 -- failed
		AND
			msdb.dbo.agent_datetime(T.run_date, T.run_time) > getdate()-1 -- in the last 24 hrs
--			msdb.dbo.agent_datetime(T.run_date, T.run_time) > getdate()-3 -- covering the weekend
        ORDER BY
            T.instance_id) H

Original code from here …

https://stackoverflow.com/questions/54215008/sql-agent-job-last-run-status

Quick row count

When COUNT(*) was too slow to get total rows from a billion row heap …

SELECT rows
FROM sys.partitions 
WHERE OBJECT_NAME(object_id) = 'TableName';

or the fancy pants version …

SELECT REPLACE(CONVERT(VARCHAR(20), CONVERT(MONEY, rows), 1), '.00', '') rows
FROM sys.partitions 
WHERE OBJECT_NAME(object_id) = 'TableName';

Original code from here …

https://www.i-programmer.info/programming/database/6576-sql-server-quickly-get-row-counts-for-tables-heaps-indexes-and-partitions.html

Adding a database to an Azure backup policy

Twice I had to ask how to do this – so it is probably time for me to write it down.

Setting the scene – this was a virtual SQL Server hosted on Azure where I had already configured a backup policy within the Portal, and then wanted to add another database to that policy.

Clicking around in the Backup center and Backup policy screens did not help. The trick was to go through the motions of taking a backup.

  1. Within the Backup center click “+ Backup”

2. On the next screen make sure the Datasource type is “SQL in Azure VM” before selecting the right vault.

3. Choose “Configure Backup” …

4. Click “Add” …

5. Then – you can choose the Instance (or AlwaysOn AG), database, then Enable Backup. Simples 🙂

Send an Email with a PDF attached

There are many posts on how to automatically generate of a PDF receipt and email it by leveraging SSRS. Here is how it went for me.

  1. Create a report that shows the details required.
  2. Create a parameters table.
  3. Subscribe to the report using the parameters table.
  4. Create a stored-procedure to populate and fire the subscription.

On a version of SQL Server Reporting Services (SSRS) that supports data-driven-subscriptions (DDS) I created a report called ‘SingleInvoice’ with one input parameter ‘invoice number’.

Outside of this report, in preparation for the DDS, I created a data source pointing to the local [ReportServer] database.

Within the [ReportServer] database I created a table called [dbo].[InvoiceParams]

CREATE TABLE [dbo].[InvoiceParams](
	[InvoiceNumber] [VARCHAR](100) NULL,
	[ToEmailAddress] [VARCHAR](200) NULL,
	[CCEmailAddress] [VARCHAR](200) NULL,
	[BccEmailAddress] [VARCHAR](200) NULL,
	[ReplyToEmailAddress] [VARCHAR](200) NULL,
	[IncludeReport] [BIT] NULL,
	[RenderFormat] [VARCHAR](20) NULL,
	[Priority] [VARCHAR](15) NULL,
	[Subject] [VARCHAR](150) NULL,
	[Comment] [VARCHAR](150) NULL,
	[IncludeLink] [BIT] NULL,
	[Active] [BIT] NULL,
	[DateInserted] [DATETIME] NOT NULL
) ON [PRIMARY]
GO

ALTER TABLE [dbo].[InvoiceParams] ADD  DEFAULT (NULL) FOR [CCEmailAddress]
ALTER TABLE [dbo].[InvoiceParams] ADD  DEFAULT (NULL) FOR [BccEmailAddress]
ALTER TABLE [dbo].[InvoiceParams] ADD  DEFAULT (NULL) FOR [ReplyToEmailAddress]
ALTER TABLE [dbo].[InvoiceParams] ADD  DEFAULT ((1)) FOR [IncludeReport]
ALTER TABLE [dbo].[InvoiceParams] ADD  DEFAULT ('PDF') FOR [RenderFormat]
ALTER TABLE [dbo].[InvoiceParams] ADD  DEFAULT ('Normal') FOR [Priority]
ALTER TABLE [dbo].[InvoiceParams] ADD  DEFAULT ((0)) FOR [IncludeLink]
ALTER TABLE [dbo].[InvoiceParams] ADD  DEFAULT ((1)) FOR [Active]
ALTER TABLE [dbo].[InvoiceParams] ADD  DEFAULT (GETDATE()) FOR [DateInserted]
GO

To simplify the stored procedure I defined default values for all the columns I would not be dynamically populating.

Next I created a Data-driven subscription on the report with a schedule in the past – so it would never fire. For Destination I chose E-Mail.

Within the subscription I edited the dataset and chose the previously created shared data source [ReportServer].

I added this query before clicking ‘Apply’….

SELECT * 
FROM dbo.InvoiceParams
WHERE Active = 1;

Back in the New Subscription form, I completed the Delivery options like this …

Within the user database I created this stored-procedure …

/*==================================================
  Author:		Richard Smith
  Create date:	10 Jul 2020
  Description:	To Email PDF receipts - demo version
  Test: Exec [dbo].[EmailReceipts] 'INV123456789',
            'Richard.Smith@company.com'
  =================================================*/

ALTER PROC [dbo].[EmailReceipts]
    @InvoiceNumber VARCHAR(100),
    @ToEmailAddress VARCHAR(200),
    @Subject VARCHAR(150) = 'test subject',
    @Comment VARCHAR(150) = 'test body',
    @SubscriptionID NVARCHAR(260) = '987654321' 
                       /* Report = "SingleInvoice" */
AS
BEGIN
    SET NOCOUNT ON;


/* 1 Save the inputs */

    INSERT INTO [ReportServer].[dbo].[InvoiceParams] 
            (InvoiceNumber, ToEmailAddress, [Subject], Comment)
    VALUES (@InvoiceNumber, @ToEmailAddress, @Subject, @Comment);


/* 2 Trigger subscription. Which will send the report (+ inputs) to the email-subsystem-queue */

    EXEC [ReportServer].[dbo].[AddEvent] @EventType = 'TimedSubscription', @EventData = @SubscriptionID;
    WAITFOR DELAY '00:00:10';


/* 3 If no longer in queue, flag as sent */

    IF NOT EXISTS (SELECT 1 FROM [ReportServer].[dbo].[Event] WHERE EventData = @SubscriptionID)
        UPDATE [ReportServer].[dbo].[InvoiceParams] 
        SET Active = 0
        WHERE InvoiceNumber = @InvoiceNumber
        AND ToEmailAddress = @ToEmailAddress;


/* 4 Manage the log */

	DELETE FROM [ReportServer].[dbo].[InvoiceParams] WHERE DateInserted < GETDATE()-30;
	SELECT * FROM [ReportServer].[dbo].[InvoiceParams] ORDER BY DateInserted DESC;

END;
GO

The stored procedure input parameter @SubscriptionID was found within the SSMS Job Activity Monitor (it was the only job that had never ran).

When executed with an email address and invoice number this stored procedure will send an email to the email address with the PDF invoice attached.

Removing duplicate rows

No need to over develop this – once you realise that the DELETE command can include the TOP option.

More? Ok. Create a SELECT command that shows the issue. EG:-

SELECT ID
FROM dbo._op2
WHERE ID = 'X123456';

Add TOP to return just the unwanted rows. IE: if the above query should return 1 row but erroneously returns 2 rows use TOP(1), if it returns 5 rows use TOP(4) …

(NOTE: It does not matter which particular rows get deleted … they are duplicates remember)

SELECT TOP(1) ID
FROM dbo._op2
WHERE ID = 'X123456';

Change SELECT to DELETE and remove the column name …

DELETE TOP(1)
FROM dbo._op2
WHERE ID = 'X123456';

… and only run it once 😉

Except and Intersect

Here is the simplest working example of EXCEPT and INTERSECT I can come up with (for Will)

/* Except.sql */

IF OBJECT_ID('tempdb..#t1') IS NOT NULL DROP TABLE #t1;
CREATE TABLE #t1 (c1 INT);
INSERT INTO #t1 VALUES (1), (2);

IF OBJECT_ID('tempdb..#t2') IS NOT NULL DROP TABLE #t2;
CREATE TABLE #t2 (c2 INT);
INSERT INTO #t2 VALUES (2), (3);

SELECT * FROM #t1
EXCEPT
SELECT * FROM #t2; /* = 1 */

SELECT * FROM #t1
INTERSECT
SELECT * FROM #t2; /* = 2 */

SELECT * FROM #t2
EXCEPT
SELECT * FROM #t1; /* = 3 */

SELECT * FROM #T1
EXCEPT
SELECT * FROM #T2
UNION
(SELECT * FROM #T2
EXCEPT
SELECT * FROM #T1); /* = 1 & 3 */

I use this frequently whilst refactoring to check the outputs are identical. And sometimes when syncing a MySQL table to MSSQL.

How to check SQL Jobs are actually doing something.

Looking through the Database Mail log today, I accidentally discovered a job that had been busy sending emails for I-don’t-know-how-long using an email profile that no longer worked. The output of the job was ‘success’ as the emails had been successfully queued with the Database Mail sub-system.

After finding the emails would have been empty anyway, I disabled the job. But it made me wonder if there might be other jobs that were busy doing nothing – hour after hour – day after day.

Knowing the dangers of weakening the system, I did not want to fail a job or job-step just to flag a maintenance issue.

The lowest-risk change I could think of making (to the many, legacy, unfamiliar jobs) was to leave pertinent messages in the job history log using the PRINT command. For example:-

IF EXISTS (SELECT 1 FROM SomeTable)
   BEGIN 
      PRINT 'YES: there is new data'
      (Do meaningful stuff)
   END
   ELSE
      PRINT 'NO: there is no new data';

Then in the future I might notice that there is Never any new data!

Expensive Spaghetti

Now that cloud computing has made the cost of things more visable. It is easier to see how much money is being wasted carrying around legacy spaghetti code.

Where once I.T. Managers prioritized the task of cleaning up inefficient code that works as “one day”. Now a cloud-compute-unit can be tied directly to cold-hard-cash. “Technical Debt” suddenly has an individual price sticker.

Subtree Cost = DTU’s = Money

Set every users default schema to DBO where its blank

In this quick script I am assuming the Windows domain is called ‘DOM’ …

-- ChangeDefaultSchemaToDboWhereNull.sql

DECLARE @cmd varchar(1000) 

SET @cmd = 
'USE ? IF DB_ID(''?'') > 4 SELECT ''USE ?; ALTER USER ['' + name + ''] WITH DEFAULT_SCHEMA = [dbo]''
 FROM sys.database_principals
 WHERE default_schema_name IS NULL
 AND [name] LIKE ''DOM\%'''

IF OBJECT_ID('tempdb..#output') IS NOT NULL DROP TABLE #output
CREATE TABLE #output
(command varchar(1000))

INSERT INTO #output
EXEC sp_MSforeachdb @cmd

SELECT * 
FROM #output


Save emergency contact details to Azure Blob Storage

This was part of the business continuity plan, so that HR had a list of employees next-of-kin etc, off-site.

I created a Job with 2 steps (Export using BCP and Import using AZCopy).

-- bcp.sql


-- Create the command to be executed

	DECLARE @cmd VARCHAR(1000) = 
                'bcp SomeDatabase..SomeView out 
                E:\SomePath\SomeFileName.txt -c -t, -T -S' + @@servername;

-- Run this if command-prompt is disabled

	IF (SELECT value_in_use FROM sys.configurations WHERE name = 'xp_cmdshell') = 0
	BEGIN
		EXEC sp_configure 'show advanced options', 1; RECONFIGURE; -- show advanced options
		EXEC sp_configure xp_cmdshell, 1; RECONFIGURE; -- enable command-shell
		EXEC xp_cmdshell @cmd; -- run the command
		EXEC sp_configure 'xp_cmdshell', 0; RECONFIGURE; -- disable command-shell
		EXEC sp_configure 'show advanced options', 0; RECONFIGURE; -- hide advanced options
	END;

-- Run this if command-prompt is enabled
		
	ELSE
		EXEC xp_cmdshell @cmd;

Step-1 overwrites the file each time.

In preparation for the azcopy command I used Azure Explorer to connect and create a fileshare called “BCExtract”.

Right-Clicking on this I chose “Generate Shared Access Signature”, and “Ad hoc Shared Access Signature”. I left all the defaults as they were except I increased Expiry time a bit.

I pasted the “Shared Access Signature URI” into notepad, added a backslash, then the “Shared Access Signature Token” making one long string.

Before this string I added “azcopy copy ” then the local path to the CSV file in double quotes.

-- azcopy.sql


-- Create the command to be executed

	DECLARE @cmd VARCHAR(1000) = 
		'azcopy copy "E:\SomePath\SomeFileName.txt" 
                "https://SomeLocation/SomeSharedAccessToken"'

-- Run this if command-prompt is disabled

	IF (SELECT value_in_use FROM sys.configurations WHERE name = 'xp_cmdshell') = 0
	BEGIN
		EXEC sp_configure 'show advanced options', 1; RECONFIGURE; -- show advanced options
		EXEC sp_configure xp_cmdshell, 1; RECONFIGURE; -- enable command-shell
		EXEC xp_cmdshell @cmd; -- run the command
		EXEC sp_configure 'xp_cmdshell', 0; RECONFIGURE; -- disable command-shell
		EXEC sp_configure 'show advanced options', 0; RECONFIGURE; -- hide advanced options
	END;
	
-- Run this if command-prompt is enabled
		
	ELSE
		EXEC xp_cmdshell @cmd;

Azure Backups and Restores

Backups

“Azure SQL Database” includes automatic (compressed) backups. The retention period is 7 days for Basic and 35 days for Standard and Premium. Backups are geo-replicated. The backup schedule is weekly full, hourly diff, and tlog backups every 5 minutes.

All backups support point-in-time recovery with a 12 hour RTO, and 5 minute RPO for in-region backups.

Max backup storage size is twice the tier database size limit (EG: Standard = 500GB of backup space). If exceeded a) request to reduce retention period. b) pay extra at standard read rate.

Long term backup archiving can be done using a) manual export to BACPAC on Azure Blob Storage. b) Long-Term-Retention to Azure Backup Vault for up to 10 years.

Restores

Databases can be restored (if within retention period). Further, all databases and their backups are deleted if you drop a Server.

Local restores can be to a point-in-time, can be renamed, but cannot overwrite an existing database.

Install Azure SQL Server and Databases

Portal click-by-click

1. Create SQL Server
– Home / SQL Servers / Add
– Subscription: “FT”
– Resource Group: “(New) SD-resource”
– Server Name “SD-server”
– Region: “(Europe) UK South” (London)
– Admin login “SD-login” / pw
– Allow Azure Services to access this server: “No”
– Enable advanced security: “Not now”
(wait 2 minutes)

2. Create SQL Database
– Home / SQL Databases / Add
– Database name: “SD-db”
– Elastic pool?: No
– Resources: “5 DTU’s (Basic)”
– Data source: “None” (blank database)
(wait 2 minute)
– Home / SQL Databases
– Database Features
– TDE: “Off”

3. Server level Connectivity
– Home / SQL Servers / Server name: “SD-server.database.windows.net” (cut)
– SSMS (paste) / SQL Server Authentication / SD-login / pw

3b. Messages
– “The requested name is valid, but no data of the requested type was found” =
server name was wrong.
– “New Firewall Rule” = your i/p is not on the server firewall list. Do you want to add it ? = Yes

4. (optional) database connectivity
a. exec sp_set_database_firewall_rule dbfwrule1, ‘86.168.15.119’, ‘86.168.15.119’;
b. select * from sys.database_firewall_rules
c. exec sp_delete_database_firewall_rule dbfwrule1

Notes
– Azure SQL DB offers 3 service tiers: Basic, Standard, and Premium.
– These tiers define performance and recovery.
– Performance is measured in DTU’s
– Multiple DB’s can share resources through Elastic Database Pools.

Differential backup / restore using SQL Safe

I wanted to improve a backup / restore sql-job that populated a Reporting Server every night. I felt it would be quicker if it did a weekly Full backup and daily Diff backups.

The 3rd-party backup software was “SQL Safe Backup” from Idera.

I used this code on the Production Server …

--BackupDBA.sql

-- FULL Backup if Friday

	IF (SELECT DATEPART(dw, GETDATE())) = 6
	BEGIN
		EXEC [master].[dbo].[xp_ss_backup]
		@database = 'DBA',
		@filename = '\\ReportingServer\m$\DBA_Full.safe',
		@backuptype = 'Full',
		@overwrite = 1,
		@verify = 1;
	END;


-- DIFF Backup

	EXEC [master].[dbo].[xp_ss_backup]
	@database = 'DBA',
	@filename = '\\ReportingServer\m$\DBA_Diff.safe',
	@backuptype = 'Diff',
	@overwrite = 1,
	@verify = 1;

… and this code I scheduled a few hours later on the Reporting Server …

--RestoreDBA.sql

-- First, kill any connections

	DECLARE @kill VARCHAR(8000) = '';
	SELECT  @kill = @kill + 'kill ' + CONVERT(VARCHAR(5), spid) + ';'
	FROM    [master].[dbo].[sysprocesses]
	WHERE   dbid = DB_ID('DBA')
	AND spid > 50;
	EXEC (@kill);


-- Restore FULL

	EXEC [master].[dbo].[xp_ss_restore] 
	@database = 'DBA',
	@filename = 'M:\DBA_Full.safe', 
	@backuptype = 'Full',
	@recoverymode = 'norecovery', 
	@replace = '1';


-- Restore DIFF

	EXEC [master].[dbo].[xp_ss_restore] 
	@database = 'DBA',
	@filename = 'M:\DBA_Diff.safe', 
	@backuptype = 'Diff',
	@recoverymode = 'recovery', 
	@replace = '1';

Finally, I added a step on Reporting to repair the orphaned user-accounts …

USE [DBA]
GO

EXEC sp_change_users_login 'Auto_Fix', 'ReportingLogin';
EXEC sp_change_users_login 'Auto_Fix', 'LinkedServerLogin';
GO

TempDB Autogrowth

I was shocked to find TempDB was the most often autogrown in our environment – and changing them by hand soon got old.

Here’s my global script to set them to best-practice values, that will only change files that exist, and will fail if the specified size is less than the current size (IE: it fails safe).

--tempdb_autogrowth.sql

-- report size and growth
select name, size*8/1024 [sizeMB], growth*8/1024 [growthMB]
from master.sys.master_files
where db_name(database_id) = 'tempdb'
order by physical_name


-- resize datafile(s) to 256MB & logfile to 128MB
USE [master]
GO
begin try
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev',  SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev2', SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev3', SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev4', SIZE = 262144KB , FILEGROWTH = 262144KB );
ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'templog',  SIZE = 131072KB , FILEGROWTH = 131072KB );
end try begin catch end catch