Here is the simplest working example of EXCEPT and INTERSECT I can come up with (for Will)
/* Except.sql */
IF OBJECT_ID('tempdb..#t1') IS NOT NULL DROP TABLE #t1;
CREATE TABLE #t1 (c1 INT);
INSERT INTO #t1 VALUES (1), (2);
IF OBJECT_ID('tempdb..#t2') IS NOT NULL DROP TABLE #t2;
CREATE TABLE #t2 (c2 INT);
INSERT INTO #t2 VALUES (2), (3);
SELECT * FROM #t1
SELECT * FROM #t2; /* = 1 */
SELECT * FROM #t1
SELECT * FROM #t2; /* = 2 */
SELECT * FROM #t2
SELECT * FROM #t1; /* = 3 */
SELECT * FROM #T1
SELECT * FROM #T2
(SELECT * FROM #T2
SELECT * FROM #T1); /* = 1 & 3 */
I use this frequently whilst refactoring to check the outputs are identical. And sometimes when syncing a MySQL table to MSSQL.
Looking through the Database Mail log today, I accidentally discovered a job that had been busy sending emails for I-don’t-know-how-long using an email profile that no longer worked. The output of the job was ‘success’ as the emails had been successfully queued with the Database Mail sub-system.
After finding the emails would have been empty anyway, I disabled the job. But it made me wonder if there might be other jobs that were busy doing nothing – hour after hour – day after day.
Knowing the dangers of weakening the system, I did not want to fail a job or job-step just to flag a maintenance issue.
The lowest-risk change I could think of making (to the many, legacy, unfamiliar jobs) was to leave pertinent messages in the job history log using the PRINT command. For example:-
IF EXISTS (SELECT 1 FROM SomeTable)
PRINT 'YES: there is new data'
(Do meaningful stuff)
PRINT 'NO: there is no new data';
Then in the future I might notice that there is Never any new data!
There are many ways to copy (or move) a MySQL database (aka schema) from one server (aka instance) to another (including the data).
On this occasion I used the Export and Import utilities within “MySQL Workbench” (like doing a backup and restore). The fact that the source and target instances were both hosted on GCP was irrelevant (no brackets required!).
1. Connect to Source and start the Data Export utility …
2.Within the utility, I left the defaults as they were, apart from …
- TIC the schema I wanted to export (see screenshot)
- Select “Export to Self-Contained File” as I wanted all the tables
- and create a meaningful dump-file name
3. Because the Export and Import utilities would be using different logins, I clicked “Advanced Options” within the Export utility, and typed “OFF” over the top of the default “AUTO” setting for set-gtid-purged …
…before clicking the “Return” then “Start Export” back on the main page.
4. To keep it simple, I closed and reopened MySQL Workbench before connecting to the Target instance. Then from the Server menu I chose Data Import …
5. I left all the defaults as they were except …
- I chose “Import from Self-Contained File”
- and navigated to the dump-file
- I clicked “New” and typed the schema name that would receive the import.
6. Finally. On the “Import Progress” page I clicked “Start Import”. Then waited about five minutes before anything seemed to happen