MSDTC is required if transactions span multiple servers (i.e. linked servers, ugh!)
Firstly, the SERVICE needs to be running >
Secondly, you need to allow communication (Allow Inbound and Allow Outbound tickboxes below) >
Thursday, 29 October 2009
Wednesday, 28 October 2009
Dynamic Management Objects : sys.dm_exec_query_stats
Sys.dm_exec_query_stats is a dmv (dynamic management view) which stores summary information about queries in the query cache.
Sys.dm_exec_sql_text(sql_handle) is a function that returns the executed command from sql_handle.
Putting them together with CROSS APPLY (APPLY lets you join the output of a function), you can see what is being run and how often.
This provides you 35 columns summarising query activity and includes counts, times, reads, writes statistics.
SQLDenis has provided a great query, which i've slightly adapted to order by the the most commonly executed queries.
It tells you where the sql is called from (ProcedureName) or replaces it with 'Ad-hoc' if not called from a procedure.
SQLDenis's original post >http://blogs.lessthandot.com/index.php/DataMgmt/DBProgramming/finding-out-how-many-times-a-table-is-be-2008
MSDN Reference : http://msdn.microsoft.com/en-us/library/ms189741.aspx
Sys.dm_exec_sql_text(sql_handle) is a function that returns the executed command from sql_handle.
Putting them together with CROSS APPLY (APPLY lets you join the output of a function), you can see what is being run and how often.
SELECT t.text , s.* FROM sys.dm_exec_query_stats s CROSS APPLY sys.dm_exec_sql_text(sql_handle) t WHERE t.text NOT like 'SELECT * FROM(SELECT coalesce(object_name(s2.objectid)%' ORDER BY execution_count DESC
This provides you 35 columns summarising query activity and includes counts, times, reads, writes statistics.
SQLDenis has provided a great query, which i've slightly adapted to order by the the most commonly executed queries.
It tells you where the sql is called from (ProcedureName) or replaces it with 'Ad-hoc' if not called from a procedure.
SELECT * FROM(SELECT COALESCE(OBJECT_NAME(s2.objectid),'Ad-Hoc') AS ProcedureName,execution_count, (SELECT TOP 1 SUBSTRING(s2.TEXT,statement_start_offset / 2+1 , ( (CASE WHEN statement_end_offset = -1 THEN (LEN(CONVERT(NVARCHAR(MAX),s2.TEXT)) * 2) ELSE statement_end_offset END) - statement_start_offset) / 2+1)) AS sql_statement, last_execution_time FROM sys.dm_exec_query_stats AS s1 CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS s2 ) x WHERE sql_statement NOT like 'SELECT * FROM(SELECT coalesce(object_name(s2.objectid)%' ORDER BY execution_count DESC
SQLDenis's original post >http://blogs.lessthandot.com/index.php/DataMgmt/DBProgramming/finding-out-how-many-times-a-table-is-be-2008
MSDN Reference : http://msdn.microsoft.com/en-us/library/ms189741.aspx
Tuesday, 27 October 2009
Dynamic Management Objects : sys.dm_db_index_usage_stats
Sys.dm_db_index_usage_stats is a really helpful view that returns information on index usage.
Using AdventureWorks2008, we can list the indexes belonging to the TransactionHistory table ...
Here we find out how they have been used ...
The view returns counts of seeks, scans and lookup operations as well as timestamps for the latest ones ...
This view is utilised by my Unused index and Index Sizes script as well as this simple example showing index usage in the current database.
Louis Davidson has an excellent article here >
http://sqlblog.com/blogs/louis_davidson/archive/2007/07/22/sys-dm-db-index-usage-stats.aspx
The MSDN reference is here >
http://msdn.microsoft.com/en-us/library/ms188755.aspx
Using AdventureWorks2008, we can list the indexes belonging to the TransactionHistory table ...
select * from sys.indexes where object_id = object_id('production.transactionhistory')
Here we find out how they have been used ...
select i.name, s.* from sys.dm_db_index_usage_stats s inner join sys.indexes i on s.object_id = i.object_id and s.index_id = i.index_id where database_id = DB_ID('adventureworks2008') and s.object_id = object_id('production.transactionhistory')
The view returns counts of seeks, scans and lookup operations as well as timestamps for the latest ones ...
This view is utilised by my Unused index and Index Sizes script as well as this simple example showing index usage in the current database.
Louis Davidson has an excellent article here >
http://sqlblog.com/blogs/louis_davidson/archive/2007/07/22/sys-dm-db-index-usage-stats.aspx
The MSDN reference is here >
http://msdn.microsoft.com/en-us/library/ms188755.aspx
Friday, 23 October 2009
MCITP : SQL 2008 Database Developer
Following passing my MCTS on Monday , I passed MCITP Database Developer 2008 (exam 70-451) this afternoon.
Very relieved to get my evenings back. :)
r
Very relieved to get my evenings back. :)
r
Wednesday, 21 October 2009
A trip down memory lane...
On the eve of the Windows 7 launch, Windows over the years...
http://news.bbc.co.uk/1/hi/technology/8316268.stm
Can personally remember using Windows 2 and running a Windows 3 network for school :)
http://news.bbc.co.uk/1/hi/technology/8316268.stm
Can personally remember using Windows 2 and running a Windows 3 network for school :)
Tuesday, 20 October 2009
Cursor Template TSQL
DECLARE @id BIGINT DECLARE tablecursor CURSOR FORWARD_ONLY FOR SELECT id FROM SCHEMA.table OPEN tablecursor FETCH NEXT FROM tablecursor INTO @id WHILE (1 = 1) BEGIN IF @@FETCH_STATUS <> 0 BREAK; PRINT @id --DO SOMETHING USEFUL HERE ! FETCH NEXT FROM tablecursor INTO @id END CLOSE tablecursor DEALLOCATE tablecursor
Bookmark : Constraints on Large Text Columns
An interesting way of enforcing unique constraints which tackles the problem of index size by using HASHBYTES (like CHECKSUM but better)
SQL Server Unique Constraints for Large Text Columns
SQL Server Unique Constraints for Large Text Columns
Demystified : How SQL Server uses threads
Data files & threads - Clarification (From Steve Jones' blog).
Therefore performance is only a reason for partitioning when separate disks are involved.
Obviously, partitioning is still worth the effort if you wish to reduce backup sizes & times by implementing a strategy that includes it.
Also, if data is partitioned and then your boss stumps up for another drive, you're ready to move some partitions!
"The bottom line is that SQL Server uses a thread for each unique disk drive, NOT files. In SQL 2000, you can fool the system if the disk drives are volumes on the same physical disk. However in SQL 2005 the system checks to see if these are the same physical disk.http://www.sqlservercentral.com/blogs/steve_jones/archive/2009/10/13/sql-server-legend-data-files-and-threads.aspx
There are some caveats, but if you hear this rumor, dispel it. Let people know that multiple file groups (or files) only help if you have different physical I/O paths and drives."
Therefore performance is only a reason for partitioning when separate disks are involved.
Obviously, partitioning is still worth the effort if you wish to reduce backup sizes & times by implementing a strategy that includes it.
Also, if data is partitioned and then your boss stumps up for another drive, you're ready to move some partitions!
SQL 2008 : Filtered Indexes
Nice new functionality in SQL 2008, Filtered Indexes are indexes defined with a WHERE clause!
Nothing further here until I have time to explore, implement and write my own experiences...
http://sqlblogcasts.com/blogs/grumpyolddba/archive/2008/07/11/filtered-indexes-really-cool.aspx
http://sqlfool.com/2009/04/filtered-indexes-what-you-need-to-know/
http://www.mssqltips.com/tip.asp?tip=1785
Nothing further here until I have time to explore, implement and write my own experiences...
http://sqlblogcasts.com/blogs/grumpyolddba/archive/2008/07/11/filtered-indexes-really-cool.aspx
http://sqlfool.com/2009/04/filtered-indexes-what-you-need-to-know/
http://www.mssqltips.com/tip.asp?tip=1785
Monday, 19 October 2009
MCTS : SQL 2008, Database Development
TSQL : Partitioning an existing table
A few months back I published SQL 2005 / 2008 : Table Partitioning - Run through , just enough notes to get Partitioning up and running (on SQL Enterprise / Developer editions).
What I forgot to post was how to move existing tables to the partition -
This is achieved easily i.e. by adding a clustered index that uses the partition scheme -
What I forgot to post was how to move existing tables to the partition -
This is achieved easily i.e. by adding a clustered index that uses the partition scheme -
USE TestDB; GO --- Step 8 : (Optional/Recommended) Create Index on Partitioned Table CREATE UNIQUE CLUSTERED INDEX IX_TestTable ON TestTable(ID) ON TestDB_PartitionScheme (ID); GO
Sunday, 18 October 2009
Persisted Columns Problem : Persisting a time difference
Below you'll see I attempt to define 2 calculated columns as PERSISTED...
This fails however, with the error...
Msg 4936, Level 16, State 1, Line 1
Computed column 'TrackingSeconds' in table 'WebActivity' cannot be persisted because the column is non-deterministic.
What does this mean, then?
According to MSDN...
" Deterministic functions always return the same result any time they are called with a specific set of input values. "
and further down, the answer...
" Deterministic unless used with datetime, smalldatetime, or sql_variant. The datetime and smalldatetime data types are deterministic if the style parameter is also specified. "
So to obtain my 'Deterministic datetime2 difference I have used 2 CONVERT statements.
The first declares my defined epoch of 2000-01-01 as a DATETIME2.
The second casts the entire result back to a numeric data type.
CREATE TABLE [dbo].[WebActivity]( [Id] [bigint] IDENTITY(1,1) NOT NULL, [Url] [varchar] (255) NULL, [TrackingDateTime] [datetime2] (0) NOT NULL, [TrackingSeconds] AS DATEDIFF(SECOND,'20000101',TrackingDateTime) PERSISTED, [TrackingDay] AS DATEDIFF(DAY,'20000101',TrackingDateTime) PERSISTED) ON [PRIMARY] GO
This fails however, with the error...
Msg 4936, Level 16, State 1, Line 1
Computed column 'TrackingSeconds' in table 'WebActivity' cannot be persisted because the column is non-deterministic.
What does this mean, then?
According to MSDN...
" Deterministic functions always return the same result any time they are called with a specific set of input values. "
and further down, the answer...
" Deterministic unless used with datetime, smalldatetime, or sql_variant. The datetime and smalldatetime data types are deterministic if the style parameter is also specified. "
So to obtain my 'Deterministic datetime2 difference I have used 2 CONVERT statements.
The first declares my defined epoch of 2000-01-01 as a DATETIME2.
The second casts the entire result back to a numeric data type.
CREATE TABLE [dbo].[WebActivity]( [Id] [bigint] IDENTITY(1,1) NOT NULL, [Url] [varchar] (255) NULL, [TrackingDateTime] [datetime2] (0) NOT NULL, [TrackingSeconds] AS CONVERT(INTEGER,DATEDIFF(SECOND,CONVERT(DATETIME2,'20000101',112),TrackingDateTime)) PERSISTED, [TrackingDay] AS CONVERT(SMALLINT,DATEDIFF(DAY,CONVERT(DATETIME2,'20000101',112),TrackingDateTime)) PERSISTED, ON PRIMARY GO
Saturday, 17 October 2009
Visual Studio Add-in : BIDS Helper
BIDS Helper is a FREE add-in for BIDS/Visual Studio
As I’m currently working with Integration Services , these features are very much of interest >
* Create Fixed Width Columns
* Deploy SSIS Packages
* Design Warnings
* dtsConfig File Formatter
* Expression and Configuration Highlighter
* Expression List
* Fix Relative Paths
* Non-Default Properties Report
* Pipeline Component Performance Breakdown
* Reset GUIDs
* Smart Diff
* Sort Project Files
* Sortable Package Properties Report
* SSIS Performance Visualization
* Variables Window Extensions
Personally I’m particularly interested in the Performance Studio feature...
The Non-Default Properties Report is also excellent for when you lose track of what you’ve changed, sorry tuned :)
As I’m currently working with Integration Services , these features are very much of interest >
* Create Fixed Width Columns
* Deploy SSIS Packages
* Design Warnings
* dtsConfig File Formatter
* Expression and Configuration Highlighter
* Expression List
* Fix Relative Paths
* Non-Default Properties Report
* Pipeline Component Performance Breakdown
* Reset GUIDs
* Smart Diff
* Sort Project Files
* Sortable Package Properties Report
* SSIS Performance Visualization
* Variables Window Extensions
Personally I’m particularly interested in the Performance Studio feature...
The Non-Default Properties Report is also excellent for when you lose track of what you’ve changed, sorry tuned :)
Friday, 16 October 2009
Stored Procedures - The Arguments
I'm 'pro' Stored Procedures. A couple of excellent url's that discuss the reasons...
Paul Nielson - Why use stored procedures?
Kimberley Tripp - Stored procedures are not evil...
Paul Nielson - Why use stored procedures?
Kimberley Tripp - Stored procedures are not evil...
Wednesday, 14 October 2009
Free Tool : OpenDBDiff
This morning's SQLServerCentral email features a new (free and open source) tool called opendbdiff.
Basicly it is a free alternative to Redgate's SQL Compare and is compatible with SQL 2005 & SQL 2008.
You can download it from the codeplex site here.
The 'Mak' reviews it here.
Basicly it is a free alternative to Redgate's SQL Compare and is compatible with SQL 2005 & SQL 2008.
You can download it from the codeplex site here.
The 'Mak' reviews it here.
Tuesday, 13 October 2009
SSIS : Introducing a wait step in the Control Flow
In a processing system I wanted SSIS to pause for a period of time (say 1 minute) to allow more data to arrive in my Import db.
The obvious solution was an empty 'For Loop' Container, constrained by looping until a minute had passed. Whilst trialing this approach I noticed the Physical Memory usage slowly increase (until it eventually consumed all memory and the package crashed).
My new approach is using an 'Execute SQL' task (shown below).
Basically, I am using the TSQL WAITFOR command, like this > WAITFOR DELAY '00:01:00'.
I am happy to report no memory issues and my package is running continuously with no unwanted side effects...
The obvious solution was an empty 'For Loop' Container, constrained by looping until a minute had passed. Whilst trialing this approach I noticed the Physical Memory usage slowly increase (until it eventually consumed all memory and the package crashed).
My new approach is using an 'Execute SQL' task (shown below).
Basically, I am using the TSQL WAITFOR command, like this > WAITFOR DELAY '00:01:00'.
I am happy to report no memory issues and my package is running continuously with no unwanted side effects...
Monday, 12 October 2009
Saturday, 10 October 2009
Wednesday, 7 October 2009
SQL 2008 : Sparse Columns
If a column is sparsely (rarely) populated, you can benefit by declaring it as SPARSE.
By declaring columns where the majority of data is NULL as 'sparse' you can save storage space.
The Technet Explanation of Sparse column functionality.
As ever, Pinal Dave has investigated Sparse columns fully.
By declaring columns where the majority of data is NULL as 'sparse' you can save storage space.
The Technet Explanation of Sparse column functionality.
As ever, Pinal Dave has investigated Sparse columns fully.
Monday, 5 October 2009
Finding Page Splits
Finding Page splits by using undocumented function fn_dblog (this queries the transaction log) -
You can summarise them like this -
Other ways to monitor page splits -
Recommended links about Page Allocation
MS CSS Sql Server Engineers : How It Works: SQL Server Page Allocations
Recommended links about Page Splits
Tony Rogerson : What is a page split and why does it happen?
SQL Server Performance : At what point should I worry about page splits?
Michelle Ufford : Page Splitting & Rollbacks
Michelle Ufford : sys.fn_physLocCracker (SQL 2008 Undoumented function)
SELECT * FROM ::fn_dblog(NULL, NULL) WHERE operation = 'LOP_DELETE_SPLIT'ref ; http://killspid.blogspot.com/2006/07/using-fndblog.html
You can summarise them like this -
Select COUNT(1) AS NumberOfSplits, AllocUnitName , Context From fn_dblog(NULL,NULL) Where operation = 'LOP_DELETE_SPLIT' Group By AllocUnitName, Context Order by NumberOfSplits descref - Identifying Page Splits
Other ways to monitor page splits -
- DBCC SHOWCONFIG
- Extended events (SQL 2008+ only)
Recommended links about Page Allocation
MS CSS Sql Server Engineers : How It Works: SQL Server Page Allocations
Recommended links about Page Splits
Tony Rogerson : What is a page split and why does it happen?
SQL Server Performance : At what point should I worry about page splits?
Michelle Ufford : Page Splitting & Rollbacks
Michelle Ufford : sys.fn_physLocCracker (SQL 2008 Undoumented function)
Sunday, 4 October 2009
Minimising Data Compression Rebuild Time
ALTER TABLE REBUILD WITH(DATA COMPRESSION=PAGE,MAXDOP=8)
Implementing table compression use server MAXDOP (sp_configure 'max degree of parallelism') if you dont set MAXDOP in statement.
A good explanation of page compression and the MAXDOP setting is here >
http://sqlblog.com/blogs/linchi_shea/archive/2008/05/05/sql-server-2008-page-compression-using-multiple-processors.aspx
Saturday, 3 October 2009
TSQL : While Loop (loop without cursor)
Really simple this, but i keep using so I thought I'd post.
How to loop in TSQL (without a CURSOR)
How to loop in TSQL (without a CURSOR)
DECLARE @Counter INT DECLARE @CounterText VARCHAR(30) DECLARE @MaxCount INT DECLARE @Stepsize INT SELECT @Counter = 0 SELECT @MaxCount = 40000 SELECT @Stepsize = 1000 WHILE @Counter <= @MaxCount BEGIN SET @CounterText = CONVERT(VARCHAR(30),@Counter,23) RAISERROR (@CounterText, 10, 1) WITH NOWAIT -- do useful stuff here SELECT @Counter = @Counter + @Stepsize ENDand counting down....
DECLARE @Counter INT DECLARE @CounterText VARCHAR(30) DECLARE @MinCount INT DECLARE @Stepsize INT SELECT @Counter = 40000 SELECT @MinCount = 0 SELECT @Stepsize = 1000 WHILE @Counter >= @MinCount BEGIN SET @CounterText = CONVERT(VARCHAR(30),@Counter,23) RAISERROR (@CounterText, 10, 1) WITH NOWAIT -- do useful stuff here SELECT @Counter = @Counter - @Stepsize END
Subscribe to:
Posts (Atom)