Monday 31 March 2014

TSQL : Login Failures

Looking at a migrated server, I found the SQL Logs were full of login failures. I wanted to retrieve the list of IP addresses involved and hence import the log messages into SQL itself. The script below is how I achieved this.
IF  NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[SQLErrorlog]'))
BEGIN
CREATE TABLE [dbo].[SQLErrorlog](
    [LogDate] [datetime] NULL,
    [ProcessInfo] [varchar](10) NULL,
    [Text] [varchar](1000) NULL,
 [ID] [bigint] IDENTITY(1,1) NOT NULL
) ON [PRIMARY]
END
GO

IF NOT EXISTS (SELECT * FROM sys.objects WHERE NAME = 'PK_SQLErrorlog' AND Type = 'PK')
 BEGIN
 ALTER TABLE [dbo].[SQLErrorlog] ADD CONSTRAINT PK_SQLErrorlog PRIMARY KEY CLUSTERED (ID) ON [PRIMARY]
 END

IF NOT EXISTS (SELECT * FROM sys.indexes WHERE NAME = 'NCI_SQLErrorLog_Text')
 BEGIN
 CREATE NONCLUSTERED INDEX [NCI_SQLErrorLog_Text] ON [dbo].[SQLErrorlog] ([Text] ASC) ON [PRIMARY]
 END

INSERT INTO [dbo].[SQLErrorlog]
-- All Login Failures
EXEC sp_readerrorlog 0,1, 'Login Failed' 
-- Could put all Error log entries in the table like this
--EXEC sp_readerrorlog 0,1

-- Retrieve Client that failed to connect
SELECT    [LogDate]
  , REPLACE(RIGHT(Text,1+LEN(Text)-CHARINDEX('CLIENT:',Text)),']','') AS Client 
FROM [dbo].[SQLErrorlog] 
WHERE [Text] LIKE 'Login Failed%'

Links -
MSSQLTips : Reading the SQL Server log files using TSQL
MSDN : How to archive your sql error logs into a table

Wednesday 12 March 2014

TSQL - Nibble Update

I've often had to utilize a 'Nibble Delete' approach to removing a large volume of data. Doing so prevents record locking and I/O requests overwhelming a system. I had to do the same with an UPDATE statement today, to combine a couple of columns. A generic version of the script is below.

-- Nibble update

DECLARE @RowsToUpdate int
SELECT @RowsToUpdate = COUNT(*) FROM TableToUpdate WHERE myColumn = ''

DECLARE @Progress int
SET @Progress = 0

WHILE @Progress < @RowsToUpdate
BEGIN

 UPDATE TOP (1000) TableToUpdate
 SET myColumn = cast(rtrim(column1) as varchar(150)) + ' \\ ' + cast(rtrim(column2) as varchar(150))
 FROM TableToUpdate
 WHERE myColumn = ''

    SET @Progress = @Progress + @@RowCount
 
END

Tuesday 4 March 2014

DLM - Data Lifecycle Management



Going through my bookshelf, I came across an older title ‘High Performance SQL Server DBA’. Now being 9 years old, the scripts are no longer relevant. Before I consign this volume for recycling, I have made notes on the headings in Chapter 1 which summarises the DLM Ideology ‘Data Lifecycle Management ‘.
Data Lifecycle Management advocates that the life of data from its creation to removal can be managed by DLM.

The 3 components of DLM -

  1. Data Management Service Layer
  2. Database Management Service Layer
  3. Performance Management Service Layer


Data Management Service Layer

                Logical Data Management

  •  Data Definition – What the data represents.
  •  Standards Control – Consistent use of data types, naming conventions
  •  Documentation – Document code, standards etc.
  •  Metadata Management – Use a data dictionary for attribute & data type definitions.


                Physical Data Management

  • Physical Design
  • ETL / Data Integration
  • Data Profiling / Firewall
  • Data Archival


Database Management Service Layer
  • Database Management
  • Installation / Configuration
  • Storage Management
  • Schema Design / Management
  • Security Management
  • Backup / Recovery Management
  • Business Logic Construction / Analysis
  • Change Management
  • Job Management

Performance Management Service Layer

Performance Lifecycle Management

                Proactive Action
  • Physical Database Design
  • Proactive Performance testing
  • Change Management
                Monitoring - (3 Major Performance methodologies)
  • ·         Bottleneck Analysis
  • ·         Workload Analysis
  • ·         Ratio Analysis
    (Applying these both Ad-hoc & 24/7)
               Analysis
               Recording database performance information for trend analysis.
               Forecasting
                           Use of Analysis data to understand future needs

               Tuning
               Revisiting the database as needs change, usage differs and time progresses.
               Configuration, SQL Code & Physical design may all need to change over time