Wednesday, 27 March 2013

SQL Server CPU’s at 100%?

This was the issues that I faced roughly 2 months after I started my DBA career.
When I was learning through the pages, about the basics, I was surrounded by this issue.
Every now and again our live production sql server’s CPU’s would rise to 100%.
There was no consistency in when it would happen, could be first thing in the morning, mid-afternoon or late evening.

Instinct told me that CPU’s at 100% will be an issue with a big process that probably shouldn’t be running during peak times table scans or the likes, a query with possibly a lot of big table / index scans or hash joins. From task manager I could see that the biggest CPU hogger was sql server itself, my primary focus was to investigate the procedures currently running.

I started simple with a sp_who2 and sp_lock. These commands themselves took up to 30seconds to return any output! When they did the number of processes running was about normal but the one thing that did stand out was the number of blocked processes.
Since there were a lot of procedures blocking a number of other processes I decided to look into sysprocesses for processes and queries that were long running, open transactions and high CPU time to see if anything stood out as being the problem.


So, to re-cap.

  • There are no external / 3rd party processes running on this box as its a dedicated sql server. The process with high cpu was sql server itself.
  • There were no long running queries / high cpu time that stood out, almost every procedure had an open transaction as they were taking so long to process.
Sure enough, failing the server over and restarting it was the only option that came out.
But you can't keep doing that every now and then.
Checking the sql logs show nothing other than the fact that I’ve failed the server over. 

This script captures the server, the max free size, the Total Available and the capture date.

USE DBName
GO
--Create table to capture results
IF EXISTS(SELECT 1 FROM sys.objects WHERE OBJECT_ID = OBJECT_ID(N'VASAvailability') AND type = (N'U'))
DROP TABLE dbo.VASAvailability
GO
CREATE TABLE dbo.VASAvailability
(
VASAvailabilityID INT IDentity(1,1),
ServerName Varchar(100),
TotalAvailMem_KB INT,
MaxFreeSize_KB INT,
CreatedDate datetime
)
GO


--Create VAS view
IF EXISTS(SELECT 1 FROM sys.objects WHERE OBJECT_ID = OBJECT_ID(N'vwVas_Summary') AND type = (N'V'))
DROP VIEW dbo.vwVas_Summary
GO
CREATE VIEW dbo.vwVas_Summary AS
SELECT
Size = Vas_Dump.Size,
Reserved = sum(case (convert (INT,Vas_Dump.Base) ^ 0) when 0 then 0 else 1 end),
Free = sum(case (convert (INT,Vas_Dump.Base) ^ 0x0) when 0 then 1 else 0 end)
FROM
(
--- combine all allocation according with allocation base, don't take into
--- account allocations with zero allocation_base
---
SELECT CONVERT (varbinary,sum(region_size_in_bytes)) AS Size,
region_allocation_base_address AS Base
FROM sys.dm_os_virtual_address_dump
WHERE region_allocation_base_address<> 0x0
GROUP BY region_allocation_base_address
UNION
(
--- we shouldn't be grouping allocations with zero allocation base
--- just get them as is
---
SELECT CONVERT (varbinary,region_size_in_bytes), region_allocation_base_address
FROM sys.dm_os_virtual_address_dump
WHERE region_allocation_base_address = 0x0)
)
as Vas_Dump
GROUP BY Size
GO


Below is the script I used to insert the results of the vas (Total and Max), all that was required was to create the job and schedule it to run every minute.

INSERT INTO dbo.VASAvailability(ServerName,TotalAvailMem_KB,MaxFreeSize_KB,CreatedDate)
SELECT  @@ServerName, ,SUM(CONVERT(BIGINT , Size) * Free) / 1024 AS [Totalavailmem,KB] ,CAST(MAX(Size) AS BIGINT) / 1024 AS [Maxfreesize,KB] ,getdate() FROM    vwVas_Summary WHERE   Free <> 0




No comments:

Post a Comment