Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 New to SQL Server Administration
 Database IO performance issue

Author  Topic 

Mathias_13
Starting Member

5 Posts

Posted - 2010-01-30 : 05:01:07
Have a problem with my new application its an application which uses
five tables to work togheter with an PLC the problem is that after one week the computer is having bad IO performance from the drive.

The database consists of 5 tables

(Realtime)
Column: Variable
Column: Status
Column: Write
Column: DataType
Column: Updated

(System) same structure as Realtime

(Log)
Column: TimeStamp
after this i have columns depending on which variable i want to log so columns can grow

(CustomLog) same as Log but is only used on demand for short logging

(Alarms)
Column: Ind
Column: Triggered
Column: Ended
Column: Type
Column: Group
Column: Message

Then i have a trigger for both realtime and system which triggers on update it then makes a notification to the application if Status Write is not equal it used to write values to the PLC.

Until today i only have 50 rows in Log and maybe 5 rows in Alarms and this is the only table that is kept Realtime,System,CustomLog is dropped and created again when i restart the application.
But there is no improvment if i restart the application.
The only thing that helps is to delete the whole database but then
i will lose my logged values and also alarms.

Where i think the problem is at is in the Realtime table which is heavily updated in column Status, Write, Updated because of variable changing in the PLC some variables is updated every second. But i thought that dropping the table and creating it again should solve it.

What i want to do is some maintenance to correct this that i run every day but i have no clue what can be done.

Kristen
Test

22859 Posts

Posted - 2010-01-30 : 05:24:15
Probably you have some poorly performing queries.

I don't see any benefit to DROP and RECREATE your tables - you just fragment the file. TRUNCATE or DELETE the data in it if you need to empty the table.

"i have a trigger for both realtime and system which triggers on update it then makes a notification to the application"

That sounds like a database AFTER UPDATE trigger is doing some communication action (outside of SQL) with something else, which I expect would be slow and fragile. Better approach would be to have the Application POLL something. e.g. have the Trigger update the "Most Recent Transaction Date" somewhere, and then have the apoplicaiotn poll for "Is the [Most Recent Transaction Date] more recent than last time I checked?"

If you do not have routine housekeeping in place for your database then:

ALTER INDEX REBUILD will defragment your indexes, and generally tidy them up

and
UPDATE STATISICS (use the WITH FULLSCAN option) will update all the statistics.

Those, alone, may speed up your queries. (If you are on SQL 2000 stop/start SQL Server after doing that [just this once!] if you can easily schedule the downtime. That will clear the caches etc. which will be a benefit if they are way out of wack. SQL 2005, and later, will automatically recompile the queries when the Statistics are updated.

If that doesn't fix it (enough) then we'll need to see some info about the Query Plan of your slow running query and then folk here will be able to advise on how to improve it.
Go to Top of Page

Mathias_13
Starting Member

5 Posts

Posted - 2010-01-30 : 06:18:55
That sounds like a database AFTER UPDATE trigger is doing some communication action (outside of SQL) with something else, which I expect would be slow and fragile. Better approach would be to have the Application POLL something. e.g. have the Trigger update the "Most Recent Transaction Date" somewhere, and then have the apoplicaiotn poll for "Is the [Most Recent Transaction Date] more recent than last time I checked?"

The trigger i used like this if i update the write column only
with a value and not Status the trigger while notify the application with a string "no need for application to query" that now i want to write a value to the plc because the Status is not equal to write. So the trigger will run also when the variable is updated in the PLC but will not notify the application because if the variable is updated in the PLC i will update both Status and Write with the same value.
I don't update every variable every second i use notification from plc when the variable has changed but some variables like temperature will update every second because of physics. So Last Transaction date
will probably always be very recent.


If you do not have routine housekeeping in place for your database then:

ALTER INDEX REBUILD will defragment your indexes, and generally tidy them up

and
UPDATE STATISICS (use the WITH FULLSCAN option) will update all the statistics.


I have tried to use the alter index but no improvment.
I will try the update statistics after some days


I will also skip the drop of tables and delete the recordings instead
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2010-01-30 : 06:23:44
Index Rebuild will also update statistics (on indexes) so I doubt UPDATE STATICS (which will also do non-index statistics will help).

I don't understand your description of the Trigger. Please post the code.

Beyond that you will need to post the Query Plan of a sample query that is running too slowly
Go to Top of Page

russell
Pyro-ma-ni-yak

5072 Posts

Posted - 2010-01-30 : 09:20:09
So you write a value to PLC, then to the Database and you're triggering the tables to make sure the value didn't change in between?

I think the application code is poorly designed. If you know what value was written to PLC, then the application should write THAT value to the database.

If it is allowed to be different, then the application should handle the notification.

Also, how do you know it is an I/O issue? What counters are you measuring? A table with 50 records should never cause I/O problems. Are you using explicit transactions that are causing blocking?

If you're tables stay under a few thousand records, then the indexes aren't the problem. In fact if the table isn't going to grow large, you don't even need indexes other than to enforce uniqueness.
Go to Top of Page

Mathias_13
Starting Member

5 Posts

Posted - 2010-01-30 : 14:24:48
I haven't measured anything to say its an IO issue other then listening to the drive and navigating in it is horrible after a week.
When i remove the database the problem is gone. Maybe saying its an IO issue is wrong maybe i should say that the tables get physically fragmented on the disk because of all the updates i make in the realtime table.

My skills in SQL is not the best so any suggestions to measure to know exactly what is the problem.

I dont think the problem is the trigger because if i stop the application then there is no updates so the trigger stops fireing.
But still the drive sounds heavily fragmented and navigatin is a big problem.

Here is the trigger:

CREATE TRIGGER [dbo].[Write_Trigger_%table%]
ON [dbo].[%table%]
AFTER UPDATE
AS
BEGIN
SET NOCOUNT ON;
DECLARE @tableName sysname;
SET @tableName = '%table%' + '!';
DECLARE @Message xml;
DECLARE @send bit;
SET @send = 0;
SET @Message = N'';
DECLARE @string nvarchar(1024);
SET @string = @tableName;
DECLARE @tempTable Table
(
rowNumber int IDENTITY(0,1) NOT NULL,
Variable varchar(30),
Status varchar(30),
Write varchar(30)
)
INSERT INTO @tempTable (Variable, Status, Write) SELECT Variable, Status, Write FROM Inserted
DECLARE @status varchar(30)
DECLARE @write varchar(30)
DECLARE @count int
SET @count = 0
DECLARE @numberOfRows int
SET @numberOfRows = (SELECT COUNT(Variable)FROM @tempTable)
WHILE @count < @numberOfRows
BEGIN
SET @status = CAST((SELECT Status FROM @tempTable WHERE rowNumber = @count) as varchar)
SET @write = CAST((SELECT Write FROM @tempTable WHERE rowNumber = @count) as varchar)

IF @status != @write
BEGIN
SET @send = 1;
SET @string = @string + CAST((SELECT Variable FROM @tempTable WHERE rowNumber = @count) as nvarchar)
SET @string = @string + '@'
SET @string = @string + CAST((SELECT Write FROM @tempTable WHERE rowNumber = @count) as nvarchar)
SET @string = @string + '?'
END
SET @count = @count + 1
END

IF @send = 1
BEGIN
set @Message = @string

DECLARE @NotificationDialog uniqueidentifier
SET QUOTED_IDENTIFIER ON
BEGIN DIALOG CONVERSATION @NotificationDialog
FROM SERVICE ChangeNotifications
TO SERVICE 'ChangeNotifications'
ON CONTRACT [http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification]
WITH ENCRYPTION = OFF;
SEND ON CONVERSATION @NotificationDialog
MESSAGE TYPE [http://schemas.microsoft.com/SQL/Notifications/QueryNotification] (@Message)

END

END


Just to understand how the application works, if i want to write to the plc
1 -> i put a value in the Write column of realtime table.
2 -> the application gets an notification and writes to the plc
3 -> the plc notifys the application that the value has changed
4 -> the application writes the value to the Status, Write column in realtime and this way i know that the value was written like it should.

-> if the plc updates a value step 3 and 4 will occur

its very rarely that step 1 and 2 occur its 3 and 4 that occur almost every second
Go to Top of Page

russell
Pyro-ma-ni-yak

5072 Posts

Posted - 2010-01-30 : 14:30:20
You can hear when the drive is fragmented? I want you to work for me!
Go to Top of Page

russell
Pyro-ma-ni-yak

5072 Posts

Posted - 2010-01-30 : 14:33:28
Open perfmon, measure CPU % utilization, Physical Disk average disk queue length while application is running.

Queue length should stay consistantly under 2. CPU you'll get a lot of opinions, but I like to see less than 35% on a regular basis. Quick spikes up into the 80s are ok as long as it isn't hanging around that high.

Also measure SQL Server Buffer Manager, Page Life Expectancy.

There are many other useful counters but those are a good start.

Is this running on a desktop or a server?
Go to Top of Page

Mathias_13
Starting Member

5 Posts

Posted - 2010-01-30 : 14:51:05
I will test perfmon to make sure that i have a disk queue
SQL Server Buffer Manager i cant find i'm using SQL SERVER 2005 EXPRESS maybe this is why?

The computer is dedicated Server which is running Mediaportal
www.team-mediaportal.com for streaming TV. The only thing mediaportal
is doing on the same drive is having an SQL database for EPG data and settings.

One thing i wonder is about my table Realtime which i think is the problem. The table is inserted with all variables and values when the application starts after this i doesn't grow anymore. After this i update the table very often with new values. How will SQL put the data on the disk, will it overwrite the old value or will it use space somewhere else on the disk. Cause what i read on the internet is that Update statement fragment the disk and my application is doing that a lot.
Go to Top of Page

Mathias_13
Starting Member

5 Posts

Posted - 2010-02-07 : 15:38:52
I'm now running UPDATE STATISTICS on a daily basis for all tables,
this seems to be the solution for my problem.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2010-02-08 : 02:30:24
Excellent! Glad to hear it.
Go to Top of Page
   

- Advertisement -