System.Data.SqlClient.SqlException: Timeout expired. The timeout period
elapsed prior to completion of the operation or the server is not
responding.
The odd thing to me is that the error if happening part way through, after a
couple of thousand records have been processed, and on the Read() call of
the SqlDataReader. Through the logging capabilites in my app I can see each
iteration through the loop and I'm getting about 25 iterations per second.
Why would a read from the SqlDataReader time out like this mid-way? It just
doesn't make sense to me why it would be looping fine then suddenly it can't
read from the datareader? There is no time limit on how long I can have a
reader open is there? It can't be a physical communication issue as it
happens every time, and usually after about the same number of iterations.
SQLServer is running on my machine, as is the application hitting it. When
I start getting the timeouts there is no disk activity and almost no CPU
usage. I've tried increasing the CommandTimeout on the SqlCommand to no
avail (it just makes it take longer to fail but still fails on the same
basic iteration).
Here is a snippet of my logging. You can see it iterating and then within a
split second it fails on a timeout. Doesn't make sense to me.
2/13/2006 3:08:44 PM 3068 - ExtDocStat:
p-0aaba9ff-cd53-4656-b866-23e8362bcd2e
2/13/2006 3:08:44 PM 3069 - ExtDocStat:
p-37e842e5-c86a-4baf-a529-23eb82999d2c
2/13/2006 3:08:44 PM 3070 - ExtDocStat:
p-2df6cfea-04c5-467f-ae87-23ed60232a18
2/13/2006 3:08:44 PM Unexpected error in ExtractDocumentStatus:
System.Data.SqlClient.SqlException: Timeout expired. The timeout period
elapsed prior to completion of the operation or the server is not
responding.
Below is the code I'm using, simplified to show the important info (no
comments and reduced error handling). Am I doing something basic wrong
here?
-Brett-
public void ExtractDocumentStatus(string strConn)
{
SqlConnection sqlConnRead = new SqlConnection(strConn);
SqlConnection sqlConnWrite = new SqlConnection(strConn);
try
{
sqlConnRead.Open();
sqlConnWrite.Open();
SqlCommand sqlCmd = new SqlCommand("SELECT * FROM Portfolio",
sqlConnRead);
SqlDataReader sqlRdr = sqlCmd.ExecuteReader();
int ordID = sqlRdr.GetOrdinal("PortID");
while (sqlRdr.Read())
{
SqlGuid id = sqlRdr.GetSqlGuid(ordID);
string sql = string.format("UPDATE DocData SET Stat='OK' WHERE
Pid='{0}'", id);
SqlCommand sqlCmd2 = new SqlCommand(sql, sqlConnWrite);
sqlCmd2.ExecuteNonQuery();
}
sqlRdr.Close();
}
finally
{
sqlConnRead.Close();
sqlConnWrite.Close();
}
}
Why are you doing it in a loop. I believe you could use just one SQL
statement to do this in one batch, something like
UPDATE DocData SET Stat='OK' WHERE Pid IN (SELECT PortID FROM Portfolio)
--
Val Mazur
Microsoft MVP
http://xport.mvps.org
"CuriousGeorge" <brettro...@newsgroup.nospam> wrote in message
news:eqD5HjCN...@TK2MSFTNGP09.phx.gbl...
As Val suggested, a set-based UPDATE is the best approach if this is
possible in your actual application. You still need to set the
CommandTimeout property to allow for the maximum time the command will
execute on the server. Personally, I usually set CommandTimeout to zero and
keep transactions as short as possible. 25 rows/sec seems a bit slow so
check for check for blocking and perform tuning.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"CuriousGeorge" <brettro...@newsgroup.nospam> wrote in message
news:eqD5HjCN...@TK2MSFTNGP09.phx.gbl...
Second, it's important to understand what SqlDataReader is actually
doing here. When you call Read() on a DataReader, the ADO.NET framework
isn't actually doing this:
App to DB: Get me one line of data
DB to App: Here's your data
App to DB: Get me one line of data
DB to App: Here's your data
App to DB: Get me one line of data
DB to App: Here's your data
What it really is doing is this:
App to DB: Get me all of the data
DB to App: I'm sending it all now! Buffer my output in another thread
so you don't have to make another round trip!
and that's it. Read() is actually reading the next data row from the
connection's internal buffer, not actually making an extra call to the
database. This gives us an important clue -- your query is still
returning data while you're reading! So you get through a number of
rows before the timeout to the query actually passes, at which point you
get your exception. At least that would be my guess. How many records
are you dealing with in that base table? If you have too many, that's
probably what's causing it to blow up.
-Brett-
"Val Mazur (MVP)" <grou...@hotmail.com> wrote in message
news:e2mEDDDN...@TK2MSFTNGP12.phx.gbl...
-Brett-
"Dan Guzman" <guzm...@nospam-online.sbcglobal.net> wrote in message
news:uWv100KN...@TK2MSFTNGP09.phx.gbl...
-Brett-
"Dave Markle" <"dma[remove_ZZ]ZZrkle"@gmail.dot.com> wrote in message
news:uS2Wt1LN...@TK2MSFTNGP10.phx.gbl...
Actually, the DataReader is used by the DataAdapter Fill method as well.
It's simply the low-level data access mechanism that's always been there
since the dawn of time--it's just been exposed for the first time since
DB-Library.
No, none of the data access interfaces (like DAO, OLE DB, ODBC, ADO or
ADO.NET) are designed to help you move large amounts of data. That's what
the DTS or BCP interfaces are for. And no, you should avoid application
designs that visit each row in a rowset for individual processing--that's
what stored procedures are for. Why move the data to the client for
processing when the server can do it in place?
--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
"CuriousGeorge" <brettro...@newsgroup.nospam> wrote in message
news:%23WNOE0v...@TK2MSFTNGP10.phx.gbl...
So what I am trying to do is innefficient, I'll freely admit that, my
reasoning is purely lack of T-SQL expertise and a large amount of string
processing that I need to do on each record (which is simple in C#,
assumably not so simple in T-SQL).
Ignoring my innefficient design, there is nothing fundamentally wrong with
what I am trying to do, correct? So what might be causing my timeout
errors? Am I somehow hitting a deadlock between the DataReader and the
ExecuteNonQuery call on the same table? I really don't know where to go
from here.
-Brett-
"William (Bill) Vaughn" <billvaRe...@nwlink.com> wrote in message
news:%231Ct1nw...@TK2MSFTNGP09.phx.gbl...
So, why are you timing out? Check sp_lock to see if there are any locks
(remember, they might be row, page or table locks) that are preventing you
from continuing. Another approach might be to create a local DataTable with
the rowset and post-process those rows. Again, I still think a SQL
Server-only solution would be the best alternative. Stretching your TSQL
skills is good for your career too... ;)
--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
"CuriousGeorge" <brettro...@newsgroup.nospam> wrote in message
news:u7$F2bzNG...@TK2MSFTNGP14.phx.gbl...