Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

DataTable Limitations

363 views
Skip to first unread message

Christopher Luther

unread,
Mar 14, 2005, 6:40:34 PM3/14/05
to
I've found a MSDN reference to the fact that an ADO.NET DataTable is limited
to 16,777,216 rows of data [1], but what I cannot find is a reference for
the maximum columns per DataTable, maximum bytes per DataRow, etc. Is this
information documented anywhere?


TIA!

- Christopher

[1]
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/cpref/html/frlrfSystemDataDataTableClassTopic.asp

Miha Markic [MVP C#]

unread,
Mar 15, 2005, 3:15:53 AM3/15/05
to
Hi Christopher,

Is this really important?

--
Miha Markic [MVP C#] - RightHand .NET consulting & development
www.rthand.com
SLODUG - Slovene Developer Users Group www.codezone-si.info

"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message
news:%23VL9z8O...@TK2MSFTNGP14.phx.gbl...

Cor Ligthert

unread,
Mar 15, 2005, 3:34:57 AM3/15/05
to
Christopher,

Why are you asking this.

The dataset exist from collections of datatables (objects) which have all a
collection of datarows (objects) which have all collection of items
(objects).

Cor


Patrice

unread,
Mar 15, 2005, 6:10:34 AM3/15/05
to
I would try to add new a1, a2 etc fields until it fails ? If you have a
particular goal in mind I would check that you can reach this.

IMO, it's likely some theorical limit that you are unlikely to reach...

Patrice


--

"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> a écrit dans le
message de news:%23VL9z8O...@TK2MSFTNGP14.phx.gbl...

Christopher Luther

unread,
Mar 15, 2005, 2:19:44 PM3/15/05
to

"Cor Ligthert" <notmyfi...@planet.nl> wrote in message
news:%23ikKWnT...@TK2MSFTNGP10.phx.gbl...

> Christopher,
>
> Why are you asking this.
>
We've currently got a "process" that takes de-normalized data from SQL
Server, transforms the data into normalized tables (dynamically created),
and then used Excel to perform calculations on the data.

There's LOTS of data, and we're hitting some hard limits within SQL Server.
The current thought goes something like "maybe se can eliminate SQL Server
as the repository of the normalized data and use an in-memory DataSet to
handle the normalized tables and such." But before expending $$$ on this
"thought", we'd like to know any defined DataTable limits.

> The dataset exist from collections of datatables (objects) which have all
> a collection of datarows (objects) which have all collection of items
> (objects).
>

I fully understand the DataSet object model.

> Cor
>


Christopher Luther

unread,
Mar 15, 2005, 2:20:20 PM3/15/05
to

"Miha Markic [MVP C#]" <miha at rthand com> wrote in message
news:exET2cTK...@TK2MSFTNGP15.phx.gbl...

> Hi Christopher,
>
> Is this really important?
>
Yes, it is important.

We've currently got a "process" that takes de-normalized data from SQL
Server, transforms the data into normalized tables (dynamically created),
and then used Excel to perform calculations on the data.

There's LOTS of data, and we're hitting some hard limits within SQL Server.
The current thought goes something like "maybe se can eliminate SQL Server
as the repository of the normalized data and use an in-memory DataSet to
handle the normalized tables and such." But before expending $$$ on this
"thought", we'd like to know any defined DataTable limits.

- Christopher

Christopher Luther

unread,
Mar 15, 2005, 2:27:31 PM3/15/05
to

"Patrice" <nob...@nowhere.com> wrote in message
news:%23PDud%23UKFH...@TK2MSFTNGP15.phx.gbl...

>I would try to add new a1, a2 etc fields until it fails ? If you have a
> particular goal in mind I would check that you can reach this.
>
> IMO, it's likely some theorical limit that you are unlikely to reach...
>
> Patrice
>
If there really is only some theoritical limit (as one would hope), then why
such an odd number as 16,777,216 for the maximum number of rows?

The .NET collection object model uses a standard int (Int32) value as the
counter for the collection. So if one were to use only int as the basis for
the number of rows, one would think that the maximum number of rows would be
something over 2 billion instead of a paltry 16 million.

So, obviously there is a hard limit for some reason, and we're trying to
ascertain if there are any other hard limites before embarking on a
significant develop effort.

- Christopher

Miha Markic [MVP C#]

unread,
Mar 15, 2005, 2:41:20 PM3/15/05
to
Hi Christopher,

I think that you will hit computer's memory limit way before you'll hit any
dataset limit.
That's why I don't think it is important.
However, if you want to be on the safe side, why don't you test it yourself
by adding columns, tables and rows?

--
Miha Markic [MVP C#] - RightHand .NET consulting & development
www.rthand.com
SLODUG - Slovene Developer Users Group www.codezone-si.info

"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message

news:eP8SFQZ...@TK2MSFTNGP12.phx.gbl...

james

unread,
Mar 15, 2005, 5:55:46 PM3/15/05
to
According to Access Help, the maximum number of Fields is 255.
Just do a search for Access Specifications.
james

"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message news:eorlFUZ...@TK2MSFTNGP12.phx.gbl...

james

unread,
Mar 15, 2005, 5:58:00 PM3/15/05
to
Sorry for the Follow-up to my previous response. But, Max row/record length is listed like this:

Number of characters in a record (excluding Memo and OLE Object fields) 2,000


james

"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message news:eorlFUZ...@TK2MSFTNGP12.phx.gbl...
>

Cor Ligthert

unread,
Mar 16, 2005, 2:09:43 AM3/16/05
to
Miha,

> I think that you will hit computer's memory limit way before you'll hit
> any dataset limit.

That was what I wrote yesterday already as message to Patrice. However,
he/she would have probably answered to me that he/she knowed that, what I
am not in doubt about.

To show that we have the same idea.

Cor


Cor Ligthert

unread,
Mar 16, 2005, 2:12:54 AM3/16/05
to
James,

That is for Access, however a good addition, it would not be likely that a
datatable would have many more columns than there are in the database.

Cor


Patrice

unread,
Mar 16, 2005, 5:57:39 AM3/16/05
to
Keep in mind that datasets are for disconnected data. I hardly see a
realistic scenario where you would like to keep more than 10 millions of
rows in memory at the same time.
Have your tried to reach the other limits ? Please let us know about your
findings.

You may want also explain what you are trying to do. It looks you have quite
an unusual scenario for datasets. Perhaps someone could suggest an alternate
approach for what you are trying to do...

Patrice

--

"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> a écrit dans le

message de news:eorlFUZ...@TK2MSFTNGP12.phx.gbl...

Patrice

unread,
Mar 16, 2005, 6:33:12 AM3/16/05
to
Saw this after my previous post.

I'm really in doubt DataSets will perform better than SQL Server. What is
the limit you reached in SQL Server ?

Patrice
--

"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> a écrit dans le

message de news:eP8SFQZ...@TK2MSFTNGP12.phx.gbl...

Christopher Luther

unread,
Mar 16, 2005, 4:30:42 PM3/16/05
to
It's not a performance issue in SQL Server, Patrice. It's SQL's physical
limitation on record size that is killing us.

As I stated previously, the data is in a de-normalized state. That is, the
parent record, call it a Form, has zero, one, or many child records, with
each record representing a Field on the Form. Each of these Field records
has a key value for the parent-child relationship and a single nvarchar
field that is defined as nvarchar(2000) -- that is, 4000 characters.

As there is no way to know how many Fields a Form may have, transforming the
many Field records into a single row of data causes SQL to eventually
complain. Even with all the smarts the application has in trimming the
nvarchar fields and such, eventually we hit the hard limit for SQL's record
size.

Hence the thought of transforming the de-normalized SQL data into local
DataTable objects that can be serialized to disk as an XML file.

- Christopher


"Patrice" <nob...@nowhere.com> wrote in message

news:%23L0Dyvh...@TK2MSFTNGP15.phx.gbl...

Christopher Luther

unread,
Mar 16, 2005, 4:33:16 PM3/16/05
to
We're not using Access, James. We're wanting to extract de-normalized data
from SQL Server and normalized the data into local .NET DataTable objects
(see other message threads on this subject for more detail on this matter).

- Christopher


"james" <jjames700ReMoVeMe at earthlink dot net> wrote in message
news:eK%23KeIbK...@TK2MSFTNGP10.phx.gbl...

Christopher Luther

unread,
Mar 16, 2005, 4:36:12 PM3/16/05
to
See the other message threads on this matter for details on what we're
thinking about doing.

And it's not the max rows I'm concerned about; it is the max row size and/or
max # of columns in the DataColumn collection.

- Christopher


"Patrice" <nob...@nowhere.com> wrote in message

news:%23c5n6bh...@TK2MSFTNGP12.phx.gbl...

Christopher Luther

unread,
Mar 18, 2005, 2:37:21 PM3/18/05
to
Is there not anyone out here in Microsoft Land who knows the the answer to
the question below???


"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message

news:%23VL9z8O...@TK2MSFTNGP14.phx.gbl...

0 new messages