TIA!
- Christopher
Is this really important?
--
Miha Markic [MVP C#] - RightHand .NET consulting & development
www.rthand.com
SLODUG - Slovene Developer Users Group www.codezone-si.info
"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message
news:%23VL9z8O...@TK2MSFTNGP14.phx.gbl...
Why are you asking this.
The dataset exist from collections of datatables (objects) which have all a
collection of datarows (objects) which have all collection of items
(objects).
Cor
IMO, it's likely some theorical limit that you are unlikely to reach...
Patrice
--
"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> a écrit dans le
message de news:%23VL9z8O...@TK2MSFTNGP14.phx.gbl...
There's LOTS of data, and we're hitting some hard limits within SQL Server.
The current thought goes something like "maybe se can eliminate SQL Server
as the repository of the normalized data and use an in-memory DataSet to
handle the normalized tables and such." But before expending $$$ on this
"thought", we'd like to know any defined DataTable limits.
> The dataset exist from collections of datatables (objects) which have all
> a collection of datarows (objects) which have all collection of items
> (objects).
>
I fully understand the DataSet object model.
> Cor
>
We've currently got a "process" that takes de-normalized data from SQL
Server, transforms the data into normalized tables (dynamically created),
and then used Excel to perform calculations on the data.
There's LOTS of data, and we're hitting some hard limits within SQL Server.
The current thought goes something like "maybe se can eliminate SQL Server
as the repository of the normalized data and use an in-memory DataSet to
handle the normalized tables and such." But before expending $$$ on this
"thought", we'd like to know any defined DataTable limits.
- Christopher
The .NET collection object model uses a standard int (Int32) value as the
counter for the collection. So if one were to use only int as the basis for
the number of rows, one would think that the maximum number of rows would be
something over 2 billion instead of a paltry 16 million.
So, obviously there is a hard limit for some reason, and we're trying to
ascertain if there are any other hard limites before embarking on a
significant develop effort.
- Christopher
I think that you will hit computer's memory limit way before you'll hit any
dataset limit.
That's why I don't think it is important.
However, if you want to be on the safe side, why don't you test it yourself
by adding columns, tables and rows?
--
Miha Markic [MVP C#] - RightHand .NET consulting & development
www.rthand.com
SLODUG - Slovene Developer Users Group www.codezone-si.info
"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message
news:eP8SFQZ...@TK2MSFTNGP12.phx.gbl...
"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message news:eorlFUZ...@TK2MSFTNGP12.phx.gbl...
Number of characters in a record (excluding Memo and OLE Object fields) 2,000
james
"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message news:eorlFUZ...@TK2MSFTNGP12.phx.gbl...
>
> I think that you will hit computer's memory limit way before you'll hit
> any dataset limit.
That was what I wrote yesterday already as message to Patrice. However,
he/she would have probably answered to me that he/she knowed that, what I
am not in doubt about.
To show that we have the same idea.
Cor
That is for Access, however a good addition, it would not be likely that a
datatable would have many more columns than there are in the database.
Cor
You may want also explain what you are trying to do. It looks you have quite
an unusual scenario for datasets. Perhaps someone could suggest an alternate
approach for what you are trying to do...
Patrice
--
"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> a écrit dans le
message de news:eorlFUZ...@TK2MSFTNGP12.phx.gbl...
I'm really in doubt DataSets will perform better than SQL Server. What is
the limit you reached in SQL Server ?
Patrice
--
"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> a écrit dans le
message de news:eP8SFQZ...@TK2MSFTNGP12.phx.gbl...
As I stated previously, the data is in a de-normalized state. That is, the
parent record, call it a Form, has zero, one, or many child records, with
each record representing a Field on the Form. Each of these Field records
has a key value for the parent-child relationship and a single nvarchar
field that is defined as nvarchar(2000) -- that is, 4000 characters.
As there is no way to know how many Fields a Form may have, transforming the
many Field records into a single row of data causes SQL to eventually
complain. Even with all the smarts the application has in trimming the
nvarchar fields and such, eventually we hit the hard limit for SQL's record
size.
Hence the thought of transforming the de-normalized SQL data into local
DataTable objects that can be serialized to disk as an XML file.
- Christopher
"Patrice" <nob...@nowhere.com> wrote in message
news:%23L0Dyvh...@TK2MSFTNGP15.phx.gbl...
- Christopher
"james" <jjames700ReMoVeMe at earthlink dot net> wrote in message
news:eK%23KeIbK...@TK2MSFTNGP10.phx.gbl...
And it's not the max rows I'm concerned about; it is the max row size and/or
max # of columns in the DataColumn collection.
- Christopher
"Patrice" <nob...@nowhere.com> wrote in message
news:%23c5n6bh...@TK2MSFTNGP12.phx.gbl...
"Christopher Luther" <cluther@n0sp@m.xybernaut.com.invalid> wrote in message
news:%23VL9z8O...@TK2MSFTNGP14.phx.gbl...