On Tue, 2016-02-09, Jerry Stuckle wrote:
> On 2/8/2016 5:23 PM, Jorgen Grahn wrote:
>> On Fri, 2016-02-05, Jerry Stuckle wrote:
>>> On 2/4/2016 3:40 PM, Paavo Helde wrote:
>>>>
>>>> I have a base class with sizeof 24. There are zillions of objects of
>>>> this class so I have tried to get the size of this class to minimum.
>> ...
>>
>>> Exactly how many are "zillions"? Even if you have 1,000,000 of them at
>>> the same time (very doubtful - trying to process that many items would
>>> take a lot of time),
>>
>> I don't want to nitpick, but a million objects is not that uncommon,
>> nor do they necessarily take a lot of time to process.
>>
>> I don't believe you're making this mistake, but I sometimes meet
>> people who have absurdly low expectations in this area. Brute force,
>> modernish computers and the absence of total stupidity takes you far.
>> For example,
>>
>> $ time od /dev/urandom| head -1000000 | sort | md5sum
>>
>> Sorting those 1,000,000 lines of text takes 7 seconds on my machine,
>> which is ten years old. Calculating the checksum takes 200 ms.
>>
>>> you're only talking about 8 MB of additional memory.
> It depends on the randomness and the operations being performed. A
> simple sort of text could be quick . Other operations could take much
> longer. How many applications have you had that actually have 1,000,000
> objects in memory?
The one above, for example ... I didn't write sort(1) myself of
course, but I don't hesitate to operate on large data sets in Unix
pipelines.
And even I am frequently surprised when I see a badly written Perl
script chew up megabytes of data in almost no time at all.
> I don't think in almost 50 years of programming I have ever had it.
> But I have used databases with tens of billions of rows.
People do different things with computers, obviously.
I used to write simulators for mobile networks, and these were used to
simulate millions of cell phones doing stuff, each with one or more
data bearers (which also were modeled as objects, of course).
If you're asking because you still believe having a million objects is
a problem that needs special care, I can just say it never was, in my
experience.
I didn't have to do anything unusual to support that -- just normal
objects, and standard containers. I think at some point I analyzed
what used up memory in such an object, and split out things which were
shared by many similar CellPhone objects ... but that had more to do
with refactoring for clarity, and less with conservation of RAM.
> And, as Ian pointed out, your urandom may not be so random.
He did, but I fail to see how that's relevant. I just wanted to
produce a million lines of text, and forgot that I could use 'seq 1
1000000' instead of the head of an infinite sequence.