new P1, .PerlArray
set P1, 100
bsr GETLEN
set I0, P1[0]
print "P1[0]="
print I0
print "\n"
bsr GETLEN
set I0, P1[10000]
print "P1[10000]="
print I0
print "\n"
bsr GETLEN
end
GETLEN:
set I0, P1
print "length="
print I0
print "\n"
ret
it prints:
length=100
P1[0]=0
length=100
P1[10000]=0
length=10001
fetching an element out of bound changes the
length of the array. but should this really happen?
why does perlarray.pmc act like this:
if (ix >= SELF->cache.int_val) {
resize_array(interpreter, SELF, ix+1);
}
instead of:
if (ix >= SELF->cache.int_val) {
value = pmc_new(INTERP, enum_class_PerlUndef);
return value->vtable->get_integer(INTERP, value);
}
are there any reason for this that I can't see or does it
just need a patch?
cheers,
Aldo
__END__
$_=q,just perl,,s, , another ,,s,$, hacker,,print;
Because that's the way Perl's arrays work. Joys of autovivification.
--
Dan
--------------------------------------"it's like this"-------------------
Dan Sugalski even samurai
d...@sidhe.org have teddy bears and even
teddy bears get drunk
Only if the element fetched is being used in an lvalue context.
Just reading an array element will result in undef and no change to the array
internals
Graham.
Good point. Read shouldn't extend, write should.
I have to disagree. the corresponding Perl script
does not show this behaviour:
my @a = (undef) x 100;
print "length=", scalar(@a), "\n";
print "a[0]=", $a[0], "\n";
print "length=", scalar(@a), "\n";
print "a[10000]=", $a[10000], "\n";
print "length=", scalar(@a), "\n";
the output is:
length=100
a[0]=
length=100
a[10000]=
length=100
It all depends. :-)
$\ = "\n";
$#a = 100;
print scalar(@a);
$x = $a[10000][0];
print scalar(@a);
101
10001
Perl has to autoviv if it has to drill down.
--
John Douglas Porter
good point. but since we don't have multidimensional
array access right now (at least AFAIK), this seems
to be a non-issue. I don't know if p6 will autovivify
the way p5 does (and I hope not). IMHO, PerlArray.pmc
should behave the way I suggested: if the index is out
of bound, returns undef tout-court. I have the patch
under my toes ;-)
> Aldo Calpini wrote:
>
>>I have to disagree. the corresponding Perl script
>>does not show this behaviour:
>>
>
> $\ = "\n";
> $#a = 100;
> print scalar(@a);
> $x = $a[10000][0];
This _writes_ to @a[10000] by generating the entry:
P0, 100
P1 = new .PerlArray
P1 = 0
P0[10000] = P1
I0 = P1[0]
> Perl has to autoviv if it has to drill down.
Not on reading.
I vote for the proposed patch.
leo
>It does on reading. I forget the eloquent explanation about the how or
>why, but all references bar the leftmost are vivified. (Even inside
>defined). In effect, all bar the last reference are in lvalue context -
>only the rightmost is rvalue.
The explanation is the part that would have been the most interesting...
Everyone: Is this just some unwanted but unavoidable behaviour that
results from the lvalue/rvalue processing or is there a real reason
behind it? (I don't see one, that's why I ask :-)
> > Perl has to autoviv if it has to drill down.
>
>
> Not on reading.
It does on reading. I forget the eloquent explanation about the how or
why, but all references bar the leftmost are vivified. (Even inside
defined). In effect, all bar the last reference are in lvalue context -
only the rightmost is rvalue.
$ cat ~/test/drilldown.pl
$\ = "\n";
$#a = 100;
print scalar(@a);
$x = $a[10000][0];
print scalar(@a);
__END__
$ perl ~/test/drilldown.pl
101
10001
Nicholas Clark
> >It does on reading. I forget the eloquent explanation about the how or
> >why, but all references bar the leftmost are vivified. (Even inside
> >defined). In effect, all bar the last reference are in lvalue context -
> >only the rightmost is rvalue.
>
> The explanation is the part that would have been the most interesting...
> Everyone: Is this just some unwanted but unavoidable behaviour that
> results from the lvalue/rvalue processing or is there a real reason
> behind it? (I don't see one, that's why I ask :-)
I am not entirely certain for Perl, but in C the [ ] operator dereferences a
pointer, and can be assigned to so the [ ] operator yields a possible
lvalue.
If the resultant value is then dereferneced with another [ ] operator you
have just used the first value in an lvalue context...
Thus each but the last [ ] force lvalue context, and the final one waits to
see its own context...
Matt
PS-Appologies to Haegl for accidentally replying to him alone at first....
It's an artifact of the implementation, and most people'd be just as
happy if it went away.
IIRC, it's basically because Perl5 doesn't have multidimensional keys,
so $a[0]{"foo"}[24] becomes (in pseudo-perl5-assembler, assume each
op pops its args and pushes its results)
push $a
array_fetch 0
hash_fetch "foo"
array_fetch 24
So, in order that it not segfault in the middle by trying to do a fetch out of
a NULL hash or array, the first to fetches are called in lvalue context (or
rather,
"this element must exist - create it if you have to" context). With
multidimensional
keys, Perl6 can avoid this trap, but we really still need an lvalue fetch -
one reason
being references. We need to support
$a = \@b[2];
$$a = 4;
which only does a fetch on @b[2], but @b[2] had better autovivify, or the
deref may
segfault.
-- BKS