When it comes to numbers, you should generally use the type "num"
unless you have strong reasons to pick a more specific type. For
example, if you're doing a for-loop, it makes sense to use "int". In
most other cases, you should use "num".
If you want to do truncating division, you should use the ~/ operator instead.
On Mon, Jan 9, 2012 at 10:57, Tom <tom....@gmail.com> wrote:
> int v = 35;
> int w = v / 35; //Error since v / 35 is a double
Actually, this is not an error. It is a type warning.
Cheers,
Peter
--
William Hesse
Software Engineer
whe...@google.com
Google Denmark ApS
Frederiksborggade 20B, 1 sal
1360 København K
Denmark
CVR nr. 28 86 69 84
If you received this communication by mistake, please don't forward it
to anyone else (it may contain confidential or privileged
information), please erase all copies of it, including all
attachments, and please let the sender know it went to the wrong
person. Thanks.
I understand what you mean by "type annotation", but it leads to a
question why "int" was introduced at the first place?
On Thursday, January 12, 2012, Tom <tom....@gmail.com> wrote:
> After trying a couple days and showing the code to my colleagues, I
> still think it is not a good idea. Using ~/ is handy, but it is really
> counter-intuitive. I strongly suggest:
>
> 1) An integer divided by an integer shall be an integer.
> 2) (v is int) && (v is double) shall always be false in any machine,
> including running as JS.
>
> The implementation is straightforward (GWT already does a good job,
> doesn't it?). Of course, there is some performance penalty, but the
> implementation is something can be enhanced over the time, especially
> when Dart VM is available in different browsers. However, the language
> spec stays forever.
>
GWT is trying to implement Java. Java is a statically typed language
and relies on static types to determine what to do. For example, if I
write the following in Java:
double x = 5;
double y = 2;
double z = x / y;
The compiler will first generate code that converts the integer 5 to a
double (did you know this can silently cause loss of precision?). Then
it generates a convertion of the integer 2 to a double (another
potential loss of precision). Then based on the static types, the
compiler will generate a double division and when you run the program,
z will end up having the double value 2.5.
Had I written:
double z = 5 / 2;
The value in z will be the double value 2.0. This is because the
left-hand-side of the assignment is a pure integer division, and the
compiler generates a truncating integer division followed by
conversion from integer to double (silent loss of precision again).
In Dart, all these versions behave the same:
double x = 5;
double y = 2;
double z = x / y;
int x = 5;
int y = 2;
double z = x / y;
num x = 5;
num y = 2;
num z = x / y;
var x = 5;
var y = 2;
var z = x / y;
double z = 5 / 2;
In each case, the value of z will be the double 2.5.
> Nowadays, programmers have to deal with different languages almost in
> a daily basis. Image what will happen if one of your less-experience
> members has to work at both server and client side? If he has to watch
> someone's video and remind himself when switching task for such a
> basic feature, there must be something wrong in the spec.
I think you assume that inexperienced programmers will find C
semantics intuitive. I don't agree with this assumption. I don't think
that C semantics are intuitive: all the way through the school system,
you have learned that 5 / 2 = 2.5. C and Java does not agree with what
you learned in grade school, and that is not intuitive.
> Sorry for strong wording. But, after programming in Java and JS for
> years, I believe Dart has a good chance. Most of differences are more
> or less like or dislike, but I'm afraid the integer issue will turn
> off a lot of people.
I think you raise a good point here: programmers with experience in C,
C++, Java, or C#, may find the Dart number semantics difficult to
understand.
On the other hand, I think that JavaScript programmers will have no
problems at all with the number semantics.
I assume that we will to do usability studies of these kinds of things.
Cheers,
Peter