Hi
Just noticed after adding the extra attributeID to the prediction output, the prediction results are different? Or i cant find an instance # with the same prediction of 0.32 for the third instance. Is this supposed to happen?
Thanks
-----------Iris Setosa
Relation: iris-weka.filters.unsupervised.attribute.MakeIndicator-Clast-V1
Instances: 150
Attributes: 5
sepallength
sepalwidth
petallength
petalwidth
class
Test mode:10-fold cross-validation
=== Classifier model (full training set) ===
Linear Regression Model
class =
0.079 * sepallength +
0.228 * sepalwidth +
-0.2561 * petallength +
0.1382
Time taken to build model: 0 seconds
=== Predictions on test data ===
inst#, actual, predicted, error
1 0 -0.065 -0.065
2 0 0.015 0.015
3 0 0.32 0.32
4 1 0.827 -0.173
5 0 0.102 0.102
6 0 -0.276 -0.276
7 0 0.175 0.175
8 0 -0.154 -0.154
After adding the extra attribute through AddID filter:
=== Run information ===
Scheme:weka.classifiers.functions.LinearRegression -S 0 -R 1.0E-8
Relation: iris-weka.filters.unsupervised.attribute.MakeIndicator-Clast-V1-weka.filters.unsupervised.attribute.AddID-Cfirst-NID
Instances: 150
Attributes: 6
ID
sepallength
sepalwidth
petallength
petalwidth
class
Test mode:10-fold cross-validation
=== Classifier model (full training set) ===
Linear Regression Model
class =
0.079 * sepallength +
0.228 * sepalwidth +
-0.2561 * petallength +
0.1382
Time taken to build model: 0 seconds
=== Predictions on test data ===
inst#, actual, predicted, error (ID)
1 0 -0.077 -0.077 (141)
2 0 0.009 0.009 (146)
3 0 0.293 0.293 (65)
4 1 0.805 -0.195 (4)
5 0 0.06 0.06 (71)
6 0 -0.281 -0.281 (106)
7 0 0.191 0.191 (82)
8 0 -0.159 -0.159 (133)
9 1 0.988 -0.012 (6)
10 0 -0.104 -0.104 (144)
11 0 0.04 0.04 (78)
12 1 1.047 0.047 (37)
13 1 0.893 -0.107 (7)
14 0 0.047 0.047 (127)
15 1 0.824 -0.176 (2)
1 0 0.288 0.288 (80)
2 0 0.117 0.117 (55)
3 1 0.843 -0.157 (38)
4 1 0.952 -0.048 (32)
5 1 0.865 -0.135 (48)
6 0 -0.071 -0.071 (117)
7 0 0.221 0.221 (76)
8 0 0.109 0.109 (67)
9 1 0.991 -0.009 (41)
10 0 -0.122 -0.122 (103)
11 0 -0.019 -0.019 (150)
12 1 1.056 0.056 (23)
13 1 0.839 -0.161 (46)
14 1 0.914 -0.086 (44)
15 1 0.961 -0.039 (29)
1 0 -0.067 -0.067 (126)
2 0 0.385 0.385 (99)
3 0 -0.077 -0.077 (84)
4 0 0.239 0.239 (72)
5 1 1.104 0.104 (17)
6 1 0.991 -0.009 (22)
7 0 0.178 0.178 (87)
8 0 0.148 0.148 (60)
9 0 0.237 0.237 (83)
10 0 0.201 0.201 (93)
11 0 -0.023 -0.023 (69)
12 0 0.172 0.172 (70)
13 0 0.024 0.024 (124)
14 0 0.074 0.074 (74)
15 1 0.967 -0.033 (50)
1 0 -0.461 -0.461 (119)
2 1 0.835 -0.165 (24)
3 0 -0.232 -0.232 (108)
4 0 0.215 0.215 (66)
5 0 0.064 0.064 (77)
6 1 1.014 0.014 (19)
7 0 0.065 0.065 (128)
8 0 -0.077 -0.077 (121)
9 0 0.135 0.135 (92)
10 1 0.808 -0.192 (26)
11 0 0.025 0.025 (54)
12 0 -0.115 -0.115 (102)
13 0 0.193 0.193 (100)
14 0 -0.057 -0.057 (107)
15 0 -0.043 -0.043 (122)
1 0 0.097 0.097 (139)
2 1 0.959 -0.041 (18)
3 0 -0.073 -0.073 (143)
4 1 0.865 -0.135 (3)
5 1 0.944 -0.056 (40)
6 1 1.201 0.201 (34)
7 0 0.07 0.07 (111)
8 1 0.824 -0.176 (30)
9 0 0.186 0.186 (52)
10 1 1.048 0.048 (49)
11 0 -0.038 -0.038 (145)
12 1 0.826 -0.174 (10)
13 0 0.029 0.029 (140)
14 1 0.745 -0.255 (9)
15 1 0.818 -0.182 (13)
1 1 1.006 0.006 (11)
2 1 0.839 -0.161 (25)
3 0 0.219 0.219 (62)
4 1 1.01 0.01 (47)
5 1 1.032 0.032 (20)
6 1 0.883 -0.117 (21)
7 0 -0.27 -0.27 (109)
8 0 0.116 0.116 (61)
9 0 0.153 0.153 (81)
10 0 0.241 0.241 (89)
11 0 -0.14 -0.14 (130)
12 0 -0.394 -0.394 (123)
13 0 0.239 0.239 (94)
14 0 -0.073 -0.073 (113)
15 0 -0.09 -0.09 (147)
1 1 1.002 0.002 (5)
2 0 -0.239 -0.239 (135)
3 0 -0.054 -0.054 (110)
4 0 0.103 0.103 (63)
5 0 -0.076 -0.076 (138)
6 0 -0.124 -0.124 (118)
7 0 -0.156 -0.156 (104)
8 0 0.112 0.112 (95)
9 1 1.214 0.214 (16)
10 0 0.138 0.138 (53)
11 0 -0.201 -0.201 (131)
12 0 0.147 0.147 (59)
13 1 0.973 -0.027 (36)
14 0 -0.181 -0.181 (101)
15 0 0.045 0.045 (88)
1 1 0.607 -0.393 (42)
2 1 0.93 -0.07 (8)
3 0 0.188 0.188 (98)
4 0 -0.011 -0.011 (116)
5 0 0.126 0.126 (56)
6 0 -0.155 -0.155 (129)
7 1 0.961 -0.039 (28)
8 0 0.204 0.204 (97)
9 0 0.112 0.112 (64)
10 0 -0.165 -0.165 (120)
11 0 -0.01 -0.01 (137)
12 0 0.199 0.199 (68)
13 0 0.035 0.035 (149)
14 0 -0.138 -0.138 (105)
15 0 0.271 0.271 (86)
1 0 -0.002 -0.002 (148)
2 0 0.225 0.225 (58)
3 0 0.221 0.221 (96)
4 1 0.886 -0.114 (43)
5 0 0.033 0.033 (132)
6 1 1.13 0.13 (33)
7 0 0.128 0.128 (79)
8 1 0.939 -0.061 (45)
9 1 0.834 -0.166 (39)
10 0 0.199 0.199 (57)
11 0 0.048 0.048 (142)
12 0 0.117 0.117 (85)
13 1 1.198 0.198 (15)
14 1 0.82 -0.18 (31)
15 0 -0.027 -0.027 (125)
1 1 0.884 -0.116 (27)
2 1 0.888 -0.112 (12)
3 0 -0.011 -0.011 (134)
4 1 0.853 -0.147 (35)
5 0 0.108 0.108 (90)
6 1 0.869 -0.131 (14)
7 0 -0.132 -0.132 (115)
8 0 -0.154 -0.154 (114)
9 0 0.233 0.233 (51)
10 0 0.209 0.209 (75)
11 0 0.058 0.058 (91)
12 1 0.977 -0.023 (1)
13 0 -0.108 -0.108 (112)
14 0 -0.042 -0.042 (73)
15 0 -0.145 -0.145 (136)