Whether or not you should use an Activation as the last layer, and what kind of activation, depends on the range of the values you want to output (for instance: if you want to output negative and positive values, don't use ReLU, etc. And never use softmax since it ouputs a probability distribution).
If you aren't sure, it's probably better not to use an Activation as the last layer (Dense would then be the last layer).
Also, "show_accuracy" should not be set for a regression problem. The notion of accuracy only makes sense for a classification problem.