I think there are problems with how these tests are run. Is the speaking part actually examined, and if so, how is it marked? This is really quite difficult to do. Are you marking the taciturn candidate down for his personality, for example, rather than for his linguistic skills? Which is better; fluent bad English or hesitant but good quality English? Or do they use some sort of substitute for the spoken test, of the style "if the candidate got 80% on the comprehension section, then we can reasonably grant them 50% on the speaking section with a very short interview, or without testing at all." And how well are the examiners trained? It's really very difficult to examine a spoken test.
Many years ago, I was part of a group from the CLA who were trained (over several days) to give the test, and several of my colleagues later wrote some of them. They consisted of a scenario in which, for example, you might be a salesman for a pharmaceutical company selling to Africa. You have to speak to a potential buyer over the phone, interact with him using emails, read up on import details for the country in question, negotiate the prices and conclude with a letter stating the agreement reached. These are some of the things the test might include. So it is a complete situation involving all the linguistic skills. There was a complex grid to help the examiner come to a conclusion as to the level of the candidate.
This test presupposes that the candidate will want to have this kind of activity, so for the conductor wanting to speak with his orchestra, or the history researcher wanting to talk about the Middle Ages with a colleague in Denmark, it wouldn't really be pertinent, of course.
However, another drawback is that the candidate has to pay to take the test; at the moment it's €100. This is always a problem. If the test is serious, it costs money. The more serious it is, the more it costs to administer.