I agree that it's worth searching for another model with some of (a lot of) the energy that goes into HATEOAS debates! Great topic :)
The biggest problem with HATEOAS model in my view is that, although it's incredibly flexible, it's reliant on having code which can on the fly interpret everything that's going on, to take advantage client code needs to be able to:
- follow arbitrary links
- interpreting media types
- understanding the semantics of the links & objects on the fly
- deal with change
Humans are awesome at this - their brains feed off all the cues they see and navigate the information at hand. They get to the semantics of what the intent of the API designer is (and the system underneath it). [Note, I'm not talking about the semantics of "GET/DELETE/PUT" - which is clear, but the semantics of the Data (e.g. Hotel, Car, Booking, Payment, ...).
So if you could build a client to do this on the fly, the model would be amazing.
In reality you have to be able to write a robust bit of code which navigates an API or set of them with predictable results.
I think it's agreed an API which is just a list of method calls is (the opposite end of the spectrum) very limiting - breaks every-time you change something. So my mind the search for an alternative framework should be somewhere in the middle of "hardcoded calls" and HATEOAS such that:
- An API can change dynamically within a certain envelope
- It can be described/defined by a set of principles that don't have to explicitly enumerate
every call, but in combination describe a set of API calls that can be made.
- Are tied to Mediatypes/DataModels
So the API might have a more flexible structure, but there are certain things you as a client developer can depend on (a certain set of base media types, certain set of patterns of calls / URL structures). So moving from "I can throw anything I want at you at runtime" to "here are bunch of things that if you code against, you won't get a total curveball at runtime".
I hate to give this as an example, but CSS is somewhat like this: it's a language for defining what something should look like with rules for the primitives and how they interact. This means to describe what you want you combine primitives, you don't have to enumerate everything explicitly.
Allowing an API to throw different things are a client at runtime is extremely challenging to deal with. In any "new model" an intermediate step should be such that I can at least predict the space of those changes to an extent.
Second, the above refers to the API design framework and that's often the center of debate, but I think there is something even more important that would move the industry forward - higher level shared media types (but maybe data models is a better word): lots of them. Most media types now are things like JSON, Image, CSS, ... and have nothing to do with application level semantics.
What would be much more powerful at shared concepts of things like:
- hotel
- car
- rental
- house
- user
- currency
- cameras
- ...
This is because as a client developer, the idea that a hotel is represented in the same way on all the travel sites, all the travel APIs etc. you might use (or least in the same basic way but maybe with some extensions) is a fantastically large time saver. It allows you to build a much richer, more robust client. You can get on with pulling "hotels" out of these APIs, no "JSON". About the only common media type today that gets close to this is vCard.
There are plenty of semantic web efforts around these types of "higher level" definitions and it can be a hornet's nest to define them. There are also efforts in areas like travel to define data model standards (e.g. OTA). But, for APIs to scale out to many 100's of thousands, I think convergence on this level is incredibly important. Probably more important than what verbs etc. are being used and how. Also, to be clear I'm not really that concerned about semantic web type hierarchies of concepts, inferences of one from the other etc. (although that would be nice), but plain and simple "this API, uses THIS data model" and oh, it's that's the same one I programmed against last week somewhere else.
The reason HTML+Browsers work so well in HATEOAS mode is: they rely on the human brain to do the ultimate interpretation and navigating of what's being rendered. What APIs are often trying to do is get by without a human in the loop.
Unfortunately our ability to build such smart clients is limited, so any framework needs to be structured enough to allow code to be written.
I'm not down on HATEOAS - it's actually (I think) ultimately close to what we want + I'm certain we can do better than what is done today (long lists of example URLs), but it has to remain in the reals of the doable for client programming.
steve.