I believe you want to get HW decode for h264 video streams in HTML5 <video>; if that's the case, then good news: it's already hooked up using
GpuVideoDecoder in the renderer talking to
OmxVideoDecodeAccelerator in the GPU process. If you have an OpenMAX IL 1.1.2 compliant library for your platform, you might just need to symlink it to a particular name or edit a single line of
code to point to your platform's name of it. You can tell whether HW decoding is enabled by either enabling vlogging in a Debug build, or loading chrome://histograms/Media.Gp twice after playing a <video>; if there is no histogram on that page that means the
renderer didn't even try initializing HW decode; if there's a "0" bucket that counts the number of successful HW decode Initialize() calls; if there are other non-"0" buckets with non-zero counts that means other
errors were encountered.
In case I misunderstood your goals/questions, in-line responses below.
Not quite; those $GYP_DEFINES are necessary to get h264/mp4 support, but other codec/container support is built into chrome even without them (e.g. vp8/webm, theora/ogv, etc).
However it's likely that your OpenMAX implementation only supports h264, so if that's what you're working on/towards, those defines are indeed necessary.