My understanding is that one of the reasons that data channels was developed after audio/video (which you would think is the harder goal) was to prevent people from jumping the gun and trying to send audio/video over data channels. To do a good job of sending audio/video, you need some fairly sophisticated stuff that responds to the the network bandwidth by scaling the resolution and frame rate (and occasionally skipping ahead to remain current) so that your stream doesn't lag more and more over time, and that wouldn't be easy/possible to develop on top of data channels.
If you could get the bytes from an H.264 stream, and the bandwidth required wasn't too high, there is nothing to stop you (other than the limited bandwidth available to data channels and the ensuing garbage collection), but it wouldn't be high on my list of weekend projects.
The bandwidth allowed to data channels is really low, presumably so that it doesn't interfere with audio/video streams. To the best of my knowledge, there is no way to prioritize one stream over another right now, but the prospect of it was being discussed at the last WebRTC world.