Audio De Intro Para Videos

0 views
Skip to first unread message

Mirtha Shikles

unread,
Aug 5, 2024, 12:09:54 PM8/5/24
to cimennile
HTML5to the rescue. It might not be apparent, but the rise of HTML5 has broughta surge of access to device hardware. Geolocation (GPS),the Orientation API (accelerometer), WebGL (GPU),and the Web Audio API (audio hardware) are perfect examples. These featuresare ridiculously powerful, exposing high level JavaScript APIs that siton top of the system's underlying hardware capabilities.

Several variants of "Media Capture APIs" have evolved over the past few years.Many folks recognized the need to be able to access native devices on the web, butthat led everyone and their mom to put together a new spec. Things gotso messy that the W3C finally decided to form a working group. Their sole purpose?Make sense of the madness! The Device APIs Policy (DAP) Working Grouphas been tasked to consolidate + standardize the plethora of proposals.


Kinda nice right? I particularly like that it reuses a file input. Semantically,it makes a lot of sense. Where this particular "API" falls short is the ability to do realtime effects(e.g. render live webcam data to a and apply WebGL filters).HTML Media Capture only allows you to record a media file or take a snapshot in time.


Many thought HTML Media Capture was too limiting, so a new specemerged that supported any type of (future) device. Not surprisingly, the design calledfor a new element, the element,which became the predecessor to getUserMedia().


Opera was among the first browsers to create initial implementationsof video capture based on the element. Soon after(the same day to be precise),the WhatWG decided to scrap the tag in favor of another up and comer, this time a JavaScript API callednavigator.getUserMedia(). A week later, Opera put out new builds that includedsupport for the updated getUserMedia() spec. Later that year,Microsoft joined the party by releasing a Lab for IE9supporting the new spec.


Unfortunately, no released browser ever included .One less API to worry about I guess :) did have two great things goingfor it though: 1.) it was semantic, and 2.) it was easily extendible to supportmore than just audio/video devices.


The pace to find a suitable capture API accelerated thanks to the larger WebRTC (Web Real Time Communications) effort. That spec is overseen by the W3C WebRTC Working Group. Google, Opera, Mozilla, and a few others have implementations.


With navigator.mediaDevices.getUserMedia(), we can finally tap into webcam and microphone input without a plugin.Camera access is now a call away, not an install away. It's baked directly into the browser. Excited yet?


To use the webcam or microphone, we need to request permission.The first parameter to navigator.mediaDevices.getUserMedia() is an object specifying the details andrequirements for each type of media you want to access. For example, if you want to access the webcam, the first parameter should be video: true. To use both the microphone and camera,pass video: true, audio: true:


OK. So what's going on here? Media capture is a perfect example of new HTML5 APIsworking together. It works in conjunction with our other HTML5 buddies, and .Notice that we're not setting a src attribute or including elementson the element. Instead of feeding the video a URL to a media file, we're settingsrcObject to the LocalMediaStream object representing the webcam.


The first parameter to getUserMedia() can also be used to specify more requirements (or constraints) on the returned media stream. For example, instead of just indicating you want basic access to video (e.g. video: true), you can additionally require the streamto be HD:


The enumerateDevices() method of the MediaDevices interface requests a list of the available media input and output devices, such as microphones, cameras, headsets, and so forth. The returned Promise is resolved with an array of MediaDeviceInfo objects describing the devices.


Browsers show a permission dialog upon calling navigator.mediaDevices.getUserMedia(),which gives users the option to grant or deny access to their camera/mic. For example, hereis Chrome's permission dialog:


Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.


Now that we are comfortable with adding simple images to a webpage, the next step is to start adding video and audio players to your HTML documents! In this article we'll look at doing just that with the and elements; we'll then finish off by looking at how to add captions/subtitles to your videos.


Note: Before you begin here, you should also know that there are quite a few OVPs (online video providers) like YouTube, Dailymotion, and Vimeo, and online audio providers like Soundcloud. Such companies offer a convenient, easy way to host and consume videos, so you don't have to worry about the enormous bandwidth consumption. OVPs even usually offer ready-made code for embedding video/audio in your webpages; if you use that route, you can avoid some of the difficulties we discuss in this article. We'll be discussing this kind of service a bit more in the next article.


Users must be able to control video and audio playback (it's especially critical for people who have epilepsy.) You must either use the controls attribute to include the browser's own control interface, or build your interface using the appropriate JavaScript API. At a minimum, the interface must include a way to start and stop the media, and to adjust the volume.


There's a problem with the above example. It is possible that the video might not play for you, because different browsers support different video (and audio) formats. Fortunately, there are things you can do to help prevent this from being an issue.


First, let's go through the terminology quickly. Formats like MP3, MP4 and WebM are called container formats. They define a structure in which the audio and/or video tracks that make up the media are stored, along with metadata describing the media, what codecs are used to encode its channels, and so forth.


A WebM file containing a movie which has a main video track and one alternate angle track, plus audio for both English and Spanish, in addition to audio for an English commentary track can be conceptualized as shown in the diagram below. Also included are text tracks containing closed captions for the feature film, Spanish subtitles for the film, and English captions for the commentary.


The audio and video tracks within the container hold data in the appropriate format for the codec used to encode that media. Different formats are used for audio tracks versus video tracks. Each audio track is encoded using an audio codec, while video tracks are encoded using (as you probably have guessed) a video codec. As we talked about before, different browsers support different video and audio formats, and different container formats (like MP3, MP4, and WebM, which in turn can contain different types of video and audio).


There are some special cases. For example, for some types of audio, a codec's data is often stored without a container, or with a simplified container. One such instance is the FLAC codec, which is stored most commonly in FLAC files, which are just raw FLAC tracks.


Another such situation is the always-popular MP3 file. An "MP3 file" is actually an MPEG-1 Audio Layer III (MP3) audio track stored within an MPEG or MPEG-2 container. This is especially interesting since while most browsers don't support using MPEG media in the and elements, they may still support MP3 due to its popularity.


Because of those patents, browsers that wish to implement support for those codecs must pay typically enormous license fees. In addition, some people prefer to avoid restricted software and prefer to use only open formats. Due to these legal and preferential reasons, web developers find themselves having to support multiple formats to capture their entire audience.


The codecs described in the previous section exist to compress video and audio into manageable files, since raw audio and video are both exceedingly large. Each web browser supports an assortment of codecs, like Vorbis or H.264, which are used to convert the compressed audio and video into binary data and back. Each codec offers its own advantages and drawbacks, and each container may also offer its own positive and negative features affecting your decisions about which to use.


Things become slightly more complicated because not only does each browser support a different set of container file formats, they also each support a different selection of codecs. In order to maximize the likelihood that your website or app will work on a user's browser, you may need to provide each media file you use in multiple formats. If your site and the user's browser don't share a media format in common, your media won't play.


Due to the intricacies of ensuring your app's media is viewable across every combination of browsers, platforms, and devices you wish to reach, choosing the best combination of codecs and container can be a complicated task. See Choosing the right container for help selecting the container file format best suited for your needs; similarly, see Choosing a video codec and Choosing an audio codec for help selecting the first media codecs to use for your content and your target audience.


One additional thing to keep in mind: mobile browsers may support additional formats not supported by their desktop equivalents, just like they may not support all the same formats the desktop version does. On top of that, both desktop and mobile browsers may be designed to offload handling of media playback (either for all media or only for specific types it can't handle internally). This means media support is partly dependent on what software the user has installed.


Here we've taken the src attribute out of the actual tag, and instead included separate elements that point to their own sources. In this case the browser will go through the elements and play the first one that it has the codec to support. Including WebM and MP4 sources should be enough to play your video on most platforms and browsers these days.

3a8082e126
Reply all
Reply to author
Forward
0 new messages