Need help to enable SSE2 accelerated in webrtc

90 views
Skip to first unread message

ABDUL MOIZ

unread,
Oct 7, 2011, 8:45:47 AM10/7/11
to discuss-webrtc

Hi,

I am using webrtc in Ubuntu 10.04, X86 platform .
Can anybody help me in enabling SSE2 support in webrtc?



Regards,
Abdul Moiz

Arun Raghavan

unread,
Oct 7, 2011, 8:52:58 AM10/7/11
to discuss...@googlegroups.com
On 7 October 2011 18:15, ABDUL MOIZ <abdulm...@gmail.com> wrote:
>
> Hi,
>
>        I am using webrtc in Ubuntu 10.04, X86 platform .
>  Can anybody help me in enabling SSE2 support in webrtc?

Looking at the code, if you're building for the correct architecture,
gcc will define __SSE2__ for you and that'll make sure SSE2 support is
built in. There also appears to be run-time detection of CPU features
to decide whether to use it or not.

Cheers,
--
Arun Raghavan
http://arunraghavan.net/
(Ford_Prefect | Gentoo) & (arunsr | GNOME)

ABDUL MOIZ

unread,
Oct 7, 2011, 9:42:10 AM10/7/11
to discuss...@googlegroups.com
Hi,

        I have found that -msse2 flag being enabled in webrtc code.

But when I run  voe_cmd_test application (with echo cancel enabled) with -msse2 flag enabled and with the flag disabled, I am getting same cpu usage in both the cases.

Is there any command or instructions by which I can know whether SSE2 instructions are being used?

Niklas Enbom

unread,
Oct 7, 2011, 10:06:47 AM10/7/11
to discuss...@googlegroups.com
The define __SSE2__ should trigger the definition of WEBRTC_USE_SSE2. My guess is that it's enabled in both your cases.

Niklas

ABDUL MOIZ

unread,
Oct 10, 2011, 2:57:28 AM10/10/11
to discuss...@googlegroups.com
Hi,

      I have disabled the sse2 support (by changing 'disable_sse2%': 1 in build/common.gypi file) and found that __SSE2__ and WEBRTC_USE_SSE2 are not defined while the application is running.
But still the CPU usage is same as before.
Can you please help me to know whether sse2 is actually disabled or not?


Regards
Abdul Moiz

Andrew MacDonald

unread,
Oct 11, 2011, 12:46:29 AM10/11/11
to discuss...@googlegroups.com
On Sun, Oct 9, 2011 at 11:57 PM, ABDUL MOIZ <abdulm...@gmail.com> wrote:
> Hi,
>
>       I have disabled the sse2 support (by changing 'disable_sse2%': 1 in
> build/common.gypi file) and found that __SSE2__ and WEBRTC_USE_SSE2 are not
> defined while the application is running.

The gyp flag disable_sse2 only has an effect on Linux, and then only
when targeting a 32-bit architecture (since x86-64 processors always
have SSE2 support). The target architecture will match your host
architecture by default. If you're building on a 64-bit machine, you
can use the following to make disable_sse2 do something:

$ ./build/gyp_chromium --depth=. -Dtarget_arch=ia32 -Ddisable_sse2=1 webrtc.gyp

> But still the CPU usage is same as before.
> Can you please help me to know whether sse2 is actually disabled or not?

If you want to quickly ensure SSE2 is disabled, comment out "#define
WEBRTC_USE_SSE2" in src/typedefs.h and rebuild.

There's a better method in
src/modules/audio_processing/main/test/process_test/process_test.cc
that essentially disables the run-time detection. Look under the
"--noasm" flag.

On the audio side, only AEC uses SSE optimization. To see any
difference then, ensure you have AEC enabled in voe_cmd_test. (I see
you mentioned this, but just wanted to make it clear for others).

Reply all
Reply to author
Forward
0 new messages