Use Java based audio as default for WebRTC.
The work landed in 4034 (use of HW AEC in AppRTC) is currently not
active by default since we build for Open SL. I missed that when I
did my initial change (since I always disabled OpenSL by GYP_DEFINES).
This CL ensures that Java based audio is used as default in WebRTC.
It would be great if we could shift over to Open SL (to cut latency)
but that would (today) mean that we can't support the HW AEC.
Hence, we are not ready to do so yet.
Review URL: https://webrtc-codereview.appspot.com/36699004
git-svn-id: http://webrtc.googlecode.com/svn/trunk@8040 4adac7df-926f-26a2-2b94-8c16560cd09d
diff --git a/webrtc/build/common.gypi b/webrtc/build/common.gypi
index 4322f37..ffadbbf 100644
@@ -106,7 +106,11 @@
# This may be subject to change in accordance to Chromium's MIPS flags
'mips_fpu%' : 1,
- 'enable_android_opensl%': 1,
+ # Use Java based audio layer as default for Android.
+ # Change this setting to 1 to use Open SL audio instead.
+ # TODO(henrika): add support for Open SL ES.
+ 'enable_android_opensl%': 0,
# Link-Time Optimizations
# Executes code generation at link-time instead of compile-time
diff --git a/webrtc/build/webrtc.gni b/webrtc/build/webrtc.gni
index e687af8..8e489a3 100644
@@ -59,7 +59,7 @@
mips_dsp_rev = 0
mips_fpu = true
- rtc_enable_android_opensl = true
+ rtc_enable_android_opensl = false
# Link-Time Optimizations.
# Executes code generation at link-time instead of compile-time.