blob: d1052f8205f43c4c2f0852a6aaef15aabbd2a85d [file] [log] [blame]
page.title=Android KitKat
@jd:body
<style>
</style>
<script>
function revealSection(hashy) {
if (hashy != "" && !$(hashy).is(":visible")) {
sectionId = $(hashy).closest(".version-section").attr("id");
link = $("#title-tabs a[href$="+sectionId+"]");
link.parent().addClass("selected");
link.parent().siblings().removeClass("selected");
sectionDiv = $(".version-section"+link.attr("href"));
if (sectionDiv.length) {
$(".version-section").hide();
sectionDiv.show();
}
$('html, body').animate({
scrollTop: $(hashy).offset().top
}, 100);
}
}
$(document).ready(function() {
$("#title-tabs li a").each(function() {
$(this).click(function(){
$(this).parent().addClass("selected");
$(this).parent().siblings().removeClass("selected");
$(".version-section").hide();
$($(this).attr("href")).show();
return false;
});
});
hashy = escapeHTML(location.hash);
revealSection(hashy);
});
window.onhashchange = function () {
revealSection(escapeHTML(location.hash));
}
</script>
<style>
</style>
<!-- BEGIN ANDROID 4.4 -->
<div id="44-android-44" class="version-section">
<div style="padding:0px 0px 0px 60px;margin-top:-3px;float:right;">
<img src="{@docRoot}images/kk-android-44.png" alt="Android 4.4 on phone and tablet" width="380">
</div>
<div class="landing-docs" style="float:right;clear:both;margin:22px 0 2em 3em;">
<div class="col-4 normal-links highlights" style="font-size:12px;">
<h3 id="thisd" >Key Developer Features</h3>
<ul style="list-style-type:none;">
<!--<li><a href="#44-ui">UI refresh</a></li>-->
<li><a href="#44-hce">Host Card Emulation</a></li>
<li><a href="#44-printing">Printing framework</a></li>
<li><a href="#44-storage-access">Storage access framework</a></li>
<li><a href="#44-sensors">Low-power sensors</a></li>
<li><a href="#44-sms-provider">SMS provider</a></li>
<li><a href="#44-immersive">Full-screen Immersive mode</a></li>
<li><a href="#44-transitions">Transitions framework</a></li>
<li><a href="#44-webview">Chromium WebView</a></li>
<li><a href="#44-screen-recording">Screen recording</a></li>
<li><a href="#44-renderscript-ndk">RenderScript NDK</a></li>
<li><a href="#44-bluetooth">Bluetooth HOGP and MAP</a></li>
<li><a href="#44-ir-blasters">IR Blasters</a></li>
<li><a href="#44-closed-captioning">Closed captioning settings</a></li>
<li><a href="#44-international-users">RTL features</a></li>
<li><a href="#44-security">Security enhancements</a></li>
<li><a href="#44-tools">Tools for analyzing memory use</a></li>
</ul>
</div>
</div>
<p>Welcome to Android 4.4 KitKat!</p>
<p>
Android KitKat brings all of Android's most innovative, most beautiful, and
most useful features to more devices everywhere.
</p>
<p>
This document provides a glimpse of what's new for developers.
</p>
<p>
Find out more about KitKat for consumers at <a href=
"http://www.android.com/versions/kit-kat-4-4/">www.android.com</a>.
</p>
<h2 id="svelte" style="line-height:1.25em;">Making Android for everyone</h2>
<p>
<span style="white-space:nowrap;">Android 4.4</span> is designed to run fast,
smooth, and responsively on a much broader range of devices than ever before
&mdash; including on millions of entry-level devices around the world that
have as little as <strong>512MB RAM</strong>.
</p>
<p>
KitKat streamlines every major component to reduce memory use and introduces
new APIs and tools to help you create innovative, responsive,
memory-efficient applications.
</p>
<p>
OEMs building the next generation of Android devices can take advantage of
<strong>targeted recommendations and options</strong> to run <span style=
"white-space:nowrap;">Android 4.4</span> efficiently, even on low-memory
devices. Dalvik JIT code cache tuning, kernel samepage merging (KSM), swap to
zRAM, and other optimizations help manage memory. New configuration options
let OEMs tune out-of-memory levels for processes, set graphics cache sizes,
control memory reclaim, and more.
</p>
<p>
In Android itself, changes across the system improve memory management and
reduce memory footprint. Core system processes are trimmed to <strong>use
less heap</strong>, and they now more <strong>aggressively protect system
memory</strong> from apps consuming large amounts of RAM. When multiple
services start at once &mdash; such as when network connectivity changes
&mdash; Android now <strong>launches the services serially</strong>, in small
groups, to avoid peak memory demands.
</p>
<p>
For developers, <span style="white-space:nowrap;">Android 4.4</span> helps
you deliver <strong>apps that are efficient and responsive</strong> on all
devices. A new API, <span style=
"font-size:11.5px;font-family:monospace;">ActivityManager.isLowRamDevice()</span>,
lets you tune your app's behavior to match the device's memory configuration.
You can modify or disable large-memory features as needed, depending on the
use-cases you want to support on entry-level devices. Learn more about
optimizing your apps for low-memory devices <a href="">here</a>.
</p>
<p>
New tools give also give you powerful insight into your app's memory use. The
<strong>procstats tool</strong> details memory use over time, with run times
and memory footprint for foreground apps and background services. An
on-device view is also available as a new developer option. The
<strong>meminfo tool</strong> is enhanced to make it easier to spot memory
trends and issues, and it reveals additional memory overhead that hasn't
previously been visible.
</p>
<h2 id="44-hce">New NFC capabilities through Host Card Emulation</h2>
<p>
<span style="white-space:nowrap;">Android 4.4</span> introduces new platform
support for secure NFC-based transactions through <strong>Host Card
Emulation</strong> (HCE), for payments, loyalty programs, card access,
transit passes, and other custom services. With HCE, any app on an Android
device can emulate an NFC smart card, letting users tap to initiate
transactions with an app of their choice &mdash; no provisioned secure
element (SE) in the device is needed. Apps can also use a new <strong>Reader
Mode</strong> to act as readers for HCE cards and other NFC-based
transactions.
</p>
<div style="float:right;margin:32px;width:200px;">
<img src="{@docRoot}images/kk-contactless-card.png" alt="" width="200" style=
"margin-bottom:0;">
</div>
<p>
Android HCE emulates ISO/IEC 7816 based smart cards that use the contactless
ISO/IEC 14443-4 (ISO-DEP) protocol for transmission. These cards are used by
many systems today, including the existing EMVCO NFC payment infrastructure.
Android uses Application Identifiers (AIDs) as defined in ISO/IEC 7816-4 as
the basis for routing transactions to the correct Android applications.
</p>
<p>
Apps declare the AIDs they support in their manifest files, along with a
category identifier that indicates the type of support available (for
example, "payments"). In cases where multiple apps support the same AID in
the same category, Android displays a dialog that lets the user choose which
app to use.
</p>
<p>
When the user taps to pay at a point-of-sale terminal, the system extracts
the preferred AID and routes the transaction to the correct application. The
app reads the transaction data and can use any local or network-based
services to verify and then complete the transaction.
</p>
<p>
Android HCE requires an NFC controller to be present in the device. Support
for HCE is already widely available on most NFC controllers, which offer
dynamic support for both HCE and SE transactions. <span style=
"white-space:nowrap;">Android 4.4</span> devices that support NFC will
include Tap &amp; Pay for easy payments using HCE.
</p>
<h2 id="44-printing">Printing framework</h2>
<p>
Android apps can now print any type of content over Wi-Fi or
cloud-hosted services such as Google Cloud Print. In print-enabled apps,
users can discover available printers, change paper sizes, choose specific
pages to print, and print almost any kind of document, image, or file.
</p>
<p>
<span style="white-space:nowrap;">Android 4.4</span> introduces native
platform support for printing, along with APIs for managing printing and
adding new types of printer support. The platform provides a print manager
that mediates between apps requesting printing and installed print services
that handle print requests. The print manager provides shared services and a
system UI for printing, giving users consistent control over printing from
any app. The print manager also ensures the security of content as it's
passed across processes, from an app to a print service.
</p>
<div style="float:right;margin:22px 0px 0px 24px;width:490px;">
<img src="{@docRoot}images/kk-print-land-n5.jpg" alt="" width="471" style=
"margin-bottom:0;">
<p class="img-caption" style=
"padding-top:1.5em;margin-left:6px;line-height:1.25em;width:480px;">
You can add printing support to your apps or develop print services to
support specific types of printers.
</p>
</div>
<p>
Printer manufacturers can use new APIs to develop their own <strong>print
services</strong> &mdash; pluggable components that add vendor-specific logic
and services for communicating with specific types of printers. They can
build print services and distribute them through Google Play, making it easy
for users to find and install them on their devices. Just as with other apps,
you can update print services over-the-air at any time.
</p>
<p>
<strong>Client apps</strong> can use new APIs to add printing capabilities to
their apps with minimal code changes. In most cases, you would add a print
action to your Action Bar and a UI for choosing items to print. You would
also implement APIs to create print jobs, query the print manager for status,
and cancel jobs. This lets you print nearly any type of content, from local
images and documents to network data or a view rendered to a canvas.
</p>
<p>
For broadest compatibility, Android uses PDF as its primary file format for
printing. Before printing, your app needs to generate a properly paginated
PDF version of your content. For convenience, the printing API provides
native and WebView helper classes to let you create PDFs using standard
Android drawing APIs. If your app knows how to draw the content, it can
quickly create a PDF for printing.
</p>
<p>
Most devices running <span style="white-space:nowrap;">Android 4.4</span>
will include Google Cloud Print pre-installed as a print service, as well as
several Google apps that support printing, including Chrome, Drive, Gallery,
and QuickOffice.
</p>
<h2 id="44-storage-access">Storage access framework</h2>
<p>
A new <strong>storage access framework</strong> makes it simple for users to
browse and open documents, images, and other files across all of their their
preferred document storage providers. A standard, easy-to-use UI lets users
browse files and access recents in a consistent way across apps and
providers.
</p>
<div style="float:right;margin:22px 0px 0px 24px;width:490px;">
<img src="{@docRoot}images/kk-saf2-n5.jpg" alt="" width="240" style=
"margin-bottom:0;"> <img src="{@docRoot}images/kk-saf1-n5.jpg" alt="" width="240"
style="margin-bottom:0;padding-left:6px;">
<p class="img-caption" style=
"padding-top:1.5em;margin-left:6px;line-height:1.25em;width:480px;">
Box and others have integrated their services into the storage access
framework, giving users easy access to their documents from apps across the
system.
</p>
</div>
<p>
Cloud or local storage services can participate in this ecosystem by
implementing a new document provider class that encapsulates their services.
The provider class includes all of the APIs needed to register the provider
with the system and manage browsing, reading, and writing documents in the
provider. The document provider can give users access to any remote or local
data that can be represented as files &mdash; from text, photos, and
wallpapers to video, audio, and more.
</p>
<p>
If you build a <strong>document provider</strong> for a cloud or local
service, you can deliver it to users as part of your existing Android app.
After downloading and installing the app, users will have instant access to
your service from any app that participates in the framework. This can help
you gain exposure and user engagement, since users will find your services
more easily.
</p>
<p>
If you develop a <strong>client app</strong> that manages files or documents,
you can integrate with the storage access framework just by using new
<span style="font-size:11.5px;">CREATE_DOCUMENT</span> or <span style=
"font-size:11.5px;">OPEN_DOCUMENT</span> intents to open or create files
&mdash; the system automatically displays the standard UI for browsing
documents, including all available document providers.
</p>
<p>
You can integrate your client app one time, for all providers, without any
vendor-specific code. As users add or remove providers, they’ll continue to
have access to their preferred services from your app, without changes or
updates needed in your code.
</p>
<p>
The storage access framework is integrated with the existing <span style=
"font-size:11.5px;">GET_CONTENT</span> intent, so users also have access to
all of their previous content and data sources from the new system UI for
browsing. Apps can continue using <span style=
"font-size:11.5px;">GET_CONTENT</span> as a way to let users import data. The
storage access framework and system UI for browsing make it easier for users
to find and import their data from a wider range of sources.
</p>
<p>
Most devices running <span style="white-space:nowrap;">Android 4.4</span>
will include Google Drive and local storage pre-integrated as document
providers, and Google apps that work with files also use the new framework.
</p>
<h2 id="44-sensors">Low-power sensors</h2>
<h4 id="44-sensor-batching">Sensor batching</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> introduces platform
support for <strong>hardware sensor batching</strong>, a new optimization
that can dramatically reduce power consumed by ongoing sensor activities.
</p>
<p>
With sensor batching, Android works with the device hardware to collect and
deliver sensor events efficiently in batches, rather than individually as
they are detected. This lets the device's application processor remain in a
low-power idle state until batches are delivered. You can request batched
events from any sensor using a standard event listener, and you can control
the interval at which you receive batches. You can also request immediate
delivery of events between batch cycles.
</p>
<p>
Sensor batching is ideal for low-power, long-running use-cases such as
fitness, location tracking, monitoring, and more. It can makes your app more
efficient and it lets you track sensor events continuously &mdash; even while
the screen is off and the system is asleep.
</p>
<p>
Sensor batching is currently available on Nexus 5, and we're working with our
chipset partners to bring it to more devices as soon as possible.
</p>
<div style="float:right;margin:1em 0em 0em 3em;width:484px;clear:both">
<img src="{@docRoot}images/kk-sensors-moves-n5.jpg" alt="" width="240" style=
"margin-bottom:0;"> <img src="{@docRoot}images/kk-sensors-runtastic-n5.jpg" alt=""
width="240" style="margin-bottom:0;padding-left:4px;">
<p class="img-caption" style=
"padding-top:1.5em;margin-left:6px;line-height:1.25em;">
<strong>Moves</strong> and <strong>Runtastic Pedometer</strong> are using
the hardware step-detector to offer long-running, low-power services.
</p>
</div>
<h4 id="44-step-detector">Step Detector and Step Counter</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> also adds platform
support for two new composite sensors &mdash; step detector
and step counter &mdash; that let your app track steps when
the user is walking, running, or climbing stairs. These new sensors are
implemented in hardware for low power consumption.
</p>
<p>
The step detector analyzes accelerometer input to recognize when the user has
taken a step, then triggers an event with each step. The step counter tracks
the total number of steps since the last device reboot and triggers an event
with each change in the step count. Because the logic and sensor management
is built into the platform and underlying hardware, you don't need to
maintain your own detection algorithms in your app.
</p>
<p>
Step detector and counter sensors are available on Nexus 5, and we're working
with our chipset partners to bring them to new devices as soon as possible.
</p>
<h2 id="44-sms-provider">SMS provider</h2>
<p>
If you develop a messaging app that uses SMS or MMS, you can now use a
<strong>shared SMS provider and new APIs</strong> to manage your app's
message storage and retrieval. The new SMS provider and APIs define a
standardized interaction model for all apps that handle SMS or MMS messages.
</p>
<p>
Along with the new provider and APIs, <span style=
"white-space:nowrap;">Android 4.4</span> introduces <strong>new
semantics</strong> for receiving messages and writing to the provider. When a
message is received, the system routes it directly to the user's default
messaging app using the new <span style=
"font-size:11.5px;">SMS_DELIVER</span> intent. Other apps can still listen
for incoming messages using the <span style=
"font-size:11.5px;">SMS_RECEIVED</span> intent. Also, the system now allows
only the default app to write message data to the provider, although other
apps can read at any time. Apps that are not the user's default can still
send messages &mdash; the system handles writing those messages to the
provider on behalf of the app, so that users can see them in the default app.
</p>
<p>
The new provider and semantics help to improve the user's experience when
multiple messaging apps are installed, and they help you to build new
messaging features with fully-supported, forward-compatible APIs.
</p>
<h2 id="44-beautiful-apps">New ways to build beautiful apps</h2>
<div style="float:right;margin:14px 0px 0px 24px;width:246px;">
<img src="{@docRoot}images/kk-immersive-n5.jpg" alt="" width="240" style=
"margin-bottom:0;">
<p class="img-caption" style=
"padding-top:1.5em;margin-left:6px;line-height:1.25em;">
A new <strong>immersive mode</strong> lets apps use every pixel on the
screen to show content and capture touch events.
</p>
</div>
<h4 id="44-immersive">Full-screen Immersive mode</h4>
<p>
Now your apps can use <strong>every pixel on the device screen</strong> to
showcase your content and capture touch events. <span style=
"white-space:nowrap;">Android 4.4</span> adds a new full-screen immersive
mode that lets you create full-bleed UIs reaching from edge to edge on phones
and tablets, <strong>hiding all system UI</strong> such as the status bar and
navigation bar. It's ideal for rich visual content such as photos, videos,
maps, books, and games.
</p>
<p>
In the new mode, the system UI stays hidden, even while users are interacting
with your app or game &mdash; you can capture touch events from anywhere
across the screen, even areas that would otherwise be occupied by the system
bars. This gives you a great way to create a larger, richer, more immersive
UI in your app or game and also reduce visual distraction.
</p>
<p>
To make sure that users always have easy, consistent access to system UI from
full-screen immersive mode, <span style="white-space:nowrap;">Android
4.4</span> supports a new gesture &mdash; in immersive mode, an edge swipe
from the top or bottom of the screen now reveals the system UI.
</p>
<p>
To return to immersive mode, users can touch the screen outside of the bar
bounds or wait for a short period for the bars to auto-hide. For a consistent
user experience, the new gesture also works with previous methods of hiding
the status bar.
</p>
<h4 id="44-transitions">Transitions framework for animating scenes</h4>
<p>
Most apps structure their flows around several key UI states that expose
different actions. Many apps also use animation to help users understand
their progress through those states and the actions available in each. To
make it easier to create <strong>high-quality animations</strong> in your
app, <span style="white-space:nowrap;">Android 4.4</span> introduces a new
transitions framework.
</p>
<p>
The transitions framework lets you define <strong>scenes</strong>, typically
view hierarchies, and transitions, which describe how to animate or transform
the scenes when the user enters or exits them. You can use several predefined
transition types to animate your scenes based on specific properties, such as
layout bounds, or visibility. There's also an auto-transition type that
automatically fades, moves, and resizes views during a scene change. In
addition, you can define custom transitions that animate the properties that
matter most to your app, and you can plug in your own animation styles if
needed.
</p>
<p>
With the transitions framework you can also <strong>animate changes to your
UI on the fly</strong>, without needing to define scenes. For example, you
can make a series of changes to a view hierarchy and then have the
TransitionManager automatically run a delayed transition on those changes.
</p>
<p>
Once you've set up transitions, it's straightforward to invoke them from your
app. For example, you can call a single method to begin a transition, make
various changes in your view hierarchy, and on the next frame animations will
automatically begin that animate the changes you specified.
</p>
<div style="float:right;margin:0px 0px 22px 32px;width:340px;">
<img src="{@docRoot}images/kk-home.jpg" alt="translucent system UI" widtdh="340"
style="margin-bottom:0">
<p class="img-caption" style=
"padding-top:1.5em;line-height:1.25em;margin-bottom:0;">
Apps can use new window styles to request translucent system bars.
</p>
</div>
<p>
For custom control over the transitions that run between specific scenes in
your application flow, you can use the TransitionManager. The
TransitionManager lets you define the relationship between scenes and the
transitions that run for specific scene changes.
</p>
<h4 id="44-translucent-system-ui">Translucent system UI styling</h4>
<p>
To get the most impact out of your content, you can now use new window styles
and themes to request <strong>translucent system UI</strong>, including both
the status bar and navigation bar. To ensure the legibility of navigation bar
buttons or status bar information, subtle gradients is shown behind the
system bars. A typical use-case would be an app that needs to show through to
a wallpaper.
</p>
<h4 id="44-notification-access">Enhanced notification access</h4>
<p>
Notification listener services can now see <strong>more information about
incoming notifications</strong> that were constructed using the notification
builder APIs. Listener services can access a notification's actions as well
as new extras fields &mdash; text, icon, picture, progress, chronometer, and
many others &mdash; to extract cleaner information about the notification and
present the information in a different way.
</p>
<div style="float:left;margin:1em 2em 1em 2em;">
<a href=""><img src="{@docRoot}images/kk-chromium-icon.png" alt="" height="160" style=
"margin-bottom:0em;"></a>
</div>
<h4 id="44-webview">Chromium WebView</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> includes a completely
new implementation of WebView that's based on <a href=
"http://www.chromium.org/Home" class="external-link">Chromium</a>. The new
Chromium WebView gives you the latest in standards support, performance, and
compatibility to build and display your web-based content.
</p>
<p>
Chromium WebView provides broad support for HTML5, CSS3, and JavaScript. It
supports most of the HTML5 features available in Chrome for Android 30. It
also brings an updated version of the JavaScript Engine (V8) that delivers
dramatically improved JavaScript performance.
</p>
<p stydle="clear:both;">
In addition, the new Chromium WebView supports remote debugging using
<a class="external-link" href=
"https://devsite.googleplex.com/chrome-developer-tools/docs/remote-debugging#debugging-webviews">
Chrome DevTools</a>. For example, you can use Chrome DevTools on your
development machine to inspect, debug, and analyze your WebView content live
on a mobile device.
</p>
<p>
The new Chromium WebView is included on all compatible devices running
<span style="white-space:nowrap;">Android 4.4</span> and higher. You can take
advantage of the new WebView right away, and with minimum modifications to
existing apps and content. In most cases, your content will migrate to the
new implementation seamlessly.
</p>
<h2 id="44-media">New media capabilities</h2>
<h4 id="44-screen-recording">Screen recording</h4>
<p>
Now it's easy to create high-quality video of your app, directly from your
Android device. <span style="white-space:nowrap;">Android 4.4</span> adds
support for screen recording and provides a <strong>screen recording
utility</strong> that lets you capture video as you use the device and store
it as an MP4 file. It's a great new way to create walkthroughs and tutorials
for your app, testing materials, marketing videos, and much more.
</p>
<p>
You can record at any device-supported resolution and bitrate you want, and
the output retains the aspect ratio of the display. By default, the utility
selects a resolution equal or close to the device's display resolution in the
current orientation. When you are done recording, you can share the video
directly from your device or pull the MP4 file to your host computer for
post-production.
</p>
<p>
If your app plays video or other protected content that you don’t want to be
captured by the screen recorder, you can use <span style=
"font-size:11.5px;font-family:monospace;white-space:nowrap;">SurfaceView.setSecure()</span>
to mark the content as secure.
</p>
<p>
You can access screen recording through the adb tool included in the Android
SDK, using the command <span style=
"font-size:11.5px;font-family:monospace;white-space:nowrap;">adb shell
screenrecord</span>. You can also launch it through the DDMS panel in Android
Studio.
</p>
<h4 id="44-adaptive-playback">Resolution switching through adaptive playback</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> brings formal support
for adaptive playback into the Android media framework. Adaptive playback is
an optional feature of video decoders for MPEG-DASH and other formats that
enables <strong>seamless change in resolution during playback</strong>. The
client can start to feed the decoder input video frames of a new resolution
and the resolution of the output buffers change automatically, and without a
significant gap.
</p>
<p>
Resolution switching in <span style="white-space:nowrap;">Android 4.4</span>
lets media apps offer a significantly better streaming video experience. Apps
can check for adaptive playback support at runtime using existing APIs and
implement resolution-switching using new APIs introduced in <span style=
"white-space:nowrap;">Android 4.4</span>.
</p>
<h4 id="44-cenc">Common Encryption for DASH</h4>
<p>
Android now supports the <strong>Common Encryption (CENC)</strong> for
MPEG-DASH, providing a standard, multiplatform DRM scheme for managing
protecting content. Apps can take advantage of CENC through Android's modular
DRM framework and platform APIs for supporting DASH.
</p>
<h4 id="44-hls">HTTP Live Streaming</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> updates the platform's
HTTP Live Streaming (HLS) support to a superset of version 7 of the HLS
specification (version 4 of the protocol). See the <a href=
"http://tools.ietf.org/html/draft-pantos-http-live-streaming-07" class=
"external-link">IETF draft</a> for details.
</p>
<h4 id="44-audio-tunneling">Audio Tunneling to DSP</h4>
<p>
For high-performance, lower-power audio playback, <span style=
"white-space:nowrap;">Android 4.4</span> adds platform support for
audio tunneling to a digital signal processor (DSP) in the
device chipset. With tunneling, audio decoding and output effects are
off-loaded to the DSP, waking the application processor less often and using
less battery.
</p>
<p>
Audio tunneling can <strong>dramatically improve battery life</strong> for
use-cases such as listening to music over a headset with the screen off. For
example, with audio tunneling, Nexus 5 offers a total off-network audio
playback time of up to 60 hours, an increase of over 50% over non-tunneled
audio.
</p>
<p>
Media applications can take advantage of audio tunneling on supported devices
without needing to modify code. The system applies tunneling to optimize
audio playback whenever it's available on the device.
</p>
<div style="float:right;padding-top:1em;width:372px;margin-left:2em;">
<img src="{@docRoot}images/kk-loudnessEnhancerAnnotated.png" alt=
"Visualizer showing loudness enhancer audio effect" width="360" height="252"
style="border:1px solid #ddd;border-radius: 6px;">
<p class="img-caption" style="margin-left:6px;line-height:1.25em;">
Visualization of how the LoudnessEnhancer effect can make speech content
more audible.
</p>
</div>
<p>
Audio tunneling requires support in the device hardware. Currently audio
tunneling is available on Nexus 5 and we're working with our chipset partners
to make it available on more devices as soon as possible.
</p>
<h4 id="44-audio-monitoring">Audio monitoring</h4>
<p>
Apps can use new monitoring tools in the Visualizer effect to get updates on
the <strong>peak and RMS levels</strong> of any currently playing audio on
the device. For example, you could use this creatively in music visualizers
or to implement playback metering in a media player.
</p>
<h4 id="44-loudness">Loudness enhancer</h4>
<p>
Media playback applications can <strong>increase the loudness of spoken
content</strong> by using the new LoudnessEnhancer effect, which acts as
compressor with time constants that are specifically tuned for speech.
</p>
<h4 id="44-audio-timestamps">Audio timestamps for improved AV sync</h4>
<p>
The audio framework can now report <strong>presentation timestamps</strong>
from the audio output HAL to applications, for better audio-video
synchronization. Audio timestamps let your app determine when a specific
audio frame will be (or was) presented off-device to the user; you can use
the timestamp information to more accurately synchronize audio with video
frames.
</p>
<h4 id="44-miracast">Wi-Fi CERTIFIED Miracastâ„¢</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> devices can now be
certified to the Wi-Fi Alliance Wi-Fi Display Specification as Miracast
compatible. To help with testing, a new Wireless Display developer option
exposes advanced configuration controls and settings for Wireless Display
certification. You can access the option at <strong>Settings &gt; Developer
options &gt; Wireless display certification</strong>. Nexus 5 is a Miracast
certified wireless display device.
</p>
<h2 id="44-renderscript">RenderScript Compute</h2>
<div style="float:right;padding-top:1em;width:372px;margin-left:2em;">
<img src="{@docRoot}images/kk-rs-chart-versions.png" alt=
"Renderscipt optimizations chart" width="360" height="252" style=
"border:1px solid #ddd;border-radius: 6px;">
<p class="img-caption" style="margin-left:6px;line-height:1.25em;">
Performance benchmarks for Android&nbsp;4.4 relative to Android&nbsp;4.3,
run on the same devices (Nexus 7, Nexus 10).
</p>
</div>
<h4>Ongoing performance improvements</strong></h4>
<p>
When your apps use RenderScript, they'll benefit from <strong>ongoing
performance tuning</strong> in the RenderScript runtime itself, without the
need for recompilation. The chart at right shows performance gains in Android
4.4 on two popular chipsets.
</p>
<h4>GPU acceleration</h4>
<p>
Any app using RenderScript on a supported device benefits from GPU
acceleration, without code changes or recompiling. Since the Nexus 10 first
debuted RenderScript GPU acceleration, various other hardware partners have
added support.
</p>
<p>
Now with <span style="white-space:nowrap;">Android 4.4</span>, GPU
acceleration is available on the Nexus 5, as well as the Nexus 4, Nexus 7
(2013), and Nexus 10, and we're working with our partners to bring it to more
devices as soon as possible.
</p>
<h4 id="44-renderscript-ndk">RenderScript in the Android NDK</h4>
<p>
Now you can take advantage of RenderScript <strong>directly from your native
code</strong>. A new C++ API in the Android Native Development Kit (NDK) lets
you access the same RenderScript functionality available through the
framework APIs, including script intrinsics, custom kernels, and more.
</p>
<p>
If you have large, performance-intensive tasks to handle in native code, you
can perform those tasks using RenderScript and integrate them with your
native code. RenderScript offers great performance across a wide range of
devices, with automatic support for multi-core CPUs, GPUs, and other
processors.
</p>
<p>
When you build an app that uses the RenderScript through the NDK, you can
distribute it to any device running Android 2.2 or or higher, just like with
the RenderScript support library available for framework APIs.
</p>
<h2 id="44-graphics">Graphics</h2>
<h4 id="44-surfaceflinger">GLES2.0 SurfaceFlinger</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> upgrades its
SurfaceFlinger from OpenGL ES 1.0 to OpenGL ES 2.0. This boosts performance
by using multi-texturing, and it improves color calibration and supports more
advanced special effects.
</p>
<h4 id="44-composer">New Hardware Composer support for virtual displays</h4>
<p>
The latest version of Android Hardware Composer, HWComposer 1.3, supports
hardware composition of one virtual display in addition to the primary,
external (e.g. HDMI) display, and has improved OpenGL ES interoperability.
</p>
<h2 id="44-connectivity">New Types of Connectivity</h2>
<h4 id="44-bluetooth">New Bluetooth profiles</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> support for two new
Bluetooth profiles to let apps support a broader range of low-power and media
interactions. <strong>Bluetooth HID over GATT</strong> (HOGP) gives apps a
low-latency link with low-power peripheral devices such as mice, joysticks,
and keyboards. <strong>Bluetooth MAP</strong> lets your apps exchange
messages with a nearby device, for example an automotive terminal for
handsfree use or another mobile device. As an <strong>extension to Bluetooth
AVRCP 1.3</strong>, users can now set absolute volume on the system from
their Bluetooth devices.
</p>
<p>
Platform support for HOGP, MAP, and AVRCP is built on the Bluedroid Bluetooth
stack introduced by Google and Broadcom in Android 4.2. Support is available
right away on Nexus devices and other Android-compatible devices that offer
compatible Bluetooth capabilities.
</p>
<h4 id="44-ir-blasters">IR Blasters</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> introduces platform
support for built-in <strong>IR blasters</strong>, along with a new API and
system service that let you create apps to take advantage them.
</p>
<p>
Using the new API, you can build apps that let users remotely control nearby
TVs, tuners, switches, and other electronic devices. The API lets your app
check whether the phone or tablet has an infrared emitter, query it's carrier
frequencies, and then send infrared signals.
</p>
<p>
Because the API is standard across Android devices running <span style=
"white-space:nowrap;">Android 4.4</span> or higher, your app can support the
broadest possible range of vendors without writing custom integration code.
</p>
<h4 id="44-wifi-tdls">Wi-Fi TDLS support</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> introduces a seamless
way to stream media and other data faster between devices already on the same
Wi-Fi network by supporting Wi-Fi Tunneled Direct Link Setup (TDLS).
</p>
<h2 id="44-accessibility">Accessibility</h2>
<h4 id="44-closed-captioning">System-wide settings for closed captioning</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> now supports a better
accessibility experience across apps by adding system-wide preferences for
Closed Captioning. Users can go to <strong>Settings</strong> &gt;
<strong>Accessibility</strong> &gt; <strong>Captions</strong> to set global
captioning preferences, such as whether to show captions and what language,
text size, and text style to use.
</p>
<p>
Apps that use video can now access the user's captioning settings and
<strong>adjust presentation to meet the user's preferences</strong>. A new
captioning manager API lets you check and monitor the user's captioning
preferences. The captioning manager provides you with the user's preferred
captioning state as well as preferred locale, scaling factor, and text style.
The text style includes foreground and background colors, edge properties,
and typeface.
</p>
<div style="float:right;margin:22px 0px 0px 24px;width:490px;">
<img src="{@docRoot}images/kk-captions-n5.jpg" alt="" width="471" style=
"margin-bottom:0;">
<p class="img-caption" style=
"padding-top:1.5em;margin-left:6px;line-height:1.25em;width:480px;">
Apps can now refer to the user's <strong>system-wide captions
preferences</strong>. An example of the expected display style is shown
right in the settings.
</p>
</div>
<p>
In addition, apps that use <strong>VideoView</strong> can use a new API to
pass a captioning stream along with a video stream for rendering. The system
automatically handles the display of the captions on video frames according
to the user's systemwide settings. Currently, VideoView supports auto-display
of captions in WebVTT format only.
</p>
<p>
<strong>All apps that show captions</strong> should make sure to check the
user's systemwide captioning preferences and render captions as closely as
possible to those preferences. For more insight into how specific
combinations of settings should look, you can look at a preview of captions
in different languages, sizes, and styles right in the Settings app.
</p>
<h4 id="44-enhanced-apis">Enhanced Accessibility APIs</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> extends the
accessibility APIs to support <strong>more precise structural and semantic
description</strong> and observation of onscreen elements. With the new APIs,
developers can improve the quality of accessible feedback by providing
accessibility services with more information about on-screen elements.
</p>
<p>
In accessibility nodes, developers can now determine whether a node is a
popup, get its input type, and more. You can also use new APIs to work with
nodes that contain grid-like information, such as lists and tables. For
example, you can now specify new supported actions, collection information,
live region modes, and more.
</p>
<p>
New accessibility events let developers more closely follow the changes that
are taking place in window content, and they can now listen for changes in
the touch exploration mode on the device.
</p>
<h2 id="44-international-users">Support for international Users</h2>
<h4 id="44-drawable-mirroring">Drawable mirroring for RTL locales</h4>
<p>
If your app is targeting users who use RTL scripts, you can use a new API to
declare that a <strong>drawable should be auto-mirrored</strong> when the
user's locale setting includes an RTL language.
</p>
<p>
Declaring a drawable as auto-mirrored helps you <strong>prevent duplication
of assets</strong> in your app and reduces the the size of your APK. When you
have drawables that are the reusable for both LTR and RTL presentations, you
can declare the default versions as auto-mirrored and then omit those
Drawables from your RTL resources.
</p>
<div style="float:right;margin:16px 12px 0px 32px;width:260px;clear:both;">
<img src="{@docRoot}images/kk-pseudolocale-rtl.png" alt="" width="260" style=
"margin-bottom:0;">
<p class="img-caption" style="padding-top:1.5em;line-height:1.25em;">
Pseudo-locales make it easier to test your app's localization.
</p>
</div>
<p>
You can declare various types of drawables as auto-mirrored in your
application code, such as bitmap, nine-patch, layer, state list, and other
drawables. You can also declare a drawable as auto-mirrored in your resource
files by using a new attribute.
</p>
<h4 id="44-pseudolocale-rtl">RTL pseudo-locale</h4>
<p>
To make it easier to test and debug your layouts, Android includes an RTL
pseudo-locale as a new developer option.
</p>
<p>
The RTL pseudo-locale switches the device to RTL layout for all locales and
displays text in your current language. This can help you find layout issues
across your app, without having to display the app in an RTL language. You
can access the RTL pseudo-localed as in <strong>Settings &gt; Developer
options &gt; Force RTL layout direction</strong>.
</p>
<h2 id="44-security">Security enhancements</h2>
<h4 id="44-selinux">SELinux (enforcing mode)</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> updates its SELinux
configuration from "permissive" to "enforcing." This means potential policy
violations within a SELinux domain that has an enforcing policy will be
blocked.
</p>
<h4 id="44-crytpo">Improved cryptographic algorithms</h4>
<p>
Android has improved its security further by adding support for two more
cryptographic algorithms. Elliptic Curve Digital Signature Algorithm (ECDSA)
support has been added to the keystore provider improving security of digital
signing, applicable to scenarios such as signing of an application or a data
connection. The Scrypt key derivation function is implemented to protect the
cryptographic keys used for full-disk encryption.
</p>
<h4 id="44-other">Other enhancements</h4>
<p>
On multiuser devices, VPNs are now applied per user. This can allow a user to
route all network traffic through a VPN without affecting other users on the
device. Also, Android now supports FORTIFY_SOURCE level 2, and all code is
compiled with those protections. FORTIFY_SOURCE has been enhanced to work
with clang.
</p>
<h2 id="44-tools">Tools for analyzing memory use</h2>
<h4 id="44-procstats">Procstats</h4>
<p>
A new tool called <strong>procstats</strong> helps you analyze the memory
resources your app uses, as well as the resources used by other apps and
services running on the system.
</p>
<p>
Procstats keeps track of <strong>how apps are running over time</strong>,
providing data about their execution durations and memory use to help
determine how efficiently they are performing. This is most important for
apps that start services that run in the background, since it lets you
monitor how long they are running and how much RAM they are using while doing
so. Procstats will also collect data for foreground applications about memory
use over time to determine the overall memory profile of the app.
</p>
<p>
Procstats can help you identify background services started by your app. You
can keep track of how long those services continue running and how much RAM
they use while doing so. Procstats also lets you profile your app while it's
in the foreground, using its memory use over time to determine its overall
memory profile.
</p>
<div style="margin:2em 0em;width:780px;">
<div style="float:left;width:390px;">
<img src="{@docRoot}images/kk-procstats.png" alt="" width="360" style=
"margin-bottom:0;box-shadow: 3px 10px 18px 1px #eee;border:1px solid #ddd;border-radius: 6px;">
<p class="img-caption" style=
"padding-top:1.5em;line-height:1.25em;width:360px;">
The new <strong>procstats</strong> tool lets you check the memory use of
apps and services over time.
</p>
</div>
<div style="float:right;width:390px;">
<img src="{@docRoot}images/kk-meminfo.png" alt="" width="360" style=
"margin-bottom:0;box-shadow: 3px 10px 12px 1px #eee;border:1px solid #ddd;border-radius: 6px;">
<p class="img-caption" style=
"padding-top:1.5em;line-height:1.25em;width:360px;">
The enhanced <strong>meminfo</strong> tool lets you see details of memory
use for an app.
</p>
</div>
</div>
<p style="clear:both;">
You can access procstats from the adb tool included in the Android SDK,
<span style="font-size:11.5px;font-family:monospace;white-space:nowrap;">adb
shell dumpsys procstats</span>. Also, for on-device profiling, see the
Process Stats developer option, below.
</p>
<h4 id="44-procstats-ondevice" style="clear:both">On-device memory status and profiling</h4>
<p>
<span style="white-space:nowrap;">Android 4.4</span> includes a new developer
option to make it easier to analyze your app's memory profile while it's
running on any device or emulator. It's especially useful to get a view of
how your app uses memory and performs on devices with low RAM. You can access
the option at <strong>Settings &gt; Developer options &gt; Process
stats</strong>
</p>
<div style="float:right;margin:22px 0px 0px 24px;width:490px;">
<img src="{@docRoot}images/kk-proc-device-overview-n5.jpg" alt="" width="240" style=
"margin-bottom:0;"> <img src="{@docRoot}images/kk-proc-device-detail-n5.jpg" alt=""
width="240" style="margin-bottom:0;padding-left:6px;">
<p class="img-caption" style=
"padding-top:1.5em;margin-left:6px;line-height:1.25em;width:480px;">
<strong>Process stats</strong> is a convenient way to check your app's
memory use. You can see how your app compares to other apps and zoom in on
specific data about your app or it's background services.
</p>
</div>
<p>
The <strong>Process Stats</strong> option shows you a variety of high-level
metrics on your app's memory use, based on data collected using the new
procstats service. On the main screen you can see a summary of system memory
status. Green indicates relative amount of time spent with low RAM usage,
yellow indicates moderate RAM usage, and red indicates high (critical) RAM
usage
</p>
<p>
Below the summary is a list summarizing each app's <strong>memory load on the
system</strong>. For each app, a blue bar indicates the relative computed
memory load (runtime x avg_pss) of its process, and a percentage number
indicates the relative amount of time spent in the background. You can filter
the list to show only foreground, background, or cached processes, and you
can include or exclude system processes. You can also change the duration of
the data collected to 3, 6, 12, or 24 hours, and you can include or exclude
uss memory.
</p>
<p>
To take a closer look at a specific app's memory usage in isolation, tap the
app. For each app, you can now see a summary of the memory consumed and the
percentage of the collection interval that the app has been running. You can
also see the average and maximum usage over the collection period, and below
the app's services and the percentage of time they've been running.
</p>
<p>
Analyzing your app using the data in Process Stats can reveal issues and
suggest possible optimizations for your app. For example, if your app is
running longer than it should or using too much memory over a period of time,
there could be bugs in your code that you can resolve to improve your app's
performance, especially when running on a device with low RAM.
</p>
</div><!-- END ANDROID 4.4 -->