Sorry, you need to enable JavaScript to visit this website.

Feedback

Your feedback is important to keep improving our website and offer you a more reliable experience.

Automating Collection of Android* Application Compile and Launch Time

BY 01 Staff (not verified) ON Sep 20, 2019

INTRODUCTION

Android, a mobile operating system developed by Google*, is a smartphone segment leader and is in sync with the customers’ requirements with novel ideas and enhanced user experience.

Android Runtime performance is measured by certain critical user experience factors, including application compile time and launch time. Collecting this data requires a dedicated effort by a human resource, high time consumption, and inaccuracies that may occur during data collection. These challenges motivated us to create an automated script to overcome the inefficiencies in the data collection process.

The techniques and tools described in this article helped us to reduce the time taken to collect compile-time data by 74.85% and launch time data by 75.61% for 4 Android apps that we tested on Intel® Platforms. The tool can collect the data for any number of apps without sign-in activity.

BACKGROUND

Android is based on a modified version of the Linux* kernel and other open source software. Android's default user interface is primarily based on direct manipulation, using touch inputs that correspond to real-world actions, like swiping, tapping, pinching, and reverse-pinching to manipulate on-screen objects, along with a virtual keyboard.

User experience (UX) plays a very critical role in the appeal of Android, including such items as the Android user interface and operating software responsiveness. Android is a popular mobile operating software with a high number of applications and the performance of these apps also plays a major role in the UX of Android. [Ref. 1]

APPROACH

The responsiveness of the apps can be measured in two ways:  the compile time and the launch time. The compile time is the time taken by the apps to get installed onto the device. The launch time consists of the first launch and subsequent launch time measurements. First launch time is time taken by the app to load and make its content visible to the user for the first time after the app has been installed. Subsequent launch time is the time taken by the app to load and display its contents after it has been launched a couple of times.

This data provides insights into the performance and responsiveness of Android and the device on which it is installed. Collecting these values provides us with a way to measure the responsiveness of Android, however, manually collecting the compile and launch time has significant challenges, including:

  • Not 100% accurate due to human error
  • Requires human resources to perform
  • Collections are tedious to perform

For these reasons, we developed an automated method for collecting compile time and launch time data.

Our first step was to define the different values used to measure the responsiveness of an Android application:

  • Compile Time: We take the dex2oat time as the compile of the Android application. dex2oat takes a dex file and compiles it. The result is essentially an elf file that is then executed natively. So instead of having bytecode that is interpreted by a virtual machine, it now has native code that can be executed natively by the processor. This is also called AOT (Ahead-of-Time) compilation.
  • First Launch: The first launch is also known as a cold start. Cold start happens when the app is launched for the first time since the device booted, or since the system killed the app. This type of start presents the greatest challenge in terms of minimizing startup time, because the system and app have more work to do then in the other launch states.
  • Subsequent Launch: Subsequent launch, also known as a hot start, is simpler and has lower-overhead than a cold start. In a hot start, all the system does is bring the activity to the foreground. If the application’s activities are still resident in memory, then the app can avoid having to repeat object initialization, layout inflation, and rendering. [Ref. 2]

For collecting compile time, we collected the data from the logcat.

dex2oat returns 2 time values. The first time is the time taken by the dex2oat. The second time is the total time taken by the CPU.

There are two ways to consistently capture launch times:

  • Using a high-speed camera to capture the frames displayed by the device and calculate the time taken by the device to launch the app by comparing the frames collected by the high-speed camera.
  • Collect the times from the logcat.

We chose to use the second method. We read the data from the ADB (Android Debug Bridge) logs to measure the first and subsequent launch times. [Ref. 3]

After the app is launched, the script will gather displayed time from the logcat as shown below.

The time shown above includes the entire time that it took to launch the process: com. android.myexample

This includes the time it took to load the code, initialize the classes that are used at the start time, run layout, and draw the app for the first time.

Figure 1 below shows the steps we  defined to collect the compile time and launch time.

Figure 1: Steps to measure Compile Time, First Launch Time, Subsequent Launch Time

Our approach uses a bash script that collects launch time data and compile-time data according to the steps defined above.

The launch and compile-time data collected is not always consistent. To get consistent data, we need to detect the outliers and deal with it appropriately. If there are outliers present, then we need to recollect the data and recheck to see if the new data is consistent. These steps are repeated until we get consistent data.

We chose 4 of the most commonly and widely used apps for Chromebooks* to test our script. Refer to Table 1 for the list of apps chosen.

Application

Category

Adobe* Photoshop Lightroom

Photography

Dropbox*

Storage

Evernote*

Productivity

WeVideo*

Video Players and Editors


Table 1: Apps Chosen for measurement of Compile Time and Launch Time

 

RESULTS

Our Evaluation Platform is Intel® NUC KIT NUC715DNHE, it uses i5-7576U processor with 2 cores. The processor can reach up to 4 GHz frequency in Turbo mode with 8GB RAM. We ensured that “Internet Speed Test” was executed before collecting the data to confirm the internet bandwidth is same while execution of the tests.

Performance Gain

Table 2 below shows the time saved by collecting the launch and compile-time data using the automated technique described in this article instead of collecting the data manually.

Metric

Manual (sec)

Automation (sec)

Time Saved

Compile Time

338

85

74.85%

Launch Time

1816

443

75.61%

 

Table 2: Execution Time in Manual and Automation Mode

Conclusion

This method has made the collection of launch time and compile time data easier and faster. It can be integrated with regular testing methods, so that we can evaluate the launch time and compile time performance of various apps on different images.

 

ABOUT THE AUTHORS

This article was written by Jaishankar Rajendran, Biboshan Banerjee, BC, Anuvarshini, and Yasoda Aravapalli who are members of Google OS Run Times Team at Intel Technology India Pvt. Ltd.

Contact us via email to ask questions or discuss issues: jaishankar.rajendran@intel.com, biboshan.banerjee@intel.com, anuvarshini.bc@intel.com, or yasoda.aravapalli@intel.com

NOTICES AND DISCLAIMERS

Tests document performance of components on a particular test, in specific systems. Differences in hardware, software, or configuration will affect actual performance. Consult other sources of information to evaluate performance as you consider your purchase. For more complete information about performance and benchmark results, visit www.intel.com/benchmarks.

TEST CONFIGURATION

Software: Android 9.0, Kernel 4.19, OpenGL ES 3.1 Support, Fast Boot

Hardware: Intel® Core™ i7-7576U Processor, 2x4.0 GHz CPU, 8GB

Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Performance varies depending on system configuration. No computer system can be absolutely secure. Check with your system manufacturer or retailer or learn more at www.intel.com

REFERENCES

[1]   S. Kapoor, "Mobile App Design Fundamentals: User Experience vs. User Interface," [Online]. Available: https://clearbridgemobile.com/mobile-app-design-fundamentals-user-experience-user-interface/

[2]   App Startup Time, 23 January 2019. [Online]. Available: https://developer.android.com/topic/performance/vitals/launch-time

[3]   C. Haase, "Measuring Activity Startup Time", 29 October 2015. [Online]. Available: http://graphics-geek.blogspot.com/2015/10/measuring-activity-startup-time.html