[automerger skipped] ATest: Fix unit test failure. am: 99a7175286 am: 86f3578111 -s ours

am skip reason: Merged-In I96fb3d3f840d4f45ba6558cbf4320a2685eb0715 with SHA-1 99a7175286 is already in history

Original change: https://googleplex-android-review.googlesource.com/c/platform/tools/asuite/+/13698205

Change-Id: Iaf6a68c0513e3a6c6ad814c4ccc1bc3132c303b7
diff --git a/OWNERS b/OWNERS
index 7e1af30..925d49c 100644
--- a/OWNERS
+++ b/OWNERS
@@ -1,5 +1,6 @@
+albaltai@google.com
 dshi@google.com
 kevcheng@google.com
-albaltai@google.com
+morrislin@google.com
 patricktu@google.com
 yangbill@google.com
diff --git a/aidegen/Android.bp b/aidegen/Android.bp
index 8b3554d..84e46df 100644
--- a/aidegen/Android.bp
+++ b/aidegen/Android.bp
@@ -12,6 +12,10 @@
 // See the License for the specific language governing permissions and
 // limitations under the License.
 
+package {
+    default_applicable_licenses: ["Android-Apache-2.0"],
+}
+
 python_defaults {
     name: "aidegen_default",
     pkg_path: "aidegen",
@@ -39,6 +43,7 @@
     libs: [
         "atest_module_info",
         "asuite_cc_client",
+        "asuite_plugin_lib",
     ],
     dist: {
         targets: ["droidcore"],
@@ -57,6 +62,22 @@
     ]
 }
 
+python_library_host {
+    name: "aidegen_lib_common_util",
+    defaults: ["aidegen_default"],
+    srcs: [
+        "lib/common_util.py",
+        "lib/config.py",
+        "lib/errors.py",
+        "constant.py",
+        "templates.py"
+    ],
+    exclude_srcs: [
+        "*_unittest.py",
+        "*/*_unittest.py",
+    ]
+}
+
 python_test_host {
     name: "aidegen_unittests",
     main: "aidegen_run_unittests.py",
@@ -68,11 +89,13 @@
         "test_data/**/*",
     ],
     libs: [
-        "py-mock",
         "atest_module_info",
         "asuite_cc_client",
     ],
     test_config: "aidegen_unittests.xml",
-    test_suites: ["general-tests"],
+    test_suites: ["null-suite"],
     defaults: ["aidegen_default"],
+    test_options:{
+        unit_test: false,
+    },
 }
diff --git a/aidegen/README.md b/aidegen/README.md
index 738bca3..bbc16b3 100755
--- a/aidegen/README.md
+++ b/aidegen/README.md
@@ -1,63 +1,105 @@
-AIDEgen aims to automate the project setup process for developers to work on
-Java project in popular IDE environment. Developers no longer need to
-manually configure an IntelliJ project, such as all the project dependencies.
-It's a **command line tool** that offers the following features:
+# AIDEGen
 
-* Configure Intellij or Android Studio project files with the relevant module
-  dependencies resolved.
+AIDEGen aims to automate the project setup process for developers to work on
+Java or C/C++project in popular IDE environment. Developers no longer need to manually
+configure an IntelliJ project, such as all the project dependencies. It's a
+**command line tool** that offers the following features:
 
-* Launch IDE for a specified sub-project or build target, i.e. frameworks/base
-  or Settings.
+*   Configure Android Studio of IntelliJ project files with the relevant module
+    dependencies resolved.
 
-* Launch IDE for a specified folder which contains build targets, i.e. cts.
+*   Launch IDE for a specified sub-project or build target, i.e. frameworks/base
+    or Settings.
 
-* Auto configure JDK and Android coding style for Intellij.
+*   Launch IDE for specified folder(s) which contains build targets, i.e. cts.
+
+*   Auto configure JDK and Android coding style for Intellij.
 
 ## 1. Prerequisites:
 
-    IDE installed and run $ '. build/envsetup.sh && lunch' under Android source
-    root folder.
+*   IDE installation, choose one of prefer IDE, including Android Studio,
+    IntelliJ IDEA, Eclipse, CLion and VS Code.
 
-## 2. Execution:
+*   Setup Android development environment.
 
-    $ 'aidegen <module_name>... <module_path>...'
-      Example to generate and launch IntelliJ project for framework and
-      Settings:
-        $ aidegen Settings framework
-        $ aidegen packages/apps/Settings frameworks/base
-        $ aidegen packages/apps/Settings framework
+```
+$ source build/envsetup.sh && lunch <TARGE>
+```
 
-    $ 'aidegen <module> -i s'
-      Example to generate and launch Android Studio project for framework:
-        $ aidegen framework -i s
+## 2. Basic Usage:
 
-## 3. More argument:
+### Example 1: Launch IDE with module name
 
-    $ aidegen --help
+Example to generate and launch IntelliJ project for framework and Settings:
 
-## 4. FAQ:
+```
+$ aidegen Settings framework
+```
 
-    1. Q: If I already have an IDE project file, and I run command AIDEGen to
-          generate the same project file again, what’ll happen?
-       A: The former IDEA project file will be overwritten by the newly
-          generated one from the aidegen command.
+### Example 2: Launch IDE with module path
 
-    2. Q: When do I need to re-run AIDEGen?
-       A: Re-run AIDEGen after repo sync.
+Example to generate and launch IntelliJ project for framework and Settings:
 
-    3. Q: Does AIDEGen support debug log dump?
-       A: Use aidegen -v to get more debug information.
+```
+$ aidegen packages/apps/Settings frameworks/base
+```
 
-    4. Q: After the aidegen command run locally, if there’s no IDEA with
-          project shown up, what can I do ?
-       A: Basic steps to do troubleshooting:
-          - Make sure development environment is set up, please refer to
-            prerequisites section.
-          - Check error message in the aidegen command output.
+### Example 3: Launch IDE with build skipped
 
-# Hint
-    1. In Intellij, uses [File] > [Invalidate Caches / Restart…] to force
+Example to generate and launch IntelliJ project for framework and Settings and
+skip build time:
+
+```
+$ aidegen Settings framework -s
+```
+
+## 3. Optional arguments:
+
+Developers can also use the following optional arguments with AIDEGen commands.
+
+| Option | Long option       | Description                                     |
+| :----: | :---------------- | ----------------------------------------------- |
+| `-d`   | `--depth`         | The depth of module referenced by source.       |
+| `-i`   | `--ide`           | Launch IDE type, j=IntelliJ s=Android Studio e=Eclipse c=CLion v=VS Code|
+| `-p`   | `--ide-path`      | Specify user's IDE installed path.              |
+| `-n`   | `--no_launch`     | Do not launch IDE.                              |
+| `-r`   | `--config-reset`  | Reset all AIDEGen's saved configurations.       |
+| `-s`   | `--skip-build`    | Skip building jars or modules.                  |
+| `-v`   | `--verbose`       | Displays DEBUG level logging.                   |
+| `-a`   | `--android-tree`  | Generate whole Android source tree project file for IDE.|
+| `-e`   | `--exclude-paths` | Exclude the directories in IDE.                 |
+| `-l`   | `--language`      | Launch IDE with a specific language,j=java c=C/C++ r=Rust|
+| `-h`   | `--help`          | Shows help message and exits.                   |
+
+## 4. Troubleshooting tips:
+
+If you get an error like: "Dependent modules dictionary is empty." or other errors, try a make
+clean.
+
+## 5. FAQ:
+
+Q1. If I already have an IDE project file, and I run command AIDEGen to generate
+the same project file again, what'll happen?
+
+A1: The former IDEA project file will be overwritten by the newly generated one
+from the aidegen command.
+
+Q2: When do I need to re-run AIDEGen?
+
+A2: Re-run AIDEGen after repo sync.
+
+## 6. Hint:
+
+1. In Intellij, uses [File] > [Invalidate Caches / Restart...] to force
        project panel updated when your IDE didn't sync.
 
-    2. If you run aidegen on a remote desktop, make sure there is no IntelliJ
+2. If you run aidegen on a remote desktop, make sure there is no IntelliJ
        running in a different desktop session.
+
+## 6. Feedback:
+
+If you have any questions or feedback, contact aidegen_tnn@google.com.
+
+If you have any bugs or feature requests email them to buganizer-system+429701@google.com.
+
+
diff --git a/aidegen/aidegen_main.py b/aidegen/aidegen_main.py
index 05b8faf..aef2650 100644
--- a/aidegen/aidegen_main.py
+++ b/aidegen/aidegen_main.py
@@ -61,7 +61,7 @@
 from aidegen.vscode import vscode_native_project_file_gen
 from aidegen.vscode import vscode_workspace_file_gen
 
-AIDEGEN_REPORT_LINK = ('To report the AIDEGen tool problem, please use this '
+AIDEGEN_REPORT_LINK = ('To report an AIDEGen tool problem, please use this '
                        'link: https://goto.google.com/aidegen-bug')
 _CONGRATULATIONS = common_util.COLORED_PASS('CONGRATULATIONS:')
 _LAUNCH_SUCCESS_MSG = (
@@ -92,6 +92,7 @@
                         'languages as follows:\n{}\nPlease select the one you '
                         'would like to implement.\t')
 _LANGUAGE_OPTIONS = [constant.JAVA, constant.C_CPP]
+_NO_ANY_PROJECT_EXIST = 'There is no Java, C/C++ or Rust target.'
 
 
 def _parse_args(args):
@@ -131,11 +132,9 @@
     parser.add_argument(
         '-i',
         '--ide',
-        default=['j'],
-        # TODO(b/152571688): Show VSCode in help's Launch IDE type section at
-        # least until one of the launching native or Java features is ready.
+        default=['u'],
         help=('Launch IDE type, j: IntelliJ, s: Android Studio, e: Eclipse, '
-              'c: CLion.'))
+              'c: CLion, v: VS Code. The default value is \'u\': undefined.'))
     parser.add_argument(
         '-p',
         '--ide-path',
@@ -168,6 +167,17 @@
         dest='exclude_paths',
         nargs='*',
         help='Exclude the directories in IDE.')
+    parser.add_argument(
+        '-V',
+        '--version',
+        action='store_true',
+        help='Print aidegen version string.')
+    parser.add_argument(
+        '-l',
+        '--language',
+        default=['u'],
+        help=('Launch IDE with a specific language, j: Java, c: C/C++, r: '
+              'Rust. The default value is \'u\': undefined.'))
     return parser.parse_args(args)
 
 
@@ -215,12 +225,12 @@
 
 
 def _launch_native_projects(ide_util_obj, args, cmakelists):
-    """Launches native projects with IDE.
+    """Launches C/C++ projects with IDE.
 
     AIDEGen provides the IDE argument for CLion, but there's still a implicit
     way to launch it. The rules to launch it are:
-    1. If no target IDE, we don't have to launch any IDE for native project.
-    2. If the target IDE is IntelliJ or Eclipse, we should launch native
+    1. If no target IDE, we don't have to launch any IDE for C/C++ project.
+    2. If the target IDE is IntelliJ or Eclipse, we should launch C/C++
        projects with CLion.
 
     Args:
@@ -252,73 +262,80 @@
         _launch_ide(ide_util_obj, projects[0].project_absolute_path)
 
 
-def _get_preferred_ide_from_user(all_choices):
-    """Provides the option list to get back users single choice.
-
-    Args:
-        all_choices: A list of string type for all options.
-
-    Return:
-        A string of the user's single choice item.
-    """
-    if not all_choices:
-        return None
-    options = []
-    items = []
-    for index, option in enumerate(all_choices, 1):
-        options.append('{}. {}'.format(index, option))
-        items.append(str(index))
-    query = _CHOOSE_LANGUAGE_MSG.format(len(options), '\n'.join(options))
-    input_data = input(query)
-    while input_data not in items:
-        input_data = input('Please select one.\t')
-    return all_choices[int(input_data) - 1]
-
-
-# TODO(b/150578306): Refine it when new feature added.
-def _launch_ide_by_module_contents(args, ide_util_obj, jlist=None, clist=None,
-                                   both=False):
+def _launch_ide_by_module_contents(args, ide_util_obj, language, jlist=None,
+                                   clist=None, rlist=None, all_langs=False):
     """Deals with the suitable IDE launch action.
 
-    The rules AIDEGen won't ask users to choose one of the languages are:
-    1. Users set CLion as IDE: CLion only supports C/C++.
-    2. Test mode is true: if AIDEGEN_TEST_MODE is true the default language is
-       Java.
+    The rules of AIDEGen launching IDE with languages are:
+      1. If no IDE or language is specific, the priority of the language is:
+         a) Java
+            aidegen frameworks/base
+            launch Java projects of frameworks/base in IntelliJ.
+         b) C/C++
+            aidegen hardware/interfaces/vibrator/aidl/default
+            launch C/C++ project of hardware/interfaces/vibrator/aidl/default
+            in CLion.
+         c) Rust
+            aidegen external/rust/crates/protobuf
+            launch Rust project of external/rust/crates/protobuf in VS Code.
+      2. If the IDE is specific, launch related projects in the IDE.
+         a) aidegen frameworks/base -i j
+            launch Java projects of frameworks/base in IntelliJ.
+            aidegen frameworks/base -i s
+            launch Java projects of frameworks/base in Android Studio.
+            aidegen frameworks/base -i e
+            launch Java projects of frameworks/base in Eclipse.
+         b) aidegen frameworks/base -i c
+            launch C/C++ projects of frameworks/base in CLion.
+         c) aidegen external/rust/crates/protobuf -i v
+            launch Rust project of external/rust/crates/protobuf in VS Code.
+      3. If the launguage is specific, launch relative language projects in the
+         relative IDE.
+         a) aidegen frameworks/base -l j
+            launch Java projects of frameworks/base in IntelliJ.
+         b) aidegen frameworks/base -l c
+            launch C/C++ projects of frameworks/base in CLion.
+         c) aidegen external/rust/crates/protobuf -l r
+            launch Rust projects of external/rust/crates/protobuf in VS Code.
+      4. Both of the IDE and language are specific, launch the IDE with the
+         relative language projects. If the IDE conflicts with the language, the
+         IDE is prior to the language.
+         a) aidegen frameworks/base -i j -l j
+            launch Java projects of frameworks/base in IntelliJ.
+         b) aidegen frameworks/base -i s -l c
+            launch C/C++ projects of frameworks/base in Android Studio.
+         c) aidegen frameworks/base -i c -l j
+            launch C/C++ projects of frameworks/base in CLion.
 
     Args:
         args: A list of system arguments.
         ide_util_obj: An ide_util instance.
-        jlist: A list of java build targets.
-        clist: A list of native build targets.
-        both: A boolean, True to launch both languages else False.
+        language: A string of the language to be edited in the IDE.
+        jlist: A list of Java build targets.
+        clist: A list of C/C++ build targets.
+        rlist: A list of Rust build targets.
+        all_langs: A boolean, True to launch all languages else False.
     """
-    if both:
+    if all_langs:
         _launch_vscode(ide_util_obj, project_info.ProjectInfo.modules_info,
-                       jlist, clist)
+                       jlist, clist, rlist)
         return
-    if not jlist and not clist:
-        logging.warning('\nThere is neither java nor native module needs to be'
-                        ' opened')
+    if not (jlist or clist or rlist):
+        print(constant.WARN_MSG.format(
+            common_util.COLORED_INFO('Warning:'), _NO_ANY_PROJECT_EXIST))
         return
-    answer = None
-    if constant.IDE_NAME_DICT[args.ide[0]] == constant.IDE_CLION:
-        answer = constant.C_CPP
-    elif common_util.to_boolean(
-            os.environ.get(constant.AIDEGEN_TEST_MODE, 'false')):
-        answer = constant.JAVA
-    if not answer and jlist and clist:
-        answer = _get_preferred_ide_from_user(_LANGUAGE_OPTIONS)
-    if (jlist and not clist) or (answer == constant.JAVA):
+    if language == constant.JAVA:
         _create_and_launch_java_projects(ide_util_obj, jlist)
         return
-    if (clist and not jlist) or (answer == constant.C_CPP):
+    if language == constant.C_CPP:
         native_project_info.NativeProjectInfo.generate_projects(clist)
         native_project_file = native_util.generate_clion_projects(clist)
         if native_project_file:
             _launch_native_projects(ide_util_obj, args, [native_project_file])
 
 
-def _launch_vscode(ide_util_obj, atest_module_info, jtargets, ctargets):
+def _launch_vscode(ide_util_obj, atest_module_info, jtargets, ctargets,
+                   rtargets):
     """Launches targets with VSCode IDE.
 
     Args:
@@ -326,28 +343,90 @@
         atest_module_info: A ModuleInfo instance contains the data of
                 module-info.json.
         jtargets: A list of Java project targets.
-        ctargets: A list of native project targets.
+        ctargets: A list of C/C++ project targets.
+        rtargets: A list of Rust project targets.
     """
     abs_paths = []
-    for target in jtargets:
-        _, abs_path = common_util.get_related_paths(atest_module_info, target)
-        abs_paths.append(abs_path)
+    if jtargets:
+        abs_paths.extend(_get_java_project_paths(jtargets, atest_module_info))
     if ctargets:
-        cc_module_info = native_module_info.NativeModuleInfo()
-        native_project_info.NativeProjectInfo.generate_projects(ctargets)
-        vs_gen = vscode_native_project_file_gen.VSCodeNativeProjectFileGenerator
-        for target in ctargets:
-            _, abs_path = common_util.get_related_paths(cc_module_info, target)
-            vs_native = vs_gen(abs_path)
-            vs_native.generate_c_cpp_properties_json_file()
-            if abs_path not in abs_paths:
-                abs_paths.append(abs_path)
+        abs_paths.extend(_get_cc_project_paths(ctargets))
+    if rtargets:
+        root_dir = common_util.get_android_root_dir()
+        abs_paths.extend(_get_rust_project_paths(rtargets, root_dir))
+    if not (jtargets or ctargets or rtargets):
+        print(constant.WARN_MSG.format(
+            common_util.COLORED_INFO('Warning:'), _NO_ANY_PROJECT_EXIST))
+        return
     vs_path = vscode_workspace_file_gen.generate_code_workspace_file(abs_paths)
     if not ide_util_obj:
         return
     _launch_ide(ide_util_obj, vs_path)
 
 
+def _get_java_project_paths(jtargets, atest_module_info):
+    """Gets the Java absolute project paths from the input Java targets.
+
+    Args:
+        jtargets: A list of strings of Java targets.
+        atest_module_info: A ModuleInfo instance contains the data of
+                module-info.json.
+
+    Returns:
+        A list of the Java absolute project paths.
+    """
+    abs_paths = []
+    for target in jtargets:
+        _, abs_path = common_util.get_related_paths(atest_module_info, target)
+        if abs_path:
+            abs_paths.append(abs_path)
+    return abs_paths
+
+
+def _get_cc_project_paths(ctargets):
+    """Gets the C/C++ absolute project paths from the input C/C++ targets.
+
+    Args:
+        ctargets: A list of strings of C/C++ targets.
+
+    Returns:
+        A list of the C/C++ absolute project paths.
+    """
+    abs_paths = []
+    cc_module_info = native_module_info.NativeModuleInfo()
+    native_project_info.NativeProjectInfo.generate_projects(ctargets)
+    vs_gen = vscode_native_project_file_gen.VSCodeNativeProjectFileGenerator
+    for target in ctargets:
+        _, abs_path = common_util.get_related_paths(cc_module_info, target)
+        if not abs_path:
+            continue
+        vs_native = vs_gen(abs_path)
+        vs_native.generate_c_cpp_properties_json_file()
+        if abs_path not in abs_paths:
+            abs_paths.append(abs_path)
+    return abs_paths
+
+
+def _get_rust_project_paths(rtargets, root_dir):
+    """Gets the Rust absolute project paths from the input Rust targets.
+
+    Args:
+        rtargets: A list of strings of Rust targets.
+        root_dir: A string of the Android root directory.
+
+    Returns:
+        A list of the Rust absolute project paths.
+    """
+    abs_paths = []
+    for rtarget in rtargets:
+        path = rtarget
+        # If rtarget is not an absolute path, make it an absolute one.
+        if not common_util.is_source_under_relative_path(rtarget, root_dir):
+            path = os.path.join(root_dir, rtarget)
+        abs_paths.append(path)
+    return abs_paths
+
+
 @common_util.time_logged(message=_TIME_EXCEED_MSG, maximum=_MAX_TIME)
 def main_with_message(args):
     """Main entry with skip build message.
@@ -380,8 +459,16 @@
     """
     exit_code = constant.EXIT_CODE_NORMAL
     launch_ide = True
+    ask_version = False
     try:
         args = _parse_args(argv)
+        if args.version:
+            ask_version = True
+            version_file = os.path.join(os.path.dirname(__file__),
+                                        constant.VERSION_FILE)
+            print(common_util.read_file_content(version_file))
+            sys.exit(constant.EXIT_CODE_NORMAL)
+
         launch_ide = not args.no_launch
         common_util.configure_logging(args.verbose)
         is_whole_android_tree = project_config.is_whole_android_tree(
@@ -411,10 +498,11 @@
             print(traceback_str)
             raise err
     finally:
-        print('\n{0} {1}\n'.format(_INFO, AIDEGEN_REPORT_LINK))
-        # Send the end message here on ignoring launch IDE case.
-        if not launch_ide and exit_code is constant.EXIT_CODE_NORMAL:
-            aidegen_metrics.ends_asuite_metrics(exit_code)
+        if not ask_version:
+            print('\n{0} {1}\n'.format(_INFO, AIDEGEN_REPORT_LINK))
+            # Send the end message here on ignoring launch IDE case.
+            if not launch_ide and exit_code is constant.EXIT_CODE_NORMAL:
+                aidegen_metrics.ends_asuite_metrics(exit_code)
 
 
 def aidegen_main(args):
@@ -429,7 +517,7 @@
          use ProjectConfig.get_instance() inside the function.
       3. Setup project_info.ProjectInfo.modules_info by instantiate
          AidegenModuleInfo.
-      4. Check if projects contain native projects and launch related IDE.
+      4. Check if projects contain C/C++ projects and launch related IDE.
 
     Args:
         args: A list of system arguments.
@@ -437,16 +525,20 @@
     config = project_config.ProjectConfig(args)
     config.init_environment()
     targets = config.targets
-    # Called ide_util for pre-check the IDE existence state.
-    ide_util_obj = ide_util.get_ide_util_instance(args.ide[0])
     project_info.ProjectInfo.modules_info = module_info.AidegenModuleInfo()
     cc_module_info = native_module_info.NativeModuleInfo()
-    jtargets, ctargets = native_util.get_native_and_java_projects(
+    jtargets, ctargets, rtargets = native_util.get_java_cc_and_rust_projects(
         project_info.ProjectInfo.modules_info, cc_module_info, targets)
-    both = config.ide_name == constant.IDE_VSCODE
-    # Backward compatible strategy, when both java and native module exist,
+    config.language, config.ide_name = common_util.determine_language_ide(
+        args.language[0], args.ide[0], jtargets, ctargets, rtargets)
+    # Called ide_util for pre-check the IDE existence state.
+    ide_util_obj = ide_util.get_ide_util_instance(
+        constant.IDE_DICT[config.ide_name])
+    all_langs = config.ide_name == constant.IDE_VSCODE
+    # Backward compatible strategy, when both java and C/C++ module exist,
     # check the preferred target from the user and launch single one.
-    _launch_ide_by_module_contents(args, ide_util_obj, jtargets, ctargets, both)
+    _launch_ide_by_module_contents(args, ide_util_obj, config.language,
+                                   jtargets, ctargets, rtargets, all_langs)
 
 
 if __name__ == '__main__':
diff --git a/aidegen/aidegen_main_unittest.py b/aidegen/aidegen_main_unittest.py
index c84f16e..7c8950c 100644
--- a/aidegen/aidegen_main_unittest.py
+++ b/aidegen/aidegen_main_unittest.py
@@ -39,6 +39,7 @@
 from aidegen.lib import project_file_gen
 from aidegen.lib import project_info
 from aidegen.vscode import vscode_workspace_file_gen
+from aidegen.vscode import vscode_native_project_file_gen
 
 
 # pylint: disable=protected-access
@@ -59,7 +60,7 @@
         """Test _parse_args with different conditions."""
         args = aidegen_main._parse_args([])
         self.assertEqual(args.targets, [''])
-        self.assertEqual(args.ide[0], 'j')
+        self.assertEqual(args.ide[0], 'u')
         target = 'tradefed'
         args = aidegen_main._parse_args([target])
         self.assertEqual(args.targets, [target])
@@ -220,65 +221,36 @@
         self.assertTrue(mock_ide_util.launch_ide.called)
         mock_print.return_value = None
 
-    @mock.patch('builtins.input')
-    def test_get_preferred_ide_from_user(self, mock_input):
-        """Test get_preferred_ide_from_user with different conditions."""
-        test_data = []
-        aidegen_main._get_preferred_ide_from_user(test_data)
-        self.assertFalse(mock_input.called)
-        mock_input.reset_mock()
-
-        test_data = ['One', 'Two', 'Three']
-        mock_input.return_value = '3'
-        self.assertEqual('Three', aidegen_main._get_preferred_ide_from_user(
-            test_data))
-        self.assertEqual(1, mock_input.call_count)
-        mock_input.reset_mock()
-
-        mock_input.side_effect = ['7', '5', '3']
-        self.assertEqual('Three', aidegen_main._get_preferred_ide_from_user(
-            test_data))
-        self.assertEqual(3, mock_input.call_count)
-        mock_input.reset_mock()
-
-        mock_input.side_effect = ('.', '7', 't', '5', '1')
-        self.assertEqual('One', aidegen_main._get_preferred_ide_from_user(
-            test_data))
-        self.assertEqual(5, mock_input.call_count)
-        mock_input.reset_mock()
-
     @mock.patch.object(project_config.ProjectConfig, 'init_environment')
-    @mock.patch('logging.warning')
+    @mock.patch('builtins.print')
     @mock.patch.object(aidegen_main, '_launch_vscode')
     @mock.patch.object(aidegen_main, '_launch_native_projects')
     @mock.patch.object(native_util, 'generate_clion_projects')
     @mock.patch.object(native_project_info.NativeProjectInfo,
                        'generate_projects')
     @mock.patch.object(aidegen_main, '_create_and_launch_java_projects')
-    @mock.patch.object(aidegen_main, '_get_preferred_ide_from_user')
-    def test_launch_ide_by_module_contents(self, mock_choice, mock_j,
-                                           mock_c_prj, mock_genc, mock_c,
-                                           mock_vs, mock_log, mock_init):
+    def test_launch_ide_by_module_contents(self, mock_j, mock_c_prj, mock_genc,
+                                           mock_c, mock_vs, mock_print,
+                                           mock_init):
         """Test _launch_ide_by_module_contents with different conditions."""
         args = aidegen_main._parse_args(['', '-i', 's'])
         mock_init.return_value = None
         self._init_project_config(args)
         ide_obj = 'ide_obj'
-        test_both = False
-        aidegen_main._launch_ide_by_module_contents(args, ide_obj, None,
-                                                    None, test_both)
+        test_all = False
+        lang = constant.JAVA
+        aidegen_main._launch_ide_by_module_contents(args, ide_obj, lang, None,
+                                                    None, None, test_all)
         self.assertFalse(mock_vs.called)
-        self.assertTrue(mock_log.called)
-        self.assertFalse(mock_choice.called)
-        self.assertFalse(mock_choice.called)
+        self.assertTrue(mock_print.called)
         self.assertFalse(mock_j.called)
         self.assertFalse(mock_c_prj.called)
         self.assertFalse(mock_genc.called)
         self.assertFalse(mock_c.called)
 
-        test_both = True
-        aidegen_main._launch_ide_by_module_contents(args, ide_obj, None,
-                                                    None, test_both)
+        test_all = True
+        aidegen_main._launch_ide_by_module_contents(args, ide_obj, lang, None,
+                                                    None, None, test_all)
         self.assertTrue(mock_vs.called)
         self.assertFalse(mock_j.called)
         self.assertFalse(mock_genc.called)
@@ -287,8 +259,7 @@
 
         test_j = ['a', 'b', 'c']
         test_c = ['1', '2', '3']
-        mock_choice.return_value = constant.JAVA
-        aidegen_main._launch_ide_by_module_contents(args, ide_obj, test_j,
+        aidegen_main._launch_ide_by_module_contents(args, ide_obj, lang, test_j,
                                                     test_c)
         self.assertFalse(mock_vs.called)
         self.assertTrue(mock_j.called)
@@ -296,12 +267,14 @@
         self.assertFalse(mock_c.called)
 
         mock_vs.reset_mock()
-        mock_choice.reset_mock()
         mock_c.reset_mock()
         mock_genc.reset_mock()
         mock_j.reset_mock()
-        mock_choice.return_value = constant.C_CPP
-        aidegen_main._launch_ide_by_module_contents(args, ide_obj, test_j,
+        args = aidegen_main._parse_args(['', '-l', 'c'])
+        mock_init.return_value = None
+        self._init_project_config(args)
+        lang = constant.C_CPP
+        aidegen_main._launch_ide_by_module_contents(args, ide_obj, lang, test_j,
                                                     test_c)
         self.assertTrue(mock_c_prj.called)
         self.assertFalse(mock_vs.called)
@@ -310,24 +283,23 @@
         self.assertFalse(mock_j.called)
 
         mock_vs.reset_mock()
-        mock_choice.reset_mock()
         mock_c.reset_mock()
         mock_genc.reset_mock()
         mock_j.reset_mock()
         test_none = None
-        aidegen_main._launch_ide_by_module_contents(args, ide_obj, test_none,
-                                                    test_c)
+        aidegen_main._launch_ide_by_module_contents(args, ide_obj, lang,
+                                                    test_none, test_c)
         self.assertFalse(mock_vs.called)
         self.assertTrue(mock_genc.called)
         self.assertTrue(mock_c.called)
         self.assertFalse(mock_j.called)
 
         mock_vs.reset_mock()
-        mock_choice.reset_mock()
         mock_c.reset_mock()
         mock_genc.reset_mock()
         mock_j.reset_mock()
-        aidegen_main._launch_ide_by_module_contents(args, ide_obj, test_j,
+        lang = constant.JAVA
+        aidegen_main._launch_ide_by_module_contents(args, ide_obj, lang, test_j,
                                                     test_none)
         self.assertFalse(mock_vs.called)
         self.assertTrue(mock_j.called)
@@ -336,15 +308,14 @@
 
         args = aidegen_main._parse_args(['frameworks/base', '-i', 'c'])
         mock_vs.reset_mock()
-        mock_choice.reset_mock()
         mock_c.reset_mock()
         mock_genc.reset_mock()
         mock_c_prj.reset_mock()
         mock_j.reset_mock()
-        aidegen_main._launch_ide_by_module_contents(args, ide_obj, test_j,
+        lang = constant.C_CPP
+        aidegen_main._launch_ide_by_module_contents(args, ide_obj, lang, test_j,
                                                     test_c)
         self.assertFalse(mock_vs.called)
-        self.assertFalse(mock_choice.called)
         self.assertFalse(mock_j.called)
         self.assertTrue(mock_c.called)
         self.assertTrue(mock_c_prj.called)
@@ -352,15 +323,15 @@
 
         args = aidegen_main._parse_args(['frameworks/base'])
         mock_vs.reset_mock()
-        mock_choice.reset_mock()
         mock_c.reset_mock()
         mock_genc.reset_mock()
         mock_c_prj.reset_mock()
         mock_j.reset_mock()
         os.environ[constant.AIDEGEN_TEST_MODE] = 'true'
-        aidegen_main._launch_ide_by_module_contents(args, None, test_j, test_c)
+        lang = constant.JAVA
+        aidegen_main._launch_ide_by_module_contents(args, None, lang, test_j,
+                                                    test_c)
         self.assertFalse(mock_vs.called)
-        self.assertFalse(mock_choice.called)
         self.assertTrue(mock_j.called)
         self.assertFalse(mock_c.called)
         self.assertFalse(mock_c_prj.called)
@@ -398,7 +369,7 @@
             self, mock_get_rel, mock_gen_code, mock_launch_ide):
         """Test _launch_vscode function without ide object."""
         mock_get_rel.return_value = 'rel', 'abs'
-        aidegen_main._launch_vscode(None, mock.Mock(), ['Settings'], [])
+        aidegen_main._launch_vscode(None, mock.Mock(), ['Settings'], [], [])
         self.assertTrue(mock_get_rel.called)
         self.assertTrue(mock_gen_code.called)
         self.assertFalse(mock_launch_ide.called)
@@ -411,7 +382,8 @@
             self, mock_get_rel, mock_gen_code, mock_get_ide):
         """Test _launch_vscode function with ide object."""
         mock_get_rel.return_value = 'rel', 'abs'
-        aidegen_main._launch_vscode(mock.Mock(), mock.Mock(), ['Settings'], [])
+        aidegen_main._launch_vscode(
+            mock.Mock(), mock.Mock(), ['Settings'], [], [])
         self.assertTrue(mock_get_rel.called)
         self.assertTrue(mock_gen_code.called)
         self.assertTrue(mock_get_ide.called)
@@ -431,8 +403,104 @@
         self.assertFalse(mock_get_ide.called)
         self.assertTrue(mock_vscode.called)
 
+    @mock.patch('builtins.print')
+    @mock.patch.object(aidegen_main, '_launch_ide')
+    @mock.patch.object(vscode_workspace_file_gen,
+                       'generate_code_workspace_file')
+    @mock.patch.object(common_util, 'get_android_root_dir')
+    @mock.patch.object(aidegen_main, '_get_rust_project_paths')
+    @mock.patch.object(aidegen_main, '_get_cc_project_paths')
+    @mock.patch.object(aidegen_main, '_get_java_project_paths')
+    def test_launch_vscode_with_logic(self, mock_get_java, mock_get_cc,
+                                      mock_get_rust, mock_get_root, mock_gen,
+                                      mock_launch, mock_print):
+        """Test _launch_vscode with the logic tests."""
+        aidegen_main._launch_vscode(None, mock.Mock(), [], [], [])
+        self.assertFalse(mock_get_java.called)
+        self.assertFalse(mock_get_cc.called)
+        self.assertFalse(mock_get_rust.called)
+        self.assertFalse(mock_get_root.called)
+        self.assertFalse(mock_gen.called)
+        self.assertFalse(mock_launch.called)
+        self.assertTrue(mock_print.called)
+
+        mock_get_java.mock_reset()
+        mock_get_cc.mock_reset()
+        mock_get_rust.mock_reset()
+        mock_get_root.mock_reset()
+        mock_gen.mock_reset()
+        mock_launch.mock_reset()
+        mock_print.mock_reset()
+
+        aidegen_main._launch_vscode(
+            mock.Mock(), mock.Mock(), ['Java'], ['CC'], ['Rust'])
+        self.assertTrue(mock_get_java.called)
+        self.assertTrue(mock_get_cc.called)
+        self.assertTrue(mock_get_rust.called)
+        self.assertTrue(mock_get_root.called)
+        self.assertTrue(mock_gen.called)
+        self.assertTrue(mock_launch.called)
+
+    @mock.patch.object(common_util, 'get_related_paths')
+    def test_get_java_project_paths(self, mock_get_related):
+        """Test _get_java_project_paths with conditions."""
+        abs_path = 'a/b/c/d'
+        rel_path = 'c/d'
+        mock_get_related.return_value = rel_path, abs_path
+        self.assertEqual(
+            [abs_path], aidegen_main._get_java_project_paths(
+                ['Java'], mock.Mock()))
+        mock_get_related.return_value = None, None
+        self.assertEqual(
+            [], aidegen_main._get_java_project_paths(['Java'], mock.Mock()))
+
+    @mock.patch.object(
+        vscode_native_project_file_gen.VSCodeNativeProjectFileGenerator,
+        'generate_c_cpp_properties_json_file')
+    @mock.patch.object(
+        vscode_native_project_file_gen, 'VSCodeNativeProjectFileGenerator')
+    @mock.patch.object(common_util, 'get_related_paths')
+    @mock.patch.object(
+        native_project_info.NativeProjectInfo, 'generate_projects')
+    @mock.patch.object(native_module_info, 'NativeModuleInfo')
+    def test_get_cc_project_paths(self, mock_mod_info, mock_gen, mock_get_rel,
+                                  mock_gen_vs, mock_gen_vs_file):
+        """Test _get_cc_project_paths with conditions."""
+        mock_get_rel.return_value = None, None
+        self.assertEqual([], aidegen_main._get_cc_project_paths(['Java']))
+        self.assertTrue(mock_mod_info.called)
+        self.assertTrue(mock_gen.called)
+        self.assertTrue(mock_get_rel.called)
+        self.assertFalse(mock_gen_vs.called)
+        self.assertFalse(mock_gen_vs_file.called)
+
+        mock_mod_info.mock_reset()
+        mock_gen.mock_reset()
+        mock_get_rel.mock_reset()
+        mock_gen_vs.mock_reset()
+        mock_gen_vs_file.mock_reset()
+
+        abs_path = 'a/b/c/d'
+        rel_path = 'c/d'
+        mock_get_rel.return_value = rel_path, abs_path
+        self.assertEqual([abs_path], aidegen_main._get_cc_project_paths(['CC']))
+        self.assertTrue(mock_mod_info.called)
+        self.assertTrue(mock_gen.called)
+        self.assertTrue(mock_get_rel.called)
+        self.assertTrue(mock_gen_vs.called)
+
+    def test_get_rust_project_paths(self):
+        """Test _get_rust_project_paths with conditions."""
+        abs_path = 'a/b/c/d'
+        rel_path = 'c/d'
+        root = 'a/b'
+        self.assertEqual(
+            [abs_path], aidegen_main._get_rust_project_paths([abs_path], root))
+        self.assertEqual(
+            [abs_path], aidegen_main._get_rust_project_paths([rel_path], root))
+
     @mock.patch.object(aidegen_main, '_launch_ide_by_module_contents')
-    @mock.patch.object(native_util, 'get_native_and_java_projects')
+    @mock.patch.object(native_util, 'get_java_cc_and_rust_projects')
     @mock.patch.object(native_module_info, 'NativeModuleInfo')
     @mock.patch.object(module_info, 'AidegenModuleInfo')
     @mock.patch.object(ide_util, 'get_ide_util_instance')
@@ -448,14 +516,15 @@
         mock_config.return_value = config
         ide = mock.Mock()
         mock_get_ide.return_value = ide
-        mock_get_project.return_value = config.targets, []
+        mock_get_project.return_value = config.targets, [], []
         aidegen_main.aidegen_main(args)
         self.assertTrue(mock_config.called)
         self.assertTrue(mock_get_ide.called)
         self.assertTrue(mock_mod_info.called)
         self.assertTrue(mock_native.called)
         self.assertTrue(mock_get_project.called)
-        mock_launch_ide.assert_called_with(args, ide, config.targets, [], True)
+        mock_launch_ide.assert_called_with(
+            args, ide, constant.JAVA, config.targets, [], [], True)
 
         mock_config.mock_reset()
         mock_get_ide.mock_reset()
@@ -472,7 +541,8 @@
         self.assertTrue(mock_mod_info.called)
         self.assertTrue(mock_native.called)
         self.assertTrue(mock_get_project.called)
-        mock_launch_ide.assert_called_with(args, ide, config.targets, [], False)
+        mock_launch_ide.assert_called_with(
+            args, ide, constant.JAVA, config.targets, [], [], False)
 
 
 if __name__ == '__main__':
diff --git a/aidegen/constant.py b/aidegen/constant.py
index a52288c..b254c44 100644
--- a/aidegen/constant.py
+++ b/aidegen/constant.py
@@ -25,6 +25,7 @@
 GEN_JAVA_DEPS = 'SOONG_COLLECT_JAVA_DEPS'
 GEN_CC_DEPS = 'SOONG_COLLECT_CC_DEPS'
 GEN_COMPDB = 'SOONG_GEN_COMPDB'
+GEN_RUST = 'SOONG_GEN_RUST_PROJECT'
 AIDEGEN_TEST_MODE = 'AIDEGEN_TEST_MODE'
 
 # Constants for module's info.
@@ -63,13 +64,23 @@
 IDE_INTELLIJ = 'IntelliJ'
 IDE_ANDROID_STUDIO = 'Android Studio'
 IDE_CLION = 'CLion'
-IDE_VSCODE = 'VSCode'
+IDE_VSCODE = 'VS Code'
+IDE_UNDEFINED = 'Undefined IDE'
 IDE_NAME_DICT = {
     'j': IDE_INTELLIJ,
     's': IDE_ANDROID_STUDIO,
     'e': IDE_ECLIPSE,
     'c': IDE_CLION,
-    'v': IDE_VSCODE
+    'v': IDE_VSCODE,
+    'u': IDE_UNDEFINED
+}
+IDE_DICT = {
+    IDE_INTELLIJ: 'j',
+    IDE_ANDROID_STUDIO: 's',
+    IDE_ECLIPSE: 'e',
+    IDE_CLION: 'c',
+    IDE_VSCODE: 'v',
+    IDE_UNDEFINED: 'u'
 }
 
 # Constants for asuite metrics.
@@ -78,6 +89,9 @@
 EXIT_CODE_AIDEGEN_EXCEPTION = 1
 AIDEGEN_TOOL_NAME = 'aidegen'
 ANDROID_TREE = 'is_android_tree'
+TYPE_AIDEGEN_BUILD_TIME = 200
+TYPE_AIDEGEN_PRE_PROCESS_TIME = 201
+TYPE_AIDEGEN_POST_PROCESS_TIME = 202
 
 # Exit code of the asuite metrics for parsing xml file failed.
 XML_PARSING_FAILURE = 101
@@ -93,22 +107,30 @@
 BLUEPRINT_JAVA_JSONFILE_NAME = 'module_bp_java_deps.json'
 BLUEPRINT_CC_JSONFILE_NAME = 'module_bp_cc_deps.json'
 COMPDB_JSONFILE_NAME = 'compile_commands.json'
+RUST_PROJECT_JSON = 'rust-project.json'
 CMAKELISTS_FILE_NAME = 'clion_project_lists.txt'
 CLION_PROJECT_FILE_NAME = 'CMakeLists.txt'
 ANDROID_BP = 'Android.bp'
 ANDROID_MK = 'Android.mk'
 JAVA_FILES = '*.java'
+KOTLIN_FILES = '*.kt'
 VSCODE_CONFIG_DIR = '.vscode'
 ANDROID_MANIFEST = 'AndroidManifest.xml'
+VERSION_FILE = 'VERSION'
+INTERMEDIATES = '.intermediates'
+TARGET_R_SRCJAR = 'R.srcjar'
+NAME_AAPT2 = 'aapt2'
 
 # Constants for file paths.
 RELATIVE_NATIVE_PATH = 'development/ide/clion'
 RELATIVE_COMPDB_PATH = 'development/ide/compdb'
+UNZIP_SRCJAR_PATH_HEAD = 'aidegen_'
 
 # Constants for whole Android tree.
 WHOLE_ANDROID_TREE_TARGET = '#WHOLE_ANDROID_TREE#'
 
 # Constants for ProjectInfo or ModuleData classes.
+SRCJAR_EXT = '.srcjar'
 JAR_EXT = '.jar'
 TARGET_LIBS = [JAR_EXT]
 
@@ -127,9 +149,22 @@
 # Constants for the languages aidegen supports.
 JAVA = 'Java'
 C_CPP = 'C/C++'
+RUST = 'Rust'
+UNDEFINED = 'undefined'
+LANG_UNDEFINED = 'u'
+LANG_JAVA = 'j'
+LANG_CC = 'c'
+LANG_RUST = 'r'
+LANGUAGE_NAME_DICT = {
+    LANG_UNDEFINED: UNDEFINED,
+    LANG_JAVA: JAVA,
+    LANG_CC: C_CPP,
+    LANG_RUST: RUST
+}
 
 # Constants for error message.
 INVALID_XML = 'The content of {XML_FILE} is not valid.'
+WARN_MSG = '\n{} {}\n'
 
 # Constants for default modules.
 FRAMEWORK_ALL = 'framework-all'
diff --git a/aidegen/idea/iml.py b/aidegen/idea/iml.py
index 9e3343d..d533805 100644
--- a/aidegen/idea/iml.py
+++ b/aidegen/idea/iml.py
@@ -173,27 +173,42 @@
     def _generate_srcs(self):
         """Generates the source urls of the project's iml file."""
         srcs = []
-        for src in self._mod_info[constant.KEY_SRCS]:
+        framework_srcs = []
+        for src in self._mod_info.get(constant.KEY_SRCS, []):
+            if constant.FRAMEWORK_PATH in src:
+                framework_srcs.append(templates.SOURCE.format(
+                    SRC=os.path.join(self._android_root, src),
+                    IS_TEST='false'))
+                continue
             srcs.append(templates.SOURCE.format(
                 SRC=os.path.join(self._android_root, src),
                 IS_TEST='false'))
-        for test in self._mod_info[constant.KEY_TESTS]:
+        for test in self._mod_info.get(constant.KEY_TESTS, []):
+            if constant.FRAMEWORK_PATH in test:
+                framework_srcs.append(templates.SOURCE.format(
+                    SRC=os.path.join(self._android_root, test),
+                    IS_TEST='true'))
+                continue
             srcs.append(templates.SOURCE.format(
                 SRC=os.path.join(self._android_root, test),
                 IS_TEST='true'))
         self._excludes = self._mod_info.get(constant.KEY_EXCLUDES, '')
+
+        #For sovling duplicate package name, frameworks/base will be higher
+        #priority.
+        srcs = sorted(framework_srcs) + sorted(srcs)
         self._srcs = templates.CONTENT.format(MODULE_PATH=self._mod_path,
                                               EXCLUDES=self._excludes,
-                                              SOURCES=''.join(sorted(srcs)))
+                                              SOURCES=''.join(srcs))
 
     def _generate_dep_srcs(self):
         """Generates the source urls of the dependencies.iml."""
         srcs = []
-        for src in self._mod_info[constant.KEY_SRCS]:
+        for src in self._mod_info.get(constant.KEY_SRCS, []):
             srcs.append(templates.OTHER_SOURCE.format(
                 SRC=os.path.join(self._android_root, src),
                 IS_TEST='false'))
-        for test in self._mod_info[constant.KEY_TESTS]:
+        for test in self._mod_info.get(constant.KEY_TESTS, []):
             srcs.append(templates.OTHER_SOURCE.format(
                 SRC=os.path.join(self._android_root, test),
                 IS_TEST='true'))
@@ -201,19 +216,19 @@
 
     def _generate_jars(self):
         """Generates the jar urls."""
-        for jar in self._mod_info[constant.KEY_JARS]:
+        for jar in self._mod_info.get(constant.KEY_JARS, []):
             self._jars.append(templates.JAR.format(
                 JAR=os.path.join(self._android_root, jar)))
 
     def _generate_srcjars(self):
         """Generates the srcjar urls."""
-        for srcjar in self._mod_info[constant.KEY_SRCJARS]:
+        for srcjar in self._mod_info.get(constant.KEY_SRCJARS, []):
             self._srcjars.append(templates.SRCJAR.format(
                 SRCJAR=os.path.join(self._android_root, srcjar)))
 
     def _generate_dependencies(self):
         """Generates the dependency module urls."""
-        for dep in self._mod_info[constant.KEY_DEPENDENCIES]:
+        for dep in self._mod_info.get(constant.KEY_DEPENDENCIES, []):
             self._deps.append(templates.DEPENDENCIES.format(MODULE=dep))
 
     def _create_iml(self):
diff --git a/aidegen/idea/xml_gen.py b/aidegen/idea/xml_gen.py
index 5bb886c..1329760 100644
--- a/aidegen/idea/xml_gen.py
+++ b/aidegen/idea/xml_gen.py
@@ -127,7 +127,7 @@
         module_path: A string, the absolute path of the module.
         git_paths: A list of git paths.
     """
-    git_mappings = [_GIT_PATH.format(GIT_DIR=p) for p in git_paths]
+    git_mappings = [_GIT_PATH.format(GIT_DIR=p) for p in git_paths if p]
     vcs = XMLGenerator(module_path, 'vcs.xml')
     if module_path != common_util.get_android_root_dir() or not vcs.xml_obj:
         common_util.file_generate(vcs.xml_path, templates.XML_VCS.format(
diff --git a/aidegen/lib/aidegen_metrics.py b/aidegen/lib/aidegen_metrics.py
index 5bf1611..1511f74 100644
--- a/aidegen/lib/aidegen_metrics.py
+++ b/aidegen/lib/aidegen_metrics.py
@@ -121,3 +121,24 @@
     stack_trace = common_util.remove_user_home_path(stack_trace)
     log = common_util.remove_user_home_path(log)
     ends_asuite_metrics(exit_code, stack_trace, log)
+
+
+def performance_metrics(process_type, duration):
+    """ Records each process runtime and send it to clearcut.
+
+    Args:
+        process_type: An integer of process type.
+        duration: Runtime for a specific process.
+
+    Returns:
+        Boolean: False if metrics does not exist.
+                 True when successfully send metrics.
+    """
+    if not metrics:
+        return False
+
+    metrics.LocalDetectEvent(
+        detect_type = process_type,
+        result = int(duration)
+        )
+    return True
diff --git a/aidegen/lib/clion_project_file_gen.py b/aidegen/lib/clion_project_file_gen.py
index ea51c27..3900c8b 100644
--- a/aidegen/lib/clion_project_file_gen.py
+++ b/aidegen/lib/clion_project_file_gen.py
@@ -129,17 +129,20 @@
         cc_path: A string of generated CLion project file's path.
     """
 
-    def __init__(self, mod_info):
+    def __init__(self, mod_info, parent_dir=None):
         """ProjectFileGenerator initialize.
 
         Args:
             mod_info: A dictionary of native module's info.
+            parent_dir: The parent directory of this native module. The default
+                        value is None.
         """
         if not mod_info:
             raise errors.ModuleInfoEmptyError(_MODULE_INFO_EMPTY)
         self.mod_info = mod_info
         self.mod_name = self._get_module_name()
-        self.mod_path = CLionProjectFileGenerator.get_module_path(mod_info)
+        self.mod_path = CLionProjectFileGenerator.get_module_path(
+            mod_info, parent_dir)
         self.cc_dir = CLionProjectFileGenerator.get_cmakelists_file_dir(
             os.path.join(self.mod_path, self.mod_name))
         if not os.path.exists(self.cc_dir):
@@ -164,11 +167,28 @@
         return mod_name
 
     @staticmethod
-    def get_module_path(mod_info):
-        """Gets the first value of the 'path' key if it exists.
+    def get_module_path(mod_info, parent_dir=None):
+        """Gets the correct value of the 'path' key if it exists.
+
+        When a module with different paths, e.g.,
+            'libqcomvoiceprocessingdescriptors': {
+                'path': [
+                    'device/google/bonito/voice_processing',
+                    'device/google/coral/voice_processing',
+                    'device/google/crosshatch/voice_processing',
+                    'device/google/muskie/voice_processing',
+                    'device/google/taimen/voice_processing'
+                ],
+                ...
+            }
+        it might be wrong if we always choose the first path. For example, in
+        this case if users command 'aidegen -i c device/google/coral' the
+        correct path they need should be the second one.
 
         Args:
             mod_info: A module's info dictionary.
+            parent_dir: The parent directory of this native module. The default
+                        value is None.
 
         Returns:
             A string of the module's path.
@@ -179,7 +199,12 @@
         mod_paths = mod_info.get(constant.KEY_PATH, [])
         if not mod_paths:
             raise errors.NoPathDefinedInModuleInfoError(_DICT_NO_PATH_KEY)
-        return mod_paths[0]
+        mod_path = mod_paths[0]
+        if parent_dir and len(mod_paths) > 1:
+            for path in mod_paths:
+                if common_util.is_source_under_relative_path(path, parent_dir):
+                    mod_path = path
+        return mod_path
 
     @staticmethod
     @common_util.check_args(cc_path=str)
@@ -260,10 +285,13 @@
             logging.warning("No source files in %s's module info.",
                             self.mod_name)
             return
+        root = common_util.get_android_root_dir()
         source_files = self.mod_info[constant.KEY_SRCS]
         hfile.write(_LIST_APPEND_HEADER)
         hfile.write(_SOURCE_FILES_LINE)
         for src in source_files:
+            if not os.path.exists(os.path.join(root, src)):
+                continue
             hfile.write(''.join([_build_cmake_path(src, '    '), '\n']))
         hfile.write(_END_WITH_ONE_BLANK_LINE)
 
@@ -405,9 +433,11 @@
     project_dir = os.path.dirname(abs_project_path)
     hfile.write(_PROJECT.format(os.path.basename(project_dir)))
     root_dir = common_util.get_android_root_dir()
+    parent_dir = os.path.relpath(abs_project_path, root_dir)
     for mod_name in mod_names:
         mod_info = cc_module_info.get_module_info(mod_name)
-        mod_path = CLionProjectFileGenerator.get_module_path(mod_info)
+        mod_path = CLionProjectFileGenerator.get_module_path(
+            mod_info, parent_dir)
         file_dir = CLionProjectFileGenerator.get_cmakelists_file_dir(
             os.path.join(mod_path, mod_name))
         file_path = os.path.join(file_dir, constant.CLION_PROJECT_FILE_NAME)
diff --git a/aidegen/lib/clion_project_file_gen_unittest.py b/aidegen/lib/clion_project_file_gen_unittest.py
index b306e2c..a3573cd 100644
--- a/aidegen/lib/clion_project_file_gen_unittest.py
+++ b/aidegen/lib/clion_project_file_gen_unittest.py
@@ -566,6 +566,31 @@
                 clion_project_file_gen._SOURCE_FILES_HEADER))
         self.assertEqual(content, expected)
 
+    def test_get_module_path(self):
+        """Test get_module_path function with conditions."""
+        path_b = 'a/b/path_a_to_A'
+        path_c = 'a/c/path_a_to_A'
+        path_d = 'a/d/path_a_to_A'
+        mod_info = {
+            'module_name': 'A_Mod',
+            'path': [
+                path_b,
+                path_c,
+                path_d,
+            ]
+        }
+        res = clion_project_file_gen.CLionProjectFileGenerator.get_module_path(
+            mod_info)
+        self.assertEqual(res, path_b)
+        res = clion_project_file_gen.CLionProjectFileGenerator.get_module_path(
+            mod_info, 'a/b')
+        self.assertEqual(res, path_b)
+        res = clion_project_file_gen.CLionProjectFileGenerator.get_module_path(
+            mod_info, 'a/c')
+        self.assertEqual(res, path_c)
+        res = clion_project_file_gen.CLionProjectFileGenerator.get_module_path(
+            mod_info, 'a/d')
+        self.assertEqual(res, path_d)
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/aidegen/lib/common_util.py b/aidegen/lib/common_util.py
index 685ca10..1d2643a 100644
--- a/aidegen/lib/common_util.py
+++ b/aidegen/lib/common_util.py
@@ -20,6 +20,7 @@
 other modules.
 """
 
+import fnmatch
 import inspect
 import json
 import logging
@@ -28,6 +29,7 @@
 import sys
 import time
 import xml.dom.minidom
+import zipfile
 
 from functools import partial
 from functools import wraps
@@ -57,6 +59,10 @@
 _DATE_FORMAT = '%Y-%m-%d %H:%M:%S'
 _ARG_IS_NULL_ERROR = "{0}.{1}: argument '{2}' is null."
 _ARG_TYPE_INCORRECT_ERROR = "{0}.{1}: argument '{2}': type is {3}, must be {4}."
+_IDE_UNDEFINED = constant.IDE_DICT[constant.IDE_UNDEFINED]
+_IDE_INTELLIJ = constant.IDE_DICT[constant.IDE_INTELLIJ]
+_IDE_CLION = constant.IDE_DICT[constant.IDE_CLION]
+_IDE_VSCODE = constant.IDE_DICT[constant.IDE_VSCODE]
 
 
 def time_logged(func=None, *, message='', maximum=1):
@@ -675,6 +681,8 @@
     data[constant.GEN_COMPDB] = os.path.join(get_soong_out_path(),
                                              constant.RELATIVE_COMPDB_PATH,
                                              constant.COMPDB_JSONFILE_NAME)
+    data[constant.GEN_RUST] = os.path.join(
+        root_dir, get_blueprint_json_path(constant.RUST_PROJECT_JSON))
     return data
 
 
@@ -726,3 +734,72 @@
             return os.path.dirname(real_path)
     logging.warning('%s can\'t find its .git folder.', relpath)
     return None
+
+
+def determine_language_ide(lang, ide, jlist=None, clist=None, rlist=None):
+    """Determines the language and IDE by the input language and IDE arguments.
+
+    If IDE and language are undefined, the priority of the language is:
+      1. Java
+      2. C/C++
+      3. Rust
+
+    Args:
+        lang: A character represents the input language.
+        ide: A character represents the input IDE.
+        jlist: A list of Android Java projects, the default value is None.
+        clist: A list of Android C/C++ projects, the default value is None.
+        clist: A list of Android Rust projects, the default value is None.
+
+    Returns:
+        A tuple of the determined language and IDE name strings.
+    """
+    if ide == _IDE_UNDEFINED and lang == constant.LANG_UNDEFINED:
+        if jlist:
+            lang = constant.LANG_JAVA
+        elif clist:
+            lang = constant.LANG_CC
+        elif rlist:
+            lang = constant.LANG_RUST
+    if lang in (constant.LANG_UNDEFINED, constant.LANG_JAVA):
+        if ide == _IDE_UNDEFINED:
+            ide = _IDE_INTELLIJ
+        lang = constant.LANG_JAVA
+        if constant.IDE_NAME_DICT[ide] == constant.IDE_CLION:
+            lang = constant.LANG_CC
+    elif lang == constant.LANG_CC:
+        if ide == _IDE_UNDEFINED:
+            ide = _IDE_CLION
+        if constant.IDE_NAME_DICT[ide] == constant.IDE_INTELLIJ:
+            lang = constant.LANG_JAVA
+    elif lang == constant.LANG_RUST:
+        ide = _IDE_VSCODE
+    return constant.LANGUAGE_NAME_DICT[lang], constant.IDE_NAME_DICT[ide]
+
+
+def check_java_or_kotlin_file_exists(abs_path):
+    """Checks if any Java or Kotlin files exist in an abs_path directory.
+
+    Args:
+        abs_path: A string of absolute path of a directory to be check.
+
+    Returns:
+        True if any Java or Kotlin files exist otherwise False.
+    """
+    for _, _, filenames in os.walk(abs_path):
+        for extension in (constant.JAVA_FILES, constant.KOTLIN_FILES):
+            if fnmatch.filter(filenames, extension):
+                return True
+    return False
+
+
+@io_error_handle
+def unzip_file(src, dest):
+    """Unzips the source zip file and extract it to the destination directory.
+
+    Args:
+        src: A string of the file to be unzipped.
+        dest: A string of the destination directory to be extracted to.
+    """
+    with zipfile.ZipFile(src, 'r') as zip_ref:
+        zip_ref.extractall(dest)
diff --git a/aidegen/lib/common_util_unittest.py b/aidegen/lib/common_util_unittest.py
index dc8d392..2c40b28 100644
--- a/aidegen/lib/common_util_unittest.py
+++ b/aidegen/lib/common_util_unittest.py
@@ -346,7 +346,8 @@
         data = {
             constant.GEN_JAVA_DEPS: 'a/b/out/soong/bp_java_file',
             constant.GEN_CC_DEPS: 'a/b/out/soong/bp_java_file',
-            constant.GEN_COMPDB: path_compdb
+            constant.GEN_COMPDB: path_compdb,
+            constant.GEN_RUST: 'a/b/out/soong/bp_java_file'
         }
         self.assertEqual(
             data, common_util.get_blueprint_json_files_relative_dict())
@@ -388,6 +389,90 @@
         mock_exist.return_value = False
         self.assertEqual(common_util.find_git_root('c/d'), None)
 
+    def test_determine_language_ide(self):
+        """Test determine_language_ide function."""
+        ide = 'u'
+        lang = 'u'
+        self.assertEqual((constant.JAVA, constant.IDE_INTELLIJ),
+                         common_util.determine_language_ide(lang, ide))
+        self.assertEqual((constant.JAVA, constant.IDE_INTELLIJ),
+                         common_util.determine_language_ide(
+                             lang, ide, ['some_module']))
+        self.assertEqual((constant.C_CPP, constant.IDE_CLION),
+                         common_util.determine_language_ide(
+                             lang, ide, None, ['some_module']))
+        self.assertEqual((constant.RUST, constant.IDE_VSCODE),
+                         common_util.determine_language_ide(
+                             lang, ide, None, None, ['some_module']))
+        lang = 'j'
+        self.assertEqual((constant.JAVA, constant.IDE_INTELLIJ),
+                         common_util.determine_language_ide(lang, ide))
+        ide = 'c'
+        self.assertEqual((constant.C_CPP, constant.IDE_CLION),
+                         common_util.determine_language_ide(lang, ide))
+        ide = 'j'
+        lang = 'u'
+        self.assertEqual((constant.JAVA, constant.IDE_INTELLIJ),
+                         common_util.determine_language_ide(lang, ide))
+        lang = 'j'
+        self.assertEqual((constant.JAVA, constant.IDE_INTELLIJ),
+                         common_util.determine_language_ide(lang, ide))
+        ide = 'c'
+        self.assertEqual((constant.C_CPP, constant.IDE_CLION),
+                         common_util.determine_language_ide(lang, ide))
+        lang = 'c'
+        ide = 'u'
+        self.assertEqual((constant.C_CPP, constant.IDE_CLION),
+                         common_util.determine_language_ide(lang, ide))
+        ide = 'j'
+        self.assertEqual((constant.JAVA, constant.IDE_INTELLIJ),
+                         common_util.determine_language_ide(lang, ide))
+
+    @mock.patch('zipfile.ZipFile.extractall')
+    @mock.patch('zipfile.ZipFile')
+    def test_unzip_file(self, mock_zipfile, mock_extract):
+        """Test unzip_file function."""
+        src = 'a/b/c.zip'
+        dest = 'a/b/d'
+        common_util.unzip_file(src, dest)
+        mock_zipfile.assert_called_with(src, 'r')
+        self.assertFalse(mock_extract.called)
+
+    @mock.patch('os.walk')
+    def test_check_java_or_kotlin_file_exists(self, mock_walk):
+        """Test check_java_or_kotlin_file_exists with conditions."""
+        root_dir = 'a/path/to/dir'
+        folder = 'path/to/dir'
+        target = 'test.java'
+        abs_path = os.path.join(root_dir, folder)
+        mock_walk.return_value = [(root_dir, [folder], [target])]
+        self.assertTrue(common_util.check_java_or_kotlin_file_exists(abs_path))
+        target = 'test.kt'
+        abs_path = os.path.join(root_dir, folder)
+        mock_walk.return_value = [(root_dir, [folder], [target])]
+        self.assertTrue(common_util.check_java_or_kotlin_file_exists(abs_path))
+        target = 'test.cpp'
+        mock_walk.return_value = [(root_dir, [folder], [target])]
+        self.assertFalse(common_util.check_java_or_kotlin_file_exists(abs_path))
+
+        # Only VS Code IDE supports Rust projects right now.
+        lang = 'r'
+        ide = 'u'
+        self.assertEqual((constant.RUST, constant.IDE_VSCODE),
+                         common_util.determine_language_ide(lang, ide))
+        lang = 'r'
+        ide = 'v'
+        self.assertEqual((constant.RUST, constant.IDE_VSCODE),
+                         common_util.determine_language_ide(lang, ide))
+        lang = 'r'
+        ide = 'j'
+        self.assertEqual((constant.RUST, constant.IDE_VSCODE),
+                         common_util.determine_language_ide(lang, ide))
+        lang = 'r'
+        ide = 'c'
+        self.assertEqual((constant.RUST, constant.IDE_VSCODE),
+                         common_util.determine_language_ide(lang, ide))
+
 
 # pylint: disable=unused-argument
 def parse_rule(self, name, text):
diff --git a/aidegen/lib/config.py b/aidegen/lib/config.py
index a1309e6..eb0a187 100644
--- a/aidegen/lib/config.py
+++ b/aidegen/lib/config.py
@@ -58,6 +58,7 @@
         os.path.expanduser('~'), '.config', 'asuite', 'aidegen')
     _CONFIG_FILE_PATH = os.path.join(_CONFIG_DIR, _DEFAULT_CONFIG_FILE)
     _KEY_APPEND = 'preferred_version'
+    _KEY_PLUGIN_PREFERENCE = 'Asuite_plugin_preference'
 
     # Constants of enable debugger
     _ENABLE_DEBUG_CONFIG_DIR = 'enable_debugger'
@@ -125,6 +126,24 @@
         key = '_'.join([ide, self._KEY_APPEND]) if ide else self._KEY_APPEND
         self._config[key] = preferred_version
 
+    @property
+    def plugin_preference(self):
+        """Gets Asuite plugin user's preference
+
+        Returns:
+             A string of the user's preference: yes/no/auto.
+        """
+        return self._config.get(self._KEY_PLUGIN_PREFERENCE, '')
+
+    @plugin_preference.setter
+    def plugin_preference(self, preference):
+        """Sets Asuite plugin user's preference
+
+        Args:
+            preference: A string of the user's preference: yes/no/auto.
+        """
+        self._config[self._KEY_PLUGIN_PREFERENCE] = preference
+
     def _load_aidegen_config(self):
         """Load data from configuration file."""
         if os.path.exists(self._CONFIG_FILE_PATH):
diff --git a/aidegen/lib/config_unittest.py b/aidegen/lib/config_unittest.py
index d492ef2..30caa9e 100644
--- a/aidegen/lib/config_unittest.py
+++ b/aidegen/lib/config_unittest.py
@@ -305,6 +305,20 @@
         cfg.set_preferred_version('test', constant.IDE_INTELLIJ)
         self.assertEqual(cfg._config['IntelliJ_preferred_version'], 'test')
 
+    def test_set_plugin_preference(self):
+        """Test set_plugin_preference."""
+        cfg = config.AidegenConfig()
+        cfg._config[config.AidegenConfig._KEY_PLUGIN_PREFERENCE] = 'yes'
+        cfg.plugin_preference = 'no'
+        self.assertEqual(cfg._config[
+            config.AidegenConfig._KEY_PLUGIN_PREFERENCE], 'no')
+
+    def test_get_plugin_preference(self):
+        """Test get_plugin_preference."""
+        cfg = config.AidegenConfig()
+        cfg._config[config.AidegenConfig._KEY_PLUGIN_PREFERENCE] = 'yes'
+        self.assertEqual(cfg.plugin_preference, 'yes')
+
     @mock.patch('os.makedirs')
     @mock.patch('os.path.exists')
     def test_gen_enable_debug_sub_dir(self, mock_file_exists, mock_makedirs):
diff --git a/aidegen/lib/eclipse_project_file_gen.py b/aidegen/lib/eclipse_project_file_gen.py
index 5bc4b7e..3340c45 100644
--- a/aidegen/lib/eclipse_project_file_gen.py
+++ b/aidegen/lib/eclipse_project_file_gen.py
@@ -176,7 +176,7 @@
         links.update(self._gen_r_link())
         links.update(self._gen_bin_link())
         self.project_content = templates.ECLIPSE_PROJECT_XML.format(
-            PROJECTNAME=self.module_name,
+            PROJECTNAME=self.module_name.replace('/', '_'),
             LINKEDRESOURCES=''.join(sorted(list(links))))
 
     def _gen_r_path_entries(self):
diff --git a/aidegen/lib/ide_common_util.py b/aidegen/lib/ide_common_util.py
index 3f74809..5731dab 100644
--- a/aidegen/lib/ide_common_util.py
+++ b/aidegen/lib/ide_common_util.py
@@ -92,20 +92,22 @@
                 yield exe_file
 
 
-def get_run_ide_cmd(sh_path, project_file):
+def get_run_ide_cmd(sh_path, project_file, new_process=True):
     """Get the command to launch IDE.
 
     Args:
         sh_path: The idea.sh path where IDE is installed.
         project_file: The path of IntelliJ IDEA project file.
+        new_process: Default is True, means to run command in a new process.
 
     Returns:
         A string: The IDE launching command.
     """
+    process_flag = '&' if new_process else ''
     # In command usage, the space ' ' should be '\ ' for correctness.
     return ' '.join([
         constant.NOHUP, sh_path.replace(' ', r'\ '), project_file,
-        constant.IGNORE_STD_OUT_ERR_CMD
+        constant.IGNORE_STD_OUT_ERR_CMD, process_flag
     ])
 
 
diff --git a/aidegen/lib/ide_common_util_unittest.py b/aidegen/lib/ide_common_util_unittest.py
index 9d2e419..908d65b 100644
--- a/aidegen/lib/ide_common_util_unittest.py
+++ b/aidegen/lib/ide_common_util_unittest.py
@@ -109,10 +109,17 @@
         test_project_path = 'xyz/.idea'
         test_result = ' '.join([
             constant.NOHUP, test_script_path, test_project_path,
-            constant.IGNORE_STD_OUT_ERR_CMD
+            constant.IGNORE_STD_OUT_ERR_CMD, '&'
         ])
         self.assertEqual(test_result, ide_common_util.get_run_ide_cmd(
             test_script_path, test_project_path))
+        folk_new_process = False
+        test_result = ' '.join([
+            constant.NOHUP, test_script_path, test_project_path,
+            constant.IGNORE_STD_OUT_ERR_CMD, ''
+        ])
+        self.assertEqual(test_result, ide_common_util.get_run_ide_cmd(
+            test_script_path, test_project_path, folk_new_process))
 
     @mock.patch('builtins.sorted')
     @mock.patch('glob.glob')
diff --git a/aidegen/lib/ide_util.py b/aidegen/lib/ide_util.py
index d9e95d1..989483f 100644
--- a/aidegen/lib/ide_util.py
+++ b/aidegen/lib/ide_util.py
@@ -83,19 +83,25 @@
 LINUX_ANDROID_SDK_PATH = os.path.join(os.getenv('HOME'), 'Android/Sdk')
 MAC_JDK_PATH = os.path.join(common_util.get_android_root_dir(),
                             'prebuilts/jdk/jdk8/darwin-x86')
-MAC_JDK_TABLE_PATH = 'options/jdk.table.xml'
-MAC_FILE_TYPE_XML_PATH = 'options/filetypes.xml'
+ALTERNAIVE_JDK_TABLE_PATH = 'options/jdk.table.xml'
+ALTERNAIVE_FILE_TYPE_XML_PATH = 'options/filetypes.xml'
 MAC_ANDROID_SDK_PATH = os.path.join(os.getenv('HOME'), 'Library/Android/sdk')
 PATTERN_KEY = 'pattern'
 TYPE_KEY = 'type'
-JSON_TYPE = 'JSON'
+_TEST_MAPPING_FILE_TYPE = 'JSON'
 TEST_MAPPING_NAME = 'TEST_MAPPING'
 _TEST_MAPPING_TYPE = '<mapping pattern="TEST_MAPPING" type="JSON" />'
 _XPATH_EXTENSION_MAP = 'component/extensionMap'
 _XPATH_MAPPING = _XPATH_EXTENSION_MAP + '/mapping'
+_SPECIFIC_INTELLIJ_VERSION = 2020.1
+_TEST_MAPPING_FILE_TYPE_ADDING_WARN = '\n{} {}\n'.format(
+    common_util.COLORED_INFO('WARNING:'),
+    ('TEST_MAPPING file type can\'t be added to filetypes.xml. The reason '
+     'might be: lack of the parent tag to add TEST_MAPPING file type.'))
 
 
 # pylint: disable=too-many-lines
+# pylint: disable=invalid-name
 class IdeUtil:
     """Provide a set of IDE operations, e.g., launch and configuration.
 
@@ -253,7 +259,7 @@
         """
         file_type_path = os.path.join(_config_path, self._IDE_FILE_TYPE_PATH)
         if not os.path.isfile(file_type_path):
-            logging.warning('Filetypes.xml is not found.')
+            logging.warning('The file: filetypes.xml is not found.')
             return
 
         file_type_xml = xml_util.parse_xml(file_type_path)
@@ -267,14 +273,17 @@
             attrib = mapping.attrib
             if PATTERN_KEY in attrib and TYPE_KEY in attrib:
                 if attrib[PATTERN_KEY] == TEST_MAPPING_NAME:
-                    if attrib[TYPE_KEY] != JSON_TYPE:
-                        attrib[TYPE_KEY] = JSON_TYPE
+                    if attrib[TYPE_KEY] != _TEST_MAPPING_FILE_TYPE:
+                        attrib[TYPE_KEY] = _TEST_MAPPING_FILE_TYPE
                         file_type_xml.write(file_type_path)
                     add_pattern = False
                     break
         if add_pattern:
-            root.find(_XPATH_EXTENSION_MAP).append(
-                ElementTree.fromstring(_TEST_MAPPING_TYPE))
+            ext_attrib = root.find(_XPATH_EXTENSION_MAP)
+            if not ext_attrib:
+                print(_TEST_MAPPING_FILE_TYPE_ADDING_WARN)
+                return
+            ext_attrib.append(ElementTree.fromstring(_TEST_MAPPING_TYPE))
             pretty_xml = common_util.to_pretty_xml(root)
             common_util.file_generate(file_type_path, pretty_xml)
 
@@ -514,8 +523,8 @@
             return
 
         show_hint = False
-        folder_path = os.path.join(os.getenv('HOME'), app_folder,
-                                   'config', 'plugins')
+        ide_version = self._get_ide_version(app_folder)
+        folder_path = self._get_config_dir(ide_version, app_folder)
         import_process = None
         while not os.path.isdir(folder_path):
             # Guide the user to go through the IDE flow.
@@ -525,16 +534,17 @@
                                            self.ide_name)))
                 try:
                     import_process = subprocess.Popen(
-                        ide_common_util.get_run_ide_cmd(run_script_path, ''),
-                        shell=True)
+                        ide_common_util.get_run_ide_cmd(run_script_path, '',
+                                                        False), shell=True)
                 except (subprocess.SubprocessError, ValueError):
                     logging.warning('\nSubprocess call gets the invalid input.')
                 finally:
                     show_hint = True
-        try:
-            import_process.wait(1)
-        except subprocess.TimeoutExpired:
-            import_process.terminate()
+        if import_process:
+            try:
+                import_process.wait(1)
+            except subprocess.TimeoutExpired:
+                import_process.terminate()
         return
 
     def _get_script_from_system(self):
@@ -579,6 +589,7 @@
             follows,
                 1. .IdeaIC2019.3
                 2. .IntelliJIdea2019.3
+                3. IntelliJIdea2020.1
         """
         if not run_script_path or not os.path.isfile(run_script_path):
             return None
@@ -586,20 +597,83 @@
         target_path = None if index == -1 else run_script_path[index:]
         if not target_path or '-' not in run_script_path:
             return None
+        return IdeIntelliJ._get_config_folder_name(target_path)
 
-        path_data = target_path.split('-')
+    @staticmethod
+    def _get_ide_version(config_folder_name):
+        """Gets IntelliJ version from the input app folder name.
+
+        Args:
+            config_folder_name: A string of the app folder name.
+
+        Returns:
+            A string of the IntelliJ version.
+        """
+        versions = re.findall(r'\d+', config_folder_name)
+        if not versions:
+            logging.warning('\nInvalid IntelliJ config folder name: %s.',
+                            config_folder_name)
+            return None
+        return '.'.join(versions)
+
+    @staticmethod
+    def _get_config_folder_name(script_folder_name):
+        """Gets IntelliJ config folder name from the IDE version.
+
+        The config folder name has been changed since 2020.1.
+
+        Args:
+            script_folder_name: A string of the script folder name of IntelliJ.
+
+        Returns:
+            A string of the IntelliJ config folder name.
+        """
+        path_data = script_folder_name.split('-')
         if not path_data or len(path_data) < 3:
             return None
-
-        config_folder = None
         ide_version = path_data[2].split(os.sep)[0]
+        numbers = ide_version.split('.')
+        if len(numbers) > 2:
+            ide_version = '.'.join([numbers[0], numbers[1]])
+        try:
+            version = float(ide_version)
+        except ValueError:
+            return None
+        pre_folder = '.IdeaIC'
+        if version < _SPECIFIC_INTELLIJ_VERSION:
+            if path_data[1] == 'ue':
+                pre_folder = '.IntelliJIdea'
+        else:
+            if path_data[1] == 'ce':
+                pre_folder = 'IdeaIC'
+            elif path_data[1] == 'ue':
+                pre_folder = 'IntelliJIdea'
+        return ''.join([pre_folder, ide_version])
 
-        if path_data[1] == 'ce':
-            config_folder = ''.join(['.IdeaIC', ide_version])
-        elif path_data[1] == 'ue':
-            config_folder = ''.join(['.IntelliJIdea', ide_version])
+    @staticmethod
+    def _get_config_dir(ide_version, config_folder_name):
+        """Gets IntelliJ config directory by the config folder name.
 
-        return config_folder
+        The IntelliJ config directory is changed from version 2020.1. Get the
+        version from app folder name and determine the config directory.
+        URL: https://intellij-support.jetbrains.com/hc/en-us/articles/206544519
+
+        Args:
+            ide_version: A string of the IntelliJ's version.
+            config_folder_name: A string of the IntelliJ's config folder name.
+
+        Returns:
+            A string of the IntelliJ config directory.
+        """
+        try:
+            version = float(ide_version)
+        except ValueError:
+            return None
+        if version < _SPECIFIC_INTELLIJ_VERSION:
+            return os.path.join(
+                os.getenv('HOME'), config_folder_name)
+        return os.path.join(
+            os.getenv('HOME'), '.config', 'JetBrains', config_folder_name)
 
 
 class IdeLinuxIntelliJ(IdeIntelliJ):
@@ -658,15 +732,24 @@
             _config_folder = self._get_application_path(_path_data)
             if not _config_folder:
                 return None
+            ide_version = self._get_ide_version(_config_folder)
+            if not ide_version:
+                return None
+            try:
+                version = float(ide_version)
+            except ValueError:
+                return None
+            folder_path = self._get_config_dir(ide_version, _config_folder)
+            if version >= _SPECIFIC_INTELLIJ_VERSION:
+                self._IDE_JDK_TABLE_PATH = ALTERNAIVE_JDK_TABLE_PATH
+                self._IDE_FILE_TYPE_PATH = ALTERNAIVE_FILE_TYPE_XML_PATH
 
-            if not os.path.isdir(os.path.join(os.getenv('HOME'),
-                                              _config_folder)):
+            if not os.path.isdir(folder_path):
                 logging.debug("\nThe config folder: %s doesn't exist",
                               _config_folder)
                 self._setup_ide()
 
-            _config_folders.append(
-                os.path.join(os.getenv('HOME'), _config_folder))
+            _config_folders.append(folder_path)
         else:
             # TODO(b/123459239): For the case that the user provides the IDEA
             # binary path, we now collect all possible IDEA config root paths.
@@ -674,6 +757,9 @@
                 os.path.join(os.getenv('HOME'), '.IdeaI?20*'))
             _config_folders.extend(
                 glob.glob(os.path.join(os.getenv('HOME'), '.IntelliJIdea20*')))
+            _config_folders.extend(
+                glob.glob(os.path.join(os.getenv('HOME'), '.config',
+                                       'IntelliJIdea202*')))
             logging.debug('The config path list: %s.', _config_folders)
 
         return _config_folders
@@ -689,8 +775,8 @@
     """
 
     _JDK_PATH = MAC_JDK_PATH
-    _IDE_JDK_TABLE_PATH = MAC_JDK_TABLE_PATH
-    _IDE_FILE_TYPE_PATH = MAC_FILE_TYPE_XML_PATH
+    _IDE_JDK_TABLE_PATH = ALTERNAIVE_JDK_TABLE_PATH
+    _IDE_FILE_TYPE_PATH = ALTERNAIVE_FILE_TYPE_XML_PATH
     _JDK_CONTENT = templates.MAC_JDK_XML
     _DEFAULT_ANDROID_SDK_PATH = MAC_ANDROID_SDK_PATH
 
@@ -848,7 +934,7 @@
     """
 
     _JDK_PATH = MAC_JDK_PATH
-    _IDE_JDK_TABLE_PATH = MAC_JDK_TABLE_PATH
+    _IDE_JDK_TABLE_PATH = ALTERNAIVE_JDK_TABLE_PATH
     _JDK_CONTENT = templates.MAC_JDK_XML
     _DEFAULT_ANDROID_SDK_PATH = MAC_ANDROID_SDK_PATH
 
@@ -930,7 +1016,7 @@
         if (os.path.exists(os.path.expanduser(constant.ECLIPSE_WS))
                 or str(input(_ALERT_CREATE_WS)).lower() == 'y'):
             self.cmd.extend(['-data', constant.ECLIPSE_WS])
-        self.cmd.extend([constant.IGNORE_STD_OUT_ERR_CMD])
+        self.cmd.extend([constant.IGNORE_STD_OUT_ERR_CMD, '&'])
         return ' '.join(self.cmd)
 
     def apply_optional_config(self):
diff --git a/aidegen/lib/ide_util_unittest.py b/aidegen/lib/ide_util_unittest.py
index 477dbbf..d2f5da3 100644
--- a/aidegen/lib/ide_util_unittest.py
+++ b/aidegen/lib/ide_util_unittest.py
@@ -391,7 +391,7 @@
         # Test _get_ide_cmd.
         ide_base._installed_path = '/a/b'
         ide_base.project_abspath = '/x/y'
-        expected_result = 'nohup /a/b /x/y 2>/dev/null >&2'
+        expected_result = 'nohup /a/b /x/y 2>/dev/null >&2 &'
         self.assertEqual(ide_base._get_ide_cmd(), expected_result)
 
         # Test launch_ide.
@@ -618,7 +618,7 @@
         mock_exists.return_value = True
         expacted_result = ('eclipse -data '
                            '~/Documents/AIDEGen_Eclipse_workspace '
-                           '2>/dev/null >&2')
+                           '2>/dev/null >&2 &')
         test_result = eclipse._get_ide_cmd()
         self.assertEqual(test_result, expacted_result)
 
@@ -626,7 +626,7 @@
         eclipse.cmd = ['eclipse']
         mock_exists.return_value = False
         mock_input.return_value = 'n'
-        expacted_result = 'eclipse 2>/dev/null >&2'
+        expacted_result = 'eclipse 2>/dev/null >&2 &'
         test_result = eclipse._get_ide_cmd()
         self.assertEqual(test_result, expacted_result)
 
@@ -670,6 +670,77 @@
         test_result = ide._get_all_versions('a', 'b')
         self.assertEqual(test_result, ['a', 'b'])
 
+    @mock.patch('logging.warning')
+    def test_get_ide_version(self, mock_warn):
+        """Test _get_ide_version with conditions."""
+        self.assertEqual(
+            None, ide_util.IdeIntelliJ._get_ide_version('intellij-efg-hi'))
+        self.assertTrue(mock_warn.called)
+        mock_warn.reset_mock()
+        self.assertEqual(
+            '2020.1',
+            ide_util.IdeIntelliJ._get_ide_version('intellij-ue-2020.1'))
+        self.assertFalse(mock_warn.called)
+        mock_warn.reset_mock()
+        self.assertEqual(
+            '303', ide_util.IdeIntelliJ._get_ide_version('intellij-ue-303'))
+        self.assertFalse(mock_warn.called)
+
+    @mock.patch('os.path.join')
+    def test_get_config_dir(self, mock_join):
+        """Test _get_config_dir with conditions."""
+        config_folder_name = 'IntelliJ2020.1'
+        ide_version = '2020.1'
+        ide_util.IdeIntelliJ._get_config_dir(ide_version, config_folder_name)
+        self.assertTrue(
+            mock_join.called_with(
+                os.getenv('HOME'), '.config', 'JetBrains', config_folder_name))
+        mock_join.reset_mock()
+        config_folder_name = 'IntelliJ2019.3'
+        ide_version = '2019.3'
+        self.assertTrue(
+            mock_join.called_with(
+                os.getenv('HOME'), config_folder_name, 'config'))
+        mock_join.reset_mock()
+        self.assertEqual(None, ide_util.IdeIntelliJ._get_config_dir(
+            'Not-a-float', config_folder_name))
+        self.assertFalse(mock_join.called)
+
+    def test_get_config_folder_name(self):
+        """Test _get_config_folder_name with conditions."""
+        config_folder_name = 'intellij-ce-2019.3'
+        pre_folder = '.IdeaIC'
+        ide_version = '2019.3'
+        expected = ''.join([pre_folder, ide_version])
+        self.assertEqual(expected, ide_util.IdeIntelliJ._get_config_folder_name(
+            config_folder_name))
+        config_folder_name = 'intellij-ue-2019.3'
+        pre_folder = '.IntelliJIdea'
+        expected = ''.join([pre_folder, ide_version])
+        self.assertEqual(expected, ide_util.IdeIntelliJ._get_config_folder_name(
+            config_folder_name))
+        config_folder_name = 'intellij-ce-2020.1'
+        pre_folder = 'IdeaIC'
+        ide_version = '2020.1'
+        expected = ''.join([pre_folder, ide_version])
+        self.assertEqual(expected, ide_util.IdeIntelliJ._get_config_folder_name(
+            config_folder_name))
+        config_folder_name = 'intellij-ue-2020.1'
+        pre_folder = 'IntelliJIdea'
+        expected = ''.join([pre_folder, ide_version])
+        self.assertEqual(expected, ide_util.IdeIntelliJ._get_config_folder_name(
+            config_folder_name))
+        config_folder_name = 'intellij-ue-2020.1.4'
+        self.assertEqual(expected, ide_util.IdeIntelliJ._get_config_folder_name(
+            config_folder_name))
+        config_folder_name = 'intellij-unknown'
+        expected = None
+        self.assertEqual(expected, ide_util.IdeIntelliJ._get_config_folder_name(
+            config_folder_name))
+        config_folder_name = 'intellij-ue-NotAFloat'
+        self.assertEqual(expected, ide_util.IdeIntelliJ._get_config_folder_name(
+            config_folder_name))
+
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/aidegen/lib/module_info.py b/aidegen/lib/module_info.py
index a635d40..66eafa2 100644
--- a/aidegen/lib/module_info.py
+++ b/aidegen/lib/module_info.py
@@ -14,7 +14,7 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-"""Module Info class used to hold cached merged_module_info.json.json."""
+"""Module Info class used to hold cached merged_module_info.json."""
 
 import logging
 import os
@@ -52,7 +52,7 @@
             os.remove(module_file_path)
         merged_file_path = os.path.join(common_util.get_soong_out_path(),
                                         constant.MERGED_MODULE_INFO)
-        if not os.path.isfile(module_file_path):
+        if not os.path.isfile(merged_file_path):
             logging.debug(
                 'Generating %s - this is required for the initial runs.',
                 merged_file_path)
diff --git a/aidegen/lib/module_info_util.py b/aidegen/lib/module_info_util.py
index bc447f3..f75f376 100644
--- a/aidegen/lib/module_info_util.py
+++ b/aidegen/lib/module_info_util.py
@@ -55,27 +55,24 @@
 _LAUNCH_PROJECT_QUERY = (
     'There exists an IntelliJ project file: %s. Do you want '
     'to launch it (yes/No)?')
-_BUILD_BP_JSON_ENV_OFF = {
-    constant.GEN_JAVA_DEPS: 'false',
-    constant.GEN_CC_DEPS: 'false',
-    constant.GEN_COMPDB: 'false'
-}
 _BUILD_BP_JSON_ENV_ON = {
     constant.GEN_JAVA_DEPS: 'true',
     constant.GEN_CC_DEPS: 'true',
-    constant.GEN_COMPDB: 'true'
+    constant.GEN_COMPDB: 'true',
+    constant.GEN_RUST: 'true'
 }
 _GEN_JSON_FAILED = (
     'Generate new {0} failed, AIDEGen will proceed and reuse the old {1}.')
-_WARN_MSG = '\n{} {}\n'
 _TARGET = 'nothing'
+_LINKFILE_WARNING = (
+    'File {} does not exist and we can not make a symbolic link for it.')
+_RUST_PROJECT_JSON = 'out/soong/rust-project.json'
 
 
 # pylint: disable=dangerous-default-value
 @common_util.back_to_cwd
 @common_util.time_logged
-def generate_merged_module_info(env_off=_BUILD_BP_JSON_ENV_OFF,
-                                env_on=_BUILD_BP_JSON_ENV_ON):
+def generate_merged_module_info(env_on=_BUILD_BP_JSON_ENV_ON):
     """Generate a merged dictionary.
 
     Linked functions:
@@ -84,8 +81,6 @@
         _merge_dict(mk_dict, bp_dict)
 
     Args:
-        env_off: A dictionary of environment settings to be turned off, the
-                 default value is _BUILD_BP_JSON_ENV_OFF.
         env_on: A dictionary of environment settings to be turned on, the
                 default value is _BUILD_BP_JSON_ENV_ON.
 
@@ -99,7 +94,7 @@
     skip_build = config.is_skip_build
     main_project = projects[0] if projects else None
     _build_bp_info(
-        module_info, main_project, verbose, skip_build, env_off, env_on)
+        module_info, main_project, verbose, skip_build, env_on)
     json_path = common_util.get_blueprint_json_path(
         constant.BLUEPRINT_JAVA_JSONFILE_NAME)
     bp_dict = common_util.get_json_dict(json_path)
@@ -107,13 +102,12 @@
 
 
 def _build_bp_info(module_info, main_project=None, verbose=False,
-                   skip_build=False, env_off=_BUILD_BP_JSON_ENV_OFF,
-                   env_on=_BUILD_BP_JSON_ENV_ON):
+                   skip_build=False, env_on=_BUILD_BP_JSON_ENV_ON):
     """Make nothing to create module_bp_java_deps.json, module_bp_cc_deps.json.
 
     Use atest build method to build the target 'nothing' by setting env config
-    SOONG_COLLECT_JAVA_DEPS to false then true. By this way, we can trigger the
-    process of collecting dependencies and generate module_bp_java_deps.json.
+    SOONG_COLLECT_JAVA_DEPS to true to trigger the process of collecting
+    dependencies and generate module_bp_java_deps.json etc.
 
     Args:
         module_info: A ModuleInfo instance contains data of module-info.json.
@@ -122,8 +116,6 @@
         skip_build: A boolean, if true, skip building if
                     get_blueprint_json_path(file_name) file exists, otherwise
                     build it.
-        env_off: A dictionary of environment settings to be turned off, the
-                 default value is _BUILD_BP_JSON_ENV_OFF.
         env_on: A dictionary of environment settings to be turned on, the
                 default value is _BUILD_BP_JSON_ENV_ON.
 
@@ -148,10 +140,13 @@
 
     logging.warning(
         '\nGenerate files:\n %s by atest build method.', files)
-    build_with_off_cmd = atest_utils.build([_TARGET], verbose, env_off)
     build_with_on_cmd = atest_utils.build([_TARGET], verbose, env_on)
 
-    if build_with_off_cmd and build_with_on_cmd:
+    # For Android Rust projects, we need to create a symbolic link to the file
+    # out/soong/rust-project.json to launch the rust projects in IDEs.
+    _generate_rust_project_link()
+
+    if build_with_on_cmd:
         logging.info('\nGenerate blueprint json successfully.')
     else:
         if not all([_is_new_json_file_generated(
@@ -170,12 +165,15 @@
     The generation of json files depends on env_on. If the env_on looks like,
     _BUILD_BP_JSON_ENV_ON = {
         'SOONG_COLLECT_JAVA_DEPS': 'true',
-        'SOONG_COLLECT_CC_DEPS': 'true'
+        'SOONG_COLLECT_CC_DEPS': 'true',
+        'SOONG_GEN_COMPDB': 'true',
+        'SOONG_GEN_RUST_PROJECT': 'true'
     }
-    We want to generate only two files: module_bp_java_deps.json and
-    module_bp_cc_deps.json. And in get_blueprint_json_files_relative_dict
-    function, there are three json files by default. We get the result list by
-    comparsing with these two dictionaries.
+    We want to generate 4 files: module_bp_java_deps.json,
+    module_bp_cc_deps.json, compile_commands.json and rust-project.json. And in
+    get_blueprint_json_files_relative_dict function, there are 4 json files
+    by default and return a result list of the absolute paths of the existent
+    files.
 
     Args:
         env_on: A dictionary of environment settings to be turned on, the
@@ -203,7 +201,8 @@
     failed_or_file = ' or '.join(file_paths)
     failed_and_file = ' and '.join(file_paths)
     message = _GEN_JSON_FAILED.format(failed_or_file, failed_and_file)
-    print(_WARN_MSG.format(common_util.COLORED_INFO('Warning:'), message))
+    print(constant.WARN_MSG.format(
+        common_util.COLORED_INFO('Warning:'), message))
 
 
 def _show_build_failed_message(module_info, main_project=None):
@@ -310,3 +309,20 @@
             merged_dict[module] = dict()
         _merge_module_keys(merged_dict[module], bp_dict[module])
     return merged_dict
+
+
+def _generate_rust_project_link():
+    """Generates out/soong/rust-project.json symbolic link in Android root."""
+    root_dir = common_util.get_android_root_dir()
+    rust_project = os.path.join(
+        root_dir, common_util.get_blueprint_json_path(
+            constant.RUST_PROJECT_JSON))
+    if not os.path.isfile(rust_project):
+        message = _LINKFILE_WARNING.format(_RUST_PROJECT_JSON)
+        print(constant.WARN_MSG.format(
+            common_util.COLORED_INFO('Warning:'), message))
+        return
+    link_rust = os.path.join(root_dir, constant.RUST_PROJECT_JSON)
+    if os.path.islink(link_rust):
+        os.remove(link_rust)
+    os.symlink(rust_project, link_rust)
diff --git a/aidegen/lib/module_info_util_unittest.py b/aidegen/lib/module_info_util_unittest.py
index c5cdb9a..73b9e07 100644
--- a/aidegen/lib/module_info_util_unittest.py
+++ b/aidegen/lib/module_info_util_unittest.py
@@ -94,6 +94,7 @@
         self.assertFalse(mock_time.called)
 
     # pylint: disable=too-many-arguments
+    @mock.patch.object(module_info_util, '_generate_rust_project_link')
     @mock.patch.object(module_info_util, '_show_build_failed_message')
     @mock.patch.object(module_info_util, '_show_files_reuse_message')
     @mock.patch.object(atest_utils, 'build')
@@ -103,7 +104,8 @@
     @mock.patch.object(module_info_util, '_get_generated_json_files')
     def test_of_build_bp_info_rebuild_jsons(self, mock_json, mock_isfile,
                                             mock_log, mock_time, mock_build,
-                                            mock_reuse, mock_fail):
+                                            mock_reuse, mock_fail,
+                                            mock_gen_rust):
         """Test of _build_bp_info on rebuilding jsons."""
         gen_files = ['file2', 'file_a', 'file_b']
         mock_json.return_value = gen_files
@@ -116,6 +118,7 @@
         module_info_util._build_bp_info(mod_info, skip_build=True)
         self.assertTrue(mock_json.called)
         self.assertTrue(mock_log.called)
+        self.assertTrue(mock_gen_rust.called)
 
         # Test of the well rebuild case.
         action_pass = '\nGenerate blueprint json successfully.'
@@ -124,6 +127,7 @@
         self.assertFalse(mock_reuse.called)
         self.assertFalse(mock_fail.called)
 
+    @mock.patch.object(module_info_util, '_generate_rust_project_link')
     @mock.patch.object(module_info_util, '_show_build_failed_message')
     @mock.patch.object(module_info_util, '_show_files_reuse_message')
     @mock.patch.object(module_info_util, '_is_new_json_file_generated')
@@ -135,7 +139,7 @@
     def test_of_build_bp_info_show_build_fail(self, mock_json, mock_isfile,
                                               mock_log, mock_time, mock_build,
                                               mock_judge, mock_reuse,
-                                              mock_fail):
+                                              mock_fail, mock_gen_rust):
         """Test of _build_bp_info to show build failed message."""
         gen_files = ['file3', 'file_a', 'file_b']
         mock_json.return_value = gen_files
@@ -152,9 +156,11 @@
         self.assertTrue(mock_json.called)
         self.assertTrue(mock_log.called)
         self.assertTrue(mock_build.called)
+        self.assertTrue(mock_gen_rust.called)
         self.assertFalse(mock_reuse.called)
-        self.assertTrue(mock_fail.called)
+        self.assertFalse(mock_fail.called)
 
+    @mock.patch.object(module_info_util, '_generate_rust_project_link')
     @mock.patch.object(module_info_util, '_show_build_failed_message')
     @mock.patch.object(module_info_util, '_show_files_reuse_message')
     @mock.patch.object(module_info_util, '_is_new_json_file_generated')
@@ -166,7 +172,7 @@
     def test_of_build_bp_info_rebuild_and_reuse(self, mock_json, mock_isfile,
                                                 mock_log, mock_time, mock_build,
                                                 mock_judge, mock_reuse,
-                                                mock_fail):
+                                                mock_fail, mock_gen_rust):
         """Test of _build_bp_info to reuse existing jsons."""
         gen_files = ['file4', 'file_a', 'file_b']
         mock_json.return_value = gen_files
@@ -181,9 +187,11 @@
         module_info_util._build_bp_info(
             mod_info, main_project=test_prj, skip_build=False)
         self.assertTrue(mock_log.called)
-        self.assertTrue(mock_reuse.called)
+        self.assertTrue(mock_gen_rust.called)
+        self.assertFalse(mock_reuse.called)
         self.assertFalse(mock_fail.called)
 
+    @mock.patch.object(module_info_util, '_generate_rust_project_link')
     @mock.patch.object(module_info_util, '_show_build_failed_message')
     @mock.patch.object(module_info_util, '_show_files_reuse_message')
     @mock.patch.object(module_info_util, '_is_new_json_file_generated')
@@ -194,7 +202,7 @@
     @mock.patch.object(module_info_util, '_get_generated_json_files')
     def test_of_build_bp_info_reuse_pass(self, mock_json, mock_isfile, mock_log,
                                          mock_time, mock_build, mock_judge,
-                                         mock_reuse, mock_fail):
+                                         mock_reuse, mock_fail, mock_gen_rust):
         """Test of _build_bp_info reuse pass."""
         gen_files = ['file5', 'file_a', 'file_b']
         mock_json.return_value = gen_files
@@ -208,6 +216,7 @@
         module_info_util._build_bp_info(mod_info, main_project=test_prj,
                                         skip_build=False)
         self.assertTrue(mock_log.called)
+        self.assertTrue(mock_gen_rust.called)
         self.assertFalse(mock_reuse.called)
         self.assertFalse(mock_fail.called)
 
@@ -287,7 +296,7 @@
         module_info_util._build_bp_info(amodule_info, unittest_constants.
                                         TEST_MODULE, False, skip)
         self.assertTrue(mock_time.called)
-        self.assertEqual(mock_build.call_count, 2)
+        self.assertEqual(mock_build.call_count, 1)
 
     @mock.patch('os.path.getmtime')
     @mock.patch('os.path.isfile')
@@ -538,6 +547,43 @@
         self.assertFalse(mock_show_reuse.called)
         self.assertFalse(mock_build_fail.called)
 
+    @mock.patch('builtins.print')
+    @mock.patch('os.symlink')
+    @mock.patch('os.remove')
+    @mock.patch('os.path.islink')
+    @mock.patch('os.path.isfile')
+    def test_generate_rust_project_link(self, mock_isfile, mock_islink,
+                                        mock_remove, mock_symlink, mock_print):
+        """Test _generate_rust_project_link function."""
+        mock_isfile.return_value = True
+        mock_islink.return_value = False
+        module_info_util._generate_rust_project_link()
+        self.assertFalse(mock_print.called)
+        self.assertFalse(mock_remove.called)
+        self.assertTrue(mock_symlink.called)
+
+        mock_symlink.mock_reset()
+        mock_remove.mock_reset()
+        mock_print.mock_reset()
+        mock_islink.return_value = True
+        module_info_util._generate_rust_project_link()
+        self.assertTrue(mock_remove.called)
+        self.assertFalse(mock_print.called)
+        self.assertTrue(mock_symlink.called)
+
+        mock_symlink.mock_reset()
+        mock_remove.mock_reset()
+        mock_print.mock_reset()
+        mock_isfile.return_value = False
+        module_info_util._generate_rust_project_link()
+        self.assertTrue(mock_print.called)
+
+        mock_symlink.mock_reset()
+        mock_remove.mock_reset()
+        mock_print.mock_reset()
+        mock_islink.return_value = True
+        module_info_util._generate_rust_project_link()
+        self.assertTrue(mock_print.called)
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/aidegen/lib/native_module_info_unittest.py b/aidegen/lib/native_module_info_unittest.py
index ce017c0..bec73e5 100644
--- a/aidegen/lib/native_module_info_unittest.py
+++ b/aidegen/lib/native_module_info_unittest.py
@@ -130,6 +130,7 @@
         """Test get_module_includes with include paths."""
         mock_load.return_value = None, _CC_NAME_TO_MODULE_INFO
         mod_info = native_module_info.NativeModuleInfo()
+        mod_info.name_to_module_info = _CC_NAME_TO_MODULE_INFO
         result = mod_info.get_module_includes('multiarch')
         self.assertEqual(set(_NATIVE_INCLUDES1), result)
 
diff --git a/aidegen/lib/native_project_info.py b/aidegen/lib/native_project_info.py
index 0b36e44..138cfbb 100644
--- a/aidegen/lib/native_project_info.py
+++ b/aidegen/lib/native_project_info.py
@@ -18,11 +18,15 @@
 
 from __future__ import absolute_import
 
+import logging
+
+from aidegen.lib import common_util
 from aidegen.lib import native_module_info
 from aidegen.lib import project_config
 from aidegen.lib import project_info
 
 
+# pylint: disable=too-few-public-methods
 class NativeProjectInfo():
     """Native project information.
 
@@ -86,11 +90,18 @@
                      generated.
         """
         config = project_config.ProjectConfig.get_instance()
-        if config.is_skip_build:
-            return
         cls._init_modules_info()
         need_builds = cls._get_need_builds(targets)
+        if config.is_skip_build:
+            if need_builds:
+                print('{} {}'.format(
+                    common_util.COLORED_INFO('Warning:'),
+                    'Native modules build skipped:\n{}.'.format(
+                        '\n'.join(need_builds))))
+            return
         if need_builds:
+            logging.info('\nThe batch_build_dependencies function is called by '
+                         'NativeProjectInfo\'s generate_projects method.')
             project_info.batch_build_dependencies(need_builds)
 
     @classmethod
diff --git a/aidegen/lib/native_project_info_unittest.py b/aidegen/lib/native_project_info_unittest.py
index cb8f876..1a5662b 100644
--- a/aidegen/lib/native_project_info_unittest.py
+++ b/aidegen/lib/native_project_info_unittest.py
@@ -38,6 +38,9 @@
         native_project_info.NativeProjectInfo._init_modules_info()
         self.assertEqual(mock_mod_info.call_count, 1)
 
+    # pylint: disable=too-many-arguments
+    @mock.patch('logging.info')
+    @mock.patch('builtins.print')
     @mock.patch.object(project_info, 'batch_build_dependencies')
     @mock.patch.object(native_project_info.NativeProjectInfo,
                        '_get_need_builds')
@@ -45,22 +48,35 @@
                        '_init_modules_info')
     @mock.patch.object(project_config.ProjectConfig, 'get_instance')
     def test_generate_projects(self, mock_get_inst, mock_mod_info,
-                               mock_get_need, mock_batch):
+                               mock_get_need, mock_batch, mock_print,
+                               mock_info):
         """Test initializing NativeProjectInfo woth different conditions."""
         target = 'libui'
         config = mock.Mock()
         mock_get_inst.return_value = config
         config.is_skip_build = True
+        nativeInfo = native_project_info.NativeProjectInfo
+        nativeInfo.modules_info = mock.Mock()
+        nativeInfo.modules_info.is_module.return_value = [True, True]
+        nativeInfo.modules_info.is_module_need_build.return_value = [True, True]
         native_project_info.NativeProjectInfo.generate_projects([target])
-        self.assertFalse(mock_mod_info.called)
+        self.assertTrue(mock_mod_info.called)
+        self.assertTrue(mock_print.called)
+        self.assertFalse(mock_info.called)
 
         mock_mod_info.reset_mock()
+        mock_print.reset_mock()
+        mock_info.reset_mock()
         config.is_skip_build = False
+        nativeInfo.modules_info.is_module_need_build.return_value = [
+            False, False]
         mock_get_need.return_value = ['mod1', 'mod2']
         native_project_info.NativeProjectInfo.generate_projects([target])
         self.assertTrue(mock_mod_info.called)
         self.assertTrue(mock_get_need.called)
         self.assertTrue(mock_batch.called)
+        self.assertFalse(mock_print.called)
+        self.assertTrue(mock_info.called)
 
     def test_get_need_builds_without_needed_build(self):
         """Test _get_need_builds method without needed build."""
diff --git a/aidegen/lib/native_util.py b/aidegen/lib/native_util.py
index 874444a..c02396c 100644
--- a/aidegen/lib/native_util.py
+++ b/aidegen/lib/native_util.py
@@ -20,7 +20,6 @@
 launching native projects in IDE.
 """
 
-import fnmatch
 import os
 
 from aidegen import constant
@@ -28,6 +27,11 @@
 from aidegen.lib import common_util
 from aidegen.lib import native_module_info
 
+_RUST_JSON_NOT_EXIST = 'The json file: {} does not exist.'
+_RUST_DICT_BROKEN = 'The rust dictionary does not have "{}" key. It\'s broken.'
+_CRATES_KEY = 'crates'
+_ROOT_MODULE_KEY = 'root_module'
+
 
 def generate_clion_projects(targets):
     """Generates CLion projects by targets.
@@ -52,16 +56,17 @@
     """
     cc_module_info = native_module_info.NativeModuleInfo()
     parent_dir, targets = _get_merged_native_target(cc_module_info, targets)
-    module_names = []
-    for target in targets:
-        mod_info = cc_module_info.get_module_info(target)
-        clion_gen = clion_project_file_gen.CLionProjectFileGenerator(mod_info)
-        clion_gen.generate_cmakelists_file()
-        module_names.append(mod_info[constant.KEY_MODULE_NAME])
     rel_path = os.path.relpath(parent_dir, common_util.get_android_root_dir())
     # If the relative path is Android root, we won't show '.' in the path.
     if rel_path == '.':
         rel_path = ''
+    module_names = []
+    for target in targets:
+        mod_info = cc_module_info.get_module_info(target)
+        clion_gen = clion_project_file_gen.CLionProjectFileGenerator(
+            mod_info, rel_path)
+        clion_gen.generate_cmakelists_file()
+        module_names.append(mod_info[constant.KEY_MODULE_NAME])
     return clion_project_file_gen.generate_base_cmakelists_file(
         cc_module_info, rel_path, module_names)
 
@@ -136,13 +141,13 @@
     return parent_folder, new_targets
 
 
-def get_native_and_java_projects(atest_module_info, cc_module_info, targets):
+def get_java_cc_and_rust_projects(atest_module_info, cc_module_info, targets):
     """Gets native and java projects from targets.
 
     Separates native and java projects from targets.
     1. If it's a native module, add it to native projects.
     2. If it's a java module, add it to java projects.
-    3. Calls _analyze_native_and_java_projects to analyze the remaining targets.
+    3. If it's a rust module, add it to rust targets.
 
     Args:
         atest_module_info: A ModuleInfo instance contains the merged data of
@@ -152,9 +157,10 @@
         targets: A list of targets to be analyzed.
 
     Returns:
-        A tuple of a list of java build targets and a list of native build
-        targets.
+        A tuple of a list of java build targets, a list of C/C++ build
+        targets and a list of rust build targets.
     """
+    rtargets = _filter_out_rust_projects(targets)
     ctargets, lefts = _filter_out_modules(targets, cc_module_info.is_module)
     jtargets, lefts = _filter_out_modules(lefts, atest_module_info.is_module)
     path_info = cc_module_info.path_to_module_info
@@ -162,7 +168,7 @@
         atest_module_info, path_info, lefts)
     ctargets.extend(ctars)
     jtargets.extend(jtars)
-    return jtargets, ctargets
+    return jtargets, ctargets, rtargets
 
 
 def _analyze_native_and_java_projects(atest_module_info, path_info, targets):
@@ -171,12 +177,12 @@
     Args:
         atest_module_info: A ModuleInfo instance contains the merged data of
                            module-info.json and module_bp_java_deps.json.
-        path_info: A dictionary contains native projects' path as key
+        path_info: A dictionary contains C/C++ projects' path as key
                    and module's info dictionary as value.
         targets: A list of targets to be analyzed.
 
     Returns:
-        A tuple of a list of java build targets and a list of native build
+        A tuple of a list of java build targets and a list of C/C++ build
         targets.
     """
     jtargets = []
@@ -184,30 +190,15 @@
     for target in targets:
         rel_path, abs_path = common_util.get_related_paths(
             atest_module_info, target)
-        if _check_java_file_exists(abs_path):
+        if common_util.check_java_or_kotlin_file_exists(abs_path):
             jtargets.append(target)
         if _check_native_project_exists(path_info, rel_path):
             ctargets.append(target)
     return jtargets, ctargets
 
 
-def _check_java_file_exists(abs_path):
-    """Checks if any Java files exist in an abs_path directory.
-
-    Args:
-        abs_path: A string of absolute path of a directory to be check.
-
-    Returns:
-        True if any Java files exist otherwise False.
-    """
-    for _, _, filenames in os.walk(abs_path):
-        if fnmatch.filter(filenames, constant.JAVA_FILES):
-            return True
-    return False
-
-
 def _check_native_project_exists(path_to_module_info, rel_path):
-    """Checks if any native project exists in a rel_path directory.
+    """Checks if any C/C++ project exists in a rel_path directory.
 
     Args:
         path_to_module_info: A dictionary contains data of relative path as key
@@ -215,9 +206,62 @@
         rel_path: A string of relative path of a directory to be check.
 
     Returns:
-        True if any native project exists otherwise False.
+        True if any C/C++ project exists otherwise False.
     """
     for path in path_to_module_info:
         if common_util.is_source_under_relative_path(path, rel_path):
             return True
     return False
+
+
+def _filter_out_rust_projects(targets):
+    """Filters out if the input targets contain any Rust project.
+
+    Args:
+        targets: A list of targets to be checked.
+
+    Returns:
+        A list of Rust projects.
+    """
+    root_dir = common_util.get_android_root_dir()
+    rust_project_json = os.path.join(
+        root_dir,
+        common_util.get_blueprint_json_path(constant.RUST_PROJECT_JSON))
+    if not os.path.isfile(rust_project_json):
+        message = _RUST_JSON_NOT_EXIST.format(rust_project_json)
+        print(constant.WARN_MSG.format(
+            common_util.COLORED_INFO('Warning:'), message))
+        return None
+    rust_dict = common_util.get_json_dict(rust_project_json)
+    if _CRATES_KEY not in rust_dict:
+        message = _RUST_DICT_BROKEN.format(_CRATES_KEY)
+        print(constant.WARN_MSG.format(
+            common_util.COLORED_INFO('Warning:'), message))
+        return None
+    return _get_rust_targets(targets, rust_dict[_CRATES_KEY], root_dir)
+
+
+def _get_rust_targets(targets, rust_modules_info, root_dir):
+    """Gets Rust targets by checking input targets with a rust info dictionary.
+
+    Args:
+        targets: A list of targets to be checked.
+        rust_modules_info: A list of the Android Rust modules info.
+        root_dir: A string of the Android root directory.
+
+    Returns:
+        A list of Rust targets.
+    """
+    rtargets = []
+    for target in targets:
+        # The Rust project can be expressed only in the path but not the module
+        # right now.
+        if not os.path.isdir(os.path.join(root_dir, target)):
+            continue
+        for mod_info in rust_modules_info:
+            if _ROOT_MODULE_KEY not in mod_info:
+                continue
+            path = mod_info[_ROOT_MODULE_KEY]
+            if common_util.is_source_under_relative_path(path, target):
+                rtargets.append(target)
+    return rtargets
diff --git a/aidegen/lib/native_util_unittest.py b/aidegen/lib/native_util_unittest.py
index 5eb6f2a..52f8f83 100644
--- a/aidegen/lib/native_util_unittest.py
+++ b/aidegen/lib/native_util_unittest.py
@@ -32,7 +32,7 @@
     """Unit tests for native_util.py"""
 
     @mock.patch.object(native_util, '_check_native_project_exists')
-    @mock.patch.object(native_util, '_check_java_file_exists')
+    @mock.patch.object(common_util, 'check_java_or_kotlin_file_exists')
     @mock.patch.object(common_util, 'get_related_paths')
     def test_analyze_native_and_java_projects(
             self, mock_get_related, mock_check_java, mock_check_native):
@@ -139,11 +139,14 @@
         self.assertEqual(
             result, native_util._filter_out_modules(targets, lambda x: True))
 
+    @mock.patch.object(native_util, '_filter_out_rust_projects')
     @mock.patch.object(native_util, '_analyze_native_and_java_projects')
     @mock.patch.object(native_util, '_filter_out_modules')
-    def test_get_native_and_java_projects(self, mock_fil, mock_ana):
-        """Test get_native_and_java_projects handling."""
+    def test_get_java_cc_and_rust_projects(self, mock_fil, mock_ana,
+                                           mock_fil_rust):
+        """Test get_java_cc_and_rust_projects handling."""
         targets = ['multiarch']
+        mock_fil_rust.return_value = []
         mock_fil.return_value = [], targets
         cc_mod_info = mock.Mock()
         cc_mod_info.is_module = mock.Mock()
@@ -152,23 +155,73 @@
         at_mod_info.is_module = mock.Mock()
         at_mod_info.is_module.return_value = True
         mock_ana.return_value = [], targets
-        native_util.get_native_and_java_projects(
+        native_util.get_java_cc_and_rust_projects(
             at_mod_info, cc_mod_info, targets)
         self.assertEqual(mock_fil.call_count, 2)
         self.assertEqual(mock_ana.call_count, 1)
 
-    @mock.patch('os.walk')
-    def test_check_java_file_exists(self, mock_walk):
-        """Test _check_java_file_exists with conditions."""
-        root_dir = 'a/path/to/dir'
-        folder = 'path/to/dir'
-        target = 'test.java'
-        abs_path = os.path.join(root_dir, folder)
-        mock_walk.return_value = [(root_dir, [folder], [target])]
-        self.assertTrue(native_util._check_java_file_exists(abs_path))
-        target = 'test.cpp'
-        mock_walk.return_value = [(root_dir, [folder], [target])]
-        self.assertFalse(native_util._check_java_file_exists(abs_path))
+    @mock.patch.object(native_util, '_get_rust_targets')
+    @mock.patch.object(common_util, 'get_json_dict')
+    @mock.patch('builtins.print')
+    @mock.patch('os.path.isfile')
+    @mock.patch('os.path.join')
+    @mock.patch.object(common_util, 'get_blueprint_json_path')
+    @mock.patch.object(common_util, 'get_android_root_dir')
+    def test_filter_out_rust_projects(self, mock_get_root, mock_get_json,
+                                      mock_join, mock_is_file, mock_print,
+                                      mock_get_dict, mock_get_rust):
+        """Test _filter_out_rust_projects with conditions."""
+        mock_is_file.return_value = False
+        native_util._filter_out_rust_projects(['a/b/rust'])
+        self.assertTrue(mock_get_root.called)
+        self.assertTrue(mock_get_json.called)
+        self.assertTrue(mock_join.called)
+        self.assertTrue(mock_print.called)
+        self.assertFalse(mock_get_dict.called)
+        self.assertFalse(mock_get_rust.called)
+
+        mock_get_root.mock_reset()
+        mock_get_json.mock_reset()
+        mock_join.mock_reset()
+        mock_print.mock_reset()
+        mock_get_dict.mock_reset()
+        mock_get_rust.mock_reset()
+        mock_is_file.return_value = True
+        mock_get_dict.return_value = {}
+        native_util._filter_out_rust_projects(['a/b/rust'])
+        self.assertTrue(mock_get_root.called)
+        self.assertTrue(mock_get_json.called)
+        self.assertTrue(mock_join.called)
+        self.assertTrue(mock_print.called)
+        self.assertFalse(mock_get_rust.called)
+
+        mock_get_root.mock_reset()
+        mock_get_json.mock_reset()
+        mock_join.mock_reset()
+        mock_print.mock_reset()
+        mock_get_rust.mock_reset()
+        mock_is_file.return_value = True
+        crates = [{native_util._ROOT_MODULE_KEY: 'a/b/rust/src'}]
+        mock_get_dict.return_value = {native_util._CRATES_KEY: crates}
+        mock_get_root.return_value = 'a/b'
+        native_util._filter_out_rust_projects(['a/b/rust'])
+        self.assertTrue(mock_get_json.called)
+        self.assertTrue(mock_join.called)
+        self.assertTrue(mock_get_rust.called)
+        mock_get_rust.assert_called_with(['a/b/rust'], crates, 'a/b')
+
+    @mock.patch.object(common_util, 'is_source_under_relative_path')
+    @mock.patch('os.path.isdir')
+    def test_get_rust_targets(self, mock_is_dir, mock_is_under):
+        """Test _get_rust_targets with conditions."""
+        mock_is_dir.return_value = True
+        mock_is_under.return_value = True
+        targets = ['a/b/rust']
+        self.assertEqual(
+            targets,
+            native_util._get_rust_targets(
+                targets, [{native_util._ROOT_MODULE_KEY: 'a/b/rust/src'}],
+                'a/b'))
 
 
 if __name__ == '__main__':
diff --git a/aidegen/lib/project_config.py b/aidegen/lib/project_config.py
index b13e090..eaa75d5 100644
--- a/aidegen/lib/project_config.py
+++ b/aidegen/lib/project_config.py
@@ -52,7 +52,7 @@
         _instance: A singleton instance of ProjectConfig.
 
     Attributes:
-        ide_name: The IDE name which user prefer to launch.
+        ide_name: The IDE name which users prefer to launch.
         is_launch_ide: A boolean for launching IDE in the end of AIDEGen.
         depth: The depth of module referenced by source.
         full_repo: A boolean decides import whole Android source repo.
@@ -62,6 +62,7 @@
         ide_installed_path: A string of IDE installed path.
         config_reset: A boolean if true to reset all saved configurations.
         atest_module_info: A ModuleInfo instance.
+        language: The programming language users prefer to deal with.
     """
 
     _instance = None
@@ -72,6 +73,7 @@
         Args:
             An argparse.Namespace object holds parsed args.
         """
+        self.language = constant.LANGUAGE_NAME_DICT[args.language[0]]
         self.ide_name = constant.IDE_NAME_DICT[args.ide[0]]
         self.is_launch_ide = not args.no_launch
         self.depth = args.depth
@@ -88,7 +90,10 @@
     def init_environment(self):
         """Initialize the environment settings for the whole project."""
         self._show_skip_build_msg()
-        self.atest_module_info = common_util.get_atest_module_info(self.targets)
+        # TODO(b/159078170): Avoid CLion IDE case for now, we should avoid
+        # Android Studio's native project's case in the future.
+        targets = self.targets if self.language == constant.JAVA else None
+        self.atest_module_info = common_util.get_atest_module_info(targets)
         self.exclude_paths = _transform_exclusive_paths(
             self.atest_module_info, self.exclude_paths)
         self.targets = _check_whole_android_tree(self.targets, self.full_repo)
diff --git a/aidegen/lib/project_config_unittest.py b/aidegen/lib/project_config_unittest.py
index deb173a..d1533c7 100644
--- a/aidegen/lib/project_config_unittest.py
+++ b/aidegen/lib/project_config_unittest.py
@@ -62,7 +62,7 @@
         """Test __init__ method without launching IDE."""
         args = aidegen_main._parse_args(['a', '-n', '-s'])
         config = project_config.ProjectConfig(args)
-        self.assertEqual(config.ide_name, constant.IDE_INTELLIJ)
+        self.assertEqual(config.ide_name, constant.IDE_UNDEFINED)
         self.assertFalse(config.is_launch_ide)
         self.assertEqual(config.depth, 0)
         self.assertFalse(config.full_repo)
@@ -73,7 +73,7 @@
         self.assertFalse(config.config_reset)
         self.assertEqual(config.exclude_paths, None)
         config_obj = project_config.ProjectConfig.get_instance()
-        self.assertEqual(config_obj.ide_name, constant.IDE_INTELLIJ)
+        self.assertEqual(config_obj.ide_name, constant.IDE_UNDEFINED)
         self.assertFalse(config_obj.is_launch_ide)
         self.assertEqual(config_obj.depth, 0)
         self.assertFalse(config_obj.full_repo)
@@ -88,11 +88,11 @@
         """Test __init__ method with different arguments."""
         args = aidegen_main._parse_args([])
         config = project_config.ProjectConfig(args)
-        self.assertEqual(config.ide_name, constant.IDE_INTELLIJ)
+        self.assertEqual(config.ide_name, constant.IDE_UNDEFINED)
         self.assertEqual(config.targets, [''])
         config_obj = project_config.ProjectConfig.get_instance()
         self.assertEqual(config_obj.targets, [''])
-        self.assertEqual(config_obj.ide_name, constant.IDE_INTELLIJ)
+        self.assertEqual(config_obj.ide_name, constant.IDE_UNDEFINED)
         target = 'tradefed'
         args = aidegen_main._parse_args([target])
         config = project_config.ProjectConfig(args)
@@ -174,6 +174,11 @@
         self.assertTrue(mock_get_atest.called)
         self.assertTrue(mock_trans.called)
         self.assertTrue(mock_check_whole.called)
+        mock_get_atest.mock_reset()
+        args = aidegen_main._parse_args(['-i', 'c'])
+        config = project_config.ProjectConfig(args)
+        config.init_environment()
+        mock_get_atest.assert_called_with(None)
 
     @mock.patch('builtins.print')
     def test_show_skip_build_msg_with_skip(self, mock_print):
diff --git a/aidegen/lib/project_file_gen.py b/aidegen/lib/project_file_gen.py
index 62ad140..1115f9a 100644
--- a/aidegen/lib/project_file_gen.py
+++ b/aidegen/lib/project_file_gen.py
@@ -32,7 +32,7 @@
 from aidegen.lib import common_util
 from aidegen.lib import config
 from aidegen.lib import project_config
-from aidegen.project import source_splitter
+from aidegen.project import project_splitter
 
 # FACET_SECTION is a part of iml, which defines the framework of the project.
 _MODULE_SECTION = ('            <module fileurl="file:///$PROJECT_DIR$/%s.iml"'
@@ -109,7 +109,7 @@
         """
         # Initialization
         iml.IMLGenerator.USED_NAME_CACHE.clear()
-        proj_splitter = source_splitter.ProjectSplitter(projects)
+        proj_splitter = project_splitter.ProjectSplitter(projects)
         proj_splitter.get_dependencies()
         proj_splitter.revise_source_folders()
         iml_paths = [proj_splitter.gen_framework_srcjars_iml()]
@@ -143,13 +143,12 @@
             os.path.join(code_style_dir, _CODE_STYLE_CONFIG_XML),
             templates.XML_CODE_STYLE_CONFIG)
         code_style_target_path = os.path.join(code_style_dir, _PROJECT_XML)
-        if os.path.exists(code_style_target_path):
-            os.remove(code_style_target_path)
-        try:
-            shutil.copy2(_CODE_STYLE_SRC_PATH, code_style_target_path)
-        except (OSError, SystemError) as err:
-            logging.warning('%s can\'t copy the project files\n %s',
-                            code_style_target_path, err)
+        if not os.path.exists(code_style_target_path):
+            try:
+                shutil.copy2(_CODE_STYLE_SRC_PATH, code_style_target_path)
+            except (OSError, SystemError) as err:
+                logging.warning('%s can\'t copy the project files\n %s',
+                                code_style_target_path, err)
         # Create .gitignore if it doesn't exist.
         _generate_git_ignore(target_path)
         # Create jsonSchemas.xml for TEST_MAPPING.
diff --git a/aidegen/lib/project_file_gen_unittest.py b/aidegen/lib/project_file_gen_unittest.py
index 10bedf7..5fb64d3 100644
--- a/aidegen/lib/project_file_gen_unittest.py
+++ b/aidegen/lib/project_file_gen_unittest.py
@@ -29,7 +29,7 @@
 from aidegen.lib import project_config
 from aidegen.lib import project_file_gen
 from aidegen.lib import project_info
-from aidegen.project import source_splitter
+from aidegen.project import project_splitter
 
 
 # pylint: disable=protected-access
@@ -257,11 +257,12 @@
     @mock.patch.object(project_file_gen, '_merge_project_vcs_xmls')
     @mock.patch.object(project_file_gen.ProjectFileGenerator,
                        'generate_intellij_project_file')
-    @mock.patch.object(source_splitter.ProjectSplitter, 'gen_projects_iml')
-    @mock.patch.object(source_splitter.ProjectSplitter,
+    @mock.patch.object(project_splitter.ProjectSplitter, 'gen_projects_iml')
+    @mock.patch.object(project_splitter.ProjectSplitter,
                        'gen_framework_srcjars_iml')
-    @mock.patch.object(source_splitter.ProjectSplitter, 'revise_source_folders')
-    @mock.patch.object(source_splitter.ProjectSplitter, 'get_dependencies')
+    @mock.patch.object(project_splitter.ProjectSplitter,
+                       'revise_source_folders')
+    @mock.patch.object(project_splitter.ProjectSplitter, 'get_dependencies')
     @mock.patch.object(common_util, 'get_android_root_dir')
     @mock.patch.object(project_info, 'ProjectInfo')
     def test_generate_ide_project_files(self, mock_project, mock_get_root,
diff --git a/aidegen/lib/project_info.py b/aidegen/lib/project_info.py
index 3bcb18a..4ab74f6 100644
--- a/aidegen/lib/project_info.py
+++ b/aidegen/lib/project_info.py
@@ -20,26 +20,24 @@
 
 import logging
 import os
+import time
 
 from aidegen import constant
+from aidegen.lib import aidegen_metrics
 from aidegen.lib import common_util
 from aidegen.lib import errors
 from aidegen.lib import module_info
 from aidegen.lib import project_config
 from aidegen.lib import source_locator
+from aidegen.idea import iml
 
 from atest import atest_utils
 
 _CONVERT_MK_URL = ('https://android.googlesource.com/platform/build/soong/'
                    '#convert-android_mk-files')
-_ANDROID_MK_WARN = (
-    '{} contains Android.mk file(s) in its dependencies:\n{}\nPlease help '
-    'convert these files into blueprint format in the future, otherwise '
-    'AIDEGen may not be able to include all module dependencies.\nPlease visit '
-    '%s for reference on how to convert makefile.' % _CONVERT_MK_URL)
 _ROBOLECTRIC_MODULE = 'Robolectric_all'
-_NOT_TARGET = ('Module %s\'s class setting is %s, none of which is included in '
-               '%s, skipping this module in the project.')
+_NOT_TARGET = ('The module %s does not contain any Java or Kotlin file, '
+               'therefore we skip this module in the project.')
 # The module fake-framework have the same package name with framework but empty
 # content. It will impact the dependency for framework when referencing the
 # package from fake-framework in IntelliJ.
@@ -62,6 +60,8 @@
     Class attributes:
         modules_info: An AidegenModuleInfo instance whose name_to_module_info is
                       combining module-info.json with module_bp_java_deps.json.
+        projects: A list of instances of ProjectInfo that are generated in an
+                  AIDEGen command.
 
     Attributes:
         project_absolute_path: The absolute path of the project.
@@ -91,6 +91,12 @@
         is_main_project: A boolean to verify the project is main project.
         dependencies: A list of dependency projects' iml file names, e.g. base,
                       framework-all.
+        iml_name: The iml project file name of this project.
+        rel_out_soong_jar_path: A string of relative project path in the
+                                'out/soong/.intermediates' directory, e.g., if
+                                self.project_relative_path = 'frameworks/base'
+                                the rel_out_soong_jar_path should be
+                                'out/soong/.intermediates/frameworks/base/'.
     """
 
     modules_info = None
@@ -121,8 +127,9 @@
         else:
             self.dep_modules = self.get_dep_modules()
         self._filter_out_modules()
-        self._display_convert_make_files_message()
         self.dependencies = []
+        self.iml_name = iml.IMLGenerator.get_unique_iml_name(abs_path)
+        self.rel_out_soong_jar_path = self._get_rel_project_out_soong_jar_path()
 
     def _set_default_modues(self):
         """Append default hard-code modules, source paths and jar files.
@@ -149,14 +156,6 @@
             'srcjar_path': set()
         }
 
-    def _display_convert_make_files_message(self):
-        """Show message info users convert their Android.mk to Android.bp."""
-        mk_set = set(self._search_android_make_files())
-        if mk_set:
-            print('\n{} {}\n'.format(
-                common_util.COLORED_INFO('Warning:'),
-                _ANDROID_MK_WARN.format(self.module_name, '\n'.join(mk_set))))
-
     def _search_android_make_files(self):
         """Search project and dependency modules contain Android.mk files.
 
@@ -179,9 +178,10 @@
                     yield '\t' + os.path.join(rel_path, constant.ANDROID_MK)
 
     def _get_modules_under_project_path(self, rel_path):
-        """Find modules under the rel_path.
+        """Find qualified modules under the rel_path.
 
-        Find modules whose class is qualified to be included as a target module.
+        Find modules which contain any Java or Kotlin file as a target module.
+        If it's the whole source tree project, add all modules into it.
 
         Args:
             rel_path: A string, the project's relative path.
@@ -189,17 +189,20 @@
         Returns:
             A set of module names.
         """
-        logging.info('Find modules whose class is in %s under %s.',
-                     constant.TARGET_CLASSES, rel_path)
+        logging.info('Find modules contain any Java or Kotlin file under %s.',
+                     rel_path)
+        if rel_path == '':
+            return self.modules_info.name_to_module_info.keys()
         modules = set()
+        root_dir = common_util.get_android_root_dir()
         for name, data in self.modules_info.name_to_module_info.items():
             if module_info.AidegenModuleInfo.is_project_path_relative_module(
                     data, rel_path):
-                if module_info.AidegenModuleInfo.is_target_module(data):
+                if common_util.check_java_or_kotlin_file_exists(
+                        os.path.join(root_dir, data[constant.KEY_PATH][0])):
                     modules.add(name)
                 else:
-                    logging.debug(_NOT_TARGET, name, data.get('class', ''),
-                                  constant.TARGET_CLASSES)
+                    logging.debug(_NOT_TARGET, name)
         return modules
 
     def _get_robolectric_dep_module(self, modules):
@@ -368,6 +371,8 @@
             return
         if rebuild_targets:
             if build:
+                logging.info('\nThe batch_build_dependencies function is '
+                             'called by ProjectInfo\'s locate_source method.')
                 batch_build_dependencies(rebuild_targets)
                 self.locate_source(build=False)
             else:
@@ -415,6 +420,23 @@
                 if common_util.is_target(x, constant.TARGET_LIBS)
             ])
 
+    def _get_rel_project_out_soong_jar_path(self):
+        """Gets the projects' jar path in 'out/soong/.intermediates' folder.
+
+        Gets the relative project's jar path in the 'out/soong/.intermediates'
+        directory. For example, if the self.project_relative_path is
+        'frameworks/base', the returned value should be
+        'out/soong/.intermediates/frameworks/base/'.
+
+        Returns:
+            A string of relative project path in out/soong/.intermediates/
+            directory, e.g. 'out/soong/.intermediates/frameworks/base/'.
+        """
+        rdir = os.path.relpath(common_util.get_soong_out_path(),
+                               common_util.get_android_root_dir())
+        return os.sep.join(
+            [rdir, constant.INTERMEDIATES, self.project_relative_path]) + os.sep
+
     @classmethod
     def multi_projects_locate_source(cls, projects):
         """Locate the paths of dependent source folders and jar files.
@@ -424,16 +446,38 @@
                       such as project relative path, project real path, project
                       dependencies.
         """
+        cls.projects = projects
         for project in projects:
             project.locate_source()
+            _update_iml_dep_modules(project)
 
 
 class MultiProjectsInfo(ProjectInfo):
     """Multiple projects info.
 
     Usage example:
-        project = MultiProjectsInfo(['module_name'])
-        project.collect_all_dep_modules()
+        if folder_base:
+            project = MultiProjectsInfo(['module_name'])
+            project.collect_all_dep_modules()
+            project.gen_folder_base_dependencies()
+        else:
+            ProjectInfo.generate_projects(['module_name'])
+
+    Attributes:
+        _targets: A list of module names or project paths.
+        path_to_sources: A dictionary of modules' sources, the module's path
+                         as key and the sources as value.
+                         e.g.
+                         {
+                             'frameworks/base': {
+                                 'src_dirs': [],
+                                 'test_dirs': [],
+                                 'r_java_paths': [],
+                                 'srcjar_paths': [],
+                                 'jar_files': [],
+                                 'dep_paths': [],
+                             }
+                         }
     """
 
     def __init__(self, targets=None):
@@ -444,10 +488,48 @@
         """
         super().__init__(targets[0], True)
         self._targets = targets
+        self.path_to_sources = {}
+
+    def _clear_srcjar_paths(self, module):
+        """Clears the srcjar_paths.
+
+        Args:
+            module: A ModuleData instance.
+        """
+        module.srcjar_paths = []
+
+    def _collect_framework_srcjar_info(self, module):
+        """Clears the framework's srcjars.
+
+        Args:
+            module: A ModuleData instance.
+        """
+        if module.module_path == constant.FRAMEWORK_PATH:
+            framework_srcjar_path = os.path.join(constant.FRAMEWORK_PATH,
+                                                 constant.FRAMEWORK_SRCJARS)
+            if module.module_name == constant.FRAMEWORK_ALL:
+                self.path_to_sources[framework_srcjar_path] = {
+                    'src_dirs': [],
+                    'test_dirs': [],
+                    'r_java_paths': [],
+                    'srcjar_paths': module.srcjar_paths,
+                    'jar_files': [],
+                    'dep_paths': [constant.FRAMEWORK_PATH],
+                }
+            # In the folder base case, AIDEGen has to ignore all module's srcjar
+            # files under the frameworks/base except the framework-all. Because
+            # there are too many duplicate srcjars of modules under the
+            # frameworks/base. So that AIDEGen keeps the srcjar files only from
+            # the framework-all module. Other modeuls' srcjar files will be
+            # removed. However, when users choose the module base case, srcjar
+            # files will be collected by the ProjectInfo class, so that the
+            # removing srcjar_paths in this class does not impact the
+            # srcjar_paths collection of modules in the ProjectInfo class.
+            self._clear_srcjar_paths(module)
 
     def collect_all_dep_modules(self):
         """Collects all dependency modules for the projects."""
-        self.project_module_names = set()
+        self.project_module_names.clear()
         module_names = set(_CORE_MODULES)
         for target in self._targets:
             relpath, _ = common_util.get_related_paths(self.modules_info,
@@ -456,6 +538,30 @@
         module_names.update(self._get_robolectric_dep_module(module_names))
         self.dep_modules = self.get_dep_modules(module_names)
 
+    def gen_folder_base_dependencies(self, module):
+        """Generates the folder base dependencies dictionary.
+
+        Args:
+            module: A ModuleData instance.
+        """
+        mod_path = module.module_path
+        if not mod_path:
+            logging.debug('The %s\'s path is empty.', module.module_name)
+            return
+        self._collect_framework_srcjar_info(module)
+        if mod_path not in self.path_to_sources:
+            self.path_to_sources[mod_path] = {
+                'src_dirs': module.src_dirs,
+                'test_dirs': module.test_dirs,
+                'r_java_paths': module.r_java_paths,
+                'srcjar_paths': module.srcjar_paths,
+                'jar_files': module.jar_files,
+                'dep_paths': module.dep_paths,
+            }
+        else:
+            for key, val in self.path_to_sources[mod_path].items():
+                val.extend([v for v in getattr(module, key) if v not in val])
+
 
 def batch_build_dependencies(rebuild_targets):
     """Batch build the jar or srcjar files of the modules if they don't exist.
@@ -469,12 +575,17 @@
     Args:
         rebuild_targets: A set of jar or srcjar files which do not exist.
     """
+    start_time = time.time()
     logging.info('Ready to build the jar or srcjar files. Files count = %s',
                  str(len(rebuild_targets)))
     arg_max = os.sysconf('SC_PAGE_SIZE') * 32 - _CMD_LENGTH_BUFFER
     rebuild_targets = list(rebuild_targets)
     for start, end in iter(_separate_build_targets(rebuild_targets, arg_max)):
         _build_target(rebuild_targets[start:end])
+    duration = time.time() - start_time
+    logging.debug('Build Time,  duration = %s', str(duration))
+    aidegen_metrics.performance_metrics(constant.TYPE_AIDEGEN_BUILD_TIME,
+                                        duration)
 
 
 def _build_target(targets):
@@ -517,3 +628,30 @@
             arg_len = len(item) + _BLANK_SIZE
     if first_item_index < len(build_targets):
         yield first_item_index, len(build_targets)
+
+
+def _update_iml_dep_modules(project):
+    """Gets the dependent modules in the project's iml file.
+
+    The jar files which have the same source codes as cls.projects' source files
+    should be removed from the dependencies.iml file's jar paths. The codes are
+    written in aidegen.project.project_splitter.py.
+    We should also add the jar project's unique iml name into self.dependencies
+    which later will be written into its own iml project file. If we don't
+    remove these files in dependencies.iml, it will cause the duplicated codes
+    in IDE and raise issues. For example, when users do 'refactor' and rename a
+    class in the IDE, it will search all sources and dependencies' jar paths and
+    lead to the error.
+    """
+    keys = ('source_folder_path', 'test_folder_path', 'r_java_path',
+            'srcjar_path', 'jar_path')
+    for key in keys:
+        for jar in project.source_path[key]:
+            for prj in ProjectInfo.projects:
+                if prj is project:
+                    continue
+                if (prj.rel_out_soong_jar_path in jar and
+                        jar.endswith(constant.JAR_EXT)):
+                    if prj.iml_name not in project.dependencies:
+                        project.dependencies.append(prj.iml_name)
+                    break
diff --git a/aidegen/lib/project_info_unittest.py b/aidegen/lib/project_info_unittest.py
index 19f1d8f..884cf8c 100644
--- a/aidegen/lib/project_info_unittest.py
+++ b/aidegen/lib/project_info_unittest.py
@@ -28,6 +28,7 @@
 from aidegen.lib import common_util
 from aidegen.lib import project_info
 from aidegen.lib import project_config
+from aidegen.lib import source_locator
 
 _MODULE_INFO = {
     'm1': {
@@ -105,6 +106,7 @@
         self.args.verbose = False
         self.args.ide_installed_path = None
         self.args.config_reset = False
+        self.args.language = ['j']
 
     @mock.patch('atest.module_info.ModuleInfo')
     def test_get_dep_modules(self, mock_module_info):
@@ -141,11 +143,12 @@
             unittest_constants.TEST_MODULE)
 
     # pylint: disable=too-many-locals
+    @mock.patch('logging.info')
     @mock.patch.object(common_util, 'get_android_root_dir')
     @mock.patch('atest.module_info.ModuleInfo')
     @mock.patch('atest.atest_utils.build')
     def test_locate_source(self, mock_atest_utils_build, mock_module_info,
-                           mock_get_root):
+                           mock_get_root, mock_info):
         """Test locate_source handling."""
         mock_atest_utils_build.build.return_value = True
         test_root_path = os.path.join(tempfile.mkdtemp(), 'test')
@@ -173,6 +176,7 @@
         result_jar = set()
         project_info_obj.locate_source()
         self.assertEqual(project_info_obj.source_path['jar_path'], result_jar)
+        self.assertTrue(mock_info.called)
 
         # Test collects source and test folders.
         result_source = set(['packages/apps/test/src/main/java'])
@@ -217,6 +221,7 @@
         args.verbose = False
         args.ide_installed_path = None
         args.config_reset = False
+        args.language = ['j']
         project_config.ProjectConfig(args)
         project_info_obj = project_info.ProjectInfo(
             mock_module_info.get_paths()[0])
@@ -297,25 +302,6 @@
         mock_format.reset_mock()
         mock_build.reset_mock()
 
-    @mock.patch('builtins.print')
-    @mock.patch.object(project_info.ProjectInfo, '_search_android_make_files')
-    @mock.patch('atest.module_info.ModuleInfo')
-    def test_display_convert_make_files_message(
-            self, mock_module_info, mock_search, mock_print):
-        """Test _display_convert_make_files_message with conditions."""
-        mock_search.return_value = []
-        mock_module_info.get_paths.return_value = ['m1']
-        project_info.ProjectInfo.modules_info = mock_module_info
-        proj_info = project_info.ProjectInfo(self.args.module_name)
-        proj_info._display_convert_make_files_message()
-        self.assertFalse(mock_print.called)
-
-        mock_print.mock_reset()
-        mock_search.return_value = ['a/b/path/to/target.mk']
-        proj_info = project_info.ProjectInfo(self.args.module_name)
-        proj_info._display_convert_make_files_message()
-        self.assertTrue(mock_print.called)
-
     @mock.patch.object(project_info, '_build_target')
     @mock.patch.object(project_info, '_separate_build_targets')
     @mock.patch.object(logging, 'info')
@@ -327,6 +313,39 @@
         self.assertTrue(mock_sep.called)
         self.assertEqual(mock_build.call_count, 1)
 
+    @mock.patch('os.path.relpath')
+    def test_get_rel_project_out_soong_jar_path(self, mock_rel):
+        """Test _get_rel_project_out_soong_jar_path."""
+        out_dir = 'a/b/out/soong'
+        mock_rel.return_value = out_dir
+        proj_info = project_info.ProjectInfo(self.args.module_name, False)
+        expected = os.sep.join(
+            [out_dir, constant.INTERMEDIATES, 'm1']) + os.sep
+        self.assertEqual(
+            expected, proj_info._get_rel_project_out_soong_jar_path())
+
+    def test_update_iml_dep_modules(self):
+        """Test _update_iml_dep_modules with conditions."""
+        project1 = mock.Mock()
+        project1.source_path = {
+            'source_folder_path': [], 'test_folder_path': [], 'r_java_path': [],
+            'srcjar_path': [], 'jar_path': []
+        }
+        project1.dependencies = []
+        project2 = mock.Mock()
+        project2.iml_name = 'm2'
+        project2.rel_out_soong_jar_path = 'out/soong/.intermediates/m2'
+        project_info.ProjectInfo.projects = [project1, project2]
+        project_info._update_iml_dep_modules(project1)
+        self.assertEqual([], project1.dependencies)
+        project1.source_path = {
+            'source_folder_path': [], 'test_folder_path': [], 'r_java_path': [],
+            'srcjar_path': [],
+            'jar_path': ['out/soong/.intermediates/m2/a/b/any.jar']
+        }
+        project_info._update_iml_dep_modules(project1)
+        self.assertEqual(['m2'], project1.dependencies)
+
 
 class MultiProjectsInfoUnittests(unittest.TestCase):
     """Unit tests for MultiProjectsInfo class."""
@@ -349,9 +368,122 @@
         expected = set(project_info._CORE_MODULES)
         expected.update({'sub_module', 'robo_module'})
         proj = project_info.MultiProjectsInfo(['a'])
+        proj.project_module_names = set('framework-all')
         proj.collect_all_dep_modules()
         self.assertTrue(mock_get_dep_modules.called_with(expected))
 
+    @mock.patch.object(logging, 'debug')
+    @mock.patch.object(source_locator, 'ModuleData')
+    @mock.patch.object(project_info.ProjectInfo, '__init__')
+    def test_gen_folder_base_dependencies(self, mock_init, mock_module_data,
+                                          mock_log):
+        """Test _gen_folder_base_dependencies."""
+        mock_init.return_value = None
+        proj = project_info.MultiProjectsInfo(['a'])
+        module = mock.Mock()
+        mock_module_data.return_value = module
+        mock_module_data.module_path = ''
+        proj.gen_folder_base_dependencies(mock_module_data)
+        self.assertTrue(mock_log.called)
+        mock_module_data.module_path = 'a/b'
+        mock_module_data.src_dirs = ['a/b/c']
+        mock_module_data.test_dirs = []
+        mock_module_data.r_java_paths = []
+        mock_module_data.srcjar_paths = []
+        mock_module_data.jar_files = []
+        mock_module_data.dep_paths = []
+        proj.gen_folder_base_dependencies(mock_module_data)
+        expected = {
+            'a/b': {
+                'src_dirs': ['a/b/c'],
+                'test_dirs': [],
+                'r_java_paths': [],
+                'srcjar_paths': [],
+                'jar_files': [],
+                'dep_paths': [],
+            }
+        }
+        self.assertEqual(proj.path_to_sources, expected)
+        mock_module_data.srcjar_paths = ['x/y.srcjar']
+        proj.gen_folder_base_dependencies(mock_module_data)
+        expected = {
+            'a/b': {
+                'src_dirs': ['a/b/c'],
+                'test_dirs': [],
+                'r_java_paths': [],
+                'srcjar_paths': ['x/y.srcjar'],
+                'jar_files': [],
+                'dep_paths': [],
+            }
+        }
+        self.assertEqual(proj.path_to_sources, expected)
+
+    @mock.patch.object(source_locator, 'ModuleData')
+    @mock.patch.object(project_info.ProjectInfo, '__init__')
+    def test_add_framework_base_path(self, mock_init, mock_module_data):
+        """Test _gen_folder_base_dependencies."""
+        mock_init.return_value = None
+        proj = project_info.MultiProjectsInfo(['a'])
+        module = mock.Mock()
+        mock_module_data.return_value = module
+        mock_module_data.module_path = 'frameworks/base'
+        mock_module_data.module_name = 'framework-other'
+        mock_module_data.src_dirs = ['a/b/c']
+        mock_module_data.test_dirs = []
+        mock_module_data.r_java_paths = []
+        mock_module_data.srcjar_paths = ['x/y.srcjar']
+        mock_module_data.jar_files = []
+        mock_module_data.dep_paths = []
+        proj.gen_folder_base_dependencies(mock_module_data)
+        expected = {
+            'frameworks/base': {
+                'dep_paths': [],
+                'jar_files': [],
+                'r_java_paths': [],
+                'src_dirs': ['a/b/c'],
+                'srcjar_paths': [],
+                'test_dirs': [],
+            }
+        }
+        self.assertDictEqual(proj.path_to_sources, expected)
+
+    @mock.patch.object(source_locator, 'ModuleData')
+    @mock.patch.object(project_info.ProjectInfo, '__init__')
+    def test_add_framework_srcjar_path(self, mock_init, mock_module_data):
+        """Test _gen_folder_base_dependencies."""
+        mock_init.return_value = None
+        proj = project_info.MultiProjectsInfo(['a'])
+        module = mock.Mock()
+        mock_module_data.return_value = module
+        mock_module_data.module_path = 'frameworks/base'
+        mock_module_data.module_name = 'framework-all'
+        mock_module_data.src_dirs = ['a/b/c']
+        mock_module_data.test_dirs = []
+        mock_module_data.r_java_paths = []
+        mock_module_data.srcjar_paths = ['x/y.srcjar']
+        mock_module_data.jar_files = []
+        mock_module_data.dep_paths = []
+        proj.gen_folder_base_dependencies(mock_module_data)
+        expected = {
+            'frameworks/base': {
+                'dep_paths': [],
+                'jar_files': [],
+                'r_java_paths': [],
+                'src_dirs': ['a/b/c'],
+                'srcjar_paths': [],
+                'test_dirs': [],
+            },
+            'frameworks/base/framework_srcjars': {
+                'dep_paths': ['frameworks/base'],
+                'jar_files': [],
+                'r_java_paths': [],
+                'src_dirs': [],
+                'srcjar_paths': ['x/y.srcjar'],
+                'test_dirs': [],
+            }
+        }
+        self.assertDictEqual(proj.path_to_sources, expected)
+
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/aidegen/lib/source_locator.py b/aidegen/lib/source_locator.py
index 7a8e5a2..9c13ae3 100644
--- a/aidegen/lib/source_locator.py
+++ b/aidegen/lib/source_locator.py
@@ -36,14 +36,11 @@
 # File extensions
 _JAVA_EXT = '.java'
 _KOTLIN_EXT = '.kt'
-_SRCJAR_EXT = '.srcjar'
 _TARGET_FILES = [_JAVA_EXT, _KOTLIN_EXT]
 _JARJAR_RULES_FILE = 'jarjar-rules.txt'
 _KEY_JARJAR_RULES = 'jarjar_rules'
-_NAME_AAPT2 = 'aapt2'
-_TARGET_R_SRCJAR = 'R.srcjar'
-_TARGET_AAPT2_SRCJAR = _NAME_AAPT2 + _SRCJAR_EXT
-_TARGET_BUILD_FILES = [_TARGET_AAPT2_SRCJAR, _TARGET_R_SRCJAR]
+_TARGET_AAPT2_SRCJAR = constant.NAME_AAPT2 + constant.SRCJAR_EXT
+_TARGET_BUILD_FILES = [_TARGET_AAPT2_SRCJAR, constant.TARGET_R_SRCJAR]
 _IGNORE_DIRS = [
     # The java files under this directory have to be ignored because it will
     # cause duplicated classes by libcore/ojluni/src/main/java.
@@ -68,7 +65,8 @@
         jar_files: A list to keep the unique jar file relative paths.
         r_java_paths: A list to keep the R folder paths to use in Eclipse.
         srcjar_paths: A list to keep the srcjar source root paths to use in
-                      IntelliJ.
+                      IntelliJ. Some modules' srcjar_paths will be removed when
+                      run with the MultiProjectInfo.
         dep_paths: A list to keep the dependency modules' path.
         referenced_by_jar: A boolean to check if the module is referenced by a
                            jar file.
@@ -220,10 +218,10 @@
         target_folder, target_file = os.path.split(srcjar)
         base_dirname = os.path.basename(target_folder)
         if target_file == _TARGET_AAPT2_SRCJAR:
-            return os.path.join(target_folder, _NAME_AAPT2)
-        if target_file == _TARGET_R_SRCJAR and base_dirname == _ANDROID:
+            return os.path.join(target_folder, constant.NAME_AAPT2)
+        if target_file == constant.TARGET_R_SRCJAR and base_dirname == _ANDROID:
             return os.path.join(os.path.dirname(target_folder),
-                                _NAME_AAPT2, 'R')
+                                constant.NAME_AAPT2, 'R')
         return None
 
     def _init_module_path(self):
diff --git a/aidegen/project/project_splitter.py b/aidegen/project/project_splitter.py
new file mode 100644
index 0000000..06f488a
--- /dev/null
+++ b/aidegen/project/project_splitter.py
@@ -0,0 +1,494 @@
+#!/usr/bin/env python3
+#
+# Copyright 2020 - The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Separate the sources from multiple projects."""
+
+import logging
+import os
+import shutil
+
+from aidegen import constant
+from aidegen.idea import iml
+from aidegen.lib import common_util
+from aidegen.lib import project_config
+
+_KEY_SOURCE_PATH = 'source_folder_path'
+_KEY_TEST_PATH = 'test_folder_path'
+_SOURCE_FOLDERS = [_KEY_SOURCE_PATH, _KEY_TEST_PATH]
+_KEY_SRCJAR_PATH = 'srcjar_path'
+_KEY_R_PATH = 'r_java_path'
+_KEY_JAR_PATH = 'jar_path'
+_EXCLUDE_ITEM = '\n            <excludeFolder url="file://%s" />'
+# Temporarily exclude test-dump and src_stub folders to prevent symbols from
+# resolving failure by incorrect reference. These two folders should be removed
+# after b/136982078 is resolved.
+_EXCLUDE_FOLDERS = ['.idea', '.repo', 'art', 'bionic', 'bootable', 'build',
+                    'dalvik', 'developers', 'device', 'hardware', 'kernel',
+                    'libnativehelper', 'pdk', 'prebuilts', 'sdk', 'system',
+                    'toolchain', 'tools', 'vendor', 'out', 'external',
+                    'art/tools/ahat/src/test-dump',
+                    'cts/common/device-side/device-info/src_stub']
+_PERMISSION_DEFINED_PATH = ('frameworks/base/core/res/framework-res/'
+                            'android_common/gen/')
+_ANDROID = 'android'
+_R = 'R'
+
+
+class ProjectSplitter:
+    """Splits the sources from multiple projects.
+
+    It's a specific solution to deal with the source folders in multiple
+    project case. Since the IntelliJ does not allow duplicate source folders,
+    AIDEGen needs to separate the source folders for each project. The single
+    project case has no different with current structure.
+
+    Usage:
+    project_splitter = ProjectSplitter(projects)
+
+    # Find the dependencies between the projects.
+    project_splitter.get_dependencies()
+
+    # Clear the source folders for each project.
+    project_splitter.revise_source_folders()
+
+    Attributes:
+        _projects: A list of ProjectInfo.
+        _all_srcs: A dictionary contains all sources of multiple projects.
+                   e.g.
+                   {
+                       'module_name': 'test',
+                       'path': ['path/to/module'],
+                       'srcs': ['src_folder1', 'src_folder2'],
+                       'tests': ['test_folder1', 'test_folder2']
+                       'jars': ['jar1.jar'],
+                       'srcjars': ['1.srcjar', '2.srcjar'],
+                       'dependencies': ['framework_srcjars', 'base'],
+                       'iml_name': '/abs/path/to/iml.iml'
+                   }
+        _framework_exist: A boolean, True if framework is one of the projects.
+        _framework_iml: A string, the name of the framework's iml.
+        _full_repo: A boolean, True if loading with full Android sources.
+        _full_repo_iml: A string, the name of the Android folder's iml.
+        _permission_r_srcjar: A string, the absolute path of R.srcjar file where
+                              the permission relative constants are defined.
+        _permission_aapt2: A string, the absolute path of aapt2/R directory
+                           where the permission relative constants are defined.
+    """
+    def __init__(self, projects):
+        """ProjectSplitter initialize.
+
+        Args:
+            projects: A list of ProjectInfo object.
+        """
+        self._projects = projects
+        self._all_srcs = dict(projects[0].source_path)
+        self._framework_iml = None
+        self._framework_exist = any(
+            {p.project_relative_path == constant.FRAMEWORK_PATH
+             for p in self._projects})
+        if self._framework_exist:
+            self._framework_iml = iml.IMLGenerator.get_unique_iml_name(
+                os.path.join(common_util.get_android_root_dir(),
+                             constant.FRAMEWORK_PATH))
+        self._full_repo = project_config.ProjectConfig.get_instance().full_repo
+        if self._full_repo:
+            self._full_repo_iml = os.path.basename(
+                common_util.get_android_root_dir())
+        self._permission_r_srcjar = _get_permission_r_srcjar_rel_path()
+        self._permission_aapt2 = _get_permission_aapt2_rel_path()
+
+    def revise_source_folders(self):
+        """Resets the source folders of each project.
+
+        There should be no duplicate source root path in IntelliJ. The issue
+        doesn't happen in single project case. Once users choose multiple
+        projects, there could be several same source paths of different
+        projects. In order to prevent that, we should remove the source paths
+        in dependencies.iml which are duplicate with the paths in [module].iml
+        files.
+
+        Steps to prevent the duplicate source root path in IntelliJ:
+        1. Copy all sources from sub-projects to main project.
+        2. Delete the source and test folders which are not under the
+           sub-projects.
+        3. Delete the sub-projects' source and test paths from the main project.
+        """
+        self._collect_all_srcs()
+        self._keep_local_sources()
+        self._remove_duplicate_sources()
+
+    def _collect_all_srcs(self):
+        """Copies all projects' sources to a dictionary."""
+        for project in self._projects[1:]:
+            for key, value in project.source_path.items():
+                self._all_srcs[key].update(value)
+
+    def _keep_local_sources(self):
+        """Removes source folders which are not under the project's path.
+
+        1. Remove the source folders which are not under the project.
+        2. Remove the duplicate project's source folders from the _all_srcs.
+        """
+        for project in self._projects:
+            srcs = project.source_path
+            relpath = project.project_relative_path
+            is_root = not relpath
+            for key in _SOURCE_FOLDERS:
+                srcs[key] = {s for s in srcs[key]
+                             if common_util.is_source_under_relative_path(
+                                 s, relpath) or is_root}
+                self._all_srcs[key] -= srcs[key]
+
+    def _remove_duplicate_sources(self):
+        """Removes the duplicate source folders from each sub project.
+
+        Priority processing with the longest path length, e.g.
+        frameworks/base/packages/SettingsLib must have priority over
+        frameworks/base.
+        (b/160303006): Remove the parent project's source and test paths under
+        the child's project path.
+        """
+        root = common_util.get_android_root_dir()
+        projects = sorted(self._projects, key=lambda k: len(
+            k.project_relative_path), reverse=True)
+        for child in projects:
+            for parent in self._projects:
+                is_root = not parent.project_relative_path
+                if parent is child:
+                    continue
+                if (common_util.is_source_under_relative_path(
+                        child.project_relative_path,
+                        parent.project_relative_path) or is_root):
+                    for key in _SOURCE_FOLDERS:
+                        parent.source_path[key] -= child.source_path[key]
+                        rm_paths = _remove_child_duplicate_sources_from_parent(
+                            child, parent.source_path[key], root)
+                        parent.source_path[key] -= rm_paths
+
+    def get_dependencies(self):
+        """Gets the dependencies between the projects.
+
+        Check if the current project's source folder exists in other projects.
+        If do, the current project is a dependency module to the other.
+        """
+        projects = sorted(self._projects, key=lambda k: len(
+            k.project_relative_path))
+        for project in projects:
+            proj_path = project.project_relative_path
+            project.dependencies = [constant.FRAMEWORK_SRCJARS]
+            if self._framework_exist and proj_path != constant.FRAMEWORK_PATH:
+                project.dependencies.append(self._framework_iml)
+            if self._full_repo and proj_path:
+                project.dependencies.append(self._full_repo_iml)
+            srcs = (project.source_path[_KEY_SOURCE_PATH]
+                    | project.source_path[_KEY_TEST_PATH])
+            dep_projects = sorted(self._projects, key=lambda k: len(
+                k.project_relative_path))
+            for dep_proj in dep_projects:
+                dep_path = dep_proj.project_relative_path
+                is_root = not dep_path
+                is_child = common_util.is_source_under_relative_path(dep_path,
+                                                                     proj_path)
+                is_dep = any({s for s in srcs
+                              if common_util.is_source_under_relative_path(
+                                  s, dep_path) or is_root})
+                if dep_proj is project or is_child or not is_dep:
+                    continue
+                dep = iml.IMLGenerator.get_unique_iml_name(os.path.join(
+                    common_util.get_android_root_dir(), dep_path))
+                if dep not in project.dependencies:
+                    project.dependencies.append(dep)
+            project.dependencies.append(constant.KEY_DEPENDENCIES)
+
+    def gen_framework_srcjars_iml(self):
+        """Generates the framework_srcjars.iml.
+
+        Create the iml file with only the srcjars of module framework-all. These
+        srcjars will be separated from the modules under frameworks/base.
+
+        Returns:
+            A string of the framework_srcjars.iml's absolute path.
+        """
+        self._remove_permission_definition_srcjar_path()
+        mod = dict(self._projects[0].dep_modules[constant.FRAMEWORK_ALL])
+        mod[constant.KEY_DEPENDENCIES] = []
+        mod[constant.KEY_IML_NAME] = constant.FRAMEWORK_SRCJARS
+        if self._framework_exist:
+            mod[constant.KEY_DEPENDENCIES].append(self._framework_iml)
+        if self._full_repo:
+            mod[constant.KEY_DEPENDENCIES].append(self._full_repo_iml)
+        mod[constant.KEY_DEPENDENCIES].append(constant.KEY_DEPENDENCIES)
+        srcjar_dict = dict()
+        permission_src = self._get_permission_defined_source_path()
+        if permission_src:
+            mod[constant.KEY_SRCS] = [permission_src]
+            srcjar_dict = {constant.KEY_DEP_SRCS: True,
+                           constant.KEY_SRCJARS: True,
+                           constant.KEY_DEPENDENCIES: True}
+        else:
+            logging.warning('The permission definition relative paths are '
+                            'missing.')
+            srcjar_dict = {constant.KEY_SRCJARS: True,
+                           constant.KEY_DEPENDENCIES: True}
+        framework_srcjars_iml = iml.IMLGenerator(mod)
+        framework_srcjars_iml.create(srcjar_dict)
+        self._all_srcs[_KEY_SRCJAR_PATH] -= set(mod.get(constant.KEY_SRCJARS,
+                                                        []))
+        return framework_srcjars_iml.iml_path
+
+    def _get_permission_defined_source_path(self):
+        """Gets the source path where permission relative constants are defined.
+
+        For the definition permission constants, the priority is,
+        1) If framework-res/android_common/gen/aapt2/R directory exists, return
+           it.
+        2) If the framework-res/android_common/gen/android/R.srcjar file exists,
+           unzip it to 'aidegen_r.srcjar' folder and return the path.
+
+        Returns:
+            A string of the path of aapt2/R or android/aidegen_r.srcjar folder,
+            else None.
+        """
+        if os.path.isdir(self._permission_aapt2):
+            return self._permission_aapt2
+        if os.path.isfile(self._permission_r_srcjar):
+            dest = os.path.join(
+                os.path.dirname(self._permission_r_srcjar),
+                ''.join([constant.UNZIP_SRCJAR_PATH_HEAD,
+                         os.path.basename(self._permission_r_srcjar).lower()]))
+            if os.path.isdir(dest):
+                shutil.rmtree(dest)
+            common_util.unzip_file(self._permission_r_srcjar, dest)
+            return dest
+        return None
+
+    def _gen_dependencies_iml(self):
+        """Generates the dependencies.iml."""
+        rel_project_soong_paths = self._get_rel_project_soong_paths()
+        self._unzip_all_scrjars()
+        mod = {
+            constant.KEY_SRCS: _get_real_dependencies_jars(
+                rel_project_soong_paths, self._all_srcs[_KEY_SOURCE_PATH]),
+            constant.KEY_TESTS: _get_real_dependencies_jars(
+                rel_project_soong_paths, self._all_srcs[_KEY_TEST_PATH]),
+            constant.KEY_JARS: _get_real_dependencies_jars(
+                rel_project_soong_paths, self._all_srcs[_KEY_JAR_PATH]),
+            constant.KEY_SRCJARS: _get_real_dependencies_jars(
+                rel_project_soong_paths,
+                self._all_srcs[_KEY_R_PATH] | self._all_srcs[_KEY_SRCJAR_PATH]),
+            constant.KEY_DEPENDENCIES: _get_real_dependencies_jars(
+                rel_project_soong_paths, [constant.FRAMEWORK_SRCJARS]),
+            constant.KEY_PATH: [self._projects[0].project_relative_path],
+            constant.KEY_MODULE_NAME: constant.KEY_DEPENDENCIES,
+            constant.KEY_IML_NAME: constant.KEY_DEPENDENCIES
+        }
+        if self._framework_exist:
+            mod[constant.KEY_DEPENDENCIES].append(self._framework_iml)
+        if self._full_repo:
+            mod[constant.KEY_DEPENDENCIES].append(self._full_repo_iml)
+        dep_iml = iml.IMLGenerator(mod)
+        dep_iml.create({constant.KEY_DEP_SRCS: True,
+                        constant.KEY_SRCJARS: True,
+                        constant.KEY_JARS: True,
+                        constant.KEY_DEPENDENCIES: True})
+
+    def _unzip_all_scrjars(self):
+        """Unzips all scrjar files to a specific folder 'aidegen_r.srcjar'.
+
+        For some versions of IntelliJ no more supports unzipping srcjar files
+        automatically, we have to unzip it to a 'aidegen_r.srcjar' directory.
+        The rules of the unzip process are,
+        1) If it's a aapt2/R type jar or other directory type sources, add them
+           into self._all_srcs[_KEY_SOURCE_PATH].
+        2) If it's an R.srcjar file, check if the same path of aapt2/R directory
+           exists if so add aapt2/R path into into the
+           self._all_srcs[_KEY_SOURCE_PATH], otherwise unzip R.srcjar into
+           the 'aidegen_r.srcjar' directory and add the unzipped path into
+           self._all_srcs[_KEY_SOURCE_PATH].
+        """
+        sjars = self._all_srcs[_KEY_R_PATH] | self._all_srcs[_KEY_SRCJAR_PATH]
+        self._all_srcs[_KEY_R_PATH] = set()
+        self._all_srcs[_KEY_SRCJAR_PATH] = set()
+        for sjar in sjars:
+            if not os.path.exists(sjar):
+                continue
+            if os.path.isdir(sjar):
+                self._all_srcs[_KEY_SOURCE_PATH].add(sjar)
+                continue
+            sjar_dir = os.path.dirname(sjar)
+            sjar_name = os.path.basename(sjar).lower()
+            aapt2 = os.path.join(
+                os.path.dirname(sjar_dir), constant.NAME_AAPT2, _R)
+            if os.path.isdir(aapt2):
+                self._all_srcs[_KEY_SOURCE_PATH].add(aapt2)
+                continue
+            dest = os.path.join(
+                sjar_dir, ''.join([constant.UNZIP_SRCJAR_PATH_HEAD, sjar_name]))
+            if os.path.isdir(dest):
+                shutil.rmtree(dest)
+            common_util.unzip_file(sjar, dest)
+            self._all_srcs[_KEY_SOURCE_PATH].add(dest)
+
+    def gen_projects_iml(self):
+        """Generates the projects' iml file."""
+        root_path = common_util.get_android_root_dir()
+        excludes = project_config.ProjectConfig.get_instance().exclude_paths
+        for project in self._projects:
+            relpath = project.project_relative_path
+            exclude_folders = []
+            if not relpath:
+                exclude_folders.extend(get_exclude_content(root_path))
+            if excludes:
+                exclude_folders.extend(get_exclude_content(root_path, excludes))
+            mod_info = {
+                constant.KEY_EXCLUDES: ''.join(exclude_folders),
+                constant.KEY_SRCS: project.source_path[_KEY_SOURCE_PATH],
+                constant.KEY_TESTS: project.source_path[_KEY_TEST_PATH],
+                constant.KEY_DEPENDENCIES: project.dependencies,
+                constant.KEY_PATH: [relpath],
+                constant.KEY_MODULE_NAME: project.module_name,
+                constant.KEY_IML_NAME: iml.IMLGenerator.get_unique_iml_name(
+                    os.path.join(root_path, relpath))
+            }
+            dep_iml = iml.IMLGenerator(mod_info)
+            dep_iml.create({constant.KEY_SRCS: True,
+                            constant.KEY_DEPENDENCIES: True})
+            project.iml_path = dep_iml.iml_path
+        self._gen_dependencies_iml()
+
+    def _get_rel_project_soong_paths(self):
+        """Gets relative projects' paths in 'out/soong/.intermediates' folder.
+
+        Gets relative projects' paths in the 'out/soong/.intermediates'
+        directory. For example, if the self.projects = ['frameworks/base'] the
+        returned list should be ['out/soong/.intermediates/frameworks/base/'].
+
+        Returns:
+            A list of relative projects' paths in out/soong/.intermediates.
+        """
+        out_soong_dir = os.path.relpath(common_util.get_soong_out_path(),
+                                        common_util.get_android_root_dir())
+        rel_project_soong_paths = []
+        for project in self._projects:
+            relpath = project.project_relative_path
+            rel_project_soong_paths.append(os.sep.join(
+                [out_soong_dir, constant.INTERMEDIATES, relpath]) + os.sep)
+        return rel_project_soong_paths
+
+    def _remove_permission_definition_srcjar_path(self):
+        """Removes android.Manifest.permission definition srcjar path.
+
+        If framework-res/android_common/gen/aapt2/R directory or
+        framework-res/android_common/gen/android/R.srcjar file exists in
+        self._all_srcs[_KEY_SRCJAR_PATH], remove them.
+        """
+        if self._permission_aapt2 in self._all_srcs[_KEY_SRCJAR_PATH]:
+            self._all_srcs[_KEY_SRCJAR_PATH].remove(self._permission_aapt2)
+        if self._permission_r_srcjar in self._all_srcs[_KEY_SRCJAR_PATH]:
+            self._all_srcs[_KEY_SRCJAR_PATH].remove(self._permission_r_srcjar)
+
+
+def _get_real_dependencies_jars(list_to_check, list_to_be_checked):
+    """Gets real dependencies' jar from the input list.
+
+    There are jar files which have the same source codes as the
+    self.projects should be removed from dependencies. Otherwise these files
+    will cause the duplicated codes in IDE and lead to issues: b/158583214 is an
+    example.
+
+    Args:
+        list_to_check: A list of relative projects' paths in the folder
+                       out/soong/.intermediates to be checked if are contained
+                       in the list_to_be_checked list.
+        list_to_be_checked: A list of dependencies' paths to be checked.
+
+    Returns:
+        A list of dependency jar paths after duplicated ones removed.
+    """
+    file_exts = [constant.JAR_EXT]
+    real_jars = list_to_be_checked.copy()
+    for jar in list_to_be_checked:
+        ext = os.path.splitext(jar)[-1]
+        for check_path in list_to_check:
+            if check_path in jar and ext in file_exts:
+                real_jars.remove(jar)
+                break
+    return real_jars
+
+
+def get_exclude_content(root_path, excludes=None):
+    """Get the exclude folder content list.
+
+    It returns the exclude folders content list.
+    e.g.
+    ['<excludeFolder url="file://a/.idea" />',
+    '<excludeFolder url="file://a/.repo" />']
+
+    Args:
+        root_path: Android source file path.
+        excludes: A list of exclusive directories, the default value is None but
+                  will be assigned to _EXCLUDE_FOLDERS.
+
+    Returns:
+        String: exclude folder content list.
+    """
+    exclude_items = []
+    if not excludes:
+        excludes = _EXCLUDE_FOLDERS
+    for folder in excludes:
+        folder_path = os.path.join(root_path, folder)
+        if os.path.isdir(folder_path):
+            exclude_items.append(_EXCLUDE_ITEM % folder_path)
+    return exclude_items
+
+
+def _remove_child_duplicate_sources_from_parent(child, parent_sources, root):
+    """Removes the child's duplicate source folders from the parent source list.
+
+    Remove all the child's subdirectories from the parent's source list if there
+    is any.
+
+    Args:
+        child: A child project of ProjectInfo instance.
+        parent_sources: The parent project sources of the ProjectInfo instance.
+        root: A string of the Android root.
+
+    Returns:
+        A set of the sources to be removed.
+    """
+    rm_paths = set()
+    for path in parent_sources:
+        if (common_util.is_source_under_relative_path(
+                os.path.relpath(path, root), child.project_relative_path)):
+            rm_paths.add(path)
+    return rm_paths
+
+
+def _get_permission_aapt2_rel_path():
+    """Gets android.Manifest.permission definition srcjar path."""
+    out_soong_dir = os.path.relpath(common_util.get_soong_out_path(),
+                                    common_util.get_android_root_dir())
+    return os.path.join(out_soong_dir, constant.INTERMEDIATES,
+                        _PERMISSION_DEFINED_PATH, constant.NAME_AAPT2, _R)
+
+
+def _get_permission_r_srcjar_rel_path():
+    """Gets android.Manifest.permission definition srcjar path."""
+    out_soong_dir = os.path.relpath(common_util.get_soong_out_path(),
+                                    common_util.get_android_root_dir())
+    return os.path.join(out_soong_dir, constant.INTERMEDIATES,
+                        _PERMISSION_DEFINED_PATH, _ANDROID,
+                        constant.TARGET_R_SRCJAR)
diff --git a/aidegen/project/project_splitter_unittest.py b/aidegen/project/project_splitter_unittest.py
new file mode 100644
index 0000000..1e9665c
--- /dev/null
+++ b/aidegen/project/project_splitter_unittest.py
@@ -0,0 +1,414 @@
+#!/usr/bin/env python3
+#
+# Copyright 2020 - The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for project_splitter."""
+
+import os
+import shutil
+import tempfile
+import unittest
+from unittest import mock
+
+from aidegen import constant
+from aidegen import unittest_constants
+from aidegen.idea import iml
+from aidegen.lib import common_util
+from aidegen.lib import project_config
+from aidegen.lib import project_info
+from aidegen.project import project_splitter
+
+
+# pylint: disable=protected-access
+class ProjectSplitterUnittest(unittest.TestCase):
+    """Unit tests for ProjectSplitter class."""
+
+    _TEST_DIR = None
+    _TEST_PATH = unittest_constants.TEST_DATA_PATH
+    _SAMPLE_EXCLUDE_FOLDERS = [
+        '\n            <excludeFolder url="file://%s/.idea" />' % _TEST_PATH,
+        '\n            <excludeFolder url="file://%s/out" />' % _TEST_PATH,
+    ]
+
+    def setUp(self):
+        """Prepare the testdata related data."""
+        projects = []
+        targets = ['a', 'b', 'c', 'framework']
+        ProjectSplitterUnittest._TEST_DIR = tempfile.mkdtemp()
+        for i, target in enumerate(targets):
+            with mock.patch.object(project_info, 'ProjectInfo') as proj_info:
+                projects.append(proj_info(target, i == 0))
+        projects[0].project_relative_path = 'src1'
+        projects[0].source_path = {
+            'source_folder_path': {'src1', 'src2', 'other1'},
+            'test_folder_path': {'src1/tests'},
+            'jar_path': {'jar1.jar'},
+            'jar_module_path': dict(),
+            'r_java_path': set(),
+            'srcjar_path': {'srcjar1.srcjar'}
+        }
+        projects[1].project_relative_path = 'src2'
+        projects[1].source_path = {
+            'source_folder_path': {'src2', 'src2/src3', 'src2/lib', 'other2'},
+            'test_folder_path': {'src2/tests'},
+            'jar_path': set(),
+            'jar_module_path': dict(),
+            'r_java_path': set(),
+            'srcjar_path': {'srcjar2.srcjar'}
+        }
+        projects[2].project_relative_path = 'src2/src3'
+        projects[2].source_path = {
+            'source_folder_path': {'src2/src3', 'src2/lib'},
+            'test_folder_path': {'src2/src3/tests'},
+            'jar_path': {'jar3.jar'},
+            'jar_module_path': dict(),
+            'r_java_path': set(),
+            'srcjar_path': {'srcjar3.srcjar'}
+        }
+        projects[3].project_relative_path = 'frameworks/base'
+        projects[3].source_path = {
+            'source_folder_path': set(),
+            'test_folder_path': set(),
+            'jar_path': set(),
+            'jar_module_path': dict(),
+            'r_java_path': set(),
+            'srcjar_path': {'framework.srcjar', 'other.srcjar'}
+        }
+        with mock.patch.object(project_config.ProjectConfig,
+                               'get_instance') as proj_cfg:
+            config = mock.Mock()
+            config.full_repo = False
+            proj_cfg.return_value = config
+            self.split_projs = project_splitter.ProjectSplitter(projects)
+
+    def tearDown(self):
+        """Clear the testdata related path."""
+        self.split_projs = None
+        shutil.rmtree(ProjectSplitterUnittest._TEST_DIR)
+        iml.IMLGenerator.USED_NAME_CACHE.clear()
+
+    @mock.patch.object(common_util, 'get_android_root_dir')
+    @mock.patch.object(project_config.ProjectConfig, 'get_instance')
+    @mock.patch('builtins.any')
+    def test_init(self, mock_any, mock_project, mock_root):
+        """Test initialize the attributes."""
+        self.assertEqual(len(self.split_projs._projects), 4)
+        mock_any.return_value = False
+        mock_root.return_value = ProjectSplitterUnittest._TEST_DIR
+        with mock.patch.object(project_info, 'ProjectInfo') as proj_info:
+            config = mock.Mock()
+            config.full_repo = False
+            mock_project.return_value = config
+            project = project_splitter.ProjectSplitter(proj_info(['a'], True))
+            self.assertFalse(project._framework_exist)
+            config.full_repo = True
+            project = project_splitter.ProjectSplitter(proj_info(['a'], True))
+            self.assertEqual(project._full_repo_iml,
+                             os.path.basename(
+                                 ProjectSplitterUnittest._TEST_DIR))
+
+    @mock.patch.object(project_splitter.ProjectSplitter,
+                       '_remove_duplicate_sources')
+    @mock.patch.object(project_splitter.ProjectSplitter,
+                       '_keep_local_sources')
+    @mock.patch.object(project_splitter.ProjectSplitter,
+                       '_collect_all_srcs')
+    def test_revise_source_folders(self, mock_copy_srcs, mock_keep_srcs,
+                                   mock_remove_srcs):
+        """Test revise_source_folders."""
+        self.split_projs.revise_source_folders()
+        self.assertTrue(mock_copy_srcs.called)
+        self.assertTrue(mock_keep_srcs.called)
+        self.assertTrue(mock_remove_srcs.called)
+
+    def test_collect_all_srcs(self):
+        """Test _collect_all_srcs."""
+        self.split_projs._collect_all_srcs()
+        sources = self.split_projs._all_srcs
+        expected_srcs = {'src1', 'src2', 'src2/src3', 'src2/lib', 'other1',
+                         'other2'}
+        self.assertEqual(sources['source_folder_path'], expected_srcs)
+        expected_tests = {'src1/tests', 'src2/tests', 'src2/src3/tests'}
+        self.assertEqual(sources['test_folder_path'], expected_tests)
+
+    def test_keep_local_sources(self):
+        """Test _keep_local_sources."""
+        self.split_projs._collect_all_srcs()
+        self.split_projs._keep_local_sources()
+        srcs1 = self.split_projs._projects[0].source_path
+        srcs2 = self.split_projs._projects[1].source_path
+        srcs3 = self.split_projs._projects[2].source_path
+        all_srcs = self.split_projs._all_srcs
+        expected_srcs1 = {'src1'}
+        expected_srcs2 = {'src2', 'src2/src3', 'src2/lib'}
+        expected_srcs3 = {'src2/src3'}
+        expected_all_srcs = {'other1', 'other2'}
+        expected_all_tests = set()
+        self.assertEqual(srcs1['source_folder_path'], expected_srcs1)
+        self.assertEqual(srcs2['source_folder_path'], expected_srcs2)
+        self.assertEqual(srcs3['source_folder_path'], expected_srcs3)
+        self.assertEqual(all_srcs['source_folder_path'], expected_all_srcs)
+        self.assertEqual(all_srcs['test_folder_path'], expected_all_tests)
+
+    @mock.patch.object(
+        project_splitter, '_remove_child_duplicate_sources_from_parent')
+    def test_remove_duplicate_sources(self, mock_remove):
+        """Test _remove_duplicate_sources."""
+        self.split_projs._collect_all_srcs()
+        self.split_projs._keep_local_sources()
+        mock_remove.return_value = set()
+        self.split_projs._remove_duplicate_sources()
+        srcs2 = self.split_projs._projects[1].source_path
+        srcs3 = self.split_projs._projects[2].source_path
+        expected_srcs2 = {'src2', 'src2/lib'}
+        expected_srcs3 = {'src2/src3'}
+        self.assertEqual(srcs2['source_folder_path'], expected_srcs2)
+        self.assertEqual(srcs3['source_folder_path'], expected_srcs3)
+        self.assertTrue(mock_remove.called)
+
+    def test_get_dependencies(self):
+        """Test get_dependencies."""
+        iml.IMLGenerator.USED_NAME_CACHE.clear()
+        self.split_projs.get_dependencies()
+        dep1 = ['framework_srcjars', 'base', 'src2', 'dependencies']
+        dep2 = ['framework_srcjars', 'base', 'dependencies']
+        dep3 = ['framework_srcjars', 'base', 'src2', 'dependencies']
+        self.assertEqual(self.split_projs._projects[0].dependencies, dep1)
+        self.assertEqual(self.split_projs._projects[1].dependencies, dep2)
+        self.assertEqual(self.split_projs._projects[2].dependencies, dep3)
+
+    @mock.patch.object(iml.IMLGenerator, 'create')
+    @mock.patch.object(project_splitter.ProjectSplitter,
+                       '_get_permission_defined_source_path')
+    @mock.patch.object(project_splitter.ProjectSplitter,
+                       '_remove_permission_definition_srcjar_path')
+    @mock.patch.object(common_util, 'get_android_root_dir')
+    def test_gen_framework_srcjars_iml(
+        self, mock_root, mock_remove, mock_get, mock_create_iml):
+        """Test gen_framework_srcjars_iml."""
+        mock_root.return_value = self._TEST_DIR
+        mock_get.return_value = 'aapt2/R'
+        self.split_projs._projects[0].dep_modules = {
+            'framework-all': {
+                'module_name': 'framework-all',
+                'path': ['frameworks/base'],
+                'srcjars': ['framework.srcjar'],
+                'iml_name': 'framework_srcjars'
+            }
+        }
+        self.split_projs._framework_exist = False
+        self.split_projs.gen_framework_srcjars_iml()
+        srcjar_dict = {constant.KEY_DEP_SRCS: True, constant.KEY_SRCJARS: True,
+                       constant.KEY_DEPENDENCIES: True}
+        mock_create_iml.assert_called_with(srcjar_dict)
+        expected_srcjars = [
+            'other.srcjar',
+            'srcjar1.srcjar',
+            'srcjar2.srcjar',
+            'srcjar3.srcjar',
+        ]
+        expected_path = os.path.join(self._TEST_DIR,
+                                     'frameworks/base/framework_srcjars.iml')
+        self.split_projs._framework_exist = True
+        self.split_projs.revise_source_folders()
+        mock_get.return_value = None
+        iml_path = self.split_projs.gen_framework_srcjars_iml()
+        srcjars = self.split_projs._all_srcs['srcjar_path']
+        self.assertEqual(sorted(list(srcjars)), expected_srcjars)
+        self.assertEqual(iml_path, expected_path)
+        self.assertTrue(mock_remove.called)
+        srcjar_dict = {constant.KEY_SRCJARS: True,
+                       constant.KEY_DEPENDENCIES: True}
+        mock_create_iml.assert_called_with(srcjar_dict)
+
+    @mock.patch.object(project_splitter.ProjectSplitter, '_unzip_all_scrjars')
+    @mock.patch.object(iml.IMLGenerator, 'create')
+    @mock.patch.object(common_util, 'get_android_root_dir')
+    def test_gen_dependencies_iml(self, mock_root, mock_create_iml, mock_unzip):
+        """Test _gen_dependencies_iml."""
+        mock_root.return_value = self._TEST_DIR
+        self.split_projs.revise_source_folders()
+        self.split_projs._framework_exist = False
+        self.split_projs._gen_dependencies_iml()
+        self.assertTrue(mock_unzip.called)
+        mock_unzip.mock_reset()
+        self.split_projs._framework_exist = True
+        self.split_projs._gen_dependencies_iml()
+        self.assertTrue(mock_create_iml.called)
+        self.assertTrue(mock_unzip.called)
+
+    @mock.patch.object(project_splitter.ProjectSplitter, '_unzip_all_scrjars')
+    @mock.patch.object(project_splitter, 'get_exclude_content')
+    @mock.patch.object(project_config.ProjectConfig, 'get_instance')
+    @mock.patch.object(iml.IMLGenerator, 'create')
+    @mock.patch.object(common_util, 'get_android_root_dir')
+    def test_gen_projects_iml(self, mock_root, mock_create_iml, mock_project,
+                              mock_get_excludes, mock_unzip):
+        """Test gen_projects_iml."""
+        mock_root.return_value = self._TEST_DIR
+        config = mock.Mock()
+        mock_project.return_value = config
+        config.exclude_paths = []
+        self.split_projs.revise_source_folders()
+        self.split_projs.gen_projects_iml()
+        self.assertTrue(mock_create_iml.called)
+        self.assertTrue(mock_unzip.called)
+        mock_unzip.mock_reset()
+        self.assertFalse(mock_get_excludes.called)
+        config.exclude_paths = ['a']
+        self.split_projs.gen_projects_iml()
+        self.assertTrue(mock_get_excludes.called)
+        self.assertTrue(mock_unzip.called)
+
+    def test_get_exclude_content(self):
+        """Test get_exclude_content."""
+        exclude_folders = project_splitter.get_exclude_content(self._TEST_PATH)
+        self.assertEqual(self._SAMPLE_EXCLUDE_FOLDERS, exclude_folders)
+
+    def test_remove_child_duplicate_sources_from_parent(self):
+        """Test _remove_child_duplicate_sources_from_parent with conditions."""
+        child = mock.Mock()
+        child.project_relative_path = 'c/d'
+        root = 'a/b'
+        parent_sources = ['a/b/d/e', 'a/b/e/f']
+        result = project_splitter._remove_child_duplicate_sources_from_parent(
+            child, parent_sources, root)
+        self.assertEqual(set(), result)
+        parent_sources = ['a/b/c/d/e', 'a/b/e/f']
+        result = project_splitter._remove_child_duplicate_sources_from_parent(
+            child, parent_sources, root)
+        self.assertEqual(set(['a/b/c/d/e']), result)
+
+    @mock.patch('os.path.relpath')
+    def test_get_rel_project_soong_paths(self, mock_rel):
+        """Test _get_rel_project_soong_paths."""
+        mock_rel.return_value = 'out/soong'
+        expected = [
+            'out/soong/.intermediates/src1/',
+            'out/soong/.intermediates/src2/',
+            'out/soong/.intermediates/src2/src3/',
+            'out/soong/.intermediates/frameworks/base/'
+        ]
+        self.assertEqual(
+            expected, self.split_projs._get_rel_project_soong_paths())
+
+    def test_get_real_dependencies_jars(self):
+        """Test _get_real_dependencies_jars with conditions."""
+        expected = ['a/b/c/d']
+        self.assertEqual(expected, project_splitter._get_real_dependencies_jars(
+            [], expected))
+        expected = ['a/b/c/d.jar']
+        self.assertEqual(expected, project_splitter._get_real_dependencies_jars(
+            ['a/e'], expected))
+        expected = ['a/b/c/d.jar']
+        self.assertEqual([], project_splitter._get_real_dependencies_jars(
+            ['a/b'], expected))
+        expected = ['a/b/c/d.srcjar']
+        self.assertEqual(expected, project_splitter._get_real_dependencies_jars(
+            ['a/b'], expected))
+        expected = ['a/b/c/gen']
+        self.assertEqual(expected, project_splitter._get_real_dependencies_jars(
+            ['a/b'], expected))
+
+    @mock.patch.object(common_util, 'get_android_root_dir')
+    @mock.patch.object(common_util, 'get_soong_out_path')
+    def test_get_permission_aapt2_rel_path(self, mock_soong, mock_root):
+        """Test _get_permission_aapt2_rel_path."""
+        mock_soong.return_value = 'a/b/out/soong'
+        mock_root.return_value = 'a/b'
+        expected = ('out/soong/.intermediates/frameworks/base/core/res/'
+                    'framework-res/android_common/gen/aapt2/R')
+        self.assertEqual(
+            expected, project_splitter._get_permission_aapt2_rel_path())
+
+    @mock.patch.object(common_util, 'get_android_root_dir')
+    @mock.patch.object(common_util, 'get_soong_out_path')
+    def test_get_permission_r_srcjar_rel_path(self, mock_soong, mock_root):
+        """Test _get_permission_r_srcjar_rel_path."""
+        mock_soong.return_value = 'a/b/out/soong'
+        mock_root.return_value = 'a/b'
+        expected = ('out/soong/.intermediates/frameworks/base/core/res/'
+                    'framework-res/android_common/gen/android/R.srcjar')
+        self.assertEqual(
+            expected, project_splitter._get_permission_r_srcjar_rel_path())
+
+    @mock.patch.object(project_splitter, '_get_permission_r_srcjar_rel_path')
+    @mock.patch.object(project_splitter, '_get_permission_aapt2_rel_path')
+    def test_remove_permission_definition_srcjar_path(
+        self, mock_get_aapt2, mock_get_r_srcjar):
+        """Test _remove_permission_definition_srcjar_path with conditions."""
+        expected_srcjars = [
+            'other.srcjar',
+            'srcjar1.srcjar',
+            'srcjar2.srcjar',
+            'srcjar3.srcjar',
+        ]
+        mock_get_aapt2.return_value = 'none/aapt2/R'
+        mock_get_r_srcjar.return_value = 'none.srcjar'
+        self.split_projs._all_srcs['srcjar_path'] = expected_srcjars
+        self.split_projs._remove_permission_definition_srcjar_path()
+        srcjars = self.split_projs._all_srcs['srcjar_path']
+        self.assertEqual(sorted(list(srcjars)), expected_srcjars)
+
+        expected_srcjars = [
+            'other.srcjar',
+            'srcjar2.srcjar',
+            'srcjar3.srcjar',
+        ]
+        mock_get_r_srcjar.return_value = 'srcjar1.srcjar'
+        self.split_projs._all_srcs['srcjar_path'] = expected_srcjars
+        self.split_projs._remove_permission_definition_srcjar_path()
+        srcjars = self.split_projs._all_srcs['srcjar_path']
+        self.assertEqual(sorted(list(srcjars)), expected_srcjars)
+
+    @mock.patch('os.path.join')
+    @mock.patch.object(common_util, 'unzip_file')
+    @mock.patch('shutil.rmtree')
+    @mock.patch('os.path.isfile')
+    @mock.patch('os.path.isdir')
+    def test_get_permission_defined_source_path(
+        self, mock_is_dir, mock_is_file, mock_rmtree, mock_unzip, mock_join):
+        """Test _get_permission_defined_source_path function."""
+        mock_is_dir.return_value = True
+        self.split_projs._get_permission_defined_source_path()
+        self.assertFalse(mock_is_file.called)
+        self.assertFalse(mock_join.called)
+        self.assertFalse(mock_rmtree.called)
+        self.assertFalse(mock_unzip.called)
+        mock_is_dir.return_value = False
+        self.split_projs._get_permission_defined_source_path()
+        self.assertTrue(mock_is_file.called)
+        self.assertTrue(mock_join.called)
+        self.assertFalse(mock_rmtree.called)
+        self.assertTrue(mock_unzip.called)
+
+    @mock.patch.object(common_util, 'unzip_file')
+    @mock.patch('shutil.rmtree')
+    @mock.patch('os.path.join')
+    @mock.patch('os.path.dirname')
+    @mock.patch('os.path.isdir')
+    def test_unzip_all_scrjars(
+        self, mock_is_dir, mock_dirname, mock_join, mock_rmtree, mock_unzip):
+        """Test _unzip_all_scrjars function."""
+        mock_is_dir.return_value = True
+        self.split_projs._unzip_all_scrjars()
+        self.assertFalse(mock_dirname.called)
+        self.assertFalse(mock_join.called)
+        self.assertFalse(mock_rmtree.called)
+        self.assertFalse(mock_unzip.called)
+
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/aidegen/project/source_splitter.py b/aidegen/project/source_splitter.py
deleted file mode 100644
index 17ca12c..0000000
--- a/aidegen/project/source_splitter.py
+++ /dev/null
@@ -1,292 +0,0 @@
-#!/usr/bin/env python3
-#
-# Copyright 2020 - The Android Open Source Project
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-"""Separate the sources from multiple projects."""
-
-import os
-
-from aidegen import constant
-from aidegen.idea import iml
-from aidegen.lib import common_util
-from aidegen.lib import project_config
-
-_KEY_SOURCE_PATH = 'source_folder_path'
-_KEY_TEST_PATH = 'test_folder_path'
-_SOURCE_FOLDERS = [_KEY_SOURCE_PATH, _KEY_TEST_PATH]
-_KEY_SRCJAR_PATH = 'srcjar_path'
-_KEY_R_PATH = 'r_java_path'
-_KEY_JAR_PATH = 'jar_path'
-_EXCLUDE_ITEM = '\n            <excludeFolder url="file://%s" />'
-# Temporarily exclude test-dump and src_stub folders to prevent symbols from
-# resolving failure by incorrect reference. These two folders should be removed
-# after b/136982078 is resolved.
-_EXCLUDE_FOLDERS = ['.idea', '.repo', 'art', 'bionic', 'bootable', 'build',
-                    'dalvik', 'developers', 'device', 'hardware', 'kernel',
-                    'libnativehelper', 'pdk', 'prebuilts', 'sdk', 'system',
-                    'toolchain', 'tools', 'vendor', 'out',
-                    'art/tools/ahat/src/test-dump',
-                    'cts/common/device-side/device-info/src_stub']
-
-
-class ProjectSplitter:
-    """Splits the sources from multiple projects.
-
-    It's a specific solution to deal with the source folders in multiple
-    project case. Since the IntelliJ does not allow duplicate source folders,
-    AIDEGen needs to separate the source folders for each project. The single
-    project case has no different with current structure.
-
-    Usage:
-    project_splitter = ProjectSplitter(projects)
-
-    # Find the dependencies between the projects.
-    project_splitter.get_dependencies()
-
-    # Clear the source folders for each project.
-    project_splitter.revise_source_folders()
-
-    Attributes:
-        _projects: A list of ProjectInfo.
-        _all_srcs: A dictionary contains all sources of multiple projects.
-                   e.g.
-                   {
-                       'module_name': 'test',
-                       'path': ['path/to/module'],
-                       'srcs': ['src_folder1', 'src_folder2'],
-                       'tests': ['test_folder1', 'test_folder2']
-                       'jars': ['jar1.jar'],
-                       'srcjars': ['1.srcjar', '2.srcjar'],
-                       'dependencies': ['framework_srcjars', 'base'],
-                       'iml_name': '/abs/path/to/iml.iml'
-                   }
-        _framework_exist: A boolean, True if framework is one of the projects.
-        _framework_iml: A string, the name of the framework's iml.
-        _full_repo: A boolean, True if loading with full Android sources.
-        _full_repo_iml: A string, the name of the Android folder's iml.
-    """
-    def __init__(self, projects):
-        """ProjectSplitter initialize.
-
-        Args:
-            projects: A list of ProjectInfo object.
-        """
-        self._projects = projects
-        self._all_srcs = dict(projects[0].source_path)
-        self._framework_iml = None
-        self._framework_exist = any(
-            {p.project_relative_path == constant.FRAMEWORK_PATH
-             for p in self._projects})
-        if self._framework_exist:
-            self._framework_iml = iml.IMLGenerator.get_unique_iml_name(
-                os.path.join(common_util.get_android_root_dir(),
-                             constant.FRAMEWORK_PATH))
-        self._full_repo = project_config.ProjectConfig.get_instance().full_repo
-        if self._full_repo:
-            self._full_repo_iml = os.path.basename(
-                common_util.get_android_root_dir())
-
-    def revise_source_folders(self):
-        """Resets the source folders of each project.
-
-        There should be no duplicate source root path in IntelliJ. The issue
-        doesn't happen in single project case. Once users choose multiple
-        projects, there could be several same source paths of different
-        projects. In order to prevent that, we should remove the source paths
-        in dependencies.iml which are duplicate with the paths in [module].iml
-        files.
-
-        Steps to prevent the duplicate source root path in IntelliJ:
-        1. Copy all sources from sub-projects to main project.
-        2. Delete the source and test folders which are not under the
-           sub-projects.
-        3. Delete the sub-projects' source and test paths from the main project.
-        """
-        self._collect_all_srcs()
-        self._keep_local_sources()
-        self._remove_duplicate_sources()
-
-    def _collect_all_srcs(self):
-        """Copies all projects' sources to a dictionary."""
-        for project in self._projects[1:]:
-            for key, value in project.source_path.items():
-                self._all_srcs[key].update(value)
-
-    def _keep_local_sources(self):
-        """Removes source folders which are not under the project's path.
-
-        1. Remove the source folders which are not under the project.
-        2. Remove the duplicate project's source folders from the _all_srcs.
-        """
-        for project in self._projects:
-            srcs = project.source_path
-            relpath = project.project_relative_path
-            is_root = not relpath
-            for key in _SOURCE_FOLDERS:
-                srcs[key] = {s for s in srcs[key]
-                             if common_util.is_source_under_relative_path(
-                                 s, relpath) or is_root}
-                self._all_srcs[key] -= srcs[key]
-
-    def _remove_duplicate_sources(self):
-        """Removes the duplicate source folders from each sub project.
-
-        Priority processing with the longest path length, e.g.
-        frameworks/base/packages/SettingsLib must have priority over
-        frameworks/base.
-        """
-        for child in sorted(self._projects, key=lambda k: len(
-                k.project_relative_path), reverse=True):
-            for parent in self._projects:
-                is_root = not parent.project_relative_path
-                if parent is child:
-                    continue
-                if (common_util.is_source_under_relative_path(
-                        child.project_relative_path,
-                        parent.project_relative_path) or is_root):
-                    for key in _SOURCE_FOLDERS:
-                        parent.source_path[key] -= child.source_path[key]
-
-    def get_dependencies(self):
-        """Gets the dependencies between the projects.
-
-        Check if the current project's source folder exists in other projects.
-        If do, the current project is a dependency module to the other.
-        """
-        for project in sorted(self._projects, key=lambda k: len(
-                k.project_relative_path)):
-            proj_path = project.project_relative_path
-            project.dependencies = [constant.FRAMEWORK_SRCJARS]
-            if self._framework_exist and proj_path != constant.FRAMEWORK_PATH:
-                project.dependencies.append(self._framework_iml)
-            if self._full_repo and proj_path:
-                project.dependencies.append(self._full_repo_iml)
-            srcs = (project.source_path[_KEY_SOURCE_PATH]
-                    | project.source_path[_KEY_TEST_PATH])
-            for dep_proj in sorted(self._projects, key=lambda k: len(
-                    k.project_relative_path)):
-                dep_path = dep_proj.project_relative_path
-                is_root = not dep_path
-                is_child = common_util.is_source_under_relative_path(dep_path,
-                                                                     proj_path)
-                is_dep = any({s for s in srcs
-                              if common_util.is_source_under_relative_path(
-                                  s, dep_path) or is_root})
-                if dep_proj is project or is_child or not is_dep:
-                    continue
-                dep = iml.IMLGenerator.get_unique_iml_name(os.path.join(
-                    common_util.get_android_root_dir(), dep_path))
-                if dep not in project.dependencies:
-                    project.dependencies.append(dep)
-            project.dependencies.append(constant.KEY_DEPENDENCIES)
-
-    def gen_framework_srcjars_iml(self):
-        """Generates the framework-srcjars.iml.
-
-        Create the iml file with only the srcjars of module framework-all. These
-        srcjars will be separated from the modules under frameworks/base.
-
-        Returns:
-            A string of the framework_srcjars.iml's absolute path.
-        """
-        mod = dict(self._projects[0].dep_modules[constant.FRAMEWORK_ALL])
-        mod[constant.KEY_DEPENDENCIES] = []
-        mod[constant.KEY_IML_NAME] = constant.FRAMEWORK_SRCJARS
-        if self._framework_exist:
-            mod[constant.KEY_DEPENDENCIES].append(self._framework_iml)
-        if self._full_repo:
-            mod[constant.KEY_DEPENDENCIES].append(self._full_repo_iml)
-        mod[constant.KEY_DEPENDENCIES].append(constant.KEY_DEPENDENCIES)
-        framework_srcjars_iml = iml.IMLGenerator(mod)
-        framework_srcjars_iml.create({constant.KEY_SRCJARS: True,
-                                      constant.KEY_DEPENDENCIES: True})
-        self._all_srcs[_KEY_SRCJAR_PATH] -= set(mod[constant.KEY_SRCJARS])
-        return framework_srcjars_iml.iml_path
-
-    def _gen_dependencies_iml(self):
-        """Generates the dependencies.iml."""
-        mod = {
-            constant.KEY_SRCS: self._all_srcs[_KEY_SOURCE_PATH],
-            constant.KEY_TESTS: self._all_srcs[_KEY_TEST_PATH],
-            constant.KEY_JARS: self._all_srcs[_KEY_JAR_PATH],
-            constant.KEY_SRCJARS: (self._all_srcs[_KEY_R_PATH]
-                                   | self._all_srcs[_KEY_SRCJAR_PATH]),
-            constant.KEY_DEPENDENCIES: [constant.FRAMEWORK_SRCJARS],
-            constant.KEY_PATH: [self._projects[0].project_relative_path],
-            constant.KEY_MODULE_NAME: constant.KEY_DEPENDENCIES,
-            constant.KEY_IML_NAME: constant.KEY_DEPENDENCIES
-        }
-        if self._framework_exist:
-            mod[constant.KEY_DEPENDENCIES].append(self._framework_iml)
-        if self._full_repo:
-            mod[constant.KEY_DEPENDENCIES].append(self._full_repo_iml)
-        dep_iml = iml.IMLGenerator(mod)
-        dep_iml.create({constant.KEY_DEP_SRCS: True,
-                        constant.KEY_SRCJARS: True,
-                        constant.KEY_JARS: True,
-                        constant.KEY_DEPENDENCIES: True})
-
-    def gen_projects_iml(self):
-        """Generates the projects' iml file."""
-        root_path = common_util.get_android_root_dir()
-        excludes = project_config.ProjectConfig.get_instance().exclude_paths
-        for project in self._projects:
-            relpath = project.project_relative_path
-            exclude_folders = []
-            if not relpath:
-                exclude_folders.extend(get_exclude_content(root_path))
-            if excludes:
-                exclude_folders.extend(get_exclude_content(root_path, excludes))
-            mod_info = {
-                constant.KEY_EXCLUDES: ''.join(exclude_folders),
-                constant.KEY_SRCS: project.source_path[_KEY_SOURCE_PATH],
-                constant.KEY_TESTS: project.source_path[_KEY_TEST_PATH],
-                constant.KEY_DEPENDENCIES: project.dependencies,
-                constant.KEY_PATH: [relpath],
-                constant.KEY_MODULE_NAME: project.module_name,
-                constant.KEY_IML_NAME: iml.IMLGenerator.get_unique_iml_name(
-                    os.path.join(root_path, relpath))
-            }
-            dep_iml = iml.IMLGenerator(mod_info)
-            dep_iml.create({constant.KEY_SRCS: True,
-                            constant.KEY_DEPENDENCIES: True})
-            project.iml_path = dep_iml.iml_path
-        self._gen_dependencies_iml()
-
-
-def get_exclude_content(root_path, excludes=None):
-    """Get the exclude folder content list.
-
-    It returns the exclude folders content list.
-    e.g.
-    ['<excludeFolder url="file://a/.idea" />',
-    '<excludeFolder url="file://a/.repo" />']
-
-    Args:
-        root_path: Android source file path.
-        excludes: A list of exclusive directories, the default value is None but
-                  will be assigned to _EXCLUDE_FOLDERS.
-
-    Returns:
-        String: exclude folder content list.
-    """
-    exclude_items = []
-    if not excludes:
-        excludes = _EXCLUDE_FOLDERS
-    for folder in excludes:
-        folder_path = os.path.join(root_path, folder)
-        if os.path.isdir(folder_path):
-            exclude_items.append(_EXCLUDE_ITEM % folder_path)
-    return exclude_items
diff --git a/aidegen/project/source_splitter_unittest.py b/aidegen/project/source_splitter_unittest.py
deleted file mode 100644
index 9908ddb..0000000
--- a/aidegen/project/source_splitter_unittest.py
+++ /dev/null
@@ -1,254 +0,0 @@
-#!/usr/bin/env python3
-#
-# Copyright 2020 - The Android Open Source Project
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-"""Unittests for source_splitter."""
-
-import os
-import shutil
-import tempfile
-import unittest
-from unittest import mock
-
-from aidegen import unittest_constants
-from aidegen.idea import iml
-from aidegen.lib import common_util
-from aidegen.lib import project_config
-from aidegen.lib import project_info
-from aidegen.project import source_splitter
-
-
-# pylint: disable=protected-access
-class ProjectSplitterUnittest(unittest.TestCase):
-    """Unit tests for ProjectSplitter class."""
-
-    _TEST_DIR = None
-    _TEST_PATH = unittest_constants.TEST_DATA_PATH
-    _SAMPLE_EXCLUDE_FOLDERS = [
-        '\n            <excludeFolder url="file://%s/.idea" />' % _TEST_PATH,
-        '\n            <excludeFolder url="file://%s/out" />' % _TEST_PATH,
-    ]
-
-    def setUp(self):
-        """Prepare the testdata related data."""
-        projects = []
-        targets = ['a', 'b', 'c', 'framework']
-        ProjectSplitterUnittest._TEST_DIR = tempfile.mkdtemp()
-        for i, target in enumerate(targets):
-            with mock.patch.object(project_info, 'ProjectInfo') as proj_info:
-                projects.append(proj_info(target, i == 0))
-        projects[0].project_relative_path = 'src1'
-        projects[0].source_path = {
-            'source_folder_path': {'src1', 'src2', 'other1'},
-            'test_folder_path': {'src1/tests'},
-            'jar_path': {'jar1.jar'},
-            'jar_module_path': dict(),
-            'r_java_path': set(),
-            'srcjar_path': {'srcjar1.srcjar'}
-        }
-        projects[1].project_relative_path = 'src2'
-        projects[1].source_path = {
-            'source_folder_path': {'src2', 'src2/src3', 'src2/lib', 'other2'},
-            'test_folder_path': {'src2/tests'},
-            'jar_path': set(),
-            'jar_module_path': dict(),
-            'r_java_path': set(),
-            'srcjar_path': {'srcjar2.srcjar'}
-        }
-        projects[2].project_relative_path = 'src2/src3'
-        projects[2].source_path = {
-            'source_folder_path': {'src2/src3', 'src2/lib'},
-            'test_folder_path': {'src2/src3/tests'},
-            'jar_path': {'jar3.jar'},
-            'jar_module_path': dict(),
-            'r_java_path': set(),
-            'srcjar_path': {'srcjar3.srcjar'}
-        }
-        projects[3].project_relative_path = 'frameworks/base'
-        projects[3].source_path = {
-            'source_folder_path': set(),
-            'test_folder_path': set(),
-            'jar_path': set(),
-            'jar_module_path': dict(),
-            'r_java_path': set(),
-            'srcjar_path': {'framework.srcjar', 'other.srcjar'}
-        }
-        with mock.patch.object(project_config.ProjectConfig,
-                               'get_instance') as proj_cfg:
-            config = mock.Mock()
-            config.full_repo = False
-            proj_cfg.return_value = config
-            self.split_projs = source_splitter.ProjectSplitter(projects)
-
-    def tearDown(self):
-        """Clear the testdata related path."""
-        self.split_projs = None
-        shutil.rmtree(ProjectSplitterUnittest._TEST_DIR)
-        iml.IMLGenerator.USED_NAME_CACHE.clear()
-
-    @mock.patch.object(common_util, 'get_android_root_dir')
-    @mock.patch.object(project_config.ProjectConfig, 'get_instance')
-    @mock.patch('builtins.any')
-    def test_init(self, mock_any, mock_project, mock_root):
-        """Test initialize the attributes."""
-        self.assertEqual(len(self.split_projs._projects), 4)
-        mock_any.return_value = False
-        mock_root.return_value = ProjectSplitterUnittest._TEST_DIR
-        with mock.patch.object(project_info, 'ProjectInfo') as proj_info:
-            config = mock.Mock()
-            config.full_repo = False
-            mock_project.return_value = config
-            project = source_splitter.ProjectSplitter(proj_info(['a'], True))
-            self.assertFalse(project._framework_exist)
-            config.full_repo = True
-            project = source_splitter.ProjectSplitter(proj_info(['a'], True))
-            self.assertEqual(project._full_repo_iml,
-                             os.path.basename(
-                                 ProjectSplitterUnittest._TEST_DIR))
-
-    @mock.patch.object(source_splitter.ProjectSplitter,
-                       '_remove_duplicate_sources')
-    @mock.patch.object(source_splitter.ProjectSplitter,
-                       '_keep_local_sources')
-    @mock.patch.object(source_splitter.ProjectSplitter,
-                       '_collect_all_srcs')
-    def test_revise_source_folders(self, mock_copy_srcs, mock_keep_srcs,
-                                   mock_remove_srcs):
-        """Test revise_source_folders."""
-        self.split_projs.revise_source_folders()
-        self.assertTrue(mock_copy_srcs.called)
-        self.assertTrue(mock_keep_srcs.called)
-        self.assertTrue(mock_remove_srcs.called)
-
-    def test_collect_all_srcs(self):
-        """Test _collect_all_srcs."""
-        self.split_projs._collect_all_srcs()
-        sources = self.split_projs._all_srcs
-        expected_srcs = {'src1', 'src2', 'src2/src3', 'src2/lib', 'other1',
-                         'other2'}
-        self.assertEqual(sources['source_folder_path'], expected_srcs)
-        expected_tests = {'src1/tests', 'src2/tests', 'src2/src3/tests'}
-        self.assertEqual(sources['test_folder_path'], expected_tests)
-
-    def test_keep_local_sources(self):
-        """Test _keep_local_sources."""
-        self.split_projs._collect_all_srcs()
-        self.split_projs._keep_local_sources()
-        srcs1 = self.split_projs._projects[0].source_path
-        srcs2 = self.split_projs._projects[1].source_path
-        srcs3 = self.split_projs._projects[2].source_path
-        all_srcs = self.split_projs._all_srcs
-        expected_srcs1 = {'src1'}
-        expected_srcs2 = {'src2', 'src2/src3', 'src2/lib'}
-        expected_srcs3 = {'src2/src3'}
-        expected_all_srcs = {'other1', 'other2'}
-        expected_all_tests = set()
-        self.assertEqual(srcs1['source_folder_path'], expected_srcs1)
-        self.assertEqual(srcs2['source_folder_path'], expected_srcs2)
-        self.assertEqual(srcs3['source_folder_path'], expected_srcs3)
-        self.assertEqual(all_srcs['source_folder_path'], expected_all_srcs)
-        self.assertEqual(all_srcs['test_folder_path'], expected_all_tests)
-
-    def test_remove_duplicate_sources(self):
-        """Test _remove_duplicate_sources."""
-        self.split_projs._collect_all_srcs()
-        self.split_projs._keep_local_sources()
-        self.split_projs._remove_duplicate_sources()
-        srcs2 = self.split_projs._projects[1].source_path
-        srcs3 = self.split_projs._projects[2].source_path
-        expected_srcs2 = {'src2', 'src2/lib'}
-        expected_srcs3 = {'src2/src3'}
-        self.assertEqual(srcs2['source_folder_path'], expected_srcs2)
-        self.assertEqual(srcs3['source_folder_path'], expected_srcs3)
-
-    def test_get_dependencies(self):
-        """Test get_dependencies."""
-        iml.IMLGenerator.USED_NAME_CACHE.clear()
-        self.split_projs.get_dependencies()
-        dep1 = ['framework_srcjars', 'base', 'src2', 'dependencies']
-        dep2 = ['framework_srcjars', 'base', 'dependencies']
-        dep3 = ['framework_srcjars', 'base', 'src2', 'dependencies']
-        self.assertEqual(self.split_projs._projects[0].dependencies, dep1)
-        self.assertEqual(self.split_projs._projects[1].dependencies, dep2)
-        self.assertEqual(self.split_projs._projects[2].dependencies, dep3)
-
-    @mock.patch.object(common_util, 'get_android_root_dir')
-    def test_gen_framework_srcjars_iml(self, mock_root):
-        """Test gen_framework_srcjars_iml."""
-        mock_root.return_value = self._TEST_DIR
-        self.split_projs._projects[0].dep_modules = {
-            'framework-all': {
-                'module_name': 'framework-all',
-                'path': ['frameworks/base'],
-                'srcjars': ['framework.srcjar'],
-                'iml_name': 'framework_srcjars'
-            }
-        }
-        self.split_projs._framework_exist = False
-        self.split_projs.gen_framework_srcjars_iml()
-        expected_srcjars = [
-            'other.srcjar',
-            'srcjar1.srcjar',
-            'srcjar2.srcjar',
-            'srcjar3.srcjar',
-        ]
-        expected_path = os.path.join(self._TEST_DIR,
-                                     'frameworks/base/framework_srcjars.iml')
-        self.split_projs._framework_exist = True
-        self.split_projs.revise_source_folders()
-        iml_path = self.split_projs.gen_framework_srcjars_iml()
-        srcjars = self.split_projs._all_srcs['srcjar_path']
-        self.assertEqual(sorted(list(srcjars)), expected_srcjars)
-        self.assertEqual(iml_path, expected_path)
-
-    @mock.patch.object(iml.IMLGenerator, 'create')
-    @mock.patch.object(common_util, 'get_android_root_dir')
-    def test_gen_dependencies_iml(self, mock_root, mock_create_iml):
-        """Test _gen_dependencies_iml."""
-        mock_root.return_value = self._TEST_DIR
-        self.split_projs.revise_source_folders()
-        self.split_projs._framework_exist = False
-        self.split_projs._gen_dependencies_iml()
-        self.split_projs._framework_exist = True
-        self.split_projs._gen_dependencies_iml()
-        self.assertTrue(mock_create_iml.called)
-
-    @mock.patch.object(source_splitter, 'get_exclude_content')
-    @mock.patch.object(project_config.ProjectConfig, 'get_instance')
-    @mock.patch.object(iml.IMLGenerator, 'create')
-    @mock.patch.object(common_util, 'get_android_root_dir')
-    def test_gen_projects_iml(self, mock_root, mock_create_iml, mock_project,
-                              mock_get_excludes):
-        """Test gen_projects_iml."""
-        mock_root.return_value = self._TEST_DIR
-        config = mock.Mock()
-        mock_project.return_value = config
-        config.exclude_paths = []
-        self.split_projs.revise_source_folders()
-        self.split_projs.gen_projects_iml()
-        self.assertTrue(mock_create_iml.called)
-        self.assertFalse(mock_get_excludes.called)
-        config.exclude_paths = ['a']
-        self.split_projs.gen_projects_iml()
-        self.assertTrue(mock_get_excludes.called)
-
-    def test_get_exclude_content(self):
-        """Test get_exclude_content."""
-        exclude_folders = source_splitter.get_exclude_content(self._TEST_PATH)
-        self.assertEqual(self._SAMPLE_EXCLUDE_FOLDERS, exclude_folders)
-
-
-if __name__ == '__main__':
-    unittest.main()
diff --git a/aidegen/run_tests.sh b/aidegen/run_tests.sh
index 3ac3500..fb52235 100755
--- a/aidegen/run_tests.sh
+++ b/aidegen/run_tests.sh
@@ -19,14 +19,13 @@
 [ "$(uname -s)" == "Darwin" ] && { realpath(){ echo "$(cd $(dirname $1);pwd -P)/$(basename $1)"; }; }
 AIDEGEN_DIR=$(dirname $(realpath $0))
 ASUITE_DIR="$(dirname $AIDEGEN_DIR)"
-CORE_DIR="$(dirname $ASUITE_DIR)/tradefederation/core"
-ATEST_DIR="$CORE_DIR/atest"
+ATEST_DIR="$ASUITE_DIR/atest"
 RC_FILE=${AIDEGEN_DIR}/.coveragerc
 MOD_COVERAGE='coverage:import coverage'
 MOD_PROTOBUF='protobuf:from google import protobuf'
 
 function get_python_path() {
-    echo "$PYTHONPATH:$CORE_DIR:$ATEST_DIR:$ASUITE_DIR"
+    echo "$PYTHONPATH:$ASUITE_DIR:$ATEST_DIR"
 }
 
 function print_summary() {
@@ -48,7 +47,7 @@
     local specified_tests=$@
     local rc=0
 
-    # Get all unit tests under tools/acloud.
+    # Get all unit tests under asuite/aidegen.
     local all_tests=$(find $AIDEGEN_DIR -type f -name "*_unittest.py");
     local tests_to_run=$all_tests
 
@@ -56,8 +55,8 @@
     for t in $tests_to_run; do
         echo "Testing" $t
         if ! PYTHONPATH=$(get_python_path) python3 -m coverage run --append --rcfile=$RC_FILE $t; then
-            rc=1
-            echo -e "${RED}$t failed${NC}"
+           rc=1
+           echo -e "${RED}$t failed${NC}"
         fi
     done
 
diff --git a/aidegen/sdk/jdk_table.py b/aidegen/sdk/jdk_table.py
index 84a7102..21de99e 100644
--- a/aidegen/sdk/jdk_table.py
+++ b/aidegen/sdk/jdk_table.py
@@ -272,5 +272,8 @@
         self._generate_jdk_config_string()
         self._generate_sdk_config_string()
         if self._modify_config:
+            if not os.path.exists(self._config_file):
+                common_util.file_generate(
+                    self._config_file, templates.JDK_TABLE_XML)
             self._xml.write(self._config_file)
         return bool(self._android_sdk_version)
diff --git a/aidegen/sdk/jdk_table_unittest.py b/aidegen/sdk/jdk_table_unittest.py
index 7af9179..b38af61 100644
--- a/aidegen/sdk/jdk_table_unittest.py
+++ b/aidegen/sdk/jdk_table_unittest.py
@@ -32,6 +32,7 @@
 
 
 # pylint: disable=protected-access
+# pylint: disable=too-many-arguments
 class JDKTableXMLUnittests(unittest.TestCase):
     """Unit tests for JDKTableXML class."""
 
@@ -142,13 +143,16 @@
 
     @mock.patch.object(jdk_table.JDKTableXML, '_override_xml')
     @mock.patch.object(ElementTree.ElementTree, 'write')
-    @mock.patch.object(jdk_table.JDKTableXML, '_generate_jdk_config_string')
+    @mock.patch('os.path.exists')
     @mock.patch.object(jdk_table.JDKTableXML, '_generate_sdk_config_string')
+    @mock.patch.object(jdk_table.JDKTableXML, '_generate_jdk_config_string')
     @mock.patch.object(jdk_table.JDKTableXML, '_check_structure')
     def test_config_jdk_table_xml(self, mock_check_structure, mock_gen_jdk,
-                                  mock_gen_sdk, mock_xml_write, mock_override):
+                                  mock_gen_sdk, mock_exist, mock_xml_write,
+                                  mock_override):
         """Test config_jdk_table_xml."""
         mock_check_structure.return_value = True
+        mock_exist.return_value = True
         self.jdk_table_xml.config_jdk_table_xml()
         self.assertTrue(mock_gen_jdk.called)
         self.assertTrue(mock_gen_sdk.called)
diff --git a/aidegen/templates.py b/aidegen/templates.py
index d7e2657..678c5bb 100644
--- a/aidegen/templates.py
+++ b/aidegen/templates.py
@@ -101,6 +101,9 @@
             </list>
         </option>
     </component>
+    <component name="FrameworkDetectionExcludesConfiguration">
+        <type id="android" />
+    </component>
     <component name="ContinuousBuildConfigurationComponent">
         <builds>
             <build intervalToCheckBuild="1" buildKey="" buildLabel=""
@@ -407,9 +410,7 @@
 <module type="JAVA_MODULE" version="4">
   <component name="FacetManager">
     <facet type="android" name="Android">
-      <configuration>
-        <proGuardCfgFiles />
-      </configuration>
+      <configuration />
     </facet>
   </component>
   <component name="NewModuleRootManager" inherit-compiler-output="true">
diff --git a/aidegen/test_data/packages/apps/test/src/java.java b/aidegen/test_data/packages/apps/test/src/java.java
index 9c57962..077ca3b 100644
--- a/aidegen/test_data/packages/apps/test/src/java.java
+++ b/aidegen/test_data/packages/apps/test/src/java.java
@@ -16,7 +16,7 @@
 
 package tests.packages;
 
-/** Dummy Class file for unit tests. */
+/** Unused Class file for unit tests. */
 public class SomeClassForTesting {
-    private static final String SOME_DUMMY_VAR = "For testing purposes";
+    private static final String SOME_UNUSED_VAR = "For testing purposes";
 }
diff --git a/aidegen/test_data/packages/apps/test/src/main/java/com/android/java.java b/aidegen/test_data/packages/apps/test/src/main/java/com/android/java.java
index aff8b64..57a1e45 100644
--- a/aidegen/test_data/packages/apps/test/src/main/java/com/android/java.java
+++ b/aidegen/test_data/packages/apps/test/src/main/java/com/android/java.java
@@ -16,7 +16,7 @@
 
 package com.android;
 
-/** Dummy Class file for unit tests. */
+/** Unused Class file for unit tests. */
 public class SomeClassForTesting {
-    private static final String SOME_DUMMY_VAR = "For testing purposes";
+    private static final String SOME_UNUSED_VAR = "For testing purposes";
 }
diff --git a/aidegen/test_data/packages/apps/test/src/main/java/com/android/no_package.java b/aidegen/test_data/packages/apps/test/src/main/java/com/android/no_package.java
index aaa822c..0359828 100644
--- a/aidegen/test_data/packages/apps/test/src/main/java/com/android/no_package.java
+++ b/aidegen/test_data/packages/apps/test/src/main/java/com/android/no_package.java
@@ -14,7 +14,7 @@
  * limitations under the License.
  */
 
-/** Dummy Class file for unit tests. */
+/** Unused Class file for unit tests. */
 public class SomeClassForTesting {
-    private static final String SOME_DUMMY_VAR = "For testing purposes";
+    private static final String SOME_UNUSED_VAR = "For testing purposes";
 }
diff --git a/aidegen/test_data/packages/apps/test/src/tests.packages/java.java b/aidegen/test_data/packages/apps/test/src/tests.packages/java.java
index 9c57962..077ca3b 100644
--- a/aidegen/test_data/packages/apps/test/src/tests.packages/java.java
+++ b/aidegen/test_data/packages/apps/test/src/tests.packages/java.java
@@ -16,7 +16,7 @@
 
 package tests.packages;
 
-/** Dummy Class file for unit tests. */
+/** Unused Class file for unit tests. */
 public class SomeClassForTesting {
-    private static final String SOME_DUMMY_VAR = "For testing purposes";
+    private static final String SOME_UNUSED_VAR = "For testing purposes";
 }
diff --git a/aidegen/test_data/packages/apps/test/src/tests.packages/test/java.java b/aidegen/test_data/packages/apps/test/src/tests.packages/test/java.java
index 3b73d9e..446b067 100644
--- a/aidegen/test_data/packages/apps/test/src/tests.packages/test/java.java
+++ b/aidegen/test_data/packages/apps/test/src/tests.packages/test/java.java
@@ -16,7 +16,7 @@
 
 package tests.packages.test;
 
-/** Dummy Class file for unit tests. */
+/** Unused Class file for unit tests. */
 public class SomeClassForTesting {
-    private static final String SOME_DUMMY_VAR = "For testing purposes";
+    private static final String SOME_UNUSED_VAR = "For testing purposes";
 }
diff --git a/aidegen/test_data/packages/apps/test/tests/com/android/test.java b/aidegen/test_data/packages/apps/test/tests/com/android/test.java
index aff8b64..57a1e45 100644
--- a/aidegen/test_data/packages/apps/test/tests/com/android/test.java
+++ b/aidegen/test_data/packages/apps/test/tests/com/android/test.java
@@ -16,7 +16,7 @@
 
 package com.android;
 
-/** Dummy Class file for unit tests. */
+/** Unused Class file for unit tests. */
 public class SomeClassForTesting {
-    private static final String SOME_DUMMY_VAR = "For testing purposes";
+    private static final String SOME_UNUSED_VAR = "For testing purposes";
 }
diff --git a/aidegen_functional_test/Android.bp b/aidegen_functional_test/Android.bp
index 9147178..4b7c05c 100644
--- a/aidegen_functional_test/Android.bp
+++ b/aidegen_functional_test/Android.bp
@@ -12,6 +12,10 @@
 // See the License for the specific language governing permissions and
 // limitations under the License.
 
+package {
+    default_applicable_licenses: ["Android-Apache-2.0"],
+}
+
 python_defaults {
     name: "aidegen_functional_test_default",
     pkg_path: "aidegen_functional_test",
diff --git a/aidegen_functional_test/aidegen_functional_test_main.py b/aidegen_functional_test/aidegen_functional_test_main.py
index 8615943..62b2b0e 100644
--- a/aidegen_functional_test/aidegen_functional_test_main.py
+++ b/aidegen_functional_test/aidegen_functional_test_main.py
@@ -17,6 +17,7 @@
 """Functional test for aidegen project files."""
 
 from __future__ import absolute_import
+from __future__ import print_function
 
 import argparse
 import functools
@@ -32,6 +33,7 @@
 from aidegen import aidegen_main
 from aidegen import constant
 from aidegen.lib import clion_project_file_gen
+# pylint: disable=no-name-in-module
 from aidegen.lib import common_util
 from aidegen.lib import errors
 from aidegen.lib import module_info_util
@@ -47,6 +49,7 @@
 _VERIFY_COMMANDS_JSON = os.path.join(_TEST_DATA_PATH, 'verify_commands.json')
 _GOLDEN_SAMPLES_JSON = os.path.join(_TEST_DATA_PATH, 'golden_samples.json')
 _VERIFY_BINARY_JSON = os.path.join(_TEST_DATA_PATH, 'verify_binary_upload.json')
+_VERIFY_PRESUBMIT_JSON = os.path.join(_TEST_DATA_PATH, 'verify_presubmit.json')
 _ANDROID_COMMON = 'android_common'
 _LINUX_GLIBC_COMMON = 'linux_glibc_common'
 _SRCS = 'srcs'
@@ -114,6 +117,12 @@
         '--remove_bp_json',
         action='store_true',
         help='Remove module_bp_java_deps.json for each use case test.')
+    parser.add_argument(
+        '-m',
+        '--make_clean',
+        action='store_true',
+        help=('Make clean before testing to create a clean environment, the '
+              'aidegen_functional_test can run only once if users command it.'))
     group.add_argument(
         '-u',
         '--use_cases',
@@ -127,6 +136,12 @@
         help=('Verify aidegen\'s use cases by executing different aidegen '
               'commands.'))
     group.add_argument(
+        '-p',
+        action='store_true',
+        dest='binary_presubmit_verified',
+        help=('Verify aidegen\'s tool in presubmit test by executing'
+              'different aidegen commands.'))
+    group.add_argument(
         '-a',
         '--test-all',
         action='store_true',
@@ -198,6 +213,7 @@
         dep_name: a string of the merged project and dependencies file's name,
                   e.g., frameworks-dependencies.iml.
     """
+    # pylint: disable=maybe-no-member
     code_name = project_file_gen.ProjectFileGenerator.get_unique_iml_name(
         abs_path)
     file_name = ''.join([code_name, '.iml'])
@@ -366,6 +382,7 @@
 
     Args:
         test_list: a list of module name and module path.
+
     Returns:
         data: a dictionary contains dependent files' data of project file's
               contents.
@@ -384,7 +401,6 @@
                 ]
             }
     """
-    _make_clean()
     data = {}
     spec_and_cur_commit_id_dict = _checkout_baseline_code_to_spec_commit_id()
     for target in test_list:
@@ -410,6 +426,7 @@
     with open(_GOLDEN_SAMPLES_JSON, 'r') as infile:
         try:
             data_sample = json.load(infile)
+        # pylint: disable=maybe-no-member
         except json.JSONDecodeError as err:
             print("Json decode error: {}".format(err))
             data_sample = {}
@@ -559,7 +576,8 @@
 # pylint: disable=eval-used
 @common_util.back_to_cwd
 @common_util.time_logged
-def _verify_aidegen(verified_file_path, forced_remove_bp_json):
+def _verify_aidegen(verified_file_path, forced_remove_bp_json,
+                    is_presubmit=False):
     """Verify various use cases of executing aidegen.
 
     There are two types of running commands:
@@ -596,9 +614,9 @@
         raise errors.JsonFileNotExistError(
             '%s does not exist, error: %s.' % (verified_file_path, err))
 
-    _make_clean()
+    if not is_presubmit:
+        _compare_sample_native_content()
 
-    _compare_sample_native_content()
     os.chdir(common_util.get_android_root_dir())
     for use_case in data:
         print('Use case "{}" is running.'.format(use_case))
@@ -677,7 +695,6 @@
         becomes
         prebuilts/gcc/linux-x86/x86/x86_64-linux-android-4.9 # in AIDEGen
     """
-    env_off = {'SOONG_COLLECT_JAVA_DEPS': 'false'}
     target_arch_variant = 'x86_64'
     env_on = {
         'TARGET_PRODUCT': 'aosp_x86_64',
@@ -685,14 +702,13 @@
         'TARGET_ARCH_VARIANT': target_arch_variant,
         'SOONG_COLLECT_JAVA_DEPS': 'true',
         'SOONG_GEN_CMAKEFILES': '1',
-        'SOONG_GEN_CMAKEFILES_DEBUG': '0',
         'SOONG_COLLECT_CC_DEPS': '1'
     }
 
     try:
         project_config.ProjectConfig(
             aidegen_main._parse_args(['-n', '-v'])).init_environment()
-        module_info_util.generate_merged_module_info(env_off, env_on)
+        module_info_util.generate_merged_module_info(env_on)
         cc_path = os.path.join(common_util.get_soong_out_path(),
                                constant.BLUEPRINT_CC_JSONFILE_NAME)
         mod_name = 'libui'
@@ -742,12 +758,18 @@
     args = _parse_args(argv)
     common_util.configure_logging(args.verbose)
     os.environ[constant.AIDEGEN_TEST_MODE] = 'true'
+
+    if args.make_clean:
+        _make_clean()
+
     if args.create_sample:
         _create_some_sample_json_file(args.targets)
     elif args.use_cases_verified:
         _verify_aidegen(_VERIFY_COMMANDS_JSON, args.remove_bp_json)
     elif args.binary_upload_verified:
         _verify_aidegen(_VERIFY_BINARY_JSON, args.remove_bp_json)
+    elif args.binary_presubmit_verified:
+        _verify_aidegen(_VERIFY_PRESUBMIT_JSON, args.remove_bp_json, True)
     elif args.test_all_samples:
         _test_all_samples_iml()
     elif args.compare_sample_native:
@@ -757,6 +779,7 @@
             _test_some_sample_iml()
         else:
             _test_some_sample_iml(args.targets)
+
     del os.environ[constant.AIDEGEN_TEST_MODE]
 
 
diff --git a/aidegen_functional_test/test_data/golden_samples.json b/aidegen_functional_test/test_data/golden_samples.json
index 883c359..3b67a2a 100644
--- a/aidegen_functional_test/test_data/golden_samples.json
+++ b/aidegen_functional_test/test_data/golden_samples.json
@@ -250,7 +250,6 @@
             "file://$PROJECT_DIR$/frameworks/base/tests/ActivityManagerPerfTests/tests/src",
             "file://$PROJECT_DIR$/frameworks/base/tests/ActivityManagerPerfTests/utils/src",
             "file://$PROJECT_DIR$/frameworks/base/tests/ActivityTests/src",
-            "file://$PROJECT_DIR$/frameworks/base/tests/ActivityViewTest/src",
             "file://$PROJECT_DIR$/frameworks/base/tests/AmSlam/src",
             "file://$PROJECT_DIR$/frameworks/base/tests/AppLaunch/src",
             "file://$PROJECT_DIR$/frameworks/base/tests/AppLaunchWear/src",
@@ -356,7 +355,7 @@
             "file://$PROJECT_DIR$/frameworks/base/tests/permission/src",
             "file://$PROJECT_DIR$/frameworks/base/tests/testables/src",
             "file://$PROJECT_DIR$/frameworks/base/tests/testables/tests/src",
-            "file://$PROJECT_DIR$/frameworks/base/tests/utils/DummyIME/src",
+            "file://$PROJECT_DIR$/frameworks/base/tests/utils/StubIME/src",
             "file://$PROJECT_DIR$/frameworks/base/tests/utils/testutils/java",
             "file://$PROJECT_DIR$/frameworks/base/tools/aapt2/integration-tests/StaticLibTest/App/src",
             "file://$PROJECT_DIR$/frameworks/base/tools/aapt2/integration-tests/StaticLibTest/LibOne/src",
@@ -670,4 +669,4 @@
             "jar://$PROJECT_DIR$/prebuilts/jdk/jdk8/linux-x86/jre/lib/rt.jar!/"
         ]
     }
-}
\ No newline at end of file
+}
diff --git a/aidegen_functional_test/test_data/verify_presubmit.json b/aidegen_functional_test/test_data/verify_presubmit.json
new file mode 100644
index 0000000..7df3455
--- /dev/null
+++ b/aidegen_functional_test/test_data/verify_presubmit.json
@@ -0,0 +1,4 @@
+{
+    "test whole android tree with frameworks/base -a": ["aidegen frameworks/base -a -n -s"],
+    "test help": ["aidegen -h"]
+}
diff --git a/asuite_plugin/build.sh b/asuite_plugin/build.sh
new file mode 100755
index 0000000..8187595
--- /dev/null
+++ b/asuite_plugin/build.sh
@@ -0,0 +1 @@
+./gradlew build
diff --git a/asuite_plugin/prebuilt/asuite_plugin-1.0.jar b/asuite_plugin/prebuilt/asuite_plugin-1.0.jar
deleted file mode 100644
index a9d588c..0000000
--- a/asuite_plugin/prebuilt/asuite_plugin-1.0.jar
+++ /dev/null
Binary files differ
diff --git a/asuite_run_unittests.py b/asuite_run_unittests.py
index 8e885da..4f66b4d 100755
--- a/asuite_run_unittests.py
+++ b/asuite_run_unittests.py
@@ -27,16 +27,16 @@
 import subprocess
 import sys
 
-
+ASUITE_HOME = os.path.dirname(os.path.realpath(__file__))
+ASUITE_PLUGIN_PATH = os.path.join(ASUITE_HOME, "asuite_plugin")
+ATEST_CMD = os.path.join(ASUITE_HOME, "atest", "atest_run_unittests.py")
+ATEST2_CMD = os.path.join(ASUITE_HOME, "atest-py2", "atest_run_unittests.py")
+AIDEGEN_CMD = "atest aidegen_unittests --host"
+PLUGIN_LIB_CMD = "atest plugin_lib_unittests --host"
+GRADLE_TEST = "/gradlew test"
+# Definition of exit codes.
 EXIT_ALL_CLEAN = 0
 EXIT_TEST_FAIL = 1
-ASUITE_PLUGIN_PATH = "tools/asuite/asuite_plugin"
-# TODO: remove echo when atest migration has done.
-ATEST_CMD = "echo {}/tools/asuite/atest/atest_run_unittests.py".format(
-    os.getenv('ANDROID_BUILD_TOP'))
-AIDEGEN_CMD = "atest aidegen_unittests --host"
-GRADLE_TEST = "/gradlew test"
-
 
 def run_unittests(files):
     """Parse modified files and tell if they belong to aidegen, atest or both.
@@ -50,14 +50,16 @@
     cmd_dict = {}
     for f in files:
         if 'atest' in f:
-            cmd_dict.update({ATEST_CMD : None})
+            cmd_dict.update({ATEST_CMD: None})
+        if 'atest-py2' in f:
+            cmd_dict.update({ATEST2_CMD: None})
         if 'aidegen' in f:
-            cmd_dict.update({AIDEGEN_CMD : None})
+            cmd_dict.update({AIDEGEN_CMD: None})
+        if 'plugin_lib' in f:
+            cmd_dict.update({PLUGIN_LIB_CMD: None})
         if 'asuite_plugin' in f:
-            full_path = os.path.join(
-                os.getenv('ANDROID_BUILD_TOP'), ASUITE_PLUGIN_PATH)
-            cmd = full_path + GRADLE_TEST
-            cmd_dict.update({cmd : full_path})
+            cmd = ASUITE_PLUGIN_PATH + GRADLE_TEST
+            cmd_dict.update({cmd : ASUITE_PLUGIN_PATH})
     try:
         for cmd, path in cmd_dict.items():
             subprocess.check_call(shlex.split(cmd), cwd=path)
diff --git a/atest-py2/Android.bp b/atest-py2/Android.bp
index 5136051..08b86ad 100644
--- a/atest-py2/Android.bp
+++ b/atest-py2/Android.bp
@@ -12,6 +12,10 @@
 // See the License for the specific language governing permissions and
 // limitations under the License.
 
+package {
+    default_applicable_licenses: ["Android-Apache-2.0"],
+}
+
 python_binary_host {
     name: "atest-py2",
     main: "atest.py",
@@ -30,7 +34,6 @@
     ],
     data: [
         "tools/updatedb_darwin.sh",
-        ":asuite_version",
     ],
     // Make atest's built name to atest-py2-dev
     stem: "atest-py2-dev",
@@ -40,32 +43,30 @@
     },
 }
 
-// Exclude atest_updatedb_unittest due to it's a test for ATest's wrapper of updatedb, but there's
-// no updatedb binary on test server.
-python_test_host {
-    name: "atest-py2_unittests",
-    main: "atest_run_unittests.py",
-    pkg_path: "atest",
-    srcs: [
-        "**/*.py",
-    ],
-    data: [
-        "tools/updatedb_darwin.sh",
-        "unittest_data/**/*",
-        "unittest_data/**/.*",
-    ],
-    exclude_srcs: [
-        "asuite_lib_test/*.py",
-        "proto/*_pb2.py",
-        "proto/__init__.py",
-    ],
-    libs: [
-        "py-mock",
-        "atest_py2_proto",
-    ],
-    test_config: "atest_unittests.xml",
-    defaults: ["atest_py2_default"],
-}
+//python_test_host {
+//    name: "atest-py2_unittests",
+//    main: "atest_run_unittests.py",
+//    pkg_path: "atest",
+//    srcs: [
+//        "**/*.py",
+//    ],
+//    data: [
+//        "tools/updatedb_darwin.sh",
+//        "unittest_data/**/*",
+//        "unittest_data/**/.*",
+//    ],
+//    exclude_srcs: [
+//        "asuite_lib_test/*.py",
+//        "proto/*_pb2.py",
+//        "proto/__init__.py",
+//    ],
+//    libs: [
+//        "py-mock",
+//        "atest_py2_proto",
+//    ],
+//    test_config: "atest_unittests.xml",
+//    defaults: ["atest_py2_default"],
+//}
 
 python_library_host {
     name: "atest_py2_proto",
diff --git a/atest-py2/TEST_MAPPING b/atest-py2/TEST_MAPPING
deleted file mode 100644
index 6cbf5e7..0000000
--- a/atest-py2/TEST_MAPPING
+++ /dev/null
@@ -1,31 +0,0 @@
-// Below lists the TEST_MAPPING tests to do ASuite unittests to make sure
-// the expectation of ASuite are still good.
-{
-  "presubmit": [
-//    {
-//      // Host side ATest unittests.
-//      "name": "atest_unittests",
-//      "host": true
-//    },
-    {
-      // Host side metrics tests.
-      "name": "asuite_metrics_lib_tests",
-      "host": true
-    },
-    {
-      // Host side metrics tests with Python3.
-      "name": "asuite_metrics_lib_py3_tests",
-      "host": true
-    },
-    {
-      // Host side clearcut tests.
-      "name": "asuite_cc_lib_tests",
-      "host": true
-    },
-    {
-      // Host side clearcut tests with Python3.
-      "name": "asuite_cc_lib_py3_tests",
-      "host": true
-    }
-  ]
-}
diff --git a/atest/Android.bp b/atest/Android.bp
index 64386e5..7e8d0bf 100644
--- a/atest/Android.bp
+++ b/atest/Android.bp
@@ -14,6 +14,10 @@
 
 // Set of error prone rules to ensure code quality
 // PackageLocation check requires the androidCompatible=false otherwise it does not do anything.
+package {
+    default_applicable_licenses: ["Android-Apache-2.0"],
+}
+
 java_library_host {
     name: "atest-tradefed-shell",
     java_resource_dirs: ["res"],
@@ -40,7 +44,7 @@
 }
 
 python_defaults {
-    name: "atest_py3_default",
+    name: "atest_default",
     pkg_path: "atest",
     version: {
         py2: {
@@ -100,21 +104,35 @@
         "tf_proto/*_pb2.py",
     ],
     libs: [
-        "atest_py3_proto",
+        "atest_proto",
+        "tradefed-protos-py",
+        "py-google-api-python-client",
+        "py-oauth2client",
+        "py-six",
     ],
     data: [
         "tools/updatedb_darwin.sh",
-        ":asuite_version",
     ],
     // Make atest's built name to atest-dev
     stem: "atest-dev",
-    defaults: ["atest_py3_default"],
+    defaults: ["atest_default"],
     dist: {
         targets: ["droidcore"],
     },
 }
 
 python_library_host {
+    name: "atest_proto",
+    defaults: ["atest_default"],
+    srcs: [
+        "proto/*.proto",
+    ],
+    proto: {
+        canonical_path_from_root: false,
+    },
+}
+
+python_library_host {
     name: "atest_module_info",
     defaults: ["atest_lib_default"],
     srcs: [
@@ -131,17 +149,6 @@
 }
 
 python_library_host {
-    name: "atest_py3_proto",
-    defaults: ["atest_py3_default"],
-    srcs: [
-        "proto/*.proto",
-    ],
-    proto: {
-        canonical_path_from_root: false,
-    },
-}
-
-python_library_host {
     name: "asuite_proto",
     defaults: ["asuite_default"],
     srcs: [
@@ -184,6 +191,9 @@
     srcs: [
         "**/*.py",
     ],
+    test_options: {
+        unit_test: true,
+    },
     data: [
         "tools/updatedb_darwin.sh",
         "unittest_data/**/*",
@@ -198,11 +208,12 @@
         "tf_proto/*_pb2.py",
     ],
     libs: [
-        "atest_py3_proto",
+        "atest_proto",
+        "tradefed-protos-py",
     ],
     test_config: "atest_unittests.xml",
     test_suites: ["general-tests"],
-    defaults: ["atest_py3_default"],
+    defaults: ["atest_default"],
 }
 
 python_test_host {
@@ -216,21 +227,11 @@
         "INTEGRATION_TESTS",
     ],
     test_config: "atest_integration_tests.xml",
-    test_suites: ["general-tests"],
-    defaults: ["atest_py3_default"],
-}
-
-genrule {
-    name: "asuite_version",
-    cmd: "DATETIME=$$(TZ='America/Log_Angelos' date +'%F');" +
-         "if [[ -n $$BUILD_NUMBER ]]; then" +
-         "  echo $${DATETIME}_$${BUILD_NUMBER} > $(out);" +
-         "else" +
-         "  echo $$(date +'%F_%R') > $(out);" +
-         "fi",
-    out: [
-        "VERSION",
-    ],
+    test_suites: ["null-suite"],
+    defaults: ["atest_default"],
+    test_options: {
+        unit_test: false,
+    },
 }
 
 sh_binary_host {
diff --git a/atest/INTEGRATION_TESTS b/atest/INTEGRATION_TESTS
index 2bf986e..d11a5e8 100644
--- a/atest/INTEGRATION_TESTS
+++ b/atest/INTEGRATION_TESTS
@@ -28,8 +28,12 @@
 
 ###[Test Finder: QUALIFIED_CLASS, Test Runner:AtestTradefedTestRunner]###
 ###Purpose: Test with finder: QUALIFIED_CLASS and runner: AtestTradefedTestRunner###
-# com.android.server.display.DisplayManagerServiceTest
-# com.android.server.wm.ScreenDecorWindowTests#testMultipleDecors
+# .java class
+android.sample.cts.SampleDeviceReportLogTest
+android.sample.cts.SampleDeviceTest#testSharedPreferences
+# .kt class
+android.os.cts.CompanionDeviceManagerTest
+android.os.cts.CompanionDeviceManagerTest#testIsDeviceAssociatedWithCompanionApproveWifiConnectionsPermission
 
 
 ###[Test Finder: MODULE_PACKAGE, Test Runner:AtestTradefedTestRunner]###
@@ -50,10 +54,8 @@
 ###[Test Finder: CC_CLASS, Test Runner:AtestTradefedTestRunner]###
 ###Purpose: Test with finder: CC_CLASS and runner: AtestTradefedTestRunner###
 PacketFragmenterTest
-# PacketFragmenterTest#test_no_fragment_necessary
 PacketFragmenterTest#test_no_fragment_necessary,test_ble_fragment_necessary
 
-
 ###[Test Finder: INTEGRATION, Test Runner:AtestTradefedTestRunner]###
 ###Purpose: Test with finder: INTEGRATION and runner: AtestTradefedTestRunner###
 native-benchmark
@@ -61,7 +63,8 @@
 
 ###[Test Finder: MODULE, Test Runner: VtsTradefedTestRunner]####
 ###Purpose: Test with finder: MODULE and runner: VtsTradefedTestRunner###
-VtsCodelabHelloWorldTest
+# Mark VtsCodelabHelloWorldTest due to vts10 no longer exist.
+#VtsCodelabHelloWorldTest
 
 
 ###[Test Finder: MODULE, Test Runner: RobolectricTestRunner]#####
@@ -84,3 +87,23 @@
 ###[MULTIPLE-TESTS + AtestTradefedTestRunner]###
 ###Purpose: Test with mixed testcases###
 CtsSampleDeviceTestCases CtsAnimationTestCases
+
+
+###[Paramatize GTest + AtestTradefedTestRunner]###
+###Purpose: Test with Paramatize GTest testcases###
+# Mark this due to multiple selection not support in integration test.
+# PerInstance/CameraHidlTest.startStopPreview/0_internal_0
+VtsHalCameraProviderV2_4TargetTest:PerInstance/CameraHidlTest#startStopPreview/0_internal_0
+
+###[Paramatize Java Test + AtestTradefedTestRunner]###
+###Purpose: Test with Paramatize Java testcases###
+CtsWindowManagerDeviceTestCases:android.server.wm.DisplayCutoutTests#testDisplayCutout_default
+cts/tests/framework/base/windowmanager/src/android/server/wm/DisplayCutoutTests.java#testDisplayCutout_default
+cts/tests/tests/os/src/android/os/cts/CompanionDeviceManagerTest.kt#testIsDeviceAssociated
+
+### Java Test ###
+####Purpose: Find the test method by traversing parent classes###
+MixedManagedProfileOwnerTest#testPasswordSufficientInitially
+
+###[Option verify]###
+--help
diff --git a/atest/TEST_MAPPING b/atest/TEST_MAPPING
deleted file mode 100644
index 5a9b62a..0000000
--- a/atest/TEST_MAPPING
+++ /dev/null
@@ -1,36 +0,0 @@
-// Below lists the TEST_MAPPING tests to do ASuite unittests to make sure
-// the expectation of ASuite are still good.
-{
-  "presubmit": [
-    {
-      // Host side ATest unittests.
-      "name": "atest_unittests",
-      "host": true
-    },
-    {
-      // Host side ATest-py3 unittests.
-      "name": "atest-py3_unittests",
-      "host": true
-    },
-    {
-      // Host side metrics tests.
-      "name": "asuite_metrics_lib_tests",
-      "host": true
-    },
-    {
-      // Host side metrics tests with Python3.
-      "name": "asuite_metrics_lib_py3_tests",
-      "host": true
-    },
-    {
-      // Host side clearcut tests.
-      "name": "asuite_cc_lib_tests",
-      "host": true
-    },
-    {
-      // Host side clearcut tests with Python3.
-      "name": "asuite_cc_lib_py3_tests",
-      "host": true
-    }
-  ]
-}
diff --git a/atest/asuite_lib_test/Android.bp b/atest/asuite_lib_test/Android.bp
index 4889909..99e0a5f 100644
--- a/atest/asuite_lib_test/Android.bp
+++ b/atest/asuite_lib_test/Android.bp
@@ -17,26 +17,13 @@
 // tests result is accurate, separate them to two different test modules.
 
 // For testing asuite_metrics python2 libs
-python_test_host {
-    name: "asuite_metrics_lib_tests",
-    main: "asuite_lib_run_tests.py",
-    // These tests primarily check that the metric libs can be imported properly (see b/132086641).
-    // Specify a different pkg_path so that we can properly test them in isolation.
-    pkg_path: "asuite_test",
-    srcs: [
-        "asuite_lib_run_tests.py",
-        "asuite_metrics_test.py",
-    ],
-    libs: [
-        "asuite_metrics",
-    ],
-    test_suites: ["general-tests"],
-    defaults: ["atest_py2_default"],
+package {
+    default_applicable_licenses: ["Android-Apache-2.0"],
 }
 
 // For testing asuite_metrics python3 libs
 python_test_host {
-    name: "asuite_metrics_lib_py3_tests",
+    name: "asuite_metrics_lib_tests",
     main: "asuite_lib_run_tests.py",
     pkg_path: "asuite_test",
     srcs: [
@@ -46,11 +33,13 @@
     libs: [
         "asuite_metrics",
     ],
-    test_suites: ["general-tests"],
+    test_options: {
+        unit_test: true,
+    },
     defaults: ["atest_lib_default"],
 }
 
-// For testing asuite_cc_client python2 libs
+// For testing asuite_cc_client python3 libs
 python_test_host {
     name: "asuite_cc_lib_tests",
     main: "asuite_lib_run_tests.py",
@@ -62,22 +51,8 @@
     libs: [
         "asuite_cc_client",
     ],
-    test_suites: ["general-tests"],
-    defaults: ["atest_py2_default"],
-}
-
-// For testing asuite_cc_client python3 libs
-python_test_host {
-    name: "asuite_cc_lib_py3_tests",
-    main: "asuite_lib_run_tests.py",
-    pkg_path: "asuite_test",
-    srcs: [
-        "asuite_lib_run_tests.py",
-        "asuite_cc_client_test.py",
-    ],
-    libs: [
-        "asuite_cc_client",
-    ],
-    test_suites: ["general-tests"],
+    test_options: {
+        unit_test: true,
+    },
     defaults: ["atest_lib_default"],
 }
diff --git a/atest/asuite_lib_test/asuite_cc_client_test.py b/atest/asuite_lib_test/asuite_cc_client_test.py
index a0aefa1..ab3d2b0 100644
--- a/atest/asuite_lib_test/asuite_cc_client_test.py
+++ b/atest/asuite_lib_test/asuite_cc_client_test.py
@@ -33,6 +33,7 @@
         from asuite.metrics import metrics
         from asuite.metrics import metrics_base
         from asuite.metrics import metrics_utils
+        from asuite import atest_utils
 
         # TODO (b/132602907): Add the real usage for checking if metrics pass or fail.
 
diff --git a/atest/asuite_metrics.py b/atest/asuite_metrics.py
index 69dae5f..7e69f1c 100644
--- a/atest/asuite_metrics.py
+++ b/atest/asuite_metrics.py
@@ -34,17 +34,17 @@
                           '.config', 'asuite', '.metadata')
 _ANDROID_BUILD_TOP = 'ANDROID_BUILD_TOP'
 
-DUMMY_UUID = '00000000-0000-4000-8000-000000000000'
+UNUSED_UUID = '00000000-0000-4000-8000-000000000000'
 
 
 #pylint: disable=broad-except
-def log_event(metrics_url, dummy_key_fallback=True, **kwargs):
+def log_event(metrics_url, unused_key_fallback=True, **kwargs):
     """Base log event function for asuite backend.
 
     Args:
         metrics_url: String, URL to report metrics to.
-        dummy_key_fallback: Boolean, If True and unable to get grouping key,
-                            use a dummy key otherwise return out. Sometimes we
+        unused_key_fallback: Boolean, If True and unable to get grouping key,
+                            use a unused key otherwise return out. Sometimes we
                             don't want to return metrics for users we are
                             unable to identify. Default True.
         kwargs: Dict, additional fields we want to return metrics for.
@@ -53,9 +53,9 @@
         try:
             key = str(_get_grouping_key())
         except Exception:
-            if not dummy_key_fallback:
+            if not unused_key_fallback:
                 return
-            key = DUMMY_UUID
+            key = UNUSED_UUID
         data = {'grouping_key': key,
                 'run_id': str(uuid.uuid4())}
         if kwargs:
@@ -97,7 +97,7 @@
 def _get_old_key():
     """Get key from old meta data file if exists, else return None."""
     old_file = os.path.join(os.environ[_ANDROID_BUILD_TOP],
-                            'tools/tradefederation/core/atest', '.metadata')
+                            'tools/asuite/atest', '.metadata')
     key = None
     if os.path.isfile(old_file):
         with open(old_file) as f:
diff --git a/atest/atest.py b/atest/atest.py
index d90cd67..f1f7533 100755
--- a/atest/atest.py
+++ b/atest/atest.py
@@ -38,6 +38,7 @@
 from multiprocessing import Process
 
 import atest_arg_parser
+import atest_configs
 import atest_error
 import atest_execution_info
 import atest_utils
@@ -52,7 +53,7 @@
 from metrics import metrics_base
 from metrics import metrics_utils
 from test_runners import regression_test_runner
-from tools import atest_tools
+from tools import atest_tools as at
 
 EXPECTED_VARS = frozenset([
     constants.ANDROID_BUILD_TOP,
@@ -71,29 +72,25 @@
 TEST_TYPE = 'test_type'
 # Tasks that must run in the build time but unable to build by soong.
 # (e.g subprocesses that invoke host commands.)
-EXTRA_TASKS = {
-    'index-targets': atest_tools.index_targets
-}
+ACLOUD_CREATE = at.acloud_create
+INDEX_TARGETS = at.index_targets
 
 
-def _run_extra_tasks(join=False):
-    """Execute EXTRA_TASKS with multiprocessing.
+def _run_multi_proc(func, *args, **kwargs):
+    """Start a process with multiprocessing and return Process object.
 
     Args:
-        join: A boolean that indicates the process should terminate when
-        the main process ends or keep itself alive. True indicates the
-        main process will wait for all subprocesses finish while False represents
-        killing all subprocesses when the main process exits.
+        func: A string of function name which will be the target name.
+        args/kwargs: check doc page:
+        https://docs.python.org/3.8/library/multiprocessing.html#process-and-exceptions
+
+    Returns:
+        multiprocessing.Process object.
     """
-    _running_procs = []
-    for task in EXTRA_TASKS.values():
-        proc = Process(target=task)
-        proc.daemon = not join
-        proc.start()
-        _running_procs.append(proc)
-    if join:
-        for proc in _running_procs:
-            proc.join()
+
+    proc = Process(target=func, *args, **kwargs)
+    proc.start()
+    return proc
 
 
 def _parse_args(argv):
@@ -116,7 +113,9 @@
     args = parser.parse_args(pruned_argv)
     args.custom_args = []
     if custom_args_index is not None:
-        args.custom_args = argv[custom_args_index+1:]
+        for arg in argv[custom_args_index+1:]:
+            logging.debug('Quoting regex argument %s', arg)
+            args.custom_args.append(atest_utils.quote(arg))
     return args
 
 
@@ -126,6 +125,8 @@
     Args:
         verbose: A boolean. If true display DEBUG level logs.
     """
+    # Clear the handlers to prevent logging.basicConfig from being called twice.
+    logging.getLogger('').handlers = []
     log_format = '%(asctime)s %(filename)s:%(lineno)s:%(levelname)s: %(message)s'
     datefmt = '%Y-%m-%d %H:%M:%S'
     if verbose:
@@ -198,7 +199,10 @@
                 'sharding': constants.SHARDING,
                 'tf_debug': constants.TF_DEBUG,
                 'tf_template': constants.TF_TEMPLATE,
-                'user_type': constants.USER_TYPE}
+                'user_type': constants.USER_TYPE,
+                'flakes_info': constants.FLAKES_INFO,
+                'tf_early_device_release': constants.TF_EARLY_DEVICE_RELEASE,
+                'request_upload_result': constants.REQUEST_UPLOAD_RESULT}
     not_match = [k for k in arg_maps if k not in vars(args)]
     if not_match:
         raise AttributeError('%s object has no attribute %s'
@@ -245,8 +249,11 @@
     err_msg = None
     # In the case of '$atest <device-only> --host', exit.
     if (host_tests or args.host) and constants.DEVICE_TEST in all_device_modes:
-        err_msg = ('Test side and option(--host) conflict. Please remove '
-                   '--host if the test run on device side.')
+        device_only_tests = [x.test_name for x in test_infos
+                             if x.get_supported_exec_mode() == constants.DEVICE_TEST]
+        err_msg = ('Specified --host, but the following tests are device-only:\n  ' +
+                   '\n  '.join(sorted(device_only_tests)) + '\nPlease remove the option '
+                   'when running device-only tests.')
     # In the case of '$atest <host-only> <device-only> --host' or
     # '$atest <host-only> <device-only>', exit.
     if (constants.DEVICELESS_TEST in all_device_modes and
@@ -470,13 +477,14 @@
 
 
 # pylint: disable=too-many-locals
-def _run_test_mapping_tests(results_dir, test_infos, extra_args):
+def _run_test_mapping_tests(results_dir, test_infos, extra_args, mod_info):
     """Run all tests in TEST_MAPPING files.
 
     Args:
         results_dir: String directory to store atest results.
         test_infos: A set of TestInfos.
         extra_args: Dict of extra args to add to test run.
+        mod_info: ModuleInfo object.
 
     Returns:
         Exit code.
@@ -501,7 +509,7 @@
         atest_utils.colorful_print(header, constants.MAGENTA)
         logging.debug('\n'.join([str(info) for info in tests]))
         tests_exit_code, reporter = test_runner_handler.run_all_tests(
-            results_dir, tests, args, delay_print_summary=True)
+            results_dir, tests, args, mod_info, delay_print_summary=True)
         atest_execution_info.AtestExecutionInfo.result_reporters.append(reporter)
         test_results.append((tests_exit_code, reporter, test_type))
 
@@ -527,20 +535,21 @@
     return all_tests_exit_code
 
 
-def _dry_run(results_dir, extra_args, test_infos):
+def _dry_run(results_dir, extra_args, test_infos, mod_info):
     """Only print the commands of the target tests rather than running them in actual.
 
     Args:
         results_dir: Path for saving atest logs.
         extra_args: Dict of extra args for test runners to utilize.
         test_infos: A list of TestInfos.
+        mod_info: ModuleInfo object.
 
     Returns:
         A list of test commands.
     """
     all_run_cmds = []
     for test_runner, tests in test_runner_handler.group_tests_by_test_runners(test_infos):
-        runner = test_runner(results_dir)
+        runner = test_runner(results_dir, module_info=mod_info)
         run_cmds = runner.generate_run_commands(tests, extra_args)
         for run_cmd in run_cmds:
             all_run_cmds.append(run_cmd)
@@ -615,7 +624,7 @@
         atest_utils.colorful_print(stop_msg, constants.RED)
         atest_utils.colorful_print(msg, constants.CYAN)
 
-def _dry_run_validator(args, results_dir, extra_args, test_infos):
+def _dry_run_validator(args, results_dir, extra_args, test_infos, mod_info):
     """Method which process --dry-run argument.
 
     Args:
@@ -623,9 +632,12 @@
         result_dir: A string path of the results dir.
         extra_args: A dict of extra args for test runners to utilize.
         test_infos: A list of test_info.
+        mod_info: ModuleInfo object.
+    Returns:
+        Exit code.
     """
     args.tests.sort()
-    dry_run_cmds = _dry_run(results_dir, extra_args, test_infos)
+    dry_run_cmds = _dry_run(results_dir, extra_args, test_infos, mod_info)
     if args.verify_cmd_mapping:
         try:
             atest_utils.handle_test_runner_cmd(' '.join(args.tests),
@@ -637,8 +649,58 @@
     if args.update_cmd_mapping:
         atest_utils.handle_test_runner_cmd(' '.join(args.tests),
                                            dry_run_cmds)
-    sys.exit(constants.EXIT_CODE_SUCCESS)
+    return constants.EXIT_CODE_SUCCESS
 
+def _exclude_modules_in_targets(build_targets):
+    """Method that excludes MODULES-IN-* targets.
+
+    Args:
+        build_targets: A set of build targets.
+
+    Returns:
+        A set of build targets that excludes MODULES-IN-*.
+    """
+    shrank_build_targets = build_targets.copy()
+    logging.debug('Will exclude all "%s*" from the build targets.',
+                  constants.MODULES_IN)
+    for target in build_targets:
+        if target.startswith(constants.MODULES_IN):
+            logging.debug('Ignore %s.', target)
+            shrank_build_targets.remove(target)
+    return shrank_build_targets
+
+def acloud_create_validator(results_dir, args):
+    """Check lunch'd target before running 'acloud create'.
+
+    Args:
+        results_dir: A string of the results directory.
+        args: A list of arguments.
+
+    Returns:
+        If the target is valid:
+            A tuple of (multiprocessing.Process,
+                        string of report file path)
+        else:
+            None, None
+    """
+    if not any((args.acloud_create, args.start_avd)):
+        return None, None
+    if args.start_avd:
+        args.acloud_create = ['--num=1']
+    acloud_args = ' '.join(args.acloud_create)
+    target = os.getenv('TARGET_PRODUCT', "")
+    if 'cf_x86' in target:
+        report_file = at.get_report_file(results_dir, acloud_args)
+        acloud_proc = _run_multi_proc(
+            func=ACLOUD_CREATE,
+            args=[report_file],
+            kwargs={'args':acloud_args,
+                    'no_metrics_notice':args.no_metrics})
+        return acloud_proc, report_file
+    atest_utils.colorful_print(
+        '{} is not cf_x86 family; will not create any AVD.'.format(target),
+        constants.RED)
+    return None, None
 
 # pylint: disable=too-many-statements
 # pylint: disable=too-many-branches
@@ -664,21 +726,25 @@
         cwd=os.getcwd(),
         os=os_pyver)
     _non_action_validator(args)
+    proc_acloud, report_file = acloud_create_validator(results_dir, args)
     mod_info = module_info.ModuleInfo(force_build=args.rebuild_module_info)
     if args.rebuild_module_info:
-        _run_extra_tasks(join=True)
-    translator = cli_translator.CLITranslator(module_info=mod_info,
-                                              print_cache_msg=not args.clear_cache)
+        proc_idx = _run_multi_proc(INDEX_TARGETS)
+        proc_idx.join()
+    translator = cli_translator.CLITranslator(
+        module_info=mod_info,
+        print_cache_msg=not args.clear_cache)
     if args.list_modules:
         _print_testable_modules(mod_info, args.list_modules)
         return constants.EXIT_CODE_SUCCESS
-    # Clear cache if user pass -c option
-    if args.clear_cache:
-        atest_utils.clean_test_info_caches(args.tests)
     build_targets = set()
     test_infos = set()
     if _will_run_tests(args):
+        find_start = time.time()
         build_targets, test_infos = translator.translate(args)
+        if args.no_modules_in:
+            build_targets = _exclude_modules_in_targets(build_targets)
+        find_duration = time.time() - find_start
         if not test_infos:
             return constants.EXIT_CODE_TEST_NOT_FOUND
         if not is_from_test_mapping(test_infos):
@@ -691,7 +757,8 @@
                                                               test_infos)
     extra_args = get_extra_args(args)
     if any((args.update_cmd_mapping, args.verify_cmd_mapping, args.dry_run)):
-        _dry_run_validator(args, results_dir, extra_args, test_infos)
+        return _dry_run_validator(args, results_dir, extra_args, test_infos,
+                                  mod_info)
     if args.detect_regression:
         build_targets |= (regression_test_runner.RegressionTestRunner('')
                           .get_test_runner_build_reqs())
@@ -701,18 +768,49 @@
         if constants.TEST_STEP in steps and not args.rebuild_module_info:
             # Run extra tasks along with build step concurrently. Note that
             # Atest won't index targets when only "-b" is given(without -t).
-            _run_extra_tasks(join=False)
+            proc_idx = _run_multi_proc(INDEX_TARGETS, daemon=True)
         # Add module-info.json target to the list of build targets to keep the
         # file up to date.
         build_targets.add(mod_info.module_info_target)
         build_start = time.time()
         success = atest_utils.build(build_targets, verbose=args.verbose)
+        build_duration = time.time() - build_start
         metrics.BuildFinishEvent(
-            duration=metrics_utils.convert_duration(time.time() - build_start),
+            duration=metrics_utils.convert_duration(build_duration),
             success=success,
             targets=build_targets)
+        rebuild_module_info = constants.DETECT_TYPE_NOT_REBUILD_MODULE_INFO
+        if args.rebuild_module_info:
+            rebuild_module_info = constants.DETECT_TYPE_REBUILD_MODULE_INFO
+        metrics.LocalDetectEvent(
+            detect_type=rebuild_module_info,
+            result=int(build_duration))
         if not success:
             return constants.EXIT_CODE_BUILD_FAILURE
+        # Always reload module-info after build finish.
+        # TODO(b/178675689) Move it to a thread when running test.
+        mod_info.generate_atest_merged_dep_file()
+        if proc_acloud:
+            proc_acloud.join()
+            status = at.probe_acloud_status(report_file)
+            if status != 0:
+                return status
+            acloud_duration = at.get_acloud_duration(report_file)
+            find_build_duration = find_duration + build_duration
+            if find_build_duration - acloud_duration >= 0:
+                # find+build took longer, saved acloud create time.
+                logging.debug('Saved acloud create time: %ss.',
+                              acloud_duration)
+                metrics.LocalDetectEvent(
+                    detect_type=constants.DETECT_TYPE_ACLOUD_CREATE,
+                    result=round(acloud_duration))
+            else:
+                # acloud create took longer, saved find+build time.
+                logging.debug('Saved Find and Build time: %ss.',
+                              find_build_duration)
+                metrics.LocalDetectEvent(
+                    detect_type=constants.DETECT_TYPE_FIND_BUILD,
+                    result=round(find_build_duration))
     elif constants.TEST_STEP not in steps:
         logging.warning('Install step without test step currently not '
                         'supported, installing AND testing instead.')
@@ -722,15 +820,16 @@
     if constants.TEST_STEP in steps:
         if not is_from_test_mapping(test_infos):
             tests_exit_code, reporter = test_runner_handler.run_all_tests(
-                results_dir, test_infos, extra_args)
+                results_dir, test_infos, extra_args, mod_info)
             atest_execution_info.AtestExecutionInfo.result_reporters.append(reporter)
         else:
             tests_exit_code = _run_test_mapping_tests(
-                results_dir, test_infos, extra_args)
+                results_dir, test_infos, extra_args, mod_info)
     if args.detect_regression:
         regression_args = _get_regression_detection_args(args, results_dir)
         # TODO(b/110485713): Should not call run_tests here.
-        reporter = result_reporter.ResultReporter()
+        reporter = result_reporter.ResultReporter(
+            collect_only=extra_args.get(constants.COLLECT_TESTS_ONLY))
         atest_execution_info.AtestExecutionInfo.result_reporters.append(reporter)
         tests_exit_code |= regression_test_runner.RegressionTestRunner(
             '').run_tests(
@@ -751,11 +850,11 @@
 
 if __name__ == '__main__':
     RESULTS_DIR = make_test_run_dir()
-    ARGS = _parse_args(sys.argv[1:])
-    with atest_execution_info.AtestExecutionInfo(sys.argv[1:],
-                                                 RESULTS_DIR,
-                                                 ARGS) as result_file:
-        if not ARGS.no_metrics:
+    atest_configs.GLOBAL_ARGS = _parse_args(sys.argv[1:])
+    with atest_execution_info.AtestExecutionInfo(
+            sys.argv[1:], RESULTS_DIR,
+            atest_configs.GLOBAL_ARGS) as result_file:
+        if not atest_configs.GLOBAL_ARGS.no_metrics:
             atest_utils.print_data_collection_notice()
             USER_FROM_TOOL = os.getenv(constants.USER_FROM_TOOL, '')
             if USER_FROM_TOOL == '':
@@ -763,11 +862,12 @@
             else:
                 metrics_base.MetricsBase.tool_name = USER_FROM_TOOL
 
-        EXIT_CODE = main(sys.argv[1:], RESULTS_DIR, ARGS)
+        EXIT_CODE = main(sys.argv[1:], RESULTS_DIR, atest_configs.GLOBAL_ARGS)
         DETECTOR = bug_detector.BugDetector(sys.argv[1:], EXIT_CODE)
-        metrics.LocalDetectEvent(
-            detect_type=constants.DETECT_TYPE_BUG_DETECTED,
-            result=DETECTOR.caught_result)
-        if result_file:
-            print("Run 'atest --history' to review test result history.")
+        if EXIT_CODE not in constants.EXIT_CODES_BEFORE_TEST:
+            metrics.LocalDetectEvent(
+                detect_type=constants.DETECT_TYPE_BUG_DETECTED,
+                result=DETECTOR.caught_result)
+            if result_file:
+                print("Run 'atest --history' to review test result history.")
     sys.exit(EXIT_CODE)
diff --git a/atest/atest_arg_parser.py b/atest/atest_arg_parser.py
index 178936e..3a62edf 100644
--- a/atest/atest_arg_parser.py
+++ b/atest/atest_arg_parser.py
@@ -32,6 +32,7 @@
              ' options.')
 
 # Constants used for arg help message(sorted in alphabetic)
+ACLOUD_CREATE = 'Create AVD(s) via acloud command.'
 ALL_ABI = 'Set to run tests for all abis.'
 BUILD = 'Run a build.'
 CLEAR_CACHE = 'Wipe out the test_infos cache of the test.'
@@ -40,11 +41,13 @@
 DISABLE_TEARDOWN = 'Disable test teardown and cleanup.'
 DRY_RUN = 'Dry run atest without building, installing and running tests in real.'
 ENABLE_FILE_PATTERNS = 'Enable FILE_PATTERNS in TEST_MAPPING.'
+FLAKES_INFO = 'Test result with flakes info.'
 HISTORY = ('Show test results in chronological order(with specified number or '
            'all by default).')
 HOST = ('Run the test completely on the host without a device. '
         '(Note: running a host test that requires a device without '
         '--host will fail.)')
+HOST_UNIT_TEST_ONLY = ('Run all host unit tests under the current directory.')
 INCLUDE_SUBDIRS = 'Search TEST_MAPPING files in subdirs as well.'
 INFO = 'Show module information.'
 INSTALL = 'Install an APK.'
@@ -55,30 +58,42 @@
 LATEST_RESULT = 'Print latest test result.'
 LIST_MODULES = 'List testable modules for the given suite.'
 NO_METRICS = 'Do not send metrics.'
+NO_MODULES_IN = ('Do not include MODULES-IN-* as build targets. Warning: This '
+                 'may result in missing dependencies issue.')
 REBUILD_MODULE_INFO = ('Forces a rebuild of the module-info.json file. '
                        'This may be necessary following a repo sync or '
                        'when writing a new test.')
+REQUEST_UPLOAD_RESULT = 'Request permission to upload test result or not.'
 RERUN_UNTIL_FAILURE = ('Rerun all tests until a failure occurs or the max '
                        'iteration is reached. (10 by default)')
 RETRY_ANY_FAILURE = ('Rerun failed tests until passed or the max iteration '
                      'is reached. (10 by default)')
 SERIAL = 'The device to run the test on.'
+SHARDING = 'Option to specify sharding count. The default value is 2'
+START_AVD = 'Automatically create an AVD and run tests on the virtual device.'
 TEST = ('Run the tests. WARNING: Many test configs force cleanup of device '
-        'after test run. In this case, "-d" must be used in previous test run to '
-        'disable cleanup for "-t" to work. Otherwise, device will need to be '
-        'setup again with "-i".')
+        'after test run. In this case, "-d" must be used in previous test run '
+        'to disable cleanup for "-t" to work. Otherwise, device will need to '
+        'be setup again with "-i".')
 TEST_MAPPING = 'Run tests defined in TEST_MAPPING files.'
+TEST_CONFIG_SELECTION = ('If multiple test config belong to same test module '
+                         'pop out a selection menu on console.')
+TF_DEBUG = ('Enable tradefed debug mode with a specify port. Default value is '
+            '10888.')
+TF_EARLY_DEVICE_RELEASE = ('Tradefed flag to release the device as soon as '
+                           'done with it.')
 TF_TEMPLATE = ('Add extra tradefed template for ATest suite, '
                'e.g. atest <test> --tf-template <template_key>=<template_path>')
-TF_DEBUG = 'Enable tradefed debug mode with a specify port. Default value is 10888.'
-SHARDING = 'Option to specify sharding count. The default value is 2'
 UPDATE_CMD_MAPPING = ('Update the test command of input tests. Warning: result '
-                      'will be saved under tools/tradefederation/core/atest/test_data.')
-USER_TYPE = 'Run test with specific user type, e.g. atest <test> --user-type secondary_user'
+                      'will be saved under '
+                      'tools/asuite/atest/test_data.')
+USER_TYPE = ('Run test with specific user type, e.g. atest <test> --user-type '
+             'secondary_user')
 VERBOSE = 'Display DEBUG level logging.'
 VERIFY_CMD_MAPPING = 'Verify the test command of input tests.'
 VERSION = 'Display version string.'
-WAIT_FOR_DEBUGGER = 'Wait for debugger prior to execution (Instrumentation tests only).'
+WAIT_FOR_DEBUGGER = ('Wait for debugger prior to execution (Instrumentation '
+                     'tests only).')
 
 def _positive_int(value):
     """Verify value by whether or not a positive integer.
@@ -96,8 +111,8 @@
         if converted_value < 1:
             raise argparse.ArgumentTypeError(err_msg)
         return converted_value
-    except ValueError:
-        raise argparse.ArgumentTypeError(err_msg)
+    except ValueError as value_err:
+        raise argparse.ArgumentTypeError(err_msg) from value_err
 
 
 class AtestArgParser(argparse.ArgumentParser):
@@ -105,9 +120,9 @@
 
     def __init__(self):
         """Initialise an ArgumentParser instance."""
-        super(AtestArgParser, self).__init__(
-            description=HELP_DESC, add_help=False)
+        super().__init__(description=HELP_DESC, add_help=False)
 
+    # pylint: disable=too-many-statements
     def add_atest_args(self):
         """A function that does ArgumentParser.add_argument()"""
         self.add_argument('tests', nargs='*', help='Tests to build and/or run.')
@@ -123,7 +138,8 @@
                           help=INSTALL)
         self.add_argument('-m', constants.REBUILD_MODULE_INFO_FLAG,
                           action='store_true', help=REBUILD_MODULE_INFO)
-        self.add_argument('-s', '--serial', help=SERIAL)
+        self.add_argument('--no-modules-in', help=NO_MODULES_IN,
+                          action='store_true')
         self.add_argument('--sharding', nargs='?', const=2,
                           type=_positive_int, default=0,
                           help=SHARDING)
@@ -131,6 +147,8 @@
                           const=constants.TEST_STEP, help=TEST)
         self.add_argument('-w', '--wait-for-debugger', action='store_true',
                           help=WAIT_FOR_DEBUGGER)
+        self.add_argument('--request-upload-result', action='store_true',
+                          help=REQUEST_UPLOAD_RESULT)
 
         # Options related to Test Mapping
         self.add_argument('-p', '--test-mapping', action='store_true',
@@ -142,6 +160,10 @@
         self.add_argument('--enable-file-patterns', action='store_true',
                           help=ENABLE_FILE_PATTERNS)
 
+        # Options related to Host Unit Test.
+        self.add_argument('--host-unit-test-only', action='store_true',
+                          help=HOST_UNIT_TEST_ONLY)
+
         # Options for information queries and dry-runs:
         # A group of options for dry-runs. They are mutually exclusive
         # in a command line.
@@ -156,6 +178,27 @@
         self.add_argument('-v', '--verbose', action='store_true', help=VERBOSE)
         self.add_argument('-V', '--version', action='store_true', help=VERSION)
 
+        # Options that to do with acloud/AVDs.
+        agroup = self.add_mutually_exclusive_group()
+        agroup.add_argument('--acloud-create', nargs=argparse.REMAINDER, type=str,
+                            help=ACLOUD_CREATE)
+        agroup.add_argument('--start-avd', action='store_true',
+                            help=START_AVD)
+        agroup.add_argument('-s', '--serial', help=SERIAL)
+
+        # Options that to query flakes info in test result
+        self.add_argument('--flakes-info', action='store_true',
+                          help=FLAKES_INFO)
+
+        # Options for tradefed to release test device earlier.
+        self.add_argument('--tf-early-device-release', action='store_true',
+                          help=TF_EARLY_DEVICE_RELEASE)
+
+        # Options to enable selection menu is multiple test config belong to
+        # same test module.
+        self.add_argument('--test-config-select', action='store_true',
+                          help=TEST_CONFIG_SELECTION)
+
         # Obsolete options that will be removed soon.
         self.add_argument('--generate-baseline', nargs='?',
                           type=int, const=5, default=0,
@@ -244,39 +287,48 @@
     Returns:
         STDOUT from pydoc.pager().
     """
-    epilog_text = EPILOG_TEMPLATE.format(ALL_ABI=ALL_ABI,
-                                         BUILD=BUILD,
-                                         CLEAR_CACHE=CLEAR_CACHE,
-                                         COLLECT_TESTS_ONLY=COLLECT_TESTS_ONLY,
-                                         DISABLE_TEARDOWN=DISABLE_TEARDOWN,
-                                         DRY_RUN=DRY_RUN,
-                                         ENABLE_FILE_PATTERNS=ENABLE_FILE_PATTERNS,
-                                         HELP_DESC=HELP_DESC,
-                                         HISTORY=HISTORY,
-                                         HOST=HOST,
-                                         INCLUDE_SUBDIRS=INCLUDE_SUBDIRS,
-                                         INFO=INFO,
-                                         INSTALL=INSTALL,
-                                         INSTANT=INSTANT,
-                                         ITERATION=ITERATION,
-                                         LATEST_RESULT=LATEST_RESULT,
-                                         LIST_MODULES=LIST_MODULES,
-                                         NO_METRICS=NO_METRICS,
-                                         REBUILD_MODULE_INFO=REBUILD_MODULE_INFO,
-                                         RERUN_UNTIL_FAILURE=RERUN_UNTIL_FAILURE,
-                                         RETRY_ANY_FAILURE=RETRY_ANY_FAILURE,
-                                         SERIAL=SERIAL,
-                                         SHARDING=SHARDING,
-                                         TEST=TEST,
-                                         TEST_MAPPING=TEST_MAPPING,
-                                         TF_DEBUG=TF_DEBUG,
-                                         TF_TEMPLATE=TF_TEMPLATE,
-                                         USER_TYPE=USER_TYPE,
-                                         UPDATE_CMD_MAPPING=UPDATE_CMD_MAPPING,
-                                         VERBOSE=VERBOSE,
-                                         VERSION=VERSION,
-                                         VERIFY_CMD_MAPPING=VERIFY_CMD_MAPPING,
-                                         WAIT_FOR_DEBUGGER=WAIT_FOR_DEBUGGER)
+    epilog_text = EPILOG_TEMPLATE.format(
+        ACLOUD_CREATE=ACLOUD_CREATE,
+        ALL_ABI=ALL_ABI,
+        BUILD=BUILD,
+        CLEAR_CACHE=CLEAR_CACHE,
+        COLLECT_TESTS_ONLY=COLLECT_TESTS_ONLY,
+        DISABLE_TEARDOWN=DISABLE_TEARDOWN,
+        DRY_RUN=DRY_RUN,
+        ENABLE_FILE_PATTERNS=ENABLE_FILE_PATTERNS,
+        FLAKES_INFO=FLAKES_INFO,
+        HELP_DESC=HELP_DESC,
+        HISTORY=HISTORY,
+        HOST=HOST,
+        HOST_UNIT_TEST_ONLY=HOST_UNIT_TEST_ONLY,
+        INCLUDE_SUBDIRS=INCLUDE_SUBDIRS,
+        INFO=INFO,
+        INSTALL=INSTALL,
+        INSTANT=INSTANT,
+        ITERATION=ITERATION,
+        LATEST_RESULT=LATEST_RESULT,
+        LIST_MODULES=LIST_MODULES,
+        NO_METRICS=NO_METRICS,
+        NO_MODULES_IN=NO_MODULES_IN,
+        REBUILD_MODULE_INFO=REBUILD_MODULE_INFO,
+        REQUEST_UPLOAD_RESULT=REQUEST_UPLOAD_RESULT,
+        RERUN_UNTIL_FAILURE=RERUN_UNTIL_FAILURE,
+        RETRY_ANY_FAILURE=RETRY_ANY_FAILURE,
+        SERIAL=SERIAL,
+        SHARDING=SHARDING,
+        START_AVD=START_AVD,
+        TEST=TEST,
+        TEST_CONFIG_SELECTION=TEST_CONFIG_SELECTION,
+        TEST_MAPPING=TEST_MAPPING,
+        TF_DEBUG=TF_DEBUG,
+        TF_EARLY_DEVICE_RELEASE=TF_EARLY_DEVICE_RELEASE,
+        TF_TEMPLATE=TF_TEMPLATE,
+        USER_TYPE=USER_TYPE,
+        UPDATE_CMD_MAPPING=UPDATE_CMD_MAPPING,
+        VERBOSE=VERBOSE,
+        VERSION=VERSION,
+        VERIFY_CMD_MAPPING=VERIFY_CMD_MAPPING,
+        WAIT_FOR_DEBUGGER=WAIT_FOR_DEBUGGER)
     return pydoc.pager(epilog_text)
 
 
@@ -297,27 +349,34 @@
         -a, --all-abi
             {ALL_ABI}
 
+            If only need to run tests for a specific abi, please use:
+                atest <test> -- --abi arm64-v8a   # ARM 64-bit
+                atest <test> -- --abi armeabi-v7a # ARM 32-bit
+
         -b, --build:
             {BUILD} (default)
 
         -d, --disable-teardown
             {DISABLE_TEARDOWN}
 
-        -D --tf-debug
+        -D, --tf-debug
             {TF_DEBUG}
 
-        --history
-            {HISTORY}
-
         --host
             {HOST}
 
+        --host-unit-test-only
+            {HOST_UNIT_TEST_ONLY}
+
         -i, --install
             {INSTALL} (default)
 
         -m, --rebuild-module-info
             {REBUILD_MODULE_INFO} (default)
 
+        --no-modules-in
+            {NO_MODULES_IN}
+
         -s, --serial
             {SERIAL}
 
@@ -327,12 +386,20 @@
         -t, --test
             {TEST} (default)
 
+        --test-config-select
+            {TEST_CONFIG_SELECTION}
+
+        --tf-early-device-release
+            {TF_EARLY_DEVICE_RELEASE}
+
         --tf-template
             {TF_TEMPLATE}
 
         -w, --wait-for-debugger
             {WAIT_FOR_DEBUGGER}
 
+        --request-upload-result
+            {REQUEST_UPLOAD_RESULT}
 
         [ Test Mapping ]
         -p, --test-mapping
@@ -349,6 +416,9 @@
         --collect-tests-only
             {COLLECT_TESTS_ONLY}
 
+        --history
+            {HISTORY}
+
         --info
             {INFO}
 
@@ -397,6 +467,20 @@
         --retry-any-failure
             {RETRY_ANY_FAILURE}
 
+
+        [ Testing With AVDs ]
+        --start-avd
+            {START_AVD}
+
+        --acloud-create
+            {ACLOUD_CREATE}
+
+
+        [ Testing With Flakes Info ]
+        --flakes-info
+            {FLAKES_INFO}
+
+
         [ Metrics ]
         --no-metrics
             {NO_METRICS}
@@ -603,8 +687,34 @@
         atest <test> --retry-any-failure 20
 
 
+    - - - - - - - - - - - -
+    RUNNING TESTS ON AVD(s)
+    - - - - - - - - - - - -
+
+    Atest is able to run tests with the newly created AVD. Atest can build and 'acloud create' simultanously, and run tests after the AVD has been created successfully.
+
+    Examples:
+    - Start an AVD before running tests on that newly created device.
+
+        acloud create && atest <test>
+
+    can be simplified by:
+
+        atest <test> --start-avd
+
+    - Start AVD(s) by specifing 'acloud create' arguments and run tests on that newly created device.
+
+        atest <test> --acloud-create "--build-id 6509363 --build-target aosp_cf_x86_phone-userdebug --branch aosp_master"
+
+    To know detail about the argument, please run 'acloud create --help'.
+
+    [WARNING]
+    * --acloud-create must be the LAST optional argument: the remainder args will be consumed as its positional args.
+    * --acloud-create/--start-avd do not delete newly created AVDs. The users will be deleting them manually.
+
+
     - - - - - - - - - - - - - - - -
-    REGRESSION DETECTION (obsolute)
+    REGRESSION DETECTION (obsolete)
     - - - - - - - - - - - - - - - -
 
     ********************** Warning **********************
@@ -684,6 +794,14 @@
     Example:
         atest -v <test> -- <custom_args1> <custom_args2>
 
+    Examples of passing options to the modules:
+        atest <test> -- --module-arg <module-name>:<option-name>:<option-value>
+        atest GtsPermissionTestCases -- --module-arg GtsPermissionTestCases:ignore-business-logic-failure:true
 
-                                                     2019-12-19
+    Examples of passing options to the runner type or class:
+        atest <test> -- --test-arg <test-class>:<option-name>:<option-value>
+        atest CtsVideoTestCases -- --test-arg com.android.tradefed.testtype.JarHosttest:collect-tests-only:true
+
+
+                                                     2021-04-22
 '''
diff --git a/atest/atest_configs.py b/atest/atest_configs.py
new file mode 100644
index 0000000..02088cc
--- /dev/null
+++ b/atest/atest_configs.py
@@ -0,0 +1,20 @@
+# Copyright 2020, The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+Various global config settings used by atest.
+"""
+
+# For saving the args to global for sub class usage.
+GLOBAL_ARGS = None
diff --git a/atest/atest_execution_info.py b/atest/atest_execution_info.py
index 013f308..e283b83 100644
--- a/atest/atest_execution_info.py
+++ b/atest/atest_execution_info.py
@@ -40,16 +40,18 @@
 _TEST_TIME_KEY = 'test_time'
 _TEST_DETAILS_KEY = 'details'
 _TEST_RESULT_NAME = 'test_result'
+_TEST_RESULT_LINK = 'test_result_link'
 _EXIT_CODE_ATTR = 'EXIT_CODE'
 _MAIN_MODULE_KEY = '__main__'
 _UUID_LEN = 30
-_RESULT_LEN = 35
+_RESULT_LEN = 20
+_RESULT_URL_LEN = 35
 _COMMAND_LEN = 50
 _LOGCAT_FMT = '{}/log/invocation_*/{}*logcat-on-failure*'
 
-_SUMMARY_MAP_TEMPLATE = {_STATUS_PASSED_KEY : 0,
-                         _STATUS_FAILED_KEY : 0,
-                         _STATUS_IGNORED_KEY : 0,}
+_SUMMARY_MAP_TEMPLATE = {_STATUS_PASSED_KEY: 0,
+                         _STATUS_FAILED_KEY: 0,
+                         _STATUS_IGNORED_KEY: 0}
 
 PREPARE_END_TIME = None
 
@@ -96,11 +98,19 @@
     target = '%s/20*_*_*' % root
     paths = glob.glob(target)
     paths.sort(reverse=True)
-    print('{:-^{uuid_len}} {:-^{result_len}} {:-^{command_len}}'
-          .format('uuid', 'result', 'command',
-                  uuid_len=_UUID_LEN,
-                  result_len=_RESULT_LEN,
-                  command_len=_COMMAND_LEN))
+    if has_url_results():
+        print('{:-^{uuid_len}} {:-^{result_len}} {:-^{result_url_len}} {:-^{command_len}}'
+              .format('uuid', 'result', 'result_url', 'command',
+                      uuid_len=_UUID_LEN,
+                      result_len=_RESULT_LEN,
+                      result_url_len=_RESULT_URL_LEN,
+                      command_len=_COMMAND_LEN))
+    else:
+        print('{:-^{uuid_len}} {:-^{result_len}} {:-^{command_len}}'
+              .format('uuid', 'result', 'command',
+                      uuid_len=_UUID_LEN,
+                      result_len=_RESULT_LEN,
+                      command_len=_COMMAND_LEN))
     for path in paths[0: int(history_arg)+1]:
         result_path = os.path.join(path, 'test_result')
         if os.path.isfile(result_path):
@@ -108,15 +118,28 @@
                 with open(result_path) as json_file:
                     result = json.load(json_file)
                     total_summary = result.get(_TOTAL_SUMMARY_KEY, {})
-                    summary_str = ', '.join([k+':'+str(v)
+                    summary_str = ', '.join([k[:1]+':'+str(v)
                                              for k, v in total_summary.items()])
-                    print('{:<{uuid_len}} {:<{result_len}} atest {:<{command_len}}'
-                          .format(os.path.basename(path),
-                                  summary_str,
-                                  result.get(_ARGS_KEY, ''),
-                                  uuid_len=_UUID_LEN,
-                                  result_len=_RESULT_LEN,
-                                  command_len=_COMMAND_LEN))
+                    test_result_url = result.get(_TEST_RESULT_LINK, '')
+                    if has_url_results():
+                        print('{:<{uuid_len}} {:<{result_len}} '
+                              '{:<{result_url_len}} atest {:<{command_len}}'
+                              .format(os.path.basename(path),
+                                      summary_str,
+                                      test_result_url,
+                                      result.get(_ARGS_KEY, ''),
+                                      uuid_len=_UUID_LEN,
+                                      result_len=_RESULT_LEN,
+                                      result_url_len=_RESULT_URL_LEN,
+                                      command_len=_COMMAND_LEN))
+                    else:
+                        print('{:<{uuid_len}} {:<{result_len}} atest {:<{command_len}}'
+                              .format(os.path.basename(path),
+                                      summary_str,
+                                      result.get(_ARGS_KEY, ''),
+                                      uuid_len=_UUID_LEN,
+                                      result_len=_RESULT_LEN,
+                                      command_len=_COMMAND_LEN))
             except ValueError:
                 pass
 
@@ -131,6 +154,9 @@
         with open(path) as json_file:
             result = json.load(json_file)
             print("\natest {}".format(result.get(_ARGS_KEY, '')))
+            test_result_url = result.get(_TEST_RESULT_LINK, '')
+            if test_result_url:
+                print('\nTest Result Link: {}'.format(test_result_url))
             print('\nTotal Summary:\n{}'.format(au.delimiter('-')))
             total_summary = result.get(_TOTAL_SUMMARY_KEY, {})
             print(', '.join([(k+':'+str(v))
@@ -179,7 +205,26 @@
             or args.history
             or args.info
             or args.version
-            or args.latest_result)
+            or args.latest_result
+            or args.history)
+
+
+def has_url_results():
+    """Get if contains url info."""
+    for root, _, files in os.walk(constants.ATEST_RESULT_ROOT):
+        for file in files:
+            if file != 'test_result':
+                continue
+            json_file = os.path.join(root, 'test_result')
+            with open(json_file) as result:
+                try:
+                    result = json.load(result)
+                    url_link = result.get(_TEST_RESULT_LINK, '')
+                    if url_link:
+                        return True
+                except ValueError:
+                    pass
+    return False
 
 
 class AtestExecutionInfo:
@@ -248,12 +293,11 @@
 
     def __exit__(self, exit_type, value, traceback):
         """Write execution information and close information file."""
-        if self.result_file:
+        if self.result_file and not has_non_test_options(self.args_ns):
             self.result_file.write(AtestExecutionInfo.
                                    _generate_execution_detail(self.args))
             self.result_file.close()
-            if not has_non_test_options(self.args_ns):
-                symlink_latest_result(self.work_dir)
+            symlink_latest_result(self.work_dir)
         main_module = sys.modules.get(_MAIN_MODULE_KEY)
         main_exit_code = getattr(main_module, _EXIT_CODE_ATTR,
                                  constants.EXIT_CODE_ERROR)
@@ -308,13 +352,15 @@
         """
         info_dict[_TEST_RUNNER_KEY] = {}
         for reporter in reporters:
+            if reporter.test_result_link:
+                info_dict[_TEST_RESULT_LINK] = reporter.test_result_link
             for test in reporter.all_test_results:
                 runner = info_dict[_TEST_RUNNER_KEY].setdefault(
                     test.runner_name, {})
                 group = runner.setdefault(test.group_name, {})
-                result_dict = {_TEST_NAME_KEY : test.test_name,
-                               _TEST_TIME_KEY : test.test_time,
-                               _TEST_DETAILS_KEY : test.details}
+                result_dict = {_TEST_NAME_KEY: test.test_name,
+                               _TEST_TIME_KEY: test.test_time,
+                               _TEST_DETAILS_KEY: test.details}
                 group.setdefault(test.status, []).append(result_dict)
 
         total_test_group_summary = _SUMMARY_MAP_TEMPLATE.copy()
diff --git a/atest/atest_integration_tests.py b/atest/atest_integration_tests.py
index a0d43e7..f96847c 100755
--- a/atest/atest_integration_tests.py
+++ b/atest/atest_integration_tests.py
@@ -41,7 +41,7 @@
 _FAILED_LINE_LIMIT = 50
 _INTEGRATION_TESTS = 'INTEGRATION_TESTS'
 _EXIT_TEST_FAILED = 1
-
+_ALTERNATIVES = ('-dev', '-py2')
 
 class ATestIntegrationTest(unittest.TestCase):
     """ATest Integration Test Class."""
@@ -129,9 +129,14 @@
 
 if __name__ == '__main__':
     # TODO(b/129029189) Implement detail comparison check for dry-run mode.
-    ARGS = ' '.join(sys.argv[1:])
+    ARGS = sys.argv[1:]
     if ARGS:
-        ATestIntegrationTest.OPTIONS = ARGS
+        for exe in _ALTERNATIVES:
+            if exe in ARGS:
+                ARGS.remove(exe)
+                ATestIntegrationTest.EXECUTABLE += exe
+        ATestIntegrationTest.OPTIONS = ' '.join(ARGS)
+    print('Running tests with {}\n'.format(ATestIntegrationTest.EXECUTABLE))
     TEST_PLANS = os.path.join(os.path.dirname(__file__), _INTEGRATION_TESTS)
     try:
         LOG_PATH = os.path.join(create_test_run_dir(), _LOG_FILE)
diff --git a/atest/atest_run_unittests.py b/atest/atest_run_unittests.py
index 74e77b7..e6269b4 100755
--- a/atest/atest_run_unittests.py
+++ b/atest/atest_run_unittests.py
@@ -16,8 +16,6 @@
 
 """Main entrypoint for all of atest's unittest."""
 
-# pylint: disable=line-too-long
-
 import logging
 import os
 import sys
@@ -25,6 +23,11 @@
 
 from importlib import import_module
 
+import atest_utils
+
+COVERAGE = 'coverage'
+RUN_COVERAGE = COVERAGE in sys.argv
+SHOW_MISSING = '--show-missing' in sys.argv
 # Setup logging to be silent so unittests can pass through TF.
 logging.disable(logging.ERROR)
 
@@ -53,25 +56,60 @@
 
     return testable_modules
 
-def main(_):
-    """Main unittest entry.
+def run_test_modules(test_modules):
+    """Main method of running unit tests.
 
     Args:
-        argv: A list of system arguments. (unused)
+        test_modules; a list of module names.
 
     Returns:
-        0 if success. None-zero if fails.
+        result: a namespace of unittest result.
     """
-    test_modules = get_test_modules()
     for mod in test_modules:
         import_module(mod)
 
     loader = unittest.defaultTestLoader
     test_suite = loader.loadTestsFromNames(test_modules)
     runner = unittest.TextTestRunner(verbosity=2)
-    result = runner.run(test_suite)
-    sys.exit(not result.wasSuccessful())
+    return runner.run(test_suite)
+
+# pylint: disable=import-outside-toplevel
+def main(run_coverage=False, show_missing=False):
+    """Main unittest entry.
+
+    Args:
+        cov_args: A list of coverage arguments.
+
+    Returns:
+        0 if success. None-zero if fails.
+    """
+    if not all((run_coverage, atest_utils.has_python_module(COVERAGE))):
+        result = run_test_modules(get_test_modules())
+        if not result.wasSuccessful():
+            sys.exit(not result.wasSuccessful())
+        sys.exit(0)
+
+    from coverage import coverage
+    # The cover_pylib=False ignores only std libs; therefore, these 3rd-party
+    # libs must be omitted before creating coverage class.
+    ignore_libs = ['*/__init__.py',
+                   '*dist-packages/*.py',
+                   '*site-packages/*.py']
+    cov = coverage(omit=ignore_libs)
+    cov.erase()
+    cov.start()
+    result = run_test_modules(get_test_modules())
+    if not result.wasSuccessful():
+        cov.erase()
+        sys.exit(not result.wasSuccessful())
+    cov.stop()
+    cov.save()
+    cov.report(show_missing=show_missing)
+    cov.html_report()
 
 
 if __name__ == '__main__':
-    main(sys.argv[1:])
+    if len(sys.argv) > 1:
+        main(RUN_COVERAGE, SHOW_MISSING)
+    else:
+        main()
diff --git a/atest/atest_tradefed.sh b/atest/atest_tradefed.sh
index c43d056..1b3afd4 100755
--- a/atest/atest_tradefed.sh
+++ b/atest/atest_tradefed.sh
@@ -35,7 +35,6 @@
           cts-tradefed.jar
           sts-tradefed.jar
           vts-tradefed.jar
-          vts10-tradefed.jar
           csuite-harness.jar
           tradefed-isolation.jar
           host-libprotobuf-java-full.jar
diff --git a/atest/atest_unittest.py b/atest/atest_unittest.py
index a56b78f..1798390 100755
--- a/atest/atest_unittest.py
+++ b/atest/atest_unittest.py
@@ -28,6 +28,7 @@
 from io import StringIO
 from unittest import mock
 
+# pylint: disable=wrong-import-order
 import atest
 import constants
 import module_info
@@ -93,6 +94,8 @@
                     atest._has_valid_test_mapping_args(parsed_args),
                     'Failed to validate: %s' % args)
 
+    @mock.patch.object(module_info.ModuleInfo, '_merge_soong_info')
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     @mock.patch('json.load', return_value={})
     @mock.patch('builtins.open', new_callable=mock.mock_open)
     @mock.patch('os.path.isfile', return_value=True)
@@ -100,7 +103,7 @@
     @mock.patch.object(module_info.ModuleInfo, 'get_module_info',)
     def test_print_module_info_from_module_name(self, mock_get_module_info,
                                                 _mock_has_colors, _isfile,
-                                                _open, _json):
+                                                _open, _json, _merge):
         """Test _print_module_info_from_module_name method."""
         mod_one_name = 'mod1'
         mod_one_path = ['src/path/mod1']
@@ -145,13 +148,15 @@
         # Check if no module_info, then nothing printed to screen.
         self.assertEqual(capture_output.getvalue(), null_output)
 
+    @mock.patch.object(module_info.ModuleInfo, '_merge_soong_info')
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     @mock.patch('json.load', return_value={})
     @mock.patch('builtins.open', new_callable=mock.mock_open)
     @mock.patch('os.path.isfile', return_value=True)
     @mock.patch('atest_utils._has_colors', return_value=True)
     @mock.patch.object(module_info.ModuleInfo, 'get_module_info',)
     def test_print_test_info(self, mock_get_module_info, _mock_has_colors,
-                             _isfile, _open, _json):
+                             _isfile, _open, _json, _merge):
         """Test _print_test_info method."""
         mod_one_name = 'mod1'
         mod_one = {constants.MODULE_NAME: mod_one_name,
diff --git a/atest/atest_utils.py b/atest/atest_utils.py
index aefc020..05d72e5 100644
--- a/atest/atest_utils.py
+++ b/atest/atest_utils.py
@@ -18,10 +18,13 @@
 
 
 # pylint: disable=import-outside-toplevel
+# pylint: disable=too-many-lines
 
 from __future__ import print_function
 
+import fnmatch
 import hashlib
+import importlib
 import itertools
 import json
 import logging
@@ -31,7 +34,24 @@
 import shutil
 import subprocess
 import sys
+import sysconfig
+import time
+import zipfile
 
+import xml.etree.ElementTree as ET
+
+from distutils.util import strtobool
+
+# This is a workaround of b/144743252, where the http.client failed to loaded
+# because the googleapiclient was found before the built-in libs; enabling
+# embedded launcher(b/135639220) has not been reliable and other issue will
+# raise.
+# The workaround is repositioning the built-in libs before other 3rd libs in
+# PYTHONPATH(sys.path) to eliminate the symptom of failed loading http.client.
+sys.path.insert(0, os.path.dirname(sysconfig.get_paths()['purelib']))
+sys.path.insert(0, os.path.dirname(sysconfig.get_paths()['stdlib']))
+
+#pylint: disable=wrong-import-position
 import atest_decorator
 import atest_error
 import constants
@@ -39,20 +59,31 @@
 # This proto related module will be auto generated in build time.
 # pylint: disable=no-name-in-module
 # pylint: disable=import-error
-from tools.asuite.atest.tf_proto import test_record_pb2
-
+try:
+    from tools.asuite.atest.tf_proto import test_record_pb2
+except ImportError as err:
+    pass
 # b/147562331 only occurs when running atest in source code. We don't encourge
 # the users to manually "pip3 install protobuf", therefore when the exception
 # occurs, we don't collect data and the tab completion is for args is silence.
 try:
+    from metrics import metrics
     from metrics import metrics_base
     from metrics import metrics_utils
-except ModuleNotFoundError:
-    # This exception occurs only when invoking atest in source code.
-    print("You shouldn't see this message unless you ran 'atest-src'."
-          "To resolve the issue, please run:\n\t{}\n"
-          "and try again.".format('pip3 install protobuf'))
-    sys.exit(constants.IMPORT_FAILURE)
+except ImportError as err:
+    # TODO(b/182854938): remove this ImportError after refactor metrics dir.
+    try:
+        from asuite.metrics import metrics
+        from asuite.metrics import metrics_base
+        from asuite.metrics import metrics_utils
+    except ImportError as err:
+        # This exception occurs only when invoking atest in source code.
+        print("You shouldn't see this message unless you ran 'atest-src'."
+              "To resolve the issue, please run:\n\t{}\n"
+              "and try again.".format('pip3 install protobuf'))
+        print('Import error, %s', err)
+        print('sys.path: %s', sys.path)
+        sys.exit(constants.IMPORT_FAILURE)
 
 _BASH_RESET_CODE = '\033[0m\n'
 # Arbitrary number to limit stdout for failed runs in _run_limited_output.
@@ -66,7 +97,7 @@
 _BUILD_FAILURE = 'FAILED: '
 CMD_RESULT_PATH = os.path.join(os.environ.get(constants.ANDROID_BUILD_TOP,
                                               os.getcwd()),
-                               'tools/tradefederation/core/atest/test_data',
+                               'tools/asuite/atest/test_data',
                                'test_commands.json')
 BUILD_TOP_HASH = hashlib.md5(os.environ.get(constants.ANDROID_BUILD_TOP, '').
                              encode()).hexdigest()
@@ -84,7 +115,11 @@
     "| awk '{{print $1}}');"
     # Get the list of modified files from HEAD to previous $ahead generation.
     "git diff HEAD~$ahead --name-only")
+_ANDROID_BUILD_EXT = ('.bp', '.mk')
 
+# Set of special chars for various purposes.
+_REGEX_CHARS = {'[', '(', '{', '|', '\\', '*', '?', '+', '^'}
+_WILDCARD_CHARS = {'?', '*'}
 
 def get_build_cmd():
     """Compose build command with no-absolute path and flag "--make-mode".
@@ -120,6 +155,26 @@
     return capture_output
 
 
+def _capture_limited_output(full_log):
+    """Return the limited error message from capture_failed_section.
+
+    Args:
+        full_log: List of strings representing full output of build.
+
+    Returns:
+        output: List of strings that are build errors.
+    """
+    # Parse out the build error to output.
+    output = _capture_fail_section(full_log)
+    if not output:
+        output = full_log
+    if len(output) >= _FAILED_OUTPUT_LINE_LIMIT:
+        output = output[-_FAILED_OUTPUT_LINE_LIMIT:]
+    output = 'Output (may be trimmed):\n%s' % ''.join(output)
+    return output
+
+
+# TODO: b/187122993 refine subprocess with 'with-statement' in fixit week.
 def _run_limited_output(cmd, env_vars=None):
     """Runs a given command and streams the output on a single line in stdout.
 
@@ -159,16 +214,51 @@
     # Wait for the Popen to finish completely before checking the returncode.
     proc.wait()
     if proc.returncode != 0:
-        # Parse out the build error to output.
-        output = _capture_fail_section(full_output)
+        # get error log from "OUT_DIR/error.log"
+        error_log_file = os.path.join(get_build_out_dir(), "error.log")
+        output = []
+        if os.path.isfile(error_log_file):
+            if os.stat(error_log_file).st_size > 0:
+                with open(error_log_file) as f:
+                    output = f.read()
         if not output:
-            output = full_output
-        if len(output) >= _FAILED_OUTPUT_LINE_LIMIT:
-            output = output[-_FAILED_OUTPUT_LINE_LIMIT:]
-        output = 'Output (may be trimmed):\n%s' % ''.join(output)
+            output = _capture_limited_output(full_output)
         raise subprocess.CalledProcessError(proc.returncode, cmd, output)
 
 
+def get_build_out_dir():
+    """Get android build out directory.
+
+    Returns:
+        String of the out directory.
+    """
+    build_top = os.environ.get(constants.ANDROID_BUILD_TOP)
+    # Get the out folder if user specified $OUT_DIR
+    custom_out_dir = os.environ.get(constants.ANDROID_OUT_DIR)
+    custom_out_dir_common_base = os.environ.get(
+        constants.ANDROID_OUT_DIR_COMMON_BASE)
+    user_out_dir = None
+    if custom_out_dir:
+        if os.path.isabs(custom_out_dir):
+            user_out_dir = custom_out_dir
+        else:
+            user_out_dir = os.path.join(build_top, custom_out_dir)
+    elif custom_out_dir_common_base:
+        # When OUT_DIR_COMMON_BASE is set, the output directory for each
+        # separate source tree is named after the directory holding the
+        # source tree.
+        build_top_basename = os.path.basename(build_top)
+        if os.path.isabs(custom_out_dir_common_base):
+            user_out_dir = os.path.join(custom_out_dir_common_base,
+                                        build_top_basename)
+        else:
+            user_out_dir = os.path.join(build_top, custom_out_dir_common_base,
+                                        build_top_basename)
+    if user_out_dir:
+        return user_out_dir
+    return os.path.join(build_top, "out")
+
+
 def build(build_targets, verbose=False, env_vars=None):
     """Shell out and make build_targets.
 
@@ -204,6 +294,9 @@
         return True
     except subprocess.CalledProcessError as err:
         logging.error('Error building: %s', build_targets)
+        print(constants.REBUILD_MODULE_INFO_MSG.format(
+            colorize(constants.REBUILD_MODULE_INFO_FLAG,
+                     constants.RED)))
         if err.output:
             logging.error(err.output)
         return False
@@ -224,6 +317,7 @@
     return False
 
 
+# pylint: disable=unused-argument
 def get_result_server_args(for_test_mapping=False):
     """Return list of args for communication with result server.
 
@@ -231,15 +325,8 @@
         for_test_mapping: True if the test run is for Test Mapping to include
             additional reporting args. Default is False.
     """
-    # TODO (b/147644460) Temporarily disable Sponge V1 since it will be turned
-    # down.
-    if _can_upload_to_result_server():
-        if for_test_mapping:
-            return (constants.RESULT_SERVER_ARGS +
-                    constants.TEST_MAPPING_RESULT_SERVER_ARGS)
-        return constants.RESULT_SERVER_ARGS
-    return []
-
+    # Customize test mapping argument here if needed.
+    return constants.RESULT_SERVER_ARGS
 
 def sort_and_group(iterable, key):
     """Sort and group helper function."""
@@ -254,6 +341,7 @@
     which means the test value is a test group name in TEST_MAPPING file, e.g.,
     `:postsubmit`.
 
+    If --host-unit-test-only be applied, it's not test mapping.
     If any test mapping options is specified, the atest command must also be
     set to run tests in test mapping files.
 
@@ -265,10 +353,12 @@
         otherwise.
     """
     return (
-        args.test_mapping or
+        not args.host_unit_test_only and
+        (args.test_mapping or
         args.include_subdirs or
         not args.tests or
-        (len(args.tests) == 1 and args.tests[0][0] == ':'))
+        (len(args.tests) == 1 and args.tests[0][0] == ':')))
+
 
 @atest_decorator.static_var("cached_has_colors", {})
 def _has_colors(stream):
@@ -418,6 +508,8 @@
         with open(result_path) as json_file:
             full_result_content = json.load(json_file)
     former_test_cmds = full_result_content.get(input_test, [])
+    test_cmds = _normalize(test_cmds)
+    former_test_cmds = _normalize(former_test_cmds)
     if not _are_identical_cmds(test_cmds, former_test_cmds):
         if do_verification:
             raise atest_error.DryRunVerificationError(
@@ -428,18 +520,10 @@
             # are willing to update the result.
             print('Former cmds = %s' % former_test_cmds)
             print('Current cmds = %s' % test_cmds)
-            try:
-                from distutils import util
-                if not util.strtobool(
-                        input('Do you want to update former result '
-                              'with the latest one?(Y/n)')):
-                    print('SKIP updating result!!!')
-                    return
-            except ValueError:
-                # Default action is updating the command result of the
-                # input_test. If the user input is unrecognizable telling yes
-                # or no, "Y" is implicitly applied.
-                pass
+            if not prompt_with_yn_result('Do you want to update former result '
+                                         'to the latest one?', True):
+                print('SKIP updating result!!!')
+                return
     else:
         # If current commands are the same as the formers, no need to update
         # result.
@@ -449,12 +533,37 @@
         json.dump(full_result_content, outfile, indent=0)
         print('Save result mapping to %s' % result_path)
 
-
-def _are_identical_cmds(current_cmds, former_cmds):
-    """Tell two commands are identical. Note that '--atest-log-file-path' is not
+def _normalize(cmd_list):
+    """Method that normalize commands. Note that '--atest-log-file-path' is not
     considered a critical argument, therefore, it will be removed during
     the comparison. Also, atest can be ran in any place, so verifying relative
-    path is regardless as well.
+    path, LD_LIBRARY_PATH, and --proto-output-file is regardless as well.
+
+    Args:
+        cmd_list: A list with one element. E.g. ['cmd arg1 arg2 True']
+
+    Returns:
+        A list with elements. E.g. ['cmd', 'arg1', 'arg2', 'True']
+    """
+    _cmd = ' '.join(cmd_list).split()
+    for cmd in _cmd:
+        if cmd.startswith('--atest-log-file-path'):
+            _cmd.remove(cmd)
+            continue
+        if cmd.startswith('LD_LIBRARY_PATH='):
+            _cmd.remove(cmd)
+            continue
+        if cmd.startswith('--proto-output-file='):
+            _cmd.remove(cmd)
+            continue
+        if _BUILD_CMD in cmd:
+            _cmd.remove(cmd)
+            _cmd.append(os.path.join('./', _BUILD_CMD))
+            continue
+    return _cmd
+
+def _are_identical_cmds(current_cmds, former_cmds):
+    """Tell two commands are identical.
 
     Args:
         current_cmds: A list of strings for running input tests.
@@ -463,32 +572,10 @@
     Returns:
         True if both commands are identical, False otherwise.
     """
-    def _normalize(cmd_list):
-        """Method that normalize commands.
-
-        Args:
-            cmd_list: A list with one element. E.g. ['cmd arg1 arg2 True']
-
-        Returns:
-            A list with elements. E.g. ['cmd', 'arg1', 'arg2', 'True']
-        """
-        _cmd = ''.join(cmd_list).split()
-        for cmd in _cmd:
-            if cmd.startswith('--atest-log-file-path'):
-                _cmd.remove(cmd)
-                continue
-            if _BUILD_CMD in cmd:
-                _cmd.remove(cmd)
-                _cmd.append(os.path.join('./', _BUILD_CMD))
-                continue
-        return _cmd
-
-    _current_cmds = _normalize(current_cmds)
-    _former_cmds = _normalize(former_cmds)
     # Always sort cmd list to make it comparable.
-    _current_cmds.sort()
-    _former_cmds.sort()
-    return _current_cmds == _former_cmds
+    current_cmds.sort()
+    former_cmds.sort()
+    return current_cmds == former_cmds
 
 def _get_hashed_file_name(main_file_name):
     """Convert the input string to a md5-hashed string. If file_extension is
@@ -505,7 +592,99 @@
     hashed_name = hashed_fn.hexdigest()
     return hashed_name + '.cache'
 
-def get_test_info_cache_path(test_reference, cache_root=TEST_INFO_CACHE_ROOT):
+def md5sum(filename):
+    """Generate MD5 checksum of a file.
+
+    Args:
+        name: A string of a filename.
+
+    Returns:
+        A string of hashed MD5 checksum.
+    """
+    if not os.path.isfile(filename):
+        return ""
+    with open(filename, 'rb') as target:
+        content = target.read()
+    return hashlib.md5(content).hexdigest()
+
+def check_md5(check_file, missing_ok=False):
+    """Method equivalent to 'md5sum --check /file/to/check'.
+
+    Args:
+        check_file: A string of filename that stores filename and its
+                   md5 checksum.
+        missing_ok: A boolean that considers OK even when the check_file does
+                    not exist. Using missing_ok=True allows ignoring md5 check
+                    especially for initial run that the check_file has not yet
+                    generated. Using missing_ok=False ensures the consistency of
+                    files, and guarantees the process is successfully completed.
+
+    Returns:
+        When missing_ok is True (soft check):
+          - True if the checksum is consistent with the actual MD5, even the
+            check_file is missing or not a valid JSON.
+          - False when the checksum is inconsistent with the actual MD5.
+        When missing_ok is False (ensure the process completed properly):
+          - True if the checksum is consistent with the actual MD5.
+          - False otherwise.
+    """
+    if not os.path.isfile(check_file):
+        if not missing_ok:
+            logging.warning(
+                'Unable to verify: %s not found.', check_file)
+        return missing_ok
+    if not is_valid_json_file(check_file):
+        logging.warning(
+            'Unable to verify: %s invalid JSON format.', check_file)
+        return missing_ok
+    with open(check_file, 'r+') as _file:
+        content = json.load(_file)
+        for filename, md5 in content.items():
+            if md5sum(filename) != md5:
+                logging.debug('%s has altered.', filename)
+                return False
+    return True
+
+def save_md5(filenames, save_file):
+    """Method equivalent to 'md5sum file1 file2 > /file/to/check'
+
+    Args:
+        filenames: A list of filenames.
+        save_file: Filename for storing files and their md5 checksums.
+    """
+    if os.path.isfile(save_file):
+        os.remove(save_file)
+    data = {}
+    for name in filenames:
+        if not os.path.isfile(name):
+            logging.warning('%s is not a file.', name)
+        data.update({name: md5sum(name)})
+    with open(save_file, 'w+') as _file:
+        json.dump(data, _file)
+
+def get_cache_root():
+    """Get the root path dir for cache.
+
+    Use branch and target information as cache_root.
+    The path will look like ~/.atest/info_cache/$hash(branch+target)
+
+    Returns:
+        A string of the path of the root dir of cache.
+    """
+    manifest_branch = get_manifest_branch()
+    if not manifest_branch:
+        manifest_branch = os.environ.get(
+            constants.ANDROID_BUILD_TOP, constants.ANDROID_BUILD_TOP)
+    # target
+    build_target = os.path.basename(
+        os.environ.get(constants.ANDROID_PRODUCT_OUT,
+                       constants.ANDROID_PRODUCT_OUT))
+    branch_target_hash = hashlib.md5(
+        (constants.MODE + manifest_branch + build_target).encode()).hexdigest()
+    return os.path.join(os.path.expanduser('~'), '.atest','info_cache',
+                        branch_target_hash[:8])
+
+def get_test_info_cache_path(test_reference, cache_root=None):
     """Get the cache path of the desired test_infos.
 
     Args:
@@ -515,11 +694,12 @@
     Returns:
         A string of the path of test_info cache.
     """
-    return os.path.join(cache_root,
-                        _get_hashed_file_name(test_reference))
+    if not cache_root:
+        cache_root = get_cache_root()
+    return os.path.join(cache_root, _get_hashed_file_name(test_reference))
 
 def update_test_info_cache(test_reference, test_infos,
-                           cache_root=TEST_INFO_CACHE_ROOT):
+                           cache_root=None):
     """Update cache content which stores a set of test_info objects through
        pickle module, each test_reference will be saved as a cache file.
 
@@ -528,6 +708,8 @@
         test_infos: A set of TestInfos.
         cache_root: Folder path for saving caches.
     """
+    if not cache_root:
+        cache_root = get_cache_root()
     if not os.path.isdir(cache_root):
         os.makedirs(cache_root)
     cache_path = get_test_info_cache_path(test_reference, cache_root)
@@ -544,7 +726,7 @@
             constants.ACCESS_CACHE_FAILURE)
 
 
-def load_test_info_cache(test_reference, cache_root=TEST_INFO_CACHE_ROOT):
+def load_test_info_cache(test_reference, cache_root=None):
     """Load cache by test_reference to a set of test_infos object.
 
     Args:
@@ -554,6 +736,8 @@
     Returns:
         A list of TestInfo namedtuple if cache found, else None.
     """
+    if not cache_root:
+        cache_root = get_cache_root()
     cache_file = get_test_info_cache_path(test_reference, cache_root)
     if os.path.isfile(cache_file):
         logging.debug('Loading cache %s.', cache_file)
@@ -573,13 +757,15 @@
                 constants.ACCESS_CACHE_FAILURE)
     return None
 
-def clean_test_info_caches(tests, cache_root=TEST_INFO_CACHE_ROOT):
+def clean_test_info_caches(tests, cache_root=None):
     """Clean caches of input tests.
 
     Args:
         tests: A list of test references.
         cache_root: Folder path for finding caches.
     """
+    if not cache_root:
+        cache_root = get_cache_root()
     for test in tests:
         cache_file = get_test_info_cache_path(test, cache_root)
         if os.path.isfile(cache_file):
@@ -644,3 +830,369 @@
         A string of delimiter.
     """
     return prenl * '\n' + char * length + postnl * '\n'
+
+def find_files(path, file_name=constants.TEST_MAPPING):
+    """Find all files with given name under the given path.
+
+    Args:
+        path: A string of path in source.
+        file_name: The file name pattern for finding matched files.
+
+    Returns:
+        A list of paths of the files with the matching name under the given
+        path.
+    """
+    match_files = []
+    for root, _, filenames in os.walk(path):
+        for filename in fnmatch.filter(filenames, file_name):
+            match_files.append(os.path.join(root, filename))
+    return match_files
+
+def extract_zip_text(zip_path):
+    """Extract the text files content for input zip file.
+
+    Args:
+        zip_path: The file path of zip.
+
+    Returns:
+        The string in input zip file.
+    """
+    content = ''
+    try:
+        with zipfile.ZipFile(zip_path) as zip_file:
+            for filename in zip_file.namelist():
+                if os.path.isdir(filename):
+                    continue
+                # Force change line if multiple text files in zip
+                content = content + '\n'
+                # read the file
+                with zip_file.open(filename) as extract_file:
+                    for line in extract_file:
+                        if matched_tf_error_log(line.decode()):
+                            content = content + line.decode()
+    except zipfile.BadZipfile as err:
+        logging.debug('Exception raised: %s', err)
+    return content
+
+def matched_tf_error_log(content):
+    """Check if the input content matched tradefed log pattern.
+    The format will look like this.
+    05-25 17:37:04 W/XXXXXX
+    05-25 17:37:04 E/XXXXXX
+
+    Args:
+        content: Log string.
+
+    Returns:
+        True if the content matches the regular expression for tradefed error or
+        warning log.
+    """
+    reg = ('^((0[1-9])|(1[0-2]))-((0[1-9])|([12][0-9])|(3[0-1])) '
+           '(([0-1][0-9])|([2][0-3])):([0-5][0-9]):([0-5][0-9]) (E|W/)')
+    if re.search(reg, content):
+        return True
+    return False
+
+def has_valid_cert():
+    """Check whether the certificate is valid.
+
+    Returns: True if the cert is valid.
+    """
+    if not constants.CERT_STATUS_CMD:
+        return False
+    try:
+        return (not subprocess.check_call(constants.CERT_STATUS_CMD,
+                                          stdout=subprocess.DEVNULL,
+                                          stderr=subprocess.DEVNULL))
+    except subprocess.CalledProcessError:
+        return False
+
+# pylint: disable=too-many-locals
+def get_flakes(branch='',
+               target='',
+               test_name='',
+               test_module='',
+               test_method=''):
+    """Get flake information.
+
+    Args:
+        branch: A string of branch name.
+        target: A string of target.
+        test_name: A string of test suite name.
+        test_module: A string of test module.
+        test_method: A string of test method.
+
+    Returns:
+        A dictionary of flake info. None if no flakes service exists.
+    """
+    if not branch:
+        branch = constants.FLAKE_BRANCH
+    if not target:
+        target = constants.FLAKE_TARGET
+    if not test_name:
+        test_name = constants.FLAKE_TEST_NAME
+    # Currently lock the flake information from test-mapping test
+    # which only runs on cuttlefish(x86) devices.
+    # TODO: extend supporting other devices
+    if test_module:
+        test_module = 'x86 {}'.format(test_module)
+    flake_service = os.path.join(constants.FLAKE_SERVICE_PATH,
+                                 constants.FLAKE_FILE)
+    if not os.path.exists(flake_service):
+        logging.debug('Get flakes: Flake service path not exist.')
+        # Send (3, 0) to present no flakes info because service does not exist.
+        metrics.LocalDetectEvent(
+            detect_type=constants.DETECT_TYPE_NO_FLAKE,
+            result=0)
+        return None
+    if not has_valid_cert():
+        logging.debug('Get flakes: No valid cert.')
+        # Send (3, 1) to present no flakes info because no valid cert.
+        metrics.LocalDetectEvent(
+            detect_type=constants.DETECT_TYPE_NO_FLAKE,
+            result=1)
+        return None
+    flake_info = {}
+    start = time.time()
+    try:
+        shutil.copy2(flake_service, constants.FLAKE_TMP_PATH)
+        tmp_service = os.path.join(constants.FLAKE_TMP_PATH,
+                                   constants.FLAKE_FILE)
+        os.chmod(tmp_service, 0o0755)
+        cmd = [tmp_service, branch, target, test_name, test_module, test_method]
+        logging.debug('Executing: %s', ' '.join(cmd))
+        output = subprocess.check_output(cmd).decode()
+        percent_template = "{}:".format(constants.FLAKE_PERCENT)
+        postsubmit_template = "{}:".format(constants.FLAKE_POSTSUBMIT)
+        for line in output.splitlines():
+            if line.startswith(percent_template):
+                flake_info[constants.FLAKE_PERCENT] = line.replace(
+                    percent_template, '')
+            if line.startswith(postsubmit_template):
+                flake_info[constants.FLAKE_POSTSUBMIT] = line.replace(
+                    postsubmit_template, '')
+    # pylint: disable=broad-except
+    except Exception as e:
+        logging.debug('Exception:%s', e)
+        return None
+    # Send (4, time) to present having flakes info and it spent time.
+    duration = round(time.time()-start)
+    logging.debug('Took %ss to get flakes info', duration)
+    metrics.LocalDetectEvent(
+        detect_type=constants.DETECT_TYPE_HAS_FLAKE,
+        result=duration)
+    return flake_info
+
+def read_test_record(path):
+    """A Helper to read test record proto.
+
+    Args:
+        path: The proto file path.
+
+    Returns:
+        The test_record proto instance.
+    """
+    with open(path, 'rb') as proto_file:
+        msg = test_record_pb2.TestRecord()
+        msg.ParseFromString(proto_file.read())
+    return msg
+
+def has_python_module(module_name):
+    """Detect if the module can be loaded without importing it in real.
+
+    Args:
+        cmd: A string of the tested module name.
+
+    Returns:
+        True if found, False otherwise.
+    """
+    return bool(importlib.util.find_spec(module_name))
+
+def is_valid_json_file(path):
+    """Detect if input path exist and content is valid.
+
+    Args:
+        path: The json file path.
+
+    Returns:
+        True if file exist and content is valid, False otherwise.
+    """
+    if isinstance(path, bytes):
+        path = path.decode('utf-8')
+    try:
+        if os.path.isfile(path):
+            with open(path) as json_file:
+                json.load(json_file)
+            return True
+        logging.warning('%s: File not found.', path)
+    except json.JSONDecodeError:
+        logging.warning('Exception happened while loading %s.', path)
+    return False
+
+def get_manifest_branch():
+    """Get the manifest branch via repo info command.
+
+    Returns:
+        None if no system environment parameter ANDROID_BUILD_TOP or
+        running 'repo info' command error, otherwise the manifest branch
+    """
+    build_top = os.getenv(constants.ANDROID_BUILD_TOP, None)
+    if not build_top:
+        return None
+    try:
+        # Command repo need use default lib "http", add non-default lib
+        # might cause repo command execution error.
+        splitter = ':'
+        env_vars = os.environ.copy()
+        org_python_path = env_vars['PYTHONPATH'].split(splitter)
+        default_python_path = [p for p in org_python_path
+                               if not p.startswith('/tmp/Soong.python_')]
+        env_vars['PYTHONPATH'] = splitter.join(default_python_path)
+        output = subprocess.check_output(
+            ['repo', 'info', '-o', constants.ASUITE_REPO_PROJECT_NAME],
+            env=env_vars,
+            cwd=build_top,
+            universal_newlines=True)
+        branch_re = re.compile(r'Manifest branch:\s*(?P<branch>.*)')
+        match = branch_re.match(output)
+        if match:
+            return match.group('branch')
+        logging.warning('Unable to detect branch name through:\n %s', output)
+    except subprocess.CalledProcessError:
+        logging.warning('Exception happened while getting branch')
+    return None
+
+def get_build_target():
+    """Get the build target form system environment TARGET_PRODUCT."""
+    return os.getenv(constants.ANDROID_TARGET_PRODUCT, None)
+
+def parse_mainline_modules(test):
+    """Parse test reference into test and mainline modules.
+
+    Args:
+        test: An String of test reference.
+
+    Returns:
+        A string of test without mainline modules,
+        A string of mainline modules.
+    """
+    result = constants.TEST_WITH_MAINLINE_MODULES_RE.match(test)
+    if not result:
+        return test, ""
+    test_wo_mainline_modules = result.group('test')
+    mainline_modules = result.group('mainline_modules')
+    return test_wo_mainline_modules, mainline_modules
+
+def has_wildcard(test_name):
+    """ Tell whether the test_name(either a list or string) contains wildcard
+    symbols.
+
+    Args:
+        test_name: A list or a str.
+
+    Return:
+        True if test_name contains wildcard, False otherwise.
+    """
+    if isinstance(test_name, str):
+        return any(char in test_name for char in _WILDCARD_CHARS)
+    if isinstance(test_name, list):
+        for name in test_name:
+            if has_wildcard(name):
+                return True
+    return False
+
+def is_build_file(path):
+    """ If input file is one of an android build file.
+
+    Args:
+        path: A string of file path.
+
+    Return:
+        True if path is android build file, False otherwise.
+    """
+    return bool(os.path.splitext(path)[-1] in _ANDROID_BUILD_EXT)
+
+def quote(input_str):
+    """ If the input string -- especially in custom args -- contains shell-aware
+    characters, insert a pair of "\" to the input string.
+
+    e.g. unit(test|testing|testing) -> 'unit(test|testing|testing)'
+
+    Args:
+        input_str: A string from user input.
+
+    Returns: A string with single quotes if regex chars were detected.
+    """
+    if has_chars(input_str, _REGEX_CHARS):
+        return "\'" + input_str + "\'"
+    return input_str
+
+def has_chars(input_str, chars):
+    """ Check if the input string contains one of the designated characters.
+
+    Args:
+        input_str: A string from user input.
+        chars: An iterable object.
+
+    Returns:
+        True if the input string contains one of the special chars.
+    """
+    for char in chars:
+        if char in input_str:
+            return True
+    return False
+
+def prompt_with_yn_result(msg, default=True):
+    """Prompt message and get yes or no result.
+
+    Args:
+        msg: The question you want asking.
+        default: boolean to True/Yes or False/No
+    Returns:
+        default value if get KeyboardInterrupt or ValueError exception.
+    """
+    suffix = '[Y/n]: ' if default else '[y/N]: '
+    try:
+        return strtobool(input(msg+suffix))
+    except (ValueError, KeyboardInterrupt):
+        return default
+
+def get_android_junit_config_filters(test_config):
+    """Get the dictionary of a input config for junit config's filters
+
+    Args:
+        test_config: The path of the test config.
+    Returns:
+        A dictionary include all the filters in the input config.
+    """
+    filter_dict = {}
+    xml_root = ET.parse(test_config).getroot()
+    option_tags = xml_root.findall('.//option')
+    for tag in option_tags:
+        name = tag.attrib['name'].strip()
+        if name in constants.SUPPORTED_FILTERS:
+            filter_values = filter_dict.get(name, [])
+            value = tag.attrib['value'].strip()
+            filter_values.append(value)
+            filter_dict.update({name: filter_values})
+    return filter_dict
+
+def get_config_parameter(test_config):
+    """Get all the parameter values for the input config
+
+    Args:
+        test_config: The path of the test config.
+    Returns:
+        A set include all the parameters of the input config.
+    """
+    parameters = set()
+    xml_root = ET.parse(test_config).getroot()
+    option_tags = xml_root.findall('.//option')
+    for tag in option_tags:
+        name = tag.attrib['name'].strip()
+        if name == constants.CONFIG_DESCRIPTOR:
+            key = tag.attrib['key'].strip()
+            if key == constants.PARAMETER_KEY:
+                value = tag.attrib['value'].strip()
+                parameters.add(value)
+    return parameters
diff --git a/atest/atest_utils_unittest.py b/atest/atest_utils_unittest.py
index cfbcad6..35082c6 100755
--- a/atest/atest_utils_unittest.py
+++ b/atest/atest_utils_unittest.py
@@ -32,6 +32,7 @@
 import atest_utils
 import constants
 import unittest_utils
+import unittest_constants
 
 from test_finders import test_info
 
@@ -50,6 +51,15 @@
                                  TEST_SUITE_A, TEST_MODULE_CLASS_A,
                                  TEST_INSTALL_LOC_A)
 TEST_INFO_A.test_finder = TEST_FINDER_A
+TEST_ZIP_DATA_DIR = 'zip_files'
+TEST_SINGLE_ZIP_NAME = 'single_file.zip'
+TEST_MULTI_ZIP_NAME = 'multi_file.zip'
+
+REPO_INFO_OUTPUT = '''Manifest branch: test_branch
+Manifest merge branch: refs/heads/test_branch
+Manifest groups: all,-notdefault
+----------------------------
+'''
 
 #pylint: disable=protected-access
 class AtestUtilsUnittests(unittest.TestCase):
@@ -82,6 +92,7 @@
             for attr in tm_option_attributes:
                 setattr(args, attr, attr == attr_to_test)
             args.tests = []
+            args.host_unit_test_only = False
             self.assertTrue(
                 atest_utils.is_test_mapping(args),
                 'Failed to validate option %s' % attr_to_test)
@@ -89,19 +100,29 @@
         args = mock.Mock()
         for attr in tm_option_attributes:
             setattr(args, attr, False)
+        args.tests = []
+        args.host_unit_test_only = True
+        self.assertFalse(atest_utils.is_test_mapping(args))
+
+        args = mock.Mock()
+        for attr in tm_option_attributes:
+            setattr(args, attr, False)
         args.tests = [':group_name']
+        args.host_unit_test_only = False
         self.assertTrue(atest_utils.is_test_mapping(args))
 
         args = mock.Mock()
         for attr in tm_option_attributes:
             setattr(args, attr, False)
         args.tests = [':test1', 'test2']
+        args.host_unit_test_only = False
         self.assertFalse(atest_utils.is_test_mapping(args))
 
         args = mock.Mock()
         for attr in tm_option_attributes:
             setattr(args, attr, False)
         args.tests = ['test2']
+        args.host_unit_test_only = False
         self.assertFalse(atest_utils.is_test_mapping(args))
 
     @mock.patch('curses.tigetnum')
@@ -411,5 +432,221 @@
         """Test method delimiter"""
         self.assertEqual('\n===\n\n', atest_utils.delimiter('=', 3, 1, 2))
 
+    def test_has_python_module(self):
+        """Test method has_python_module"""
+        self.assertFalse(atest_utils.has_python_module('M_M'))
+        self.assertTrue(atest_utils.has_python_module('os'))
+
+    @mock.patch.object(atest_utils, 'matched_tf_error_log', return_value=True)
+    def test_read_zip_single_text(self, _matched):
+        """Test method extract_zip_text include only one text file."""
+        zip_path = os.path.join(unittest_constants.TEST_DATA_DIR,
+                                TEST_ZIP_DATA_DIR, TEST_SINGLE_ZIP_NAME)
+        expect_content = '\nfile1_line1\nfile1_line2\n'
+        self.assertEqual(expect_content, atest_utils.extract_zip_text(zip_path))
+
+    @mock.patch.object(atest_utils, 'matched_tf_error_log', return_value=True)
+    def test_read_zip_multi_text(self, _matched):
+        """Test method extract_zip_text include multiple text files."""
+        zip_path = os.path.join(unittest_constants.TEST_DATA_DIR,
+                                TEST_ZIP_DATA_DIR, TEST_MULTI_ZIP_NAME)
+        expect_content = ('\nfile1_line1\nfile1_line2\n\nfile2_line1\n'
+                          'file2_line2\n')
+        self.assertEqual(expect_content, atest_utils.extract_zip_text(zip_path))
+
+    def test_matched_tf_error_log(self):
+        """Test method extract_zip_text include multiple text files."""
+        matched_content = '05-25 17:37:04 E/XXXXX YYYYY'
+        not_matched_content = '05-25 17:37:04 I/XXXXX YYYYY'
+        # Test matched content
+        self.assertEqual(True,
+                         atest_utils.matched_tf_error_log(matched_content))
+        # Test not matched content
+        self.assertEqual(False,
+                         atest_utils.matched_tf_error_log(not_matched_content))
+
+    @mock.patch('os.chmod')
+    @mock.patch('shutil.copy2')
+    @mock.patch('atest_utils.has_valid_cert')
+    @mock.patch('subprocess.check_output')
+    @mock.patch('os.path.exists')
+    def test_get_flakes(self, mock_path_exists, mock_output, mock_valid_cert,
+                        _cpc, _cm):
+        """Test method get_flakes."""
+        # Test par file does not exist.
+        mock_path_exists.return_value = False
+        self.assertEqual(None, atest_utils.get_flakes())
+        # Test par file exists.
+        mock_path_exists.return_value = True
+        mock_output.return_value = (b'flake_percent:0.10001\n'
+                                    b'postsubmit_flakes_per_week:12.0')
+        mock_valid_cert.return_value = True
+        expected_flake_info = {'flake_percent':'0.10001',
+                               'postsubmit_flakes_per_week':'12.0'}
+        self.assertEqual(expected_flake_info,
+                         atest_utils.get_flakes())
+        # Test no valid cert
+        mock_valid_cert.return_value = False
+        self.assertEqual(None,
+                         atest_utils.get_flakes())
+
+    @mock.patch('subprocess.check_call')
+    def test_has_valid_cert(self, mock_call):
+        """Test method has_valid_cert."""
+        # raise subprocess.CalledProcessError
+        mock_call.raiseError.side_effect = subprocess.CalledProcessError
+        self.assertFalse(atest_utils.has_valid_cert())
+        with mock.patch("constants.CERT_STATUS_CMD", ''):
+            self.assertFalse(atest_utils.has_valid_cert())
+        with mock.patch("constants.CERT_STATUS_CMD", 'CMD'):
+            # has valid cert
+            mock_call.return_value = 0
+            self.assertTrue(atest_utils.has_valid_cert())
+            # no valid cert
+            mock_call.return_value = 4
+            self.assertFalse(atest_utils.has_valid_cert())
+
+    # pylint: disable=no-member
+    def test_read_test_record_proto(self):
+        """Test method read_test_record."""
+        test_record_file_path = os.path.join(unittest_constants.TEST_DATA_DIR,
+                                             "test_record.proto.testonly")
+        test_record = atest_utils.read_test_record(test_record_file_path)
+        self.assertEqual(test_record.children[0].inline_test_record.test_record_id,
+                         'x86 hello_world_test')
+
+    def test_is_valid_json_file_file_not_exist(self):
+        """Test method is_valid_json_file if file not exist."""
+        json_file_path = os.path.join(unittest_constants.TEST_DATA_DIR,
+                                      "not_exist.json")
+        self.assertFalse(atest_utils.is_valid_json_file(json_file_path))
+
+    def test_is_valid_json_file_content_valid(self):
+        """Test method is_valid_json_file if file exist and content is valid."""
+        json_file_path = os.path.join(unittest_constants.TEST_DATA_DIR,
+                                      "module-info.json")
+        self.assertTrue(atest_utils.is_valid_json_file(json_file_path))
+
+    def test_is_valid_json_file_content_not_valid(self):
+        """Test method is_valid_json_file if file exist but content is valid."""
+        json_file_path = os.path.join(unittest_constants.TEST_DATA_DIR,
+                                      "not-valid-module-info.json")
+        self.assertFalse(atest_utils.is_valid_json_file(json_file_path))
+
+    @mock.patch('subprocess.check_output')
+    @mock.patch('os.getenv')
+    def test_get_manifest_branch(self, mock_env, mock_check_output):
+        """Test method get_manifest_branch"""
+        mock_env.return_value = 'any_path'
+        mock_check_output.return_value = REPO_INFO_OUTPUT
+        self.assertEqual('test_branch', atest_utils.get_manifest_branch())
+
+        mock_env.return_value = 'any_path'
+        mock_check_output.return_value = 'not_matched_branch_pattern.'
+        self.assertEqual(None, atest_utils.get_manifest_branch())
+
+        mock_env.return_value = 'any_path'
+        mock_check_output.side_effect = subprocess.CalledProcessError(
+            1,
+            'repo info')
+        self.assertEqual(None, atest_utils.get_manifest_branch())
+
+        mock_env.return_value = None
+        mock_check_output.return_value = REPO_INFO_OUTPUT
+        self.assertEqual(None, atest_utils.get_manifest_branch())
+
+    def test_has_wildcard(self):
+        """Test method of has_wildcard"""
+        self.assertFalse(atest_utils.has_wildcard('test1'))
+        self.assertFalse(atest_utils.has_wildcard(['test1']))
+        self.assertTrue(atest_utils.has_wildcard('test1?'))
+        self.assertTrue(atest_utils.has_wildcard(['test1', 'b*', 'a?b*']))
+
+    # pylint: disable=anomalous-backslash-in-string
+    def test_quote(self):
+        """Test method of quote()"""
+        target_str = r'TEST_(F|P)[0-9].*\w$'
+        expected_str = '\'TEST_(F|P)[0-9].*\w$\''
+        self.assertEqual(atest_utils.quote(target_str), expected_str)
+        self.assertEqual(atest_utils.quote('TEST_P224'), 'TEST_P224')
+
+    @mock.patch('builtins.input', return_value='')
+    def test_prompt_with_yn_result(self, mock_input):
+        """Test method of prompt_with_yn_result"""
+        msg = 'Do you want to continue?'
+        mock_input.return_value = ''
+        self.assertTrue(atest_utils.prompt_with_yn_result(msg, True))
+        self.assertFalse(atest_utils.prompt_with_yn_result(msg, False))
+        mock_input.return_value = 'y'
+        self.assertTrue(atest_utils.prompt_with_yn_result(msg, True))
+        mock_input.return_value = 'nO'
+        self.assertFalse(atest_utils.prompt_with_yn_result(msg, True))
+
+    def test_get_android_junit_config_filters(self):
+        """Test method of get_android_junit_config_filters"""
+        no_filter_test_config = os.path.join(
+            unittest_constants.TEST_DATA_DIR,
+            "filter_configs", "no_filter.cfg")
+        self.assertEqual({},
+                         atest_utils.get_android_junit_config_filters(
+                             no_filter_test_config))
+
+        filtered_test_config = os.path.join(
+            unittest_constants.TEST_DATA_DIR,
+            'filter_configs', 'filter.cfg')
+        filter_dict = atest_utils.get_android_junit_config_filters(
+            filtered_test_config)
+        include_annotations = filter_dict.get(constants.INCLUDE_ANNOTATION)
+        include_annotations.sort()
+        self.assertEqual(
+            ['include1', 'include2'],
+            include_annotations)
+        exclude_annotation = filter_dict.get(constants.EXCLUDE_ANNOTATION)
+        exclude_annotation.sort()
+        self.assertEqual(
+            ['exclude1', 'exclude2'],
+            exclude_annotation)
+
+    def test_md5sum(self):
+        """Test method of md5sum"""
+        exist_string = os.path.join(unittest_constants.TEST_DATA_DIR,
+                                    unittest_constants.JSON_FILE)
+        inexist_string = os.path.join(unittest_constants.TEST_DATA_DIR,
+                                      unittest_constants.CLASS_NAME)
+        self.assertEqual(
+            atest_utils.md5sum(exist_string), 'c26aab9baae99bcfb97633b69e9ceefd')
+        self.assertEqual(
+            atest_utils.md5sum(inexist_string), '')
+
+    def test_check_md5(self):
+        """Test method of check_md5"""
+        file1 = os.path.join(unittest_constants.TEST_DATA_DIR,
+                            unittest_constants.JSON_FILE)
+        checksum_file = '/tmp/_tmp_module-info.json'
+        atest_utils.save_md5([file1], '/tmp/_tmp_module-info.json')
+        self.assertTrue(atest_utils.check_md5(checksum_file))
+        os.remove(checksum_file)
+        self.assertFalse(atest_utils.check_md5(checksum_file))
+        self.assertTrue(atest_utils.check_md5(checksum_file, missing_ok=True))
+
+    def test_get_config_parameter(self):
+        """Test method of get_config_parameter"""
+        parameter_config = os.path.join(
+            unittest_constants.TEST_DATA_DIR,
+            "parameter_config", "parameter.cfg")
+        no_parameter_config = os.path.join(
+            unittest_constants.TEST_DATA_DIR,
+            "parameter_config", "no_parameter.cfg")
+
+        # Test parameter empty value
+        self.assertEqual(set(),
+                         atest_utils.get_config_parameter(
+                             no_parameter_config))
+
+        # Test parameter empty value
+        self.assertEqual({'value_1', 'value_2', 'value_3', 'value_4'},
+                         atest_utils.get_config_parameter(
+                             parameter_config))
+
 if __name__ == "__main__":
     unittest.main()
diff --git a/atest/cli_translator.py b/atest/cli_translator.py
index 8bac153..92872b3 100644
--- a/atest/cli_translator.py
+++ b/atest/cli_translator.py
@@ -36,14 +36,18 @@
 from metrics import metrics
 from metrics import metrics_utils
 from test_finders import module_finder
+from test_finders import test_finder_utils
 
 FUZZY_FINDER = 'FUZZY'
 CACHE_FINDER = 'CACHE'
+TESTNAME_CHARS = {'#', ':', '/'}
 
 # Pattern used to identify comments start with '//' or '#' in TEST_MAPPING.
 _COMMENTS_RE = re.compile(r'(?m)[\s\t]*(#|//).*|(\".*?\")')
 _COMMENTS = frozenset(['//', '#'])
 
+_MAINLINE_MODULES_EXT_RE = re.compile(r'(.apex|.apks|.apk)$')
+
 #pylint: disable=no-self-use
 class CLITranslator:
     """
@@ -78,13 +82,17 @@
                         'to clean the old cache.)')
 
     # pylint: disable=too-many-locals
-    def _find_test_infos(self, test, tm_test_detail):
+    # pylint: disable=too-many-branches
+    # pylint: disable=too-many-statements
+    def _find_test_infos(self, test, tm_test_detail,
+                         is_rebuild_module_info=False):
         """Return set of TestInfos based on a given test.
 
         Args:
             test: A string representing test references.
             tm_test_detail: The TestDetail of test configured in TEST_MAPPING
                 files.
+            is_rebuild_module_info: Boolean of args.is_rebuild_module_info
 
         Returns:
             Set of TestInfos based on the given test.
@@ -95,6 +103,22 @@
         test_finders = []
         test_info_str = ''
         find_test_err_msg = None
+        mm_build_targets = []
+        test, mainline_modules = atest_utils.parse_mainline_modules(test)
+        if not self._verified_mainline_modules(test, mainline_modules):
+            return test_infos
+        test_modules_to_build = []
+        test_mainline_modules = []
+        if self.mod_info and self.mod_info.get_module_info(test):
+            test_mainline_modules = self.mod_info.get_module_info(test).get(
+                constants.MODULE_MAINLINE_MODULES, [])
+        for modules in test_mainline_modules:
+            for module in modules.split('+'):
+                test_modules_to_build.append(re.sub(
+                    _MAINLINE_MODULES_EXT_RE, '', module))
+        if mainline_modules:
+            mm_build_targets = [re.sub(_MAINLINE_MODULES_EXT_RE, '', x)
+                                for x in mainline_modules.split('+')]
         for finder in test_finder_handler.get_find_methods_for_test(
                 self.mod_info, test):
             # For tests in TEST_MAPPING, find method is only related to
@@ -108,6 +132,12 @@
             if found_test_infos:
                 finder_info = finder.finder_info
                 for test_info in found_test_infos:
+                    test_deps = set()
+                    if self.mod_info:
+                        test_deps = self.mod_info.get_install_module_dependency(
+                            test_info.test_name)
+                        logging.debug('(%s) Test dependencies: %s',
+                                      test_info.test_name, test_deps)
                     if tm_test_detail:
                         test_info.data[constants.TI_MODULE_ARG] = (
                             tm_test_detail.options)
@@ -115,6 +145,17 @@
                         test_info.host = tm_test_detail.host
                     if finder_info != CACHE_FINDER:
                         test_info.test_finder = finder_info
+                    test_info.mainline_modules = mainline_modules
+                    test_info.build_targets = {
+                        x for x in test_info.build_targets
+                        if x not in test_modules_to_build}
+                    test_info.build_targets.update(mm_build_targets)
+                    # Only add dependencies to build_targets when they are in
+                    # module info
+                    test_deps_in_mod_info = [
+                        test_dep for test_dep in test_deps
+                        if self.mod_info.is_module(test_dep)]
+                    test_info.build_targets.update(test_deps_in_mod_info)
                     test_infos.add(test_info)
                 test_found = True
                 print("Found '%s' as %s" % (
@@ -126,7 +167,8 @@
                 test_info_str = ','.join([str(x) for x in found_test_infos])
                 break
         if not test_found:
-            f_results = self._fuzzy_search_and_msg(test, find_test_err_msg)
+            f_results = self._fuzzy_search_and_msg(test, find_test_err_msg,
+                                                   is_rebuild_module_info)
             if f_results:
                 test_infos.update(f_results)
                 test_found = True
@@ -143,15 +185,49 @@
         # non-test_mapping tests.
         if test_infos and not tm_test_detail:
             atest_utils.update_test_info_cache(test, test_infos)
-            print(self.msg)
+            if self.msg:
+                print(self.msg)
         return test_infos
 
-    def _fuzzy_search_and_msg(self, test, find_test_err_msg):
+    def _verified_mainline_modules(self, test, mainline_modules):
+        """ Verify the test with mainline modules is acceptable.
+
+        The test must be a module and mainline modules are in module-info.
+        The syntax rule of mainline modules will check in build process.
+        The rule includes mainline modules are sorted alphabetically, no space,
+        and no duplication.
+
+        Args:
+            test: A string representing test references
+            mainline_modules: A string of mainline_modules.
+
+        Returns:
+            True if this test is acceptable. Otherwise, print the reason and
+            return False.
+        """
+        if not mainline_modules:
+            return True
+        if not self.mod_info.is_module(test):
+            print('Test mainline modules(%s) for: %s failed. Only support '
+                  'module tests.'
+                  % (atest_utils.colorize(mainline_modules, constants.RED),
+                     atest_utils.colorize(test, constants.RED)))
+            return False
+        if not self.mod_info.has_mainline_modules(test, mainline_modules):
+            print('Error: Test mainline modules(%s) not for %s.'
+                  % (atest_utils.colorize(mainline_modules, constants.RED),
+                     atest_utils.colorize(test, constants.RED)))
+            return False
+        return True
+
+    def _fuzzy_search_and_msg(self, test, find_test_err_msg,
+                              is_rebuild_module_info=False):
         """ Fuzzy search and print message.
 
         Args:
             test: A string representing test references
             find_test_err_msg: A string of find test error message.
+            is_rebuild_module_info: Boolean of args.is_rebuild_module_info
 
         Returns:
             A list of TestInfos if found, otherwise None.
@@ -160,6 +236,8 @@
               atest_utils.colorize(test, constants.RED))
         # Currently we focus on guessing module names. Append names on
         # results if more finders support fuzzy searching.
+        if atest_utils.has_chars(test, TESTNAME_CHARS):
+            return None
         mod_finder = module_finder.ModuleFinder(self.mod_info)
         results = mod_finder.get_fuzzy_searching_results(test)
         if len(results) == 1 and self._confirm_running(results):
@@ -175,20 +253,22 @@
             print('%s\n' % (atest_utils.colorize(
                 find_test_err_msg, constants.MAGENTA)))
         else:
-            print('(This can happen after a repo sync or if the test'
-                  ' is new. Running: with "%s" may resolve the issue.)'
-                  '\n' % (atest_utils.colorize(
-                      constants.REBUILD_MODULE_INFO_FLAG,
-                      constants.RED)))
+            if not is_rebuild_module_info:
+                print(constants.REBUILD_MODULE_INFO_MSG.format(
+                    atest_utils.colorize(constants.REBUILD_MODULE_INFO_FLAG,
+                                         constants.RED)))
+            print('')
         return None
 
-    def _get_test_infos(self, tests, test_mapping_test_details=None):
+    def _get_test_infos(self, tests, test_mapping_test_details=None,
+                        is_rebuild_module_info=False):
         """Return set of TestInfos based on passed in tests.
 
         Args:
             tests: List of strings representing test references.
             test_mapping_test_details: List of TestDetail for tests configured
                 in TEST_MAPPING files.
+            is_rebuild_module_info: Boolean of args.is_rebuild_module_info
 
         Returns:
             Set of TestInfos based on the passed in tests.
@@ -197,7 +277,8 @@
         if not test_mapping_test_details:
             test_mapping_test_details = [None] * len(tests)
         for test, tm_test_detail in zip(tests, test_mapping_test_details):
-            found_test_infos = self._find_test_infos(test, tm_test_detail)
+            found_test_infos = self._find_test_infos(test, tm_test_detail,
+                                                     is_rebuild_module_info)
             test_infos.update(found_test_infos)
         return test_infos
 
@@ -210,9 +291,9 @@
         Returns:
             True is the answer is affirmative.
         """
-        decision = input('Did you mean {0}? [Y/n] '.format(
-            atest_utils.colorize(results[0], constants.GREEN)))
-        return decision in constants.AFFIRMATIVES
+        return atest_utils.prompt_with_yn_result(
+            'Did you mean {0}?'.format(
+                atest_utils.colorize(results[0], constants.GREEN)), True)
 
     def _print_fuzzy_searching_results(self, results):
         """Print modules when fuzzy searching gives multiple results.
@@ -279,6 +360,13 @@
                 grouped_tests = all_tests.setdefault(test_group_name, set())
                 tests = []
                 for test in test_list:
+                    # TODO: uncomment below when atest support testing mainline
+                    # module in TEST_MAPPING files.
+                    if constants.TEST_WITH_MAINLINE_MODULES_RE.match(test['name']):
+                        logging.debug('Skipping mainline module: %s',
+                                      atest_utils.colorize(test['name'],
+                                                           constants.RED))
+                        continue
                     if (self.enable_file_patterns and
                             not test_mapping.is_match_file_patterns(
                                 test_mapping_file, test)):
@@ -307,28 +395,13 @@
                 grouped_tests.update(tests)
         return all_tests, imports
 
-    def _find_files(self, path, file_name=constants.TEST_MAPPING):
-        """Find all files with given name under the given path.
-
-        Args:
-            path: A string of path in source.
-
-        Returns:
-            A list of paths of the files with the matching name under the given
-            path.
-        """
-        test_mapping_files = []
-        for root, _, filenames in os.walk(path):
-            for filename in fnmatch.filter(filenames, file_name):
-                test_mapping_files.append(os.path.join(root, filename))
-        return test_mapping_files
-
     def _get_tests_from_test_mapping_files(
-            self, test_group, test_mapping_files):
+            self, test_groups, test_mapping_files):
         """Get tests in the given test mapping files with the match group.
 
         Args:
-            test_group: Group of tests to run. Default is set to `presubmit`.
+            test_groups: Groups of tests to run. Default is set to `presubmit`
+            and `presubmit-large`.
             test_mapping_files: A list of path of TEST_MAPPING files.
 
         Returns:
@@ -352,24 +425,26 @@
                 grouped_tests = merged_all_tests.setdefault(
                     test_group_name, set())
                 grouped_tests.update(test_list)
-
-        tests = set(merged_all_tests.get(test_group, []))
-        if test_group == constants.TEST_GROUP_ALL:
-            for grouped_tests in merged_all_tests.values():
-                tests.update(grouped_tests)
+        tests = set()
+        for test_group in test_groups:
+            temp_tests = set(merged_all_tests.get(test_group, []))
+            tests.update(temp_tests)
+            if test_group == constants.TEST_GROUP_ALL:
+                for grouped_tests in merged_all_tests.values():
+                    tests.update(grouped_tests)
         return tests, merged_all_tests, all_imports
 
     # pylint: disable=too-many-arguments
     # pylint: disable=too-many-locals
     def _find_tests_by_test_mapping(
-            self, path='', test_group=constants.TEST_GROUP_PRESUBMIT,
+            self, path='', test_groups=None,
             file_name=constants.TEST_MAPPING, include_subdirs=False,
             checked_files=None):
         """Find tests defined in TEST_MAPPING in the given path.
 
         Args:
             path: A string of path in source. Default is set to '', i.e., CWD.
-            test_group: Group of tests to run. Default is set to `presubmit`.
+            test_groups: A List of test groups to run.
             file_name: Name of TEST_MAPPING file. Default is set to
                 `TEST_MAPPING`. The argument is added for testing purpose.
             include_subdirs: True to include tests in TEST_MAPPING files in sub
@@ -385,6 +460,9 @@
             grouped by test group.
         """
         path = os.path.realpath(path)
+        # Default test_groups is set to [`presubmit`, `presubmit-large`].
+        if not test_groups:
+            test_groups = constants.DEFAULT_TEST_GROUPS
         test_mapping_files = set()
         all_tests = {}
         test_mapping_file = os.path.join(path, file_name)
@@ -393,7 +471,7 @@
         # Include all TEST_MAPPING files in sub-directories if `include_subdirs`
         # is set to True.
         if include_subdirs:
-            test_mapping_files.update(self._find_files(path, file_name))
+            test_mapping_files.update(atest_utils.find_files(path, file_name))
         # Include all possible TEST_MAPPING files in parent directories.
         root_dir = os.environ.get(constants.ANDROID_BUILD_TOP, os.sep)
         while path not in (root_dir, os.sep):
@@ -410,7 +488,7 @@
             return test_mapping_files, all_tests
 
         tests, all_tests, imports = self._get_tests_from_test_mapping_files(
-            test_group, test_mapping_files)
+            test_groups, test_mapping_files)
 
         # Load TEST_MAPPING files from imports recursively.
         if imports:
@@ -425,7 +503,7 @@
                 # Search for tests based on the imported search path.
                 import_tests, import_all_tests = (
                     self._find_tests_by_test_mapping(
-                        path, test_group, file_name, include_subdirs,
+                        path, test_groups, file_name, include_subdirs,
                         checked_files))
                 # Merge the collections
                 tests.update(import_tests)
@@ -440,11 +518,13 @@
             targets |= test_info.build_targets
         return targets
 
-    def _get_test_mapping_tests(self, args):
+    def _get_test_mapping_tests(self, args, exit_if_no_test_found=True):
         """Find the tests in TEST_MAPPING files.
 
         Args:
             args: arg parsed object.
+            exit_if_no_test(s)_found: A flag to exit atest if no test mapping
+                                      tests found.
 
         Returns:
             A tuple of (test_names, test_details_list), where
@@ -454,28 +534,29 @@
         """
         # Pull out tests from test mapping
         src_path = ''
-        test_group = constants.TEST_GROUP_PRESUBMIT
+        test_groups = constants.DEFAULT_TEST_GROUPS
         if args.tests:
             if ':' in args.tests[0]:
                 src_path, test_group = args.tests[0].split(':')
+                test_groups = [test_group]
             else:
                 src_path = args.tests[0]
 
         test_details, all_test_details = self._find_tests_by_test_mapping(
-            path=src_path, test_group=test_group,
+            path=src_path, test_groups=test_groups,
             include_subdirs=args.include_subdirs, checked_files=set())
         test_details_list = list(test_details)
-        if not test_details_list:
+        if not test_details_list and exit_if_no_test_found:
             logging.warning(
                 'No tests of group `%s` found in TEST_MAPPING at %s or its '
                 'parent directories.\nYou might be missing atest arguments,'
                 ' try `atest --help` for more information',
-                test_group, os.path.realpath(''))
+                test_groups, os.path.realpath(''))
             if all_test_details:
                 tests = ''
                 for test_group, test_list in all_test_details.items():
                     tests += '%s:\n' % test_group
-                    for test_detail in sorted(test_list):
+                    for test_detail in sorted(test_list, key=str):
                         tests += '\t%s\n' % test_detail
                 logging.warning(
                     'All available tests in TEST_MAPPING files are:\n%s',
@@ -489,6 +570,31 @@
         test_names = [detail.name for detail in test_details_list]
         return test_names, test_details_list
 
+    def _extract_testable_modules_by_wildcard(self, user_input):
+        """Extract the given string with wildcard symbols to testable
+        module names.
+
+        Assume the available testable modules is:
+            ['Google', 'google', 'G00gle', 'g00gle']
+        and the user_input is:
+            ['*oo*', 'g00gle']
+        This method will return:
+            ['Google', 'google', 'g00gle']
+
+        Args:
+            user_input: A list of input.
+
+        Returns:
+            A list of testable modules.
+        """
+        testable_mods = self.mod_info.get_testable_modules()
+        extracted_tests = []
+        for test in user_input:
+            if atest_utils.has_wildcard(test):
+                extracted_tests.extend(fnmatch.filter(testable_mods, test))
+            else:
+                extracted_tests.append(test)
+        return extracted_tests
 
     def translate(self, args):
         """Translate atest command line into build targets and run commands.
@@ -502,14 +608,38 @@
         tests = args.tests
         # Test details from TEST_MAPPING files
         test_details_list = None
+        # Loading Host Unit Tests.
+        host_unit_tests = []
+        if not args.tests:
+            logging.debug('Finding Host Unit Tests...')
+            path = os.path.relpath(
+                os.path.realpath(''),
+                os.environ.get(constants.ANDROID_BUILD_TOP, ''))
+            host_unit_tests = test_finder_utils.find_host_unit_tests(
+                self.mod_info, path)
+            logging.debug('Found host_unit_tests: %s', host_unit_tests)
         if atest_utils.is_test_mapping(args):
             if args.enable_file_patterns:
                 self.enable_file_patterns = True
-            tests, test_details_list = self._get_test_mapping_tests(args)
+            tests, test_details_list = self._get_test_mapping_tests(
+                args, not bool(host_unit_tests))
         atest_utils.colorful_print("\nFinding Tests...", constants.CYAN)
         logging.debug('Finding Tests: %s', tests)
         start = time.time()
-        test_infos = self._get_test_infos(tests, test_details_list)
+        # Clear cache if user pass -c option
+        if args.clear_cache:
+            atest_utils.clean_test_info_caches(tests + host_unit_tests)
+        # Process tests which might contain wildcard symbols in advance.
+        if atest_utils.has_wildcard(tests):
+            tests = self._extract_testable_modules_by_wildcard(tests)
+        test_infos = self._get_test_infos(tests, test_details_list,
+                                          args.rebuild_module_info)
+        if host_unit_tests:
+            host_unit_test_details = [test_mapping.TestDetail(
+                {'name':test, 'host':True}) for test in host_unit_tests]
+            host_unit_test_infos = self._get_test_infos(host_unit_tests,
+                                                        host_unit_test_details)
+            test_infos.update(host_unit_test_infos)
         logging.debug('Found tests in %ss', time.time() - start)
         for test_info in test_infos:
             logging.debug('%s\n', test_info)
diff --git a/atest/cli_translator_unittest.py b/atest/cli_translator_unittest.py
index afeb0c6..f3889ab 100755
--- a/atest/cli_translator_unittest.py
+++ b/atest/cli_translator_unittest.py
@@ -30,6 +30,7 @@
 
 import cli_translator as cli_t
 import constants
+import module_info
 import test_finder_handler
 import test_mapping
 import unittest_constants as uc
@@ -38,6 +39,7 @@
 from metrics import metrics
 from test_finders import module_finder
 from test_finders import test_finder_base
+from test_finders import test_finder_utils
 
 
 # TEST_MAPPING related consts
@@ -58,7 +60,8 @@
 
 
 #pylint: disable=unused-argument
-def gettestinfos_side_effect(test_names, test_mapping_test_details=None):
+def gettestinfos_side_effect(test_names, test_mapping_test_details=None,
+                             is_rebuild_module_info=False):
     """Mock return values for _get_test_info."""
     test_infos = set()
     for test_name in test_names:
@@ -66,6 +69,10 @@
             test_infos.add(uc.MODULE_INFO)
         if test_name == uc.CLASS_NAME:
             test_infos.add(uc.CLASS_INFO)
+        if test_name == uc.HOST_UNIT_TEST_NAME_1:
+            test_infos.add(uc.MODULE_INFO_HOST_1)
+        if test_name == uc.HOST_UNIT_TEST_NAME_2:
+            test_infos.add(uc.MODULE_INFO_HOST_2)
     return test_infos
 
 
@@ -85,6 +92,7 @@
         self.args.test_mapping = False
         self.args.include_subdirs = False
         self.args.enable_file_patterns = False
+        self.args.rebuild_module_info = False
         # Cache finder related args
         self.args.clear_cache = False
         self.ctr.mod_info = mock.Mock
@@ -100,8 +108,9 @@
     @mock.patch.object(metrics, 'FindTestFinishEvent')
     @mock.patch.object(test_finder_handler, 'get_find_methods_for_test')
     # pylint: disable=too-many-locals
-    def test_get_test_infos(self, mock_getfindmethods, _metrics, mock_getfuzzyresults,
-                            mock_findtestbymodule, mock_input):
+    def test_get_test_infos(self, mock_getfindmethods, _metrics,
+                            mock_getfuzzyresults, mock_findtestbymodule,
+                            mock_input):
         """Test _get_test_infos method."""
         ctr = cli_t.CLITranslator()
         find_method_return_module_info = lambda x, y: uc.MODULE_INFOS
@@ -214,12 +223,29 @@
                     test_detail2.options,
                     test_info.data[constants.TI_MODULE_ARG])
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    @mock.patch.object(module_finder.ModuleFinder, 'get_fuzzy_searching_results')
+    @mock.patch.object(metrics, 'FindTestFinishEvent')
+    @mock.patch.object(test_finder_handler, 'get_find_methods_for_test')
+    def test_get_test_infos_with_mod_info(
+            self, mock_getfindmethods, _metrics, mock_getfuzzyresults,):
+        """Test _get_test_infos method."""
+        mod_info = module_info.ModuleInfo(
+            module_file=os.path.join(uc.TEST_DATA_DIR, uc.JSON_FILE))
+        ctr = cli_t.CLITranslator(module_info=mod_info)
+        null_test_info = set()
+        mock_getfindmethods.return_value = []
+        mock_getfuzzyresults.return_value = []
+        unittest_utils.assert_strict_equal(
+            self, ctr._get_test_infos('not_exist_module'), null_test_info)
+
     @mock.patch.object(cli_t.CLITranslator, '_get_test_infos',
                        side_effect=gettestinfos_side_effect)
     def test_translate_class(self, _info):
         """Test translate method for tests by class name."""
         # Check that we can find a class.
         self.args.tests = [uc.CLASS_NAME]
+        self.args.host_unit_test_only = False
         targets, test_infos = self.ctr.translate(self.args)
         unittest_utils.assert_strict_equal(
             self, targets, uc.CLASS_BUILD_TARGETS)
@@ -231,22 +257,27 @@
         """Test translate method for tests by module or class name."""
         # Check that we get all the build targets we expect.
         self.args.tests = [uc.MODULE_NAME, uc.CLASS_NAME]
+        self.args.host_unit_test_only = False
         targets, test_infos = self.ctr.translate(self.args)
         unittest_utils.assert_strict_equal(
             self, targets, uc.MODULE_CLASS_COMBINED_BUILD_TARGETS)
         unittest_utils.assert_strict_equal(self, test_infos, {uc.MODULE_INFO,
                                                               uc.CLASS_INFO})
 
+    @mock.patch.object(test_finder_utils, 'find_host_unit_tests', return_value=[])
     @mock.patch.object(cli_t.CLITranslator, '_find_tests_by_test_mapping')
     @mock.patch.object(cli_t.CLITranslator, '_get_test_infos',
                        side_effect=gettestinfos_side_effect)
-    def test_translate_test_mapping(self, _info, mock_testmapping):
+    def test_translate_test_mapping(self, _info, mock_testmapping,
+        _find_unit_tests):
         """Test translate method for tests in test mapping."""
         # Check that test mappings feeds into get_test_info properly.
         test_detail1 = test_mapping.TestDetail(uc.TEST_MAPPING_TEST)
         test_detail2 = test_mapping.TestDetail(uc.TEST_MAPPING_TEST_WITH_OPTION)
         mock_testmapping.return_value = ([test_detail1, test_detail2], None)
         self.args.tests = []
+        self.args.host = False
+        self.args.host_unit_test_only = False
         targets, test_infos = self.ctr.translate(self.args)
         unittest_utils.assert_strict_equal(
             self, targets, uc.MODULE_CLASS_COMBINED_BUILD_TARGETS)
@@ -264,6 +295,7 @@
         mock_testmapping.return_value = ([test_detail1, test_detail2], None)
         self.args.tests = ['src_path:all']
         self.args.test_mapping = True
+        self.args.host = False
         targets, test_infos = self.ctr.translate(self.args)
         unittest_utils.assert_strict_equal(
             self, targets, uc.MODULE_CLASS_COMBINED_BUILD_TARGETS)
@@ -292,7 +324,7 @@
         with mock.patch.dict('os.environ', os_environ_mock, clear=True):
             tests, all_tests = self.ctr._find_tests_by_test_mapping(
                 path=TEST_MAPPING_DIR,
-                test_group=constants.TEST_GROUP_POSTSUBMIT,
+                test_groups=[constants.TEST_GROUP_POSTSUBMIT],
                 file_name='test_mapping_sample', checked_files=set())
         expected_presubmit = set([TEST_1, TEST_2, TEST_5, TEST_7, TEST_9])
         expected = set([TEST_3, TEST_6, TEST_8, TEST_10])
@@ -309,7 +341,7 @@
         os_environ_mock = {constants.ANDROID_BUILD_TOP: uc.TEST_DATA_DIR}
         with mock.patch.dict('os.environ', os_environ_mock, clear=True):
             tests, all_tests = self.ctr._find_tests_by_test_mapping(
-                path=TEST_MAPPING_DIR, test_group=constants.TEST_GROUP_ALL,
+                path=TEST_MAPPING_DIR, test_groups=[constants.TEST_GROUP_ALL],
                 file_name='test_mapping_sample', checked_files=set())
         expected_presubmit = set([TEST_1, TEST_2, TEST_5, TEST_7, TEST_9])
         expected = set([
@@ -371,6 +403,54 @@
 
         self.assertEqual(test_mapping_dict, test_mapping_dict_gloden)
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    @mock.patch.object(module_info.ModuleInfo, 'get_testable_modules')
+    def test_extract_testable_modules_by_wildcard(self, mock_mods):
+        """Test _extract_testable_modules_by_wildcard method."""
+        mod_info = module_info.ModuleInfo(
+            module_file=os.path.join(uc.TEST_DATA_DIR, uc.JSON_FILE))
+        ctr = cli_t.CLITranslator(module_info=mod_info)
+        mock_mods.return_value = ['test1', 'test2', 'test3', 'test11',
+                                  'Test22', 'Test100', 'aTest101']
+        # test '*'
+        expr1 = ['test*']
+        result1 = ['test1', 'test2', 'test3', 'test11']
+        self.assertEqual(ctr._extract_testable_modules_by_wildcard(expr1),
+                         result1)
+        # test '?'
+        expr2 = ['test?']
+        result2 = ['test1', 'test2', 'test3']
+        self.assertEqual(ctr._extract_testable_modules_by_wildcard(expr2),
+                         result2)
+        # test '*' and '?'
+        expr3 = ['*Test???']
+        result3 = ['Test100', 'aTest101']
+        self.assertEqual(ctr._extract_testable_modules_by_wildcard(expr3),
+                         result3)
+
+    @mock.patch.object(test_finder_utils, 'find_host_unit_tests',
+                       return_value=[uc.HOST_UNIT_TEST_NAME_1,
+                                     uc.HOST_UNIT_TEST_NAME_2])
+    @mock.patch.object(cli_t.CLITranslator, '_find_tests_by_test_mapping')
+    @mock.patch.object(cli_t.CLITranslator, '_get_test_infos',
+                       side_effect=gettestinfos_side_effect)
+    def test_translate_test_mapping_host_unit_test(self, _info, mock_testmapping,
+        _find_unit_tests):
+        """Test translate method for tests belong to host unit tests."""
+        # Check that test mappings feeds into get_test_info properly.
+        test_detail1 = test_mapping.TestDetail(uc.TEST_MAPPING_TEST)
+        test_detail2 = test_mapping.TestDetail(uc.TEST_MAPPING_TEST_WITH_OPTION)
+        mock_testmapping.return_value = ([test_detail1, test_detail2], None)
+        self.args.tests = []
+        self.args.host = False
+        self.args.host_unit_test_only = False
+        _, test_infos = self.ctr.translate(self.args)
+        unittest_utils.assert_strict_equal(self,
+                                           test_infos,
+                                           {uc.MODULE_INFO,
+                                            uc.CLASS_INFO,
+                                            uc.MODULE_INFO_HOST_1,
+                                            uc.MODULE_INFO_HOST_2})
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/atest/constants_default.py b/atest/constants_default.py
index ac902f8..0ef70db 100644
--- a/atest/constants_default.py
+++ b/atest/constants_default.py
@@ -56,6 +56,10 @@
 TF_DEBUG = 'TF_DEBUG'
 COLLECT_TESTS_ONLY = 'COLLECT_TESTS_ONLY'
 TF_TEMPLATE = 'TF_TEMPLATE'
+FLAKES_INFO = 'FLAKES_INFO'
+TF_EARLY_DEVICE_RELEASE = 'TF_EARLY_DEVICE_RELEASE'
+REQUEST_UPLOAD_RESULT = 'REQUEST_UPLOAD_RESULT'
+MODULES_IN = 'MODULES-IN-'
 
 # Application exit codes.
 EXIT_CODE_SUCCESS = 0
@@ -66,6 +70,14 @@
 EXIT_CODE_TEST_FAILURE = 5
 EXIT_CODE_VERIFY_FAILURE = 6
 EXIT_CODE_OUTSIDE_ROOT = 7
+EXIT_CODE_AVD_CREATE_FAILURE = 8
+EXIT_CODE_AVD_INVALID_ARGS = 9
+# Conditions that atest should exit without sending result to metrics.
+EXIT_CODES_BEFORE_TEST = [EXIT_CODE_ENV_NOT_SETUP,
+                          EXIT_CODE_TEST_NOT_FOUND,
+                          EXIT_CODE_OUTSIDE_ROOT,
+                          EXIT_CODE_AVD_CREATE_FAILURE,
+                          EXIT_CODE_AVD_INVALID_ARGS]
 
 # Codes of specific events. These are exceptions that don't stop anything
 # but sending metrics.
@@ -85,13 +97,19 @@
 MODULE_CLASS_NATIVE_TESTS = 'NATIVE_TESTS'
 MODULE_CLASS_JAVA_LIBRARIES = 'JAVA_LIBRARIES'
 MODULE_TEST_CONFIG = 'test_config'
+MODULE_MAINLINE_MODULES = 'test_mainline_modules'
+MODULE_DEPENDENCIES = 'dependencies'
+MODULE_SRCS = 'srcs'
+MODULE_IS_UNIT_TEST = 'is_unit_test'
 
 # Env constants
 ANDROID_BUILD_TOP = 'ANDROID_BUILD_TOP'
 ANDROID_OUT = 'OUT'
 ANDROID_OUT_DIR = 'OUT_DIR'
+ANDROID_OUT_DIR_COMMON_BASE = 'OUT_DIR_COMMON_BASE'
 ANDROID_HOST_OUT = 'ANDROID_HOST_OUT'
 ANDROID_PRODUCT_OUT = 'ANDROID_PRODUCT_OUT'
+ANDROID_TARGET_PRODUCT = 'TARGET_PRODUCT'
 
 # Test Info data keys
 # Value of include-filter option.
@@ -109,8 +127,10 @@
 TEST_MAPPING = 'TEST_MAPPING'
 # Test group for tests in TEST_MAPPING
 TEST_GROUP_PRESUBMIT = 'presubmit'
+TEST_GROUP_PRESUBMIT_LARGE = 'presubmit-large'
 TEST_GROUP_POSTSUBMIT = 'postsubmit'
 TEST_GROUP_ALL = 'all'
+DEFAULT_TEST_GROUPS = [TEST_GROUP_PRESUBMIT, TEST_GROUP_PRESUBMIT_LARGE]
 # Key in TEST_MAPPING file for a list of imported TEST_MAPPING file
 TEST_MAPPING_IMPORTS = 'imports'
 
@@ -125,6 +145,7 @@
 TF_MODULE_ARG_VALUE_FMT = '{test_name}:{option_name}:{option_value}'
 TF_SUITE_FILTER_ARG_VALUE_FMT = '"{test_name} {option_value}"'
 TF_SKIP_LOADING_CONFIG_JAR = '--skip-loading-config-jar'
+TF_MODULE_FILTER = '--module'
 
 # Suite Plans
 SUITE_PLANS = frozenset(['cts'])
@@ -139,13 +160,10 @@
 # ANSI code shift for colorful print
 BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE = range(8)
 
-# Answers equivalent to YES!
-AFFIRMATIVES = ['y', 'Y', 'yes', 'Yes', 'YES', '']
-LD_RANGE = 2
-
 # Types of Levenshetine Distance Cost
 COST_TYPO = (1, 1, 1)
 COST_SEARCH = (8, 1, 5)
+LD_RANGE = 2
 
 # Value of TestInfo install_locations.
 DEVICELESS_TEST = 'host'
@@ -171,8 +189,20 @@
 TF_PREPARATION = 'tf-preparation'
 
 # Detect type for local_detect_event.
-# Next expansion : DETECT_TYPE_XXX = 1
+# Next expansion : DETECT_TYPE_XXX = 8
 DETECT_TYPE_BUG_DETECTED = 0
+DETECT_TYPE_ACLOUD_CREATE = 1
+DETECT_TYPE_FIND_BUILD = 2
+DETECT_TYPE_NO_FLAKE = 3
+DETECT_TYPE_HAS_FLAKE = 4
+DETECT_TYPE_TF_TEARDOWN_LOGCAT = 5
+DETECT_TYPE_REBUILD_MODULE_INFO = 6
+DETECT_TYPE_NOT_REBUILD_MODULE_INFO = 7
+DETECT_TYPE_ONLY_BUILD_MODULE_INFO = 8
+# XTS suite types encode from 100 to 199
+DETECT_TYPE_XTS_SUITE = {'cts': 101,
+                         'vts': 104}
+
 # Considering a trade-off between speed and size, we set UPPER_LIMIT to 100000
 # to make maximum file space 10M(100000(records)*100(byte/record)) at most.
 # Therefore, to update history file will spend 1 sec at most in each run.
@@ -202,11 +232,13 @@
 # generate modules' dependencies info when make.
 # With SOONG_COLLECT_JAVA_DEPS enabled, out/soong/module_bp_java_deps.json will
 # be generated when make.
-ATEST_BUILD_ENV = {'RECORD_ALL_DEPS':'true', 'SOONG_COLLECT_JAVA_DEPS':'true'}
+ATEST_BUILD_ENV = {'RECORD_ALL_DEPS':'true', 'SOONG_COLLECT_JAVA_DEPS':'true',
+                   'SOONG_COLLECT_CC_DEPS':'true'}
 
 # Atest index path and relative dirs/caches.
 INDEX_DIR = os.path.join(os.getenv(ANDROID_HOST_OUT, ''), 'indexes')
 LOCATE_CACHE = os.path.join(INDEX_DIR, 'mlocate.db')
+LOCATE_CACHE_MD5 = os.path.join(INDEX_DIR, 'mlocate.md5')
 INT_INDEX = os.path.join(INDEX_DIR, 'integration.idx')
 CLASS_INDEX = os.path.join(INDEX_DIR, 'classes.idx')
 CC_CLASS_INDEX = os.path.join(INDEX_DIR, 'cc_classes.idx')
@@ -232,7 +264,11 @@
                                r'(?P<package>[^(;|\s)]+)\s*')
 
 ATEST_RESULT_ROOT = '/tmp/atest_result'
+ATEST_TEST_RECORD_PROTO = 'test_record.proto'
 LATEST_RESULT_FILE = os.path.join(ATEST_RESULT_ROOT, 'LATEST', 'test_result')
+ACLOUD_REPORT_FILE_RE = re.compile(r'.*--report[_-]file(=|\s+)(?P<report_file>[\w/.]+)')
+TEST_WITH_MAINLINE_MODULES_RE = re.compile(r'(?P<test>.*)\[(?P<mainline_modules>.*'
+                                           r'[.](apk|apks|apex))\]$')
 
 # Tests list which need vts_kernel_tests as test dependency
 REQUIRED_KERNEL_TEST_MODULES = [
@@ -249,3 +285,69 @@
     'vts_ltp_test_x86_64',
     'vts_ltp_test_x86'
 ]
+
+# XTS suite set dependency.
+SUITE_DEPS = {}
+
+# Tradefed log file name term.
+TF_HOST_LOG = 'host_log_*'
+
+# Flake service par path
+FLAKE_SERVICE_PATH = '/foo'
+FLAKE_TMP_PATH = '/tmp'
+FLAKE_FILE = 'flakes_info.par'
+FLAKE_TARGET = 'aosp_cf_x86_phone-userdebug'
+FLAKE_BRANCH = 'aosp-master'
+FLAKE_TEST_NAME = 'suite/test-mapping-presubmit-retry_cloud-tf'
+FLAKE_PERCENT = 'flake_percent'
+FLAKE_POSTSUBMIT = 'postsubmit_flakes_per_week'
+
+# cert status command
+CERT_STATUS_CMD = ''
+
+ASUITE_REPO_PROJECT_NAME = 'platform/tools/asuite'
+
+# logstorage api scope.
+SCOPE_BUILD_API_SCOPE = ''
+STORAGE_API_VERSION = ''
+STORAGE_SERVICE_NAME = ''
+DO_NOT_UPLOAD = 'DO_NOT_UPLOAD'
+CLIENT_ID = ''
+CLIENT_SECRET = ''
+CREDENTIAL_FILE_NAME = ''
+TOKEN_FILE_PATH = ''
+INVOCATION_ID = 'INVOCATION_ID'
+WORKUNIT_ID = 'WORKUNIT_ID'
+RESULT_LINK = ''
+TF_GLOBAL_CONFIG = ''
+UPLOAD_TEST_RESULT_MSG = 'Upload test result?'
+
+# messages that share among libraries.
+REBUILD_MODULE_INFO_MSG = ('(This can happen after a repo sync or if the test'
+                           ' is new. Running with "{}" may resolve the issue.)')
+
+# AndroidJUnitTest related argument.
+ANDROID_JUNIT_CLASS = 'com.android.tradefed.testtype.AndroidJUnitTest'
+INCLUDE_ANNOTATION = 'include-annotation'
+EXCLUDE_ANNOTATION = 'exclude-annotation'
+SUPPORTED_FILTERS = [INCLUDE_ANNOTATION, EXCLUDE_ANNOTATION]
+
+# Tradefed config-descriptor metadata.
+CONFIG_DESCRIPTOR = 'config-descriptor:metadata'
+PARAMETER_KEY = 'parameter'
+
+# Tradefed related constant.
+TF_TEST_ARG = '--test-arg'
+TF_AND_JUNIT_CLASS = 'com.android.tradefed.testtype.AndroidJUnitTest'
+TF_EXCLUDE_ANNOTATE = 'exclude-annotation'
+INSTANT_MODE_ANNOTATE = 'android.platform.test.annotations.AppModeInstant'
+TF_PARA_INSTANT_APP = 'instant_app'
+TF_PARA_SECOND_USR = 'secondary_user'
+TF_PARA_MULTIABI = 'multi_abi'
+DEFAULT_EXCLUDE_PARAS = {TF_PARA_INSTANT_APP,
+                         TF_PARA_SECOND_USR,
+                         TF_PARA_MULTIABI
+                         }
+DEFAULT_EXCLUDE_NOT_PARAS = {'not_' + TF_PARA_INSTANT_APP,
+                            'not_' + TF_PARA_SECOND_USR,
+                            'not_' + TF_PARA_MULTIABI}
diff --git a/atest/docs/developer_workflow.md b/atest/docs/developer_workflow.md
index d3c2a32..cdf7eb6 100644
--- a/atest/docs/developer_workflow.md
+++ b/atest/docs/developer_workflow.md
@@ -52,7 +52,7 @@
 
 ##### Where does the Python code live?
 
-The python code lives here: `tools/tradefederation/core/atest/`
+The python code lives here: `tools/asuite/atest/`
 (path relative to android repo root)
 
 ##### Writing tests
diff --git a/atest/unittest_data/test_config/a.xml b/atest/logstorage/__init__.py
similarity index 100%
copy from atest/unittest_data/test_config/a.xml
copy to atest/logstorage/__init__.py
diff --git a/atest/logstorage/atest_gcp_utils.py b/atest/logstorage/atest_gcp_utils.py
new file mode 100644
index 0000000..8896f50
--- /dev/null
+++ b/atest/logstorage/atest_gcp_utils.py
@@ -0,0 +1,141 @@
+#  Copyright (C) 2020 The Android Open Source Project
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#       http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+"""
+Utility functions for atest.
+"""
+from __future__ import print_function
+
+import os
+import logging
+try:
+    import httplib2
+except ModuleNotFoundError as e:
+    logging.debug('Import error due to %s', e)
+import constants
+
+try:
+    # pylint: disable=import-error
+    from oauth2client import client as oauth2_client
+    from oauth2client.contrib import multistore_file
+    from oauth2client import tools as oauth2_tools
+except ModuleNotFoundError as e:
+    logging.debug('Import error due to %s', e)
+
+
+class RunFlowFlags():
+    """Flags for oauth2client.tools.run_flow."""
+    def __init__(self, browser_auth):
+        self.auth_host_port = [8080, 8090]
+        self.auth_host_name = "localhost"
+        self.logging_level = "ERROR"
+        self.noauth_local_webserver = not browser_auth
+
+
+class GCPHelper():
+    """GCP bucket helper class."""
+    def __init__(self, client_id=None, client_secret=None,
+                 user_agent=None, scope=constants.SCOPE_BUILD_API_SCOPE):
+        """Init stuff for GCPHelper class.
+        Args:
+            client_id: String, client id from the cloud project.
+            client_secret: String, client secret for the client_id.
+            user_agent: The user agent for the credential.
+            scope: String, scopes separated by space.
+        """
+        self.client_id = client_id
+        self.client_secret = client_secret
+        self.user_agent = user_agent
+        self.scope = scope
+
+    def get_refreshed_credential_from_file(self, creds_file_path):
+        """Get refreshed credential from file.
+        Args:
+            creds_file_path: Credential file path.
+        Returns:
+            An oauth2client.OAuth2Credentials instance.
+        """
+        credential = self.get_credential_from_file(creds_file_path)
+        if credential:
+            try:
+                credential.refresh(httplib2.Http())
+            except oauth2_client.AccessTokenRefreshError as e:
+                logging.debug('Token refresh error: %s', e)
+            if not credential.invalid:
+                return credential
+        logging.debug('Cannot get credential.')
+        return None
+
+    def get_credential_from_file(self, creds_file_path):
+        """Get credential from file.
+        Args:
+            creds_file_path: Credential file path.
+        Returns:
+            An oauth2client.OAuth2Credentials instance.
+        """
+        storage = multistore_file.get_credential_storage(
+            filename=os.path.abspath(creds_file_path),
+            client_id=self.client_id,
+            user_agent=self.user_agent,
+            scope=self.scope)
+        return storage.get()
+
+    def get_credential_with_auth_flow(self, creds_file_path):
+        """Get Credential object from file.
+        Get credential object from file. Run oauth flow if haven't authorized
+        before.
+
+        Args:
+            creds_file_path: Credential file path.
+        Returns:
+            An oauth2client.OAuth2Credentials instance.
+        """
+        credentials = self.get_refreshed_credential_from_file(creds_file_path)
+        if not credentials:
+            storage = multistore_file.get_credential_storage(
+                filename=os.path.abspath(creds_file_path),
+                client_id=self.client_id,
+                user_agent=self.user_agent,
+                scope=self.scope)
+            return self._run_auth_flow(storage)
+        return credentials
+
+    def _run_auth_flow(self, storage):
+        """Get user oauth2 credentials.
+
+        Args:
+            storage: GCP storage object.
+        Returns:
+            An oauth2client.OAuth2Credentials instance.
+        """
+        flags = RunFlowFlags(browser_auth=False)
+        flow = oauth2_client.OAuth2WebServerFlow(
+            client_id=self.client_id,
+            client_secret=self.client_secret,
+            scope=self.scope,
+            user_agent=self.user_agent)
+        credentials = oauth2_tools.run_flow(
+            flow=flow, storage=storage, flags=flags)
+        return credentials
diff --git a/atest/logstorage/logstorage_utils.py b/atest/logstorage/logstorage_utils.py
new file mode 100644
index 0000000..2283a26
--- /dev/null
+++ b/atest/logstorage/logstorage_utils.py
@@ -0,0 +1,160 @@
+#  Copyright (C) 2020 The Android Open Source Project
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#       http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#       http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+""" Utility functions for logstorage. """
+from __future__ import print_function
+
+import logging
+import constants
+
+# pylint: disable=import-error
+try:
+    import httplib2
+    from googleapiclient.discovery import build
+except ImportError as e:
+    logging.debug('Import error due to: %s', e)
+
+
+class BuildClient:
+    """Build api helper class."""
+
+    def __init__(self, creds):
+        """Init BuildClient class.
+        Args:
+            creds: An oauth2client.OAuth2Credentials instance.
+        """
+        http_auth = creds.authorize(httplib2.Http())
+        self.client = build(
+            serviceName=constants.STORAGE_SERVICE_NAME,
+            version=constants.STORAGE_API_VERSION,
+            cache_discovery=False,
+            http=http_auth)
+
+    def list_branch(self):
+        """List all branch."""
+        return self.client.branch().list(maxResults=10000).execute()
+
+    def list_target(self, branch):
+        """List all target in the branch."""
+        return self.client.target().list(branch=branch,
+                                         maxResults=10000).execute()
+
+    def insert_local_build(self, external_id, target, branch):
+        """Insert a build record.
+        Args:
+            external_id: unique id of build record.
+            target: build target.
+            branch: build branch.
+
+        Returns:
+            An build record object.
+        """
+        body = {
+            "buildId": "",
+            "externalId": external_id,
+            "branch": branch,
+            "target": {
+                "name": target,
+                "target": target
+            },
+            "buildAttemptStatus": "complete",
+        }
+        return self.client.build().insert(buildType="local",
+                                          body=body).execute()
+
+    def insert_build_attempts(self, build_record):
+        """Insert a build attempt record.
+        Args:
+            build_record: build record.
+
+        Returns:
+            An build attempt object.
+        """
+        build_attempt = {
+            "id": 0,
+            "status": "complete",
+            "successful": True
+        }
+        return self.client.buildattempt().insert(
+            buildId=build_record['buildId'],
+            target=build_record['target']['name'],
+            body=build_attempt).execute()
+
+    def insert_invocation(self, build_record):
+        """Insert a build invocation record.
+        Args:
+            build_record: build record.
+
+        Returns:
+            A build invocation object.
+        """
+        invocation = {
+            "primaryBuild": {
+                "buildId": build_record['buildId'],
+                "buildTarget": build_record['target']['name'],
+                "branch": build_record['branch'],
+            },
+            "schedulerState": "running"
+        }
+        return self.client.invocation().insert(body=invocation).execute()
+
+    def update_invocation(self, invocation):
+        """Insert a build invocation record.
+        Args:
+            invocation: invocation record.
+
+        Returns:
+            A invocation object.
+        """
+        return self.client.invocation().update(
+            resourceId=invocation['invocationId'],
+            body=invocation).execute()
+
+    def insert_work_unit(self, invocation_record):
+        """Insert a workunit record.
+          Args:
+              invocation_record: invocation record.
+
+          Returns:
+              the workunit object.
+          """
+        workunit = {
+            'invocationId': invocation_record['invocationId']
+        }
+        return self.client.workunit().insert(body=workunit).execute()
diff --git a/atest/metrics/metrics_base.py b/atest/metrics/metrics_base.py
index 18320a3..9ffc837 100644
--- a/atest/metrics/metrics_base.py
+++ b/atest/metrics/metrics_base.py
@@ -92,7 +92,7 @@
         _user_key = str(asuite_metrics._get_grouping_key())
     #pylint: disable=broad-except
     except Exception:
-        _user_key = asuite_metrics.DUMMY_UUID
+        _user_key = asuite_metrics.UNUSED_UUID
     _user_type = get_user_type()
     _log_source = ATEST_LOG_SOURCE[_user_type]
     cc = clearcut_client.Clearcut(_log_source)
diff --git a/atest/module_info.py b/atest/module_info.py
index 0be78ed..024d234 100644
--- a/atest/module_info.py
+++ b/atest/module_info.py
@@ -21,13 +21,25 @@
 import json
 import logging
 import os
+import shutil
+import sys
+import tempfile
+import time
 
 import atest_utils
 import constants
 
+from metrics import metrics
+
 # JSON file generated by build system that lists all buildable targets.
 _MODULE_INFO = 'module-info.json'
-
+# JSON file generated by build system that lists dependencies for java.
+_JAVA_DEP_INFO = 'module_bp_java_deps.json'
+# JSON file generated by build system that lists dependencies for cc.
+_CC_DEP_INFO = 'module_bp_cc_deps.json'
+# JSON file generated by atest merged the content from module-info,
+# module_bp_java_deps.json, and module_bp_cc_deps.
+_MERGED_INFO = 'atest_merged_dep.json'
 
 class ModuleInfo:
     """Class that offers fast/easy lookup for Module related details."""
@@ -61,6 +73,7 @@
         Returns:
             Tuple of module_info_target and path to module file.
         """
+        logging.debug('Probing and validating module info...')
         module_info_target = None
         root_dir = os.environ.get(constants.ANDROID_BUILD_TOP, '/')
         out_dir = os.environ.get(constants.ANDROID_PRODUCT_OUT, root_dir)
@@ -78,13 +91,20 @@
             module_file_path = os.path.join(
                 os.environ.get(constants.ANDROID_PRODUCT_OUT), _MODULE_INFO)
             module_info_target = module_file_path
-        if not os.path.isfile(module_file_path) or force_build:
+        # Make sure module-info exist and could be load properly.
+        if not atest_utils.is_valid_json_file(module_file_path) or force_build:
             logging.debug('Generating %s - this is required for '
-                          'initial runs.', _MODULE_INFO)
+                          'initial runs or forced rebuilds.', _MODULE_INFO)
             build_env = dict(constants.ATEST_BUILD_ENV)
-            atest_utils.build([module_info_target],
-                              verbose=logging.getLogger().isEnabledFor(
-                                  logging.DEBUG), env_vars=build_env)
+            build_start = time.time()
+            if not atest_utils.build([module_info_target],
+                                     verbose=logging.getLogger().isEnabledFor(
+                                         logging.DEBUG), env_vars=build_env):
+                sys.exit(constants.EXIT_CODE_BUILD_FAILURE)
+            build_duration = time.time() - build_start
+            metrics.LocalDetectEvent(
+                detect_type=constants.DETECT_TYPE_ONLY_BUILD_MODULE_INFO,
+                result=int(build_duration))
         return module_info_target, module_file_path
 
     def _load_module_info_file(self, force_build, module_file):
@@ -105,8 +125,15 @@
         if not file_path:
             module_info_target, file_path = self._discover_mod_file_and_target(
                 force_build)
+        merged_file_path = self.get_atest_merged_info_path()
+        if (not self.need_update_merged_file(force_build)
+            and os.path.exists(merged_file_path)):
+            file_path = merged_file_path
+            logging.debug('Loading %s as module-info.', file_path)
         with open(file_path) as json_file:
             mod_info = json.load(json_file)
+        if self.need_update_merged_file(force_build):
+            mod_info = self._merge_build_system_infos(mod_info)
         return module_info_target, mod_info
 
     @staticmethod
@@ -136,11 +163,13 @@
 
     def is_module(self, name):
         """Return True if name is a module, False otherwise."""
-        return name in self.name_to_module_info
+        if self.get_module_info(name):
+            return True
+        return False
 
     def get_paths(self, name):
         """Return paths of supplied module name, Empty list if non-existent."""
-        info = self.name_to_module_info.get(name)
+        info = self.get_module_info(name)
         if info:
             return info.get(constants.MODULE_PATH, [])
         return []
@@ -158,16 +187,16 @@
                 for m in self.path_to_module_info.get(rel_module_path, [])]
 
     def get_module_info(self, mod_name):
-        """Return dict of info for given module name, None if non-existent."""
+        """Return dict of info for given module name, None if non-existence."""
         module_info = self.name_to_module_info.get(mod_name)
         # Android's build system will automatically adding 2nd arch bitness
         # string at the end of the module name which will make atest could not
-        # finding matched module. Rescan the module-info with matched module
+        # find the matched module. Rescan the module-info with the matched module
         # name without bitness.
         if not module_info:
-            for _, module_info in self.name_to_module_info.items():
-                if mod_name == module_info.get(constants.MODULE_NAME, ''):
-                    break
+            for _, mod_info in self.name_to_module_info.items():
+                if mod_name == mod_info.get(constants.MODULE_NAME, ''):
+                    return mod_info
         return module_info
 
     def is_suite_in_compatibility_suites(self, suite, mod_info):
@@ -264,7 +293,7 @@
             String of module that is the runnable robolectric module, None if
             none could be found.
         """
-        module_name_info = self.name_to_module_info.get(module_name)
+        module_name_info = self.get_module_info(module_name)
         if not module_name_info:
             return None
         module_paths = module_name_info.get(constants.MODULE_PATH, [])
@@ -306,7 +335,7 @@
             True if the test config file will be generated automatically.
         """
         if self.is_module(module_name):
-            mod_info = self.name_to_module_info.get(module_name)
+            mod_info = self.get_module_info(module_name)
             auto_test_config = mod_info.get('auto_test_config', [])
             return auto_test_config and auto_test_config[0]
         return False
@@ -337,3 +366,235 @@
         mod_info = self.get_module_info(module_name)
         return constants.MODULE_CLASS_NATIVE_TESTS in mod_info.get(
             constants.MODULE_CLASS, [])
+
+    def has_mainline_modules(self, module_name, mainline_modules):
+        """Check if the mainline modules are in module-info.
+
+        Args:
+            module_name: A string of the module name.
+            mainline_modules: A list of mainline modules.
+
+        Returns:
+            True if mainline_modules is in module-info, False otherwise.
+        """
+        # TODO: (b/165425972)Check AndroidTest.xml or specific test config.
+        mod_info = self.get_module_info(module_name)
+        if mainline_modules in mod_info.get(constants.MODULE_MAINLINE_MODULES,
+                                            []):
+            return True
+        return False
+
+    def generate_atest_merged_dep_file(self):
+        """Method for generating atest_merged_dep.json."""
+        self._merge_build_system_infos(self.name_to_module_info,
+                                       self.get_java_dep_info_path(),
+                                       self.get_cc_dep_info_path())
+
+    def _merge_build_system_infos(self, name_to_module_info,
+        java_bp_info_path=None, cc_bp_info_path=None):
+        """Merge the full build system's info to name_to_module_info.
+
+        Args:
+            name_to_module_info: Dict of module name to module info dict.
+            java_bp_info_path: String of path to java dep file to load up.
+                               Used for testing.
+            cc_bp_info_path: String of path to cc dep file to load up.
+                             Used for testing.
+
+        Returns:
+            Dict of merged json of input def_file_path and name_to_module_info.
+        """
+        # Merge _JAVA_DEP_INFO
+        if not java_bp_info_path:
+            java_bp_info_path = self.get_java_dep_info_path()
+        if atest_utils.is_valid_json_file(java_bp_info_path):
+            with open(java_bp_info_path) as json_file:
+                java_bp_infos = json.load(json_file)
+                logging.debug('Merging Java build info: %s', java_bp_info_path)
+                name_to_module_info = self._merge_soong_info(
+                    name_to_module_info, java_bp_infos)
+        # Merge _CC_DEP_INFO
+        if not cc_bp_info_path:
+            cc_bp_info_path = self.get_cc_dep_info_path()
+        if atest_utils.is_valid_json_file(cc_bp_info_path):
+            with open(cc_bp_info_path) as json_file:
+                cc_bp_infos = json.load(json_file)
+            logging.debug('Merging CC build info: %s', cc_bp_info_path)
+            # CC's dep json format is different with java.
+            # Below is the example content:
+            # {
+            #   "clang": "${ANDROID_ROOT}/bin/clang",
+            #   "clang++": "${ANDROID_ROOT}/bin/clang++",
+            #   "modules": {
+            #       "ACameraNdkVendorTest": {
+            #           "path": [
+            #                   "frameworks/av/camera/ndk"
+            #           ],
+            #           "srcs": [
+            #                   "frameworks/tests/AImageVendorTest.cpp",
+            #                   "frameworks/tests/ACameraManagerTest.cpp"
+            #           ],
+            name_to_module_info = self._merge_soong_info(
+                name_to_module_info, cc_bp_infos.get('modules', {}))
+        return name_to_module_info
+
+    def _merge_soong_info(self, name_to_module_info, mod_bp_infos):
+        """Merge the dependency and srcs in mod_bp_infos to name_to_module_info.
+
+        Args:
+            name_to_module_info: Dict of module name to module info dict.
+            mod_bp_infos: Dict of module name to bp's module info dict.
+
+        Returns:
+            Dict of merged json of input def_file_path and name_to_module_info.
+        """
+        merge_items = [constants.MODULE_DEPENDENCIES, constants.MODULE_SRCS]
+        for module_name, dep_info in mod_bp_infos.items():
+            if name_to_module_info.get(module_name, None):
+                mod_info = name_to_module_info.get(module_name)
+                for merge_item in merge_items:
+                    dep_info_values = dep_info.get(merge_item, [])
+                    mod_info_values = mod_info.get(merge_item, [])
+                    for dep_info_value in dep_info_values:
+                        if dep_info_value not in mod_info_values:
+                            mod_info_values.append(dep_info_value)
+                    mod_info_values.sort()
+                    name_to_module_info[
+                        module_name][merge_item] = mod_info_values
+        output_file = self.get_atest_merged_info_path()
+        if not os.path.isdir(os.path.dirname(output_file)):
+            os.makedirs(os.path.dirname(output_file))
+        # b/178559543 saving merged module info in a temp file and copying it to
+        # atest_merged_dep.json can eliminate the possibility of accessing it
+        # concurrently and resulting in invalid JSON format.
+        temp_file = tempfile.NamedTemporaryFile()
+        with open(temp_file.name, 'w') as _temp:
+            json.dump(name_to_module_info, _temp, indent=0)
+        shutil.copy(temp_file.name, output_file)
+        temp_file.close()
+        return name_to_module_info
+
+    def get_module_dependency(self, module_name, depend_on=None):
+        """Get the dependency sets for input module.
+
+        Recursively find all the dependencies of the input module.
+
+        Args:
+            module_name: String of module to check.
+            depend_on: The list of parent dependencies.
+
+        Returns:
+            Set of dependency modules.
+        """
+        if not depend_on:
+            depend_on = set()
+        deps = set()
+        mod_info = self.get_module_info(module_name)
+        if not mod_info:
+            return deps
+        mod_deps = set(mod_info.get(constants.MODULE_DEPENDENCIES, []))
+        # Remove item in deps if it already in depend_on:
+        mod_deps = mod_deps - depend_on
+        deps = deps.union(mod_deps)
+        for mod_dep in mod_deps:
+            deps = deps.union(set(self.get_module_dependency(
+                mod_dep, depend_on=depend_on.union(deps))))
+        return deps
+
+    def get_install_module_dependency(self, module_name, depend_on=None):
+        """Get the dependency set for the given modules with installed path.
+
+        Args:
+            module_name: String of module to check.
+            depend_on: The list of parent dependencies.
+
+        Returns:
+            Set of dependency modules which has installed path.
+        """
+        install_deps = set()
+        deps = self.get_module_dependency(module_name, depend_on)
+        logging.debug('%s depends on: %s', module_name, deps)
+        for module in deps:
+            mod_info = self.get_module_info(module)
+            if mod_info and mod_info.get(constants.MODULE_INSTALLED, []):
+                install_deps.add(module)
+        logging.debug('modules %s required by %s were not installed',
+                      install_deps, module_name)
+        return install_deps
+
+    @staticmethod
+    def get_atest_merged_info_path():
+        """Returns the path for atest_merged_dep.json.
+
+        Returns:
+            String for atest_merged_dep.json.
+        """
+        return os.path.join(atest_utils.get_build_out_dir(),
+                            'soong', _MERGED_INFO)
+
+    @staticmethod
+    def get_java_dep_info_path():
+        """Returns the path for atest_merged_dep.json.
+
+        Returns:
+            String for atest_merged_dep.json.
+        """
+        return os.path.join(atest_utils.get_build_out_dir(),
+                            'soong', _JAVA_DEP_INFO)
+
+    @staticmethod
+    def get_cc_dep_info_path():
+        """Returns the path for atest_merged_dep.json.
+
+        Returns:
+            String for atest_merged_dep.json.
+        """
+        return os.path.join(atest_utils.get_build_out_dir(),
+                            'soong', _CC_DEP_INFO)
+
+    def has_soong_info(self):
+        """Ensure the existence of soong info files.
+
+        Returns:
+            True if soong info need to merge, false otherwise.
+        """
+        return (os.path.isfile(self.get_java_dep_info_path()) and
+                os.path.isfile(self.get_cc_dep_info_path()))
+
+    def need_update_merged_file(self, force_build=False):
+        """Check if need to update/generated atest_merged_dep.
+
+        If force_build, always update merged info.
+        If not force build, if soong info exist but merged inforamtion not exist,
+        need to update merged file.
+
+        Args:
+            force_build: Boolean to indicate that if user want to rebuild
+                         module_info file regardless if it's created or not.
+
+        Returns:
+            True if atest_merged_dep should be updated, false otherwise.
+        """
+        return (force_build or
+                (self.has_soong_info() and
+                 not os.path.exists(self.get_atest_merged_info_path())))
+
+    def is_unit_test(self, mod_info):
+        """Return True if input module is unit test, False otherwise.
+
+        Args:
+            mod_info: ModuleInfo to check.
+
+        Returns:
+            True if if input module is unit test, False otherwise.
+        """
+        return mod_info.get(constants.MODULE_IS_UNIT_TEST, '') == 'true'
+
+    def get_all_unit_tests(self):
+        """Get a list of all the module names which are unit tests."""
+        unit_tests = []
+        for mod_name, mod_info in self.name_to_module_info.items():
+            if mod_info.get(constants.MODULE_NAME, '') == mod_name:
+                if self.is_unit_test(mod_info):
+                    unit_tests.append(mod_name)
+        return unit_tests
diff --git a/atest/module_info_unittest.py b/atest/module_info_unittest.py
index 994918f..59df34b 100755
--- a/atest/module_info_unittest.py
+++ b/atest/module_info_unittest.py
@@ -23,6 +23,7 @@
 
 from unittest import mock
 
+import atest_utils
 import constants
 import module_info
 import unittest_utils
@@ -42,8 +43,8 @@
 NON_RUN_ROBO_MOD_NAME = 'robo_mod'
 RUN_ROBO_MOD_NAME = 'run_robo_mod'
 NON_RUN_ROBO_MOD = {constants.MODULE_NAME: NON_RUN_ROBO_MOD_NAME,
-                    constants.MODULE_PATH: ROBO_MOD_PATH,
-                    constants.MODULE_CLASS: ['random_class']}
+                        constants.MODULE_PATH: ROBO_MOD_PATH,
+                        constants.MODULE_CLASS: ['random_class']}
 RUN_ROBO_MOD = {constants.MODULE_NAME: RUN_ROBO_MOD_NAME,
                 constants.MODULE_PATH: ROBO_MOD_PATH,
                 constants.MODULE_CLASS: [constants.MODULE_CLASS_ROBOLECTRIC]}
@@ -60,15 +61,22 @@
                constants.MODULE_PATH: 'a/b/c/path',
                constants.MODULE_CLASS: ['random_class']}
 NAME_TO_MODULE_INFO = {'random_name' : MODULE_INFO}
+MERGED_DEP = '/tmp/out/atest_merged_dep.json'
 
 #pylint: disable=protected-access
 class ModuleInfoUnittests(unittest.TestCase):
     """Unit tests for module_info.py"""
 
+    def tearDown(self):
+        if os.path.isfile(MERGED_DEP):
+            os.remove(MERGED_DEP)
+
+    @mock.patch.object(module_info.ModuleInfo, '_merge_soong_info')
     @mock.patch('json.load', return_value={})
     @mock.patch('builtins.open', new_callable=mock.mock_open)
     @mock.patch('os.path.isfile', return_value=True)
-    def test_load_mode_info_file_out_dir_handling(self, _isfile, _open, _json):
+    def test_load_mode_info_file_out_dir_handling(self, _isfile, _open, _json,
+                                                 _merge):
         """Test _load_module_info_file out dir handling."""
         # Test out default out dir is used.
         build_top = '/path/to/top'
@@ -104,7 +112,8 @@
             self.assertEqual(custom_abs_out_dir_mod_targ,
                              mod_info.module_info_target)
 
-    @mock.patch.object(module_info.ModuleInfo, '_load_module_info_file',)
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    @mock.patch.object(module_info.ModuleInfo, '_load_module_info_file')
     def test_get_path_to_module_info(self, mock_load_module):
         """Test that we correctly create the path to module info dict."""
         mod_one = 'mod1'
@@ -124,6 +133,7 @@
         self.assertDictEqual(path_to_mod_info,
                              mod_info._get_path_to_module_info(mod_info_dict))
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     def test_is_module(self):
         """Test that we get the module when it's properly loaded."""
         # Load up the test json file and check that module is in it
@@ -131,6 +141,7 @@
         self.assertTrue(mod_info.is_module(EXPECTED_MOD_TARGET))
         self.assertFalse(mod_info.is_module(UNEXPECTED_MOD_TARGET))
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     def test_get_path(self):
         """Test that we get the module path when it's properly loaded."""
         # Load up the test json file and check that module is in it
@@ -139,6 +150,7 @@
                          EXPECTED_MOD_TARGET_PATH)
         self.assertEqual(mod_info.get_paths(MOD_NO_PATH), [])
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     def test_get_module_names(self):
         """test that we get the module name properly."""
         mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
@@ -148,6 +160,7 @@
             self, mod_info.get_module_names(PATH_TO_MULT_MODULES),
             MULT_MOODULES_WITH_SHARED_PATH)
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     def test_path_to_mod_info(self):
         """test that we get the module name properly."""
         mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
@@ -158,6 +171,7 @@
         TESTABLE_MODULES_WITH_SHARED_PATH.sort()
         self.assertEqual(module_list, TESTABLE_MODULES_WITH_SHARED_PATH)
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     def test_is_suite_in_compatibility_suites(self):
         """Test is_suite_in_compatibility_suites."""
         mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
@@ -171,6 +185,7 @@
         self.assertTrue(mod_info.is_suite_in_compatibility_suites("vts10", info3))
         self.assertFalse(mod_info.is_suite_in_compatibility_suites("ats", info3))
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     @mock.patch.object(module_info.ModuleInfo, 'is_testable_module')
     @mock.patch.object(module_info.ModuleInfo, 'is_suite_in_compatibility_suites')
     def test_get_testable_modules(self, mock_is_suite_exist, mock_is_testable):
@@ -186,6 +201,7 @@
         self.assertEqual(0, len(mod_info.get_testable_modules('test_suite')))
         self.assertEqual(1, len(mod_info.get_testable_modules()))
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     @mock.patch.object(module_info.ModuleInfo, 'has_test_config')
     @mock.patch.object(module_info.ModuleInfo, 'is_robolectric_test')
     def test_is_testable_module(self, mock_is_robo_test, mock_has_test_config):
@@ -218,13 +234,16 @@
         self.assertTrue(mod_info.has_test_config({}))
         # Validate when actual config exists and there's no auto-generated config.
         mock_is_auto_gen.return_value = False
+        info = {constants.MODULE_PATH:[uc.TEST_DATA_DIR]}
         self.assertTrue(mod_info.has_test_config(info))
         self.assertFalse(mod_info.has_test_config({}))
         # Validate the case mod_info MODULE_TEST_CONFIG be set
         info2 = {constants.MODULE_PATH:[uc.TEST_CONFIG_DATA_DIR],
-                 constants.MODULE_TEST_CONFIG:[os.path.join(uc.TEST_CONFIG_DATA_DIR, "a.xml")]}
+                 constants.MODULE_TEST_CONFIG:[os.path.join(
+                     uc.TEST_CONFIG_DATA_DIR, "a.xml.data")]}
         self.assertTrue(mod_info.has_test_config(info2))
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     @mock.patch.object(module_info.ModuleInfo, 'get_module_names')
     def test_get_robolectric_test_name(self, mock_get_module_names):
         """Test get_robolectric_test_name."""
@@ -241,6 +260,7 @@
         self.assertEqual(mod_info.get_robolectric_test_name(
             NON_RUN_ROBO_MOD_NAME), None)
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     @mock.patch.object(module_info.ModuleInfo, 'get_module_info')
     @mock.patch.object(module_info.ModuleInfo, 'get_module_names')
     def test_is_robolectric_test(self, mock_get_module_names, mock_get_module_info):
@@ -259,6 +279,7 @@
         mock_get_module_info.return_value = None
         self.assertFalse(mod_info.is_robolectric_test('rand_mod'))
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     @mock.patch.object(module_info.ModuleInfo, 'is_module')
     def test_is_auto_gen_test_config(self, mock_is_module):
         """Test is_auto_gen_test_config correctly detects the module."""
@@ -277,6 +298,7 @@
         self.assertFalse(mod_info.is_auto_gen_test_config(MOD_NAME3))
         self.assertFalse(mod_info.is_auto_gen_test_config(MOD_NAME4))
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     def test_is_robolectric_module(self):
         """Test is_robolectric_module correctly detects the module."""
         mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
@@ -288,6 +310,143 @@
         self.assertTrue(mod_info.is_robolectric_module(MOD_INFO_DICT[MOD_NAME1]))
         self.assertFalse(mod_info.is_robolectric_module(MOD_INFO_DICT[MOD_NAME2]))
 
+    @mock.patch.object(module_info.ModuleInfo, 'get_atest_merged_info_path')
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    def test_merge_build_system_infos(self, _merge):
+        """Test _merge_build_system_infos."""
+        _merge.return_value = MERGED_DEP
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        java_dep_file = os.path.join(uc.TEST_DATA_DIR,
+                                     'module_bp_java_deps.json')
+        mod_info_1 = {constants.MODULE_NAME: 'module_1',
+                      constants.MODULE_DEPENDENCIES: []}
+        name_to_mod_info = {'module_1' : mod_info_1}
+        expect_deps = ['test_dep_level_1_1', 'test_dep_level_1_2']
+        name_to_mod_info = mod_info._merge_build_system_infos(
+            name_to_mod_info, java_bp_info_path=java_dep_file)
+        self.assertEqual(
+            name_to_mod_info['module_1'].get(constants.MODULE_DEPENDENCIES),
+            expect_deps)
+
+    @mock.patch.object(module_info.ModuleInfo, 'get_atest_merged_info_path')
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    def test_merge_dependency_with_ori_dependency(self, _merge):
+        """Test _merge_dependency."""
+        _merge.return_value = MERGED_DEP
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        java_dep_file = os.path.join(uc.TEST_DATA_DIR,
+                                     'module_bp_java_deps.json')
+        mod_info_1 = {constants.MODULE_NAME: 'module_1',
+                      constants.MODULE_DEPENDENCIES: ['ori_dep_1']}
+        name_to_mod_info = {'module_1' : mod_info_1}
+        expect_deps = ['ori_dep_1', 'test_dep_level_1_1', 'test_dep_level_1_2']
+        name_to_mod_info = mod_info._merge_build_system_infos(
+            name_to_mod_info, java_bp_info_path=java_dep_file)
+        self.assertEqual(
+            name_to_mod_info['module_1'].get(constants.MODULE_DEPENDENCIES),
+            expect_deps)
+
+    @mock.patch.object(module_info.ModuleInfo, 'get_atest_merged_info_path')
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    def test_get_module_dependency(self, _merge):
+        """Test get_module_dependency."""
+        _merge.return_value = MERGED_DEP
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        java_dep_file = os.path.join(uc.TEST_DATA_DIR,
+                                     'module_bp_java_deps.json')
+        expect_deps = {'test_dep_level_1_1', 'module_1', 'test_dep_level_1_2',
+                       'test_dep_level_2_2', 'test_dep_level_2_1', 'module_2'}
+        mod_info._merge_build_system_infos(mod_info.name_to_module_info,
+                                   java_bp_info_path=java_dep_file)
+        self.assertEqual(
+            mod_info.get_module_dependency('dep_test_module'),
+            expect_deps)
+
+    @mock.patch.object(module_info.ModuleInfo, 'get_atest_merged_info_path')
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    def test_get_module_dependency_w_loop(self, _merge):
+        """Test get_module_dependency with problem dep file."""
+        _merge.return_value = MERGED_DEP
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        # Java dependency file with a endless loop define.
+        java_dep_file = os.path.join(uc.TEST_DATA_DIR,
+                                     'module_bp_java_loop_deps.json')
+        expect_deps = {'test_dep_level_1_1', 'module_1', 'test_dep_level_1_2',
+                       'test_dep_level_2_2', 'test_dep_level_2_1', 'module_2'}
+        mod_info._merge_build_system_infos(mod_info.name_to_module_info,
+                                   java_bp_info_path=java_dep_file)
+        self.assertEqual(
+            mod_info.get_module_dependency('dep_test_module'),
+            expect_deps)
+
+    @mock.patch.object(module_info.ModuleInfo, 'get_atest_merged_info_path')
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    def test_get_install_module_dependency(self, _merge):
+        """Test get_install_module_dependency."""
+        _merge.return_value = MERGED_DEP
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        java_dep_file = os.path.join(uc.TEST_DATA_DIR,
+                                     'module_bp_java_deps.json')
+        expect_deps = {'module_1', 'test_dep_level_2_1'}
+        mod_info._merge_build_system_infos(mod_info.name_to_module_info,
+                                           java_bp_info_path=java_dep_file)
+        self.assertEqual(
+            mod_info.get_install_module_dependency('dep_test_module'),
+            expect_deps)
+
+    @mock.patch.object(module_info.ModuleInfo, 'get_atest_merged_info_path')
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    def test_cc_merge_build_system_infos(self, _merge):
+        """Test _merge_build_system_infos for cc."""
+        _merge.return_value = MERGED_DEP
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        cc_dep_file = os.path.join(uc.TEST_DATA_DIR,
+                                     'module_bp_cc_deps.json')
+        mod_info_1 = {constants.MODULE_NAME: 'module_cc_1',
+                      constants.MODULE_DEPENDENCIES: []}
+        name_to_mod_info = {'module_cc_1' : mod_info_1}
+        expect_deps = ['test_cc_dep_level_1_1', 'test_cc_dep_level_1_2']
+        name_to_mod_info = mod_info._merge_build_system_infos(
+            name_to_mod_info, cc_bp_info_path=cc_dep_file)
+        self.assertEqual(
+            name_to_mod_info['module_cc_1'].get(constants.MODULE_DEPENDENCIES),
+            expect_deps)
+
+    @mock.patch.object(atest_utils, 'get_build_out_dir')
+    def test_get_atest_merged_info_path(self, mock_out_dir):
+        """Test get_atest_merged_info_path."""
+        expect_out = '/test/output/'
+        mock_out_dir.return_value = expect_out
+        expect_path = os.path.join(expect_out, 'soong',
+                                   module_info._MERGED_INFO)
+        self.assertEqual(expect_path, module_info.ModuleInfo.get_atest_merged_info_path())
+
+    @mock.patch.object(atest_utils, 'get_build_out_dir')
+    def test_get_java_dep_info_path(self, mock_out_dir):
+        """Test get_java_dep_info_path."""
+        expect_out = '/test/output/'
+        mock_out_dir.return_value = expect_out
+        expect_path = os.path.join(expect_out, 'soong',
+                                   module_info._JAVA_DEP_INFO)
+        self.assertEqual(expect_path, module_info.ModuleInfo.get_java_dep_info_path())
+
+    @mock.patch.object(atest_utils, 'get_build_out_dir')
+    def test_get_cc_dep_info_path(self, mock_out_dir):
+        """Test get_cc_dep_info_path."""
+        expect_out = '/test/output/'
+        mock_out_dir.return_value = expect_out
+        expect_path = os.path.join(expect_out, 'soong',
+                                   module_info._CC_DEP_INFO)
+        self.assertEqual(expect_path, module_info.ModuleInfo.get_cc_dep_info_path())
+
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    def test_is_unit_test(self):
+        """Test is_unit_test."""
+        module_name = 'myModule'
+        maininfo_with_unittest = {constants.MODULE_NAME: module_name,
+                                  constants.MODULE_IS_UNIT_TEST: 'true'}
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        self.assertTrue(mod_info.is_unit_test(maininfo_with_unittest))
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/atest/result_reporter.py b/atest/result_reporter.py
index 2d433a4..225dec9 100644
--- a/atest/result_reporter.py
+++ b/atest/result_reporter.py
@@ -12,6 +12,7 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
+# pylint: disable=import-outside-toplevel
 # pylint: disable=line-too-long
 
 """
@@ -80,6 +81,7 @@
 BENCHMARK_OPTIONAL_KEYS = {'bytes_per_second', 'label'}
 BENCHMARK_EVENT_KEYS = BENCHMARK_ESSENTIAL_KEYS.union(BENCHMARK_OPTIONAL_KEYS)
 INT_KEYS = {'cpu_time', 'real_time'}
+ITER_SUMMARY = {}
 
 class PerfInfo():
     """Class for storing performance test of a test run."""
@@ -260,7 +262,7 @@
               'VtsTradefedTestRunner': {'Module1': RunStat(passed:4, failed:0)}}
     """
 
-    def __init__(self, silent=False):
+    def __init__(self, silent=False, collect_only=False, flakes_info=False):
         """Init ResultReporter.
 
         Args:
@@ -274,6 +276,9 @@
         self.log_path = None
         self.silent = silent
         self.rerun_options = ''
+        self.collect_only = collect_only
+        self.flakes_info = flakes_info
+        self.test_result_link = None
 
     def process_test_result(self, test):
         """Given the results of a single test, update stats and print results.
@@ -333,25 +338,42 @@
         """Print starting text for running tests."""
         print(au.colorize('\nRunning Tests...', constants.CYAN))
 
-    def print_summary(self, is_collect_tests_only=False):
-        """Print summary of all test runs.
+    def set_current_summary(self, run_num):
+        """Set current test summary to ITER_SUMMARY."""
+        run_summary = []
+        for runner_name, groups in self.runners.items():
+            for group_name, stats in groups.items():
+                name = group_name if group_name else runner_name
+                summary = self.process_summary(name, stats)
+                run_summary.append(summary)
+        summary_list = ITER_SUMMARY.get(run_num, [])
+        # Not contain redundant item
+        if not set(run_summary).issubset(set(summary_list)):
+            summary_list.extend(run_summary)
+            ITER_SUMMARY[run_num] = summary_list
 
-        Args:
-            is_collect_tests_only: A boolean of collect_tests_only.
+    # pylint: disable=too-many-branches
+    def print_summary(self):
+        """Print summary of all test runs.
 
         Returns:
             0 if all tests pass, non-zero otherwise.
 
         """
-        if is_collect_tests_only:
+        if self.collect_only:
             return self.print_collect_tests()
         tests_ret = constants.EXIT_CODE_SUCCESS
         if not self.runners:
             return tests_ret
         print('\n{}'.format(au.colorize('Summary', constants.CYAN)))
         print(au.delimiter('-', 7))
-        if self.rerun_options:
-            print(self.rerun_options)
+        iterations = len(ITER_SUMMARY)
+        for iter_num, summary_list in ITER_SUMMARY.items():
+            if iterations > 1:
+                print(au.colorize("ITERATION %s" % (int(iter_num) + 1),
+                                  constants.BLUE))
+            for summary in summary_list:
+                print(summary)
         failed_sum = len(self.failed_tests)
         for runner_name, groups in self.runners.items():
             if groups == UNSUPPORTED_FLAG:
@@ -370,7 +392,8 @@
                 if stats.run_errors:
                     tests_ret = constants.EXIT_CODE_TEST_FAILURE
                     failed_sum += 1 if not stats.failed else 0
-                print(summary)
+                if not ITER_SUMMARY:
+                    print(summary)
         self.run_stats.perf_info.print_perf_info()
         print()
         if tests_ret == constants.EXIT_CODE_SUCCESS:
@@ -383,6 +406,12 @@
             self.print_failed_tests()
         if self.log_path:
             print('Test Logs have saved in %s' % self.log_path)
+        # TODO(b/174535786) Error handling while uploading test results has
+        # unexpected exceptions.
+        # TODO (b/174627499) Saving this information in atest history.
+        if self.test_result_link:
+            print('Test Result uploaded to %s'
+                  % au.colorize(self.test_result_link, constants.GREEN))
         return tests_ret
 
     def print_collect_tests(self):
@@ -411,8 +440,21 @@
         """Print the failed tests if existed."""
         if self.failed_tests:
             for test_name in self.failed_tests:
-                print('%s' % test_name)
+                failed_details = test_name
+                if self.flakes_info:
+                    flakes_method = test_name.replace('#', '.')
+                    flakes_info = au.get_flakes(test_method=flakes_method)
+                    if (flakes_info and
+                            flakes_info.get(constants.FLAKE_PERCENT, None)):
+                        failed_details += (
+                            ': flakes percent: {}%, flakes postsubmit per week:'
+                            ' {}'.format(float(flakes_info.get(
+                                constants.FLAKE_PERCENT)),
+                                         flakes_info.get(
+                                             constants.FLAKE_POSTSUBMIT, '0')))
+                print(failed_details)
 
+    # pylint: disable=too-many-locals
     def process_summary(self, name, stats):
         """Process the summary line.
 
@@ -434,25 +476,56 @@
         """
         passed_label = 'Passed'
         failed_label = 'Failed'
+        flakes_label = ''
         ignored_label = 'Ignored'
         assumption_failed_label = 'Assumption Failed'
         error_label = ''
+        host_log_content = ''
+        flakes_percent = ''
         if stats.failed > 0:
             failed_label = au.colorize(failed_label, constants.RED)
+            mod_list = name.split()
+            module = ''
+            if len(mod_list) > 1:
+                module = mod_list[1]
+            if module and self.flakes_info:
+                flakes_info = au.get_flakes(test_module=module)
+                if (flakes_info and
+                        flakes_info.get(constants.FLAKE_PERCENT, None)):
+                    flakes_label = au.colorize('Flakes Percent:',
+                                               constants.RED)
+                    flakes_percent = '{:.2f}%'.format(float(flakes_info.get(
+                        constants.FLAKE_PERCENT)))
         if stats.run_errors:
             error_label = au.colorize('(Completed With ERRORS)', constants.RED)
+            # Only extract host_log_content if test name is tradefed
+            # Import here to prevent circular-import error.
+            from test_runners import atest_tf_test_runner
+            if name == atest_tf_test_runner.AtestTradefedTestRunner.NAME:
+                find_logs = au.find_files(self.log_path,
+                                          file_name=constants.TF_HOST_LOG)
+                if find_logs:
+                    host_log_content = au.colorize(
+                        '\n\nTradefederation host log:', constants.RED)
+                for tf_log_zip in find_logs:
+                    host_log_content = host_log_content + au.extract_zip_text(
+                        tf_log_zip)
         elif stats.failed == 0:
             passed_label = au.colorize(passed_label, constants.GREEN)
-        summary = '%s: %s: %s, %s: %s, %s: %s, %s: %s %s' % (name,
-                                                             passed_label,
-                                                             stats.passed,
-                                                             failed_label,
-                                                             stats.failed,
-                                                             ignored_label,
-                                                             stats.ignored,
-                                                             assumption_failed_label,
-                                                             stats.assumption_failed,
-                                                             error_label)
+        summary = ('%s: %s: %s, %s: %s, %s: %s, %s: %s, %s %s %s %s'
+                   % (name,
+                      passed_label,
+                      stats.passed,
+                      failed_label,
+                      stats.failed,
+                      ignored_label,
+                      stats.ignored,
+                      assumption_failed_label,
+                      stats.assumption_failed,
+                      flakes_label,
+                      flakes_percent,
+                      error_label,
+                      host_log_content))
         return summary
 
     def _update_stats(self, test, group):
@@ -497,6 +570,7 @@
         underline = '-' * (len(title))
         print('\n%s\n%s' % (title, underline))
 
+    # pylint: disable=too-many-branches
     def _print_result(self, test):
         """Print the results of a single test.
 
@@ -519,37 +593,32 @@
             self.pre_test = test
             return
         if test.test_name:
+            color = ''
             if test.status == test_runner_base.PASSED_STATUS:
                 # Example of output:
                 # [78/92] test_name: PASSED (92ms)
-                print('[%s/%s] %s: %s %s' % (test.test_count,
-                                             test.group_total,
-                                             test.test_name,
-                                             au.colorize(
-                                                 test.status,
-                                                 constants.GREEN),
-                                             test.test_time))
-                for key, data in test.additional_info.items():
-                    if key not in BENCHMARK_EVENT_KEYS:
-                        print('\t%s: %s' % (au.colorize(key, constants.BLUE), data))
-            elif test.status == test_runner_base.IGNORED_STATUS:
+                color = constants.GREEN
+            elif test.status in (test_runner_base.IGNORED_STATUS,
+                                 test_runner_base.ASSUMPTION_FAILED):
                 # Example: [33/92] test_name: IGNORED (12ms)
-                print('[%s/%s] %s: %s %s' % (test.test_count, test.group_total,
-                                             test.test_name, au.colorize(
-                                                 test.status, constants.MAGENTA),
-                                             test.test_time))
-            elif test.status == test_runner_base.ASSUMPTION_FAILED:
                 # Example: [33/92] test_name: ASSUMPTION_FAILED (12ms)
-                print('[%s/%s] %s: %s %s' % (test.test_count, test.group_total,
-                                             test.test_name, au.colorize(
-                                                 test.status, constants.MAGENTA),
-                                             test.test_time))
+                color = constants.MAGENTA
             else:
                 # Example: [26/92] test_name: FAILED (32ms)
-                print('[%s/%s] %s: %s %s' % (test.test_count, test.group_total,
-                                             test.test_name, au.colorize(
-                                                 test.status, constants.RED),
-                                             test.test_time))
-        if test.status == test_runner_base.FAILED_STATUS:
-            print('\nSTACKTRACE:\n%s' % test.details)
+                color = constants.RED
+            print('[{}/{}] {}'.format(test.test_count,
+                                      test.group_total,
+                                      test.test_name), end='')
+            if self.collect_only:
+                print()
+            else:
+                print(': {} {}'.format(au.colorize(test.status, color),
+                                       test.test_time))
+            if test.status == test_runner_base.PASSED_STATUS:
+                for key, data in test.additional_info.items():
+                    if key not in BENCHMARK_EVENT_KEYS:
+                        print('\t%s: %s' % (au.colorize(key, constants.BLUE),
+                                            data))
+            if test.status == test_runner_base.FAILED_STATUS:
+                print('\nSTACKTRACE:\n%s' % test.details)
         self.pre_test = test
diff --git a/atest/run_atest_unittests.sh b/atest/run_atest_unittests.sh
index 5ba6ed9..c528eda 100755
--- a/atest/run_atest_unittests.sh
+++ b/atest/run_atest_unittests.sh
@@ -30,7 +30,7 @@
 PIP=pip3
 
 function python3_checker() {
-    if ! which $PYTHON; then
+    if ! which $PYTHON >/dev/null 2>&1; then
         echo "python3 not found."; exit 1
     fi
 }
@@ -42,7 +42,7 @@
 function print_summary() {
     local test_results=$1
     if [[ $COVERAGE == true ]]; then
-        coverage report -m
+        coverage report --show-missing
         coverage html
     fi
     if [[ $test_results -eq 0 ]]; then
@@ -57,10 +57,9 @@
     for mod in ${mods_to_check[@]}; do
         if ! $PIP freeze | grep $mod >/dev/null 2>&1; then
             $PIP install -U --user $mod
-        else
-            if ! (head -n1 $(which $mod) | grep -q $PYTHON); then
-                sed -i "1 s/python/$PYTHON/" $(which $mod)
-            fi
+        fi
+        if ! (head -n1 $(which $mod) | grep -q $PYTHON); then
+            sed -i "1 s/python/$PYTHON/" $(which $mod)
         fi
     done
 }
diff --git a/atest/test_data/test_commands.json b/atest/test_data/test_commands.json
index 2b6c008..cf456eb 100644
--- a/atest/test_data/test_commands.json
+++ b/atest/test_data/test_commands.json
@@ -1,59 +1,490 @@
 {
-"hello_world_test": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter hello_world_test --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"packages/apps/Car/Messenger/tests/robotests/src/com/android/car/messenger/MessengerDelegateTest.java": [
-"./build/soong/soong_ui.bash --make-mode RunCarMessengerRoboTests"
-], 
-"CtsAnimationTestCases:AnimatorTest": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter CtsAnimationTestCases --atest-include-filter CtsAnimationTestCases:android.animation.cts.AnimatorTest --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"CtsSampleDeviceTestCases:android.sample.cts.SampleDeviceReportLogTest": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter CtsSampleDeviceTestCases --atest-include-filter CtsSampleDeviceTestCases:android.sample.cts.SampleDeviceReportLogTest --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"CtsAnimationTestCases CtsSampleDeviceTestCases": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter CtsAnimationTestCases --include-filter CtsSampleDeviceTestCases --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
 "AnimatorTest": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter CtsAnimationTestCases --atest-include-filter CtsAnimationTestCases:android.animation.cts.AnimatorTest --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"PacketFragmenterTest": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter net_test_hci --atest-include-filter net_test_hci:PacketFragmenterTest.* --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"android.animation.cts": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter CtsAnimationTestCases --atest-include-filter CtsAnimationTestCases:android.animation.cts --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"platform_testing/tests/example/native/Android.bp": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter hello_world_test --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"tools/tradefederation/core/res/config/native-benchmark.xml": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter native-benchmark --log-level WARN --logcat-on-failure --no-enable-granular-attempts"
-], 
-"native-benchmark": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter native-benchmark --log-level WARN --logcat-on-failure --no-enable-granular-attempts"
-], 
-"platform_testing/tests/example/native": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter hello_world_test --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"VtsCodelabHelloWorldTest": [
-"vts10-tradefed run commandAndExit vts-staging-default -m VtsCodelabHelloWorldTest --skip-all-system-status-check --skip-preconditions --primary-abi-only"
-], 
-"aidegen_unittests": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --atest-log-file-path=/tmp/atest_run_1568627341_v33kdA/log --include-filter aidegen_unittests --log-level WARN"
-], 
-"HelloWorldTests": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter HelloWorldTests --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"CtsSampleDeviceTestCases:SampleDeviceTest#testSharedPreferences": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter CtsSampleDeviceTestCases --atest-include-filter CtsSampleDeviceTestCases:android.sample.cts.SampleDeviceTest#testSharedPreferences --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"CtsSampleDeviceTestCases:android.sample.cts": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter CtsSampleDeviceTestCases --atest-include-filter CtsSampleDeviceTestCases:android.sample.cts --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
-"PacketFragmenterTest#test_no_fragment_necessary,test_ble_fragment_necessary": [
-"atest_tradefed.sh template/atest_local_min --template:map test=atest --include-filter net_test_hci --atest-include-filter net_test_hci:PacketFragmenterTest.test_ble_fragment_necessary:PacketFragmenterTest.test_no_fragment_necessary --log-level WARN --skip-loading-config-jar --logcat-on-failure --no-enable-granular-attempts"
-], 
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsAnimationTestCases",
+"CtsAnimationTestCases:android.animation.cts.AnimatorTest",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
 "CarMessengerRoboTests": [
-"./build/soong/soong_ui.bash --make-mode RunCarMessengerRoboTests"
+"--make-mode",
+"./build/soong/soong_ui.bash",
+"RunCarMessengerRoboTests"
+],
+"CtsAnimationTestCases:AnimatorTest": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsAnimationTestCases",
+"CtsAnimationTestCases:android.animation.cts.AnimatorTest",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"CtsSampleDeviceTestCases:SampleDeviceTest#testSharedPreferences": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsSampleDeviceTestCases",
+"CtsSampleDeviceTestCases:android.sample.cts.SampleDeviceTest#testSharedPreferences",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"CtsSampleDeviceTestCases:android.sample.cts": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsSampleDeviceTestCases",
+"CtsSampleDeviceTestCases:android.sample.cts",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"CtsSampleDeviceTestCases:android.sample.cts.SampleDeviceReportLogTest": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsSampleDeviceTestCases",
+"CtsSampleDeviceTestCases:android.sample.cts.SampleDeviceReportLogTest",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"CtsAnimationTestCases CtsSampleDeviceTestCases": [
+"--include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsAnimationTestCases",
+"CtsSampleDeviceTestCases",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"CtsWindowManagerDeviceTestCases:android.server.wm.DisplayCutoutTests#testDisplayCutout_default": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"CtsWindowManagerDeviceTestCases",
+"CtsWindowManagerDeviceTestCases:android.server.wm.DisplayCutoutTests#testDisplayCutout_default*",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"template/atest_local_min",
+"test=atest"
+],
+"HelloWorldTests": [
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"HelloWorldTests",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"template/atest_local_min",
+"test=atest"
+],
+"android.animation.cts": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsAnimationTestCases",
+"CtsAnimationTestCases:android.animation.cts",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"cts/tests/framework/base/windowmanager/src/android/server/wm/DisplayCutoutTests.java#testDisplayCutout_default": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"CtsWindowManagerDeviceTestCases",
+"CtsWindowManagerDeviceTestCases:android.server.wm.DisplayCutoutTests#testDisplayCutout_default*",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"template/atest_local_min",
+"test=atest"
+],
+"hello_world_test": [
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"hello_world_test",
+"template/atest_local_min",
+"test=atest"
+],
+"native-benchmark": [
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"native-benchmark",
+"template/atest_local_min",
+"test=atest"
+],
+"packages/apps/Car/Messenger/tests/robotests/src/com/android/car/messenger/MessengerDelegateTest.java": [
+"--make-mode",
+"./build/soong/soong_ui.bash",
+"RunCarMessengerRoboTests"
+],
+"platform_testing/tests/example/native": [
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"hello_world_test",
+"template/atest_local_min",
+"test=atest"
+],
+"platform_testing/tests/example/native/Android.bp": [
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"hello_world_test",
+"template/atest_local_min",
+"test=atest"
+],
+"tools/tradefederation/core/res/config/native-benchmark.xml": [
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"native-benchmark",
+"template/atest_local_min",
+"test=atest"
+],
+"PacketFragmenterTest": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"net_test_hci",
+"net_test_hci:PacketFragmenterTest.*",
+"template/atest_local_min",
+"test=atest"
+],
+"PacketFragmenterTest#test_no_fragment_necessary,test_ble_fragment_necessary": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"net_test_hci",
+"net_test_hci:PacketFragmenterTest.test_ble_fragment_necessary:PacketFragmenterTest.test_no_fragment_necessary",
+"template/atest_local_min",
+"test=atest"
+],
+"VtsHalCameraProviderV2_4TargetTest:PerInstance/CameraHidlTest#startStopPreview/0_internal_0": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"VtsHalCameraProviderV2_4TargetTest",
+"VtsHalCameraProviderV2_4TargetTest:PerInstance/CameraHidlTest.startStopPreview/0_internal_0",
+"atest_tradefed.sh",
+"template/atest_local_min",
+"test=atest"
+],
+"MixedManagedProfileOwnerTest#testPasswordSufficientInitially": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"CtsDevicePolicyManagerTestCases",
+"CtsDevicePolicyManagerTestCases:com.android.cts.devicepolicy.MixedManagedProfileOwnerTest#testPasswordSufficientInitially",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"template/atest_local_min",
+"test=atest"
+],
+"android.sample.cts.SampleDeviceReportLogTest": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsSampleDeviceTestCases",
+"CtsSampleDeviceTestCases:android.sample.cts.SampleDeviceReportLogTest",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"android.sample.cts.SampleDeviceTest#testSharedPreferences": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsSampleDeviceTestCases",
+"CtsSampleDeviceTestCases:android.sample.cts.SampleDeviceTest#testSharedPreferences",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"com.android.server.wm.ScreenDecorWindowTests": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"WmTests",
+"WmTests:com.android.server.wm.ScreenDecorWindowTests",
+"atest_tradefed.sh",
+"template/atest_local_min",
+"test=atest"
+],
+"com.android.server.wm.ScreenDecorWindowTests#testMultipleDecors": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"VERBOSE",
+"VERBOSE",
+"WmTests",
+"WmTests:com.android.server.wm.ScreenDecorWindowTests#testMultipleDecors",
+"atest_tradefed.sh",
+"template/atest_local_min",
+"test=atest"
+],
+"android.os.cts.CompanionDeviceManagerTest": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsOsTestCases",
+"CtsOsTestCases:android.os.cts.CompanionDeviceManagerTest",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"android.os.cts.CompanionDeviceManagerTest#testIsDeviceAssociatedWithCompanionApproveWifiConnectionsPermission": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsOsTestCases",
+"CtsOsTestCases:android.os.cts.CompanionDeviceManagerTest#testIsDeviceAssociatedWithCompanionApproveWifiConnectionsPermission",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
+],
+"cts/tests/tests/os/src/android/os/cts/CompanionDeviceManagerTest.kt#testIsDeviceAssociated": [
+"--atest-include-filter",
+"--include-filter",
+"--log-level",
+"--log-level-display",
+"--logcat-on-failure",
+"--no-early-device-release",
+"--no-enable-granular-attempts",
+"--skip-loading-config-jar",
+"--template:map",
+"--test-arg",
+"CtsOsTestCases",
+"CtsOsTestCases:android.os.cts.CompanionDeviceManagerTest#testIsDeviceAssociated",
+"VERBOSE",
+"VERBOSE",
+"atest_tradefed.sh",
+"com.android.tradefed.testtype.AndroidJUnitTest:exclude-annotation:android.platform.test.annotations.AppModeInstant",
+"template/atest_local_min",
+"test=atest"
 ]
-}
+}
\ No newline at end of file
diff --git a/atest/test_finder_handler.py b/atest/test_finder_handler.py
index 67f5a34..33261af 100644
--- a/atest/test_finder_handler.py
+++ b/atest/test_finder_handler.py
@@ -63,7 +63,8 @@
                                         'MODULE_PACKAGE', 'MODULE_FILE_PATH',
                                         'INTEGRATION_FILE_PATH', 'INTEGRATION',
                                         'SUITE', 'CC_CLASS', 'SUITE_PLAN',
-                                        'SUITE_PLAN_FILE_PATH', 'CACHE'])
+                                        'SUITE_PLAN_FILE_PATH', 'CACHE',
+                                        'CONFIG'])
 
 _REF_TYPE_TO_FUNC_MAP = {
     _REFERENCE_TYPE.MODULE: module_finder.ModuleFinder.find_test_by_module_name,
@@ -83,6 +84,7 @@
     _REFERENCE_TYPE.SUITE_PLAN_FILE_PATH:
         suite_plan_finder.SuitePlanFinder.find_test_by_suite_path,
     _REFERENCE_TYPE.CACHE: cache_finder.CacheFinder.find_test_by_cache,
+    _REFERENCE_TYPE.CONFIG: module_finder.ModuleFinder.find_test_by_config_name,
 }
 
 
@@ -119,6 +121,7 @@
         pass
     return test_finders_list
 
+# pylint: disable=too-many-branches
 # pylint: disable=too-many-return-statements
 def _get_test_reference_types(ref):
     """Determine type of test reference based on the content of string.
@@ -147,14 +150,22 @@
                     _REFERENCE_TYPE.INTEGRATION_FILE_PATH,
                     _REFERENCE_TYPE.MODULE_FILE_PATH,
                     _REFERENCE_TYPE.SUITE_PLAN_FILE_PATH]
+        if ':' in ref:
+            return [_REFERENCE_TYPE.CACHE,
+                    _REFERENCE_TYPE.INTEGRATION_FILE_PATH,
+                    _REFERENCE_TYPE.MODULE_FILE_PATH,
+                    _REFERENCE_TYPE.INTEGRATION,
+                    _REFERENCE_TYPE.SUITE_PLAN_FILE_PATH,
+                    _REFERENCE_TYPE.MODULE_CLASS]
         return [_REFERENCE_TYPE.CACHE,
                 _REFERENCE_TYPE.INTEGRATION_FILE_PATH,
                 _REFERENCE_TYPE.MODULE_FILE_PATH,
                 _REFERENCE_TYPE.INTEGRATION,
                 _REFERENCE_TYPE.SUITE_PLAN_FILE_PATH,
+                _REFERENCE_TYPE.CC_CLASS,
                 # TODO: Uncomment in SUITE when it's supported
                 # _REFERENCE_TYPE.SUITE
-               ]
+                ]
     if '.' in ref:
         ref_end = ref.rsplit('.', 1)[-1]
         ref_end_is_upper = ref_end[0].isupper()
@@ -196,6 +207,7 @@
             # TODO: Uncomment in SUITE when it's supported
             # _REFERENCE_TYPE.SUITE,
             _REFERENCE_TYPE.MODULE,
+            _REFERENCE_TYPE.CONFIG,
             _REFERENCE_TYPE.SUITE_PLAN,
             _REFERENCE_TYPE.CLASS,
             _REFERENCE_TYPE.CC_CLASS]
diff --git a/atest/test_finder_handler_unittest.py b/atest/test_finder_handler_unittest.py
index 5888565..4578553 100755
--- a/atest/test_finder_handler_unittest.py
+++ b/atest/test_finder_handler_unittest.py
@@ -98,22 +98,26 @@
         self.assertEqual(
             test_finder_handler._get_test_reference_types('ModuleOrClassName'),
             [REF_TYPE.CACHE, REF_TYPE.INTEGRATION, REF_TYPE.MODULE,
-             REF_TYPE.SUITE_PLAN, REF_TYPE.CLASS, REF_TYPE.CC_CLASS]
+             REF_TYPE.CONFIG, REF_TYPE.SUITE_PLAN, REF_TYPE.CLASS,
+             REF_TYPE.CC_CLASS]
         )
         self.assertEqual(
             test_finder_handler._get_test_reference_types('Module_or_Class_name'),
             [REF_TYPE.CACHE, REF_TYPE.INTEGRATION, REF_TYPE.MODULE,
-             REF_TYPE.SUITE_PLAN, REF_TYPE.CLASS, REF_TYPE.CC_CLASS]
+             REF_TYPE.CONFIG, REF_TYPE.SUITE_PLAN, REF_TYPE.CLASS,
+             REF_TYPE.CC_CLASS]
         )
         self.assertEqual(
             test_finder_handler._get_test_reference_types('SuiteName'),
             [REF_TYPE.CACHE, REF_TYPE.INTEGRATION, REF_TYPE.MODULE,
-             REF_TYPE.SUITE_PLAN, REF_TYPE.CLASS, REF_TYPE.CC_CLASS]
+             REF_TYPE.CONFIG, REF_TYPE.SUITE_PLAN, REF_TYPE.CLASS,
+             REF_TYPE.CC_CLASS]
         )
         self.assertEqual(
             test_finder_handler._get_test_reference_types('Suite-Name'),
             [REF_TYPE.CACHE, REF_TYPE.INTEGRATION, REF_TYPE.MODULE,
-             REF_TYPE.SUITE_PLAN, REF_TYPE.CLASS, REF_TYPE.CC_CLASS]
+             REF_TYPE.CONFIG, REF_TYPE.SUITE_PLAN, REF_TYPE.CLASS,
+             REF_TYPE.CC_CLASS]
         )
         self.assertEqual(
             test_finder_handler._get_test_reference_types('some.package'),
@@ -187,7 +191,7 @@
             test_finder_handler._get_test_reference_types('rel/path/to/test'),
             [REF_TYPE.CACHE, REF_TYPE.INTEGRATION_FILE_PATH,
              REF_TYPE.MODULE_FILE_PATH, REF_TYPE.INTEGRATION,
-             REF_TYPE.SUITE_PLAN_FILE_PATH]
+             REF_TYPE.SUITE_PLAN_FILE_PATH, REF_TYPE.CC_CLASS]
         )
         self.assertEqual(
             test_finder_handler._get_test_reference_types('/abs/path/to/test'),
@@ -198,24 +202,41 @@
             test_finder_handler._get_test_reference_types('int/test'),
             [REF_TYPE.CACHE, REF_TYPE.INTEGRATION_FILE_PATH,
              REF_TYPE.MODULE_FILE_PATH, REF_TYPE.INTEGRATION,
-             REF_TYPE.SUITE_PLAN_FILE_PATH]
+             REF_TYPE.SUITE_PLAN_FILE_PATH, REF_TYPE.CC_CLASS]
         )
         self.assertEqual(
             test_finder_handler._get_test_reference_types('int/test:fully.qual.Class#m'),
             [REF_TYPE.CACHE, REF_TYPE.INTEGRATION_FILE_PATH,
              REF_TYPE.MODULE_FILE_PATH, REF_TYPE.INTEGRATION,
-             REF_TYPE.SUITE_PLAN_FILE_PATH]
+             REF_TYPE.SUITE_PLAN_FILE_PATH, REF_TYPE.MODULE_CLASS]
         )
         self.assertEqual(
             test_finder_handler._get_test_reference_types('int/test:Class#method'),
             [REF_TYPE.CACHE, REF_TYPE.INTEGRATION_FILE_PATH,
              REF_TYPE.MODULE_FILE_PATH, REF_TYPE.INTEGRATION,
-             REF_TYPE.SUITE_PLAN_FILE_PATH]
+             REF_TYPE.SUITE_PLAN_FILE_PATH, REF_TYPE.MODULE_CLASS]
         )
         self.assertEqual(
             test_finder_handler._get_test_reference_types('int_name_no_slash:Class#m'),
             [REF_TYPE.CACHE, REF_TYPE.INTEGRATION, REF_TYPE.MODULE_CLASS]
         )
+        self.assertEqual(
+            test_finder_handler._get_test_reference_types(
+                'gtest_module:Class_Prefix/Class#Method/Method_Suffix'),
+            [REF_TYPE.CACHE, REF_TYPE.INTEGRATION_FILE_PATH,
+             REF_TYPE.MODULE_FILE_PATH, REF_TYPE.INTEGRATION,
+             REF_TYPE.SUITE_PLAN_FILE_PATH, REF_TYPE.MODULE_CLASS]
+        )
+        self.assertEqual(
+            test_finder_handler._get_test_reference_types(
+                'Class_Prefix/Class#Method/Method_Suffix'),
+            [REF_TYPE.CACHE,
+             REF_TYPE.INTEGRATION_FILE_PATH,
+             REF_TYPE.MODULE_FILE_PATH,
+             REF_TYPE.INTEGRATION,
+             REF_TYPE.SUITE_PLAN_FILE_PATH,
+             REF_TYPE.CC_CLASS]
+        )
 
     def test_get_registered_find_methods(self):
         """Test that we get the registered find methods."""
diff --git a/atest/test_finders/cache_finder.py b/atest/test_finders/cache_finder.py
index 7e0765c..951d925 100644
--- a/atest/test_finders/cache_finder.py
+++ b/atest/test_finders/cache_finder.py
@@ -16,7 +16,10 @@
 Cache Finder class.
 """
 
+import logging
+
 import atest_utils
+import constants
 
 from test_finders import test_finder_base
 from test_finders import test_info
@@ -25,8 +28,9 @@
     """Cache Finder class."""
     NAME = 'CACHE'
 
-    def __init__(self, **kwargs):
-        super(CacheFinder, self).__init__()
+    def __init__(self, module_info=None):
+        super().__init__()
+        self.module_info = module_info
 
     def _is_latest_testinfos(self, test_infos):
         """Check whether test_infos are up-to-date.
@@ -43,6 +47,7 @@
         for cached_test_info in test_infos:
             sorted_cache_ti = sorted(vars(cached_test_info).keys())
             if not sorted_cache_ti == sorted_base_ti:
+                logging.debug('test_info is not up-to-date.')
                 return False
         return True
 
@@ -57,6 +62,120 @@
             TestInfo format, else None.
         """
         test_infos = atest_utils.load_test_info_cache(test_reference)
-        if test_infos and self._is_latest_testinfos(test_infos):
+        if test_infos and self._is_test_infos_valid(test_infos):
             return test_infos
         return None
+
+    def _is_test_infos_valid(self, test_infos):
+        """Check if the given test_infos are valid.
+
+        Args:
+            test_infos: A list of TestInfo.
+
+        Returns:
+            True if test_infos are all valid. Otherwise, False.
+        """
+        if not self._is_latest_testinfos(test_infos):
+            return False
+        for t_info in test_infos:
+            if not self._is_test_path_valid(t_info):
+                return False
+            if not self._is_test_build_target_valid(t_info):
+                return False
+            if not self._is_test_filter_valid(t_info):
+                return False
+        return True
+
+    def _is_test_path_valid(self, t_info):
+        """Check if test path is valid.
+
+        Args:
+            t_info: TestInfo that has been filled out by a find method.
+
+        Returns:
+            True if test path is valid. Otherwise, False.
+        """
+        # For RoboTest it won't have 'MODULES-IN-' as build target. Treat test
+        # path is valid if cached_test_paths is None.
+        cached_test_paths = t_info.get_test_paths()
+        if cached_test_paths is None:
+            return True
+        current_test_paths = self.module_info.get_paths(t_info.test_name)
+        if not current_test_paths:
+            return False
+        if sorted(cached_test_paths) != sorted(current_test_paths):
+            logging.debug('Not a valid test path.')
+            return False
+        return True
+
+    def _is_test_build_target_valid(self, t_info):
+        """Check if test build targets are valid.
+
+        Args:
+            t_info: TestInfo that has been filled out by a find method.
+
+        Returns:
+            True if test's build target is valid. Otherwise, False.
+        """
+        # If the cached build target can be found in current module-info, then
+        # it is a valid build targets of the test.
+        for build_target in t_info.build_targets:
+            if str(build_target).startswith(constants.MODULES_IN):
+                continue
+            if not self.module_info.is_module(build_target):
+                logging.debug('%s is not a valid build target.', build_target)
+                return False
+        return True
+
+    def _is_test_filter_valid(self, t_info):
+        """Check if test filter is valid.
+
+        Args:
+            t_info: TestInfo that has been filled out by a find method.
+
+        Returns:
+            True if test filter is valid. Otherwise, False.
+        """
+        test_filters = t_info.data.get(constants.TI_FILTER, [])
+        if not test_filters:
+            return True
+        for test_filter in test_filters:
+            # Check if the class filter is under current module.
+            # TODO: (b/172260100) The test_name may not be inevitably equal to
+            #  the module_name.
+            if self._is_java_filter_in_module(t_info.test_name ,
+                                              test_filter.class_name):
+                return True
+            # TODO: (b/172260100) Also check for CC.
+        logging.debug('Not a valid test filter.')
+        return False
+
+    def _is_java_filter_in_module(self, module_name, filter_class):
+        """Check if input class is part of input module.
+
+        Args:
+            module_name: A string of the module name of the test.
+            filter_class: A string of the class name field of TI_FILTER.
+
+        Returns:
+            True if input filter_class is in the input module. Otherwise, False.
+        """
+        mod_info = self.module_info.get_module_info(module_name)
+        if not mod_info:
+            return False
+        module_srcs = mod_info.get(constants.MODULE_SRCS, [])
+        # If module didn't have src information treat the cached filter still
+        # valid. Remove this after all java srcs could be found in module-info.
+        if not module_srcs:
+            return True
+        ref_end = filter_class.rsplit('.', 1)[-1]
+        if '.' in filter_class:
+            file_path = str(filter_class).replace('.', '/')
+            # A Java class file always starts with a capital letter.
+            if ref_end[0].isupper():
+                file_path = file_path + '.'
+            for src_path in module_srcs:
+                # If java class, check if class file in module's src.
+                if src_path.find(file_path) >= 0:
+                    return True
+        return False
diff --git a/atest/test_finders/cache_finder_unittest.py b/atest/test_finders/cache_finder_unittest.py
index fcb3e54..2e09560 100755
--- a/atest/test_finders/cache_finder_unittest.py
+++ b/atest/test_finders/cache_finder_unittest.py
@@ -24,9 +24,12 @@
 from unittest import mock
 
 import atest_utils
+import constants
+import module_info
 import unittest_constants as uc
 
 from test_finders import cache_finder
+from test_finders import test_info
 
 
 #pylint: disable=protected-access
@@ -35,9 +38,15 @@
     def setUp(self):
         """Set up stuff for testing."""
         self.cache_finder = cache_finder.CacheFinder()
+        self.cache_finder.module_info = mock.Mock(spec=module_info.ModuleInfo)
 
+    @mock.patch.object(cache_finder.CacheFinder, '_is_test_filter_valid',
+                       return_value=True)
+    @mock.patch.object(cache_finder.CacheFinder, '_is_test_build_target_valid',
+                       return_value=True)
     @mock.patch.object(atest_utils, 'get_test_info_cache_path')
-    def test_find_test_by_cache(self, mock_get_cache_path):
+    def test_find_test_by_cache(self, mock_get_cache_path,
+            _mock_build_target_valid, _mock_filter_valid):
         """Test find_test_by_cache method."""
         uncached_test = 'mytest1'
         cached_test = 'hello_world_test'
@@ -51,6 +60,8 @@
         self.assertIsNone(self.cache_finder.find_test_by_cache(uncached_test))
         # Hit matched cache file and original_finder is in it,
         # should return cached test infos.
+        self.cache_finder.module_info.get_paths.return_value = [
+            'platform_testing/tests/example/native']
         mock_get_cache_path.return_value = os.path.join(
             test_cache_root,
             '78ea54ef315f5613f7c11dd1a87f10c7.cache')
@@ -61,5 +72,98 @@
             '39488b7ac83c56d5a7d285519fe3e3fd.cache')
         self.assertIsNone(self.cache_finder.find_test_by_cache(uncached_test2))
 
+    @mock.patch.object(cache_finder.CacheFinder, '_is_test_build_target_valid',
+                       return_value=True)
+    @mock.patch.object(atest_utils, 'get_test_info_cache_path')
+    def test_find_test_by_cache_wo_valid_path(self, mock_get_cache_path,
+            _mock_build_target_valid):
+        """Test find_test_by_cache method."""
+        cached_test = 'hello_world_test'
+        test_cache_root = os.path.join(uc.TEST_DATA_DIR, 'cache_root')
+        # Return None when the actual test_path is not identical to that in the
+        # existing cache.
+        self.cache_finder.module_info.get_paths.return_value = [
+            'not/matched/test/path']
+        mock_get_cache_path.return_value = os.path.join(
+            test_cache_root,
+            '78ea54ef315f5613f7c11dd1a87f10c7.cache')
+        self.assertIsNone(self.cache_finder.find_test_by_cache(cached_test))
+
+    @mock.patch.object(cache_finder.CacheFinder, '_is_test_build_target_valid',
+                       return_value=False)
+    @mock.patch.object(cache_finder.CacheFinder, '_is_test_path_valid',
+                       return_value=True)
+    @mock.patch.object(atest_utils, 'get_test_info_cache_path')
+    def test_find_test_by_cache_wo_valid_build_target(self, mock_get_cache_path,
+            _mock_path_valid, _mock_build_target_valid):
+        """Test find_test_by_cache method."""
+        cached_test = 'hello_world_test'
+        test_cache_root = os.path.join(uc.TEST_DATA_DIR, 'cache_root')
+        # Return None when the build target is not exist in module-info.
+        mock_get_cache_path.return_value = os.path.join(
+            test_cache_root,
+            '78ea54ef315f5613f7c11dd1a87f10c7.cache')
+        self.assertIsNone(self.cache_finder.find_test_by_cache(cached_test))
+
+    @mock.patch.object(cache_finder.CacheFinder, '_is_test_filter_valid',
+                       return_value=False)
+    @mock.patch.object(cache_finder.CacheFinder, '_is_test_build_target_valid',
+                       return_value=True)
+    @mock.patch.object(cache_finder.CacheFinder, '_is_test_path_valid',
+                       return_value=True)
+    @mock.patch.object(atest_utils, 'get_test_info_cache_path')
+    def test_find_test_by_cache_wo_valid_java_filter(self, mock_get_cache_path,
+        _mock_path_valid, _mock_build_target_valid, _mock_filter_valid):
+        """Test _is_test_filter_valid method."""
+        cached_test = 'hello_world_test'
+        test_cache_root = os.path.join(uc.TEST_DATA_DIR, 'cache_root')
+        # Return None if the cached test filter is not valid.
+        mock_get_cache_path.return_value = os.path.join(
+            test_cache_root,
+            '78ea54ef315f5613f7c11dd1a87f10c7.cache')
+        self.assertIsNone(self.cache_finder.find_test_by_cache(cached_test))
+
+    def test_is_java_filter_in_module_for_java_class(self):
+        """Test _is_java_filter_in_module method if input is java class."""
+        mock_mod = {constants.MODULE_SRCS:
+                             ['src/a/b/c/MyTestClass1.java']}
+        self.cache_finder.module_info.get_module_info.return_value = mock_mod
+        # Should not match if class name does not exist.
+        self.assertFalse(
+            self.cache_finder._is_java_filter_in_module(
+                'MyModule', 'a.b.c.MyTestClass'))
+        # Should match if class name exist.
+        self.assertTrue(
+            self.cache_finder._is_java_filter_in_module(
+                'MyModule', 'a.b.c.MyTestClass1'))
+
+    def test_is_java_filter_in_module_for_java_package(self):
+        """Test _is_java_filter_in_module method if input is java package."""
+        mock_mod = {constants.MODULE_SRCS:
+                        ['src/a/b/c/MyTestClass1.java']}
+        self.cache_finder.module_info.get_module_info.return_value = mock_mod
+        # Should not match if package name does not match the src.
+        self.assertFalse(
+            self.cache_finder._is_java_filter_in_module(
+                'MyModule', 'a.b.c.d'))
+        # Should match if package name matches the src.
+        self.assertTrue(
+            self.cache_finder._is_java_filter_in_module(
+                'MyModule', 'a.b.c'))
+
+    def test_is_test_build_target_valid_module_in(self):
+        """Test _is_test_build_target_valid method if target has MODULES-IN."""
+        t_info = test_info.TestInfo('mock_name', 'mock_runner',
+                                    {'MODULES-IN-my-test-dir'})
+        self.cache_finder.module_info.is_module.return_value = False
+        self.assertTrue(self.cache_finder._is_test_build_target_valid(t_info))
+
+    def test_is_test_build_target_valid(self):
+        """Test _is_test_build_target_valid method."""
+        t_info = test_info.TestInfo('mock_name', 'mock_runner',
+                                    {'my-test-target'})
+        self.cache_finder.module_info.is_module.return_value = False
+        self.assertFalse(self.cache_finder._is_test_build_target_valid(t_info))
+
 if __name__ == '__main__':
     unittest.main()
diff --git a/atest/test_finders/module_finder.py b/atest/test_finders/module_finder.py
index e9d311c..4b62566 100644
--- a/atest/test_finders/module_finder.py
+++ b/atest/test_finders/module_finder.py
@@ -21,6 +21,7 @@
 import logging
 import os
 
+import atest_configs
 import atest_error
 import atest_utils
 import constants
@@ -32,7 +33,6 @@
 from test_runners import robolectric_test_runner
 from test_runners import vts_tf_test_runner
 
-_MODULES_IN = 'MODULES-IN-%s'
 _ANDROID_MK = 'Android.mk'
 
 # These are suites in LOCAL_COMPATIBILITY_SUITE that aren't really suites so
@@ -47,11 +47,11 @@
     _VTS_TEST_RUNNER = vts_tf_test_runner.VtsTradefedTestRunner.NAME
 
     def __init__(self, module_info=None):
-        super(ModuleFinder, self).__init__()
+        super().__init__()
         self.root_dir = os.environ.get(constants.ANDROID_BUILD_TOP)
         self.module_info = module_info
 
-    def _determine_testable_module(self, path):
+    def _determine_testable_module(self, path, file_path=None):
         """Determine which module the user is trying to test.
 
         Returns the module to test. If there are multiple possibilities, will
@@ -59,11 +59,14 @@
 
         Args:
             path: String path of module to look for.
+            file_path: String path of input file.
 
         Returns:
             A list of the module names.
         """
         testable_modules = []
+        # A list to save those testable modules but srcs information is empty.
+        testable_modules_no_srcs = []
         for mod in self.module_info.get_module_names(path):
             mod_info = self.module_info.get_module_info(mod)
             # Robolectric tests always exist in pairs of 2, one module to build
@@ -73,7 +76,22 @@
                 # return a list with one module name if it is robolectric.
                 return [mod]
             if self.module_info.is_testable_module(mod_info):
+                # If test module defined srcs, input file_path should be defined
+                # in the src list of module.
+                module_srcs = mod_info.get(constants.MODULE_SRCS, [])
+                if file_path and os.path.relpath(
+                    file_path, self.root_dir) not in module_srcs:
+                    logging.debug('Skip module: %s for %s', mod, file_path)
+                    # Collect those modules if they don't have srcs information
+                    # in module-info, use this list if there's no other matched
+                    # module with src information.
+                    if not module_srcs:
+                        testable_modules_no_srcs.append(
+                            mod_info.get(constants.MODULE_NAME))
+                    continue
                 testable_modules.append(mod_info.get(constants.MODULE_NAME))
+        if not testable_modules:
+            testable_modules.extend(testable_modules_no_srcs)
         return test_finder_utils.extract_test_from_tests(testable_modules)
 
     def _is_vts_module(self, module_name):
@@ -165,6 +183,25 @@
             return self._update_to_robolectric_test_info(test)
         rel_config = test.data[constants.TI_REL_CONFIG]
         test.build_targets = self._get_build_targets(module_name, rel_config)
+        # For device side java test, it will use
+        # com.android.compatibility.testtype.DalvikTest as test runner in
+        # cts-dalvik-device-test-runner.jar
+        if self.module_info.is_auto_gen_test_config(module_name):
+            if constants.MODULE_CLASS_JAVA_LIBRARIES in test.module_class:
+                test.build_targets.update(test_finder_utils.DALVIK_TEST_DEPS)
+        # Update test name if the test belong to extra config which means it's
+        # test config name is not the same as module name. For extra config, it
+        # index will be greater or equal to 1.
+        try:
+            if (mod_info.get(constants.MODULE_TEST_CONFIG, []).index(rel_config)
+                    > 0):
+                config_test_name = os.path.splitext(os.path.basename(
+                    rel_config))[0]
+                logging.debug('Replace test_info.name(%s) to %s',
+                              test.test_name, config_test_name)
+                test.test_name = config_test_name
+        except ValueError:
+            pass
         return test
 
     def _get_build_targets(self, module_name, rel_config):
@@ -185,14 +222,21 @@
         if constants.VTS_CORE_SUITE in self.module_info.get_module_info(
                 module_name).get(constants.MODULE_COMPATIBILITY_SUITES, []):
             targets.add(constants.VTS_CORE_TF_MODULE)
+        for suite in self.module_info.get_module_info(
+            module_name).get(constants.MODULE_COMPATIBILITY_SUITES, []):
+            targets.update(constants.SUITE_DEPS.get(suite, []))
         for module_path in self.module_info.get_paths(module_name):
             mod_dir = module_path.replace('/', '-')
-            targets.add(_MODULES_IN % mod_dir)
+            targets.add(constants.MODULES_IN + mod_dir)
         # (b/156457698) Force add vts_kernel_tests as build target if our test
         # belong to REQUIRED_KERNEL_TEST_MODULES due to required_module option
         # not working for sh_test in soong.
         if module_name in constants.REQUIRED_KERNEL_TEST_MODULES:
             targets.add('vts_kernel_tests')
+        # (b/184567849) Force adding module_name as a build_target. This will
+        # allow excluding MODULES-IN-* and prevent from missing build targets.
+        if module_name and self.module_info.is_module(module_name):
+            targets.add(module_name)
         return targets
 
     def _get_module_test_config(self, module_name, rel_config=None):
@@ -208,18 +252,31 @@
             rel_config: XML for the given test.
 
         Returns:
-            A string of test_config path if found, else return rel_config.
+            A list of string of test_config path if found, else return rel_config.
         """
+        default_all_config = not (atest_configs.GLOBAL_ARGS and
+                                  atest_configs.GLOBAL_ARGS.test_config_select)
         mod_info = self.module_info.get_module_info(module_name)
         if mod_info:
-            test_config = ''
+            test_configs = []
             test_config_list = mod_info.get(constants.MODULE_TEST_CONFIG, [])
             if test_config_list:
-                test_config = test_config_list[0]
-            if not self.module_info.is_auto_gen_test_config(module_name) and test_config != '':
-                return test_config
-        return rel_config
+                # multiple test configs
+                if len(test_config_list) > 1:
+                    test_configs = test_finder_utils.extract_test_from_tests(
+                        test_config_list, default_all=default_all_config)
+                else:
+                    test_configs = test_config_list
+            if test_configs:
+                return test_configs
+            # Double check if below section is needed.
+            if (not self.module_info.is_auto_gen_test_config(module_name)
+                    and len(test_configs) > 0):
+                return test_configs
+        return [rel_config] if rel_config else []
 
+    # pylint: disable=too-many-branches
+    # pylint: disable=too-many-locals
     def _get_test_info_filter(self, path, methods, **kwargs):
         """Get test info filter.
 
@@ -245,17 +302,41 @@
         elif file_name and constants.JAVA_EXT_RE.match(file_name):
             full_class_name = test_finder_utils.get_fully_qualified_class_name(
                 path)
+            # If input class is parameterized java class, adding * to the end of
+            # method filter string to make sure the generated method name could
+            # be run.
+            if test_finder_utils.is_parameterized_java_class(path):
+                update_methods = []
+                for method in methods:
+                    update_methods.append(method + '*')
+                methods = frozenset(update_methods)
             ti_filter = frozenset(
                 [test_info.TestFilter(full_class_name, methods)])
         # Path to cc file.
         elif file_name and constants.CC_EXT_RE.match(file_name):
+            # TODO (b/173019813) Should setup correct filter for an input file.
             if not test_finder_utils.has_cc_class(path):
                 raise atest_error.MissingCCTestCaseError(
                     "Can't find CC class in %s" % path)
-            if methods:
-                ti_filter = frozenset(
-                    [test_info.TestFilter(test_finder_utils.get_cc_filter(
-                        kwargs.get('class_name', '*'), methods), frozenset())])
+            # Extract class_name, method_name and parameterized_class from
+            # the given cc path.
+            file_classes, _, file_para_classes = (
+                test_finder_utils.get_cc_test_classes_methods(path))
+            cc_filters = []
+            # When instantiate tests found, recompose the class name in
+            # $(InstantiationName)/$(ClassName)
+            for file_class in file_classes:
+                if file_class in file_para_classes:
+                    file_class = '*/%s' % file_class
+                cc_filters.append(
+                    test_info.TestFilter(
+                        test_finder_utils.get_cc_filter(file_class, methods),
+                        frozenset()))
+            ti_filter = frozenset(cc_filters)
+        # If input path is a folder and have class_name information.
+        elif (not file_name and kwargs.get('class_name', None)):
+            ti_filter = frozenset(
+                [test_info.TestFilter(kwargs.get('class_name', None), methods)])
         # Path to non-module dir, treat as package.
         elif (not file_name
               and kwargs.get('rel_module_dir', None) !=
@@ -273,6 +354,7 @@
                         ti_filter = frozenset(
                             [test_info.TestFilter(package_name, methods)])
                         break
+        logging.debug('_get_test_info_filter() ti_filter: %s', ti_filter)
         return ti_filter
 
     def _get_rel_config(self, test_path):
@@ -311,24 +393,26 @@
             module_names = [module_name]
         else:
             module_names = self._determine_testable_module(
-                os.path.dirname(rel_config))
+                os.path.dirname(rel_config),
+                test_path if self._is_comparted_src(test_path) else None)
         test_infos = []
         if module_names:
             for mname in module_names:
                 # The real test config might be record in module-info.
-                rel_config = self._get_module_test_config(mname,
-                                                          rel_config=rel_config)
-                mod_info = self.module_info.get_module_info(mname)
-                tinfo = self._process_test_info(test_info.TestInfo(
-                    test_name=mname,
-                    test_runner=self._TEST_RUNNER,
-                    build_targets=set(),
-                    data={constants.TI_FILTER: test_filter,
-                          constants.TI_REL_CONFIG: rel_config},
-                    compatibility_suites=mod_info.get(
-                        constants.MODULE_COMPATIBILITY_SUITES, [])))
-                if tinfo:
-                    test_infos.append(tinfo)
+                rel_configs = self._get_module_test_config(
+                    mname, rel_config=rel_config)
+                for rel_cfg in rel_configs:
+                    mod_info = self.module_info.get_module_info(mname)
+                    tinfo = self._process_test_info(test_info.TestInfo(
+                        test_name=mname,
+                        test_runner=self._TEST_RUNNER,
+                        build_targets=set(),
+                        data={constants.TI_FILTER: test_filter,
+                              constants.TI_REL_CONFIG: rel_cfg},
+                        compatibility_suites=mod_info.get(
+                            constants.MODULE_COMPATIBILITY_SUITES, [])))
+                    if tinfo:
+                        test_infos.append(tinfo)
         return test_infos
 
     def find_test_by_module_name(self, module_name):
@@ -341,23 +425,27 @@
             A list that includes only 1 populated TestInfo namedtuple
             if found, otherwise None.
         """
+        tinfos = []
         mod_info = self.module_info.get_module_info(module_name)
         if self.module_info.is_testable_module(mod_info):
             # path is a list with only 1 element.
             rel_config = os.path.join(mod_info['path'][0],
                                       constants.MODULE_CONFIG)
-            rel_config = self._get_module_test_config(module_name,
-                                                      rel_config=rel_config)
-            tinfo = self._process_test_info(test_info.TestInfo(
-                test_name=module_name,
-                test_runner=self._TEST_RUNNER,
-                build_targets=set(),
-                data={constants.TI_REL_CONFIG: rel_config,
-                      constants.TI_FILTER: frozenset()},
-                compatibility_suites=mod_info.get(
-                    constants.MODULE_COMPATIBILITY_SUITES, [])))
-            if tinfo:
-                return [tinfo]
+            rel_configs = self._get_module_test_config(module_name,
+                                                       rel_config=rel_config)
+            for rel_config in rel_configs:
+                tinfo = self._process_test_info(test_info.TestInfo(
+                    test_name=module_name,
+                    test_runner=self._TEST_RUNNER,
+                    build_targets=set(),
+                    data={constants.TI_REL_CONFIG: rel_config,
+                          constants.TI_FILTER: frozenset()},
+                    compatibility_suites=mod_info.get(
+                        constants.MODULE_COMPATIBILITY_SUITES, [])))
+                if tinfo:
+                    tinfos.append(tinfo)
+            if tinfos:
+                return tinfos
         return None
 
     def find_test_by_kernel_class_name(self, module_name, class_name):
@@ -370,23 +458,30 @@
         Returns:
             A list of populated TestInfo namedtuple if test found, else None.
         """
+
         class_name, methods = test_finder_utils.split_methods(class_name)
-        test_config = self._get_module_test_config(module_name)
-        test_config_path = os.path.join(self.root_dir, test_config)
-        mod_info = self.module_info.get_module_info(module_name)
-        ti_filter = frozenset(
-            [test_info.TestFilter(class_name, methods)])
-        if test_finder_utils.is_test_from_kernel_xml(test_config_path, class_name):
-            tinfo = self._process_test_info(test_info.TestInfo(
-                test_name=module_name,
-                test_runner=self._TEST_RUNNER,
-                build_targets=set(),
-                data={constants.TI_REL_CONFIG: test_config,
-                      constants.TI_FILTER: ti_filter},
-                compatibility_suites=mod_info.get(
-                    constants.MODULE_COMPATIBILITY_SUITES, [])))
-            if tinfo:
-                return [tinfo]
+        test_configs = self._get_module_test_config(module_name)
+        if not test_configs:
+            return None
+        tinfos = []
+        for test_config in test_configs:
+            test_config_path = os.path.join(self.root_dir, test_config)
+            mod_info = self.module_info.get_module_info(module_name)
+            ti_filter = frozenset(
+                [test_info.TestFilter(class_name, methods)])
+            if test_finder_utils.is_test_from_kernel_xml(test_config_path, class_name):
+                tinfo = self._process_test_info(test_info.TestInfo(
+                    test_name=module_name,
+                    test_runner=self._TEST_RUNNER,
+                    build_targets=set(),
+                    data={constants.TI_REL_CONFIG: test_config,
+                          constants.TI_FILTER: ti_filter},
+                    compatibility_suites=mod_info.get(
+                        constants.MODULE_COMPATIBILITY_SUITES, [])))
+                if tinfo:
+                    tinfos.append(tinfo)
+        if tinfos:
+            return tinfos
         return None
 
     def find_test_by_class_name(self, class_name, module_name=None,
@@ -407,32 +502,118 @@
             A list of populated TestInfo namedtuple if test found, else None.
         """
         class_name, methods = test_finder_utils.split_methods(class_name)
+        search_class_name = class_name
+        # For parameterized gtest, test class will be automerged to
+        # $(class_prefix)/$(base_class) name. Using $(base_class) for searching
+        # matched TEST_P to make sure test class is matched.
+        if '/' in search_class_name:
+            search_class_name = str(search_class_name).split('/')[-1]
         if rel_config:
             search_dir = os.path.join(self.root_dir,
                                       os.path.dirname(rel_config))
         else:
             search_dir = self.root_dir
-        test_paths = test_finder_utils.find_class_file(search_dir, class_name,
+        test_paths = test_finder_utils.find_class_file(search_dir, search_class_name,
                                                        is_native_test, methods)
         if not test_paths and rel_config:
             logging.info('Did not find class (%s) under module path (%s), '
                          'researching from repo root.', class_name, rel_config)
             test_paths = test_finder_utils.find_class_file(self.root_dir,
-                                                           class_name,
+                                                           search_class_name,
                                                            is_native_test,
                                                            methods)
+        test_paths = test_paths if test_paths is not None else []
+        # If we already have module name, use path in module-info as test_path.
         if not test_paths:
-            return None
+            if not module_name:
+                return None
+            # Use the module path as test_path.
+            module_paths = self.module_info.get_paths(module_name)
+            test_paths = []
+            for rel_module_path in module_paths:
+                test_paths.append(os.path.join(self.root_dir, rel_module_path))
         tinfos = []
         for test_path in test_paths:
             test_filter = self._get_test_info_filter(
                 test_path, methods, class_name=class_name,
                 is_native_test=is_native_test)
-            tinfo = self._get_test_infos(test_path, rel_config,
-                                         module_name, test_filter)
-            if tinfo:
-                tinfos.extend(tinfo)
-        return tinfos
+            test_infos = self._get_test_infos(
+                test_path, rel_config, module_name, test_filter)
+            # If input include methods, check if tinfo match.
+            if test_infos and len(test_infos) > 1 and methods:
+                test_infos = self._get_matched_test_infos(test_infos, methods)
+            if test_infos:
+                tinfos.extend(test_infos)
+        return tinfos if tinfos else None
+
+    def _get_matched_test_infos(self, test_infos, methods):
+        """Get the test_infos matched the given methods.
+
+        Args:
+            test_infos: A list of TestInfo obj.
+            methods: A set of method name strings.
+
+        Returns:
+            A list of matched TestInfo namedtuple, else None.
+        """
+        matched_test_infos = set()
+        for tinfo in test_infos:
+            test_config, test_srcs = test_finder_utils.get_test_config_and_srcs(
+                tinfo, self.module_info)
+            if test_config:
+                filter_dict = atest_utils.get_android_junit_config_filters(
+                    test_config)
+                # Always treat the test_info is matched if no filters found.
+                if not filter_dict.keys():
+                    matched_test_infos.add(tinfo)
+                    continue
+                for method in methods:
+                    if self._is_srcs_match_method_annotation(method, test_srcs,
+                                                             filter_dict):
+                        logging.debug('For method:%s Test:%s matched '
+                                      'filter_dict: %s', method,
+                                      tinfo.test_name, filter_dict)
+                        matched_test_infos.add(tinfo)
+        return list(matched_test_infos)
+
+    def _is_srcs_match_method_annotation(self, method, srcs, annotation_dict):
+        """Check if input srcs matched annotation.
+
+        Args:
+            method: A string of test method name.
+            srcs: A list of source file of test.
+            annotation_dict: A dictionary record the include and exclude
+                             annotations.
+
+        Returns:
+            True if input method matched the annotation of input srcs, else
+            None.
+        """
+        include_annotations = annotation_dict.get(
+            constants.INCLUDE_ANNOTATION, [])
+        exclude_annotations = annotation_dict.get(
+            constants.EXCLUDE_ANNOTATION, [])
+        for src in srcs:
+            include_methods = set()
+            src_path = os.path.join(self.root_dir, src)
+            # Add methods matched include_annotations.
+            for annotation in include_annotations:
+                include_methods.update(
+                    test_finder_utils.get_annotated_methods(
+                        annotation, src_path))
+            if exclude_annotations:
+                # For exclude annotation, get all the method in the input srcs,
+                # and filter out the matched annotation.
+                exclude_methods = set()
+                all_methods = test_finder_utils.get_java_methods(src_path)
+                for annotation in exclude_annotations:
+                    exclude_methods.update(
+                        test_finder_utils.get_annotated_methods(
+                            annotation, src_path))
+                include_methods = all_methods - exclude_methods
+            if method in include_methods:
+                return True
+        return False
 
     def find_test_by_module_and_class(self, module_class):
         """Find the test info given a MODULE:CLASS string.
@@ -496,9 +677,14 @@
             search_dir = self.root_dir
         package_paths = test_finder_utils.run_find_cmd(
             test_finder_utils.FIND_REFERENCE_TYPE.PACKAGE, search_dir, package)
+        package_paths = package_paths if package_paths is not None else []
         # Package path will be the full path to the dir represented by package.
         if not package_paths:
-            return None
+            if not module_name:
+                return None
+            module_paths = self.module_info.get_paths(module_name)
+            for rel_module_path in module_paths:
+                package_paths.append(os.path.join(self.root_dir, rel_module_path))
         test_filter = frozenset([test_info.TestFilter(package, frozenset())])
         test_infos = []
         for package_path in package_paths:
@@ -506,7 +692,7 @@
                                          module_name, test_filter)
             if tinfo:
                 test_infos.extend(tinfo)
-        return test_infos
+        return test_infos if test_infos else None
 
     def find_test_by_module_and_package(self, module_package):
         """Find the test info given a MODULE:PACKAGE string.
@@ -527,7 +713,7 @@
             package, module_info.test_name,
             module_info.data.get(constants.TI_REL_CONFIG))
 
-    def find_test_by_path(self, path):
+    def find_test_by_path(self, rel_path):
         """Find the first test info matching the given path.
 
         Strategy:
@@ -539,13 +725,13 @@
             path_to_any_other_dir --> Resolve as MODULE
 
         Args:
-            path: A string of the test's path.
+            rel_path: A string of the relative path to $BUILD_TOP.
 
         Returns:
             A list of populated TestInfo namedtuple if test found, else None
         """
-        logging.debug('Finding test by path: %s', path)
-        path, methods = test_finder_utils.split_methods(path)
+        logging.debug('Finding test by path: %s', rel_path)
+        path, methods = test_finder_utils.split_methods(rel_path)
         # TODO: See if this can be generalized and shared with methods above
         # create absolute path from cwd and remove symbolic links
         path = os.path.realpath(path)
@@ -559,6 +745,20 @@
         rel_module_dir = test_finder_utils.find_parent_module_dir(
             self.root_dir, dir_path, self.module_info)
         if not rel_module_dir:
+            # Try to find unit-test for input path.
+            path = os.path.relpath(
+                os.path.realpath(rel_path),
+                os.environ.get(constants.ANDROID_BUILD_TOP, ''))
+            unit_tests = test_finder_utils.find_host_unit_tests(
+                self.module_info, path)
+            if unit_tests:
+                tinfos = []
+                for unit_test in unit_tests:
+                    tinfo = self._get_test_infos(path, constants.MODULE_CONFIG,
+                                                 unit_test, frozenset())
+                    if tinfo:
+                        tinfos.extend(tinfo)
+                return tinfos
             return None
         rel_config = os.path.join(rel_module_dir, constants.MODULE_CONFIG)
         test_filter = self._get_test_info_filter(path, methods,
@@ -655,3 +855,52 @@
             if _distance <= abs(constants.LD_RANGE):
                 guessed_modules.append(_module)
         return guessed_modules
+
+    def find_test_by_config_name(self, config_name):
+        """Find test for the given config name.
+
+        Args:
+            config_name: A string of the test's config name.
+
+        Returns:
+            A list that includes only 1 populated TestInfo namedtuple
+            if found, otherwise None.
+        """
+        for module_name, mod_info in self.module_info.name_to_module_info.items():
+            test_configs = mod_info.get(constants.MODULE_TEST_CONFIG, [])
+            for test_config in test_configs:
+                test_config_name = os.path.splitext(
+                    os.path.basename(test_config))[0]
+                if test_config_name == config_name:
+                    tinfo = test_info.TestInfo(
+                        test_name=test_config_name,
+                        test_runner=self._TEST_RUNNER,
+                        build_targets=self._get_build_targets(module_name,
+                                                              test_config),
+                        data={constants.TI_REL_CONFIG: test_config,
+                              constants.TI_FILTER: frozenset()},
+                        compatibility_suites=mod_info.get(
+                            constants.MODULE_COMPATIBILITY_SUITES, []))
+                    if tinfo:
+                        # There should have only one test_config with the same
+                        # name in source tree.
+                        return [tinfo]
+        return None
+
+    def _is_comparted_src(self, path):
+        """Check if the input path need to match srcs information in module.
+
+        If path is a folder or android build file, we don't need to compart
+        with module's srcs.
+
+        Args:
+            path: A string of the test's path.
+
+        Returns:
+            True if input path need to match with module's src info, else False.
+        """
+        if os.path.isdir(path):
+            return False
+        if atest_utils.is_build_file(path):
+            return False
+        return True
diff --git a/atest/test_finders/module_finder_unittest.py b/atest/test_finders/module_finder_unittest.py
index 805a857..94d8a6e 100755
--- a/atest/test_finders/module_finder_unittest.py
+++ b/atest/test_finders/module_finder_unittest.py
@@ -17,7 +17,10 @@
 """Unittests for module_finder."""
 
 # pylint: disable=line-too-long
+# pylint: disable=too-many-lines
+# pylint: disable=unsubscriptable-object
 
+import copy
 import re
 import unittest
 import os
@@ -25,6 +28,8 @@
 from unittest import mock
 
 import atest_error
+import atest_configs
+import atest_utils
 import constants
 import module_info
 import unittest_constants as uc
@@ -39,7 +44,7 @@
 MODULE_PACKAGE = '%s:%s' % (uc.MODULE_NAME, uc.PACKAGE)
 CC_MODULE_CLASS = '%s:%s' % (uc.CC_MODULE_NAME, uc.CC_CLASS_NAME)
 KERNEL_TEST_CLASS = 'test_class_1'
-KERNEL_TEST_CONFIG = 'KernelTest.xml'
+KERNEL_TEST_CONFIG = 'KernelTest.xml.data'
 KERNEL_MODULE_CLASS = '%s:%s' % (constants.REQUIRED_KERNEL_TEST_MODULES[0],
                                  KERNEL_TEST_CLASS)
 KERNEL_CONFIG_FILE = os.path.join(uc.TEST_DATA_DIR, KERNEL_TEST_CONFIG)
@@ -116,7 +121,7 @@
 
     # pylint: disable=unused-argument
     @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets',
-                       return_value=uc.MODULE_BUILD_TARGETS)
+                       return_value=copy.deepcopy(uc.MODULE_BUILD_TARGETS))
     def test_find_test_by_module_name(self, _get_targ):
         """Test find_test_by_module_name."""
         self.mod_finder.module_info.is_robolectric_test.return_value = False
@@ -135,6 +140,58 @@
         self.mod_finder.module_info.is_testable_module.return_value = False
         self.assertIsNone(self.mod_finder.find_test_by_module_name('Not_Module'))
 
+    @mock.patch('builtins.input', return_value='1')
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets',
+                       return_value=copy.deepcopy(uc.MODULE_BUILD_TARGETS))
+    def test_find_test_by_module_name_w_multiple_config(
+            self, _get_targ, _mock_input):
+        """Test find_test_by_module_name."""
+        atest_configs.GLOBAL_ARGS = mock.Mock()
+        atest_configs.GLOBAL_ARGS.test_config_select = True
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.has_test_config.return_value = True
+        mod_info = {'installed': ['/path/to/install'],
+                    'path': [uc.MODULE_DIR],
+                    constants.MODULE_CLASS: [],
+                    constants.MODULE_COMPATIBILITY_SUITES: [],
+                    constants.MODULE_TEST_CONFIG: [
+                        uc.CONFIG_FILE,
+                        uc.EXTRA_CONFIG_FILE]}
+        self.mod_finder.module_info.get_module_info.return_value = mod_info
+        t_infos = self.mod_finder.find_test_by_module_name(uc.MODULE_NAME)
+        # Only select one test
+        self.assertEqual(len(t_infos), 1)
+        # The t_info should be the EXTRA_CONFIG_FILE one.
+        unittest_utils.assert_equal_testinfos(
+            self, t_infos[0], uc.MODULE_INFO_W_CONFIG)
+
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets',
+                       return_value=copy.deepcopy(uc.MODULE_BUILD_TARGETS))
+    def test_find_test_by_module_name_w_multiple_config_all(
+            self, _get_targ,):
+        """Test find_test_by_module_name."""
+        atest_configs.GLOBAL_ARGS = mock.Mock()
+        atest_configs.GLOBAL_ARGS.test_config_select = False
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.has_test_config.return_value = True
+        mod_info = {'installed': ['/path/to/install'],
+                    'path': [uc.MODULE_DIR],
+                    constants.MODULE_CLASS: [],
+                    constants.MODULE_COMPATIBILITY_SUITES: [],
+                    constants.MODULE_TEST_CONFIG: [
+                        uc.CONFIG_FILE,
+                        uc.EXTRA_CONFIG_FILE]}
+        self.mod_finder.module_info.get_module_info.return_value = mod_info
+        t_infos = self.mod_finder.find_test_by_module_name(uc.MODULE_NAME)
+        unittest_utils.assert_equal_testinfos(self, t_infos[0], uc.MODULE_INFO)
+        unittest_utils.assert_equal_testinfos(
+            self, t_infos[1], uc.MODULE_INFO_W_CONFIG)
+
+    @mock.patch.object(test_finder_utils, 'find_host_unit_tests',
+                       return_value=[])
+    @mock.patch.object(atest_utils, 'is_build_file', return_value=True)
+    @mock.patch.object(test_finder_utils, 'is_parameterized_java_class',
+                       return_value=False)
     @mock.patch.object(test_finder_utils, 'has_method_in_file',
                        return_value=True)
     @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
@@ -148,7 +205,9 @@
     #pylint: disable=unused-argument
     def test_find_test_by_class_name(self, _isdir, _isfile, _fqcn,
                                      mock_checkoutput, mock_build,
-                                     _vts, _has_method_in_file):
+                                     _vts, _has_method_in_file,
+                                     _is_parameterized, _is_build_file,
+                                     _mock_unit_tests):
         """Test find_test_by_class_name."""
         mock_build.return_value = uc.CLASS_BUILD_TARGETS
         self.mod_finder.module_info.is_auto_gen_test_config.return_value = False
@@ -165,12 +224,12 @@
             self, t_infos[0], uc.CLASS_INFO)
 
         # with method
-        mock_build.return_value = uc.MODULE_BUILD_TARGETS
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
         class_with_method = '%s#%s' % (uc.CLASS_NAME, uc.METHOD_NAME)
         t_infos = self.mod_finder.find_test_by_class_name(class_with_method)
         unittest_utils.assert_equal_testinfos(
             self, t_infos[0], uc.METHOD_INFO)
-        mock_build.return_value = uc.MODULE_BUILD_TARGETS
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
         class_methods = '%s,%s' % (class_with_method, uc.METHOD2_NAME)
         t_infos = self.mod_finder.find_test_by_class_name(class_methods)
         unittest_utils.assert_equal_testinfos(
@@ -194,6 +253,8 @@
             self, t_infos[0],
             CLASS_INFO_MODULE_2)
 
+    @mock.patch.object(test_finder_utils, 'is_parameterized_java_class',
+                       return_value=False)
     @mock.patch.object(test_finder_utils, 'has_method_in_file',
                        return_value=True)
     @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
@@ -206,7 +267,8 @@
     #pylint: disable=unused-argument
     def test_find_test_by_module_and_class(self, _isfile, _fqcn,
                                            mock_checkoutput, mock_build,
-                                           _vts, _has_method_in_file):
+                                           _vts, _has_method_in_file,
+                                           _is_parameterized):
         """Test find_test_by_module_and_class."""
         # Native test was tested in test_find_test_by_cc_class_name().
         self.mod_finder.module_info.is_native_test.return_value = False
@@ -222,7 +284,7 @@
         t_infos = self.mod_finder.find_test_by_module_and_class(MODULE_CLASS)
         unittest_utils.assert_equal_testinfos(self, t_infos[0], uc.CLASS_INFO)
         # with method
-        mock_build.return_value = uc.MODULE_BUILD_TARGETS
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
         t_infos = self.mod_finder.find_test_by_module_and_class(MODULE_CLASS_METHOD)
         unittest_utils.assert_equal_testinfos(self, t_infos[0], uc.METHOD_INFO)
         self.mod_finder.module_info.is_testable_module.return_value = False
@@ -255,6 +317,7 @@
         self.mod_finder.module_info.is_auto_gen_test_config.return_value = False
         self.mod_finder.module_info.is_robolectric_test.return_value = False
         self.mod_finder.module_info.has_test_config.return_value = True
+        self.mod_finder.module_info.get_paths.return_value = []
         mock_build.return_value = uc.CLASS_BUILD_TARGETS
         mod_info = {constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
                     constants.MODULE_PATH: [uc.CC_MODULE_DIR],
@@ -264,7 +327,7 @@
         t_infos = self.mod_finder.find_test_by_module_and_class(CC_MODULE_CLASS)
         unittest_utils.assert_equal_testinfos(self, t_infos[0], uc.CC_MODULE_CLASS_INFO)
         # with method
-        mock_build.return_value = uc.MODULE_BUILD_TARGETS
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
         mock_fcf.side_effect = [None, None, '/']
         t_infos = self.mod_finder.find_test_by_module_and_class(CC_MODULE_CLASS_METHOD)
         unittest_utils.assert_equal_testinfos(self, t_infos[0], uc.CC_METHOD_INFO)
@@ -275,7 +338,7 @@
         self.assertIsNone(self.mod_finder.find_test_by_module_and_class(bad_module))
 
     @mock.patch.object(module_finder.ModuleFinder, '_get_module_test_config',
-                       return_value=KERNEL_CONFIG_FILE)
+                       return_value=[KERNEL_CONFIG_FILE])
     @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
                        return_value=False)
     @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets')
@@ -293,6 +356,7 @@
         self.mod_finder.module_info.is_auto_gen_test_config.return_value = False
         self.mod_finder.module_info.is_robolectric_test.return_value = False
         self.mod_finder.module_info.has_test_config.return_value = True
+        self.mod_finder.module_info.get_paths.return_value = []
         mock_build.return_value = uc.CLASS_BUILD_TARGETS
         mod_info = {constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
                     constants.MODULE_PATH: [uc.CC_MODULE_DIR],
@@ -302,6 +366,8 @@
         t_infos = self.mod_finder.find_test_by_module_and_class(KERNEL_MODULE_CLASS)
         unittest_utils.assert_equal_testinfos(self, t_infos[0], KERNEL_MODULE_CLASS_INFO)
 
+    @mock.patch.object(test_finder_utils, 'find_host_unit_tests',
+                       return_value=[])
     @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
                        return_value=False)
     @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets')
@@ -310,7 +376,7 @@
     @mock.patch('os.path.isdir', return_value=True)
     #pylint: disable=unused-argument
     def test_find_test_by_package_name(self, _isdir, _isfile, mock_checkoutput,
-                                       mock_build, _vts):
+                                       mock_build, _vts, _mock_unit_tests):
         """Test find_test_by_package_name."""
         self.mod_finder.module_info.is_auto_gen_test_config.return_value = False
         self.mod_finder.module_info.is_robolectric_test.return_value = False
@@ -354,6 +420,7 @@
         self.mod_finder.module_info.is_auto_gen_test_config.return_value = False
         self.mod_finder.module_info.is_robolectric_test.return_value = False
         self.mod_finder.module_info.has_test_config.return_value = True
+        self.mod_finder.module_info.get_paths.return_value = []
         mock_build.return_value = uc.CLASS_BUILD_TARGETS
         mod_info = {constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
                     constants.MODULE_PATH: [uc.MODULE_DIR],
@@ -383,6 +450,13 @@
         self.mod_finder.module_info.get_module_info.return_value = mod_info
         self.assertIsNone(self.mod_finder.find_test_by_module_and_package(bad_pkg))
 
+    @mock.patch.object(test_finder_utils, 'find_host_unit_tests',
+                       return_value=[])
+    @mock.patch.object(test_finder_utils, 'get_cc_test_classes_methods',
+                       return_value=(set(), set(), set()))
+    @mock.patch.object(atest_utils, 'is_build_file', return_value=True)
+    @mock.patch.object(test_finder_utils, 'is_parameterized_java_class',
+                       return_value=False)
     @mock.patch.object(test_finder_utils, 'has_method_in_file',
                        return_value=True)
     @mock.patch.object(test_finder_utils, 'has_cc_class',
@@ -398,9 +472,10 @@
     @mock.patch.object(test_finder_utils, 'find_parent_module_dir')
     @mock.patch('os.path.exists')
     #pylint: disable=unused-argument
-    def test_find_test_by_path(self, mock_pathexists, mock_dir, _isfile, _real,
-                               _fqcn, _vts, mock_build, _has_cc_class,
-                               _has_method_in_file):
+    def test_find_test_by_path(
+            self, mock_pathexists, mock_dir, _isfile, _real, _fqcn, _vts,
+            mock_build, _has_cc_class, _has_method_in_file, _is_parameterized,
+            _is_build_file, _get_cc_test_classed, _mock_unit_tests):
         """Test find_test_by_path."""
         self.mod_finder.module_info.is_robolectric_test.return_value = False
         self.mod_finder.module_info.has_test_config.return_value = True
@@ -429,6 +504,12 @@
         unittest_utils.assert_equal_testinfos(
             self, uc.CLASS_INFO, t_infos[0])
 
+        class_with_method = '%s#%s' % (class_path, uc.METHOD_NAME)
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
+        t_infos = self.mod_finder.find_test_by_path(class_with_method)
+        unittest_utils.assert_equal_testinfos(
+            self, t_infos[0], uc.METHOD_INFO)
+
         class_path = '%s.java' % uc.CLASS_NAME
         mock_build.return_value = uc.CLASS_BUILD_TARGETS
         t_infos = self.mod_finder.find_test_by_path(class_path)
@@ -436,13 +517,13 @@
             self, uc.CLASS_INFO, t_infos[0])
 
         class_with_method = '%s#%s' % (class_path, uc.METHOD_NAME)
-        mock_build.return_value = uc.MODULE_BUILD_TARGETS
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
         t_infos = self.mod_finder.find_test_by_path(class_with_method)
         unittest_utils.assert_equal_testinfos(
             self, t_infos[0], uc.METHOD_INFO)
 
         class_with_methods = '%s,%s' % (class_with_method, uc.METHOD2_NAME)
-        mock_build.return_value = uc.MODULE_BUILD_TARGETS
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
         t_infos = self.mod_finder.find_test_by_path(class_with_methods)
         unittest_utils.assert_equal_testinfos(
             self, t_infos[0],
@@ -463,7 +544,7 @@
             self, uc.CC_PATH_INFO2, t_infos[0])
 
     @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets',
-                       return_value=uc.MODULE_BUILD_TARGETS)
+                       return_value=copy.deepcopy(uc.MODULE_BUILD_TARGETS))
     @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
                        return_value=False)
     @mock.patch.object(test_finder_utils, 'find_parent_module_dir',
@@ -503,6 +584,9 @@
         unittest_utils.assert_equal_testinfos(
             self, uc.CC_PATH_INFO, t_infos[0])
 
+    @mock.patch.object(test_finder_utils, 'find_host_unit_tests',
+                       return_value=[])
+    @mock.patch.object(atest_utils, 'is_build_file', return_value=True)
     @mock.patch.object(test_finder_utils, 'has_method_in_file',
                        return_value=True)
     @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
@@ -514,7 +598,8 @@
     #pylint: disable=unused-argument
     def test_find_test_by_cc_class_name(self, _isdir, _isfile,
                                         mock_checkoutput, mock_build,
-                                        _vts, _has_method):
+                                        _vts, _has_method, _is_build_file,
+                                       _mock_unit_tests):
         """Test find_test_by_cc_class_name."""
         mock_build.return_value = uc.CLASS_BUILD_TARGETS
         self.mod_finder.module_info.is_auto_gen_test_config.return_value = False
@@ -531,14 +616,14 @@
             self, t_infos[0], uc.CC_CLASS_INFO)
 
         # with method
-        mock_build.return_value = uc.MODULE_BUILD_TARGETS
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
         class_with_method = '%s#%s' % (uc.CC_CLASS_NAME, uc.CC_METHOD_NAME)
         t_infos = self.mod_finder.find_test_by_cc_class_name(class_with_method)
         unittest_utils.assert_equal_testinfos(
             self,
             t_infos[0],
             uc.CC_METHOD_INFO)
-        mock_build.return_value = uc.MODULE_BUILD_TARGETS
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
         class_methods = '%s,%s' % (class_with_method, uc.CC_METHOD2_NAME)
         t_infos = self.mod_finder.find_test_by_cc_class_name(class_methods)
         unittest_utils.assert_equal_testinfos(
@@ -592,6 +677,485 @@
         self.assertEqual(self.mod_finder._get_build_targets('', ''),
                          {constants.VTS_CORE_TF_MODULE})
 
+    @mock.patch.object(test_finder_utils, 'is_parameterized_java_class',
+                       return_value=False)
+    @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
+                       return_value=False)
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets')
+    @mock.patch('subprocess.check_output', return_value='')
+    @mock.patch.object(test_finder_utils, 'get_fully_qualified_class_name',
+                       return_value=uc.FULL_CLASS_NAME)
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    @mock.patch('os.path.isdir', return_value=True)
+    #pylint: disable=unused-argument
+    def test_find_test_by_class_name_w_module(self, _isdir, _isfile, _fqcn,
+                                              mock_checkoutput, mock_build,
+                                              _vts, _is_parameterized):
+        """Test test_find_test_by_class_name with module but without class found."""
+        mock_build.return_value = uc.CLASS_BUILD_TARGETS
+        self.mod_finder.module_info.is_auto_gen_test_config.return_value = False
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.has_test_config.return_value = True
+        self.mod_finder.module_info.get_module_names.return_value = [uc.MODULE_NAME]
+        self.mod_finder.module_info.get_module_info.return_value = {
+            constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
+            constants.MODULE_NAME: uc.MODULE_NAME,
+            constants.MODULE_CLASS: [],
+            constants.MODULE_COMPATIBILITY_SUITES: []}
+        self.mod_finder.module_info.get_paths.return_value = [uc.TEST_DATA_CONFIG]
+        t_infos = self.mod_finder.find_test_by_class_name(
+            uc.FULL_CLASS_NAME, module_name=uc.MODULE_NAME,
+            rel_config=uc.CONFIG_FILE)
+        unittest_utils.assert_equal_testinfos(
+            self, t_infos[0], uc.CLASS_INFO)
+
+    @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
+                       return_value=False)
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets')
+    @mock.patch('subprocess.check_output', return_value='')
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    @mock.patch('os.path.isdir', return_value=True)
+    #pylint: disable=unused-argument
+    def test_find_test_by_package_name_w_module(self, _isdir, _isfile,
+                                                mock_checkoutput, mock_build,
+                                                _vts):
+        """Test find_test_by_package_name with module but without package found."""
+        self.mod_finder.module_info.is_auto_gen_test_config.return_value = False
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.has_test_config.return_value = True
+        mock_build.return_value = uc.CLASS_BUILD_TARGETS
+        self.mod_finder.module_info.get_module_names.return_value = [uc.MODULE_NAME]
+        self.mod_finder.module_info.get_module_info.return_value = {
+            constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
+            constants.MODULE_NAME: uc.MODULE_NAME,
+            constants.MODULE_CLASS: [],
+            constants.MODULE_COMPATIBILITY_SUITES: []
+        }
+        self.mod_finder.module_info.get_paths.return_value = [uc.TEST_DATA_CONFIG]
+        t_infos = self.mod_finder.find_test_by_package_name(
+            uc.PACKAGE, module_name=uc.MODULE_NAME, rel_config=uc.CONFIG_FILE)
+        unittest_utils.assert_equal_testinfos(
+            self, t_infos[0],
+            uc.PACKAGE_INFO)
+
+    @mock.patch.object(atest_utils, 'is_build_file', return_value=True)
+    @mock.patch.object(test_finder_utils, 'is_parameterized_java_class',
+                       return_value=True)
+    @mock.patch.object(test_finder_utils, 'has_method_in_file',
+                       return_value=True)
+    @mock.patch.object(test_finder_utils, 'has_cc_class',
+                       return_value=True)
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets')
+    @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
+                       return_value=False)
+    @mock.patch.object(test_finder_utils, 'get_fully_qualified_class_name',
+                       return_value=uc.FULL_CLASS_NAME)
+    @mock.patch('os.path.realpath',
+                side_effect=unittest_utils.realpath_side_effect)
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    @mock.patch.object(test_finder_utils, 'find_parent_module_dir')
+    @mock.patch('os.path.exists')
+    #pylint: disable=unused-argument
+    def test_find_test_by_path_is_parameterized_java(
+            self, mock_pathexists, mock_dir, _isfile, _real, _fqcn, _vts,
+            mock_build, _has_cc_class, _has_method_in_file, _is_parameterized,
+            _is_build_file):
+        """Test find_test_by_path and input path is parameterized class."""
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.has_test_config.return_value = True
+        mock_build.return_value = set()
+        mock_pathexists.return_value = True
+        self.mod_finder.module_info.get_module_names.return_value = [uc.MODULE_NAME]
+        self.mod_finder.module_info.get_module_info.return_value = {
+            constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
+            constants.MODULE_NAME: uc.MODULE_NAME,
+            constants.MODULE_CLASS: [],
+            constants.MODULE_COMPATIBILITY_SUITES: []}
+        # Happy path testing.
+        mock_dir.return_value = uc.MODULE_DIR
+        class_path = '%s.java' % uc.CLASS_NAME
+        # Input include only one method
+        class_with_method = '%s#%s' % (class_path, uc.METHOD_NAME)
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
+        t_infos = self.mod_finder.find_test_by_path(class_with_method)
+        unittest_utils.assert_equal_testinfos(
+            self, t_infos[0], uc.PARAMETERIZED_METHOD_INFO)
+        # Input include multiple methods
+        class_with_methods = '%s,%s' % (class_with_method, uc.METHOD2_NAME)
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
+        t_infos = self.mod_finder.find_test_by_path(class_with_methods)
+        unittest_utils.assert_equal_testinfos(
+            self, t_infos[0], uc.PARAMETERIZED_FLAT_METHOD_INFO)
+
+    @mock.patch.object(test_finder_utils, 'find_host_unit_tests',
+                           return_value=[])
+    @mock.patch.object(atest_utils, 'is_build_file', return_value=True)
+    @mock.patch.object(test_finder_utils, 'is_parameterized_java_class',
+                       return_value=True)
+    @mock.patch.object(test_finder_utils, 'has_method_in_file',
+                       return_value=True)
+    @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
+                       return_value=False)
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets')
+    @mock.patch('subprocess.check_output', return_value=uc.FIND_ONE)
+    @mock.patch.object(test_finder_utils, 'get_fully_qualified_class_name',
+                       return_value=uc.FULL_CLASS_NAME)
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    @mock.patch('os.path.isdir', return_value=True)
+    #pylint: disable=unused-argument
+    def test_find_test_by_class_name_is_parameterized(
+            self, _isdir, _isfile, _fqcn, mock_checkoutput, mock_build, _vts,
+            _has_method_in_file, _is_parameterized, _is_build_file,
+            _mock_unit_tests):
+        """Test find_test_by_class_name and the class is parameterized java."""
+        mock_build.return_value = uc.CLASS_BUILD_TARGETS
+        self.mod_finder.module_info.is_auto_gen_test_config.return_value = False
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.has_test_config.return_value = True
+        self.mod_finder.module_info.get_module_names.return_value = [uc.MODULE_NAME]
+        self.mod_finder.module_info.get_module_info.return_value = {
+            constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
+            constants.MODULE_NAME: uc.MODULE_NAME,
+            constants.MODULE_CLASS: [],
+            constants.MODULE_COMPATIBILITY_SUITES: []}
+        # With method
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
+        class_with_method = '%s#%s' % (uc.CLASS_NAME, uc.METHOD_NAME)
+        t_infos = self.mod_finder.find_test_by_class_name(class_with_method)
+        unittest_utils.assert_equal_testinfos(
+            self, t_infos[0], uc.PARAMETERIZED_METHOD_INFO)
+        # With multiple method
+        mock_build.return_value = copy.deepcopy(uc.MODULE_BUILD_TARGETS)
+        class_methods = '%s,%s' % (class_with_method, uc.METHOD2_NAME)
+        t_infos = self.mod_finder.find_test_by_class_name(class_methods)
+        unittest_utils.assert_equal_testinfos(
+            self, t_infos[0], uc.PARAMETERIZED_FLAT_METHOD_INFO)
+
+    # pylint: disable=unused-argument
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets',
+                       return_value=copy.deepcopy(uc.MODULE_BUILD_TARGETS))
+    def test_find_test_by_config_name(self, _get_targ):
+        """Test find_test_by_config_name."""
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.has_test_config.return_value = True
+
+        mod_info = {'installed': ['/path/to/install'],
+                    'path': [uc.MODULE_DIR],
+                    constants.MODULE_TEST_CONFIG: [uc.CONFIG_FILE,
+                                                   uc.EXTRA_CONFIG_FILE],
+                    constants.MODULE_CLASS: [],
+                    constants.MODULE_COMPATIBILITY_SUITES: []}
+        name_to_module_info = {uc.MODULE_NAME: mod_info}
+        self.mod_finder.module_info.name_to_module_info = name_to_module_info
+        t_infos = self.mod_finder.find_test_by_config_name(uc.MODULE_CONFIG_NAME)
+        unittest_utils.assert_equal_testinfos(
+            self,
+            t_infos[0],
+            uc.TEST_CONFIG_MODULE_INFO)
+
+    @mock.patch.object(test_finder_utils, 'is_parameterized_java_class',
+                       return_value=False)
+    @mock.patch.object(test_finder_utils, 'has_method_in_file',
+                       return_value=True)
+    @mock.patch.object(test_finder_utils, 'has_cc_class',
+                       return_value=True)
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets')
+    @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
+                       return_value=False)
+    @mock.patch.object(test_finder_utils, 'get_fully_qualified_class_name',
+                       return_value=uc.FULL_CLASS_NAME)
+    @mock.patch('os.path.realpath',
+                side_effect=unittest_utils.realpath_side_effect)
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    @mock.patch.object(test_finder_utils, 'find_parent_module_dir')
+    @mock.patch('os.path.exists')
+    #pylint: disable=unused-argument
+    def test_find_test_by_path_w_src_verify(
+            self, mock_pathexists, mock_dir, _isfile, _real, _fqcn, _vts,
+            mock_build, _has_cc_class, _has_method_in_file, _is_parameterized):
+        """Test find_test_by_path with src information."""
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.has_test_config.return_value = True
+        self.mod_finder.module_info.get_module_names.return_value = [uc.MODULE_NAME]
+        mock_build.return_value = uc.CLASS_BUILD_TARGETS
+
+        # Happy path testing.
+        mock_dir.return_value = uc.MODULE_DIR
+        # Test path not in module's src list.
+        class_path = '%s.java' % uc.CLASS_NAME
+        self.mod_finder.module_info.get_module_info.return_value = {
+            constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
+            constants.MODULE_NAME: uc.MODULE_NAME,
+            constants.MODULE_CLASS: [],
+            constants.MODULE_COMPATIBILITY_SUITES: [],
+            constants.MODULE_SRCS: ['not_matched_%s' % class_path]}
+        t_infos = self.mod_finder.find_test_by_path(class_path)
+        self.assertEqual(0, len(t_infos))
+
+        # Test input file is in module's src list.
+        class_path = '%s.java' % uc.CLASS_NAME
+        self.mod_finder.module_info.get_module_info.return_value = {
+            constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
+            constants.MODULE_NAME: uc.MODULE_NAME,
+            constants.MODULE_CLASS: [],
+            constants.MODULE_COMPATIBILITY_SUITES: [],
+            constants.MODULE_SRCS: [class_path]}
+        t_infos = self.mod_finder.find_test_by_path(class_path)
+        unittest_utils.assert_equal_testinfos(self, uc.CLASS_INFO, t_infos[0])
+
+    @mock.patch.object(test_finder_utils, 'get_cc_test_classes_methods')
+    @mock.patch.object(atest_utils, 'is_build_file', return_value=True)
+    @mock.patch.object(test_finder_utils, 'is_parameterized_java_class',
+                       return_value=False)
+    @mock.patch.object(test_finder_utils, 'has_method_in_file',
+                       return_value=True)
+    @mock.patch.object(test_finder_utils, 'has_cc_class',
+                       return_value=True)
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets')
+    @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
+                       return_value=False)
+    @mock.patch.object(test_finder_utils, 'get_fully_qualified_class_name',
+                       return_value=uc.FULL_CLASS_NAME)
+    @mock.patch('os.path.realpath',
+                side_effect=unittest_utils.realpath_side_effect)
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    @mock.patch.object(test_finder_utils, 'find_parent_module_dir')
+    @mock.patch('os.path.exists')
+    #pylint: disable=unused-argument
+    def test_find_test_by_path_for_cc_file(self, mock_pathexists, mock_dir,
+        _isfile, _real, _fqcn, _vts, mock_build, _has_cc_class,
+        _has_method_in_file, _is_parameterized, _is_build_file,
+        _mock_get_cc_test_class):
+        """Test find_test_by_path for handling correct CC filter."""
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.has_test_config.return_value = True
+        mock_build.return_value = set()
+        # Check that we don't return anything with invalid test references.
+        mock_pathexists.return_value = False
+        mock_pathexists.return_value = True
+        mock_dir.return_value = None
+        self.mod_finder.module_info.get_module_names.return_value = [uc.MODULE_NAME]
+        self.mod_finder.module_info.get_module_info.return_value = {
+            constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
+            constants.MODULE_NAME: uc.MODULE_NAME,
+            constants.MODULE_CLASS: [],
+            constants.MODULE_COMPATIBILITY_SUITES: []}
+        # Happy path testing.
+        mock_dir.return_value = uc.MODULE_DIR
+        # Cc path testing if get_cc_test_classes_methods found those information.
+        self.mod_finder.module_info.get_module_names.return_value = [uc.CC_MODULE_NAME]
+        self.mod_finder.module_info.get_module_info.return_value = {
+            constants.MODULE_INSTALLED: DEFAULT_INSTALL_PATH,
+            constants.MODULE_NAME: uc.CC_MODULE_NAME,
+            constants.MODULE_CLASS: [],
+            constants.MODULE_COMPATIBILITY_SUITES: []}
+        mock_dir.return_value = uc.CC_MODULE_DIR
+        class_path = '%s' % uc.CC_PATH
+        mock_build.return_value = uc.CLASS_BUILD_TARGETS
+        # Test without paramertize test
+        founded_classed = {'class1'}
+        founded_methods = {'method1'}
+        founded_para_classes = set()
+        _mock_get_cc_test_class.return_value = (founded_classed,
+                                                founded_methods,
+                                                founded_para_classes)
+        cc_path_data = {constants.TI_REL_CONFIG: uc.CC_CONFIG_FILE,
+                        constants.TI_FILTER: frozenset(
+                            {test_info.TestFilter(class_name='class1.*',
+                                                  methods=frozenset())})}
+        cc_path_info = test_info.TestInfo(uc.CC_MODULE_NAME,
+                                          atf_tr.AtestTradefedTestRunner.NAME,
+                                          uc.CLASS_BUILD_TARGETS, cc_path_data)
+        t_infos = self.mod_finder.find_test_by_path(class_path)
+        unittest_utils.assert_equal_testinfos(self, cc_path_info, t_infos[0])
+        # Test with paramertize test defined in input path
+        founded_classed = {'class1'}
+        founded_methods = {'method1'}
+        founded_para_classes = {'class1'}
+        _mock_get_cc_test_class.return_value = (founded_classed,
+                                                founded_methods,
+                                                founded_para_classes)
+        cc_path_data = {constants.TI_REL_CONFIG: uc.CC_CONFIG_FILE,
+                        constants.TI_FILTER: frozenset(
+                            {test_info.TestFilter(class_name='*/class1.*',
+                                                  methods=frozenset())})}
+        cc_path_info = test_info.TestInfo(uc.CC_MODULE_NAME,
+                                          atf_tr.AtestTradefedTestRunner.NAME,
+                                          uc.CLASS_BUILD_TARGETS, cc_path_data)
+        t_infos = self.mod_finder.find_test_by_path(class_path)
+        unittest_utils.assert_equal_testinfos(self, cc_path_info, t_infos[0])
+
+    # pylint: disable=unused-argument
+    @mock.patch.object(module_finder.ModuleFinder, '_is_vts_module',
+                       return_value=False)
+    @mock.patch.object(module_finder.ModuleFinder, '_get_build_targets',
+                       return_value=copy.deepcopy(uc.MODULE_BUILD_TARGETS))
+    def test_process_test_info(self, _get_targ, _is_vts):
+        """Test _process_test_info."""
+        mod_info = {'installed': ['/path/to/install'],
+                    'path': [uc.MODULE_DIR],
+                    constants.MODULE_CLASS: [
+                        constants.MODULE_CLASS_JAVA_LIBRARIES],
+                    constants.MODULE_COMPATIBILITY_SUITES: []}
+        self.mod_finder.module_info.is_robolectric_test.return_value = False
+        self.mod_finder.module_info.is_auto_gen_test_config.return_value = True
+        self.mod_finder.module_info.get_module_info.return_value = mod_info
+        processed_info = self.mod_finder._process_test_info(
+            copy.copy(uc.MODULE_INFO))
+        unittest_utils.assert_equal_testinfos(
+            self,
+            processed_info,
+            uc.MODULE_INFO_W_DALVIK)
+
+    @mock.patch.object(test_finder_utils, 'get_annotated_methods')
+    def test_is_srcs_match_method_annotation_include_anno(
+        self, _mock_get_anno_methods):
+        """Test _is_srcs_match_method_annotation with include annotation."""
+        annotation_dict = {constants.INCLUDE_ANNOTATION: 'includeAnnotation1'}
+        input_method = 'my_input_method'
+        input_srcs = ['src1']
+        # Test if input method matched include annotation.
+        _mock_get_anno_methods.return_value = {input_method,
+                                               'not_my_input_method'}
+
+        is_matched = self.mod_finder._is_srcs_match_method_annotation(
+            input_method, input_srcs, annotation_dict)
+
+        self.assertTrue(is_matched)
+        # Test if input method not matched include annotation.
+        _mock_get_anno_methods.return_value = {'not_my_input_method'}
+
+        is_matched = self.mod_finder._is_srcs_match_method_annotation(
+            input_method, input_srcs, annotation_dict)
+
+        self.assertFalse(is_matched)
+
+    @mock.patch.object(test_finder_utils, 'get_annotated_methods')
+    @mock.patch.object(test_finder_utils, 'get_java_methods')
+    def test_is_srcs_match_method_exclude_anno(self, _mock_get_java_methods,
+        _mock_get_exclude_anno_methods):
+        """Test _is_srcs_match_method_annotation with exclude annotation."""
+        annotation_dict = {constants.EXCLUDE_ANNOTATION: 'excludeAnnotation1'}
+        input_method = 'my_input_method'
+        input_srcs = ['src1']
+        _mock_get_java_methods.return_value = {input_method,
+                                               'method1',
+                                               'method2'}
+        # Test if input method matched exclude annotation.
+        _mock_get_exclude_anno_methods.return_value = {input_method, 'method1'}
+
+        is_matched = self.mod_finder._is_srcs_match_method_annotation(
+            input_method, input_srcs, annotation_dict)
+
+        self.assertFalse(is_matched)
+
+        # Test if input method not matched exclude annotation.
+        _mock_get_exclude_anno_methods.return_value = {'method2'}
+
+        is_matched = self.mod_finder._is_srcs_match_method_annotation(
+            input_method, input_srcs, annotation_dict)
+
+        self.assertTrue(is_matched)
+
+    @mock.patch.object(atest_utils, 'get_android_junit_config_filters')
+    @mock.patch.object(test_finder_utils, 'get_test_config_and_srcs')
+    def test_get_matched_test_infos_no_filter(self, _mock_get_conf_srcs,
+        _mock_get_filters):
+        """Test _get_matched_test_infos without test filters."""
+        test_info1 = 'test_info1'
+        test_infos = [test_info1]
+        test_config = 'test_config'
+        test_srcs = ['src1', 'src2']
+        _mock_get_conf_srcs.return_value = test_config, test_srcs
+        filter_dict = {}
+        _mock_get_filters.return_value = filter_dict
+
+        self.assertEqual(
+            self.mod_finder._get_matched_test_infos(test_infos, {'method'}),
+            test_infos)
+
+    @mock.patch.object(module_finder.ModuleFinder,
+                       '_is_srcs_match_method_annotation')
+    @mock.patch.object(atest_utils, 'get_android_junit_config_filters')
+    @mock.patch.object(test_finder_utils, 'get_test_config_and_srcs')
+    def test_get_matched_test_infos_get_filter_method_match(
+        self, _mock_get_conf_srcs, _mock_get_filters, _mock_method_match):
+        """Test _get_matched_test_infos with test filters and method match."""
+        test_infos = [KERNEL_MODULE_CLASS_INFO]
+        test_config = 'test_config'
+        test_srcs = ['src1', 'src2']
+        _mock_get_conf_srcs.return_value = test_config, test_srcs
+        filter_dict = {'include-annotation': 'annotate1'}
+        _mock_get_filters.return_value = filter_dict
+        _mock_method_match.return_value = True
+
+        unittest_utils.assert_strict_equal(
+            self,
+            self.mod_finder._get_matched_test_infos(test_infos, {'method'}),
+            test_infos)
+
+    @mock.patch.object(module_finder.ModuleFinder,
+                       '_is_srcs_match_method_annotation')
+    @mock.patch.object(atest_utils, 'get_android_junit_config_filters')
+    @mock.patch.object(test_finder_utils, 'get_test_config_and_srcs')
+    def test_get_matched_test_infos_filter_method_not_match(
+        self, _mock_get_conf_srcs, _mock_get_filters, _mock_method_match):
+        """Test _get_matched_test_infos but method not match."""
+        test_infos = [KERNEL_MODULE_CLASS_INFO]
+        test_config = 'test_config'
+        test_srcs = ['src1', 'src2']
+        _mock_get_conf_srcs.return_value = test_config, test_srcs
+        filter_dict = {'include-annotation': 'annotate1'}
+        _mock_get_filters.return_value = filter_dict
+        _mock_method_match.return_value = False
+
+        self.assertEqual(
+            self.mod_finder._get_matched_test_infos(test_infos, {'method'}),
+            [])
+
+    @mock.patch.object(module_finder.ModuleFinder, '_get_matched_test_infos')
+    @mock.patch.object(module_finder.ModuleFinder, '_get_test_infos',
+                       return_value=uc.MODULE_INFO)
+    @mock.patch.object(module_finder.ModuleFinder, '_get_test_info_filter',
+                       return_value=uc.CLASS_FILTER)
+    @mock.patch.object(test_finder_utils, 'find_class_file',
+                       return_value=['path1'])
+    def test_find_test_by_class_name_not_matched_filters(
+        self, _mock_class_path, _mock_test_filters,
+        _mock_test_infos, _mock_matched_test_infos):
+        """Test find_test_by_class_name which has not matched filters."""
+        found_test_infos = [uc.MODULE_INFO, uc.MODULE_INFO2]
+        _mock_test_infos.return_value = found_test_infos
+        matched_test_infos = [uc.MODULE_INFO2]
+        _mock_matched_test_infos.return_value = matched_test_infos
+
+        # Test if class without method
+        test_infos = self.mod_finder.find_test_by_class_name('my.test.class')
+        self.assertEqual(len(test_infos), 2)
+        unittest_utils.assert_equal_testinfos(
+            self, test_infos[0], uc.MODULE_INFO)
+        unittest_utils.assert_equal_testinfos(
+            self, test_infos[1], uc.MODULE_INFO2)
+
+        # Test if class with method
+        test_infos = self.mod_finder.find_test_by_class_name(
+            'my.test.class#myMethod')
+        self.assertEqual(len(test_infos), 1)
+        unittest_utils.assert_equal_testinfos(
+            self, test_infos[0], uc.MODULE_INFO2)
+
+    @mock.patch.object(module_finder.ModuleFinder, '_get_test_infos',
+                       return_value=None)
+    @mock.patch.object(module_finder.ModuleFinder, '_get_test_info_filter',
+                       return_value=uc.CLASS_FILTER)
+    @mock.patch.object(test_finder_utils, 'find_class_file',
+                       return_value=['path1'])
+    def test_find_test_by_class_name_get_test_infos_none(
+        self, _mock_class_path, _mock_test_filters, _mock_test_infos):
+        """Test find_test_by_class_name which has not matched test infos."""
+        self.assertEqual(
+            self.mod_finder.find_test_by_class_name('my.test.class'),
+            None)
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/atest/test_finders/test_finder_utils.py b/atest/test_finders/test_finder_utils.py
index 705e7a0..705ee23 100644
--- a/atest/test_finders/test_finder_utils.py
+++ b/atest/test_finders/test_finder_utils.py
@@ -17,6 +17,7 @@
 """
 
 # pylint: disable=line-too-long
+# pylint: disable=too-many-lines
 
 from __future__ import print_function
 
@@ -32,6 +33,7 @@
 import atest_decorator
 import atest_error
 import atest_enum
+import atest_utils
 import constants
 
 from metrics import metrics_utils
@@ -41,12 +43,15 @@
 # We want to make sure we don't grab apks with paths in their name since we
 # assume the apk name is the build target.
 _APK_RE = re.compile(r'^[^/]+\.apk$', re.I)
-# RE for checking if TEST or TEST_F is in a cc file or not.
-_CC_CLASS_RE = re.compile(r'^[ ]*TEST(_F|_P)?[ ]*\(', re.I)
-# RE for checking if there exists one of the methods in java file.
-_JAVA_METHODS_PATTERN = r'.*[ ]+({0})\(.*'
-# RE for checking if there exists one of the methods in cc file.
-_CC_METHODS_PATTERN = r'^[ ]*TEST(_F|_P)?[ ]*\(.*,[ ]*({0})\).*'
+# Group matches "class" of line "TEST_F(class, "
+_CC_CLASS_METHOD_RE = re.compile(
+    r'^\s*TEST(_F|_P)?\s*\(\s*(?P<class>\w+)\s*,\s*(?P<method>\w+)\s*\)', re.M)
+# Group matches parameterized "class" of line "INSTANTIATE_TEST_CASE_P( ,class "
+_PARA_CC_CLASS_RE = re.compile(
+    r'^\s*INSTANTIATE[_TYPED]*_TEST_(SUITE|CASE)_P\s*\(\s*(?P<instantiate>\w+)\s*,'
+    r'\s*(?P<class>\w+)\s*\,', re.M)
+# Group that matches java/kt method.
+_JAVA_METHODS_RE = r'.*\s+(fun|void)\s+(?P<methods>\w+)\(\)'
 # Parse package name from the package declaration line of a java or
 # a kotlin file.
 # Group matches "foo.bar" of line "package foo.bar;" or "package foo.bar"
@@ -54,6 +59,12 @@
 # Matches install paths in module_info to install location(host or device).
 _HOST_PATH_RE = re.compile(r'.*\/host\/.*', re.I)
 _DEVICE_PATH_RE = re.compile(r'.*\/target\/.*', re.I)
+# RE for checking if parameterized java class.
+_PARAMET_JAVA_CLASS_RE = re.compile(
+    r'^\s*@RunWith\s*\(\s*(Parameterized|TestParameterInjector|'
+    r'JUnitParamsRunner|DataProviderRunner|JukitoRunner|Theories|BedsteadJUnit4'
+    r').class\s*\)', re.I)
+_PARENT_CLS_RE = re.compile(r'.*class\s+\w+\s+extends\s+(?P<parent>[\w\.]+.*)\s\{')
 
 # Explanation of FIND_REFERENCE_TYPEs:
 # ----------------------------------
@@ -109,8 +120,18 @@
 _CTS_JAR = "cts-tradefed"
 _XML_PUSH_DELIM = '->'
 _APK_SUFFIX = '.apk'
+DALVIK_TEST_RUNNER_CLASS = 'com.android.compatibility.testtype.DalvikTest'
+LIBCORE_TEST_RUNNER_CLASS = 'com.android.compatibility.testtype.LibcoreTest'
+DALVIK_TESTRUNNER_JAR_CLASSES = [DALVIK_TEST_RUNNER_CLASS,
+                                 LIBCORE_TEST_RUNNER_CLASS]
+DALVIK_DEVICE_RUNNER_JAR = 'cts-dalvik-device-test-runner'
+DALVIK_HOST_RUNNER_JAR = 'cts-dalvik-host-test-runner'
+DALVIK_TEST_DEPS = {DALVIK_DEVICE_RUNNER_JAR,
+                    DALVIK_HOST_RUNNER_JAR,
+                    _CTS_JAR}
 # Setup script for device perf tests.
 _PERF_SETUP_LABEL = 'perf-setup.sh'
+_PERF_SETUP_TARGET = 'perf-setup'
 
 # XML tags.
 _XML_NAME = 'name'
@@ -208,7 +229,7 @@
     """
     with open(test_path) as class_file:
         for line in class_file:
-            match = _CC_CLASS_RE.match(line)
+            match = _CC_CLASS_METHOD_RE.match(line)
             if match:
                 return True
     return False
@@ -222,7 +243,7 @@
 
     Returns:
         A string of the package name or None
-      """
+    """
     with open(file_name) as data:
         for line in data:
             match = _PACKAGE_RE.match(line)
@@ -230,11 +251,26 @@
                 return match.group('package')
 
 
-def has_method_in_file(test_path, methods):
-    """Find out if there is at least one method in the file.
+def get_parent_cls_name(file_name):
+    """Parse the parent class name from a java file.
 
-    Note: This method doesn't handle if method is in comment sections or not.
-    If the file has any method(even in comment sections), it will return True.
+    Args:
+        file_name: A string of the absolute path to the java file.
+
+    Returns:
+        A string of the parent class name or None
+    """
+    with open(file_name) as data:
+        for line in data:
+            match = _PARENT_CLS_RE.match(line)
+            if match:
+                return match.group('parent')
+
+# pylint: disable=too-many-branches
+def has_method_in_file(test_path, methods):
+    """Find out if every method can be found in the file.
+
+    Note: This method doesn't handle if method is in comment sections.
 
     Args:
         test_path: A string of absolute path to the test file.
@@ -245,19 +281,40 @@
     """
     if not os.path.isfile(test_path):
         return False
-    methods_re = None
     if constants.JAVA_EXT_RE.match(test_path):
-        methods_re = re.compile(_JAVA_METHODS_PATTERN.format(
-            '|'.join([r'%s' % x for x in methods])))
-    elif constants.CC_EXT_RE.match(test_path):
-        methods_re = re.compile(_CC_METHODS_PATTERN.format(
-            '|'.join([r'%s' % x for x in methods])))
-    if methods_re:
-        with open(test_path) as test_file:
-            for line in test_file:
-                match = re.match(methods_re, line)
-                if match:
-                    return True
+        # omit parameterized pattern: method[0]
+        _methods = set(re.sub(r'\[\S+\]', '', x) for x in methods)
+        if _methods.issubset(get_java_methods(test_path)):
+            return True
+        parent = get_parent_cls_name(test_path)
+        package = get_package_name(test_path)
+        if parent and package:
+            # Remove <Generics> when needed.
+            parent_cls = re.sub(r'\<\w+\>', '', parent)
+            # Use Full Qualified Class Name for searching precisely.
+            # package org.gnome;
+            # public class Foo extends com.android.Boo -> com.android.Boo
+            # public class Foo extends Boo -> org.gnome.Boo
+            if '.' in parent_cls:
+                parent_fqcn = parent_cls
+            else:
+                parent_fqcn = package + '.' + parent_cls
+            try:
+                logging.debug('Searching methods in %s', parent_fqcn)
+                return has_method_in_file(
+                    run_find_cmd(FIND_REFERENCE_TYPE.QUALIFIED_CLASS,
+                                os.environ.get(constants.ANDROID_BUILD_TOP),
+                                parent_fqcn,
+                                methods)[0], methods)
+            except TypeError:
+                logging.debug('Out of searching range: no test found.')
+                return False
+    if constants.CC_EXT_RE.match(test_path):
+        # omit parameterized pattern: method/argument
+        _methods = set(re.sub(r'\/.*', '', x) for x in methods)
+        _, cc_methods, _ = get_cc_test_classes_methods(test_path)
+        if _methods.issubset(cc_methods):
+            return True
     return False
 
 
@@ -280,25 +337,19 @@
     if isinstance(output, str):
         output = output.splitlines()
     for test in output:
-        # compare CC_OUTPUT_RE with output
         match_obj = constants.CC_OUTPUT_RE.match(test)
+        # Legacy "find" cc output (with TEST_P() syntax):
         if match_obj:
-            # cc/cpp
             fpath = match_obj.group('file_path')
             if not methods or match_obj.group('method_name') in methods:
                 verified_tests.add(fpath)
-        else:
-            # TODO (b/138997521) - Atest checks has_method_in_file of a class
-            #  without traversing its parent classes. A workaround for this is
-            #  do not check has_method_in_file. Uncomment below when a solution
-            #  to it is applied.
-            # java/kt
-            #if not methods or has_method_in_file(test, methods):
+        # "locate" output path for both java/cc.
+        elif not methods or has_method_in_file(test, methods):
             verified_tests.add(test)
     return extract_test_from_tests(sorted(list(verified_tests)))
 
 
-def extract_test_from_tests(tests):
+def extract_test_from_tests(tests, default_all=False):
     """Extract the test path from the tests.
 
     Return the test to run from tests. If more than one option, prompt the user
@@ -315,7 +366,7 @@
         A string list of paths.
     """
     count = len(tests)
-    if count <= 1:
+    if default_all or count <= 1:
         return tests if count else None
     mtests = set()
     try:
@@ -425,7 +476,10 @@
         return None
     ref_name = FIND_REFERENCE_TYPE[ref_type]
     start = time.time()
-    if os.path.isfile(FIND_INDEXES[ref_type]):
+    # Validate mlocate.db before using 'locate' or 'find'.
+    # TODO: b/187146540 record abnormal mlocate.db in Metrics.
+    is_valid_mlocate = atest_utils.check_md5(constants.LOCATE_CACHE_MD5)
+    if os.path.isfile(FIND_INDEXES[ref_type]) and is_valid_mlocate:
         _dict, out = {}, None
         with open(FIND_INDEXES[ref_type], 'rb') as index:
             try:
@@ -436,8 +490,8 @@
                     constants.ACCESS_CACHE_FAILURE)
                 os.remove(FIND_INDEXES[ref_type])
         if _dict.get(target):
-            logging.debug('Found %s in %s', target, FIND_INDEXES[ref_type])
             out = [path for path in _dict.get(target) if search_dir in path]
+            logging.debug('Found %s in %s', target, out)
     else:
         prune_cond = _get_prune_cond_of_ignored_dirs()
         if '.' in target:
@@ -517,15 +571,20 @@
         # TODO (b/112904944) - migrate module_finder functions to here and
         # reuse them.
         rel_dir = os.path.relpath(current_dir, root_dir)
-        # Check if actual config file here
-        if os.path.isfile(os.path.join(current_dir, constants.MODULE_CONFIG)):
+        # Check if actual config file here but need to make sure that there
+        # exist module in module-info with the parent dir.
+        if (os.path.isfile(os.path.join(current_dir, constants.MODULE_CONFIG))
+                and module_info.get_module_names(current_dir)):
             return rel_dir
         # Check module_info if auto_gen config or robo (non-config) here
         for mod in module_info.path_to_module_info.get(rel_dir, []):
             if module_info.is_robolectric_module(mod):
                 return rel_dir
             for test_config in mod.get(constants.MODULE_TEST_CONFIG, []):
-                if os.path.isfile(os.path.join(root_dir, test_config)):
+                # If the test config doesn's exist until it was auto-generated
+                # in the build time(under <android_root>/out), atest still
+                # recognizes it testable.
+                if test_config:
                     return rel_dir
             if mod.get('auto_test_config'):
                 auto_gen_dir = rel_dir
@@ -616,8 +675,7 @@
         if _is_apk_target(name, value):
             target_to_add = _get_apk_target(value)
         elif _PERF_SETUP_LABEL in value:
-            targets.add(_PERF_SETUP_LABEL)
-            continue
+            target_to_add = _PERF_SETUP_TARGET
 
         # Let's make sure we can actually build the target.
         if target_to_add and module_info.is_module(target_to_add):
@@ -633,6 +691,8 @@
         fqcn = class_attr.attrib['class'].strip()
         if fqcn.startswith(_COMPATIBILITY_PACKAGE_PREFIX):
             targets.add(_CTS_JAR)
+        if fqcn in DALVIK_TESTRUNNER_JAR_CLASSES:
+            targets.update(DALVIK_TEST_DEPS)
     logging.debug('Targets found in config file: %s', targets)
     return targets
 
@@ -997,3 +1057,164 @@
             if option_tag.attrib['key'] == test_name:
                 return True
     return False
+
+
+def is_parameterized_java_class(test_path):
+    """Find out if input test path is a parameterized java class.
+
+    Args:
+        test_path: A string of absolute path to the java file.
+
+    Returns:
+        Boolean: Is parameterized class or not.
+    """
+    with open(test_path) as class_file:
+        for line in class_file:
+            match = _PARAMET_JAVA_CLASS_RE.match(line)
+            if match:
+                return True
+    return False
+
+
+def get_java_methods(test_path):
+    """Find out the java test class of input test_path.
+
+    Args:
+        test_path: A string of absolute path to the java file.
+
+    Returns:
+        A set of methods.
+    """
+    with open(test_path) as class_file:
+        content = class_file.read()
+    matches = re.findall(_JAVA_METHODS_RE, content)
+    if matches:
+        methods = {match[1] for match in matches}
+        logging.debug('Available methods: %s', methods)
+        return methods
+    return set()
+
+
+def get_cc_test_classes_methods(test_path):
+    """Find out the cc test class of input test_path.
+
+    Args:
+        test_path: A string of absolute path to the cc file.
+
+    Returns:
+        A tuple of sets: classes, methods and para_classes.
+    """
+    classes = set()
+    methods = set()
+    para_classes = set()
+    with open(test_path) as class_file:
+        content = class_file.read()
+        # Search matched CC CLASS/METHOD
+        matches = re.findall(_CC_CLASS_METHOD_RE, content)
+        logging.debug('Found cc classes: %s', matches)
+        for match in matches:
+            # The elements of `matches` will be "Group 1"(_F),
+            # "Group class"(MyClass1) and "Group method"(MyMethod1)
+            classes.update([match[1]])
+            methods.update([match[2]])
+        # Search matched parameterized CC CLASS.
+        matches = re.findall(_PARA_CC_CLASS_RE, content)
+        logging.debug('Found parameterized classes: %s', matches)
+        for match in matches:
+            # The elements of `matches` will be "Group 1"(_F),
+            # "Group instantiate class"(MyInstantClass1)
+            # and "Group class"(MyClass1)
+            para_classes.update([match[2]])
+    return classes, methods, para_classes
+
+def find_host_unit_tests(module_info, path):
+    """Find host unit tests for the input path.
+
+    Args:
+        module_info: ModuleInfo obj.
+        path: A string of the relative path from $BUILD_TOP we want to search.
+
+    Returns:
+        A list that includes the module name of unit tests, otherwise an empty
+        list.
+    """
+    logging.debug('finding unit tests under %s', path)
+    found_unit_tests = []
+    unit_test_names = module_info.get_all_unit_tests()
+    logging.debug('All the unit tests: %s', unit_test_names)
+    for unit_test_name in unit_test_names:
+        for test_path in module_info.get_paths(unit_test_name):
+            if test_path.find(path) == 0:
+                found_unit_tests.append(unit_test_name)
+    return found_unit_tests
+
+def get_annotated_methods(annotation, file_path):
+    """Find all the methods annotated by the input annotation in the file_path.
+
+    Args:
+        annotation: A string of the annotation class.
+        file_path: A string of the file path.
+
+    Returns:
+        A set of all the methods annotated.
+    """
+    methods = set()
+    annotation_name = '@' + str(annotation).split('.')[-1]
+    with open(file_path) as class_file:
+        enter_annotation_block = False
+        for line in class_file:
+            if str(line).strip().startswith(annotation_name):
+                enter_annotation_block = True
+                continue
+            if enter_annotation_block:
+                matches = re.findall(_JAVA_METHODS_RE, line)
+                if matches:
+                    methods.update({match[1] for match in matches})
+                    enter_annotation_block = False
+                    continue
+    return methods
+
+def get_test_config_and_srcs(test_info, module_info):
+    """Get the test config path for the input test_info.
+
+    The search rule will be:
+    Check if test name in test_info could be found in module_info
+      1. AndroidTest.xml under module path if no test config be set.
+      2. The first test config defined in Android.bp if test config be set.
+    If test name could not found matched module in module_info, search all the
+    test config name if match.
+
+    Args:
+        test_info: TestInfo obj.
+        module_info: ModuleInfo obj.
+
+    Returns:
+        A string of the config path and list of srcs, None if test config not
+        exist.
+    """
+    android_root_dir = os.environ.get(constants.ANDROID_BUILD_TOP)
+    test_name = test_info.test_name
+    mod_info = module_info.get_module_info(test_name)
+    if mod_info:
+        test_configs = mod_info.get(constants.MODULE_TEST_CONFIG, [])
+        if len(test_configs) == 0:
+            # Check for AndroidTest.xml at the module path.
+            for path in mod_info.get(constants.MODULE_PATH, []):
+                config_path = os.path.join(
+                    android_root_dir, path, constants.MODULE_CONFIG)
+                if os.path.isfile(config_path):
+                    return config_path, mod_info.get(constants.MODULE_SRCS, [])
+        if len(test_configs) >= 1:
+            test_config = test_configs[0]
+            config_path = os.path.join(android_root_dir, test_config)
+            if os.path.isfile(config_path):
+                return config_path, mod_info.get(constants.MODULE_SRCS, [])
+    else:
+        for _, info in module_info.name_to_module_info.items():
+            test_configs = info.get(constants.MODULE_TEST_CONFIG, [])
+            for test_config in test_configs:
+                config_path = os.path.join(android_root_dir, test_config)
+                config_name = os.path.splitext(os.path.basename(config_path))[0]
+                if config_name == test_name and os.path.isfile(config_path):
+                    return config_path, info.get(constants.MODULE_SRCS, [])
+    return None, None
diff --git a/atest/test_finders/test_finder_utils_unittest.py b/atest/test_finders/test_finder_utils_unittest.py
index fdb6fd9..46ea09f 100755
--- a/atest/test_finders/test_finder_utils_unittest.py
+++ b/atest/test_finders/test_finder_utils_unittest.py
@@ -19,6 +19,7 @@
 # pylint: disable=line-too-long
 
 import os
+import tempfile
 import unittest
 
 from unittest import mock
@@ -30,7 +31,9 @@
 import unittest_utils
 
 from test_finders import test_finder_utils
+from test_finders import test_info
 
+JSON_FILE_PATH = os.path.join(uc.TEST_DATA_DIR, uc.JSON_FILE)
 CLASS_DIR = 'foo/bar/jank/src/android/jank/cts/ui'
 OTHER_DIR = 'other/dir/'
 OTHER_CLASS_NAME = 'test.java'
@@ -41,7 +44,7 @@
 FIND_TWO = uc.ROOT + 'other/dir/test.java\n' + uc.FIND_ONE
 FIND_THREE = '/a/b/c.java\n/d/e/f.java\n/g/h/i.java'
 FIND_THREE_LIST = ['/a/b/c.java', '/d/e/f.java', '/g/h/i.java']
-VTS_XML = 'VtsAndroidTest.xml'
+VTS_XML = 'VtsAndroidTest.xml.data'
 VTS_BITNESS_XML = 'VtsBitnessAndroidTest.xml'
 VTS_PUSH_DIR = 'vts_push_files'
 VTS_PLAN_DIR = 'vts_plan_files'
@@ -63,12 +66,12 @@
                    'CtsDeviceInfo.apk',
                    'DATA/app/DeviceHealthTests/DeviceHealthTests.apk',
                    'DATA/app/sl4a/sl4a.apk'}
-VTS_PLAN_TARGETS = {os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-staging-default.xml'),
-                    os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-aa.xml'),
-                    os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-bb.xml'),
-                    os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-cc.xml'),
-                    os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-dd.xml')}
-XML_TARGETS = {'CtsJankDeviceTestCases', 'perf-setup.sh', 'cts-tradefed',
+VTS_PLAN_TARGETS = {os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-staging-default.xml.data'),
+                    os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-aa.xml.data'),
+                    os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-bb.xml.data'),
+                    os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-cc.xml.data'),
+                    os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-dd.xml.data')}
+XML_TARGETS = {'CtsJankDeviceTestCases', 'perf-setup', 'cts-tradefed',
                'GtsEmptyTestApp'}
 PATH_TO_MODULE_INFO_WITH_AUTOGEN = {
     'foo/bar/jank' : [{'auto_test_config' : True}]}
@@ -79,6 +82,17 @@
     'foo/bar' : [{'auto_test_config' : True},
                  {'auto_test_config' : True}],
     'foo/bar/jank': [{constants.MODULE_CLASS : [constants.MODULE_CLASS_ROBOLECTRIC]}]}
+UNIT_TEST_SEARCH_ROOT = 'my/unit/test/root'
+IT_TEST_MATCHED_1_PATH = os.path.join(UNIT_TEST_SEARCH_ROOT, 'sub1')
+UNIT_TEST_MATCHED_2_PATH = os.path.join(UNIT_TEST_SEARCH_ROOT, 'sub1', 'sub2')
+UNIT_TEST_NOT_MATCHED_1_PATH = os.path.join(
+    os.path.dirname(UNIT_TEST_SEARCH_ROOT), 'sub1')
+UNIT_TEST_MODULE_1 = 'unit_test_module_1'
+UNIT_TEST_MODULE_2 = 'unit_test_module_2'
+UNIT_TEST_MODULE_3 = 'unit_test_module_3'
+DALVIK_TEST_CONFIG = 'AndroidDalvikTest.xml.data'
+LIBCORE_TEST_CONFIG = 'AndroidLibCoreTest.xml.data'
+DALVIK_XML_TARGETS = XML_TARGETS | test_finder_utils.DALVIK_TEST_DEPS
 
 #pylint: disable=protected-access
 class TestFinderUtilsUnittests(unittest.TestCase):
@@ -150,13 +164,26 @@
             test_path, frozenset(['testMethod1'])))
         test_path = os.path.join(uc.TEST_DATA_DIR, 'class_file_path_testing',
                                  'hello_world_test.java')
-        self.assertTrue(test_finder_utils.has_method_in_file(
+        self.assertFalse(test_finder_utils.has_method_in_file(
             test_path, frozenset(['testMethod', 'testMethod2'])))
         test_path = os.path.join(uc.TEST_DATA_DIR, 'class_file_path_testing',
                                  'hello_world_test.java')
         self.assertFalse(test_finder_utils.has_method_in_file(
             test_path, frozenset(['testMethod'])))
 
+    def test_has_method_in_kt_file(self):
+        """Test has_method_in_file method with kt class path."""
+        test_path = os.path.join(uc.TEST_DATA_DIR, 'class_file_path_testing',
+                                 'hello_world_test.kt')
+        self.assertTrue(test_finder_utils.has_method_in_file(
+            test_path, frozenset(['testMethod1'])))
+        self.assertFalse(test_finder_utils.has_method_in_file(
+            test_path, frozenset(['testMethod'])))
+        self.assertTrue(test_finder_utils.has_method_in_file(
+            test_path, frozenset(['testMethod1', 'testMethod2'])))
+        self.assertFalse(test_finder_utils.has_method_in_file(
+            test_path, frozenset(['testMethod', 'testMethod2'])))
+
     @mock.patch('builtins.input', return_value='1')
     def test_extract_test_from_tests(self, mock_input):
         """Test method extract_test_from_tests method."""
@@ -337,16 +364,45 @@
         mock_module_info = mock.Mock(spec=module_info.ModuleInfo)
         mock_module_info.is_module.side_effect = lambda module: (
             not module == 'is_not_module')
-        xml_file = os.path.join(uc.TEST_DATA_DIR, constants.MODULE_CONFIG)
+        xml_file = os.path.join(uc.TEST_DATA_DIR,
+                                constants.MODULE_CONFIG + '.data')
         unittest_utils.assert_strict_equal(
             self,
             test_finder_utils.get_targets_from_xml(xml_file, mock_module_info),
             XML_TARGETS)
 
+    def test_get_targets_from_dalvik_xml(self):
+        """Test get_targets_from_xml method with dalvik class."""
+        # Mocking Etree is near impossible, so use a real file, but mocking
+        # ModuleInfo is still fine. Just have it return False when it finds a
+        # module that states it's not a module.
+        mock_module_info = mock.Mock(spec=module_info.ModuleInfo)
+        mock_module_info.is_module.side_effect = lambda module: (
+            not module == 'is_not_module')
+        xml_file = os.path.join(uc.TEST_DATA_DIR, DALVIK_TEST_CONFIG)
+        unittest_utils.assert_strict_equal(
+            self,
+            test_finder_utils.get_targets_from_xml(xml_file, mock_module_info),
+            DALVIK_XML_TARGETS)
+
+    def test_get_targets_from_libcore_xml(self):
+        """Test get_targets_from_xml method with libcore class."""
+        # Mocking Etree is near impossible, so use a real file, but mocking
+        # ModuleInfo is still fine. Just have it return False when it finds a
+        # module that states it's not a module.
+        mock_module_info = mock.Mock(spec=module_info.ModuleInfo)
+        mock_module_info.is_module.side_effect = lambda module: (
+            not module == 'is_not_module')
+        xml_file = os.path.join(uc.TEST_DATA_DIR, LIBCORE_TEST_CONFIG)
+        unittest_utils.assert_strict_equal(
+            self,
+            test_finder_utils.get_targets_from_xml(xml_file, mock_module_info),
+            DALVIK_XML_TARGETS)
+
     @mock.patch.object(test_finder_utils, '_VTS_PUSH_DIR',
                        os.path.join(uc.TEST_DATA_DIR, VTS_PUSH_DIR))
     def test_get_targets_from_vts_xml(self):
-        """Test get_targets_from_xml method."""
+        """Test get_targets_from_vts_xml method."""
         # Mocking Etree is near impossible, so use a real file, but mock out
         # ModuleInfo,
         mock_module_info = mock.Mock(spec=module_info.ModuleInfo)
@@ -567,15 +623,20 @@
         self.assertEqual(test_finder_utils.get_install_locations(no_installed_paths),
                          no_expect)
 
+    # Disable the fail test due to the breakage if test xml rename to xml.data.
+    # pylint: disable=pointless-string-statement
+    '''
     def test_get_plans_from_vts_xml(self):
         """Test get_plans_from_vts_xml method."""
-        xml_path = os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'vts-staging-default.xml')
+        xml_path = os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR,
+                                'vts-staging-default.xml.data')
         self.assertEqual(
             test_finder_utils.get_plans_from_vts_xml(xml_path),
             VTS_PLAN_TARGETS)
         xml_path = os.path.join(uc.TEST_DATA_DIR, VTS_PLAN_DIR, 'NotExist.xml')
         self.assertRaises(atest_error.XmlNotExistError,
                           test_finder_utils.get_plans_from_vts_xml, xml_path)
+    '''
 
     def test_get_levenshtein_distance(self):
         """Test get_levenshetine distance module correctly returns distance."""
@@ -585,6 +646,178 @@
         self.assertEqual(test_finder_utils.get_levenshtein_distance(uc.MOD3, uc.FUZZY_MOD3,
                                                                     dir_costs=(1, 2, 1)), 8)
 
+    def test_is_parameterized_java_class(self):
+        """Test is_parameterized_java_class method."""
+        matched_contents = (['@RunWith(Parameterized.class)'],
+                            [' @RunWith( Parameterized.class ) '],
+                            ['@RunWith(TestParameterInjector.class)'],
+                            ['@RunWith(JUnitParamsRunner.class)'],
+                            ['@RunWith(DataProviderRunner.class)'],
+                            ['@RunWith(JukitoRunner.class)'],
+                            ['@RunWith(Theories.class)'],
+                            ['@RunWith(BedsteadJUnit4.class)'])
+        not_matched_contents = (['// @RunWith(Parameterized.class)'],
+                                ['*RunWith(Parameterized.class)'])
+        # Test matched patterns
+        for matched_content in matched_contents:
+            try:
+                tmp_file = tempfile.NamedTemporaryFile(mode='wt')
+                tmp_file.writelines(matched_content)
+                tmp_file.flush()
+            finally:
+                tmp_file.close()
+        # Test not matched patterns
+        for not_matched_content in not_matched_contents:
+            try:
+                tmp_file = tempfile.NamedTemporaryFile(mode='wt')
+                tmp_file.writelines(not_matched_content)
+                tmp_file.flush()
+            finally:
+                tmp_file.close()
+
+    def test_get_cc_test_classes_methods(self):
+        """Test get_cc_test_classes_methods method."""
+        expect_classes = ('MyClass1', 'MyClass2', 'MyClass3', 'MyClass4',
+                          'MyClass5')
+        expect_methods = ('Method1', 'Method2', 'Method3', 'Method5')
+        expect_para_classes = ('MyInstantClass1', 'MyInstantClass2',
+                               'MyInstantClass3', 'MyInstantTypeClass1',
+                               'MyInstantTypeClass2')
+        expected_result = [sorted(expect_classes), sorted(expect_methods),
+                           sorted(expect_para_classes)]
+        file_path = os.path.join(uc.TEST_DATA_DIR, 'my_cc_test.cc')
+        classes, methods, para_classes = (
+            test_finder_utils.get_cc_test_classes_methods(file_path))
+        self.assertEqual(expected_result,
+                         [sorted(classes),
+                          sorted(methods),
+                          sorted(para_classes)])
+
+    def test_get_java_method(self):
+        """Test get_java_method"""
+        expect_methods = {'testMethod1', 'testMethod2'}
+        target_java = os.path.join(uc.TEST_DATA_DIR,
+                                   'class_file_path_testing',
+                                   'hello_world_test.java')
+        self.assertEqual(expect_methods,
+                         test_finder_utils.get_java_methods(target_java))
+        target_kt = os.path.join(uc.TEST_DATA_DIR,
+                                 'class_file_path_testing',
+                                 'hello_world_test.kt')
+        self.assertEqual(expect_methods,
+                         test_finder_utils.get_java_methods(target_kt))
+
+    def test_get_parent_cls_name(self):
+        """Test get_parent_cls_name"""
+        parent_cls = 'AtestClass'
+        target_java = os.path.join(uc.TEST_DATA_DIR,
+                                   'path_testing',
+                                   'PathTesting.java')
+        self.assertEqual(parent_cls,
+                         test_finder_utils.get_parent_cls_name(target_java))
+
+    def test_get_package_name(self):
+        """Test get_package_name"""
+        package_name = 'com.test.hello_world_test'
+        target_java = os.path.join(uc.TEST_DATA_DIR,
+                                   'class_file_path_testing',
+                                   'hello_world_test.java')
+        self.assertEqual(package_name,
+                         test_finder_utils.get_package_name(target_java))
+        target_kt = os.path.join(uc.TEST_DATA_DIR,
+                                 'class_file_path_testing',
+                                 'hello_world_test.kt')
+        self.assertEqual(package_name,
+                         test_finder_utils.get_package_name(target_kt))
+
+    def get_paths_side_effect(self, module_name):
+        """Mock return values for module_info.get_paths."""
+        if module_name == UNIT_TEST_MODULE_1:
+            return [IT_TEST_MATCHED_1_PATH]
+        if module_name == UNIT_TEST_MODULE_2:
+            return [UNIT_TEST_MATCHED_2_PATH]
+        if module_name == UNIT_TEST_MODULE_3:
+            return [UNIT_TEST_NOT_MATCHED_1_PATH]
+        return []
+
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    @mock.patch.object(module_info.ModuleInfo, 'get_all_unit_tests',
+                       return_value=[UNIT_TEST_MODULE_1,
+                                     UNIT_TEST_MODULE_2,
+                                     UNIT_TEST_MODULE_3])
+    @mock.patch.object(module_info.ModuleInfo, 'get_paths',)
+    def test_find_host_unit_tests(self, _get_paths, _mock_get_unit_tests):
+        """Test find_host_unit_tests"""
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        _get_paths.side_effect = self.get_paths_side_effect
+        expect_unit_tests = [UNIT_TEST_MODULE_1, UNIT_TEST_MODULE_2]
+        self.assertEqual(
+            sorted(expect_unit_tests),
+            sorted(test_finder_utils.find_host_unit_tests(
+                mod_info, UNIT_TEST_SEARCH_ROOT)))
+
+    def test_get_annotated_methods(self):
+        """Test get_annotated_methods"""
+        sample_path = os.path.join(
+            uc.TEST_DATA_DIR, 'annotation', 'sample.txt')
+        real_methods = list(test_finder_utils.get_annotated_methods(
+            'TestAnnotation1', sample_path))
+        real_methods.sort()
+        expect_methods = ['annotation1_method1', 'annotation1_method2']
+        expect_methods.sort()
+        self.assertEqual(expect_methods, real_methods)
+
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    def test_get_test_config_use_androidtestxml(self, _isfile):
+        """Test get_test_config_and_srcs using default AndroidTest.xml"""
+        android_root = '/'
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        t_info = test_info.TestInfo(
+            'androidtest_config_module', 'mock_runner', build_targets=set())
+        expect_config = os.path.join(android_root, uc.ANDTEST_CONFIG_PATH,
+                                     constants.MODULE_CONFIG)
+        result, _ = test_finder_utils.get_test_config_and_srcs(t_info, mod_info)
+        self.assertEqual(expect_config, result)
+
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    def test_get_test_config_single_config(self, _isfile):
+        """Test get_test_config_and_srcs manualy set it's config"""
+        android_root = '/'
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        t_info = test_info.TestInfo(
+            'single_config_module', 'mock_runner', build_targets=set())
+        expect_config = os.path.join(
+            android_root, uc.SINGLE_CONFIG_PATH, uc.SINGLE_CONFIG_NAME)
+        result, _ = test_finder_utils.get_test_config_and_srcs(t_info, mod_info)
+        self.assertEqual(expect_config, result)
+
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    def test_get_test_config_main_multiple_config(self, _isfile):
+        """Test get_test_config_and_srcs which is the main module of multiple config"""
+        android_root = '/'
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        t_info = test_info.TestInfo(
+            'multiple_config_module', 'mock_runner', build_targets=set())
+        expect_config = os.path.join(
+            android_root, uc.MULTIPLE_CONFIG_PATH, uc.MAIN_CONFIG_NAME)
+        result, _ = test_finder_utils.get_test_config_and_srcs(t_info, mod_info)
+        self.assertEqual(expect_config, result)
+
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
+    @mock.patch('os.path.isfile', side_effect=unittest_utils.isfile_side_effect)
+    def test_get_test_config_subtest_in_multiple_config(self, _isfile):
+        """Test get_test_config_and_srcs not the main module of multiple config"""
+        android_root = '/'
+        mod_info = module_info.ModuleInfo(module_file=JSON_FILE_PATH)
+        t_info = test_info.TestInfo(
+            'Multiple2', 'mock_runner', build_targets=set())
+        expect_config = os.path.join(
+            android_root, uc.MULTIPLE_CONFIG_PATH, uc.SUB_CONFIG_NAME_2)
+        result, _ = test_finder_utils.get_test_config_and_srcs(t_info, mod_info)
+        self.assertEqual(expect_config, result)
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/atest/test_finders/test_info.py b/atest/test_finders/test_info.py
index 42c0e86..fe3b550 100644
--- a/atest/test_finders/test_info.py
+++ b/atest/test_finders/test_info.py
@@ -30,7 +30,8 @@
     # pylint: disable=too-many-arguments
     def __init__(self, test_name, test_runner, build_targets, data=None,
                  suite=None, module_class=None, install_locations=None,
-                 test_finder='', compatibility_suites=None):
+                 test_finder='', compatibility_suites=None,
+                 mainline_modules=None):
         """Init for TestInfo.
 
         Args:
@@ -47,6 +48,9 @@
             compatibility_suites: A list of compatibility_suites. It's a
                         snippet of compatibility_suites in module_info. e.g.
                         ["device-tests",  "vts10"]
+            mainline_modules: A list of mainline modules.
+                    e.g. ['some1.apk', 'some2.apex', 'some3.apks',
+                          'some1.apk+some2.apex']
         """
         self.test_name = test_name
         self.test_runner = test_runner
@@ -62,19 +66,21 @@
         # attribute is only set through TEST_MAPPING file.
         self.host = False
         self.test_finder = test_finder
-        self.compatibility_suites = (map(str, compatibility_suites)
+        self.compatibility_suites = (compatibility_suites
                                      if compatibility_suites else [])
+        self.mainline_modules = mainline_modules if mainline_modules else []
 
     def __str__(self):
         host_info = (' - runs on host without device required.' if self.host
                      else '')
         return ('test_name: %s - test_runner:%s - build_targets:%s - data:%s - '
                 'suite:%s - module_class: %s - install_locations:%s%s - '
-                'test_finder: %s - compatibility_suites:%s' % (
+                'test_finder: %s - compatibility_suites:%s -'
+                'mainline_modules:%s' % (
                     self.test_name, self.test_runner, self.build_targets,
                     self.data, self.suite, self.module_class,
                     self.install_locations, host_info, self.test_finder,
-                    self.compatibility_suites))
+                    self.compatibility_suites, self.mainline_modules))
 
     def get_supported_exec_mode(self):
         """Get the supported execution mode of the test.
@@ -108,6 +114,22 @@
             return constants.DEVICE_TEST
         return constants.BOTH_TEST
 
+    def get_test_paths(self):
+        """Get the relative path of test_info.
+
+        Search build target's MODULE-IN as the test path.
+
+        Return:
+            A list of string of the relative path for test, None if test
+            path information not found.
+        """
+        test_paths = []
+        for build_target in self.build_targets:
+            if str(build_target).startswith(constants.MODULES_IN):
+                test_paths.append(
+                    str(build_target).replace(
+                        constants.MODULES_IN, '').replace('-', '/'))
+        return test_paths if test_paths else None
 
 class TestFilter(TestFilterBase):
     """Information needed to filter a test in Tradefed"""
diff --git a/atest/test_finders/test_info_unittest.py b/atest/test_finders/test_info_unittest.py
new file mode 100755
index 0000000..25a56c5
--- /dev/null
+++ b/atest/test_finders/test_info_unittest.py
@@ -0,0 +1,39 @@
+#!/usr/bin/env python3
+#
+# Copyright 2020, The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for cache_finder."""
+
+
+import unittest
+
+from test_finders import test_info
+
+
+#pylint: disable=protected-access
+class TestInfoUnittests(unittest.TestCase):
+    """Unit tests for cache_finder.py"""
+
+    def test_get_test_path(self):
+        """Test test_get_test_paths method."""
+        build_targets = set()
+        exp_rel_paths = ['a/b/c', 'd/e/f']
+        for exp_rel_path in exp_rel_paths:
+            build_targets.add('MODULES-IN-%s' % exp_rel_path.replace('/', '-'))
+        t_info = test_info.TestInfo('mock_name', 'mock_runner', build_targets)
+        self.assertEqual(sorted(t_info.get_test_paths()), sorted(exp_rel_paths))
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/atest/test_finders/tf_integration_finder.py b/atest/test_finders/tf_integration_finder.py
index a23c8aa..c6dbd6e 100644
--- a/atest/test_finders/tf_integration_finder.py
+++ b/atest/test_finders/tf_integration_finder.py
@@ -22,8 +22,12 @@
 import logging
 import os
 import re
+import shutil
+import tempfile
 import xml.etree.ElementTree as ElementTree
 
+from zipfile import ZipFile
+
 import atest_error
 import constants
 
@@ -33,12 +37,13 @@
 from test_runners import atest_tf_test_runner
 
 # Find integration name based on file path of integration config xml file.
-# Group matches "foo/bar" given "blah/res/config/blah/res/config/foo/bar.xml
-_INT_NAME_RE = re.compile(r'^.*\/res\/config\/(?P<int_name>.*).xml$')
+# Group matches "foo/bar" given "blah/res/config/foo/bar.xml from source code
+# res directory or "blah/config/foo/bar.xml from prebuilt jars.
+_INT_NAME_RE = re.compile(r'^.*\/config\/(?P<int_name>.*).xml$')
 _TF_TARGETS = frozenset(['tradefed', 'tradefed-contrib'])
 _GTF_TARGETS = frozenset(['google-tradefed', 'google-tradefed-contrib'])
 _CONTRIB_TARGETS = frozenset(['google-tradefed-contrib'])
-_TF_RES_DIR = '../res/config'
+_TF_RES_DIRS = frozenset(['../res/config', 'res/config'])
 
 
 class TFIntegrationFinder(test_finder_base.TestFinderBase):
@@ -48,12 +53,13 @@
 
 
     def __init__(self, module_info=None):
-        super(TFIntegrationFinder, self).__init__()
+        super().__init__()
         self.root_dir = os.environ.get(constants.ANDROID_BUILD_TOP)
         self.module_info = module_info
         # TODO: Break this up into AOSP/google_tf integration finders.
         self.tf_dirs, self.gtf_dirs = self._get_integration_dirs()
         self.integration_dirs = self.tf_dirs + self.gtf_dirs
+        self.temp_dir = tempfile.TemporaryDirectory()
 
     def _get_mod_paths(self, module_name):
         """Return the paths of the given module name."""
@@ -62,7 +68,8 @@
             # changed to ../res/config.
             if module_name in _CONTRIB_TARGETS:
                 mod_paths = self.module_info.get_paths(module_name)
-                return [os.path.join(path, _TF_RES_DIR) for path in mod_paths]
+                return [os.path.join(path, res_path) for path in mod_paths
+                        for res_path in _TF_RES_DIRS]
             return self.module_info.get_paths(module_name)
         return []
 
@@ -125,6 +132,8 @@
                     logging.warning('skipping <include> tag with no "name" value')
                     continue
                 full_paths = self._search_integration_dirs(integration_name)
+                if not full_paths:
+                    full_paths = self._search_prebuilt_jars(integration_name)
                 node = None
                 if full_paths:
                     node = self._load_xml_file(full_paths[0])
@@ -168,8 +177,9 @@
         if ':' in name:
             name, class_name = name.split(':')
         test_files = self._search_integration_dirs(name)
-        if test_files is None:
-            return None
+        if not test_files:
+            # Check prebuilt jars if input name is in jars.
+            test_files = self._search_prebuilt_jars(name)
         # Don't use names that simply match the path,
         # must be the actual name used by TF to run the test.
         t_infos = []
@@ -179,6 +189,82 @@
                 t_infos.append(t_info)
         return t_infos
 
+    def _get_prebuilt_jars(self):
+        """Get prebuilt jars based on targets.
+
+        Returns:
+            A tuple of lists of strings of prebuilt jars.
+        """
+        prebuilt_jars = []
+        for tf_dir in self.tf_dirs:
+            for tf_target in _TF_TARGETS:
+                jar_path = os.path.join(
+                    self.root_dir, tf_dir, '..', 'filegroups', 'tradefed',
+                    tf_target + '.jar')
+                if os.path.exists(jar_path):
+                    prebuilt_jars.append(jar_path)
+        for gtf_dir in self.gtf_dirs:
+            for gtf_target in _GTF_TARGETS:
+                jar_path = os.path.join(
+                    self.root_dir, gtf_dir, '..', 'filegroups',
+                    'google-tradefed', gtf_target + '.jar')
+                if os.path.exists(jar_path):
+                    prebuilt_jars.append(jar_path)
+        return prebuilt_jars
+
+    def _search_prebuilt_jars(self, name):
+        """Search tradefed prebuilt jar which has matched name.
+
+        Search if input name matched prebuilt tradefed jar. If matched, extract
+        the jar file to temp directly for later on test info handling.
+
+        Args:
+            name: A string of integration name as seen in tf's list configs.
+
+        Returns:
+            A list of test path.
+        """
+
+        xml_path = 'config/{}.xml'.format(name)
+        test_files = []
+        prebuilt_jars = self._get_prebuilt_jars()
+        logging.debug('Found prebuilt_jars=%s', prebuilt_jars)
+        for prebuilt_jar in prebuilt_jars:
+            with ZipFile(prebuilt_jar, 'r') as jar_file:
+                jar_contents = jar_file.namelist()
+                if xml_path in jar_contents:
+                    extract_path = os.path.join(
+                        self.temp_dir.name, os.path.basename(prebuilt_jar))
+                    if not os.path.exists(extract_path):
+                        logging.debug('Extracting %s to %s',
+                                      prebuilt_jar, extract_path)
+                        jar_file.extractall(extract_path)
+                    test_files.append(os.path.join(extract_path, xml_path))
+
+        # TODO(b/194362862): Remove below logic after prebuilt jars could be
+        # loaded by atest_tradefed.sh from prebuilt folder directly.
+        # If found in prebuilt jars, manually copy tradefed related jars
+        # to out/host as tradefed's java path.
+        if test_files:
+            host_framework_dir = os.path.join(
+                os.getenv(constants.ANDROID_HOST_OUT, ''), 'framework')
+            if not os.path.isdir(host_framework_dir):
+                os.makedirs(host_framework_dir)
+            prebuilt_dirs = []
+            for prebuilt_jar in prebuilt_jars:
+                prebuilt_dir = os.path.dirname(prebuilt_jar)
+                if prebuilt_dir not in prebuilt_dirs:
+                    prebuilt_dirs.append(prebuilt_dir)
+            for prebuilt_dir in prebuilt_dirs:
+                prebuilts = os.listdir(prebuilt_dir)
+                for prebuilt in prebuilts:
+                    if os.path.splitext(prebuilt)[1] == '.jar':
+                        prebuilt_jar = os.path.join(prebuilt_dir, prebuilt)
+                        logging.debug('Copy %s to %s',
+                                      prebuilt_jar, host_framework_dir)
+                        shutil.copy2(prebuilt_jar, host_framework_dir)
+        return test_files
+
     def _get_test_info(self, name, test_file, class_name):
         """Find the test info matching the given test_file and class_name.
 
@@ -253,6 +339,7 @@
         # create absolute path from cwd and remove symbolic links
         path = os.path.realpath(path)
         if not os.path.exists(path):
+            logging.debug('"%s": file not found!', path)
             return None
         int_dir = test_finder_utils.get_int_dir_from_path(path,
                                                           self.integration_dirs)
diff --git a/atest/test_finders/tf_integration_finder_unittest.py b/atest/test_finders/tf_integration_finder_unittest.py
index 0b1b8ea..3ac4577 100755
--- a/atest/test_finders/tf_integration_finder_unittest.py
+++ b/atest/test_finders/tf_integration_finder_unittest.py
@@ -126,8 +126,9 @@
     def test_load_xml_file(self, search):
         """Test _load_xml_file and _load_include_tags methods."""
         search.return_value = [os.path.join(uc.TEST_DATA_DIR,
-                                            'CtsUiDeviceTestCases.xml')]
-        xml_file = os.path.join(uc.TEST_DATA_DIR, constants.MODULE_CONFIG)
+                                            'CtsUiDeviceTestCases.xml.data')]
+        xml_file = os.path.join(uc.TEST_DATA_DIR,
+                                constants.MODULE_CONFIG + '.data')
         xml_root = self.tf_finder._load_xml_file(xml_file)
         include_tags = xml_root.findall('.//include')
         self.assertEqual(0, len(include_tags))
@@ -138,6 +139,32 @@
                 included = True
         self.assertTrue(included)
 
+    @mock.patch.object(tf_integration_finder.TFIntegrationFinder,
+                       '_get_prebuilt_jars')
+    def test_search_prebuilt_jars(self, prebuilt_jars):
+        """Test _search_prebuilt_jars method."""
+        test_plan = 'performance/inodeop-benchmark'
+        prebuilt_jars.return_value = [
+            os.path.join(
+                uc.TEST_DATA_DIR,
+                'tradefed_prebuilt/prebuilts/filegroups/tradefed/tradefed-contrib.jar')]
+        expect_path = [
+            os.path.join(self.tf_finder.temp_dir.name, 'tradefed-contrib.jar',
+                         'config', test_plan + '.xml')]
+        self.assertEqual(self.tf_finder._search_prebuilt_jars(test_plan),
+                         expect_path)
+
+    def test_get_prebuilt_jars(self):
+        """Test _get_prebuilt_jars method."""
+        tf_int_finder = tf_integration_finder.TFIntegrationFinder()
+        tf_int_finder.tf_dirs = ['tradefed_prebuilt/prebuilts/test_harness']
+        tf_int_finder.root_dir = uc.TEST_DATA_DIR
+        expect_prebuilt_jars = [
+            os.path.join(uc.TEST_DATA_DIR,
+                         'tradefed_prebuilt/prebuilts/test_harness/..',
+                         'filegroups/tradefed/tradefed-contrib.jar')]
+        self.assertEqual(tf_int_finder._get_prebuilt_jars(),
+                         expect_prebuilt_jars)
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/atest/test_runner_handler.py b/atest/test_runner_handler.py
index 5229c88..0d21eed 100644
--- a/atest/test_runner_handler.py
+++ b/atest/test_runner_handler.py
@@ -95,16 +95,16 @@
     Returns:
         Set of build targets required by the test runners.
     """
-    dummy_result_dir = ''
+    unused_result_dir = ''
     test_runner_build_req = set()
     for test_runner, _ in group_tests_by_test_runners(test_infos):
         test_runner_build_req |= test_runner(
-            dummy_result_dir,
+            unused_result_dir,
             module_info=module_info).get_test_runner_build_reqs()
     return test_runner_build_req
 
 
-def run_all_tests(results_dir, test_infos, extra_args,
+def run_all_tests(results_dir, test_infos, extra_args, module_info,
                   delay_print_summary=False):
     """Run the given tests.
 
@@ -112,11 +112,14 @@
         results_dir: String directory to store atest results.
         test_infos: List of TestInfo.
         extra_args: Dict of extra args for test runners to use.
+        module_info: ModuleInfo object.
 
     Returns:
         0 if tests succeed, non-zero otherwise.
     """
-    reporter = result_reporter.ResultReporter()
+    reporter = result_reporter.ResultReporter(
+        collect_only=extra_args.get(constants.COLLECT_TESTS_ONLY),
+        flakes_info=extra_args.get(constants.FLAKES_INFO))
     reporter.print_starting_text()
     tests_ret_code = constants.EXIT_CODE_SUCCESS
     for test_runner, tests in group_tests_by_test_runners(test_infos):
@@ -126,7 +129,7 @@
         ret_code = constants.EXIT_CODE_TEST_FAILURE
         stacktrace = ''
         try:
-            test_runner = test_runner(results_dir)
+            test_runner = test_runner(results_dir, module_info=module_info)
             ret_code = test_runner.run_tests(tests, extra_args, reporter)
             tests_ret_code |= ret_code
         # pylint: disable=broad-except
@@ -144,5 +147,4 @@
                    'stacktrace': stacktrace}])
     if delay_print_summary:
         return tests_ret_code, reporter
-    return (reporter.print_summary(extra_args.get(constants.COLLECT_TESTS_ONLY))
-            or tests_ret_code, reporter)
+    return reporter.print_summary() or tests_ret_code, reporter
diff --git a/atest/test_runner_handler_unittest.py b/atest/test_runner_handler_unittest.py
index ca94405..853729c 100755
--- a/atest/test_runner_handler_unittest.py
+++ b/atest/test_runner_handler_unittest.py
@@ -19,12 +19,16 @@
 # pylint: disable=protected-access
 # pylint: disable=line-too-long
 
+import os
 import unittest
 
 from unittest import mock
 
 import atest_error
+import constants
+import module_info
 import test_runner_handler
+import unittest_constants as uc
 
 from metrics import metrics
 from test_finders import test_info
@@ -119,29 +123,32 @@
             test_runner_handler.get_test_runner_reqs(empty_module_info,
                                                      test_infos))
 
+    @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
     @mock.patch.object(metrics, 'RunnerFinishEvent')
     def test_run_all_tests(self, _mock_runner_finish):
         """Test that the return value as we expected."""
         results_dir = ""
         extra_args = {}
+        mod_info = module_info.ModuleInfo(
+            module_file=os.path.join(uc.TEST_DATA_DIR, uc.JSON_FILE))
         # Tests both run_tests return 0
         test_infos = [MODULE_INFO_A, MODULE_INFO_A_AGAIN]
         self.assertEqual(
             0,
             test_runner_handler.run_all_tests(
-                results_dir, test_infos, extra_args)[0])
+                results_dir, test_infos, extra_args, mod_info)[0])
         # Tests both run_tests return 1
         test_infos = [MODULE_INFO_B, MODULE_INFO_B_AGAIN]
         self.assertEqual(
             1,
             test_runner_handler.run_all_tests(
-                results_dir, test_infos, extra_args)[0])
+                results_dir, test_infos, extra_args, mod_info)[0])
         # Tests with on run_tests return 0, the other return 1
         test_infos = [MODULE_INFO_A, MODULE_INFO_B]
         self.assertEqual(
             1,
             test_runner_handler.run_all_tests(
-                results_dir, test_infos, extra_args)[0])
+                results_dir, test_infos, extra_args, mod_info)[0])
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/atest/test_runners/atest_tf_test_runner.py b/atest/test_runners/atest_tf_test_runner.py
index be954fe..add149d 100644
--- a/atest/test_runners/atest_tf_test_runner.py
+++ b/atest/test_runners/atest_tf_test_runner.py
@@ -25,13 +25,19 @@
 import select
 import shutil
 import socket
+import uuid
 
 from functools import partial
+from pathlib import Path
 
 import atest_utils
 import constants
 import result_reporter
 
+from logstorage import atest_gcp_utils
+from logstorage import logstorage_utils
+from metrics import metrics
+from test_finders import test_finder_utils
 from test_finders import test_info
 from test_runners import test_runner_base
 from .event_handler import EventHandler
@@ -40,13 +46,13 @@
 SOCKET_HOST = '127.0.0.1'
 SOCKET_QUEUE_MAX = 1
 SOCKET_BUFFER = 4096
-SELECT_TIMEOUT = 5
+SELECT_TIMEOUT = 0.5
 
 # Socket Events of form FIRST_EVENT {JSON_DATA}\nSECOND_EVENT {JSON_DATA}
 # EVENT_RE has groups for the name and the data. "." does not match \n.
 EVENT_RE = re.compile(r'\n*(?P<event_name>[A-Z_]+) (?P<json_data>{.*})(?=\n|.)*')
 
-EXEC_DEPENDENCIES = ('adb', 'aapt')
+EXEC_DEPENDENCIES = ('adb', 'aapt', 'fastboot')
 
 TRADEFED_EXIT_MSG = 'TradeFed subprocess exited early with exit code=%s.'
 
@@ -67,8 +73,9 @@
     # TODO(b/142630648): Enable option enable-granular-attempts
     # in sharding mode.
     _LOG_ARGS = ('--logcat-on-failure --atest-log-file-path={log_path} '
-                 '--no-enable-granular-attempts')
-    _RUN_CMD = ('{exe} {template} --template:map '
+                 '--no-enable-granular-attempts '
+                 '--proto-output-file={proto_path}')
+    _RUN_CMD = ('{env} {exe} {template} --template:map '
                 'test=atest {tf_customize_template} {log_args} {args}')
     _BUILD_REQ = {'tradefed-core'}
     _RERUN_OPTION_GROUP = [constants.ITERATIONS,
@@ -77,13 +84,15 @@
 
     def __init__(self, results_dir, module_info=None, **kwargs):
         """Init stuff for base class."""
-        super(AtestTradefedTestRunner, self).__init__(results_dir, **kwargs)
+        super().__init__(results_dir, **kwargs)
         self.module_info = module_info
         self.log_path = os.path.join(results_dir, LOG_FOLDER_NAME)
         if not os.path.exists(self.log_path):
             os.makedirs(self.log_path)
-        log_args = {'log_path': self.log_path}
-        self.run_cmd_dict = {'exe': self.EXECUTABLE,
+        log_args = {'log_path': self.log_path,
+                    'proto_path': os.path.join(self.results_dir, constants.ATEST_TEST_RECORD_PROTO)}
+        self.run_cmd_dict = {'env': self._get_ld_library_path(),
+                             'exe': self.EXECUTABLE,
                              'template': self._TF_TEMPLATE,
                              'tf_customize_template': '',
                              'args': '',
@@ -91,6 +100,21 @@
         self.is_verbose = logging.getLogger().isEnabledFor(logging.DEBUG)
         self.root_dir = os.environ.get(constants.ANDROID_BUILD_TOP)
 
+    def _get_ld_library_path(self):
+        """Get the extra environment setup string for running TF.
+
+        Returns:
+            Strings for the environment passed to TF. Currently only
+            LD_LIBRARY_PATH for TF to load the correct local shared libraries.
+        """
+        out_dir = os.environ.get(constants.ANDROID_HOST_OUT, '')
+        lib_dirs = ['lib', 'lib64']
+        path = ''
+        for lib in lib_dirs:
+            lib_dir = os.path.join(out_dir, lib)
+            path = path + lib_dir + ':'
+        return 'LD_LIBRARY_PATH=%s' % path
+
     def _try_set_gts_authentication_key(self):
         """Set GTS authentication key if it is available or exists.
 
@@ -130,9 +154,145 @@
         # Set google service key if it's available or found before
         # running tests.
         self._try_set_gts_authentication_key()
-        if os.getenv(test_runner_base.OLD_OUTPUT_ENV_VAR):
-            return self.run_tests_raw(test_infos, extra_args, reporter)
-        return self.run_tests_pretty(test_infos, extra_args, reporter)
+        result = 0
+        creds, inv = self._do_upload_flow(extra_args)
+        try:
+            if os.getenv(test_runner_base.OLD_OUTPUT_ENV_VAR):
+                result = self.run_tests_raw(test_infos, extra_args, reporter)
+            result = self.run_tests_pretty(test_infos, extra_args, reporter)
+        finally:
+            if inv:
+                try:
+                    logging.disable(logging.INFO)
+                    # Always set invocation status to completed due to the ATest
+                    # handle whole process by its own.
+                    inv['schedulerState'] = 'completed'
+                    logstorage_utils.BuildClient(creds).update_invocation(inv)
+                    reporter.test_result_link = (constants.RESULT_LINK
+                                                 % inv['invocationId'])
+                finally:
+                    logging.disable(logging.NOTSET)
+        return result
+
+    def _do_upload_flow(self, extra_args):
+        """Run upload flow.
+
+        Asking user's decision and do the related steps.
+
+        Args:
+            extra_args: Dict of extra args to add to test run.
+        Return:
+            tuple(invocation, workunit)
+        """
+        config_folder = os.path.join(os.path.expanduser('~'), '.atest')
+        creds = self._request_consent_of_upload_test_result(
+            config_folder,
+            extra_args.get(constants.REQUEST_UPLOAD_RESULT, None))
+        if creds:
+            inv, workunit = self._prepare_data(creds)
+            extra_args[constants.INVOCATION_ID] = inv['invocationId']
+            extra_args[constants.WORKUNIT_ID] = workunit['id']
+            if not os.path.exists(os.path.dirname(constants.TOKEN_FILE_PATH)):
+                os.makedirs(os.path.dirname(constants.TOKEN_FILE_PATH))
+            with open(constants.TOKEN_FILE_PATH, 'w') as token_file:
+                token_file.write(creds.token_response['access_token'])
+            return creds, inv
+        return None, None
+
+    def _prepare_data(self, creds):
+        """Prepare data for build api using.
+
+        Args:
+            creds: The credential object.
+        Return:
+            invocation and workunit object.
+        """
+        try:
+            logging.disable(logging.INFO)
+            external_id = str(uuid.uuid4())
+            client = logstorage_utils.BuildClient(creds)
+            branch = self._get_branch(client)
+            target = self._get_target(branch, client)
+            build_record = client.insert_local_build(external_id,
+                                                     target,
+                                                     branch)
+            client.insert_build_attempts(build_record)
+            invocation = client.insert_invocation(build_record)
+            workunit = client.insert_work_unit(invocation)
+            return invocation, workunit
+        finally:
+            logging.disable(logging.NOTSET)
+
+    def _get_branch(self, build_client):
+        """Get source code tree branch.
+
+        Args:
+            build_client: The build client object.
+        Return:
+            "git_master" in internal git, "aosp-master" otherwise.
+        """
+        default_branch = ('git_master'
+                          if constants.CREDENTIAL_FILE_NAME else 'aosp-master')
+        local_branch = atest_utils.get_manifest_branch()
+        branches = [b['name'] for b in build_client.list_branch()['branches']]
+        return local_branch if local_branch in branches else default_branch
+
+    def _get_target(self, branch, build_client):
+        """Get local build selected target.
+
+        Args:
+            branch: The branch want to check.
+            build_client: The build client object.
+        Return:
+            The matched build target, "aosp_x86-userdebug" otherwise.
+        """
+        default_target = 'aosp_x86-userdebug'
+        local_target = atest_utils.get_build_target()
+        targets = [t['target']
+                   for t in build_client.list_target(branch)['targets']]
+        return local_target if local_target in targets else default_target
+
+    def _request_consent_of_upload_test_result(self, config_folder,
+                                               request_to_upload_result):
+        """Request the consent of upload test results at the first time.
+
+        Args:
+            config_folder: The directory path to put config file.
+            request_to_upload_result: Prompt message for user determine.
+        Return:
+            The credential object.
+        """
+        if not os.path.exists(config_folder):
+            os.makedirs(config_folder)
+        not_upload_file = os.path.join(config_folder,
+                                       constants.DO_NOT_UPLOAD)
+        # Do nothing if there are no related config or DO_NOT_UPLOAD exists.
+        if (not constants.CREDENTIAL_FILE_NAME or
+                not constants.TOKEN_FILE_PATH):
+            return None
+
+        creds_f = os.path.join(config_folder, constants.CREDENTIAL_FILE_NAME)
+        if request_to_upload_result:
+            if os.path.exists(not_upload_file):
+                os.remove(not_upload_file)
+            if os.path.exists(creds_f):
+                os.remove(creds_f)
+
+        # If the credential file exists or the user says “Yes”, ATest will
+        # try to get the credential from the file, else will create a
+        # DO_NOT_UPLOAD to keep the user's decision.
+        if not os.path.exists(not_upload_file):
+            if (os.path.exists(creds_f) or
+                    (request_to_upload_result and
+                     atest_utils.prompt_with_yn_result(
+                         constants.UPLOAD_TEST_RESULT_MSG, False))):
+                return atest_gcp_utils.GCPHelper(
+                    client_id=constants.CLIENT_ID,
+                    client_secret=constants.CLIENT_SECRET,
+                    user_agent='atest').get_credential_with_auth_flow(creds_f)
+
+        Path(not_upload_file).touch()
+        return None
 
     def run_tests_raw(self, test_infos, extra_args, reporter):
         """Run the list of test_infos. See base class for more.
@@ -178,20 +338,22 @@
             self.handle_subprocess(subproc, partial(self._start_monitor,
                                                     server,
                                                     subproc,
-                                                    reporter))
+                                                    reporter,
+                                                    extra_args))
             server.close()
             ret_code |= self.wait_for_subprocess(subproc)
         return ret_code
 
     # pylint: disable=too-many-branches
     # pylint: disable=too-many-locals
-    def _start_monitor(self, server, tf_subproc, reporter):
+    def _start_monitor(self, server, tf_subproc, reporter, extra_args):
         """Polling and process event.
 
         Args:
             server: Socket server object.
             tf_subproc: The tradefed subprocess to poll.
             reporter: Result_Reporter object.
+            extra_args: Dict of extra args to add to test run.
         """
         inputs = [server]
         event_handlers = {}
@@ -222,7 +384,12 @@
                         else:
                             event_handler = event_handlers.setdefault(
                                 socket_object, EventHandler(
-                                    result_reporter.ResultReporter(),
+                                    result_reporter.ResultReporter(
+                                        collect_only=extra_args.get(
+                                            constants.COLLECT_TESTS_ONLY),
+                                        flakes_info=extra_args.get(
+                                            constants.FLAKES_INFO)),
+
                                     self.NAME))
                         recv_data = self._process_connection(data_map,
                                                              socket_object,
@@ -234,9 +401,15 @@
                 # Subprocess ended and all socket clients were closed.
                 if tf_subproc.poll() is not None and len(inputs) == 1:
                     inputs.pop().close()
+                    if not reporter.all_test_results:
+                        atest_utils.colorful_print(
+                            r'No test to run. Please check: '
+                            r'{} for detail.'.format(reporter.log_path),
+                            constants.RED, highlight=True)
                     if not data_map:
                         raise TradeFedExitError(TRADEFED_EXIT_MSG
                                                 % tf_subproc.returncode)
+                    self._handle_log_associations(event_handlers)
 
     def _process_connection(self, data_map, conn, event_handler):
         """Process a socket connection betwen TF and ATest.
@@ -292,10 +465,20 @@
     def generate_env_vars(self, extra_args):
         """Convert extra args into env vars."""
         env_vars = os.environ.copy()
+        if constants.TF_GLOBAL_CONFIG:
+            env_vars["TF_GLOBAL_CONFIG"] = constants.TF_GLOBAL_CONFIG
         debug_port = extra_args.get(constants.TF_DEBUG, '')
         if debug_port:
             env_vars['TF_DEBUG'] = 'true'
             env_vars['TF_DEBUG_PORT'] = str(debug_port)
+        filtered_paths = []
+        for path in str(env_vars['PYTHONPATH']).split(':'):
+            # TODO (b/166216843) Remove the hacky PYTHON path workaround.
+            if (str(path).startswith('/tmp/Soong.python_') and
+                    str(path).find('googleapiclient') > 0):
+                continue
+            filtered_paths.append(path)
+        env_vars['PYTHONPATH'] = ':'.join(filtered_paths)
         return env_vars
 
     # pylint: disable=unnecessary-pass
@@ -347,8 +530,7 @@
 
     # pylint: disable=too-many-branches
     # pylint: disable=too-many-statements
-    @staticmethod
-    def _parse_extra_args(extra_args):
+    def _parse_extra_args(self, test_infos, extra_args):
         """Convert the extra args into something tf can understand.
 
         Args:
@@ -392,6 +574,8 @@
                 continue
             if constants.DRY_RUN == arg:
                 continue
+            if constants.FLAKES_INFO == arg:
+                continue
             if constants.INSTANT == arg:
                 args_to_append.append('--enable-parameterized-modules')
                 args_to_append.append('--module-parameter')
@@ -427,7 +611,31 @@
             if constants.TF_DEBUG == arg:
                 print("Please attach process to your IDE...")
                 continue
+            if arg in (constants.TF_TEMPLATE,
+                       constants.TF_EARLY_DEVICE_RELEASE,
+                       constants.INVOCATION_ID,
+                       constants.WORKUNIT_ID):
+                continue
             args_not_supported.append(arg)
+        # Set exclude instant app annotation for non-instant mode run.
+        if (constants.INSTANT not in extra_args and
+            self._has_instant_app_config(test_infos, self.module_info)):
+            args_to_append.append(constants.TF_TEST_ARG)
+            args_to_append.append(
+                '{tf_class}:{option_name}:{option_value}'.format(
+                    tf_class=constants.TF_AND_JUNIT_CLASS,
+                    option_name=constants.TF_EXCLUDE_ANNOTATE,
+                    option_value=constants.INSTANT_MODE_ANNOTATE))
+        # If test config has config with auto enable parameter, force exclude
+        # those default parameters(ex: instant_app, secondary_user)
+        if '--enable-parameterized-modules' not in args_to_append:
+            for tinfo in test_infos:
+                if self._is_parameter_auto_enabled_cfg(tinfo, self.module_info):
+                    args_to_append.append('--enable-parameterized-modules')
+                    for exclude_parameter in constants.DEFAULT_EXCLUDE_PARAS:
+                        args_to_append.append('--exclude-module-parameters')
+                        args_to_append.append(exclude_parameter)
+                    break
         return args_to_append, args_not_supported
 
     def _generate_metrics_folder(self, extra_args):
@@ -471,13 +679,22 @@
         if metrics_folder:
             test_args.extend(['--metrics-folder', metrics_folder])
             logging.info('Saved metrics in: %s', metrics_folder)
-        log_level = 'WARN'
-        if self.is_verbose:
-            log_level = 'VERBOSE'
-            test_args.extend(['--log-level-display', log_level])
+        if extra_args.get(constants.INVOCATION_ID, None):
+            test_args.append('--invocation-data invocation_id=%s'
+                             % extra_args[constants.INVOCATION_ID])
+        if extra_args.get(constants.WORKUNIT_ID, None):
+            test_args.append('--invocation-data work_unit_id=%s'
+                             % extra_args[constants.WORKUNIT_ID])
+        # For detailed logs, set TF options log-level/log-level-display as
+        # 'VERBOSE' by default.
+        log_level = 'VERBOSE'
+        test_args.extend(['--log-level-display', log_level])
         test_args.extend(['--log-level', log_level])
+        # Set no-early-device-release by default to speed up TF teardown time.
+        if not constants.TF_EARLY_DEVICE_RELEASE in extra_args:
+            test_args.extend(['--no-early-device-release'])
 
-        args_to_add, args_not_supported = self._parse_extra_args(extra_args)
+        args_to_add, args_not_supported = self._parse_extra_args(test_infos, extra_args)
 
         # TODO(b/122889707) Remove this after finding the root cause.
         env_serial = os.environ.get(constants.ANDROID_SERIAL)
@@ -616,7 +833,13 @@
             # if it's integration finder.
             if info.test_finder in _INTEGRATION_FINDERS:
                 has_integration_test = True
-            args.extend([constants.TF_INCLUDE_FILTER, info.test_name])
+            # For non-paramertize test module, use --include-filter, but for
+            # tests which have auto enable paramertize config use --module
+            # instead.
+            if self._is_parameter_auto_enabled_cfg(info, self.module_info):
+                args.extend([constants.TF_MODULE_FILTER, info.test_name])
+            else:
+                args.extend([constants.TF_INCLUDE_FILTER, info.test_name])
             filters = set()
             for test_filter in info.data.get(constants.TI_FILTER, []):
                 filters.update(test_filter.to_set_of_tf_strings())
@@ -670,3 +893,73 @@
         """
         return ' '.join(['--template:map %s'
                          % x for x in extra_args.get(constants.TF_TEMPLATE, [])])
+
+    def _handle_log_associations(self, event_handlers):
+        """Handle TF's log associations information data.
+
+        log_association dict:
+        {'loggedFile': '/tmp/serial-util11375755456514097276.ser',
+         'dataName': 'device_logcat_setup_127.0.0.1:58331',
+         'time': 1602038599.856113},
+
+        Args:
+            event_handlers: Dict of {socket_object:EventHandler}.
+
+        """
+        log_associations = []
+        for _, event_handler in event_handlers.items():
+            if event_handler.log_associations:
+                log_associations += event_handler.log_associations
+        device_test_end_log_time = ''
+        device_teardown_log_time = ''
+        for log_association in log_associations:
+            if 'device_logcat_test' in log_association.get('dataName', ''):
+                device_test_end_log_time = log_association.get('time')
+            if 'device_logcat_teardown' in log_association.get('dataName', ''):
+                device_teardown_log_time = log_association.get('time')
+        if device_test_end_log_time and device_teardown_log_time:
+            teardowntime = (float(device_teardown_log_time) -
+                            float(device_test_end_log_time))
+            logging.debug('TF logcat teardown time=%s seconds.', teardowntime)
+            metrics.LocalDetectEvent(
+                detect_type=constants.DETECT_TYPE_TF_TEARDOWN_LOGCAT,
+                result=int(teardowntime))
+
+    @staticmethod
+    def _has_instant_app_config(test_infos, mod_info):
+        """Check if one of the input tests defined instant app mode in config.
+
+        Args:
+            test_infos: A set of TestInfo instances.
+            mod_info: ModuleInfo object.
+
+        Returns: True if one of the tests set up instant app mode.
+        """
+        for tinfo in test_infos:
+            test_config, _ = test_finder_utils.get_test_config_and_srcs(
+                tinfo, mod_info)
+            if test_config:
+                parameters = atest_utils.get_config_parameter(test_config)
+                if constants.TF_PARA_INSTANT_APP in parameters:
+                    return True
+        return False
+
+    @staticmethod
+    def _is_parameter_auto_enabled_cfg(tinfo, mod_info):
+        """Check if input tests contains auto enable support parameters.
+
+        Args:
+            test_infos: A set of TestInfo instances.
+            mod_info: ModuleInfo object.
+
+        Returns: True if input test has parameter setting which is not in the
+                 exclude list.
+        """
+        test_config, _ = test_finder_utils.get_test_config_and_srcs(
+            tinfo, mod_info)
+        if test_config:
+            parameters = atest_utils.get_config_parameter(test_config)
+            if (parameters - constants.DEFAULT_EXCLUDE_PARAS
+                - constants.DEFAULT_EXCLUDE_NOT_PARAS):
+                return True
+        return False
diff --git a/atest/test_runners/atest_tf_test_runner_unittest.py b/atest/test_runners/atest_tf_test_runner_unittest.py
index fc79a2a..2219ba4 100755
--- a/atest/test_runners/atest_tf_test_runner_unittest.py
+++ b/atest/test_runners/atest_tf_test_runner_unittest.py
@@ -27,10 +27,13 @@
 from io import StringIO
 from unittest import mock
 
+import atest_utils
 import constants
 import unittest_constants as uc
 import unittest_utils
 
+from logstorage import atest_gcp_utils
+from test_finders import test_finder_utils
 from test_finders import test_info
 from test_runners import event_handler
 from test_runners import atest_tf_test_runner as atf_tr
@@ -41,10 +44,14 @@
 METRICS_DIR_ARG = '--metrics-folder %s ' % METRICS_DIR
 # TODO(147567606): Replace {serial} with {extra_args} for general extra
 # arguments testing.
-RUN_CMD_ARGS = '{metrics}--log-level WARN{serial}'
+RUN_CMD_ARGS = ('{metrics}--log-level-display VERBOSE --log-level VERBOSE'
+                '{device_early_release}{serial}')
 LOG_ARGS = atf_tr.AtestTradefedTestRunner._LOG_ARGS.format(
-    log_path=os.path.join(uc.TEST_INFO_DIR, atf_tr.LOG_FOLDER_NAME))
+    log_path=os.path.join(uc.TEST_INFO_DIR, atf_tr.LOG_FOLDER_NAME),
+    proto_path=os.path.join(uc.TEST_INFO_DIR, constants.ATEST_TEST_RECORD_PROTO))
+RUN_ENV_STR = 'tf_env_var=test'
 RUN_CMD = atf_tr.AtestTradefedTestRunner._RUN_CMD.format(
+    env=RUN_ENV_STR,
     exe=atf_tr.AtestTradefedTestRunner.EXECUTABLE,
     template=atf_tr.AtestTradefedTestRunner._TF_TEMPLATE,
     tf_customize_template='{tf_customize_template}',
@@ -174,8 +181,11 @@
 class AtestTradefedTestRunnerUnittests(unittest.TestCase):
     """Unit tests for atest_tf_test_runner.py"""
 
+    #pylint: disable=arguments-differ
+    @mock.patch.object(atf_tr.AtestTradefedTestRunner, '_get_ld_library_path')
     @mock.patch.dict('os.environ', {constants.ANDROID_BUILD_TOP:'/'})
-    def setUp(self):
+    def setUp(self, mock_get_ld_library_path):
+        mock_get_ld_library_path.return_value = RUN_ENV_STR
         self.tr = atf_tr.AtestTradefedTestRunner(results_dir=uc.TEST_INFO_DIR)
 
     def tearDown(self):
@@ -250,7 +260,7 @@
         mock_process.side_effect = ['abc', 'def', False, False]
         mock_subproc.poll.side_effect = [None, None, None, None,
                                          None, True]
-        self.tr._start_monitor(mock_server, mock_subproc, mock_reporter)
+        self.tr._start_monitor(mock_server, mock_subproc, mock_reporter, {})
         self.assertEqual(mock_process.call_count, 4)
         calls = [mock.call.accept(), mock.call.close()]
         mock_server.assert_has_calls(calls)
@@ -280,7 +290,7 @@
         # TF exit early but have not processed data in socket buffer.
         mock_subproc.poll.side_effect = [None, None, True, True,
                                          True, True]
-        self.tr._start_monitor(mock_server, mock_subproc, mock_reporter)
+        self.tr._start_monitor(mock_server, mock_subproc, mock_reporter, {})
         self.assertEqual(mock_process.call_count, 4)
         calls = [mock.call.accept(), mock.call.close()]
         mock_server.assert_has_calls(calls)
@@ -388,26 +398,31 @@
         unittest_utils.assert_strict_equal(
             self,
             self.tr.generate_run_commands([], {}),
-            [RUN_CMD.format(metrics='',
+            [RUN_CMD.format(env=RUN_ENV_STR,
+                            metrics='',
                             serial='',
-                            tf_customize_template='')])
+                            tf_customize_template='',
+                            device_early_release=' --no-early-device-release')])
         mock_mertrics.return_value = METRICS_DIR
         unittest_utils.assert_strict_equal(
             self,
             self.tr.generate_run_commands([], {}),
             [RUN_CMD.format(metrics=METRICS_DIR_ARG,
                             serial='',
-                            tf_customize_template='')])
+                            tf_customize_template='',
+                            device_early_release=' --no-early-device-release')])
         # Run cmd with result server args.
         result_arg = '--result_arg'
         mock_resultargs.return_value = [result_arg]
         mock_mertrics.return_value = ''
         unittest_utils.assert_strict_equal(
             self,
-            self.tr.generate_run_commands([], {}),
+            self.tr.generate_run_commands(
+                [], {}),
             [RUN_CMD.format(metrics='',
                             serial='',
-                            tf_customize_template='') + ' ' + result_arg])
+                            tf_customize_template='',
+                            device_early_release=' --no-early-device-release') + ' ' + result_arg])
 
     @mock.patch('os.environ.get')
     @mock.patch.object(atf_tr.AtestTradefedTestRunner, '_generate_metrics_folder')
@@ -426,7 +441,8 @@
             self.tr.generate_run_commands([], {}),
             [RUN_CMD.format(metrics='',
                             serial=env_serial_arg,
-                            tf_customize_template='')])
+                            tf_customize_template='',
+                            device_early_release=' --no-early-device-release')])
         # Serial env be set but with --serial arg.
         arg_device_serial = 'arg-device-0'
         arg_serial_arg = ' --serial %s' % arg_device_serial
@@ -435,14 +451,16 @@
             self.tr.generate_run_commands([], {constants.SERIAL:arg_device_serial}),
             [RUN_CMD.format(metrics='',
                             serial=arg_serial_arg,
-                            tf_customize_template='')])
+                            tf_customize_template='',
+                            device_early_release=' --no-early-device-release')])
         # Serial env be set but with -n arg
         unittest_utils.assert_strict_equal(
             self,
             self.tr.generate_run_commands([], {constants.HOST: True}),
             [RUN_CMD.format(metrics='',
                             serial='',
-                            tf_customize_template='') +
+                            tf_customize_template='',
+                            device_early_release=' --no-early-device-release') +
              ' -n --prioritize-host-config --skip-host-arch-check'])
 
 
@@ -542,10 +560,12 @@
         unittest_utils.assert_equal_testinfo_sets(self, test_infos,
                                                   {FLAT2_CLASS_INFO})
 
-    def test_create_test_args(self):
+    @mock.patch.object(test_finder_utils, 'get_test_config_and_srcs')
+    def test_create_test_args(self, mock_config):
         """Test _create_test_args method."""
         # Only compile '--skip-loading-config-jar' in TF if it's not
         # INTEGRATION finder or the finder property isn't set.
+        mock_config.return_value = '', ''
         args = self.tr._create_test_args([MOD_INFO])
         self.assertTrue(constants.TF_SKIP_LOADING_CONFIG_JAR in args)
 
@@ -581,7 +601,8 @@
             [RUN_CMD.format(
                 metrics='',
                 serial='',
-                tf_customize_template='')])
+                tf_customize_template='',
+                device_early_release=' --no-early-device-release')])
         # Testing  with collect-tests-only
         mock_resultargs.return_value = []
         mock_mertrics.return_value = ''
@@ -592,7 +613,8 @@
             [RUN_CMD.format(
                 metrics='',
                 serial=' --collect-tests-only',
-                tf_customize_template='')])
+                tf_customize_template='',
+                device_early_release=' --no-early-device-release')])
 
 
     @mock.patch('os.environ.get', return_value=None)
@@ -616,6 +638,7 @@
             [RUN_CMD.format(
                 metrics='',
                 serial='',
+                device_early_release=' --no-early-device-release',
                 tf_customize_template=
                 '--template:map {}={}').format(tf_tmplate_key1,
                                                tf_tmplate_val1)])
@@ -631,6 +654,7 @@
             [RUN_CMD.format(
                 metrics='',
                 serial='',
+                device_early_release=' --no-early-device-release',
                 tf_customize_template=
                 '--template:map {}={} --template:map {}={}').format(
                     tf_tmplate_key1,
@@ -638,6 +662,200 @@
                     tf_tmplate_key2,
                     tf_tmplate_val2)])
 
+    @mock.patch.object(atest_gcp_utils.GCPHelper, 'get_credential_with_auth_flow')
+    @mock.patch('builtins.input')
+    def test_request_consent_of_upload_test_result_yes(self,
+                                                       mock_input,
+                                                       mock_get_credential_with_auth_flow):
+        """test request_consent_of_upload_test_result method."""
+        constants.CREDENTIAL_FILE_NAME = 'cred_file'
+        constants.GCP_ACCESS_TOKEN = 'access_token'
+        tmp_folder = tempfile.mkdtemp()
+        mock_input.return_value = 'Y'
+        not_upload_file = os.path.join(tmp_folder,
+                                       constants.DO_NOT_UPLOAD)
+
+        self.tr._request_consent_of_upload_test_result(tmp_folder, True)
+        self.assertEqual(1, mock_get_credential_with_auth_flow.call_count)
+        self.assertFalse(os.path.exists(not_upload_file))
+
+        self.tr._request_consent_of_upload_test_result(tmp_folder, True)
+        self.assertEqual(2, mock_get_credential_with_auth_flow.call_count)
+        self.assertFalse(os.path.exists(not_upload_file))
+
+    @mock.patch.object(atest_gcp_utils.GCPHelper, 'get_credential_with_auth_flow')
+    @mock.patch('builtins.input')
+    def test_request_consent_of_upload_test_result_no(self,
+                                                      mock_input,
+                                                      mock_get_credential_with_auth_flow):
+        """test request_consent_of_upload_test_result method."""
+        mock_input.return_value = 'N'
+        constants.CREDENTIAL_FILE_NAME = 'cred_file'
+        constants.GCP_ACCESS_TOKEN = 'access_token'
+        tmp_folder = tempfile.mkdtemp()
+        not_upload_file = os.path.join(tmp_folder,
+                                       constants.DO_NOT_UPLOAD)
+
+        self.tr._request_consent_of_upload_test_result(tmp_folder, True)
+        self.assertTrue(os.path.exists(not_upload_file))
+        self.assertEqual(0, mock_get_credential_with_auth_flow.call_count)
+        self.tr._request_consent_of_upload_test_result(tmp_folder, True)
+        self.assertEqual(0, mock_get_credential_with_auth_flow.call_count)
+
+    @mock.patch('os.environ.get', return_value=None)
+    @mock.patch.object(atf_tr.AtestTradefedTestRunner, '_generate_metrics_folder')
+    @mock.patch('atest_utils.get_result_server_args')
+    def test_generate_run_commands_with_tf_early_device_release(
+            self, mock_resultargs, mock_mertrics, _):
+        """Test generate_run_command method."""
+        # Testing  without collect-tests-only
+        mock_resultargs.return_value = []
+        mock_mertrics.return_value = ''
+        extra_args = {constants.TF_EARLY_DEVICE_RELEASE: True}
+        unittest_utils.assert_strict_equal(
+            self,
+            self.tr.generate_run_commands([], extra_args),
+            [RUN_CMD.format(
+                metrics='',
+                serial='',
+                tf_customize_template='',
+                device_early_release='')])
+
+    @mock.patch.object(atf_tr.AtestTradefedTestRunner, '_prepare_data')
+    @mock.patch.object(atf_tr.AtestTradefedTestRunner, '_request_consent_of_upload_test_result')
+    def test_do_upload_flow(self, mock_request, mock_prepare):
+        """test _do_upload_flow method."""
+        fake_extra_args = {}
+        fake_creds = mock.Mock()
+        fake_creds.token_response = {'access_token': 'fake_token'}
+        mock_request.return_value = fake_creds
+        fake_inv = {'invocationId': 'inv_id'}
+        fake_workunit = {'id': 'workunit_id'}
+        mock_prepare.return_value = fake_inv, fake_workunit
+        constants.TOKEN_FILE_PATH = tempfile.NamedTemporaryFile().name
+        creds, inv = self.tr._do_upload_flow(fake_extra_args)
+        self.assertEqual(fake_creds, creds)
+        self.assertEqual(fake_inv, inv)
+        self.assertEqual(fake_extra_args[constants.INVOCATION_ID],
+                         fake_inv['invocationId'])
+        self.assertEqual(fake_extra_args[constants.WORKUNIT_ID],
+                         fake_workunit['id'])
+
+        mock_request.return_value = None
+        creds, inv = self.tr._do_upload_flow(fake_extra_args)
+        self.assertEqual(None, creds)
+        self.assertEqual(None, inv)
+
+    @mock.patch.object(test_finder_utils, 'get_test_config_and_srcs')
+    def test_has_instant_app_config(self, mock_config):
+        """test _has_instant_app_config method."""
+        no_instant_config = os.path.join(
+            uc.TEST_DATA_DIR, "parameter_config", "parameter.cfg")
+        instant_config = os.path.join(
+            uc.TEST_DATA_DIR, "parameter_config", "instant_app_parameter.cfg")
+        # Test find instant app config
+        mock_config.return_value = instant_config, ''
+        self.assertTrue(
+            atf_tr.AtestTradefedTestRunner._has_instant_app_config(
+                ['test_info'], 'module_info_obj'))
+        # Test not find instant app config
+        mock_config.return_value = no_instant_config, ''
+        self.assertFalse(
+            atf_tr.AtestTradefedTestRunner._has_instant_app_config(
+                ['test_info'], 'module_info_obj'))
+
+    @mock.patch.object(atf_tr.AtestTradefedTestRunner,
+                       '_has_instant_app_config', return_value=True)
+    @mock.patch('os.environ.get', return_value=None)
+    @mock.patch.object(atf_tr.AtestTradefedTestRunner,
+                       '_generate_metrics_folder')
+    @mock.patch('atest_utils.get_result_server_args')
+    def test_generate_run_commands_has_instant_app_config(
+        self, mock_resultargs, mock_mertrics, _, _mock_has_config):
+        """Test generate_run_command method which has instant app config."""
+        # Basic Run Cmd
+        mock_resultargs.return_value = []
+        mock_mertrics.return_value = ''
+        extra_tf_arg = (
+            '{tf_test_arg} {tf_class}:{option_name}:{option_value}'.format(
+            tf_test_arg = constants.TF_TEST_ARG,
+            tf_class=constants.TF_AND_JUNIT_CLASS,
+            option_name=constants.TF_EXCLUDE_ANNOTATE,
+            option_value=constants.INSTANT_MODE_ANNOTATE))
+        unittest_utils.assert_strict_equal(
+            self,
+            self.tr.generate_run_commands([], {}),
+            [RUN_CMD.format(env=RUN_ENV_STR,
+                            metrics='',
+                            serial='',
+                            tf_customize_template='',
+                            device_early_release=' --no-early-device-release '
+                                                 + extra_tf_arg)])
+
+    @mock.patch.object(atest_utils, 'get_config_parameter')
+    @mock.patch.object(test_finder_utils, 'get_test_config_and_srcs')
+    def test_is_parameter_auto_enabled_cfg(self, mock_config, mock_cfg_para):
+        """test _is_parameter_auto_enabled_cfg method."""
+        # Test if TF_PARA_INSTANT_APP is match
+        mock_config.return_value = 'test_config', ''
+        mock_cfg_para.return_value = {list(constants.DEFAULT_EXCLUDE_PARAS)[1],
+                                      list(constants.DEFAULT_EXCLUDE_PARAS)[0]}
+        self.assertFalse(
+            atf_tr.AtestTradefedTestRunner._is_parameter_auto_enabled_cfg(
+                ['test_info'], 'module_info_obj'))
+        # Test if DEFAULT_EXCLUDE_NOT_PARAS is match
+        mock_cfg_para.return_value = {
+            list(constants.DEFAULT_EXCLUDE_NOT_PARAS)[2],
+            list(constants.DEFAULT_EXCLUDE_NOT_PARAS)[0]}
+        self.assertFalse(
+            atf_tr.AtestTradefedTestRunner._is_parameter_auto_enabled_cfg(
+                ['test_info'], 'module_info_obj'))
+        # Test if have parameter not in default exclude paras
+        mock_cfg_para.return_value = {
+            'not match parameter',
+            list(constants.DEFAULT_EXCLUDE_PARAS)[1],
+            list(constants.DEFAULT_EXCLUDE_NOT_PARAS)[2]}
+        self.assertTrue(
+            atf_tr.AtestTradefedTestRunner._is_parameter_auto_enabled_cfg(
+                ['test_info'], 'module_info_obj'))
+
+    @mock.patch.object(atf_tr.AtestTradefedTestRunner,
+                       '_is_parameter_auto_enabled_cfg',
+                       return_value=True)
+    @mock.patch.object(test_finder_utils, 'get_test_config_and_srcs')
+    def test_create_test_args_with_auto_enable_parameter(
+        self, mock_config, _mock_is_enable):
+        """Test _create_test_args method with auto enabled parameter config."""
+        # Should have --m on args and should not have --include-filter.
+        mock_config.return_value = '', ''
+        args = self.tr._create_test_args([MOD_INFO])
+        self.assertTrue(constants.TF_MODULE_FILTER in args)
+        self.assertFalse(constants.TF_INCLUDE_FILTER in args)
+
+    @mock.patch.object(atf_tr.AtestTradefedTestRunner,
+                       '_is_parameter_auto_enabled_cfg')
+    @mock.patch.object(test_finder_utils, 'get_test_config_and_srcs')
+    def test_parse_extra_args(self, mock_config, _mock_is_enable):
+        """Test _parse_extra_args ."""
+        # If extra_arg enable instant_app or secondary users, should not have
+        # --exclude-module-rameters even though test config parameter is auto
+        # enabled.
+        mock_config.return_value = '', ''
+        _mock_is_enable.return_value = True
+        args, _ = self.tr._parse_extra_args([MOD_INFO], [constants.INSTANT])
+        self.assertFalse('--exclude-module-parameters' in args)
+
+        # If extra_arg not enable instant_app or secondary users, should have
+        # --exclude-module-rameters if config parameter is auto enabled.
+        _mock_is_enable.return_value = True
+        args, _ = self.tr._parse_extra_args([MOD_INFO], [constants.ALL_ABI])
+        self.assertTrue('--exclude-module-parameters' in args)
+
+        # If extra_arg not enable instant_app or secondary users, should not
+        # have --exclude-module-rameters if config parameter is not auto enabled
+        _mock_is_enable.return_value = False
+        args, _ = self.tr._parse_extra_args([MOD_INFO], [constants.ALL_ABI])
+        self.assertFalse('--exclude-module-parameters' in args)
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/atest/test_runners/event_handler.py b/atest/test_runners/event_handler.py
index 9cd216e..499a9e9 100644
--- a/atest/test_runners/event_handler.py
+++ b/atest/test_runners/event_handler.py
@@ -25,6 +25,7 @@
 from datetime import timedelta
 
 import atest_execution_info
+import result_reporter
 
 from test_runners import test_runner_base
 
@@ -81,6 +82,8 @@
         self.runner_name = name
         self.state = CONNECTION_STATE.copy()
         self.event_stack = deque()
+        self.log_associations = []
+        self.run_num = 0
 
     def _module_started(self, event_data):
         if atest_execution_info.PREPARE_END_TIME is None:
@@ -91,6 +94,7 @@
 
     def _run_started(self, event_data):
         # Technically there can be more than one run per module.
+        self.run_num = event_data.get('runAttempt', 0)
         self.state['test_run_name'] = event_data.setdefault('runName', '')
         self.state['current_group_total'] = event_data['testCount']
         self.state['test_count'] = 0
@@ -150,8 +154,12 @@
             additional_info={},
             test_run_name=self.state['test_run_name']))
 
+    # pylint: disable=unused-argument
     def _run_ended(self, event_data):
-        pass
+        # Renew ResultReport if is module level(reporter.silent=False)
+        if not self.reporter.silent:
+            self.reporter.set_current_summary(self.run_num)
+            self.reporter = result_reporter.ResultReporter(silent=False)
 
     def _module_ended(self, event_data):
         pass
@@ -200,7 +208,8 @@
             test_run_name=self.state['test_run_name']))
 
     def _log_association(self, event_data):
-        pass
+        event_data.setdefault('time', time.time())
+        self.log_associations.append(event_data)
 
     switch_handler = {EVENT_NAMES['module_started']: _module_started,
                       EVENT_NAMES['run_started']: _run_started,
diff --git a/atest/test_runners/robolectric_test_runner.py b/atest/test_runners/robolectric_test_runner.py
index 103a7ef..302c9c5 100644
--- a/atest/test_runners/robolectric_test_runner.py
+++ b/atest/test_runners/robolectric_test_runner.py
@@ -56,8 +56,12 @@
     # pylint: disable=useless-super-delegation
     def __init__(self, results_dir, **kwargs):
         """Init stuff for robolectric runner class."""
-        super(RobolectricTestRunner, self).__init__(results_dir, **kwargs)
-        self.is_verbose = logging.getLogger().isEnabledFor(logging.DEBUG)
+        super().__init__(results_dir, **kwargs)
+        # TODO: Rollback when found a solution to b/183335046.
+        if not os.getenv(test_runner_base.OLD_OUTPUT_ENV_VAR):
+            self.is_verbose = True
+        else:
+            self.is_verbose = logging.getLogger().isEnabledFor(logging.DEBUG)
 
     def run_tests(self, test_infos, extra_args, reporter):
         """Run the list of test_infos. See base class for more.
@@ -70,9 +74,10 @@
         Returns:
             0 if tests succeed, non-zero otherwise.
         """
+        # TODO: Rollback when found a solution to b/183335046.
         if os.getenv(test_runner_base.OLD_OUTPUT_ENV_VAR):
-            return self.run_tests_raw(test_infos, extra_args, reporter)
-        return self.run_tests_pretty(test_infos, extra_args, reporter)
+            return self.run_tests_pretty(test_infos, extra_args, reporter)
+        return self.run_tests_raw(test_infos, extra_args, reporter)
 
     def run_tests_raw(self, test_infos, extra_args, reporter):
         """Run the list of test_infos with raw output.
diff --git a/atest/test_runners/robolectric_test_runner_unittest.py b/atest/test_runners/robolectric_test_runner_unittest.py
index e036aa4..0edd061 100755
--- a/atest/test_runners/robolectric_test_runner_unittest.py
+++ b/atest/test_runners/robolectric_test_runner_unittest.py
@@ -19,9 +19,10 @@
 # pylint: disable=line-too-long
 
 import json
-import unittest
+import platform
 import subprocess
 import tempfile
+import unittest
 
 from unittest import mock
 
@@ -94,7 +95,10 @@
         self.suite_tr. _exec_with_robo_polling(event_file, robo_proc, mock_pe)
         calls = [mock.call.process_event(event_name,
                                          json.loads(event1 + event2))]
-        mock_pe.assert_has_calls(calls)
+        # (b/147569951) subprocessing 'echo'  behaves differently between
+        # linux/darwin. Ensure it is not called in MacOS.
+        if platform.system() == 'Linux':
+            mock_pe.assert_has_calls(calls)
 
     @mock.patch.object(event_handler.EventHandler, 'process_event')
     def test_exec_with_robo_polling_with_fail_stacktrace(self, mock_pe):
diff --git a/atest/test_runners/suite_plan_test_runner.py b/atest/test_runners/suite_plan_test_runner.py
index 948a349..aa686d1 100644
--- a/atest/test_runners/suite_plan_test_runner.py
+++ b/atest/test_runners/suite_plan_test_runner.py
@@ -19,9 +19,9 @@
 import copy
 import logging
 
-import atest_utils
 import constants
 
+from metrics import metrics
 from test_runners import atest_tf_test_runner
 
 class SuitePlanTestRunner(atest_tf_test_runner.AtestTradefedTestRunner):
@@ -32,7 +32,7 @@
 
     def __init__(self, results_dir, **kwargs):
         """Init stuff for suite tradefed runner class."""
-        super(SuitePlanTestRunner, self).__init__(results_dir, **kwargs)
+        super().__init__(results_dir, **kwargs)
         self.run_cmd_dict = {'exe': '',
                              'test': '',
                              'args': ''}
@@ -44,8 +44,7 @@
             Set of build targets.
         """
         build_req = set()
-        build_req |= super(SuitePlanTestRunner,
-                           self).get_test_runner_build_reqs()
+        build_req |= super().get_test_runner_build_reqs()
         return build_req
 
     def run_tests(self, test_infos, extra_args, reporter):
@@ -62,8 +61,7 @@
         run_cmds = self.generate_run_commands(test_infos, extra_args)
         ret_code = constants.EXIT_CODE_SUCCESS
         for run_cmd in run_cmds:
-            proc = super(SuitePlanTestRunner, self).run(run_cmd,
-                                                        output_to_stdout=True)
+            proc = super().run(run_cmd, output_to_stdout=True)
             ret_code |= self.wait_for_subprocess(proc)
         return ret_code
 
@@ -114,11 +112,19 @@
         cmds = []
         args = []
         args.extend(self._parse_extra_args(extra_args))
-        args.extend(atest_utils.get_result_server_args())
+        # TODO(b/183069337): Enable result server args after suite ready.
+        #args.extend(atest_utils.get_result_server_args())
         for test_info in test_infos:
             cmd_dict = copy.deepcopy(self.run_cmd_dict)
             cmd_dict['test'] = test_info.test_name
             cmd_dict['args'] = ' '.join(args)
             cmd_dict['exe'] = self.EXECUTABLE % test_info.suite
             cmds.append(self._RUN_CMD.format(**cmd_dict))
+            if constants.DETECT_TYPE_XTS_SUITE:
+                xts_detect_type = constants.DETECT_TYPE_XTS_SUITE.get(
+                    test_info.suite, '')
+                if xts_detect_type:
+                    metrics.LocalDetectEvent(
+                        detect_type=xts_detect_type,
+                        result=1)
         return cmds
diff --git a/atest/tf_proto/Android.bp b/atest/tf_proto/Android.bp
index 3756212..9c302dd 100644
--- a/atest/tf_proto/Android.bp
+++ b/atest/tf_proto/Android.bp
@@ -13,6 +13,10 @@
 // limitations under the License.
 
 // This is a copy of the proto from Tradefed at tools/tradefederation/core/proto
+package {
+    default_applicable_licenses: ["Android-Apache-2.0"],
+}
+
 python_library_host {
     name: "tradefed-protos-py",
     pkg_path: "atest",
diff --git a/atest/tools/Android.bp b/atest/tools/Android.bp
index b1f383b..1f33e92 100644
--- a/atest/tools/Android.bp
+++ b/atest/tools/Android.bp
@@ -13,6 +13,10 @@
 // limitations under the License.
 
 // This is a copy of the proto from Tradefed at tools/tradefederation/core/proto
+package {
+    default_applicable_licenses: ["Android-Apache-2.0"],
+}
+
 python_library_host {
     name: "metrics-protos",
     pkg_path: "tools",
diff --git a/atest/tools/atest_tools.py b/atest/tools/atest_tools.py
index d424e85..899301d 100755
--- a/atest/tools/atest_tools.py
+++ b/atest/tools/atest_tools.py
@@ -19,13 +19,16 @@
 
 from __future__ import print_function
 
+import json
 import logging
 import os
 import pickle
 import shutil
 import subprocess
 import sys
+import time
 
+import atest_utils as au
 import constants
 import module_info
 
@@ -35,6 +38,7 @@
 MAC_UPDB_DST = os.path.join(os.getenv(constants.ANDROID_HOST_OUT, ''), 'bin')
 UPDATEDB = 'updatedb'
 LOCATE = 'locate'
+ACLOUD_DURATION = 'duration'
 SEARCH_TOP = os.getenv(constants.ANDROID_BUILD_TOP, '')
 MACOSX = 'Darwin'
 OSNAME = os.uname()[0]
@@ -83,6 +87,27 @@
         if os.path.isfile(index):
             os.remove(index)
 
+def get_report_file(results_dir, acloud_args):
+    """Get the acloud report file path.
+
+    This method can parse either string:
+        --acloud-create '--report-file=/tmp/acloud.json'
+        --acloud-create '--report-file /tmp/acloud.json'
+    and return '/tmp/acloud.json' as the report file. Otherwise returning the
+    default path(/tmp/atest_result/<hashed_dir>/acloud_status.json).
+
+    Args:
+        results_dir: string of directory to store atest results.
+        acloud_args: string of acloud create.
+
+    Returns:
+        A string path of acloud report file.
+    """
+    match = constants.ACLOUD_REPORT_FILE_RE.match(acloud_args)
+    if match:
+        return match.group('report_file')
+    return os.path.join(results_dir, 'acloud_status.json')
+
 def has_command(cmd):
     """Detect if the command is available in PATH.
 
@@ -125,7 +150,8 @@
     try:
         full_env_vars = os.environ.copy()
         logging.debug('Executing: %s', updatedb_cmd)
-        subprocess.check_call(updatedb_cmd, env=full_env_vars)
+        if subprocess.check_call(updatedb_cmd, env=full_env_vars) == 0:
+            au.save_md5([constants.LOCATE_CACHE], constants.LOCATE_CACHE_MD5)
     except (KeyboardInterrupt, SystemExit):
         logging.error('Process interrupted or failure.')
 
@@ -205,7 +231,13 @@
         index: A string path of the index file.
     """
     logging.debug('indexing testable modules.')
-    testable_modules = module_info.ModuleInfo().get_testable_modules()
+    try:
+        # b/178559543 The module-info.json becomes invalid after a success build is
+        # unlikely to happen, wrap with a try-catch to prevent it from happening.
+        testable_modules = module_info.ModuleInfo().get_testable_modules()
+    except json.JSONDecodeError:
+        logging.error('Invalid module-info.json detected. Will not index modules.')
+        return
     with open(index, 'wb') as cache:
         try:
             pickle.dump(testable_modules, cache, protocol=2)
@@ -352,6 +384,94 @@
             logging.error(err.output)
         _delete_indexes()
 
+# pylint: disable=consider-using-with
+# TODO: b/187122993 refine subprocess with 'with-statement' in fixit week.
+def acloud_create(report_file, args="", no_metrics_notice=True):
+    """Method which runs acloud create with specified args in background.
+
+    Args:
+        report_file: A path string of acloud report file.
+        args: A string of arguments.
+        no_metrics_notice: Boolean whether sending data to metrics or not.
+    """
+    notice = constants.NO_METRICS_ARG if no_metrics_notice else ""
+    match = constants.ACLOUD_REPORT_FILE_RE.match(args)
+    report_file_arg = '--report-file={}'.format(report_file) if not match else ""
+    # (b/161759557) Assume yes for acloud create to streamline atest flow.
+    acloud_cmd = ('acloud create -y {ACLOUD_ARGS} '
+                  '{REPORT_FILE_ARG} '
+                  '{METRICS_NOTICE} '
+                  ).format(ACLOUD_ARGS=args,
+                           REPORT_FILE_ARG=report_file_arg,
+                           METRICS_NOTICE=notice)
+    au.colorful_print("\nCreating AVD via acloud...", constants.CYAN)
+    logging.debug('Executing: %s', acloud_cmd)
+    start = time.time()
+    proc = subprocess.Popen(acloud_cmd, shell=True)
+    proc.communicate()
+    acloud_duration = time.time() - start
+    logging.info('"acloud create" process has completed.')
+    # Insert acloud create duration into the report file.
+    if au.is_valid_json_file(report_file):
+        try:
+            with open(report_file, 'r') as _rfile:
+                result = json.load(_rfile)
+            result[ACLOUD_DURATION] = acloud_duration
+            with open(report_file, 'w+') as _wfile:
+                _wfile.write(json.dumps(result))
+        except OSError as e:
+            logging.error("Failed dumping duration to the report file: %s", str(e))
+
+def probe_acloud_status(report_file):
+    """Method which probes the 'acloud create' result status.
+
+    If the report file exists and the status is 'SUCCESS', then the creation is
+    successful.
+
+    Args:
+        report_file: A path string of acloud report file.
+
+    Returns:
+        0: success.
+        8: acloud creation failure.
+        9: invalid acloud create arguments.
+    """
+    # 1. Created but the status is not 'SUCCESS'
+    if os.path.exists(report_file):
+        if not au.is_valid_json_file(report_file):
+            return constants.EXIT_CODE_AVD_CREATE_FAILURE
+        with open(report_file, 'r') as rfile:
+            result = json.load(rfile)
+
+        if result.get('status') == 'SUCCESS':
+            logging.info('acloud create successfully!')
+            # Always fetch the adb of the first created AVD.
+            adb_port = result.get('data').get('devices')[0].get('adb_port')
+            os.environ[constants.ANDROID_SERIAL] = '127.0.0.1:{}'.format(adb_port)
+            return constants.EXIT_CODE_SUCCESS
+        au.colorful_print(
+            'acloud create failed. Please check\n{}\nfor detail'.format(
+                report_file), constants.RED)
+        return constants.EXIT_CODE_AVD_CREATE_FAILURE
+
+    # 2. Failed to create because of invalid acloud arguments.
+    logging.error('Invalid acloud arguments found!')
+    return constants.EXIT_CODE_AVD_INVALID_ARGS
+
+def get_acloud_duration(report_file):
+    """Method which gets the duration of 'acloud create' from a report file.
+
+    Args:
+        report_file: A path string of acloud report file.
+
+    Returns:
+        An float of seconds which acloud create takes.
+    """
+    if not au.is_valid_json_file(report_file):
+        return 0
+    with open(report_file, 'r') as rfile:
+        return json.load(rfile).get(ACLOUD_DURATION, 0)
+
 
 if __name__ == '__main__':
     if not os.getenv(constants.ANDROID_HOST_OUT, ''):
diff --git a/atest/tools/atest_tools_unittest.py b/atest/tools/atest_tools_unittest.py
index a6ef31c..fc26785 100755
--- a/atest/tools/atest_tools_unittest.py
+++ b/atest/tools/atest_tools_unittest.py
@@ -26,6 +26,7 @@
 
 from unittest import mock
 
+import constants
 import unittest_constants as uc
 
 from tools import atest_tools
@@ -106,5 +107,55 @@
             self.assertEqual(atest_tools.has_command(UPDATEDB), False)
             self.assertEqual(atest_tools.has_command(LOCATE), False)
 
+    def test_get_report_file(self):
+        """Test method get_report_file."""
+        report_file = '/tmp/acloud_status.json'
+
+        arg_with_equal = '-a --report-file={} --all'.format(report_file)
+        self.assertEqual(atest_tools.get_report_file('/abc', arg_with_equal),
+                         report_file)
+
+        arg_with_equal = '-b --report_file={} --ball'.format(report_file)
+        self.assertEqual(atest_tools.get_report_file('/abc', arg_with_equal),
+                         report_file)
+
+        arg_without_equal = '-c --report-file {} --call'.format(report_file)
+        self.assertEqual(atest_tools.get_report_file('/abc', arg_without_equal),
+                         report_file)
+
+        arg_without_equal = '-d --report_file {} --dall'.format(report_file)
+        self.assertEqual(atest_tools.get_report_file('/abc', arg_without_equal),
+                         report_file)
+
+        arg_without_report = '-e --build-id 1234567'
+        self.assertEqual(atest_tools.get_report_file('/tmp', arg_without_report),
+                         report_file)
+
+    def test_probe_acloud_status(self):
+        """Test method prob_acloud_status."""
+        success = os.path.join(SEARCH_ROOT, 'acloud', 'create_success.json')
+        self.assertEqual(atest_tools.probe_acloud_status(success),
+                         constants.EXIT_CODE_SUCCESS)
+
+        failure = os.path.join(SEARCH_ROOT, 'acloud', 'create_failure.json')
+        self.assertEqual(atest_tools.probe_acloud_status(failure),
+                         constants.EXIT_CODE_AVD_CREATE_FAILURE)
+
+        inexistence = os.path.join(SEARCH_ROOT, 'acloud', 'inexistence.json')
+        self.assertEqual(atest_tools.probe_acloud_status(inexistence),
+                         constants.EXIT_CODE_AVD_INVALID_ARGS)
+
+    def test_get_acloud_duration(self):
+        """Test method get_acloud_duration."""
+        success = os.path.join(SEARCH_ROOT, 'acloud', 'create_success.json')
+        success_duration = 152.659824
+        self.assertEqual(atest_tools.get_acloud_duration(success),
+                         success_duration)
+
+        failure = os.path.join(SEARCH_ROOT, 'acloud', 'create_failure.json')
+        failure_duration = 178.621254
+        self.assertEqual(atest_tools.get_acloud_duration(failure),
+                         failure_duration)
+
 if __name__ == "__main__":
     unittest.main()
diff --git a/atest/unittest_constants.py b/atest/unittest_constants.py
index 0d162ba..7ecdc5f 100644
--- a/atest/unittest_constants.py
+++ b/atest/unittest_constants.py
@@ -34,6 +34,9 @@
 MODULE_DIR = 'foo/bar/jank'
 MODULE2_DIR = 'foo/bar/hello'
 MODULE_NAME = 'CtsJankDeviceTestCases'
+MODULE_CONFIG_NAME = 'CtsJankDeviceTestCases2'
+HOST_UNIT_TEST_NAME_1 = 'host_unit_test1'
+HOST_UNIT_TEST_NAME_2 = 'host_unit_test2'
 TYPO_MODULE_NAME = 'CtsJankDeviceTestCase'
 MODULE2_NAME = 'HelloWorldTests'
 CLASS_NAME = 'CtsDeviceJankUi'
@@ -51,25 +54,55 @@
 GTF_INT_DIR = 'gtf/core/res/config'
 
 CONFIG_FILE = os.path.join(MODULE_DIR, constants.MODULE_CONFIG)
+EXTRA_CONFIG_FILE = os.path.join(MODULE_DIR, MODULE_CONFIG_NAME + '.xml')
 CONFIG2_FILE = os.path.join(MODULE2_DIR, constants.MODULE_CONFIG)
 JSON_FILE = 'module-info.json'
 MODULE_INFO_TARGET = '/out/%s' % JSON_FILE
 MODULE_BUILD_TARGETS = {'tradefed-core', MODULE_INFO_TARGET,
                         'MODULES-IN-%s' % MODULE_DIR.replace('/', '-'),
                         'module-specific-target'}
+MODULE_BUILD_TARGETS_W_DALVIK = (MODULE_BUILD_TARGETS |
+                                 {'cts-dalvik-device-test-runner',
+                                  'cts-dalvik-host-test-runner',
+                                  'cts-tradefed'})
 MODULE_BUILD_TARGETS2 = {'build-target2'}
 MODULE_DATA = {constants.TI_REL_CONFIG: CONFIG_FILE,
                constants.TI_FILTER: frozenset()}
 MODULE_DATA2 = {constants.TI_REL_CONFIG: CONFIG_FILE,
                 constants.TI_FILTER: frozenset()}
+MODULE_DATA_W_CONFIG = {constants.TI_REL_CONFIG: EXTRA_CONFIG_FILE,
+                        constants.TI_FILTER: frozenset()}
 MODULE_INFO = test_info.TestInfo(MODULE_NAME,
                                  atf_tr.AtestTradefedTestRunner.NAME,
                                  MODULE_BUILD_TARGETS,
                                  MODULE_DATA)
+MODULE_INFO_W_DALVIK = test_info.TestInfo(
+    MODULE_NAME,
+    atf_tr.AtestTradefedTestRunner.NAME,
+    MODULE_BUILD_TARGETS_W_DALVIK,
+    MODULE_DATA,
+    module_class=[constants.MODULE_CLASS_JAVA_LIBRARIES])
+MODULE_INFO_W_CONFIG = test_info.TestInfo(MODULE_CONFIG_NAME,
+                                          atf_tr.AtestTradefedTestRunner.NAME,
+                                          MODULE_BUILD_TARGETS,
+                                          MODULE_DATA_W_CONFIG)
 MODULE_INFO2 = test_info.TestInfo(MODULE2_NAME,
                                   atf_tr.AtestTradefedTestRunner.NAME,
                                   MODULE_BUILD_TARGETS2,
                                   MODULE_DATA2)
+TEST_CONFIG_MODULE_INFO = test_info.TestInfo(
+    MODULE_CONFIG_NAME,
+    atf_tr.AtestTradefedTestRunner.NAME,
+    MODULE_BUILD_TARGETS,
+    MODULE_DATA_W_CONFIG)
+MODULE_INFO_HOST_1 = test_info.TestInfo(HOST_UNIT_TEST_NAME_1,
+                                        atf_tr.AtestTradefedTestRunner.NAME,
+                                        MODULE_BUILD_TARGETS,
+                                        MODULE_DATA)
+MODULE_INFO_HOST_2 = test_info.TestInfo(HOST_UNIT_TEST_NAME_2,
+                                        atf_tr.AtestTradefedTestRunner.NAME,
+                                        MODULE_BUILD_TARGETS,
+                                        MODULE_DATA)
 MODULE_INFOS = [MODULE_INFO]
 MODULE_INFOS2 = [MODULE_INFO, MODULE_INFO2]
 CLASS_FILTER = test_info.TestFilter(FULL_CLASS_NAME, frozenset())
@@ -134,6 +167,23 @@
                               set(),
                               data={constants.TI_REL_CONFIG: INT_CONFIG,
                                     constants.TI_FILTER: frozenset()})
+# Golden sample test filter for method under parameterized java.
+PARAMETERIZED_METHOD_FILTER = test_info.TestFilter(
+    FULL_CLASS_NAME, frozenset([METHOD_NAME + '*']))
+PARAMETERIZED_METHOD_INFO = test_info.TestInfo(
+    MODULE_NAME,
+    atf_tr.AtestTradefedTestRunner.NAME,
+    MODULE_BUILD_TARGETS,
+    data={constants.TI_FILTER: frozenset([PARAMETERIZED_METHOD_FILTER]),
+          constants.TI_REL_CONFIG: CONFIG_FILE})
+PARAMETERIZED_FLAT_METHOD_FILTER = test_info.TestFilter(
+    FULL_CLASS_NAME, frozenset([METHOD_NAME + '*', METHOD2_NAME + '*']))
+PARAMETERIZED_FLAT_METHOD_INFO = test_info.TestInfo(
+    MODULE_NAME,
+    atf_tr.AtestTradefedTestRunner.NAME,
+    MODULE_BUILD_TARGETS,
+    data={constants.TI_FILTER: frozenset([PARAMETERIZED_FLAT_METHOD_FILTER]),
+          constants.TI_REL_CONFIG: CONFIG_FILE})
 GTF_INT_INFO = test_info.TestInfo(
     GTF_INT_NAME,
     atf_tr.AtestTradefedTestRunner.NAME,
@@ -251,3 +301,11 @@
 
 # TF's log dir
 TEST_INFO_DIR = '/tmp/atest_run_1510085893_pi_Nbi'
+
+# Constants for get_test_config unit tests.
+ANDTEST_CONFIG_PATH = 'my/android/config/path'
+SINGLE_CONFIG_PATH = 'my/single/config/path'
+MULTIPLE_CONFIG_PATH = 'my/multiple/config/path'
+MAIN_CONFIG_NAME = 'main_test_config.xml'
+SINGLE_CONFIG_NAME = 'test_config.xml'
+SUB_CONFIG_NAME_2 = 'Multiple2.xml'
diff --git a/atest/unittest_data/AndroidDalvikTest.xml.data b/atest/unittest_data/AndroidDalvikTest.xml.data
new file mode 100644
index 0000000..654b6ee
--- /dev/null
+++ b/atest/unittest_data/AndroidDalvikTest.xml.data
@@ -0,0 +1,19 @@
+<configuration description="Config for CTS Jank test cases">
+  <option name="test-suite-tag" value="cts" />
+  <option name="not-shardable" value="true" />
+  <option name="config-descriptor:metadata" key="component" value="graphics" />
+  <target_preparer class="com.android.tradefed.targetprep.suite.SuiteApkInstaller">
+    <option name="cleanup-apks" value="true" />
+    <option name="test-file-name" value="CtsJankDeviceTestCases.apk" />
+    <option name="test-file-name" value="is_not_module.apk" />
+    <option name="push" value="GtsEmptyTestApp.apk->/data/local/tmp/gts/packageinstaller/GtsEmptyTestApp.apk" />
+  </target_preparer>
+  <include name="CtsUiDeviceTestCases"/>
+  <test class="com.android.tradefed.testtype.AndroidJUnitTest" >
+    <option name="package" value="android.jank.cts" />
+    <option name="runtime-hint" value="11m20s" />
+  </test>
+  <option name="perf_arg" value="perf-setup.sh" />
+  <test class="com.android.compatibility.class.for.test" />
+  <test class="com.android.compatibility.testtype.DalvikTest" />
+</configuration>
diff --git a/atest/unittest_data/AndroidLibCoreTest.xml.data b/atest/unittest_data/AndroidLibCoreTest.xml.data
new file mode 100644
index 0000000..718d439
--- /dev/null
+++ b/atest/unittest_data/AndroidLibCoreTest.xml.data
@@ -0,0 +1,15 @@
+<configuration description="Config for CTS Jank test cases">
+  <option name="test-suite-tag" value="cts" />
+  <option name="not-shardable" value="true" />
+  <option name="config-descriptor:metadata" key="component" value="graphics" />
+  <target_preparer class="com.android.tradefed.targetprep.suite.SuiteApkInstaller">
+    <option name="cleanup-apks" value="true" />
+    <option name="test-file-name" value="CtsJankDeviceTestCases.apk" />
+    <option name="test-file-name" value="is_not_module.apk" />
+    <option name="push" value="GtsEmptyTestApp.apk->/data/local/tmp/gts/packageinstaller/GtsEmptyTestApp.apk" />
+  </target_preparer>
+  <include name="CtsUiDeviceTestCases"/>
+  <option name="perf_arg" value="perf-setup.sh" />
+  <test class="com.android.compatibility.class.for.test" />
+  <test class="com.android.compatibility.testtype.LibcoreTest" />
+</configuration>
diff --git a/atest/unittest_data/AndroidTest.xml b/atest/unittest_data/AndroidTest.xml
index 431eafc..e69de29 100644
--- a/atest/unittest_data/AndroidTest.xml
+++ b/atest/unittest_data/AndroidTest.xml
@@ -1,18 +0,0 @@
-<configuration description="Config for CTS Jank test cases">
-  <option name="test-suite-tag" value="cts" />
-  <option name="not-shardable" value="true" />
-  <option name="config-descriptor:metadata" key="component" value="graphics" />
-  <target_preparer class="com.android.tradefed.targetprep.suite.SuiteApkInstaller">
-    <option name="cleanup-apks" value="true" />
-    <option name="test-file-name" value="CtsJankDeviceTestCases.apk" />
-    <option name="test-file-name" value="is_not_module.apk" />
-    <option name="push" value="GtsEmptyTestApp.apk->/data/local/tmp/gts/packageinstaller/GtsEmptyTestApp.apk" />
-  </target_preparer>
-  <include name="CtsUiDeviceTestCases"/>
-  <test class="com.android.tradefed.testtype.AndroidJUnitTest" >
-    <option name="package" value="android.jank.cts" />
-    <option name="runtime-hint" value="11m20s" />
-  </test>
-  <option name="perf_arg" value="perf-setup.sh" />
-  <test class="com.android.compatibility.class.for.test" />
-</configuration>
diff --git a/atest/unittest_data/AndroidTest.xml.data b/atest/unittest_data/AndroidTest.xml.data
new file mode 100644
index 0000000..431eafc
--- /dev/null
+++ b/atest/unittest_data/AndroidTest.xml.data
@@ -0,0 +1,18 @@
+<configuration description="Config for CTS Jank test cases">
+  <option name="test-suite-tag" value="cts" />
+  <option name="not-shardable" value="true" />
+  <option name="config-descriptor:metadata" key="component" value="graphics" />
+  <target_preparer class="com.android.tradefed.targetprep.suite.SuiteApkInstaller">
+    <option name="cleanup-apks" value="true" />
+    <option name="test-file-name" value="CtsJankDeviceTestCases.apk" />
+    <option name="test-file-name" value="is_not_module.apk" />
+    <option name="push" value="GtsEmptyTestApp.apk->/data/local/tmp/gts/packageinstaller/GtsEmptyTestApp.apk" />
+  </target_preparer>
+  <include name="CtsUiDeviceTestCases"/>
+  <test class="com.android.tradefed.testtype.AndroidJUnitTest" >
+    <option name="package" value="android.jank.cts" />
+    <option name="runtime-hint" value="11m20s" />
+  </test>
+  <option name="perf_arg" value="perf-setup.sh" />
+  <test class="com.android.compatibility.class.for.test" />
+</configuration>
diff --git a/atest/unittest_data/CtsUiDeviceTestCases.xml b/atest/unittest_data/CtsUiDeviceTestCases.xml.data
similarity index 100%
rename from atest/unittest_data/CtsUiDeviceTestCases.xml
rename to atest/unittest_data/CtsUiDeviceTestCases.xml.data
diff --git a/atest/unittest_data/KernelTest.xml b/atest/unittest_data/KernelTest.xml.data
similarity index 100%
rename from atest/unittest_data/KernelTest.xml
rename to atest/unittest_data/KernelTest.xml.data
diff --git a/atest/unittest_data/VtsAndroidTest.xml b/atest/unittest_data/VtsAndroidTest.xml.data
similarity index 100%
rename from atest/unittest_data/VtsAndroidTest.xml
rename to atest/unittest_data/VtsAndroidTest.xml.data
diff --git a/atest/unittest_data/acloud/create_failure.json b/atest/unittest_data/acloud/create_failure.json
new file mode 100644
index 0000000..2921ceb
--- /dev/null
+++ b/atest/unittest_data/acloud/create_failure.json
@@ -0,0 +1,22 @@
+{
+  "command": "create_cf",
+  "data": {
+    "devices": [
+      {
+        "adb_port": 58167,
+        "branch": "aosp-master",
+        "build_id": "6561305",
+        "build_target": "aosp_cf_x86_phone-userdebug",
+        "fetch_artifact_time": 56.8,
+        "gce_create_time": 24.11,
+        "instance_name": "ins-1f40f0db-6561305-aosp-cf-x86-phone-userdebug",
+        "ip": "34.72.216.4",
+        "launch_cvd_time": 69.71,
+        "vnc_port": 40707
+      }
+    ]
+  },
+  "errors": [],
+  "status": "FAILURE",
+  "duration": 178.621254
+}
diff --git a/atest/unittest_data/acloud/create_success.json b/atest/unittest_data/acloud/create_success.json
new file mode 100644
index 0000000..d64ea9f
--- /dev/null
+++ b/atest/unittest_data/acloud/create_success.json
@@ -0,0 +1,22 @@
+{
+  "command": "create_cf",
+  "data": {
+    "devices": [
+      {
+        "adb_port": 58167,
+        "branch": "aosp-master",
+        "build_id": "6561305",
+        "build_target": "aosp_cf_x86_phone-userdebug",
+        "fetch_artifact_time": 56.8,
+        "gce_create_time": 24.11,
+        "instance_name": "ins-1f40f0db-6561305-aosp-cf-x86-phone-userdebug",
+        "ip": "34.72.216.4",
+        "launch_cvd_time": 69.71,
+        "vnc_port": 40707
+      }
+    ]
+  },
+  "errors": [],
+  "status": "SUCCESS",
+  "duration": 152.659824
+}
diff --git a/atest/unittest_data/annotation/sample.txt b/atest/unittest_data/annotation/sample.txt
new file mode 100644
index 0000000..acc8350
--- /dev/null
+++ b/atest/unittest_data/annotation/sample.txt
@@ -0,0 +1,25 @@
+@RunWith(AndroidJUnit4.class)
+public final class SampleTest {
+
+    @Test
+    @TestAnnotation1
+    @Postsubmit(reason="new test")
+    public void annotation1_method1() {
+    }
+
+    @Test
+    @TestAnnotation1
+    public void annotation1_method2() {
+    }
+
+    @Test
+    @TestAnnotation2
+    @Postsubmit(reason="new test")
+    public void annotation2_method1() {
+    }
+
+    @Test
+    @TestAnnotation3
+    public void annotation3_method1() {
+    }
+}
diff --git a/atest/unittest_data/cache_root/78ea54ef315f5613f7c11dd1a87f10c7.cache b/atest/unittest_data/cache_root/78ea54ef315f5613f7c11dd1a87f10c7.cache
index 3b384c7..b196fd3 100644
--- a/atest/unittest_data/cache_root/78ea54ef315f5613f7c11dd1a87f10c7.cache
+++ b/atest/unittest_data/cache_root/78ea54ef315f5613f7c11dd1a87f10c7.cache
Binary files differ
diff --git a/atest/unittest_data/class_file_path_testing/hello_world_test.kt b/atest/unittest_data/class_file_path_testing/hello_world_test.kt
index 623b4a2..ad2775b 100644
--- a/atest/unittest_data/class_file_path_testing/hello_world_test.kt
+++ b/atest/unittest_data/class_file_path_testing/hello_world_test.kt
@@ -1 +1,9 @@
-package com.test.hello_world_test
\ No newline at end of file
+package com.test.hello_world_test
+
+class HelloWorldTest : InstrumentationTestCase() {
+    @Test
+    fun testMethod1() throws Exception {}
+
+    @Test
+    fun testMethod2() throws Exception {}
+}
\ No newline at end of file
diff --git a/atest/unittest_data/filter_configs/filter.cfg b/atest/unittest_data/filter_configs/filter.cfg
new file mode 100644
index 0000000..4aa13f9
--- /dev/null
+++ b/atest/unittest_data/filter_configs/filter.cfg
@@ -0,0 +1,39 @@
+<?xml version="1.0" encoding="utf-8"?>
+<!-- Copyright (C) 2021 The Android Open Source Project
+
+     Licensed under the Apache License, Version 2.0 (the "License");
+     you may not use this file except in compliance with the License.
+     You may obtain a copy of the License at
+
+          http://www.apache.org/licenses/LICENSE-2.0
+
+     Unless required by applicable law or agreed to in writing, software
+     distributed under the License is distributed on an "AS IS" BASIS,
+     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     See the License for the specific language governing permissions and
+     limitations under the License.
+-->
+<!-- This test config file is auto-generated. -->
+<configuration description="Runs hello_world_test.">
+    <target_preparer class="com.android.tradefed.targetprep.RootTargetPreparer">
+        <option name="force-root" value="false" />
+    </target_preparer>
+
+    <target_preparer class="com.android.tradefed.targetprep.PushFilePreparer">
+        <option name="cleanup" value="true" />
+        <option name="push" value="hello_world_test->/data/local/tmp/hello_world_test" />
+    </target_preparer>
+
+    <test class="com.android.tradefed.testtype.GTest" >
+        <option name="native-test-device-path" value="/data/local/tmp" />
+        <option name="module-name" value="hello_world_test" />
+    </test>
+
+    <test class="com.android.tradefed.testtype.AndroidJUnitTest" >
+        <option name="include-annotation" value="include1" />
+        <option name="include-annotation" value="include2" />
+        <option name="exclude-annotation" value="exclude1" />
+        <option name="exclude-annotation" value="exclude2" />
+        <option name="hidden-api-checks" value="false" />
+    </test>
+</configuration>
diff --git a/atest/unittest_data/filter_configs/no_filter.cfg b/atest/unittest_data/filter_configs/no_filter.cfg
new file mode 100644
index 0000000..06cb8ef
--- /dev/null
+++ b/atest/unittest_data/filter_configs/no_filter.cfg
@@ -0,0 +1,31 @@
+<?xml version="1.0" encoding="utf-8"?>
+<!-- Copyright (C) 2021 The Android Open Source Project
+
+     Licensed under the Apache License, Version 2.0 (the "License");
+     you may not use this file except in compliance with the License.
+     You may obtain a copy of the License at
+
+          http://www.apache.org/licenses/LICENSE-2.0
+
+     Unless required by applicable law or agreed to in writing, software
+     distributed under the License is distributed on an "AS IS" BASIS,
+     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     See the License for the specific language governing permissions and
+     limitations under the License.
+-->
+<!-- This test config file is auto-generated. -->
+<configuration description="Runs hello_world_test.">
+    <target_preparer class="com.android.tradefed.targetprep.RootTargetPreparer">
+        <option name="force-root" value="false" />
+    </target_preparer>
+
+    <target_preparer class="com.android.tradefed.targetprep.PushFilePreparer">
+        <option name="cleanup" value="true" />
+        <option name="push" value="hello_world_test->/data/local/tmp/hello_world_test" />
+    </target_preparer>
+
+    <test class="com.android.tradefed.testtype.GTest" >
+        <option name="native-test-device-path" value="/data/local/tmp" />
+        <option name="module-name" value="hello_world_test" />
+    </test>
+</configuration>
diff --git a/atest/unittest_data/module-info.json b/atest/unittest_data/module-info.json
index 0959fad..462cb60 100644
--- a/atest/unittest_data/module-info.json
+++ b/atest/unittest_data/module-info.json
@@ -15,5 +15,16 @@
   "multiarch2": { "class": ["JAVA_LIBRARIES"],  "path": ["shared/path/to/be/used2"], "tags": ["optional"],  "installed": ["out/host/linux-x86/framework/tradefed-contrib.jar"], "module_name": "multiarch2" },
   "multiarch2_32": { "class": ["JAVA_LIBRARIES"],  "path": ["shared/path/to/be/used2"], "tags": ["optional"],  "installed": ["out/host/linux-x86/framework/tradefed-contrib.jar"], "module_name": "multiarch2" },
   "multiarch3": { "class": ["JAVA_LIBRARIES"],  "path": ["shared/path/to/be/used2"], "tags": ["optional"],  "installed": ["out/host/linux-x86/framework/tradefed-contrib.jar"], "module_name": "multiarch3" },
-  "multiarch3_32": { "class": ["JAVA_LIBRARIES"],  "path": ["shared/path/to/be/used2"], "tags": ["optional"],  "installed": ["out/host/linux-x86/framework/tradefed-contrib.jar"], "module_name": "multiarch3_32" }
+  "multiarch3_32": { "class": ["JAVA_LIBRARIES"],  "path": ["shared/path/to/be/used2"], "tags": ["optional"],  "installed": ["out/host/linux-x86/framework/tradefed-contrib.jar"], "module_name": "multiarch3_32" },
+  "dep_test_module": { "module_name": "dep_test_module", "dependencies": ["module_1", "module_2"] },
+  "module_1": { "module_name": "module_1", "dependencies": [], "installed": ["path1"] },
+  "module_2": { "module_name": "module_2", "dependencies": [] },
+  "module_3": { "module_name": "module_3", "dependencies": [] },
+  "test_dep_level_1_1": { "module_name": "test_dep_level_1_1", "dependencies": [] },
+  "test_dep_level_1_2": { "module_name": "test_dep_level_1_2", "dependencies": [] },
+  "test_dep_level_2_1": { "module_name": "test_dep_level_2_1", "dependencies": [], "installed": ["path2"] },
+  "test_dep_level_2_2": { "module_name": "test_dep_level_2_2", "dependencies": [] },
+  "single_config_module": {"module_name": "single_config_module", "test_config":  ["my/single/config/path/test_config.xml"],  "path": ["my/single/config/path"]},
+  "androidtest_config_module": {"module_name": "androidtest_config_module", "test_config":  [],  "path": ["my/android/config/path"]},
+  "multiple_config_module": {"module_name": "multiple_config_module", "test_config":  ["my/multiple/config/path/main_test_config.xml", "my/multiple/config/path/Multiple1.xml", "my/multiple/config/path/Multiple2.xml"], "path": ["my/multiple/config/path"]}
 }
diff --git a/atest/unittest_data/module_bp_cc_deps.json b/atest/unittest_data/module_bp_cc_deps.json
new file mode 100644
index 0000000..a1b6549
--- /dev/null
+++ b/atest/unittest_data/module_bp_cc_deps.json
@@ -0,0 +1,33 @@
+{
+	"clang": "${ANDROID_ROOT}/prebuilts/clang/host/linux-x86/clang-r399163b/bin/clang",
+	"clang++": "${ANDROID_ROOT}/prebuilts/clang/host/linux-x86/clang-r399163b/bin/clang++",
+	"modules": {
+        "module_cc_1": {
+                "dependencies": [
+                        "test_cc_dep_level_1_1",
+                        "test_cc_dep_level_1_2"
+                ]
+        },
+        "module_cc_2": {
+                "dependencies": [
+                        "test_cc_dep_level_1_2"
+                ]
+        },
+        "module_cc_3": {
+        },
+        "test_cc_dep_level_1_1": {
+                "dependencies": [
+                        "test_cc_dep_level_2_1"
+                ]
+        },
+        "test_cc_dep_level_1_2": {
+                "dependencies": [
+                        "test_cc_dep_level_2_2"
+                ]
+        },
+        "test_cc_dep_level_2_1": {
+        },
+        "test_cc_dep_level_2_2": {
+        }
+  }
+}
diff --git a/atest/unittest_data/module_bp_java_deps.json b/atest/unittest_data/module_bp_java_deps.json
new file mode 100644
index 0000000..72b1839
--- /dev/null
+++ b/atest/unittest_data/module_bp_java_deps.json
@@ -0,0 +1,29 @@
+{
+        "module_1": {
+                "dependencies": [
+                        "test_dep_level_1_1",
+                        "test_dep_level_1_2"
+                ]
+        },
+        "module_2": {
+                "dependencies": [
+                        "test_dep_level_1_2"
+                ]
+        },
+        "module_3": {
+        },
+        "test_dep_level_1_1": {
+                "dependencies": [
+                        "test_dep_level_2_1"
+                ]
+        },
+        "test_dep_level_1_2": {
+                "dependencies": [
+                        "test_dep_level_2_2"
+                ]
+        },
+        "test_dep_level_2_1": {
+        },
+        "test_dep_level_2_2": {
+        }
+}
diff --git a/atest/unittest_data/module_bp_java_loop_deps.json b/atest/unittest_data/module_bp_java_loop_deps.json
new file mode 100644
index 0000000..f74aced
--- /dev/null
+++ b/atest/unittest_data/module_bp_java_loop_deps.json
@@ -0,0 +1,31 @@
+{
+        "module_1": {
+                "dependencies": [
+                        "test_dep_level_1_1",
+                        "test_dep_level_1_2"
+                ]
+        },
+        "module_2": {
+                "dependencies": [
+                        "test_dep_level_1_2"
+                ]
+        },
+        "test_dep_level_1_1": {
+                "dependencies": [
+                        "module_1",
+                        "test_dep_level_2_1"
+                ]
+        },
+        "test_dep_level_1_2": {
+                "dependencies": [
+                        "test_dep_level_2_2"
+                ]
+        },
+        "test_dep_level_2_1": {
+                "dependencies": [
+                        "module_1"
+                ]
+        },
+        "test_dep_level_2_2": {
+        }
+}
diff --git a/atest/unittest_data/my_cc_test.cc b/atest/unittest_data/my_cc_test.cc
new file mode 100644
index 0000000..db0047c
--- /dev/null
+++ b/atest/unittest_data/my_cc_test.cc
@@ -0,0 +1,72 @@
+INSTANTIATE_TEST_SUITE_P( Instantiation1, MyInstantClass1,
+    testing::Combine(testing::Values(Options::Language::CPP, Options::Language::JAVA,
+                                     Options::Language::NDK, Options::Language::RUST),
+                     testing::ValuesIn(kTypeParams)),
+    [](const testing::TestParamInfo<std::tuple<Options::Language, TypeParam>>& info) {
+      return Options::LanguageToString(std::get<0>(info.param)) + "_" +
+             std::get<1>(info.param).kind;
+    });
+
+INSTANTIATE_TEST_CASE_P(Instantiation2,
+    MyInstantClass2,
+    testing::Combine(testing::Values(Options::Language::CPP, Options::Language::JAVA,
+                                     Options::Language::NDK, Options::Language::RUST),
+                     testing::ValuesIn(kTypeParams)),
+    [](const testing::TestParamInfo<std::tuple<Options::Language, TypeParam>>& info) {
+      return Options::LanguageToString(std::get<0>(info.param)) + "_" +
+             std::get<1>(info.param).kind;
+    });
+
+INSTANTIATE_TEST_SUITE_P(
+    Instantiation3, MyInstantClass1 ,
+    testing::Combine(testing::Values(Options::Language::CPP, Options::Language::JAVA,
+                                     Options::Language::NDK, Options::Language::RUST),
+                     testing::ValuesIn(kTypeParams)),
+    [](const testing::TestParamInfo<std::tuple<Options::Language, TypeParam>>& info) {
+      return Options::LanguageToString(std::get<0>(info.param)) + "_" +
+             std::get<1>(info.param).kind;
+    });
+
+
+INSTANTIATE_TEST_CASE_P(
+    Instantiation4,
+    MyInstantClass3,
+    testing::Combine(testing::Values(Options::Language::CPP, Options::Language::JAVA,
+                                     Options::Language::NDK, Options::Language::RUST),
+                     testing::ValuesIn(kTypeParams)),
+    [](const testing::TestParamInfo<std::tuple<Options::Language, TypeParam>>& info) {
+      return Options::LanguageToString(std::get<0>(info.param)) + "_" +
+             std::get<1>(info.param).kind;
+    });
+    
+TEST_P( MyClass1, Method1) {
+  Run("List<{}>", kListSupportExpectations);
+}
+
+TEST_F(
+MyClass1, 
+Method2) {
+  Run("List<{}>", kListSupportExpectations);
+}
+
+TEST_P(MyClass2, 
+       Method3) {
+  Run("List<{}>", kListSupportExpectations);
+}
+
+TEST_F(MyClass3, Method2) {
+  Run("List<{}>", kListSupportExpectations);
+}
+
+TEST(MyClass4, Method5) {
+  Run("List<{}>", kListSupportExpectations);
+}
+
+TEST(MyClass5, Method5) {
+  Run("List<{}>", kListSupportExpectations);
+}
+
+INSTANTIATE_TYPED_TEST_CASE_P(Instantiation5, MyInstantTypeClass1, IntTypes);
+
+INSTANTIATE_TYPED_TEST_SUITE_P(Instantiation6, MyInstantTypeClass2, IntTypes);
+
diff --git a/atest/unittest_data/not-valid-module-info.json b/atest/unittest_data/not-valid-module-info.json
new file mode 100644
index 0000000..ea28ea3
--- /dev/null
+++ b/atest/unittest_data/not-valid-module-info.json
@@ -0,0 +1,3 @@
+{
+  "AmSlam": { "class": ["APPS"],  "path": ["foo/bar/AmSlam"],  "tags": ["tests"],  "installed": ["out/target/product/generic/data/app/AmSlam/AmSlam.apk"], "module_name": "AmSlam"
+}
diff --git a/atest/unittest_data/parameter_config/instant_app_parameter.cfg b/atest/unittest_data/parameter_config/instant_app_parameter.cfg
new file mode 100644
index 0000000..923d91b
--- /dev/null
+++ b/atest/unittest_data/parameter_config/instant_app_parameter.cfg
@@ -0,0 +1,19 @@
+<?xml version="1.0" encoding="utf-8"?>
+<!-- Copyright (C) 2021 The Android Open Source Project
+
+     Licensed under the Apache License, Version 2.0 (the "License");
+     you may not use this file except in compliance with the License.
+     You may obtain a copy of the License at
+
+          http://www.apache.org/licenses/LICENSE-2.0
+
+     Unless required by applicable law or agreed to in writing, software
+     distributed under the License is distributed on an "AS IS" BASIS,
+     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     See the License for the specific language governing permissions and
+     limitations under the License.
+-->
+<configuration description="Config for parameter tests">
+    <option name="config-descriptor:metadata" key="parameter" value="value_1" />
+    <option name="config-descriptor:metadata" key="parameter" value="instant_app" />
+</configuration>
diff --git a/atest/unittest_data/parameter_config/no_parameter.cfg b/atest/unittest_data/parameter_config/no_parameter.cfg
new file mode 100644
index 0000000..89b56db
--- /dev/null
+++ b/atest/unittest_data/parameter_config/no_parameter.cfg
@@ -0,0 +1,19 @@
+<?xml version="1.0" encoding="utf-8"?>
+<!-- Copyright (C) 2021 The Android Open Source Project
+
+     Licensed under the Apache License, Version 2.0 (the "License");
+     you may not use this file except in compliance with the License.
+     You may obtain a copy of the License at
+
+          http://www.apache.org/licenses/LICENSE-2.0
+
+     Unless required by applicable law or agreed to in writing, software
+     distributed under the License is distributed on an "AS IS" BASIS,
+     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     See the License for the specific language governing permissions and
+     limitations under the License.
+-->
+<configuration description="Config for no parameter tests">
+    <option name="config-descriptor:metadata" key="parameter_no" value="value_1" />
+    <option name="config-descriptor:metadata_no" key="parameter" value="value_2" />
+</configuration>
diff --git a/atest/unittest_data/parameter_config/parameter.cfg b/atest/unittest_data/parameter_config/parameter.cfg
new file mode 100644
index 0000000..f94d2b1
--- /dev/null
+++ b/atest/unittest_data/parameter_config/parameter.cfg
@@ -0,0 +1,21 @@
+<?xml version="1.0" encoding="utf-8"?>
+<!-- Copyright (C) 2021 The Android Open Source Project
+
+     Licensed under the Apache License, Version 2.0 (the "License");
+     you may not use this file except in compliance with the License.
+     You may obtain a copy of the License at
+
+          http://www.apache.org/licenses/LICENSE-2.0
+
+     Unless required by applicable law or agreed to in writing, software
+     distributed under the License is distributed on an "AS IS" BASIS,
+     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     See the License for the specific language governing permissions and
+     limitations under the License.
+-->
+<configuration description="Config for parameter tests">
+    <option name="config-descriptor:metadata" key="parameter" value="value_1" />
+    <option name="config-descriptor:metadata" key="parameter" value="value_2" />
+    <option name="config-descriptor:metadata" key="parameter" value="value_3" />
+    <option name="config-descriptor:metadata" key="parameter" value="value_4" />
+</configuration>
diff --git a/atest/unittest_data/path_testing/PathTesting.java b/atest/unittest_data/path_testing/PathTesting.java
index 468307a..6f744bd 100644
--- a/atest/unittest_data/path_testing/PathTesting.java
+++ b/atest/unittest_data/path_testing/PathTesting.java
@@ -16,7 +16,7 @@
 
 package android.jank.cts.ui;
 
-/** Dummy Class file for unit tests. */
-public class SomeClassForTesting {
-    private static final String SOME_DUMMY_VAR = "For testing purposes";
+/** Unused Class file for unit tests. */
+public class SomeClassForTesting extends AtestClass {
+    private static final String SOME_UNUSED_VAR = "For testing purposes";
 }
diff --git a/atest/unittest_data/test_config/a.xml b/atest/unittest_data/test_config/a.xml.data
similarity index 100%
rename from atest/unittest_data/test_config/a.xml
rename to atest/unittest_data/test_config/a.xml.data
diff --git a/atest/unittest_data/test_record.proto.testonly b/atest/unittest_data/test_record.proto.testonly
new file mode 100644
index 0000000..7dd8734
--- /dev/null
+++ b/atest/unittest_data/test_record.proto.testonly
Binary files differ
diff --git a/atest/unittest_data/tradefed_prebuilt/prebuilts/filegroups/suite/compatibility-tradefed.jar b/atest/unittest_data/tradefed_prebuilt/prebuilts/filegroups/suite/compatibility-tradefed.jar
new file mode 100755
index 0000000..ae9d928
--- /dev/null
+++ b/atest/unittest_data/tradefed_prebuilt/prebuilts/filegroups/suite/compatibility-tradefed.jar
Binary files differ
diff --git a/atest/unittest_data/tradefed_prebuilt/prebuilts/filegroups/tradefed/tradefed-contrib.jar b/atest/unittest_data/tradefed_prebuilt/prebuilts/filegroups/tradefed/tradefed-contrib.jar
new file mode 100755
index 0000000..2c5b14a
--- /dev/null
+++ b/atest/unittest_data/tradefed_prebuilt/prebuilts/filegroups/tradefed/tradefed-contrib.jar
Binary files differ
diff --git a/atest/unittest_data/test_config/a.xml b/atest/unittest_data/tradefed_prebuilt/prebuilts/test_harness/tmp
similarity index 100%
copy from atest/unittest_data/test_config/a.xml
copy to atest/unittest_data/tradefed_prebuilt/prebuilts/test_harness/tmp
diff --git a/atest/unittest_data/vts_plan_files/vts-aa.xml b/atest/unittest_data/vts_plan_files/vts-aa.xml.data
similarity index 100%
rename from atest/unittest_data/vts_plan_files/vts-aa.xml
rename to atest/unittest_data/vts_plan_files/vts-aa.xml.data
diff --git a/atest/unittest_data/vts_plan_files/vts-bb.xml b/atest/unittest_data/vts_plan_files/vts-bb.xml.data
similarity index 100%
rename from atest/unittest_data/vts_plan_files/vts-bb.xml
rename to atest/unittest_data/vts_plan_files/vts-bb.xml.data
diff --git a/atest/unittest_data/vts_plan_files/vts-cc.xml b/atest/unittest_data/vts_plan_files/vts-cc.xml.data
similarity index 100%
rename from atest/unittest_data/vts_plan_files/vts-cc.xml
rename to atest/unittest_data/vts_plan_files/vts-cc.xml.data
diff --git a/atest/unittest_data/vts_plan_files/vts-dd.xml b/atest/unittest_data/vts_plan_files/vts-dd.xml.data
similarity index 100%
rename from atest/unittest_data/vts_plan_files/vts-dd.xml
rename to atest/unittest_data/vts_plan_files/vts-dd.xml.data
diff --git a/atest/unittest_data/vts_plan_files/vts-staging-default.xml b/atest/unittest_data/vts_plan_files/vts-staging-default.xml.data
similarity index 100%
rename from atest/unittest_data/vts_plan_files/vts-staging-default.xml
rename to atest/unittest_data/vts_plan_files/vts-staging-default.xml.data
diff --git a/atest/unittest_data/zip_files/multi_file.zip b/atest/unittest_data/zip_files/multi_file.zip
new file mode 100644
index 0000000..129af09
--- /dev/null
+++ b/atest/unittest_data/zip_files/multi_file.zip
Binary files differ
diff --git a/atest/unittest_data/zip_files/single_file.zip b/atest/unittest_data/zip_files/single_file.zip
new file mode 100644
index 0000000..1d67faa
--- /dev/null
+++ b/atest/unittest_data/zip_files/single_file.zip
Binary files differ
diff --git a/atest/unittest_utils.py b/atest/unittest_utils.py
index a2f806f..35db172 100644
--- a/atest/unittest_utils.py
+++ b/atest/unittest_utils.py
@@ -81,7 +81,7 @@
             raise AssertionError('No matching TestInfo (%s) in [%s]' %
                                  (test_info_a, ';'.join([str(t) for t in test_info_set_b])))
 
-
+# pylint: disable=too-many-return-statements
 def isfile_side_effect(value):
     """Mock return values for os.path.isfile."""
     if value == '/%s/%s' % (uc.CC_MODULE_DIR, constants.MODULE_CONFIG):
@@ -100,6 +100,18 @@
         return True
     if value.endswith(uc.GTF_INT_NAME + '.xml'):
         return True
+    if value.endswith(
+        '/%s/%s' % (uc.ANDTEST_CONFIG_PATH, constants.MODULE_CONFIG)):
+        return True
+    if value.endswith(
+        '/%s/%s' % (uc.SINGLE_CONFIG_PATH, uc.SINGLE_CONFIG_NAME)):
+        return True
+    if value.endswith(
+        '/%s/%s' % (uc.MULTIPLE_CONFIG_PATH, uc.MAIN_CONFIG_NAME)):
+        return True
+    if value.endswith(
+        '/%s/%s' % (uc.MULTIPLE_CONFIG_PATH, uc.SUB_CONFIG_NAME_2)):
+        return True
     return False
 
 
diff --git a/plugin_lib/Android.bp b/plugin_lib/Android.bp
new file mode 100644
index 0000000..adddfd4
--- /dev/null
+++ b/plugin_lib/Android.bp
@@ -0,0 +1,64 @@
+// Copyright (C) 2020 The Android Open Source Project
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+//      http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package {
+    default_applicable_licenses: ["Android-Apache-2.0"],
+}
+
+python_defaults {
+    name: "plugin_default",
+    pkg_path: "plugin_lib",
+    version: {
+        py2: {
+            enabled: false,
+            embedded_launcher: false,
+        },
+        py3: {
+            enabled: true,
+            embedded_launcher: false,
+        },
+    },
+}
+
+python_library_host {
+    name: "asuite_plugin_lib",
+    defaults: ["plugin_default"],
+    srcs: [
+        "**/*.py",
+    ],
+    exclude_srcs: [
+        "*_unittest.py",
+        "**/*_unittest.py",
+    ],
+}
+
+python_test_host {
+    name: "plugin_lib_unittests",
+    main: "plugin_run_unittests.py",
+    pkg_path: "plugin_lib",
+    srcs: [
+        "**/*.py",
+    ],
+    libs: [
+        "aidegen_lib_common_util",
+        "atest_module_info",
+        "asuite_cc_client",
+    ],
+    test_config: "plugin_lib_unittests.xml",
+    test_suites: ["null-suite"],
+    defaults: ["plugin_default"],
+        test_options:{
+        unit_test: false,
+    },
+}
diff --git a/plugin_lib/OWNERS b/plugin_lib/OWNERS
new file mode 100644
index 0000000..2c33f13
--- /dev/null
+++ b/plugin_lib/OWNERS
@@ -0,0 +1,5 @@
+shinwang@google.com
+patricktu@google.com
+bralee@google.com
+albaltai@google.com
+dshi@google.com
diff --git a/atest/unittest_data/test_config/a.xml b/plugin_lib/__init__.py
similarity index 100%
copy from atest/unittest_data/test_config/a.xml
copy to plugin_lib/__init__.py
diff --git a/plugin_lib/deployment.py b/plugin_lib/deployment.py
new file mode 100644
index 0000000..04f178c
--- /dev/null
+++ b/plugin_lib/deployment.py
@@ -0,0 +1,130 @@
+#!/usr/bin/env python3
+#
+# Copyright 2020 - The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Asuite plugin deployment."""
+import os
+import subprocess
+
+from aidegen.lib import common_util
+from aidegen.lib import config
+
+_ASK_INSTALL_PLUGIN = """\nAsuite plugin is a new tool with following features:
+    -Atest UI widget. For more information: go/atest_plugin
+    -Code search integration. For more information and locate build module: go/android-platform-plugin
+Would you like to install the Asuite plugin? (Yes/no/auto)"""
+_ASK_UPGRADE_PLUGIN = ('\nAsuite plugin has a new version. Would you like to '
+                       'upgrade Asuite plugin? (Yes/no/auto)')
+_YES_RESPONSE = 'Thank you, Asuit plugin will be installed in IntelliJ.'
+_NO_RESPONSE = ('Thank you, if you want to install Asuite plugin, please use '
+                'aidegen --plugin.')
+_AUTO_RESPONSE = ('Thank you, Asuit plugin will be installed in IntelliJ, and '
+                  'automatically updated to the newest version.')
+_THANKS_UPGRADE = 'Thank you for upgrading the Asuite plugin.'
+_NO_NEED_UPGRADE = 'Awesome! You have the newest Asuite plugin.'
+_SELECTION_ITEM = {'yes': 'yes', 'no': 'no', 'auto': 'auto', 'y': 'yes',
+                   'n': 'no', 'a': 'auto', '': 'yes'}
+
+
+class PluginDeployment:
+    """The util class of Asuite plugin deployment.
+
+    Usage:
+        PluginDeployment.install_asuite_plugin()
+        It will start installation process.
+
+    Attributes:
+        is_internal: True if the user is a internal user.
+    """
+
+    def __init__(self):
+        """PluginDeployment initialize."""
+        self.is_internal = self._is_internal_user()
+
+    def install_asuite_plugin(self):
+        """It is the main entry function for installing Asuite plugin."""
+
+    def _ask_for_install(self):
+        """Asks the user to install the Asuite plugin."""
+        input_data = input(_ASK_INSTALL_PLUGIN)
+        while input_data.lower() not in _SELECTION_ITEM.keys():
+            input_data = input(_ASK_INSTALL_PLUGIN)
+        choice = _SELECTION_ITEM.get(input_data)
+        self._user_selection = choice
+        if choice == 'no':
+            print(_NO_RESPONSE)
+        else:
+            self._copy_jars()
+            if choice == 'yes':
+                print(_YES_RESPONSE)
+            else:
+                print(_AUTO_RESPONSE)
+
+    def _ask_for_upgrade(self):
+        """Asks the user to upgrade the Asuite plugin."""
+
+    def _copy_jars(self):
+        """Copies jars to IntelliJ plugin folders."""
+
+    def _build_jars(self):
+        """builds jars to IntelliJ plugin folders."""
+        asuite_plugin_path = os.path.join(common_util.get_android_root_dir(),
+                                          'tools/asuite/asuite_plugin/')
+        asuite_plugin_gradle_path = os.path.join(asuite_plugin_path, 'gradlew')
+        cmd = [asuite_plugin_gradle_path, 'build']
+        subprocess.check_call(cmd, cwd=asuite_plugin_path)
+
+    def _is_plugin_installed(self):
+        """Checks if the user has installed Asuite plugin before.
+
+        Return:
+            True if the user has installed Asuite plugin.
+        """
+
+    def _is_version_up_to_date(self):
+        """Checks if all plugins' versions are up to date or not.
+
+        Return:
+            True if all plugins' versions are up to date.
+        """
+
+    @property
+    def _user_selection(self):
+        """Reads the user's selection from config file.
+
+        Return:
+            A string of the user's selection: yes/no/auto.
+        """
+        with config.AidegenConfig() as aconf:
+            return aconf.plugin_preference
+
+    @_user_selection.setter
+    def _user_selection(self, selection):
+        """Writes the user's selection to config file.
+
+        Args:
+            selection: A string of the user's selection: yes/no/auto.
+        """
+        with config.AidegenConfig() as aconf:
+            aconf.plugin_preference = selection
+
+    @staticmethod
+    def _is_internal_user():
+        """Checks if the user is internal user or external user.
+
+        Return:
+            True if the user is a internal user.
+        """
+        return True
diff --git a/plugin_lib/deployment_unittest.py b/plugin_lib/deployment_unittest.py
new file mode 100644
index 0000000..116453a
--- /dev/null
+++ b/plugin_lib/deployment_unittest.py
@@ -0,0 +1,68 @@
+#!/usr/bin/env python3
+#
+# Copyright 2020 - The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for deployment."""
+import os
+import shutil
+import subprocess
+import tempfile
+import unittest
+from unittest import mock
+
+from deployment import PluginDeployment
+
+
+# pylint: disable=protected-access
+from aidegen.lib import config
+
+
+class DeploymentUnittests(unittest.TestCase):
+    """Unit tests for deployment.py."""
+
+    _TMP_DIR = None
+
+    def setUp(self):
+        """Prepare the testdata related path."""
+        DeploymentUnittests._TMP_DIR = tempfile.mkdtemp()
+        config.AidegenConfig._CONFIG_DIR = os.path.join(
+            DeploymentUnittests._TMP_DIR, '.config', 'asuite', 'aidegen')
+        config.AidegenConfig._CONFIG_FILE_PATH = os.path.join(
+            config.AidegenConfig._CONFIG_DIR,
+            config.AidegenConfig._DEFAULT_CONFIG_FILE)
+
+    def tearDown(self):
+        """Clear the testdata related path."""
+        shutil.rmtree(DeploymentUnittests._TMP_DIR)
+
+    @mock.patch('builtins.input')
+    def test_ask_for_install(self, mock_input):
+        """Test _ask_for_install."""
+        mock_input.return_value = 'y'
+        PluginDeployment()._ask_for_install()
+        self.assertTrue(mock_input.call)
+
+    @mock.patch.object(subprocess, 'check_call')
+    def test_build_jars(self, mock_check_call):
+        """Test _build_jars."""
+        PluginDeployment()._build_jars()
+        self.assertTrue(mock_check_call.call)
+
+    def test_write_read_selection(self):
+        """Test _read_selection and _write_selection."""
+        PluginDeployment._user_selection = 'yes'
+        self.assertEqual(PluginDeployment._user_selection, 'yes')
+        PluginDeployment._user_selection = 'no'
+        self.assertEqual(PluginDeployment._user_selection, 'no')
diff --git a/plugin_lib/plugin_lib_unittests.xml b/plugin_lib/plugin_lib_unittests.xml
new file mode 100644
index 0000000..598b570
--- /dev/null
+++ b/plugin_lib/plugin_lib_unittests.xml
@@ -0,0 +1,20 @@
+<?xml version="1.0" encoding="utf-8"?>
+<!-- Copyright (C) 2020 The Android Open Source Project
+     Licensed under the Apache License, Version 2.0 (the "License");
+     you may not use this file except in compliance with the License.
+     You may obtain a copy of the License at
+          http://www.apache.org/licenses/LICENSE-2.0
+     Unless required by applicable law or agreed to in writing, software
+     distributed under the License is distributed on an "AS IS" BASIS,
+     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     See the License for the specific language governing permissions and
+     limitations under the License.
+-->
+<configuration description="Config to run plugin lib unittests">
+    <option name="test-suite-tag" value="plugin_lib_unittests" />
+
+    <test class="com.android.tradefed.testtype.python.PythonBinaryHostTest" >
+        <option name="par-file-name" value="plugin_lib_unittests" />
+        <option name="test-timeout" value="2m" />
+    </test>
+</configuration>
diff --git a/plugin_lib/plugin_run_unittests.py b/plugin_lib/plugin_run_unittests.py
new file mode 100644
index 0000000..b88780a
--- /dev/null
+++ b/plugin_lib/plugin_run_unittests.py
@@ -0,0 +1,67 @@
+#!/usr/bin/env python3
+#
+# Copyright 2020 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+"""Main entrypoint for all of plugin_lib's unittest."""
+
+import logging
+import os
+import sys
+import unittest
+from importlib import import_module
+
+# Setup logging to be silent so unittests can pass through TF.
+logging.disable(logging.ERROR)
+
+def get_test_modules():
+    """Returns a list of testable modules.
+
+    Finds all the test files (*_unittest.py) and gets their relative
+    paths (internal/lib/utils_test.py) and translate it to an import path and
+    strip the py ext (internal.lib.utils_test).
+
+    Returns:
+        A list of strings (the testable module import path).
+    """
+    testable_modules = []
+    base_path = os.path.dirname(os.path.realpath(__file__))
+
+    for dirpath, _, files in os.walk(base_path):
+        for _file in files:
+            if _file.endswith("_unittest.py"):
+                # Now transform it into a relative import path.
+                full_file_path = os.path.join(dirpath, _file)
+                rel_file_path = os.path.relpath(full_file_path, base_path)
+                rel_file_path, _ = os.path.splitext(rel_file_path)
+                rel_file_path = rel_file_path.replace(os.sep, ".")
+                testable_modules.append(rel_file_path)
+
+    return testable_modules
+
+
+def main():
+    """Main unittest entry."""
+    test_modules = get_test_modules()
+    for mod in test_modules:
+        import_module(mod)
+
+    loader = unittest.defaultTestLoader
+    test_suite = loader.loadTestsFromNames(test_modules)
+    runner = unittest.TextTestRunner(verbosity=2)
+    result = runner.run(test_suite)
+    sys.exit(not result.wasSuccessful())
+
+
+if __name__ == '__main__':
+    main()