Robolectric with Gradle and Android Studio

Over Christmas break, I attempted to get Robolectric tests working in Gradle from the command line and from the IDE without the aid of a plugin. This approach was intentional. I attempted to convert a new, empty project to use a couple different Robolectric unit test plugins with zero success. My experience left me embittered against plugins for a number of reasons.

  • Third-party modules in Gradle are not easily extensible. If something is broken, pretty much the only real solution that’s not a hack is to patch the plugin and wait for an update.

  • Updates to the plugins come through packaged releases or snap-shots over Maven. Most plugins have very limited release cycles, and so many end up depending on SNAPSHOT builds in the meanwhile that could change while you’re using it!

  • If the plugin maintainer disagrees with your approach, your only alternative is creating a fork of the plugin with a bunch of code you didn’t write and for the most part don’t understand. Not only are you responsible for the code, you’re now also responsible for the new releases, meaning uploading to nexus-int yourself each time you change the code.

Beyond all this, in my experience, most plugins are dependent on particular versions of the toolchain, are in some way broken, abandoned, or impose limitations on how your project is organized. The theory was, if I could write custom code to get this working in my build:

  • I understand the build process better

  • I could maintain the build better by not depending on a third-party to update their solution when it breaks.

  • I wouldn’t have to worry about the plugin being deprecated when AS 1.1 comes out and releases support for unit testing.

With that, here is an explanation of the approach.

Groovy Modifications

  1. Create a Java library module that depends on the “app” project. “File” > “New Module”

    Android Studio

  2. Compile the android project’s classpath, resources, and boot classpath into your jar.

       apply plugin: 'java' 
    
       evaluationDependsOn(':app')
    
       dependencies {
         def app = project(':app')
         compile app
         app.android.applicationVariants.each { variant ->
           if ( variant.name == 'developDebug') {
             testCompile variant.javaCompile.classpath
             testCompile variant.javaCompile.outputs.files
           }
         }
    
         testCompile files(
           app.plugins.findPlugin("com.android.application").getBootClasspath()
         )
       }
    

    Make sure Gradle knows to evaluate the app gradle project by using evaluationDependsOn(). Then compile the three into your app using the testCompile directive.

    Note the applicationVariants is referenced directly from the Android Gradle Plugin User Guide. An applicationVariant is another name for a flavor. We are making the robolectric tests directly dependent on the develop build flavor. Without this explicit dependency, Gradle and IDEA seem to choose different build variants/flavors as the dependency for the project.

    In the last line above we use the android plugin to resolve the boot class path.

  3. Reorder the classpath in Gradle and in IntelliJ to make sure the Android dependencies fall behind Robolectric. Note: these classpaths are for some reason calculated in two different ways!

      apply plugin: 'idea'
    
      configurations {
        junitTestCompile
        roboTestCompile
      }
    
      dependencies {
        junitTestCompile 'org.hamcrest:hamcrest-all:1.3'
        // junitTestCompile '<all other dependencies>'
        roboTestCompile 'org.robolectric:robolectric:2.4'
      }
    
      sourceSets.test.compileClasspath = configurations.junitTestCompile + \
        configurations.roboTestCompile + \
        sourceSets.test.compileClasspath
    
      sourceSets.test.runtimeClasspath = configurations.junitTestCompile + \
        configurations.roboTestCompile + \
        sourceSets.test.runtimeClasspath
    
      idea {
        module {
          scopes.COMPILE.plus += [ configurations.junitTestCompile ]
          scopes.TEST.plus += [ configurations.roboTestCompile ]
        }
      }
    

    Modifying the sourceSets directly, we’re able to reorder the classpath. Strangely, this isn’t at all related to the order one declares dependencies, as one might initially think. This is all fine for Gradle. Unfortunately, IntelliJ ignores the sourceSets directive, preferring to calculate its own classpath. So we need to import the idea plugin for this and reorder these dependencies via the scopes concept.

  4. Tell the Test target what tests need to be launched via a selection.

    tasks.withType(Test) {
        scanForTestClasses = false
        include "**/*Test.class"
    }
    

This last modification keeps Gradle from trying to grab random class files and run them as JUnit tests.

The above is all that is required from a Gradle/Groovy perspective. With the above configurations, Robolectric will work just fine with Config = NONE. However, your build will need more help if you need to find resources from your actual app, or are concerned about referencing anything in your Android manifest –e.g., ShadowActivities and the like. For that, you’ll need to tell Robolectric where it can find the above.

Modified TestRunner

public class RobolectricGradleTestRunner extends RobolectricTestRunner {
  private static final String PROJECT_DIR =
    getProjectDirectory();

  private static final String MANIFEST_PROPERTY =
    PROJECT_DIR + "src/main/AndroidManifest.xml";

  private static final String RES_PROPERTY = 
    PROJECT_DIR + "build/intermediates/res/develop/debug/";

  private static final int MAX_SDK_SUPPORTED_BY_ROBOLECTRIC = 
    Build.VERSION_CODES.JELLY_BEAN_MR2;

  public RobolectricGradleTestRunner(final Class<?> testClass) throws Exception {
    super(testClass);
  }

  private static AndroidManifest getAndroidManifest() {
    return new AndroidManifest(
      Fs.fileFromPath(MANIFEST_PROPERTY), 
      Fs.fileFromPath(RES_PROPERTY)) {
        @Override public int getTargetSdkVersion() {
          return MAX_SDK_SUPPORTED_BY_ROBOLECTRIC;
        }
    };
  }

  private static String getProjectDirectory() {
    String path = "";
    try {
      File file = new File("..");
      path = file.getCanonicalPath();
      path = path + "/app/";
    } catch (IOException ex) {}
    return path;
  }

  @Override public AndroidManifest getAppManifest(Config config) {
    return getAndroidManifest();
  }
}

The above custom test runner keeps you from having to relocate those resources and the manifest every new test you write. It also configures the target SDK version to be 18, so that Robolectric will run. Sadly, Robolectric does not work with newer target versions of the Android API. This is a known issue.

Cross Platform Tutorial

I have created a tutorial on how to write a cross platform iOS/Android app using the C++ and the Android NDK. I’ve posted the source code on github. For anyone looking to get started, this is a great resource for playing around with the basic mechanics.

Core Foundation on Android

On a recent iOS project, I started programming a static library with the intention of sharing it with an Android project which was to contain very similar functionality. This much was easy: creating the project in X-Code, I had access to almost any library I needed by simply adding the appopriate frameworks and header files to my project and building just like I normally would. There were a few bumps and bruises on the journey to Android, however. Below I’ve outlined my journey:

  • Learn the NDK structure, but from whom? Firstly, I needed to decipher what needed to happen where and when as the NDK has changed twice in the past six months, making many blog posts obsolete or misleading. Honestly, had I started out in the documentation that had shipped with the NDK I downloaded, I probably would have been much better off, and much less confused. This would be my advice to anyone getting started. In the end you will realize that (as strange as it seems) those txt docs best resource out there simply because they don’t lie, and the Internet does with stuff like this that is always changing.

  • Reverse engineer various function declarations from Bionic header files (Android’s version of glibc) The android.git.kernel.org site was down while I was trying to figure all this out, so that was part of the problem. But, honestly, I’m not sure if there is any documentation on the Bionic header files. In comparison to the extensive documentation provided by Apple to all of their APIs, developing on the NDK has a feeling not unlike that of any project you’ve seen created in some dude’s basement.

  • Getting boost and stl port to compile and link. Later versions of the NDK (5 and later) include STL port and support for runtime exceptions, so this wasn’t too bad. I also found a blog post that was invaluable for help getting boost to compile.

  • Debugging errors and getting stack traces from adb’s logcat. Again, searching for how to do this on the Internet was pointless. The NDK now includes a script creatively called ndk-stack. The usage is pretty self explanatory:

    Usage:
    ndk-stack -sym <path> [-dump <path>]
          -sym  Contains full path to the root directory for symbols.
          -dump Contains full path to the file containing the crash dump. 
              This is an optional parameter. If ommited, ndk-stack will
                read input data from stdin
    
       See docs/NDK-STACK.html in your NDK installation tree for more details.
    

Once these items were in place, the last hurdle was getting Core Foundation Lite to compile on Android as I couldn’t find documentation on how anyone had accomplished it. The closest I came was finding someone who had compiled Core Foundation Lite on Linux five years ago.

Based on work from this posting, I was able to get a compiling/working version of CF 299.33 on Android by making a few relatively minor patches to the source code. I’ve since posted this version on Github under the terms of the Apple Public License.

Why use Core Foundation in a library you’re sharing between iOS and Android, you might ask? One of the biggest pluses from an iOS perspective was the Toll-Free Bridging support with Cocoa. Retaining arrays, strings, dictionaries, etc was as quick and easy as recasting the core foundation objects into their relative Cocoa types. For example, given:

class Library { 
public:
    CFArrayRef getCFArrayRef();
};

Obtaining an array of data from your static library can be as easy as:

NSArray* myArray = (NSArray*)library->getCFArrayRef();
[myArray retain];

When your library object falls out of scope or is deleted, myArray will still be holding a valid reference to your array. This cuts down on data copying and makes your application more performant on iOS where hardware specifications are typically lower powered than on Android.

From the Android side of things, using Core Foundation makes sense as it gives you easy, cross platform support for Unicode strings. Both Dalvik (Android’s VM) and Cocoa use UTF-16 String by default. This keeps you away from the nightmare that is wchar_t.

Finally, Core Foundation Lite gives Android the ability to read property list (plist) files, which means you can have one set of configuration documents shared across both platforms.