Layout inflation and why AppCompat components replace standard ones

Some time ago I saw the following question in StackOverflow: Why does Android select AppCompat components when I don’t declare them explicitly?

I was debugging my App and found while hovering on an ImageView reference that, it is an AppCompatImageView instead of an ImageView. The same happened with a TextView(with AppCompatTextView).

So, why Android Views get “automagically” converted into AppCompat* equivalents? The answer comes from the fact that he is extending from AppCompatActivity instead of Activity or other alternatives (in fact, the original question also asks whether he should keep extending AppCompatActivity or start using Activity).

Short answer to “Should I just extend from an Activity from now on?” is no, you should keep extending AppCompatActivity as it provides backwards compatible features to older devices. In the case of AppCompatImageView:

A ImageView which supports compatible features on older versions of the platform, including:

  • Allows dynamic tint of its background via the background tint methods in ViewCompat.
  • Allows setting of the background tint using backgroundTint and backgroundTintMode.
  • Allows dynamic tint of its image via the image tint methods in ImageViewCompat.
  • Allows setting of the image tint using tint and tintMode.

Also, it adds compatibility with vector drawables for older Android versions.

So, how when the layout specifies ImageView, an AppCompatImageView gets created?

AppCompatActivity installs a LayoutInflater.Factory2 to intercept the inflation of certain views. The code of this inflater can be seen in AppCompatViewInflater.java.

The function responsible for creating the Views is AppCompatViewInflater#createView(View, String, Context, AttributeSet, boolean, boolean, boolean, boolean), and as you can see here it checks for simple view names (without the package prefixing it), and creates the AppCompat* version instead:

public final View createView(View parent, final String name, @NonNull Context context,
        @NonNull AttributeSet attrs, boolean inheritContext,
        boolean readAndroidTheme, boolean readAppTheme, boolean wrapContext) {
    final Context originalContext = context;

    // ...

    View view = null;

    // We need to 'inject' our tint aware Views in place of the standard framework versions
    switch (name) {
        case "TextView":
            view = new AppCompatTextView(context, attrs);
            break;
        case "ImageView":
            view = new AppCompatImageView(context, attrs);
            break;
        case "Button":
            view = new AppCompatButton(context, attrs);
            break;
        case "EditText":
            view = new AppCompatEditText(context, attrs);
            break;
        case "Spinner":
            view = new AppCompatSpinner(context, attrs);
            break;
        case "ImageButton":
            view = new AppCompatImageButton(context, attrs);
            break;
        case "CheckBox":
            view = new AppCompatCheckBox(context, attrs);
            break;
        case "RadioButton":
            view = new AppCompatRadioButton(context, attrs);
            break;
        case "CheckedTextView":
            view = new AppCompatCheckedTextView(context, attrs);
            break;
        case "AutoCompleteTextView":
            view = new AppCompatAutoCompleteTextView(context, attrs);
            break;
        case "MultiAutoCompleteTextView":
            view = new AppCompatMultiAutoCompleteTextView(context, attrs);
            break;
        case "RatingBar":
            view = new AppCompatRatingBar(context, attrs);
            break;
        case "SeekBar":
            view = new AppCompatSeekBar(context, attrs);
            break;
    }

    if (view == null && originalContext != context) {
        // If the original context does not equal our themed context, then we need to manually
        // inflate it using the name so that android:theme takes effect.
        view = createViewFromTag(context, name, attrs);
    }

    // ...

    return view;
}

Forcing the usage of non-AppCompat views

So, in order to force the creation of a regular ImageView (no AppCompatImageView) while still using AppCompatActivity you need to specify the complete class name, for example:

    <android.widget.ImageView
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:src="@mipmap/test"/>

For more information on how layout inflation works you can see the amazing talk “LayoutInflater: Friend or Foe?” by Chris Jenx, author of Calligraphy.

Android compatibility with 32-bit libraries on a 64-bit device, updated

Sometime ago I wrote the following post: Android compatibility with 32-bit libraries on a 64-bit device.

In that article, the proposed solution was to manually remove the library files that were causing the problem using exclude. See that article for the details on the reasons to do that.

There’s a better way using ABI Filters, though.

android {
    // ...
    buildTypes {
        // ...
        release {
            ndk {
                abiFilters "armeabi", "armeabi-v7a", "x86"
            }
        }
    }
}

With ABI Filters we specify the architectures that we want to keep, which is the opposite of what we were previously doing, which was manually removing the libraries we didn’t want to keep. So, what we’re doing is keeping 32-bit architectures, both for ARM and x86 (we’re excluding 64-bit libraries and MIPS).

Notice that nowadays I would also exclude “armeabi”, as all the nowadays devices use at least “armeabi-v7a”.

Finally, notice that by August 2019 all the apps using native libraries will have to provide both 32 and 64-bit versions, as Google states in the blog post “Improving app security and performance on Google Play for years to come“:

In August 2019, Play will require that new apps and app updates with native libraries provide 64-bit versions in addition to their 32-bit versions.

See also Reducing APK size by using ABI Filters and APK split by Igor Wojda for other uses of ABI Filters.

How to specify different buildTypes in Gradle 3.0.0 beta 4 and later

After updating to Gradle plugin 3.0.0 beta 4 our build failed with the following message:

buildTypeMatching has been removed. Use buildTypes..fallbacks

Our libraries have release and debug buildTypes, but our app has two additional buildTypes: releaseWithLogs and debugMinified.

Snippet of our app Gradle file:

android {
    // ...
    buildTypeMatching 'releaseWithLogs', 'release'
    buildTypeMatching 'debugMinified', 'debug'

    buildTypes {
        debug {
            // ...
        }
        debugMinified {
            // ...
        }
        release {
            // ...
        }
        releaseWithLogs {
            // ...
        }
    }
}

After some investigation, the following announcement has been found: Android Studio 3.0 Beta 4 is now available. There, it mentions:

You now provide fallbacks for missing build types and flavors using matchingFallbacks (which replaces buildTypeMatching and productFlavorMatching). You also provide the default selection and fallbacks for missing dimensions using missingDimensionStrategy (which replaces flavorSelection).

So, our previous app build.gradle gets converted to:

android {
    // ...
    //buildTypeMatching 'releaseWithLogs', 'release' // remove this
    //buildTypeMatching 'debugMinified', 'debug'     // remove this

    buildTypes {
        debug {
            // ...
        }
        debugMinified {
            // ...
            matchingFallbacks = ['debug']    // instead use this
        }
        release {
            // ...
        }
        releaseWithLogs {
            // ...
            matchingFallbacks = ['release']  // instead use this
        }
    }
}

Notice that, instead of saying that buildType releaseWithLogs will also match with release (buildTypeMatching 'releaseWithLogs', 'release'), we specify the match inside the buildType itself. Same for debugMinified matching debug. Also notice that there’s no need to include this in release and debug buildTypes, as they already match.

The original question and answer can be found in Gradle plugin 3.0.0 beta 4: “buildTypeMatching has been removed. Use buildTypes..fallbacks”

Kotlin interop: mixing Kotlin and Java ButterKnife-annotated activities

I’ve been working with Kotlin for a while, mainly for side-projects or toy-projects. Since last Google I/O 2017 announcement it has become clear that there are no more reasons or excuses to not use it in production.

One of the big selling points of Kotlin is that you can start small, by converting one class or two, or by creating new ones, while keeping all the remaining code in Java. So, interop between the two languages is almost 100% transparent. Almost.

Working to convert a small project step by step, I started to convert activities into Kotlin. Those activities use ButterKnife (I’m using current version, which is 8.7.0) to inject the views. So, after converting the first activity I stumbled upon a problem with the annotation processor: in Gradle script, you have to use either annotationProcessor or kapt, but not both at the same time. So, you have to choose:

  • using annotationProcessor only will not find Kotlin classes, and because of that injection will silently fail at runtime,
  • using kapt only will make compilation fail.

The final workaround I found was:

  • using kapt3 (by applying kotlin-kapt plugin to the Gradle script) and,
  • adding a @JvmField() annotation in addition to ButterKnife annotations so Kotlin compiler generates public fields instead of getters and setters.

By applying kapt3 we fix the compilation error involving “kotlin.jvm.internal.FunctionReference.(ILjava/lang/Object;)V” and by converting Kotlin fields to plain-old Java fields we allow ButterKnife compiler to find the fields to inject, as is unable to find Kotlin fields.

You can find the source code with different options in different branches (the one with the final solution is kotlin-workaround) in this GitHub project.

The project has two activities, one (MainActivity) that is written in Java and kept in this language, and the second one (NextActivity) that is converted to Kotlin. Notice that a simple suite of tests is available to check that both activities are being correctly injected, and that there is a TextView in both activities that has its text replaced by code to prove that the activity has been successfully injected.

Hope this tip is useful!

Offtopic: mechanical calculators (cardboard and wood!)

A few days ago I happened to stumble with this fun article of a mechanical 4-bit calculator made in cardboard. Even if it’s quite unreliable, is interesting to see the methodical approach and all the creativity to make this work with only cardboard and marbles.

In the comments of the article there is a Youtube video of a calculator that implements 6-bit addition made in wood. It is very reliable, and it takes a completely different approach.

Disabling (and removing) code on release builds

Repo updated on 2017-03-13: added additional debug configurations to check different options to enable/disable minification, optimization and obfuscation. See the discussion in Optimize without obfuscate your debug build, including this comment.

Sometimes you have to add code to your applications that is used for debugging purposes. This can be very useful, and sometimes is keep there as it helps in the development and debugging of different parts of the application. But, some of this code can have unintended consequences:

  • it can reveal sensitive data to a potential attacker (internal URLs, session cookies, etc.)
  • it can have a performance impact in your application (excessive logging, performing operations not needed for release builds, etc.)
  • it can lower the security of your application (backdoor-like features to help while debugging, that can disable certain security features, or completely bypass them, etc.)

Continue reading “Disabling (and removing) code on release builds”

Jenkins builds with a different Java version

We have a Jenkins server taking care of CI for an Android project. In the server we were using Java 7, but since we updated a few dependencies we needed Java 8 to run some Gradle plugins. After the change, we suddenly started to get this error in the builds:

* What went wrong:
A problem occurred evaluating project ':my-project-app'.
> java.lang.UnsupportedClassVersionError: com/android/build/gradle/AppPlugin : Unsupported major.minor version 52.0

After googling it, I found the following SO answer, pointing us to the need of Java 8 (major version 52): How to fix java.lang.UnsupportedClassVersionError: Unsupported major.minor version.

So, given that:

  1. server has both Java 7 and Java 8 installed
  2. Jenkins is configured to run using Java 7, and changing may pose some problems (and currently any failure could put our schedule in jeopardy)

we decided to just run Gradle script using Java 8. To configure it, the following change was done to the configuration of the job: in section Build Environment, enable Inject environment variables to the build process and add the following to Properties Content:

JAVA_HOME=(path_to_jdk)

in our case:

JAVA_HOME=/usr/lib/jvm/java-8-oracle

And hit Rebuild!

npm install not installing all the packages

A few days ago I was trying to install a project’s dependencies with a simple npm install, but in the instructions I was given there was an extra step to manually install one additional package. When I checked package.json, the package was there… So, what was going on!?

After searching for a while, it turns out that a previous developer used npm shrinkwrap “to predownload the packages” (or something like that, from what he said). It turns out that this command “freezes” a list of packages with its current version, so next time you do a npm install you will have the exact same dependencies versions, even if you’re using imprecise version specification in your package.json. It does this by creating a npm-shrinkwrap.json file next to your package.json (similar to a pip freeze but it also affects npm install). The problem is that once you have this file, package.json is ignored, and even if you add a new package it will not install.

So, while this is good for reproducibility of the environments, when you need to update you package list, you also need to update npm-shrinkwrap.json file. So, the steps are:

$ npm install my_new_package --save
$ npm shrinkwrap

Or, to update:

$ npm update
$ npm shrinkwrap

If you’re dependencies are screwed up like it happened to me, the easiest way to clean up the mess is:

$ rm -rf node_modules
$ npm install
$ npm shrinkwrap

 

Fixing Python virtualenv on OS X

This weekend I’ve been toying with an old project I created in Django a while (~2 years) ago. The project uses virtualenv, as commonly happens in this kind of projects. The original project was created originally on a different machine, and migrated to the current one. Original machine had OS X Lion, I think… or Snow Leopard, not sure. The point is, the project had been migrated thought 4 or 5 major OS X versions… and Python was a different version from the original one used to create the original virtualenv.

So, back to the problem, trying to use Anaconda for Sublime Text, I found that the Python interpreter I was specifying in the configuration was not working. After some Googling, I found a link (I’m not able to find it again…) on how to troubleshot the configuration, which at the end it boiled down to execute the Python interpreter inside your virtualenv with the following command:

~/Virtualenvs/myvirtualenv/bin/python -c pass

 
In my case, the result was the folling:

dyld: Library not loaded: @executable_path/../.Python
  Referenced from: ~/Virtualenvs/myvirtualenv/bin/python
  Reason: image not found
[1]    74178 trace trap  ~/Virtualenvs/myvirtualenv/bin/python -c pass

 
After some Googling about this problem, I found the following post: “dyld: Library not loaded: @executable_path/../.Python” Basically: “dump the packages configuration, recreate the virtualenv, restore packages, and move everything there”. In my case, the “move everything there” was a bit tricky, and the “restore packages” was even worst, as it was using some unsupported packages which was impossible to install (no pip support, and manually install was quite time consuming…). So, I tried to do something in between, and finally I found the real difference, and the reason the interpreter was failing (notice different python version):

$ ls -la old-virtualenv
...
lrwxr-xr-x   1 myuser  staff      78 29 nov  2014 .Python -> /usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/Python
...

$ ls -la new-virtualenv
...
lrwxr-xr-x   1 myuser  staff      80  5 mai 22:42 .Python -> /usr/local/Cellar/python/2.7.8_2/Frameworks/Python.framework/Versions/2.7/Python
...

 
So, the real fix in my case was:

$ cd old-virtualenv
$ rm .Python
$ ln -s /usr/local/Cellar/python/2.7.8_2/Frameworks/Python.framework/Versions/2.7/Python .Python

Problems with Instant Run (Android Studio 2.0 beta 4) and Retrofit

This morning I was playing with a toy app I have which uses Retrofit, and I’ve found the following problem with it (and Instant Run):

02-16 07:41:55.550 2976-2976/com.test.android A/art: art/runtime/thread.cc:1329] Throwing new exception 'length=227; index=1017' with unexpected pending exception: java.lang.ArrayIndexOutOfBoundsException: length=227; index=1017

The exception was in the call itself, so it was either Retrofit doing strange things or something deeper. I found the problem to be because of some interaction between ART and Instant Run, and it was already reported.

The original reporter has opened a bug in the AOSP and it’s still unsolved. See also (Retrofit).

All in all, currently there is no fully reliable solution. So, if you find this problem in a call using Retrofit, the best course of action is to disable Instant Run and rebuild. Preferences -> Build, Execution, Deployment -> Instant Run -> (uncheck) Enable Instant Run to hot swap code/resource changes on deploy (default enabled)

Disable Instant Run