Sometime ago I wrote the following post: Android compatibility with 32-bit libraries on a 64-bit device.
In that article, the proposed solution was to manually remove the library files that were causing the problem using
exclude. See that article for the details on the reasons to do that.
There’s a better way using ABI Filters, though.
abiFilters "armeabi", "armeabi-v7a", "x86"
With ABI Filters we specify the architectures that we want to keep, which is the opposite of what we were previously doing, which was manually removing the libraries we didn’t want to keep. So, what we’re doing is keeping 32-bit architectures, both for ARM and x86 (we’re excluding 64-bit libraries and MIPS).
Notice that nowadays I would also exclude “armeabi”, as all the nowadays devices use at least “armeabi-v7a”.
Finally, notice that by August 2019 all the apps using native libraries will have to provide both 32 and 64-bit versions, as Google states in the blog post “Improving app security and performance on Google Play for years to come“:
In August 2019, Play will require that new apps and app updates with native libraries provide 64-bit versions in addition to their 32-bit versions.
See also Reducing APK size by using ABI Filters and APK split by Igor Wojda for other uses of ABI Filters.
After updating to Gradle plugin 3.0.0 beta 4 our build failed with the following message:
buildTypeMatching has been removed. Use buildTypes..fallbacks
Our libraries have
debug buildTypes, but our app has two additional buildTypes:
Snippet of our app Gradle file:
buildTypeMatching 'releaseWithLogs', 'release'
buildTypeMatching 'debugMinified', 'debug'
After some investigation, the following announcement has been found: Android Studio 3.0 Beta 4 is now available. There, it mentions:
You now provide fallbacks for missing build types and flavors using
matchingFallbacks (which replaces
productFlavorMatching). You also provide the default selection and fallbacks for missing dimensions using
missingDimensionStrategy (which replaces
So, our previous app build.gradle gets converted to:
//buildTypeMatching 'releaseWithLogs', 'release' // remove this
//buildTypeMatching 'debugMinified', 'debug' // remove this
matchingFallbacks = ['debug'] // instead use this
matchingFallbacks = ['release'] // instead use this
Notice that, instead of saying that buildType
releaseWithLogs will also match with
buildTypeMatching 'releaseWithLogs', 'release'), we specify the match inside the buildType itself. Same for
debug. Also notice that there’s no need to include this in
debug buildTypes, as they already match.
The original question and answer can be found in Gradle plugin 3.0.0 beta 4: “buildTypeMatching has been removed. Use buildTypes..fallbacks”
I’ve been working with Kotlin for a while, mainly for side-projects or toy-projects. Since last Google I/O 2017 announcement it has become clear that there are no more reasons or excuses to not use it in production.
One of the big selling points of Kotlin is that you can start small, by converting one class or two, or by creating new ones, while keeping all the remaining code in Java. So, interop between the two languages is almost 100% transparent. Almost.
Working to convert a small project step by step, I started to convert activities into Kotlin. Those activities use ButterKnife (I’m using current version, which is 8.7.0) to inject the views. So, after converting the first activity I stumbled upon a problem with the annotation processor: in Gradle script, you have to use either
kapt, but not both at the same time. So, you have to choose:
annotationProcessor only will not find Kotlin classes, and because of that injection will silently fail at runtime,
kapt only will make compilation fail.
The final workaround I found was:
kapt3 (by applying
kotlin-kapt plugin to the Gradle script) and,
- adding a
@JvmField() annotation in addition to ButterKnife annotations so Kotlin compiler generates public fields instead of getters and setters.
By applying kapt3 we fix the compilation error involving “
kotlin.jvm.internal.FunctionReference.(ILjava/lang/Object;)V” and by converting Kotlin fields to plain-old Java fields we allow ButterKnife compiler to find the fields to inject, as is unable to find Kotlin fields.
You can find the source code with different options in different branches (the one with the final solution is kotlin-workaround) in this GitHub project.
The project has two activities, one (
MainActivity) that is written in Java and kept in this language, and the second one (
NextActivity) that is converted to Kotlin. Notice that a simple suite of tests is available to check that both activities are being correctly injected, and that there is a
TextView in both activities that has its text replaced by code to prove that the activity has been successfully injected.
Hope this tip is useful!
A few days ago I happened to stumble with this fun article of a mechanical 4-bit calculator made in cardboard. Even if it’s quite unreliable, is interesting to see the methodical approach and all the creativity to make this work with only cardboard and marbles.
In the comments of the article there is a Youtube video of a calculator that implements 6-bit addition made in wood. It is very reliable, and it takes a completely different approach.
Repo updated on 2017-03-13: added additional debug configurations to check different options to enable/disable minification, optimization and obfuscation. See the discussion in Optimize without obfuscate your debug build, including this comment.
Sometimes you have to add code to your applications that is used for debugging purposes. This can be very useful, and sometimes is keep there as it helps in the development and debugging of different parts of the application. But, some of this code can have unintended consequences:
- it can reveal sensitive data to a potential attacker (internal URLs, session cookies, etc.)
- it can have a performance impact in your application (excessive logging, performing operations not needed for release builds, etc.)
- it can lower the security of your application (backdoor-like features to help while debugging, that can disable certain security features, or completely bypass them, etc.)
Continue reading Disabling (and removing) code on release builds
We have a Jenkins server taking care of CI for an Android project. In the server we were using Java 7, but since we updated a few dependencies we needed Java 8 to run some Gradle plugins. After the change, we suddenly started to get this error in the builds:
* What went wrong:
A problem occurred evaluating project ':my-project-app'.
> java.lang.UnsupportedClassVersionError: com/android/build/gradle/AppPlugin : Unsupported major.minor version 52.0
After googling it, I found the following SO answer, pointing us to the need of Java 8 (major version 52): How to fix java.lang.UnsupportedClassVersionError: Unsupported major.minor version.
So, given that:
- server has both Java 7 and Java 8 installed
- Jenkins is configured to run using Java 7, and changing may pose some problems (and currently any failure could put our schedule in jeopardy)
we decided to just run Gradle script using Java 8. To configure it, the following change was done to the configuration of the job: in section Build Environment, enable Inject environment variables to the build process and add the following to Properties Content:
in our case:
And hit Rebuild!
A few days ago I was trying to install a project’s dependencies with a simple npm install, but in the instructions I was given there was an extra step to manually install one additional package. When I checked package.json, the package was there… So, what was going on!?
After searching for a while, it turns out that a previous developer used npm shrinkwrap “to predownload the packages” (or something like that, from what he said). It turns out that this command “freezes” a list of packages with its current version, so next time you do a npm install you will have the exact same dependencies versions, even if you’re using imprecise version specification in your package.json. It does this by creating a npm-shrinkwrap.json file next to your package.json (similar to a pip freeze but it also affects npm install). The problem is that once you have this file, package.json is ignored, and even if you add a new package it will not install.
So, while this is good for reproducibility of the environments, when you need to update you package list, you also need to update npm-shrinkwrap.json file. So, the steps are:
$ npm install my_new_package --save
$ npm shrinkwrap
Or, to update:
$ npm update
$ npm shrinkwrap
If you’re dependencies are screwed up like it happened to me, the easiest way to clean up the mess is:
$ rm -rf node_modules
$ npm install
$ npm shrinkwrap