英文原版链接:亚马逊《Mastering OpenCV Android Application Programming》
原书所给代码以章为单位,针对的Android版本从API 19到API 21不等,同时使用的OpenCV库版本也有2.4.9和2.4.10两种。本文给出的代码是在原书代码的基础上,针对Android 7.0(API 24)与OpenCV 3.2进行了修改,适当地增加了一些预处理等操作,以使代码整体上更合理。
原书的完整代码可以在这里获取:https://www.packtpub.com/lcode_download/22299。
更新后的代码托管在GitHub:https://github.com/johnhany/MOAAP/tree/master/MOAAP-Chp3-r3。
关于在Android Studio上配置OpenCV 3.1开发环境的方法,请参考《在Android Studio上进行OpenCV 3.1开发》。
关于在Android Studio上利用CMAKE配置OpenCV NDK开发环境的方法,请参考《Android Studio 2.3利用CMAKE进行OpenCV 3.2的NDK开发》。
本章介绍如何在Android Studio 2上利用OpenCV 3.2的Java API和Native API,通过Android NDK r15b开发一个图像匹配应用,涉及的算法有SIFT,SURF,ORB,BRISK和FREAK。关于以上各算法的原理可以参考《深入OpenCV Android应用开发》第三章。
本书中文版在62页介绍了向Android项目添加nonfree模块的方法,但对于OpenCV 3.2是不适用的。首先,在OpenCV 3.2中,SIFT等匹配算法被转移到了opencv_contrib项目的xfeatures2d模块中。其次,即使在项目的Android.mk中添加所需的源码文件,由于OpenCV 3.2本身的一个BUG,而且,由于匹配算法大多受专利保护,OpenCV源码的features2d_manual.hpp文件中默认没有声明相应的C++接口。所以就需要自己来编译支持SIFT等匹配算法的OpenCV Android SDK,并生成相应的OpenCV Manager安装包。所以,本文所给的代码与原书相比变化较大。
此外,由于涉及的目标匹配算法计算量都比较大,所以本章引入了Android多线程编程技巧,利用AsyncTask执行核心计算,以避免阻塞主UI进程。具体细节请参考下文代码。
开发环境:
Ubuntu 16.04 x64(用于OpenCV Android SDK的编译,也可直接下载我编译好的SDK)
Windows 10 x64专业版(用于项目开发)
Android Studio 2.3.3(Gradle 3.3,Android Plugin 2.3.3)
Android 7.0(API 24)
Android NDK r15b
JDK 8u141
OpenCV 3.2.0 Android SDK(需要自己编译)
编译OpenCV Android SDK:
请参考《Ubuntu 16.04下为Android编译OpenCV 3.2.0 Manager》的过程为该项目编译一个支持SIFT,SURF和FREAK算法的OpenCV Android SDK,并将编译产生的OpenCV-android-sdk文件夹拷贝到某个位置,比如/home/john/Android/OpenCV-contrib-android-sdk(我把自己编译的OpenCV Android SDK重命名为OpenCV-contrib-android-sdk以区分官方的预编译库)。
我的Android SDK目录为/home/john/Android/Sdk,Android NDK目录为/home/john/Android/Sdk/ndk-bundle。
向Android项目导入自己编译的OpenCV Android SDK,显示的名称为openCVLibrary310dev,而且在项目结构视图里会出现两个模块,分别为openCVLibrary310和openCVLibrary310dev,其中openCVLibrary310dev需要加入app的Dependencies。如果是以3.2.0 Release源码编译的,则导入后的模块名称仍为openCVLibrary320。
代码及简略解释:
1.创建Android Studio项目,包命名为net.johnhany.moaap_chp3。
2.在app\src\main\java目录中找到net.johnhany.moaap_chp3包,为MainActivity.java添加如下代码:
package net.johnhany.moaap_chp3; import android.Manifest; import android.content.Intent; import android.content.pm.PackageManager; import android.graphics.Bitmap; import android.graphics.BitmapFactory; import android.net.Uri; import android.os.AsyncTask; import android.os.Bundle; import android.os.Environment; import android.support.annotation.NonNull; import android.support.v4.app.ActivityCompat; import android.support.v4.content.ContextCompat; import android.support.v7.app.AppCompatActivity; import android.util.Log; import android.view.Menu; import android.view.MenuItem; import android.widget.ImageView; import android.widget.TextView; import android.widget.Toast; import org.opencv.android.BaseLoaderCallback; import org.opencv.android.LoaderCallbackInterface; import org.opencv.android.OpenCVLoader; import org.opencv.android.Utils; import org.opencv.core.Core; import org.opencv.core.CvType; import org.opencv.core.Mat; import org.opencv.core.MatOfDMatch; import org.opencv.core.MatOfKeyPoint; import org.opencv.core.Point; import org.opencv.core.Scalar; import org.opencv.core.DMatch; import org.opencv.features2d.DescriptorExtractor; import org.opencv.features2d.DescriptorMatcher; import org.opencv.features2d.FeatureDetector; import org.opencv.features2d.Features2d; import org.opencv.imgcodecs.Imgcodecs; import org.opencv.imgproc.Imgproc; import java.io.FileNotFoundException; import java.io.InputStream; import java.util.Collections; import java.util.Comparator; import java.util.List; public class MainActivity extends AppCompatActivity { private final int SELECT_PHOTO_1 = 1; private final int SELECT_PHOTO_2 = 2; private ImageView ivImage1; private TextView tvKeyPointsObject1, tvKeyPointsObject2, tvKeyPointsMatches, tvTime; private int keypointsObject1, keypointsObject2, keypointMatches; Mat src1, src2, src1_gray, src2_gray; static int ACTION_MODE = 0; private boolean src1Selected = false, src2Selected = false; static int REQUEST_READ_EXTERNAL_STORAGE = 11; static int REQUEST_WRITE_EXTERNAL_STORAGE = 12; static boolean read_external_storage_granted = false; static boolean write_external_storage_granted = false; private BaseLoaderCallback mOpenCVCallBack = new BaseLoaderCallback(this) { @Override public void onManagerConnected(int status) { switch (status) { case LoaderCallbackInterface.SUCCESS: //DO YOUR WORK/STUFF HERE System.loadLibrary("xfeatures2d"); break; default: super.onManagerConnected(status); break; } } }; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_2_0, this, mOpenCVCallBack); if(getSupportActionBar() != null) { getSupportActionBar().setDisplayHomeAsUpEnabled(true); } ivImage1 = (ImageView)findViewById(R.id.ivImage1); tvKeyPointsObject1 = (TextView) findViewById(R.id.tvKeyPointsObject1); tvKeyPointsObject2 = (TextView) findViewById(R.id.tvKeyPointsObject2); tvKeyPointsMatches = (TextView) findViewById(R.id.tvKeyPointsMatches); keypointsObject1 = keypointsObject2 = keypointMatches = -1; tvTime = (TextView) findViewById(R.id.tvTime); Intent intent = getIntent(); if(intent.hasExtra("ACTION_MODE")){ ACTION_MODE = intent.getIntExtra("ACTION_MODE", 0); } if (ContextCompat.checkSelfPermission(MainActivity.this, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) { Log.i("permission", "request READ_EXTERNAL_STORAGE"); ActivityCompat.requestPermissions(MainActivity.this, new String[]{Manifest.permission.READ_EXTERNAL_STORAGE}, REQUEST_READ_EXTERNAL_STORAGE); }else { Log.i("permission", "READ_EXTERNAL_STORAGE already granted"); read_external_storage_granted = true; } } @Override public boolean onCreateOptionsMenu(Menu menu) { getMenuInflater().inflate(R.menu.menu_main, menu); return true; } @Override public boolean onOptionsItemSelected(MenuItem item) { int id = item.getItemId(); if (id == R.id.action_load_first_image && read_external_storage_granted) { Intent photoPickerIntent = new Intent(Intent.ACTION_PICK); photoPickerIntent.setType("image/*"); startActivityForResult(photoPickerIntent, SELECT_PHOTO_1); return true; } else if (id == R.id.action_load_second_image && read_external_storage_granted) { Intent photoPickerIntent = new Intent(Intent.ACTION_PICK); photoPickerIntent.setType("image/*"); startActivityForResult(photoPickerIntent, SELECT_PHOTO_2); return true; } else if(!read_external_storage_granted) { Log.e("APP", "pick image failed"); return true; } return super.onOptionsItemSelected(item); } @Override protected void onActivityResult(int requestCode, int resultCode, Intent imageReturnedIntent) { super.onActivityResult(requestCode, resultCode, imageReturnedIntent); switch(requestCode) { case SELECT_PHOTO_1: if(resultCode == RESULT_OK && read_external_storage_granted){ try { final Uri imageUri = imageReturnedIntent.getData(); final InputStream imageStream = getContentResolver().openInputStream(imageUri); final Bitmap selectedImage = BitmapFactory.decodeStream(imageStream); src1 = new Mat(selectedImage.getHeight(), selectedImage.getWidth(), CvType.CV_8UC4); src1_gray = new Mat(selectedImage.getHeight(), selectedImage.getWidth(), CvType.CV_8UC1); // ivImage1.setImageBitmap(selectedImage); Utils.bitmapToMat(selectedImage, src1); Imgproc.cvtColor(src1, src1_gray, Imgproc.COLOR_BGRA2GRAY); src1Selected = true; } catch (FileNotFoundException e) { e.printStackTrace(); } } break; case SELECT_PHOTO_2: if(resultCode == RESULT_OK && read_external_storage_granted){ try { final Uri imageUri = imageReturnedIntent.getData(); final InputStream imageStream = getContentResolver().openInputStream(imageUri); final Bitmap selectedImage = BitmapFactory.decodeStream(imageStream); src2 = new Mat(selectedImage.getHeight(), selectedImage.getWidth(), CvType.CV_8UC4); src2_gray = new Mat(selectedImage.getHeight(), selectedImage.getWidth(), CvType.CV_8UC1); Utils.bitmapToMat(selectedImage, src2); Imgproc.cvtColor(src2, src2_gray, Imgproc.COLOR_BGRA2GRAY); src2Selected = true; } catch (FileNotFoundException e) { e.printStackTrace(); } } break; } Toast.makeText(MainActivity.this, "First image: " + src1Selected + " Second image: " + src2Selected, Toast.LENGTH_SHORT).show(); if(src1Selected && src2Selected){ Log.i("APP", "Before Execute"); new AsyncTask<Void, Void, Bitmap>() { private long startTime, endTime; @Override protected void onPreExecute() { super.onPreExecute(); startTime = System.currentTimeMillis(); } @Override protected Bitmap doInBackground(Void... params) { return executeTask(); } @Override protected void onPostExecute(Bitmap bitmap) { super.onPostExecute(bitmap); endTime = System.currentTimeMillis(); ivImage1.setImageBitmap(bitmap); tvKeyPointsObject1.setText(getString(R.string.result_target_1, keypointsObject1)); tvKeyPointsObject2.setText(getString(R.string.result_target_2, keypointsObject2)); tvKeyPointsMatches.setText(getString(R.string.result_target_matches, keypointMatches)); tvTime.setText(getString(R.string.result_time_cost, endTime-startTime)); } }.execute(); } } private Bitmap executeTask(){ Log.i("APP", "Execute"); final int MAX_MATCHES = 50; FeatureDetector detector; MatOfKeyPoint keypoints1, keypoints2; DescriptorExtractor descriptorExtractor; Mat descriptors1, descriptors2; DescriptorMatcher descriptorMatcher; MatOfDMatch matches = new MatOfDMatch(); keypoints1 = new MatOfKeyPoint(); keypoints2 = new MatOfKeyPoint(); descriptors1 = new Mat(); descriptors2 = new Mat(); Log.i("APP", "before switch"); switch (ACTION_MODE){ case HomeActivity.MODE_SIFT: detector = FeatureDetector.create(FeatureDetector.SIFT); descriptorExtractor = DescriptorExtractor.create(DescriptorExtractor.SIFT); descriptorMatcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_SL2); // descriptorMatcher = DescriptorMatcher.create(DescriptorMatcher.FLANNBASED); break; case HomeActivity.MODE_SURF: detector = FeatureDetector.create(FeatureDetector.SURF); descriptorExtractor = DescriptorExtractor.create(DescriptorExtractor.SURF); descriptorMatcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_SL2); break; case HomeActivity.MODE_ORB: detector = FeatureDetector.create(FeatureDetector.ORB); descriptorExtractor = DescriptorExtractor.create(DescriptorExtractor.ORB); descriptorMatcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMING); break; case HomeActivity.MODE_BRISK: detector = FeatureDetector.create(FeatureDetector.BRISK); descriptorExtractor = DescriptorExtractor.create(DescriptorExtractor.BRISK); descriptorMatcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMING); break; case HomeActivity.MODE_FREAK: detector = FeatureDetector.create(FeatureDetector.FAST); descriptorExtractor = DescriptorExtractor.create(DescriptorExtractor.FREAK); descriptorMatcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMING); break; default: detector = FeatureDetector.create(FeatureDetector.FAST); descriptorExtractor = DescriptorExtractor.create(DescriptorExtractor.BRIEF); descriptorMatcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMING); break; } Log.i("APP", "After switch"); detector.detect(src2_gray, keypoints2); detector.detect(src1_gray, keypoints1); Log.i("APP", CvType.typeToString(src1_gray.type())+" "+CvType.typeToString(src2_gray.type())); Log.i("APP", keypoints1.toArray().length+" keypoints"); Log.i("APP", keypoints2.toArray().length+" keypoints"); Log.i("APP", "Detect"); keypointsObject1 = keypoints1.toArray().length; keypointsObject2 = keypoints2.toArray().length; descriptorExtractor.compute(src1_gray, keypoints1, descriptors1); descriptorExtractor.compute(src2_gray, keypoints2, descriptors2); descriptorMatcher.match(descriptors1, descriptors2, matches); Log.i("APP", matches.toArray().length+" matches"); keypointMatches = matches.toArray().length; Collections.sort(matches.toList(), new Comparator<DMatch>() { @Override public int compare(DMatch o1, DMatch o2) { if(o1.distance<o2.distance) return -1; if(o1.distance>o2.distance) return 1; return 0; } }); List<DMatch> listOfDMatch = matches.toList(); if(listOfDMatch.size()>MAX_MATCHES){ matches.fromList(listOfDMatch.subList(0,MAX_MATCHES)); } // Mat src3 = src1.clone(); // Features2d.drawMatches(src1, keypoints1, src2, keypoints2, matches, src3); Mat src3 = drawMatches(src1_gray, keypoints1, src2_gray, keypoints2, matches, false); Log.i("APP", CvType.typeToString(src3.type())); Bitmap image1 = Bitmap.createBitmap(src3.cols(), src3.rows(), Bitmap.Config.ARGB_8888); Utils.matToBitmap(src3, image1); Imgproc.cvtColor(src3, src3, Imgproc.COLOR_BGR2RGB); if (ContextCompat.checkSelfPermission(MainActivity.this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) { Log.i("permission", "request WRITE_EXTERNAL_STORAGE"); ActivityCompat.requestPermissions(MainActivity.this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE}, REQUEST_WRITE_EXTERNAL_STORAGE); }else { Log.i("permission", "WRITE_EXTERNAL_STORAGE already granted"); write_external_storage_granted = true; } if(write_external_storage_granted) { boolean bool = Imgcodecs.imwrite(Environment.getExternalStorageDirectory() + "/Download/" + ACTION_MODE + ".png", src3); Log.i("APP", bool + " " + Environment.getExternalStorageDirectory() + "/Download/" + ACTION_MODE + ".png"); } return image1; } @Override public void onRequestPermissionsResult(int requestCode, @NonNull String permissions[], @NonNull int[] grantResults) { if (requestCode == REQUEST_READ_EXTERNAL_STORAGE) { // If request is cancelled, the result arrays are empty. if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) { // permission was granted Log.i("permission", "READ_EXTERNAL_STORAGE granted"); read_external_storage_granted = true; } else { // permission denied Log.i("permission", "READ_EXTERNAL_STORAGE denied"); } }else if(requestCode == REQUEST_WRITE_EXTERNAL_STORAGE) { if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) { // permission was granted Log.i("permission", "WRITE_EXTERNAL_STORAGE granted"); write_external_storage_granted = true; } else { // permission denied Log.i("permission", "WRITE_EXTERNAL_STORAGE denied"); } } } @Override protected void onResume() { super.onResume(); } static Mat drawMatches(Mat img1, MatOfKeyPoint key1, Mat img2, MatOfKeyPoint key2, MatOfDMatch matches, boolean imageOnly){ //https://github.com/mustafaakin/image-matcher/tree/master/src/in/mustafaak/imagematcher Mat out = new Mat(); Mat im1 = new Mat(); Mat im2 = new Mat(); Imgproc.cvtColor(img1, im1, Imgproc.COLOR_GRAY2RGB); Imgproc.cvtColor(img2, im2, Imgproc.COLOR_GRAY2RGB); if ( imageOnly){ MatOfDMatch emptyMatch = new MatOfDMatch(); MatOfKeyPoint emptyKey1 = new MatOfKeyPoint(); MatOfKeyPoint emptyKey2 = new MatOfKeyPoint(); Features2d.drawMatches(im1, emptyKey1, im2, emptyKey2, emptyMatch, out); } else { Features2d.drawMatches(im1, key1, im2, key2, matches, out); } //Bitmap bmp = Bitmap.createBitmap(out.cols(), out.rows(), Bitmap.Config.ARGB_8888); Imgproc.cvtColor(out, out, Imgproc.COLOR_BGR2RGB); Imgproc.putText(out, "Frame", new Point(img1.width() / 2,30), Core.FONT_HERSHEY_PLAIN, 2, new Scalar(0,255,255),3); Imgproc.putText(out, "Match", new Point(img1.width() + img2.width() / 2,30), Core.FONT_HERSHEY_PLAIN, 2, new Scalar(255,0,0),3); return out; } }
与原书的代码相比,变化主要有:
(1)去掉了一些无用的import,增加了一些必要的import;
(2)在onCreate()中增加了动态申请外部存储读取权限的机制,并增加了相应变量REQUEST_READ_EXTERNAL_STORAGE和read_external_storage_granted;在需要保存输出图像的代码之前增加了申请外部存储写入权限的代码,并增加了相应变量REQUEST_WRITE_EXTERNAL_STORAGE和write_external_storage_granted;
(3)将部分Log.d()替换为Log.i(),便于查看调试信息;
(4)更新了OpenCV的版本,OPENCV_VERSION_2_4_9改为OPENCV_VERSION_3_2_0;
(5)把jni模块nonfree重命名为xfeatures2d。
(6)考虑到FREAK方法默认对单通道图像进行匹配,所以加入将彩色图像转换为灰度图像的代码,强制对所有方法以单通道显示结果。
3.在app\src\main\java目录中添加一个名为HomeActivity.java的文件,其内容如下:
package net.johnhany.moaap_chp3; import android.app.Activity; import android.content.Intent; import android.os.Bundle; import android.view.View; import android.widget.Button; public class HomeActivity extends Activity { public static final int MODE_SIFT = 1; public static final int MODE_SURF = 2; public static final int MODE_ORB = 3; public static final int MODE_BRISK = 4; public static final int MODE_FREAK = 5; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_home); Button bSIFT, bSURF, bORB, bBRISK, bFREAK; bSIFT = (Button) findViewById(R.id.bSIFT); bSURF = (Button) findViewById(R.id.bSURF); bORB = (Button) findViewById(R.id.bORB); bBRISK = (Button) findViewById(R.id.bBRISK); bFREAK = (Button) findViewById(R.id.bFREAK); bSIFT.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { Intent i = new Intent(getApplicationContext(), MainActivity.class); i.putExtra("ACTION_MODE", MODE_SIFT); startActivity(i); } }); bSURF.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { Intent i = new Intent(getApplicationContext(), MainActivity.class); i.putExtra("ACTION_MODE", MODE_SURF); startActivity(i); } }); bORB.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { Intent i = new Intent(getApplicationContext(), MainActivity.class); i.putExtra("ACTION_MODE", MODE_ORB); startActivity(i); } }); bBRISK.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { Intent i = new Intent(getApplicationContext(), MainActivity.class); i.putExtra("ACTION_MODE", MODE_BRISK); startActivity(i); } }); bFREAK.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { Intent i = new Intent(getApplicationContext(), MainActivity.class); i.putExtra("ACTION_MODE", MODE_FREAK); startActivity(i); } }); } }
4.在修改app/CMakeLists.txt内容如下:
cmake_minimum_required(VERSION 3.4.1) set(CMAKE_VERBOSE_MAKEFILE on) set(ocvlibs "E:/dev-lib/opencv-contrib-android-sdk/sdk/native/libs") include_directories(E:/dev-lib/opencv-contrib-android-sdk/sdk/native/jni/include) add_library(libopencv_java3 SHARED IMPORTED ) set_target_properties(libopencv_java3 PROPERTIES IMPORTED_LOCATION "${ocvlibs}/${ANDROID_ABI}/libopencv_java3.so") add_library( # Sets the name of the library. xfeatures2d # Sets the library as a shared library. SHARED # Provides a relative path to your source file(s). src/main/cpp/xfeatures2d_init.cpp src/main/cpp/sift.cpp src/main/cpp/surf.cpp src/main/cpp/freak.cpp) find_library( # Sets the name of the path variable. log-lib # Specifies the name of the NDK library that # you want CMake to locate. log ) target_link_libraries( # Specifies the target library. xfeatures2d android log libopencv_java3 # Links the target library to the log library # included in the NDK. ${log-lib} )
关于该文件中各项的含义,请参考《Android Studio 2.3利用CMAKE进行OpenCV 3.2的NDK开发》。
5.把/home/john/Downloads/opencv-master/opencv_contrib/modules/xfeatures2d/src目录下的freak.cpp,precomp.hpp,sift.cpp,surf.cpp,surf.hpp,xfeatures2d_init.cpp共6个文件拷贝到app\src\main\cpp文件夹中。其中,precomp.hpp文件需要做如下修改:
注释掉第52-53行的
#include "opencv2/core/private.hpp" #include "opencv2/core/private.cuda.hpp"
再注释掉第62行的
#include "opencv2/core/private.hpp"
6.修改app/src/main/res/layout/activity_main.xml文件内容为:
<?xml version="1.0" encoding="utf-8"?> <ScrollView xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent" tools:context="net.johnhany.moaap_chp3.MainActivity"> <LinearLayout android:orientation="vertical" android:layout_width="match_parent" android:layout_height="wrap_content"> <ImageView android:layout_width="match_parent" android:layout_height="0dp" android:layout_weight="0.5" android:id="@+id/ivImage1" /> <TextView android:id="@+id/tvKeyPointsObject1" android:layout_width="match_parent" android:layout_height="wrap_content" /> <TextView android:id="@+id/tvKeyPointsObject2" android:layout_width="match_parent" android:layout_height="wrap_content" /> <TextView android:id="@+id/tvKeyPointsMatches" android:layout_width="match_parent" android:layout_height="wrap_content" /> <TextView android:id="@+id/tvTime" android:layout_width="match_parent" android:layout_height="wrap_content" /> </LinearLayout> </ScrollView>
7.在app/src/main/res/layout目录中添加activity_home.xml文件,其内容如下:
<?xml version="1.0" encoding="utf-8"?> <ScrollView xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="match_parent" > <LinearLayout android:layout_height="wrap_content" android:layout_width="match_parent" android:orientation="vertical" > <Button android:id="@+id/bSIFT" android:layout_height="wrap_content" android:layout_width="wrap_content" android:text="SIFT" /> <Button android:id="@+id/bSURF" android:layout_height="wrap_content" android:layout_width="wrap_content" android:text="SURF" /> <Button android:id="@+id/bORB" android:layout_height="wrap_content" android:layout_width="wrap_content" android:text="ORB" /> <Button android:id="@+id/bBRISK" android:layout_height="wrap_content" android:layout_width="wrap_content" android:text="BRISK" /> <Button android:id="@+id/bFREAK" android:layout_height="wrap_content" android:layout_width="wrap_content" android:text="FREAK" /> </LinearLayout> </ScrollView>
8.在app/src/main/res目录下新建一个名为menu的文件夹,在其中添加一个menu_main.xml文件,内容如下:
<?xml version="1.0" encoding="utf-8"?> <menu xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" xmlns:app="http://schemas.android.com/apk/res-auto" tools:context="net.johnhany.moaap_chp3.MainActivity"> <item android:id="@+id/action_load_first_image" android:title="@string/action_load_first_image" android:orderInCategory="1" app:showAsAction="never" /> <item android:id="@+id/action_load_second_image" android:title="@string/action_load_second_image" android:orderInCategory="2" app:showAsAction="never" /> </menu>
9.修改app/src/main/res/values/strings.xml文件,内容为:
<resources> <string name="app_name">第三章 - 深入OpenCV Android应用开发</string> <string name="action_load_first_image">Load First Image</string> <string name="action_load_second_image">Load Second Image</string> <string name="title_activity_main">第三章 - 深入OpenCV Android应用开发</string> <string name="title_activity_home">第三章 - 深入OpenCV Android应用开发</string> <string name="result_target_1">目标1:%1$d</string> <string name="result_target_2">目标2:%1$d</string> <string name="result_target_matches">关键点匹配:%1$d</string> <string name="result_time_cost">耗费时间:%1$d ms</string> </resources>
10.修改app/src/main/AndroidManifest.xml文件为如下内容:
<?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="http://schemas.android.com/apk/res/android" package="net.johnhany.moaap_chp3"> <supports-screens android:anyDensity="true" android:largeScreens="true" android:normalScreens="true" android:resizeable="true" android:smallScreens="true" /> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" /> <application android:allowBackup="true" android:icon="@mipmap/ic_launcher" android:label="@string/app_name" android:supportsRtl="true" android:theme="@style/AppTheme"> <activity android:name=".MainActivity" android:label="@string/title_activity_main" android:parentActivityName=".HomeActivity" > </activity> <activity android:name=".HomeActivity" android:label="@string/title_activity_home"> <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> </application> </manifest>
11.修改app/build.gradle文件为:
apply plugin: 'com.android.application' android { compileSdkVersion 24 buildToolsVersion "26.0.1" defaultConfig { applicationId "net.johnhany.moaap_chp3" minSdkVersion 16 targetSdkVersion 24 versionCode 1 versionName "1.0" testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner" externalNativeBuild { cmake { cppFlags "-std=c++11", "-frtti", "-fexceptions" abiFilters 'x86', 'x86_64', 'armeabi', 'armeabi-v7a', 'arm64-v8a', 'mips', 'mips64' } } } sourceSets { main { jniLibs.srcDirs = ['E:\\dev-lib\\opencv-contrib-android-sdk\\sdk\\native\\libs'] } } buildTypes { release { minifyEnabled false proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro' } } externalNativeBuild { cmake { path "CMakeLists.txt" } } } dependencies { compile fileTree(include: ['*.jar'], dir: 'libs') androidTestCompile('com.android.support.test.espresso:espresso-core:2.2.2', { exclude group: 'com.android.support', module: 'support-annotations' }) compile 'com.android.support:appcompat-v7:24.2.1' compile 'com.android.support.constraint:constraint-layout:1.0.2' testCompile 'junit:junit:4.12' compile project(':openCVLibrary320') }
12.修改openCVLibrary320项目下的build.gradle文件为:
apply plugin: 'com.android.library' android { compileSdkVersion 24 buildToolsVersion "26.0.1" defaultConfig { minSdkVersion 16 targetSdkVersion 24 } buildTypes { release { minifyEnabled false proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.txt' } } }
13.确认一下项目根目录下的build.gradle文件是否为:
// Top-level build file where you can add configuration options common to all sub-projects/modules. buildscript { repositories { jcenter() } dependencies { classpath 'com.android.tools.build:gradle:2.3.3' // NOTE: Do not place your application dependencies here; they belong // in the individual module build.gradle files } } allprojects { repositories { jcenter() } } task clean(type: Delete) { delete rootProject.buildDir }
运行效果:
android studio 3.4版本:
ERROR: Expected NDK STL shared object file at D:\Android\Sdk\ndk-bundle\sources\cxx-stl\gnu-libstdc++\4.9\libs\arm64-v8a\libgnustl_shared.so
博主,android studio 2.3版本的代码还有吗,我看资料下载地址的是3点多的版本了
文章里给出的代码应该是2.3可以用的,github上我是尽量更新到各工具的最新版。核心代码的区别也不大,只是几个build.gradle等文件有些不同
按这个方法到最后会 Error:fatal error: ‘opencv2/core/cuda.hpp’ file not found
编译opencv的时候需要取消掉CUDA的支持
我用的是你编译好的3.2的库,在cmakelist中,include包含一定要引用到库的include中,include_directories(D:/opencv/opencv-contrib-android-sdk/sdk/native/jni/include)include_directories(D:/opencv/opencv-contrib-android-sdk/sdk/native/jni/include);
如果我拔include中的文件拷贝到cpp中,然后再用如下方式就会报错,不知道这个有什么区别?
#include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/include)
这两种方式的不同之处在这篇文章中讨论过:http://johnhany.net/2017/07/opencv-ndk-dev-with-cmake-on-android-studio/,你可以参考一下!
博主大神哥哥,书本附带的代码能发我邮箱一份吗?感谢ing! 18241266008@sina.cn
你好,原书的代码下载不了的话可以在这儿下载:https://pan.baidu.com/s/1pLLGsyf
我的代码可以在我的GitHub页面下载,链接在文章里。
大神,第四章的还出不了?原书的代码实在是没法看,版本太落后了。
请问编译的代码能发给我吗
感谢分享,大赞!!
多谢支持!:)
請問我得到這個錯誤是哪裡有錯嗎? 照著上面的步驟做
java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "_ZN2cv3hal6exp32fEPKfPfi" referenced by "libxfeatures2d.so"
對了~ 關於 把/home/john/Downloads/opencv-master/opencv_contrib/modules/xfeatures2d/src目录下的freak.cpp,precomp.hpp,sift.cpp,surf.cpp,surf.hpp,xfeatures2d_init.cpp共6个文件拷贝到app\src\main\jni文件夹中。其中,precomp.hpp文件需要做如下修改:
這六個文件 我是網路上google 載的 不知道有沒有差!!
還是可以請站長提供一下 您的這六個文件~~
或是編譯過後的.so檔呢
整個錯誤訊息是
E/art: dlopen("/data/app/com.example.yantingchen.testopencv3-2/lib/arm/libxfeatures2d.so", RTLD_LAZY) failed: dlopen failed: cannot locate symbol "_ZN2cv3hal6exp32fEPKfPfi" referenced by "libxfeatures2d.so"…
E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.example.yantingchen.testopencv3, PID: 13942
java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "_ZN2cv3hal6exp32fEPKfPfi" referenced by "libxfeatures2d.so"…
at java.lang.Runtime.loadLibrary(Runtime.java:371)
at java.lang.System.loadLibrary(System.java:989)
at com.example.yantingchen.testopencv3.MainActivity$1.onManagerConnected(MainActivity.java:68)
at org.opencv.android.AsyncServiceHelper$3.onServiceConnected(AsyncServiceHelper.java:319)
at android.app.LoadedApk$ServiceDispatcher.doConnected(LoadedApk.java:1314)
at android.app.LoadedApk$ServiceDispatcher$RunConnection.run(LoadedApk.java:1331)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:155)
at android.app.ActivityThread.main(ActivityThread.java:5696)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1028)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:823)
使用手機實機系統是api 21的
不過我用虛擬機器23的 錯誤又不一樣
請問如何使用openCV 3.1 去支援 android api level 18的版本
因為專案的minSdkVersion 是18
所以不知道有沒有方法是可以支援到18的
应该是可以的,除了要把compileSdkVersion和targetSdkVersion改为18之外,一些动态申请权限的代码也要去掉。另外可能还有需要修改的地方,我要试验一下才能得知。
《深入OpenCV Android应用开发》这本书我买了,刚开始学习。请问其他章节的代码有更新计划吗?
抱歉回复得比较晚。后续章节当然会继续更新!只是最近因为其他的事情耽搁了,我会尽快更新的:)
博主真的好棒啊。我是刚刚接触as opencv,你简直是我的大救星啊!!!期待后面的内容~~~
谢谢支持!以后要常来哈;)
麻烦给我发一份原书的完整代码,我到网站上下载不了。邮箱:3302436132@qq.com
你好,第6步的opencv_contrib,我下载了github上的6个文件,并注释了precomp.hpp的52-53,66行,OpenCV的sdk也是用你编译的,编译报错:Error:(46, 40) opencv2/xfeatures2d/cuda.hpp: No such file or directory
然后我把46和48行注释了,还是报错:
Error:(48, 4) error: #error this is a private header which should not be used from outside of the OpenCV library
Error:(52, 22) cvconfig.h: No such file or directory
请问这是什么原因呢,能提供一下你的freak.cpp,precomp.hpp,sift.cpp,surf.cpp,surf.hpp,xfeatures2d_init.cpp文件吗
好像是我理解错了,我以为http://pullrequest.opencv.org/buildbot/export/opencv_releases/precommit_pack_android/20160723-073613_116/
这个路径的包是你编译的。能分享一下你编译的OpenCV Android SDK吗?我没有Ubuntu环境,windows环境下不知道怎么编译
这是我编译的库,不过没试过windows下可不可以用;)
http://pan.baidu.com/s/1slAc6Bj
可以使用,非常感谢!
你好,我下载了您编译的SDK,在java\src\org\opencv 发现比官方的标准库多了bioinspired和structured_light 这两个原本属于opencv_contrib 的模块,请问这两个是您自己编译的吗? 如果是的话,方便告知如何编译吗? 在此谢过。
看到学长你公开了,谢谢。
OK~
学长你好,你的sdk中其实把opencv_contrib/modules/xfeatures2d/src目录下的的6个cpp编译进去了对吗?我没有用你给的library,也没有manager.apk,我只用你编译的sdk成功提取到图片的surf特征值。
主要工作是这样,而且还要修改一下features2d_manual.hpp文件里面对这几个类的声明。有cpp文件的实现,和hpp文件中的声明,新模块就加进来了!
嗯,我就说我直接引入头文件后编写c++代码然后就可以直接使用了哈,谢谢学长。
博主你好,你那有用Cmake编译的opencv2.4.10的库吗?有的话能不能分享一下,我的邮箱1219493801@qq.com
很抱歉,很久没有用2.4.10版本,我的机器上已经找不到这个版本的编译库了^_^
测试你的前端校验
这些代码都是你因为要帮老师做项目写的还是自己纯兴趣写的啊? 真的很厉害!
多谢,多谢!《深入OpenCV Android应用开发》系列的代码是把原书中的代码针对新平台和工具修改了一下,其他文章的基本都是兴趣使然了,毕竟做项目的代码还涉及版权之类的问题;)
[…] 另外附上一篇在Android上利用SIFT,SURF和FREAK进行目标匹配的教程:《深入OpenCV Android应用开发 中文版 – 第三章代码更新》。 […]