웹문서를 자동으로 좀 그럴싸하게 넣도록 수정해봐야지 이건 영아니다; 가독성 0;;
OpenCV in Android
Using OpenCV in Android. This tutorial is tested under Ubuntu 10.04 + Android SDKr05 + Android NDKr03.
Preparing the development environment
Download and install Android SDK. Details can be found here
Download eclipse and install the ADT plugin. Details can be found here
Download Androdi NDK. This tool is used to cross compile OpenCV source code to Android. Currently (NDK r3) only C is fully supported, so I can only use OpenCV 1.1 under Android. The laterst version of OpenCV uses lots of STL functions. :(
Download OpenCV source code
Download the source code here. The source code in cxcore and cv directories are based on OpenCV 1.1pre version. I only change less than 10 lines to make it pass the compilation. I do not include highgui because there are too much work to cross compile libraries like IJG. I provide two simple functions in cvjni.cpp to set and get images between JVM and native environment.
Test the code
Create a new Android project in eclipse. For example, called testOpenCV.
In the root directory of the project, create a new folder called jni and extract all files in android_opencv.tar.gz to this folder.
Change to the Android NDK directory and run
./build/host-setup.sh
This will tell you whether your system is ready to cross compile the source code. If there is no error, switch to apps directory and create a new folder called opencv and move the Application.mk from the jni folder to opencv.
Modify the Application.mk file to make sure APP_BUILD_SCRIPT and APP_PROJECT_PATH point to the correct file and path.
Go back to the Android NDK directory and run
make APP=opencv
This step may take serveral minutes. Be patient.
Now let's write a simple app to test whether OpenCV work. When I created the project, I use the namespace edu.stanford.zixuanpc. If you are using other namespace, you need to change the function name in cvjni and run make again. In this simple app, we would like to use the camera to take an image and extract SURF features from this image and show the results.
I have three java files which can be downloaded here. They are a little bit long so I do not want to paste the source code here. The general idea is that I use the intent to start the camera or gallery activities to get the image and send this image to OpenCV. After OpenCV finishes extracting SURF features, it send the processed image back to JVM. The interface between JVM and OpenCV is pretty simple: setSourceImage and getSourceImage.
OpenCV.java
package edu.stanford.zixuanpc;
public class OpenCV {
static{
System.loadLibrary("opencv");
}
public native boolean setSourceImage(int[] pixels, int width, int height);
public native byte[] getSourceImage();
public native void extractSURFFeature();
}
And here is the code how we use these two functions:
Code snippet in testOpenCVActivity.java
Bitmap bitmap = BitmapFactory.decodeFile(mCurrentImagePath);
int width = bitmap.getWidth();
int height = bitmap.getHeight();
int[] pixels = new int[width * height];
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
opencv.setSourceImage(pixels, width, height);
opencv.extractSURFFeature();
byte[] imageData = opencv.getSourceImage();
bitmap = BitmapFactory.decodeByteArray(imageData, 0,
imageData.length);
mImageView.setImageBitmap(bitmap);
You can use other OpenCV functions supported in 1.1.
출처 : http://www.stanford.edu/~zxwang/android_opencv.html