Face detection API is used to detect human faces automatically from media files such as images & video. I have developed an application which detects faces photo taken from device Camera and gallery images as well.
Step 1: Add camera permission & face detection metadata in manifest
<uses-feature android:name="android.hardware.camera" android:required="true" /> <uses-feature android:name="android.hardware.camera.autofocus" /> <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Step 2: Checking camera & external storage for gallery permission<meta-data android:name="com.google.android.gms.vision.DEPENDENCIES" android:value="face" />
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA}, CAMERA_PERMISSION_REQUEST_CODE);
Step 3: Custom camera activity using SurfaceViewActivityCompat.requestPermissions(this, new String[]{Manifest.permission.READ_EXTERNAL_STORAGE}, GALLERY_PERMISSION_REQUEST_CODE);
We have to initialize surface view in OnCreate() then get surface holder from surface view and add callbacks,
We have to start camera preview in surfaceChanged() callback method and then open a camera from surfaceCreated() callback.surfaceView = (SurfaceView)findViewById(R.id.camerapreview); surfaceHolder = surfaceView.getHolder(); surfaceHolder.addCallback(this); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
@Overridepublic void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { // TODO Auto-generated method stub if(previewing){ camera.stopPreview(); previewing = false; } if (camera != null){ try { int rotation=setCameraDisplayOrientation(AndroidCamera.this, Camera.CameraInfo.CAMERA_FACING_FRONT,camera); camera.setDisplayOrientation(rotation); camera.setPreviewDisplay(surfaceHolder); camera.startPreview(); previewing = true; } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } } }
Step 4: To avoid mirror camera imaging@Overridepublic void surfaceCreated(SurfaceHolder holder) { // TODO Auto-generated method stub camera = Camera.open(Camera.CameraInfo.CAMERA_FACING_FRONT); }
Set camera orientation based on Surface rotation.
Step 5: Take picture & Save imageandroid.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo(); android.hardware.Camera.getCameraInfo(cameraId, info); int rotation = activity.getWindowManager().getDefaultDisplay() .getRotation(); int degrees = 0; switch (rotation) { case Surface.ROTATION_0: degrees = 0; break; case Surface.ROTATION_90: degrees = 90; break; case Surface.ROTATION_180: degrees = 180; break; case Surface.ROTATION_270: degrees = 270; break; } int result; if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) { result = (info.orientation + degrees) % 360; result = (360 - result) % 360; } else { result = (info.orientation - degrees + 360) % 360; }
calling takePicture method & image byte data return to myPictureCallback.
camera.takePicture(myShutterCallback, myPictureCallback_RAW, myPictureCallback_JPG);
Step 6: Detecting face from FaceOverlayViewfileUserPhoto = new File(dir.getAbsolutePath(), "pic.png"); try { fileUserPhoto.createNewFile(); }catch (IOException ex){ Log.e(TAG,"IOException:"+ex.getLocalizedMessage()); } try { FileOutputStream fileOS=new FileOutputStream(fileUserPhoto); android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo(); android.hardware.Camera.getCameraInfo(Camera.CameraInfo.CAMERA_FACING_FRONT, info); int rotation=setCameraDisplayOrientation(AndroidCamera.this,Camera.CameraInfo.CAMERA_FACING_FRONT,camera); rotation=rotation+180; Bitmap realImage= BitmapFactory.decodeByteArray(imageByte,0,imageByte.length); Bitmap bitmap= rotate(realImage,rotation); boolean isSaved=bitmap.compress(Bitmap.CompressFormat.JPEG,100,fileOS); Log.d(TAG,"saveCameraImage:"+isSaved); fileOS.close(); setResult(RESULT_OK); finish(); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); }
Once the face is detected, the image bitmap is added to the overlay frame view.FaceDetector detector = new FaceDetector.Builder( getContext() ) .setTrackingEnabled(false) .build(); if (!detector.isOperational()) { Toast.makeText(getContext(), "Face detection error.", Toast.LENGTH_LONG).show(); } else { Frame frame = new Frame.Builder().setBitmap(bitmap).build(); mFaces = detector.detect(frame); detector.release(); } invalidate();
Step 7: Drawing face box over the taken photo
The paint should be defined as a member variable rather than being created on each onDraw request, but left here for emphasis.
Paint paint = new Paint(); paint.setColor(Color.GREEN); paint.setStyle(Paint.Style.STROKE); paint.setStrokeWidth(5); float left = 0; float top = 0; float right = 0; float bottom = 0; for( int i = 0; i < mFaces.size(); i++ ) { Face face = mFaces.valueAt(i); left = (float) ( face.getPosition().x * scale ); top = (float) ( face.getPosition().y * scale ); right = (float) scale * ( face.getPosition().x + face.getWidth() ); bottom = (float) scale * ( face.getPosition().y + face.getHeight() ); canvas.drawRect( left, top, right, bottom, paint ); }
<com.truedreamz.facedetection.FaceOverlayViewAdd FaceOverlayView to the main layout of application and assign taken photos or selected picture from gallery i.e picking images using intent.
android:id="@+id/imgPhoto"
android:layout_width="fill_parent"
android:layout_height="match_parent"
android:layout_margin="2dp"
android:layout_gravity="center_horizontal"
android:scaleType="fitXY"
/>
I have uploaded the full source code in Github for reference,
https://github.com/JayaprakashR-Zealot/Knowledge-Circle---Android/tree/master/FaceDetection
Happy Coding !!!
Thanks!!!
No comments:
Post a Comment