欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页  >  IT编程

JavaCV实现人脸检测功能

程序员文章站 2023-11-13 11:37:34
本文实例为大家分享了javacv实现人脸检测功能的具体代码,供大家参考,具体内容如下 /* * copyright (c) 2010,2011,2012 s...

本文实例为大家分享了javacv实现人脸检测功能的具体代码,供大家参考,具体内容如下

/* 
 * copyright (c) 2010,2011,2012 samuel audet 
 * 
 * facepreview - a fusion of opencv's facedetect and android's camerapreview samples, 
 *        with javacv + javacpp as the glue in between. 
 * 
 * this file was based on camerapreview.java that came with the samples for 
 * android sdk api 8, revision 1 and contained the following copyright notice: 
 * 
 * copyright (c) 2007 the android open source project 
 * 
 * licensed under the apache license, version 2.0 (the "license"); 
 * you may not use this file except in compliance with the license. 
 * you may obtain a copy of the license at 
 * 
 *   http://www.apache.org/licenses/license-2.0 
 * 
 * unless required by applicable law or agreed to in writing, software 
 * distributed under the license is distributed on an "as is" basis, 
 * without warranties or conditions of any kind, either express or implied. 
 * see the license for the specific language governing permissions and 
 * limitations under the license. 
 * 
 * 
 * important - make sure the androidmanifest.xml file looks like this: 
 * 
 * <?xml version="1.0" encoding="utf-8"?> 
 * <manifest xmlns:android="http://schemas.android.com/apk/res/android" 
 *   package="com.googlecode.javacv.facepreview" 
 *   android:versioncode="1" 
 *   android:versionname="1.0" > 
 *   <uses-sdk android:minsdkversion="4" /> 
 *   <uses-permission android:name="android.permission.camera" /> 
 *   <uses-feature android:name="android.hardware.camera" /> 
 *   <application android:label="@string/app_name"> 
 *     <activity 
 *       android:name="facepreview" 
 *       android:label="@string/app_name" 
 *       android:screenorientation="landscape"> 
 *       <intent-filter> 
 *         <action android:name="android.intent.action.main" /> 
 *         <category android:name="android.intent.category.launcher" /> 
 *       </intent-filter> 
 *     </activity> 
 *   </application> 
 * </manifest> 
 */ 
 
package com.googlecode.javacv.facepreview; 
 
import android.app.activity; 
import android.app.alertdialog; 
import android.content.context; 
import android.graphics.canvas; 
import android.graphics.color; 
import android.graphics.imageformat; 
import android.graphics.paint; 
import android.hardware.camera; 
import android.hardware.camera.size; 
import android.os.bundle; 
import android.view.surfaceholder; 
import android.view.surfaceview; 
import android.view.view; 
import android.view.window; 
import android.view.windowmanager; 
import android.widget.framelayout; 
import java.io.file; 
import java.io.ioexception; 
import java.nio.bytebuffer; 
import java.util.list; 
import com.googlecode.javacpp.loader; 
import com.googlecode.javacv.cpp.opencv_objdetect; 
 
import static com.googlecode.javacv.cpp.opencv_core.*; 
import static com.googlecode.javacv.cpp.opencv_imgproc.*; 
import static com.googlecode.javacv.cpp.opencv_objdetect.*; 
import static com.googlecode.javacv.cpp.opencv_highgui.*; 
 
// ---------------------------------------------------------------------- 
 
public class facepreview extends activity { 
  private framelayout layout; 
  private faceview faceview; 
  private preview mpreview; 
 
  @override 
  protected void oncreate(bundle savedinstancestate) { 
    // hide the window title. 
    requestwindowfeature(window.feature_no_title); 
 
    super.oncreate(savedinstancestate); 
 
    getwindow().addflags(windowmanager.layoutparams.flag_fullscreen); 
 
    // create our preview view and set it as the content of our activity. 
    try { 
      layout = new framelayout(this); 
      faceview = new faceview(this); 
      mpreview = new preview(this, faceview); 
      layout.addview(mpreview); 
      layout.addview(faceview); 
      setcontentview(layout); 
    } catch (ioexception e) { 
      e.printstacktrace(); 
      new alertdialog.builder(this).setmessage(e.getmessage()).create().show(); 
    } 
  } 
} 
 
// ---------------------------------------------------------------------- 
 
class faceview extends view implements camera.previewcallback { 
  public static final int subsampling_factor = 4; 
 
  private iplimage grayimage; 
  private cvhaarclassifiercascade classifier; 
  private cvmemstorage storage; 
  private cvseq faces; 
 
  public faceview(facepreview context) throws ioexception { 
    super(context); 
 
    // load the classifier file from java resources. 
    file classifierfile = loader.extractresource(getclass(), 
      "/com/googlecode/javacv/facepreview/haarcascade_frontalface_alt2.xml", 
      context.getcachedir(), "classifier", ".xml"); 
    if (classifierfile == null || classifierfile.length() <= 0) { 
      throw new ioexception("could not extract the classifier file from java resource."); 
    } 
 
    // preload the opencv_objdetect module to work around a known bug. 
    loader.load(opencv_objdetect.class); 
    classifier = new cvhaarclassifiercascade(cvload(classifierfile.getabsolutepath())); 
    classifierfile.delete(); 
    if (classifier.isnull()) { 
      throw new ioexception("could not load the classifier file."); 
    } 
    storage = cvmemstorage.create(); 
  } 
 
  public void onpreviewframe(final byte[] data, final camera camera) { 
    try { 
      camera.size size = camera.getparameters().getpreviewsize(); 
      processimage(data, size.width, size.height); 
      camera.addcallbackbuffer(data); 
    } catch (runtimeexception e) { 
      // the camera has probably just been released, ignore. 
    } 
  } 
 
  protected void processimage(byte[] data, int width, int height) { 
    // first, downsample our image and convert it into a grayscale iplimage 
    int f = subsampling_factor; 
    if (grayimage == null || grayimage.width() != width/f || grayimage.height() != height/f) { 
      grayimage = iplimage.create(width/f, height/f, ipl_depth_8u, 1); 
    } 
    int imagewidth = grayimage.width(); 
    int imageheight = grayimage.height(); 
    int datastride = f*width; 
    int imagestride = grayimage.widthstep(); 
    bytebuffer imagebuffer = grayimage.getbytebuffer(); 
    for (int y = 0; y < imageheight; y++) { 
      int dataline = y*datastride; 
      int imageline = y*imagestride; 
      for (int x = 0; x < imagewidth; x++) { 
        imagebuffer.put(imageline + x, data[dataline + f*x]); 
      } 
    } 
    iplimage grayimaget = iplimage.create(height/f, width/f, ipl_depth_8u, 1); 
    //cvsaveimage("/storage/emulated/0/pictures/grayimage.jpg",grayimage); 
    cvtranspose(grayimage,grayimaget); 
    //cvsaveimage("/storage/emulated/0/pictures/grayimaget.jpg",grayimaget); 
    cvflip(grayimaget,grayimaget,0); 
    //cvsaveimage("/storage/emulated/0/pictures/grayimaget_x.jpg",grayimaget); 
    cvflip(grayimaget,grayimaget,1); 
    //cvsaveimage("/storage/emulated/0/pictures/grayimaget_y.jpg",grayimaget); 
 
    cvclearmemstorage(storage); 
    faces = cvhaardetectobjects(grayimaget, classifier, storage, 1.1, 3, cv_haar_do_canny_pruning); 
    postinvalidate(); 
  } 
 
  @override 
  protected void ondraw(canvas canvas) { 
    paint paint = new paint(); 
    paint.setcolor(color.red); 
    paint.settextsize(20); 
 
    string s = "facepreview - this side up."; 
    float textwidth = paint.measuretext(s); 
    canvas.drawtext(s, (getwidth()-textwidth)/2, 20, paint); 
 
    if (faces != null) { 
      paint.setstrokewidth(2); 
      paint.setstyle(paint.style.stroke); 
      float scalex = (float)getwidth()/grayimage.width(); 
      float scaley = (float)getheight()/grayimage.height(); 
      int total = faces.total(); 
      for (int i = 0; i < total; i++) { 
        cvrect r = new cvrect(cvgetseqelem(faces, i)); 
        int x = r.x(), y = r.y(), w = r.width(), h = r.height(); 
        canvas.drawrect(x*scalex, y*scaley, (x+w)*scalex, (y+h)*scaley, paint); 
      } 
    } 
    else{ 
      canvas.drawtext("meiyoujiancedao", (getwidth()-textwidth)/2, 20, paint); 
    } 
  } 
} 
 
// ---------------------------------------------------------------------- 
 
class preview extends surfaceview implements surfaceholder.callback { 
  surfaceholder mholder; 
  camera mcamera; 
  camera.previewcallback previewcallback; 
 
  preview(context context, camera.previewcallback previewcallback) { 
    super(context); 
    this.previewcallback = previewcallback; 
 
    // install a surfaceholder.callback so we get notified when the 
    // underlying surface is created and destroyed. 
    mholder = getholder(); 
    mholder.addcallback(this); 
    mholder.settype(surfaceholder.surface_type_push_buffers); 
  } 
 
  public void surfacecreated(surfaceholder holder) { 
    // the surface has been created, acquire the camera and tell it where 
    // to draw. 
    mcamera = camera.open(camera.camerainfo.camera_facing_front); 
    try { 
      mcamera.setpreviewdisplay(holder); 
    } catch (ioexception exception) { 
      mcamera.release(); 
      mcamera = null; 
      // todo: add more exception handling logic here 
    } 
  } 
 
  public void surfacedestroyed(surfaceholder holder) { 
    // surface will be destroyed when we return, so stop the preview. 
    // because the cameradevice object is not a shared resource, it's very 
    // important to release it when the activity is paused. 
    mcamera.stoppreview(); 
    mcamera.release(); 
    mcamera = null; 
  } 
 
 
  private size getoptimalpreviewsize(list<size> sizes, int w, int h) { 
    final double aspect_tolerance = 0.05; 
    double targetratio = (double) w / h; 
    if (sizes == null) return null; 
 
    size optimalsize = null; 
    double mindiff = double.max_value; 
 
    int targetheight = h; 
 
    // try to find an size match aspect ratio and size 
    for (size size : sizes) { 
      double ratio = (double) size.width / size.height; 
      if (math.abs(ratio - targetratio) > aspect_tolerance) continue; 
      if (math.abs(size.height - targetheight) < mindiff) { 
        optimalsize = size; 
        mindiff = math.abs(size.height - targetheight); 
      } 
    } 
 
    // cannot find the one match the aspect ratio, ignore the requirement 
    if (optimalsize == null) { 
      mindiff = double.max_value; 
      for (size size : sizes) { 
        if (math.abs(size.height - targetheight) < mindiff) { 
          optimalsize = size; 
          mindiff = math.abs(size.height - targetheight); 
        } 
      } 
    } 
    return optimalsize; 
  } 
 
  public void surfacechanged(surfaceholder holder, int format, int w, int h) { 
    // now that the size is known, set up the camera parameters and begin 
    // the preview. 
    camera.parameters parameters = mcamera.getparameters(); 
 
    list<size> sizes = parameters.getsupportedpreviewsizes(); 
    size optimalsize = getoptimalpreviewsize(sizes, w, h); 
    parameters.setpreviewsize(optimalsize.width, optimalsize.height); 
 
    mcamera.setparameters(parameters); 
    if (previewcallback != null) { 
      mcamera.setpreviewcallbackwithbuffer(previewcallback); 
      camera.size size = parameters.getpreviewsize(); 
      byte[] data = new byte[size.width*size.height* 
          imageformat.getbitsperpixel(parameters.getpreviewformat())/8]; 
      mcamera.addcallbackbuffer(data); 
    } 
    mcamera.startpreview(); 
  } 
 
} 

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持。