刚刚半路接触Android,项目要求利用手机camera录制视频,而且只能录制某个事件发生前后30秒的视频,在网上参考了一些代码,基本上没有能够顺利调试通过的,郁闷苦恼了好几天了,高手能不能给个解决的方向,比如说用哪个类去录视频。
网上看到很多代码都是用MediaRecorder来录制视频的,但是不知道这个类能不能录制某个时间段的视频。
网上看到很多代码都是用MediaRecorder来录制视频的,但是不知道这个类能不能录制某个时间段的视频。
解决方案 »
- 求助!!!一个关于textview的问题,高手看一下,给点思路
- 一张图片覆盖不了另一张图片
- 我想限制返回键返回某个界面,请问这个可以时间吗
- Activity横屏和竖屏切换问题:
- android两个工程调用,包名相同冲突怎么办?
- VMRuntime 找不到trackExternalAllocation,trackExternalFree方法
- Android 怎样实现同步更新数据库中的表
- 为什么WVAG800的模拟器仍然使用drawable-mdpi下的图片
- android动态设置edittext高度
- ConcurrentModificationException 错误
- 问一个android环境的问题?
- listview 问题请教
但是如何改变为可以播放的视频格式,就完全没有头绪了
从来没有做过视频方面的开发,各种视频格式也完全不了解
求教各位达人,能否指点一二
import java.io.File;
import java.io.RandomAccessFile;import android.app.Activity;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.Window;
import android.view.WindowManager;public class AndroidCamera extends Activity implements SurfaceHolder.Callback {
private SurfaceView mSurfaceView = null;
private SurfaceHolder mSurfaceHolder = null;
private Camera mCamera = null;
private boolean mPreviewRunning = false;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
mSurfaceView = (SurfaceView)findViewById(R.id.surface_camera);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
} @Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
if(mPreviewRunning) {
mCamera.stopPreview();
}
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(width, height);
mCamera.setPreviewCallback(new VideoData(width, height));
mCamera.setParameters(p);
try {
mCamera.setPreviewDisplay(holder);
} catch(Exception e) {
e.printStackTrace();
}
mCamera.startPreview();
mPreviewRunning = true;
} @Override
public void surfaceCreated(SurfaceHolder holder) {
mCamera = Camera.open();
} @Override
public void surfaceDestroyed(SurfaceHolder holder) {
Log.v("AndroidCamera", "surfaceDestroyed");
if(mCamera != null) {
mCamera.setPreviewCallback(null);
mCamera.stopPreview();
mPreviewRunning =false;
mCamera.release();
mCamera = null;
}
}
}class VideoData implements Camera.PreviewCallback { RandomAccessFile raf=null;
byte[] h264Buff =null;
public VideoData(int width, int height) {
Log.v("androidCamera", "new VideoData");
h264Buff = new byte[width * height *8];
try {
Log.v("androidCamera", "Create File: /sdcard/camera.dat start");
File file = new File("/sdcard/camera.dat");
Log.v("androidCamera", "Create File: /sdcard/camera.dat end");
raf = new RandomAccessFile(file, "rw");
} catch (Exception ex) {
ex.printStackTrace();
}
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if(data == null) {
return;
} // int previewWidth = camera.getParameters().getPreviewSize().width;
// int previewHeight = camera.getParameters().getPreviewSize().height;
// byte[] rgbBuffer = new byte[previewWidth * previewHeight * 3];
// decodeYUV420SP(rgbBuffer, data, previewWidth, previewHeight);
try {
raf.write(data, 0, data.length);
} catch(Exception ex) {
ex.printStackTrace();
}
}
protected void finalize() {
if (null != raf) {
try {
raf.close();
} catch (Exception ex) {
ex.printStackTrace();
}
}
try {
super.finalize();
} catch (Throwable e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}