Capture video in Unity3d

Capture video in Unity3d using Intel INDE Media Pack   for Android

In one of the comments to the article about capturing video in OpenGL applications was mentioned the ability to capture video in applications created with Unity3d. We are interested in this topic, in fact - why only "clean” OpenGL applications, though many developers use to create games various libraries and frameworks? Video capture in applications written using Unity3d and Android.

Next, we consider two embodiment of video capture in Unity3d:

  1. Full post effect. The method will work only in the Pro version, and the video will not be captured Unity GUI
  2.  With framebuffer (FrameBuffer). Will work for all versions of Unity3d, including paid and free, Unity GUI objects will also be recorded in the video.

We need
 

  1. Unity3d version 4.3 Pro version for the first and second methods, or free version, for which the only available method framebuffer
  2. Installed Android SDK
  3. Installed Intel INDE Media Pack
  4. Apache Ant (for assembly Unity plugin for Android)

Create a project

Open the Unity Editor and create a new project. In the folder, create a folder assets list Plugins, and in it the folder Android.

In the folder you installed Intel INDE Media Pack for Android, from the libs directory copy the two jar-file (android- <version> .jar and domain- <version> .jar) to a folder of your Android project.

In the same folder, create a new Android file named Capturing.java and copy the following code:

Capturing.java

package com.intel.inde.mp.samples.unity;

import com.intel.inde.mp.android.graphics.FullFrameTexture;

import android.os.Environment;
import java.io.IOException;
import java.io.File;

public class Capturing
{

    private static FullFrameTexture texture;
    
    public Capturing()
    {
        texture = new FullFrameTexture();
    }

    // The path to the folder in which to save video
    public static String getDirectoryDCIM()
    {
        return Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM) + File.separator;
    }

    // Configuring video settings
    public void initCapturing(int width, int height, int frameRate, int bitRate)
    {
        VideoCapture.init(width, height, frameRate, bitRate);
    }

    // starting the process of capturing video
    public void startCapturing(String videoPath)
    {
        VideoCapture capture = VideoCapture.getInstance();

        synchronized (capture) 
        {
            try 
            {
                capture.start(videoPath);
            } 
            catch (IOException e) 
            {
            }
        }
    }

    // Called for each captured frame
    public void captureFrame(int textureID)
    {
        VideoCapture capture = VideoCapture.getInstance();

        synchronized (capture) 
        {
            capture.beginCaptureFrame();
            texture.draw(textureID);
            capture.endCaptureFrame();
        }
    }

    // Stop the video capture
    public void stopCapturing()
    {
        VideoCapture capture = VideoCapture.getInstance();

        synchronized (capture) 
        {
            if (capture.isStarted()) 
            {
                capture.stop();
            }
        }
    }
}

Add another Java file, this time with the name VideoCapture.java:

VideoCapture.java

package com.intel.inde.mp.samples.unity;

import com.intel.inde.mp.*;
import com.intel.inde.mp.android.AndroidMediaObjectFactory;
import com.intel.inde.mp.android.AudioFormatAndroid;
import com.intel.inde.mp.android.VideoFormatAndroid;

import java.io.IOException;

public class VideoCapture
{
    private static final String TAG = "VideoCapture";

    private static final String Codec = "video/avc";
    private static int IFrameInterval = 1;

    private static final Object syncObject = new Object();
    private static volatile VideoCapture videoCapture;

    private static VideoFormat videoFormat;
    private static int videoWidth;
    private static int videoHeight;
    private GLCapture capturer;

    private boolean isConfigured;
    private boolean isStarted;
    private long framesCaptured;

    private VideoCapture()
    {
    }
    
    public static void init(int width, int height, int frameRate, int bitRate)
    {
        videoWidth = width;
        videoHeight = height;
    	
        videoFormat = new VideoFormatAndroid(Codec, videoWidth, videoHeight);
        videoFormat.setVideoFrameRate(frameRate);
        videoFormat.setVideoBitRateInKBytes(bitRate);
        videoFormat.setVideoIFrameInterval(IFrameInterval);
    }

    public static VideoCapture getInstance()
    {
        if (videoCapture == null) 
        {
            synchronized (syncObject) 
            {
                if (videoCapture == null)
                {
                    videoCapture = new VideoCapture();
                }
            }
        }
        return videoCapture;
    }

    public void start(String videoPath) throws IOException
    {
        if (isStarted())
        {
            throw new IllegalStateException(TAG + " already started!");
        }

        capturer = new GLCapture(new AndroidMediaObjectFactory());
        capturer.setTargetFile(videoPath);
        capturer.setTargetVideoFormat(videoFormat);

        AudioFormat audioFormat = new AudioFormatAndroid("audio/mp4a-latm", 44100, 2);
        capturer.setTargetAudioFormat(audioFormat);

        capturer.start();

        isStarted = true;
        isConfigured = false;
        framesCaptured = 0;
    }    
    
    public void stop()
    {
        if (!isStarted())
        {
            throw new IllegalStateException(TAG + " not started or already stopped!");
        }

        try 
        {
            capturer.stop();
            isStarted = false;
        } 
        catch (Exception ex) 
        {
        }

        capturer = null;
        isConfigured = false;
    }

    private void configure()
    {
        if (isConfigured())
        {
            return;
        }

        try 
        {
            capturer.setSurfaceSize(videoWidth, videoHeight);
            isConfigured = true;
        } 
        catch (Exception ex) 
        {
        }
    }

    public void beginCaptureFrame()
    {
        if (!isStarted())
        {
            return;
        }

        configure();

        if (!isConfigured())
        {
            return;
        }

        capturer.beginCaptureFrame();
    }

    public void endCaptureFrame()
    {
        if (!isStarted() || !isConfigured())
        {
            return;
        }

        capturer.endCaptureFrame();

        framesCaptured++;
    }

    public boolean isStarted()
    {
        return isStarted;
    }

    public boolean isConfigured()
    {
        return isConfigured;
    }
}

Important: Please note the name of the package com.intel.inde.mp.samples.unity. It must match the name in the project settings (Player Settings / Other Settings / Bundle identifier):

You should also use the same name in C # -script to call Java-class. If these names do not match, your game will fall at the start.

Add your scene in any dynamic content. You can also integrate Intel® INDE Media Pack for Android * with any existing project, not create it from the beginning. But try to get the scene was something dynamic. Otherwise, you too will be interesting to look at the videos, in which nothing changes.

Now, as any other Android application, we need to set up a manifesto. Create a folder in the / Plugins / Android AndroidManifest.xml file and copy the contents:

AndroidManifest.xml

<?xml version="1.0" encoding="utf-8"?>
<manifest
    xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.intel.inde.mp.samples.unity"
    android:installLocation="preferExternal"
    android:theme="@android:style/Theme.NoTitleBar"
    android:versionCode="1"
    android:versionName="1.0">
      
    <uses-sdk android:minSdkVersion="18" />
    
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
    <uses-permission android:name="android.permission.INTERNET"/>
    <!ââ¬â uses microphone -->
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    
    <!-- requires OpenGL ES >= 2.0. -->
    <uses-feature
        android:glEsVersion="0x00020000"
        android:required="true"/>
    
    <application
     android:icon="@drawable/app_icon"
        android:label="@string/app_name"
        android:debuggable="true">
        <activity android:name="com.unity3d.player.UnityPlayerNativeActivity"
                  android:label="@string/app_name">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />
                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
            <meta-data android:name="unityplayer.UnityActivity" android:value="true" />
            <meta-data android:name="unityplayer.ForwardNativeEventsToDalvik" android:value="false" />
        </activity>
    </application>

</manifest>

Note the line:

package="com.intel.inde.mp.samples.unity"

The package name must match what you specified earlier.

Now we have everything you need. Since Unity will not be able to compile our own Java-files we create Ant-script.

Note: if you use other classes and libraries, you should change your Ant-script accordingly

more about Ant-script  http://ant.apache.org/manual/

Next Ant-script is intended only for this tutorial. Create a folder in the / Plugins / Android / file build.xml:

build.xml

<?xml version="1.0" encoding="UTF-8"?>
<project name="UnityCapturing">
    <!-- Change this in order to match your configuration -->
    <property name="sdk.dir" value="C:\Android\sdk"/>
    <property name="target" value="android-18"/>
    <property name="unity.androidplayer.jarfile" value="C:\Program Files (x86)\Unity\Editor\Data\PlaybackEngines\androiddevelopmentplayer\bin\classes.jar"/>
    <!-- Source directory -->
    <property name="source.dir" value="\ProjectPath\Assets\Plugins\Android" />
    <!-- Output directory for .class files-->
    <property name="output.dir" value="\ProjectPath\Assets\Plugins\Android\classes"/>
    <!-- Name of the jar to be created. Please note that the name should match the name of the class and the name
    placed in the AndroidManifest.xml-->
    <property name="output.jarfile" value="Capturing.jar"/>
      <!-- Creates the output directories if they don't exist yet. -->
    <target name="-dirs"  depends="message">
        <echo>Creating output directory: ${output.dir} </echo>
        <mkdir dir="${output.dir}" />
    </target>
   <!-- Compiles this project's .java files into .class files. -->
    <target name="compile" depends="-dirs"
                description="Compiles project's .java files into .class files">
        <javac encoding="ascii" target="1.6" debug="true" destdir="${output.dir}" verbose="${verbose}" includeantruntime="false">
            <src path="${source.dir}" />
            <classpath>
                <pathelement location="${sdk.dir}\platforms\${target}\android.jar"/>
                <pathelement location="${source.dir}\domain-1.0.903.jar"/>
                <pathelement location="${source.dir}\android-1.0.903.jar"/>
                <pathelement location="${unity.androidplayer.jarfile}"/>
            </classpath>
        </javac>
    </target>
    <target name="build-jar" depends="compile">
        <zip zipfile="${output.jarfile}"
            basedir="${output.dir}" />
    </target>
    <target name="clean-post-jar">
         <echo>Removing post-build-jar-clean</echo>
         <delete dir="${output.dir}"/>
    </target>
    <target name="clean" description="Removes output files created by other targets.">
        <delete dir="${output.dir}" verbose="${verbose}" />
    </target>
    <target name="message">
     <echo>Android Ant Build for Unity Android Plugin</echo>
        <echo>   message:      Displays this message.</echo>
        <echo>   clean:     Removes output files created by other targets.</echo>
        <echo>   compile:   Compiles project's .java files into .class files.</echo>
        <echo>   build-jar: Compiles project's .class files into .jar file.</echo>
    </target>
</project>

Pay attention to the way   source.dir, output.dir   and, of course, in the name of the   output jar-file output.jarfile.

At the command prompt, navigate to the project folder / Plugins / Android and start building a plugin

ant build-jar clean-post-jar

If you did everything as described above, then after a few seconds you get a message saying that the assembly was successfully completed!

At the entrance of the folder should be a new file Capturing.jar, containing the code of our plugin.

Plugin ready, it remains to make the necessary changes to the code Unity3d, first of all create a wrapper linking Unity and our Android plugin. To do this, create a project file Capture.cs

Capture.cs

using UnityEngine;
using System.Collections;
using System.IO;
using System;

[RequireComponent(typeof(Camera))]
public class Capture : MonoBehaviour
{
    public int videoWidth = 720;
    public int videoHeight = 1094;
    public int videoFrameRate = 30;
    public int videoBitRate = 3000;

    private string videoDir;
    public string fileName = "game_capturing-";
    
    private float nextCapture = 0.0f;
    public bool inProgress { get; private set; }
    
    private static IntPtr constructorMethodID = IntPtr.Zero;
    private static IntPtr initCapturingMethodID = IntPtr.Zero;
    private static IntPtr startCapturingMethodID = IntPtr.Zero;
    private static IntPtr captureFrameMethodID = IntPtr.Zero;
    private static IntPtr stopCapturingMethodID = IntPtr.Zero;

    private static IntPtr getDirectoryDCIMMethodID = IntPtr.Zero;

    private IntPtr capturingObject = IntPtr.Zero;

    void Start()
    {
        if(!Application.isEditor) 
        {
            // Gets a pointer to our class
            IntPtr classID = AndroidJNI.FindClass("com/intel/inde/mp/samples/unity/Capturing");

            // Looking for designer
            constructorMethodID = AndroidJNI.GetMethodID(classID, "<init>", "()V");

            // Register methods implemented class
            initCapturingMethodID = AndroidJNI.GetMethodID(classID, "initCapturing", "(IIII)V");
            startCapturingMethodID = AndroidJNI.GetMethodID(classID, "startCapturing", "(Ljava/lang/String;)V");
            captureFrameMethodID = AndroidJNI.GetMethodID(classID, "captureFrame", "(I)V");
            stopCapturingMethodID = AndroidJNI.GetMethodID(classID, "stopCapturing", "()V");
            getDirectoryDCIMMethodID = AndroidJNI.GetStaticMethodID(classID, "getDirectoryDCIM", "()Ljava/lang/String;");

            jvalue[] args = new jvalue[0];

            videoDir = AndroidJNI.CallStaticStringMethod(classID, getDirectoryDCIMMethodID, args);

            // Create an object
            IntPtr local_capturingObject = AndroidJNI.NewObject(classID, constructorMethodID, args);
            if (local_capturingObject == IntPtr.Zero) 
            {
                Debug.LogError("Can't create Capturing object");
                return;
            }

            // Save a pointer to the object
            capturingObject = AndroidJNI.NewGlobalRef(local_capturingObject);
            AndroidJNI.DeleteLocalRef(local_capturingObject);

            AndroidJNI.DeleteLocalRef(classID);
        }

        inProgress = false;
        nextCapture = Time.time;
    }

    void OnRenderImage(RenderTexture src, RenderTexture dest)
    {
        if (inProgress && Time.time > nextCapture)
        {
            CaptureFrame(src.GetNativeTextureID());
            nextCapture += 1.0f / videoFrameRate;
        }

        Graphics.Blit(src, dest);
    }

    public void StartCapturing()
    {
        if (capturingObject == IntPtr.Zero)
        {
            return;
        }

        jvalue[] videoParameters =  new jvalue[4];

        videoParameters[0].i = videoWidth;
        videoParameters[1].i = videoHeight;
        videoParameters[2].i = videoFrameRate;
        videoParameters[3].i = videoBitRate;

        AndroidJNI.CallVoidMethod(capturingObject, initCapturingMethodID, videoParameters);

        DateTime date = DateTime.Now;

        string fullFileName = fileName + date.ToString("ddMMyy-hhmmss.fff") + ".mp4";
        jvalue[] args = new jvalue[1];
        args[0].l = AndroidJNI.NewStringUTF(videoDir + fullFileName);
        AndroidJNI.CallVoidMethod(capturingObject, startCapturingMethodID, args);

        inProgress = true;
    }

    private void CaptureFrame(int textureID)
    {
        if (capturingObject == IntPtr.Zero)
        {
            return;
        }

        jvalue[] args = new jvalue[1];
        args[0].i = textureID;

        AndroidJNI.CallVoidMethod(capturingObject, captureFrameMethodID, args);
    }

    public void StopCapturing()
    {
        inProgress = false;

        if (capturingObject == IntPtr.Zero)
        {
            return;
        }

        jvalue[] args = new jvalue[0];

        AndroidJNI.CallVoidMethod(capturingObject, stopCapturingMethodID, args);
    }
}

Assign the script the main chamber. Before capturing video, you must configure the video format. You can do this directly in the editor by changing the appropriate parameters (videoWidth, videoHeight etc.)
Methods Start (), StartCapturing () and StopCapturing () rather trivial and is a wrapper to call the plugin code from Unity.

More interesting is the method OnRenderImage (). It is called after the entire rendering is already finished, just before displaying the results to the screen. The input image is contained in the texture of src, the result we need to add to the texture dest.

This mechanism allows you to modify the final image, applying various effects, but it's out of our interests, we are interested in a picture as it is. To capture video, we have to copy the final image in the video. To do this, we pass the Id texture object Capturing by calling captureFrame () and transferring texture Id. as an input parameter.

To draw on the screen, simply copy src to dest:

Graphics.Blit(src, dest);

For convenience, let's create a button with which we enable, disable video recording of the game interface.

To do this, create a GUI object and assign it the handler. The handler will be located in the file CaptureGUI.cs

CaptureGUI.cs

using UnityEngine;
using System.Collections;

public class CaptureGUI : MonoBehaviour
{
    public Capture capture;
    private GUIStyle style = new GUIStyle();

    void Start()
    {
        style.fontSize = 48;
        style.alignment = TextAnchor.MiddleCenter;
    }

    void OnGUI()
    {
        style.normal.textColor = capture.inProgress ? Color.red : Color.green;

        if (GUI.Button(new Rect(10, 200, 350, 100), capture.inProgress ? "[Stop Recording]" : "[Start Recording]", style)) 
        {
            if (capture.inProgress) 
            {
                capture.StopCapturing();
            } 
            else 
            {
                capture.StartCapturing();
            }
        }
    }	
}

Do not forget to initialize an instance of a class field capture Capture.

By clicking on the object will run, stop the process of capturing video, the result will be saved in the folder / mnt / sdcard / DCIM /.

As I said earlier, this method will work only in the Pro version (the free version can not be used OnRenderImage () and call Graphics.Blit), another feature - the final video will not contain objects Unity GUI. These restrictions are eliminated way number two - with FrameBuffer.

Capture video using the frame buffer. Edit the Capturing.java, for that just replace its contents

Capturing.java

package com.intel.inde.mp.samples.unity;

import com.intel.inde.mp.android.graphics.FullFrameTexture;
import com.intel.inde.mp.android.graphics.FrameBuffer;

import android.os.Environment;

import java.io.IOException;
import java.io.File;

public class Capturing
{
    private static FullFrameTexture texture;
    private FrameBuffer frameBuffer;
    
    public Capturing(int width, int height)
    {
        frameBuffer = new FrameBuffer();
    	frameBuffer.create(width, height);

        texture = new FullFrameTexture();
    }

    public static String getDirectoryDCIM()
    {
        return Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM) + File.separator;
    }

    public void initCapturing(int width, int height, int frameRate, int bitRate)
    {
        VideoCapture.init(width, height, frameRate, bitRate);
    }

    public void startCapturing(String videoPath)
    {
        VideoCapture capture = VideoCapture.getInstance();

        synchronized (capture) 
        {
            try 
            {
                capture.start(videoPath);
            } 
            catch (IOException e) 
            {
            }
        }
    }
    
    public void beginCaptureFrame()
    {
    	frameBuffer.bind();
    }
    
    public void captureFrame(int textureID)
    {
        VideoCapture capture = VideoCapture.getInstance();

        synchronized (capture) 
        {
            capture.beginCaptureFrame();
            texture.draw(textureID);
            capture.endCaptureFrame();
        }
    }
    
    public void endCaptureFrame()
    {
    	frameBuffer.unbind();

    	int textureID = frameBuffer.getTexture();

    	captureFrame(textureID);
    	texture.draw(textureID);
    }

    public void stopCapturing()
    {
        VideoCapture capture = VideoCapture.getInstance();

        synchronized (capture) 
        {
            if (capture.isStarted()) 
            {
                capture.stop();
            }
        }
    }
}

As you can see, not much changes. The main one - the appearance of a new object

FrameBuffer frameBuffer;

Designer now takes as parameters the width and height of a frame, it is required to create the desired size framebuffer.

There were three new public method: frameBufferTexture (), beginCaptureFrame () and endCaptureFrame (). Their importance will become clearer when we get to the code in C #.

File VideoCapture.java we leave unchanged.

Next, you need to build Android plugin, how it is done we discussed above.
Now we can switch to Unity. Open the script Capture.cs and replace its contents:

Capture.cs

using UnityEngine;
using System.Collections;
using System.IO;
using System;

[RequireComponent(typeof(Camera))]
public class Capture : MonoBehaviour
{
    public int videoWidth = 720;
    public int videoHeight = 1094;
    public int videoFrameRate = 30;
    public int videoBitRate = 3000;

    private string videoDir;
    public string fileName = "game_capturing-";
    
    private float nextCapture = 0.0f;
    public bool inProgress { get; private set; }
    private bool finalizeFrame = false;
    private Texture2D texture = null;
    
    private static IntPtr constructorMethodID = IntPtr.Zero;
    private static IntPtr initCapturingMethodID = IntPtr.Zero;
    private static IntPtr startCapturingMethodID = IntPtr.Zero;
    private static IntPtr beginCaptureFrameMethodID = IntPtr.Zero;
    private static IntPtr endCaptureFrameMethodID = IntPtr.Zero;
    private static IntPtr stopCapturingMethodID = IntPtr.Zero;

    private static IntPtr getDirectoryDCIMMethodID = IntPtr.Zero;

    private IntPtr capturingObject = IntPtr.Zero;

    void Start()
    {
        if (!Application.isEditor) 
        {
            // Get a pointer to our class
            IntPtr classID = AndroidJNI.FindClass("com/intel/inde/mp/samples/unity/Capturing ");

            // looking for designer
            constructorMethodID = AndroidJNI.GetMethodID(classID, "<init>", "(II)V");

            // Register methods implemented class
            initCapturingMethodID = AndroidJNI.GetMethodID(classID, "initCapturing", "(IIII)V");
            startCapturingMethodID = AndroidJNI.GetMethodID(classID, "startCapturing", "(Ljava/lang/String;)V");
            beginCaptureFrameMethodID = AndroidJNI.GetMethodID(classID, "beginCaptureFrame", "()V");
            endCaptureFrameMethodID = AndroidJNI.GetMethodID(classID, "endCaptureFrame", "()V");
            stopCapturingMethodID = AndroidJNI.GetMethodID(classID, "stopCapturing", "()V");

            getDirectoryDCIMMethodID = AndroidJNI.GetStaticMethodID(classID, "getDirectoryDCIM", "()Ljava/lang/String;");
            jvalue[] args = new jvalue[0];
            videoDir = AndroidJNI.CallStaticStringMethod(classID, getDirectoryDCIMMethodID, args);

            // create an object
            jvalue[] constructorParameters = new jvalue[2];

            constructorParameters[0].i = Screen.width;
            constructorParameters[1].i = Screen.height;

            IntPtr local_capturingObject = AndroidJNI.NewObject(classID, constructorMethodID, constructorParameters);

            if (local_capturingObject == IntPtr.Zero) 
            {
                Debug.LogError("Can't create Capturing object");
                return;
            }

            // Save a pointer to the object
            capturingObject = AndroidJNI.NewGlobalRef(local_capturingObject);
            AndroidJNI.DeleteLocalRef(local_capturingObject);

            AndroidJNI.DeleteLocalRef(classID);
        }

        inProgress = false;
        nextCapture = Time.time;
    }

    void OnPreRender()
    {
        if (inProgress && Time.time > nextCapture) 
        {
            finalizeFrame = true;
            nextCapture += 1.0f / videoFrameRate;
            BeginCaptureFrame();
        }
    }

    public IEnumerator OnPostRender()
    {
        if (finalizeFrame) 
        {
            finalizeFrame = false;
            yield return new WaitForEndOfFrame();
            EndCaptureFrame();
        } 
        else 
        {
            yield return null;
        }
    }

    public void StartCapturing()
    {
        if (capturingObject == IntPtr.Zero)
        {
            return;
        }

        jvalue[] videoParameters =  new jvalue[4];

        videoParameters[0].i = videoWidth;
        videoParameters[1].i = videoHeight;
        videoParameters[2].i = videoFrameRate;
        videoParameters[3].i = videoBitRate;

        AndroidJNI.CallVoidMethod(capturingObject, initCapturingMethodID, videoParameters);

        DateTime date = DateTime.Now;

        string fullFileName = fileName + date.ToString("ddMMyy-hhmmss.fff") + ".mp4";
        jvalue[] args = new jvalue[1];

        args[0].l = AndroidJNI.NewStringUTF(videoDir + fullFileName);
        AndroidJNI.CallVoidMethod(capturingObject, startCapturingMethodID, args);

        inProgress = true;
    }

    private void BeginCaptureFrame()
    {
        if (capturingObject == IntPtr.Zero)
        {
            return;
        }

        jvalue[] args = new jvalue[0];
        AndroidJNI.CallVoidMethod(capturingObject, beginCaptureFrameMethodID, args);
    }

    private void EndCaptureFrame()
    {
        if (capturingObject == IntPtr.Zero)
        {
            return;
        }

        jvalue[] args = new jvalue[0];
        AndroidJNI.CallVoidMethod(capturingObject, endCaptureFrameMethodID, args);
    }

    public void StopCapturing()
    {
        inProgress = false;

        if (capturingObject == IntPtr.Zero)
        {
            return;
        }

        jvalue[] args = new jvalue[0];
        AndroidJNI.CallVoidMethod(capturingObject, stopCapturingMethodID, args);
    }
}

In this code, we've got a lot more changes, but the logic remains simple. First we pass the dimensions of the frame to the constructor Capturing. Note the new signature constructor - (II) V. On the Java side, we create an object and pass it FrameBuffer these parameters.

Method OnPreRender () is invoked before the camera starts rendering the scene. It is here that we switch to our FrameBuffer. Thus, the entire rendering is performed on the texture assigned to FrameBuffer.

Method OnPostRender () is called after rendering. We are waiting for the end of the frame, disable FrameBuffer and copy the texture directly on the screen by means of Media Pack (see method endCaptureFrame () class Capturing.java).

To evaluate the performance, let's add a simple counter FPS. To do this, add the object to the scene Unity GUI and secure him the following code:

FPS.cs

using UnityEngine;
using System.Collections;

public class FPSCounter : MonoBehaviour
{
    public float updateRate = 4.0f; // 4 updates per sec.

    private int frameCount = 0;
    private float nextUpdate = 0.0f;
    private float fps = 0.0f;
    private GUIStyle style = new GUIStyle();

    void Start()
    {
        style.fontSize = 48;
        style.normal.textColor = Color.white;

        nextUpdate = Time.time;
    }

    void Update()
    {
        frameCount++;

        if (Time.time > nextUpdate) 
        {
            nextUpdate += 1.0f / updateRate;
            fps = frameCount * updateRate;
            frameCount = 0;
        }
    }

    void OnGUI()
    {
        GUI.Label(new Rect(10, 110, 300, 100), "FPS: " + fps, style);
    }
}

On this you can consider our work finished, run the project, experiment with recording.

All data posted on the site represents accessible information that can be browsed and downloaded for free from the web

http://habrahabr.ru/company/intel/blog/219649/

 

User replies

No replies yet