How to fix Unity ARKit Remote
この記事は
を英語での解説が欲しいとリクエストがあったので英語でまとめたものです。
要点だけまとめてあるので、詳しい解説はリンク先のものを見てください。
It does not work properly!
What is wrong?
Since the data transmitted can not be handled on the receiving side, the sending side can not send it!
This problem occurs because iPhoneX has a high resolution.
How to fix
Using the MemoryStream, compress the data by DeflateStream before sending it on the sending side.
The point is, ...
- Compress the data on the data sending side .
- Decompress the data on the data receiving side.
Compress & Decompress
/// <summary> /// Compress using deflate. /// </summary> /// <returns>The byte compress.</returns> /// <param name="source">Source.</param> public static byte[] ConvertByteCompress(byte[] source) { using (MemoryStream ms = new MemoryStream()) using (DeflateStream compressedDStream = new DeflateStream(ms, CompressionMode.Compress, true)) { compressedDStream.Write(source, 0, source.Length); compressedDStream.Close(); byte[] destination = ms.ToArray(); Debug.Log(source.Length.ToString() + " vs " + ms.Length.ToString()); return destination; } } /// <summary> /// Decompress using deflate. /// </summary> /// <returns>The byte decompress.</returns> /// <param name="source">Source.</param> public static byte[] ConvertByteDecompress(byte[] source) { using (MemoryStream input = new MemoryStream(source)) using (MemoryStream output = new MemoryStream()) using (DeflateStream decompressedDstream = new DeflateStream(input, CompressionMode.Decompress)) { decompressedDstream.CopyTo(output); byte[] destination = output.ToArray(); Debug.Log("Decompress Size : " + output.Length); return destination; } }
How to use
UnityRemoteVideo.cs
public void OnPreRender() { ~ abridgement ~ //connectToEditor.SendToEditor (ConnectionMessageIds.screenCaptureYMsgId, YByteArrayForFrame(1-currentFrameIndex)); connectToEditor.SendToEditor(ConnectionMessageIds.screenCaptureYMsgId, ByteConverter.ConvertByteCompress(YByteArrayForFrame(1 - currentFrameIndex))); //connectToEditor.SendToEditor (ConnectionMessageIds.screenCaptureUVMsgId, UVByteArrayForFrame(1-currentFrameIndex)); connectToEditor.SendToEditor(ConnectionMessageIds.screenCaptureUVMsgId, ByteConverter.ConvertByteCompress(UVByteArrayForFrame(1 - currentFrameIndex))); }
ARKitRemoteConnection.cs
void ReceiveRemoteScreenYTex(MessageEventArgs mea) { if (!bTexturesInitialized) return; //remoteScreenYTex.LoadRawTextureData(mea.data); remoteScreenYTex.LoadRawTextureData(ByteConverter.ConvertByteDecompress(mea.data)); remoteScreenYTex.Apply (); UnityARVideo arVideo = Camera.main.GetComponent<UnityARVideo>(); if (arVideo) { arVideo.SetYTexure(remoteScreenYTex); } } void ReceiveRemoteScreenUVTex(MessageEventArgs mea) { if (!bTexturesInitialized) return; //remoteScreenUVTex.LoadRawTextureData(mea.data); remoteScreenUVTex.LoadRawTextureData(ByteConverter.ConvertByteDecompress(mea.data)); remoteScreenUVTex.Apply (); UnityARVideo arVideo = Camera.main.GetComponent<UnityARVideo>(); if (arVideo) { arVideo.SetUVTexure(remoteScreenUVTex); } }
UnityRemoteVideo.cs and ARKitRemoteConnection.cs are scripts prepared beforehand by Unity ARKit Plugin.
The compression & expansion processing is the script that I created this time.
Caution!
In order to use CopyTo method of DeflateStream class I am using .NET Framework 4.6 instead of .NET Framework 3.5.
Building the Unity ARKit Remote scene uses .NET Framework 3.5. Please change to .NET Framework 4.6 after building and run Editor.
Run
You are now able to use it successfully.
If there is something you do not understand please comment:)