醬學堂|Hololens 開發(fā)入門(hololens開發(fā)教程)
開發(fā)要求
Hololens 運行與Win10,應用程序是與UWP(通用windows開發(fā)平臺)構建的,開發(fā)Hololens 這樣的全息體驗對電腦的配置要求也是相當高的。
硬件配置:
1.64位Windows 10專業(yè)版,企業(yè)版或教育版(家庭版不支持Hyper-V)
2.64位CPU
3.8GB以上的RAM
4.在BIOS中,必須具備以下功能:
-
硬件輔助虛擬化
二級地址轉換(SLAT)
基于硬件的數(shù)據(jù)執(zhí)行保護(DEP)
5.對于GPU,需DirectX 11.0或更高版本,WDDM 1.2驅動程序或更高版本
關于Hyper-V,它是微軟的一款虛擬化產品,采用類似Vmware和Citrix開源Xen一樣的基于hypervisor的技術。
第二部分:安裝
1.啟用虛擬化,即在PC上啟用硬件虛擬化。
詳細步驟請看:
https://msdn.microsoft.com/library/windows/apps/jj863509(v=vs.105).aspx
2.啟用Hyper-V
3.安裝Visual Studio 2017或Visual Studio 2015 Update3(https://developer.microsoft.com/en-us/windows/downloads)
4.安裝HoloLens emulator(https://developer.microsoft.com/en-us/windows/mixed-reality/hololens_emulator_archive)
5.安裝Unity(https://unity3d.com/cn/get-unity/download)
關于詳細的安裝視頻,可以看看老外的這個教程:
不過不知道什么原因,視頻騰訊過不了所以大家可以在優(yōu)酷看,或者點擊閱讀原文
視頻:
第三部分:關于Hololens 模擬器
HoloLens模擬器允許你在沒有Hololens的情況下在PC上測試全息應用程序,并附帶Hololens開發(fā)工具集。仿真器使用Hyper-V虛擬機。
關于輸入:
-
向前,向后,向左和向右走 – 使用鍵盤上的W,A,S和D鍵或Xbox控制器上的左鍵。
查找向上,向下,向左和向右 – 單擊并拖動鼠標,使用鍵盤上的箭頭鍵或Xbox控制器上的右鍵。
空氣敲擊手勢 – 右鍵單擊鼠標,按鍵盤上的Enter鍵,或使用Xbox控制器上的A按鈕。
綻放手勢 – 按鍵盤上的Windows鍵或F2鍵,或按Xbox控制器上的B按鈕。手動移動滾動 – 按住Alt鍵,按住鼠標右鍵,向上/向下拖動鼠標,或者在Xbox控制器中按住右側觸發(fā)器和A按鈕,向上和向下移動右側手柄。
關于工具欄:
在主窗口的右側,您將找到仿真器工具欄。工具欄包含以下按鈕:
-
關閉:關閉模擬器。
最小化:最小化仿真器窗口。
人工輸入:鼠標和鍵盤用于模擬模擬器的人工輸入。
鍵盤和鼠標輸入:鍵盤和鼠標輸入直接傳遞到HoloLens操作系統(tǒng)作為鍵盤和鼠標事件,就像連接了藍牙鍵盤和鼠標一樣。
適合屏幕:適合模擬器屏幕。
縮放:使仿真器越來越大。
幫助:打開模擬器幫助。
打開設備門戶:在仿真器中打開HoloLens OS的Windows設備門戶。
工具:打開“ 其他工具 ”窗格。
開發(fā)—-Hello,HoloLens!
首先我們在unity中新建一個項目,接著添加一個簡單的3D模型進行測試,比如:
接著部署Windows Store
接著,點擊Build,生成VS項目:
啟動VS:
一般默認情況下,從Unity導出的UWP應用程序在任何Windows 10設備上運行。由于HoloLens是不同的,應用程序應該利用僅在HoloLens上可用的功能。為此,您需要在Visual Studio TargetDeviceFamily中的Package.appxmanifest文件中設置為“Windows.Holographic” ,如下:
接下來,就可以運行啦:
第五部分:輸入事件總結
1
GAZE凝視操作
在Hololens中,使用的是用戶的頭部位置與方向來gaze,而不是眼睛。
示例代碼(PS:核心在于RayCast):
using UnityEngine;
public class WorldCursor : MonoBehaviour
{
private MeshRenderer meshRenderer;
// Use this for initialization
void Start()
{
// Grab the mesh renderer that's on the same object as this script.
meshRenderer = this.gameObject.GetComponentInChildren();
}
// Update is called once per frame
void Update()
{
// Do a raycast into the world based on the user's
// head position and orientation.
var headPosition = Camera.main.transform.position;
var gazeDirection = Camera.main.transform.forward;
RaycastHit hitInfo;
if (Physics.Raycast(headPosition, gazeDirection, out hitInfo))
{
// If the raycast hit a hologram…
// Display the cursor mesh.
meshRenderer.enabled = true;
// Move the cursor to the point where the raycast hit.
this.transform.position = hitInfo.point;
// Rotate the cursor to hug the surface of the hologram.
this.transform.rotation = Quaternion.FromToRotation(Vector3.up, hitInfo.normal);
}
else
{
// If the raycast did not hit a hologram, hide the cursor mesh.
meshRenderer.enabled = false;
}
}
}
2
手勢輸入
示例代碼:
using UnityEngine;
using UnityEngine.VR.WSA.Input;
public class GazeGestureManager : MonoBehaviour
{
public static GazeGestureManager Instance { get; private set; }
// Represents the hologram that is currently being gazed at.
public GameObject FocusedObject { get; private set; }
GestureRecognizer recognizer;
// Use this for initialization
void Start()
{
Instance = this;
// Set up a GestureRecognizer to detect Select gestures.
recognizer = new GestureRecognizer();
recognizer.TappedEvent = (source, tapCount, ray) =>
{
// Send an OnSelect message to the focused object and its ancestors.
if (FocusedObject != null)
{
FocusedObject.SendMessageUpwards(“OnSelect”);
}
};
recognizer.StartCapturingGestures();
}
// Update is called once per frame
void Update()
{
// Figure out which hologram is focused this frame.
GameObject oldFocusObject = FocusedObject;
// Do a raycast into the world based on the user's
// head position and orientation.
var headPosition = Camera.main.transform.position;
var gazeDirection = Camera.main.transform.forward;
RaycastHit hitInfo;
if (Physics.Raycast(headPosition, gazeDirection, out hitInfo))
{
// If the raycast hit a hologram, use that as the focused object.
FocusedObject = hitInfo.collider.gameObject;
}
else
{
// If the raycast did not hit a hologram, clear the focused object.
FocusedObject = null;
}
// If the focused object changed this frame,
// start detecting fresh gestures again.
if (FocusedObject != oldFocusObject)
{
recognizer.CancelGestures();
recognizer.StartCapturingGestures();
}
}
}
Update方法會持續(xù)檢查是否有任何對象被注視并將對象設置為焦點,以便在點擊時向對象發(fā)送一個輕擊的事件。GestureRecognizer負責識別用戶的手勢。
3
語音輸入
示例代碼:
using System.Collections.Generic;
using System.Linq;
using UnityEngine;
using UnityEngine.Windows.Speech;
public class SpeechManager : MonoBehaviour
{
KeywordRecognizer keywordRecognizer = null;
Dictionarykeywords = new Dictionary();
// Use this for initialization
void Start()
{
keywords.Add(“Reset world”, () =>
{
// Call the OnReset method on every descendant object.
this.BroadcastMessage(“OnReset”);
});
keywords.Add(“Drop Object”, () =>
{
var focusObject = GazeGestureManager.Instance.FocusedObject;
if (focusObject != null)
{
// Call the OnDrop method on just the focused object.
focusObject.SendMessage(“OnDrop”);
}
});
// Tell the KeywordRecognizer about our keywords.
keywordRecognizer = new KeywordRecognizer(keywords.Keys.ToArray());
// Register a callback for the KeywordRecognizer and start recognizing!
keywordRecognizer.OnPhraseRecognized = KeywordRecognizer_OnPhraseRecognized;
keywordRecognizer.Start();
}
private void KeywordRecognizer_OnPhraseRecognized(PhraseRecognizedEventArgs args)
{
System.Action keywordAction;
if (keywords.TryGetValue(args.text, out keywordAction))
{
keywordAction.Invoke();
}
}
}
4
.音頻輸入
示例代碼:
using UnityEngine;
public class SphereSounds : MonoBehaviour
{
AudioSource audioSource = null;
AudioClip impactClip = null;
AudioClip rollingClip = null;
bool rolling = false;
void Start()
{
// Add an AudioSource component and set up some defaults
audioSource = gameObject.AddComponent();
audioSource.playOnAwake = false;
audioSource.spatialize = true;
audioSource.spatialBlend = 1.0f;
audioSource.dopplerLevel = 0.0f;
audioSource.rolloffMode = AudioRolloffMode.Custom;
// Load the Sphere sounds from the Resources folder
impactClip = Resources.Load(“Impact”);
rollingClip = Resources.Load(“Rolling”);
}
// Occurs when this object starts colliding with another object
void OnCollisionEnter(Collision collision)
{
// Play an impact sound if the sphere impacts strongly enough.
if (collision.relativeVelocity.magnitude >= 0.1f)
{
audioSource.clip = impactClip;
audioSource.Play();
}
}
// Occurs each frame that this object continues to collide with another object
void OnCollisionStay(Collision collision)
{
Rigidbody rigid = this.gameObject.GetComponent();
// Play a rolling sound if the sphere is rolling fast enough.
if (!rolling && rigid.velocity.magnitude >= 0.01f)
{
rolling = true;
audioSource.clip = rollingClip;
audioSource.Play();
}
// Stop the rolling sound if rolling slows down.
else if (rolling && rigid.velocity.magnitude < 0.01f)
{
rolling = false;
audioSource.Stop();
}
}
// Occurs when this object stops colliding with another object
void OnCollisionExit(Collision collision)
{
// Stop the rolling sound if the object falls off and stops colliding.
if (rolling)
{
rolling = false;
audioSource.Stop();
}
}
}
OnCollisionEnter,OnCollisionStay而OnCollisionExit事件確定何時開始播放音頻剪輯,是否繼續(xù)音頻剪輯以及何時停止播放音頻剪輯。
AR醬原創(chuàng),轉載務必注明
微信號AR醬(ARchan_TT)
AR醬官網(wǎng):www.arjiang.com