Mediapipe是goole的一个开源项目,支持跨平台的常用ML方案,详情请戳下面链接
MediaPipe
Mediapipe底层封装了手势识别的具体实现内容,而在Python中搭建完环境后经过很简单的调用就能够实现手势识别
环境如下:
pip install mediapipe
pip install opencv-python
简单的实现,代码很少,代码如下:
import cv2
import mediapipe as mp
import time
cap = cv2.VideoCapture('video.mp4')#可将'video.mp4'替换为0打开摄像头
mpHands = mp.solutions.hands
hands = mpHands.Hands()
mpDraw = mp.solutions.drawing_utils
pTime = 0
cTime = 0
while True:
success, img = cap.read()
imgRGB = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
results = hands.process(imgRGB)
if results.multi_hand_landmarks:
for handLms in results.multi_hand_landmarks:
for id, lm in enumerate(handLms.landmark):
h, w, c = img.shape
cx, cy = int(lm.x * w), int(lm.y * h)
print(id, cx, cy)
cv2.circle(img, (cx, cy), 15, (255, 0, 255), cv2.FILLED)
mpDraw.draw_landmarks(img, handLms, mpHands.HAND_CONNECTIONS)
cTime = time.time()
fps = 1 / (cTime - pTime)
pTime = cTime
cv2.putText(img, str(int(fps)), (10, 70), cv2.FONT_HERSHEY_PLAIN, 3,
(255, 255, 255), 2)
cv2.imshow("Image", img)
cv2.waitKey(1)
其中打印了lm.x,lm.y代表xy坐标,还可以有一个z值,lm.z代表相对于0号点的位置,0号点的z值为0。21个关键点位置代表如下:
而为了获取深度信息,可以根据5号点和17号点距离摄像头越远,屏幕中二者的距离越小的规律构建函数关系,进行深度的预测,但仅在手与摄像头平行才好用,手旋转,深度信息也会变化。函数构建代码如下:
x = [300, 245, 200, 170, 145, 130, 112, 103, 93, 87, 80, 75, 70, 67, 62, 59, 57]
y = [20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75, 80, 85, 90, 95, 100]
coff = np.polyfit(x, y, 2)
Lx1, Ly1 = HandLeft[5][:2]
Lx2, Ly2 = HandLeft[17][:2]
Ldistance = int(math.sqrt((Ly2 - Ly1) ** 2 + (Lx2 - Lx1) ** 2))
A, B, C = coff
LdistanceCM = A * Ldistance ** 2 + B * Ldistance + C
此外,还可在python中设置接口,与其他程序通信,这里将与unity脚本通信示例,
# GUI
labelP = tk.Label(window, text='port: ').place(x=150, y=0)
var = tk.IntVar()
var.set(port)
def insertPort():
global port
port = entery.get()
port = int(port)
var.set(port)
serverAddressPort = ("127.0.0.1", port)
button = tk.Button(window,text='Enter', width=15, height=2, command=insertPort)
button.pack()
button = tk.Button(window,text='Start', width=15, height=2, command=changeSwitch)
button.pack()
window.mainloop()
在通信之前可打印一下通信的信息,设置好格式,在unity处,取到传入的值
using UnityEngine;
using UnityEngine.UI;
using System;
using System.Text;
using System.Net;
using System.Net.Sockets;
using System.Threading;
public class UDPReceive : MonoBehaviour
{
Thread receiveThread;
UdpClient client = null;
public int port = 43513;
public bool startRecieving = true;
public bool printToConsole = false;
public string data;
public void Start()
{
receiveThread = new Thread(new ThreadStart(ReceiveData));
receiveThread.Start();
}
private void ReceiveData()
{
client = new UdpClient(port);
while (startRecieving)
{
try
{
IPEndPoint anyIP = new IPEndPoint(IPAddress.Any, 0);
byte[] dataByte = client.Receive(ref anyIP);
data = Encoding.UTF8.GetString(dataByte);
if (printToConsole) { print(data); }
}
catch (Exception err)
{
print(err.ToString());
}
}
}
}
之后,去除杂项获得x,y,z值,根据该值进行创建物体划线展示效果
GameObject[] point = new GameObject[21];
public void Start()
{
for (int i = 0; i < 21; i++)
{
point[i] = GameObject.CreatePrimitive(PrimitiveType.Sphere);
point[i].SetActive(true);
point[i].transform.localScale = new Vector3(0.005f, 0.005f, 0.005f);
Renderer render = point[i].GetComponent<Renderer>();
//render.material = new Material(Shader.Find("Unlit/Color"));
render.material.SetColor("_Color", Color.red);
}
}
public void Update(){
for (int i = 0; i < 21; i++)
{
float x = float.Parse(pointsRight[i * 4]) / 10000;
float y = float.Parse(pointsRight[i * 4 + 1]) / 10000;
float z = float.Parse(pointsRight[i * 4 + 2]) / 10000;
RzD = 20;
if (x == 0 && y == 0 && z == 0)
return;
handPoints[i] = new Vector3( x, -y, z);
point[i].transform.position = handPoints[i];
for (int i = 1; i < 21; i++)
{
if(i!=4&&i!=8&&i!=12&&i!=16&&i!=20)
Debug.DrawLine(handPoints[i], handPoints[i+1], Color.green);
}
Debug.DrawLine(handPoints[0], handPoints[1], Color.black);
Debug.DrawLine(handPoints[0], handPoints[5], Color.black);
Debug.DrawLine(handPoints[0], handPoints[9], Color.black);
Debug.DrawLine(handPoints[0], handPoints[13], Color.black);
Debug.DrawLine(handPoints[0], handPoints[17], Color.black);
}
效果如下:文章来源:https://www.toymoban.com/news/detail-506933.html
文章来源地址https://www.toymoban.com/news/detail-506933.html
到了这里,关于Mediapipe手势识别,并与unity通信的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!