最近做的一个项目需要使用arm开发板调用摄像头获取图片数据发送到后端,由于开发板是fedora系统,所以踩了不少坑。特此记录下来。
https://blog.csdn.net/qq_30910355/article/details/119106776
https://blog.csdn.net/who__are__you_/article/details/82628390
$ dnf install xxx $ Error: Failed to synchronize cache for repo
https://blog.csdn.net/l1175832366/article/details/104175396
https://blog.csdn.net/heng615975867/article/details/80519274
配置完成后,把下载目录里面的nginx-rtmp-module-master移到/usr/local/nginx中,因为可能没有移过去
https://blog.csdn.net/weixin_30539625/article/details/98516616
安装完之后可能需要添加环境变量:
vim /etc/profile
添加一行
export PATH=/usr/local/ffmpeg/bin/:$PATH\
刷新环境
souce /etc/profile
将里面的地址修改成你要推流的地址
rtmpUrl = "rtmp://139.159.142.192:1935/live/1"
如果想要调用摄像头的参数设为0显示找不到摄像头,就参考下面的查找本机摄像头序号。因为我用的是AI开发板,所以直接输入0可能调用不了。
vid = cv2.VideoCapture(r"/usr/local/web/studey/mysite/chat/video/4.mp4")
import cv2 import queue import os import numpy as np from threading import Thread import datetime, _thread import subprocess as sp from time import * # 使用线程锁,防止线程死锁 mutex = _thread.allocate_lock() # 存图片的队列 frame_queue = queue.Queue() # 推流的地址,前端通过这个地址拉流,主机的IP,2019是ffmpeg在nginx中设置的端口号 rtmpUrl = "rtmp://139.159.142.192:1935/live/1" # 用于推流的配置,参数比较多,可网上查询理解 command = ['ffmpeg', '-y', '-f', 'rawvideo', '-vcodec', 'rawvideo', '-pix_fmt', 'bgr24', '-s', "{}x{}".format(640, 480), # 图片分辨率 '-r', str(25.0), # 视频帧率 '-i', '-', '-c:v', 'libx264', '-pix_fmt', 'yuv420p', '-preset', 'ultrafast', '-f', 'flv', rtmpUrl] def Video(): # 调用相机拍图的函数 vid = cv2.VideoCapture(r"/usr/local/web/studey/mysite/chat/video/4.mp4") if not vid.isOpened(): raise IOError("Couldn't open webcam or video") while (vid.isOpened()): return_value, frame = vid.read() # 原始图片推入队列中 frame_queue.put(frame) def push_frame(): # 推流函数 accum_time = 0 curr_fps = 0 fps = "FPS: ??" prev_time = time() # 防止多线程时 command 未被设置 while True: if len(command) > 0: # 管道配置,其中用到管道 p = sp.Popen(command, stdin=sp.PIPE) break while True: if frame_queue.empty() != True: # 从队列中取出图片 frame = frame_queue.get() # curr_time = timer() # exec_time = curr_time - prev_time # prev_time = curr_time # accum_time = accum_time + exec_time # curr_fps = curr_fps + 1 # process frame # 你处理图片的代码 # 将图片从队列中取出来做处理,然后再通过管道推送到服务器上 # 增加画面帧率 # if accum_time > 1: # accum_time = accum_time - 1 # fps = "FPS: " + str(curr_fps) # curr_fps = 0 # write to pipe # 将处理后的图片通过管道推送到服务器上,image是处理后的图片 p.stdin.write(frame.tostring()) def run(): # 使用两个线程处理 thread1 = Thread(target=Video, ) thread1.start() thread2 = Thread(target=push_frame, ) thread2.start() if __name__ == '__main__': run()
拉流使用你的推流地址
vid = cv2.VideoCapture("你的推流地址")
https://blog.csdn.net/weixin_44345862/article/details/91047938?ops_request_misc=&request_id=&biz_id=102&utm_term=%E6%9F%A5%E7%9C%8B%E8%BF%9E%E6%8E%A5%E6%91%84%E5%83%8F%E5%A4%B4%E5%8F%B7&utm_medium=distribute.pc_search_result.none-task-blog-2allsobaiduweb~default-0-91047938.nonecase&spm=1018.2226.3001.4187
https://blog.csdn.net/keith_bb/article/details/54172899?spm=1001.2101.3001.6650.2&utm_medium=distribute.pc_relevant.none-task-blog-2%7Edefault%7EOPENSEARCH%7Edefault-2.tagcolumn&depth_1-utm_source=distribute.pc_relevant.none-task-blog-2%7Edefault%7EOPENSEARCH%7Edefault-2.tagcolumn