捕捉h。264流從相機和Gstreamer。

[英]Capturing h.264 stream from camera with Gstreamer


I'm trying to capture H264 stream from locally installed Logitech C920 camera from /dev/video0 with Gstreamer 1.0 v4l2src element.

我正在嘗試用Gstreamer 1.0 v4l2src元素在/dev/video0中從本地安裝的Logitech C920相機捕獲H264流。

v4l2-ctl --list-formats shows that camera is capable to give H264 video format:

v4l2-ctl——列表格式顯示,攝像頭能夠提供H264視頻格式:

# v4l2-ctl --list-formats
ioctl: VIDIOC_ENUM_FMT
        ...

        Index       : 1
        Type        : Video Capture
        Pixel Format: 'H264' (compressed)
        Name        : H.264

        ...

But pipeline

但管道

# gst-launch-1.0 -vvv v4l2src device=/dev/video0 ! video/x-h264, width=800, height=448, framerate=30/1 ! fakesink

# gst-launch-1.0 - v4l2src設備=/dev/video0 !視頻/x-h264,寬度=800,高度=448,幀數=30/1 !fakesink

keeps giving me not-negotiated (-4) error:

一直給我不談(-4)錯誤:

/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2809): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 67687169 ns.

Any help!

任何幫助!

5 个解决方案

#1


9  

Is gstreamer mandatory for your needs? I also have lots of problems with the Logitech C920 in H264 mode and gstreamer. But I managed to use VLC as a RTSP server to use the C920 with H264:

gstreamer是否必須滿足您的需求?我在H264模式和gstreamer中也遇到了很多問題。但我成功地使用VLC作為RTSP服務器,使用了H264的C920:

$ cvlc -v v4l2:///dev/video0:chroma="H264":width=1024:height=570:fps=30 \
       --sout="#rtp{sdp=rtsp://:8554/live}"

Then I can connect with another VLC to the URI rtsp://localhost:8554/live

然后我可以與另一個VLC連接到URI rtsp://localhost:8554/live。

If GStreamer is mandatory for you, I only managed to use it with a capture utility that you can find here: https://github.com/csete/bonecam - directory "capture"

如果GStreamer對您來說是強制的,那么我只使用了一個捕獲實用程序,您可以在這里找到:https://github.com/csete/bonecam - directory“capture”

You have to compile it, but if you have some programming skills it shoud be very easy as there is only one C file and a script to help. Just pass "host" as a parameter to the script :

您必須編譯它,但是如果您有一些編程技巧,它應該非常簡單,因為只有一個C文件和一個腳本可以幫助您。將“host”作為參數傳遞給腳本:

# Get the bonecam/capture content or git clone the directory, and then
$ cd bonecam/capture
$ ./build host

You can use the "capture" utility with something like that :

你可以使用“捕捉”實用工具:

$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1024,height=570,pixelformat=1
$ v4l2-ctl -d /dev/video0 --set-parm=30
$ ./bonecam/capture/capture -d /dev/video0 -c 100000 -o | \
      gst-launch -e filesrc location=/dev/fd/0 ! legacyh264parse ! rtph264pay ! udpsink host=10.0.0.42 port=5000

If you do not like to specify the number of frame to get ("-c" parameter with "capture"), there is a fork to this utility that you can find here: https://github.com/DeLaGuardo/bonecam

如果您不喜歡指定要獲取的幀數(“-c”參數與“capture”),那么您可以在這里找到這個實用工具的一個fork: https://github.com/DeLaGuardo/bonecam。

I know there is a plugin categorized as "bad", called uvch264 for gstreamer 0.10, that should work with the C920. But I do not know for gstreamer 1.0, and I could not test it.

我知道有一個插件被分類為“壞”,gstreamer 0.10被稱為uvch264,它應該與C920合作。但是我不知道gstreamer 1.0,我無法測試它。

UPD:

烏利希期刊指南:

Don't forget to add --rtsp-timeout=-1 to cvlc command line like

不要忘記添加——rtsp-timeout=-1到cvlc命令行。

$ cvlc -v v4l2:///dev/video0:chroma="H264":width=1024:height=570:fps=30 \
       --sout="#rtp{sdp=rtsp://:8554/live}" --rtsp-timeout=-1

Without this option streaming only lasts for 60 seconds by default.

如果沒有此選項,默認情況下只能持續60秒。

#2


1  

I have been trying to do the same thing and I got the same error. I believe I was using GStreamer 1.0.6.

我一直試着做同樣的事情,我也犯了同樣的錯誤。我相信我使用的是GStreamer 1.0.6。

What I found, possibly even thanks to Fergal Butler's answer, was the following page:

我發現,甚至可能要感謝Fergal Butler的回答,是以下的頁面:

http://kakaroto.homelinux.net/2012/09/uvc-h264-encoding-cameras-support-in-gstreamer/

http://kakaroto.homelinux.net/2012/09/uvc-h264-encoding-cameras-support-in-gstreamer/

Here Youness Alaoui describes the uvch264_src element he made to bring H264 camera support to GStreamer.

這里,Youness Alaoui描述了uvch264_src元素,他將H264相機支持帶到GStreamer中。

He describes the port to GStreamer 1.0 as pending in his article. So over the last week I've been looking into this. It turns out that it has now been ported to GStreamer 1.0, but only in a developer release (Version 1.1.2).

他將這個端口描述為GStreamer 1.0,這在他的文章中是懸而未決的。所以在過去的一周里,我一直在研究這個。事實證明,它現在已經被移植到GStreamer 1.0,但只在開發人員發布版本(版本1.1.2)中。

You can get version 1.1.2 here:

您可以在這里得到版本1.1.2:

http://gstreamer.freedesktop.org/src/

http://gstreamer.freedesktop.org/src/

It's called "uvch264src" now, and it's a part of gst-plugins-bad. I think it is also present in version 1.1.1 but I haven't really looked into that.

它現在被稱為“uvch264src”,它是gst-plugins-bad的一部分。我認為它也存在於版本1.1.1中,但我還沒有真正研究過它。

I had a bit of a hard time getting it installed, I think mostly due to to conflicts with GST 1.0 packages installed on my PC (so my own fault). But note that it has dependencies on libgudev-1.0-dev and libusb-1.0-0-dev so install these packages first - it took me a while to work out it was those two I was missing.

我在安裝它時遇到了一些困難,我認為主要是因為我的PC上安裝了GST 1.0軟件包(所以我自己的錯誤)。但是請注意,它依賴於libgudev-1.0-dev和libusb-1.0-0-dev,所以首先安裝這些包——我花了一段時間才發現這是我丟失的那兩個包。

Here is a pipeline I got to work which uses uvch264:

這里有一條管道,我使用了uvch264:

gst-launch-1.0 uvch264src device=/dev/video0 name=src auto-start=true src.vfsrc ! video/x-raw, format=YUY2, width=160, height=90, framerate=5/1 ! xvimagesink src.vidsrc ! queue ! video/x-h264, width=800, height=448, framerate=30/1 ! h264parse ! avdec_h264 ! xvimagesink

gst-launch-1.0 uvch264src設備=/dev/video0 name=src自動啟動=true src。vfsrc !視頻/x-raw格式=YUY2, width=160, height=90, framerate=5/1 !xvimagesink src。vidsrc !隊列!視頻/x-h264,寬度=800,高度=448,幀數=30/1 !h264parse !avdec_h264 !xvimagesink

If you don't want to use the preview video (from the vfsrc pad) just hook src.vfsrc straight up to a fakesink. I should also mention that even though this pipeline is working for me, I get a lot of warnings about "Got data flow before segment event". So obviously I'm not doing something right, but I'm not sure what.

如果您不想使用預覽視頻(來自vfsrc pad),只需hook src。vfsrc直接指向fakesink。我還應該提到,盡管這條管道是為我工作的,但是我得到了很多關於“在段事件之前得到數據流”的警告。顯然我做的不對,但我不確定。

Anyway, after all of that messing about getting 1.1.2 and uvch264src completely installed and working, I decided to give v4l2src a quick go again. And it turns out that v4l2src supports H264 properly after all :/. (See the short answer.)

不管怎樣,在把1.1.2和uvch264src完全安裝和工作之后,我決定再一次給v4l2src。事實證明,v4l2src畢竟支持H264:/。(參見簡短的回答)。


Short Answer:

So the short answer to your question is that if you are happy to install 1.1.2 from source you'll be able to do exactly what you want in the same way you've been trying to do it. You shouldn't need uvch264src. I've tested your pipeline and it worked fine with my installation. I've also tried this simple pipeline, to display the video on-screen, and it worked fine for me as well:

因此,你的問題的簡短回答是,如果你很樂意從源代碼安裝1.1.2,你就可以按照你一直想做的方式去做你想做的事情。你不應該需要uvch264src。我已經測試了你的管道,它在我的安裝上運行良好。我也嘗試過這個簡單的管道,在屏幕上顯示視頻,它對我也很好:

gst-launch-1.0 -e v4l2src device=/dev/video0 ! video/x-h264, width=800, height=448, framerate=30/1 ! avdec_h264 ! xvimagesink sync=false

gst-launch-1.0 -e v4l2src設備=/dev/video0 !視頻/x-h264,寬度=800,高度=448,幀數=30/1 !avdec_h264 !xvimagesink同步= false

#3


0  

I don't believe v4l2src supports h264 at the moment. See here:

我不相信v4l2src現在支持h264。在這里看到的:

http://www.oz9aec.net/index.php/gstreamer/473-using-the-logitech-c920-webcam-with-gstreamer

http://www.oz9aec.net/index.php/gstreamer/473-using-the-logitech-c920-webcam-with-gstreamer

and here:

在這里:

http://kakaroto.homelinux.net/2012/09/uvc-h264-encoding-cameras-support-in-gstreamer/

http://kakaroto.homelinux.net/2012/09/uvc-h264-encoding-cameras-support-in-gstreamer/

#4


0  

Try use videoconvert for automatically convert the video to a format understood by the video sink

嘗試使用videoconvert來自動將視頻轉換成視頻接收器所理解的格式。

gst-launch-1.0 -vvv v4l2src device=/dev/video0 ! videoconvert ! ...

#5


0  

I've also got a Logitech C920 camera, and have used the following pipeline to record H.264 video from the camera:

我也有一個Logitech C920相機,並使用了以下的管道記錄了H.264視頻。

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-h264,width=1280,height=720,framerate=30/1 ! mpegtsmux ! filesink location=output.ts

This asks the camera to produce H.264 data, which I then mux into a MPEG transport stream container, and write to disk. I can play the resulting file successfully with Totem.

這要求攝像機生成H.264數據,然后將其mux轉換為MPEG傳輸流容器,並寫入磁盤。我可以用Totem成功地播放結果文件。

The above pipeline records at 720p. The camera can also record at 1080p if you change the requested format to width=1920,height=1080.

以上管線記錄為720p。如果您將請求的格式更改為width=1920,height=1080,那么該相機也可以記錄1080p。


注意!

本站翻译的文章,版权归属于本站,未经许可禁止转摘,转摘请注明本文地址:https://www.itdaan.com/blog/2013/04/03/72f62b0350fb7c3f9a15eeb3ca2f8a91.html



 
粤ICP备14056181号  © 2014-2021 ITdaan.com