Making a 720p live video monitoring

For Logitech C310, you can run

./test-launch "v4l2src device=/dev/video1 ! video/x-raw,format=I420,width=1280,height=720 ! omxh264enc ! rtph264pay name=pay0 pt=96"

Don’t have experience about it. Other users may share experience.

thanks alot DaneLL,

About issue 2 , Anyone can help me?

I need multi-PC meanwhile .

B/R

Hai
Instead of Ip address I want to see direct video frame I need to stream it I need to do in CPP programing is that possible to stream frame by frame to live stream using rtsp gstreamer on vlc or mx players plz help me…

Hi rojachedey,
Please share more about your case. Is your source camera?

hai DaneLLL

Thanks for responding I am using the snapdragon board801 it is having front camera i.e my source camera from that i am taking the video frames and processing it using the opencv .Now I need to stream the processed frames using gstreamer rtsp in cpp

Hi rojacheday,
Is snapdragon board801 a camera board connecting to TK1?

Hi
In snapdragon board 801 camera in built having the front camera and downward camera what TK1 I dont know can u plz tell me?

This is the support forum for a Jetson TK1. Many of those Jetsons are used to process camera output which I’m guessing is how you got here. Specifics:
[url]https://developer.nvidia.com/embedded/faq[/url]

Snapdragon 801 is probably similar, but without the GPU capabilities.

Is that possible By using tk1 we can stream the processed frames to live uzing cpp.if it is possible how can i send with my snapdragon if u know plz tell me

I’m definitely not a camera guru, but the TK1 excels at video processing. There are definitely a list of things you’d need to know first, e.g., I think not all video formats are supported, but I couldn’t give you a list of what is needed.

Note that if there is video output from your snapdragon (such as over gigabit ethernet or as a USB device) then odds are good you can do this. It kind of depends on your needs. If you are using a custom sensor this may complicate things. A lot more would need to be known about what methods the snapdragon has for sending the video data (what physical interface and what video standards).

Hi linuxdev :

I met a trouble , can you help me ?

when I used TX1+usb camera , the device ID of USB camera is changing offen , sometime viedo1 , sometime video0 !

But I used opencv cmd " cap = cv2.VideoCapture(1) ", so this cmd offen donot work ,

what I should do lock the device id of usb camera ???

thanks ahead !

USB is a “hot plug” system, and whenever something plugs in it is enumerated. The process of enumeration does not guarantee the order at which devices will be presented to drivers. When drivers see a device they will simply number them in a more or less undefined way. If it happens that the device driver is still registered and not yet deconstructed when you unplug a device and plug it back in, then the numbering could in fact be incremented by its own presence even if there is no other device.

Rules to provide repeatable naming depend on udev. udev can rename things, or provide second names. As an example, I have many serial UARTs for debugging. Currently I see “/dev/ttyUSB0” through “/dev/ttyUSB3”, and no way to know which cable goes to which device. One day the ttyUSB0 might go to a particular board, and another day it might go to another. However, udev also produces “/dev/serial/by-id/” and “/dev/serial/by-path/”. The content of those directories are symbolic links to the actual ttyUSB#. The “by-id” directory uses the ID of the device as queried through lsusb, and is repeatable (e.g., I use “usb-FTDI_FT232R_USB_UART_A10171DT-if00-port0”, which points at ttyUSB0 and I always get that cable regardless of enumeration). I use these.

Your particular device would need to have a udev rule to do this if you want to identify it. It might already have this, but it depends on either there being a commonly used rule or one specifically for that device before it will show up that. When it does show up there is no way to predict it unless you see it in dmesg or browse the systemd udev files. So the particular camera may already have something set up, or you may need to add code to test for which “/dev/video#” devices exist, and then use the first device.

Does dmesg say anything as the device is plugged in which might give a particular name you can search for in “/dev” with find?

Hi linuxdev :

 thanks very much for you so detailed repsonse .

 following you way , I can get the device with cmd:

 #ls  /dev/v4l/by-id
 ##usb-lihappe8_Corp._USB_2.0_Camera-video-index0

 But ,when I used this id , as follows:

 [i]cap = cv2.VideoCapture(usb-lihappe8_Corp._USB_2.0_Camera-video-index0)[/i]

 one err: " syntaxerror : invalid  syntax"


  how to solve it ??


 B/R

Perhaps a command can substitute for the real hard link path. As an example, for serial devices (I have many) I can run this and see the real full path to the udev symbolic name:

readlink -f /dev/serial/by-id/*

In your case it might be something like this:

readlink -f /dev/v4l/usb-lihappe8_Corp._USB_2.0_Camera-video-index0

To trick this command to be substituted as a file name there are two ways. As an example, to embed this directly in the “echo” command, either of these work:

echo "$(readlink -f /dev/v4l/usb-lihappe8_Corp._USB_2.0_Camera-video-index0)"
echo "`readlink -f /dev/v4l/usb-lihappe8_Corp._USB_2.0_Camera-video-index0`"

The first syntax can be embedded in complicated ways with nesting, the second syntax is simple by keyboard. You could embed this to determine the actual hard link path.

One way is to set an environment variable:

export MYUSBCAM="$(readlink -f //dev/v4l/usb-lihappe8_Corp._USB_2.0_Camera-video-index0)"
echo $MYUSBCAM
whatever_your_program_is $MYUSBCAM

Or just embed it directly:

whatever_your_program_is "$(readlink -f //dev/v4l/usb-lihappe8_Corp._USB_2.0_Camera-video-index0)"

FYI, I’ve had similar issues with serial console programs which demand that the device special file only be named “ttyS#” format…they don’t allow “ttyUSB#” format, nor the by-path or by-id entries. I tend to put these in config files and not bother trying to access it from their GUI, which also works. For your video program there may not be a config file you can directly put the name in…it’s a case-by-case basis.

Hi all, i’ve succesfully run test app following instructions:

What is the suggested way to stream opencv processed frames (application written in c++) over rtsp?
There is a simple way to use a cv::VideoWriter against gst-rtsp-server (preferred) or i have to manually integrate (and manage) the library into my application?

Thanks in advance