Sincemy WODC refuses to send anything to the micro sad card within and the one I have installed in its base are only the 12 sec event recordings.
It would be great if Wyze could integrate rtsp or ftp and allow up to 5 min continuous record as long as motion is present and then dump tp an ftp server. I need to capture longer video.
Is it true if you switch this feature you are going to lose all Wyse app features including their internet access?
I red also that if we do firmware update they are not going to support it except security features.
Do yourself a favor and just use a V3 (if/when they deliver the promised RTSP support), or a V2, or some other brand that can provide the feature you need. No reason to wait in vain for something that makes no sense for an old product.
I've been trying for a few days to get this Pelco Spectra Enhanced video camera to feed RTSP to a re-enconding web service (Wowza.com). I'm able to pull up the feed fine through the VLC application, but when trying to feed to this service, I don't get the video. I'm using a Bi-Directional Static Source Address translation NAT policy and have also attempted the application over-ride instructions on the only RTSP article here with no luck.
I've used inbound application rtsp and also service port 554/udp and 554/tcp which both allow the inbound connection to be made when I start the initiation from Wowza to pull the rtsp feed, but am seeing something that appears odd to me when looking at the traffic going back out from the internal camera IP to the Wowza server. I see the NAT Source as the internal IP and ingress and egress as the internal sub interface and no data received for rtp and rtp-base. I'm still learning about rtsp but judging by the amunt of data being sent, I'm assuming this is the video. I'm wondering if this is my issue and the camera is not NAT aware and if so, are there any work arounds? Thanks!
I've found that I get the same type of entries in the logs when I open VLC on my home system and pull the rtsp feed which works and returns the RTCP and RTP-Base connections . The only obvious difference I see are that the "To Port" number of the RTP-Base traffic is in the 50,000 + range using vlc and when trying to pull from Wowza, its in the 6900 range. Not sure what to blame here, the camera, Wowza's system or the PAN which I really need to rule out.
Here I see the RTP-Base sessions to Wowza start up but then disappear after about 30 seconds. If I stream from my own system I see rtp-base call back from the camera to my home public IP and stays active.
I'm struggling to work out how to input the rtsp stream information into the NVR1008H interface. It suggests that the resource path should merely be the Stream number but where does the full URL get entered. Works fine on VLC. It is a Samsung snh-p6410.
Currently I'm using FFmpeg to receive and decode the stream, saving it to an mp4 file. This works perfectly, but is CPU intensive, and will severely limit the number of RTSP streams I can receive simultaneously on my server.
I have tried VLC, which is even more CPU intensive than FFmpeg. I've also looked at this question where the answer says dumping RTSP to file is not useful, and this question, where the comment below the question says "Raw RTSP content is not well suited for save and replay...", which seems to indicate that there is way.
If you are reencoding in your ffmpeg command line, that may be the reason why it is CPU intensive. You need to simply copy the streams to the single container. Since I do not have your command line I cannot suggest a specific improvement here. Your acodec and vcodec should be set to copy is all I can say.
In the mplayer source file "stream/stream_rtsp.c" is a prebuffer_size setting of 640k and no option to change the size other then recompile. The result is that writing the stream is always delayed, which can be annoying for things like cameras, but besides this, you get an output file, and can play it back most places without a problem.
I have tried various media players, but couldn't stream the file. I know for sure this is not a server problem because I could stream a test video on that server that doesn't require an authentication.
I was surprised that vlc couldn't handle such urls, so I tried mplayer but it couldn't play the streams either.
mpv was the first player in which I managed to play the test video I mentioned above, but I couldn't stream urls of the form I written above. With mpv I tried this command line:
Pass the rtsp:// URL to mplayer on its command line. There are servers out there that serve files containing a rtsp:// URL over HTTP, but then serve content in the MMS protocol. This is for compatibility with some older Microsoft players (my memory is hazy over the details), but it breaks clients that believe that RTSP is RTSP and MMS is MMS. If you obtain an rtsp:// URL that doesn't work at all, try replacing the scheme with mms://.
I am having an issue with getting smooth video playback when streaming from my IP cameras on my Jetson Nano. This can be seen from the deepstream-app sample application, even with only one RTSP source and inferencing and tracking turned of. The displayed video seems to skip frames, and is very noticeable when there are moving objects (people or cars).
Running either of the following commands (without nvstreammux) produces buttery smooth playback. Note that I have specified an unusually long latency here, generally 500ms works good for me. The default in the deepstream-app is 100ms which produces the jumpiness. But the 5 second latency in the examples demonstrates this the best. On running the command, the video playback shows one frame and then pauses for 5 seconds, after which it streams smoohtly.
My deepstream-app config file is based of the reference file you mentioned. I have setup my 8 RTSP sources, configured sync=0 on [sink0] and live-source=1 on the [streammux] as per the Deepstream FAQ with no difference. As mentioned, this is not related to an inferencing bottleneck as the issue persists event without inferencing and tracking (please see my post again)
Adding sync=0 does not help. By adding sync=0 to any of the above samples causes the streams to start playing immediately with no buffering, and the jumpiness then occurs on all samples, even those without nvstreammux.
Yes, this does occur with the above pipeline with latency=5000. It will show the initial frame, then wait 5000ms (5sec) and start playing. If one looks at a rtsp source with an embedded timestimp (e.g. IP camera) one will see it fast-forwards the stream until is catches up with the latest frame and the jumpiness/laggy display issue happening.
If one runs the above pipeline without the latency=5000, it defaults to 2000ms of the rtspsrc. In this case it will behave the same as above, but will wait only for 2sec and will catch up faster - with the jumpiness/laggy display issue occuring.
The problem though is that this option does not work with the general deepstream pipeline (source(n) -> nvstreammux -> nvinfer -> nvtracker -> nvtiler -> nvosd -> sink) where one uses nvtiler for tiled display.
Hi,
We have not tested this kind of scenario - it is like a feature enhancement. No solution available for 5.0 GA release as it needs changes in nvstreammux component. would like to understand why this specific requirement or use-case so that we can evaluate to support it in future release.
Therefore I need the visual display to be smooth without any laggy/stuttering issues, as the visual impression a client gets by looking at the display is that the software is not processing in realtime / is laggy. This is really a show stopper!
In my test case the network latency is very low, and smooth streaming is possible as can be confirmed with the nvcompositor sample above. This is not the case with nvstreammux in the pipleine, resulting in poor user perception.
Hi,
There can be a quick solution for smooth playback. Could you check if the fps of all sources is same? If so, we can try by setting live-source=false. With this, timestamps of outgoing nvstreammux gst buffers will increase based on framerate of the source. If fps info is not available from caps, timestamps will increase as per 30 fps framerate.
I am trying to link some of my Meraki Cameras to my access control system. I am having to use RTSP to do this, but when I add the camera IP into my access control system and then attempt to view the camera, a popup windows comes up telling me that I need to install the VLC Media Player extension (which no longer exists). I contacted the access control software people and they told me that their system will play any camera that connects via IP and uses ONVIF.
Meraki MV cameras do not support ONVIF. Other than RTSP to make the video stream available, the MV system is designed to be proprietary, to ensure it's high degree of security, reliability and ease-of-use. Do you have Meraki licensing for your MV cameras? If so, you'd conduct all of your management for the system through the Meraki Dashboard or the related Vision Portal.
Use RTP over RTSP (TCP) was already unchecked. Admittedly, I didn't capture the very first stream between my machine and the camera (so maybe there's some dynamic settings caching going on), but I see no UDP traffic between the two, when starting streams now - not even an initial 'give UDP a go'. I may retest later, when I can fully restart my PC
dhcp and dns come in on udp ports, now i know this never been even consider but is it possable sent rtsp over port 53 udp, now i got 3 camreas i getting ready set up, and i can get up to date software for poe swich,
so only quistion becomes can the stream be sent over port 53. 136 500 each these ports required as well ports 67 and 68 it would be a intresting test to see if it could be done, i do know for fack that they can use port 53 udp to set up vpn for hacking,
3a8082e126