pyCA vs. Galicaster (vs. future opencast native agent)

477 views
Skip to first unread message

mostolog

unread,
Apr 27, 2017, 6:42:49 PM4/27/17
to Opencast Users
Hi


We are starting to use galicaster as capture agent for our environment. As pyCA seems to be getting some traction, we were wondering if there is a feature comparison/advantages-disadvantages against Galicaster or any other widely used capture agent.

AFAIK, Galicaster requires X11 which IMHO sucks, while pyCA can be launched from a terminal/script, or even managed with recent signal support.
It would be great if pyCA has a 'manually start/pause/stop recording' web gui (seem it has), but we haven't played enough with it to know how powerful it is.

Does anyone dares to give us some feedback?

Another question (perhaps for another thread) would be what inventory system do you use to manage ip cameras, agents and all opencast satellites.

Regards

Lars Kiesow

unread,
Apr 27, 2017, 8:06:49 PM4/27/17
to us...@opencast.org
Hi mostolog,
as main pyCA developer, let me try to answer some questions. For me,
the three main problems with Galicaster which lead to the pyCA
development were:

- The CC-BY-NC license makes Galicaster completely unreliable and
unusable for most tasks. Besides the license being not designed for
software which may cause issues, the term non-commercial is always
hard to judge. where exactly starts commercial use? It's way to easy
to find an interpretation by which you violated the license
agreement.

- Being a GUI application with tons of dependencies and the need for a
lot of resources. Though small devices like the Raspberry Pi which I
wanted to use have become significantly more powerful, I'm still not
sure how comfortable a Galicaster would run there. More important,
if you want it to run on a server, without UI at all, you cannot.
I'm not sure what the current situation about properly starting it
as a service is, but that was an issue as well.

- Finally, an issue was that Galicaster was using deprecated
libraries, namely GStreamer 0.10, which are not contained in recent
systems anymore. Vut afaik, this has recently changed with version 2.


Now to the comparison with pyCA. First, you should keep in mind that the
main focus of both capture agents is different. Galicaster's main focus
is on the GUI which a user can use to control the agent. You have
things like start and stop buttons, have a live view, can modify
metadata, …

pyCA is more meant to run automatically. Use it as service. It is
getting schedules from Opencast. Manage it from somewhere else. Maybe,
have it run on smaller devices or servers. It's GUI is less important.

For other capture agents, I guess TheRec is even more focused on GUI
and manual recordings and the deprecated reference CA's focus was
somewhat between pyCA and Galicaster with tendency to pyCA.


For the current state of the pyCA UI: It will give you status
information about the recorder, its event buffer and recordings. It
does not allow you to interact with events or manually start recordings.

In current development is a RESTful API which will make the web
interface more dynamic and allows users to interact with recording
dates. Still, first interactions would be to make re-ingest easier, …
Manual control is something for later. For more details, have a look at:
https://github.com/opencast/pyCA/pull/73

I hope this helps. Feel free to ask more questions if something is not
clear.

Best regards,
Lars

PS: Please ask different questions in different topics.


On Thu, 27 Apr 2017 15:42:49 -0700 (PDT) mostolog <most...@gmail.com>
wrote:

mostolog

unread,
May 2, 2017, 6:13:18 AM5/2/17
to Opencast Users
Thanks for you reply. It's time to deal with boss to adopt PyCA ;)

Jan Koppe

unread,
May 2, 2017, 7:33:00 AM5/2/17
to us...@opencast.org
Hello mostolog,

we are currently using pyCA for all our capture agents at the University
of Münster, which are 12 active agents right now with more being
installed as we speak. So we're probably one of the bigger pyCA-users.
Also, out of necessity I'm involved with the pyCA development, so I
might be a bit biased here. :-)

In the beginning we tried to use Galicaster Pro (1.4 back then), which
we had trouble to operate reliably and performantly even after several
weeks. For us the most prominent cons of galicaster we're the dependency
on Ubuntu (12.04 LTS at that time), GUI and just general perceived
bloat, which makes this much more prone to errors.

In comparison, I had pyCA up and running within a few days. The main
work there is to build your own recording_command or script which does
the actual capturing. That's one of the bigger differences to galicaster
from the plain user PoV. This is quite easy though with ffmpeg for
example. We're using magewell capture cards which make this stage a
breeze. If you have any questions or need help in that regard, feel free
to ask.

Our deployment uses Arch Linux as a really minimal base and only pyCA, a
Zabbix Agent and a small script for livestreaming. This is really easy
to manage (we're using Ansible) and keep in check security wise because
of the small amount of dependencies and services.

Any questions are welcome, hope that I could help.

Regards,
Jan

On 02.05.2017 12:13, mostolog wrote:
> Thanks for you reply. It's time to deal with boss to adopt PyCA ;)
> --
> You received this message because you are subscribed to the Google
> Groups "Opencast Users" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to users+un...@opencast.org
> <mailto:users+un...@opencast.org>.

--
Jan Koppe
eLectures / LearnWeb
Westfälische Wilhelms-Universität
Georgskommende 25 - Room 310
48143 Münster/Westf. - Germany
Tel. + 49 (0) 251 - 83 29295
E-mail: jan....@wwu.de


Max Lira

unread,
May 2, 2017, 8:52:33 AM5/2/17
to Opencast Users
Hi Jan:

Thats is impresive! I want to try PyCA, and your deployment seems to be very good for scheduled recordings.

I have a couple of questions about it:

1.- How do you record unscheduled lectures?
2.-Is there a way you can look how is the PyCA recording working (To check is the presentation stream is being recorded or if its the camera or mic adjusted correctly).
3.- Can you show us the scripts that you use to make it htat happen.
4.- Do you know if the blackmagic cards works with PyCA?

Ruth Lang

unread,
May 2, 2017, 10:05:23 AM5/2/17
to Opencast Users
Hi,

I have the same questions as Max and an  additional one:
how would you control a camera remotely ?

I am sure that one can write as many scripts as one like to add features to pyCA, 
These are the reasons why we are still using Galicaster: 

1.) You can easily start/stop a recording manually 
2.) Because Galicaster has a GUI, you can do that remotely or locally in the lecture rooms (which is often required in Germany by the staff council)
3.) One can easily see if there is a problem in the lecture room (no audio , camera or beamer signal....)
4.) One can easily control a Sony PTZ camera or an AXIS cameras. We use this feature in the current semester.
5.) Recording profiles can also be changed / adapted to your need.
5.) We run Galicaster 2.0 with Gstreamer version > 1.0

If one do not like to work only with a terminal based system you have to deal with a desktop system and with 
"tons of dependencies and the need for a lot of resources",  but on the other side you also have to work with Opencast, 
not directly a l"ightweight" system, that  - in my opinion - lugs a lot of old code around.

Regards
Ruth

Jan Koppe

unread,
May 2, 2017, 10:20:51 AM5/2/17
to us...@opencast.org
Hello Max,

On 02.05.2017 14:52, Max Lira wrote:
> Hi Jan:
>
> Thats is impresive! I want to try PyCA, and your deployment seems to
> be very good for scheduled recordings.
>
> I have a couple of questions about it:
>
> 1.- How do you record unscheduled lectures?
We don't. In our process we always schedule recordings via Opencast.
This is required anyway because lecturers have to sign a consent form
prior to being recorded. We're working quite flexible hours, so it's not
a problem, if need be we can schedule within 10 minutes or so. That
said, the upcoming ad-hoc functionality in pyCA will be much appreciated
for testing and possible future integration with the control panels in
our lecture rooms.
> 2.-Is there a way you can look how is the PyCA recording working (To
> check is the presentation stream is being recorded or if its the
> camera or mic adjusted correctly).
Our Capture Agents are capable to deliver a Picture-in-Picture
livestream mix, so we can do a quick check if everything is fine. Our
cameras are all in a fixed location (Axis P1428-E), so that's not really
necessary. Other than that, we're constantly monitoring all hardware
sensors (voltages, temperatures, fan speeds) as well as generall
processes, cpu & ram usage, disk space and ffmpeg processes in
particular, so we have a lot of insight what the machines are doing.
> 3.- Can you show us the scripts that you use to make it htat happen.
Of course! I was thinking about documenting our Capture Agents as a
whole and making that openly available as a Document, including hardware
choice and general system setup. I will try to tackle this in the next
few days.
> 4.- Do you know if the blackmagic cards works with PyCA?
Yes. And I absolutely advise against them (sorry Ruth, should have
listened in Cologne!). We started out the Blackmagic Design Mini
Recorder cards. They do work. But their (driver) support is much more
hassle than it is worth. You will have to compile the kernel module,
package the driver headers, then compile ffmpeg with the
`--enable-decklink` flag, then sed the configuration file so that the
hdmi input is used, use an external scaler to make sure only one exact
size and refresh rate is used, and then find out what the exact device
string to use with ffmpeg is, and then, if you did everything right, you
will get a picture. Not sure if they're compatible with linux 4.5+ yet,
but a few months ago they were not. Quite annoying if you can't even use
linux-lts.

You can do that if you really want to, but in the end the magewell cards
were just so much easier and robust to handle: run the installer
provided by magewell (works just fine with Arch, just have to watch when
doing kernel upgrades - reboot first, then mwcap-repair.sh) and then do
a simple v4l2 capture on /dev/video0. done.

Price difference was about 150€ IIRC (150€ for blackmagic versus 300€
for magewell), and the magewell cards also include a sound card which we
previously had to install separately as well (server boards, so no
onboard sound). this came in very handy with our newer 1HE version. Well
worth the money.

Regards,
Jan

Jan Koppe

unread,
May 2, 2017, 10:30:54 AM5/2/17
to us...@opencast.org
Hello Ruth,

On 02.05.2017 16:05, 'Ruth Lang' via Opencast Users wrote:
> Hi,
>
> I have the same questions as Max and an additional one:
> how would you control a camera remotely ?
We're using Axis P1428-E Network Cameras exclusively. Right now we
capture the raw 4K stream as well as a digital-tracking 720p stream. As
soon as track4k is usable in Opencast we will get to work on re-tracking
existing recordings and use that from then on. The digital-tracking
stream is also used for the PiP livestream mix
>
> If one do not like to work only with a terminal based system you have
> to deal with a desktop system and with
> "tons of dependencies and the need for a lot of resources", but on
> the other side you also have to work with Opencast,
> not directly a l"ightweight" system, that - in my opinion - lugs a
> lot of old code around.
Agree on the Opencast point. But if I have the easy choice between pyCA
and Galicaster (which does not provide huge benefits over pyCA for
_us_), that is certainly something to consider. The need for resources
was quite substantial, as our pyCA solution for example does not need to
reencode the camera streams, therefore uses around 12-15% CPU in regular
use (1080p beamer signal encoding + aac audio encoding). With galicaster
we were seeing around 70-80% cpu usage because it was apparently not
possible to avoid useless re-encoding. That's a dealbreaker imho.

Regards,
Jan

Ruth Lang

unread,
May 2, 2017, 10:53:38 AM5/2/17
to Opencast Users
Hi Jan,

thanks for your comments.

If I understand you correctly, you do not need to steer your cameras because they are fixed ?
At our university this is an absolutely "NO-GO". 
Funny enough because both universities are located in North Rhine-Westphalia ....

I am not quite sure what you mean with "reencoding" on Galicaster side.
Our CPU load is also around 12-15% for recording. 

But I agree with you that everyone should use the capture solution which fits best for the institution.

Regards
Ruth

Jan Koppe

unread,
May 2, 2017, 11:01:25 AM5/2/17
to us...@opencast.org
Hi Ruth,

On 02.05.2017 16:53, 'Ruth Lang' via Opencast Users wrote:
> If I understand you correctly, you do not need to steer your cameras
> because they are fixed ?
> At our university this is an absolutely "NO-GO".
Correct. I'm stumped that a fixed camera, which provides a fixed region
that could be recorded is a no-go, but a camera that is freely
positionable over a network, where one cannot be really sure where it
points to, is allowed. But I gave up on trying to understand german
bureaucracy and laws a long time ago, so...
> I am not quite sure what you mean with "reencoding" on Galicaster side.
> Our CPU load is also around 12-15% for recording.
The cameras provide a h.264 encoded stream via rtmp. Galicaster would
always decode that stream, and then re-encode it right away, no matter
what we did or configured. With pyCA/ffmpeg a simple -c:v copy
eliminates the cpu usage for that stream (barely 1% to write the stream
in a mp4 container). Not sure why galicaster did that, but especially
with a 4K stream that was unusable.

Stephen Marquard

unread,
May 2, 2017, 11:03:48 AM5/2/17
to us...@opencast.org
Galicaster allows recording camera streams as-is. We're recording some 4K streams with Galicaster (2.x but the change is in 1.x version as well).

Regards
Stephen

---
Stephen Marquard, Learning Technologies Co-ordinator,
Centre for Innovation in Learning and Teaching (CILT)
University of Cape Town
http://www.cilt.uct.ac.za
stephen....@uct.ac.za
Phone: +27-21-650-5037 Cell: +27-83-500-5290
--
You received this message because you are subscribed to the Google Groups "Opencast Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to users+un...@opencast.org.
Disclaimer - University of Cape Town This e-mail is subject to UCT policies and e-mail disclaimer published on our website at http://www.uct.ac.za/about/policies/emaildisclaimer/ or obtainable from +27 21 650 9111. If this e-mail is not related to the business of UCT, it is sent by the sender in an individual capacity. Please report security incidents or abuse via cs...@uct.ac.za

Jan Koppe

unread,
May 2, 2017, 11:06:22 AM5/2/17
to us...@opencast.org
On 02.05.2017 17:03, Stephen Marquard wrote:
> Galicaster allows recording camera streams as-is. We're recording some 4K streams with Galicaster (2.x but the change is in 1.x version as well).
That is very good to hear! do you mind sharing configuration details?
Was this added after 1.4.1? Is this also possible with Galicaster Pro?
This might be of interest to other users.

-jan

Stephen Marquard

unread,
May 2, 2017, 11:10:56 AM5/2/17
to us...@opencast.org
This is a 2.x profile, but I think the rtp bin has been around for a while.

[track2]
name = presenter
device = rtp
flavor = presenter
location = rtspt://camera.hostname/axis-media/media.amp
file = presenter.mkv
cameratype = h264
audio = False
muxer = matroskamux
active = True

You can't use rtspt with flvmux, but you can use rtsp / mkv if you want (or probably some other combinations).

We're not using GCPro at the moment. Initially it didn't support IP Cams but I think it now might.

Regards
Stephen

---
Stephen Marquard, Learning Technologies Co-ordinator,
Centre for Innovation in Learning and Teaching (CILT)
University of Cape Town
http://www.cilt.uct.ac.za
stephen....@uct.ac.za
Phone: +27-21-650-5037 Cell: +27-83-500-5290

-----Original Message-----
From: Jan Koppe [mailto:jan....@wwu.de]
Sent: 02 May 2017 05:06 PM
To: us...@opencast.org
Subject: Re: [OC Users] pyCA vs. Galicaster (vs. future opencast native agent)

Carlos Turro Ribalta

unread,
May 4, 2017, 9:13:31 AM5/4/17
to us...@opencast.org

Hi Jan, one question. ¿Do you carry the audio also from the Axis camera?. I’ve taken a look to the P1428-E in the axis website and I’m not sure how to do it

 

Thanks!

 

Carlos

--
You received this message because you are subscribed to the Google Groups "Opencast Users" group.

To unsubscribe from this group and stop receiving emails from it, send an email to users+un...@opencast.org.

Jan Koppe

unread,
May 6, 2017, 3:56:27 PM5/6/17
to us...@opencast.org
Hello Carlos,

sorry for the late reply, didn't see your mail.

On 04.05.2017 15:13, Carlos Turro Ribalta wrote:
>
> Hi Jan, one question. ¿Do you carry the audio also from the Axis
> camera?. I’ve taken a look to the P1428-E in the axis website and I’m
> not sure how to do it
>
No, we capture the audio signal via the analog inputs on our magewell
capture cards. This introduces some problems regarding audio/video
synchronization because the rtsp stream from the axis cameras will have
an inherit delay to them which can be unpredictable. I've got a hunch on
how to trick ffmpeg to more reliably synchronize audio & video but I'm
still unsure if this will actually work. I intend to post to the Users
mailing list once I have a proper solution and thorough writeup for this.

One thing to note is that Axis has several "smaller" cameras in their
portfolio that have a stereo analog input for audio or microphone. Those
do have sensors with lower resolutions though, which was not an option
for us because we do digital tracking on a 4K sensor signal.

Carlos Turro Ribalta

unread,
May 7, 2017, 3:41:58 AM5/7/17
to us...@opencast.org
Thanks Jan.

It's a pity because the camera looked very good. If you find a good way to sync that would be great

Carlos


===
Carlos Turro

Head of Media Services
Universitat Politecnica de Valencia



-----Mensaje original-----
De: Jan Koppe [mailto:jan....@wwu.de]
Enviado el: sábado, 6 de mayo de 2017 21:56
Para: us...@opencast.org
Asunto: Re: [OC Users] pyCA vs. Galicaster (vs. future opencast native agent)

Max Lira

unread,
May 8, 2017, 9:26:54 AM5/8/17
to Opencast Users
About Axis and another IP cameras topic:

So is no a way to sync correctly the audio unless you can carry the audio in the RTP? What IP cameras do you recomend with Audio in?

Rüdiger Rolf

unread,
May 8, 2017, 9:43:01 AM5/8/17
to us...@opencast.org
Hi Max,

For FullHD we have a Axis Q1755, wich has at least an Audio input and offers an H.264 stream, that can be recorded.

For 4K we use a Panasonic WV-SPV781L, which also has a audio input. A demo video can be found here:
http://video4.virtuos.uos.de/engage/theodul/ui/core.html?id=4b20e4f5-ef24-4a40-b874-23c348c1e3a5
(select 4K in de Quality dropdown, and you can zoom then)

The ETH Zürich currently has a demo unit of the Sony SNC-VB770 which is an extremly expansive IP-Camera, because of its full-format sensor, but the video quality is very good:
http://video4.virtuos.uos.de/engage/theodul/ui/core.html?id=de3c9160-adc9-474c-a3c0-73b1477a10c7
(select 4K in de Quality dropdown, and you can zoom then)

Regards
Rüdiger


Am 08.05.17 um 15:26 schrieb Max Lira:
--
You received this message because you are subscribed to the Google Groups "Opencast Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to users+un...@opencast.org.


-- 
________________________________________________
Rüdiger Rolf, M.A.
Universität Osnabrück - Zentrum virtUOS
Heger-Tor-Wall 12, 49069 Osnabrück
Telefon: (0541) 969-6511 - Fax: (0541) 969-16511
E-Mail: rr...@uni-osnabrueck.de
Internet: www.virtuos.uni-osnabrueck.de

miguel gusils

unread,
Jun 3, 2017, 11:33:54 AM6/3/17
to us...@opencast.org
Hi Jan,

Could you elaborate on your livestreaming setup?
Is it in use in a production environment?

Thanks!!
-mg

> Our deployment uses Arch Linux as a really minimal base and only pyCA, a
> Zabbix Agent and a small script for livestreaming. This is really easy
> to manage (we're using Ansible) and keep in check security wise because
> of the small amount of dependencies and services.


---
miguel gusils
miguel...@harvard.edu
> To unsubscribe from this group and stop receiving emails from it, send an email to users+un...@opencast.org.

Jan Koppe

unread,
Jun 3, 2017, 1:26:12 PM6/3/17
to us...@opencast.org
Hello Miguel,

On 03.06.2017 17:33, miguel gusils wrote:
> Hi Jan,
>
> Could you elaborate on your livestreaming setup?
> Is it in use in a production environment?
>

Contrary to the signaled huge demand by university personel, we have
only had one occasion where a livestream was requested, but that worked
completely fine for about 2 hours continuous streaming to ~10 guests,
which were other universities in germany playing the stream on beamers
in big lecture halls. The majority used the MPEG-DASH stream.

The essential part to this setup is a single ffmpeg process. This
process captures all inputs (RTSP Stream from camera, audio input, local
HDMI input from magewell card) and does a Picture-in-Picture mix on the
fly as well as some basic audio filtering & heavy compression. That's
all done via filter_complex options. On the Agent the stream get's a
very light h.264 720p compression and pushes the output via rtmp to the
university's central Wowza server. There, the stream will get transcoded
to several smaller resolutions and delivered via HLS and MPEG-Dash,
using a small Video.js webplayer using dash.js and hls plugins.

The ffmpeg process on the capture agent is managed via systemd, which
will try to keep this stream running no matter what as long as the
systemd unit is enabled.

Now, the tricky (and still a bit messy) part is central control. The
first version, which i've quickly cobbled together because we were in a
bit of a hurry, used a small node script on the agents to listen for
commands via http and in turn enable or disable the systemd unit, and a
central python flask server to handle authentication, web interface and
relaying commands. This was all a bit fragile, so I've revised this in
the past week: The new version uses a central etcd instance on which a
very simple bash script will watch certain keys and in turn
enable/disable the units. On one of the servers a node server runs a web
interface that changes these specific keys in etcd, reports back the
status from the agent and generates the webplayers.

I'm planing to open source that software, but before that it will need a
bit more cleaning up.

The important part is, as i've mentioned, this ffmpeg command:

ffmpeg \
-i "{{ camera_url }}" \
-f v4l2 -i /dev/video0 \
-thread_queue_size 2048 -f alsa -i dsnoop \
-filter_complex
"highpass=f=120,acompressor=threshold=0.3:makeup=4:release=20:attack=5:knee=4:ratio=10:detection=peak,alimiter=limit=0.8"
\
-filter_complex
"[0:v]scale=w=500:h=-1[a];[1:v][a]overlay=(W-w-20):(H-h-20)[b];[b]scale=w=1280:h=720[x]"
\
-map "[x]:0" -map "2:0" \
-c:a aac \
-c:v libx264 -bf 0 -g 25 -crf 25 -preset veryfast \
-f flv rtmp://{{ wowza.auth.user }}:{{ wowza.auth.pass }}@{{
wowza.server }}/{{ wowza.application }}/{{ agent_name }}

The dsnoop input on alsa allows multiple processes to capture from a
alsa input at the same time, this is necessary to avoid conflicts with
recording. you will most likely have to modify the second filter_complex
chain. This will first scale the camera input to 500 px width and then
overlay it ontop of the capture card input which is captured at 1080p,
so roughly a quarter of the overall width. Position is defined by
(W-w-20)..., in this case bottom right corner. Finally the whole video
is scaled down from 1080p to 720p.

I hope that this was of help to you. If you have any questions, feel
free to ask!

Regards

--

Jan Koppe
eLectures / LearnWeb
Westfälische Wilhelms-Universität
Georgskommende 25 - Room 310
48143 Münster/Westf. - Germany
E-mail: jan....@wwu.de


Miguel Gusils

unread,
Jun 6, 2017, 2:08:13 PM6/6/17
to us...@opencast.org
Hi Jan, 
..interesting 
Thanks!!
-mg


---
Miguel Gusils






Reply all
Reply to author
Forward
0 new messages