For my programming stream at twitch.tv/stapelberg, I wanted to add an additional camera to show test devices, electronics projects, etc. I couldn’t find my old webcam, and new ones are hard to come by currently, so I figured I would try to include a phone camera somehow.
The setup that I ended up with is:
→ Instant Webcam
Disclaimer: I was only interested in a video stream! I don’t think this setup would be helpful for video conferencing, due to lacking audio/video synchronization.
iPhone Software: Instant Webcam app
I’m using the PhobosLab Instant Webcam (install from the Apple App Store) app on an old iPhone 8 that I bought used.
There are three interesting related blog posts by app author Dominic Szablewski:
- HTML5 Live Video Streaming via WebSockets (2013-Sep)
- Decode it like it’s 1999 (2017-Feb)
After some git archeology, I figured out that jsmpeg was rewritten in commit 7bf420fd just after v0.2. You can browse the old version on GitHub.
Notably, the Instant Webcam app seems to still use the older v0.2 version, which starts WebSocket streams with a custom 8-byte header that we need to strip.
on Arch Linux or
Debian. I used version 0.12.5-1 at the time of writing.
Then, install gstreamer and required plugins. I used version 1.16.2 for all of these:
Lastly, install either
wsta for accessing WebSockets. I
successfully tested with
websocat 1.5.0 and
First, load the
v4l2loopback kernel module:
% sudo modprobe v4l2loopback video_nr=10 card_label=v4l2-iphone
Then, we’re going to use gstreamer to decode the WebSocket MPEG1 stream (after
stripping the custom 8-byte header) and send it into the
device, to the
v4l2loopback kernel module:
% websocat --binary ws://iPhone.lan/ws | \ dd bs=8 skip=1 | \ gst-launch-1.0 \ fdsrc \ ! queue \ ! mpegvideoparse \ ! avdec_mpeg2video \ ! videoconvert \ ! videorate \ ! 'video/x-raw, format=YUY2, framerate=30/1' \ ! v4l2sink device=/dev/video10 sync=false
Here are a couple of notes about individual parts of this pipeline:
You must set
websocat(or the alternative
wsta) into binary mode, otherwise they will garble the output stream with newline characters, resulting in a seemingly kinda working stream that just displays garbage. Ask me how I know.
queueelement uncouples decoding from reading from the network socket, which should help in case the network has intermittent troubles.
framerate=30/1, you cannot cancel and restart the gstreamer pipeline: subsequent invocations will fail with
streaming stopped, reason not-negotiated (-4)
ffmpeg-based decoders to play the stream. Without this setting, e.g.
ffplaywill fail with
[ffmpeg/demuxer] video4linux2,v4l2: Dequeued v4l2 buffer contains 462848 bytes, but 460800 were expected. Flags: 0x00000001.
v4l2sinkplays frames as quickly as possible without trying to do any synchronization.
Now, consumers such as OBS (Open Broadcaster
mpv can capture from
% ffplay /dev/video10 % mpv av://v4l2:/dev/video10 --profile=low-latency
Hopefully the instructions above just work for you, but in case things go wrong, maybe the following notes are helpful.
To debug issues, I used the
GST_DEBUG_DUMP_DOT_DIR environment variable as
described on Debugging tools: Getting pipeline
these graphs, you can quickly see which pipeline elements negotiate which caps.
I also used the PL_MPEG example program to play the supplied MPEG test file. PL_MPEG is written by Dominic Szablewski as well, and you can read more about it in Dominic’s blog post MPEG1 Single file C library. I figured the codec and parameters might be similar between the different projects of the same author and used this to gain more confidence into the stream parameters.
I also used Wireshark to look at the stream
traffic to discover that
wsta garble the stream output by
default unless the
--binary flag is used.