How to keep your Youtube stream live!
In my last post, I show you how easy it is to forward a livestream to youtube from your local camera.
The problem
Actually Youtube will stop your stream when there are no viewers. That's Ok, but when I stream I don't awaiting massive viewers.
The idea
The idea is that I want to simulate a viewer, so I must download the video. After a little bit of googleing around (with bing) in the internet I found the youtube-dl toolkit. This is a small shellscript that will take the url from your youtube video link an download it.
movieyou can just run the follwing command, to start download the video
youtube-dl YOUR_YOUTUBE_URL
bash
-
But we have a livestream, so it will not stop at a given amount of time. So I must use a trick and pass the output stream to ffmpeg. In this I control the time to "view".
So I created a minimal script for this and it will end in this
YOUTUBE_STREAM_URL=myurl DURATION_IN_SECONDS=5 DURATION_IN_SECONDS_TO_WAIT=120 echo "YOUTUBE_STREAM_URL=$YOUTUBE_STREAM_URL" echo "DURATION_IN_SECONDS=$DURATION_IN_SECONDS" echo "DURATION_IN_SECONDS_TO_WAIT=$DURATION_IN_SECONDS_TO_WAIT" while : do echo "Downloading video for $RECORD_TIME_IN_SECONDS seconds" # option -f 93 is for 360p stream URL=$(youtube-dl -g -f 93 $YOUTUBE_STREAM_URL) echo "Stream internal address(es): $URL" # Pass the stream to ffmpeg echo "$URL" | xargs -n 1 -I {} ffmpeg -t $RECORD_TIME_IN_SECONDS -i "{}" -f null - echo "Waiting for 10 minutes to restart the stream" sleep $DURATION_IN_SECONDS_TO_WAIT
bash
-
This script will take myurl and "view" it for 5 seconds. After this, it waits for 2 minutes and does it again every time. So it will simulate a recurring viewer.
Dockerize it
Yep I want to do it on my homeserver and headless. I created a script file for the entry point and passing parameters into it. With this I can control the behaviour of the script. My final script looks like this
#!/bin/sh if [ "$#" -ne 3 ]; then >&2 echo "Required arguments: YOUTUBE_STREAM_URL DURATION_IN_SECONDS DURATION_IN_SECONDS_TO_WAIT" exit 1 fi # -f 93 is for 360p stream YOUTUBE_STREAM_URL=$1 DURATION_IN_SECONDS=$2 DURATION_IN_SECONDS_TO_WAIT=$3 echo "YOUTUBE_STREAM_URL=$YOUTUBE_STREAM_URL" echo "DURATION_IN_SECONDS=$DURATION_IN_SECONDS" echo "DURATION_IN_SECONDS_TO_WAIT=$DURATION_IN_SECONDS_TO_WAIT" while : do echo "Downloading video for 5 seconds" URL=$(youtube-dl -g -f 93 $YOUTUBE_STREAM_URL) echo "Stream internal address(es): $URL" echo "$URL" | xargs -n 1 -I {} ffmpeg -t $DURATION_IN_SECONDS -i "{}" -f null - echo "Waiting for 1 minute" sleep 60 done
entrypoint.sh
-
I saved it as entrypoint.sh and put it beside a Dockerfiler with the following conten
# Usage # docker run youtube-stream-download https://youtube.com/live/h4j-dpvDWXY FROM jrottenberg/ffmpeg #MAINTAINER Sascha Bajonczak <xbeejayx@hotmail.com> RUN apt-get update && apt-get install -y wget python && \ ## Get the youtube downloader toolkit (ignore invalid certificate) wget https://yt-dl.org/downloads/latest/youtube-dl --no-check-certificate -O /usr/local/bin/youtube-dl && \ chmod a+rx /usr/local/bin/youtube-dl && \ rm -rf /var/lib/apt/lists/* /var/cache/apt # Adding the entry point script ADD entrypoint.sh /entrypoint.sh # Pass parameters to the entrypoint script ENTRYPOINT /entrypoint.sh $@
Docker file
-
This dockerfile takes the ffmpeg base image and download the youtube download and install this in the image itself. After this it will add the script into the root filesystem and set this as entrypoint when the image will be started.
Now we can build it in the directory with the following command
docker build -t yourimagename .
build
-
Testrun
After building the container image I test the container and start it
docker run --restart=always -v /etc/localtime:/etc/localtime yourimagename -- https://youtube.com/live/STREAMID 5
first
-
This will pass the current time to the image and get the stream and also the seconds of time to record. Inside the container the output will be look like this
Downloading video for 5 seconds WARNING: Assuming --restrict-filenames since file system encoding cannot encode all characters. Set the LC_ALL environment variable to fix this. ERROR: Unable to extract uploader id; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output. Stream internal address(es): Waiting for 1 minute Downloading video for 5 seconds WARNING: Assuming --restrict-filenames since file system encoding cannot encode all characters. Set the LC_ALL environment variable to fix this. ERROR: Unable to extract uploader id; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output. Stream internal address(es): Waiting for 1 minute Downloading video for 5 seconds WARNING: Assuming --restrict-filenames since file system encoding cannot encode all characters. Set the LC_ALL environment variable to fix this. ERROR: Unable to extract uploader id; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output. Stream internal address(es): Waiting for 1 minute
-
You can safetly ignore the error, because it is an internal error when they want to get the uploader_id metadata. But we don't need it.
I created a docker container in the docker hub so you can then use it directly with this command
docker run -it --restart=always -v /etc/localtime:/etc/localtime stream.downloader -- https://youtube.com/live/0qPRKqRp-ys 5 120
-
It will fetch the stream from https://youtube.com/live/0qPRKqRp-ys for 5 seconds and wait 120 seconds for the next iteration.
Conclusion
So you can now prevent the abortion of your stream, while you "view" it in a time of manner. So this is for me a little help, that will keep my stream ongoing. But unfortunaly ... my cam is broken. But hey, the next cam will arrive soon then I will update this post with the correct video URL .
Edit
Here you get the livestream now