Self Hosting Video
The scrappy way
I recently set up a low tech system for hosting videos on my website. Here is the first video I uploaded to test it!
Why?
I used to make videos on YouTube. But I dislike someone earning money from my videos. I have no intention of earning money from them, and almost nobody watches them anyway, so it just gives me a bad feeling.
But I do want to be able to put videos online. I want to be able to present them to people without ads other distractions. I want people to be able to watch them through their RSS readers! I want to offer a direct download link, and original quality Torrent download next to the video.
I want my visitors to feel like fellow humans, rather than a way to make money.
Goals
These are the goals for my setup:
- Low maintenance, I don’t want to be forced to fix my server.
- Portable, I don’t want to be stuck on a specific server.
- Easy to use, both to set it up, and using the final system.
These goals align with a slippy mindset.
All I want to have to do is upload a video somewhere over ssh
. When the video is uploaded it should automatically create a page for the video in my website repo, as well as transcode the video to a lighter format, and upload that to the website repo as well.
Then I can add a title and description to it and publish it manually.
How I did it
I combined several smaller pieces into something that does exactly what I need, and nothing I don’t need. There’s low maintenance, because each program has been around forever, and won’t change with updates, and they are likely to keep being maintained for a long time. And if something breaks, everything is at a level where I can go in there and fix it myself, or swap out one of the components.
Here are the key parts in this project:
scp
to send videos to a directory on a server.entr
for reacting to a file being sent to that directory.ffmpeg
for transcoding videos.git
to push transcoded videos to the website repository.bash
for putting everything together into a script.
I put all these pieces together into this ~100 line bash
script:
#!/usr/bin/env bash
transcode_video() {
# Transcodes the input files to the $OUT directory.
#
# Transcodes to one modern mainstream format.
#
# I'm using https://developer.mozilla.org/en-US/docs/Web/Media/Formats/Video_codecs to decide which formats
# have good enough support.
#
# As of May 2024
# Modern format: VP9 & Opus
# Legacy format: H.265 & Vorbis
local IN=$1
local OUT=$2
# 960 is half of 1920,
# so this is half of 1080p.
# The long side is decided instead of the short side,
# because this ensures the resolution doesn't go above half of 1080p.
local RES=960
local SCALING="scale=w='min($RES,iw)':h='min($RES,ih)':force_original_aspect_ratio=decrease"
# X265 constant rate factor (CRF) encoding
ffmpeg \
-n \
-i $IN \
-vf $SCALING \
-c:v libx264 \
-b:v 0 \
-crf 30 \
-c:a aac \
-b:a 128k \
"${OUT}.mp4"
# Create thumbnail
ffmpeg -n -i "$IN" -vf "$SCALING" -frames:v 1 -q:v 5 "${OUT}.jpg"
}
push_to_git () {
# Adds and pushes all files in the repository.
# Assumes the GIT_WORK_TREE and GIT_DIR environment variables are set.
git fetch
git merge
git add --all
git commit -m "push_to_git from $(hostname)"
git push
}
folder_transcode () {
# This script transcodes all files from one folder
# and puts them into another folder.
local DIR_IN="$1"
local HUGO_DIR="$2"
local DIR_OUT=$HUGO_DIR/assets/video
for f in "$DIR_IN/*"; do
# Continue if it isn't a file.
if ! test -f "$f"; then
continue
fi
filename=$(basename -- "$f")
out_filename_no_ext=${filename%.*}
out_filename_ext="$out_filename_no_ext.mp4"
if ! test -f "$DIR_OUT/$out_filename_ext"; then
# If the output file does not exist..
# Create and push a new draft page for the video,
# so that can be edited while the video gets uploaded.
$HUGO_DIR/hugo -s $HUGO_DIR new "video/$out_filename_no_ext.md"
push_to_git
echo "Pushed video info file"
# transcode and push the video.
out_file_path="$DIR_OUT/$out_filename_no_ext"
transcode_video $f $out_file_path
push_to_git
fi
done
}
# This script watches a folder,
# and transcodes all videos to the output folder when it changes.
# Print everything.
set -x
declare DIR_IN=$1
declare HUGO_DIR=$2
# Set the git repo we're using.
export GIT_WORK_TREE=$HUGO_DIR
export GIT_DIR=$HUGO_DIR/.git
# Ensure the out dir is up to date first.
git fetch
git merge
while sleep 1; do
# Transcode all files in the directory.
folder_transcode $DIR_IN $HUGO_DIR
# Wait until there is a change in the directory.
ls -d $DIR_IN | entr -npd -s 'kill $PPID'
done
Feel free to copy and do whatever you want with this script!
I’ve added the script as a systemd
service that starts when the computer starts. The script runs forever in a loop once it has started. systemd
is configured with NixOS, and the configuration there looks like this.
systemd.services.folder-watch-transcode = {
description = "Auto-transcode and upload videos";
wantedBy = [ "multi-user.target" ];
path = with pkgs; [
bash
openssh
entr
ffmpeg_7-headless
gitMinimal
hostname
lsof
];
serviceConfig = {
User = "video";
Group = "users";
Type = "simple";
ExecStart = ''${pkgs.bash}/bin/bash /home/video/bin/folder-watch-transcode /home/video/upload /home/video/orsvarn'';
};
}
There is much more than this to the NixOS configuration, but that’s for another blog post. I also had to modify my website to display the videos, that is also not covered in this post.
NixOS
NixOS enables me to have the setup for an entire system in no more than a few configuration files.
If I want to have the same setup on another computer, I can copy the configuration files there, run nixos-rebuild switch --flake .
and I’ve got all the programs, users, systemd services, etc. all up and running within a few minutes.
Some things still need to be done manually, like adding private SSH keys, and adding git repos where they’re supposed to be, but it reduces the amount of work by at least 80%!
NixOS improves the “low maintenance” and “portability” goals. This comes at the cost of the “easy to use” goal though, because it’s hard!
Next steps
I want to add a torrent download for the videos. I’m investigating if I need to run a tracker to not rely on a third party, and command line programs for creating torrent files.
Why don’t I use PeerTube?
PeerTube has all the features I want. It transcodes videos, allows embedding them, and uses the Torrent protocol to alleviate server load.
I didn’t go with PeerTube though, because it doesn’t satisfy my goals.
- It’s potentially high maintenance, because I don’t know what is going to happen with it in the future. If it stops develoment I’d be forced to move to something else at some point. Since it’s a relatively new and complex project, the risk of it being left unmaintained increases. And when updates do arrive, that comes with an unknown amount of work for me.
- It’s less portable because it’s a bigger thing. What sort of hardware does it need to run? Can it run on a Raspberry Pi 4? What sort of database setup do I need? Once I’ve set up a system, I don’t want to feel like it would be a huge task to migrate it to another server.
- It’s hard to use, because it’s a big project that requires databases and certain power of hardware and other requirements to set up. If something goes wrong I don’t know how to fix it. If I want to upload something to it I need to have access to a web browser.
So even though PeerTube has the features I need, and I really like the project, it doesn’t satisfy my needs.
Small web!!
I hope that someday all this stuff can be available for less technical people too. It isn’t fair that only a few technologically capable people with free time can be truly free on the web.
Have you built something similar? Do you have suggestions for how it can be improved? Contact me via email.