Launching a multistreaming server with AWS

Colin Cabana
8 min readDec 26, 2021

A nice little tradition my community at Xythereon and I have is running a 12-Hour Marathon at the end of every year, and because this event is a special one for us, I like to stream it on every platform I can.

Now — I’m sorta broke, which makes paying $30 for Restream.io for one stream a bit of a dumb decision. Luckily, I’ve been practicing Linux and using the cloud for a bit now and I thought “this is the perfect time to put my skills to the test!” and let me just say it was no picnic, which is why I’m writing this. If there’s someone out there who’s a bit of a techie like me (but not an expert), this guide is for you!

Comparing Restream.io

With this, you could theoretically stream around 1.5TB worth of content per month before your bill starts to become more expensive than Restream.io — and unlike Restream.io — you also only pay for what you actually use, so if you don’t need a full month’s worth of multistreaming servers — just shut it down. Delete it if you need to, so you don’t ever have to take an EBS bill for months you don’t even plan to touch it.

The plan is simple…

For the 12-Hour Marathon, I deployed 3 servers, but you could 1000% just use one. If you’ve streamed before, you’ll know that every platform has an RTMP “ingest” server that you put into your broadcasting software (Like OBS). Your broadcasting software will transmit your stream to the server you put in, such as YouTube or Twitch. The issue is — with most people out there — they probably only have enough bandwidth to stream one broadcast at once. My internet connection personally is only 12 Mbps up and I regularly stream at 8 Mbps. If I were to stream on just one more platform, the demand for bandwidth becomes 16 Mbps up — and you always have to account for the bandwidth your game needs. If you completely saturate your internet connection, you get higher latency on your game and dropped frames transmitted to your platform.

For this, the plan is simple: We’re going to deploy an AWS t2.micro instance to multistream your broadcast for you in Amazon’s high-speed network, and if you’d like to copy the 12-Hour Marathon: we’re going to set up 2 more t2.micro instances for a backup stream.

Backup streams are great for streams that must remain online at all times — even if your stream from your computer fails. If that happens, this backup stream kicks in and your viewers will see a screen that says “Hey! I’m experiencing technical difficulties so don’t worry — it’s not you. I’ll be back up in a few moments!”.

So, the TLDR: We’re deploying 1 server to multistream your broadcast from your computer, and 2 servers to multistream a backup stream to your platform’s backup stream servers. Right now, YouTube and Facebook support this. Twitch doesn’t, but they do have disconnect protection.

Why AWS?

Alrighty, it’s time to bust out our first-grade math books. Please open to page 1.

Amazon Web Services (while harder to set up) is cheaper than Google Cloud for US Central users. For the 12-Hour stream, I estimated 300GB of egress from my VM ((3 streams x 2 servers) x 50GB of stream egress each = 300GB).

Google charges $0.12 per GB. While it’s not much, it adds up fairly quickly. 300GB x $0.12 easily becomes a snazzy $36 bill — and that’s not even including the VM price.

Amazon charges $0.01 per GB, which becomes $3 when you multiply $0.01 with 300 GB.

Obviously, your mileage may vary depending on where you are.

For the total cost, we need to bring the whole thing together. 3 instances x $0.0116 per t2.micro x 14 hours in a month is just $0.4872. Now add the $3 for network egress and that is only $3.4872. Now, add on a 10GB hard drive and it’s just $0.70 a day.

These estimates are for the 12-Hour Challenge plus 2 hours to set up before the stream and shut down after the stream. Luckily, you see the math up there. Just adjust the numbers to how long you plan to use each instance, how many destinations and whatever else you feel like and you’re all set. Additionally, you can also use https://calculator.aws to calculate how much you’ll spend. Obviously at some point, it’s going to get more expensive than Restream.io, but you can use this to calculate before you burn your money. Additionally, you can also use this to stream to a Restream.io free account, and you can broadcast a Twitch and YouTube stream for the price of one! Since Restream.io doesn’t let you stream to Facebook Gaming for free, you can stream to DLive, YouTube, Twitch, Twitter and more by streaming to just one destination on your server and then add Facebook afterward!

As always, your mileage may vary. Your actual bill may be higher or lower than what I’m saying above and on the AWS Calculator. These are designed to give you a general idea of what you’re getting into.

Time to get to work!

Head to https://aws.amazon.com/ and sign up. Once you’re all set, select the region YOU are closest to. This lowers latency from you and the servers, which prevents stream delay, possible dropped packets and unreliable connections.

Once you’re all set, head to Elastic Compute Cloud (or “EC2” for short) and select “Launch Instance”

Once you’re there, select a Linux distribution. I recommend Ubuntu for first timers as it comes with a lot of packages already installed. You can also use Debian or maybe even Amazon Linux if you’re already an expert. This tutorial is designed for Ubuntu because I am indeed not an expert.

If you’re making the backup servers, deploy 3 at once. It’ll save you some time.

Run the two commands you should always run:

sudo apt update

sudo apt upgrade

Now, we’re going to run a tutorial by “dodgepong” on the OBS forums here: https://obsproject.com/forum/resources/how-to-set-up-your-own-private-rtmp-server-using-nginx.50/

Basically, what we’re doing is we’re building our own special version of nginx that has an RTMP module. You can’t get that by using apt — it’s something you need to make yourself.

Then run “sudo apt-get install build-essential libpcre3 libpcre3-dev libssl-dev”.

Now it’s time to download nginx and the RTMP module.

wget http://nginx.org/download/nginx-1.15.1.tar.gz

wget https://github.com/sergey-dryabzhinsky/nginx-rtmp-module/archive/dev.zip

And now to install it:

sudo apt install unzip

tar -zxvf nginx-1.15.1.tar.gz

unzip dev.zip

cd nginx-1.15.1

./configure — with-http_ssl_module — add-module=../nginx-rtmp-module-dev

make

sudo make install

If you get a zlib error saying that zlib isn’t installed, run “sudo apt-get install libpcre3 libpcre3-dev zlib1g zlib1g-dev libssl-dev” and retry the command again.

Now, if you’re planning on using RTMPS — that is actually not supported by the RTMP module we installed on nginx, so we need to use something called “stunnel”. Stunnel will basically “tunnel” data going through it with RTMPS.

sudo apt install stunnel

Now it’s time to configure everything. Head to “/etc/default/stunnel4” and run “sudo nano stunnel.conf” to create/edit a configuration file for stunnel.

Paste this into the file:

# RTMP -> RTMPS tunnel
[fb-live]
client = yes
accept = 127.0.0.1:19350
connect = live-api-s.facebook.com:443
verifyChain = no

You can replace the “connect” value with YouTube or any other platform as well if you want to encrypt your traffic going there. Save the file and restart stunnel.

Now, head to /usr/local/nginx/conf/nginx.conf and paste this at the end of the file:

rtmp {
server {
listen 1935;
chunk_size 4096;

application live {
live on;
record off;
}
}
}

Under “record off;”, enter your streaming platform URL(s).

push rtmp://<other streaming service rtmp url>/<stream key>

To access the stunnel connection, enter “push rtmp://127.0.0.1:19350/live/<stream-key>

Now, use these commands to stop and restart the server.

sudo /usr/local/nginx/sbin/nginx -s stop
sudo /usr/local/nginx/sbin/nginx

Now, get your server’s IP, add “/live” to the end of it and put it into OBS or any streaming software of choice.

Example: rtmp://10.0.0.69/live

Create any random stream key you’d like and start a test stream. See if you’re up and online!

If all goes well, you’re done! You now have a fully functioning multistreaming server on the cloud. Just remember to turn it off when you’re not using it or you’ll be billed for using resources that are idling. It’s like leaving the lights on and being the poor sucker who has to pay the power bill. Remember to run “sudo /usr/local/nginx/sbin/nginx” to start nginx RTMP!

Backup Servers

Configure your second server the same way you configured the first with the instructions above — but replace the RTMP links that you’re streaming to with the RTMP links to the backup links, like YouTube. YouTube has a second server for backup streams and you’re able to use the same stream key for that broadcast, but Facebook uses an entirely different stream key on the same address. Your platform may vary. For the third server, install FFmpeg by running “sudo apt install ffmpeg”.

Create a graphic or video you’d like to use for your backup stream. Then, use an editing application (I recommend DaVinci Resolve if you need a free editor) and render out the picture for 5 seconds OR whatever video you just made.

Upload that to your server (you can use WinSCP to SFTP into the server) as “offline.mp4”.

Now, find your second server’s internal IP address and replace “<ip-address>” with it on this next command:

ffmpeg -stream_loop -1 -i ‘offline.mp4’ -vf realtime,scale=1920:1080,format=yuv420p -r 30 -g 60 -c:v libx264 -preset superfast -x264-params keyint=2 -bufsize 500k -c:a aac -ar 44100 -b:a 128k -f flv “rtmp://<ip-address>/live/test”

That will livestream your video on loop until you hit “q” or “Ctrl+C” to stop it.

Challenges for Linux experts

There are a number of ways to improve on my tutorial — and if you do, let me know! I’d love to see it!

For example, figure out a way to make it more secure and use just one stream key. That way, if someone does somehow find your server’s IP address, they can’t stream. You can also figure out how to merge Server A and Server B together, thus saving money on a secondary server. Heck, t2.micro has the capacity to run all three servers simultaneously. If you can figure that out, please let me know! Another cool challenge is to make an automatic shut-down sequence so that when the server isn’t in use, it’ll shut down automatically, or even an automatic start-up script so that your server will be ready for action the second it turns on and becomes completely hands off!

With that being said, I hope you were able to deploy these servers successfully. Happy streaming!

--

--