A Day With Pleroma

I was looking at the costs associated with running a Mastodon instance, mainly for myself and it wasn’t very cost efficient. I looked around and found Pleroma which promises that it is so efficient that it can run on a Raspberry pi. I decided to give it a shot this morning. The install was rather easy following the succinct installation instructions. I spent a lot of the day just playing with the software and optimizing things behind the scenes (Cloudflare and nginx).

Here are some initial thoughts and observations of Pleroma from my first day administering and using it:

  • The size of the status/updates can be changed. It defaults to 5,000 chars, compared to Mastodon’s 500 character limit.
  • UPDATE This is an nginx configuration. See below for details. Photo uploads from clients work. There is some tiny upload limit for Mastodon and it forced me to resize pictures on my phone before uploading them. It works perfectly with Pleroma, the photos get uploaded and resized correctly on the server side.
  • Images/media from remote instances are downloaded to the instance with Mastodon for performance reasons. This behavior cannot be turned off and this causes ever increasing disk space usage for a Mastodon instance when there are a lot of federated statuses with media. Pleroma takes a different approach by using nginx as a reverse-proxy and caching media locally as they are accessed. nginx handles the expiration of the files as defined in the nginx configuration. If the Pleroma admin wants, they can turn all of this off all together. I have this turned off, more on this further down in this write up.
  • With Pleroma, uploads are saved into their own directory and are not intermingled with remote media.
  • UPDATE I made a mistake on this one. See below for update There’s no search in Pleroma. I can’t search for hashtags nor have the option of using Elasticsearch in the background for full text searches.
  • UPDATE I figured it all out after looking in the config.exs file and discovering S3 settings. Details here on how to set it up. The S3 object store setup for Mastodon works effortlessly and is needed because of the remote media caching. There are options to setup S3 object store on Pleroma, but I could not get S3 object storage working. I ended up configuring s3fs to mount a Wasabi bucket for use by Pleroma.
  • UPDATE This is a client issue. Issue. Clients that work with Mastodon should also work with Pleroma. Though, the clients still are limited to the 500 character limit. On iOS, I tested with Amoroq (works), Tootdon (can’t authenticate) and Tottle (can’t authenticate).
  • Pleroma uses very little system resources. I am running Pleroma on a 1GB RAM Digital Ocean instance and it has half the memory still available. My Mastodon instance is running on a 2GB RAM Digital Ocean instance and even then, I will see memory usage spike to 80% when running maintenance tasks.
  • There are less moving parts with Pleroma. All that was needed was elixir, nginx and postres for Pleroma. For Mastodon, it needs Rails, Postgres, Redis, Yarn, Node and nginx.
  • UPDATE This seems to be a client issue. I had an issue where images were not showing up in Amaroq if they have not yet been loaded from a web browser. I traced it down to some http 406 errors in the logs. After a bunch of troubleshoot – and flipping settings on and off again – I figured it out. The Cloudflare CDN and Pleroma media proxy did not like each other. Once I turned off the Pleroma media proxy and let the remote images load via Pleroma and not through the nginx reverse proxy, things started working fine again.
  • The default Pleroma front-end is nicer than the tweetdeck-inspired Mastodon front-end. I like the Twitter-like interface that allows for more column space for the feed.
  • Pleroma seems to have issues when used in conjunction with Cloudflare’s Rocket Loader. I ended up having to turn that off in order for the Pleroma front-end to load.

I’ll see about writing up more about Pleroma in the near future as I get more time with it. So far, I am digging Pleroma.

Update: 9/9/18 - @kosh@vorlon.social pinged me to let me know that the Pleroma front-end doesn’t have the facilities to do searches, but the Pleroma back-end definitely supports it. He suggested that I use the Mastodon front-end (access via /web on your instance) to try it out. I did and it works great, full-text (and other) search without the need for Elasticsearch on the back-end like Mastodon needs. Thanks Kosh!

**Update: 9/27/18 - For uploading large files, add this line to the server block: client_max_body_size 16m;