Uncloud Journal

  1. 2022.06.05 – Status Page!
  2. 2022.04.14 – Metrics
  3. 2022.04.06 – Synapse, Cryptpad and first website
  4. 2022.03.09 – Encrypting the disk
  5. 2022.03.02 – Site sketch, backup bug
  6. 2022.03.01 – The First Entry

Uncloud is an attempt to fight big tech and privacy invasive applications, like Google, Amazon, Facebook, Apple etc. Built by me and my friend Lucas Severo.

2022.06.05 – Status Page!

Last update has been a long time! Here are the news:

Lately I've been freelancing more and working less on Uncloud... but, as a gardener, sometimes we need to leave nature on its own.

2022.04.14 – Metrics

Just deployed a basic metrics system! It is composed of: Prometheus, Prometheus Exporter (for the node), cAdvisor (for Docker) and Grafana. Soon I wish we can offer some open dashboards 🤓.

2022.04.06 – Synapse, Cryptpad and first website

Long time without any updates, I'm traveling in my home country, Brasil 🇧🇷.

Anyway, Lucas got a Synapse and a Cryptpad up and running!

AND we finally have the first first first website, uncloud.do! You can take a look at the source code here.

2022.03.09 – Encrypting the disk

By default, the disks on dedicated machines at Hetzner are not encrypted at rest. Last week I worked on encrypting the disks using Hetzner's installimage script. Worked like a charm. You can check more here.

2022.03.02 – Site sketch, backup bug

Two updates for the journal today.

First, we have a sketch for our future website. My partner Isabela is working lots on it. Implementation is coming soon!

Secondly, I had a ridiculous bug hunt yesterday. We are temporarily storing backups in Premiumize, by mounting their storage with SSHFS. The shell script basically did:

It worked fine running by hand, line by line, but failed within Cron: the files were never copied.

After multiple trial and errors I found out that files were being created with a weird old date, like September 1999, and updated to the correct timestamp a couple of seconds later. Since my script was deleting older files right after copying, it deleted the newly copied file. Fuck.

The fix was simple: delete the files before copying, so we avoid any timestamp problem. It took a couple of arts though.

2022.03.01 – The First Entry

The project is around since 06.02.2022, at least that's the day we rented a dedicated machine and bought the domain.

So far, we have a couple of services:

Everything is hosted in (so far, one) Hetzner dedicated machine. The whole infrastructure is open and public, you can check it here. Time will tell if making our infrastructure so public is a good idea.

My partner was watching me hosting FreshRSS today and gave me the idea to document this, so here it is.

By the way, the FreshRSS logs are mixed between the web server and the app, like this:

freshrss_1  | FreshRSS[2704]: [<username>] [Tue, 01 Mar 2022 21:23:56 +0100] [warning] --- cURL error 22: The requested URL returned error: 404 Not Found [http://ohoh-blog.blogspot.com/feeds/posts/default]
freshrss_1  | FreshRSS[2704]: SimplePie GET https://patchworkcactus.typepad.com/
freshrss_1  | PHP Notice: A feed could not be found at `http://patchworkcactus.typepad.com/blog/atom.xml`; the status code is `302` and content-type is `text/html; charset=utf-8` in /var/www/FreshRSS/lib/SimplePie/SimplePie.php on line 1807
freshrss_1  | FreshRSS[2704]: [<username>] [Tue, 01 Mar 2022 21:23:59 +0100] [warning] --- A feed could not be found at `http://patchworkcactus.typepad.com/blog/atom.xml`; the status code is `302` and content-type is `text/html; charset=utf-8` [http://patchworkcactus.typepad.com/blog/atom.xml]
freshrss_1  | - - [01/Mar/2022:21:24:18 +0100] "GET /i/?c=javascript&a=nbUnreadsPerFeed HTTP/1.1" 200 33 "-" "Go-http-client/1.1"
freshrss_1  | - - [01/Mar/2022:21:24:23 +0100] "POST /i/?c=entry&a=read HTTP/1.1" 200 11 "-" "Go-http-client/1.1"
freshrss_1  | - - [01/Mar/2022:21:24:24 +0100] "GET /i/?c=javascript&a=nbUnreadsPerFeed HTTP/1.1" 200 1488 "-" "Go-http-client/1.1"

One thing we want to do with Uncloud is be as anonymous as possible, so we probably need to drop the <username> out of those logs. Soon I'll work on this.