Yeah, it was $2.5/tb/month, now it’s $4.1/tb/month.
Still cheaper than backblaze’s $6 which seems the only other option everyone suggests, so it’ll have to do for the moment.
Yeah, it was $2.5/tb/month, now it’s $4.1/tb/month.
Still cheaper than backblaze’s $6 which seems the only other option everyone suggests, so it’ll have to do for the moment.
I’m assuming you mean updating every service, right?
If you don’t need anything new from a service you can just stay on the version you use for as long as you like as long as your services are not public.
You could just install tailscale and connect everything inside the tailnet.
From there you’ll just need to update tailscale and probably your firewall, docker, and OS, or when any of the services you use receives a security update.
I’ve lagged behind several versions of immich because I don’t have time to monitor the updates and handle the breaking changes, so I just use a version until I have free time.
Then it’s just an afternoon of reading through the breaking changes, updating the docker file and config, and running docker compose pull && docker compose up -d
.
In theory there could be issues in here, that’s were your backups come into place, but I’ve never had any issues.
The rest of the 20+ services I have are just running there, because I don’t need anything new from them. Or I can just mindlessly run the same compose commands to update them.
There was only one or two times I had to actually go into some kind of emergency mode because a service suddenly broke and I had to spend a day or two figuring out what happened.
I’d say syncthing is not really a backup solution.
If for some reason something happens to a file on one side, it’ll also happen to the file on the other side, so you’ll loose your “backup”.
Plus, what ensures you your friend won’t be going around and snooping or making their own copies of your data.
Use a proper backup software to send your data offsite (restic, borg, duplicati, etc) which will send it encrypted (use a password manager to set a strong and unique password for each backup)
And follow the 3-2-1 rule MangoPenguin mentioned.
Remember, this rule is just for data you can’t find anywhere else, so just your photos, your own generated files, databases of the services you self-host, stuff like that. If you really want you could make a backup of hard to find media, but if you already have a torrent file, then don’t go doing backup of that media.
What do you mean jellyfin uses the *are suite?
I have Jellyfin with any media in different directories as long as I try to match the format the documents mention.
So, as long as I can get the media in any way I can just put it in any directory and it’ll be added to the library.
Is it similar with Odin? Or does it directly fetch the media from where you want to download it?
FreshRSS has been amazing, as you said, other readers have other goals in mind and seems RSS is just an add-on.
On Android’s also there are no good clients, I’ve been using the PWA which is good enough.
There are several extensions for mobile menu improvements, I have Smart Mobile Menu
, Mobile Scroll Menu
and Touch Control
(it works great on Firefox, but not on brave, it’s too sensitive there, so YMMV).
There’s also ReadingTime
, but there are feeds which don’t send the whole body of the post, so you might only see a 1minute read because of that.
The extension AutoTTL
processes the feeds and makes them update only when it’s more likely for them to get new items instead of every X minutes configured by FreshRSS.
Still there’s a problem when the MaxTTL happens, all feeds are allowed to be updated and you might hit some rate limits, so I developed a rate limiter. Still there’s a problem with AutoTTL because how extensions are loaded and with the http code reported by FreshRSS.
I found this project which receive the emails of newsletters and turns them into a RSS feed, I’ve only used it for one feed and I’ve only received one entry, not sure if the newsletter is that bad or if the site struggles to receive/show them. Haven’t tried something it.
https://github.com/leafac/kill-the-newsletter
There’s also this repo linking a lot of sites with feeds, and some sites which don’t offer feeds directly are provided via feedburner (which seems it’s a Google service and wikipedia says "primarily for monetizing RSS feeds, primarily by inserting targeted advertisements into them"
, so use those at your own discretion)
https://github.com/plenaryapp/awesome-rss-feeds
Just for privacy reasons?
I can decouple the traffic fingerprinting of some sites, like amazon, youtube, reddit, etc.
And because I have a squid proxy router through the vpn set up via a couple of docker containers, I have a firefox container to always send the traffic over the proxy which allows me to easily search for stuff outside and inside the vpn.
Aside from that I also use the proxy to send requests in scripts over the vpn so my real IP doesn’t get rate limited.
And what VPNs are actually for: looking for geo-blocked content.
I’ve always used them as a bookmark, specially now they have lists.
There are projects with tens of thousands of stars but with commits from 2-3 years ago, with only dependabot commits, or with 0 issues but every last closed one is from stalebot because the owner doesn’t care to maintain the repo.
Stars are not a way to know if a repo is good.
Maybe you could submit an issue to the repo to include a way to change the format of the saved folders.
(I’m thinking something similar on how immich allows to change some formats)
I’m seeing in my instance the names seem like some sort of timestamp, not sure if the code uses them in a meaningful way, so probably the solution would be to create symlinks with the name of the site or some other format while keeping the timestamp so the rest of the code can still expect it.
I bought this one and it’s been wonderful to run +20 services. A few of those are Forgejo (github replacement), Jellyfin (Plex but actually self-hosted), immich (Google Photos replacement), frigate (to process one security camera).
(Only Immich does transcoding, jellyfin already has all my media preprocessed from the GPU of my laptop)
I bought it bare-bone since I already had the RAM and an SSD, plus I wasn’t to use windows. During this year I’ve bought another SSD and a HDD.
I bought it on amazon, but you could buy it from the seller, although I’d recommend amazon to not deal with the import and have an easy return policy.
Gameloft was a sister company, wasn’t it?
What’s happened to them?
I’ve seen a few trailers for their games in some Nintendo directs, are they any good? Or have they followed a similar path as Ubisoft?
I’d say it’s one thing and better to be tracked only at account level than to be tracked at traffic level.
So you know only your history in the site can be used as opposed to any other form of fingerprinting the sites might use at browser, cookies, or ip level.
Found the issue '^-^
UFW also blocks traffic between docker and host.
I had to add these rules
ufw allow proto tcp from 172.16.0.0/12 to 172.16.0.0/12 port 80
ufw allow proto tcp from 172.16.0.0/12 to 172.16.0.0/12 port 443
Same problem.
I tried a few values and the same, ping works but curl doesn’t.
I wonder if Trudeau will make the same move as the Mexican president and tell the actual truth about the meeting.
Why not report it in the repo?
Not sure what’d you consider lightweight, I’ve been using https://github.com/jhj0517/Whisper-WebUI with fast whisper.
The GPU integration has never worked well for me, but the CPU one works wonders.
You’ll have to check if the models offer good results for those languages.
The video in YT and the music played in YTMusic are two different uploads, you can easily get one in YT by checking the YTM URL and getting the ID. So yeah, yt-dlp should get you only the song if you created a playlist with only songs instead of music videos.
Maybe FreshRSS with some extensions?
I saw a recent commit to fire an event when saving a favorite, so probably you can get an extension to send the link to something like archivebox for the pages you favorite.
I’ve just fiddled with an already created extension, but they seem fairly simple to create your own easily.
Of course you can inject JS so you could make it more complex if you want.
With invidious and in FreshRSS I use the youtube extension to use the embedded video player, you just need update this part of the code https://github.com/FreshRSS/Extensions/blob/master/xExtension-YouTube/extension.php#L153-L163
It easy just to replace for this:
public function getHtmlContentForLink(FreshRSS_Entry $entry, string $link): string
{
$domain = 'www.youtube.com';
if ($this->useNoCookie) {
$domain = 'www.youtube-nocookie.com';
}
$domain = 'invidious.personal.com';
$params = 'quality=dash';
$url = str_replace('//www.youtube.com/watch?v=', '//'.$domain.'/embed/', $link);
$url = str_replace('http://', 'https://', $url);
$url = $url . '?' . $params;
return $this->getHtml($entry, $url);
}
The only change is to use $domain = 'invidious.personal.com';
And add the parameter quality=dash
Seems there’s also this one https://github.com/tunbridgep/freshrss-invidious
but haven’t tried it
My question would be, if you’re only archiving repos, why do you need a forge?
A simple
git clone <repo>
to any your archival directory would be enough to store them, there’s no need for you to use a forge software.Are there any other features of gitea you use?