I for one use and self-host Meshcentral. The GUI is ugly, but it works well.
I for one use and self-host Meshcentral. The GUI is ugly, but it works well.
In term of software compatibility, on Linux, you have the option of making chroots. Since the kernel devs makes a lot of effort to preserve compatibility, old software can still work fine. If I remember correctly, some kernel devs tested a while ago some really really old versions of bash, gcc, etc, and they still work fine with modern kernels.
Red Hat. Probably Canonical too.
I know it for a fact since I worked for a bank that chose Red Hat and since I also know someone working for Red Hat.
For those wondering, it also works with a Linux VM:
It’s not easy to set up, but it works. I’m able to run some games like Borderlands 3 running at ~50FPS with a resolution of 1920x1080 with visual effects set to the max (important: disable vsync in the games !).
Only problem is disk access. It tends to add some latency. So with games coded with their ass (ex: Raft), the framerate drops a lot (Raft goes down to 20FPS sometimes).
You can use du -sh
to figure out what’s using most of the space. Something along the line of:
sudo -i
du -sh /home /usr /var
du -sh /var/*
du -sh /var/log/*
# etc
If it’s one of your log files (likely), you can run something like tail -n 100 /var/log/[culprit]
or tail -F /var/log/[culprit]
to see what is being flooded in this log file exactly. Then you can try to fix it.
As suggested by others, your processes may be using too much memory. However I would also suggest you keep an eye on the output of dmesg
. Maybe one of your disks is failing.
Oh actually if you are worried about vendor lock-in: Meshcentral is opensource. So even if they decide to try something stupid, a fork would be likely to happen.
I think there is some confusion here.
Paperwork is a desktop application, not a web application. (eh, self-hosting doesn’t necessarily always imply web applications ! :). I for one use Nextcloud to keep my Paperwork work directories in sync on all my computers.
Paperless is a web application. Paperwork is a desktop application.
I self-host Meshcentral. I haven’t seen any limitation at all. I don’t know if there are limitations when you use meshcentral.com instead of self-hosting.
Paperwork seems to fit most of the bill except for one thing: it won’t scroll to where the search hit is (but it will highlight the matching keywords).
Just beware Paperwork won’t just create an index. It’ll organize the PDF its own way in its own work directory.
(full disclosure: I’m its main dev)
Based on my tests on my family and friends, the main problem is tech support. Most geeks seem to assume other people want the same things than themselves (privacy, freedom, etc). Well, they don’t. They want a computer that just works.
Overall when using Linux, people actually don’t need much tech support, but they need it. My father put it really well by saying: “the best OS is the one of your neighbor.”
I apply few rules:
The deal with my family and friends is simple: you want tech support from me ? ok, then I’m going to pick your computer (usually old Lenovo Thinkpads bought on Ebay at ~300€) and I’m going to install Linux on it.
I’m not shy. I ask them if they want me to have remote access to their computer. If they accept, I install a Meshcentral agent. Thing is, on other OS, they are already spied on by Google, Microsoft, Apple, etc. And most people think “they have nothing to hide”. Therefore why should they worry more about a family member or a friend than some unknown big company ? Fun fact, I’ve been really surprised by how easily people do accept that I keep a remote access on their computer: even people that are not family ! Pretty much everybody has gladly agreed up to now. (and God knows I’ve been really clear that I can access their computer whenever I want).
I install the system for them and I make the major updates for them. Therefore, if I have remote access to the system, I pick the distribution I’m the most at ease with (Debian). They just don’t care what actually runs on their computers.
When they have a problem, they call me after 8pm. With remote access, most problems are solved in a matter of minutes. Usually, they call me a few times the first days, and then I never hear from them anymore until the next major update.
So far, everybody seems really happy with this deal. And for those wondering, I can see in Meshcentral they really do use those computers :-P
Also thanks to Wine/Proton. You have to give it to Valve : overall it works surprisingly well.
At first sight, it looks like it can be used with chroots thanks to systemd-nspawn
(I haven’t tried it though)
I use OPNSense virtualized on top of Proxmox. Each physical interface of the host system (ethX
and friends) is in its own bridge (vmbrX
), and for each bridge, the OpenSense VM also has a virtual interface that is part of the bridge. It has worked flawlessly for months now.
I worked for a bank. When they decided to deploy Linux on their infrastructure, they chose RHEL and they have signed a big contract with RedHat for tech support.
Overall, they chose RedHat for the same reason they chose Microsoft before: tech support. They have >10000 engineers, and yet somehow they think they absolutely need tech support… They pay a lot for it. In my building, they even got a Microsoft engineer once a week on-site until Covid. I don’t know for the other people working for this bank, but I asked for Microsoft support only once in 2 years. In the end, their guy sent me back an email telling me “I’ve transmitted your question to the corresponding engineering team” and … diddlysquat.
Now to be fair, for paying customers, RHEL and Microsoft both ensure security updates for a really a long time. Red Hat pays a lot of people to backport security patches from upstream to previous versions. It allows companies like the bank I worked for to keep running completely crappy and obsolete software for an insane amount of time without having to worry too much about security vulnerabilities.
Anyway regarding RedHat contributions, a lot of them are subtle.
This list is far from exhaustive. I’m sure they have paid for a lot of other things you’re using daily without knowing it.
That’s something I would like to do someday. Unfortunately, last time I checked, libraries for reading DjVu files exist and are OK, but not for writing them. Last time I checked, most programs I found that write DjVu files actually don’t use the DjVuLibre library. They actually run the DjVuLibre commands.
Shameless self-promotion: Paperwork
To quickly introduce myself, I’m the main author of Paperwork. I’ve packaged Paperwork in various ways, and many people have packaged it in various distributions as well.
I’m fine with Flatpak. In my opinion, it has its use cases. I find complementary to other existing methods (distribution packages, AppImage, …)
However I’m not fine with Snap. I haven’t used it much, but my understanding is that it focuses on Canonical servers. You can change its configuration to use other servers, but it defaults to Canonical servers (and we all know most users will never change default settings). To me, this is a slipping slope towards proprietary services/software.
Moreover, I’m really annoyed by Canonical pushing Snap by default in Ubuntu (Firefox, Chrome, etc are packaged only using Snap now; the APT packages install the Snap packages). It doesn’t bring anything to the users. Those packages could have been as well-packaged using APT (see the repositories *-updates in Debian for instance).
My wife and I use a Nextcloud application called Cospend.