linuxPIPEpower

joined 9 months ago
[–] [email protected] 2 points 3 months ago (1 children)

Thanks!

I elaborated on why I'm using USB HDDs in this comment. I have been a bit stuck knowing how to proceed to avoid these problems. I am willing to get a new desktop at some point but not sure what is needed and don't have unlimited resources. If I buy a new device, I'll have to live with it for a long time. I have about 6 or 8 external HDDs in total. Will probably eventually consolidate the smaller ones into a larger drive which would bring it down. Several are 2-4TB, could replace with 1x 12TB. But I will probably keep using the existing ones for backup if at all possible.

Re the VPN, people keep mentioning this. I am not understanding what it would do though? I mostly need to access my files from within the LAN. Certainly not enough to justify the security risk of a dummy like me running a public service. I'd rather just copy files to an encrypted disk for those occasions and feel safe with my ports closed to outsiders.

Is there some reason to consider a VPN for inside the LAN?

[–] [email protected] 4 points 3 months ago (1 children)
  1. In another comment I ran iperf3 Laptop (wifi) ---> Desktop (ethernet) which was about 80-90MBits/s. Whereas Desktop ---> OtherDesktop was in the 900-950 MBits/s range. So I think I can say the networking is fine enough when it's all ethernet. Is there some other kind of benchmarking to do?

  2. Just posted a more detailed description of the desktops in this comment (4th paragraph). It's not ideal but for now its what I have. I did actually take the time (gnome-disks benchmarking) to test different cables, ports, etc to find the best possible configuration. While there is an upper limit, if you are forced to use USB, this makes a big difference.

  3. Other people suggested ZeroTier or VPNs generally. I don't really understand the role this component would be playing? I have a LAN and I really only want local access. Why the VPN?

  4. Ya, I have tried using syncthing for this before and it involves deleting stuff all the time then re-syncing it when you need it again. And you need to be careful not to accidentally delete while synced, which could destroy all files.

  5. Resilio I used it a long time ago. Didn't realize it was still around! IIRC it was somewhat based on bittorrent with the idea of peers providing data to one another.

[–] [email protected] 2 points 3 months ago (6 children)

Maybe Syncthing is the way forward. I use it for years and am reasonably comfortable with it. When it works, it works. Problems is that when it doesn't work, it's hard to solve or even to know about. For the present use case it would involve making a lot of shares and manually toggling them on and off all the time. And would need to have some kind of error checking system also to avoid deleting unsynced files.

Others have also suggested NFS but I am having a difficult time finding basic info about what it is and what I can expect. How is it different than using SSHFS mounted? Assuming I continue limping along on my existing hardware, do you think it can do any of the local caching type stuff I was hoping for?

Re the hardware, thanks for the feedback! I am only recently learning about this side of computing. Am not a gamer and usually have had laptops, so never got too much into the hardware.

I have actually 2 desktops, both 10+ years old. 1 is a macmini so there is no chance of getting the storage properly installed. I believe the CPU is better and it has more RAM because it was upgraded when it was my main machine. The other is a "small" tower (about 14") picked up cheaply to learn about PCs. Has not been upgraded at all other than SSD for the system drive. Both running debian now.

In another comment I ran iperf3 Laptop (wifi) ---> Desktop (ethernet) which was about 80-90MBits/s. Whereas Desktop ---> OtherDesktop was in the 900-950 MBits/s range. So I think I can say the networking is fine enough when it's all ethernet.

One thing I wasn't expecting from the tower is that it only supports 2x internal HDDs. I was hoping to get all the loose USB devices inside the box, like you suggest. It didn't occur to me that I could only get the system drive + one extra. I don't know if that's common? Or if there is some way to expand the capacity? There isn't too much room inside the box but if there was a way to add trays, most of them could fit inside with a bit of air between them.

This is the kind of pitfall I wanted to learn about when I bought this machine so I guess it's doing its job. :)

Efforts to research what I would like to have instead have led me to be quite overwhelmed. I find a lot of people online who have way more time and resources to devote than I do, who want really high performance. I always just want "good enough". If I followed the advice I found online I would end up with a PC costing more than everything else I own in the world put together.

As far as I can tell, the solution for the miniPC type device is to buy an external drive holder rack. Do you agree? They are sooo expensive though, like $200-300 for basically a box. I don't understand why they cost so much.

[–] [email protected] 1 points 3 months ago (5 children)

What would be the role of Zerotier? It seems like some sort of VPN-type application. I don't understand what it's needed for though. Someone else also suggested it albeit in a different configuration.

Just doing some reading on NFS, it certainly seems promising. Naturally ArchWiki has a fairly clear instruction document. But I am having a ahrd time seeing what it is exactly? Why is it faster than SSHFS?

Using the Cache with NFS > Cache Limitations with NFS:

Opening a file from a shared file system for direct I/O automatically bypasses the cache. This is because this type of access must be direct to the server.

Which raises the question what is "direct I/O" and is it something I use? This page calls direct I/O "an alternative caching policy" and the limited amount I can understand elsewhere leads me to infer I don't need to worry about this. Does anyone know otherwise?

The issue with syncing, is usually needing to sync everything.

yes this is why syncthing proved difficult when I last tried it for this purpose.

Beyond the actual files ti would be really handy if some lower-level stuff could be cache/synced between devices. Like thumbnails and other metadata. To my mind, remotely perusing Desktop filesystem from Laptop should be just as fast as looking through local files. I wouldn't mind having a reasonable chunk of local storage dedicated to keeping this available.

[–] [email protected] 2 points 3 months ago (1 children)

What would be the role of Zerotier? It seems like some sort of VPN-type application. What do I need that for?

rclone is cool and I used it before. I was never able to get it to work really consistently so always gave up. But that's probably use error.

That said, I can mount network drives and access them from within the file system. I think GVFS is doing the lifting for that. There are a couple different ways I've tried including with rclone, none seemed superior performance-wise. I should say the Desktop computer is just old and slow; there is only so much improvement possible if the files reside there. I would much prefer to work on my Laptop directly and move them back to Desktop for safe keeping when done.

"vfs cache" is certainly an intriguing term

Looks like maybe the main documentation is rclone mount > vfs-file-caching and specifically --vfs-cache-mode-full

In this mode the files in the cache will be sparse files and rclone will keep track of which bits of the files it has downloaded.

So if an application only reads the starts of each file, then rclone will only buffer the start of the file. These files will appear to be their full size in the cache, but they will be sparse files with only the data that has been downloaded present in them.

I'm not totally sure what this would be doing, if it is exactly what I want, or close enough? I am remembering now one reason I didn't stick with rclone which is I find the documentation difficult to understand. This is a really useful lead though.

[–] [email protected] 1 points 3 months ago (2 children)

I don't know what that means

[–] [email protected] 2 points 3 months ago (1 children)

if you delete a file on your laptop it will also be deleted on your desktop on the next sync

This is my fear! I have done it before.... Forgetting something is synced and deleting what I thought was "an extra copy" only to realize later that it propagated to the original.

[–] [email protected] 3 points 3 months ago (3 children)

hmm interesting idea. I do not get the idea that nextcloud is reliably "easy" as it's kind of a joke how complex it can be.

Someone else suggested WebDAV which I believe is the filesharing Nextcloud uses. Does Nextcloud add anything relevant above what's available from just WebDAV?

[–] [email protected] 3 points 3 months ago (1 children)

A few weeks ago I put some serious time/brainpower into the network and got it waaaay smoother and faster than before. Finally implemented some upgraded hardware that has been sitting on a shelf for too long.

I tried iperf. Actually iperf3 because that's the first tutorial I found. Do you have any opinion on iPerf vs iperf3? On Desktop I ran:

iperf3 -s -p 7673

On Laptop I am currently doing some stuff I didn't want to quit so this may not be a totally fair test. I'll try re running it later. That said I ran:

 iperf3 -c desktop.lan -p 7673 -bidir

And what looks like a summary at the bottom:

[ ID] Interval           Transfer     Bitrate         Retr
[  5]   0.00-10.00  sec   102 MBytes  86.0 Mbits/sec  152             sender
[  5]   0.00-10.00  sec   102 MBytes  85.6 Mbits/sec                  receiver

I actually have AnotherDesktop on the LAN also connected via ethernet. Going from Laptop ---> AnotherDesktop gets similar to the above.

However going AnotherDesktop ---> Desktop gets 10x better results:

[  5]   0.00-10.00  sec  1.09 GBytes   936 Mbits/sec    0             sender
[  5]   0.00-10.00  sec  1.09 GBytes   933 Mbits/sec                  receiver

Laptop has Intel Dual Band Wireless-AC 8260 who's Max Speed = 867 Mbps. It probably isn't the bottleneck. Although with the distro running at the moment (Fedora) I have a LOT of problems with everything so possibly things aren't set up ideally here.

I still didn't upgrade the actual wireless access point for the network; don't recall what the max speed is for current WAP but could be around 100Mbps.

So this is an interesting path to optimize. However I am still interested in solving the original problem because even when I am directly using Desktop, things are slow. I do not really want to upgrade it is I can get away with a software solution. There are many items on my list of projects and purchases that I'd rather concentrate on.

[–] [email protected] 2 points 3 months ago

I've used WebDAV here and there. I found some aspects of set up frustrating so I tend to keep away from it except for smaller, short term use cases.

Does it do the caching thing or is it more of an alternative to SSH/SFTP?

If it's an alternative, what is the benefit?

IIRC WebDAV can be set up from inside certain filemanagers (like nautilus with an extension installed) or by using a web server like apache, or by using smaller stand alone services.

[–] [email protected] 1 points 3 months ago

I love this logo. <3

[–] [email protected] 28 points 3 months ago

By showing how you drew a comic about it them posted it to lemmy ofc

 

I really like advance find and replace in kate editor. You can optionally use regex and operate on multiple files.

Very importantly it has a robust preview changes ability. it is comfortable to use even with lots of hits, lots of files. So you do not need to apply a bunch of changes and hope you considered every permutation as with a cli tool like sed.

One thing that would really improve my life would be a tool like this which allows you to save search queries and options.

Don't work for me:

  • Kate has a popup for history in the fields which is somewhat helpful but limited. When trying out different queries you don't have a way to remember which one actually worked so going by the history just ends in repeating the same errors over and over. Also it doesn't match the "find" and "replace" fields nor does it associate them with the other options like directory, etc.

  • Keeping notes in a text file is of course possible but cumbersome. I would like the computer to do work like that for me.

For single file searches regex101.com (non floss) and regexr.com (GPLv3) are great in-browser tools for learning and you can save the search. But to operate locally on many files, it doesn't work.

Does anyone know any tools that do anything like this? Can find various utilities which operate on file names but I am looking for file content. Certainly this exists ya?

(Post image is screenshot from Kate website of Kate on windows.)

 

Anytime I search for an addon via the search box in settings > add-ons manager I get all these theme results. Here is a search for "syntax" (via the add-ons manager) I had to make it very zoomed-out to fit long page into screen cap:

I use themes personally to visually differentiate between profiles. And I have nothing against fun and frivolous user customizations. Am not hating on the concept.

I am curious about why they are so aggressively pushed so that they show up be default when trying to search for add ons you need to toggle off every time. Searching for an add-on to do something and searching for a theme that has some keyword included seem to me like totally different tasks and mixing them up is a strange choice.

Is this like a major things firefox thinks people like about it? Do people like it?

 

I am learning some bash scripting.

I am interested to learn about getting input for my scripts via a GUI interface. It seems that yad (forked from zenity) is the most robust tool for this. (But if there is a better choice I would like to hear about it too.)

Is it possible to obtain 2 or more named variables using yad? Not just getting the values based on their positions ($1, $2, etc), with awk. See "What doesn't work" spoiler for those.

What doesn't workI find how to obtain one named variable, for example:

inputStr=$(zenity --entry --title="My Title" --text="My Text:")

I also find solutions relying on opening single-variable dialogues sequentially but that's a terrible interface.

Everything else relies on chopping up the output with awk or based on the positions, $1, $2, $3 etc. In this script $jpgfile is obtained:

jpgfile=$(echo $OUTPUT | awk 'BEGIN {FS="," } { print $1 }')

This seems unmanageable because adding a new field or failing to provide input for a field will both change the output order of every subsequent value. It's way too fragile.

For a simple example, I want to ask the user for a file name and some content. Creating the dialogue is like this:

yad --title "Create a file" --form --field="File name" --field="Content"

If you fill both fields the output in the terminal is file|this is some text|. How do I get them into variables like $filename and $filecontent? So then I can finish the script like this:

touch "$filename"
echo "$filecontent" > $filename

Is this possible??? I do not find it anywhere. I looked though all kinds of websites like YAD Guide, yad man page, smokey01. Maybe I missed something. On yaddemo I read about bash arrays and it seemed to come close but I couldn't quite piece it together.

 

cross-posted from: https://discuss.tchncs.de/post/9585677

My dream: I want a way to arbitrarily close and later open groups of applications including their states such as open files, window arrangement, scrollback, even undo histories etc. So working on a specific project I can close everything neatly and return to it later.

In my research/experiments here is what I come up with, do you agree?:

  1. in the terminal-only environment this would be tmux or another multiplexer

  2. But when you start including GUI applications (which I must), then it is something else that doesn't exactly exist

  3. Applications store their current states in a variety of places and some of them don't really do restoring in any way so it would be hard to force.

  4. the best option for this is something like xpra where you can have multiple sessions. If you had a machine that stayed powered-on all the time it might be possible to create sessions, log in remotely and use them that way.

  5. Using xpra or similar the sessions are never really actually closed. You would only close the connection from the local machine. If the machine faces a power off then too bad. As far as I can se there is basically no way to accomplish this goal where power-offs are accommodated.

I have tried some remote-login options but they are too slow for normal use. I tend to have pretty low-end hardware running (because so far it works for most things) so maybe if I upgraded it would improve.

  1. is it plausible?
  2. how to estimate hardware/performance needs of host, client and LAN? anything else to consider?

I typically use manjaro + XFCE but would be willing to try something different to accomplish the goal. I only want to do this locally on LAN not remotely.

re XFCE session managerXFCE has session management but the majority of programs don't totally work with. Like maybe the application will re-open when the session is restored but no files will be open even if they were when session was saved. Or distribution through workspaces, window size etc will not be restored.

 

My dream: I want a way to arbitrarily close and later open groups of applications including their states such as open files, window arrangement, scrollback, even undo histories etc. So working on a specific project I can close everything neatly and return to it later.

In my research/experiments here is what I come up with, do you agree?:

  1. in the terminal-only environment this would be tmux or another multiplexer

  2. But when you start including GUI applications (which I must), then it is something else that doesn't exactly exist

  3. Applications store their current states in a variety of places and some of them don't really do restoring in any way so it would be hard to force.

  4. the best option for this is something like xpra where you can have multiple sessions. If you had a machine that stayed powered-on all the time it might be possible to create sessions, log in remotely and use them that way.

  5. Using xpra or similar the sessions are never really actually closed. You would only close the connection from the local machine. If the machine faces a power off then too bad. As far as I can se there is basically no way to accomplish this goal where power-offs are accommodated.

I have tried some remote-login options but they are too slow for normal use. I tend to have pretty low-end hardware running (because so far it works for most things) so maybe if I upgraded it would improve.

  1. is it plausible?
  2. how to estimate hardware/performance needs of host, client and LAN? anything else to consider?

I typically use manjaro + XFCE but would be willing to try something different to accomplish the goal. I only want to do this locally on LAN not remotely.

re XFCE session managerXFCE has session management but the majority of programs don't totally work with. Like maybe the application will re-open when the session is restored but no files will be open even if they were when session was saved. Or distribution through workspaces, window size etc will not be restored.

28
submitted 8 months ago* (last edited 8 months ago) by [email protected] to c/[email protected]
 

I am really struggling to replace facebook messenger / whatsapp for a few casual conversations. My friends and I are all wanting to move away. We are not heavy users of this but need it to work. I think the requirements are:

  • floss client for android, linux, windows

  • persistent history across devices

  • reasonable security

  • don't need to self host server

  • can send a message to offline user, they get it when they come online

  • not tied to or reliant on phone number / cell service

  • ETA: end user documentation explaining how to set up and common troubleshooting

tried:

  • matrix: the thing with having to keep track of room keys and stuff is too complicated. every time someone uses a new device it is a ton of issues and we could never quite get it ironed out

  • signal: tied to phone number, no history across devices

  • xmpp: similar to matrix the key situation is confusing, also no cross device history

  • ETA: simpleX: a lot of people here are mentioning simpleX. It didn't come up in previous investigations so will give it a shot.

    • ETA 2: It doesn't seem to have persistent history across devices. Clarification?

I actually didn't think this would be such a problem but it is breaking us. we don't need a lot of sophisticated features like voice, video, moderation, 1000s of participants, spam protection etc that seem to be of concern to the projects. just simple text chat.

 

I really like comparison tables on wikipedia but find them hard to navigate.

For example: Comparison of web browsers > General Information

Say I want a web browser for Linux which has been recently updated. I can sort by the "Platform" column, or by "Latest release: Date" but not both.

Sometimes tables can be very wide and/or very tall. Once you get to scrolling it is impossible to see either the row or column headings. So then you can't tell where you even are in the table. Example: Table of AMD processors Also they can have complex structures with merged headings and content.

Ideally I would like to apply some basic spreadsheet-type operations like hiding rows/columns, filtering, sorting by multiple columns etc. Even if there was a way to easily get the table into an actual spreadsheet that would be helpful. I tried some extensions that export tables to other formats but nothing worked without a lot of cleanup.

Is there some kind of trick or tool or extension that makes these ginormous tables useful? I can't tell how people even add information to these things, they are so large.

 

I really like comparison tables on wikipedia but find them hard to navigate.

For example: Comparison of web browsers > General Information

Say I want a web browser for Linux which has been recently updated. I can sort by the "Platform" column, or by "Latest release: Date" but not both.

Sometimes tables can be very wide and/or very tall. Once you get to scrolling it is impossible to see either the row or column headings. So then you can't tell where you even are in the table. Example: Table of AMD processors Also they can have complex structures with merged headings and content.

Ideally I would like to apply some basic spreadsheet-type operations like hiding rows/columns, filtering, sorting by multiple columns etc. Even if there was a way to easily get the table into an actual spreadsheet that would be helpful. I tried some extensions that export tables to other formats but nothing worked without a lot of cleanup.

Is there some kind of trick or tool or extension that makes these ginormous tables useful? I can't tell how people even add information to these things, they are so large.

 

I really like comparison tables on wikipedia but find them hard to navigate.

For example: Comparison of web browsers > General Information

Say I want a web browser for Linux which has been recently updated. I can sort by the "Platform" column, or by "Latest release: Date" but not both.

Sometimes tables can be very wide and/or very tall. Once you get to scrolling it is impossible to see either the row or column headings. So then you can't tell where you even are in the table. Example: Table of AMD processors Also they can have complex structures with merged headings and content.

Ideally I would like to apply some basic spreadsheet-type operations like hiding rows/columns, filtering, sorting by multiple columns etc. Even if there was a way to easily get the table into an actual spreadsheet that would be helpful. I tried some extensions that export tables to other formats but nothing worked without a lot of cleanup.

Is there some kind of trick or tool or extension that makes these ginormous tables useful? I can't tell how people even add information to these things, they are so large.

view more: ‹ prev next ›