ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Topics
    2. guyinpv
    3. Posts
    • Profile
    • Following 0
    • Followers 0
    • Topics 57
    • Posts 679
    • Best 144
    • Controversial 2
    • Groups 0

    Posts made by guyinpv

    • RE: Best backup strategy for NextCloud?

      @obsolesce said in Best backup strategy for NextCloud?:

      What's your retention policy for data on NextCloud? I think that would help determine your backup method and software (along with RTO and RPO policies) and several other factors of course.
      At what point in time should you be allowed to say "Sorry, our backups don't go back that far, so we can't recover that file from that period of time..."? As well as "The oldest copy I have is from XXX, and the most recent is from YYY.". It will take this long XXX to restore it, and cost this much XXX to restore it.".

      I don't have any of that 😉

      Essentially, VULTR's automatic backup only retains the latest 2 images. To me that's a little shallow. I'd like to push it to a week at least. For example if a virus or something screws up all our files late on Friday, I don't want to come in on Monday and find that our oldest backup is still screwed up. I haven't been able to find whether VULTR lets me download or transfer a backup on my own, then at least I could push that to another storage service.

      Assuming I can't download an image, that leaves me having to do my own "standard" NextCloud backup procedure, grabbing my files and DB manually.

      I'm ok setting this manual backup up, but I still am curious about doing it as an incremental backup just to save space. I'm just not sure how incremental will work with the rest of the NC files and database. Do I dump the DB and include it with each incremental backup? And if so, how does that help if I needed to restore a bunch of files from an incremental? I don't think that would work. Or maybe this situation would never happen?

      @jaredbusch said in Best backup strategy for NextCloud?:

      BackBlaze B2 would be good. Or any other provider that you can script an rsync to. It sounds to me like you do not want backups. you just want a copy offsite.
      If you want actual backups that is different. Then things like the script @black3dynamite mentioned or URBackup as @DustinB3403 mentioned become the best choice.

      We have a B2 account. That is really for archival storage but I guess that doesn't matter, I can still delete older backups without any issues in my account?

      @emad-r said in Best backup strategy for NextCloud?:

      @guyinpv

      The thing is and most people forgot with NC is depends on the user to put the files in the NC folder that gets created, if the user does not put the files into the sync folder nothing gets backed up.

      I am saying this right ? or there is way to force backup ? like mapping the sync folder to C:\Users. But that is not what NC is made for ...

      My theory is if you create NC folder for users in the desktop, and they dont use it = no backups.

      For us, NC is not for backups, it's for file sharing. We have different backup for workstations.

      For that matter, I could cheat this whole thing and just backup the NC files daily as part of the normal workstation backup, but even those are one week incremental. At least it would be something, I'd have a week's worth of NC changes on a local workstation to pull files out of if ever needed.

      Or likewise I could create a second backup on my workstation to do incrementals on NC only to our NAS.

      posted in IT Discussion
      guyinpvG
      guyinpv
    • RE: Need to track what PHP script is generating a file on nix

      I thought maybe just a simply stack trace log that could be "turned on" in Apache and/or PHP for temporary time like a few days, then turn it back off.

      Logging all PHP functions for multiple days would likely produce a mountain of data, so I'd have to figure out how to save that and search it.

      Wouldn't the Zend engine or some other PHP diagnostic monitoring tool be able to do this? I think it's something that can be done using Apache/PHP tools rather than underlying OS tools, I don't know.

      posted in IT Discussion
      guyinpvG
      guyinpv
    • Best backup strategy for NextCloud?

      I'm running NC on a VULTR box. I'll have their built-in backup system doing dailies I suppose, but they only keep 2 copies if I remember right.
      There are also snapshots I can take at any time.

      My question is, for something like NC, there are really two things going on. There is the server and software itself, the NC "image" so to speak. And then the huge dump of all the files we store there.

      I feel like doing a daily backup of the entire file system to be a bit aggressive. Overkill if you will.

      Let's say we have 30GB of files in there. In a given day, we might edit 50 files to the tune of 10MB of changes.

      Part of me says, do some kind of incremental backup just to get the changed files and store those by date.

      If I only store changed files, I could backup multiple times a day, or even do something more akin to revisions. Of course, NC itself stores revisions, but I just mean a differential backup of only changed files.

      Anyway, my point is, it seems wasteful and time consuming to do a full system backup daily when I only truly need about 10MB of changed files from the last image. It's not like the OS and NC software is changing. Only the files and DB.

      I don't know how to set something like that up, it's probably not even possible since the files are so closely linked to the NC database. Copying in/out files directly would mess up the database right? And if I want to restore something like a deleted folder from 2 weeks ago, it would probably not be possible by just copying the files back, without also the DB parts. No?

      Am I stuck with only doing a 30-40GB daily full system image, or can I do something more interesting with my backup strategy?

      Note that on VULTR, you can only do a full system restore, so I wouldn't easily be able to jump in some backup system somewhere and look for a missing file from 2 weeks ago that was accidentally deleted.

      That sort of thing would be nice though, to do a "restore" at the file level, assuming the missing file is not in the NC trash bin.

      I guess I just want a couple levels to the backup plan. A full system image in the case of catastrophe, but also a kind of backup to the NC trash bin. Store my own 2 or 3 weeks of files so I can go back in time and find anything that's gone missing. I certainly don't want to do full system restore just to look for a few files, as this would wipe out all the other files added since that backup. I could never really do a full restore in this sense. I don't need the entire server restore, I just need the latest few changes.

      I'm trying to picture a situation where this would be needed. I suppose it's simply that someone deleted a file from days ago accidentally. I assume when you delete a file in NC you also lose all the revisions. But there is no way I'm restoring a multi-day old full system image to look for one or two files. I need a way to browse in a more "partial" way for specific files, assuming the trash bin doesn't apply.

      posted in IT Discussion
      guyinpvG
      guyinpv
    • RE: Need to track what PHP script is generating a file on nix

      @dbeato I should be able to install and use any tool I want, it's a VPS.

      But monitoring the folder, I don't think will work. At best it would only be able to tell me that the PHP process wrote a file, but not which script did it. I would need some kind of application monitor that monitors all the PHP scripts as well as monitor when they write files to that tmp folder.

      posted in IT Discussion
      guyinpvG
      guyinpv
    • Need to track what PHP script is generating a file on nix

      I've got a LiquidWeb VPS with WHM/cPanel and a couple sites on there.

      Some part of the Wordpress site hosted on there is creating these random files in the /var/tmp folder to the point where it will fill up over time and run out of space. All the files are about 5MB in size, and they are all named the same, with "php" followed by 6 random characters with no file extension.
      Opening a file shows only binary gobbledigoook with no real clues what it's for.

      The file is created as the user/group of the user of the cPanel account. And as far as I've seen, it creates about 4 or 5 of these files a day at various times.

      No errors appear in normal PHP error logs to match, so it does not appear to coincide with any kind of error dump. The files are also not generated through normal means by the PHP or Apache or MySQL processes.

      My best guess is that they are created by a poorly written plugin. It's creating the files and not cleaning them up.

      I have attempted to match up the time stamps of the files creation date with the raw access logs of the server. I went line by line opening every URL endpoint from the access log and then check if a file was created but this did not work, I couldn't get one to generate by going down the access log.

      So that leaves my final method. I need some way to backtrack. To detect when such a file is written and somehow record or log what process or code is writing it.
      I need some kind of linux tool to monitor the file operations of the WP site and log whenever a file gets created to the /tmp folder and by what script.
      Not sure if this is even possible.

      If not, or it would be way too consuming of resources to monitor all file operations, I'm open to other ideas. But right now, some stupid code somewhere is writing endless 5MB files to /tmp and filling it up over and over.

      Since this is a production site, I really can't experiment by messing with WP like turning off plugins and changing theme and stuff like that.

      Any ideas?

      posted in IT Discussion
      guyinpvG
      guyinpv
    • Best virtualized environments for freelance web dev?

      So I'm dealing with a common problem, and that is creating dev environments to match client environments. The projects are basically either clones of websites, or custom apps or scripts.

      I'd say 80% of the time, the environments are common, either a PHP 5.6 or 7.0+ web server using relatively late model Apache/Nginx and MySQL.

      But what I'm trying to sort out is:

      1. Unique dev environment for each client to hold their project.
      2. Ability to easily access this environment and do work from any location (at least 3; home, office, laptop/mobile).
      3. Ability to spec the environment to match the client's. Namely this would boil down to which versions of the web stack are used.
      4. Ability to package up and archive the projects, or spring them back into action if further work is ever needed.
      5. Provide limited access to the project into the public, for example giving client access to particular parts of the project, or a final build folder, etc. Ability to provide a public URL to let them test things out and see progress.
      6. Easy to maintain and manage. Starting a new project and choosing the environment shouldn't take a half day of writing scripts and configs and fooling around trying to get it to work. Backing them up, archiving, restoring, changing the environment even, should be easy to do.

      So this sounds like a job for virtual machines. Of course, something like Vagrant, VirtualBox, etc. But I have to wonder, just to get an alternate version of PHP and Apache, do I really require an entire virtual computer? I definitely don't want overkill as the solution.
      But I also want to be able to possibly work on multiple projects at once. If I can somehow have a PHP 7 project open over here and a PHP 5.6 project open over there, without conflicting between them. So they MUST be virtualized somehow into a container, but I do wonder if it must be the entire virtual hardware and operating system. But maybe it should? One client might be on CentOS and one on Ubuntu, so that has to be taken into account.

      I know that hosting companies now let you do things like switch PHP version on a per folder basis, but they still share the same on the rest of the stack.

      I also want this to be portable, as mentioned, so doing Vagrant on my home desktop doesn't really help if I want to work on my laptop at the coffee shop.

      But then you might say, the Vagrant config is portable, so store that in the cloud somewhere, then you can just pull it from the laptop and create the VM. Yes but then I also need to pull the data too. So maybe the data within each VM also is sent to a Git server and then I can pull and push to my heart's content and pull and push Vagrant configs and bla bla.
      This is not sounding fun.

      I could move the entire thing to the cloud. Use VULTR or similar and just pop up a new box for every client, then use a provisioning service or something to get the correct stack of tools installed.
      This way at least I could access it from any computer. But doing the backups and archiving and remote access to client and everything else. Still a pain to do all this manually on a fresh VM.

      So then there are provisioning services that go in the middle of me and my VPSes. All the ones I've seen cost a bit to use.
      If I'm going to create this entire unique environment for each client, I can't be spending hundreds a month to manage each one. Nor spend anything while the project is idle, archived, etc.

      There are other technologies like Docker and Bitnami and non VM type container systems, but I don't know if those are just for apps, or if they contain the entire LAMP stack. And even so, it has to meet all the other criteria.

      So that is what I'm researching, how to create containerized environments for client projects without a ton of complexity, or being locked to a particular workstation, etc. With the main goal being to match their production environment, create separation between projects, and manage active and inactive projects, backups, and archives that can be activated again for future work. And provide client some degree of access for testing and pulling out the assets when finished.

      posted in Developer Discussion
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      @jame_s said in Looking for a self-hosted file share tool:

      @guyinpv odd, i have had no issues with clients, everything is done via https. as far as pricing goes, i use the free version and have about 30 users. no stability issues whatsoever.

      Isn't the free version limited to 3 users?

      I'm sure a lot of my issues were from trying to run it on an internal server. Doing https and dealing with virtual networking and routing and everything didn't pan out. Probably could do it better on a cloud VPS.

      Regardless, NextCloud does have a couple advantages over SeaFile, but that goes the other way around too so..... pick your poison!

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      Speaking of deleting cache entries. This seems to be what the occ command is for occ files:cleanup.
      It even reports that way:
      0 orphaned file cache entries deleted

      "orphaned file cache entries" seems like exactly what I found, it had orphaned entries for the group share folder. When I deleted them, that seemed to me to be "deleting orphaned file cache entries."

      So I guess, maybe the files:cleanup tool is not functional or is where the bug really lies if it's not working?

      It wouldn't help to put this in a cron job if it doesn't work anyway.

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      What I did by deleting rows in the filecache table referencing the group share seemed to work. I just can't be sure there aren't any other pieces or references anywhere else in the DB. Hoping it doesn't lead to bugs later if there are still leftovers.

      @jame_s said in Looking for a self-hosted file share tool:

      @guyinpv seafile

      https://www.seafile.com/

      been using it here for quite a while.

      each user here has an upload link i store in active directory and it is added to each user's signature.

      I tested it for a while on our internal server. I had a heck of a time getting clients to connect though, lots of troubleshooting routing, DNS, ports, etc, was never really stable. Perhaps it would be better on some VPS instead of internal. But there are other reasons I didn't go with it, like pricing model. Some info on that here: https://mangolassi.it/topic/9882/why-would-you-chose-nextcloud-over-seafile/17

      Seafile had some great concepts, the desktop tool was pretty nice, fast sync. But it's not quite what I was looking for.

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      @jaredbusch said in Looking for a self-hosted file share tool:

      @guyinpv I have never seen this issue.

      I have been using ownCloud since version 7 and switched to Nextcloud with version 10/11.

      The steps to reproduce it are at the top of this bug report: https://github.com/nextcloud/server/issues/3502

      I'd be curious if you tried it. The very last comment on that bug report is a person on version 13 who experienced it. And I did too, on 13.0.1.

      The main difference with me is that I used the group folders plugin which, I believe, makes use of the NextCloud ability to connect to external storage. So I'm not sure if this is a bug with normal shares, or only when involving external storage shares.

      Whatever the case, if anybody reads any of this, I would definitely stay away from the group folder plugin.

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      I think I fixed it, but who knows.

      When I look at entries in the filecache table, it still references the original group folder, here is a sample query:

      0_1524070648254_group folder filecache.png

      That group folder doesn't exist any more, it was deleted through normal means in the web interface, the group folder UI, and removed the plugin itself. So this definitely seems like a bug since I didn't remove the folder in any strange way, it was all done in the web interface.

      I took a chance and simply deleted all rows that referenced "groupfolders", a nice 13,631 rows deleted. Now the same query doesn't show any entries for "groupfolders".

      A quick refresh of the web interface and all the duplicates disappeared, now looks normal.

      0_1524070823477_better.png

      You'll notice in my earlier screenshot, the interface was outputting the entire path, rather than just the relative folder name, which was weird. But now it's only showing the folder names, not the whole paths.

      I don't know if I want to trust such buggy software, it can't even keep its own filecache in order when moving folders between shares. OwnCloud and NextCloud both seem to be dealing with this inconsistency, and the filecache cleaning command line tool doesn't fix it.

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      @bnrstnr said in Looking for a self-hosted file share tool:

      @guyinpv Just create a share with an admin account that will never be deleted, then you can create a group for all of your users, and add the group to the share as shown in the pic. The folder will automatically appear for every one in that group.

      You can also use the normal link sharing rules with this, so every user shares the same link.

      https://i.imgur.com/Q69nHOi.png

      This is exactly what I did, right out of the box.
      But then I found this: https://apps.nextcloud.com/apps/groupfolders

      So I created one of those. Then I found the features of group folders didn't work for me. For example the trash bin doesn't work at all for group folders.

      So I went back to method #1. Created a share linked to a group.

      Really guys, nothing weird going on here. Then all I did was select all the files in the one folder (group share), and used the button called "move" right in the interface to move them to the normal share, which seemed to work. And then I see all the duplicates. Here is a screenshot of how the web interface looks.

      0_1524065181569_duplicates.png

      @wirestyle22 said in Looking for a self-hosted file share tool:

      Whatever that long-winded post is up there, I have no idea. That's an incredibly convoluted way to do that.

      I don't see how it's convoluted to want to move files and folders from one folder into another. You'd think a program meant for file management can do such trivial tasks without screwing up it's own cache or indexes or whatever went wrong.

      Anyway, so I looked in the file system and there are definitely no duplicates there. It looks as it should.
      I ran the occ commands files:scan for all users, no errors reported. Then I ran files:cleanup to clean the filecache.

      I didn't see any other commands that seemed relevant.

      Anyway, I logged in and duplicates still showing. Grrr.

      Then I went in and deleted the previous group share folder (it is empty anyway) and then removed the group share plugin entirely.

      At this point there are no clients connected, I'm only working in the web interface.

      Others have seen this problem too: https://help.nextcloud.com/t/files-and-folders-shown-twice-in-web-ui/12416/2

      https://help.nextcloud.com/t/duplicate-files-delete-not-possible/8869

      https://github.com/nextcloud/server/issues/3502

      Looks like OwnCloud was dealing with same issue as well: https://github.com/owncloud/core/issues/28018

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      @jaredbusch said in Looking for a self-hosted file share tool:

      @guyinpv said in Looking for a self-hosted file share tool:

      @jaredbusch Cool! 120 users connected and yet it reports about 800MB RAM. Mine is on PHP 7.0.28, MySQL 5.7.21, Ubuntu 16.04. That makes me feel a little better.

      As many file sync tools I've tried over the years, the top most important feature becomes stability. As long as the thing keeps syncing, keeps accurate, then I'll deal with the rest.

      You'd be surprised how quickly I get myself into trouble.

      I found this app for "Group Shared Folders" so that I could have a master share that everybody uses, seemed like a good idea at the time. But then I found out the deleted folder (trash can) doesn't work on group shares.

      I created another admin user to act as a master account that I could put a share in for the whole office. Then I used the NC web interface to move all the files from the group share into the new share. This seemed to work fine.

      On my test workstations, I had turned off sync so I could move the files locally as well, I want to avoid using a ton of bandwidth if I can help it.

      When I turned sync back on, it seemed to go just fine until I noticed in the web interface that most every folder was duplicated! Lots of files appear duplicated as well. But this doesn't translate to Windows, I only see one copy of everything.
      When I open the sync client, the duplicates are seen in the folder selection tree.

      Now it gets weirder. When in the web UI, if I click the share icon on one of the duplicate folders/files, the duplicate magically disappears, just by viewing share info in the sidebar. But if I refresh the folder or click on the folder name, once again all the folders/files reappear as duplicates. This is freaking bizarre. Kinda goes against my need for a stable and robust system.

      I have no clue WTF you are doing.

      But it sounds like your browser shit.

      Look at your data store directly.

      If it is all good and it still looks bad in a new porn mode browser, then have it rescan the files from the command line.

      haha, what does that mean?

      Like I said, I originally did a "group share", this is a feature via a plugin.
      When the features of a group share didn't work out, I went back to just having a regular share owned by a user.
      In the web interface, I selected all the files in the group share and used the "move" link to move them into the new share.
      Maybe I'm going mad but this seems like a pretty standard operation, move some files and folders from one share into another??

      But now all those folders/files appear duplicated.

      What are the magic commands to force things to rescan at the command line?

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      @jaredbusch Cool! 120 users connected and yet it reports about 800MB RAM. Mine is on PHP 7.0.28, MySQL 5.7.21, Ubuntu 16.04. That makes me feel a little better.

      As many file sync tools I've tried over the years, the top most important feature becomes stability. As long as the thing keeps syncing, keeps accurate, then I'll deal with the rest.

      You'd be surprised how quickly I get myself into trouble.

      I found this app for "Group Shared Folders" so that I could have a master share that everybody uses, seemed like a good idea at the time. But then I found out the deleted folder (trash can) doesn't work on group shares.

      I created another admin user to act as a master account that I could put a share in for the whole office. Then I used the NC web interface to move all the files from the group share into the new share. This seemed to work fine.

      On my test workstations, I had turned off sync so I could move the files locally as well, I want to avoid using a ton of bandwidth if I can help it.

      When I turned sync back on, it seemed to go just fine until I noticed in the web interface that most every folder was duplicated! Lots of files appear duplicated as well. But this doesn't translate to Windows, I only see one copy of everything.
      When I open the sync client, the duplicates are seen in the folder selection tree.

      Now it gets weirder. When in the web UI, if I click the share icon on one of the duplicate folders/files, the duplicate magically disappears, just by viewing share info in the sidebar. But if I refresh the folder or click on the folder name, once again all the folders/files reappear as duplicates. This is freaking bizarre. Kinda goes against my need for a stable and robust system.

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      With all our files uploaded, and just me and my test laptop connected, it's using about 360MB RAM.

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      I've put nodequery on it to monitor resource use, it'll alert me if anything goes above 80%. I'll be curious to see how it behaves as I add the users onto it.

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      @scottalanmiller Interesting. Well if that's the case I just bump up to the $10 plan. Still cheaper than any other service charging $5 to $10 per user.

      I tried to give Turnkey Cloud a try and boy was that a joke.

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: What Are You Watching Now

      Been watching this one, latest binge.

      Youtube Video

      At first I didn't care for the acting, but it's grown on me. Also annoying that every character has to invent a different kind of accent. Jared Harris is awesome but I don't know about this weird British slash Chinese accent he's trying to pull.

      Overall good story though, can be hard to follow if you don't remember the detailed bits.

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      @scottalanmiller said in Looking for a self-hosted file share tool:

      Until recently we used Vultr and Fedora for NC and it worked well. RAM was tight for sure, though.

      $5 box on VULTR is 1GB now. You think we'd stretch that with about 12 users and moderate activity on mostly Word/Excel files of about 13GB total?

      posted in Water Closet
      guyinpvG
      guyinpv
    • RE: Looking for a self-hosted file share tool

      I typically use VULTR, and they have it as a default app so I just ran that. Installed without a hitch on Ubuntu 16.04.
      Did some initial configuring and a couple users. Now uploading a few gigs of some of our files.

      Anything I need to know about running this? Troubleshooting common issues? Ways to make it perform better? Tricks or tips?

      Note that our users don't use the web interface, I couldn't pay them enough to make them use a web interface for file management. All that matters is how robust the windows sync tool is.

      posted in Water Closet
      guyinpvG
      guyinpv
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 33
    • 34
    • 6 / 34