ML
    • Register
    • Login
    • Search
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    1. Home
    2. Pete.S
    • Profile
    • Following 0
    • Followers 0
    • Topics 272
    • Posts 3492
    • Best 1093
    • Controversial 10
    • Groups 0

    Pete.S

    @Pete.S

    1566
    Reputation
    2312
    Profile views
    3492
    Posts
    0
    Followers
    0
    Following
    Joined Last Online

    Pete.S Unfollow Follow

    Best posts made by Pete.S

    • How to install and run Geekbench 4 on linux

      If you want to run Geekbench 4 on a linux server, this is how to install and run it.
      Note that you need to have a working internet connection on the server.
      You can run it as root or as any other user.

      Let's start from the home directory and put the files there.
      cd

      Download the files from geekbench.com:
      (change version number if needed for latest version)
      wget http://cdn.geekbench.com/Geekbench-4.3.3-Linux.tar.gz

      Extract the downloaded files:
      tar -zxvf Geekbench-4.3.3-Linux.tar.gz

      Go to the extracted folder:
      cd Geekbench-4.3.3-Linux

      Run the test in tryout mode, results are uploaded automatically:
      ./geekbench_x86_64

      After a few minutes the test is completed and you'll see a link to a webpage which is unique for each test.

      Upload succeeded. Visit the following link and view your results online:
      https://browser.geekbench.com/v4/cpu/1234567

      Just enter the link in any browser and you'll see the results of the test.

      posted in IT Discussion geekbench
      Pete.S
      Pete.S
    • RE: NVMe and RAID?

      @dbeato said in NVMe and RAID?:

      One of the first Dell Servers with Hotswap NVME was the R7415 so yeah
      https://www.dell.com/en-us/work/shop/povw/poweredge-r7415

      Not sure what others have seen.

      The newer ones have a 5 in the model number, so R7515, R6515 etc.
      That's the ones you want to buy. AMD Epyc 2 Rome CPUs.

      Dual sockets models are R7525, R6525 etc.

      And to make this complete: 6 is 1U and 7 is 2U. R7515, R6515 etc.

      posted in IT Discussion
      Pete.S
      Pete.S
    • RE: Macbook Air for College

      @jasgot said in Macbook Air for College:

      Daughter wants a Mac laptop for college. Any suggestions?

      Yes, take her to the store and buy the one she wants.

      Buying an Apple product is not a technical issue that needs to be figured out. It's an emotional issue. Like a Gucci bag.

      posted in IT Discussion
      Pete.S
      Pete.S
    • SAS expanders explained

      If you have a RAID controller with 8 ports, you can connect up to 8 SAS/SATA drives directly to the RAID controller. That's fine but if you for instance have a server with say 36 drive bays you would need a 36 port RAID controller. Those are hard or maybe even impossible to find.

      Well, here is where the SAS expander comes into play. It will work somewhat like a network switch but for SAS/SATA ports.

      sas_expander.png

      The SAS expander IC can be integrated directly on the backplane of the drive bays or it can be a standalone card or PCIe card. These are often used when you have more than 8 drive bays and even more so when you have 16 or more drive bays.

      It allows you to expand the number of drives the RAID controller is able to connect to. It's transparent to the user because the RAID cards have integrated support for SAS expanders. This is also true of HBAs (Host Bus Adapters).

      The only drawback is that the maximum transfer rate is, as always, limited by the PCIe link to the RAID controller card but also by the SAS connections from the RAID controller to the SAS expander. In real life though these bottlenecks are seldom bottlenecks as it's uncommon to read from all drives at the same time and drives are also often slower, especially when using HDDs.

      SAS expanders are also used heavily in external JBOD chassis which are expander chassis for drive bays that you connect to a server so you can attach more drives than fits in the standard enclosure, aka Direct Attached Storage DAS. In that case the SAS expander sits inside the JBOD chassis.

      posted in IT Discussion sas sata sas expander
      Pete.S
      Pete.S
    • RE: How to Secure a Website at Home

      @hobbit666 said in How to Secure a Website at Home:

      Why not GitHub or GitLab for free?

      That was part of the etc 😁😁😁😁
      Also I thought GitHub was more for storing scripts and opensource stuff.

      It's not generic hosting of websites as you don't have control like you would on a normal webserver.

      It's simplified hosting and the github/gitlab pages was initially intended to complement the projects on there. So it would be easy to make a html website from the git repositories, for instance for documentation.

      Since you can store any files on gitlab/github you could of course also use the pages for any type of static website.

      Here is how to get started in the simplest way possible:
      https://guides.github.com/features/pages/

      posted in Water Closet
      Pete.S
      Pete.S
    • RE: Make a Bootable Windows 10 USB Installer from Fedora Linux

      Good to know about WoeUSB.

      If you have windows available, rufus is an easy tool to make bootable USB drives.
      Doesn't need to be installed and it's fast.
      https://rufus.akeo.ie/

      But in all honesty it's very easy to make a bootable windows installer USB drive manually. Just make a primary bootable FAT32 partition on the USB drive and copy the files from the ISO onto it. Done.

      You can copy more files onto the drive, for instance drivers or other software. If you do that, it makes sense to make a dd image of the entire thing when you're done. That way you can easily write a new USB drive with your custom files on it.

      posted in IT Discussion
      Pete.S
      Pete.S
    • RE: Incorporating Ransomware Protection into Backup Plan

      First, ransomware is big business run by organized crime. I think about 19 billion dollar per year industry.

      Everything can be compromised in different ways. There is just no way to protect your data 100% and to think otherwise is just naive.

      We have chosen to go with tape as our last line of defense. Once you take it offline there is no way it can be remotely compromised. We believe that is enough to be able to recover from most attacks and the cost is reasonable.

      posted in IT Discussion
      Pete.S
      Pete.S
    • How to check the integrity of a set of files with md5deep

      Integrity of files

      If you want to check the integrity of a bunch of files you can do it with md5deep, which can be thought of as a recursive version of md5sum. It was initially designed for forensic work.

      If a file has the same hash as another file they are identical. If you save the md5 hash of a file and later recheck it, you can be sure the file hasn't been changed, corrupted or tampered with.

      Installation on Debian

      You'll find it in the package md5deep.

      apt install md5deep
      

      Inside the package you'll also find sha256deep and some other good stuff. Use sha256deep instead if you want to use sha256 hash. It's better and actually more secure than md5 but might be slower. You use it in the exact the same way though.

      Besides linux it's also available on other OSs such as Windows, MacOS. You can build it from source too. https://github.com/jessek/hashdeep

      Create MD5 signatures

      md5deep -rl /check_this_dir/* > files.md5
      

      This will create a text file (files.md5) with the md5 hash of all files (*) in the "/check_this_dir" directory.

      Check MD5 signatures

      md5deep -rlX files.md5 /check_this_dir/*
      

      It will return the files that don't match. So if any file has been changed, it will show up.

      Common Options

      -r is to go into subdirectories as well
      -l is to use local paths instead of absolute paths
      -X is to do check the signatures

      -e is if you want to see the progress while it's working.

      Find more info on basic usage with examples here:
      http://md5deep.sourceforge.net/start-md5deep.html#basic

      Example

      Let's check that our files in /boot and it's sub-directories stays intact.

      First let's create an md5 file that we will compare with.

      md5deep -r /boot/ > boot.md5
      

      Let's verify the files have not been tampered with.

      md5deep -rX boot.md5 /boot/ 
      

      If a file or several files has been changed it will return the file and the new hash (exit code 1).
      If all is good it will not return anything (exit code 0).

      posted in IT Discussion md5 md5sum hashing corruption
      Pete.S
      Pete.S
    • RE: XenServer gave error I'm not familiar with

      Maybe stop using USB drives for things they aren't designed for?

      It looks like we've been down this road before.
      https://mangolassi.it/topic/20070/so-xen-server-gave-me-an-error-what-do-i-do

      A small SSD would do better. Something with write endurance and something that is designed to attached 24/7 in a hot environment.

      If you don't have drive bays or don't want to waste them, use satadom.
      alt text

      posted in IT Discussion
      Pete.S
      Pete.S
    • RE: System Admin - checklist for Don'ts and Important points please!

      Maybe I'm alone but on the top of my list:

      1. Only use Microsoft as a last resort when all other options have been explored.
      2. If you get paid by the hour disregard #1.
      posted in IT Discussion
      Pete.S
      Pete.S

    Latest posts made by Pete.S

    • RE: 3CX Desktop VoIP Client Hit with Supply Chain Attack

      @scottalanmiller said in 3CX Desktop VoIP Client Hit with Supply Chain Attack:

      Well, everyone using it opted to not have code visibility and self compilation or code verification. Not that people would, but this is a risk people opt for.

      It's not that simple since it was the Github's open source electron framework that had been tampered with.

      I don't think it's known where it was hosted though. Could have been github or a local repository. But if I understand correctly it was only there it had been compromised, not upstream.

      More info will probably be known in a week or two.

      posted in News
      Pete.S
      Pete.S
    • RE: Allow Binaries on Linux to Run on Well Known Privileged Ports

      @scottalanmiller

      Yeah, me too.
      The search is not particularly good on nodebb.

      If you search for net_bind you would assume it would find both these post but it finds nothing.

      Since this site isn't index by google and others anymore you can't use those to search either.

      posted in IT Discussion
      Pete.S
      Pete.S
    • RE: Allow Binaries on Linux to Run on Well Known Privileged Ports

      FYI
      https://mangolassi.it/topic/25022/bind-linux-process-to-well-known-web-ports-when-not-root

      posted in IT Discussion
      Pete.S
      Pete.S
    • 3CX Desktop VoIP Client Hit with Supply Chain Attack

      3CX Desktop VoIP Client Hit with Supply Chain Attack

      The 3CX VoIP Desktop Client was compromised by what is believed to be a threat group associated with the North Korean government. Millions of users of the 3CX software are affected. The malware in the compromised version of the 3CX VoIP client exfiltrated data from affected users, allowing full remote control of infected systems.

      The attack affects both Windows and macOS users. The attack gained notice when 3CX users began complaining that security products were flagging and, in some cases, removing the software from their computers.

      https://www.cisa.gov/news-events/alerts/2023/03/30/supply-chain-attack-against-3cxdesktopapp

      More detailed information and discussions for those that are interested:
      Youtube Video

      posted in News 3cx
      Pete.S
      Pete.S
    • RE: Dell Server: The server power action is initiated because the host device initiated a warm-reset operation.

      @scottalanmiller said in Dell Server: The server power action is initiated because the host device initiated a warm-reset operation.:

      Just verifying that this log entry tells us that a human hit the power button on the server? This is a log entry in the iDrac.

      I don't think so. Warm reset is a reset, like the reset button or alt+ctrl+del.

      If you press the power button you get a shutdown / power down but not reset, because after power off it will not start again.

      If you have another Dell server available maybe you can verify.

      posted in IT Discussion
      Pete.S
      Pete.S
    • RE: Marketing - Video Editing Storage

      @Jimmy9008

      The only way to make a cloud only video editing solution work is to use a VDI solution.

      Basically you have a LAN in the cloud where your editing workstations, storage and backups are located. Your editors remote control their workstations to do the work.

      All files that are shot or produced need to be uploaded first before any editing can be done.

      posted in IT Discussion
      Pete.S
      Pete.S
    • RE: Marketing - Video Editing Storage

      @Jimmy9008 said in Marketing - Video Editing Storage:

      I assume the'video editing' cloud storage providers are like OneDrive or DropBox, where each machine has a local cache so do not have to pull from cloud each time. But, wouldnt that mean that our video editing workstations all need to have a 20 TB local drive to store this local cache?

      When you use on-demand sync the cache is based on files you need access to. So local storage on the workstation only has to be big enough to hold the files you are currently working on.

      posted in IT Discussion
      Pete.S
      Pete.S
    • RE: Marketing - Video Editing Storage

      @Jimmy9008 said in Marketing - Video Editing Storage:

      Hi all,

      I am being asked to find a storage solution for our video editing department. They use various Adobe tools on Mac clients.

      They have around 40TB of files they use, with the largest files being RAW files of 300 GB - 400 GB. Average file sizes are 10 GB - 20 GB.

      1/2 of this is archive and can be stored on cloud storage for pull down should they require again. Leaving around 20TB live data and 20TB in archive.

      Originally, I was looking at proposing a 20 - 30 TB NAS populated with SSDs in the local office, with 10 Gbps NIC. This would provide high speed local access over the LAN to 6 marketing users.

      Marketing said speed was the main concern. Accessing the large files is currently slow causing many delays. They have so far been sharing USB 3.0 devices between each other, without backups.

      Our CIO is now pushing for cloud only solutions for the storage where marketing can check in/out the files they need, killing my NAS idea. I have concerns on this but am open, so would like some advice. What have you used or what would you suggest to use to provide this?

      I am concerned that when an editor wants to do something with the 300 GB file, that will have to pull down from the cloud, bringing our WAN link to a crawl until the download finishes. Then, once they have finished editing, they have to upload the end product, again causing bandwidth issues. This is exacerbated shoudl multiple editors be pulling multiple files at the same time. Even at 500 Mbps with no overhead that is around 1.5h of time to sit and wait for a 300 GB file whilst everybody else is affected on the same WAN link.

      I assume the'video editing' cloud storage providers are like OneDrive or DropBox, where each machine has a local cache so do not have to pull from cloud each time. But, wouldnt that mean that our video editing workstations all need to have a 20 TB local drive to store this local cache?

      Kind of all over the place on this one, any ideas folks?

      You need to present the math for the CIO.

      It's unrealistic to assume you can use your entire bandwith for one download. It's also completely unrealistic to assume you can get 500 Mbps sustained from whatever cloud storage you have.

      I would say you're lucky to have 100 Mbit/s sustained. So 10MB/s of data and a little over 8 hours for a 300GB file. 50 Mbit/s is however a more realistic number.

      Next up is the maximum file size limits on cloud storage. Microsoft for example is 250GB maximum on Onedrive and Sharepoint. In other words - you can't store your video files there.

      Using cloud for archival storage you may never need is fine. But for working files that the editing team is sharing, you must have those files on the LAN.

      You should run a test. Sign up for a cloud storage solution and take one of those 400GB RAW files and upload it. And then download it. See what happens and how long it takes.

      posted in IT Discussion
      Pete.S
      Pete.S
    • RE: Easy Computer to Computer File Transfer Over Internet

      @VoIP_n00b said in Easy Computer to Computer File Transfer Over Internet:

      @Pete-S said in Easy Computer to Computer File Transfer Over Internet:

      @VoIP_n00b said in Easy Computer to Computer File Transfer Over Internet:

      Reading thought your requirements again SyncThing is perfect for what your looking to do

      Synthings goes through community hosted intermediate servers, called relay servers, so it's not direct. https://relays.syncthing.net/

      https://docs.syncthing.net/users/relaying.html

      Yes, that's exactly what I'm saying.

      The normal situation between two residential computers is that a direct connection is never possible because they're both behind NAT or CGNAT.

      A direct connection is only possible if you're willing and able to do port forward on either side.

      posted in IT Discussion
      Pete.S
      Pete.S
    • RE: Easy Computer to Computer File Transfer Over Internet

      @VoIP_n00b said in Easy Computer to Computer File Transfer Over Internet:

      Reading thought your requirements again SyncThing is perfect for what your looking to do

      Synthings goes through community hosted intermediate servers, called relay servers, so it's not direct. https://relays.syncthing.net/

      If both source and destination is behind NAT, you can't transfer directly without opening ports unfortunately.

      I would just use a VM in the cloud (running something like ubuntu with desktop) and open ssh. Transfer all files over ssh (sftp) and then run a remote desktop session over ssh to upload the files from the cloud to wherever the files it needs to go.

      posted in IT Discussion
      Pete.S
      Pete.S