What makes people want IPSEC at line speed
-
@Donahue said in What makes people want IPSEC at line speed:
I've got 25 users connecting
This should not matter unless you are doing something stupid like routing 100% of their traffic through the VPN before going back out to the internet via the main office.
@Donahue said in What makes people want IPSEC at line speed:
Also sometimes large CAD files being accessed across it.
WTF would even set this kind of expectation?
@Donahue said in What makes people want IPSEC at line speed:
I am also pulling real time backups across this ATM because of a poorly designed (by me) network layout from a few years ago. This is part of the issues we are fixing with my current project.
So 100% bad design and not real world expectations. Because I push Veeam replication jobs across much slower connections than that.
-
My current setup is such that I've got two primary resources that my users access, and two locations, each about the same size. One resource, the file server, is at one location, and our ERP is at the other. Both sides utilize both resources all day long. We upgraded from a 100/100 ptp ethernet circuit to this because it was constantly saturated. Also, all internet traffic for one location was funneled through said ptp, and exited our other location, because the first did not have its own connection to the outside world. Our setup was far from ideal, which is why it is mostly getting scrapped.
-
@Donahue said in What makes people want IPSEC at line speed:
We upgraded from a 100/100 ptp ethernet circuit
If it is PTP you don't need VPN.
-
@JaredBusch said in What makes people want IPSEC at line speed:
@Donahue said in What makes people want IPSEC at line speed:
We upgraded from a 100/100 ptp ethernet circuit
If it is PTP you don't need VPN.
that's what they USED to have. They didn't need it back then.
-
Please bear in mind that a lot of our issues were either the direct result of my personal inexperience and decisions about 4 years ago, and my company's overall lack of experience when it comes to how IT should be done. 4 years ago we did not have IT, we had a 2 drive NAS, and that is basically it. To put that into perspective, we literally did not have email access until 2011, which is about a year before I got here. The company I work for is was and to a large degree still is either behind the times, or working from bad assumptions. I will claim my mistakes that were made previously and perhaps more recently and say that a large part of why I am here on ML is to learn where I went wrong, and to try and make a plan for the present and future that is more inline with what would be considered "standard IT practices".
-
my company has two locations, that are about 10 miles apart. each have ~25 office staff and ~100 shop employees. We are a manufacturing/fabrication company.
My first bad apple is my ERP, which we picked out 4 years ago. As I have learned since, it's not a great one. For reasons that I cannot remember at the moment, it is at one branch. Our file server is located at the other office because at the time, the 2 drive NAS it replaced was at that location, and the majority of our engineers are also at that location. We bought into the myth that we needed MS everything with AD and all that, so I originally had two hosts, one at each location, each with a DC and one of the primary services. At that time, we only had a 10/10 ptp that our nortel PBX required. This was left over from before we had data access between the plants. shortly after getting everyone on the ERP, we discovered that 10/10 was simply not sufficient for the users that have to access it remotely. We setup an RDS server to try and alleviate some of this, but it can only do so much. We then bumped that up to the 100/100 that we still have, and will still have for a while because of contracts. At this point, I also moved my office to the location that has the ERP.
One final thing that is currently in place, but is planned to be removed, is our offsite backups. At the moment, I am required by company policy to make copies of some of the data and take it offsite, to the safe deposit box at the bank. The way I was doing this was and still in very inefficient. I have been copying individual files from all over our systems, and a lot of it coming over the site to site link, to a external SSD. This one step was absolutely killing the network, and was unsustainable. This is actually what prompted my current project, because it was a bad backup plan.
That brings me to where we were a few months ago. I spend the majority of my time dealing with our ERP and other things like that, and I have not had enough time to really learn the best way this should have been setup in the beginning. Because of this, and because I want to make sure that I am not setting us up for failure, we brought in outside help this time, specifically from someone who is not a VAR. We've been lead astray by advice from VARs, and they might share a little of the blame for where I am at now.
-
I am not sure how you want to slice it up, but the 100/100 pipe just wasn't cutting it. When we had a shift change in the shop, people clocking in and out would cause very noticeable slow down across the entire network at both sites. If I happened to be running that offsite backup (which was taking something like 150 hours each time), I would get constant complaints. Now, I can see that a lot of it has to do with how we are setup, but changing the router and internet service is a lot quicker than resolving issues with our ERP or how stuff is on the network. So call it a bandaid if you must, but it got us out of some of the immediate issues we were facing, and has allowed me to take a lot more time making sure that when I change everything up, what I am putting in is actually better and not just throwing money at it, hoping the problems will go away
-
so because of that, I was trying to get line speed out of IPsec so that it would all behave like local speed, and that service location would not impact the users so much.
A large portion of my plan is to move our file server over to the same location as the ERP, and treat that location as the HQ. This has many benefits, but the one main drawback is that the majority of engineers still work at the (now) branch location and they will now have to access all files remotely, as will all the other users at that location.
Something like nextcloud may be part of my solution, but I have not got that far yet.
-
What ERP are you using?
-
-
The main performance problem we have with it is that it is only a 2 tier design. We have had many problems related to functionality too, but that is not really the topic of this thread.
-
I don't see any solution making those remote engineers happy with data access to the CAD drawings.
What was wrong with RDS for the users not at the ERP location?
-
@Dashrender said in What makes people want IPSEC at line speed:
I don't see any solution making those remote engineers happy with data access to the CAD drawings.
What was wrong with RDS for the users not at the ERP location?
Mostly, its about user confusion. In a vacuum, RDS isn't so bad, but the way that the application tolerates it leaves a lot to be desired. Since I am just sharing the application itself, the user never sees their desktop or any other part of their user profile on the RDS server. So they are constantly trying to save items to the desktop or documents folder, not realizing that they are actually seeing said folders on the RDS server and not their workstations. They also have no way to kill the app when it freezes, which is somewhat common. This is mostly due to them treating RDP like it's magic, so I have to log in and kill it for them. I have trained them to always try a restart before they call me, and this is one area in which that does absolutely nothing. We also have small issues with the way the application integrates with email. One main reason is a conflict between said ERP and CAD file viewers that also have to be running on RDS. This conflict alone probably causes half of the issues that people call me about regarding RDS.
At the end of the day, the users experience is generally better when running the local client, but the performance is worse. It's kind of a lose lose situation.
The only solution I could think of to help make the remote engineers happy is to have a 1gbps IPsec tunnel. Latency is still an issue and will continue to be, but total bandwidth is better.
Having said all this, my actual tunnel speed is really about 650-700mbps. I learned after doing all this that the small regional ISP only gets a 2gbps total connection to their provider, and that I am trying to take half of it. Their provider is my other ISP, FYI. Both plans are in bad locations for fiber.
-
You're 10 miles apart, any chance for a site to site wireless link?
OK so making them split between some things local and some remote - why not move them 100% remote? Give the users a full RDS desktop, and have them completely stop using their local system?
1 Gb connection for CAD is still going to be an issue in my mind - I don't really see this solution being better, but who knows, you might get lucky.
What is your end goal for backups? If it's to continue taking tapes to the bank, why not just pick up two tapes/drives, whatever, one from each site and deliver them to the bank instead of copying over the WAN?
-
@Dashrender said in What makes people want IPSEC at line speed:
You're 10 miles apart, any chance for a site to site wireless link?
OK so making them split between some things local and some remote - why not move them 100% remote? Give the users a full RDS desktop, and have them completely stop using their local system?
1 Gb connection for CAD is still going to be an issue in my mind - I don't really see this solution being better, but who knows, you might get lucky.
What is your end goal for backups? If it's to continue taking tapes to the bank, why not just pick up two tapes/drives, whatever, one from each site and deliver them to the bank instead of copying over the WAN?
no wireless without large towers unfortunately, I looked into it before settling on what we did, but I didnt want to try and deal with renting space on someone else's tower. It was getting intimidating and that plan would have been probably more than I could have pulled off.
I am not sure a full RDS desktop would work under the CAD load, and I know it is not allowed under autodesk licencing without getting Citrix involved.
In theory, the 1gbps WAN should be similar to the 1gbps LAN, at least that was my thought. I realize now that latency may still be an issue, but it has only been in place for maybe 2 months. Time will tell if that is the long term solution.
For backups, I have been and currently am doing everything from my location, which is now the HQ. I am backing up roughly ~600GB onto a 1TB external SSD via usb3. I've got somewhere between 6-8TB of total data that I would like to backup, but I had neither the space nor the time to get that all onto a single device that I could take offsite. This forced me to have to choose what to backup, because of lack of anyone higher than me that could/would give me a solid business policy to follow. I don't like being responsible for deciding what does and what doesn't make it into these offsite backups. One problem I am running into is that the person giving me the requirement for offsite backups (the CEO) has no clue what there even is to backup in the first place, because no one here (with a few possible exceptions) can even understand this stuff. I had a conversation just yesterday with him about wanting some direction on how long he wanted to retain backups, and if he wanted that retention done onsite or offsite. He couldn't really give me an answer, he just wants the "drawings" to be backed up "forever". In the end, I basically talked him into officially telling me to do what I had planned on doing in the first place, just so that we had "officially" talked about it. That is probably off topic though.
Current: Like I said, we are currently backing up 600GB worth of files to a single usb SSD that I rotate out on a weekly basis. Before the IPsec was in place, it took ~150 hours to complete, which since they were weekly backups, took basically the entire week. Now they are completing in ~50 hours, but I am still pulling individual files across the WAN.
My plan at this point is to move everything over to a single new host at my HQ. This host will be running local SSD's, see https://mangolassi.it/topic/18201/large-or-small-raid-5-with-ssd. I've got two existing hosts (I picked one up along the way) that will be repurposed once the new host is in place. One will become a veeam host (it will be getting new storage), and the other will become an empty host used only for restores. All three hosts will be on a new 10G network, and the veeam host will be getting a tape drive (most likely, see https://mangolassi.it/topic/18209/adding-tape-drive). By using LTO-7 tapes, I can backup literally everything I have, and take those offsite. I am going to backing up to disk on the veeam host, and then copying those to tape. I am also going to be copying my backups across to my branch site. With the new setup, I should be able to do the offsite copy job in a matter of hours. So, I will have 4 copies of the data, 1 production, 2 onsite backups, and 1 offsite backup. I will also be able to run everything from veeam instead of trying to mix that with individual files.
I still need to decide on how much storage to give said veeam host, but it seems challenging to determine how much each backup requires in the way of storage space, especially since I am deduping mine now using windows server.