Setting up a small business network

I used to work for a managed service provider. Before that, I worked infrastructure for a number of very large companies. Before that, I was a consulting systems engineer specializing in networking dealing with both large and small companies.

I’d highly recommend going with Google or Dropbox or Office 365 with OneDrive. You don’t need to do your own disaster recovery, you don’t need to figure out if your managed service provider is screwing you (hint, it isn’t a good business model in 2018, very hard to keep the talent on team that will be able to provide the services you promised in a timely manner, and if you don’t have the technical skills to do the job, you don’t have the technical skills to evaluate whether they are doing the job) Google is fantastic - I put it into a large (20,000+ employees) company and my husband’s company uses it - if companies like Conde Nast, Gartner, SalesForce and Colgate Pamolive aren’t worried about Google security, you really shouldn’t be.

File servers are very much a 20th century technology. Maybe in 2005 this might still be a good idea. Skip it in 2017 - unless you have spotty or bad internet because your business is in the middle of Africa and dependent on satellite connectivity.

Do you know if cloud storage does file verification? A few flipped bits or lost bytes in a video or audio file isn’t going to stop it from playing, but it may corrupt a Word doc or Excel worksheet. Not discounting cloud storage, but am honestly curious about file verification.

I do multi-terabyte transfers and backups of my video and audio files every few months and use Teracopy to copy and verify. And once in a while I’ll get a verification on file that Windows reports is byte for byte identical and when I play the file, it is corrupted.

Whenever you read the phrase “cloud computing”, mentally replace it with “someone else’s computer”. Likewise, “in the cloud” with “on someone else’s computer.” It’s not magic maintenance-free technology and we have no way of knowing here what OP’s operational requirements are. A small operation can latch onto consumer-level cloud services ad-hoc style and then run into trouble when they suddenly encounter a situation where a more architected solution would have handled a security risk (internal or external), a small-scale on-premises backup covered an outage, smooth handling of data ownership and employee account turn-over, etc. Large companies that use O365, for instance, are not doing the equivalent of signing everyone up for a personal Gmail account and letting them run with it. It requires the same expertise to configure and administer O365 to a large business’s requirements as to configure on-premises Exchange. It’s just that someone else effectively installed Exchange Server for you. MS is not setting up your internal security groups, mail filtering and retention rules, integration with single sign-on, etc. for you. O365 has outages from time to time and the more you outsource the more you place your business at the hands of a third party you have no direct control over. It can be an effective trade-off but needs to be considered carefully. “X company uses Y provider” doesn’t mean that company exclusively uses that provider’s system for all business operations and it doesn’t mean they don’t have their own internal support either, if only to handle the minutiae of accessing the cloud provider when problems arise.

Some of these solutions sound mighty expensive for a one man business expanding to 3 employees.

I am dealing with a managed service company at the moment who wants to sell services to a small charity I work with. It has not been a pleasant experience. They are a bunch of incompetent leeches that who exploit the lack of skills within a small organisation and try to scare them into buying their over-priced services.

Keep it simple. Google Apps and a couple of other services are a good fit for small businesses. You get email using your domain name and you can access you office stuff from anywhere, not tied to the office. That is worth having. Making sure you can rebuild the system and restore data to any important desktop or laptop is challenge in itself but a it is a get out jail free card for when disaster strikes. Big internal or external hard drives are cheap. Flash drives are also cheap and very useful.

Depending on your own server is another world of pain that has to be thought through very carefully. How many precious eggs are in that basket?

While it is tempting to geek out on technical solutions, they are just tools. It is people and their skills that are crucial to success.

Of course it does file verification. Seriously, we ran a billion dollar business off G-Suite. Files were not getting corrupted.

Yep, that describes not only the one I worked for, but every one of our competitors. I lasted eight months before I helped our biggest two clients hire IT managers who would fire us and bring in a staff in house and quit.

(It also describes IBM or Accenture within a Fortune 500 as its MSP. That’s WAY more expensive - been there done that one as well - but its the same scam).

Granted the billion dollar business also had 300 on site Oracle instances in house and a few thousand Unix and Microsoft servers running business applications…but basic file services and email - what was then known as Google Apps.

So here is how our deal with G-Suite worked. I had been in charge of our Active Directory infrastructure, so I’m pretty familiar with security groups, et. all.

Our HR system would send Google a feed. Google would use that to set up the basic account - including email - no email administration on our part was required. When someone was terminated, the HR feed would take care of that as well. Access to files would be automatically granted to the manager of record at the time of termination.

We turned over administration of groups - both file directory groups and email groups - to the end user communities. You want a group - you set it up and manage it. We provided training for this, and it was really simple. We, of course, retained about three administrators with high level access to Google to straighten out the inevitable “Bob left and now no one can manage the group” problems and “Carrie should take over all of George’s stuff” and do the group audits. Previously we’d had about 103 people managing a worldwide server infrastructure.

AD stayed, as well as the core group of a half dozen AD Administrators/Server Engineers - we had applications dependent on it and it had been our file system - so when I left, people still had files on it.

The MSP used O365 for some of their clients - and that was far more of an administrative nightmare. But they were so incompetent that I don’t know how much of that was O365 and how much of that was them. I suspect a little of both - I’ve worked with MS too long to believe that they didn’t just take all your headaches and migrate them to their datacenters - leaving you with the pain.

Still curious about file verification on transfers. Is it done on the fly or after the files is uploaded, stored on the hardrive(s).

When I transfer files at home with Teracopy, it copies all the files first, then runs a CRC check after. Reporting if the Hashes don’t match exactly. The verification process takes about 1/4 to 1/2 the time as the actual file transfer. There are also other programs that can perform the check after the files are copied.

I found this old discussion about how cloud storage verification works if the provider (all?) performs a checksum on the file. I’m assuming this has to be reading the file from the harddrive(s) on the server. Curious because from what I’ve read and done, all activity on an upload (down and up, and on the local drive) stops once the file is received on the server. I know there’s packet loss/error correction in the upload stream, but what about the integrity of the file on the server?

Edit: It’s just me, but I get worried about anything I didn’t do/see myself.

Look at something like ZFS. With a modern copy-on-write filesystems thousands of snapshots can be maintained with the only overhead being the space necessary to store the changes between the snapshot and the “live” filesystem. I run the server for my small department (about 100 employees) the way I described. Yes, the link between my primary and backup server is very fast, but that doesn’t really matter.

I also run my home setup the same way, and send my offsite backups from home to an external drive plugged into my desktop at work. I’m using Comcast Business internet at home, so exactly the same internet service that a small business will be using. Pushing out the snapshots to offsite is much slower, but even on days I’m uploading gigabytes, it will eventually finish.

That is all on small servers that would be appropriately scaled for a small business. The petabytes of storage attached to our supercomputing cluster (on which I’m only a user) is fundamentally the same, just on a much bigger scale. Frequent snapshoting and replication, to whatever level each group is willing to pay for.

Here is an encryption whitepaper from Google for you - file verification is usually done as a subset of encryption when you are talking about this sort of infrastructure.

I think something totally missed here is budget, and I think any recommendation requires real and hard numbers. Does the OP have $5000 to spend, $500, nothing? A monthly fee to Google or Microsoft might actually be much easier to manage than coming up with $10,000 right now. Whenever I talk to people who do this kind of stuff at a corporate level, sometimes I need to remind them to pull back a bit. Downtime can be OK. The company isn’t losing millions of dollars for each minute the server is down. If the consumer PC based server is down for 45 minutes while replacing a non-hotswap drive in the RAID, so what? There’s other stuff people can be working on, and at least one employee is working on the server. And the cold-spare drive going into the server is kept on a shelf at Best Buy until needed.

As stated in this thread many times, the thing that can’t be tolerated is data loss. Even if the budget is nothing, money might have to be found to guard against data loss.

That document describes your data as being encrypted with keys that are stored on a “key management service”… controlled and operated by Google. The 128-bit AES encryption also sounds a bit weak. Then again, by the point your business has to worry about corporate and government espionage, it should have procedures in place to protect its own data.

Regarding filesystems, a filesystem like ZFS checksums blocks of data using hash functions, unless you turn that feature off, which you would never, ever want to do- I have witnessed the occasional bit getting flipped.

Of course the kms is going to be owned and operated by Google - that’s part of what you are paying them for.