This article is related to "The small office that went to telework without realizing it" so I recommend reading it before continuing.
By storing all the information on the NAS there was a chance that a problem would occur and get lost. Come on, how can it happen to us with our computer or the device we use. Therefore it is necessary to make backup copies.
What is a backup for?
We have the information where we have it, it is always in danger.
- The device containing the information may be damaged
- Malware can alter or eliminate it
- We can delete or modify it ourselves by mistake
- They can steal our device
- There could be an accident, a fire ...
The only way to stay calm is to have backups. But backups must meet some minimum requirements:
- They must be on other devices - if we copy to the same disk, almost all of the reasons we mentioned earlier could affect it. If the disk is damaged, malware enters, a fire, a theft… we would lose the information and its copy.
- They must be in another place: if we do it on another record but it's in the same place, a theft or a fire could also affect it.
Experience tells us that it is recommended that they also meet the following conditions:
- Should be done automatically: If copies are to be made manually by the user, most likely they will not be made. One day we forget, another day we go in a hurry, etc ...
- They must be versioned: different copies or versions of the same file must be saved as we edit it. Therefore, if we make a mistake, we can look back several versions until we find a copy that has what we are looking for.
Here's how the office did it
Returning to the case of the small office, a backup system based on two disks was mounted.
On one side there was a USB hard drive connected to the NAS where Synology's Hyperbackup application performed versioned backups every night. No user accessed that disk, just a user created on the NAS for backup purposes and with their own password. In this way, no one could manipulate him, not even by mistake.
But this record doesn't meet the requirement of being in a different place. If you enter to steal or there is a fire, the copy may be lost.
Then another identical NAS was mounted at the owner's home. Hyperbackup also created a versioned copy every night on that other NAS, with a dedicated user.
Why then the USB disk if we already have the most secure information at the owner's home? In addition to increasing security by having an extra copy, access is much faster if something needs to be recovered.
Outlook, un problema per i backup
All information handled by the expedition was stored directly on the NAS, working directly from there with the Samba protocol. So no backup of users' computers was done, because if there was a problem with any of them, the information about the work was on the NAS and its copies. Except for email.
The fact is that the backup system checked every night all the files that had undergone changes (an updated word, a new pdf…) and only made a copy of those files. So, the copy of the first day was very large and durable, but for the rest of the days I just had to copy the changes that had happened that same day.
The problem is that Outlook saves the information in a single file, a.pst. Said file in that case took up 20 to 40 GB per user (although it shouldn't be, but users do what they want). Copying 20 to 40GB per user every night was crazy, and if you had the version, it was assumed that every night the space occupied had increased by the same amount.
But the problem was copying to the owner's home NAS. At the time there was ADSL, there was no fiber, so the speed was much lower than now, and not only that, the upload speed was much lower (we usually talk about 1 Mb / s) . But, to curl the loop, I don't know why, Movistar was giving much slower upload speeds than usual at the time, we are talking about 512 kb / s in some cases and 800 kb / s in others. Making a 20 to 40 GB copy, for a single user, was impossible, it would take several days.
We might think we can leave all of the query information in one or more .pst and leave the most recent information, from the last month, in another pst. But it is that the pst that had only to be read, for the mere fact of opening them, were already modified and therefore the backup software included them in the copy of the night.
Thunderbird e Maildir
The solution came with the Maildir mail archiving system, where each mail is saved to your hard drive as a separate file (and in a standard format that can be opened by any program). This is a big plus, as at the end of the day only emails you received or sent that day are considered modified files (and some other file that is used to manage that storage). So now they wanted to make small copies every day.
Outlook can't use this storage system, but Thunderbird can. Then the mail clients and all mail were passed to Thunderbird and the Maildir filing system. This way, you can make backup copies every night, both locally and remotely.
And, in all the years it has worked, it hasn't caused any problems.
Thunderbird has also improved other email activities, but it's not the subject of this article.
Note that Thunderbird doesn't use Maildir by default, it uses Mbox (just like Apple's Mail). Mbox also works for files where everything is stored, even if it's not a single file like Outlook is, but it's one file per folder. He is still not optimal for this like Maildir. So the first thing to do in Thunderbird is to configure the storage in this way.
Luckily (fingers crossed, knocking on wood…) there was never a need for backup.
[poll id = "10 ″]