Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
This is closely related to the "Levels of Access" discussion.
It is obviously important that we keep regular backups of the n-lab, and that several people do so. However, a full backup contains all the information in the database, including the personal webs and the passwords, so there's a question of access here as well.
There are various options available. The following spring to mind, but there may be others.
Of course, what we'll do will be a combination of these. Those who would have access to all the data anyway may as well keep a full backup of the whole database: this is also the simplest way of rebuilding the labs in case of a massive systems failure on behalf of our hosts. But due to the security implications, this can't be extended to everyone. On the other hand, there's clearly no problem if everyone has access to a copy of the content (either current or full) of the entire n-lab, but personal web owners may feel differently about their webs.
Then there's the question of how to get the data. The full database is rather large, and sending it over http is just compounding the problem. The ideal system is simply to update the differences each time: export the content to a version control system and then simply update that each time (I think I got that idea from Jacques somewhere). How difficult would people find that to use? Does anyone have a clue what I'm talking about?
Remember that as these are backups, it's not important that everyone do it, just that enough people do it.
Would it make sense to automatically provide a variety of backup files somewhere where they may be downloaded via sftp as desired?
This is how we do (or, will do) full backups. To do this, there needs to be a place that both the person downloading the backup and the "mathforge machine" have access to. The obvious place being mathforge itself which means that the person doing the backup needs a secure (ssh) access to it. At the moment, that's in one of the top 3 levels of access.
The reason I say "do (or, will do)" is that for the top 2 levels of access (currently you, me, and Toby), this is all in place already - I just need to tell you where the files are located on the server and then you can download them whenever you want.
For the other levels Version Control can work over http and since it wouldn't be a single file of the whole lot, it wouldn't greatly affect our traffic.
At the moment, though, the precise technicalities are not so important - but perhaps my original post wasn't clear on this. What I'm most interested in is:
(But on that last point, remember that security on the n-lab process is not very tight anyway so making the backups available over a 2048-bit SSL encrypted channel with Implicit Graph Network Disruption Detection is a bit overkill.)
((Go on, admit it. Who googled "Implicit graph network disruption detection"?))
I just need to tell you where the files are located on the server and then you can download them whenever you want.
Have you told us? I forget. I haven't been making backups, but I should start to.
((Go on, admit it. Who googled "Implicit graph network disruption detection"?))
Give me a chance! You asked too quickly.
For those with personal webs
If they're personal but not very private, then anybody can make a backup (albeit possibly only the hard way, by crawling all of the revision pages, and only the HTML if it's only published). In principle, someone with a personal but world-readable web could request us not to back-up revisions, but making the web world-readable is a much bigger hole. So it's probably really for private (password-protected and not published) webs that we need to clarify this.
Instructions for getting the full weekly backup are now in the instiki-user help document since that seems the best place to put them.
Thanks! I'll have to do that when I have a good connection.
Just added the daily backup instructions. I had to change a few permissions so it might be 24hrs before the daily backups are available for download. Fortunately, in experimenting I created a new full backup so there's no need to download the dailies until tomorrow. Incidentally, even compressed then the full backup is of the order of 50Mb (dailies are uncompressed and tend to be of the order of a couple of Mbs).
Bleugh. Just realised that the links to the daily backups were pointing to the current files rather than the ones from the previous day, so they were being updated during the day. Fixed now (I hope).
1 to 8 of 8