Discussion:
Copying backup files
(too old to reply)
Philip Herlihy
2013-01-21 22:29:17 UTC
Permalink
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.

Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.

However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.

I've been trying to come up with something he can run without having to
think about it too hard.

One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.

Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...

I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.

Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?

Any thoughts? Any hazards?
--
Phil, London
Gene E. Bloch
2013-01-21 23:17:25 UTC
Permalink
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
I have two backup drives respectively marked A and B with tape labels
and in Macrium I created two backup scripts respectively labeled A and
B.

Both have been configured to look at alternative drive letters, in case
they have changed.

Unfortunately, the way all that is done is IMO not easy for a naive user
to set up. It's not that easy for me, and I claim to be experienced.

Also - I just tested - Macrium does allow me to access the files for
copying, if you prefer that method. The account is of class
Administrator, but it's not *the* admin account.
--
Gene E. Bloch (Stumbling Bloch)
Philip Herlihy
2013-01-22 00:12:32 UTC
Permalink
Post by Gene E. Bloch
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
I have two backup drives respectively marked A and B with tape labels
and in Macrium I created two backup scripts respectively labeled A and
B.
Both have been configured to look at alternative drive letters, in case
they have changed.
Unfortunately, the way all that is done is IMO not easy for a naive user
to set up. It's not that easy for me, and I claim to be experienced.
Also - I just tested - Macrium does allow me to access the files for
copying, if you prefer that method. The account is of class
Administrator, but it's not *the* admin account.
Thanks, Gene. I think Macrium is in most respects equivalent to True
Image, in that it's copying sectors rather than files (it's a cloning
tool) and it also seems to elevate itself (or get you to do it).

I've been experimenting, and I have a basic script which (if run
elevated) can access the files without problem and copy them to the
device on which the script resides, whatever its current drive letter.
You can use parameter substitution to tease the drive letter out of the
invocation string. So far, that looks like the best option I can think
of. If run without elevation, you simply get "Access Denied" errors.
--
Phil, London
Gene E. Bloch
2013-01-22 19:10:32 UTC
Permalink
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
I have two backup drives respectively marked A and B with tape labels
and in Macrium I created two backup scripts respectively labeled A and
B.
Both have been configured to look at alternative drive letters, in case
they have changed.
Unfortunately, the way all that is done is IMO not easy for a naive user
to set up. It's not that easy for me, and I claim to be experienced.
Also - I just tested - Macrium does allow me to access the files for
copying, if you prefer that method. The account is of class
Administrator, but it's not *the* admin account.
Thanks, Gene. I think Macrium is in most respects equivalent to True
Image, in that it's copying sectors rather than files (it's a cloning
tool) and it also seems to elevate itself (or get you to do it).
I've been experimenting, and I have a basic script which (if run
elevated) can access the files without problem and copy them to the
device on which the script resides, whatever its current drive letter.
You can use parameter substitution to tease the drive letter out of the
invocation string. So far, that looks like the best option I can think
of. If run without elevation, you simply get "Access Denied" errors.
You say of Macrium, "it's a cloning tool". But it's two tools, a cloning
tool and an imaging tool. The imaging tool allows for incremental
backups, and allows for separate access to any dated image. They are
compressed images a la vhd files (I hope I recalled the extension
correctly), i.e., analogous to virtual hard drives.
--
Gene E. Bloch (Stumbling Bloch)
Philip Herlihy
2013-01-22 23:03:15 UTC
Permalink
In article <***@stumbler1907.invalid>, not-***@other.invalid
says...
Post by Gene E. Bloch
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
I have two backup drives respectively marked A and B with tape labels
and in Macrium I created two backup scripts respectively labeled A and
B.
Both have been configured to look at alternative drive letters, in case
they have changed.
Unfortunately, the way all that is done is IMO not easy for a naive user
to set up. It's not that easy for me, and I claim to be experienced.
Also - I just tested - Macrium does allow me to access the files for
copying, if you prefer that method. The account is of class
Administrator, but it's not *the* admin account.
Thanks, Gene. I think Macrium is in most respects equivalent to True
Image, in that it's copying sectors rather than files (it's a cloning
tool) and it also seems to elevate itself (or get you to do it).
I've been experimenting, and I have a basic script which (if run
elevated) can access the files without problem and copy them to the
device on which the script resides, whatever its current drive letter.
You can use parameter substitution to tease the drive letter out of the
invocation string. So far, that looks like the best option I can think
of. If run without elevation, you simply get "Access Denied" errors.
You say of Macrium, "it's a cloning tool". But it's two tools, a cloning
tool and an imaging tool. The imaging tool allows for incremental
backups, and allows for separate access to any dated image. They are
compressed images a la vhd files (I hope I recalled the extension
correctly), i.e., analogous to virtual hard drives.
Sure. I'm not entirely comfortable using sector-backup tools for
backing up ordinary files (though I'm not entirely sure why not). I'd
imagine if you did a defragmentation, the incremental backup next time
would be huge. It just seems so wasteful. I have to say, I liked
ntbackup, which goes back to Windows 2000!
--
Phil, London
Gene E. Bloch
2013-01-22 23:11:11 UTC
Permalink
Post by Philip Herlihy
says...
Post by Gene E. Bloch
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
I have two backup drives respectively marked A and B with tape labels
and in Macrium I created two backup scripts respectively labeled A and
B.
Both have been configured to look at alternative drive letters, in case
they have changed.
Unfortunately, the way all that is done is IMO not easy for a naive user
to set up. It's not that easy for me, and I claim to be experienced.
Also - I just tested - Macrium does allow me to access the files for
copying, if you prefer that method. The account is of class
Administrator, but it's not *the* admin account.
Thanks, Gene. I think Macrium is in most respects equivalent to True
Image, in that it's copying sectors rather than files (it's a cloning
tool) and it also seems to elevate itself (or get you to do it).
I've been experimenting, and I have a basic script which (if run
elevated) can access the files without problem and copy them to the
device on which the script resides, whatever its current drive letter.
You can use parameter substitution to tease the drive letter out of the
invocation string. So far, that looks like the best option I can think
of. If run without elevation, you simply get "Access Denied" errors.
You say of Macrium, "it's a cloning tool". But it's two tools, a cloning
tool and an imaging tool. The imaging tool allows for incremental
backups, and allows for separate access to any dated image. They are
compressed images a la vhd files (I hope I recalled the extension
correctly), i.e., analogous to virtual hard drives.
Sure. I'm not entirely comfortable using sector-backup tools for
backing up ordinary files (though I'm not entirely sure why not). I'd
imagine if you did a defragmentation, the incremental backup next time
would be huge. It just seems so wasteful. I have to say, I liked
ntbackup, which goes back to Windows 2000!
My experience doesn't seem to agree with your speculation.

It might be informative to do an experiment: do an image backup, defrag,
and do an incremental right away.

Since Windows automatically defrags my computer when I'm not looking (I
never killed that default behavior), I should see some lengthy
incremental backups in Macrium, but I don't.
--
Gene E. Bloch (Stumbling Bloch)
Gene E. Bloch
2013-01-22 23:46:39 UTC
Permalink
Post by Gene E. Bloch
Post by Philip Herlihy
says...
Post by Gene E. Bloch
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
I have two backup drives respectively marked A and B with tape labels
and in Macrium I created two backup scripts respectively labeled A and
B.
Both have been configured to look at alternative drive letters, in case
they have changed.
Unfortunately, the way all that is done is IMO not easy for a naive user
to set up. It's not that easy for me, and I claim to be experienced.
Also - I just tested - Macrium does allow me to access the files for
copying, if you prefer that method. The account is of class
Administrator, but it's not *the* admin account.
Thanks, Gene. I think Macrium is in most respects equivalent to True
Image, in that it's copying sectors rather than files (it's a cloning
tool) and it also seems to elevate itself (or get you to do it).
I've been experimenting, and I have a basic script which (if run
elevated) can access the files without problem and copy them to the
device on which the script resides, whatever its current drive letter.
You can use parameter substitution to tease the drive letter out of the
invocation string. So far, that looks like the best option I can think
of. If run without elevation, you simply get "Access Denied" errors.
You say of Macrium, "it's a cloning tool". But it's two tools, a cloning
tool and an imaging tool. The imaging tool allows for incremental
backups, and allows for separate access to any dated image. They are
compressed images a la vhd files (I hope I recalled the extension
correctly), i.e., analogous to virtual hard drives.
Sure. I'm not entirely comfortable using sector-backup tools for
backing up ordinary files (though I'm not entirely sure why not). I'd
imagine if you did a defragmentation, the incremental backup next time
would be huge. It just seems so wasteful. I have to say, I liked
ntbackup, which goes back to Windows 2000!
My experience doesn't seem to agree with your speculation.
It might be informative to do an experiment: do an image backup, defrag,
and do an incremental right away.
Since Windows automatically defrags my computer when I'm not looking (I
never killed that default behavior), I should see some lengthy
incremental backups in Macrium, but I don't.
Well, what the heck. I decided to read the manual. In the program's
Help, which sends you to the Macrium website,

http://www.macrium.com/help/v5/reflect_v5.htm

Under Advanced Topics, I found this:

"When backing up a disk or partition you can select to do so using
Intelligent Sector Copy. This is the recommended method for creating
images. Using this method will result in smaller images and create them
with greater speed.

How does it work ? In the case of a full image, Macrium Reflect will
take a snapshot and then save only the clusters that are in use on the
disk. In the case of a differential or an incremental, after the
snapshot has been taken, Macrium Reflect will compare the clusters in
use with the previous image and then save only those clusters that have
been modified."

So I'm forced to agree with you, much as I hate to admit it :-)

I had somehow believed that was the way cloning, and only cloning, works
in Macrium Reflect, but I haven't found documentary evidence that they
do cloning the way full images are described above. Of course, I think
they should do that.

When imaging, I still haven't seen huge incremental files or seen huge
backup times unless I have gone way too long since the last image
backup, which happens much too often.
--
Gene E. Bloch (Stumbling Bloch)
Philip Herlihy
2013-01-23 15:20:55 UTC
Permalink
Post by Gene E. Bloch
Post by Gene E. Bloch
Post by Philip Herlihy
says...
...
Post by Gene E. Bloch
Post by Gene E. Bloch
Post by Philip Herlihy
Sure. I'm not entirely comfortable using sector-backup tools for
backing up ordinary files (though I'm not entirely sure why not). I'd
imagine if you did a defragmentation, the incremental backup next time
would be huge. It just seems so wasteful. I have to say, I liked
ntbackup, which goes back to Windows 2000!
My experience doesn't seem to agree with your speculation.
It might be informative to do an experiment: do an image backup, defrag,
and do an incremental right away.
Since Windows automatically defrags my computer when I'm not looking (I
never killed that default behavior), I should see some lengthy
incremental backups in Macrium, but I don't.
Well, what the heck. I decided to read the manual. In the program's
Help, which sends you to the Macrium website,
http://www.macrium.com/help/v5/reflect_v5.htm
"When backing up a disk or partition you can select to do so using
Intelligent Sector Copy. This is the recommended method for creating
images. Using this method will result in smaller images and create them
with greater speed.
How does it work ? In the case of a full image, Macrium Reflect will
take a snapshot and then save only the clusters that are in use on the
disk. In the case of a differential or an incremental, after the
snapshot has been taken, Macrium Reflect will compare the clusters in
use with the previous image and then save only those clusters that have
been modified."
So I'm forced to agree with you, much as I hate to admit it :-)
I had somehow believed that was the way cloning, and only cloning, works
in Macrium Reflect, but I haven't found documentary evidence that they
do cloning the way full images are described above. Of course, I think
they should do that.
When imaging, I still haven't seen huge incremental files or seen huge
backup times unless I have gone way too long since the last image
backup, which happens much too often.
Thanks for this - it does seem to confirm my presumption, although I
note that you haven't experienced large incrementals as a result. I
still think it's a messy way to do file backups, though - although I
certainly recognise the value of a system image.
--
Phil, London
Gene E. Bloch
2013-01-23 21:05:31 UTC
Permalink
Post by Philip Herlihy
Post by Gene E. Bloch
Well, what the heck. I decided to read the manual. In the program's
Help, which sends you to the Macrium website,
http://www.macrium.com/help/v5/reflect_v5.htm
"When backing up a disk or partition you can select to do so using
Intelligent Sector Copy. This is the recommended method for creating
images. Using this method will result in smaller images and create them
with greater speed.
How does it work ? In the case of a full image, Macrium Reflect will
take a snapshot and then save only the clusters that are in use on the
disk. In the case of a differential or an incremental, after the
snapshot has been taken, Macrium Reflect will compare the clusters in
use with the previous image and then save only those clusters that have
been modified."
So I'm forced to agree with you, much as I hate to admit it :-)
I had somehow believed that was the way cloning, and only cloning, works
in Macrium Reflect, but I haven't found documentary evidence that they
do cloning the way full images are described above. Of course, I think
they should do that.
When imaging, I still haven't seen huge incremental files or seen huge
backup times unless I have gone way too long since the last image
backup, which happens much too often.
Thanks for this - it does seem to confirm my presumption, although I
note that you haven't experienced large incrementals as a result. I
still think it's a messy way to do file backups, though - although I
certainly recognise the value of a system image.
I have mixed feelings about it. It's a bit counter-intuitive, but at the
same time, it makes some sense. If properly done you certainly should
end up with an image that reflects accurately the state of the partition
at the time of backup (granted that Shadowing is black magic, in terms
of my understanding), and it ought to be more efficient that rewriting
entire files when only one cluster has changed.

I'm not sure why Macrium doesn't seem to do the same analysis before a
clone. Maybe they do, and I just don't realize it.
--
Gene E. Bloch (Stumbling Bloch)
Philip Herlihy
2013-01-23 22:19:50 UTC
Permalink
...
Post by Gene E. Bloch
Post by Philip Herlihy
Thanks for this - it does seem to confirm my presumption, although I
note that you haven't experienced large incrementals as a result. I
still think it's a messy way to do file backups, though - although I
certainly recognise the value of a system image.
I have mixed feelings about it. It's a bit counter-intuitive, but at the
same time, it makes some sense. If properly done you certainly should
end up with an image that reflects accurately the state of the partition
at the time of backup (granted that Shadowing is black magic, in terms
of my understanding), and it ought to be more efficient that rewriting
entire files when only one cluster has changed.
I'm not sure why Macrium doesn't seem to do the same analysis before a
clone. Maybe they do, and I just don't realize it.
I haven't used Macrium, but I have used Acronis True Image, and that has
a facility to do what they call a 'Differential' backup. But it works
at the Sector level, not the file level. I was interested in your
report that defragging a disk doesn't seem to generate spurious
differential backups - perhaps defragging also works at the Sector level
- below my radar, I'm afraid. Surely it can't be as space-efficient,
though. If you change one tiny text file, you have to back up the
entire sector, typically 4K. I guess it's conceivable that such
products might be able to back up only part of a file, perhaps, which
could make them more space-efficient. Ah - just found this in the
Acronis True Image Home 2012 Help:

"An incremental or differential backup created after a disk is
defragmented might be considerably larger than usual. This is because
the defragmentation program changes file locations on the disk and the
backups reflect these changes. Therefore, it is recommended that you re-
create a full backup after disk defragmentation."
--
Phil, London
Gene E. Bloch
2013-01-23 22:36:56 UTC
Permalink
Post by Philip Herlihy
...
Post by Gene E. Bloch
Post by Philip Herlihy
Thanks for this - it does seem to confirm my presumption, although I
note that you haven't experienced large incrementals as a result. I
still think it's a messy way to do file backups, though - although I
certainly recognise the value of a system image.
I have mixed feelings about it. It's a bit counter-intuitive, but at the
same time, it makes some sense. If properly done you certainly should
end up with an image that reflects accurately the state of the partition
at the time of backup (granted that Shadowing is black magic, in terms
of my understanding), and it ought to be more efficient that rewriting
entire files when only one cluster has changed.
I'm not sure why Macrium doesn't seem to do the same analysis before a
clone. Maybe they do, and I just don't realize it.
I haven't used Macrium, but I have used Acronis True Image, and that has
a facility to do what they call a 'Differential' backup. But it works
at the Sector level, not the file level. I was interested in your
report that defragging a disk doesn't seem to generate spurious
differential backups - perhaps defragging also works at the Sector level
- below my radar, I'm afraid. Surely it can't be as space-efficient,
though. If you change one tiny text file, you have to back up the
entire sector, typically 4K. I guess it's conceivable that such
products might be able to back up only part of a file, perhaps, which
could make them more space-efficient. Ah - just found this in the
"An incremental or differential backup created after a disk is
defragmented might be considerably larger than usual. This is because
the defragmentation program changes file locations on the disk and the
backups reflect these changes. Therefore, it is recommended that you re-
create a full backup after disk defragmentation."
Odds are that the same would be true of Macrium, then, since they use
the same approach.

Surely the people at Acronis, at least, are more knowledgeable than I
am, wouldn't you think? So you might have just forced me to abandon a
prejudice or two :-)
--
Gene E. Bloch (Stumbling Bloch)
Philip Herlihy
2013-01-23 22:50:54 UTC
Permalink
In article <***@stumbler1907.invalid>, not-***@other.invalid
says...
Post by Gene E. Bloch
Post by Philip Herlihy
...
"An incremental or differential backup created after a disk is
defragmented might be considerably larger than usual. This is because
the defragmentation program changes file locations on the disk and the
backups reflect these changes. Therefore, it is recommended that you re-
create a full backup after disk defragmentation."
Odds are that the same would be true of Macrium, then, since they use
the same approach.
Surely the people at Acronis, at least, are more knowledgeable than I
am, wouldn't you think? So you might have just forced me to abandon a
prejudice or two :-)
The older you get, the more you know. And the less you're certain of
any of it...
--
Phil, London
Robin Bignall
2013-01-24 00:23:19 UTC
Permalink
Post by Philip Herlihy
says...
Post by Gene E. Bloch
Post by Philip Herlihy
...
"An incremental or differential backup created after a disk is
defragmented might be considerably larger than usual. This is because
the defragmentation program changes file locations on the disk and the
backups reflect these changes. Therefore, it is recommended that you re-
create a full backup after disk defragmentation."
Odds are that the same would be true of Macrium, then, since they use
the same approach.
Surely the people at Acronis, at least, are more knowledgeable than I
am, wouldn't you think? So you might have just forced me to abandon a
prejudice or two :-)
The older you get, the more you know. And the less you're certain of
any of it...
True. I know less than anyone about how disks operate in detail, but I
suppose common sense will tell us that there must be trade-offs. For
example, if you have a system that examines in some way the sectors to
just backup those that have changed, this operation must take time,
during which many other sectors will change on a very busy disk (or even
SSD). I can't see how, in that case, one can end up with an accurate
image of one instant, even though CPUs are very much faster than I/O
devices. I don't know if Shadow Protect does it at the sector level for
incrementals or differentials, or some unit larger, but their idea is to
capture an image in the shortest possible time, and then write it out to
a disk at leisure, in parallel with other processing. Disk space is
cheap; it doesn't matter how large incrementals are with that approach.
--
Robin Bignall
Herts, England
Philip Herlihy
2013-01-25 12:31:37 UTC
Permalink
Post by Robin Bignall
Post by Philip Herlihy
says...
...
Post by Robin Bignall
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
True. I know less than anyone about how disks operate in detail, but I
suppose common sense will tell us that there must be trade-offs. For
example, if you have a system that examines in some way the sectors to
just backup those that have changed, this operation must take time,
during which many other sectors will change on a very busy disk (or even
SSD). I can't see how, in that case, one can end up with an accurate
image of one instant, even though CPUs are very much faster than I/O
devices. I don't know if Shadow Protect does it at the sector level for
incrementals or differentials, or some unit larger, but their idea is to
capture an image in the shortest possible time, and then write it out to
a disk at leisure, in parallel with other processing. Disk space is
cheap; it doesn't matter how large incrementals are with that approach.
All true (providing you are taking incrementals and not copying
everything every time, and especially not overwriting previous copies).
However, if I want to restore a particular file, the fact that the
backup archive is structured in terms of sectors seems an unecessary
complication. I certainly value disk images, which work via sectors,
but my own feeling is that it's a stretch to use that technology for
simple file backup, which is a mature field of its own.
--
Phil, London
Robin Bignall
2013-01-25 16:31:15 UTC
Permalink
Post by Philip Herlihy
Post by Robin Bignall
Post by Philip Herlihy
says...
...
Post by Robin Bignall
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
True. I know less than anyone about how disks operate in detail, but I
suppose common sense will tell us that there must be trade-offs. For
example, if you have a system that examines in some way the sectors to
just backup those that have changed, this operation must take time,
during which many other sectors will change on a very busy disk (or even
SSD). I can't see how, in that case, one can end up with an accurate
image of one instant, even though CPUs are very much faster than I/O
devices. I don't know if Shadow Protect does it at the sector level for
incrementals or differentials, or some unit larger, but their idea is to
capture an image in the shortest possible time, and then write it out to
a disk at leisure, in parallel with other processing. Disk space is
cheap; it doesn't matter how large incrementals are with that approach.
All true (providing you are taking incrementals and not copying
everything every time, and especially not overwriting previous copies).
However, if I want to restore a particular file, the fact that the
backup archive is structured in terms of sectors seems an unecessary
complication. I certainly value disk images, which work via sectors,
but my own feeling is that it's a stretch to use that technology for
simple file backup, which is a mature field of its own.
ShadowProtect incrementals run every two hours on my machine during the
week. They only copy what's changed, and neither they, nor full backups,
copy free space unless you want them to. The Incs are strung out
separately "below" the full to which they refer; they don't overwrite
the previous Inc.

SP, along with other systems, has a Mount command for any image. It
presents the image's files in a Windows Explorer format and you can copy
any file out of the image. It also has a Write option so that you can
copy any file INTO the image. The image, when demounted, will then
create a slightly expanded Inc that contains the new data. I seldom use
this facility unless, say, I want to shrink an image to, say, restore to
a smaller device.

I agree with you about file backup, and there are systems that are based
on a file, rather than a volume, method. SP, however, has met all of my
needs, so far.
--
Robin Bignall
Herts, England
Philip Herlihy
2013-01-26 14:39:20 UTC
Permalink
Post by Robin Bignall
Post by Philip Herlihy
Post by Robin Bignall
Post by Philip Herlihy
says...
...
Post by Robin Bignall
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
True. I know less than anyone about how disks operate in detail, but I
suppose common sense will tell us that there must be trade-offs. For
example, if you have a system that examines in some way the sectors to
just backup those that have changed, this operation must take time,
during which many other sectors will change on a very busy disk (or even
SSD). I can't see how, in that case, one can end up with an accurate
image of one instant, even though CPUs are very much faster than I/O
devices. I don't know if Shadow Protect does it at the sector level for
incrementals or differentials, or some unit larger, but their idea is to
capture an image in the shortest possible time, and then write it out to
a disk at leisure, in parallel with other processing. Disk space is
cheap; it doesn't matter how large incrementals are with that approach.
All true (providing you are taking incrementals and not copying
everything every time, and especially not overwriting previous copies).
However, if I want to restore a particular file, the fact that the
backup archive is structured in terms of sectors seems an unecessary
complication. I certainly value disk images, which work via sectors,
but my own feeling is that it's a stretch to use that technology for
simple file backup, which is a mature field of its own.
ShadowProtect incrementals run every two hours on my machine during the
week. They only copy what's changed, and neither they, nor full backups,
copy free space unless you want them to. The Incs are strung out
separately "below" the full to which they refer; they don't overwrite
the previous Inc.
SP, along with other systems, has a Mount command for any image. It
presents the image's files in a Windows Explorer format and you can copy
any file out of the image. It also has a Write option so that you can
copy any file INTO the image. The image, when demounted, will then
create a slightly expanded Inc that contains the new data. I seldom use
this facility unless, say, I want to shrink an image to, say, restore to
a smaller device.
I agree with you about file backup, and there are systems that are based
on a file, rather than a volume, method. SP, however, has met all of my
needs, so far.
Seems you can only buy it 5 licenses at a time ($89).
--
Phil, London
Robin Bignall
2013-01-26 17:45:22 UTC
Permalink
Post by Philip Herlihy
Post by Robin Bignall
Post by Philip Herlihy
Post by Robin Bignall
Post by Philip Herlihy
says...
...
Post by Robin Bignall
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
True. I know less than anyone about how disks operate in detail, but I
suppose common sense will tell us that there must be trade-offs. For
example, if you have a system that examines in some way the sectors to
just backup those that have changed, this operation must take time,
during which many other sectors will change on a very busy disk (or even
SSD). I can't see how, in that case, one can end up with an accurate
image of one instant, even though CPUs are very much faster than I/O
devices. I don't know if Shadow Protect does it at the sector level for
incrementals or differentials, or some unit larger, but their idea is to
capture an image in the shortest possible time, and then write it out to
a disk at leisure, in parallel with other processing. Disk space is
cheap; it doesn't matter how large incrementals are with that approach.
All true (providing you are taking incrementals and not copying
everything every time, and especially not overwriting previous copies).
However, if I want to restore a particular file, the fact that the
backup archive is structured in terms of sectors seems an unecessary
complication. I certainly value disk images, which work via sectors,
but my own feeling is that it's a stretch to use that technology for
simple file backup, which is a mature field of its own.
ShadowProtect incrementals run every two hours on my machine during the
week. They only copy what's changed, and neither they, nor full backups,
copy free space unless you want them to. The Incs are strung out
separately "below" the full to which they refer; they don't overwrite
the previous Inc.
SP, along with other systems, has a Mount command for any image. It
presents the image's files in a Windows Explorer format and you can copy
any file out of the image. It also has a Write option so that you can
copy any file INTO the image. The image, when demounted, will then
create a slightly expanded Inc that contains the new data. I seldom use
this facility unless, say, I want to shrink an image to, say, restore to
a smaller device.
I agree with you about file backup, and there are systems that are based
on a file, rather than a volume, method. SP, however, has met all of my
needs, so far.
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
--
Robin Bignall
Herts, England
Gene E. Bloch
2013-01-26 20:32:56 UTC
Permalink
Post by Robin Bignall
Post by Philip Herlihy
Post by Robin Bignall
Post by Philip Herlihy
Post by Robin Bignall
Post by Philip Herlihy
says...
...
Post by Robin Bignall
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
True. I know less than anyone about how disks operate in detail, but I
suppose common sense will tell us that there must be trade-offs. For
example, if you have a system that examines in some way the sectors to
just backup those that have changed, this operation must take time,
during which many other sectors will change on a very busy disk (or even
SSD). I can't see how, in that case, one can end up with an accurate
image of one instant, even though CPUs are very much faster than I/O
devices. I don't know if Shadow Protect does it at the sector level for
incrementals or differentials, or some unit larger, but their idea is to
capture an image in the shortest possible time, and then write it out to
a disk at leisure, in parallel with other processing. Disk space is
cheap; it doesn't matter how large incrementals are with that approach.
All true (providing you are taking incrementals and not copying
everything every time, and especially not overwriting previous copies).
However, if I want to restore a particular file, the fact that the
backup archive is structured in terms of sectors seems an unecessary
complication. I certainly value disk images, which work via sectors,
but my own feeling is that it's a stretch to use that technology for
simple file backup, which is a mature field of its own.
ShadowProtect incrementals run every two hours on my machine during the
week. They only copy what's changed, and neither they, nor full backups,
copy free space unless you want them to. The Incs are strung out
separately "below" the full to which they refer; they don't overwrite
the previous Inc.
SP, along with other systems, has a Mount command for any image. It
presents the image's files in a Windows Explorer format and you can copy
any file out of the image. It also has a Write option so that you can
copy any file INTO the image. The image, when demounted, will then
create a slightly expanded Inc that contains the new data. I seldom use
this facility unless, say, I want to shrink an image to, say, restore to
a smaller device.
I agree with you about file backup, and there are systems that are based
on a file, rather than a volume, method. SP, however, has met all of my
needs, so far.
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
If you guys are talking about the ShadowProtect that is found at

http://www.storagecraft.com/

I just looked there. I see a single Desktop Edition license at $89.95
and a three-user license (Desktop Edition - Home User Bundle) at
$209.95.

I'd say fairly expensive.
--
Gene E. Bloch (Stumbling Bloch)
Rodney Pont
2013-01-26 21:05:23 UTC
Permalink
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
If you guys are talking about the ShadowProtect that is found at
http://www.storagecraft.com/
I just looked there. I see a single Desktop Edition license at $89.95
and a three-user license (Desktop Edition - Home User Bundle) at
$209.95.
I'd say fairly expensive.
Could the confusion here be that it's ShadowProtect version 5?
(ie ShadowProtect 5 $89.99)
--
Regards - Rodney Pont
The from address exists but is mostly dumped,
please send any emails to the address below
e-mail rpont (at) gmail (dot) com
Gene E. Bloch
2013-01-26 21:28:23 UTC
Permalink
Post by Rodney Pont
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
If you guys are talking about the ShadowProtect that is found at
http://www.storagecraft.com/
I just looked there. I see a single Desktop Edition license at $89.95
and a three-user license (Desktop Edition - Home User Bundle) at
$209.95.
I'd say fairly expensive.
Could the confusion here be that it's ShadowProtect version 5?
(ie ShadowProtect 5 $89.99)
Yeah, I can believe that :-)

It is written as 'SHADOWPROTECT 5'. The '5' is very prominent & it isn't
adorned with the word 'version' or any abbreviation of it.
--
Gene E. Bloch (Stumbling Bloch)
Robin Bignall
2013-01-26 21:55:15 UTC
Permalink
On Sat, 26 Jan 2013 21:05:23 +0000 (GMT), "Rodney Pont"
Post by Rodney Pont
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
If you guys are talking about the ShadowProtect that is found at
http://www.storagecraft.com/
I just looked there. I see a single Desktop Edition license at $89.95
and a three-user license (Desktop Edition - Home User Bundle) at
$209.95.
I'd say fairly expensive.
Could the confusion here be that it's ShadowProtect version 5?
(ie ShadowProtect 5 $89.99)
Yes, probably. That's the latest version.
--
Robin Bignall
Herts, England
Philip Herlihy
2013-01-27 13:59:16 UTC
Permalink
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Post by Philip Herlihy
says...
...
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
...
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
If you guys are talking about the ShadowProtect that is found at
http://www.storagecraft.com/
I just looked there. I see a single Desktop Edition license at $89.95
and a three-user license (Desktop Edition - Home User Bundle) at
$209.95.
I'd say fairly expensive.
I think I must have mis-read "5 Desktop", which does appear on the
page... (a senior moment).
--
Phil, London
Gene E. Bloch
2013-01-28 00:08:29 UTC
Permalink
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Post by Philip Herlihy
says...
...
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
...
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
If you guys are talking about the ShadowProtect that is found at
http://www.storagecraft.com/
I just looked there. I see a single Desktop Edition license at $89.95
and a three-user license (Desktop Edition - Home User Bundle) at
$209.95.
I'd say fairly expensive.
I think I must have mis-read "5 Desktop", which does appear on the
page... (a senior moment).
Senior, but probably not serious :-)

Actually, there was a post above from Rodney Pont, who guessed as much.
Thanks for the proof (we need proof!).
--
Gene E. Bloch (Stumbling Bloch)
Robin Bignall
2013-01-28 00:08:52 UTC
Permalink
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Post by Philip Herlihy
says...
...
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
...
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
If you guys are talking about the ShadowProtect that is found at
http://www.storagecraft.com/
I just looked there. I see a single Desktop Edition license at $89.95
and a three-user license (Desktop Edition - Home User Bundle) at
$209.95.
I'd say fairly expensive.
I think I must have mis-read "5 Desktop", which does appear on the
page... (a senior moment).
I just did my usual C: backup with SP. It took 14 minutes. I also did
a Windows backup of C: with a system backup. It took about an hour.
Both to the same HDD. What on earth is Windows doing that makes it so
slow? Copying empty space?
--
Robin Bignall
Herts, England
Philip Herlihy
2013-01-28 13:00:16 UTC
Permalink
Post by Robin Bignall
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Post by Philip Herlihy
says...
...
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
...
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
If you guys are talking about the ShadowProtect that is found at
http://www.storagecraft.com/
I just looked there. I see a single Desktop Edition license at $89.95
and a three-user license (Desktop Edition - Home User Bundle) at
$209.95.
I'd say fairly expensive.
I think I must have mis-read "5 Desktop", which does appear on the
page... (a senior moment).
I just did my usual C: backup with SP. It took 14 minutes. I also did
a Windows backup of C: with a system backup. It took about an hour.
Both to the same HDD. What on earth is Windows doing that makes it so
slow? Copying empty space?
Was the SP backup a full backup, or an incremental/differential? Was
the Windows backup a "complete" backup (image) or a file backup?

You'll be aware that file backup programs mark the files they copy (the
Archive attribute) so running two backup programs needs to be thought
through very carefully.
--
Phil, London
Robin Bignall
2013-01-28 15:56:57 UTC
Permalink
Post by Philip Herlihy
Post by Robin Bignall
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Post by Philip Herlihy
says...
...
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
...
Post by Gene E. Bloch
Post by Robin Bignall
Post by Philip Herlihy
Seems you can only buy it 5 licenses at a time ($89).
It's certainly not cheap, but I didn't realise my licence covers 5
machines.
If you guys are talking about the ShadowProtect that is found at
http://www.storagecraft.com/
I just looked there. I see a single Desktop Edition license at $89.95
and a three-user license (Desktop Edition - Home User Bundle) at
$209.95.
I'd say fairly expensive.
I think I must have mis-read "5 Desktop", which does appear on the
page... (a senior moment).
I just did my usual C: backup with SP. It took 14 minutes. I also did
a Windows backup of C: with a system backup. It took about an hour.
Both to the same HDD. What on earth is Windows doing that makes it so
slow? Copying empty space?
Was the SP backup a full backup, or an incremental/differential? Was
the Windows backup a "complete" backup (image) or a file backup?
You'll be aware that file backup programs mark the files they copy (the
Archive attribute) so running two backup programs needs to be thought
through very carefully.
It was just an experiment. I don't use Windows backup. SP does not
care about files being marked; it images volumes.
--
Robin Bignall
Herts, England
Gene E. Bloch
2013-01-25 19:07:34 UTC
Permalink
Post by Philip Herlihy
Post by Robin Bignall
Post by Philip Herlihy
says...
...
Post by Robin Bignall
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
True. I know less than anyone about how disks operate in detail, but I
suppose common sense will tell us that there must be trade-offs. For
example, if you have a system that examines in some way the sectors to
just backup those that have changed, this operation must take time,
during which many other sectors will change on a very busy disk (or even
SSD). I can't see how, in that case, one can end up with an accurate
image of one instant, even though CPUs are very much faster than I/O
devices. I don't know if Shadow Protect does it at the sector level for
incrementals or differentials, or some unit larger, but their idea is to
capture an image in the shortest possible time, and then write it out to
a disk at leisure, in parallel with other processing. Disk space is
cheap; it doesn't matter how large incrementals are with that approach.
All true (providing you are taking incrementals and not copying
everything every time, and especially not overwriting previous copies).
However, if I want to restore a particular file, the fact that the
backup archive is structured in terms of sectors seems an unecessary
complication. I certainly value disk images, which work via sectors,
but my own feeling is that it's a stretch to use that technology for
simple file backup, which is a mature field of its own.
I'm not sure how or why this bothers you. You don't need to (can't, in
fact) deal directly with the way the files are backed up or with the way
the backups are structured. If you need to restore an entire drive, the
program does it for you. If you need to reclaim a file or files, the
program mounts the backup and you see it as a Windows disk drive. You
can just copy files or folders at will, as needed. In both cases you can
choose a backup of a specific date, if there is a series of
incrementals.

There's a twist (in Macrium, at least), in that when you mount an image,
you have to notice & check an obscure checkbox in order to mount it with
full access. Otherwise you get unpleasant surprises when you try to
access things like your Documents :-)
--
Gene E. Bloch (Stumbling Bloch)
Philip Herlihy
2013-01-26 14:41:08 UTC
Permalink
Post by Gene E. Bloch
Post by Philip Herlihy
Post by Philip Herlihy
says...
...
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
...
Post by Gene E. Bloch
Post by Philip Herlihy
However, if I want to restore a particular file, the fact that the
backup archive is structured in terms of sectors seems an unecessary
complication. I certainly value disk images, which work via sectors,
but my own feeling is that it's a stretch to use that technology for
simple file backup, which is a mature field of its own.
I'm not sure how or why this bothers you. You don't need to (can't, in
fact) deal directly with the way the files are backed up or with the way
the backups are structured. If you need to restore an entire drive, the
program does it for you. If you need to reclaim a file or files, the
program mounts the backup and you see it as a Windows disk drive. You
can just copy files or folders at will, as needed. In both cases you can
choose a backup of a specific date, if there is a series of
incrementals.
There's a twist (in Macrium, at least), in that when you mount an image,
you have to notice & check an obscure checkbox in order to mount it with
full access. Otherwise you get unpleasant surprises when you try to
access things like your Documents :-)
Well, when MS designed a backup facility for Windows Vista, they chose
to separate sector-based imaging from file-based backups, and that just
feels right to me.
--
Phil, London
Gene E. Bloch
2013-01-26 19:59:03 UTC
Permalink
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Philip Herlihy
Post by Philip Herlihy
says...
...
Post by Philip Herlihy
The older you get, the more you know. And the less you're certain of
any of it...
...
Post by Gene E. Bloch
Post by Philip Herlihy
However, if I want to restore a particular file, the fact that the
backup archive is structured in terms of sectors seems an unecessary
complication. I certainly value disk images, which work via sectors,
but my own feeling is that it's a stretch to use that technology for
simple file backup, which is a mature field of its own.
I'm not sure how or why this bothers you. You don't need to (can't, in
fact) deal directly with the way the files are backed up or with the way
the backups are structured. If you need to restore an entire drive, the
program does it for you. If you need to reclaim a file or files, the
program mounts the backup and you see it as a Windows disk drive. You
can just copy files or folders at will, as needed. In both cases you can
choose a backup of a specific date, if there is a series of
incrementals.
There's a twist (in Macrium, at least), in that when you mount an image,
you have to notice & check an obscure checkbox in order to mount it with
full access. Otherwise you get unpleasant surprises when you try to
access things like your Documents :-)
Well, when MS designed a backup facility for Windows Vista, they chose
to separate sector-based imaging from file-based backups, and that just
feels right to me.
File-based backup, in my experience, means a backup scheme wherein you
save a specific chosen set of files and folders as just saved data with
no other data saved. You can recover only the files you initially chose
to save since they are the only ones that get saved.

Image-based backups, like clones, mean backups where what you save is a
complete version of the partition or drive that is backed up. You can
recover any file, including the entire hard drive boot structure if
needed.

I like images and clones precisely because I don't need to decide today
what I might need in the future, with the potential problem that the one
I need next Thursday will be one of the files I didn't select to back
up.
--
Gene E. Bloch (Stumbling Bloch)
Char Jackson
2013-01-26 22:21:35 UTC
Permalink
Post by Gene E. Bloch
I like images and clones precisely because I don't need to decide today
what I might need in the future, with the potential problem that the one
I need next Thursday will be one of the files I didn't select to back
up.
+1
--
Char Jackson
Robin Bignall
2013-01-22 23:48:49 UTC
Permalink
On Tue, 22 Jan 2013 15:11:11 -0800, "Gene E. Bloch"
Post by Gene E. Bloch
Post by Philip Herlihy
says...
Post by Gene E. Bloch
Post by Philip Herlihy
Post by Gene E. Bloch
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
I have two backup drives respectively marked A and B with tape labels
and in Macrium I created two backup scripts respectively labeled A and
B.
Both have been configured to look at alternative drive letters, in case
they have changed.
Unfortunately, the way all that is done is IMO not easy for a naive user
to set up. It's not that easy for me, and I claim to be experienced.
Also - I just tested - Macrium does allow me to access the files for
copying, if you prefer that method. The account is of class
Administrator, but it's not *the* admin account.
Thanks, Gene. I think Macrium is in most respects equivalent to True
Image, in that it's copying sectors rather than files (it's a cloning
tool) and it also seems to elevate itself (or get you to do it).
I've been experimenting, and I have a basic script which (if run
elevated) can access the files without problem and copy them to the
device on which the script resides, whatever its current drive letter.
You can use parameter substitution to tease the drive letter out of the
invocation string. So far, that looks like the best option I can think
of. If run without elevation, you simply get "Access Denied" errors.
You say of Macrium, "it's a cloning tool". But it's two tools, a cloning
tool and an imaging tool. The imaging tool allows for incremental
backups, and allows for separate access to any dated image. They are
compressed images a la vhd files (I hope I recalled the extension
correctly), i.e., analogous to virtual hard drives.
Sure. I'm not entirely comfortable using sector-backup tools for
backing up ordinary files (though I'm not entirely sure why not). I'd
imagine if you did a defragmentation, the incremental backup next time
would be huge. It just seems so wasteful. I have to say, I liked
ntbackup, which goes back to Windows 2000!
My experience doesn't seem to agree with your speculation.
It might be informative to do an experiment: do an image backup, defrag,
and do an incremental right away.
Since Windows automatically defrags my computer when I'm not looking (I
never killed that default behavior), I should see some lengthy
incremental backups in Macrium, but I don't.
Neither do I using ShadowProtect. Although the SP people do warn
against doing an incremental during a defrag run, I've never seen any
problems.
--
Robin Bignall
Herts, England
Philip Herlihy
2013-01-23 15:22:25 UTC
Permalink
Post by Robin Bignall
On Tue, 22 Jan 2013 15:11:11 -0800, "Gene E. Bloch"
Post by Gene E. Bloch
Post by Philip Herlihy
says...
...
Post by Robin Bignall
Post by Gene E. Bloch
Post by Philip Herlihy
Sure. I'm not entirely comfortable using sector-backup tools for
backing up ordinary files (though I'm not entirely sure why not). I'd
imagine if you did a defragmentation, the incremental backup next time
would be huge. It just seems so wasteful. I have to say, I liked
ntbackup, which goes back to Windows 2000!
My experience doesn't seem to agree with your speculation.
It might be informative to do an experiment: do an image backup, defrag,
and do an incremental right away.
Since Windows automatically defrags my computer when I'm not looking (I
never killed that default behavior), I should see some lengthy
incremental backups in Macrium, but I don't.
Neither do I using ShadowProtect. Although the SP people do warn
against doing an incremental during a defrag run, I've never seen any
problems.
Interesting. Perhaps one of those 'theoretical' problems which doesn't
matter in practice.
--
Phil, London
Ashton Crusher
2013-01-22 23:22:43 UTC
Permalink
On Mon, 21 Jan 2013 22:29:17 -0000, Philip Herlihy
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
Maybe my memory is bad but I don't recall having any problem simply
copying my TrueImage backup files to a new location. AFAIK they are
just files like any other file.
Philip Herlihy
2013-01-23 15:23:48 UTC
Permalink
In article <***@4ax.com>, ***@moore.net
says...
Post by Ashton Crusher
On Mon, 21 Jan 2013 22:29:17 -0000, Philip Herlihy
...
Post by Ashton Crusher
Maybe my memory is bad but I don't recall having any problem simply
copying my TrueImage backup files to a new location. AFAIK they are
just files like any other file.
I was talking about the file generated by Windows Backup, which are
indeed protected against access by ordinary users. True Image doesn't
do this.
--
Phil, London
Gordon
2013-01-22 23:54:02 UTC
Permalink
On Mon, 21 Jan 2013 22:29:17 -0000, Philip Herlihy
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
Will someone please correct me if I'm making a serious error in this
matter. I use three external 500 GB hard drives in separate DYNEX
cases. I back up or rather copy the entire Libraries folder from each
computer once a week. I just create a new folder on one of the
external hard drives, using the date as part of the folder name
(130122 Pavilion) for example. Then do a copy/paste from the
comptuer's Libraries into the hard drive's new folder. This gives me a
new copy of the Libraries folder each week or so and I have several
earlier copies remaining on these hard drives, just in case I run into
a problem. I can delete the older copies when the external hard drives
begin to get too full.

Is this not a workable, easy way to assure myself that I have safe,
secure backups? I put these backup external drives in separate
buildings, as a safeguard against some disaster like a fire or
tornado.
Gene E. Bloch
2013-01-23 00:33:36 UTC
Permalink
Post by Ashton Crusher
On Mon, 21 Jan 2013 22:29:17 -0000, Philip Herlihy
Post by Philip Herlihy
One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.
Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.
However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.
I've been trying to come up with something he can run without having to
think about it too hard.
One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.
Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...
I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.
Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?
Any thoughts? Any hazards?
Will someone please correct me if I'm making a serious error in this
matter. I use three external 500 GB hard drives in separate DYNEX
cases. I back up or rather copy the entire Libraries folder from each
computer once a week. I just create a new folder on one of the
external hard drives, using the date as part of the folder name
(130122 Pavilion) for example. Then do a copy/paste from the
comptuer's Libraries into the hard drive's new folder. This gives me a
new copy of the Libraries folder each week or so and I have several
earlier copies remaining on these hard drives, just in case I run into
a problem. I can delete the older copies when the external hard drives
begin to get too full.
Is this not a workable, easy way to assure myself that I have safe,
secure backups? I put these backup external drives in separate
buildings, as a safeguard against some disaster like a fire or
tornado.
If you're sure that all the data you care about is in the libraries,
that should be OK.

But I find libraries a bit strange. They don't always do what I expect
of them, so I would suggest spending some quality time looking at one of
your backup folders to make sure it really does contain what you think
should be there.

All in all, clones and image backups work for me because they contain
everything on my hard drive, so if I need something that I didn't
anticipate that I might need, it will be there, assuming I can find it
:-) ... I wouldn't rely on your method - but that's my prejudice (or
superstition) speaking.
--
Gene E. Bloch (Stumbling Bloch)
Philip Herlihy
2013-01-23 16:29:05 UTC
Permalink
Post by Ashton Crusher
On Mon, 21 Jan 2013 22:29:17 -0000, Philip Herlihy
...
Post by Ashton Crusher
Will someone please correct me if I'm making a serious error in this
matter. I use three external 500 GB hard drives in separate DYNEX
cases. I back up or rather copy the entire Libraries folder from each
computer once a week. I just create a new folder on one of the
external hard drives, using the date as part of the folder name
(130122 Pavilion) for example. Then do a copy/paste from the
comptuer's Libraries into the hard drive's new folder. This gives me a
new copy of the Libraries folder each week or so and I have several
earlier copies remaining on these hard drives, just in case I run into
a problem. I can delete the older copies when the external hard drives
begin to get too full.
Is this not a workable, easy way to assure myself that I have safe,
secure backups? I put these backup external drives in separate
buildings, as a safeguard against some disaster like a fire or
tornado.
No serious error, but I think you could do better. Maybe I'm fixated in
an era when disk space really wasn't cheap, but you seem to be using
your available space wastefully. If you had to back up a file-server
containing masses of information, only a little of which changes much
(the most common situation), you would probably need to be economical
with the space you use, instead of just generating four full copies of
everything. Similarly, backing up using sector-copying tools seems to
be wasting the information which Windows uses (the 'Archive' flag) to
mark new or changed files. The backup schemes I use depend on this, and
they scale to large quantities of data, frequently backed-up.

A backup scheme also needs (in my view) to prioritise the most likely
causes of loss of data. In my situation, I think the most likely cause
is accidentally deleting or overwriting a file, followed by disk
failure. Fire, theft and a giant meteorite blowing up my part of South
East England are lower down the scale, but a failure to maintain regular
backups because you can't be bothered to go and get the disk out the
cupboard is a real risk. So, my preferred backup strategy is this:

I mount an additional disk in the machine to be backed-up. I use backup
software (that in Windows 2000 or XP Pro was fine, and the Vista/W7
backup software is also ok) to generate backups automatically on a
schedule. I can rely on this running, as the disk is always present. I
check the 'report' manually once a week (having once seen a colleague
ruefully surveying 6 months of unexamined failure reports!).

One one customer's site, the file server does a Baseline ("full") backup
once a month - note that a Baseline backup resets the Archive flag for
every file it touches, after copying it. Every weekday, a Differential
is done (only copies changed or new files, but doesn't reset the Archive
flags) with an Incremental (same but does reset the flag) every
Saturday. All of this is managed by a command-line script I've evolved.
The advantage of this scheme is that (provided you haven't had to delete
the files involved through running out of space) you can go back to the
version of any file which existed on any chosen date, yet for a full
restore you need only the Baseline, the latest Incremental, and any
subsequent Differentials to get back to the latest position.

Note that if you damage a file without realising it, and then copy the
damaged one over the most recent backup, you've lost the game - keeping
versions is important in some situations (think of an important
financial spreadsheet which is edited daily). Note also that these file
servers run XP or Windows 2000, which allow fine control of backup modes
- Vista and later only have Baselines and Incrementals, and the utility
decides which you get.

Now, that leaves you exposed to the risk that a PSU firestorm (or some
such event) could take out the whole machine, so it's well worth copying
the backup files (the complete, interdependent set) to another host. My
script uses robocopy to update a complete set of backup files held on
another machine. In this situation, the two machines are nearby,
leaving an exposure to fire, theft, meteorites, etc. I've discussed
this with the client, but he doesn't want to bother even moving the
second machine to a different room. It's for him to weigh the cost of
loss with his view of the likelihood of these events, so I can't insist.
I have encouraged him to keep critical files (like those spreadsheets)
on something like Dropbox or Skydrive or Google Docs, as an additional
precaution.

That file server holds a lot of stuff, much of it obsolete, of course,
but I can't be the one to weed it. A Baseline takes under 20m and an
Incremental takes less than a minute, copying between two internal
disks. The Robocopy job over the network takes a bit longer (no figures
to hand, but under an hour, max.) This scheme allows all user files to
be backed-up daily, and recovered from any point in time up to several
months back. Without using vast quantities of disk space.

I do something similar on my own PC. When doing serious software
development (not as often as I'd like, these days) I use version control
software which amounts to backing up many times a day.

Going back to my OP: my customer wants to do something similar on a
Vista PC (his idea - few of my domestic customers can be persuaded even
to think about backups, sadly). The one external disk is to be USB
(I've persuaded him to invest in a USB3 card to match the drive he's
bought) which raises the possibility of a changing drive letter, which
is a problem if seeming to automate the process. And Vista protects
backup files against access by the user who initiates their creation. I
now have a tested prototype script which resides on the destination
volume and detects its current drive letter, and, if run 'elevated' can
happily access and copy the backup files, using robocopy to mirror the
backup archive on the main backup disk. He's happy with that, and so am
I. If it wasn't for the need for elevation, I could even use an
autorun.inf file to run the copying script automatically when the drive
was plugged-in.
--
Phil, London
Ken Blake
2013-01-23 19:17:38 UTC
Permalink
On Wed, 23 Jan 2013 16:29:05 -0000, Philip Herlihy
Post by Philip Herlihy
A backup scheme also needs (in my view) to prioritise the most likely
causes of loss of data. In my situation, I think the most likely cause
is accidentally deleting or overwriting a file, followed by disk
failure. Fire, theft and a giant meteorite blowing up my part of South
East England are lower down the scale, but a failure to maintain regular
backups because you can't be bothered to go and get the disk out the
I mount an additional disk in the machine to be backed-up. I use backup
software (that in Windows 2000 or XP Pro was fine, and the Vista/W7
backup software is also ok) to generate backups automatically on a
schedule. I can rely on this running, as the disk is always present. I
Each to his own, but in my view, backup to an internal hard drive is
better than no backup at all, but just barely. Next to backup to a
second partition on your only hard drive, it's the weakest form of
backup there is. It leaves you susceptible to simultaneous loss of
the original and backup to many of the most common dangers: severe
power glitches, nearby lightning strikes, virus attacks, fire, user
error, even theft of the computer.

Giant meteorites, on the other hand, are not terribly likely. <g>
Philip Herlihy
2013-01-23 19:37:33 UTC
Permalink
Post by Ken Blake
On Wed, 23 Jan 2013 16:29:05 -0000, Philip Herlihy
Post by Philip Herlihy
A backup scheme also needs (in my view) to prioritise the most likely
causes of loss of data. In my situation, I think the most likely cause
is accidentally deleting or overwriting a file, followed by disk
failure. Fire, theft and a giant meteorite blowing up my part of South
East England are lower down the scale, but a failure to maintain regular
backups because you can't be bothered to go and get the disk out the
I mount an additional disk in the machine to be backed-up. I use backup
software (that in Windows 2000 or XP Pro was fine, and the Vista/W7
backup software is also ok) to generate backups automatically on a
schedule. I can rely on this running, as the disk is always present. I
Each to his own, but in my view, backup to an internal hard drive is
better than no backup at all, but just barely. Next to backup to a
second partition on your only hard drive, it's the weakest form of
backup there is. It leaves you susceptible to simultaneous loss of
the original and backup to many of the most common dangers: severe
power glitches, nearby lightning strikes, virus attacks, fire, user
error, even theft of the computer.
Giant meteorites, on the other hand, are not terribly likely. <g>
Depends entirely on your assessment of the probability of those hazards.
As I've said, the likeliest problems are inadvertent deletion or
overwriting of a file, or failure of a single disk. And, of course,
inertia preventing you from getting the external disk and plugging it
in. For some people, a NAS is a good solution.
--
Phil, London
NY
2013-01-23 20:36:23 UTC
Permalink
Post by Philip Herlihy
Depends entirely on your assessment of the probability of those hazards.
As I've said, the likeliest problems are inadvertent deletion or
overwriting of a file, or failure of a single disk. And, of course,
inertia preventing you from getting the external disk and plugging it
in. For some people, a NAS is a good solution.
Yes a NAS has the advantage that it can be physically remote from the
computer that is being backed up (less likely to be stolen or go up in
flames if the main PC is) and yet is always online and so you can't forget
to connect it when the automatic backup process runs.

I backup to external HDDs that are plugged into USB, and I use a
manually-initiated automatic backup program (MS Sync Toy) to compare PC
against backup drive and copy just the new/changed files. I definitely want
a backup which makes an exact copy of each file, rather than something that
merges the whole backup into one big proprietary file because it's a pain
searching through that if you want to restore a file, and if the file gets
corrupted you've lost everything. It's also why I prefer Outlook Express,
Windows Mail or Windows Live Mail over Outlook because each message is in a
separate file rather then putting everything into one humungous PST file.

Ideally I'd use a NAS in another room, but the problem with that is getting
a network signal to it: it's a pain getting Ethernet cable into other rooms
and wireless is horrendously slow for backing up multi-GB files (TV
recordings). Either way knackers the network during the backup, which is
fine if you schedule the backup for overnight, but I run SyncToy as and when
I think "right, I've changed or added some files - let's make a backup".

When I remember I remove the backup drives and keep them in another room
(guarding against casual theft, although not fire) and I take them with me
when I leave the house for a few days, both to keep them separate from the
main PC and so I've got access to files while I'm away from home, though for
trivial access I can also use LogMeIn back to the main PC - also useful for
setting extra TV programmes to record that I'd forgotten to set before I
went on holiday!

I agree that the most likely disaster is accidental deletion of files. I
deleted a folder of loads of TV programs; almost all were on the backup
drive but a few were new ones that I was about to backup but I was first
deleting (as I thought) unwanted programmes from the drive to be backed up
before comparing and backing up the rest - except I somehow deleted the
parent folder instead of a few of the files in it. And I'd already started
restoring the majority of the files from the backup before I realised that I
could use Recuva to undelete them - but by then it was too late for a lot of
them. Talk about compounding a problem through my own stupidity! In the
event, all I lost were about 10 programmes which I was able to download from
BBC iPlayer to watch, even if I no longer had a way to keep them forever,
only for the few weeks you get before iPlayer recordings time out.
Philip Herlihy
2013-01-23 22:49:23 UTC
Permalink
...
Post by NY
I backup to external HDDs that are plugged into USB, and I use a
manually-initiated automatic backup program (MS Sync Toy) to compare PC
against backup drive and copy just the new/changed files. I definitely want
a backup which makes an exact copy of each file, rather than something that
merges the whole backup into one big proprietary file because it's a pain
searching through that if you want to restore a file, and if the file gets
corrupted you've lost everything.
...

Backups were originally usually sent to tape - hence the large
'archive' file format. I've seen no appreciable problems due to this
(including PST files, etc).

Your scheme doesn't scale to a situation where you have a dozen users
changing files daily, and you find you need to go back a couple of
months of daily backups to get an undamaged file. I persuaded one
customer of this a couple of months ago, and a couple of weeks later she
realised she'd accidentally deleted large parts of a vital spreadsheet
before saving it. (Easy to do with many types of file if you select
more than you intended to, and the overtype it.) We found an intact
original from a point just after I'd sorted out a 'proper' backup regime
for her. All the subsequent versions were exact copies of the damaged
file.

For me, a utility only deserves the name 'Backup' if it captures a
sequence of all versions of a file, and is aware of the Archive
attribute. Anything else is just a copy, and rubbish copied is still
rubbish.

It's worth noting that DropBox, Google Docs and Skydrive maintain
versions of files (for Skydrive, only Office files) if you edit the
files through those sites.
--
Phil, London
Mike Barnes
2013-01-23 17:49:17 UTC
Permalink
Post by Gordon
Will someone please correct me if I'm making a serious error in this
matter. I use three external 500 GB hard drives in separate DYNEX
cases. I back up or rather copy the entire Libraries folder from each
computer once a week. I just create a new folder on one of the
external hard drives, using the date as part of the folder name
(130122 Pavilion) for example. Then do a copy/paste from the
comptuer's Libraries into the hard drive's new folder. This gives me a
new copy of the Libraries folder each week or so and I have several
earlier copies remaining on these hard drives, just in case I run into
a problem. I can delete the older copies when the external hard drives
begin to get too full.
I do something similar, with three Samsung 1TB USB drives (F:). But
there are differences. My data is separated into system (C:), documents
(D:), and media (M:). My weekly backup to the currently-mounted F:
treats each drive letter differently. C:, which is the least important,
is backed up using Windows Backup, and there is only the latest copy. D:
is the most complex because I want multiple copies and I want
encryption. So my backup consists of a password-protected TrueCrypt
volume, and like you I put the date in the filename and prune when the
disk starts to run low on space. Rather than copy and paste I use sync
software (AJC Sync) which copies only new files and files where the
timestamp has changed. I also use AJC Sync to back up the media files on
M:, again only where the timestamp has changed.

The entire process is automated and runs under the Windows task
scheduler in the early hours of every Saturday morning, taking about
half an hour. I'm presented with a report on the success or otherwise of
the backups. I then rotate the disks, with there always being one
attached to the PC, one in my car, and one somewhere else.

Devising and testing the software for automating all this wasn't trivial
but it pays dividends in reliability. The weakest link in the chain is
me, and the less I have to do the better.
--
Mike Barnes
Ken Blake
2013-01-23 19:19:28 UTC
Permalink
On Wed, 23 Jan 2013 17:49:17 +0000, Mike Barnes
Post by Mike Barnes
Post by Gordon
Will someone please correct me if I'm making a serious error in this
matter. I use three external 500 GB hard drives in separate DYNEX
cases. I back up or rather copy the entire Libraries folder from each
computer once a week. I just create a new folder on one of the
external hard drives, using the date as part of the folder name
(130122 Pavilion) for example. Then do a copy/paste from the
comptuer's Libraries into the hard drive's new folder. This gives me a
new copy of the Libraries folder each week or so and I have several
earlier copies remaining on these hard drives, just in case I run into
a problem. I can delete the older copies when the external hard drives
begin to get too full.
I do something similar, with three Samsung 1TB USB drives (F:). But
there are differences. My data is separated into system (C:), documents
treats each drive letter differently. C:, which is the least important,
is the most complex because I want multiple copies and I want
encryption. So my backup consists of a password-protected TrueCrypt
volume, and like you I put the date in the filename and prune when the
disk starts to run low on space. Rather than copy and paste I use sync
software (AJC Sync) which copies only new files and files where the
timestamp has changed. I also use AJC Sync to back up the media files on
M:, again only where the timestamp has changed.
The entire process is automated and runs under the Windows task
scheduler in the early hours of every Saturday morning, taking about
half an hour. I'm presented with a report on the success or otherwise of
the backups. I then rotate the disks, with there always being one
attached to the PC, one in my car, and one somewhere else.
I was about to disagree with your scheme, until I got to that last
paragraph above. What you say there is the key, and gives you
significant extra security.
Philip Herlihy
2013-01-23 19:35:13 UTC
Permalink
In article <***@g52lk5g23lkgk3lk345g.invalid>,
***@gmail.com says...
...
Post by Mike Barnes
Devising and testing the software for automating all this wasn't trivial
but it pays dividends in reliability. The weakest link in the chain is
me, and the less I have to do the better.
+1!
--
Phil, London
Loading...