File Operations: Alternatives to Windows Explorer
In modern operating systems with a graphical user interface (GUI) working on files and folders can easily be carried out by file managers. On Windows, the Explorer is this by default. The visual manipulation of clearly displayed objects is beneficial in many ways.
But when such operations refer to entire folder "trees" or large file collections, some serious drawbacks come to light, which are essentially based on the fact that this type of software is largely designed for an interactive work. The user is therefore basically forced to monitor the entire process, to react to several possible — and often unforeseen — interruptions.
A common cause for an interruption is that for security matters a single object will not be included without explicit affirmation. Generally this is a good thing, because the potential problem could has been overlooked. But otherwise this behaviour is very troublesome, because the interruption may occur at any time. File managers like the Explorer ask for a user's decision just when it is needed. So for (perhaps repeated) operations on larger file collections a program that runs (automated) without further inquiry will be more suitable.
Copying files (and/or folders) may be one of the most common and important tasks, in particular the transmission between the PC's hard disk and removable media. It is quite easy to select folders or files with the Explorer and initiate a copy operation. However, the process may be interrupted for several reasons. So the operation will stop when a target object already exists — and again when a "protected" file is affected. 1
For example, anyone who wants to create backup copies of entire folders and leave this perhaps prolonged procedure unattended, will perceive interposed queries as useless and annoying. That's why copy commands (console programs) like Xcopy may do a better job in such cases. They usually give the opportunity to anticipate certain decisions, that would otherwise be requested later.
This is because the operation can be controlled by starting the program with (additional) parameters. For instance:
xcopy "My Data" "E:\My Data" /e /r /v /c /i /y
Here the parameters (arguments) /r and /y determine that the copying process shall overwrite any existing files at the destination without inquiry. And the parameter /c instructs the program to proceed on error. 2
As mentioned elsewhere, I prepared a VB-script for backup copying, which employs a console command (configurable, Xcopy by default). That script can be used right away or as a template for similar copy tasks. The packed file bakcopy.zip contains the script along with an explanation.
The deletion of folders and files with the Explorer generally appears without questioning or difficulties. However, this smoothness is usually due to the fact that these objects aren't actually deleted but just moved into a special system directory. This directory, which is typically displayed on GUI-level as "trash bin" or the like, enables the recovery of deleted objects. The erase operation thus has only been postponed.
At the actual (final) deletion of files quite a few circumstances can result in an interruptive inquiry. The process stops when it encounters a "protected" file. And this can happen several times even after blanket consent, because the Explorer distinguishes at least 3 different types, each handled separately (read-only file, system file, and program file).
Of course, such warnings are greatly useful. But yet there are tasks where the deletion of folder structures should proceed continually. Again the recourse to console programs is recommendable:
attrib -r -s -h "old data"\*.* /s /d rd /s /q "old data"
This combination of two "DOS" commands causes that all files within the folder old data (and its subdirectories) lose the special attributes that could prevent an erasure. Then the folder including all its contents is deleted.
Windows provides further means for this task. So the folder deletion can, for example, also be performed using VBScript:
Set fso = CreateObject("Scripting.FileSystemObject") fso.DeleteFolder "old data", vbTrue
The packed file killdirs.zip contains both implementations as small utilities, where the folder name is, of course, a variable parameter given at program start. Both programs also go a little beyond the commands shown above, because they briefly evaluate the parameter and display a short message if necessary. (The file killdirs_de.zip provides German versions.)
- The script killdir.vbs is suitable for Explorer integration, but can also be called directly:
killdir "old data" /quiet
The command file killdir2.bat is meant to be started in a console window:
killdir2 "old data"
Of course, it is very important to pay attention to the peril of incorrect entries!
And please note that these informations and script/command files are destined for experienced PC users, and that any usage is at your own risk!
1) The process also always stops when an error occurs, a continuation is often impossible. But this problem may appear also with non-interactive programs, such as console commands (see Note 2).
2) This "fault tolerance" is sometimes useful, especially when it comes to backups, where an error must not prevent the copying of other folders and files.
However, this setting does not help when a file is too large for the destination drive or file system. Unfortunately, this can happen regardless of the free capacity, namely if a file > 4GB is copied to a FAT system. Most programs handle (and report) the problem, as if the target drive is full. XCOPY too aborts the whole process at this point.