Batch Rename Multiple Files with Linux

Here is a list of ways how to rename multiple files with Linux. The easies and quickest way I found was to use ‘rename’.

One hack of a perfect (as in jack of all trades) backup solution for Ubuntu Linux (remote, flexible, instant restore, automated, reliable)

This is a work in progress (and most likely will always be so)!

Here is what I have been working on and looking for to aquip myself with. I wanted to keep working without any hassle on my daily stuff just as I ever had and with changes to come. But at the same time I needed to be sure for a situation where I needed older versions of my files — would it be due to a system or hard disk brake down or a file deleted erroneously or changes need to be undone — to just have them there by no more than one command away. To say in short I wanted a time machine for my files that just works™ — also in at least 5 years time. In the case of recreation of older versions I want to be able to focus on what to restore and not how. And also with previous backups I have had e.g. corrupted archive files or unreadable part files/CDs to many times (one is even to many) or I’ve had issues because of too old of a file format (mostly proprietary formats).

Here is what I’ve been looking for feature-wise generally:

  • no expenses money-wise
  • robust
  • using only small and freely available tools — the more system core utils the better
  • version control
  • snapshot system
  • remote storage
  • private, i.e. secure data transmission over network and reliably encrypted storage
  • suitable for mobility, independent of how I’m connected
  • simple yet flexible usage

for daily backups:

  • automation using cron
  • no need for interaction
  • easy and flexible declaration of files or folders to omit from backup

and for restoring data:

  • just works™ (see above)
  • fast and easy look up of what versions are available at best via a GUI like Timeline with filter options
  • at very best some sort of offline functionality, e.g. caching of most likely (whatever that means) required older versions

(partly) Alternative solutions I have come across on the run

  • Suns’ z file system (zfs): Haven’t had enough time to get it working with Ubuntu Linux (because of license issues not packaged, only working via FUSE so far). Need’s partition setup thus lavish. Not sure about networking/mobility demands, e.g. remote snapshot location nor ease of use.
  • subversion together with svk: Easy and flexible to use and automate, version control per se, distributed and offline operations (svk). Contra: Recovery relays on subversion software, i.e. no cp or mv. Basic idea is to work on a copy: checkout before you start) and have daily automated commits. Should need no interactions since I’m the only one working with my “backup projects”. See this lengthy description.
  • Coda file system: distributed file system with caching. Had not enough time to try out.
  • rsnapshot: Has remote feature (ssh, rsync), automation, rotation. Relies on file systems using hard links within backup folder hierarchy for “non-incremental files” and runs as root only (system wide conf file, ssh configure issue, ssh-key, …). Workaround could be to use a specific group.
  • sshfs: FUSE add on to use remote directories via ssh transparently.
  • croned bash backup script using tar and gzip; daily incremental and monthly save “snapshot” similar to logrotate.
  • grsync: gnome GUI for rsync optimized for (incremental) backups

Update 10/2009: A few weeks ago i stumbled upon Back In Time which has astonishingly many properties of what I expect from a perfect backup solution. It basis on flyback project and TimeVault. There is a — for some people maybe a little lengthy — video on blip.tv that shows how to install and use it and how straight forward the GUI is.

Ubuntu Linux backup utilities and links

Resources:

Emerge clone for Debian-based distributions like Ubuntu, or: Compile your own, dude!

There is a nice overview about apt-build, the package I’m talking about here. So I will not say much. Only so far as what to do to try it out. On my system gnomes system manager is fairly slow. So I gave it a try:

  1. install the bundle:
    sudo aptitude --reinstall install apt-build
    
  2. configure your processor (dpkg-configure asks you about it)
  3. add deb-src to sources.list if you haven’t already
  4. run it on gnome-system-manager:
    sudo apt-build install gnome-system-manager
    

And there you have it. You might want to copy the list of packages that apt-build installs via apt-get build-dep so you can mark them as auto installed using aptitude when done:

sudo aptitude markauto list_of_packages_you_copied_before

or, even easier, use apt-builds –remove-builddep option.

It really does make a difference!

If you’re really keen or you happen to have an older system just wasting away try this:

sudo apt-build world

and see what happens ;)

Delete folder recursively via php if ftp says: Prohibited directory name

Using eclipse to sync an ftp folder with my local XAMPP installation I ran into he following situation. I ended up with a folder created on the remote machine called 'C:\xampp\tmp' that I could not delete or even view properties or content of using any ftp client I could get hands on (filezilla, ncftp3, gnome ftp “server connect”). Since the folder was on webspace I thought: “Mmh, why not try php.” So I created a file called delfolder.php right in the folder containing the miscurious c:-directory and finally came up with this:

< ?php
function rm_recursive($filepath) {
    if (is_dir($filepath) && !is_link($filepath)) {
        if ($dh = opendir($filepath)) {
            while (($sf = readdir($dh)) !== false) {
                if ($sf == '.' || $sf == '..' ) {
                    continue;
                }
                if (!rm_recursive($filepath.'/'.$sf)) {
                    throw new Exception($filepath.'/'.$sf.' could not be deleted.');
                }
            }
            closedir($dh);
        }
        return rmdir($filepath);
    }
    return unlink($filepath);
}

// Path to directory you want to delete
$directory = 'C:\xampp\tmp';

// Delete it
if (rm_recursive($directory)) {
    echo "{$directory} has been deleted";
} else {
    echo "{$directory} could not be deleted";
}

This, of course, only works if you have either direct access via shell (but than again why not just use rm -Rf ‘C:\xampp\tmp’) or FTP and web, i.e. HTTP, access to the folder in question.

Resources:

Drupal: An easy way to set up multiple sites on localhost

This works for a couple of subdomains/sites only. If you need to have a load of sites or other settings this is not for you. On the other hand this method needs no web server configuration.

  1. In your systems hosts file (Windows: In your favourite text editor open %SYSTEMROOT%\system32\drivers\etc\hosts and on Linux it’s /etc/hosts.
  2. Find the line defining 127.0.0.1, i.e. your local horst, erm localhost :) and append sitename1.localhost to the end of the line. do so for every site name you need.
  3. Go to your apaches/webservers documents folder holding your drupal installation. Say it’s htdocs/drupal than you need folder sites. There should be at least two folders called all and default. Copy default and name it sitename1.localhost, i.e. excactly the string you added to your hosts file (again you need to replace sitename1 by your site’s name but do include the dot!).
  4. In your web browser type sitename1.localhost/drupal to test if drupal shows up at all (meaning your OS resolves your “domain name” correctly) and if it shows your old content (meaning it works).

Now you have two options to actually set up your “new site”: Either edit the settings.php that should be in the new folder to use a diferent data base (that should be well-stocked with drupal data) or just install a fresh drupal site. You achieve the later by doing:

  1. Delete settings.php. That should leave you with a file named default.settings.php.
  2. Point your browser to sitename1.localhost/drupal/install.php and do everything like you did with the first install but use a diferent database (or the same but different database prefix).
  3. Done.

Resources

  • settings.php

Furthur reading

Ubuntu: Changing Hostname from Command Line

As described at it’s straight forward:

sudo /bin/hostname mynewhostname

Supplying the path to the binary is for security reasons, I guess, to make sure we have the right bin (eventhough it could have been replaced there, too…).

But the people at debianadmin.com forgot to mention in order to avoid “hostname: Unknown Host” you have to

sudo edit /etc/hosts

and change it there, too!

Ubuntu: Howto setup mp3 preview on Mouse-Over

Straight from Ubuntuguide.org:

You can also get Mouse over preview to work by installing:

sudo apt-get install mpg321
sudo apt-get install mpg123-esd
sudo apt-get install vorbis-tools
sudo apt-get install esound
sudo apt-get install ubuntu-restricted-extras

with this system Skype still functions.

Three easy steps to Install Ubuntu Fresh but still have all your Favourite Packages Installed

Also, there is a very interesting article at Linux Owns showing three steps to get all your favourite packages (back) fast. I added a fourth step actually saving your package list for later use. Deriving it straight from there (without testing, since unfortunatelly my last machine has been hardyed just a couple of hours ago):

  1. integrate medibuntu sources.list
    sudo wget http://www.medibuntu.org/sources.list.d/gutsy.list -O /etc/apt/sources.list.d/medibuntu.list
    
  2. add server key
    wget -q http://packages.medibuntu.org/medibuntu-key.gpg -O- | sudo apt-key add - && sudo apt-get update
    
  3. write a text file listing all package names you wish installed separated with spaces — you should be able to instead put every package name in one line with trailing \\ (double back-slash) but as I said: I haven’t tested it, yet! Name it, say, most_important_debs.
  4. sudo aptitude --assume-yes install < most_important_debs
    

    You might want to approve the package list before install. In that case omit –assume-yes

Let me know it someone used it (hopefully with success).

Update 2008/05/15: It does help to read and think before you speak (or write for that matter). I got it completely wring. The linked article is about packages from Medibuntu only. OK then, if it’s like this I just alter 1, 2, 4 and 3 -> 1 :)

  1. build a list of your (most important, enduser) packages
  2. update-manager -d, i.e. dist-upgrade your system
  3. employ aptitude to read in your file (to be exact it’s bash that redirect from the file…)

Of course, this method still does not solve the problem of saving your personal settings but still get all the system settings from the new distro release. But this shouldn’t be that hard for release maintainers since they (potentially) know which package version had what config files delivered or generated. From there it should be easy to determine if a system config file has been changed be the user -> show diff. Or do I overlook something once again?

Update 2008/05/15: Even better looks aptitude-run-state-bundle:

DESCRIPTION
aptitude-create-state-bundle produces a compressed archive storing the files that are required to replicate the current
package archive state.  The following files and directories are included in the bundle:
·      $HOME/.aptitude
·      /var/lib/aptitude
·      /var/lib/apt
·      /var/cache/apt/*.bin
·      /etc/apt
·      /var/lib/dpkg/status
The output of this program can be used as an argument to aptitude-run-state-bundle(1).

Update 2008/05/15: A good starting point would be either

dpkg -l | grep ^i | editor

or if you don’t use aptitude this also shows (only currently) installed packages

dpkg -l | grep ^i | editor

One needs to remove non-package-name strings, though. As I haven’t come around to learn sed (line editing) I cannot show how to deploy sed to do it. Anyone?

« Older entries

Follow

Get every new post delivered to your Inbox.