Go back to fronty page View most popular entries View latest additions Submit tutorials to UnixTutorials.info
UnixTutorials logo

Search results for Getting started with RHEL4's built-in LVM tools

Unix+clones Remote backup using ssh, tar and cron
Post date: April 12, 2005, 21:04 Category: Miscellaneous Views: 51
Tutorial quote: Are you looking for a solution to backup your data to a remote location? While a solid backup solution such as Arkeia or TSM from IBM are nice from an enterprise point of view, simpler solutions are available from a home user's perspective. I will walk you through on you how you can backup your data to a remote server, using the default tools available on all linux systems. In a nutshell, we will use ssh capabilities to allow a cron job to transfer a tarball from you local machine to a remote machine.

For the purpose of this tutorial, the local machine will be called “localmachine” (running slackware) and the remote server will be called “remoteserver” (slackware as well). The user will be joe (me). You will have to substitute those 3 with your own machines names and user.
Unix+clones Keeping Your Life in Subversion
Post date: October 2, 2005, 12:10 Category: Software Views: 119
Tutorial quote: I keep my life in a Subversion repository. For the past five years, I've checked every file I've created and worked on, every email I've sent or received, and every config file I've tweaked into revision control. Five years ago, when I started doing this using CVS, people thought I was nuts to use revision control in this way. Today it's still not a common practice, but thanks to my earlier article "CVS homedir" (Linux Journal, issue 101), I know I'm not alone. In this article I will describe how my new home directory setup is working now that I've switched from CVS to Subversion.

Subversion is a revision-control system. Like the earlier and much cruftier CVS, its purpose is to manage chunks of code, such as free software programs with multiple developers, or in-house software projects involving several employees. Unlike CVS, Subversion handles directories and file renaming reasonably, which is more than sufficient reason to switch to it if you're already using CVS. It also fixes most of CVS's other misfeatures. Subversion still has its warts, though, such as an inability to store symbolic links and some file permissions, and its need for twice as much disk space as you'd expect thanks to the copies of everything in those .svn directories. These problems can be quite annoying when you're keeping your whole home directory in svn. Why bother?
Unix+clones Execute Commands on Multiple Linux or UNIX Servers part II
Post date: December 28, 2005, 05:12 Category: System Views: 51
Tutorial quote: I have already covered how to execute commands on multiple Linux or UNIX servers via shell script. The disadvantage of script is commands do not run in parallel on all servers. However, several tools exist to automate this procedure in parallel. With the help of tool called tentakel, you run distributed command execution. It is a program for executing the same command on many hosts in parallel using ssh (it supports other methods too). Main advantage is you can create several sets of servers according requirements. For example webserver group, mail server group, home servers group etc. The command is executed in parallel on all servers in this group (time saving). By default, every result is printed to stdout (screen). The output format can be defined for each group.
Debian An apt-get primer
Post date: April 12, 2005, 13:04 Category: System Views: 90
Tutorial quote: If any single program defines the Debian Linux project, that program is apt-get. apt-get is Debian's main tool for installing and removing software. Working with the .deb package format, apt-get offers sophisticated package management that few Red Hat Package Manager RPM-based distributions can match.

Besides the convenience, an advantage of apt-get is that it reduces the chances of falling into dependency hell, that limbo where software installation fails for lack of another piece of software, whose installation fails for lack of another piece of software, and so on. If you know how Debian's archive system works, and how to choose the sources that apt-get uses, and use a few precautions in your upgrades, then the chances are that dependency problems will never bedevil you. Should you descend into dependency hell anyway, apt-get offers useful tools for climbing out of it.
Linux Upstream Provider Woes? Point the Ping of Blame
Post date: April 14, 2005, 08:04 Category: Network Views: 42
Tutorial quote: Your users are complaining that "the Internet is, like, all slow." Users are always complaining, but you're seeing a lot of timeouts when you check mail, surf the Web, or try to log in for remote administration. Or even worse, latency is so bad that you keep getting killed all to heck in your favorite gory violent online multi-player game, so you know there is a problem. But there a lot of potential bottlenecks between your PC and the outside world, like your Internet gateway, proxy server, firewall, Internet service provider, and so forth, so where do you begin?

One of the best and most versatile network tools you can have is a notebook PC running Linux. This lets you plug in anywhere to run tests and find out what is going on. Make it a nothing-to-lose box--don't keep data on it so you can wipe and reinstall the operating system as necessary, because you want to be able to run tests outside of firewalls. Don't run any services. You can put a minimal iptables firewall on it, as there is no point in being totally exposed, but keep it simple. (Use MondoRescue to make a system snapshot for fast restores.)