Minimal linux distro for underpowered machine

Drake Diedrich dld at coyote.com.au
Tue May 21 16:55:33 EST 2002


On Tue, May 21, 2002 at 11:28:30AM +1000, Ben.Westgarth at facs.gov.au wrote:
> 
> I am looking for a small linux distro which will run on an old dusty 486 dx4/100
> with 24MB ram and a 540MB hard drive (please don't laugh). I also need it to run
> X. Does anyone have any recommendations on distros which could pull this off? I
> had a quick look at Peanut Linux and it claims to be able to do so, but before I
> tie up the phone line for a couple of days I thought I'd quickly canvas the list
> for any other suggestions.

   Overkill.  :)  I've run Debian with X on a 20MB machine in production
(for about a year, 1996ish).  A very minimal 16MB B&W xterminal 486/33 also
ran until about 2 years ago as my second (x2x'able) terminal.  HDD on both
machines was ~200MB, distribution was approximately Debian 2.0, with libc
and a few other things upgraded.  Still run some early-mid pentium machines
only slightly more capable than these as full workstations around the house,
all running Debian/unstable (lagging the bleeding edge a few weeks at most).
    Debian _runs_ perfectly well on very small machines. Upgrading using
dselect, dpkg, and apt though is a serious pain against a full archive,
especially the latest.  What I do though is upgrade my largest machine, copy
all the .debs out of /var/cache/apt/archives, and (using a slightly modified
dpkg-scanpackages) build a new local archive (a Packages.gz file in a
directory full of .debs) that contains only the packages I could be bothered
installing on the large machine.  This smaller archive is much easier to
install from on small disk/memory machines.  For very small machines (8MB),
having no preconfigured archives and dpkg -i the .debs is the most practical
way to maintain.  Down at 4MB you might need to NFS export rw without
root_squash to a big-brother machine, and use the admindir/rootdir options
to get it to remotely admin the machine.  Very painful.  (Or physically
transfer the HDD to the larger machine for maintenance).  At 24MB turning
off X during apt-get's might be sufficient to keep all the package data in
core and not swap madly during upgrades and maintenance.

   For X, you can lose a lot of fonts and still have a usable machine. 
Memory is more precious than disk space, so using a remote font server isn't
a win.  You *want* most fonts to be unavailable so that programs will fall
back and use only a handful of minimal standard fonts (like "fixed").  Fonts
are server side, so you have to pay local memory for them, even if you're
mostly using the machine for remote applications.  Keeping the window
manager and light apps like the terminal and ssh local improves latency and
security.  Large apps like graphical web browsers, XEmacs, or GIMP really
benefit from being run off a larger remote machine.  If you share your home
directory (you can see the whole of your local network, right? :) ) between
beefy and smaller machines there are few differences other than the
application speed between local and remote apps, and even that can be
tolerable with remote execution on the faster machines. It also makes it
easier to trade seats when someone really does *need* to be sprawled out in
front of a machine with 384M of RAM, DRI capable video card, or DVD drive.

   You don't have to limit fonts or large applications, or play with smaller
local .deb archives on a 24-32MB machine, but they are suggestions to
improve the responsiveness if you find the installed or installing machine
too slow.

-Drake




More information about the linux mailing list