CCL Home Page
Up Directory CCL documents

CCL Anonimized Responses

Which Linux to use in a research/educational envioronment

Original Message

Posted by jlabanow at nd edu "Jan Labanowski"
Dear CCL,

With Linux getting more popular and becoming a commercial entity
one, has to make a decision what to install on his/her desktop.
The situation is even more complicated, if you need to install Linux
on hundreds of desktops. The critical points from my perspective are:

1) Which flavor/vendor of Linux will be supported by the vendors
    of application software.  As much, as I may like Debian, many
    commercial packages will not work with it (e.g., due  to file
    system differences).

2) What is the deployment cost (in a broad sense, "the real cost
    of ownership"):
    a) how much you need to pay for the license (for example
       some "Enterprise" Linux distributions)?

    b) how solid is a linux distribution and how much time you need
       to install it and make it work?

    c)  How much maintainance is required?
          i)  The "Enterprise" (and pricy) Linuces promise stability,
              easy upgrade and patching paths, and Administration tools,
              which make maintainance easier for large installed base.
              They also promise continuity and backward compatibility
              for older packages installed on your computer.
         ii)  The "development" (and free) Linuces will require frequent
              patches, release upgrades, etc. You will most likely face
              glibc compatibility problems for many packages when
              you upgrade, have constant problems with kernel versions
              (e.g., for device drivers, which require recompilation
              for specific kernel), etc., etc...

3) Portability... Related to the above. If you compile your software on
    one Linux, can you expect it to run on many other Linux boxes,
    or only on a specific flavor of Linux.

There are obviously more issues. I will summarize the answers and
will make sure that I protect the sources (for corporate respondees,
this may be important).

Thanks in advance

Jan
jlabanow at nd edu

From Estonia

Rather than answer any of the questions you asked, may I point you to a
website which has an interesting and, IMHO, useful perspective:

http://www.infrastructures.org/

I have followed some of their advice, and plan to pick up more when we
re-install most of our Linux boxes after the next version of our
favourite Linux distribution comes out.

From US Academia

Here are my $0.02 worth of reply as an individual who does some
computational chemistry part-time on his own....
See *** embedded in the text below
## 1) Which flavor/vendor...
*** I used Red Hat until they dropped their desktop platform.
I wanted simplicity and was willing to pay a bit for it. So, I moved
to Suse v. 9.0 Professional. Fedora looked like I was going to do more
sys adm. stuff and less chemistry.
## 2) What is the deployment cost...
## a) how much you need to pay
*** I use it in a single system within a simple lan. Deployment has been
easy, although I have a NVidia board and the 3D driver had to be installed
separately. This required a floppy drive for the initial installation that
I did not have. So, getting one, etc. slowed things a bit. So, the main
cost is the desktop package (ca. $100).

*** Moving from Red Hat to Suse has required some education. The system
are different. This is an additional cost, but in time.

## b) how solid is a linux distribution...
*** It is solid. Initial installation took 1hr. if I exclude the time
it took to get the floppy drive for the Nvidia drivers.
## c) How much maintainance is required?
*** I've chose to pay some for the simplicity. So far, upgrading
is easy and I am pleased with Yast/YOU and Suse

## 3) Portability...
*** Portability is an issue. As some one indicated RPM's from Red Hat and
Suse are not interchangeable because they may put libraries in slightly
different places, etc. This is more of a problem with software that uses
dynamic libraries. It is not only limited to RPM. For example, ADF uses
the PGI compiler and you have to download and install their dynamic
libraries for linux for things to work.

I suspect that with vendors complying with LSB some of this may improve,
but the use of shareable objects by vendors of software makes things a
bit difficult when those objects are from producers of compilers like
PGI/ADF.

Suse has its own way of distributing the JAVA lib etc. They warn you
about it, but you need to make the adjustment.

I can see how people like Debian and the use of deb/tgz files. I have
not looked into this distro, but may in the future.

From US Corporate

Dear Jan,

I am surprised that this hardware issue is discussed at ccl - but since
the maintainer brought it up - I am interested in the results anyway.

I have ported some software which uses MPI to linux clusters with MPICH
installed, and set up several linux clusters.
My customers seem to go for redhat enterprise, because of better support
in comparison with other linuxes.
I like to get a feeling where linux is going in the opinion of the ccl
participants - which might be our future customers.
My personal bet is
i) linux will diverse as Unix has done before
ii) windows is becoming a more severe competitor: Grid computing on windows
   machines vs. Linux clusters.
iii) Linux OS development is infected by the consumer needs: games,
     clicks and colours but no stability. So it might loose its biggest
     advantage over windows: stability.
iv) The development is highly inconsistend: Gnome, KDE - if you got used
    to one version - the next is completely redesigned - no evolution.
v) Windows does a much better job on consistent window management and user
   friendliness.
I hope linux will improve to survive at least in the world of scientific
computing.

Best regards,

From The Netherlands, Academia

## 1) Which flavor/vendor
At my work position, RedHat is the university supported linux version, though
I maintain a Debian machine, and SuSe is used at our department too. My
personal desktop runs SunOS.

I mostly work with free tools, but we have Matlab running on one Debian system
and Gaussian98 on two Debian systems.

## 2) What is the deployment cost
## a) how much you need to pay
RedHat is a campus license, but don't know the cost. Debian was downloaded
from the internet, and the one SuSe machine had bought a personal license (I
think).
## b) ...how much time you need
One day for installing the machine at its location and installing the OS.

## c)  How much maintainance is required
The RedHat machines are booted from a faculty server, and centrally
maintained. The SuSe machine has been running for almost two years now, and
the Debian machines too. One of the latter (the one I maintain) is running
testing, and it takes me one day each two months to keep it uptodate...

But you know the virtues of maintaining Debian... two weeks ago I installed
KDE 4.2 which did not fully work, so I just as easily downgraded to KDE 4.1.5
again... 3 hours spent... last week it took me one hour to install the newer
KDE packages and it has now run for a week without problems.

## 3) Portability...
No problems here, working with C/C++ (which compiles on Debian and SunOS from
one automake system) and Java (cdk.sf.net) which compiles on Debian with
kaffee and on any platform with Sun's JDK. For the rest I work with R
(www.r-project.org) which runs on SunOS and Debian...

I hope this helps a bit, even though our deparment does not keep a big
computer room...

From France Gov Research Lab

## 1) Which flavor/vendor of Linux
My impression is that RedHat has the best support from commercial  
software vendors. However, the SuSE distribution that I use is similar  
enough that most packages offered for RedHat work. A lot of commercial  
software (Mathematica comes to mind) assumes the lowest common  
denominator and still uses lib5, for example. Such packages often  
require the installation of support libraries (old versions of standard  
libraries), but work nearly everywhere.
## 2) What is the deployment cost
## a) how much you need to pay
I have yet to see any non-free ingredient in a Linux distribution that  
is of use in science.
## b) how solid is a linux distribution
That also depends on the hardware. If trouble-free installation is  
important, buy PCs with Linux preinstalled. Even if you end up  
reinstalling Linux for optimizing the installation, you know at least  
that the distribution shipped with the hardware supports all  
components.

Technical support also matters, of course. I have had good experience  
with SuSE
## c)  How much maintainance is required?
The main aspects in my experience are:
1) Security updates
2) Version updates for new features

A good distribution will help a lot with 1) but rarely much with 2). I  
am satisfied with SuSE's approach of online security updates, which  
works well enough and at the right level of automaticity (the system  
makes recommendations, but leaves the decisions to the administrator).
## 3) Portability... 
Source code portability is rarely a problem, unless some code relies on  
very recent libraries. Binary portability rarely matters unless you  
distribute commercial software.

My personal history: I had been using RedHat from 1996 to 2002, then  
switched to SuSE in January 2003 because the new PCs we bought were  
offered with SuSE preinstalled. The users have hardly noticed the  
difference, and from my administrator's perspective, SuSE was easier to  
handle, but not so much that I would have switched for that reason  
alone. Both are much easier to administrate than the Debian I had on my  
home machine (until I replaced it with SuSE as well, for just that  
reason).

From US

Ease of 'upgradeability', especially from a cluster perspective, is an
essential critical point that is not on your list.  The cluster provider I
choose to use, based on their 'commitment to service and support' refuses to
provide support for upgrading my cluster from Red Hat 7.3 to any other level
of Red Hat, unless I give them $1,000s or hire an 'experienced Linux
Administrator (also costing $1,000s).

Also, consequences of upgrades is another.  For example, the recent upgrade
of Red Hat 8 to 9 broke many standard tools (e.g. compilers) as well as
compiled applications.  It was (and still is for some) a real nightmare.


From Germany, Academia

I think it depends on experience and personal taste. It is a great
question to start a flame war, since the points below are hard to
analyse in an objective way. Who knows what future will bring us?

The choice for installing something on the personal desktop and for
hundreds of desktops is (maybe) different. For a single desktop, maybe
having little experience with Linux a comercial
insert-CD-and-press-button installation might be preferable. For
hundreds of PCs one should think a bit more carefull:

- How different will the desktops be? Best: To have only a few (one)
groups of computers (maybe even with similar hardware), but this depends
how similar the work of the users is.
- What is more important:
   - stable system
   - long update cycles (not to install a new distro ever 3 months on
all computers, as it is the case for some comercial)
   Or:
   - cutting edge features
   - newest software versions installed
Probably you have to make a trade-off. Maybe you have some developers
which depend on having the newest glibc or lots of feature loving
kiddies which complain all the time if they have not the coolest newest
features in their desktop environment.
For me personally stability and security counts most and I want to have
as little work as possible (sysadmin is not my main job, but I want to
do my PhD at some point...). A desktop environment is good for placing
lots of console windows, but I do not need any features except that it
does not consume much space on my screen. But people comming from M$ are
probably less puristic...
## 1) Which flavor/vendor 
In the moment there is no Linux distro with which you are free of such
problems. Suse RPMs do not work for Redhat and vice versa. If you have
to deal a lot with commercial packages from the US, Redhat is probably
preferred. In Europe (esp. Germany) it will be Suse. For some
open-source projects you may only find DEBs and tar, but no RPMs...
So if you require some commercial package on all computers (and
(re-)building a package or installing as tar (e.g. on a central
fileserver) is not possible) you might take what the company forces you
to use. (I would think twice if there is no alternative for the software...)
I use Debian for our lab. If possible (and desired) I try to get DEBs
and install them on all computers, but for special (scientific software,
software with frequently new versions, e.g. browsers) I think it is
better to have a central server and mount it to all desktops via NFS.
So far it works fine for me.
## 2) What is the deployment cost
## a) how much you need to pay 
And how often you have to pay it. If you do not follow the upgrade path
of the commercial distro and skip some updates it is quite likely that
you will have to reinstall everything.
## b) how solid is a linux distribution
Solid: As stated above, is most important for me. Debian stable is
therefore my preference - with some rebuild packages from testing for
things I use often and the stable version is lacking function. It is an
advantage from Debian that you can mix packages by using apt pinning.
(Unfortunatly the official packages have different glibc versions, that
recompiling is neccessary, but this is also widely automated in Debian)

Installation time: On a single computer it might take longer because
hardware autodetection is not that advanced as in some commercial
distros, but I like it because I never trust plug-and-play...
If you want to install hundreds of computers you can not do it
individually anyway. You need a automated solution. Suse started
relatively late with something like that and it was not very useable,
when I looked at it (but this may have changed - it was over a year
ago). Redhat has its kickstart, which seems to be useable. I have not
used it myself, but know others who use it. For Debian there is FAI,
which works fine. It takes a while to define what you actually want, but
then you can install your 100 computers in 15 min (dependend on
hardware, packageselection etc.).

Make it work: I am not sure what you mean exactly. I guess you mean the
postinstallation, after the installation program finished. This depends
on your network infrastructure, e.g. how user management is organized,
if you use DHCP or not, if you have centralized servers e.g. for
printing etc. If you are very standard with everything, the installation
program might be able to do everything right. If you are less standard,
than it is more handwork, but this is automated by the network
installation software. FAI is very flexible on that.
Another point is to provide all the infrastructure (server for all
services) needed in your network. This is more work than the 100
desktops if they are all the same. There are all-in-one solutions from
some companies, but I personally don't like it when I don't know what is
going on. If you installed everything by hand, you get a system suited
for your needs and if there are errors you can debug them. With the
commercial solutions you depend on the support - do you want that?
For me security is as important as stability (an insecure system is not
a reliable system). Therefore I don't trust any company solution. If I
do it myself I know where the weaknesses are. Company solutions have
usually the aim to replace a sysadmin by a GUI. Easy usage is more
important than security. I think this is a missconception.
## c)  How much maintainance is required?
But mainly they want to sell new versions and support contracts.
## The "development" (and free) Linuces...
Debian Woody is there now since Jul 2002. Ofcause you have to do some
security updates, but this can be nearly automated by some scripts.
Patches etc. are only neccessary if you want to be cutting edge.
## glibc compatibility
If you go to a newer glibc you have to upgrade many packages at the same
time, but such dependencies are resolved automatically by apt. For most
programs you can still get packages for older versions. I had more
problems with Suse being ahead of certain package developments (not yet
ready for the new version).
## kernel versions
If you need lots of exotic hardware...
A conservative approach (what is really needed and provide only this)
saves a lot of time.
## 3) Portability... 
Depends, but it is not unlikely that you get problems with different
libraries. But if you keep all computers the same, than there are no
problems. If you have too many differences you will never get a running
network because you have to care about each computer individually. Put
time in automating things instead of redoing things on different
computers. This scales much better.
An advantage of Unix is that you CAN compile your software for your OS
and architecture. This is what I understand by real portability...
...
In the end you have to try different distros yourself. Again, it depends
on experience, time (full time admin or is something else your real
job), how many computers and how similar there are, what the users do
with them, if you prefer GUI or command line, if you want to know what
is going on or if the software should be 'smart' etc. I started with
Suse, but I failed with it. It was too much work to do upgrades between
versions (3 months cycle) and after leaving one out I had to reinstall
everyting. Suse follows its own concept of config files and if you start
editing configs by hand instead of using Yast it will overwrite your
changes the next time you use it (I know there are also configs for
that...). There is no automated installation (or was at this time).
Debian takes maybe a bit longer to get it running initially, but than
saves you a lot of time on the long run.
Stability improved a lot when going from Suse to Debian. I don't think
it was completely Suses fault (early 2.4 kernels were not that stable,
there were problems with ReiserFS, early NVidia drivers were
unstable...), it was just my observation.
With Redhat I have not much experience. I didn't like the company
politics aiming for a monopoly on Linux (esp. in the US). We have
something like that allready and we see where it leads...

From US Academia

THis is an interesting and important topic. I work in academia (..)
Our lab uses Redhat Linux exclusively (that is, no different
Linux flavors).
## 1) Which flavor/vendor
In my experience, this would be Redhat. since they switched to enterprise
versions now which are commercial, we use Fedora core, which works fine for
us. I havent had major problems with both Redhat 8/9 and Fedora Core (which
is essentially RH 10).
## 2) What is the deployment cost
Our license is free.
## b) how solid is a linux distribution
Again, from my/our experience, Linux Redhat8-10 is very stable. There are
some issues with GUI applications sometimes, which can cause trouble, but in
general, stability is very satisfying. Installation doesn't take much longer
than to install a Windows system.
One very problematic thing for Linux are the newer integrated INtel graphics
cards (i810 and the like) It is impossible to get a satisfying resolution on
the screen, and one needs to buy a separate card.
...
I had some problems with installtion of the specific driver for my GForce
graphics card, but after I found the respective kernel-header files,
everything worked fine. I upgrade my system about every two to three months,
but didn't encounter any major problems.
## 3) Portability... 
I must say that I hadn't tried that a lot, but the one or two time I tried a
binary on one machine that was compiled on yet a different machine, it
worked. Not in between different flavors of Linux, though

From US Gov Lab

## 1) Which flavor/vendor 
I use Debian, and I haven't heard of this problem; what file system
differences are you referring to?

Debian also has wonderful automated tool for installation named FAI,
great for installing clusters or hundreds of desktops.

I copy executables and binary data files back and forth between Debian
and RedHat systems frequently, w/o much difficulty.
[jkl answered: Debian has a different way of mounting file systems due to its
 partition naming scheme. We had problems with some binary RPM based
 distributions to install correctly.
 Of course, for me it is not a problem. But for people who want to push the
 button, it is...]

From US Corporate

an, while it's not (directly?) pertinent, I wanted to point
out that Mac OS X might be worthy of inclusion.  I/we develop
at work under Linux, and we're supported by 1-2 sysadmins who
keep the machines running.  I don't have this luxury at home,
and I'm tired of having to play one...

Couple this with the confusion caused by RedHat's dropping
of their home/low-cost product, and going back to the Mac
was the thing for me to do at home.  Now, I run straight
from Apple OS X, and it has all of the development stuff I
use under Linux.  Granted, it's BSD-based but I was able to
port my stuff (C/Python) in an evening.  The Mac even has
some memory-checking stuff which I don't get using dmalloc
under Linux.

It's also the only platform when I can do "work" and "PC"
stuff without an emulator or dual boot.  I'm not even sure
how to get a Linux running on a Windows PC, and I wasn't
interested in the effort or cost of getting Windows on a
Linux machine.  While it cost a few hundred more (I got a
12" G4 powerbook for the small footprint), it's a whole lot
less hassle - and I've been able to get rid of two machines
from my desk at home.

From a software vendor's point of view, Linux is a problem
area for support.  RHat 7.2 or 7.3 is the last fully
supported version, with 8.0 and 9.0 being problematic or
less problematic.  RHat Enterprise edition seems to be
getting back on track, but which version works with which
graphics cards is still a real bother.  Getting cards and
systems which run GL properly still isn't an easy thing.

I've now done 2-3 years development under Linux and it's
quite easy to use.  Quite similar to the SGI, except for
gdb rather than dbx.  While I've lost a little performance
with gcc vs. the native C compilers, it's not a real issue
(and we can pursue native compilers if we wish).  I'm not
doing fortran at the moment, but I've been told the
performance hit's worse with g77.

I suspect, but have no way to prove, that SuSE will take
over from RHat.  They already seem to be heavily used in
Europe, and Novell/IBM's support might drag it to the
top here in the States.  I'm curious as to what the other
responses will be, and whether anybody other than me
gives a flip about Mac's :-).

From US Academia

Jan
I am currently using Suse 9.0
I have had some issues with Mathematica and AVS but these were ver y minor
I use crossoveroffice to recover the all too necessary microsoft
applications which works very well
I am using the portland compilers and have had no problems
AVS works well but some UI issues
My friend at [Pharma Company] uses this and of course it is all over Europe
hope this helps.

Modified: Wed Apr 14 01:54:10 2004 GMT
Page accessed 24825 times since Wed Apr 14 01:49:48 2004 GMT