HP HOWTO: Configuration Guide and Use of HP Products under Linux (Version 0.94) | ||
---|---|---|
Chapter 2. Presentation of Linux and Free Software |
Once the definitions are given, it is important to stay on at ideas promoted by the free software movement. It's important as well to clear some wrong ideas hawked on these software. This section gives then a various set of elements in favour of the introduction of free software and ends on the real problems remaining to solve.
In fact, the philosophy promoted by the free software movement is not that different from the one proposed by the scientific movement till a long time already: to put in common ideas and collective knowledge to allow the progression of the research and the growth of this knowledge. The knowledge of the human genome is one of the examples of such a collaborative work.
The computer engineering environment, and especially the software one, seems to have turn away till the last 20 years from these base concepts of the scientific world. It prefers on the contrary to keep the customer captive instead of giving him the information needed to exploit his computing environment the best he can. That's indeed following such a problem at the begining of the 80's that Richard Stallman, who was doing researches in artificial intelligence at the MIT, decided to create the GNU project. This project is the foundation of the current free software movement.
The main ideas promoted by this movement and stated by Richard Stallman himself are :
Liberty: every user should be free to copy, diffuse, modify a program, either to share it with others, or to adapt it to his own needs. As well, he should be able to analyse it to understand, imitate, improve, verify its operations, as well as every scientific result is published and seen by the peers for verification, study, understanding and realisation of derived works. Could you imagine a vaccine against the liver cancer that another laboratory couldn't derive to make a vaccine against the pancres cancer. Of course, the negative impact of patents at that level is obvious.
Equality: every person should have the same rights on the software. Thus the provider isn't priviledged and can't keep customers to whom he furnished his work captive. Could you think that only the producer of our vaccine could use it ? With software patents, if you don't pay, you cannot use them (think to discussions around RANT and W3C standards).
Fraternity: this mode of working encourages the whole computer engineering community to cooperate and thus to produce software more and more reliable and useful to all. Could you imagine that a discovery like the vaccine above couldn't help everyone and favour other discoveries. Again with software patents, nothing like that is possible.
More over the utopy of these ideas, we can find other reasons which allowed free software to spread so widely today. They are detailed in Section 2.2.2.
The free software movement materializes itself also through a community of people. That community, informal meeting of personnalities, is heterogeneous in its contents, actions, ideas, even if all share the same belief in the freedom of the software. That community created for itself the tools needed to its communication : Internet and Usenet. And these communication tools are based of course on a lot of free software to work. Among the outstanding persons of this movement, we can present :
Linus Torvalds, conceptor of Linux.
Richard Stallman, GNU project conceptor.
Eric S. Raymond, writer of several and excellent articles, which inspired so many vocations.
Larry Wall, author of Perl and of the patch tool, and philosopher.
Alan Cox, mister "do everything".
Tim O'Reilly <[email protected]>, free software advocate and editor of several books dedicated to them.
All these personalities are, above all, excellent computer engineers, which allows them to be recognized as major actors of the free software movement. Their human and communication qualities are also strong characteristics of their nature. In any case they aren't considered for their power, but for their knowledge.
Of course, the free software community is built of thousands of programmers, whose complete list would be too tedious. All share the will of producing useful work, free, and to be recognized for their technical qualities above all.
To use free software to bring solutions in a computing environment is a choice. First, it is in favour of a plurality of solutions, mainly in the personal computer world which tends to be monopolistic. Then, the choice is made, and that's what is finally important, on the own qualities of free software, which are detailed just below.
This point is the most important of the choice, because it allows the undestanding, adaptation, correction, distribution, improvement of the software.
That quality is derived from the previous one: the free software is the combined result of the experience and the intelligence of all the participants. Its reliability increases then as time passes, with all the corrections which are made. More over, no marketing pression requires the software's producer to deliver it to its customers before it is in a satisfactory state.
This quality is not intrinsic to free software, but is very often seen in a free software. Indeed if a softawre meets success, it will necessarily be adapted to other environments than those initialy considered. Thus by increasing its disponibility, its portability and reliability are also increased. linux works today on a HP Jornada or IBM watch, as well as on s390 or SuperDome.
One essential quality of free software is the character naturaly universal of the data format used. Even if they don't follow standards, the availibility of the source code assures the user that he will understand them, and more over be able to write any filter needed to reuse these data or exchange them with other software. This allows also users to stabilize their environment, because they are not required to migrate due to incompatibility of data formats in their applications. Don't forget that your data are precious ans thet it's better to archive them under a *ML format (HTML, XML, SGML, ...) rther than a proprietary one.
Resulting from a lot of examinations, the use of algorithms coming from advanced research works, as well as tested by various usages, free software have good performances by nature. Frequently large portions of code are rewritten to allow the reuse of the original ideas with a better code and thus to increase performances. Several tests made by various organisms tend to prove it also
Table 2-1. Performances of Free Software
Subject | URL |
---|---|
Apache Web Server and competitors | http://www5.zdnet.com/products/content/pcmg/1709/305867.html |
SMB SaMBa server vs Windows NT | http://www.zdnet.com/sr/stories/news/0,4538,2196106,00.html |
Once more, there is no obligation to diffuse an application whose performances would be bad.
Interoperability is a reality of today's enterprises. Historically, Unix environment was always a ferment for interoperability with other systems (big or medium size systems, as well as personal computers). The support in Linux, for example, of a lot of network protocols, filesystem formats, and even binary compatibility modes assures a good interoperability. By the way, interoperability requires 2 actors, and having only one open is generally not sufficient. That's the goal of having RFCs, norms, standards, ...
When considering the more and more longer development cycles of the software editors, the reactivity brought by the free software movement is interesting for a lot of sites, concerned by the rapid obtention of corrections to a given problem. Thus, during the recent discoveries of IP problems (ping of the death,...), patches were always available within the next 3 days. And above that, only the patch correcting the hole found was delivered. There were no functionality added, which could have create other instabilities.
The best computing security possible is ensured by a robust construction, public and reknown algorithms, a quick communication around flaws, ... In other words by transparency. Obscurity is in this domain bad, useless and even dangerous. In the free software world, the reactivity described in the previous paragraph is a garanty of increased security, ... at the condition that patches are applied regularly.
Independantly of its qualities, it's possible to give other reasons of various nature, in favor of free software, depending on the type of the person met.
Studies from IDC bring to light the irresistible rise of Linux as a server operating system. In 1998, Linux is credited with 17% of market share, with an increase of 212%, which is the most important in that domain. The following graphics give the whole market share repartition.
This was confirmed in 1999 with a market share climbing to 24% and an ancrease of 93%, always more than four times the increase of the follower.
Dataquest estimates on its side that Linux servers will represent, with 1.1 million of units, 14% of the servers sold in 2003.
The Net itself produces marketing tools to demonstrate the superiority of free software. Counters are regularly updated by Netcraft and others, on web server software, and another was done by IOS Counter for the servers on Internet. Results, reproduced below, show the importance taken by Apache with more than 16 millions of operational sites (among them 30% run Linux), crushing the competition, as well as the free operating systems Linux and *BSD which dominate the world of Internet servers.
A detailed explanation based on the most precise numbered results is also regularly updated, showing the advantages to use free and open source software, Cf: http://www.dwheeler.com/oss_fs_why.html.
Financial factors also speak for free software. And first the price to aquire them is low. Low, because it's never zero. Even if you can find it on the Internet, you have to consider the costs related to that link. However costs are greatly less expensive than for commercial software. So a RedHat 7.2 Linux distribution, delivered with more than 1400 software packages, costs about 60 USD when you have to pay more than 800 USD to obtain Windows 2000 server, delivered only with IIS.
On the other side, free software don't have the notion of license by user or by supplmentary service. Thus there is no additional cost when you have to increase the use of these software in your entity. That's of course not the case with commercial software whose economical logic is often based on the number of licenses.
Free software bring in addition a better mastering of the TCO (Total Cost of Ownership), mentionned so frequently in the massive deployment of personal computers. Thus administration costs are reduced because systems like Linux or FreeBSD, as Unix, are managed completely remotely, either through command line orders (with telnet or ssh) or in graphical mode by using X-Window. More over, we benefit from a true multi-users mode, improving these management operations. Always in this domain, it's also possible to do remote management, either through the own hardware capacities of the machine (as the Remote Assistant card integrated in most HP NetServers), or by doing a remote connexion (through modem, ISDN adapter or a permanent link) thanks to the native PPP protocol and secure connexion systems as tunneling or ssh. This managemnt could even be realised by an external entity, in outsourcing.
At last, the costs due to the hardware themselves could be controled; on one side, if by chance free software don't meet the needs, it's always possible to buy then commercial software solutions to cover the rest on the same hardware. On the other side, solutions based on free software have good performances by nature, and can use hardware platforms which would be considered as obsolete, if installed following the standard criterias of other operating systems or applications. The fact to separate the graphical interface from the rest of the working system is here key. It's so possible to use "old" hardware, mainly to model. It's then possible to invest, with a fine knowledge, when puting the solution in operation, if needed. The power increase may naturaly take place progressively.
This argumentation was already given in the previous sections. I think nevertheless that some notions may be explained with complementary information.
So concerning the reliability aspects of free software based solutions, it's important to note that it implies an operational running time very high (standard characteristic of Unix systems in general). This is mesured by the command uptime. One of Medasys and HP customers, Saint-Michel Hospital in Paris, has a Vectra VL5 acting as router under Linux since more than 300 days. And that's not a isolated case.
Respect of standards and norms, as well as the extreme portability of free software assures also to applications developed on these platforms the same qualities. And notably, if after their use, the performances or services brought by free software based architectures were insufficient (it may be caused by architecture problems such as PCI bandwidth, number of processors available, ...), it would be easy to migrate to machines offering more performances and capacities of evolution, as the HP 9000 systems, running HP-UX.
At last a development plan centered around performances implies a modularity, such as it's possible to resize the system kernel nearest to the capacities of the hardware or to use dynamically loaded modules following the needs. A packages installation may vary from 40 MB for a minimal system up to many GB for a complete distribution. The system linearity allow also the support of multi-processors machines (SMP) (tested up to 32 processors on a Sparc machine). The system modularity allows also to obtain an operational system on a 1.44 MB floppy disk, either to realize a minimal repair environment, or to provide a perfectly operational router. The world of embedded systems shows besides more and more interest for systems such as Linux, because above it's modularity, source availability makes communication with dedicated peripherals easier (acquisition cards, sonde, ...). Entities as CERN or Thomson already use such solutions.
That argumentation is probably the most important of all, because it's useless to have free software if it's not to make something useful with it or to offer solutions to demands of entities willing to use it. In which sectors free softawre may bring solutions today ? Well, you have to admit it's in nearly all the sectors of enterprise computing.
Historically, Open Source Software were used to realise Internet/Intranet servers, because their growth was following the one of the Net. It's so possible to cover all aspects linked to the Internet, from the Web server ( Apache ), FTP server (Wu-Ftpd), DNS server (Bind), the E-Mail server (Sendmail or PostFix ), the Usenet groups server (INN), the proxy server (IPmasqadm), the firewall (IPChains or IPTables), Virtual Private Network (OpenSSH), the Cache server for the Web ( Squid ) or also the Time server (NTP) or as directory service(LDAP) server, or as content management(Midgard) server ... All these software are available in standard in a Linux distribution. The client computer should be equiped with the software corresponding to the application used (mail reader, news reader, web browser, ...) whatever its operating system. The choice of the client is free, as all these tools respect the standards decreed in the RFCs.
The second preferential domain for free software is the file and print server domain. For these services, clients may be multiple: Unix type (use of NFS and KNFS or also Coda and Inter-Mezzo, for file sharing and of lpd or CUPS for print service), Microsoft Windows type (use of SaMBa , which allows also the use of local client printers), Novell type (use of Mars_nwe) or MacIntosh type (use of NetAtalk). All these software are provided in standard in a Linux distribution and don't need any modification at the client level to work.
The other domains where a system such as Linux may bring solutions is the computation one, with support of multiprocessor, linked to the realisation of clusters with multiple nodes with Mosix or BeoWulf with high-speed network interfaces (100 Mbit/s, Gigabit or Myrinet); those of data security with the support of HP NetRaidRem. cards, allowing Raid level of 0, 1, 3, 5, 10, 50, and HotSpare disks, managed by the harware; those of centralized fax server, with a free software like HylaFAX or also as an archive/backup server with HP SureStore DAT or DLT libraries thanks to a GPL software like Amanda or a commercial one like Arkeia or at last as a database server with free solutions like PostgreSQL,MySQL or commercial like Oracle, to speak only of these three.
On the client side, even if it's less highlighted for the moment, possibilities to use solutions based on free or commercial software are numerous. Here also the Internet part is the main one, with tools like graphical web browsers (Netscape, Mozilla or Konqueror) or textual (lynx or w3m), a lot of graphical mail readers (Kmail, XFMail, ...) or textual (mutt, elm, ...). But you also have the whole panel of indispensable tools for a personal computer today as a PDF reader (Acrobat Reader or xpdf), image manipulation tools (ImageMagick, the Gimp, RealPlayer tools ...), word processors (LyX, LaTeX, SGMLTools, Wordperfect, ...), commercial office suites (ApplixWare, StarOffice), or opensource (Koffice, OpenOffice, ...), sound management tools (Xmms, eplaymidi, xmcd, ...), CD burning tools (cdrecord, BurnIT, ... with complements as mkisofs, cdparanoia, cdrdao), free and commercial emulators for various systems (Wine, Executor, WABI, DOSEmu,Win4Lin, VmWare...), compilers and interpretors for all the languages (C, C++, Pascal, Fortran, Basic, Tcl/Tk, Perl, Python, Ada, Eiffel, Lisp, Scheme, Prolog...), including commercial versions ( PGI, Intel...), graphical environments (Gnome, KDE, Motif, OpenMotif, ...). The evolution of these last tools indicates that the 2000's may be the years where Linux and free software will break through at their turn on the client.
I want to mention that this document was realised on an HP Brio BAx, then on a Vectra VL400 equiped only with a Linux distribution, with the help of tools like DSSSL style sheet, OpenJade and DocBook, ViM which allowed to generate from a single source the formats HTML, Txt, RTF, PostScript, and PDF.
This one was for a long time a blocking point to the expansion of free software in the firms. It's not the case today. Many service providers or hardware manufacturers, like HP, control today these solutions and propose support around them, uoto missions critical if requested.
Other sources of information are also available, in abundance, through several web sites dedicated to these solutions, specialised mailing-lists, and various Usenet groups, such as for Linux, the international groups under comp.os.linux.* or for the french speaking people under fr.comp.os.linux.*.
Concerning competences, more and more young engineers or academics finish their learning cycle being trained to the use of free applications and operating systems. This wealth of competences arrives now on the labour market and will contribute to increase the movement of generalisation of these tools. At last, many firms have internaly ignored competences. In fact, their employees often install this software at home, and have a good mastering, usable when arrives the deployment of the software in their professional structure.
Advocating free software consists also to mention some generally accepted ideas concerning them and to fight them.
As seen previously, support is currently structuring itself. A firm like RedHat provides today support for their solutions. Only in France, we may mention firms like Medasys , Atrid, Alcove which assure support on free software. Likewise, always in France, training on free software may be given by HP France, Learning Tree, the IUT de VИlizy, without mentioning generic network and Unix trainings (besides proposed also by the same organizations) which represent a fundamental base in a training course. And at last, we should mention the ability of each of us to self-train, mainly thanks to the huge documentation available (See the Chapter 7).
There is a whole set of manuals, the Linux Documentation Project, made of FAQ (Frequently Asked Questions) and HOWTO, counting more than 300 documents around Linux, the main being translated in french, and in japanese available as free documentation. This documentation has a various quality, more or less up to date, following the subjects, certainly but it makes a corpus allowing to apprehend alone a Linux distribution and all its components. For myself, I always found in it everything I needed to do my job with free software. And, in case of complementary information, a lot of web sites and Usenet groups may again bring some of the elements needed. And without counting the innumerable manual pages available on line. Each distribution comes also with a comprehensive set of manual covering all the tasks of installing, handling and managing them.
On the other side, the editors O'Reilly and SSC have specialised in providing books around free software, written generaly by the writers of the software themselves. Their books are considered as reference books in their respective domains.
You should always make a difference between free (as speech) and free (as beer). Too many freeware in Microsoft environment are in fact toys and of poor quality. It's absolutely not the case for free software, as stated in the previous sections. Remember just that they are reliable by construction.
Linux is a professional operating system. At that title, it requires competences to install it, as well as any other professional operating system, like the other Unix or Windows NT for example. But it isn't more difficult to install than those either, mainly thanks to distributions as RedHat, Mandrake, ... You need about 30 minutes to realize a complete installation of such distributions, so quite the same as for HP-UX and noticeably less than for Windows NT Server.
On the other hand, as before installing a server with Windows NT you have to verify its compatibility with the Hardware Compatibility List of Microsoft, for Linux it's also greatly recommended to verify the Hardware HOWTO, and for HP machines to refer to Section 3.2.
This is less and less true and this criticism has been obsoleted with the latest versions of the Linux kernel which will include a journalised filesystem, allowing a true application cluster. But already Linux authorises the use of multi-processors, several nodes to realise computation clusters. And don't forget it's used by the portal Voila (France Telecom) or the engine Google among other prestigious references. As well, FreeBSD is used with success as the world biggest ftp server : Walnut Creek CDROM server Useful projects to consult in this area of high availability are http://www.linux-vs.org, http://www.opengfs.org and http://www.linux-ha.org
There is no appeal, because software licences deny all responsabilities for the writers, in case of problem. But, in reality developers are always ready to help in case of problem and try to correct as soon as possible the bugs encountered (for the F00F bug of the pentium, a patch for the Linux kernel was published within 3 days, for example). On the other side, commercial editors guarantee very badly users against problems other than packaging errors. Please read the notes furnished with your software to judge.
It would not be honest to negate certain remaining problems linked to free software. Some have begun to disappear, other are inherent to the system, other at last will take time to diappear.
The first problem, inherent to the model of free software, is the multiplicity of tools and distributions available. So, if you want to setup a mail server, you have to choose between Sendmail, Exim, PostFix, Qmail, Smail. As well if you want to install Linux, you may choose between the distributions RedHat , SuSE, Slackware, Mandrake, Turbo Linux, Debian. This represents often a problem for the newcomer, but the esperienced user will always prefer to have a large choice he will confront to his particularities and to his experience. As long as an actor respects the rules by freeing his code (it's the case of the rpm and deb formats for example), there is little risk from the comunity point of view. The key point is that the choice is made only taking in account technical criterias.
The second problem, inherent also to the free software birth, is the necessity to have strong Unix and Internet competences, to manage such solutions. The power available through these systems is proportional to the competence of their administrators. And that will stay true even with the growth of more and more grahical solutions to manage them. On the other side, you capitalize the investment in time to learn their functions and that doesn't disappear, because you don't have to re-learn eveything from one version to the other (I use the same editor, vi, for the last 15 years). Don't forget that systems you use daily seem to be simple, uniquely because you passed enough time to learn them. For Internet competences, it's a statement of the obvious to say that this investment isn't lost. At last, even with the work of translators to provide information in french and other languages, a good knowledge in technical english is definitively a plus.
The last problem met in the implementation of free software solutions is to suceed in convincing some managers to go against the prevailing opinion. The aim of this chapter is precisely to give all sorts of argumentations to achieve that goal, but you need each time to show conviction to get his way in the end. As soon as these solutions will be adopted by big firms principaly, resistances will disappear.
Закладки на сайте Проследить за страницей |
Created 1996-2024 by Maxim Chirkov Добавить, Поддержать, Вебмастеру |