Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.arcetri.astro.it/irlab/doc/library/linux/lasg-www/getting-started/index.html
Дата изменения: Tue Jan 18 09:28:00 2000 Дата индексирования: Sat Dec 22 15:40:08 2007 Кодировка: Поисковые слова: п п п п п п п п п п п п п п п п п п п |
There are many issues that affect actually security setup on a computer. How secure does it need to be? Is the machine networked? Will there be interactive user accounts (telnet/SSH)? Will users be using it as a workstation or is it a server? The last one has a big impact since "workstations" and "servers" have traditionally been very different beasts, although the line is blurring with the introduction of very powerful and cheap PC's, as well as operating systems that take advantage of them. The main difference in today's world between computers is usually not the hardware, or even the OS (Linux is Linux, NT Server and NT Workstation are close family, etc.), it is in what software packages are loaded (Apache, X, etc) and how users access the machine (interactively, at the console, and so forth). Some general rules that will save you a lot of grief in the long run:
1. Keep users off of the servers. That is to say: do not give
them interactive login shells, unless you absolutely must.
2. Lock down the workstations; assume users will try to 'fix'
things (heck, they might even be hostile, temp
workers/etc).
3. Use encryption wherever possible to keep plain text passwords,
credit card numbers and other sensitive information from lying
around.
4. Regularly scan the network for open ports/installed
software/etc that shouldn't be, compare it against previous
results..
Remember: security is not a solution; it is a way of life (or a procedure if you want to tell your manager).
Generally speaking workstations / servers are used by people that don't really care about the underlying technology, they just want to get their work done and retrieve their email in a timely fashion. There are however many users that will have the ability to modify their workstation, for better or worse (install packet sniffers, warez ftp sites, www servers, IRC bots, etc). To add to this most users have physical access to their workstations, meaning you really have to lock them down if you want to do it right.
1. Use BIOS passwords to lock users out of the BIOS (they
should never be in here, also remember that older BIOS's have
universal passwords.)
2. Set the machine to boot from the appropriate harddrive
only.
3. Password the LILO prompt.
4. Do not give the user root access, use sudo to tailor access to
privileged commands as needed.
5. Use firewalling so even if they do setup services they
wont be accessible to the world.
6. Regularly scan the process table, open ports, installed
software, and so on for change.
7. Have a written security policy that users can understand, and
enforce it.
8. Remove all sharp objects (compilers, etc) unless needed from a
system.
Remember: security in depth.
Properly setup, a Linux workstation is almost user proof (nothing is 100% secure), and generally a lot more stable then a comparable Wintel machine. With the added joy of remote administration (SSH/Telnet/NSH) you can keep your users happy and productive.
Servers are a different ball of wax together, and generally more important then workstations (one workstation dies, one user is affected, if the email/www/ftp/etc server dies your boss phones up in a bad mood). Unless there is a strong need, keep the number of users with interactive shells (bash, pine, lynx based, whatever) to a bare minimum. Segment services up (have a mail server, a www server, and so on) to minimize single point of failure. Generally speaking a properly setup server will run and not need much maintenance (I have one email server at a client location that has been in use for 2 years with about 10 hours of maintenance in total). Any upgrades should be planned carefully and executed on a test. Some important points to remember with servers:
1. Restrict physical access to servers.
2. Policy of least privilege, they can break less things this
way.
3. MAKE BACKUPS!
4. Regularly check the servers for changes (ports, software,
etc), automated tools are great for this.
5. Software changes should be carefully planned/tested as they
can have adverse affects (like kernel 2.2.x no longer uses
ipfwadm, wouldn't that be embarrassing if you forgot to install
ipchains).
Minimization of privileges means giving users (and administrators for that matter) the minimum amount of access required to do their job. Giving a user "root" access to their workstation would make sense if all users were Linux savvy, and trustworthy, but they generally aren't (on both counts). And even if they were it would be a bad idea as chances are they would install some software that is broken/insecure or other. If all a user access needs to do is shutdown/reboot the workstation then that is the amount of access they should be granted. You certainly wouldn't leave accounting files on a server with world readable permissions so that the accountants can view them, this concept extends across the network as a whole. Limiting access will also limit damage in the event of an account penetration (have you ever read the post-it notes people put on their monitors?).
Written by Kurt Seifried |