Sensitive Web- based Applications
Revision History:
2/20/01 - Added provision for informing
users when personal information are collected without their knowledge.
3/28/00 - Added pointer to SANS
Step-by-step Security Guidelines
6/10/99 - First Draft
Sensitive data are increasingly being collected
and analyzed over the web. News stories often include reports of websites
being defaced, and sensitive data being stolen. These standards were developed
to limit the risks associated with using web-based applications for sensitive
data.
Physical Security
The computer running the web server should be kept physically secured in a locked area. Any backup storage media (tapes, removable disks, etc.) should be similarly protected.
Operating system security
The services offered by the computer running the web server should be kept to a minimum. This minimizes the threats to the web server, since each network service carries its own risks. By eliminating all nonessential services you eliminate potential holes through which an attacker could break into your system. Examples of services which may pose un-needed risks include mail, FTP, file sharing, remote access, etc.
Most privileged user
The number of users with most privileged access (e.g. root in UNIX or Administrator in NT) should be kept to a minimum. The most privileged user must never use cleartext, re-usable passwords for remote authentication since passwords can easily be sniffed over public networks.
Limited number of accounts
The number of user accounts on the system should be kept to a minimum. This minimizes threats because it limits the number of accounts capable of attempting to elevate privileges without authorization.
Authentication
If weak authentication (i.e. re-usable, cleartext passwords) is to be used for unprivileged accounts, then user passwords must be at least seven characters long; must not be dictionary words; must contain a mix of alphabetic, numeric and special characters (e.g. "*&^%$%$#"); and must change at least every sixty days. Good password security is the first line of defense against system abuse. Intruders will often try to guess passwords or will try to crack them after stealing the encrypted password database.
Weak authentication is subject to various attacks including password guessing, sharing, cracking and sniffing. Especially sensitive applications may require a form of strong authentication for unprivileged users. One example of strong authentication is the SecurID authenticator token. To successfully authenticate, the user must physically posess the token and must know a password.
Vulnerability Assessment
Configuring a system securely, and ensuring that it remains so over time is difficult. System upgrades, patches, and routine maintenance can introduce unintended side-effects which undermine security or which even open up holes which had previously been closed. The system administrator may be very competent, but no one is perfect. Automated vulnerability assessment tools should be used as a check on human error. Free tools are available (e.g. COPS and Tiger for UNIX) but may be old and outdated. Commercial tools include System Security Scanner (for UNIX and NT) from Internet Security Systems, Kane Security Analyst (NT) from Network Associates, Inc. Penn's Information Security office will also run network-based scans on request that can perform limited vulnerability assessment over the network.
Platform-specific risks
Most operating systems are insecure by default when they arrive new, out-of-the-box. Securing them before placing them on PennNet is a critical step in the overall security of your application. Penn has licensed detailed checklists for securing Windows NT, Solaris and Linux (requires a PennNet ID and password). Checklists for some other operating systems are available at
Vendors of operating systems and application software regularly issue patches to fix serious security weaknesses in their software. Security patches must be applied on a timely and ongoing basis. Pointers to many vendors' security patches are available at:
Logs help ensure accountability. Knowledge that logs are kept acts as a deterrent to abuse. Logs are also essential in investigating incidents after the fact. Logs are typically created both by the operating system as well as by applications like web servers, mail servers, etc. The following events should be logged: failed and successful logins, attempts to access files/directories without authority, successful and failed attempts to access sensitive data. Logs should include (where feasible) the time and date of activities, the user ID, commands (and command arguments) executed, ID of either the and the local terminal or remote computer initiating the connection. To ensure integrity, logs should be written to another computer whenever possible.
Logs often contain sensitive information such as dates and times of
user access. Logs containing sensitive information should only be accessible
to authorized staff and should not be publicly accessible.
Application security
One technique to find sensitive data copied inadvertently is to create unique test data, and then to search the entire hard-drive with a low level search tool like Norton Utilities. For instance, on a commerce site, you might create a dummy/test purchase order in your own name with a fake credit card number, and then search your hard drive for all files containing the credit card number. You should be able to account for all files containing the credit card number, and you should verify that they are properly protected through a combination of operating system and application security.
Another technique is to create test data and then point a web indexing robot at your site. Search the resulting index for information which should not be publicly accessible. You might search on test credit card numbers, customer names, order numbers, etc.
The SSL (Secure Sockets Layer) protocol is a de facto standard for protecting web-based network traffic. The SSL protocol protects data from alteration and disclosure while it is in transit. It also gives users some assurance that they are communicating with the sites they think they are. There is also provision in the SSL standard for client certificates which allow the server to authenticate the end user. SSL is widely deployed in web browsers and servers.
SSL comes in two versions: exportable and domestic (for use within the United States). Exportable SSL is limited to secret keys that may not exceed forty bits, while the domestic version uses stronger 128-bit secret keys. Forty bit secret key encryption is relatively weak. It has been shown that messages encrypted with forty-bit encryption can be decrypted in a matter of hours without requiring knowledge of the key. It is recommended that applications with sensitive data support 128-bit as an option for users with browsers using the stronger domestic protocol.
It is important that users at public workstations shut down their web browser so that subsequent users can not re-use the session ID during it's remaining life.
It is a bad practice to authenticate users by storing a re-usable PIN or password as a hidden field or by embedding it in the URL. It is possible for the next user at that workstation to obtain the password, and gain unlimited access.
It is recommended that security-sensitive web-based applications be run stand-alone on dedicated computers. Running other applications unrelated to the web server almost invariably entails added risk. To the extent that other applications are needed, it is important that they be kept up-to-date with pertinent security patches.
Data Security
Files containing sensitive data should be encrypted wherever possible, using strong encryption, or should be transferred as soon as practical to a secured system not providing public services. Un-encrypted, sensitive files should not be allowed to accumulate on public web servers.
Additional Requirements when using the Web to Conduct Research - Surveys Collecting Personally-Identifiable Data
Web survey subjects must be prevented from viewing any survey records other than their own.
Survey respondents must be informed of any personal data collected about them without their knowledge. Respondents must be given the option of not providing such information or not completing the survey before personal data are collected. Use of "web bugs," URL keywords, or other methods to track respondents' identity withtout their knowledge is inappropriate unless survey respondents are informed in advance that personal information will be collected.
Web sites collecting personally-identifiable survey information must provide on their web page a privacy statement that describes the kind of information that is collected, how it is to be used, and how it may be disclosed. Example privacy statements can be found at:
http://www.gore2000.org/privacy.html
http://www.doc.gov/ecommerce/privst.htm
Other Resources
The WWW-Security FAQ is a good resource:
http://www.w3.org/Security/Faq/www-security-faq.html
Other good resources include:
Web Security & Commerce, by Simson
Garfinkel and Gene Spafford, O'Reilly & Associates, 1997.