Electronic Commerce

Initially, the Internet was designed to be used by government and academic users, but now it is rapidly becoming commercialized. It has on-line “shops”, even electronic “shopping malls”. Customers, browsing at their computers, can view products, read descriptions, and sometimes even try samples. What they lack is the means to buy from their keyboard, on impulse. They could pay by credit card, transmitting the necessary data by modem; but intercepting messages on the Internet is trivially easy for a smart hacker, so sending a credit-card number in an unscrambled message is inviting trouble.

It would be relatively safe to end a credit card number encrypted with a hard-to-break code. That would require either a general adoption across the internet of standard encoding protocols, or the making of prior arrangements between buyers and sellers. Both consumers and merchants could see a windfall if these problems are solved. For merchants, a secure and easily divisible supply of electronic money will motivate more Internet surfers to become on-line shoppers.

Electronic money will also make it easier for smaller businesses to achieve a level of automation already enjoyed by many large corporations whose Electronic Data Interchange eritage means streams of electronic bits now flow instead of cash in back-end financial processes. We need to resolve four key technology issues before consumers and merchants anoint electric money with the same real and perceived values as our tangible bills and coins. These four key areas are: Security, Authentication, Anonymity, and Divisibility.

Commercial R&D departments and university labs are developing measures to address security for both Internet and private-network transactions. The venerable answer to securing sensitive information, like credit-card numbers, is o encrypt the data before you send it out. MIT’s Kerberos, which is named after the three-headed watchdog of Greek mythology, is one of the best-known- private-key encryption technologies. It creates an encrypted data packet, called a ticket, which securely identifies the user.

To make a purchase, you generate the ticket during a series of coded messages you exchange with a Kerberos server, which sits between your computer system and the one you are communicating with. These latter two systems share a secret key with the Kerberos server to protect information from prying eyes and to assure that your ata has not been altered during the transmission. But this technology has a potentially weak link: Breach the server, and the watchdog rolls over and plays dead. An alternative to private-key cryptography is a public-key system that directly connects consumers and merchants.

Businesses need two keys in public- key encryption: one to encrypt, the other to decrypt the message. Everyone who expects to receive a message publishes a key. To send digital cash to someone, you look up the public key and use the algorithm to encrypt the payment. The recipient then uses the private half of the key pair for decryption. Although encryption fortifies our electronic transaction against thieves, there is a cost: The processing overhead of encryption/decryption makes high-volume, low- volume payments prohibitively expensive.

Processing time for a reasonably safe digital signature conspires against keeping costs per transaction low. Depending on key length, an average machine can only sign between twenty and fifty messages per second. Decryption is faster. One way to factor out the overhead is to use a trustee organization, one that collects batches of small transaction before passing them on to the credit-card organization for rocessing. First Virtual, an Internet-based banking organization, relies on this approach.

Consumers register their credit cards with First Virtual over the phone to eliminate security risks, and from then on, they uses personal identification numbers (PINs) to make purchases. Encryption may help make the electric money more secure, but we also need guarantees that no one alters the data–most notably the denomination of the currency–at either end of the transaction. One form of verification is secure hash algorithms, which represent a large file of multiple megabytes with a elatively short number consisting of a few hundred bits.

We use the surrogate file–whose smaller size saves computing time–to verify the integrity of a larger block of data. Hash algorithms work similarly to the checksums used in communications protocols: The sender adds up all the bytes in a data packet and appends the sum to the packet. The recipient performs the same calculation and compares the two sums to make sure everything arrived correctly. One possible implementation of secure hash functions is in a zero-knowledge-proof system, which relies on challenge/response protocols.

The server poses a question, and the system seeking access offers an answer. If the answer checks out, access is granted. In practice, developers could incorporate the common knowledge into software or a hardware encryption device, and the challenge could then consist of a random-number string. The device might, for example, submit the number to a secure hash function to generate the response. The third component of the electronic-currency infrastructure is anonymity–the ability to buy and sell as we please without threatening our fundamental freedom of privacy.

If unchecked, all our transactions, as well as analyses of our spending habits, could eventually reside on the corporate databases of individual companies or in central clearinghouses, like those that now track our credit histories. Serial numbers offer the greatest opportunity for broadcasting our spending habits to the outside world. Today’s paper money floats so freely throughout the economy that serial numbers reveal nothing about our spending habits.

But a company that mints an electric dollar could keep a database of serial numbers that records who spent the currency and what the dollars purchased. It is then important to build a degree of anonymity into electric money. Blind signatures are one answer. Devised by a company named DigiCash, it lets consumers scramble serial numbers. When a consumer makes an E-cash withdrawal, the PC calculates the number of digital coins needed and generates random serial numbers for the coins.

The PC specifies a blinding factor, a random number that it uses to multiply the coin serial numbers. A bank encodes the blinded numbers using its own secret key and debits the consumer’s account. The bank then sends the authenticated coins back to the consumer, who removes he blinding factor. The consumer can spend bank-validated coins, but the bank itself has no record of how the coins were spent. The fourth technical component in the evolution of electric money is flexibility.

Everything may work fine if transactions use nice round dollar amounts, but that changes when a company sells information for a few cents or even fractions of cents per page, a business model that’s evolving on the Internet. Electric-money systems must be able to handle high volume at a marginal cost per transaction. Millicent, a division of Digital Equipment, may achieve this goal. Millicent uses a variation on the digital-check model with decentralized validation at the vendor’s server. Millicent relies on third-party organizations that take care of account management, billing, and other administrative duties.

Millicent transactions use scrip, digital money that is valid only for Millicent. Scrip consists of a digital signature, a serial number, and a stated value (typically a cent or less). To authenticate transactions, Millicent uses a variation of the zero-knowledge-proof system. Consumers receive a secret code when they obtain a scrip. This proves ownership of the currency when it’s being spent. The vendor that issues the scrip value uses a master-customer secret to verify the consumer’s secret.

The system hasn’t yet been launched commercially, but Digital says internal tests of transactions across TCP/IP networks indicate the system can validate approximately 1000 requests per second, with TCP connection handling taking up most of the processing time. Digital sees the system as a way for companies to charge for information that Internet users obtain from Web sites. Security, authentication, anonymity, and divisibility all have developers orking to produce the collective answers that may open the floodgates to electronic commerce in the near future.

The fact is that the electric-money genie is already out of the bottle. The market will demand electric money because of the accompanying new efficiencies that will shave costs in both consumer and supplier transactions. Consumers everywhere will want the bounty of a global marketplace, not one that’s tied to bankers’ hours. These efficiencies will push developers to overcome today’s technical hurdles, allowing bits to replace paper as our most trusted medium of exchange.

Internet Censorship

Everyone has heard of the Internet and how it is going to help set the world free. The Internet is the fastest growing form of communication and is becoming more and more common in the home. Companies these days do big business over the Internet, and online shopping has grown tremendously in the last few years. For instance, the online auction site eBay sells millions of items every year online. Many companies are making even more plans to expand their business to the Internet. Unfortunately, there have been numerous attempts lately to censor the Internet.

If the Internet is controlled, regulated, restricted, or censored it will have harsh effects on its capabilities. In recent years, Americas economy has become increasingly dependent on the need to instantly move large amounts of information across long distances. Computerization has changed everyones life in ways that were never before possible. The global network of interconnected computers allows people to send electronic mail messages across the world in the blink of an eye and stay updated on world events as they happen; the world has become a much smaller place as a result of this global communication and exchange of ideas.

There have also become thousands of online communities of people who share common interests through message boards, chat rooms, and electronic mailing lists (Wilmott 106). Right now, the Internet is the ultimate demonstration of the first amendment: free speech. A place where people can speak their mind without being punished for what they say or how they choose to say it. The Internet owes its incredible worldwide success to its protection of free speech, not only in America, but also in countries where freedom of speech is not guaranteed.

For some, it is the only place where they can speak their mind without fear of political or religious persecution. (Cyberchaos). The Internet is also one of America’s most valuable types of technology; scientists use email for quick and easy communication. They post their current scientific discoveries on online newsgroups so other scientists in the same field of study all over the world can know in minutes. Ordinary people use the Internet for communication, expressing their opinions in the newsgroups, obtaining new information from the WWW, downloading all types of media files, or just surfing for their own personal enjoyment.

Users of the Internet have the freedom to express anything they believe. The fact that the Internet has no single authority figure creates a problem about what kind of materials should be available on the Internet. (Hentoff 12) The largest controversy that surrounds censoring the Internet is what information should be considered offensive. The Internet can be viewed in many different ways. It can be considered a carrier of common data, similar to a phone company, which must ignore what is broadcast for privacy reasons.

Or, it can be considered a distributor and broadcaster of information, much like a television or radio station, which is responsible for what it broadcasts and has to conform to federal standards and regulations. This argument is the main concern of the censorship matter. The Internet is a carrier of information, and not a broadcaster, since it only provides the basic structure for information transfer and sharing. (Cyberchaos) But this angers lawmakers. The current laws existing today do not apply well to the Internet.

The Internet cannot be viewed as one type of transfer medium under current broadcast definitions (Muzzling the Internet). One large difference that sets the Internet apart from a broadcasting media is the fact that one cant stumble across a vulgar or obscene site without first entering an address or following a link from another page. There are exceptions, of course, but for the most part, if one wants to find dirty material on the Internet, they have to go out and look for it. The Internet is much more like going into a bookstore and choosing to look at adult magazines than it is like channel surfing on television (Miller 75).

The Internet is a great place of entertainment and education, but like all places used by millions of people, it has some bad influences that people would rather not have their children view. Parents usually try to protect their children, but there are no boundaries to the Internet. For this reason, there have been many attempts at censoring the Internet in the name of protecting children. One example is the Communications Decency Act of 1995.

The Communications Decency Act, also known as the Internet Censorship Act, was introduced in the U. S. Congress in 1995. It would make it a crime to make anything available to a child that is indecent, or to send anything indecent with “intent to annoy, abuse, threaten, or harass. The goal of this bill was to try to make all public material on the Internet suitable for young children. The bill would have made certain commercial servers that carry pictures of nudity, like those run by Penthouse or Playboy, be shut down immediately or face prosecution. The same goes for any amateur web site that features nudity, sex talk, or dirty language.

Posting any dirty words in an online discussion group, which occurs often, could make one liable for a $50,000 fine and six months in jail (Levy 78-79). Why does it suddenly become illegal to post something that has been legal for years in print? If it had passed, the bill would also have criminalized private mail. … I can call my brother on the phone and say anything – but if I say it on the Internet, it’s illegal” (Levy 53). Most Internet users are enjoying their freedom of speech on the Net, which is supposed to be protected by the First Amendment of the United States.

It is believed by many that the Internet provides greater freedom of speech and press than anything before in our history. “Heavy-handed attempts to impose restrictions on the unruly but incredibly creative anarchy of the Net could kill the spirit of cooperative knowledge-sharing that makes the Net valuable to millions” (Rheingold). The freedom of ideas and expression is what makes the Internet important and enjoyable, and it should not be suppressed for any reason. Additionally, only a very small portion of the Internet contains offensive material.

Most people do not use the Internet for pornography. There is no doubt that porn is popular. But the Net is mostly being used for communication and information exchange, and only a tiny portion of the Net contains pornography and other offensive material. While people are concerned about Internet pornography, it is true that it is often perfectly legal; for example, pornography is legal in video and magazines. Therefore, it is contradictory to ban the Internet equivalents (Legal Definition… ). “Citizens should have the right to restrict the information flow into their homes.

They should be able to exclude from their home any subject matter that they do not want their children to see. But sooner or later, their children will be exposed to everything from which they have shielded them… ” (Rheingold). The Internet is definitely not the only means for teenagers to find inappropriate material. If kids want to get a hold of dirty pictures or magazines, there are many other ways to find them besides the Internet. If the purpose of censorship is to prevent minors from being exposed to indecent material, not only the Internet has to be censored.

Censoring the Net will only eliminate one way for children to find this material. Government censorship is not the solution to the problem, and other measures that have the same effects as censorship can be used. For example, there are many software programs that can be purchased or downloaded for free which block out web sites with offensive language or words. Programs such as Net Nanny, Cyber Patrol, and Net Watch can be set up by parents to block access to websites that contain any words, or foul language that may be unsuitable for children.

While these programs have many flaws (a completely appropriate website on breast cancer could be blocked), they are definitely a much smarter and fairer alternative to government censorship (Parental Control Ware). To conclude, the Internet is one of the worlds greatest resources to freedom of speech and expression, and it has the potential to bring education and better communication to every part of the world. All types of people use it, and its free speech and equality has made it incredibly popular.

However, government attempts to censor the Internet in the name of protecting children can only have harmful effects. Censoring the Internet will only contribute to limiting its potential. It also has to be taken into account that indecent or pornographic websites only make up a tiny portion of the Net, and much pornography is legal. The solution to keeping kids from getting into inappropriate websites is to monitor their access, use filtering software, and teach them morals. Censoring the Internet can only be harmful to everyone else who uses it.

The History of the Internet and the WWW

The internet started out as an information resource for the government so that they could talk to each other. They called it “The Industrucable Network” because it was so many computers linked to gether that if one server went down, no-one would know. This report will mainly focus on the history of the World Wide Web (WWW) because it is the fastest growing resource on the internet. The internet consists of diferent protocals such as WWW, Gopher (Like the WWW but text based), FTP (File Transfer Protocal),  and Telnet (Allows you to connect to different BBS’s).

There are many more smaller one’s but they are inumerable. A BBS is an abreviation for Bullitin Board Service. A BBS is a computer that you can ether dial into or access from the Internet. BBS’s are normally text based. A graduate of Oxford University, England, Tim is now with the Laboratory for Computer Science ( LCS)at the Massachusetts Institute of Technology ( MIT). He directs the W3 Consortium, an open forum of companies and organizations with the mission to realize the full potential of the Web.

With a background of system design in real-time communications and text processing software development, in 1989 he invented the World Wide Web, an internet-based hypermedia initiative for global information sharing. while working at CERN, the European Particle Physics Laboratory. He spent two years with Plessey elecommunications Ltd a major UK Telecom equipment manufacturer, working on distributed transaction systems, message relays, and bar code technology.

In 1978 Tim left Plessey to join D. G Nash Ltd, where he wrote among other things typesetting software for intelligent printers, a multitasking operating system, and a generic macro expander. A year and a half spent as an independent consultant included a six month stint as consultant software engineer at CERN, the European Particle Physics Laboratory in Geneva, Switzerland. Whilst there, he wrote for his own private use his first program for storing information including using random associations.

Named “Enquire”, and never published, this program formed the conceptual basis for the future development of the World Wide Web. I could go on and on forever telling you about this person, but my report is not about him. From 1981 until 1984, Tim was a founding Director of Image Computer Systems Ltd, with technical design responsibility. In 1984, he took up a fellowship at CERN, to work on distributed real-time systems for scientific data acquisition and system control. In 1989, he proposed a global hypertext project, to be known as the World Wide Web.

Based on the earlier “Enquire” work, it was designed to allow people to work together by combining their knowledge in a web of hypertext documents. He wrote the first World Wide Web server and the first client, a wysiwyg hypertext browser/editor which ran in the NeXTStep environment. This work was started in October 1990, and the program “WorldWideWeb” first made available within CERN in December, and on the Internet at large in the summer of 1991. Through 1991 and 1993, Tim continued working on the design of the Web, coordinating feedback from users across the Internet.

His initial specifications of URIs, HTTP and HTML were refined and discussed in larger circles as the Web technology spread. In 1994, Tim joined the Laboratory for Computer Science (LCS)at the Massachusetts Institute of Technology (MIT). as Director of the W3   Consortium which coordinates W3 development worldwide, with teams at MIT and at INRIA in France. The consortium takes as it goal to realize the full potential of the web, ensuring its stability through rapid evolution and revolutionary transformations of its usage.

In 1995, Tim Berners-Lee received the Kilby Foundation’s “Young Innovator of the Year” Award for his invention of the World Wide Web, and was corecipient of the ACM Software Systems Award. He has been named as the recipient of the 1996 ACM Kobayashi award, and corecipient of the 1996 Computers and Communication (C&C) award. He has honorary degrees from the Parsons School of Design, New York (D. F. A. , 1996) and Southampton University (D. Sc. , 1996), and is a Distinguished Fellow of the British Computer Society. This has just been about Tim, but here is the real hsitory of the WWW.

The Internet: its effects and its future

The Internet is, quite literally, a network of networks. It is comprised of ten thousands of interconnected networks spanning the globe. The computers that form the Internet range from huge mainframes in research establishments to modest PCs in people’s homes and offices. Despite the recent hype, the Internet is not a new phenomenon. Its roots lie in a collection of computers that were linked together in the 1970s to form the US Department of Defense’s communications systems.

Fearing the consequences of nuclear attack, there was no central computer holding vast amounts of data, rather the information was dispersed across thousands of machines. A set of rules, of protocols, known as TCP/IP was developed to allow disparate devices to work together. The original network has long since been upgraded and expanded and TCP/IP is now a “de facto” standard. Millions of people worldwide are using the Internet to share information, make new associations and communicate.

Individuals and businesses, from students and journalists, to consultants, programmers and corporate giants are all harnessing the power of the Internet. For many businesses the Internet is becoming integral to their operations. Imagine the ability to send and receive data: messages, notes, letters, documents, pictures, video, sound- just about any form of communication, as effortlessly as making a phone call. It is easy to understand why the Internet is rapidly becoming the corporate communications medium.

Using the mouse on your computer, the familiar point-and-click functionality gives you access to electronic mail for sending and receiving data, and file transfer for copying files from one computer to another. Telnet services allow you to establish connections with systems on the other side of the world as if they were just next door. This flood of information is a beautiful thing and it can only open the minds of society. With the explosion of the World Wide Web, anyone could publish his or her ideas to the world.

Before, in order to be heard one would have to go through publishers who were willing to invest in his ideas to get something put into print. With the advent of the Internet, anyone who has something to say can be heard by the world. By letting everyone speak their mind, this opens up all new ways of thinking to anyone who is willing to listen. Moreover, the Internet is an information resource for you to search, gathering new data on key search aspects of your market. Perhaps most importantly, the Internet offers a new way of doing business.

A virtual market-place where customers can, at the push of a button, select goods, place an order and pay using a secure electronic transaction. Businesses are discovering the Internet as the most powerful and cost effective tool in history. The Net provides a faster, more efficient way to work colleagues, customers, vendors and business partners- irrespective of location or operating system harnessing this powerful resource gives companies strategic advantages by leveraging information into essential business asset. The “technology of the future” here today. This is a fact.

Businesses making the transition will, and are prospering; however those that do not will most certainly suffer the consequences. One of the most commonly asked questions is, “Will the Net help me sell more product? ” The answer is yes, but in ways you might not expect. The Internet is a communication “tool” first, not and advertisement medium. Unlike print or broadcasting media, the Internet is interactive; and unlike the telephone, it is both visual and content rich. A Web site is an excellent way to reduce costs, improve customer service, disseminate information and even sell to your market.

Perhaps, the most important facts about the internet are that it contains a wealth of information, that can be send across the world almost instantly, and that it can unite people in wildly different locations as if they were next to each other. The soundest claims for the importance of the Internet in today’s society are based upon these very facts. People of like minds and interests can share information with one another through electronic mail and chat rooms. E-mail is enabling radically new forms of worldwide human collaboration.

Approximately 225 millions of people can send and receive it and they all represent a network of potentially cooperating individuals dwarfing anything that even the mightiest corporation or government can muster. Mailing-list discussion groups and online conferencing allow us to gather together to work on a multitude of projects that are interesting or helpful to us. Chat rooms and mailing lists can connect groups of users to discuss a topic and share ideas. Materials from users can be added to a Web site to share with others and can be updated quickly and easily anytime.

However, the most exciting part of the Internet is its multimedia and hypertext capabilities. The Web provides information in many different formats. Of course, text is still a popular way to transmit information, but the Web also presents information in sound bites, such as music, voice, or special effects. Graphics may be still photographs, drawings, cartoons, diagrams, tables, or other artwork, but they also may be moving, such as animation video. Hypertext links allows users to move from one piece of information to another. A link might be an underlined word or phrase, an icon or a symbol, or a picture, for example.

When a link is selected, usually by clicking the mouse on the link, the user sees another piece of information, which may be electronically stored on another computer thousands of miles away. Of major importance is the fact that the Internet supports online education. Online education introduces unprecedented options for teaching, learning, and knowledge building. Today access to a microcomputer, modem, telephone line, and communication program offers learners and teachers the possibility of interactions that transcended the boundaries of time and space.

Even from an economic standpoint, the costs of establishing a brand new educational program for a few thousand students are far less than the cost of a building to house the same number of students. New social and intellectual connectivity is proliferating as educational institutions adopt computer-mediated communication for educational interactions. There are many school based networks that link learners to discuss, share and examine specific subjects such as environmental concerns, science, local and global issues, or to enhance written communication skills in first- or second- language proficiency activities.

Online education is a unique expression of both existing and new attributes. It shares certain attributes with the distance mode and with the face-to-face mode; however, in combination, these attributes form a new environment for learning. Online education, on the other hand, is distinguished by the social nature of the learning environment that it offers. Like face-to-face education, it supports interactive group communication. Historically, the social, affective, and cognitive benefits of peer interaction, and collaboration have been available only in face-to-face learning.

The introduction of online education opens unprecedented opportunities for educational interactivity. The mediation of the computer further distinguishes the nature of the activity online, introducing entirely new elements to the learning process. The potential of online education can be explored through five attributes that, taken together, both delineate its differences from existing modes of education and also characterize online education as a unique mode.

They may learn independently, at their own pace, in a convenient location, at a convenient time about a greater variety of subjects, from a greater variety of institutions or educators/trainers. But no matter how great and significant the effects of the Internet in our lives might be, there are some quite considerable consequences and drawbacks. A very important disadvantage is that the Internet is addictive. One of the first people to take the phenomenon seriously was Kimberly S. Young, Ph. D. , professor of psychology at the University of Pittsburgh.

She takes it so seriously, in fact, that she founded the Center for Online Addiction, an organization that provides consultation for educational institutions, mental health clinics and corporations dealing with Internet misuse problems. Psychologists now recognize Internet Addiction Syndrome (IAS) as a new illness that could ruin hundreds of lives. Internet addicts are people who are reported staying online for six, eight, ten or more hours a day, every day. They use the Internet as a way of escaping problems or relieving distressed moods. Their usage can cause problems in their family, work and social lives.

They feel anxious and irritable when offline and craved getting back online. Despite the consequences, they continue using regardless of admonishments from friends and family. Special help groups have been set up to give out advice and offer links with other addicts. Internets Anonymous and Webaholics are two of the sites offering help, but only through logging onto the Internet. The study of 100 students by Margaret Martin of Glasgow University found: One in six (16%) felt irritable, tense, depressed or restless if they were barred from using the Internet.

More than one in four (27%) felt guilty about the time they spent online. One in ten (10%) admitted neglecting a partner, child or work because of overuse. One in twenty five (4%) said it had affected their mental or physical health for the worse. Another Ph. D. psychologist Maressa Hecht Orzack posits that people use the Internet compulsively because it so easily facilitates the reward response common to addictive behavior. “If they are lonely and need compassion, camaraderie or romance, it can be found immediately. If they are looking for sex or pornography, they need only to click a button.

They can experience the thrill of gambling, playing interactive games from the comfort of their chairs. They can entertain fantasies by pretending to be other people, or engaging interactive, role-playing games. The reward received from these activities can manifest itself physically, so that the person begins to crave more of it. ” The effects lead to headaches, lack of concentration and tiredness. Addicts must not cut off access altogether but they should set time limits and limit Internet usage to a set number of hours each day.

Robert Kraut Doctoral Psychologist says referring on the subject: “We have evidence that people who are online for long periods of time show negative changes in how much they talk to people in their family and how many friends and acquaintances they say they keep in contact with. They also report small but increased amounts of loneliness, stress and depression. What we do not know is exactly why. Being online takes up time, and it may be taking time away from sleep, social contact or even eating. Our negative results are understandable if people’s interactions on the net are not as socially valuable as their other activities.

Another considerable drawback of the Internet is that it is susceptible to hackers. Hackers are persons that have tremendous knowledge on the subject and use it to steal, cheat or misuse confidential or classified information for the sake of fun or profit. As the world increases its reliance on computer systems, we become more vulnerable to extremists who use computer technology as a weapon. It is called cyber-terrorism and research groups within the CIA and FBI say cyber-warfare has become one of the main threats to global security. But how serious is hacking?

In 1989, the Computer Emergency Response Team, a nonprofit organization that monitors security issues throughout America from its base at the Carnegie Mellon University in Pittsburgh reported 132 computer intrusions. Last year they recorded 2341! And in recent months, a few celebrated cases have shed a new light on the hacker’s netherworldly activities. One notorious hacker is American Kevin Mitnick, a 31-year-old computer junkie arrested by the FBI in February for allegedly pilfering more than $1 million worth of data and 20. 000 credit-card numbers through the Internet.

Still, the new wave of network hacking is presenting fresh problems for companies, universities and law-enforcement officials in every industrial country. In July, John Deutch, head of the CIA, told Congress that he ranked information warfare as the second most serious threat to the national security, just below weapons of mass destruction in terrorist hands. The Internet suffers around a million successful penetrations every year while the Pentagon headquarters of the US military has 250. 000 attempts to hack into computers. But what can be done for hacking?

There are ways for corporations to safeguard against hackers and the demand for safety has led to a boom industry in data security. Security measures range from user Ids and passwords to thumbprint, voiceprint or retinal scan technologies. Another approach is public key encryption, used in software packages such as Entrust. An information system girded with firewalls and gates, broken vertically into compartments and horizontally by access privileges, where suspicion is the norm and nothing can be trusted, will probably reduce the risk of information warfare as we know it today to negligible levels.

Yet, increasingly intrusive and somehow antithetical to the purposes for which science in general is purposed. It is no accident that the World Wide Web was invented to enable particle physicists to share knowledge. Michael Binder, assistant deputy at the industry department of United States asks another key question: “How would you regulate the Internet? ” Computer and legal experts all agree that enforcement is difficult. Still, a committee of the Canadian Association of Chiefs of Police has made several recommendations. One would make it illegal to possess computer hacking programs, those used to break into computer systems.

Another would make the use of computer networks and telephone lines used in the commission of a crime a crime in itself. The committee also recommends agreements with the United States that would allow police officials in both countries to search computer data banks. But for the time being, Binder says, the government is in no rush to rewrite the statute book. “We don’t know how it will evolve. We don’t want to stifle communication. We don’t want to shut down the Net. ” The problem with regulating the Internet is that no one owns it and no one controls it.

Messages are passed from computer system to computer system in milliseconds, and the network literally resembles a web of computers and connecting telephone lines. It crosses borders in less time than it takes to cross most streets, and connections to Asia or Australia are as commonplace as dialing your neighbor next door. It is the Net’s very lack of frontiers that make law enforcement so difficult. Confronted with the difficulty of trying to grab onto something as amorphous as the Net, some critics and government officials are hoping that Internet service providers can police the Net themselves.

However, Ian Kyer, President of Computer Law Association Inc. and a lawyer believes that much of the debate about the Internet arises because it is so new. “We’re just sort of waking up to it. Now that it’s an everyday thing, it’s coming to the attention of the legislators and police forces, and I think they’re not going to like what they see. One of the real problems with the law of the Internet is deciding, where does the offence occur? ” The best guide to the way the law should work is to study the past and the present, not to attempt to predict every possible future.

As Justice Oliver Wendell Holmes said long ago, “The life of the law has not been logic; it has been experience. ” When a new media technology emerges, the best thing to do is to wait and see what problems actually emerge, not panic about what could happen. Once we understand the actual risks, we can legislate accordingly and with full regard to the competing interests at stake. But there is another problem that practically circulates through the Internet: The viruses. They can move stealthily and strike without warning.

Yet they have no real life of their own, and goo virtually unnoticed until they find a suitable host. Computer viruses- tiny bits of programming code capable of destroying vast amounts of stored data- bear an uncannily close relationship to their biological namesake. And like natural viruses are constantly changing, making them more and more difficult to detect. It is estimated that two or three new varieties are written each day. Most experts believe that a virus is created by an immature, disenchanted computer whiz, frequently called a “cracker”.

The effects may be benign: on variation of the famous “Stoned” virus merely displays a message calling for the legalization of marijuana. Other viruses, however, can scramble files to create a frenzy of duplication that may cause a computer’s microchips to fail. The rapid increase in computer networks, with their millions of user exchanging vast amounts of information, has only made things worse. With word-processing macros embedded in text, opening e-mail can now unleash a virus in a network or a hard disk. Web browsers can also download running code, some of it possibly malign.

Distributing objects over global networks without a good way to authenticate them leads to similar risks. Crackers have also succeeded in tainting software sold by brand-name manufacturers. A clutch of companies offer antiviral programs, capable of detecting viruses before they have the chance to spread. Such programs find the majority of viruses but virus detection is likely to remain a serious problem because of the ingenuity of crackers. One new type of virus, known as polymorphic, evades discovery by changing slightly each time it replicates itself.

Another extremely important issue about they Internet is child pornography. Computer technology is providing child molesters and child pornographers with powerful new tools for victimizing children. The result is an explosive growth in the production and distribution of illegal child pornography, as well as new forms of child predation. Research carried out at Stockholm University identified 5651 postings of child pornography in four discussion groups. Children around the world are being sexually assaulted, molested and exploited by people who also misuse computers and related technology.

The abuse is being photographed and distributed to an international marketplace of child pornography consumers via the Internet. That marketplace- along with related Internet sites that encourage child sexual abuse- is leading to new assaults against children. No longer are schools, public libraries and homes safe harbors from sexual pedophiles- people whose sexual fantasies focus on girls or boys- from around the world. In the past photographs of children being raped, sexually abused and exploited were sold at high prices through tightknit, difficult-to-access networks.

Today, those illegal pictures are available for free online, at any hour of the day. Anyone with rudimentary computer skills and an interest in the material can obtain it. Computer networks can also allow pedophiles to identify and contact potential victims without revealing their identities. Often, adult predators pretend to be children until they have gained their victims’ confidence. Federal law defines child pornography as photographs or video that depict people under the age of 18 involved in sexually explicit conduct- such as sexual intercourse, bestiality, masturbation and sadistic or masochistic abuse.

Also prohibited are pictures involving children that include a “lascivious exhibition of the genitals or public area”. The Government has introduced a number of measures to deal with pornography and obscene material, including the use on computers. The Criminal Justice and Public Order Act 1998 increased the maximum sentence for possession of indecent photographs of children up to 5 years in prison, a $250. 000 fine, or both. People convicted of distributing child pornography face up to 15 years in prison or/and a $300. 000 fine.

It also gave the police the power to arrest without warrant people suspected of obscenity and certain child pornography offences and greater powers to search and seize obscene material and child pornography. It also closed a potential legal loophole by extending the law to cover simulated child pornography manufactured and stored on computers. In Singapore authorities announced plans to establish a “neighborhood police post” on the Internet to monitor and receive complaints of criminal activity- including the distribution of child pornography.

And in the United States there has been introduced a bill- vocally opposed by civil liberties organizations and computer-user groups- that would outlaw the electronic distribution of words and images that are “obscene, lewd, lascivious, filthy or indecent. ” However, Federal agencies lack the manpower to cope with all the criminal activity taking place online. Few local law enforcement officers are trained in computer technology. Moreover, Internet providers generally fail to educate their customers about ways to protect children from sexual predators.

Few schools or libraries offer real safety training programs for children online. Many parents have no idea what threats exist or even how the technologies in question work. Last but not least is the report to our privacy when online. These days, the most skilful manipulators of new information and communications technology to build up files on individuals are private companies collecting personal data on tens of millions of people. Simon Davies, the British head of Privacy International, a human rights watchdog group, says that every citizen of an industrialized country appears today in about 200 different data bases.

Such mines of information are centralized, sifted through and correlated to produce very detailed profiles of consumers. The files are then resold to all kinds of firms, which use them to sharpen their marketing strategies, assess the economic reliability of customers and adjust to specific commercial demands. The Internet is an ideal tool for this meticulous task of categorizing the population. It is an extraordinary source of data as well as a practical way to handle such information and circulate it.

To make matters worse, the Internet is a world of invisible tracks. You get the impression when you surf the Web that you leave no traces behind you. The truth is rather different. Some sites place spying devices, or “cookies”, on your computer’s hard drive the moment you log to them, so they can tell which pages of the site you have looked at, when you looked and for how long. A survey last year by the Electronic Privacy Information Center (EPIC) showed that a quarter of the 100 most popular sites on the Web use cookies to obtain profiles of their users.

When you next visit them, they can present you with advertising tailored to your interests, or even send you without your knowledge programs like Java Applets, which can reconfigure a site according to each visitor’s tastes. Arguments persist that the erosion of privacy is not such a big deal; the economic benefits of information availability and mobility, it is said outweigh limitations on our personal privacy. Is privacy an ethical nicety, an expendable luxury, then, or is it a basic natural right that needs legal protection?

Some philosophers and legal scholars have argued that privacy is an intrinsic good, implying that the right to privacy is fundamental and irreducible. Others contend that privacy is more of an instrumental good. Hence the right to privacy is derived from other rights such as property, bodily security and freedom. While both approaches have validity, the latter seems more compelling. It is especially persuasive when applied to those rights involving our liberty and personal autonomy.

A primary moral foundation for the value of privacy is its role as a condition of freedom: A shield of privacy is absolutely essential if one is freely to pursue his or her projects or cultivate intimate social relationships Under the directive, that came into effect on 25 October 1998, the processing of data about ethnic origins, political opinions, religious and philosophical beliefs, trade union membership, health and sex life, is prohibited except where there are special exemptions or derogations.

Moreover, in each of the European Union’s fifteen Member State, a special authority is to protect individual’s rights and freedoms with regard to the processing of personal data. It is to guarantee citizens the right to be informed, to have access to data concerning them and the right to correct it, and to erase data whose processing does not comply with the provisions of the directive. Article 25 states the principle that the transfer of personal data to third countries may only take place if the receiving countries offer a level of protection that is “adequate” within the meaning of EU legislation.

In a globalized economy where information about consumers is the new gold mine, the stakes are huge, involving no more and no less than the future of all banking and trade transactions, especially electronic. The United States has already gone on the offensive by accusing Europe of using privacy protection laws to erect barriers around the valuable European market of 370 million people. President Bill Clinton’s Internet policy adviser Ira Magaziner has even threatened to go to the World Trade Organization (WTO) about it.

At the same time he insists that the US is just as concerned to protect the privacy of its citizens as European governments are. And all studies show that Internet commerce can not succeed unless consumers can count on information about themselves being kept confidential. There are currently about 250 bills relating to the Internet pending in Congress. Many of those deal specifically within privacy. However, only very few of these have become a law. That is largely because the Clinton administration and Congress have taken a largely wait-and-see approach to this conflict.

Most lawmakers feel the Internet develops too quickly for static laws to work effectively. Instead, politicians from Vice President Al Gore down are encouraging the Internet industry to regulate itself, while suggesting that their patience is not inexhaustible. It will be very difficult to regulate the Internet because it is global and decentralized, and it is very hard to identify Internet users. The key is developing something that is enforceable. Good intentions are one thing, but in the self-regulatory environment, if somebody is hurt by the misuse of personal information, who pays?

Who provides a remedy to that harmed individual? Nobody does! Privacy is a tough area for personal injury lawyers because it is difficult under our tort law to prove that somebody has been harmed. It is very hard to prove damage to reputation of intentional infliction of emotional distress in cases involving disclosure of personal information. Many individuals and organizations are now relying more heavily on digital networks as they routinely communicate by e-mail, post messages to electronic bulletin boards on the Internet and visit Web sites.

But in the process they become more exposed and vulnerable to those seeking to collect and sell their personal data. When users visit Web sites they often fill out detailed personal profiles that become grist for marketing lists sold to third parties. Digital networks have also made consumer information even more widely and easily available the use of these networks greatly expands the capability of checking up on someone’s personal background or receiving an electronic list of prospective customers quickly and inexpensively. Indeed we are moving perilously close to the reality of immediate online personal data.

More disturbing than the loss of our privacy as consumers is the loss of privacy about our financial affairs. Once again government data banks have usually provided the building blocks for these records. Certain financial information that was always in the public domain, such as real estate and bankruptcy records, is now treated as a basic commodity. Data brokers such as Information America, Inc. allow their subscribers quick online access to the county and court records for many states. Their vast databases contain business records, bankruptcy records, lawsuit information and property records, including liens and judgement.

By computerizing these real estate records, liens, incorporations, licenses and so on, they become more than public documents. They are now on-line commodities, more easily accessed and distributed than their physical counterparts. In addition, this data can be recombined with other personal and financial background. The most recent assault on privacy has developed in the health care industry, in which patient records have also become commodities for sale. These records, containing highly sensitive and revealing information, are being collected and stored in databases maintained by hospitals.

Thus, medical privacy seems destined to be another victim of our evolving information technologies. By putting so much medical data online without proper safeguards the Government, the Health Care Industry and the Information Industry are clearly undermining the foundation to the confidential doctor-patient relationship. It seems quite evident that our right to informational privacy- the right to control the disclosure of and access to one’s personal information- has been sacrificed for the sake of economic efficiency and other social objectives.

As our personal information becomes tangled in the Web of information technology, our control over how that data will be utilized and distributed is notably diminished. Our personal background and purchases are tracked by many companies that consider us prospects for their products or services; our financial profile and credit history is available to a plethora of “legitimate” users, and our medical records are more widely accessible than ever before.

The Net effect is that each of us can become an open book to anyone who wants to take the time to investigate our background. Another adverse consequence of all these is that we can be more easily targeted and singled out either as individuals or as members of certain groups. Data based technology makes it easy to find and exploit certain groups based on age, income level, place of residence, or purchasing habits. At the same time online data banks now make it especially simple to pinpoint individuals electronically.

If public polity makers do become convinced that privacy is worth preserving, what should be done? Are there any viable solutions? Further complicating the issue, of course, are legitimate economic considerations. Privacy can not be accomplished without incurring some costs. And we can not ignore the economic benefit of acquiring and distributing information and using data as a commercial tool to target the right customers. If the information flow about consumers is overly constrained, a substantial negative economic impact can not be discounted.

In addition, there must be stricter controls for an especially sensitive information such as medical data. If a centralized national database becomes a reality, it will be necessary to achieve a broad public consensus on the definition of the health care trustees who should have access to that data. In summary, then, if informed consent is made mandatory for the reuse of consumer data and there are stricter safeguards for more critical information such as medical data, we can begin to make some progress in protecting privacy rights.

But unless we should come to terms with this problem the boundaries between what is public and private could become much more tenuous. A world where privacy is in such short supply will undermine our freedom and dignity and pose a great threat to our security and well being. But what is the future of the Internet? The Internet is moving from a relatively passive publishing medium to a truly interactive application deployment platfo

Internet Regulation: Policing Cyberspace

The Internet is a method of communication and a source of information that is becoming more popular among those who are interested in, and have the time to surf the information superhighway. The problem with this much information being accessible to this many people is that some of it is deemed inappropriate for minors. The government wants censorship, but a segment of the population does not.

Legislative regulation of the Internet would be an appropriate function The Communications Decency Act is an amendment which prevents the information superhighway from becoming a computer “red light district. On June 14, 1995, by a vote of 84-16, the United States Senate passed the amendment. It is now being brought through the House of Representatives. 1 The Internet is owned and operated by the government, which gives them the obligation to restrict the materials available through it.

Though it appears to have sprung up overnight, the inspiration of free-spirited hackers, it in fact was born in Defense Department Cold War projects of the 1950s. 2 The United States Government owns the Internet and has the responsibility to determine who uses it and how it The government must control what information is This material is not lawfully available through the mail or over the telephone, there is no valid reason these perverts should be allowed unimpeded on the Internet.

Since our initiative, the industry has commendably advanced some blocking devices, but they are not a substitute for Because the Internet has become one of the biggest sources of information in this world, legislative safeguards are The government gives citizens the privilege of using the Internet, but it has never given them the right to use They seem to rationalize that the framers of the constitution planned & plotted at great length to ake certain that above all else, the profiteering pornographer, the pervert and the pedophile must be free to practice their pursuits in the presence of children on a taxpayer created and subsidized People like this are the ones in the wrong. Taxpayer’s dollars are being spent bringing obscene text and graphics into the homes of people all over the world. The government must take control to prevent pornographers from using the Internet however they see fit because they are breaking laws that have existed for years.

Cyberpunks, those most popularly associated with the Internet, are members of a rebellious society that are olluting these networks with information containing pornography, racism, and other forms of explicit When they start rooting around for a crime, new cybercops are entering a pretty unfriendly environment. Cyberspace, especially the Internet, is full of those who embrace a frontier culture that is hostile to authority and fearful that any intrusions of police or government will destroy The self-regulating environment desired by the cyberpunks is an opportunity to do whatever they want.

The Communications Decency Act is an attempt on part of the government to control their “free attitude” displayed in homepages such as Sex, Adult Pictures, X-Rated Porn”, “Hot Sleazy Pictures (Cum again + again)” and “sex, sex, sex. heck, it’s better even better than real sex”6. What we are doing is simply making the same laws, held constitutional time and time again by the courts with regard to obscenity and indecency through the mail and telephones, applicable to the Internet. “7

To keep these kinds of pictures off home computers, the government must control information on the Internet, just as it controls obscenity through the mail or Legislative regulations must be made to control information on the Internet because the displaying or istribution of obscene material is illegal. The courts have generally held that obscenity is illegal under all circumstances for all ages, while “indecency” is generally allowable to adults, but that laws protecting children from this “lesser” form are acceptable.

It’s called protecting those among us who are children from The constitution of the United States has set regulations to determine what is categorized as obscenity and what is not. In Miller vs. California, 413 U. S. at 24-25, the court announced its “Miller Test” and held, at 29, that its three part test constituted “concrete uidelines to isolate ‘hard core’ pornography from expression protected by the First Amendment. 9 By laws previously set by the government, obscene pornography should not be accessible on the Internet. The government must police the Internet because people are breaking laws.

“Right now, cyberspace is like a neighborhood without a police department. “10 Currently anyone can put anything he wants on the Internet with no penalties. The Communications Decency Act gives law enforcement new tools to prosecute those who would use a computer to make the equivalent of obscene telephone calls, o prosecute ‘electronic stalkers’ who terrorize their victims, to clamp down on electronic distributors of obscene materials, and to enhance the chances of prosecution of those who would provide pornography to children via a The government must regulate the flow of information on the Internet because some of the commercial blocking devices used to filter this information are insufficient.

“Cybercops especially worry that outlaws are now able to use powerful cryptography to send and receive uncrackable secret communications and are also aided by anonymous re-mailers. 11 By using features like these it is impossible to use blocking devices to stop children from accessing this information. Devices set up to detect specified strings of characters will not filter those that The government has to stop obscene materials from being transferred via the Internet because it violates laws It is not a valid argument that “consenting adults” should be allowed to use the computer BBS and “Internet” systems to receive whatever they want.

If the materials are obscene, the law can forbid the use of means and facilities of interstate commerce and common carriers to ship or When supplies and information are passed over state or national boundaries, they are subject to the laws governing interstate and intrastate commerce. When information is passed between two computers, it is subjected to the same The government having the power to regulate the information being put on the Internet is a proper extension of its powers. With an information based system such as the Internet there is bound to be material that is not appropriate for minors to see. In passing of an amendment like the Communications Decency Act, the government would be given the power to regulate that material.

Internet Censorship Essay

The Internet is a wonderful place of entertainment and education, but like all places used by millions of people, it has some murky corners people would prefer children not to explore. In the physical world, society as a whole wants to protect children, but there are no social or physical constraints to Internet surfing. During the past decade, our society has become based solely on the ability to move large amounts of information across large distances quickly. Computerization has influenced everyone’s life.

The natural evolution of computers and this need for ultra-fast communications has caused a global network of interconnected computers to develop. This global net allows a person to send E-mail across the world in mere fractions of a second, and enables even the common person to access information worldwide. Is the Internet really a Global phenomenon? Since the beginning of time, mankind has been involved in a never-ending quest to learn more about the world in which he lives.

Our natural curiosity leads us to question everything and investigate the way in which the world around us works. Human beings also have a natural tendency to want to explore and see things in the world around them, which they have never seen before. It is in that sense that the Internet is a perfect extension of human nature. It is a medium, which transcends boundaries and makes the world a smaller place much as the media of printing, radio and television did before it. The Internet is the largest most versatile source of information in the world today.

With its web sites and chat rooms, it is a means of communicating with people in places all over the face of the earth. Since its conception in 1973, the Internet has grown at a whirlwind rate. 51 million adults, were on-line as of the second quarter 1997 in the United States alone With advances such as software that allows users with a sound card to use the Internet as a carrier for long distance voice calls and video conferencing, this network is key to the future of the knowledge society. At present, this net is the epitome of the first amendment: free speech.

It is a place where people can speak their mind without being reprimanded for what they say, or how they choose to say it. The key to the worldwide success of the Internet is its protection of free speech, not only in America, but also in other countries where free speech is not protected by a constitution. To be found on the Internet is a huge collection of obscene graphics, Anarchists’ cookbooks and countless other things that offend some people.

With over 30 million Internet users in the U. S. one (only 3 million of which surf the net from home), everything is bound to offend someone. The newest wave of laws floating through law making bodies around the world threatens to stifle this area of spontaneity. Recently, Congress has been considering passing laws that will make it a crime punishable by jail to send “vulgar” language over the net, and to export encryption software. No matter how small, any attempt at government intervention in the Internet will stifle the greatest communication innovation of this century.

The government wants to maintain control over this new form of communication, and they are trying to use the protection of children as a smoke screen to pass laws that will allow them to regulate and censor the Internet, while banning techniques that could eliminate the need for regulation. Censorship of the Internet threatens to destroy its freelance atmosphere, while wide spread encryption could help prevent the need for government intervention. The current body of laws existing today in America does not apply well to the Internet.

Is the Internet like a bookstore, where servers cannot be expected to review every title? Is it like a phone company who must ignore what it carries because of privacy? Is it like a broadcasting medium, where the government monitors what is broadcast? The trouble is that the Internet can be all or none of these things depending on how it’s used. The Internet cannot be viewed as one type of transfer medium under current broadcast definitions. The Internet differs from broadcasting media in that one cannot just happen upon a vulgar site without first entering a complicated address, or following a link from another source.

The Internet is much more like going into a book store and choosing to look at adult magazines. ” (Miller 75). Jim Exon, a democratic senator from Nebraska, wants to pass a decency bill regulating the Internet. If the bill passes, certain commercial servers that post pictures of unclad beings, like those run by Penthouse or Playboy, would of course be shut down immediately or risk prosecution. The same goes for any amateur web site that features nudity, sex talk, or rough language.

Posting any dirty words in a Usenet discussion group, which occurs routinely, could make one liable for a $50,000 fine and six months in jail. Even worse, if a magazine that commonly runs some of those nasty words in its pages, The New Yorker for instance, decided to post its contents on-line, its leaders would be held responsible for a $100,000 fine and two years in jail. Why does it suddenly become illegal to post something that has been legal for years in print? Exon’s bill apparently would also “criminalize private mail,” …

I can call my brother on the phone and say anything–but if I say it on the Internet, it’s illegal” (Levy 53). Congress, in their pursuit of regulations, seems to have overlooked the fact that the majority of the adult material on the Internet comes from overseas. Although many U. S. government sources helped fund Arpanet, the predecessor to the Internet, they no longer control it. Many of the new Internet technologies, including the World Wide Web, have come from overseas. There is no clear boundary between information held in the U. S. and information stored in other countries.

Data held in foreign computers is just as accessible as data in America; all it takes is the click of a mouse to access. Even if our government tried to regulate the Internet, we have no control over what is posted in other countries, and we have no practical way to stop it. The Internet’s predecessor was originally designed to uphold communications after a nuclear attack by rerouting data to compensate for destroyed telephone lines and servers. Today’s Internet still works on a similar design. The very nature this design allows the Internet to overcome any kind of barriers put in its way.

If a major line between two servers, say in two countries, is cut, then the Internet users will find another way around this obstacle. This obstacle avoidance makes it virtually impossible to separate an entire nation from indecent information in other countries. If it were physically possible to isolate America’s computers from the rest of the world, it would be devastating to our economy. Recently, a major university attempted to regulate what types of Internet access its students had, with results reminiscent of a 1960’s protest.

A research associate at Carnegie Mellon University conducted a study of pornography on the school’s computer networks. Martin Rimm put together quite a large picture collection (917,410 images) and he also tracked how often each image had been downloaded (a total of 6. 4 million). A local court had recently declared pictures of similar content obscene, and the school feared they might be held responsible for the content of its network. The school administration quickly removed access to all these pictures, and to the newsgroups where most of this obscenity is suspected to come from.

A total of 80 newsgroups were removed, causing a large disturbance among the student body, the American Civil Liberties Union, and the Electronic Frontier Foundation, all of who felt this was unconstitutional. After only half a week, the college had backed down, and restored the newsgroups. This is a tiny example of what may happen if the government tries to impose censorship (Elmer-Dewitt 102). Altered views of an electronic world translate easily into altered views of the real world. “When it comes to our children, censorship is a far less important issue than good parenting.

We must teach our kids that the Internet is an extension and a reflection of the real world, and we have to show them how to enjoy the good things and avoid the bad things. This isn’t the government’s responsibility. It’s ours (Miller 76). ” Not all restrictions on electronic speech are bad. Most of the major on-line communication companies have restrictions on what there users can “say. ” They must respect their customer’s privacy, however. Private E-mail content is off limits to them, but they may act swiftly upon anyone who spouts obscenities in a public forum.

Self-regulation by users and servers is the key to avoiding government-imposed intervention. Many on-line sites such as Playboy and Penthouse have started to regulate themselves. Both post clear warnings that adult content lies ahead and lists the countries where this is illegal. The film and videogame industries subject themselves to ratings, and if Internet users want to avoid government imposed regulations, then it is time they begin to regulate themselves. It all boils down to protecting children from adult material, while protecting the first amendment right to free speech between adults.

Government attempts to regulate the Internet are not just limited to obscenity and vulgar language; it also reaches into other areas, such as data encryption. By nature, the Internet is an insecure method of transferring data. A single E-mail packet may pass through hundreds of computers from its source to destination. At each computer, there is the chance that the data will be archived and someone may intercept that data. Credit card numbers are a frequent target of hackers. Encryption is a means of encoding data so that only someone with the proper “key” can decode it.

Why do you need PGP (encryption)? It’s personal. It’s private. And it’s no one’s business but yours. You may be planning a political campaign, discussing our taxes, or having an illicit affair. Or you may be doing something that you feel shouldn’t be illegal, but is. Whatever it is, you don’t want your private electronic mail (E-mail) or confidential documents read by anyone else. There’s nothing wrong with asserting your privacy. Privacy is as apple-pie as the Constitution. Perhaps you think your E-mail is legitimate enough that encryption is unwarranted.

If you really are a law-abiding citizen with nothing to hide, then why don’t you always send your paper mail on postcards? Why not submit to drug testing on demand? Why require a warrant for police searches of your house? Are you trying to hide something? You must be a subversive or a drug dealer if you hide your mail inside envelopes. Or maybe a paranoid nut. Do law-abiding citizens have any need to encrypt their E-mail? What if everyone believed that law-abiding citizens should use postcards for their mail?

If some brave soul tried to assert his privacy by using an envelope for his mail, it would draw suspicion. Perhaps the authorities would open his mail to see what he’s hiding. Fortunately, we don’t live in that kind of world, because everyone protects most of their mail with envelopes. So no one draws suspicion by asserting their privacy with an envelope. There’s safety in numbers. Analogously, it would be nice if everyone routinely used encryption for all their E-mail, innocent or not, so that no one drew suspicion by asserting their E-mail privacy with encryption.

Think of it as a form of solidarity (Zimmerman). ” Until the development of the Internet, the U. S. government controlled most new encryption techniques. With the development of faster home computers and a worldwide web, they no longer hold control over encryption. Even the FBI and the NSA have discovered new algorithms that are reportedly uncrackable. This is a major concern to the government because they want to maintain the ability to conduct wiretaps, and other forms of electronic surveillance into the digital age.

To stop the spread of data encryption software, the U. S. government has imposed very strict laws on its exportation. One very well known example of this is the PGP (Pretty Good Privacy) scandal. PGP was written by Phil Zimmerman, and is based on “public key” encryption. This system uses complex algorithms to produce two codes, one for encoding and one for decoding. To send an encoded message to someone, a copy of that person’s “public” key is needed. The sender uses this public key to encrypt the data, and the recipient uses their “private” key to decode the message.

As Zimmerman was finishing his program, he heard about a proposed Senate bill to ban cryptography. This prompted him to release his program for free, hoping that it would become so popular that its use could not be stopped. One of the original users of PGP posted it to an Internet site, where anyone from any country could download it, causing a federal investigator to begin investigating Phil for violation of this new law. As with any new technology, this program has allegedly been used for illegal purposes, and the FBI and NSA are believed to be unable to crack this code.

When told about the illegal uses of him programs, Zimmerman replies: “If I had invented an automobile, and was told that criminals used it to rob banks, I would feel bad, too. But most people agree the benefits to society that come from automobiles — taking the kids to school, grocery shopping and such — outweigh their drawbacks. ” (Levy 56). Currently, PGP can be downloaded from MIT. They have a very complicated system that changes the location on the software to be sure that they are protected. All that needs to be done is click “YES” to four questions dealing with exportation and use of the program, and it is there for the taking.

This seems to be a lot of trouble to protect a program from spreading that is already world wide. The government wants to protect their ability to legally wiretap, but what good does it do them to stop encryption in foreign countries? They cannot legally wiretap someone in another country, and they sure cannot ban encryption in the U. S. The government has not been totally blind to the need for encryption. For nearly two decades, a government sponsored algorithm, Data Encryption Standard (DES), has been used primarily by banks.

The government always maintained the ability to decipher this code with their powerful supercomputers. Now that new forms of encryption have been devised that the government can’t decipher, they are proposing a new standard to replace DES. This new standard is called Clipper, and is based on the “public key” algorithms. Instead of software, Clipper is a microchip that can be incorporated into just about anything (Television, Telephones, etc. ). This algorithm uses a much longer key that is 16 million times more powerful than DES.

It is estimated that today’s fastest computers would take 400 billion years to break this code using every possible key. (Lehrer 378). “The catch: At the time of manufacture, each Clipper chip will be loaded with its own unique key, and the Government gets to keep a copy, placed in escrow. Not to worry, though the Government promises that they will use these keys to read your traffic only when duly authorized by law. Of course, to make Clipper completely effective, the next logical step would be to outlaw other forms of cryptography (Zimmerman).

If privacy is outlawed, only outlaws will have privacy. Intelligence agencies have access to good cryptographic technology. So do the big arms and drug traffickers. So do defense contractors, oil companies, and other corporate giants. But ordinary people and grassroots political organizations mostly have not had access to affordable “military grade” public-key cryptographic technology. Until now. PGP empowers people to take their privacy into their own hands. There’s a growing social need for it. That’s why I wrote it (Zimmerman).

The most important benefits of encryption have been conveniently overlooked by the government. If everyone used encryption, there would be absolutely no way that an innocent bystander could happen upon something they choose not to see. Only the intended receiver of the data could decrypt it (using public key cryptography, not even the sender can decrypt it) and view its contents. Each coded message also has an encrypted signature verifying the sender’s identity. The sender’s secret key can be used to encrypt an enclosed signature message, thereby “signing” it.

This creates a digital signature of a message, which the recipient (or anyone else) can check by using the sender’s public key to decrypt it. This proves that the sender was the true originator of the message, and that the message has not been subsequently altered by anyone else, because the sender alone possesses the secret key that made that signature. “Forgery of a signed message is infeasible, and the sender cannot later disavow his signature(Zimmerman). ” Gone would be the hate mail that causes many problems, and gone would be the ability to forge a document with someone else’s address.

The government, if it did not have alterior motives, should mandate encryption, not outlaw it. As the Internet continues to grow throughout the world, more governments may try to impose their views onto the rest of the world through regulations and censorship. It will be a sad day when the world must adjust its views to conform to that of the most prudish regulatory government. If too many regulations are inacted, then the Internet as a tool will become nearly useless, and the Internet as a mass communication device and a place for freedom of mind and thoughts, will become non-existent.

The users, servers, and parents of the world must regulate themselves, so as not to force government regulations that may stifle the best communication instrument in history. If encryption catches on and becomes as widespread as Zimmerman predicts it will, then there will no longer be a need for the government to meddle in the Internet, and the biggest problem will work itself out. The government should rethink its approach to the censorship and encryption issues, allowing the Internet to continue to grow and mature.

Internet Regulation: Policing Cyberspace

The Internet is a method of communication and a source of information that is becoming more popular among those who are interested in, and have the time to surf the information superhighway. The problem with this much information being accessible to this many people is that some of it is deemed inappropriate for minors. The government wants censorship, but a segment of the population does not. Legislative regulation of the Internet would be an appropriate function of the government. The Communications Decency Act is an amendment which prevents the information superhighway from becoming a computer “red light district.

On June 14, 1995, by a vote of 84-16, the United States Senate passed the amendment. It is now being brought through the House of Representatives. 1 The Internet is owned and operated by the government, which gives them the obligation to restrict the materials available through it. Though it appears to have sprung up overnight, the inspiration of free-spirited hackers, it in fact was born in Defense Department Cold War projects of the 1950s. 2 The United States Government owns the Internet and has the responsibility to determine who uses it and how it is used.

The government must control what information is accessible from its agencies. This material is not lawfully available through the mail or over the telephone, there is no valid reason these perverts should be allowed unimpeded on the Internet. Since our initiative, the industry has commendably advanced some blocking devices, but they are not a substitute for well-reasoned law. 4 Because the Internet has become one of the biggest sources of information in this world, legislative safeguards are imperative. The government gives citizens the privilege of using the Internet, but it has never given them the right to use it.

They seem to rationalize that the framers of the constitution planned & plotted at great length to ake certain that above all else, the profiteering pornographer, the pervert and the pedophile must be free to practice their pursuits in the presence of children on a taxpayer created and subsidized computer network. 3 People like this are the ones in the wrong. Taxpayer’s dollars are being spent bringing obscene text and graphics into the homes of people all over the world. The government must take control to prevent pornographers from using the Internet however they see fit because they are breaking laws that have existed for years.

Cyberpunks, those most popularly associated with the Internet, are members of a rebellious society that are olluting these networks with information containing pornography, racism, and other forms of explicit information. When they start rooting around for a crime, new cybercops are entering a pretty unfriendly environment. Cyberspace, especially the Internet, is full of those who embrace a frontier culture that is hostile to authority and fearful that any intrusions of police or government will destroy their self-regulating world. The self-regulating environment desired by the cyberpunks is an opportunity to do whatever they want.

The Communications Decency Act is an attempt on part of the government to ontrol their “free attitude” displayed in homepages such as “Sex, Adult Pictures, X-Rated Porn”, “Hot Sleazy Pictures (Cum again + again)” and “sex, sex, sex. heck, it’s better even better than real sex”6. “What we are doing is simply making the same laws, held constitutional time and time again by the courts with regard to obscenity and indecency through the mail and telephones, applicable to the Internet.

To keep these kinds of pictures off home computers, the government must control information on the Internet, just as it controls obscenity through the mail or on the phone. Legislative regulations must be made to control nformation on the Internet because the displaying or distribution of obscene material is illegal. The courts have generally held that obscenity is illegal under all circumstances for all ages, while “indecency” is generally allowable to adults, but that laws protecting children from this “lesser” form are acceptable.

It’s called protecting those among us who are children from the vagrancies of adults. 8 The constitution of the United States has set regulations to determine what is categorized as obscenity and what is not. In Miller vs. California, 413 U. S. at 24-25, the court announced its “Miller Test” and held, at 29, hat its three part test constituted “concrete guidelines to isolate ‘hard core’ pornography from expression protected by the First Amendment. 9 By laws previously set by the government, obscene pornography should not be accessible on the Internet.

The government must police the Internet because people are breaking laws. “Right now, cyberspace is like a neighborhood without a police department. “10 Currently anyone can put anything he wants on the Internet with no penalties. “The Communications Decency Act gives law enforcement new tools to prosecute those who would use a computer to make the equivalent of obscene telephone calls, o prosecute ‘electronic stalkers’ who terrorize their victims, to clamp down on electronic distributors of obscene materials, and to enhance the chances of prosecution of those who would provide pornography to children via a computer.

The government must regulate the flow of information on the Internet because some of the commercial blocking devices used to filter this information are insufficient. “Cybercops especially worry that outlaws are now able to use powerful cryptography to send and receive uncrackable secret communications and are also aided by anonymous re-mailers. “11 By using features like these it is mpossible to use blocking devices to stop children from accessing this information. Devices set up to detect specified strings of characters will not filter those that it cannot read.

The government has to stop obscene materials from being transferred via the Internet because it violates laws dealing with interstate commerce. It is not a valid argument that “consenting adults” should be allowed to use the computer BBS and “Internet” systems to receive whatever they want. If the materials are obscene, the law can forbid the use of means and facilities of interstate commerce and common carriers to ship or disseminate the obscenity. 2 When supplies and information are passed over state or national boundaries, they are subject to the laws governing interstate and intrastate commerce.

When information is passed between two computers, it is subjected to the same standards Frank Stallone. The government having the power to regulate the information being put on the Internet is a proper extension of its powers. With an information based system such as the Internet there is bound to be material that is not appropriate for minors to see. In passing of an amendment like the Communications Decency Act, the government would be given the power to regulate that material.

Government Intervention On The Internet

During the past decade, our society has become based solely on the ability to move large amounts of information across great distances in a very short amount of time and at very low costs. The evolution of the computer era and our growing need for ultra-fast communications has caused a global network of interconnected computers to develop, commonly referred to as the Internet or the world wide web. The Internet has influenced practically everyones life in some way whether it was done directly or indirectly. Our children are exposed to the Internet at school, and we are exposed to the Internet simply by just watching our television sets.

The Internet has become the primary key to the future of communication in our society today. Because of this, the government feels that it has the right to regulate and control the contents of information distributed through the World Wide Web, contrary to the opinions of most Internet users, myself included. Freedom of Speech Over the Internet At the present, this network is the epitome of the first amendment, freedom of speech. It is a place where people can speak their minds without being reprimanded for what they say, or how they choose to say it.

The key to the success of the Internet is its protection of free speech, not only in America, but in other countries as well, where free speech is not protected by a constitution. Because there are no laws regulating Internet material, people may find some of its content offending, ranging from pornography, to hate-group forums, to countless other forms of information. With over 30 million Internet users in the U. S. alone, some of the material is bound to be interpreted as offensive to some other Internet user. My advice to these people is to change the station if you dont like what you see. Laws and the Internet

The newest waves of laws making their way through Congress threaten to stifle spontaneity of the Internet. Recently, Congress has considered passing laws that will make it a crime to send vulgar language or encryption software over the web. These crimes could result in prosecutions punishable by jail time. No matter how insignificant, any attempt at government intervention on the Internet will stifle the greatest communication innovation of this century. The government wants to maintain control over this new form of communication, and it is trying to use the protection of children as a smoke screen to impose these laws upon us.

Censorship of the Internet threatens to destroy its freelance atmosphere, while wide spread encryption could help eliminate the need for government intervention. How Do We Interpret the Internet The current body of laws existing today in America does not apply well to the Internet. Is the Internet like a broadcasting medium, where the government monitors what is broadcast? Is it like a bookstore, where servers cannot be expected to review every title? Is it like a phone company that must ignore what it carries because of privacy?

The trouble is that the Internet can be all or none of these things depending on how it is used. The Internet cannot be viewed as one type of transfer medium under the current broadcast definitions. The Internet differs from the broadcasting media in that one cannot just happen upon a vulgar site without first keying in a complicated address, or following a link from another source. The Internet is much more like going into a book store and choosing to look at adult magazines (Miller 75). Because our use of the Internet varies from person to person, its meaning may be interpreted in a number of different ways.

Nudity on the Internet Jim Exon, a democratic senator from Nebraska, wants to pass a decency bill regulating sexual content on the Internet. If the bill is passed, certain commercial servers that post nude pictures, like those run by Penthouse or Playgirl, would of course be shut down immediately or risk prosecution. The same goes for any amateur web site that features nudity, sex talk, or sexually explicit words. Posting any sexual words in a Usenet discussion group, which occurs routinely, could cause a person to be liable for a $50,000 fine and six months in jail.

Why does it suddenly become illegal to post something that has been legal for years in print? Exon’s bill apparently would also criminalize private mail, … I can call my brother on the phone and say anything–but if I say it on the Internet, it’s illegal (Levy 56). Internet Access To Other Countries Congress, in their pursuit of regulations, seems to have overlooked the fact that the majority of the adult material on the Internet is sent from overseas. Many of the new Internet technologies, including the World Wide Web, have been developed overseas.

There is no clear boundary between information existing in the U. S. and information existing in other countries. Data held in foreign computers is just as accessible as data in America. All it takes is the click of a mouse to access it. Even if our government tried to regulate the Internet, we have no control over what is posted in other countries or sent from other countries, and we have no practical way to stop it. The Internet was originally designed to uphold communications after a nuclear attack occurred by rerouting data to compensate for destroyed telephone lines and servers. Today’s Internet still works on a similar design.

The building blocks of the Internet were designed to overcome any kind of communication barriers put in its way. For example, if a major line between two servers is cut, the Internet users will find another way around this obstacle, whether the servers reside in different cities, states, or countries. This characteristic of the Internet makes it virtually impossible to separate an entire nation from indecent information in other countries (Wilson 33). Internet Regulating Gone Bad Recently, a major university attempted to implement limitations on the Internet access available to its students, with results reminiscent of a 1960s protest.

The university had become concerned that it might be held responsible for allowing students access to sexually explicit material, after a research associate found quite a large collection of pornographic pictures (917,410 images to be exact) that several students had downloaded. Frightened by a local court case that had recently declared pictures of similar content obscene, the school administration quickly removed access to all these pictures and to the newsgroups where most of this obscenity had susceptibly come from.

A total of 80 newsgroups were removed, causing a large disturbance among the student body, and shortly thereafter, the American Civil Liberties Union and the Electronic Frontier Foundation became involved, all of whom felt this was unconstitutional. After only half a week, the college had backed down, and restored the newsgroups. This is a small example of what may happen if the government tries to impose censorship (Elmer-Dewitt 102). Children and the Internet Currently, there is software being released that promises to block childrens access to known X-rated Internet newsgroups and sites.

However, most adults rely on their computer literate children to install and set these programs up, which inevitable defeats the purpose behind childproofing software. Even if this software is installed by an adult, whos to say that the child cant go to a friends house and surf the web without any restrictions or supervision? Children will find ways to get around these restrictions. Regardless of what types of software or safeguards are used to protect these children, there will always be ways around them. This necessitates the education of the children to deal with reality.

Altered views of an electronic world translate easily into altered views of the real world. When it comes to our children, censorship is a far less important issue than good parenting. We must teach our kids that the Internet is an extension and a reflection of the real world. We have to show them how to enjoy the good things and avoid the bad things. This isn’t the government’s responsibility. It’s ours as parents. (Miller 76) Self Regulation of the Internet Some restrictions on electronic speech imposed by major online companies are not so bad.

Most of these communication companies have restrictions on what their users can say in public forum areas (Messmer). They must, however, respect their customer’s privacy. Private e-mail content is off limits to them, but they may act swiftly upon anyone who spouts obscenities in a public forum. Self-regulation by users and servers is the key to avoiding government imposed intervention. Many on-line sites such as Playgirl and Penthouse have started to regulate themselves. Both of these sites post clear warnings that adult content lies ahead and lists the countries where this is illegal.

The film and video game industries subject themselves to ratings, and similarly, if Internet users want to avoid government imposed regulations, maybe it is time they began to regulate themselves. Encryption Government attempts to regulate the Internet are not just limited to obscenity and vulgar language. These attempts also fall into other areas, such as data encryption. By nature, the Internet is an insecure method of transferring data. A single e-mail packet may pass through hundreds of computers from its source to its final destination.

At each computer, there is the chance that the data will be archived and someone may intercept that data. Encryption is a means of encoding data so that only someone with the proper key can decode it. Why do you need encryption? It’s personal. It’s private. And it’s no one’s business but yours (Laberis). You may be planning a political campaign, discussing our taxes, or having an illicit affair. Or you may be doing something that you feel shouldn’t be illegal, but it is. Whatever it is, you don’t want your private electronic mail or confidential documents read by anyone else.

There’s nothing wrong with asserting your privacy. Perhaps you are not really concerned about encrypting your e-mail because you believe that you have nothing to hide. I mean you havent broken the law in any way, right? Well then why not just write letters on postcards instead of sealed away in envelopes? Why not submit to drug testing on demand? Why require a warrant for police searches of your house? Do law-abiding citizens have any need to encrypt their e-mail? What if everyone believed those law-abiding citizens should use postcards for their mail for the simple reason that you have nothing to hide?

Just because you havent done anything wrong, doesnt mean that you want the whole world to have access to your letters or e-mail. Analogously, it would be nice if everyone routinely used encryption for all their e-mail, innocent or not, so that no one drew suspicion by asserting their e-mail privacy with encryption. Think of it as a form of solidarity (Zimmerman). Until the development of the Internet, the U. S. government controlled most new encryption techniques. With the development of faster home computers and a worldwide web, the government no longer holds control over encryption.

New algorithms have been discovered that are reportedly unable to be cracked, even by the FBI and the NSA. This is a major concern to the government because they want to maintain the ability to conduct wiretaps and other forms of electronic surveillance into the digital age. Pretty Good Privacy To stop the spread of data encryption software, the U. S. government has imposed very strict laws on its exportation. One very well known example of this is the PGP (Pretty Good Privacy) scandal. PGP was written by Phil Zimmerman, and is based on public key encryption.

This system uses complex algorithms to produce two codes, one for encoding and one for decoding. To send an encoded message to someone, a copy of that person’s public key is needed. The sender uses this public key to encrypt the data, and the recipient uses their private key to decode the message. As Zimmerman was finishing his program, he heard about a proposed Senate bill to ban cryptography. This prompted him to release his program for free, hoping that it would become so popular that its use could not be stopped.

One of the original users of PGP posted it to an Internet site, where anyone from any country could download it, causing a federal investigator to begin investigating Phil for violation of this new law. As with any new technology, this program has allegedly been used for illegal purposes, and the FBI and NSA are believed to be unable to crack this code. When told about the illegal uses of his program, Zimmerman replied, If I had invented an automobile, and was told that criminals used it to rob banks, I would feel bad, too.

But most people agree the benefits to society that come from automobiles — taking the kids to school, grocery shopping and such — outweigh their drawbacks. Data Encryption Standard The government has not been totally blind for the need of encryption. For nearly two decades, a government sponsored algorithm, Data Encryption Standard (DES), has been used primarily by banks. The government has always maintained the ability to decipher this code with their powerful supercomputers. Now that new forms of encryption have been devised that the government cannot decipher, they are proposing a new standard to replace DES.

Clipper Chips This new standard is called Clipper, and is based on the public key algorithms. Instead of software, Clipper is a microchip that can be incorporated into just about anything (Television, Telephones, etc. ). This algorithm uses a much longer key that is 16 million times more powerful than DES. It is estimated that today’s fastest computers would take 400 billion years to break this code using every possible key (Lehrer 378). The catch: At the time of manufacture, each Clipper chip will be loaded with its own unique key, and the Government gets to keep a copy, placed in escrow.

Not to worry though, the Government promises that they will use these keys to read your traffic only when duly authorized by law. Of course, to make Clipper completely effective, the next logical step would be to outlaw other forms of cryptography. If privacy is outlawed, only outlaws will have privacy. Intelligence agencies have access to good cryptographic technology. So do the big arms and drug traffickers. So do defense contractors, oil companies, and other corporate giants. But ordinary people and grassroots political organizations mostly have not had access to affordable military grade public-key cryptographic technology.

Until now. PGP empowers people to take their privacy into their own hands. There’s a growing social need for it. That’s why I wrote it. (Zimmerman) Signatures The most important benefits of encryption have been conveniently overlooked by the government. If everyone used encryption, there would be absolutely no way that an innocent bystander could happen upon material they find offensive. Only the intended receiver of the data could decrypt it (using public key cryptography, not even the sender can decrypt it) and view its contents.

Each coded message also has an encrypted signature verifying the sender’s identity. The sender’s secret key can be used to encrypt an enclosed signature message, thereby signing it. This creates a digital signature of a message, which the recipient (or anyone else) can check by using the sender’s public key to decrypt it. This proves that the sender was the true originator of the message, and that the message has not been subsequently altered by anyone else, because the sender alone possesses the secret key that made that signature.

Forgery of a signed message is infeasible, and the sender cannot later disavow his signature (Zimmerman). Gone would be the hate mail that causes many problems, and gone would be the ability to forge a document with someone else’s address. The government, if it did not have ulterior motives, should mandate encryption, not outlaw it. Conclusion As the Internet continues to grow throughout the world, more governments may try to impose their views onto the rest of the world through regulations and censorship.

It will be a sad day when the world must adjust its views to conform to that of the most prudish regulatory governments in existence. If too many regulations are enacted, then the Internet as a tool will become nearly useless, and the Internet as a mass communication device and a place for freedom of mind and thoughts, will become nonexistent. There exists a very fine line between protecting our children from pornographic material, while still protecting our rights to freedom of speech.

The users, servers, and parents of the world must regulate themselves, so as not to force government regulations that may stifle the best communication instrument in history. If encryption catches on and becomes as widespread as Zimmerman predicts it will, then there will no longer be a need for the government to intrude in the matters of the Internet, and the biggest problems will work themselves out. The government should rethink its approach to the censorship and encryption issues, allowing the Internet to continue to grow and mature on its own.

The Internet Report

Imagine a place where people interact in business situations, shop, play video games, do research, or study and get tutoring. Now imagine that there are no office buildings, no shopping centers, no arcades, no libraries, and no schools. These places all exist in a location called the Internet – “an anarchic eyetem (to use an oxymoron) of public and private computer networks that span the globe. ” (Clark 3). This technological advance not only benefits people of the present, but also brings forth future innovations. People use the Internet for many purposes, yet there are three popular reasons.

First, there is the sending and receiving of messages through the electronic mail. Second, there are discussion groups with a wide range of topics in which people can join. Finally, people are free to browse into vast collection of resources (or databases) of the World Wide Web. Electronic mail (e-mail) brings a unique perception into the way of communication. Although, it did not replace the traditional means of communication such as letters and telephone calls, it has created a new method f transmitting information in a more efficient way.

E-mail saves time between the interval of sending and receiving a message. Sending an e-mail message halfway around the world can arrive at its destination within a minute or two. In comparison, a letter can take from a few days to a couple of weeks, according to the distance it travels. Furthermore, e-mail is inexpensive. The cost of connection to the Internet is relatively cheaper than that of cable television. Evidently e-mail is both time-saving and cost-effective.

Discussion groups are a great way to interact with others in the world and to expand the knowledge of one’s horizon. The response is instantaneuos just like the telephone except it is non-verbal (typed). Discussion groups are on-line services that make use of non-verbal communication in the interest of the user. Services can range from tutor sessions to chat-lines where people just want to mingle. Communication through the Internet is a way of meeting new people. There is no racial judgement in meeting on the Internet because hysical appearance is not perceived.

However, attitude and personal characteristics are evident from the style in which a person talks (or types). This kind of communicaion helps narrow the gap between people and cultural differences. Communicating in discussion groups sometimes lead to even one-to- one conversations that soon enough become a link to friendship. Connections are being made when people meet each other; therefore, information on interest Web sites can be passed on. The World Wide Web (WWW) holds information that answers questions to the user.

The main purpose of the WWW is to give a variety of information ranging from literature to world geography. WWW contains Web sites that are created by government agencies and institutions to business companies and individuals. WWW carries text, graphics, and sound to catch the interest of people browsing through the different Web sites. These Web sites are being added daily, while the existing sites are being revised to compensate for more updated information and interests. This growth of information will soon become a world library of topics on anything that one can imagine.

A person using the Internet for one day encounters more information than a person reading in the library for a whole year. It is the convenience of the Internet that allows a person to go through an enormous amoung of information in a short period of time. This information community can pull the minds’ of users closer together, thus making the world smaller. The Internet is full of people who are requesting and giving out information to the ones who are interested, since “information wants to be free. ” – Stewart Brand (Van der Leun 25).

Hypothetically, if everyone is onnected to at least one other person on the Internet, eventually everyone everyone will meet each other. In other words, the world will gradually evolve into a “global village” which can be defined as “the world, especially of the late 1900’s, thought of as a village, a condition arising from shrinking distance by instantaneous world-wide electronic communication. ” (Nault 907). Thus, the Internet is a wonderful tool and medium in which people can interact with the information society. Afterall, information is like the building blocks of technological advancement.

An Internet Perspective

A man and computer scientist Robert Taylor had developed a new system of communication that would change the world. Taylor would connect two separate computers that were capable of communicating small bits of information between one another. This was only intended to send simple text messages and numbers using an analog signal, but would prove to be a bigger help than originally imagined. Consequently, this basic networking of computers would soon develop into something much bigger and more vast than what had originally been envisioned.

The Internet has become a fast and efficient way of connecting people of all cultures and locals. This in turn has given rise to an entirely unique from of business practice and consumer buying power. Social interactions between all types of peoples from around the world have also become more wide spread. The Internet has become a hotbed of business activity, a virtual shopping mall, a social paradise, and a culture all wrapped up in a neat little package. Despite these advantages, this synthetic global connection with its massive networking of computers has drawbacks, such as an avoidance of direct social contact and alienation.

The power to access both the business and social world from the average user’s home hinders the desire to connect with the outside physical world. The Internet serves many purposes, but has specifically altered the standard economic practices of businesses previously dependant on direct social contact in attracting and maintaining a healthy clientele. For instance, the use of email to communicate messages and send file attachments is a system that has eliminated much of the legwork involved in exchanging information pertinent to the needs of that particular customer.

Inversely, this decreases the need for added employees, eliminating the costs of having to hire personnel to do such work, and has made customer interactions both faster and more efficient. Furthermore, the net is also used as an on-line trading tool between companies, customer web sites, company purchases, and various other integrated uses. These kinds of applications serve the company and customer in a fashion that allows more compatibility, speed, and convenience for both sides.

The Internet’s speed and efficiency can be a great asset to a business as well as its consumers, but is subject to such intrusions as hacking for the purpose of industrial espionage; which can easily result in stolen marketing and product ideas. This can greatly effect a business if such attempts are successful. As a result, other problems such as system crashes can incapacitate a business for an inordinate amount of time, where shipping and purchasing would cease as a result. From the perspective of the consumer, this new form of business has spawned a new form of convenience and accessibility.

The ability to access goods and services through the computer increases consumer base. Consumers who were previously unable to purchase goods on their own now have the power to purchase such products from their personal computer. With a search engine and the knowledge of what a person is looking for, an unlimited array of businesses will include a virtual catalog of what their company has to offer to the customer. As a result, the customer has the power to choose whatever he or she desires without the inconvenience of leaving their own home.

People can browse for hours without the time and effort involved in going from store to store to find just the right product. The elimination of price markups to pay employees and showcase merchandise is also appealing to the consumer. Despite the ease and accessibility such a system of acquiring goods and services would have, there are some drawbacks. First and foremost, the customer is divorced from the physical aspect of selecting their product. They must rely on photographs and descriptions to select their product, which may turn out not to suit their specific needs and wants.

While another issue concerning the purchasing of products on-line are the security risks regarding credit card purchases, which is the main source of buying power for the consumer. This creates hesitation on the part of the consumer for fear of hackers who can intercept credit card numbers while online purchases are being made. The Internet is not always used for the specific purpose of business relations and customer transactions. The net allows people who would not normally be able to interact with other social groups and cultures to communicate with a wide range of different persons.

Chat rooms are the most widespread of all places to find common interest groups that are willing to share information about themselves and their lives. This type of global social interaction is reminiscent to having a pen pal but on a much larger and more efficient scale. Secondly, this type of interaction eliminates the social awkwardness of direct physical contact when meeting new people. Net users can reach out to others while maintaining a comfortable level of personal space.

While this can be a new and exciting tool to meeting other users from around the world, this can also discourage people from using their social skills outside their own home. This comfort of anonymity can be used to falsify information to others. This means that the information received about another person could very well be fabricated. These encounters, whether they are sincere or falsified, can lead to unhealthy infatuations. Such incidences can result in dangerous encounters with individuals who insist on pursuing an uninterested party.

These individuals will go to such lengths as obtaining personal information for the purpose of stalking. In cases such as these, the Internet can provide a false sense of security. For those who insist on using the Internet as their primary source of interaction with others, this method of human relations can discourage the avenues of conventional socialization. Despite all the issues concerning the Internet, it is still becoming an ever-increasing form of communication and will continue to flourish well into the 21st century.

Businesses are finding that with the unprecedented rate of growth that the net is currently undergoing, a substantial increase in potential consumers can only serve to increase the profitability of their organizations. The sheer volume of current, as well as future consumers, far outweighs any technical abnormalities that may occur. Such conveniences are also appealing to the customers and encourage them to make their purchases on-line. This convenience overrides any doubts the consumer may have in regards to their purchase.

The volume increase of on-line users creates this environment for business growth as well as social interaction; users are able to connect with a wide variety of people and view the Internet as a social garden. Issues of safety and isolation within the home do not seem as relevant as the benefits of cross-global interaction. People who struggle in customary social situations find comfort in this type environment. The Internet is a useful and vestal tool in all of these situations. There will always be problems with a system as large as this, but the advantages are ever increasing.

Internet – Method Of Communication

The Internet is a method of communication and a source of information that is becoming more popular among those who are interested in, and have the time to surf the information superhighway. The problem with much information being accessible to this many people is that some of it is deemed inappropriate for minors. The government wants censorship, but a segment of the population does not. Within this examination of the topic of, Government Intervention of the Internet, I will attempt to express both side s of this issue.

During the past decade, our society has become based solely on the bility to move large amounts of information across large distances quickly. Computerization has influenced everyone’s life. The natural evolution of computers and this need for ultra-fas t communications has caused a global network of interconnected computers to develop. This global net allows a person to send E-mail across the world in mere fractions of a second, and enables even the common person to access information worldwide.

With th e advances with software that allows users with a sound card to use the Internet as a carrier for long distance voice alls and video conferencing, this network is the key to the future of the knowledge society. At present this net is the epitome of the F irst Amendment: freedom of speech. It is a place where people can speak their mind without being reprimanded for what they say, or how they choose to say it. Recently, Congress has been considering passing laws that will make it a crime punishable by jail to send “vulgar” language over the net.

The government wants to maintain control over this new form of communication, and they are trying to use the protect ion of children as a moke screen to pass laws that will allow them to regulate and censor the Internet, while banning techniques that could eliminate the need for regulation. Censorship of the Internet threatens to destroy its freelance atmosphere, while methods such as encryption could help prevent the need for government intervention. The current body of laws existing today in America does not apply well to the Internet. Is the Internet like a bookstore, where servers cannot be expected to review every title?

Well, according to an article written by Michael Miller “Cybersex Shock. ” In he October 10, 1995 issue of PC Magazine (p. 75) “The Internet is much more like going into a book store and choosing to look at adult magazines. ” Although the Internet differs from other forms of media in that one cannot just happen upon a vulgar site without first, either entering a complicated address following a link from another source, or by clicking on the agreement statement at the beginning of the site acknowledging that one is of the legal age of 18.

This lawless atmosphere bothered many people, one such person is Nebraska Senator James Exon (D), who is one of the founding fathers of the Telecommunications Decency Act of 1996, Section 502, 47 U. S. C Section 223 [a], which regulates ” any obscene or in decent material via the Internet to anyone under 18 years of age. Exon’s bill would also according to an article written by Steven Levy in an April 1995 issue of Newsweek magazine (p. 53) “criminalize private mail,” Levy also stated emotional “I can call m y brother on the phone and say anything-but if I say it on the Internet, it’s illegal.

One thing that Congress seems to have overlooked in its pursuit of regulations is that there are no clear bountries from information being ccessed over the Internet from over countries. All it takes is a click of a mouse to access, even if our governmen t tried to regulate information accessed from other countries, we would have no control over what is posted in those countries, and we would have no practical way to stop it. Today’s Internet works much like that of our own human brains, in that if one ba rrier or option is taking your brain tries to find an alternate route or option.

Today’s Internet works on a similar design, if a major line between two servers say in two countries, is cut, then the Internet sers will find another way around this obstac le. This process of obstacle avoidance makes it virtually impossible to separate an entire nation from indecent information in other countries. If it were physically possible to isolate America’s computers from the rest of the world, in my opinion it woul d be devastating to our economy.

In an article published In Time magazine, written by Philip Emler-Dewitt titled “Censoring Cyberspace: Carnegie Mellon’s attempt to Ban Sex from its Campus Computer Network Sends A Chill Along the Info Highway. ” Nov. 1994, (p. 102) “Martin Rim put togethe r quite a large icture collection (917,410 images) and he also tracked how often each image had been downloaded (a total of 6. 4 million). A local court had recently declared pictures of similar content obscene, and the school felt they might be held resp onsible for the content on its network.

The school administration quickly removed access to all these pictures, and to the newsgroups where this obscenity is suspected to come from. A total of 80 newsgroups were removed, causing a large disturbance among the student body, the American civil Liberties Union, and the Electronic Frontier Foundation, all of whom felt this was unconstitutional. After only half a week, the college had backed down, and restored the newsgroups. ” This is only a tiny example of wha t may happen if the government tries to impose censorship.

Regardless of what types of software or safeguards are used to protect the children of the information age, there will always be ways around them. As stated in an article printed in PC Magazine on Oct. 10, 1995, written by Michael Miller on (p. 76) titled “Cybersex Shock. ” “When it comes to our children, censorship is a far less important issue than good parenting. We must teach our kids that the Internet is an extension and reflection of the real world, and we have to show them how to enjoy the good things and avoid the bad things.

This isn’t the government’s responsibility. It’s ours. ” Until the development of the Internet, the U. S. government controlled most of the new communication techniques. With the development of faster personal computers and the addition of the worldwide web, they had no control over the vast range of this style of communication. To stop the spread of data the U. S. government has imposed strict laws on the exportation.

This is explained in an article by Phil Zimmerman entitled ” Pretty Good Privacy” found online at Ftp: net-dist. mit. du “To send a encoded messag e to someone, a copy of that person’s ‘public’ key is needed. The sender uses this public key to encrypt the data, and the recipient uses their ‘private’ key to decode the message. ” As with any new technology, this program has allegedly been used for ille gal purposes, and the FBI and NSA are believed to be unable to crack this code. Zimmerman’s reply to his knowledge of this rumor was quoted in Steven Levy’s article published in the Apr. 995 issue of Newsweek titled “The Encryption Wars: Privacy Good or Bad? ” (p. 6)

“If I had invented an automobile, and was told that criminals used it to rob banks, I would feel bad, too. But most people agree that the benefits to society that come from automobiles-taking the kids to school, grocery shopping and such-outw eigh their drawbacks. ” As the Internet continues to grow throughout the world, more governments may try to impose their views onto the rest of the world through regulations and censorship, It will be a sad day when the world must adjust its views to conform to that of the most rudish regulatory government.

If too many regulations are incited, then the Internet as a tool will become nearly useless, and the Internet as a mass communication devise and a place for freedom of mind and thoughts, will become non existent. The govern ment should rethink its approach to the censorship and the encryption issues, allowing the Internet to continue to grow and mature. The users, parents, and servers of the world need to regulate themselves, so, as not to push the government into forcing th ese types of regulations on what might be the best communication instrument in history.

The History of the Internet and the WWW

1. The History of the World Wide Web-

The internet started out as an information resource for the government so that they could talk to each other. They called it “The Industrucable Network” because it was so many computers linked to gether that if one server went down, no-one would know. This report will mainly focus on the history of the World Wide Web (WWW) because it is the fastest growing resource on the internet.

The internet consists of diferent protocals such as WWW, Gopher (Like the WWW but text based), FTP (File Transfer Protocal), and Telnet (Allows you to connect to different BBS’s). There are many more smaller one’s but they are inumerable. A BBS is an abreviation for Bullitin Board Service. A BBS is a computer that you can ether dial into or access from the Internet. BBS’s are normally text based.

2. The Creator of the WWW-

A graduate of Oxford University, England, Tim is now with the Laboratory for Computer Science ( LCS)at the Massachusetts Institute of Technology ( MIT). He directs the W3 Consortium, an open forum of companies and organizations with the mission to realize the full potential of the Web.

With a background of system design in real-time communications and text processing software development, in 1989 he invented the World Wide Web, an internet-based hypermedia initiative for global information sharing. while working at CERN, the European Particle Physics Laboratory. He spent two years with Plessey elecommunications Ltd a major UK Telecom equipment manufacturer, working on distributed transaction systems, message relays, and bar code technology.

In 1978 Tim left Plessey to join D.G Nash Ltd, where he wrote among other things typesetting software for intelligent printers, a multitasking operating system, and a generic macro expander.

A year and a half spent as an independent consultant included a six month stint as consultant software engineer at CERN, the European Particle Physics Laboratory in Geneva, Switzerland. Whilst there, he wrote for his own private use his first program for storing information including using random associations. Named “Enquire”, and never published, this program formed the conceptual basis for the future development of the World Wide Web. I could go on and on forever telling you about this person, but my report is not about him.

From 1981 until 1984, Tim was a founding Director of Image Computer Systems Ltd, with technical design responsibility. In 1984, he took up a fellowship at CERN, to work on distributed real-time systems for scientific data acquisition and system control.

In 1989, he proposed a global hypertext project, to be known as the World Wide Web. Based on the earlier “Enquire” work, it was designed to allow people to work together by combining their knowledge in a web of hypertext documents. He wrote the first World Wide Web server and the first client, a wysiwyg hypertext browser/editor which ran in the NeXTStep environment. This work was started in October 1990, and the program “WorldWideWeb” first made available within CERN in December, and on the Internet at large in the summer of 1991.

Through 1991 and 1993, Tim continued working on the design of the Web, coordinating feedback from users across the Internet. His initial specifications of URIs, HTTP and HTML were refined and discussed in larger circles as the Web technology spread.

In 1994, Tim joined the Laboratory for Computer Science (LCS)at the Massachusetts Institute of Technology (MIT). as Director of the W3 Consortium which coordinates W3 development worldwide, with teams at MIT and at INRIA in France. The consortium takes as it goal to realize the full potential of the web, ensuring its stability through rapid evolution and revolutionary transformations of its usage.

In 1995, Tim Berners-Lee received the Kilby Foundation’s “Young Innovator of the Year” Award for his invention of the World Wide Web, and was corecipient of the ACM Software Systems Award. He has been named as the recipient of the 1996 ACM Kobayashi award, and corecipient of the 1996 Computers and Communication (C&C) award.

He has honorary degrees from the Parsons School of Design, New York (D.F.A., 1996) and Southampton University (D.Sc., 1996), and is a Distinguished Fellow of the British Computer Society. This has just been about Tim, but here is the real hsitory of the WWW.

3. History of the WWW dates –

“Information Management: A Proposal” written by Tim BL and circulated for comments at CERN (TBL). Paper “HyperText and CERN” produced as background (text or WriteNow format). Project proposal reformulated with encouragement from CN and ECP divisional management. Robert Cailliau (ECP) is co-author. The name World-Wide Web was decided because the name tells you what the reasorce does. HyperText is the language that users who want homepages on the internet use to write them. (See a sample of this on last page).

In November of 1990 Initial WorldWideWeb program developed on the NeXT (TBL) . This was a wysiwyg browser/editor with direct inline creation of links. This made the WWW easier to use and navigate without having to type long numbers. Technical Student Nicola Pellow (CN) joins and starts work on the line-mode browser. Bernd Pollermann (CN) helps get interface to CERNVM “FIND” index running. TBL gives a colloquium on hypertext in general. When this happend the WWW really started sprouting because this new browsers made the WWW easier to navigate.

4. History of the World Wide Web dates 1991-1993

In 1991 a line mode browser (www) released to limited audience on “priam” vax, rs6000, sun4. On the 17th of May a general release of WWW software was made avalible on Cern servers. This allowed people to start ther own internet providing such as America Online and South Carolina SuperNet. On the 12th of June a siminar was held for the WWW that allowed people to come in and see this new software in progres. I would like to skip ahead to present day because more intersting things are happening now.

5. Present Day World Wide Web and Internet reasorces-

The World Wide web today is the most popular reasource on the internet. Facts show that the internet has an average 45 million users on a day with one more joining every eight seconds. The internet transmits at a maximum speed of 100mb per second. Present day internet is fast and relyable, it is also very popular. The internet started out as just a few computers linked together, and now look what we have. The internet will live on forever, and so will the WWW. I belive that the WWW will be replaced by something in the next 10 years.

Internet privacy and Internet censorship

During the last decade, our society has become based on the sole ability to move large amounts of information across great distances quickly. Computerization has influenced everyone’s life in numerous ways. The natural evolution of computer technology and this need for ultra-fast communications has caused a global network of interconnected computers to develop.

This global network allows a person to send E-mail across the world in mere fractions of a second, and allows a common person to access wealths of information worldwide. This newfound global network, originally called Arconet, was developed and funded solely by and for the U. S. vernment. It was to be used in the event of a nuclear attack in order to keep communications lines open across the country by rerouting information through different servers across the country.

Does this mean that the government owns the Internet, or is it no longer a tool limited by the powers that govern. Generalities such as t! hese have sparked great debates within our nation’s government. This paper will attempt to focus on two high profile ethical aspects concerning the Internet and its usage. These subjects are Internet privacy and Internet censorship. At the moment, the Internet is epitome of our first amendment, free speech.

It is a place where a person can speak their mind without being reprimanded for what they say or how they choose to say it. But also contained on the Internet, are a huge collection of obscene graphics, Anarchists’ cookbooks, and countless other things that offend many people. There are over 30 million Internet surfers in the U. S. alone, and much is to be said about what offends whom and how. As with many new technologies, today’s laws don’t apply well when it comes to the Internet. Is the Internet like a bookstore, where servers can not be expected to review every title?

Is it like a phone company who must ignore hat it carries because of privacy; or is it like a broadcast medium, where the government monitors what is broadcast? The problem we are facing today is that the Internet can be all or none of the above depending on how it is used. Internet censorship, what does it mean? Is it possible to censor amounts of information that are all alone unimaginable? The Internet was originally designed to “find a way around” in case of broken communications lines, and it seems that explicit material keeps finding its “way around” too.

I am opposed to such content on the Internet and therefore am a firm believer in Internet censorship. However, the question at hand is just how much censorship the government impose. Because the internet has become the largest source of information in the world, legislative safeguards are indeed imminent. Explicit material is not readily available over the mail or telephone and distribution of obscene material is illegal. Therefore, there is no reason this stuff should go unimpeded across the Internet. Sure, there are some blocking devices, but they are no substitute for well-reasoned law.

To counter this, the United States has set regulations to determine what is categori! zed as obscenity and what is not. By laws set previously by the government, obscene material should not be accessible through the Internet. The problem society is now facing is that cyberspace is like a neighborhood without a police department. “Outlaws” are now able to use powerful cryptography to send and receive uncrackable communications across the Internet. Devices set up to filter certain communications cannot filter that which cannot be read, which leads to my other topic of interest: data encryption.

By nature, the Internet is an insecure method of transferring data. A single E-mail packet may pass through hundreds f computers between its source and destination. At each computer, there is a chance that the data will be archived and someone may intercept the data, private or not. Credit card numbers are a frequent target of hackers. Encryption is a means of encoding data so that only someone with the proper “key” can decode it. So far, recent attempts by the government to control data encryption have failed. They are concerned that encryption will block their monitoring capabilities, but there is nothing wrong with asserting our privacy.

Privacy is an inalienable right given to us by our onstitution. For example, your E-mail may be legitimate enough that encryption is unnecessary. If you we do indeed have nothing to hide, then why don’t we send our paper mail on postcards? Are we trying to hide something? In comparison, is it wrong to encrypt E-mail? Before the advent of the Internet, the U. S. government controlled most new encryption techniques. But with the development of the WWW and faster home computers, they no longer have the control they once had. New algorithms have been discovered that are reportedly uncrackable even by the FBI and NSA.

The government is oncerned that they will be unable to maintain the ability to conduct electronic surveillance into the digital age. To stop the spread of data encryption software, they have imposed very strict laws on its exportation. One programmer, Phil Zimmerman, wrote an encryption program he called PGP (Pretty Good Privacy). When he heard of the government’s intent to ban distribution encryption software, he immediately released the program to be public for free. PGP’s software is among the most powerful public encryption tool available. The government has not been totally blind by the need for encryption.

The banks have sponsored an algorithm called DES, that has been used by banks for decades. While to some, its usage by banks may seem more ethical, but what makes it unethical for everyone else to use encryption too? The government is now developing a new encryption method that relies on a microchip that may be placed inside just about any type of electronic equipment. It is called the Clipper chip and is 16 million times more powerful than DES and today’s fastest computers would take approximately 400 billion years to decipher it. At the time of manufacture, the chips are loaded with their own nique key, and the government gets a copy.

But don’t worry the government promises that they will use these keys only to read traffic when duly authorized by law. But before this new chip can be used effectively, the government must get rid of all other forms of cryptography. The relevance of my two topics of choice seems to have been conveniently overlooked by our government. Internet privacy through data encryption and Internet censorship are linked in one important way. If everyone used encryption, there would be no way that an innocent bystander could stumble upon something they weren’t meant to ee.

Only the intended receiver of an encrypted message can decode it and view its contents; the sender isn’t even able to view such contents. Each coded message also has an encrypted signature verifying the sender’s identity. Gone would be the hate mail that causes many problems, as well as the ability to forge a document with someone else’s address. If the government didn’t have ulterior motives, they would mandate encryption, not outlaw it. As the Internet grows throughout the world, more governments may try to impose their views onto the rest of the orld through regulations and censorship.

If too many regulations are enacted, then the Internet as a tool will become nearly useless, and our mass communication device, a place of freedom for our mind’s thoughts will fade away. We must regulate ourselves as not to force the government to regulate us. If encryption is allowed to catch on, there will no longer be a need for the government to intervene on the Internet, and the biggest problem may work itself out. As a whole, we all need to rethink our approach to censorship and encryption and allow the Internet to continue to grow and mature.

Internet Firewalls Essay

The Internet is a complex web of interconnected servers and workstations that span the globe, linking millions of people and companies. But there is a dark side: The convenient availability of valuable and sensitive electronic information invites severe misuse in the form of stolen, corrupted, or destroyed data found therein. Compounding this problem is the unfortunate fact that there are ample opportunities to intercept and misuse information transmitted on the Internet.

For example, information sent across telephone lines can not only be seen, but also can be easily manipulated and retransmitted, or software can be developed to do something as fundamental as deny Internet service. Preventing unauthorized access is a cost that should be factored into every Internet equation. What follows is an explanation of Internet security and the concept of Firewalls.

What Makes the Internet Vulnerable? Let’s look at some of the most common security threats: Impersonating a User or System – To authenticate Internet users, a system of user-Ids and passwords is used. Anyone intent on gaining access to the Internet can repeatedly make guesses until the right combination is found, a simple but time consuming process made all the easier by programs which systematically try all character combinations until the correct one is eventually generated.

User-IDs and passwords can also be trapped by finding security holes in programs; a person looking to abuse the Internet finds these holes and uses the information leaked through them for his or her own personal agenda. Even someone who has been entrusted with high-level network access, such as a system administrator, can misuse his or her authorization to gain access to sensitive areas by impersonating other users. Eavesdropping – By making a complete transcript of network activity, sensitive data such as passwords, data, and procedures for performing certain functions can be obtained.

Eavesdropping is can be accomplished through the use of programs that monitor the packets of information transmitted across the network; or, less often, by tapping network circuits in a manner similar to telephone wiretapping. Regardless of technique, it is very difficult to detect the presence of an intruder. Packet Replay – The recording of transmitted message packets over the network is a significant threat for programs requiring authentication sequences because an intruder saves and later replays (retransmits) legitimate authentication sequences to gain access to a system.

Packet Modification – This significant integrity threat involves one system intercepting and modifying a packet destined for another system; more significantly, in many cases, packet information can be just as easily destroyed as it can be modified. Denial of Service – Multi-user, multi-tasking operating systems are subject to denial of service attacks where one user can render the system unusable by hogging a resource or by damaging or destroying resources so that they cannot be used.

Service overloading, message flooding, and signal grounding are three common forms of denial-of-service attacks. While system administrators must protect against these types of threats without denying access to legitimate users, they are very hard to prevent. Many denial-of-service attacks can be hindered by restricting access to critical accounts, resources, and files, and by protecting them from unauthorized users. Many invasive Internet opportunities exist for access to corporate and personal information. These instances do occur and care should be taken to guard against them.

This is the function of a firewall: To provide a barrier between an Internet server and anyone intent on invading its sensitive data. Countering the Threat with a Firewall As the name implies, an Internet Firewall is a system set up specifically to shield a Web site from abuse and to provide protection from exposure to inherently insecure services, probes, and attacks from other computers on the network. A Firewall can be thought of as a pair of mechanisms: one, which exists to block traffic, and the other that exists to permit traffic.

Some firewalls place a greater emphasis on blocking traffic, while others emphasize permitting traffic. A major Firewall benefit is centralized security through which all Internet access must pass, which is far easier to maintain since there are fewer servers to update and fewer places in which to find suspected security breaches. The most important thing to remember is that a Firewall should be designed to implement an access control policy that best fits your specific needs to protect your unique data and resources.

Components of a Firewall Now, let’s looks at the individual components of a Firewall and how they operate: First, it is important to realize that the term Firewall defines a security concept rather than a specific device or program. A Firewall takes many forms, from a router that filters TCP/IP packets based on information in the packet to sophisticated packet filtering, logging, and application gateway servers which closely scrutinize requested functions.

Often firewalls are a collection of systems, each providing a piece of the overall security scheme. Acer has stepped up to the challenge by manufacturing gateway servers for a broad range of Firewall applications. The AcerAltos product family, from the entry-level applications AA900 Single Pentium and AA900Pro Single Pentium Pro servers to the mid-range AA9000 Dual Pentium and AA9000Pro Dual Pentium Pro servers, to the AA19000 Dual Pentium Pro server, fit any size Firewall application.

AcerAltos servers can provide the reliability and fault tolerance required by demanding Firewall applications. Packet Filtering – Accomplished by using a packet filtering router designed to examine each packet as it passes between the router’s input/output interfaces, services can be limited or even disabled, access can be restricted to and from specific systems or domains, and information about subnets can be hidden. The following packet fields are available for examination: Packet type – such as IP, UDP, ICMP, or TCP

Source IP address – the system from which the packet was sent Destination IP address – the system to which the packet is being sent Destination TCP/UDP port – a number designating a service such as telnet, ftp, smtp, nfs, etc. Source TCP/UDP port – the port number of the service on the host originating the connection The decision to filter certain protocols and fields depends on the site security policy; i. e. , which systems should have Internet access and the type of access permitted.

The Firewall’s location will influence this policy; for example, if a Firewall is located on a site’s Internet gateway, the decision to block inbound telnet access still permits access to other site systems, or, if it is located on a subnet, the decision to block inbound telnet to the subnet will prevent access from other site subnets. While some services such as FTP or telnet are inherently risky, blocking these services completely may be too harsh a policy; however, not all systems, though, require access to all services.

For example, restricting telnet or FTP access from the Internet to only those systems requiring such access can improve security without affecting user convenience. On the other hand, while services such as Network News Transfer Protocol (NNTP) or Network Time Protocol (NTP) may seem to pose no threat, restricting these protocols helps create a cleaner network environment, thereby reducing the likelihood of exploitation from yet-to-be-discovered vulnerabilities and threats.

Unfortunately, Packet Filtering routers suffer from a number of weaknesses: The filtering rules can be difficult to specify; testing must be done manually; the filtering rules can be very complex depending on the site’s access requirements; and no logging capability exists, thus if a router’s (lack of) rules were to still let dangerous packets through, they may go undetected until a break-in has occurred. In addition, some routers filter only on the destination address rather than on the source address.

Event Logging-Used to detect suspicious activity that might lead to break-ins, a host system with packet-filtering capability can more readily monitor traffic than a host in combination with a packet-filtering router, unless the router can be configured to send all rejected packets to a specific logging host. In addition to standard logging that would include statistics on packet types, frequency, and source/destination address, the following types of activity should be captured:

Connection Information to include the point of origin, destination, username, time of day, and duration. Attempts Use of Any Banned Protocols such as TFTP, domain name service zone transfers, portmapper, and RPC-based services, all of which would be indicative of probing or attempts to break in. Attempts to Spoof Internal Systems to identify traffic from an outside system attempting to masquerade as an internal system. Routing Re-Directions to identify access from unauthorized sources (unknown routers). A downside to logging is that Logs will have to be read frequently.

If suspicious behavior is detected, a call to the site’s administrator can often determine the source of the behavior and put an end to it, however the Firewall administrator also has the option of blocking traffic from the offending site. Application Gateways-Also referred to as Proxy Servers. A site would use an application gateway server such as an AcerAltos server to provide a guarded gate through which application traffic must first pass before being permitted access to specific (pre-defined) systems.

These gateway servers are used in conjunction with packet filtering and event logging to provide a higher level of security for applications that are not blocked at the firewall; examples include telnet, FTP, and SMTP. They are located where all traffic is destined for a host within a subnet; data is first sent to the application gateway, and any traffic not directed at the application gateway will be rejected via packet filtering. The application gateway then passes authorized traffic to the subnet, rejecting all unauthorized traffic.

Here are a number of advantages over the default mode of permitting application traffic to pass directly to internal hosts: Information Hiding – The names of internal systems are not made known via DNS to outside systems; only the application gateway host name is made known. Robust Authentication and Logging – Application traffic can be pre-authenticated before it reaches internal hosts and can be logged more effectively than standard host logging. Cost-Effectiveness – Third-party authentication or logging software/hardware need be located only at the application gateway.

Less-Complex Filtering Rules – The rules at the packet filtering router are less complex than if the router needed to filter application traffic and direct it to a number of specific systems; the router need only allow application traffic destined for the application gateway and reject the rest. Note that an application server is application specific; to support a new application protocol, new proxy software must be developed for it. Several proxy application tool kits have been developed and can be used as a starting place to develop your own gateway software.

Alternatively, packages have appeared on the market that offers a complete solution in lieu of costly development time. An application gateway is the focal point of all traffic to and from the Internet. Selecting the proper server hardware is critical to efficient, reliable Internet access. Underestimating the load with a server too small produces bottlenecks that affects every Internet user, while overestimating the load with a server too large wastes money, which affects the corporation.

Acer has effectively addressed this problem with a broad range of servers and upgrade options. Selecting the proper server is made easier because of the inherent flexibility of the AcerAltos product line — from the uni-processor 133MHz Pentium AA900 and the 180MHz/256KB / 200MHz/256KB Pentium Pro AA900Pro to the dual-processor 166MHz Pentium AA9000 and the 200MHz/256KB Pentium Pro AA9000Pro and AA19000 models — offers the appropriate level of power that best fits the application.

The expandability and scalability of the AcerAltos product line ensure incremental growth and performance improvement with minimal cost. Other Technologies-There are other emerging technologies that, while not new, are just now gaining recognition and standardization. Certain industry niches such as financial services require a higher degree of security. It is imperative for these companies to maintain the safety of financial data and build customer trust; Internet transactions must be made as safe, if not more, as traditional transactions.

To do this, these and other organizations have begun relying on two closely linked technologies: Authentication and Encryption. An application of encryption, which further enhances privacy, is the Virtual Private Network (VPN). Authentication is the process in which the receiver of a digital message can be confident of the identity of the sender and integrity of the message. Authentication protocols can be based on secret key cryptosystems or public key signature systems.

Secret key cryptosystems use a key or seed to encode data transmitted over the Internet. Once encoded by the sender, it takes the same or different key to decode it on the receiving end. Only the sender and the receiver know the keys. Should an unauthorized person intercept the message, it is unreadable and nearly impossible to decode without a great deal of time and a powerful computer. Public key technology uses the concept of digital signatures to assert that a named person wrote or otherwise agreed to the document on which the signatures appear.

The signature is an unforgettable piece of data allowing the recipient, as well as a third party, to verify both that the document did originate from the person who signed it and that the document has not been altered since it was signed. A secure digital signature system thus consists of two parts: a method of signing a document so that forgery is unfeasible and a method of signature verification. Moreover, secure digital signatures cannot be repudiated; that is, the signer of a document cannot later disown it by claiming it was forged, since each digital signature is registered with a Certificate Authority.

A Brief History of the Internet

Within our society there has been a revolution, one that rivals that of the Industrial Revolution. The Technological Revolution. At the head of this revolution is the Internet. A place full of information, adventure, and even for some, romance. In our society today everyone has heard of this technological wonder, and many use it on a daily basis, but for some the question still remains What is the Internet, and where did it come from? Some thirty years ago, the RAND Corporation, American’s foremost Cold War think-tank, faced a strange strategic problem. How could the US authorities successfully communicate after a nuclear war?

Post nuclear America would need a command-and-control network, linked from city to city, state-to-state, and base-to-base. But no matter how thoroughly that network was armored or protected, its switches and wiring would always be vulnerable to the impact of atomic bombs. A nuclear attack would reduce any conceivable network to tatters. And how would the network itself be commanded and controlled? Any central authority, any network central citadel, would be an obvious and immediate target for an enemy missile. RAND mulled over this grim puzzle in deep military secrecy, and arrived at a daring solution.

The network would have no central authority. Furthermore, it would be designed from the beginning to operate while in tatters. The principles were simple, the network itself would be assumed to be unreliable at all times (Krol 11). It would be designed from the get-go to transcend its own unreliability. All the nodes (computers hooked to the network) in the network would be equal in status to all other nodes, each node with its own authority to originate, pass, and receive messages. The messages themselves would be divided into packets, each packet separately addressed.

Each packet would begin at some specified source node, and end at some other specified destination node, winding its way through the network on an individual basis (Krol 11). The particular route that the packet took would be unimportant. Only final results would count. Basically, the packet would be tossed like a hot potato from node to node, more or less in the direction of its destination, until it ended up in the proper place. If big pieces of the network had been blown away, that simply wouldn’t matter; the packets would still stay air born, lateralled wildly across the network by whatever node happened to survive.

The National Physical Laboratory in Great Britain set up the first test network on these principles in 1968. Shortly afterward, the Pentagon’s Advanced Research Projects Agency (ARPA) decided to fund a larger, more ambitious project in the USA. In the fall 1969, the first node was installed in UCLA, and by December 1969 there were four nodes on the infant network, which was rightly named ARPANET, after its Pentagon sponsor (Overview of Internet Technology). The four computers could transfer data on dedicated high-speed transmission lines. They could even be programmed remotely from other nodes.

Thanks to ARPANET, scientists and researchers could share one another’s computer facilities by long-distance. By the second year of operation, however, and odd fact became clear. ARPANET’s users had warped the computer-sharing network into a dedicated, high-speed, federally subsidized electronic post office. The main traffic on ARPANET was not long-distance computing. Instead, it was news and personal messages. Researchers were using ARPANET to collaborate on projects, to trade notes on work, and eventually, to downright gossip and schmooze.

People had their own personal user accounts on the ARPANET computers, and their own personal address for electronic mail (email) (Overview of Internet Technology). Throughout the 70s, ARPANET’s network grew, its decentralized structure made expansion easy. As long as individual machines could speak the packet-switching lingua franca of new, anarchic network, their brand names, and their content, and even their ownership, were irrelevant. The ARPANET’s original standard for communication was known as NCP, “Network Control Protocol,” (Moschovits 62) but as time passed and the technique advanced.

NCP was surpassed by a higher-level, more sophisticated standard known as TCP/IP (Overview of Internet Technology). TCP, or “Transmission Control Protocol,” converts messages into streams of packets at the source, and then reassembles them back into messages at the destination (Krol 23). IP, or “Internet Protocol,” handles the addressing, seeing to it that packets are routed across multiple nodes and even multiple networks with multiple standards (Krol 20). As the 70s and 80s advanced, many very different social groups found themselves in possession of powerful computers.

It was fairly easy to link these computers to the growing network-of-networks. As the use of TCP/IP became more common, entire other networks fell into digital embrace of the Internet, and messily adhered. Since the software called TCP/IP was public-domain, (Overview of Internet Technology) and the basic technology was decentralized and rather anarchic by its very nature, it was very difficult to stop people from barging in and linking up somewhere-or-other. In point of fact, nobody wanted to stop them from joining this branching complex of networks, which came to be known as the “Internet”.

Connecting to the Internet cost the taxpayer little or nothing, since each node was independent, and had to handle its own financing and its own technical requirements. The more the merrier. Like the phone network, the computer network became steadily more valuable as it embraced larger and larger territories of people and resources. A fax machine is only valuable if everybody else has a fax machine. Until they do, a fax machine is just a curiosity. ARPANET, too, was a curiosity for a while, and then computer networking became an utter necessity.

In 1984 the National Science Foundation got into the act, through its Office of Advanced Scientific Computing. The new NSFNET set a blistering pace for technical advancement, linking newer, faster, shinier supercomputers, through thicker, faster links, upgraded and expanded, again and again, in 1986, 1988, and 1990 (Krol 12). The nodes in this growing network-of-networks were divided up into basic varieties. Foreign computers, and a few American ones, chose to be denoted by their geographical locations. The others were grouped by the six basic Internet “domains”: gov, mil, edu, com, org, and net.

Gov, Mil, and Edu denoted governmental, military and educational institutions. Com, however, stood for “commercial” institutions, which were soon bursting into the network like rodeo bulls, surrounded by a dust-cloud of eager nonprofit “orgs. ” (The “net” computers served as gateways between networks) (Kehoe 2). ARPANET itself formally expired in 1990, a happy victim of its own overwhelming success. (Overview of Internet Technology) Its users hardly noticed, for ARPANET’s functions not only continued but steadily improved.

In 1971, a mere thirty years ago, there were only four nodes in the ARPANET network. Today there are tens of thousands of nodes in the Internet, scattered over forty-two countries, with more coming on-line every day (Cerf). Three million, possibly four million people use this gigantic mother-of-all computer networks. The Internet’s pace of growth in the 90s is spectacular, almost ferocious. Last year the Internet was growing at a rate of twenty percent a month and the number of “host” machines with direct connection to TCP/IP has been doubling every year since 1988 (Cerf).

The Internet is moving out of its original base in military and research institutions, and into elementary and high schools, as well as into public libraries and even coffee shops. Long distance computing was an original inspiration of ARPANET and is still a very useful service, at least for some. Programmers can maintain accounts on distant, powerful computers, run programs there or write their own. Scientists can make use of powerful supercomputers a continent away. Libraries offer their electronic card catalogs for free search. Enormous CD-Rom catalogs are increasingly available through this service.

And there are fantastic amounts of free software available. The headless, anarchic, million-limbed Internet is spreading like bread mold. Any computer with sufficient power is a potential spore of the Internet, and today such computers sell for less than $2,000 and are in the hands of people all over the world. ARPA’s network, designed to assume control of a ravaged society after nuclear holocaust, has been surpassed by its mutant child the Internet, which is thoroughly out of control, and spreading exponentially through the post Cold War electronic global village.

The spread of the Internet in the 90s resembles the spread of personal computing in the 1970s, though it is even faster and perhaps much more important. More important, perhaps, because it may give those personal computers a means of cheap, easy storage and access that is truly planetary in scale. The future of the Internet bids fair to be bigger and exponentially faster. Commercialization of the Internet is a very hot topic today, with every manner of wild new commercial information – service promised. The federal government, pleased with an unsought success, is also very much in the act.

NREN, the National Research and Education Network, was approved by Congress in fall 1991, as a five year, $2 billion project to upgrade the Internet “backbone”. NREN is now fifty times faster than the fastest network in 1991, allowing the electronic transfer of the entire Encyclopedia Britannica in one hot second (EFF Information Infrastructure). Computer networks worldwide now feature 3-D animated graphics, portable computers (PALMs) and cellular phone links to the Internet, and this is only the beginning.

As we enter this new millennium, the Internet will be the road we drive on to get us to the next level of technological greatness. As astonishing as man walking on the moon, who would have thought that a device created to help during the worlds worst hour has now become the device that has united the globe. ARPANET may have got the ball rolling but this animal the Internet is on a blazing run to who knows where. We can only sit back, hang on, and enjoy the ride as we the technological generation wait for the next great technological wonder to change everything once more.

The Internet: its effects and its future

The Internet is, quite literally, a network of networks. It is comprised of ten thousands of interconnected networks spanning the globe. The computers that form the Internet range from huge mainframes in research establishments to modest PCs in people’s homes and offices. Despite the recent hype, the Internet is not a new phenomenon. Its roots lie in a collection of computers that were linked together in the 1970s to form the US Department of Defense’s communications systems.

Fearing the consequences of nuclear attack, there was no central computer holding vast amounts of data, rather the information was ispersed across thousands of machines. A set of rules, of protocols, known as TCP/IP was developed to allow disparate devices to work together. The original network has long since been upgraded and expanded and TCP/IP is now a “de facto” standard. Millions of people worldwide are using the Internet to share information, make new associations and communicate.

Individuals and businesses, from students and journalists, to consultants, programmers and corporate giants are all harnessing the power of the Internet. For many businesses the Internet is becoming integral to their operations. Imagine the ability to send and receive data: messages, notes, letters, documents, pictures, video, sound- just about any form of communication, as effortlessly as making a phone call. It is easy to understand why the Internet is rapidly becoming the corporate communications medium.

Using the mouse on your computer, the familiar point-and-click functionality gives you access to electronic mail for sending and receiving data, and file transfer for copying files from one computer to another. Telnet services allow you to establish connections with systems on the other side of the world as if they were just next door. This flood of information is a beautiful thing and it can only open the minds of society. With the explosion of the World Wide Web, anyone could publish his or her ideas to the world.

Before, in order to be heard one would have to go through publishers who were willing to invest in his ideas to get something put into print. With the advent of the Internet, anyone who has something to say can be heard by the world. By letting everyone speak their mind, this opens up all new ways of thinking to anyone who is willing to listen. Moreover, the Internet is an information resource for you to search, gathering new data on key search spects of your market. Perhaps most importantly, the Internet offers a new way of doing business.

A virtual market-place where customers can, at the push of a button, select goods, place an order and pay using a secure electronic transaction. Businesses are discovering the Internet as the most powerful and cost effective tool in history. The Net provides a faster, more efficient way to work colleagues, customers, vendors and business partners- irrespective of location or operating system harnessing this powerful resource gives companies strategic advantages by leveraging information into essential business asset. The technology of the future” here today. This is a fact.

Businesses making the transition will, and are prospering; however those that do not will most certainly suffer the consequences. One of the most commonly asked questions is, “Will the Net help me sell more product? ” The answer is yes, but in ways you might not expect. The Internet is a communication “tool” first, not and advertisement medium. Unlike print or broadcasting media, the Internet is interactive; and unlike the telephone, it is both visual and content rich. A Web site is an excellent way to reduce costs, improve customer service, disseminate information nd even sell to your market.

Perhaps, the most important facts about the internet are that it contains a wealth of information, that can be send across the world almost instantly, and that it can unite people in wildly different locations as if they were next to each other. The soundest claims for the importance of the Internet in today’s society are based upon these very facts. People of like minds and interests can share information with one another through electronic mail and chat rooms. E-mail is enabling radically new forms of worldwide human collaboration.

Approximately 225 millions of people can send and receive it and they all epresent a network of potentially cooperating individuals dwarfing anything that even the mightiest corporation or government can muster. Mailing-list discussion groups and online conferencing allow us to gather together to work on a multitude of projects that are interesting or helpful to us. Chat rooms and mailing lists can connect groups of users to discuss a topic and share ideas. Materials from users can be added to a Web site to share with others and can be updated quickly and easily anytime.

However, the most exciting part of the Internet is its multimedia and hypertext capabilities. The Web provides information in many different formats. Of course, text is still a popular way to transmit information, but the Web also presents information in sound bites, such as music, voice, or special effects. Graphics may be still photographs, drawings, cartoons, diagrams, tables, or other artwork, but they also may be moving, such as animation video. Hypertext links allows users to move from one piece of information to another. A link might be an underlined word or phrase, an icon or a symbol, or a picture, for example.

When a link is selected, usually by clicking the mouse on the link, the ser sees another piece of information, which may be electronically stored on another computer thousands of miles away. Of major importance is the fact that the Internet supports online education. Online education introduces unprecedented options for teaching, learning, and knowledge building. Today access to a microcomputer, modem, telephone line, and communication program offers learners and teachers the possibility of interactions that transcended the boundaries of time and space.

Even from an economic standpoint, the costs of establishing a brand new educational program or a few thousand students are far less than the cost of a building to house the same number of students. New social and intellectual connectivity is proliferating as educational institutions adopt computer-mediated communication for educational interactions. There are many school based networks that link learners to discuss, share and examine specific subjects such as environmental concerns, science, local and global issues, or to enhance written communication skills in first- or second- language proficiency activities.

Online education is a unique expression of both existing and new attributes. It shares certain attributes with the distance mode and with the face-to-face mode; however, in combination, these attributes form a new environment for learning. Online education, on the other hand, is distinguished by the social nature of the learning environment that it offers. Like face-to-face education, it supports interactive group communication. Historically, the social, affective, and cognitive benefits of peer interaction, and collaboration have been available only in face-to-face learning.

The introduction of online education opens unprecedented opportunities for educational interactivity. The ediation of the computer further distinguishes the nature of the activity online, introducing entirely new elements to the learning process. The potential of online education can be explored through five attributes that, taken together, both delineate its differences from existing modes of education and also characterize online education as a unique mode.

They may learn independently, at their own pace, in a convenient location, at a convenient time about a greater variety of subjects, from a greater variety of institutions or educators/trainers. But no matter how great and significant the effects of the Internet in our ives might be, there are some quite considerable consequences and drawbacks. A very important disadvantage is that the Internet is addictive. One of the first people to take the phenomenon seriously was Kimberly S. Young, Ph. D. , professor of psychology at the University of Pittsburgh.

She takes it so seriously, in fact, that she founded the Center for Online Addiction, an organization that provides consultation for educational institutions, mental health clinics and corporations dealing with Internet misuse problems. Psychologists now recognize Internet Addiction Syndrome (IAS) as a new llness that could ruin hundreds of lives. Internet addicts are people who are reported staying online for six, eight, ten or more hours a day, every day. They use the Internet as a way of escaping problems or relieving distressed moods. Their usage can cause problems in their family, work and social lives.

They feel anxious and irritable when offline and craved getting back online. Despite the consequences, they continue using regardless of admonishments from friends and family. Special help groups have been set up to give out advice and offer links with other addicts. Internets Anonymous and Webaholics are two of the sites ffering help, but only through logging onto the Internet. The study of 100 students by Margaret Martin of Glasgow University found: One in six (16%) felt irritable, tense, depressed or restless if they were barred from using the Internet.

More than one in four (27%) felt guilty about the time they spent online. One in ten (10%) admitted neglecting a partner, child or work because of overuse. One in twenty five (4%) said it had affected their mental or physical health for the worse. Another Ph. D. psychologist Maressa Hecht Orzack posits that people use the Internet compulsively because it so easily facilitates the reward response ommon to addictive behavior. “If they are lonely and need compassion, camaraderie or romance, it can be found immediately. If they are looking for sex or pornography, they need only to click a button.

They can experience the thrill of gambling, playing interactive games from the comfort of their chairs. They can entertain fantasies by pretending to be other people, or engaging interactive, role-playing games. The reward received from these activities can manifest itself physically, so that the person begins to crave more of it. ” The effects lead to headaches, lack of concentration and tiredness. Addicts ust not cut off access altogether but they should set time limits and limit Internet usage to a set number of hours each day.

Robert Kraut Doctoral Psychologist says referring on the subject: “We have evidence that people who are online for long periods of time show negative changes in how much they talk to people in their family and how many friends and acquaintances they say they keep in contact with. They also report small but increased amounts of loneliness, stress and depression. What we do not know is exactly why. Being online takes up time, and it may be taking time away from sleep, social contact r even eating. Our negative results are understandable if people’s interactions on the net are not as socially valuable as their other activities.

Another considerable drawback of the Internet is that it is susceptible to hackers. Hackers are persons that have tremendous knowledge on the subject and use it to steal, cheat or misuse confidential or classified information for the sake of fun or profit. As the world increases its reliance on computer systems, we become more vulnerable to extremists who use computer technology as a weapon. It is called cyber-terrorism and research groups within the CIA and FBI say yber-warfare has become one of the main threats to global security. But how serious is hacking?

In 1989, the Computer Emergency Response Team, a nonprofit organization that monitors security issues throughout America from its base at the Carnegie Mellon University in Pittsburgh reported 132 computer intrusions. Last year they recorded 2341! And in recent months, a few celebrated cases have shed a new light on the hacker’s netherworldly activities. One notorious hacker is American Kevin Mitnick, a 31-year-old computer junkie arrested by the FBI in February for allegedly pilfering more than $1 million worth of data and 20. 00 credit-card numbers through the Internet.

Still, the new wave of network hacking is presenting fresh problems for companies, universities and law-enforcement officials in every industrial country. In July, John Deutch, head of the CIA, told Congress that he ranked information warfare as the second most serious threat to the national security, just below weapons of mass destruction in terrorist hands. The Internet suffers around a million successful penetrations every year while the Pentagon headquarters of the US military has 250. 000 attempts to hack into computers. But what can be done for hacking?

There are ways for corporations to safeguard against hackers and the demand for safety has led to a boom industry in data security. Security measures range from user Ids and passwords to thumbprint, voiceprint or retinal scan technologies. Another approach is public key encryption, used in software packages such as Entrust. An information system girded with firewalls and gates, broken vertically into compartments and horizontally by access privileges, where suspicion is the norm and nothing can be trusted, will probably reduce the risk of information warfare as we know it today to negligible levels.

Yet, increasingly intrusive and somehow antithetical to the purposes for which science in general is purposed. It is no accident that the World Wide Web was invented to enable particle physicists to share knowledge. Michael Binder, assistant deputy at the industry department of United States asks another key question: “How would you regulate the Internet? ” Computer and legal experts all agree that enforcement is difficult. Still, a committee of the Canadian Association of Chiefs of Police has made several recommendations. One would make it illegal to possess computer hacking programs, hose used to break into computer systems.

Another would make the use of computer networks and telephone lines used in the commission of a crime a crime in itself. The committee also recommends agreements with the United States that would allow police officials in both countries to search computer data banks. But for the time being, Binder says, the government is in no rush to rewrite the statute book. “We don’t know how it will evolve. We don’t want to stifle communication. We don’t want to shut down the Net. ” The problem with regulating the Internet is that no one owns it and no one controls it.

Messages are passed from computer system to computer system in milliseconds, and the network literally resembles a web of computers and connecting telephone lines. It crosses borders in less time than it takes to cross most streets, and connections to Asia or Australia are as commonplace as dialing your neighbor next door. It is the Net’s very lack of frontiers that make law enforcement so difficult. Confronted with the difficulty of trying to grab onto something as amorphous as the Net, some critics and government officials are hoping that Internet service providers can police the Net themselves.

However, Ian Kyer, President of Computer Law Association Inc. and a lawyer believes that much of the debate about the Internet arises because it is so new. “We’re just sort of waking up to it. Now that it’s an everyday thing, it’s coming to the attention of the legislators and police forces, and I think they’re not going to like what they see. One of the real problems with the law of the Internet is deciding, where does the offence occur? ” The best guide to the way the law should work is to study the past and the present, not to attempt to predict every possible future.

As Justice Oliver Wendell Holmes said long ago, “The life of the law has not been logic; it has been experience. ” When a new media technology emerges, the best thing to do is to wait and see what problems actually emerge, not panic about what could happen. Once we understand the actual risks, we can legislate accordingly and with full regard to the competing interests at stake. But there is another problem that practically circulates through the Internet: The viruses. They can move stealthily and strike without warning.

Yet they have no real life of their own, and goo virtually unnoticed until they find a suitable host. Computer viruses- tiny bits of programming code capable of destroying vast amounts of stored data- bear an uncannily close relationship to their biological namesake. And like natural viruses are constantly changing, making them more and more difficult to detect. It is estimated that two or three new varieties are written each day. Most experts believe that a virus is created by an immature, disenchanted computer whiz, frequently called a “cracker”.

The effects may be benign: on variation of the famous “Stoned” virus merely displays a message calling for the legalization of marijuana. Other iruses, however, can scramble files to create a frenzy of duplication that may cause a computer’s microchips to fail. The rapid increase in computer networks, with their millions of user exchanging vast amounts of information, has only made things worse. With word-processing macros embedded in text, opening e-mail can now unleash a virus in a network or a hard disk. Web browsers can also download running code, some of it possibly malign.

Distributing objects over global networks without a good way to authenticate them leads to similar risks. Crackers have also succeeded in tainting software sold by brand-name anufacturers. A clutch of companies offer antiviral programs, capable of detecting viruses before they have the chance to spread. Such programs find the majority of viruses but virus detection is likely to remain a serious problem because of the ingenuity of crackers. One new type of virus, known as polymorphic, evades discovery by changing slightly each time it replicates itself.

Another extremely important issue about they Internet is child pornography. Computer technology is providing child molesters and child pornographers with powerful new tools for victimizing children. The result is an explosive growth n the production and distribution of illegal child pornography, as well as new forms of child predation. Research carried out at Stockholm University identified 5651 postings of child pornography in four discussion groups. Children around the world are being sexually assaulted, molested and exploited by people who also misuse computers and related technology.

The abuse is being photographed and distributed to an international marketplace of child pornography consumers via the Internet. That marketplace- along with related Internet sites that encourage child sexual abuse- is leading to new assaults against children. No longer are schools, public libraries and homes safe harbors from sexual pedophiles- people whose sexual fantasies focus on girls or boys- from around the world. In the past photographs of children being raped, sexually abused and exploited were sold at high prices through tightknit, difficult-to-access networks.

Today, those illegal pictures are available for free online, at any hour of the day. Anyone with rudimentary computer skills and an interest in the material can obtain it. Computer networks can also allow pedophiles to identify and contact potential victims without revealing their identities. Often, adult predators pretend to be children until they have gained their victims’ confidence. Federal law defines child pornography as photographs or video that depict people under the age of 18 involved in sexually explicit conduct- such as sexual intercourse, bestiality, masturbation and sadistic or masochistic abuse.

Also prohibited are pictures involving children that include a “lascivious exhibition of the genitals or public area”. The Government has introduced a number of measures to deal with pornography and obscene material, including the use on computers. The Criminal Justice and Public Order Act 1998 increased the maximum sentence for possession of indecent photographs of children up to 5 years in prison, a $250. 000 fine, or both. People convicted of distributing child pornography face up to 15 years in prison or/and a $300. 000 fine.

It also gave the police the power to arrest without warrant people suspected of obscenity and certain child pornography offences and greater powers to search and seize obscene material and child pornography. It also closed a potential legal loophole by extending the law to cover simulated child pornography manufactured and stored on computers. In Singapore authorities announced plans to establish a “neighborhood police post” on the Internet to monitor and receive complaints of criminal activity- including the distribution of child pornography.

And in the United States there has been introduced a bill- vocally opposed by civil liberties organizations and computer-user groups- that would outlaw the electronic distribution of words and images that are “obscene, lewd, lascivious, filthy or indecent. ” However, Federal agencies lack the manpower to cope with all the criminal activity taking place online. Few local law enforcement officers are trained in omputer technology. Moreover, Internet providers generally fail to educate their customers about ways to protect children from sexual predators.

Few schools or libraries offer real safety training programs for children online. Many parents have no idea what threats exist or even how the technologies in question work. Last but not least is the report to our privacy when online. These days, the most skilful manipulators of new information and communications technology to build up files on individuals are private companies collecting personal data on tens of millions of people. Simon Davies, the British head of Privacy International, a human rights watchdog group, says that every citizen of an industrialized country appears today in about 200 different data bases.

Such mines of information are centralized, sifted through and correlated to produce very detailed profiles of consumers. The files are then resold to all kinds of firms, which use them to sharpen their marketing strategies, assess the economic reliability of customers and adjust to specific commercial demands. The Internet is an ideal tool for this meticulous task of categorizing the population. It is an extraordinary source of data as well as a practical way to handle such nformation and circulate it.

To make matters worse, the Internet is a world of invisible tracks. You get the impression when you surf the Web that you leave no traces behind you. The truth is rather different. Some sites place spying devices, or “cookies”, on your computer’s hard drive the moment you log to them, so they can tell which pages of the site you have looked at, when you looked and for how long. A survey last year by the Electronic Privacy Information Center (EPIC) showed that a quarter of the 100 most popular sites on the Web use cookies to obtain profiles of their users.

When you next visit them, they can resent you with advertising tailored to your interests, or even send you without your knowledge programs like Java Applets, which can reconfigure a site according to each visitor’s tastes. Arguments persist that the erosion of privacy is not such a big deal; the economic benefits of information availability and mobility, it is said outweigh limitations on our personal privacy. Is privacy an ethical nicety, an expendable luxury, then, or is it a basic natural right that needs legal protection?

Some philosophers and legal scholars have argued that privacy is an intrinsic good, implying that the right to privacy is fundamental and irreducible. Others contend that privacy is more of an instrumental good. Hence the right to privacy is derived from other rights such as property, bodily security and freedom. While both approaches have validity, the latter seems more compelling. It is especially persuasive when applied to those rights involving our liberty and personal autonomy.

A primary moral foundation for the value of privacy is its role as a condition of freedom: A shield of privacy is absolutely essential if one is freely to pursue his or her projects or cultivate intimate social relationships Under the directive, that came into effect on 25 October 1998, the processing f data about ethnic origins, political opinions, religious and philosophical beliefs, trade union membership, health and sex life, is prohibited except where there are special exemptions or derogations.

Moreover, in each of the European Union’s fifteen Member State, a special authority is to protect individual’s rights and freedoms with regard to the processing of personal data. It is to guarantee citizens the right to be informed, to have access to data concerning them and the right to correct it, and to erase data whose processing does not comply with the provisions of the directive. Article 25 states the principle hat the transfer of personal data to third countries may only take place if the receiving countries offer a level of protection that is “adequate” within the meaning of EU legislation.

In a globalized economy where information about consumers is the new gold mine, the stakes are huge, involving no more and no less than the future of all banking and trade transactions, especially electronic. The United States has already gone on the offensive by accusing Europe of using privacy protection laws to erect barriers around the valuable European market of 370 million people. President Bill Clinton’s Internet policy adviser Ira Magaziner has even hreatened to go to the World Trade Organization (WTO) about it.

At the same time he insists that the US is just as concerned to protect the privacy of its citizens as European governments are. And all studies show that Internet commerce can not succeed unless consumers can count on information about themselves being kept confidential. There are currently about 250 bills relating to the Internet pending in Congress. Many of those deal specifically within privacy. However, only very few of these have become a law. That is largely because the Clinton administration and Congress have taken a largely wait-and-see approach to this conflict.

Most lawmakers feel the Internet develops too quickly for static laws to work effectively. Instead, politicians from Vice President Al Gore down are encouraging the Internet industry to regulate itself, while suggesting that their patience is not inexhaustible. It will be very difficult to regulate the Internet because it is global and decentralized, and it is very hard to identify Internet users. The key is developing something that is enforceable. Good intentions are one thing, but in the self-regulatory environment, if somebody is hurt by the misuse of personal information, who pays?

Who provides a remedy to that harmed individual? Nobody does! Privacy is a tough area for personal injury lawyers because it is difficult under our tort law to prove that somebody has been harmed. It is very hard to prove damage to reputation of intentional infliction of emotional distress in cases involving disclosure of personal information. Many individuals and organizations are now relying more heavily on digital networks as they routinely communicate by e-mail, post messages to electronic bulletin boards on the Internet and visit Web sites.

But in the process they become more exposed and vulnerable to those seeking to collect and sell their personal data. When users visit Web sites they often fill out detailed personal profiles that become grist for marketing lists sold to third parties. Digital networks have also made consumer information even more widely and easily available the use of these networks greatly expands the capability of checking up on someone’s personal background or receiving an electronic list of prospective customers quickly and inexpensively. Indeed we are moving perilously close to the reality of immediate online personal data.

More disturbing than the loss of our privacy as consumers is the loss of privacy about our financial affairs. Once again government data banks have sually provided the building blocks for these records. Certain financial information that was always in the public domain, such as real estate and bankruptcy records, is now treated as a basic commodity. Data brokers such as Information America, Inc. allow their subscribers quick online access to the county and court records for many states. Their vast databases contain business records, bankruptcy records, lawsuit information and property records, including liens and judgement.

By computerizing these real estate records, liens, incorporations, licenses and so on, they become more than public documents. They re now on-line commodities, more easily accessed and distributed than their physical counterparts. In addition, this data can be recombined with other personal and financial background. The most recent assault on privacy has developed in the health care industry, in which patient records have also become commodities for sale. These records, containing highly sensitive and revealing information, are being collected and stored in databases maintained by hospitals.

Thus, medical privacy seems destined to be another victim of our evolving information technologies. By putting so much medical data online without proper safeguards the Government, he Health Care Industry and the Information Industry are clearly undermining the foundation to the confidential doctor-patient relationship. It seems quite evident that our right to informational privacy- the right to control the disclosure of and access to one’s personal information- has been sacrificed for the sake of economic efficiency and other social objectives.

As our personal information becomes tangled in the Web of information technology, our control over how that data will be utilized and distributed is notably diminished. Our personal background and purchases are tracked by many companies hat consider us prospects for their products or services; our financial profile and credit history is available to a plethora of “legitimate” users, and our medical records are more widely accessible than ever before. The Net effect is that each of us can become an open book to anyone who wants to take the time to investigate our background.

Another adverse consequence of all these is that we can be more easily targeted and singled out either as individuals or as members of certain groups. Data based technology makes it easy to find and exploit certain groups based on age, income level, place of residence, or purchasing habits. At the same time online data banks now make it especially simple to pinpoint individuals electronically. If public polity makers do become convinced that privacy is worth preserving, what should be done? Are there any viable solutions?

Further complicating the issue, of course, are legitimate economic considerations. Privacy can not be accomplished without incurring some costs. And we can not ignore the economic benefit of acquiring and distributing information and using data as a commercial tool to target the right customers. If the information flow about consumers is overly constrained, a substantial negative economic impact can not be iscounted. In addition, there must be stricter controls for an especially sensitive information such as medical data.

If a centralized national database becomes a reality, it will be necessary to achieve a broad public consensus on the definition of the health care trustees who should have access to that data. In summary, then, if informed consent is made mandatory for the reuse of consumer data and there are stricter safeguards for more critical information such as medical data, we can begin to make some progress in protecting privacy rights. But unless we should come to terms with this problem the boundaries between what is pu

Internet Censorship Paper

“Inevitably, being an uncontrolled system, means that the Internet will be subjected to subversive applications of some unscrupulous users. ” (Kershaw) The concept of the Internet was created in answer to a strategic problem faced by the United States government during the Cold war era. A nuclear attack would easily disrupt a traditional computer network and hence make communication impossible. The solution was found in a new type of network. A network where all nodes would be equal in status, that is to say each could send and receive messages.

The resulting projects were the first steps towards the birth of the Internet, as we know it. Today, the Internet consists of several parts, which include the World Wide Web, News groups, and Email. The Internet is continuing to grow at a rate of 40% a year, with roughly 20 million users to date. Over the past few years, the issue of Internet censorship has been subject to an unprecedented amount of controversy. Both sides of the debate present very strong arguments about why the Internet should or should not be censored.

The point most often brought forward by advocates of Internet censorship is that “inappropriate” material can all too easily land in the hands of children via this powerful new medium. “Inappropriate” mostly describes the sexually explicit and racist material that is easily found on the Internet. The debate that currently rages however center mainly on pornographic material. The essay is divided into three content-based sections. The first section examines the data that is available about pornography on the Internet. Conclusions on significance of the data are offered.

Section two examines the legal issues and difficulties surrounding the idea of censorship. The final section discusses alternative ways of protecting children from pornography and offers a final conclusion on the attributes of the problem and the suggestion of a solution. Censorship of Internet is a big issue and not much of it can be covered in an essay at this level. In early 1995, a research team at the Carnegie Mellon University in Pittsburgh, Pennsylvania released one of the most revealing studies into online pornography.

The value of study, titled “Marketing Pornography on the Information Superhighway” is realised mainly due to its massive sample size. There are several issues about pornography on the Internet that were highlighted by the study. The research team surveyed 917,410 “sexually explicit pictures, descriptions, short stories and film clips”. Of special interest were Usenet newsgroups, which are basically electronic forums. It was found that 83. 5 percent of the digitised images stored on these newsgroups were pornographic pictures.

This finding indicates that there clearly is a substantial amount of pornography on the net. This however does not necessarily indicate that this material is easy to find. To come to a conclusion, this student conducted several experiments using the on-line “Altavista” search engine. The key searchword “sex” was entered. 616,156 links were returned. Out of the first 20 entries listed on the first page only 2 links were to pornographic sites. The search keyword “tits” however, returned 69,920 links. Out of the first 20 links listed on the first page 17 were to pornographic websites or bulletin boards.

Amidst all these links was one that led to a French children’s pen-pal club called “Les P’tits Garnements. After reviewing the posted messages and photographs on one of the bulletin boards that showed up as a link, it was apparent that the purpose of the site was purely for exchange of child pornography. It is most likely that a minor would come across explicit areas of the Internet through search engines. Children are very likely to search using traditionally rude four-letter words more as a source of childish amusement than anything else.

There can be no argument that the resulting links do not justify the level of parental anxiety that we are witnessing today. Explicit sexual material on the Internet is not the result of an unfounded moral panic. Anyone that takes the time to conduct a few experiments as detailed above will realize that this is a most serious issue. The survey also determined that 71 percent of the sexual images on the newsgroups surveyed originate from adult-oriented computer bulletin-board systems (BBS) whose operators are trying to lure customers to their private collections of X-rated material.

There are thousands of these BBS services, which charge fees (typically $10 to $30 a month) and take credit cards; the five largest have annual revenues in excess of $1 million. Contrary to what seems to be popular belief, explicit material is not being circulated by “perverted socially reclusive computer nerds”. This is a commercial activity. As long as people are willing to pay for it, it will be supplied. This is not a new problem that society faces. Prostitution and drug trading are other older facets of this same concept. The Internet has simply brought a new face of the same issue.

Perhaps the most disturbing discovery of the Carnegie Mellon study is one that relates to the changing face of pornography. It is no longer “just naked women”. There is great demand and inevitably great supply of “pedophilia” (nude photos of children), “hebephilia” (youths) and what the researchers call “paraphilia” (“a mixture of deviant material that includes images of bondage, sadomasochism, urination, defecation, and sex acts with a barnyard full of animals”). Anti-censorship activists often argue that censoring the net “makes no difference” because “obscene” material is available from any old corner shop.

These newer “types” of pornography may actually render this argument obsolete. Children are certainly exposed to material that even the most adventurous of them would not have normally come across. Most societies like to think of themselves as at least doing something to limit the development of “problems” such as pornography on the Internet. Governments of the United Kingdom and United States have both taken legislative steps towards this effect. It has not been easy in either case and the outcomes have arguably been altogether unsatisfactory.

United Kingdom legislation includes several statutes that are of particular interest. Section 1(1) of the 1959 Obscene Publications Act provides the following test for obscenity: “For the purposes of this Act an article shall be deemed to be obscene if its effect or (where the article comprises two or more distinct items) the effect of any one of its items is, if taken as a whole, such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it.

This definition of “obscene” stems from the opinion of Lord Cockburn concerning the case Regina v. Hicklin (1868), enunciated the first important guide in determining what material was obscene. It is open to serious criticism. The fundamental problem with this definition is that it can condemn material that may legitimately dealt with sex. Section 43 of the Telecommunications 1984 Act makes it an offence to send by means of a public telecommunications system, a message or other matter that is grossly offensive or of an indecent, obscene or menacing character’ and is an imprisonable offence with a maximum term of six months.

When carefully scrutinised it is clear that the Act itself does not penalise the act of procuring a message to be sent. As usual, there are loopholes abound. When a telecommunication system located outside the jurisdiction is used to send obscene materials into the country, no offence has been committed. The 1984 Act will also not apply to cases where the data is transmitted by using a local area network unless part of the transmission is routed through a public telecommunications system.

Even though UK legislation has recently been amended by the Criminal Justice and Public Order Act 1994 (CJPOA 1994′), in order to keep up with technological changes, there are still wrinkles in its enforcement with respect to the Internet. United State legislation has gone through many different tests of “obscenity” for reasons too many and varied to be discussed here. The current definition of obscenity is based on several conditions as opposed to a single one. On June 14, 1995, the Senate debated and voted on Title IV of the Telecommunications Competition and Deregulation Act of 1995 (S. 652).

Also known as the Communications Decency Act of 1995, it proposed to amend Section 223 (47 U. S. C. 223) to read: “Whoever, by means of telecommunications device knowingly makes, creates, or solicits, and initiates the transmission of, any comment, request, suggestion, proposal, image, or other communication which is obscene, lewd, lascivious, filthy, or indecent, with intent to annoy, abuse, threaten, or harass another person,” will be charged with a felony punishable by a fine of up to $100,000 or up to two years in prison, or both. ” There are two immediately obvious reasons why the CDA was doomed to fail.

Firstly, it may have severely restricted the flow of information and free speech. In its online analysis the “Electronic Frontier Foundation” (EFF), stated: “[T]he legislation not only fails to solve the problems it is intended to address, but it also imposes content restrictions on computer communications that would chill First Amendment-protected speech and, in effect, restrict adults in the public forums of computer networks to writing and reading only such content as is suitable for children. ” A second problem was the uncompromising nature of the bill.

A particular example would perhaps best illustrate this point. An online appeal from an organisation called Computer Professionals for Social Responsibility (CPSR) stated the following: “This proposed law could mean the demise of the Seattle Community Network (SCN), a 6,500-member free, public access computer network established to benefit the public. Under the proposed legislation, if an individual member of the SCN posted a message on an SCN forum or from SCN that was later deemed to be “indecent,” SCN could be fined $100,000, and SCN’s board of Directors and staff could face two-year prison sentences.

Yet without community networks like SCN, the Internet would be out of reach to millions of citizens. ” It is easy to admire the ideals that the CDA stands for. The Senate approved the Act with a vote of 84-16. The approach of the legislation however, is much too naive. In June of 1997 the CDA was declared unconstitutional by the Supreme Court on the grounds that it “violates constitutional guarantees of freedom of expression”. The general problem with legislation with respect to the Internet is the fact that regulation laws have for so long been focused on traditional media channels such as radio and television.

The Internet is new and extreme form of media. It is thus incompatible with existing laws. New and equally extreme laws would be necessary to regulate the Internet. However, as experience with the CDA has shown, extreme laws are not very popular and are likely to be contested and overthrown. Section 3 Even the most zealous censorship advocate would be blind not to realise that a world-wide censorship programme would be futile. National laws and community standards vary from nation to nation and even between regions of the same country.

It would be virtually impossible to get every nation on earth to agree to a single censorship act. Thus, the Internet, in its entirety, can never truly be censored. Even if the politics of it could be appeased, there are too many sources of pornography. Even the most ambitious regulatory agency could not hope to control them all. So the question of how to keep pornography away from children still remains. Following the 1996 period of extensive controversy over pornography on the Internet, many sexually explicit websites introduced password protection schemes.

Before being allowed access to the site a password is required. To obtain a password a user must subscribe to company such as “Adultcheck”. Online credit card payment is required before passwords are dispensed, the idea being that minors will not have access to credit cards. It seems like an attractive solution. Legislation could be used to compel the pornography industry to self-regulate itself in order to safeguard their substantial profits. There are two reasons however, why this would be an incomplete solution.

Firstly, many websites would be out of the jurisdiction of the legislation and thus free to operate without password schemes. There would probably even be a significant number of renegade sites that would ignore legislation and get away with it. Secondly, many people simply give away their passwords on bulletin boards, which defeats the whole purpose of the scheme. The solution that parents and educators are currently turning to most is the censor software approach. Censor or filtering software works by denying access to “objectionable” websites on the Internet. There are two main reasons why this approach is at most inadequate.

The first drawback relates to technological imperfections and was well highlighted in a study titled “Faulty filters”, released by the Electronic Privacy Information Centre (EPIC) in November 1997. The study involved the conducting of up to 100 searches using a normal search engine and then the conducting of the same searches via one of the new search engine that claim to return only “family-friendly” links. It was found that the search engine “typically blocked access to 95-99 percent of the material available on the Internet that might be of interest to young people”.

The study also noted that even when strict “blocking criteria were used links to “objectionable” material still showed up. The conclusion of the study seriously questioned the filtering software approach to Internet censorship on the basis of its potential to “ultimately diminish the educational value of the Internet”. The study is not optimistic about any improved performance of filtering software with the passage of time. The second drawback is concerned with a much more sinister aspect of censor software. In the February 1998 edition of “. net. ” magazine; an article about filtering software presented some disturbing facts.

The article revealed that that Solid Oaks’ CYBERsitter, a popular filtering program, actually blocked sites of groups such as “The National Organisation for Women” and “The Gay and Lesbian Alliance against Defamation”. An even more worrying revelation was that among the list of words and phrases that were to be blocked by the program were the phrases “dontbuycybercitter” and “bennethasleton” the name of an 18 year old anti-censorship activist. This implies that there are serious political and commercial agendas behind the facade of “censorship”. This may perhaps be the general problem with censorship.

It tries to promote a particular rigid set of morals and values on everybody concerned and at times is abused in order to advance other hidden agendas. Whatever the solution to the problem of pornography on the Internet it is unlikely to have anything to do with technically unsophisticated software that considers “gay rights” to be an objectionable topic. In August 1995, at the Massachusetts Institute of Technology (MIT), a twenty-two company consortium gathered under the held a conference discuss the need to create self-imposed ratings systems which would allow parents to control what children are allowed to see on the Internet.

The result of the group’s work, to date, is the Platform for Internet Content Selection (PICS). PICS is similar to the V-chip technology that was suggested for shielding children from sex and violence on television. In this case websites voluntarily “label” their content. What this technology may mean for parents is the ability to specify viewing levels for their children according to what they feel their children should be exposed to. Intuition suggests that a complete censorship program would be practically impossible.

There are too many obstacles including the inability of current legislation to deal with the unique nature of the Internet without resorting to extreme and unpopular measures. The Internet, to be utilised to its full value, simply must remain as it is. Protection of minors from “obscene” material must be a sort of add-on. Filtering software is an attractive concept, but as discussed above, has enough shortcomings to make it a questionable approach. The only hope lies in the advancement of technology.

The MIT consortium and their work with PICS illustrates the kind of solution that would be most ideal. PICS is simply a technical standard that will supposedly allow concerned parents and educators to tighten their control over what children can see on the Internet. Unlike filtering software, it does not impose any scheme of values on its user population or advance any hidden agendas. Hopefully, with the advancement of fifth generation computers and artificial intelligence schemes such as this will eventually approach minimal defect.

The question that will then remain will be a much more of a human one. Technology can never really replace parental guidance. Children absorb sexually explicit material because of their basic curiosity, which stems from inexperience in the ways of the world. Parents and educators must recognise their duty to nurture the growth of children and encourage broad-minded yet ethically aware mental development. Only then will society be facing up to its basic problems as opposed to merely trying to diminish their effects.

The Napster Debate

1. Background The Napster software (http://www.napster.com), launched early in 1999, allows internet users to share and download MP3 files directly from any computer connected to the Napster network. The software is used by downloading a client program from the Napster site and then connecting to the network through this software, which allows sharing (uploading and downloading) of MP3 files between all users connected to the network. While Napster does not condone copyright infringement, there is no opportunity in the software to stop this, or for royalties to be paid to artists whose songs are being duplicated for free.

Unlike similar file-sharing applications (Gnutella, Freenet), Napster limits users to uploading/downloading of MP3 files only. These files are compressed wave (.wav) files. The advantage of MP3 files is that they are approximately one-tenth the size of the corresponding .wav file and can be close-to-CD-quality. It is for this reason that many artists, record labels and other music industry stakeholders are concerned by the MP3 file format and applications like Napster that simplify the sharing of copyrighted material.

Other file formats in common use on the Internet are not as threatening to the recording industry; primarily due to the reduced quality of the recording. Real audio (.ra, .rm) files have reduced sound quality (comparable to radio) and are usually streamed over a different protocol, allowing people to listen to songs without having (or being able) to download the source files. Another ‘music’ file format common on the internet is the midi format. These files are of no threat to the music industry because the files are not actually a recording of the music; rather a set of instructions to the computer as to what sounds to play (and there is no way to duplicate vocal tracks). This file format is also becoming outdated and being used less and less.

2. Impact The reaction from recording artists, record labels and other music industry players has been varied, but primarily anti-Napster. The first action to be taken against Napster was by the band Metallica. In April of this year, they sued Napster Inc for copyright infringement. The case was settled out of court when Napster agreed to ban some 300,000 users who had allegedly downloaded Metallica songs. Again in June Napster Inc was sued for copyright infringement by The Recording Industry Association of America (RIAA), a trade group representing the US recording industry, alleging “Napster is enabling and encouraging the illegal copying and distribution of copyrighted music”.

Napster claims that Audio Home Recording Act that permits copying of material for personal use, allows it’s uses to swap MP3s. Napster further claims immunity by defining the company as an ISP under the Digital Millennium Copyright Act. The RIAA unsuccessfully applied to have an injunction to stop Napster’s operations until after the court case in September, so Napster will continue to operate until (and if) the court rules against Napster.

Other artists and record labels (http://www.napster.com/speakout/artists.html and http://www.napster.com/speakout/labels.html) have responded to the advent of Napster and similar applications in a more positive way, embracing the new technology rather than rejecting it. On their website, the Offspring says “MP3 technology and programs such as Napster [are] a vital and necessary means to promote music and foster better relationships with our fans.” Interestingly enough, the Offspring’s last album, Americana, was made available online illegally before commercially released, yet it is the band’s best-selling album to date. Furthermore, a number of surveys have proven that Napster users actually buy more CDs, after ‘sampling’ the songs online

It is this issue that is at the core of the RIAA lawsuit, whether Napster and similar applications will mean reduced CD sales. Napster does challenge the traditional distribution of music (CDs, cassettes, vinyl etc) but whether this should be viewed as a threat or simply a new medium to be exploited by the music industry is another issue. Some record labels, most notably Epitaph (http://www.epitaph.com) have partnered with sites like e-music.com to sell full albums and single songs in MP3 format over the web. In this case, the record company has in fact gained a new distribution method, rather than seeing it as the ‘enemy’. Of course, in this scenario, the record company still gets a cut of the profits, something that artists’ whose songs are downloaded through Napster don’t get.

The fact that Napster is free and more convenient than visiting a record store makes it an appealing way to get music for consumers. The problem the record companies have is that there is no way of regulating who has access to the information, and hence no way of profiting from it.

Napster also facilitates international distribution for unsigned artists. This also threatens record labels. Previously, without being signed to a record label, an artist simply could not get the exposure to make a living as a musician. With the Internet, sites like mp3.com and Napster, this is now possible.

While Napster does allow music sharing to an extent that could theoretically destroy the retail music industry, stopping Napster will not stop all their problems. Record labels need to see this new technology not as a threat, but as a challenge. They need to come up with ideas to encourage people to buy CDs (multimedia components, attractive artwork, lyrics, picture books etc). Perhaps if they offered better services to their signed artists, fewer artists would want to release their music themselves.

Napster challenges the music industry’s monopoly on distribution. People can now download music for free in their own homes and artists can release their music themselves. In theory, this could mean the end of record labels and other associated companies, and that is why groups like the RIAA are so worried.

Censorship and the Internet

The concept of the Internet was created in answer to a strategic problem faced by the United States government during the Cold war era. A nuclear attack would easily disrupt a traditional computer network and hence make communication impossible. The solution was found in a new type of network. A network where all nodes would be equal in status, that is to say each could send and receive messages. The resulting projects were the first steps towards the birth of the Internet, as we know it. Today, the Internet consists of several parts, which include the World Wide Web, FTP, IRC, News groups, Gopher, WAIS, Archie, and Email.

The Internet is continuing to grow at a rate of 40% a year, with roughly 20 million users to date. Over the past few years, the issue of Internet censorship has been subject to an unprecedented amount of controversy. Both sides of the debate present very strong arguments about why the Internet should or should not be censored. The point most often brought forward by advocates of Internet censorship is that “inappropriate” material can all too easily land in the hands of children via this powerful new medium. “Inappropriate” mostly describes the sexually explicit and racist material that is easily found on the Internet.

The debate that currently rages however centres mainly on pornographic material. The essay is divided into three content-based sections. The first section examines the data that is available about pornography on the Internet. Conclusions on significance of the data are offered. Section two examines the legal issues and difficulties surrounding the idea of censorship. The final section discusses alternative ways of protecting children from pornography and offers a final conclusion on the attributes of the problem and the suggestion of a solution.

Censorship of Internet is a big issue and not much of it can be covered in an essay at this level. The essay deliberately focuses only on pornography. While many aspects had to be left out and others discussed minimally, the result of this essay remains a brief synopsis of relevant issues and conclusions on these issues. Section 1 In early 1995, a research team at the Carnegie Mellon University in Pittsburgh, Pennsylvania released one of the most revealing studies into online pornography. The value of study, titled “Marketing Pornography on the Information Superhighway” is realised mainly due to its massive sample size.

There are several issues about pornography on the Internet that were highlighted by the study. The research team surveyed 917,410 “sexually explicit pictures, descriptions, short stories and film clips”. Of special interest were Usenet newsgroups, which are basically electronic forums. It was found that 83. 5 percent of the digitised images stored on these newsgroups were pornographic pictures. This finding indicates that there clearly is a substantial amount of pornography on the net. This however does not necessarily indicate that this material is easy to find.

To come to a conclusion, this student conducted several experiments using the on-line “Altavista” search engine. The key searchword “sex” was entered. 616,156 links were returned. Out of the first 20 entries listed on the first page only 2 links were to pornographic sites. The search keyword “tits” however, returned 69,920 links. Out of the first 20 links listed on the first page 17 were to pornographic websites or bulletin boards. Amidst all these links was one that led to a French children’s pen-pal club called “Les P’tits Garnements.

After reviewing the posted messages and photographs on one of the bulletin boards that showed up as a link, it was apparent that the purpose of the site was purely for exchange of child pornography. It is most likely that a minor would come across explicit areas of the Internet through search engines. Children are very likely to search using traditionally rude four-letter words more as a source of childish amusement than anything else. There can be no argument that the resulting links do not justify the level of parental anxiety that we are witnessing today.

Explicit sexual material on the Internet is not the result of an unfounded moral panic. Anyone that takes the time to conduct a few experiments as detailed above will realise that this is a most serious issue. The survey also determined that 71 percent of the sexual images on the newsgroups surveyed originate from adult-oriented computer bulletin-board systems (BBS) whose operators are trying to lure customers to their private collections of X-rated material. There are thousands of these BBS services, which charge fees (typically $10 to $30 a month) and take credit cards; the five largest have annual revenues in excess of $1 million.

This finding is a valuable one. Contrary to what seems to be popular belief, explicit material is not being circulated by “perverted socially reclusive computer nerds”. This is a commercial activity. As long as people are willing to pay for it, it will be supplied. This is not a new problem that society faces. Prostitution and drug trading are other older facets of this same concept. The Internet has simply brought a new face of the same issue. Perhaps the most disturbing discovery of the Carnegie Mellon study is one that relates to the changing face of pornography.

It is no longer “just naked women”. There is great demand and inevitably great supply of “pedophilia” (nude photos of children), “hebephilia” (youths) and what the researchers call “paraphilia” (“a grab bag of deviant material that includes images of bondage, sadomasochism, urination, defecation, and sex acts with a barnyard full of animals”). Anti-censorship activists often argue that censoring the net “makes no difference” because “obscene” material is available from any old corner shop. These newer “types” of pornography may actually render this argument obsolete.

Children are certainly exposed to material that even the most adventurous of them would not have normally come across. Section 2 Most societies like to think of themselves as at least doing something to limit the development of “problems” such as pornography on the Internet. Governments of the United Kingdom and United States have both taken legislative steps towards this effect. It has not been easy in either case and the outcomes have arguably been altogether unsatisfactory. United Kingdom legislation includes several statutes that are of particular interest.

Section 1(1) of the 1959 Obscene Publications Act provides the following test for obscenity: “For the purposes of this Act an article shall be deemed to be obscene if its effect or (where the article comprises two or more distinct items) the effect of any one of its items is, if taken as a whole, such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it. ”

This definition of “obscene” stems from the opinion of Lord Cockburn concerning the case Regina v. Hicklin (1868), enunciated the first important guide in determining what material was obscene. It is open to serious criticism. The fundamental problem with this definition is that it can condemn material that may legitimately dealt with sex. Section 43 of the Telecommunications 1984 Act makes it an offence to send by means of a public telecommunications system, a message or other matter that is grossly offensive or of an indecent, obscene or menacing character’ and is an imprisonable offence with a maximum term of six months.

When carefully scrutinised it is clear that the Act itself does not penalise the act of procuring a message to be sent. As usual, there are loopholes abound. When a telecommunication system located outside the jurisdiction is used to send obscene materials into the country, no offence has been committed. The 1984 Act will also not apply to cases where the data is transmitted by using a local area network unless part of the transmission is routed through a public telecommunications system.

Even though UK legislation has recently been amended by the Criminal Justice and Public Order Act 1994 (CJPOA 1994′), in order to keep up with technological changes, there are still wrinkles in its enforcement with respect to the Internet. United State legislation has gone through many different tests of “obscenity” for reasons too many and varied to be discussed here. The current definition of obscenity is based on several conditions as opposed to a single one. On June 14, 1995, the Senate debated and voted on Title IV of the Telecommunications Competition and Deregulation Act of 1995 (S. 652).

Also known as the Communications Decency Act of 1995, it proposed to amend Section 223 (47 U. S. C. 223) to read: “Whoever, by means of telecommunications device knowingly makes, creates, or solicits, and initiates the transmission of, any comment, request, suggestion, proposal, image, or other communication which is obscene, lewd, lascivious, filthy, or indecent, with intent to annoy, abuse, threaten, or harass another person,” will be charged with a felony punishable by a fine of up to $100,000 or up to two years in prison, or both. ” There are two immediately obvious reasons why the CDA was doomed to fail.

Firstly, it may have severely restricted the flow of information and free speech. In its online analysis the “Electronic Frontier Foundation” (EFF), stated: “[T]he legislation not only fails to solve the problems it is intended to address, but it also imposes content restrictions on computer communications that would chill First Amendment-protected speech and, in effect, restrict adults in the public forums of computer networks to writing and reading only such content as is suitable for children. ” A second problem was the uncompromising nature of the bill.

A particular example would perhaps best illustrate this point. An online appeal from an organisation called Computer Professionals for Social Responsibility (CPSR) stated the following: “This proposed law could mean the demise of the Seattle Community Network (SCN), a 6,500-member free, public access computer network established to benefit the public. Under the proposed legislation, if an individual member of the SCN posted a message on an SCN forum or from SCN that was later deemed to be “indecent,” SCN could be fined $100,000, and SCN’s board of Directors and staff could face two-year prison sentences.

Yet without community networks like SCN, the Internet would be out of reach to millions of citizens. ” It is easy to admire the ideals that the CDA stands for. The Senate approved the Act with a vote of 84-16. The approach of the legislation however, is much too naive. In June of 1997 the CDA was declared unconstitutional by the Supreme Court on the grounds that it “violates constitutional guarantees of freedom of expression”. The general problem with legislation with respect to the Internet is the fact that regulation laws have for so long been focused on traditional media channels such as radio and television.

The Internet is new and extreme form of media. It is thus incompatible with existing laws. New and equally extreme laws would be necessary to regulate the Internet. However, as experience with the CDA has shown, extreme laws are not very popular and are likely to be contested and overthrown. Section 3 Even the most zealous censorship advocate would be blind not to realise that a world-wide censorship programme would be futile. National laws and community standards vary from nation to nation and even between regions of the same country.

It would be virtually impossible to get every nation on earth to agree to a single censorship act. Thus, the Internet, in its entirety, can never truly be censored. Even if the politics of it could be appeased, there are too many sources of pornography. Even the most ambitious regulatory agency could not hope to control them all. So the question of how to keep pornography away from children still remains. Following the 1996 period of extensive controversy over pornography on the Internet, many sexually explicit websites introduced password protection schemes.

Before being allowed access to the site a password is required. To obtain a password a user must subscribe to company such as “Adultcheck”. Online credit card payment is required before passwords are dispensed, the idea being that minors will not have access to credit cards. It seems like an attractive solution. Legislation could be used to compel the pornography industry to self-regulate itself in order to safeguard their substantial profits. There are two reasons however, why this would be an incomplete solution.

Firstly, many websites would be out of the jurisdiction of the legislation and thus free to operate without password schemes. There would probably even be a significant number of renegade sites that would ignore legislation and get away with it. Secondly, many people simply give away their passwords on bulletin boards, which defeats the whole purpose of the scheme. The solution that parents and educators are currently turning to most is the censor software approach. Censor or filtering software works by denying access to “objectionable” websites on the Internet. There are two main reasons why this approach is at most inadequate.

The first drawback relates to technological imperfections and was well highlighted in a study titled “Faulty filters”, released by the Electronic Privacy Information Centre (EPIC) in November 1997. The study involved the conducting of up to 100 searches using a normal search engine and then the conducting of the same searches via one of the new search engine that claim to return only “family-friendly” links. It was found that the search engine “typically blocked access to 95-99 percent of the material available on the Internet that might be of interest to young people”.

The study also noted that even when strict “blocking criteria were used links to “objectionable” material still showed up. The conclusion of the study seriously questioned the filtering software approach to Internet censorship on the basis of its potential to “ultimately diminish the educational value of the Internet”. The study is not optimistic about any improved performance of filtering software with the passage of time. The second drawback is concerned with a much more sinister aspect of censor software. In the February 1998 edition of “. net. ” magazine; an article about filtering software presented some disturbing facts.

The article revealed that that Solid Oaks’ CYBERsitter, a popular filtering program, actually blocked sites of groups such as “The National Organisation for Women” and “The Gay and Lesbian Alliance against Defamation”. An even more worrying revelation was that among the list of words and phrases that were to be blocked by the program were the phrases “dontbuycybercitter” and “bennethasleton” the name of an 18 year old anti-censorship activist. This implies that there are serious political and commercial agendas behind the facade of “censorship”. This may perhaps be the general problem with censorship.

It tries to promote a particular rigid set of morals and values on everybody concerned and at times is abused in order to advance other hidden agendas. Whatever the solution to the problem of pornography on the Internet it is unlikely to have anything to do with technically unsophisticated software that considers “gay rights” to be an objectionable topic. In August 1995, at the Massachusetts Institute of Technology (MIT), a twenty-two company consortium gathered under the held a conference discuss the need to create self-imposed ratings systems which would allow parents to control what children are allowed to see on the Internet.

The result of the group’s work, to date, is the Platform for Internet Content Selection (PICS). PICS is similar to the V-chip technology that was suggested for shielding children from sex and violence on television. In this case websites voluntarily “label” their content. What this technology may mean for parents is the ability to specify viewing levels for their children according to what they feel their children should be exposed to. Intuition suggests that a complete censorship program would be practically impossible.

There are too many obstacles including the inability of current legislation to deal with the unique nature of the Internet without resorting to extreme and unpopular measures. The Internet, to be utilised to its full value, simply must remain as it is. Protection of minors from “obscene” material must be a sort of add-on. Filtering software is an attractive concept, but as discussed above, has enough shortcomings to make it a questionable approach. The only hope lies in the advancement of technology.

The MIT consortium and their work with PICS illustrates the kind of solution that would be most ideal. PICS is simply a technical standard that will supposedly allow concerned parents and educators to tighten their control over what children can see on the Internet. Unlike filtering software, it does not impose any scheme of values on its user population or advance any hidden agendas. Hopefully, with the advancement of fifth generation computers and artificial intelligence schemes such as this will eventually approach minimal defect.

The question that will then remain will be a much more of a human one. Technology can never really replace parental guidance. Children absorb sexually explicit material because of their basic curiosity, which stems from inexperience in the ways of the world. Parents and educators must recognise their duty to nurture the growth of children and encourage broad-minded yet ethically aware mental development. Only then will society be facing up to its basic problems as opposed to merely trying to diminish their effects.

The Evolution of the Internet

So you believe Al Gore created the Internet? Well thats not possible, because I did. Yes, its true, a few years ago I was sitting in my basement with nothing to do and suddenly the idea came to me: why not create an inter-connected network of networks that will allow users to send mail instantly, download copyrighted songs, and order pizza, all from the comfort of their own living room? OK, so maybe I didnt exactly invent the Internet, but neither did Al Gore. So who was the genius behind the information superhighway, you ask?

Well lets take a step back to the sixties, a decade when Cold War tension caused nationwide fear of nuclear warfare. Early in the decade, two groups of researchers, privately owned RAND Corporation (Americas leading nuclear war think-tank) and federal agency ARPA (Advanced Research Projects Agency), grappled with a bizarre strategic mystery: in the event of nuclear war, how could political and military officials communicate successfully? It was obvious that a network, linking cities and military bases, would be necessary.

But the advent of the atomic bomb made switches, wiring, and command posts for this network highly vulnerable. A nuclear-safe network would need to operate with missing links and without central authority. In 1964, RAND Corporations Paul Barran made public his solution to the problem. Essentially, the concept was simple. Barrans network would be assumed to be unreliable at all times. Information would be broken into many small pieces called packets and then sent to various points, or nodes, in the network until they reached their destination.

ARPA embraced Barrans idea for three reasons. First, if nuclear bombs blew away large components of the network, data would still reach its destination. Second, it would be relatively secure from espionage, since spies tapping into parts of the network would be able to intercept only portions of transmissions. Lastly, it would be much more efficient because files and transmissions couldnt clog portions of the network. Only five years after Barran proposed his version of a computer network, ARPANET went online.

Named after its federal sponsor, ARPANET initially linked four high-speed supercomputers and was intended to allow scientists and researchers to share computing facilities by long-distance. By 1971, ARPANET had grown to fifteen nodes, and by 1972, thirty-seven. ARPAs original standard for communication was known as Network Control Protocol or NCP. As time passed, however, NCP grew obsolete and was replaced by a new, higher-level standard known as TCP-IP, which is still in use today.

TCP, or Transmission Control Protocol, is responsible for converting messages into streams of packets at the source of the transmission and then assembling the streams at the final destination. IP, or Internet Protocol ensured that packets were routed across multiple nodes and networks, even those using the original NCP standard. The Internet, as it came to be called, spread like wildfire. Since software for TCP-IP was available to the public and the technology for networking was decentralized by its very nature, nodes and networks easily joined in.

Each node covered its own expenses, and thus there was no disapproval for expansion. Like the telephone, the Internet was relatively useless without universal participation. In 1989, ARPANET was completely eliminated in favor of TCP-IP, which already contained over 300,000 nodes. Practical applications for the Internet popped up everywhere. Graduate students at North Carolina and Duke invented an electronic bulletin board called Usenet and researchers at the University of Minnesota developed a primitive search engine called Gopher.

In 1991, the foundation for the modern Internet was built when Swiss Internet user Tim Berners-Lee released his system of Internet hypertext. Prior to hypertext, Internet use was limited to nerds who knew the commands needed to communicate through the text based network. Hypertext allowed users to link words, pictures, and files with mouse clicks. It wasnt long before University of Illinois student Marc Andressen and fellow computer programmers invented the first Web browser called Mosaic. The introduction of Web browsers spawned a new era of communication and, in response to the 1993 release of Mosaic, Internet use grew 341,634%.

You may have noticed that Al Gores name was never mentioned in this brief history of the Internet. While Gore did in fact help secure government funding for a related project in the 1980s, he did not invent the Internet. In fact, there is no specific creator of the Internet, nor is there a date when the Internet came into being. It evolved over thirty years from the governments way of post-nuclear war communication to todays Information Superhighway. The number of hosts now far exceeds ten million, and hundreds of millions of users from over 150 countries are connected.

What is WAP

WAP stands for Wireless Application Protocol. The idea has been developed by some of the wireless telecommunications giants such as Nokia and Ericsson. The Wireless Application Protocol (WAP) uses the Internet as a gateway for the transmission of the protocol. WAP has brought the Internet and the sub-services that it provides right into our lives as we are on the move. WAP brings us information right to the screen of our mobile phone. WAP offers the possibility to call specific WAP pages directly from the Internet; these WAP pages can be seen in a display, but their presentation is reduced and without illustrations and charts.

The possibilities of WAP are nearly endless: Entertainment – sending messages – calling sport results, stock exchange quotations, arrivals and departures of airplanes and trains – (nearly) everything is possible. Especially in business WAP seems to be the \”star performer\” as to corporate communication solutions. Trends in WAP Customers willing to make use of these new possibilities need a WAP phone. This is a GSM mobile phone with incorporated modem and WAP browser. The data is transmitted with the usual transmission rate of 9. Bit/s.

As to the consumers’ interest in WAP there are still controversies. Mobile phones recorded in the UK very high sales rates during the 2nd quarter of 2000, however the sales rates of WAP phones were not that good. At the end of May 2000 there were worldwide more than 150,000 WAP pages and the offer of WAP pages continues to increase with explosive growth rates. (Forrester Research, 2000) International analysts predict that the demand for appropriate equipment will increase rapidly in the coming few months.

After the online boom in 1990’s WAP will be the next growth industry of the twenty-first century. The conditions seem to be ideal: The consumers’ interest in online services and e-commerce does not stop to show an upward trend. For the time being there are worldwide several millions of persons using mobile phones, their number is still growing. WAP is based on GSM technology, the worldwide most used mobile phone system that has turned out to be the standard on all continents.

Only in the USA there are more systems competing with GSM). Forrester Research believes in their study that in 2002 worldwide more than 100 million persons will use WAP phones. A forecast of the producer Ericsson confirms this trend, too: in 2001 nearly 50% the persons using mobile phones are expected to take advantage of WAP. For a study conducted by NOP Research Group among Internet users in Great Britain, France and Germany, 9%, 22% and 11% respectively declared to have the intention to buy a WAP phone during the next 12 months.

A direct connection between the intention to use both WAP services and Internet is already existing: in the 3 countries, only 8% in total of the interviewed persons declaring themselves to be an \”Internet beginner\” were interested in buying a WAP phone while 28% of the \”Internet experts\” confirmed their intention to buy a WAP phone. Kaufcom (Wapitout) in Switzerland until the end of May 200. 000 WAP phones were sold. At that time in Switzerland more than 40. 0 persons made a daily use of WAP – almost one out of five persons in possession of a WAP phone went online! The pioneers regarding Internet surfing are the Japanese.

When Europe started the first WAP tests, Japan saw a real boom of mobile data services. This boom started with \”i-mode\”, a service offered by the worldwide largest mobile telephony provider NTT DoCoMoInc, that used a protocol similar to the WAP technology. Despite additional costs of approx. 35 ATS per month this service recorded 18 months after its start more than 3 million users.

Blogging: Its for everyone

Recently, Merriam-Webster announced that, based on “online lookups,” the number one word of the year was “blogs” (Morse, Page 1). Their definition of a blog is “a web site that contains an online personal journal with reflections, comments, and often hyperlinks provided by the writer” (Morse, Page1). This definition is inaccurate based on my research, as blogs are not always “personal” and can include more than one author. Throughout my research, many bloggers in the blogoshere have referred to websites as blogs that discuss business only, business and personal details, and more than mere “reflections” of a personal nature.

As blogs become more popular and affect different forms of communication with a higher degree of magnitude, I am confident that the definition of blogs will morph closer to my definition of blogs (short for weblog, a web site that contains an online journal including, but not limited to, reflections, comments, and often hyperlinks provided by the writer(s)) than the Merriam-Webster definition. This paper will discuss blogs (what they are), bloggers (who they are), blogging (should you do it and is it profitable), and the impact of blogs on media.

I will start by talking about how blogs started, and who some bloggers are. Next, I will discuss the amount of revenue that can be made, and how that revenue is made, from starting a blog. Finally, I will show the impact blogs have had on the mainstream media, specifically, the most recent Presidential Election. The culture of the internet has created a subculture of bloggers that, as evidenced by the number of persons looking to find a definition of the word (however inaccurate the definition may be), is growing in popularity and is therefore a prescient topic for persons to be informed about.

Blogging started, albeit without a proper name and with an even more vague definition, as soon as the internet was invented. Just as writing a journal started with the first writers thousands of years ago, blogs arose at the same time as the medium of the internet was born. This created some new challenges to the conventional writer. According to The Handbook of Digital Publishing, the greatest strength of publishing online material is “displaying the interrelated nature of information connected with hyperlinks” (Kleper, Page 197). The use of hyperlinks is extensive in blogs, confirming Kleper’s thesis.

The value of hyperlinks are determined by the author of the blog and how they choose to use their hyperlinks. Mostly, I have found hyperlinks used as a reference to, and compliment of, the idea the blogger is trying to impress upon the reader. For instance, if I’m writing a blog about Winston Churchill, I can create a hyperlink to an encyclopedia entry online describing Mr. Churchill, which will let the reader of my blog know who I am talking about and give an impression of what context I reference Mr. Churchill.

I don’t have to provide a biography of Mr. Churchill in the blog, and waste the time of those who know of him already, but others that don’t know of Mr. Churchill (should learn! ) can click on the link and more fully understand what I am writing about. This is a valuable tool for the writer if used correctly. Writing with hyperlinks is different from traditional serial prose writing, but not excessively different. The main difference between traditional writing and writing for a blog is that a blog is a cumulative piece of writing that needs to be continuously updated and revised (Kleper, Page 194).

Once you write an article in print, that article is done. Blogs need to be written and added to over and over, especially considering “the prospect of finding timely information with each site visit is among the strongest incentives for repeat visits” (Kleper, Page 194) Keep in mind however, that although repeated revision and an eye-catching web page are preferred over bland pages, observations suggest that after the first visit, the usefulness of the site is what ultimately motivates the user to return (Kleper, Page 196).

You can have a blog about paint drying, update it very often and work hard describing the process in great detail, but it is unlikely to become widely read. Some popular bloggers today include Hugh Hewitt (www. hughhewitt. com), Josh Michael Marshal (www. talkingpointsmemo. com), and Glenn Reynolds (www. instapundit. com). Although I’ve chosen political bloggers to comment on here, be advised that blogging can be done by anyone with internet access about any topic. In fact, there is even a blog dedicated solely to shaving, www. shavingstuff. m (Brewer, Page 1).

The rise of in the number of blogs is a testimonial to how many different topics are covered. In January of 2002, there were about 100,000 websites dedicated solely to blogging (Gard, Page 1). Today, there are over 6 million blogs (Gard, Page 1). The vast majority of bloggers are under 30, and most bloggers are under 19 (McGann, Page 1). These demographics indicate that blogging is on the rise, and many wonder if it can be a legitimate source of income. Like most things, the answer to that depends on several factors.

Time, talent, and the level of effort put into the blog most likely will determine the success the writer achieves, measured in how many readers and consequently how much income is possible. Blogging is a sole-proprietorship classification of business in an industry that follows the characteristics of monopolistic competition, namely easy entry into the industry (few barriers) and unique content among competitors (Case and Fair, Page 281). Starting a blog takes little more than an internet connection, computer, and a small monthly fee.

Because anyone can create a blog, there are many blogs around. This means the content of each blog must be unique and interesting in order to succeed commercially. Bloggers can earn income three different ways, and, considering that the “monetization of the internet will increase 30% over the next years,” it might be profitable to learn how bloggers earn revenue (Meeker, Page 1). First, bloggers can find advertisers to provide different companies that promote their products or services on the blog page (Rowse, Page 1) This is typical on many websites that exist today.

Banner ads, ads before being transfered to the desired page (precommercials), and pop-up ads are all forms of advertising. As a sole-proprietor, the major difficulty you face is finding advertisers that want to promote their products on your site. This is where a company like Blogads comes in handy. Blogads will look at the quality of your site and number of visitors and then find companies to advertise their product on your website for you (Rowse, Page 1). The more visitors you have, the more you get paid for each advertisement.

In some cases, advertisers pay a small commission to the owner of a website that redirects a customer to their site. For example, anytime you click on a textlink and get transferred to Amazon. com, if you purchase the product that linked you to Amazon, the owner of the website that transferred you to Amazon gets a commission, based on the amount you spent (Rowse, Page 1). Another way to earn income from blogging is by starting a niche blog that members are willing to pay to see the content of (Rowse, Page 1).

One of the most amazing stories I found in researching this paper was that of OhMyNews. m. According to Spencer Ante of Business Week Magazine, OhMyNews. com earns money through members of the site paying for the content of it and “the barely profitable site has been widely credited with helping elect the country’s new President, Roh Moo Hyun” (Ante, Page 1). OhMyNews. com has individuals go out and blog different stories, then directs readers to the stories editors like the most. Many readers found out about former small time attorney Roh Moo Hyen through OhMyNews. com, and propelled him to the presidency through their support.

Businesses that are constantly changing, like many technical industries, can also have members that will pay for constantly updated content. Techdirt. com is a blog that charges each member a monthly fee and allows access of up to the minute information about online publishing (Rowse, Page 1). Many blogs have followed this model, usually for niche markets that wouldn’t appeal to laypersons not in a specific industry. There is great debate among bloggers whether free information is the goal and ultimate aspiration of blogging; or, the opposite view of profit through membership fees, is the future and best way to proceed.

As of now, most blogs are free to read and no membership is required. Only time will tell whether pay-for-view blogging will become the norm or not. The third and final way to earn income from blogging is the easiest: ask your readers for donations. Many blogs have a “tip jar” that allows readers to contribute to the blog at the readers discretion (Rowse, Page 1). Much like PBS, some blogs can make a nice profit by simply asking for money. Andrew Sullivan, of www. andrewsullivan. com, makes “$6000 per month off of donations” (Bushell, Page 1).

In order to make any significant sum of money through donations, you would need a lot of readers, a few wealthy readers who like your site a lot, or a really good story that makes people pity you and decide to help you out. Indeed, although not reliable enough to source, there are internet rumors of people running up credit card debt and begging others to help them, with some degree of success being reported. This is probably not the best way to earn a living, but as a way to keep a blog going, it certainly can’t hurt. Making a living off of a blog is difficult right now, mainly because of the high level of competition.

Many professional bloggers are former, or current, journalists with many industry contacts and years of experience. Since political blogs, basically online editorials, are currently the most profitable blogs, without gravitas it would be hard to make any money blogging. An example of how to gain credibility and therefore readers which leads to revenue couldn’t be more canonical than the story of www. powerlineblog. com.

Certainly creating more than just a “personal journal of reflection” as the folks at Merriam-Webster would have you believe, the founders of Powerlineblog. m (Paul Mirengoff, Scott Johnson, and John Hinderacker) were a group of successful attornies who decided to write a nonpersonal blog about political issues, mainly from a Midwestern perspective (they live in Minnesota). Being conservative, they were one of the first groups to notice problems with Dan Rather’s story about George W. Bush and his National Guard record. The Powerline crew got together experts that roundly criticized the “scoop” that Rather claimed he got, and eventually had so much evidence (they are all attornies) they demanded CBS recant the story.

CBS, Dan Rather in particular, would not recant the story. Much like the figure Rather treated worse than Saddam Hussein, Richard Nixon, Rather refused to admit his mistake until guilt was no longer deniable. The “scoop” was based on documents received from a known (and vocal) Bush-hater.. The documents were so obviously fake that experts from CBS would not authenticate them before the story aired. The fake documents were identified as fraudulent within hours of airing once the men from Powerline got on the case, yet Rather arrogantly took over a full week to admit his error.

The three small-time attorney’s from Minnesota were instrumental in forcing Rather to admit his mistake, possibly causing his early resignation. Without the power of the internet and the Powerline blog, this scenario would have unfolded much differently, although no one can be sure how. In closing, I have defined (more accurately than Merriam-Webster) what a blog is. The possibility of creating a blog is open to everyone and I would highly encourage it, but if one lesson can be learned from this paper it is this: blogging for profit is very tough.

Much like other areas of life, don’t blog for money but blog because you enjoy it. With several different ways to earn revenue, you might be able to make a little extra cash, but it would be a very tough way to make a living. As blogging matures and technology advances, there are limitless possibilities for weblogs. Being familiar with the form and concept of blogs can only be an asset for most people. Suggested areas of further research include finding advertising for your blog by attending trade shows and advertising your blog (on industry related sites).

Also, methods of forming links with similar minded blogs can be a worthy enterprise resulting in increased site visits. Further information regarding people other than Dan Rather who have felt the sting of bloggers can be found with research of Trent Lott and Howard Dean. As A. J. Leibling once said, “Freedom of the press is guaranteed only to those who own one” (Simpson, Page 82). With the power of a blog, everyone now can own their own version of Leibling’s press, and the power of that concept is freedom at its highest form.

Children and the Internet

Although the U. S. created the Internet in the 1960s as a communications tool for the military, it was not until after the government opened it to the public in the late 1980s that the Internet became a unique communications phenomenon. Nobody could predict the speed by which people all over the world grabbed onto this new form of technological communication. In 1995, there were an estimated 56 million Internet users worldwide; by 1999, this figure is expected to rise to 200 million.

This tremendous growth has caused something our world has never seen before; for the first time in history, the governments of this planet are facing something that is larger than all of them combined . . . and they are terrified. A wealth of information is readily available to those who possess the technological means to access and contribute to it. It is the place where “any person can become a town crier with a voice that resonates farther than it could from any soapbox” (Sheremata 22). This has made the Internet a very powerful and positive forum for free expression.

Parents, however, are concerned that the Internet makes pornographic, hateful, violent, profane and destructive content easily accessible to their children. But who is ultimately responsible for keeping the Internet safe for children: parents, educators, Internet service providers or the government? And how can one regulate this new form of communication without infringing on peoples’ right to freedom of expression? Some would say that the Internet needs to be regulated by eliminating all pornographic, destructive, violent and hateful web sites. This would ensure the protection and welfare of everyone’s children.

Pornography and “adult-oriented sites” are the main part of the Internet that parents do not want their children to have access to. From text to images, the graphic portrayals of almost every form of sexual activity are available to anyone regardless of age or gender. Without any restrictions one may view these images or read these “stories” within a few minutes of logging onto the Net. Making Internet service providers (ISPs) delete all the pornography may be the key to getting rid of Internet smut. Hate propaganda is not a recent symptom of the Net but has grown significantly in the past few years.

The Internet has more than 150 extreme web sites, offering a vast array of racist literature and graphics. The propaganda that these racist sites publish is believed to be detrimental to children’s welfare and mental stability. Anti-Semitic views and harassment are also part of the hate propaganda expressed on the Net. Many people think that the Internet should be free from such racism and hate, and that getting rid of these web sites would be beneficial for all. Another problem with the Internet is all the sites that contain destructive and violent texts.

Recipes for things like how to make pipe bombs have parents worried. Parents become very frightened for the safety of their children when they can make a bomb from household materials. About two years ago at a bus terminal near Erminskin Shopping Centre in Edmonton, Alberta, a teenager placed a homemade pipe bomb into a watermelon that blew not only the watermelon apart along but the bus terminal as well. Such recipes for disaster are everywhere on the net and should be eliminated before more children get hurt or killed. Children are vulnerable to what they see and read on the Internet.

Pornography, hate propaganda, and violent content is upsetting, confusing and can give children incorrect information and can also be emotionally destructive. They don’t have the knowledge to decide whether this material is good for them or not and cannot differentiate between healthy and unhealthy sexual activity. Children are easily influenced and should not be able to view or access such sites which contain pornography or hate propaganda. Canadian law states that a person must be 18 or older to purchase erotica, yet a child of nine can access such pictures on the Net that would make most adults sick.

Hate propaganda on the net may recruit kids into cults or influence their beliefs so that they exhibit racist behaviours. Also, any user can enter a hate web-site, and download pages of text and graphics without any legal hindrance or technological impediment. Such ideas and beliefs must be not be exposed to children for their safety. Destructive and violent texts also need to be kept at arms length from children because children are curious by nature. They may attempt to make some of these homemade recipes from the Net.

Children have had their hands blown off before, when attempting to make pipe bombs from a recipe off the Internet. The Internet provides access to more pornography and hate speech than parents can handle, but it also provides the tools to let them protect their children from such evils. Many markets have an incentive to regulate themselves, competing to offer consumers protection from unpleasant experiences. Parents can choose Internet access from service providers that can help them keep their offspring away from offensive material.

American Online (AOL) is one of these ISPs that offer parental controls for differing ages such as child, teenager, and adult. The government is also regulating ISPs in order to them accept some of the responsibility for the material that they circulate using their network on the Internet, and will also be required to obtain a license. To guarantee that minors are not accessing inappropriate material, content providers will be required to register all users who visit their sites to verify their age.

This registration process will prevent anonymous browsing and speech online, since the user will have to identify themselves in order to enter that specific site. Also, filter and blocking software can control the presence and accessibility on the Internet of material judged objectionable by parents. The user-based filtering software is presented as an option for parents and educators so that they, instead of governments or ISPs, can engage in blocking sites considered unsuitable for children.

Some better-known filters that block keywords, phrases, or certain blacklisted sites are Net Nanny, SurfWatch, CyberPatrol, and CyberSitter. However, none of these current filtering tools are 100% effective but range around 97% effectiveness. The newest filter on the market My Father’s Eyes is the first and only Internet filtering device which is 100% secure and guarantees that children will not be exposed to material which is either questionable or offensive. However, the implication of blocking and filtering software could inevitably lead to, in a worst case scenario, a bland and homogenized Internet for children.

The vast majority of the content on the Internet is not porn, but if people want adult content they can find it. However, child pornography is on the Net as well and should be altogether wiped out, some forms of pornography may, in certain instances, provide a useful social service, such as “relief for the lonely or education for the dysfunctional” (O’Connell, 36). Pornography is legal as anyone can see by looking behind the counter at their local convenience store. The pornographic magazines may be wrapped in brown paper but they are still there.

Of course, children do not have access to this pornographic medium. They do, however, have access to the Net through their homes, schools, and libraries. A recent survey by SurfWatch showed that nearly all parents whose children have Internet access are aware of the dangerous material on the Internet, but many have no idea how much time their children are spending on it. Parents need to pay attention to what their children are doing, and even closer attention to how they are using the Net.

The protection of children must also go outside of the home, at their schools and at their libraries. Presently, many schools and libraries at the elementary, junior and senior levels are beginning to use these filtering devices to block certain aspects of the Internet that they deem questionable. This still presents the problem of censorship but at a minimal level for children not adults. People may not like pornography or racist views but the right to freedom of expression guarantees that people can have those views and post that material.

Not only is anonymous communication another class of constitutionally protected speech, but it is a vital component of online interaction. David Jones, a professor at McMaster University in Hamilton, Ont. , has found that “the more experience people have on the Net, the more they appreciate its openness” (5). Jones also added that, “people come to realize that they can make choices themselves and [that] they don’t need some bureaucrat to decide what they can [or cannot] see” (3). The Internet is an adult environment, with all the problems of the real adult world.

Therefore, a system of restrictions should be placed on children below the age of 18. Getting rid of certain forms of expression on the Net diminishes what it stands for – freedom of expression. People who are over the age of 18 need not be censored, for they are presumed to be adults who can make the necessary decisions as to what they wish to view or read on the Internet. Most authors using electronic media do not produce material that is any “worse” than that available from newsagents, video shops, or mail-order sources.

What is new is that all types of material are equally, and easily, accessible to everyone. Regulation of the Internet is not needed but one must regulate children’s access to it. Through the use of ISPs that have parental controls or by using your own filtering software one may keep all the “evils” of the Net away from their children. Freedom of expression is not limited in this process but children’s viewing capabilities are. However, since kids are usually more computer smart than their parents, keeping one step ahead of them in the technology arena can be quite a trick.

To censor the Internet or let the government handle it would be ultimately the worst thing that could happen. Where would the government stop once they had this power to eliminate ideas that aren’t in their “best interests”? We do not live in Singapore where they restrict the Internet. The Internet does not present the views of a few privileged speakers, but rather allows all participants to publish, comment on, and even refute, what they read. Therefore, any attempt to regulate the Internet would impact heavily, and negatively, on free expression.

The Quantitative Challenges from Click stream Data

A common thread through all techniques discussed is the need for data. Fortunately, a natural byproduct of users accessing WWW pages is a dataset that contains the sequence of URLs they visited, how long they viewed them, and at what time. This dataset is called the click stream. To maximize its potential, managers can merge the click stream with demographic and purchase information. Three potential sources exist for collecting click stream data: (1) The host server (the computer of the site being visited) keeps a record of visits, usually called a server log.

As a user requests a page, the server records identifying information (IP address, previous URL visited, and browser type) in the server log. (2) A third party can capture information about web requests. For example, if a user contacts an Internet Service Provider (ISP) or Commercial On-line Service (COS), such as AOL, it can record any requests the user makes as it relays them to the requested server. Because many ISPs and COS’s cache their users’ requested pages, they do not pass all requests on to the server; instead, they serve many pages from local cache archives to speed up responses to user requests.

Unfortunately, this means that server logs contain only a subset of the viewings that occur. Dreze and Zufryden [1998] discuss some of the challenges of using server log data to measure advertising effectiveness. (3) A final-and perhaps the most reliable-source of click stream data is a program installed on the computer where the browser program is running that can “watch” the user and re-cord the URLs of each page viewed in the browser window as well as other application programs that the user is running.

It records the actual pages viewed, and thus avoids the problem of cached requests. Such a program can also record how long windows are active. The drawback is that the analyst must choose the individuals and obtain their consent to participate in such a panel. Generally web users are randomly sampled to construct a representative sample. The information from this sample can be projected to the national population using statistical inference. The largest provider of such information is Media Metrix [Coffey 1999].

The click stream of an actual user session as collected by Media Metrix shows that the user frequently views the same page repeatedly and sometimes pauses to do other tasks between page views (for example run other applications or watch television). Only five of the 12 viewings the user requested could generate a “hit” to the server. This illustrates the advantage of collecting data at a user’s machine and not from a host site since it includes all requests, eliminating a potential source of bias. Information about where and how frequently users access web sites is used for various tasks.

Marketers use such information to target banner ads. For example, users who often visit business sites may receive targeted banner ads for financial services even while browsing at no business sites. Web managers may use this information to understand consumer behavior at their site. Additionally, it can be used to compare competing web sites. Members of the financial community use such information to value dot com companies. Analysts use click stream information to track trends in a particular site or within a general community.

Financial analysts find this type of intelligence useful for assessing the values of companies because many traditional accounting and finance measures can be poor predictors of firms’ values. Another use of click stream data is to profile visitors to a web site. Identifying characteristics about visitors to a site is an important precept of personalization. One way to find out characteristics of visitors is to ask them to fill out a survey. However, not everyone is willing to fill them out, creating what is known in marketing research as a self-selection bias.

The information may be inaccurate as well, for example visitors may give invalid mailing addresses to protect their privacy or inaccurately report incomes to inflate their egos. Also, completing a survey takes time, and the effort required may severely skew the type of individuals that complete it and the results. An alternative way to profile users is with click stream data. The demographic profiles of sites reported by companies like Media Metrix can be used to determine what type of individuals visit a site. For example, Media Metrix reports that 66 percent of visitors to ivillage. m are female.

Even without knowing anything about a user except that they visit ivillage. com, the odds are two to one that a visitor is female. This is quite reasonable because ivillage. com offers content geared toward issues of primary concern to women. Some gaming sites appeal primarily to teenage boys, and sports sites may draw predominately adult men. On the other hand, such portals as Yahoo! and Excite draw audiences that are fairly representative of the web as a whole. Media Metrix can identify demographic characteristics of visitors using information provided to them by panelists.

However, simply a knowledge of the web sites visited by a user and profiles of these web sites (that is, the demographic characteristics of a sample of users) is enough to make a good prediction about a visitor’s demographics. For example, suppose we wish to predict whether a user is a woman. In general, about 45 percent of web users are female. Therefore without knowing what sites a person visited one would guess that there is a 45 percent probability of being female and a 55 percent probability of being male.

If forced to choose, one would guess the user to be male, but this would be an inaccurate guess since the odds are almost equal. However, if one knows that this individual visited ivillage. com, whose visitors are 66 percent women, the hypothesis that this user is female can be improved. This is a Bayesian hypothesis updating problem, and an analyst could apply Bayes formula to recompute the probability that the user was female using this new information: The original probability of being female is denoted by p_=.45, and the new information we have is p=.66. The updated probability or posterior probability of our hypothesis is denoted by p_=.62.

In other words, the probability this is a female user has increased from 45 percent to 62 percent. While most of the web sites visited by this individual indicate the user is most likely a female, some of the sites (aol. com, eplay. com, halcyon. com, lycos. com, and netradio. net) visited might point to the individual being male. However, based on information from all 22 sites the probability that the user is female is 99. 97 percent.

To assess the accuracy of this technique I applied it to actual usage information from a sample of 19,000 representative web users with one month of usage and known gender. There is a great deal of variation in users in this sample, such as some users visit only one site, while others may visit hundreds. If the model predicts more than an 80 percent probability that the user is male then the user is predicted to be male. Similar predictions are made for female users. There was enough information to classify 60 percent of users as either male or female. percent of users classified as male are male, and 96 percent of users classified as female are female.

The agreement between the predictions and actual accuracy validates the accuracy of these techniques. More advanced techniques that appropriately account for statistical dependence between web site visits can increase the accuracy of these predictions. In this example, all of the web sites visited by a user were known, but this is not typical. However, many web advertising agencies, such as Double-Click and Fly-cast, have such information for their advertising partners since they serve all their partners’ banner ads.

These partners give them wide coverage of the web. User profiling techniques allows them to accurately predict characteristics of their users without ever having to ask a user to fill out a survey. Predicting the genders of an individual seems innocuous, but the same techniques can be applied to other more sensitive demographic variables, such as income (for example, does a user make more than $75,000). Just as there are sites that provide valuable information about gender, there are sites that provide information about income (for example, millionaire. m, wsj. com, businessweek. com).

While this example illustrates that these techniques can be used successfully, questions about privacy need to also be considered before using these techniques in practice. The collection, processing, and use of click stream data is not cost free, as some would claim. A major portal can collect 30 to 50 gigabytes of information from web accesses each day in a server log. Even a small site will generate several megabytes. To analyze these large datasets, managers need data mining and statistical techniques.

These techniques generally require skill and their inaccurate applications can result in poor or misleading predictions about visitors and what they want to see. If, for example, a visitor is wrongly classified as male, the visitor could be shown messages that would not appeal to a female user or even worse cause her to leave the site and never return. Therefore, to apply these techniques appropriately, one should take into account the value of personalization and the costs of misclassifying a visitor.

Cookies and Internet Privacy

Netscape’s Client Side State definition:Cookies are a general mechanism which server side connections (such as CGI scripts) can use to both store and retrieve information on the client side of the connection. The addition of a simple, persistent, client-side state significantly extends the capabilities of Web-based client/server applications. Kington, Andy, Andys HTTP Cookie Notes, Available from http://www. illuminatus. com/cookie_pages/ [modified 6 June 1997, cited 14 March, 1999] In English, this means that webservers can create web pages that will customize from user to user.

By saving these preferences on your computer, the web page can reload appearing to your chosen options. This is accomplished by retrieving the cookie, through your browser, when you access the web page. The problem with privacy begins with the cookie revealing personal information that you do not wish to be available. Your browser is probably revealing more than you might want: which computer you are coming from, what software and hardware you are using, details of the link you clicked on, and possibly even your email address.

Junkbusters, How Web Servers’ Cookies Threaten Your Privacy, [Online], Available from http://www. junkbusters. com/ht/en/cookies. html, [written 11 December, 1998, cited 14 March, 1999] By receiving this information, the webservers could sell it as part of an advertising database resulting in both electronic and paper junk mail. Legislative action has been enacted to curtail the illegal use of personal information. “The WWW offers a wide variety of communication, information and interaction. Cookies provide for necessary customization. But the Internet is not outside the law.

Existing regulations, targeted at protecting personal information, limit the use and application of cookies. Current cookie usage violates such norms. Content providers continuing to use cookies that violate these regulations and browser producers unwilling or incapable of bringing their products into accordance with these laws both risk legal liability. It should be their concern to avoid legal action; and it should be our concern to safeguard our privacy. ”

Mayer-Schoenberger, Viktor, “The Internet and Privacy Legislation: Cookies for a Treat? West Virginia Journal of Law and Technology, [journal online], Available from http://www. wvjolt. wvu. edu/wvjolt/current/issue1/articles/mayer/mayer. htm, [cited 14 March, 1999] Another possibility of potential privacy violation is cookies retrieving information from other locations on your hard drive. The safety of personal information stored on the user’s hard drive has also been of concern in the cookie debate. Concerns have been raised about the possibility of cookies being written that would allow access to other information that the user has stored.Pitt, Andrew, Internet Privacy: The Cookie Controversy, [online], Available from http://www. cookiecentral. com/ccstory/cc3. htm, [cited 14 March, 1999]

I feel that the best way to minimize your risk is to be careful with allowing sites to place cookies in your system. This can easily be accomplished by changing your internet settings to allow manual confirmation of each cookie. You can prevent any cookies from being sent to your system using the browser options

. In Internet Explorer 4. choose the View, Internet Options command, click the Advanced tab and click the Disable All Cookie Use option. In Netscape 4. 0, choose the Edit, Options command, click on Advanced and click the Disable Cookies option. After that, no cookies will be stored on your system. You will need to turn cookies back on if you want to use any online services that require them. You can also choose the option to prompt you before accepting a cookie, but at many sites you will be continually closing the warning dialog box.

How The Internet Got Started

Some thirty years ago , the Rand corporation , America’s formost cold war think tank, faced a strange straegic problem. How could the US authrieties succesfully communicate after a nuclear war? Postnuclear America would need a comand-and-control network, linked from city to city , state to state, base to base . But no matter how throughly that network was armored or protected , its switches and wiring would always be vulnerable to the impact of atomic bombs. A nuclear attack would reduce any conceivable network to tatters.

And how would the network itself be commanded nd controlled ? Any central authority, any network central citadel, would be an obvious and immediate target for man enemy missle. The center of the network would be the very first place to go. RAND mulled over this grim puzzle in deep military secrecy, and arrived at a daring solution made in 1964. The principles were simple . The network itself would be assumed to be unreliable at all times . It would be designed from the get-go to tyranscend its all times . It would be designed from the get-go to transcend its own unrreliability.

All the nodes from computers in he network would be equal in status to all other nodes , each node with its own authority to originate , pass , and recieve messages. The messages would be divided into packets, each packet seperatly addressed. Each packet would begin at some specified source node , and end at some other specified destination node . Each packet would wind its way through the network on an individual basis. In fall 1969, the first such node was insalled in UCLA. By December 1969, there were 4 nodes on the infant network, which was named arpanet, after its Pentagon sponsor.

The four computers could even be programed remotely from the other nodes. hanks to ARPANET scientists and researchers could share one another’s computer facilities by long -distance . This was a very handy service , for computer- time was precious in the early 70s. In 1971 ther were fifteen nodes in Arpanet; by 1972, thirty-seven nodes. And it was good. As early as 1977, TCP/IP was being used by other networks to link to ARPANET. ARPANET itself remained fairly tightly controlled,at least until 1983,when its military segment broke off and became MILNET.

TCP/IP became more common,entire other networks fell into the digital embrace of the Internet,and messily adhered. Since the software called TCP/IP was public domain and he basic technology was decentralized and rather anarchic by its very nature,it as difficult to stop people from barging in linking up somewhere or other. Nobody wanted to stop them from joining this branching complex of networks, which came tobe known as the “INTERNET”. Connecting to the Internet cost the taxpayer little or nothing, since each node was independent, and had to handle its own financing and its own technical requirements. The more, the merrier.

Like the phone network, the computer network became steadily more valuable as it embraced larger and larger erritories of people and resources. A fax machine is only valuable if everybody eles a fax machine. Until they do, a fax is just a curiosity. ARPANET, too was a curiosity for a while. Then computer networking became an utter necessity. In 1984 the National Science Foundation got into the act,through its office of Advanced Scientific Computing. The new NSFNET set a blisteing pace for technical advancement linking newer, faster, shinier supercomputers, through thicker, faster links,upgraded and expanded,again and again,in l986,l988,l990.

And other government agencies leapt in: NASA, National Institutes of Health, Department of Energy, each of them maintaining a digital satrapy in the INTERNET confederation. The nodes in this growing network-of-networks were divided up into basic varieties. Foreighn computers,and a few American ones chose to be denoted by their geographical locations. The others were grouped by the six basic Internet domains –gov, {government} mil {military}edu{education} these were of course, the pioneers Just think, in l997 the standards for computer networking is now global. In 1971, there were only four nodes in the ARPANET network.

Today there re tens of thousands of nodes in the Internet,scattered over forty two countries and more coming on line every single day. In estimate, as of December,l996 over 50 million people use this network. Probably, the most important scientific instrument of the late twentieth century is the INTERNET. It is spreading faster than celluar phones,faster than fax machines. The INTERNET offers simple freedom. There are no censors,no bosses,etc. There are only technical rules, not social, political,it is a bargain you can talk to anyone anywhere,and it doesnt charge for long distance service. It belongs to everyone and no one.

The most widely used part of the “Net” is the world Wide Web. Internet mail is E mail a lot faster than the US Postal service mail Internet regulars call the US mail the “snailmail” File transfers allow Internet users to access remote machines and retrieve programs or text. Many internet computers allow any person to acess them anonymously to simply copy their public files,free of charge. Entire books can be transferred through direct access in a matter of minutes. Finding a link to the Internet will become easier and cheaper. At the turn of the century, Network literacy will be forcing itself into every individuals life.

Security On The Internet

How do you secure something that is changing faster than you can fix it? The Internet has had security problems since its earliest days as a pure research project. Today, after several years and orders of magnitude of growth, is still has security problems. It is being used for a purpose for which it was never intended: commerce. It is somewhat ironic that the early Internet was design as a prototype for a high-availability command and control network that could resist outages resulting from enemy actions, yet it cannot resist college undergraduates.

The problem is that the attackers are on, and make up apart of, the network they are attacking. Designing a system that is capable of resisting attack from within, while still growing and evolving at a breakneck pace, is probably impossible. Deep infrastructure changes are needed, and once you have achieved a certain amount of size, the sheer inertia of the installed base may make it impossible to apply fixes. The challenges for the security industry are growing. With the electronic commerce spreading over the Internet, there are issues such as nonrepudiation to be solved.

Financial institutions will have both technical concerns, such as the security of a credit card number or banking information, and legal concerns for holding individuals responsible for their actions such as their purchases or sales over the Internet. Issuance and management of encryption keys for millions of users will pose a new type of challenge. While some technologies have been developed, only an industry-wide effort and cooperation can minimize risks and ensure privacy for users, data confidentiality for the financial institutions, and nonrepudiation for electronic commerce.

With the continuing growth in linking individuals and businesses over the Internet, some social issues are starting to surface. The society may take time in adapting to the new concept of transacting business over the Internet. Consumers may take time to trust the network and accept it as a substitute for transacting business in person. Another class of concerns relates to restricting access over the Internet. Preventing distribution of pornography and other objectionable material over the Internet has already been in the news.

We can expect new social hurdles over time and hope the great benefits of the Internet will continue to override these hurdles through new technologies and legislations. The World Wide Web is the single largest, most ubiquitous source of information in the world, and it sprang up spontaneously. People use interactive Web pages to obtain stock quotes, receive tax information from the Internal Revenue Service, make appointments with a hairdresser, consult a pregnancy planner to determine ovulation dates, conduct election polls, register for a conference, search for old friends, and the list goes on.

It is only natural that the Web’s functionality, popularity, and ubiquity have made it the seemingly ideal platform for conducting electronic commerce. People can now go online to buy CDs, clothing, concert tickets, and stocks. Several companies, such Digicash, Cybercash, and First Virtual, have sprung up to provide mechanisms for conducting business on the Web. The savings in cost and the convenience of shopping via the Web are incalculable. Whereas most successful computer systems result from careful, methodical planning, followed by hard work, the Web took on a life of its own from the very beginning.

The introduction of a common protocol and a friendly graphical user interface was all that was needed to ignite the Internet explosion. The Web’s virtues are extolled without end, but its rapid growth and universal adoption have not been without cost. In particular, security was added as an afterthought. New capabilities were added ad hoc to satisfy the growing demand for features without carefully considering the impact on security. As general-purpose scripts were introduced on both the client and the server sides, the dangers of accidental and malicious abuse grew.

It did not take long for the Web to move from the scientific community to the commercial world. At this point, the security threats became much more serious. The incentive for malicious attackers to exploit vulnerabilities in the underlying technologies is at an all-time high. This is indeed frightening when we consider what attackers of computer systems have accomplished when their only incentive was fun and boosting their egos. When business and profit are at stake, we cannot assume anything less than the most dedicated and resourceful attackers typing their utmost to steal, cheat, and perform malice against users of the Web.

When people use their computers to surf the Web, they have many expectations. They expect to find all sorts of interesting information, they expect to have opportunities to shop and they expect to be bombarded with all sorts of ads. Even people who do not use the Web are in jeopardy of being impersonated on the Web. There are simple and advanced methods for ensuring browser security and protecting user privacy. The more simple techniques are user certification schemes, which rely on digital Ids. Netscape Communicator Navigator and Internet Explorer allow users to obtain and use personal certificates.

Currently, the only company offering such certificates is Verisign, which offers digital Ids that consist of a certificate of a user’s identity, signed by Verisign. There are four classes of digital Ids, each represents a different level of assurance in the identify, and each comes at an increasingly higher cost. The assurance is determined by the effort that goes into identifying the person requesting the certificate. Class 1 Digital IDs, intended for casual Web browsing, provided users with an unambiguous name and e-mail address within Verisign’s domain.

A Class 1 ID provides assurance to the server that the client is using an identity issued by Verisign but little guarantee about the actual person behind the ID. Class 2 Digital IDs require third party confirmation of name, address, and other personal information related to the user, and they are available only to residents of the United States and Canada. The information provided to Verisign is checked against a consumer database maintained by Equifax. To protect against insiders at Verisign issuing bogus digital IDs, a hardware device is used to generate the certificates.

Class 3 Digital IDs are not available. The purpose is to bind an individual to an organization. Thus, a user in possession of such an ID could, theoretically, prove that he or she belongs to the organization that employs him or her. The idea behind Digital IDs is that they are entered into the browser and then are automatically sent when users connect to sites requiring personal certificates. Unfortunately, the only practical effect is to make impersonating users on the network only a little bit more difficult.

Many Web sites require their users to register a name and a password. When users connect to these sites, their browser pops up an authentication window that asks for these two items. Usually, the browser than sends the name and password to the server that can allow retrieval of the remaining pages at the site. The authentication information can be protected from eavesdropping and replay by using the SSL protocol. As the number of sites requiring simple authentication grows, so does the number of passwords that each user must maintain.

In fact, users are often required to have several different passwords for systems in their workplace, for personal accounts, for special accounts relating to payroll and vacation, and so on. It is not uncommon for users to have more than six sites they visit that require passwords. In the early days of networking, firewalls were intended less as security devices than as a means of preventing broken networking software or hardware from crashing wide-area networks. In those days, malformed packets or bogus routes frequently crashed systems and disrupted servers.

Desperate network managers installed screening systems to reduce the damage that could happen if a subnet’s routing tables got confused or if a system’s Ethernet card malfunctioned. When companies began connecting to what is now the Internet, firewalls acted as a means of isolating networks to provide security as well as enforce an administrative boundary. Early hackers were not very sophisticated; neither were early firewalls. Today, firewalls are sold by many vendors and protect tens of thousands of sites.

The products are a far cry from the first-generation firewalls, now including fancy graphical user interfaces, intrusion detection systems, and various forms of tamper-proof software. To operate, a firewall sits between the protected network and all external access points. To work effectively, firewalls have to guard all access points into the network’s perimeter otherwise, an attacker can simply go around the firewall and attack an undefended connection. The simple days of the firewalls ended when the Web exploded.

Suddenly, instead of handling only a few simple services in an “us versus them manner”, firewalls now must be connected with complex data and protocols. Today’s firewall has to handle multimedia traffic level, attached downloadable programs (applets) and a host of other protocols plugged into Web browsers. This development has produced a basis conflict: The firewall is in the way of the things users want to do. A second problem has arisen as many sites want to host Web servers: Does the Web server go inside or outside of the firewall?

Firewalls are both a blessing and a curse. Presumably, they help deflect attacks. They also complicate users’ lives, make Web server administrators’ jobs harder, rob network performance, add an extra point of failure, cost money, and make networks more complex to manage. Firewall technologies, like all other Internet technologies, are rapidly changing. There are two main types of firewalls, plus many variations. The main types of firewalls are proxy and network-layer.

The idea of a proxy firewall is simple: Rather than have users log into a gateway host and then access the Internet from there, give them a set of restricted programs running on the gateway host and let them talk to those programs, which act as proxies on behalf of the user. The user never has a account or login on the firewall itself, and he or she can interact only with a tightly controlled restricted environment created by the firewall’s administrator. This approach greatly enhances the security of the firewall itself because it means that users do not have accounts or shell access to the operating system.

Most UNIX bugs require that the attacker have a login on the system to exploit them. By throwing the users off the firewall, it becomes just a dedicated platform that does nothing except support a small set of proxies-it is no longer a general-purpose computing environment. The proxies, in turn, are carefully designed to be reliable and secure because they are the only real point of the system against which an attack can be launched. Proxy firewalls have evolved to the point where today they support a wide range of services and run on a number of different UNIX and Windows NT platforms.

Many security experts believe that proxy firewall is more secure than other types of firewalls, largely because the first proxy firewalls were able to apply additional control on to the data traversing the proxy. The real reason for proxy firewalls was their ease of implementation, not their security properties. For security, it does not really matter where in the processing of data the security check is made; what’s more important is that it is made at all. Because they do not allow any direct communication between the protected network and outside world, proxy firewall inherently provide network address translation.

Whenever an outside site gets a connection from the firewall’s proxy address, it in turn hides and translates the addresses of system behind the firewall. Prior to the invention of firewalls, routers were often pressed into service to provide security and network isolation. Many sites connecting to the Internet in the early days relied on ordinary routers to filter the types of traffic allowed into or out of the network. Routers operate on each packet as a unique event unrelated to previous packets, filtered on IP source, IP destination, IP port number, and a f few other basic data contained in the packet header.

Filtering, strictly speaking, does not constitute a firewall because it does not have quite enough detailed control over data flow to permit building highly secure connections. The biggest problem with using filtering routers for security is the FTP protocol, which, as part of its specification, makes a callback connection in which the remote system initiates a connection to the client, over which data is transmitted. Cryptography is at the heart of computer and network security. The important cryptographic functions are encryption, decryption, one-way hashing, and digital signatures.

Ciphers are divided into two categories, symmetric and asymmetric, or public-key systems. Symmetric ciphers are functions where the same key is used for encryption and decryption. Public-key systems can be used for encryption, but they are also useful for key agreement and digital signatures. Key-agreement protocols enable two parties to compute a secret key, even in the face of an eavesdropper. Symmetric ciphers are the most efficient way to encrypt data so that its confidentiality and integrity are preserved.

That is, the data remains secret to those who do not posses the secret key, and modifications to the cipher text can be detected during decryption. Two of the most popular symmetric ciphers are the Data Encryption Standard (DES) and the International Data Encryption Algorithm (IDEA). The DES algorithm operates on blocks of 64 bits at a time using a key length of 56 bits. The 64 bits are permuted according to the value of the key, and so encryption with two keys that differently in one bit produces two completely different cipher texts.

The most popular mode of DES is called Cipher Block Chaining (CBC) mode, where output from previous block are mixed with the plaintext of each block. The first block is mixed with the plaintext of each block. The block uses a special value called the Initialization Vector. Despite its size and rapid growth, the Web is still in its infancy. So is the software industry. We are just beginning to learn how to develop secure software, and we are beginning to understand that for our future, if it is to be online, we need to incorporate security into the basic underpinnings of everything we develop.

Hackers: Information Warefare

The Popularity Of The Internet Has Hrown Immeasurably In The Past Few Years. Along with it the so-called “hacker” community has grown and risen to a level where it’s less of a black market scenario and more of “A Current Affair” scenario. Misconceptions as to what a hacker is and does run rampant in everyone who thinks they understand what the Internet is after using it a few times. In the next few pages I’m going to do my best to prove the true definition of what a hacker is, how global economic electronic warfare ties into it, background on the Internet, along with a plethora of scatological material urely for your reading enjoyment.

I will attempt to use the least technical computer terms I can, but in order to make my point at times I have no choice. There are many misconceptions, as to the definition, of what a hacker truly is, in all my research this is the best definition I’ve found: Pretend your walking down the street, the same street you have always walked down. One day, you see a big wooden or metal box with wires coming out of it sitting on the sidewalk where there had been none. Many people won’t even notice. Others might say, “Oh, a box on the street. “. A few might wonder what it does and then move on.

The hacker, the true hacker, will see the box, stop, examine it, wonder about it, and spend mental time trying to figure it out. Given the proper circumstances, he might come back later to look closely at the wiring, or even be so bold as to open the box. Not maliciously, just out of curiosity. The hacker wants to know how things work. (8) Hackers truly are “America’s Most Valuable Resource,”(4:264) as ex-CIA Robert Steele has said. But if we don’t stop screwing over our own countrymen, we will never be looked at as anything more than common gutter trash.

Hacking computers or the sole purpose of collecting systems like space-age baseball cards is stupid and pointless; and can only lead to a quick trip up the river. Let’s say that everyone was given an opportunity to hack without any worry of prosecution with free access to a safe system to hack from, with the only catch being that you could not hack certain systems. Military, government, financial, commercial and university systems would all still be fair game. Every operating system, every application, every network type all open to your curious minds.

Would this be a good alternative? Could you follow a few simple guidelines for he offer of virtually unlimited hacking with no worry of governmental interference? Where am I going with this? Right now we are at war. You may not realize it, but we all feel the implications of this war, because it’s a war with no allies, and enormous stakes. It’s a war of economics. The very countries that shake our hands over the conference tables of NATO and the United Nations are picking our pockets.

Whether it be the blatant theft of American R&D by Japanese firms, or the clandestine and governmentally-sanctioned bugging of Air France first-class seating, or the cloak-and-dagger hacking of he SWIFT network (1:24) by the German BND’s Project Rahab(1:24), America is getting screwed. Every country on the planet is coming at us. Let’s face it, we are the leaders in everything. Period. Every important discovery in this century has been by an American or by an American company. Certainly other countries have better profited by our discoveries, but nonetheless, we are the world’s think-tank.

So, is it fair that we keep getting shafted by these so-called “allies? “. Is it fair that we sit idly by, like some old hound too lazy to scratch at the ticks sucking out our life’s blood by the gallon? Hell no. Let’s say that an enterprising group of computer hackers decided to strike back. Using equipment bought legally, using network connections obtained and paid for legally, and making sure that all usage was tracked and paid for, this same group began a systematic attack of foreign computers.

Then, upon having gained access, gave any and all information obtained to American corporations and the Federal government. What laws would be broken? Federal Computer Crime Statutes specifically target so-called “Federal Interest Computers. “(6:133) (i. e. : banks, telecommunications, military, etc. Since these attacks would involve foreign systems, those statutes would not apply. If all calls and network connections were promptly paid for, no toll-fraud or other communications related laws would apply.

International law is so muddled that the chances of getting extradited by a country like France for breaking into systems in Paris from Albuquerque is slim at best. Even more slim when factoring in that the information gained was given to the CIA and American corporations. Every hacking case involving international break-ins has been tried and convicted based on other crimes. Although the media may spray headlines like Dutch Hackers Invade Internet” or “German Hackers Raid NASA,” those hackers were tried for breaking into systems within THEIR OWN COUNTRIES… ot somewhere else.

A hacker who uses the handle of 8lgm in England got press for hacking world-wide, but got nailed hacking locally(3). Australia’s Realm Hackers’: Phoenix, Electron & Nom hacked almost exclusively other countries, but use of AT&T calling cards rather than Australian Telecom got them a charge of defrauding the Australian government(3). Dutch hacker RGB got huge press hacking a US military site and creating a “dquayle” account, but got nailed hile hacking a local university(3). The list goes on and on.

I asked several people about the workability of my proposal. Most seemed to concur that it was highly unlikely that anyone would have to fear any action by American law enforcement, or of extradition to foreign soil to face charges there. The most likely form of retribution would be eradication by agents of that government. Well, I’m willing to take that chance, but only after I get further information from as many different sources as I can. I’m not looking for anyone to condone these actions, nor to finance them.

I’m only interested in any possible legal ction that may interfere with my freedom. We must take the offensive, and attack the electronic borders of other countries as vigorously as they attack us, if not more so. This is indeed a war, and America must not lose. There have always been confrontations online. It’s unavoidable on the net, as it is in life, to avoid unpleasantness. However, on the net the behavior is far more pronounced since it effects a much greater response from the limited online environments than it would in the real world.

People behind such behavior in the real world can be dealt with or avoided, but online they cannot. In the real world, annoying people don’t impersonate you in national forums. In the real world, annoying people don’t walk into your room and go through your desk and run through the town showing everyone your private papers or possessions. In the real world, people can’t readily imitate your handwriting or voice and insult your friends and family by letter or telephone. In the real world people don’t rob or vandalize and leave your fingerprints behind. The Internet is not the real world.

All of the above continually happens on the Internet, and there is little anyone can do to stop it. The perpetrators know full well how impervious they are to retribution, since the only people who can put their activities to a complete halt are reluctant to open cases against computer criminals due to the complex nature of the crimes. The Internet still clings to the anarchy of the Arpanet that spawned it, and many people would love for the status quo to remain. However, the actions of a few miscreants will force lasting changes on the net as a whole.

The wanton destruction of sites, the petty forgeries, the needless break-ins and the poor blackmail attempts do not go unnoticed by the authorities. I personally could care less what people do on the net. I know it is fantasy land. I know it exists only in our minds, and should not have any long lasting effect in the real world. Unfortunately, as the net’s presence grows larger and larger, and the world begins to accept it as an entity in and of itself, it will be harder to convince those inexperienced users that the net is not real.

I have always played by certain rules and they have worked well for me in the years I’ve been online. These rules can best be summed up by the following quote, “We are taught to love all our neighbors. Be courteous. Be peaceful. But if someone lays his hands on you, send them to the cemetery. ” The moment someone crosses the line, and interferes with my well-being in any setting (even one that is arguably unreal such as the Internet) I will do whatever necessary to ensure that I can once again go about minding my own business unmolested.

I am not alone in this feeling. There are hundreds of net-loving anarchists who don’t want the extra attention and bad press brought to our little fantasy land by people who never learned how to play well as children. Even these diehard anti-authoritarians are finding themselves caught n a serious quandary: do they do nothing and suffer attacks, or do they make the phone call to Washington and try to get the situation resolved? Many people cannot afford the risk of striking back electronically, as some people may suggest.

Other people do not have the skill set needed to orchestrate an all out electronic assault against an unknown, even if they pay no heed to the legal risk. Even so, should anyone attempt such retribution electronically, the assailant will merely move to a new site and begin anew. People do not like to deal with police. No one LOVES to call up their local law enforcement office and have a nice chat. Almost everyone feels somewhat nervous dealing with these figures knowing that they may just as well decide to turn their focus on you rather than the people causing problems.

Even if you live your life crime-free, there is always that underlying nervousness; even in the real world. However, begin an assault directed against any individual, and I guarantee he or she will overcome such feelings and make the needed phone call. It isn’t the “hacking” per se that will cause anyone’s downfall nor bring about governmental regulation of the net, but the unchecked attitudes and gross disregard for human dignity that runs rampant online. What good can come from any of this? Surely people will regain the freedom to go about their business, but what of the added governmental attentions?

Electronic Anti-Stalking Laws? Electronic Trespass? Electronic Forgery? False Electronic Identification? Electronic Shoplifting? Electronic Burglary? Electronic Assault? Electronic Loitering? Illegal Packet Sniffing equated as Illegal Wiretaps? (7:69) The potential for new legislation is immense. As the networks further permeate our real lives, the continual unacceptable behavior and following public outcry in that setting will force the ruling bodies to draft such laws. And who will enforce these laws? And who will watch the watchmen?

Often times these issues are left to resolve themselves after the laws have passed. Is this the future we want? One of increased legislation and governmental regulation? With the development of the supposed National Information Super- Highway, the tools will be in place for a new body to continually monitor traffic for suspect activity and uphold any newly passed legislation. Do not think that the ruling forces have not considered that potential. The Information Age has arrived and most people don’t recognize the serious nature behind it.

Computers and the related technology can either be the answer to the human races problems or a cause for the demise of the race. Right now we rely on computers too much, and have too little security to protect us if they fail. In the coming years, we will see amazing technology permeate every part of our lives, some of which will be welcomed, some won’t, and some will be used against us. If we don’t learn to handle the power that computers give us in the next few years, we will all pay dearly for it. Remember the warning. The future is here now and most people aren’t ready to handle it.

Internet Security Essay

What will US politics and the economy be like as we progress through the twenty-first century? There is no single vision, but many people perceive a type of digital democracy. The use of information via Internet or World Wide Web will dramatically change politics and the way government takes place. For example, a digital democracy can inform people about political candidates and issues. Volunteers also use email and web sites to encourage people to go to the polls and vote for their candidate. (1) This really boosts voting and political participation but the problem of security and privacy comes along with this digital democracy.

Security would seem easy with todays technology but how do you secure something that is changing faster than you can find a solution? The Internet has had security problems since its earliest days as a pure research project. Even today, after several years and orders of magnitude of growth, it still has security problems. It is being used for a purpose for which it was never intended: commerce. (2) It is somewhat ironic that the early Internet was design as a prototype for a command and control network that could resist outages resulting from enemy actions, but it cannot resist college undergraduates. )

The problem is that the attackers are on, and make up a part of, the network they are attacking. Designing a system that is capable of resisting attack from within, while still growing and evolving at a breakneck pace, is probably impossible. (1) Deep infrastructure changes are needed, and once you have achieved a certain amount of size, the sheer inertia of the installed base may make it impossible to apply repairs. (1) As general-purpose scripts were introduced on both the client and the servers sides, the dangers of accidental and malicious abuse grew.

It did not take long for the Web the move form the scientific community to the commercial world. At this point, the security threats became much more serious. The incentive for malicious attackers to exploit vulnerabilities in the underlying technologies is at an all-time high. (1) When business and profit are at stake, we cannot assume anything less than the most dedicated and resourceful attackers typing their utmost to steal, cheat, and perform malice against users of the Web. (2)

With the web being the single, largest source of information in the world, people are capable of obtaining stock quotes, tax information from the International Revenue Service, conduct election polls, register for a conference and the list goes on. (2) It is only natural that the Webs functionality, popularity, and ubiquity have made it the seemingly ideal platform for conducting electronic commerce. (1) The Webs virtues are extolled without end, but its rapid growth and universal adoption have not been without cost.

Along with the costs the challenges of security are growing. With the electronic commerce spreading over the Internet, there are issues such as non-repudiation to be resolved. (3) Financial institutions will have both technical concerns, such as the security of a credit card number or banking information, and legal concerns for holding individuals responsible for their actions such as their purchases or sales over the Internet. Issuance and management of encryption keys for millions of users will also pose a challenge of confidentiality.

The simple method of protecting security and user privacy is using certification schemes, which rely on digital Ids. (3) Netscape Communicator Navigator and Internet Explorer allow users to obtain and use personal certificates. (3) Many Web sites require their users to register a name and a password. When users connect to these sites, their browser pops up an authentication window that asks for these two items. Usually, the browser then sends the name and password to the server that can allow retrieval of the remaining pages at the site. (2)

Despite the maintaining of security, may it be for an individual or confirming confidentiality for a firm, only an across-the-board effort of cooperation and integrity can minimize risks and ensure privacy for users. We can expect new social hurdles over time and hope the great benefits of the Internet will continue to override these hurdles through new technologies and legislations. (3) We are just beginning to learn how to develop secure software and need to realize that for our future, if online, we need to incorporate security into the foundation of everything we develop.

Internet Censorship Essay

Presently, it seems that the Internet is playing a very important role in everyone’s daily life. This multipurpose network has many different functions useful for everyday work and entertainment. Due to the freedom of the Internet various debates and protests have come to disagree with its open form of communication.

Because of the misuse of the internet many people believe that there should be some kind of internet censorship, while others are against internet censorship stating that “it is both unnecessary impossible to implement and that because of its nature the internet should be afforded the same freedom and protection as the print media” (Bradsher 2). People that are in favor of internet censorship believe that the internet in unregulated and that unlike any other form of communication available today is open to abuse and misuse in many different ways.

Anyone can use the Internet to send almost any type of data to anyone. This leaves it open to abuse in ways unheard before because data can be transmitted anonymously and secretly. While these people fight for Internet censorship others argue against it. The basic argument made by people who oppose Internet censorship is that it should be granted the same rights as any other means of communication. These people state that the Internet “allows for the first time for everyone in a group to have the same opportunities of engaging in and partaking of debates” (Bradsher 2).

It doesn’t discriminate against anyone because of sex, religion or race and its open for with disabilities who are very often excluded from other media outlets to access and contribute equally. The magazine Society states that “the Internet has been shown to be used by pedophile rings across the world to trade in pictures and to encourage their depraved and sick habits” (37). This makes it impossible to stop juveniles using it to access pornography because there is no way of knowing someone’s age.

People can use the Internet to impose themselves on others; anonymous email can be sent by people; people can log onto chat servers and interrupt by saying things, which offend or upset those already communicating. “People who have had no access to pornography and other depravities are using the Internet to get at something that they would otherwise never be able to see” (Society 38). While there are people favoring Internet censorship there are others that are against it. Bradsher sates that “to censor the Internet would fundamentally harm it and destroy the equality, which makes it most popular – its freedom” (3).

People like and use the Internet because they feel they can come onto it, talk and email hundreds of others across the world. Debaters against Internet censorship believe that there are simply too many users of the Internet to be able to ensure that one doesn’t offend anyone. Many of the most important newspapers and news organizations are moving to the Internet. “If censorship is introduced our national media will be severely curtailed in a way which would never be acceptable if it were to be done to the print media” (Bradsher 3).

Those against Internet censorship state that “making commercial providers responsible for the activities or their customers is fundamentally unfair” (Society 38). If a costumer is allowed use the Internet then unless you watch and monitor every word or data they send then it is very easy for them to do something unwanted. To try and carry out this from of control would make the entire Internet unworkable. While those in favor of Internet censorship sate that “all other forms of communication TV Radio Satellite are regulated and strictly controlled.

Why should the Internet not be” (Society 38). The rights of the child to protection from pedophilia and pornography outweigh the rights of users who wish full freedom. The web is unlike any other form of publishing today. It allows people to publish quickly and quietly. People can set up pages that are unavailable unless one knows the address. This allows for the publishing of child porn and other dangerous material such as instructions for bomb making available without even the owner of the computer knowing about it.

People believe it must be strictly controlled; they can set up links to bring people to web pages they don’t want to see and can subject them to images and text about the basest forms of immorality. Children can use the web to find pictures, videos, and texts about subjects that are not suitable. Bradsher sates that “the web should be afforded the same protections as currently apply to the printed word” (4). To ban certain newsgroups and web sites would force the pornography “underground” (Bradsher 4). People would still post their files but would have to do so under cover.

This would mean those not wanting that material could have it forced upon them. “Individual users and parents, not the government, should decide what material is appropriate for their children” (Bradsher 4). Parents can make use of the new porn blocking software to stop children accessing sites, which they feel, are inappropriate for children. Bradsher states that “blanket censorship effects very serious and worthwhile organizations like those involved in the fields of breast cancer, rape, HIV/aids and others working on behalf of marginalized and disadvantaged groups” (4).

People must be entitled to view and obtain whatever information they want to as long as they are not hurting other people in the process. Those in favor of internet censorship propose different measures to deal with the internet such as: “make it a criminal offense to transmit child pornography across the internet; make it illegal to send unwanted emails to people when they make it clear they are not interested; instigate some way of controlling all forms of communication to ensure compliance with regulation, and outlaw the publishing of offensive and obscene information” (Society 39).

Bradsher propses that “the making available of new technologies to parents which block all unwanted sites to ensure their children can ‘surf safely’” (4) and that “the introduction of a rating system where each page contains a few bytes of data which identify content type, whether suitable for minors, Christians etc. and would warn those concerned before loading up data” (4). As a result, Internet censorship is a topic of many debates and those in favor and against it state their own ideas and perceptions of what is appropriate and not appropriate for the benefit of our society.

The Internet Revolution

Now, with the click of a button, consumers are buying just about anything imaginable, and all from the convenience of the internet. People no longer have to leave their homes, work or where ever there is internet access to make important purchases. Technology has advanced so that companies are conducting business around the world with out ever meeting. No longer do consumers or businessmen have to shake to complete a deal or a sale, but merely click down on the mouse and the numbers change. Some internet companies have never seen their customers and yet some traditional retailers have not yet acknowledged the internet.

However, ‘convergence is the new religion (‘The Real…’; 53). ‘; Big companies are changing the business world as we see it through the internet. June 1, Merrill Lynch announced it was joining the internet revolution and would begin selling stocks for $29. 95 a trade (Cropper 60). A division of the Sabre Holdings Company of Fort Worth, Texas and Preview Travel, an exclusive partner of America Online, announced they were both merging to form one of the nations largest internet commerce sites with an expected revenue of nearly one billion dollars (Jones C-7).

Companies are merging and joining the internet all out of the internet revolution craze. The internet is revolutionizing the way the world is doing business through faster, easier and more direct consumer access to their desired companies. Of course, such direct contact to these companies means that the ‘middleman’; is often eliminated. People like accountants, travel agents and stockbrokers are all ending up with commissions being cut and even losing their jobs. ‘My commission was first cut from 10% to 8% and now to 5% on plane tickets. People are now buying their tickets online. Its much easier than going to travel agency (Halbert, intrv).

People can also manage their money from home with on-line banking, which is now offered by many banks, which also eliminates the need for an accountant. Companies like E-trade and Ameritrade are taking the jobs away from stock brokers by offering $8 trades on-line as well. All these transactions eliminate jobs and are dangerous too. ‘Most online purchases today are completed with out a customers actual signature (Swisher R-22). ‘; This means that anyone with access to a credit card number can make a purchase with the holder knowing, until he gets his statement. Electronic commerce is easy, but dangerous

Then again, that is what Mr. Aminifard and his company is alert for. If companies know how to detect fraudulent transactions, then they can obviously avoid them. ‘With a few clues, you can pretty much guess (with 90% certainty) that an order is going to be fraudulent (Swisher R-22). ‘; Companies want to avoid these transactions because they are left with the credit card bill in the end. There is obvious security in the internet and that is one reason companies continue to expand and people continue to buy. Companies are joining the internet revolution and for good reasons, if they don’t their competitors will.

The internet is the great equalizer. It can make small companies seem like large companies and…large companies take care, for the formerly minor competitor may take your business (Goodman 112). ‘; The internet is saving time and money. ‘British Telecommunications will save a billion dollars next year by sourcing exclusively through the internet (Goodman 112). ‘; Companies like Sears, the nations leading seller of appliances, need not to spend money on establishing a delivery system when they begin selling more products online because one is already established (Coleman A-4).

The internet can save consumers time and effort as well because they can research the product before they phone in the order or go to the store. Companies are making purchases a lot more accessible. Investment companies such as Merrill Lynch, Ameritrade, E-trade and Charles Schwab all give detailed reports, graphs, analysis and many other tools to keep track of stocks and other investments. They also allow the investor to take all responsibility and buy, sell or trade as much as they want for a minimal fee.

In the first two months of 1999 alone, the number of trades executed over the internet has increased by 25% to 425,000 per day (Wilson 36). ‘; Internet trading is growing steadily and the same can be said for on-line banking. ‘From 1993 to 1998, the typical U. S. banks assets grew at 8%. Over the same period, Telebank, the nations largest on-line bank, grew at 53% per year (Wilson 35). ‘; This is a great sign for the public. With so much competition with online trading and banking, companies are constantly offering cheaper trading rates and boosting interest rates to gain the attention of potential customers.

Banks are also saving money through the web too with the cost of an in perosn transaction costing $1. 07 and an on-line transaction costing only $. 01, as reported in ‘Can Online Banking Replace Conventional Banks? (32). ‘; With the holidays soon approaching, companies are rushing to update their internet sites for the shopping season. Department stores such as J. C. Penny, Nordstrom, Sears and K-Mart are all upgrading for the rush. Tiffany, having sworn never to sell their diamonds and pearls over something as common as the web, are doing just that (‘The Real…’; 53).

Excite. com has introduced an electronic ‘wallet’; just in time for the holidays. The wallet stores money and scrambled credit card information on a personal computer in a way that is only accessible to the owner (Swisher R-22). A perfect solution to the over crowded shopping malls this season is to shop online. Sites will be quicker, as in terms of transaction process and more products will be available, which will hopefully boost revenue from the $8. 2 billion from last years shopping season, to something greater (Coleman A-4). Electronic commerce has certainly been beneficial to many companies.

Dell Computers, who generates more than $12 million from their website everyday, claims that internet sales account for nearly half of their overall sales (McUsic C-4). With e- commerce in the U. S. alone set to rise from $12 billion a year to $41 billion a year by 2002, companies are not hesitating to join the revolution. (‘The Real…’; 53). The internet is simply easier for consumers and businesses to deal with. Sabre Holding Corporation and Preview Travel is strictly on-line agencies and when their merger completes, expected revenue of $1 billion is more than likely to happen.

Travelers can get as much and even more information on their planned destination from on-line sites than from conventional travel agencies. This might be bad for travel agents, but there is simply plenty more information, it is quicker and easier to use the internet. The internet has really exploded since it’s beginning in 1968 (see Appendix Graph 1). There are now thousands of sites all around the world, some generating billions of dollars and others for amusement. Whatever their intention, the internet revolution is in full effect.

Businesses have really found comfort in the web. ‘The [internet] revolution is sweeping the world (Goodman 113). ‘; Multi-billion dollar companies are all using the web. Microsoft, K-mart, Nordstrom, Dell Computers, Gateway Computers, Macintosh, AT&T, America Online and many other mass money producing companies all have joined the internet revolution. There are companies still joining the revolution, no matter how big or small they are because they know that some day, they too can be as large and as powerful as the companies who have already taken part of the internet.

The internet literally provides all of our necessities at the tip of our fingertips. The internet revolution has provided us with a fast, convenient and direct contact to virtually anything we want. The revolution is available to anyone, so as Davis Goodman titles his article and reminds us all, ‘It’s Your Internet: Don’t leave it to your competitors (112). ‘; The opportunities are endless with the internet revolution. It is just up to the willingness and determination of the people to see how far it will go.

Cookies and their Impact on Privacy

In today’s fast paced world of internet commerce it would be hard to accomplish many of the tasks without the creation of “cookies. ” Since their advent, cookies have been given a bad name and associated immediately with a loss of privacy. In April of 2001 a newspaper article defined cookies as, “programs that Web sites put on your hard disk.

They sit on your computer gathering information about you and everything you do on the Internet, and whenever the Web site wants to it can download all of the information the cookie has collected. www. howstuffworks. com) This article could not be any farther from the truth. Cookies are not programs and do not perform any actions as they sit on your hard drive.

According to Netscape, “Cookies are a general mechanism which server side connections (such as CGI scripts) can use to both store and retrieve information on the client side of the connection. The addition of a simple, persistent, client-side state significantly extends the capabilities of Web-based client/server applications.

As cookies have emerged to the forefront their association with their user’s privacy has become more of an issue as time progresses. Even though cookies serve an important role in today’s e-commerce and advertising industries, it is impossible not to think of them as a breach in user security. There is something about a seemingly forced piece of information being saved on your computer for the use of a computer hundreds or even thousands of miles away. One can only think of one word. Privacy. Who’s to say that company’s are using the information gathered by these cookies and using them for good.

How do I know that you are collecting cookies for your own advertising or e-commercial purposes rather than probing me as a candidate for the ever-present adware? Do I want vendors to know exactly what it is I usually shop for when I get online? Do I really need to save my shopping time by one or two clicks with the sacrifice of decreased privacy? These are questions that each user asks themselves when we look at our internet security settings or when we are denied access to a site based on our cookie settings.

Many companies have been labeled with improper actions concerning cookies. A company named DoubleClick was forced to reach a settlement in 2002 for improper conduct concerning cookies. DoubleClick failed to keep their findings private to the companies as well as use the information for strictly commercial purposes. DoubleClick is an example of many companies that have been forced to pay settlements for improper use of cookie information. (EPIC, 2002)

Though cookies are quickly becoming essential to a pleasurable web experience, users must decide whether the decreased security is worth the benefits. The onus of whether you accept a cookie or not is on the individual. Some of us don’t mind the extra clicks to get to what we want. After all, why do we need someone to remind us of what we want or send us extra advertisements? Most people just do not care. Whether cookies continue to remain a nuisance to the internet community or they emerge as an epidemic in internet privacy; only time will tell.

Why is gambling such a problem on the Internet

There are plenty of reasons and you are going to hear all of them throughout this research paper. There are three main types of Internet gambling. There is the sports book, there are casinos and the lotto, and last but not least there is horseracing. Throughout this paper I will explain the Laws against online gambling. Why people can get around the laws, and the style of gambling and how to do it. I think that online gambling is fine the only person you could hurt is yourself. First I will explain the Horseracing betting online.

I dont know whether you have been to the horse tracks before. It is rather simple you pick one to three horses that you would like to bet. The hard part is when you actually go to the track you have to learn the slang terms for the bets. You have to say the track name and the money you want to bet then the horse or horses. With online horse betting you just have to enter it in on the computer. You can even watch a race on the Internet, its not like watching television, but you find out what horses win place and show. Thats what really matters anyway.

There is also a new form of horse gambling its on www. lousianadowns. com it is a simulated horse race every month. You have to go to the web site and fill out some info. You design your own horse, by choosing the color the name and, jockey. You pay $29. 95 per month. You have to get on the website every day to feed the horse and train him. Then twice a month, your horse will race a simulated race and if it comes in win, place, or show, you will win money (4). This form of online gambling is my favorite. My second form of online gambling is the casinos and lotteries for online gambling.

These are everywhere. You can play everything that you can play in the actual casino. For example Black Jack, roulette, poker, slots, craps, and keno (3) are available online. Black jack is a game where you try to beat the dealer. Poker is a game where you can play either way you can play where you beat the dealer, or you can play where your betting against other players. Slot machines are what I like to call a gamblers game for somebody that doesnt know how to gamble. All you really do is pull the lever and hope to win.

Craps is where you put your money down roll the dice then try to roll that number again before rolling a seven or you lose. Roulette is when you pick a number and they roll the number and if you win you get 35/1 odds. Then there is the lottery online its your typical lottery. The state or organization running it will draw numbers and if those are your numbers you win. You can also check you real numbers from the gas station on the Internet as well. This is a very popular way to gamble on the Internet. This isnt really the way I like to do it but if somebody wants to take the risk then thats there choice.

My third and final Way to gamble on the Internet is the online sports book. This is a really simple, but most dangerous way to gamble. How it works is you call your bookie, or organization that runs the sports book. Then you find out a spread on a sports game. A spread is the amount of points an organization will give a underdog team to make it an even game, its called handicapping. For example the Cowboys are playing the Rams in football. The Rams are obviously going to win so the sports book will say the Cowboys will get 10 points.

Then that means that if you bet the Rams they would have to beat the Cowboys by 10 points or the bet was a loser. A tie is a push, which means nobody wins. The reason that this is so dangerous is because the average male thinks that he knows everything about sports and will throw big money on a game and lose, then he wants to quit but he cant. He wants to get back on top, before long he is addicted. Just like a cigarette. All forms are very addictive and very dangerous and the online world is making this so much easier on a gambler.

They dont even have to leave the house. Is online gambling legal? For online gambling to be legal or illegal is up to the state. The debate about online gambling is always changing from state to state (1). So Congress decided to do something because of the inconsistencies between the states legislation and enforcement (1). Here are some laws that prohibit online gambling that you should know about. The Wire Wager Act This act prohibits the use of Internet to place bets. This prevents the use of a wire transmission facility to foster a gambling pursuit (1).

It says any body who being engaged in the business of betting or wagering knowingly uses a wire communication or foreign commerce of bets or wagers or info on placing bets or wagers sporting event or contest. Shall be fined under this title, or imprisonment of up to two years, or both (1). The second federal statute act is The Travel Act this act will penalize a person that travels interstate or foreign commerce or uses mail to further any unlawful activity (1). Unlawful activity is a business enterprise involving gambling (1). This applies to the Internet because of the Internet connection threw a wire.

The last law that I will discus is The Professional and Amateur Sports Protection Act. This act is basically to prohibit the gambling on a game in which a collage or professional athlete participates in (1). In November of 99 the U. S. Senate unexpectedly turned its energies to an Anti Gambling Bill (2). Sponsored by Arizonas Sen. John Kyl. Kyl extend on the wire wager act, sent the bill to congress and it was a unanimous decision (2). Just to prove that the fight is still going on. There are some enforcement problems with these Internet laws. You see first threw the Internet is very hard to trace the activity.

They can also take place overseas that is normally happens and the United States cannot do anything if they dont come over here (1). With virtually every state allowing one for of gambling mostly the lottery, how could we go to another country and join alliances to prevent any type of gambling (1). Another effective way to hide is to use a sophisticated encryption and stop using credit card- based deposit system to hide them selves (1). The government had an option that is still debated and that is to force the Internet service providers (ISPs) to block access to gambling sites (1).

If the government finds a way to stop online betting there will be a way around it with the technology we have there is always a Lupe hole, so the safe bet is that online gambling will go on (2). The on going battle between Internet gambling will probably never stop. Some people believe that gambling is a natural need, and that it teaches people to take risk every once and a while (2). The main problems with online or any gambling is that people dont take it seriously and some take it very seriously and some times ends up in death or beatings once one thing illegal happens then there are more to follow.

Its hard to find a person how did it because it was normally hidden because it was an illegal activity. Throughout this piece I described the three main types the laws that try to prevent online gambling and the ways around it. I myself love to gamble. I really dont do it over the Internet, I prefer to go the site and gamble. Maybe that is my way of saving money some times, the Internet seems to easy to just not really care about or really realize what they where doing with there money. Like I said before its your money, as long as you have it, do what you want with it.

Internet Censorship Report

The freedom of speech that was possible on the Internet could now be subjected to governmental approvals. For example, China is attempting to restrict political expression, in the name of security and social stability. It requires users of the Internet and electronic mail (e-mail) to register, so that it may monitor their activities. In the United Kingdom, state secrets and personal attacks are off limits on the Internet. Laws are strict and the government is extremely interested in regulating the Intern et with respect to these issues. 10 Laws intended for other types of communication will not necessarily apply in his medium.

Through all the components of the Internet it becomes easy to transfer material that particular governments might find objectionable. However, all of these means of communicating on the Internet make up a large and vast system. For inspectors to monitor every e-mail, every article in every Newsgroup, every Webpage, every IRC channel, every Gopher site and every FTP site would be near impossible. Besides taking an ext raordinary amount of money and time, attempts to censor the Internet violate freedom of speech rights that are included in democratic constitutions and international aws. 1 It would be a breach of the First Amendment.

The Constitution of the United States of America declares that Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redr ess of grievances 12 Therefore it would be unconstitutional for any sort of censorship to occur on the Internet and affiliated services. Despite the illegality, restrictions on Internet ccess and content are increasing worldwide under all forms of government.

In France, a co untry where the press generally has a large amount of freedom, the Internet has recently been in the spotlight. A banned book on the health history of former French president Francois Mitterrand was republished electronically on the World Wide Web (WWW). Apparently, the electronic reproduction of Le Grand Secret by a third party wasn’t banned by a court that ruled that the printed version of the book unlawfully violated Mitterrand’s privacy.

To enforce censorship of the Internet, free societies find hat they become more repressive and closed societies find new ways to crush political expression and opposition. 3 Vice – President Al Gore, while at an international conference in Brussels about the Internet, in a keynote address said that [Cyberspace] is about protecting and enlarging freedom of expression for all our citizens … Ideas should not be checked at the border. 14 Another person attending that conference was Ann Breeson of the Ame rican Civil Liberties Union, an organization dedicated to preserving many things including free speech.

She is quoted as saying, Our big victory at Brussels as that we pressured them enough so that Al Gore in his keynote address made a big point of stre ssing the importance of free speech on the Internet. 5 Many other organizations have fought against laws and have succeeded.

A prime example of this is the fight that various groups put on against the recent Communication Decency Act (CDA) of the U. S. Se nate. The Citizens Internet Empowerment Coalition on 26 February 1996 filed a historic lawsuit in Philadelphia against the U. S. Department of Justice and Attorney General Janet Reno to make certain that the First Amendment of the U. S. A. ould not be compr omised by the CDA.

The sheer range of plaintiffs alone, including the American Booksellers Association, the Freedom to Read Foundation, Apple, Microsoft, America Online, the Society of Professional Journalists, the Commercial Internet eXchange Association , Wired, and HotWired, as well as thousands of netizens (citizens of the Internet) shows the dedication that is felt by many different people and groups to the cause of free speech on the Internet. 16 Words like *censored*, *censored*, piss, and tits. Words of which our mothers (at least some of them) would no doubt disapprove, but hich by no means should be regulated by the government.

But it’s not just about dirty words. It’s also about words like AIDS, gay, a nd breasts. It’s about sexual content, and politically controversial topics like drug addiction, euthanasia, and racism. 17 Just recently in France, a high court has struck down a bill that promoted the censorship of the Internet. Other countries have attempted similar moves. The Internet cannot be regulated in the way of other mediums simply because it is not the same as anyt hing else that we have. It is a totally new and unique form of communication and deserves to be given a hance to prove itself.

Laws of one country can not hold jurisdiction in another country and holds true on the Internet because it has no borders. Although North America (mainly the United States) has the largest share of servers, the Internet is still a worldwide network. This means that domestic regulations cannot oversee the rules of foreign countries. It would be just as easy for an American te en to download (receive) pornographic material from England, as it would be from down the street. One of the major problems is the lack of physical boundaries, making it ifficult to determine where violations of the law should be prosecuted.

There is no one place through which all information passes through. That was one of the key points that was stressed during the original days of the Internet, then called ARPANET. It started out as a defense project that would allow communication in the event of an e mergency such as nuclear attack. Without a central authority, information would pass around until it got where it was going. 18 This was intended to be similar to the road system. It is not necessary to take any specific route but rather anyone goes. In th e same way the nformation on the Internet starts out and eventually gets to it’s destination.

The Internet is full of anonymity. Since text is the standard form of communication on the Internet it becomes difficult to determine the identity and/or age of a specific person. Nothing is known for certain about a person accessing content. There are no signatures or photo-ids on the Internet therefore it is difficult to certify that illegal activities (regarding minors accessing restricted data) are taking place. Take for example a conversation on IRC. Two people could people talking to one another, bu t all that they see is text.

It would be extremely difficult, if not impossible, to ascertain the gender and/or age just from communication of this sort. Then if the conversationalist lies about any points mentioned above it would be extremely difficult t o know or prove otherwise. In this way governments could not restrict access to certain sites on the basis of ages. A thirteen-year-old boy in British Columbia could decide that he wanted to download pornography from an adult site in the U. S. The site may have warnings and age restrictions but they have no way of stopping him from receiving their material if he says e is 19 years of age when prompted.

The complexity in the way information is passed around the Internet means that if information has been posted, deleting this material becomes almost impossible. A good example of this is the junk mail that people refer to as spam. These include e-mails ad vertising products, usenet articles that are open for flames. Flames are heated letters that many times have no founding behind them. These seem to float around for ages before dying out because they are perfect material for flamewars. Flamewars are long, drawn out and highly heated discussions consisting of lames, which often time, obscenely, slander one’s reputation and personae.

Mostly these are immature arguments that are totally pointless except to those involved. The millions of people that partici pate on the Internet everyday have access to almost all of the data present. As well it becomes easy to copy something that exists on the Internet with only a click of a button. The relative ease of copying data means that the second information is posted to the Internet it may be archived somewhere else.

There are in fact many sites on the Internet that are devoted to the archiving of information including: tp. cdrom. com (which archives an extraordinary amount of software among others), www. rchive. org ( which is working towards archiving as much of the WWW as possible), and wuarchive. wustl. edu (which is dedicated towards archiving software, publications, and many other types of data).

It becomes hard to censor material that might be duplicated or triplic ated within a matter of minutes. An example could be the recent hacking of the U. S. Department of Justice’s Homepage and the hacking of the Central Intelligence Agency’s Homepage. Someone illegally obtained access to the omputer on which these homepages were stored and modified them.

It was done as a prank; however, both of these agencies have since shut down their pages. 2600 (www. 2600. com), a magazine devoted to hacking, has republished the hacked DoJ and CIA homepages on their website. The magazine ei ther copied the data straight from the hacked sites or the hacked site was submitted to the magazine. I don’t know which one is true but it does show the ease that data can be copied and distributed, as well it shows the difficulty in preventing material deemed inappropriate from appearing where it shouldn’t.

The Internet is much too complex a network for censorship to effectively occur. It is a totally new and unique environment in which communications transpire. Existing laws are not applicable to this medium. The lack of tangible boundaries causes confusion as to where violations of law take place. The Internet is made up of nameless interaction and anonymous communication. The intricacy of the Internet makes it near impossible to delete data that has been publicized. No one country should be allowed to, or could, regulate or censor the Internet.

Government Intervention Of The Internet

During the past decade, our society has become based solely on the ability to move large amounts of information across large distances quickly. Computerization has influenced everyone’s life. The natural evolution of computers and this need for ultra-fast communications has caused a global network of interconnected computers to develop. This global net allows a person to send E-mail across the world in mere fractions of a second, and enables even the common person to access information world-wide.

With advances such as software that allows users with a sound card to use the Internet as a carrier for long distance voice calls and video onferencing, this network is key to the future of the knowledge society. At present, this net is the epitome of the first amendment: free speech. It is a place where people can speak their mind without being reprimanded for what they say, or how they choose to say it. The key to the world-wide success of the Internet is its protection of free speech, not only in America, but in other countries where free speech is not protected by a constitution.

To be found on the Internet is a huge collection of obscene graphics, Anarchists’ cookbooks and countless other things that offend some eople. With over 30 million Internet users in the U. S. alone (only 3 million of which surf the net from home), everything is bound to offend someone. The newest wave of laws floating through law making bodies around the world threatens to stifle this area of spontaneity. Recently, Congress has been considering passing laws that will make it a crime punishable by jail to send “vulgar” language over the net, and to export encryption software.

No matter how small, any attempt at government intervention in the Internet will stifle the greatest communication innovation of this century. The overnment wants to maintain control over this new form of communication, and they are trying to use the protection of children as a smoke screen to pass laws that will allow them to regulate and censor the Internet, while banning techniques that could eliminate the need for regulation. Censorship of the Internet threatens to destroy its freelance atmosphere, while wide spread encryption could help prevent the need for government intervention.

The current body of laws existing today in America does not apply well to the Internet. Is the Internet like a bookstore, where servers cannot be expected to eview every title? Is it like a phone company who must ignore what it carries because of privacy? Is it like a broadcasting medium, where the government monitors what is broadcast? The trouble is that the Internet can be all or none of these things depending on how it’s used. The Internet cannot be viewed as one type of transfer medium under current broadcast definitions.

The Internet differs from broadcasting media in that one cannot just happen upon a vulgar site without first entering a complicated address, or following a link from another source. “The Internet is much more like going into a book store and hoosing to look at adult magazines. ” (Miller 75). Jim Exon, a democratic senator from Nebraska, wants to pass a decency bill regulating the Internet. If the bill passes, certain commercial servers that post pictures of unclad beings, like those run by Penthouse or Playboy, would of course be shut down immediately or risk prosecution.

The same goes for any amateur web site that features nudity, sex talk, or rough language. Posting any dirty words in a Usenet discussion group, which occurs routinely, could make one liable for a $50,000 fine and six months in jail. Even worse, if a magazine that commonly runs ome of those nasty words in its pages, The New Yorker for instance, decided to post its contents on-line, its leaders would be held responsible for a $100,000 fine and two years in jail. Why does it suddenly become illegal to post something that has been legal for years in print?

Exon’s bill apparently would also “criminalize private mail,” … “I can call my brother on the phone and say anything–but if I say it on the Internet, it’s illegal” (Levy 53). Congress, in their pursuit of regulations, seems to have overlooked the fact that the majority of the adult material on the Internet comes from overseas. Although many U. S. government sources helped fund Arpanet, the predecessor to the Internet, they no longer control it. Many of the new Internet technologies, including the World Wide Web, have come from overseas.

There is no clear boundary between information held in the U. S. and information stored in other countries. Data held in foreign computers is just as accessible as data in America, all it takes is the click of a mouse to access. Even if our government tried to regulate the Internet, we have no control over what is posted in other countries, and we have no practical way to top it. The Internet’s predecessor was originally designed to uphold communications after a nuclear attack by rerouting data to compensate for destroyed telephone lines and servers.

Today’s Internet still works on a similar design. The very nature this design allows the Internet to overcome any kind of barriers put in its way. If a major line between two servers, say in two countries, is cut, then the Internet users will find another way around this obstacle. This obstacle avoidance makes it virtually impossible to separate an entire nation from indecent information in other countries. If it was physically possible to isolate America’s computers from the rest of the world, it would be devastating to our economy.

Recently, a major university attempted to regulate what types of Internet access its students had, with results reminiscent of a 1960’s protest. A research associate at Carnegie Mellon University conducted a study of pornography on the school’s computer networks. Martin Rimm put together quite a large picture collection (917,410 images) and he also tracked how often each image had been downloaded (a total of 6. 4 million). Pictures of similar content had recently been declared bscene by a local court, and the school feared they might be held responsible for the content of its network.

The school administration quickly removed access to all these pictures, and to the newsgroups where most of this obscenity is suspected to come from. A total of 80 newsgroups were removed, causing a large disturbance among the student body, the American Civil Liberties Union, and the Electronic Frontier Foundation, all of whom felt this was unconstitutional. After only half a week, the college had backed down, and restored the newsgroups. This is a tiny example of what may happen if the government tries to impose censorship Elmer-Dewitt 102).

Currently, there is software being released that promises to block children’s access to known X-rated Internet newsgroups and sites. However, since most adults rely on their computer literate children to setup these programs, the children will be able to find ways around them. This mimics real life, where these children would surely be able to get their hands on an adult magazine. Regardless of what types of software or safeguards are used to protect the children of the Information age, there will be ways around them. This necessitates the education of the children to deal with reality.

Altered views of an electronic world translate easily into altered views of the real world. “When it comes to our children, censorship is a far less important issue than good parenting. We must teach our kids that the Internet is a extension and a reflection of the real world, and we have to show them how to enjoy the good things and avoid the bad things. This isn’t the government’s responsibility. It’s ours (Miller 76). ” Not all restrictions on electronic speech are bad. Most of the major on-line communication companies have restrictions on what their users can “say. ” They must respect their customer’s privacy, however.

Private E-mail content is off limits to them, but they may act swiftly upon anyone who spouts obscenities in a public forum. Self regulation by users and servers is the key to avoiding government imposed intervention. Many on-line sites such as Playboy and Penthouse have started to regulated themselves. Both post clear warnings that adult content lies ahead and lists the countries where this is illegal. The film and videogame industries subject themselves to ratings, and if Internet users want to avoid government imposed regulations, then it is time they begin to regulate themselves.

It all oils down to protecting children from adult material, while protecting the first amendment right to free speech between adults. Government attempts to regulate the Internet are not just limited to obscenity and vulgar language, it also reaches into other areas, such as data encryption. By nature, the Internet is an insecure method of transferring data. A single E-mail packet may pass through hundreds of computers from its source to destination. At each computer, there is the chance that the data will be archived and someone may intercept that data. Credit card numbers are a frequent target of ackers.

Encryption is a means of encoding data so that only someone with the proper “key” can decode it. “Why do you need PGP (encryption)? It’s personal. It’s private. And it’s no one’s business but yours. You may be planning a political campaign, discussing our taxes, or having an illicit affair. Or you may be doing something that you feel shouldn’t be illegal, but is. Whatever it is, you don’t want your private electronic mail (E-mail) or confidential documents read by anyone else. There’s nothing wrong with asserting your privacy. Privacy is as apple-pie as the Constitution.

Perhaps you think your E-mail is legitimate enough that encryption is unwarranted. If you really are a law-abiding citizen with nothing to hide, then why don’t you always send your paper mail on postcards? Why not submit to drug testing on demand? Why require a warrant for police searches of your house? Are you trying to hide something? You must be a subversive or a drug dealer if you hide your mail inside envelopes. Or maybe a paranoid nut. Do law-abiding citizens have any need to encrypt their E-mail? What if everyone believed that law-abiding citizens should use postcards for their mail?

If some brave soul tried to assert his privacy by using an envelope for his mail, it would draw suspicion. Perhaps the authorities would open his mail to see what he’s hiding. Fortunately, we don’t live in that kind of world, because everyone protects most of their mail with envelopes. So no one draws suspicion by asserting their privacy with an envelope. There’s safety in numbers. Analogously, it would be nice if everyone routinely used encryption for all their E-mail, innocent or not, so that no one drew suspicion by asserting their E-mail privacy with encryption.

Think of it as a form of solidarity (Zimmerman). Until the development of the Internet, the U. S. government controlled most new encryption techniques. With the development of faster home computers and a worldwide web, they no longer hold control over encryption. New algorithms have been discovered that are reportedly uncrackable even by the FBI and the NSA. This is a major concern to the government because they want to maintain the ability to conduct wiretaps, and other forms of electronic surveillance into the digital age.

To stop the spread of data encryption software, the U. S. government has imposed very strict laws on its exportation. One very well known example of this is the PGP (Pretty Good Privacy) scandal. PGP was written by Phil Zimmerman, and is based on “public key” encryption. This system uses complex algorithms to produce two codes, one for encoding and one for decoding. To send an encoded message to someone, a copy of that person’s “public” key is needed. The sender uses this public key to encrypt the data, and the recipient uses their “private” key to decode the message.

As Zimmerman was finishing his program, he heard about a proposed Senate bill to ban cryptography. This prompted him to release his program for free, oping that it would become so popular that its use could not be stopped. One of the original users of PGP posted it to an Internet site, where anyone from any country could download it, causing a federal investigator to begin investigating Phil for violation of this new law. As with any new technology, this program has allegedly been used for illegal purposes, and the FBI and NSA are believed to be unable to crack this code.

When told about the illegal uses of him programs, Zimmerman replies: “If I had invented an automobile, and was told that criminals used it to rob banks, I would feel bad, too. But most people agree the benefits to society that come from automobiles — taking the kids to school, grocery shopping and such — outweigh their drawbacks. ” (Levy 56). Currently, PGP can be downloaded from MIT. They have a very complicated system that changes the location on the software to be sure that they are protected. All that needs to be done is click “YES” to four questions dealing with exportation and use of the program, and it is there for the taking.

This seems to be a lot of trouble to protect a program from spreading that is already world wide. The government wants to protect their ability to legally wiretap, but what ood does it do them to stop encryption in foreign countries? They cannot legally wiretap someone in another country, and they sure cannot ban encryption in the U. S. The government has not been totally blind to the need for encryption. For nearly two decades, a government sponsored algorithm, Data Encryption Standard (DES), has been used primarily by banks.

The government always maintained the ability to decipher this code with their powerful supercomputers. Now that new forms of encryption have been devised that the government can’t decipher, they are proposing a new standard to replace DES. This new standard is called Clipper, and is based on the “public key” algorithms. Instead of software, Clipper is a microchip that can be incorporated into just about anything (Television, Telephones, etc. ). This algorithm uses a much longer key that is 16 million times more powerful than DES.

It is estimated that today’s fastest computers would take 400 billion years to break this code using every possible key. (Lehrer 378). “The catch: At the time of manufacture, each Clipper chip will be loaded with its own unique key, and the Government gets to keep a copy, placed in escrow. Not to worry, though the Government promises that they will use these keys to read your traffic only when duly authorized by law.

Of course, to make Clipper completely effective, the next logical step would be to outlaw other forms of cryptography (Zimmerman). “If privacy is outlawed, only outlaws will have privacy. Intelligence agencies have access to good cryptographic technology. So do the big arms and drug traffickers. So do defense contractors, oil companies, and other corporate giants. But ordinary people and grassroots political organizations mostly have not had access to affordable “military grade” public-key cryptographic technology. Until now. PGP empowers people to take their privacy into their own hands.

There’s a growing social need for it. That’s why I wrote it (Zimmerman). The most important benefits of encryption have been conveniently overlooked by the government. If everyone used encryption, there would be absolutely no way that an innocent bystander could happen upon something they choose not to see. Only the intended receiver of the data could decrypt it (using public key cryptography, not even the sender can decrypt it) and view its contents. Each coded message also has an encrypted signature verifying the sender’s dentity. The sender’s secret key can be used to encrypt an enclosed signature message, thereby “signing” it.

This creates a digital signature of a message, which the recipient (or anyone else) can check by using the sender’s public key to decrypt it. This proves that the sender was the true originator of the message, and that the message has not been subsequently altered by anyone else, because the sender alone possesses the secret key that made that signature. “Forgery of a signed message is infeasible, and the sender cannot later disavow his signature(Zimmerman). ” Gone would be he hate mail that causes many problems, and gone would be the ability to forge a document with someone else’s address.

The government, if it did not have alterior motives, should mandate encryption, not outlaw it. As the Internet continues to grow throughout the world, more governments may try to impose their views onto the rest of the world through regulations and censorship. It will be a sad day when the world must adjust its views to conform to that of the most prudish regulatory government. If too many regulations are inacted, then the Internet as a tool will become nearly useless, and the Internet as a ass communication device and a place for freedom of mind and thoughts, will become non existent.

The users, servers, and parents of the world must regulate themselves, so as not to force government regulations that may stifle the best communication instrument in history. If encryption catches on and becomes as widespread as Zimmerman predicts it will, then there will no longer be a need for the government to meddle in the Internet, and the biggest problem will work itself out. The government should rethink its approach to the censorship and encryption issues, allowing the Internet to continue to grow and mature.

Internet and Its’s services

Working with Internet does not mean just browsing www and sending and receiving e-mails. The Basic Structure of the Internet was developed through last 30 years of existence of the Internet. The Internet is a heterogeneous worldwide network consisting of a large number of host computers and local area networks. The Internet uses the TCP/IP suite of protocols. This allows the integration of a large number of different computers into one single network with highly efficient communication between them. This way, the user can access information on all kinds of host computers from a desktop PC, Macintosh, or whatever he/she has available.

TCP/IP, the communication standard underlying the Internet, originates from work done at the US-Department of Defense in the late 1960s. The first version of the Internet was built in 1969 and consisted of just four computers. In 1982 a set of specifications and protocols have been implemented, which became known as TCP/IP in reference to their two major elements, the “Transmission Control Protocol” and the “Internet Protocol”. The development and implementation of TCP/IP stimulated a massive growth process for the Internet. “By late 1987 it was estimated that the growth had reached 15% per month and remained high for the following two years.

By 1990, the connected Internet included over 3,000 active networks, over 3,000 active networks, and over 200,000 computers. By January 1992 the number of hosts on the Internet was 727,000, doubling about every 7 months. Various groups of users are connected to the Internet: universities and other educational institutions, government agencies, the military, and at an increasing number private businesses. The most fundamental function of the Internet is to pass electronic information from one computer to another. A 32 bit Internet Address or IP-Number identifies every computer on the network.

This number is commonly represented as four numbers joined by periods. The Internet uses these numbers to guide information through the network. This is called routing. For human users, however, such numbers are usually difficult to keep in mind. Therefore, computers are also identified by Domain Names, which are to some extent similar to mailing addresses. Special programs, called name servers, translate domain names into IP-Addresses. Internet services can be divided into two groups, communication services, and information services.

In the first group, the Internet mediates in the communication between two or more individuals. In the second group, the user turns to the Internet-service in search for some particular information. Communication services can roughly be compared to a telephone call, information services to a dictionary . The most important communication services on the Internet are electronic mail, Netnews and some derived services. Major information services are terminal emulation and file transfer, Gopher, WAIS, and World Wide Web. Electronic mail is the most popular and widely used network service.

It can be viewed as the electronic equivalent to a regular mail letter. When one user wants to send a message to another Internet user, he types the message into a special computer program, adds the e-mail address of the recipient, and sends the message off through the network. Typically, the message reaches its destination almost immediately, even when it is on another continent. Practically all gateways between the Internet and other computer networks can handle e-mail messages. Therefore, many more people can be reached by e-mail than are connected to the Internet via TCP/IP.

The major advantage of e-mail over regular mail is that an e- mail message comes in electronic form. Therefore, it can easily be handled and interpreted by a computer program. This feature is use by electronic discussion lists. They operate a database of subscribers and each incoming e-mail message is automatically distributed to each subscriber. Listserv, the most popular program for managing discussion lists also handles subscription via mail messages, archives incoming messages, and allows users to retrieve these and other archived files. Thousands of discussion lists focusing on all kinds of topics exist on the Internet.

Topics range from various hobbies, political discussions to operational aspects of different computer systems and research questions. For the user, discussion lists are an easy way to identify and contact a large number of people with similar interests. A discussion list can also be considered as a worldwide forum for expressing views and discussing opinions. While messages are automatically sent to all subscribed users in the case of a discussion list and one has to be subscribed in order to receive the messages, messages in Netnews are distributed between a net of servers.

Messages are organized in a hierarchy of newsgroups. Incoming messages are stored for a particular period in a publicly accessible area. Each user can connect to this area, browse through the stored messages, and respond to any one of them. This way Netnews allows for a better overview of ongoing discussions but requires the user to actively connect to the respective area. One of the reasons for the creation of a computer network like the Internet was to give users access to remote computers and to allow them to transfer files to and from this machine. These are typical demands for Telnet and FTP for file transfer protocol.

In both services, the user specifies a specific remote host through its IP-number or domain-name that he wants to access. When the user has an account on this remote host, he can work there just like on a local machine. With telnet available, increasingly Internet-sites allowed outside users to access some of their information services. Typical examples are electronic library card catalogs, campus information systems, and other database applications. Today, the electronic card catalogs of practically all-major libraries and many university libraries in Europe are publicly accessible through the Internet.

Many sites run large specialized information systems for the network community. A similar type of anonymous access is available through FTP. Many Internet sites allow users to log in as anonymous through FTP. The way the sites can make files publicly available. This system has led to a huge supply of freeware- and shareware-software that is distributed through this channel. Practically all network software is available in this way. In addition, information files specifications of network standards; research papers, multimedia files, and even the complete texts of classical books can be accessed through anonymous FTP.

When using an FTP client program to download files, assure yourself, that it is giving a bogus password, like [email protected] com, not your real one. If your browser lets you, turn off the feature that sends your e-mail address as a password for anonymous FTP sessions. The File Transfer Protocol, as its name states, is a set of rules that dictates how files should be transferred over the TCP/IP protocol. A basic FTP connection consists of a client and a server. The client gets a file by opening a connection to the server. Usually, the server is run on port 21, however, the system administrator can change this if he or she wishes.

Every Thing You Ever Wanted to Know About EDI

Well EDI, or Electronic Data Interchange, is the transfer of business documents such as sales invoices, purchase orders, price quotations, etc. using a pre-established format in a paperless electronic environment. Usually this transfer occurs over VANs, Value Added Networks, but it is becoming increasingly popular over the Internet because of cost savings and ease of use. EDI has been around for approximately 30 years. “The true genesis of EDI occurred in the mid-1960s, as an early attempt at implementing the fictional “paperless” office by companies in transportation, grocery and retail industry segments.

Although EDI never eliminated paper documents, it decreased the number of times such documents were handled by people. Reduced handling resulted in fewer errors and faster transfers” (Millman, 83). EDI technology is rapidly changing the way business is conducted throughout the world. Firms that use EDI are more efficient and responsive to the needs of customers and partners and in many cases have jumped out ahead of the competition.

Many businesses are already using EDI with suppliers and customers, and if your firm wants to do business with companies involved in Government Dealings EDI must be part of your business no later than January 1, 1999. In May of this year, the major industrial groups in charge of standards setting for EDI, have united behind a set of standards that will allow for seamless web-based forms using extensible markup language, similar to HTML, thereby increasing the accessibility of EDI for small businesses on the Internet (Campbell, 28).

An example of an application for EDI is filing tax returns with the Internal Revenue Service. The IRS offers several options for filing your tax return, one of which is filing electronically and receiving your refund by electronic funds transfer or “direct deposit. ” The forms used are available in tax preparation software, which can be downloaded off the Internet or purchased by retail. The forms are filled out directly on a PC then transmitted to another computer, which acts as a midpoint to the IRS. The IRS receives your forms and can issue a refund without ever having to reprocess the data.

By using this method you save yourself and the IRS time and money (Campbell, 28). Table came from Information Technology for Management by Turban, McLean and Wetherbe page 244. Information, such as purchase orders for medical supplies, flows from the hospital’s information system into an EDI station, which consist of a PC, an EDI translator and a modem. From there, the information moves to a VAN (Value Added Network). The Van transfers the formatted information to the vendor, where the vendor side EDI translator converts it to a desired format (Turban, 244).

An EDI translator does the conversion of data into standard format. An example of such formatting is shown below. Table came from Information Technology for Management by Turban, McLean and Wetherbe page 243. “An average hospital generates about 15,000 purchase orders each year at a processing cost of about $70 per order. The health Industry Business Communication Council estimates that EDI can reduce this cost to $4 per order, potential yearly savings of $840,000 per hospital. The required investment ranges between $8,000 and $15,000.

This includes the purchase of a PC with an EDI translator, a modem, and a link to the mainframe-based information system. The hospital can have two or three ordering points. These are connected to a value-added network (VAN), which connects the hospitals to its suppliers. (See figure on previous page) The system can also connect to other hospitals, or to centralized joint purchasing agencies. ” (Turban, 244). There are numerous benefits associated with the adoption of EDI. Probably the most important and largest benefit is efficiency. By utilizing EDI businesses are able to streamline their whole supply chain process.

Whether it is upstream to suppliers or downstream to customers, EDI eliminates repetitive tasks such as entering data multiple times and cuts costs of printing hard copies and transportation costs. EDI also allows you to send and receive large amounts of data quickly to or from anywhere in the world. Anywhere that there is access to the Internet there is access to EDI. For example a supplier could be located in Taiwan while the customer is sitting in Memphis, TN and in no more than a few seconds thousands of product order forms could be sent between the two without any errors or lost data.

Companies in partnership agreements can gain access to one another’s shared databases to retrieve and store regular transactions. These partnerships tend to last for a long time as well because of the commitment of a long term investment and refinement of the system over a period of time. EDI creates a complete paperless Transaction Processing System environment, which saves money and increases efficiency. Collecting bills and making payments can be shortened by several weeks because the data doesn’t have to be reentered several time (Turban, 245).

There are other benefits to using EDI – security and validation. Using EDI is secure as long as it is not conducted over the Internet. The information is transmitted over a VAN and on to your partner, but never enters the realm of the World Wide Web. There are only three points of contact versus the millions of interconnections and links over the Internet. The use of EDI also provides a means of validation through time code embedded in the string of electronic codes that are attached to each file. It is time coded at every step in the transmission process.

Imagine no longer having to rely on postmarks or call a package delivery service, or check to make sure a fax went through (Campbell, 28). Despite all of the benefits of EDI there are still some disadvantages that have caused much criticism. First and foremost is the cost, the only companies that can really afford to utilize EDI to its fullest potential are the Fortune 1,000 and Global 2,000 firms (Millman, 83). “Traditional EDI works fine in the larger enterprises because they have IS professionals to maintain the system,” says Dennis Freeman, director of product marketing at Harbinger, an EDI software and services supplier in Atlanta.

Those companies exchange business documents with their trading partners and save themselves a huge amount of money by not using paper,” Freeman continues. “For smaller companies, that process becomes much more daunting. They don’t want – nor can they afford to own a full-blown EDI server. For them, Internet-based EDI is a low-cost answer” (Millman, 38). Another disadvantage of EDI is there could be communication problems with trading partners who use different EDI software. Internet EDI has solved this problem but at the cost of decreased security which is a major issue right now not to mention that network capacity may be unsatisfactory.

EDI in the Federal Government. Under the Federal Acquisition Act (FASA), as of June 1, 1998 any organization that wanted to do business with the Department of Defense was required to register with the Centralized Contractor Registration (CCR) to receive government contracts. Companies with existing contracts did not have to register initially but the DOD will institute a system-wide electronic funds transfer payment system by January 1, 1999. By that time every contractor that wants to get paid must enter it’s register” (Campbell, 29). The resistance behavior of some firms towards the adoption of EDI is still very evident today.

Recently EDI has received much attention in the transportation and logistics literature, particularly in the area of shipper-carrier relationships. EDI is often considered an indicator of mutual cooperation between a carrier and a shipper, as well between a customer and a supplier in a supply chain. As people began viewing EDI as the glue that holds partnerships together, the decision to adopt (or not adopt) became a decision to be made by multiple firms rather than a single-firm decision. The use of EDI for transportation-related transactions by a shipper, for example, requires the use of EDI by its carriers.

Similarly, if a retailer wants to use EDI for achieving higher operational efficiency, the use of EDI by its suppliers is a must. In other words, a successful implementation of EDI strategy (strategic use of EDI for achieving higher cost efficiency and improved customer service) greatly depends on whether a firm can persuade its trading partners to adopt the EDI system. Given these situations, firms pursuing the EDI strategy began feeling the need to learn effective ways of persuading their trading partners to adopt the EDI system” (Suzuki, 36).

In an attempt to provide insights on how to persuade non-adopters to adopt EDI, numerous researchers conducted studies to determine the factors that influence the decision to adopt or not. Those studies are invaluable tools that firms can use to attempt to persuade their trading partners to adopt EDI. The studies, however, did not study why firms choose not to adopt EDI. Adoption and resistance behavior may seem to be opposite sides of the same coin but in fact there could be numerous reasons not to adopt EDI.

For instance a firms business environment may not require the adoption of EDI, that doesn’t mean that firm is resisting EDI it just means it doesn’t need it. On the other hand if a firm has adopted EDI but only uses it for certain limited purposes, like the issue of bills of lading, despite repeated requests by its trading partners to increase usage that firm, although it has adopted EDI, still shows strong resistance. In the studies there arose three main factors that contributed to a firms resistance to EDI.

These factors were as follows: uncertainty, EDI standardization, and perceived EDI benefit. Each factor will be discussed in the following paragraphs. The concept of uncertainty has been defined as the “inability of a firm to forecast accurately the technical requirements of the EDI system in the relatively near future” (Suzuki, 37). When there is a high level of unpredictability firms tend not to form long lasting agreements with other firms, such as those found in an EDI partnership, because they wish to retain the flexibility to terminate relationships whenever necessary.

EDI increases the strength of the partner relationship therefore a firm is less likely to accept EDI sharing with its trading partners when that firm perceives a high degree of uncertainty in EDI technology. EDI standardization refers to the distribution level of an industry wide EDI format. When Company exclusive (many different) EDI formats are more common in an industry, the EDI standardization in this industry is considered low. However, if the industry wide (standardized or similar) EDI formats are more common, the EDI standardization is high.

The lack of EDI standardization is considered to be a major barrier towards the adoption of EDI. Firms with limited EDI knowledge can become confused with all of the different formats for EDI and be dissuaded from adopting EDI for their firm. Basically in industries where firms adopt their own EDI format other firms are less likely to accept EDI agreements because linking with these firms would require that firm to adopt more document formats. EDI benefits have been discussed deeply in this paper, in literature, and just about every article or study about EDI.

The lower the perceived benefit a firm has towards EDI the less likely that firm is to adopt the technology. If a firm believes that EDI has no benefit for that firm it may not and probably will not adopt EDI technology despite the pleadings of its trading partners or it may adopt the system but be reluctant to use it (Suzuki, 39). The most obvious benefits of EDI are speed and accuracy of data transmission but can and often do include improved customer service quality.

A successful implementation of EDI strategy greatly depends on whether a firm can persuade its supply-chain partners to adopt the EDI system. Convincing other parties to use the EDI system, however, may not be an easy task, because partners often show strong resistance to EDI usage. The study of resistance to the adoption of EDI indicated that firms tend to show stronger resistance to EDI when firms perceive a high level of uncertainty, low distribution rate of industry wide EDI formats and standards, and little benefits of using EDI for reducing processing time” (Suzuki, 40).

Therefore in order to successfully implement EDI across the entire supply-chain a firm must clarify the details of the agreement, adequately demonstrate the reliability and benefits of EDI, and insure a set global format that can be used anywhere. While EDI may not be for everyone any other form of Electronic Commerce cannot match its benefits and popularity across all industries. “EDI’s cost savings are the stuff of accounting legends.

For example, RJR Nabisco estimates that the cost for processing a paper-based purchase order is almost $70, whereas the same transaction performed through EDI costs less than $1. Further, EDI’s single entry of transactions minimizes repetitive data entry and reduces keystroke errors. The almost instantaneous nature of an EDI transaction shortens the time between creating a purchase order and sending an invoice to seconds. Until recently, there were few alternatives to EDI that offered the speed, standardization, and acceptance in the global business community.

Now, Internet-based EDI solutions, such as those found at Hewlett Packard (See figure below), promise to trim the cost and mass of EDI and add new life to an antiquated system” (Millman, 83). In the past, EDI adoption was a slow process because EDI exchanges were based on an exclusive company to company system. Now EDI on the Web will greatly accelerate exchanging business documents all over the Globe. As the prices of EDI plummet, interest continues to soar, which will bring about lower prices and offer more flexibility for companies conducting Internet commerce.

RadioFreeCash’s innovative essay

In today’s society radio serves as a popular source of entertainment. Online radio stations have been popping up all over the internet. A relatively new radio website, www. radiofreecash. com, now offers it’s listeners money to tune in to their station. RadioFreeCash’s innovative pay-to-listen strategy benefits the web industry, but it may not benefit its listeners as much. Radio stations have been around since the 1920’s. “Some long term research indicates a very gradual decline in radio listening (LaRose, 160). ” Radio formats are constantly changing with new generations.

The generation of today can turn to computers for almost anything, and radio stations took note of this. “We view this period as a turning point for the radio broadcasting’ said investment bank Robertson Stephens in a recently released research report. ‘We see the internet medium as a tremendous opportunity for radio to reinvent itself (Stephens, 1). ‘” Online radio stations started in 1998 (LaRose, 131) and flourish in today’s computer society. “According to Arbitron, there are currently an estimated 30 million users of Internet Radio.

As the growth of broadband continues, that number is expected to triple by mid-2000 (theGlobe. com, 1). ” With such a big number of users wanting to listen to the radio while surfing the internet, the company RadioFreeCash was formed. RadioFreeCash was founded in September 1999 and has become very popular within the last year. RadioFreeCash is very similar to an online radio station. Members can choose from hundreds of musical channels, where they find formats that meet their personal tastes in music.

RadioFreeCash’s mission is to violently change the strategic landscape of delivering radio content and audio advertising over the internet (theGlobe. com, 2). ” So RadioFreeCash already had the music now it needed the advertising. RadioFreeCash uses outside companies to place banner and audio advertisements through it’s website and radio bar. So basically, RadioFreeCash members get paid “listening to tunes while chilling out at the computer” said Fox Anchor Shepherd Smith while interviewing RadioFreeCash’s CEO.

RadioFreeCash doesn’t charge a membership fee, but it does require some personal information. The information they require is your name, address, city, state, zip and then personal interests. Some personal interests might be travel, hobbies, sports, fashion, etc. you must choose at least three interests. After you fill in all this information you create a user name, start downloading and once that is done, enjoy the music and money. So what’s the catch? Your personal address and name are never given out, but your personal interests are given to the advertisers.

These interests “enable the creation of a loyal listener community of known demographics and predictable attributes, and thus commands much greater leverage with audio advertising clients world wide (theGlobe. com, 2). ” So then when a member listens to the online radio, advertisements will go across the radio bar or they might just pop out. This feature may be very disruptive at times. Advertisers still attract enough users to make money from advertising.

A study showed that “Eight out of 10 internet radio listeners claim they would be likely to seek information on products and services from a station web site (Stephens, 1). ” The big factor is that the members get paid to listen to the radio. Its all very simple. Users of RadioFreeCash will be compensated for every minute that they listen to the radio as well as for every minute that their friends and family they referred listen to the radio. So for example if you and all your referrals listened to the radio an average of six hours per day, RadioFreeCash will pay you and estimated $140 (see chart for better understanding).

RadioFreeCash has become one of the most popular online radio sites. This past September RadioFreeCash was the #1 Online Radio Station (to see how it compared to other online radio company refer to chart). This new industry is rising fast and sooner than we can say “radio” there will be three more of these pay-to-listen companies popping up like the advertisements do. This type of online radio listening is fun and rewarding, once you get past all the advertisements.

The astonishing growth of the Internet

With the astonishing growth of the Internet, many companies are finding new and exciting ways to expand upon their business opportunities. There are very few successful companies that do not use computers in their everyday business activities, which also means there are few companies that do not use e-commerce. To emphasise the point that the effect of the Internet is so widespread in todays business communities, one online article stated that more than 100000 companies have Internet addresses, and 20000 companies have home pages on the Internet as of February 1999.

These numbers have more than tripled since 1995, and the trend shows no signs of slowing. But what exactly is e-commerce? To most casual Internet surfers, e-commerce means online shopping V workaholics pointing their web browser to Amazon. com to order an emergency present because they forgot someones birthday again. (Weiss, 1999) As we will soon find out, this is far from the case. Simply put, e-commerce is the exchange of business information between two or more organizations. An example of this would be buying and selling products or services over the Internet.

E-commerce became very popular soon after it proved to be an efficient means to conduct long distance transactions. The purpose of this report is to discuss some of the advantages and disadvantages of e-commerce, as well as examining its potential for the future of business. Electronic commerce, or e-commerce has developed very rapidly in the last few years and has left some people wondering what it is all about. “Most people think e-commerce is just about buying and selling things over the Internet. ” (Wareham, 2000) E-commerce is a broad term describing the electronic exchange of business data between two or more organizations’ computers.

Some examples might be the electronic filing of your income tax return, on-line services like Prodigy, and on-line billing for services or products received. E-commerce also includes buying and selling any item over the Internet, electronic fund transfer, smart cards, and all other methods of conducting business over digital networks. The primary technological goal of e-commerce is to integrate businesses, government agencies, and contractors into a single community with the ability to communicate with one another across any computer platform. (Edwards, 1998)

Electronic commerce was built on a foundation that was started more than 125 years ago with Western Union’s money transfer as an example of telegraph technology. In the early 1900s the advent of credit cards as a payment system revolutionised the process of automated commerce functions. In the mid 1980s the introduction of the ATM card was the latest improvement to electronic commerce. The Internet was conceived in 1969 when the Department of Defence began funding the research of computer networking. The Internet, as a means for commerce, did not become reality until the 1990s.

Before this time, it was mainly a tool for the army, and a research device for some American universities. Its popularity grew when it proved to become a fast and efficient means to conduct long distance transactions, as well as an effective way to distribute information. Clearly, E-commerce will change the face of business forever. Companies that are thousands of miles away can complete business transactions in a matter of seconds as well as exchange information. As one online article explained: Dell Computers sells more than $14 million worth of computer equipment a day from its web-site.

By taking their customer service department to the web Federal Express began saving $10,000 a day. The Internet provides businesses with the opportunity to sell their products to millions of people, 24 hours a day. (Baxton, 1999) Figure #1 shows the amount of revenues generated by the on the Internet dating back to 1996 as well as estimating possible revenues through the year 2002. With 1998, revenue equalling almost 74 billion dollars and experts predicting that it will climb to as much as 1,234 billion dollars by the year 2002, anyone can see that e-commerce is the wave of the future.

Figure #1- Internet generated revenues in US dollars. Source: NUA Internet Surveys “Without a doubt, the Internet is ushering in an era of sweeping change that will leave no business or industry untouched. In just three years, the Net has gone from a playground for nerds into a vast communications and trading centre where some 90 million people swap information or do deals around the world. Imagine: It took radio more than 30 years to reach 60 million people, and television 15 years. Never has a technology caught fire so fast. ” (Edwards, 1998) „h International Distribution

The number one advantage that e-commerce possesses is speed. The Internet and World Wide Web gives businesses opportunities to exchange messages or complete transactions almost instantaneously. Even with the slowest connections, doing business electronically is much faster than traditional modes. With increased speeds of communication, the delivery time is expedited and that makes the whole transaction from start to finish more efficient. Also, you can find practically any product available for sale on the Internet, as one author put it from books and compact disks (from www. azon. com) to French bread (available from www. sourdoughbread. com), (Buskin, 1998)

Even more significant is the fact that information appearing on the Internet can be changed extremely rapidly. This gives business owners the ability to inform customers of any changes to the service that you are offering. This also allows for you to update marketing and promotional materials as often and as frequently as you would like. The second advantage of the electronic commerce is the opportunity it offers to save on costs.

By using the Internet, marketing, distribution, personnel, phone, postage and printing costs, among many others, can be reduced. You can start doing business in cyberspace for as little as $100. Most businesses will spend more than this but compared to the cost of opening a physical store, the savings are tremendous. These funds can then be diverted to marketing and advertising of your product or service. Cyberspace knows no national boundaries. That means you can do business all over the world as easily as you can in your own neighbourhood.

Since the Internet connects everyone in cyberspace, information is transmitted at the speed of sound or the speed of light, depending on your connection. Either way, distance becomes meaningless, which makes you able to link to anyone on the globe and anyone on the globe can link to you. The ability to provide links makes doing business on the Internet attractive to customers in any part of the world. Using the web to provide customer support is an excellent vehicle to help build the reliability and effectiveness of your product or service.

The ability to provide on-line answers to problems through email or an provide an archive section of frequently asked questions 24 hours a day, 365 days a year, builds customer confidence and retention. In fact, a whole series of IBM E-commerce commercials were based on this one single point. The Internet tends to be a more personal environment. People expect to get a real person when they send mail. This can work to your advantage as a small start-up company, or when you are a large corporation. No matter what business you are involved in, an online-help feature is an extraordinary advantage to have.

A potential source of trouble is customer concerns with privacy and security. Anything sent over the Internet is sent through several different computers before it reaches its destination. The concern regarding Internet security and privacy is that unscrupulous hackers can capture credit card or checking account data as it is transferred or break into computers that hold the same information. Security on the Internet is much like security for your home. There is a point when the effort outweighs the advantages. As with your home you usually stop adding security features when you feel safe.

Making a customer feel safe is what is important in doing business on the Internet. Even though no one can guarantee 100% security of transferring financial information over the Internet, e-commerce is still safer than using credit cards at an actual store or restaurant, or paying for something with the use of a 1-800 number (unknown author, 1999). Also, every time you throw away a credit card receipt you could make yourself vulnerable to fraud. But how do we, as consumers, know this for sure? What precautions do e-commerce websites take to avoid such problems? The answer is simple: encryption.

Ever since the 2. 0 versions of Netscape Navigator and Microsoft Internet Explorer, transactions can be encrypted using Secure Sockets Layer (SSL), an Internet protocol that creates a secure connection to the server, protecting the information as it travels over the Internet. SSL uses public key encryption, one of the strongest encryption methods around. A way to tell that a Web site is secured by SSL is when the URL begins with https instead of http. Browser makers and credit card companies are also promoting an additional security standard called Secure Electronic Transactions (SET).

SET encodes the credit card numbers that sit on vendors servers so that only banks and credit card companies can read the numbers. Obviously no e-commerce system can guarantee 100-percent protection for your credit card, but you are less likely to get your pocket picked online than in a real store (Weiss, 1999). E-commerce is based on the assumption that the participants will pay for what they buy. There has been a noted reluctance among Internet users to actually pay, particularly for the digital goods and services.

As a result, much of the current business on the Internet is funded using business models other than user-pays, primarily advertising and sponsorship. If a company is selling something, then they need to find a way to accept payment that is not only convenient for them, but most importantly, convenient for the customers. Setting up a simple web site can be very inexpensive, but if you are unsure of how to go about creating one, a simple web site thus may not be so simple. And if you dont know what you are doing, your site will definitely not be effective.

A functional web site with online ordering requires expertise in four different areas. If a business owner does not have HTML , CGI scripting, ODBC, and special programs for online clearing options experience, they may want to consider outsourcing. Outsourcing is the use of a third party service company to provide the missing pieces to complete the total functionality of the business. This is a cost-effective way to allow a site to get up and running much faster and concentrate on the product or service rather than getting overwhelmed with the technical challenges.

Finally, a possible disadvantage to e-commerce is not having a strong organizational commitment. A functional web site that is going to be successful will soon need additional resources in technology and skills. E-commerce is evolving at a very rapid rate and the business owner must be willing to evolve with it. Newer and more advanced technology will cost more, but should be supplemented by additional revenues. Also, the company must be willing to change the entire business or start a new one when they can see the need for change.

Yahoo started as a commercial operation in 1995, with a simple, if enormous, list of Web sites to help people navigate the Web. But like the Web itself, Yahoo is changing fast. The once amazing ability to search the entire World Wide Web became outdated in a Net instant, so Yahoo, at the tender age of two years, began reinventing itself as a place to trade stocks, make travel reservations, and conduct commerce. (Hof, 1998) Rest assured the future of e-commerce is intact and ever changing. “Like electricity, antibiotics, or the car, the Internet is a revolutionary technology. France, 1999)

It is quite evident that e-commerce is only gaining speed. As one article stated, The growth of e-commerce wont diminish, it will become such a pervasive influence on how a company works that all functions within an organization will have a stake in their e-commerce strategy. (Wareham, 2000) With Internet traffic doubling every 100 days the digital economy is alive and growing. The huge growth of virtual communities are causing shifts in economic power from large corporations to smaller businesses. “Virtual communities erode the marketing and sales advantages of large companies.

A small company with a better product and better customer service can use these communities to challenge larger competitors–something it probably could not do in the real world. ” (CommerceNet, 1999) With many of the technological advances in the banking, on-line trading and retail industries, e-commerce will soon become the foundation of our life just as radio, telephone and television have in the past. Technology has a place in everyone’s day to day activities and soon e-commerce will be a major factor in the decisions we have to make. Remember, e-commerce is more complex than just buying that special someones birthday present.

E-commerce, along with the Internet, is an outlet for business. It is a way for the new guy to compete with the proven giants in the industry. An example of this would be the launch of Wal-Marts new web site intended to compete with industry monster Amazon. com. Their new business venture allows Wal-Mart to go outside its usual corporate sphere for Web-savvy talent geared for dot. com commerce, such as engineers, programmers and marketers. It also provides them with the necessary Web-wampum V such as options, warrants and shares thats essential to attracting top talent.

Simply put, the Internet and the use of e-commerce provides many opportunities for even the smallest of businesses to compete with large corporations, in essence leveling the playing field. With the steady growth of the Internet, and the fact that every year more and more families are plugging in and surfing the web, can a company survive without the use of the Internet and e-commerce? Probably, but not for long. The Internet and e-commerce are here to stay, so businesses can either change with the times, or get left behind. The choice is theirs to make.

Internet Privacy Paper

Today a profound shift in the privacy equation is under way. Technology brings enormous efficiency to the collection, sorting and distribution of personal information. This efficiency has revolutionized countless organizations but it has also increased opportunities for snooping. The ability of computers to sift though personal information may make much of your life an open book, unless privacy policies are implemented. In many countries a large number of records are public, available to anyone.

In the United States, for example, public records typically include voter registration lists, Department of Motor Vehicle records, federal tax liens, arrest and conviction records, court proceedings and judgments, bankruptcy and probate records, and lists of births, deaths and marriage licenses. Additional information is available in commercial directories and databases, including phone books, city directories, professional directories and newspaper databases.

Many records that we think are confidential today such as credit reports, income-tax records, social security records, loan applications, bank account records, credit card records, Telephone Company records, military records and medical records may not be confidential tomorrow. If a curious neighbor wants to know something about you, it takes some expense and effort but with the rise of computer databases and the Internet, it’s getting easier to correlate information and develop a profile of somebody.

Already, some companies are beginning to offer services that collect personal information and sell it on CD-ROMs or online. Take police reports, for example, they are public. The Internet will make this kind of information extraordinarily public, unless the rules change. The way to protect privacy in this age of information technology is to develop respect for the privacy of personal information. We shouldn’t ban the collection of legitimate information or deny organizations the benefit of databases.

Sharing information can be appropriate in some commercial settings. Many consumers appreciate having products and services tailored to their individual needs or profiles, as long as their privacy isn’t unduly compromised. But to protect privacy, access should be denied to people who don’t have a legitimate reason for the information they seek. A nosy neighbor shouldn’t be able to check your credit rating.

Society must define the appropriate purposes for specific kinds of information and fashion ways to confine the use of the information to those purposes. It won’t be easy, but it is possible. Individuals and organizations can do their part by making the protection of privacy an important objective. As people wake up to how much information about them is stored on computers and how it can be used, the issue of privacy will command more and more the attention from the powers that be.

The privacy concerns of the Lotus and Equifax “MarketPlace: Households” mailing lists consisted of true product dissemination and how it would be controlled, resale of the data and would consumers be able to delete their names from the database or make corrections before hitting the market? There were also concerns that not everyone in the database had agreed to participate. These were certainly legitimate regards of the consumers. It’s clearly a control issue and scary to think that a stranger is controlling how your personal information is used. In lieu of so much criticism Lotus and Equifax canceled the product.

The Evolution of the Internet

So you believe Al Gore created the Internet? Well that’s not possible, because I did. Yes, it’s true, a few years ago I was sitting in my basement with nothing to do and suddenly the idea came to me: why not create an inter-connected network of networks that will allow users to send mail instantly, download copyrighted songs, and order pizza, all from the comfort of their own living room? OK, so maybe I didn’t exactly invent the Internet, but neither did Al Gore. So who was the genius behind the information superhighway, you ask?

Well let’s take a step back to the sixties, a decade when Cold War tension caused nationwide fear of nuclear warfare. Early in the decade, two groups of researchers, privately owned RAND Corporation (America’s leading nuclear war think-tank) and federal agency ARPA (Advanced Research Projects Agency), grappled with a bizarre strategic mystery: in the event of nuclear war, how could political and military officials communicate successfully? It was obvious that a network, linking cities and military bases, would be necessary. But the advent of the atomic bomb made switches, wiring, and command posts for this network highly vulnerable.

A “nuclear-safe” network would need to operate with missing links and without central authority. In 1964, RAND Corporation’s Paul Barran made public his solution to the problem. Essentially, the concept was simple. Barran’s network would be assumed to be unreliable at all times. Information would be broken into many small pieces called “packets” and then sent to various points, or nodes, in the network until they reached their destination. ARPA embraced Barran’s idea for three reasons. First, if nuclear bombs blew away large components of the network, data would still reach its destination.

Second, it would be relatively secure from espionage, since spies tapping into parts of the network would be able to intercept only portions of transmissions. Lastly, it would be much more efficient because files and transmissions couldn’t clog portions of the network. Only five years after Barran proposed his version of a computer network, ARPANET went online. Named after its federal sponsor, ARPANET initially linked four high-speed supercomputers and was intended to allow scientists and researchers to share computing facilities by long-distance.

By 1971, ARPANET had grown to fifteen nodes, and by 1972, thirty-seven. ARPA’s original standard for communication was known as “Network Control Protocol” or NCP. As time passed, however, NCP grew obsolete and was replaced by a new, higher-level standard known as TCP-IP, which is still in use today. TCP, or “Transmission Control Protocol,” is responsible for converting messages into streams of packets at the source of the transmission and then assembling the streams at the final destination.

IP, or “Internet Protocol” ensured that packets were routed across multiple nodes and networks, even those using the original NCP standard. The Internet, as it came to be called, spread like wildfire. Since software for TCP-IP was available to the public and the technology for networking was decentralized by its very nature, nodes and networks easily joined in. Each node covered its own expenses, and thus there was no disapproval for expansion. Like the telephone, the Internet was relatively useless without universal participation.

In 1989, ARPANET was completely eliminated in favor of TCP-IP, which already contained over 300,000 nodes. Practical applications for the Internet popped up everywhere. Graduate students at North Carolina and Duke invented an electronic bulletin board called Usenet and researchers at the University of Minnesota developed a primitive search engine called Gopher. In 1991, the foundation for the modern Internet was built when Swiss Internet user Tim Berners-Lee released his system of Internet “hypertext.

Prior to hypertext, Internet use was limited to “nerds” who knew the commands needed to communicate through the text based network. Hypertext allowed users to link words, pictures, and files with mouse clicks. It wasn’t long before University of Illinois student Marc Andressen and fellow computer programmers invented the first “Web browser” called Mosaic. The introduction of Web browsers spawned a new era of communication and, in response to the 1993 release of Mosaic, Internet use grew 341,634%. You may have noticed that Al Gore’s name was never mentioned in this brief history of the Internet.

While Gore did in fact help secure government funding for a related project in the 1980’s, he did not invent the Internet. In fact, there is no specific creator of the Internet, nor is there a date when the Internet came into being. It evolved over thirty years from the government’s way of post-nuclear war communication to today’s Information Superhighway. The number of hosts now far exceeds ten million, and hundreds of millions of users from over 150 countries are connected. Sorry Al, the Internet would have done just fine without you.

Internet Access: Flat Fee vs. Pay-Per-Use

Most Internet users are either not charged to access information, or pay a low-cost flat fee. The Information SuperHighway, on the other hand, will likely be based upon a pay-per-use model. On a gross level, one might say that the payment model for the Internet is closer to that of broadcast (or perhaps cable) television while the model for the Information SuperHighway is likely to be more like that of pay-per-view T. V. “Pay-per-use” environments affect user access habits.

“Flat fee” situations encourage exploration. Users in flat-fee environments navigate through webs of information and tend to make serendipitous discoveries. Pay- er-use” situations give the public the incentive to focus their attention on what they know they already want, or to look for well-known items previously recommended by others.

In “pay-per-use” environments, people tend to follow more traditional paths of discovery, and seldom explore totally unexpected avenues. “Pay-per-use” environments discourage browsing. Imagine how a person’s reading habits would change if they had to pay for each article they looked at in a magazine or newspaper. Yet many of the most interesting things we learn about or find come from following unknown routes, bumping into things we weren’t looking for.

Indeed, Thomas Kuhn makes the claim that, even in the hard sciences, real breakthroughs and interesting discoveries only come from following these unconventional routes [Kuhn, Thomas, The Structure of Scientific Revolutions, Chicago: University of Chicago Press, 1962]). And people who have to pay each time they use a piece of information are likely to increasingly rely upon specialists and experts. For example, in a situation where the reader will have to pay to read each paragraph of background on Bosnia, s/he is more likely to rely upon State Department summaries instead of paying to become more generally informed him/herself.

And in the 1970s and 1980s the library world learned that the introduction of expensive pay-per-use databases discouraged individual exploration and introduced the need for intermediaries who specialized in searching techniques. Producers vs. Consumers On the Internet anyone can be an information provider or an information consumer. On the Information SuperHighway most people will be relegated to the role of information consumer.

Because services like “movies-on-demand” will drive the technological development of the Information SuperHighway, movies’ need for high bandwidth nto the home and only narrow bandwidth coming back out will likely dominate. (see Besser, Howard. “Movies on Demand May Significantly Change the Internet”, Bulletin of the American Association for Information Science, October 1994) Metaphorically, this will be like a ten-lane highway coming into the home and only a tiny path leading back out (just wide enough to take a credit card number or to answer multiple-choice questions).

This kind of asymmetrical design implies that only a limited number of sites will have the capability of outputting large volumes of bandwidth onto the Information SuperHighway. If such a configuration becomes prevalent, this is likely to have several far-reaching results. It will inevitably lead to some form of gatekeeping. Managers of those sites will control all high-volume material that can be accessed. And for reasons of scarcity, politics, taste, or personal/corporate preference, they will make decisions on a regular basis as to what material will be made accessible and what will not.

This kind of model resembles broadcast or cable television much more so than it does today’s Internet. The scarcity of outbound bandwidth will discourage individuals and small roups from becoming information producers, and will further solidify their role as information consumers. “Interactivity” will be defined as responding to multiple-choice questions and entering credit card numbers onto a keypad. It should come as no surprise that some of the major players trying to build the Information SuperHighway are those who introduced televised “home shopping”.

Information vs. Entertainment The telecommunications industry continues to insist that functions such as entertainment and home shopping will be the driving forces behind the construction of the Information SuperHighway. Yet, there is a growing body of vidence that suggests that consumers want more information-related services, and would be more willing to pay for these than for movies-on-demand, video games, or home shopping services. Two surveys published in October 1994 had very similar findings.

According to the Wall Street Journal (Bart Ziegler, “Interactive Options May be Unwanted, Survey Indicates,” Oct. , 1994, page B8), a Lou Harris poll found that “a total of 63% of consumers surveyed said they would be interested in using their TV or PC to receive health-care information, lists of government services, phone numbers of businesses and non-profit groups, product reviews and imilar information. In addition, almost three-quarters said they would like to receive a customized news report, and about half said they would like some sort of communications service, such as the ability to send messages to others.

But only 40% expressed interest in movies-on-demand or in ordering sports programs, and only about a third said they want interactive shopping. ” A survey commissioned by MacWorld (Charles Piller, “Dreamnet”, MacWorld, Oct 1994, pages 96-105) which claims to be “one of the most extensive benchmarks of consumer demand for interactive services yet conducted” found that “consumers re much more interested in using emerging networks for information access, community involvement, self-improvement, and communication, than for entertainment.

Out of a total of 26 possible online capabilities, respondents rated video-on-demand tenth, with only 28% indicating that this service was highly desirable. Much more desirable activities included on-demand access to reference materials, distance learning, interactive reports on local schools, and access to information about government services and training. Thirty-four percent of the sample was willing to pay over $10 per month for distance earning, yet only 19% was willing to pay that much for video-on-demand or other entertainment services.

If people say they desire informational services more than entertainment and shopping (and say that they’re willing to pay for it), why does the telecommunications industry continue to focus on plans oriented towards entertainment and shopping? Because, in the long run, the industry believes that this other set of services will prove more lucrative. After all, there are numerous examples in other domains of large profits made from entertainment and shopping services, and very few such examples from informational services.

It is also possible that the industry believes that popular opinion can easily be shifted from favoring informational services to favoring entertainment and shopping. For several years telecommunications industry supporters have been attempting to gain support for deregulation of that industry by citing the wealth of interesting informational services that would be available if this industry was freed from regulatory constraints. Sectors of the industry may well believe that the strength of consumer desire for the Information SuperHighway to meet information needs (as shown in these polls) is a result of this campaign.

According to this argument, if popular opinion can be swayed in one direction, it can be swayed back in the other direction Popular discourse would have us believe that the Information SuperHighway will just be a faster, more powerful version of the Internet. But there are key differences between these two entities, and in many ways they are diametrically opposed models. Privacy The metering that will have to accompany pay-per-view on the Information SuperHighway will need to track everything that an individual looks at (in case s/he wants to challenge the bill).

It will also give governmental agencies the pportunity to monitor reading habits. Many times in the past the FBI has tried to view library circulation records to see who has been reading which books. In the online age, service providers can track everything a user has bought, read, or even looked at. And they plan to sell this information to anyone willing to pay for it. In an age where people engage in a wide variety of activities online, service providers will amass a wealth of demographic and consumption information on each individual.

This information will be sold to other organizations who will use it in their marketing campaigns. Some organizations are already using omputers and telephone messaging systems to experiment with this kind of demographic targeting. For example, in mid-1994, Rolling Stone magazine announced a new telephone-based ordering system for music albums. After using previous calls to build “a profile of each caller’s tastes … custom messages will alert them to new releases by their favorite artists or recommend artists based on previous selections. (“Phone Service Previews Albums” by Laura Evenson, San Francisco Chronicle, 6/30/94, p D1)

Some of the early experiments promoted as tests of interactive services on the Information SuperHighway were ctually designed to gather demographic data on users. (“Interacting at the Jersey shore: FutureVision courts advertisers for Bell Atlantic’s test in Toms River”, Advertising Age, May 9, 1994) Conclusion No one can predict the future with certainty.

But we can analyze and evaluate predictions by seeing how they fit into patterns. And an analysis of the discourse around the Information SuperHighway shows remarkable similarity to that which surrounded cable TV nearly a quarter-century before. Though there is no guarantee that the promises of this technology will prove as empty as those f the previous technology, we can safely say that certain powerful groups are more interested in promoting hype than in weighing the possible effects of the Information SuperHighway.

The Information SuperHighway will not just be a faster Internet; in fact it is possible that many of the elements that current Internet users consider vital will disappear in the new infrastructure. Though the average consumer will have many more options than they do from their home television today, attempts at mass distribution will likely favor mainstream big-budget programs over those hat are controversial or appeal to a narrower audience.

It is possible that diversity available from all sources will decrease and independent productions will be even further marginalized. And the adoption of an asynchronous architecture (a ten-lane highway coming into the library or home with a tiny path leading back out) would pose a significant barrier to those seeking to be information providers, and would favor a model of relatively passive consumption. And the kind of massification and leveling of culture that will follow is likely to be similar to the effects of broadcast television on culture.

Internet Rating Systems: Censors by Default

The Internet, first designed for the military and the scientific community, has grown larger and faster than anyone could have ever expected. Now being a potpourri of information, from business to entertainment, the Internet is quickly gaining respect as a useful and important tool in thousands of applications, both globally and domestically. But, the growth that the Internet has seen in the last few years has come with some growing pains.

Reports of harmful information reaching children are always painful to hear; who wouldn’t feel for a mother who lost a child to a pipe bomb that was built from instructions on the Internet? But the greatest pain thus far has been the issue of accessibility of pornography on the Internet, and it has many parents concerned. But is it as big of a threat as the media would like us to think, or has it been a bit exaggerated? On July 3, 1995, Time Magazine published a story called On a screen near you: Cyberporn.

This article discussed the types of pornography that could be found on the Internet such as, Pedophilia, S and M, urination, defecation, bestiality, and everything else in between. In Julia Wilkins’ Humanist article, she states that the Time magazine article was based on a George Town University undergraduate student’s law journal paper that claimed that 83. 5 percent of the pictures on the Internet were pornographic. Unfortunately, after Time published the article, it was discovered that the paper’s research was found to be wrong.

So wrong in fact that Time retracted the figure, which really was less then 1 percent, yet the damage had already been done (1). She also claimed that the article, which was the first of its kind, was responsible for sparking what can be compared to a Salem witch-hunt or the McCarthy hearings. In effect setting off many child protection and religious groups who were being fueled more by inaccurate data and a Moral Panic type attitude, than the facts (1). With government officials being pressured from these groups, they declared war and the anti-Internet campaign had begun.

The first attack came from Sen. Jim Exxon (D-Nebraska) in March 1995. He introduced legislation that made material considered obscene, lewd, lascivious, filthy, or indecent against the law (qtd. in Lead-up). This legislation made its way to the Telecommunication Reform Package, and ultimately to the Communication Decency Act (CDA). The Telecommunication Act, which includes the CDA was passed by the senate, the house, and signed by President Clinton on February 8, 1995.

Lead-up) The same day the CDA was signed, the opposition, led by the ACLU and other advocacy groups, along with industry leaders like AOL and Microsoft, filed suit in a Philadelphia District Court challenging the constitutionality of the new law (Kramer, qtd. in Lead-up). The ACLU v. Reno landmark case found that CDA violated the first amendment and was, therefore, unconstitutional. Then on June 26, 1997, in the U. S. Supreme Court, in the appeal, Reno v.

ACLU, the justices reaffirmed that the CDA was unconstitutional and that it was a cure worse than the disease (Lead-up). By a vote of 7-2, the CDA and the Moral Panic went away (Wilkins). Or did it? Despite the Supreme Court’s ruling that the Communication Decency Act was a violation of the first amendment and that the Internet is entitled to the highest level of free speech protection , there is a new less obvious threat to the freedom of speech (qtd. in Beeson). According to ACLU, the new threat is hiding in the smoke screen of Internet rating systems (Beeson 2).

These types of rating systems have been around for awhile, designed as a tool to protect children from inappropriate material and help businesses keep their Internet users focused. While benign on the surface, the ACLU warns the long-term ramification may in fact destroy the Internet and the rights that come with it. Parental level blocking programs are not only the most effective way to keep children from inappropriate information on the Internet, but unlike labeling systems, provide the rest of us with the freedom of information we deserve. The protection of children is the focus of most of the issues surrounding the Net today.

Protection from pornography seems to be the main focus, but there are others, such as information on bomb making or protection from cults and sexual predators. All these are examples of why parents want to block or protect their children from this information. All sides agree that parents should have tools to protect their children from these threats. There is software, developed and maintained by private companies that do not rely on a global system for its blocking mechanism, that a parent can buy and use in their home to block sites that are deemed inappropriate for children (Krantz 1).

Although these types of third party software do a good job in keeping undesirable information away from children, the ACLU warns that many of them don’t block sites that have not been rated, and are known to block some sites that wouldn’t be considered inappropriate for children (Beeson 10). A buyer beware attitude has been taken by ACLU in regards to these end user packages. Likewise, the ACLU warns against the new direction the industry is taking with a global labeling type systems and states many of the new rating schemes pose far greater free speech concerns than do end-user software programs (qtd. Beeson 10).

The Platform for Internet Content Selection(PICS) is the newest labeling scheme that is being embraced by Internet industry leaders. This is not a filter, but a labeling standard for creating filtering or blocking tools. In theory, it’s designed to be a labeling mechanism for web sites that then can be used by end user software (Webber 1). However, because the PICS system has many technological holes, it is getting opposition from many advocacy groups including the ACLU, ALA and the Electronic Privacy Information Center (EPIC).

In early December 1997, the Internet industry leaders, along with the White House, armed with a goal to standardize an Internet rating scheme, held a summit about the issue of virtually protecting children. This was an attempt at collectively finding middle ground for protecting children from Internet porn. But, Bruce Handy, a reporter for Time describes government officials pleading with the private sector to develop a system that works, and the industry leaders looking for the potential profit that a clean and well-lighted Internet can bring (Handy 2).

Bringing big business and the government together to discuss the strategies of the subject was an image booster for both parties, but little else. And according to Berry Streinhardt of the ACLU, one of the few advocacy groups invited, This had seemed more like a trade show than a summit (qtd. in Handy). It ended up producing more questions than answers. One of the only questions answersed by the so called summit proved to many free speech advocacy groups that their collation with big business was crumbling and the new battle against rating systems would be fought alone.

Sold out by big businesses that fought gallantly beside the ACLU and in their fight against the CDA, are now seeing dollar signs that a Homogenized web could bring (Beeson 8). But, the free speech groups have some ammunition on their side, the first amendment for one and the technological shortcomings of the PICS system second. One such technological shortcoming questions how would a labeling system keep up with the ever-changing web. With new web content moving through cyberspace constantly, how can all that content be labeled?

Voluntary self-labeling is the answer according to the Clinton administration (Weber 1). But that brings up many of it’s own questions. The first question is: What happens if the webmaster -the person who creates the web site- decides not to volunteer? The answer is simple. The site is blocked, because the PICS system must, by default, block unlabeled sites in order to be effective; so much for voluntary (Webber 2). Another question is: If this voluntary system is truly voluntary what incentive would a webmaster have to label his site?

Or what would keep the webmaster from mislabeling? According to the ACLU, eventually a law would, because without a penalty system for misrating, the entire concept of a self-rating system breaks down. They also state that despite all good intentions… would lead to heavy handed government censorship (Beeson 8). This would convince most, that a labeling system is inappropriate for the Internet, but there are many more holes in the PICS scheme. Another hole in the PICS labeling system is the cost and the man hours required to do such a task.

It probably wouldn’t hurt empires like CNN or Microsoft to spend the man hours to label their entire web presence, but the ACLU states non-profit organizations would be impacted greatly by burdens that rating large sites would require (Beeson 6). We all know what would happen if organizations decided not to label themselves. So what choice would they have? They would have no choice and that is wrong. It’s wrong because the cost and burden of such a task would effectively shut most noncommercial speakers out of the Internet marketplace (Beeson 6). Along the same lines the, the U. S. Supreme Court In the landmark case, Reno v. ACLU, in regards to use age verification for internet forums, the Justices stated that it would be prohibitively expensive (qtd. in Beeson 6).

And according to the ACLU this ruling would also apply to self-rating for some organizations (Beeson 6). Even if pro-rates get past the shortcomings of government intervention and justify the costs to the self-rating proposals what they sometimes forget is that the Internet is global and knows no boundaries. Lets face it, the rest of the world gets a lot closer on the Internet. For the first time in the history of man, there is a nearly global system of communication.

Ever wondered what a Russians’ day is like? Just ask and you’ll probably get more answers than you originally wanted. Or look out the window onto a street in Cairo and see what’s happening there at that moment. These are some impractical examples of what global communication can do. The practical ones for both business and entertainment are obvious. But, rating this vast virtuosity presents an equally vast amount of problems, both technically and practically.

According to the ACLU, more than half of the information on the Net is from outside of the U. S. d they warn that a rating system would put up borders around the U. S. , making us what they call Fortress America (Beeson 7). This would again infringe upon the rights of the Net user both foreign and domestic, and devaluate the rich resources that we find on the web today. If we can’t rely on the PICS system to protect our children and our free speech ideals, why is it still being pursued as the standard? There are a couple reasons why the PICS system is still being pursued as the standard. The first reason is a monopoly. Two companies hold 90 percent of the browser market, Microsoft -no surprise-, and Netscape.

Both support PICS labeling software packages (Beeson 13). Unfortunately where these two companies go, most of the rest of the industry has no choice, but to follow. The second reason that the faulty PICS system is still being pursued is ignorance, pure and simple. The same ignorance and bad journalism that led to the CDA, and Julia Wilkins’ Moral Panic. The same Ignorance that held the bogus Time article up on the senate floor and claimed it as fact (Wilkins 2). And according to the ACLU it’s the same ignorance that can be related to burning down the house to roast the pigs (qtd. Beeson 2)

The bottom line, according to Julia Wilkins, is that children are very much the minority on the web and for every child on the Web there is an adult paying for it. And for those who have children and want to keep them safe, there is plenty of education and software out there to protect the children without sacrifice to the rights of the majority (4). Likewise in Tim Haight’s article, Do We Need Internet Content Rating, he agrees users have the right to protect themselves and their children, but it needs to be done without any burden or effect on the rights of others (2).

The ACLU also agrees, and has always agreed, with providing education to parents and fully endorses the parent’s rights to choose (Beeson 12). If not the PICS system, then what may we see in the future? That is yet to be seen. But it is clear to the ACLU in the white paper presented at the Internet summit in Washington which meticulously examined the free speech issue for all rating system, not just the PICS system. They urge industry leaders, policy makers, children groups and the Internet users to engage in genuine debate about free speech ramifications of the rating and blocking systems being proposed.

The ACLU also gives five recommendations and principals to help guide the debates. The first is: Internet users know best. Meaning that the individual adult user know best what information should or shouldn’t be filtered or blocked, whether for themselves or their children. The second ACLU principal is: Default setting on free Speech. They fear that setting the blocking mechanism up as default on software such as browsers, or search engines, would greatly restrict the freedom of speech. The third is: Buyers beware.

Again the ACLU stresses complete user control; control when the user wants to block and control to know what they are blocking. The fourth is: no government coercion or censorship. Remember the first amendment. Finally, the fifth ACLU recommendation is: Libraries are free speech zones. they claim that mandatory use of blocking software is a violation of the first amendment (qtd. in Beeson 5+). In conclusion, based on key short falls of the recently endorsed PICS content labeling system, we can see that parental level blocking systems are the only solution, at the present time, that can protect our children as well as are freedoms.

We see that a voluntary rating system will only lead to government censorship, and commercialized homogenized content. Potentially leaving the U. S. in a Fortress American Internet presence, instead of the important global one. While free speech advocacy groups, such as the ACLU, support parents and educators rights to choose, what is and isn’t appropriate for children. They continue warding caution that freedom of speech is one of our greatest freedoms, and not examining the long-term ramifications in a system such as PICS may someday cost us that freedom, and others.

Should it be legal to release certain indecent content in print but not electronically through the Internet

This question has plagued many in years past and will continue to be a source of controversy for years to come. Supporters of Internet censorship believe that this new information medium, currently unregulated and expanding at an alarming rate, must be filtered and controlled to avoid the risk of so-called undesirable content being easily accessible by minors (or, in some cases, anyone). The goal of these individuals and organizations is, essentially, to have laws put in place like those to which the television and radio industry are subjected.

Many others strongly oppose any type of Internet censorship; they, along with other arguments, cite United States citizens constitutional right to Freedom of Speech. The majority of these advocates propose self-regulation for concerned individuals (such as parents). For the majority, the quick expansion of the Internet has come as a surprise; and matters such as this should be taken into careful consideration for the future of the Internet, our children, and, to some extent, our society. At first, the Internet was only used by military personnel for national security purposes.

Then, its horizon broadened to include computer hobbyists and corporations. Now, because of its flexibility and ease of use, it is part of the life of mainstream America. Because of this so-called explosion in the number of households and institutions going online, some have suggested that the Internet be regulated to keep indecent and vulgar content from appearing. Although bills have been proposed to do just what these people wish, as of this point, all of them have been struck down in the United States Congress. Pro Internet Censorship

Bills such as the Protection of Children from Computer Pornography Act of 1995 (PCCPA) began to appear before the House and Senate when organizations such as Enough is Enough lobbied the Senate for legislation to protect children from online pornography: Women speak with a special authority on the issue of pornographyfor we, and our children are its primary subjectsand its primary victims. Pornography demeans and degrades women, victimizes children and ruins men. It contributes to domestic and spouse abuse, rape, incest and child molestation.

And a great share of it is not protected speech, any more than libel, slander or false advertising is protected speech; therefore, it is not a 1st Amendment issue. It is not legal material. Many Americans do not realize this fact. (Dee Jepsen) Pornography is not the only means of content that many want regulated on the Internet; some contend that hate speech should not be allowed to be openly accessible either. Some oppose hate groups have used the Internet as a medium to spread their message, discovering that it costs much less, and reaches a larger audience than they were able to do beforehand.

Supporters of regulating the Internet believe that it is a powerful tool, which with near total anonymity is easy to abuse. They profess that scamming and harassing other users is not free speech and those people can not hide behind the First Amendment. When the Constitutional Decency Act (CDA) went on trial, the Justice Department argued that the CDA is necessary because “The Internet threatens to give every child a free pass into the equivalent of every adult bookstore and every adult video store in the country” (Mattos).

Supporters of the CDA, such as President Bill Clinton, believe the bill is Constitutional: “I remain convinced, as I was when I signed the bill, that our Constitution allows us to help parents by enforcing this Act to prevent children from being exposed to objectionable material transmitted through computer networks” (Clinton). With regulations to restrict indecent content put in place, children would be able to browse the Internet freely and there would be no need for filtering software or to monitor them. Laws passed to restrict free expression on the Internet would make it safer, easier, and more useful for all involved.

Con Internet Censorship Other activists believe that censoring the Internet is a ridiculous, unconstitutional, and impractical solution to the problems of indecent content. The majority of these opponents propose self-regulation, either through a parent watching his/her child or blocking software that does not let a child access inappropriate material. People against censoring the Internet feel that it would take a persons right to free expression away, which is what some feel is one of the most important assets of the Internet.

These activists have argued that the Constitutional Decency Act, which was brought before the Senate in the summer of 1996, is unjust and certainly not constitutional. By imposing broadcast-style content regulations on the open, decentralized Internet, the CDA severely restricts the First Amendment rights of all Americans and threatens the very existence of the Internet itself. Although they believe that the act is well intentioned, they insist that the CDA can never be effective at controlling content on a global medium, where a web site in Sweden is as close as a site in Connecticut.

The Citizens Internet Empowerment Coalition (CIEC) case is based on the argument that the only effective and constitutional way to control childrens access to objectionable material on the Internet is to rely on user control. (“CIEC, The Internet Is Not A Television”). The CIEC challenged the CDA on the grounds that “the Internet is a unique communications medium, different from traditional broadcast mass media, which deserves broad First Amendment protections” (CIEC).

The media did not let this issue pass; condemning the CDA, James C. Plummer wrote in Consumers Research Magazine as follows: The CDA penalizes not only people who transmit or make available indecent and/or patently offensive material to minors, but also those who “knowingly permit any telecommunications facility under [his] control to be used for any activity prohibited. ” What does this mean? In effect, it means that your Internet Service Provider (ISP) is legally liable for anything you email, post to a newsgroup, or put on a web page. (Plummer)

Microsoft CEO and computer industry expert Bill Gates in his book on the future of the Internet, The Road Ahead, asserts that the idea of having Internet Service Providers act as censors would be unreasonable: Some critics have suggested that communications companies be made gatekeepers, charged with filtering the content of what they carry. This idea would put companies in the business of censoring all communication. Its entirely unworkable, for one thing because the volume of communicated information is way too large.

This idea is no more feasible or desirable than asking a telephone company to monitor and accept legal responsibility for everything thats spoken or transmitted on its telephone wires. (Gates 310) The CDA, however, was struck down as unconstitutional. In his decision, Judge Ronald L. Buckwalter wrote: It is, of course, correct that statutes that attempt to regulate the content of speech presumptively violate the First AmendmentThat is as it should be. The prohibition against Government’s regulation of speech cannot be set forth any clearer than in the language of the First Amendment.

President Clinton, although steadfastly holding to the belief that censorship laws are necessary for the Internet, condones self-regulation of the Internet through the use of voluntary rating systems: [The Clinton Administration] vigorously supports the development and widespread availability of products that allow both parents and schools to block objectionable materials from reaching computers that children use. And we also support the industry’s accelerating efforts to rate Internet sites so that they are compatible with these blocking techniques.

Agreeing that the proposed CDA is unconstitutional, Judge Stewart Dalzell wrote that: the Internet may fairly be regarded as a never-ending worldwide conversation. The Government may not, through the CDA, interrupt that conversation. As the most participatory form of mass speech yet developed, the Internet deserves the highest protection from governmental intrusion. (“ACLU v. Reno”) Some believe that creating a new set of laws for the Internet is unnecessary because federal, state and local laws already apply to users of the Internet if they are in the jurisdiction of said laws.

For example, child pornography is not legal online or off, and several cases have shown that the Internet does not condone libel without penalization. In conclusion, advocates for free speech on the Internet believe that any type of government interference would lead to chaos, and would not work in any way whatsoever, even if taking away the First Amendment in this situation were the correct thing to do. Personal Opinion Before starting this research paper, I believed that the Internet should be free from government regulation.

After comparing the cases of both sides, I still believe, now more strongly, that any interference would have negative results and would not be the least bit effective. The arguments of the supporters of Internet censorship were mostly rooted in moral issues, and as I see it, blatantly show a disregard for the First Amendment. It is obvious that the Internet has not brought about these new issues, but rather raised them to a more global level. Censoring the Internet would not fix any of the problems it is supposed it, but instead it would bring about more problems along with it.

Laws such as the CDA attempt to apply to the entire, global Internet. Even if it were plausible, enforcing United States laws in foreign countries would be absurd because the United States does not have jurisdiction in foreign countries. The government of the United States cannot rightfully, or practically, enforce any law that would affect the entire world. If a law were passed, it would not be able to be enforced and therefore would not work in any way for any amount of time.

As an avid Internet user, publisher, and activist, I believe that the U. S. government should treat U. S. citizen Internet publishers the same as print publishers. Since the First Amendment was ratified in the 18th century, the federal government has recognized the freedom of the press: Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceable to assemble, and to petition the Government for a redress of grievances. (The Constitution of the United States of America)

The Internet is unlike any information medium in history. Though there are many similarities between the Internet and other means of communications, the free flow of information that the Internet provides makes it a unique medium. Unlike television and radio, the Internet allows users to access much information beyond just a channel-changing remote with only a limited number of stations. The ability of the Internet to provide access to information is only limited by those who use it.

Unlike traditional print media such as the newspaper, the Internet is relatively cheap and widely accessible. With a computer, network connection, and the proper software, anybody can become a web publisher. There is virtually no limit to how much content can be published on the Internet, whereas newspapers are limited by physical costs such as the cost of paper or the number of writers. The Internet is truly an exceptional, unmatched information medium and should be treated like one.

Fight to Maintain Freedom of Speech on the Internet

Imagine yourself trapped inside another world, a world where your essence is made entirely of words that can say whatever you desire. You could be young or old, male or female or neither, you are only as limited as your imagination. Now Imagine that someone wants to have a say in what can be said, seen, and done in this brave new world, what would this change, and more importantly who decides what’s ‘good’ and what’s ‘bad’? In the ordinary and mundane world of real life people have always fought for the pursuit of happiness, free speech, etc.

They are subjects which have always shouted in the hearts of our nations heroes, and rightfully so. What would our world be like if the government controlled what we were allowed to see and to say? It seems that George Orwell described it best in his book 1984 when he gave the scenario of a society in which people who committed the heinous act of thoughtcrime, the act of thinking something that goes against the party line, mysteriously disappeared into the night never to be seen again.

Thankfully, the hordes of would be ‘thought police’ members have been staved off throughout history and we have achieved a relatively liberal society where people are, for the most part, able to speak their minds openly. Well, even in today’s world there are still people who get pissed off when they think that free speech goes to far and they say something about it. This brings me to my main point. The Internet. A land made possible in 1968 by the Dept.

Defense with the idea that if all other lines of communications were destroyed in the advent of war then at least we’d have computers, (I don’t know, maybe they thought the electricity might magically produce itself after the bombing stopped). Any ways, thankfully the Internet has evolved beyond that into something which encompasses just about every possible human interest out there. A hodgepodge of political ideals ranging from big business capitalism to the gender erasing equality of the socialist mindset make the Internet a place where conflicts of interest often arise.

These conflicts are what I’m going to concentrate on in this essay, I hope to show you that, although I don’t believe that kids need to be looking at stuff like drug pages or cyberporn, I don’t believe that censoring it so that no one can take a peep is the answer that we need. The answer that I am going to try to present will focus more on increasing the level of awareness on the part of the Internet user instead of black listing offensive material. I will also talk about the increasing debate on the ethics of canceling someone who is writing offensive material.

The question that we must address on this issue is do we, as Internet users, have the right to decide what others can say on the net? For years now the Internet has played the role of 7th heaven for those who want to get their hands on information of any kind. Sometimes, however, it becomes the ‘any kind’ part that scares parents who don’t want their young son ‘accidentally’ catching his first exposure to the birds and the bees on the family’s PC, or their daughter looking at the latest news on the Cannabis Cup results.

Parents have every right to be alarmed by the material that their children see because that’s part of being a parent, looking after your kid. However, does being a parent extend to someone who wants to look at such material? First you have look at the availability of objectionable material, in this case pornography. The following excerpt is by James Herrington from his essay “Beware of Chilling Freedom of Expression” which is in CyberReader, The Internet also can be a conduit for obscenity, and for children who know how to find it. Not that obscenity is new to human existence, even for minors.

Nor is salacious and vulgar material easy to locate on the Internet; ferreting it out requires a certain adeptness, even for seasoned Internet surfers. Its availability alone should not set the stage for cybercensorship. Although Herrington suggests that the availability of pornographic material on the net requires “a certain adeptness” this is olny a temporary solution to the problem. Kids are becoming more familiar with the Internet due to its increased use in schools and, because of this, it won’t be long before it becomes easier for kids to find things on the net.

However, the limited knowledge that kids posses as of right now gives us some time before the problem amplifies, this is where the parents come in. Parents today must no longer look at the Internet with apprehension, there is software available to block out offensive sights using key words and the Internet providers also have the right and power to enforce morality codes on their users. The responsibility to initiate this software falls upon the shoulders of concerned parents.

There doesn’t have to be an outright ban on “salacious and vulgar” material, people have the option to in effect create an Internet which is tailored to their needs without restricting the type of material which is presented on the entire net. Parents aren’t the only ones who are calling for changes on the net, regular users have often complained of ‘flaming’, the act of sending verbally abusive messages. In reaction to flaming netizens have begun utilizing the ‘cancelbot’ function, in other words, destroying offending messages.

There is a danger in this however, where do we draw the line between canceling someone because they’re offensive and canceling them because you don’t want to hear what they have to say? Here’s what Daniel P. Dern, author of The Internet Guide for New Users, has to say about canceling. There is a danger of the cancel wars shifting from inappropriate resource use to canceling somebody based on ‘I don’t like your opinion. At what point does somebody say, ‘I don’t like this person, and I’m going to cancel them? (Lewis, “No more “Anything Goes” Cyberspace Gets Censors”)

The question that we face now is, should the cancel command be continued or another system be imposed. One idea is to reduce the whole thing to majority rule basis, I do not believe this is the right choice, the whole idea of free speech is to allow the minority the right to voice their opinions. So, what answer can we turn to? A system in which a complaint is lobbied through a committee would slow the process to an intolerable length for the petitioner. I offer this idea, that the system remain the same, minus the cancelbot.

There isn’t any need to cancel someone is to take the out-of-sight out-of-mind approach which only ignores the problem. As Clifford Stoll, author of Silicon Snake Oil: Second Thoughts on the Information Highway 220, Internet users have the capability of self redemption. I suspect that the main reason why we see so few law suits is that the network provides an ideal system for rebuttal. Whatever someone says against you online, you can reply to within hour, with the same distribution, and to the same audience.

The Internet is a land where people may defend themselves by the verbal sword, a land of freedom wherein its peoples should be able to speak freely. This is the code by which the Internet should live. People must become educated in the areas of software advances and the Internet itself otherwise the Internet might never live up to its potential greatness. It is the users themselves who decide what freedoms remain on the net and so I implore you to become more responsible netizens and up hold the edicts and ethics of the net to stop the invasion of would be ‘thought police’.

Internet Access: Flat Fee vs. Pay-Per-Use

Most Internet users are either not charged to access information, or pay a low-cost flat fee. The Information SuperHighway, on the other hand, will likely be based upon a pay-per-use model. On a gross level, one might say that the payment model for the Internet is closer to that of broadcast (or perhaps cable) television while the model for the Information SuperHighway is likely to be more like that of pay-per-view T. V. “Pay-per-use” environments affect user access habits.

“Flat fee” situations encourage exploration. Users in flat-fee environments navigate through webs of information and tend to make serendipitous discoveries. Pay- er-use” situations give the public the incentive to focus their attention on what they know they already want, or to look for well-known items previously recommended by others.

In “pay-per-use” environments, people tend to follow more traditional paths of discovery, and seldom explore totally unexpected avenues. “Pay-per-use” environments discourage browsing. Imagine how a person’s reading habits would change if they had to pay for each article they looked at in a magazine or newspaper. Yet many of the most interesting things we learn about or find come from following unknown routes, bumping into things we weren’t looking for.

Indeed, Thomas Kuhn makes the claim that, even in the hard sciences, real breakthroughs and interesting discoveries only come from following these unconventional routes [Kuhn, Thomas, The Structure of Scientific Revolutions, Chicago: University of Chicago Press, 1962]). And people who have to pay each time they use a piece of information are likely to increasingly rely upon specialists and experts. For example, in a situation where the reader will have to pay to read each paragraph of background on Bosnia, s/he is more likely to rely upon State Department summaries instead of paying to become more generally informed him/herself.

And in the 1970s and 1980s the library world learned that the introduction of expensive pay-per-use databases discouraged individual exploration and introduced the need for intermediaries who specialized in searching techniques. Producers vs. Consumers On the Internet anyone can be an information provider or an information consumer. On the Information SuperHighway most people will be relegated to the role of information consumer.

Because services like “movies-on-demand” will drive the technological development of the Information SuperHighway, movies’ need for high bandwidth nto the home and only narrow bandwidth coming back out will likely dominate. (see Besser, Howard. “Movies on Demand May Significantly Change the Internet”, Bulletin of the American Association for Information Science, October 1994) Metaphorically, this will be like a ten-lane highway coming into the home and only a tiny path leading back out (just wide enough to take a credit card number or to answer multiple-choice questions).

This kind of asymmetrical design implies that only a limited number of sites will have the capability of outputting large volumes of bandwidth onto the Information SuperHighway. If such a configuration becomes prevalent, this is likely to have several far-reaching results. It will inevitably lead to some form of gatekeeping. Managers of those sites will control all high-volume material that can be accessed. And for reasons of scarcity, politics, taste, or personal/corporate preference, they will make decisions on a regular basis as to what material will be made accessible and what will not.

This kind of model resembles broadcast or cable television much more so than it does today’s Internet. The scarcity of outbound bandwidth will discourage individuals and small roups from becoming information producers, and will further solidify their role as information consumers. “Interactivity” will be defined as responding to multiple-choice questions and entering credit card numbers onto a keypad. It should come as no surprise that some of the major players trying to build the Information SuperHighway are those who introduced televised “home shopping”.

Information vs. Entertainment The telecommunications industry continues to insist that functions such as entertainment and home shopping will be the driving forces behind the construction of the Information SuperHighway. Yet, there is a growing body of vidence that suggests that consumers want more information-related services, and would be more willing to pay for these than for movies-on-demand, video games, or home shopping services. Two surveys published in October 1994 had very similar findings.

According to the Wall Street Journal (Bart Ziegler, “Interactive Options May be Unwanted, Survey Indicates,” Oct. , 1994, page B8), a Lou Harris poll found that “a total of 63% of consumers surveyed said they would be interested in using their TV or PC to receive health-care information, lists of government services, phone numbers of businesses and non-profit groups, product reviews and imilar information. In addition, almost three-quarters said they would like to receive a customized news report, and about half said they would like some sort of communications service, such as the ability to send messages to others.

But only 40% expressed interest in movies-on-demand or in ordering sports programs, and only about a third said they want interactive shopping. ” A survey commissioned by MacWorld (Charles Piller, “Dreamnet”, MacWorld, Oct 1994, pages 96-105) which claims to be “one of the most extensive benchmarks of consumer demand for interactive services yet conducted” found that “consumers re much more interested in using emerging networks for information access, community involvement, self-improvement, and communication, than for entertainment.

Out of a total of 26 possible online capabilities, respondents rated video-on-demand tenth, with only 28% indicating that this service was highly desirable. Much more desirable activities included on-demand access to reference materials, distance learning, interactive reports on local schools, and access to information about government services and training. Thirty-four percent of the sample was willing to pay over $10 per month for distance earning, yet only 19% was willing to pay that much for video-on-demand or other entertainment services.

If people say they desire informational services more than entertainment and shopping (and say that they’re willing to pay for it), why does the telecommunications industry continue to focus on plans oriented towards entertainment and shopping? Because, in the long run, the industry believes that this other set of services will prove more lucrative. After all, there are numerous examples in other domains of large profits made from entertainment and shopping services, and very few such examples from informational services.

It is also possible that the industry believes that popular opinion can easily be shifted from favoring informational services to favoring entertainment and shopping. For several years telecommunications industry supporters have been attempting to gain support for deregulation of that industry by citing the wealth of interesting informational services that would be available if this industry was freed from regulatory constraints. Sectors of the industry may well believe that the strength of consumer desire for the Information SuperHighway to meet information needs (as shown in these polls) is a result of this campaign.

According to this argument, if popular opinion can be swayed in one direction, it can be swayed back in the other direction Popular discourse would have us believe that the Information SuperHighway will just be a faster, more powerful version of the Internet. But there are key differences between these two entities, and in many ways they are diametrically opposed models. Privacy The metering that will have to accompany pay-per-view on the Information SuperHighway will need to track everything that an individual looks at (in case s/he wants to challenge the bill).

It will also give governmental agencies the pportunity to monitor reading habits. Many times in the past the FBI has tried to view library circulation records to see who has been reading which books. In the online age, service providers can track everything a user has bought, read, or even looked at. And they plan to sell this information to anyone willing to pay for it. In an age where people engage in a wide variety of activities online, service providers will amass a wealth of demographic and consumption information on each individual.

This information will be sold to other organizations who will use it in their marketing campaigns. Some organizations are already using omputers and telephone messaging systems to experiment with this kind of demographic targeting. For example, in mid-1994, Rolling Stone magazine announced a new telephone-based ordering system for music albums. After using previous calls to build “a profile of each caller’s tastes … custom messages will alert them to new releases by their favorite artists or recommend artists based on previous selections. (“Phone Service Previews Albums” by Laura Evenson, San Francisco Chronicle, 6/30/94, p D1)

Some of the early experiments promoted as tests of interactive services on the Information SuperHighway were ctually designed to gather demographic data on users. (“Interacting at the Jersey shore: FutureVision courts advertisers for Bell Atlantic’s test in Toms River”, Advertising Age, May 9, 1994) Conclusion No one can predict the future with certainty.

But we can analyze and evaluate predictions by seeing how they fit into patterns. And an analysis of the discourse around the Information SuperHighway shows remarkable similarity to that which surrounded cable TV nearly a quarter-century before. Though there is no guarantee that the promises of this technology will prove as empty as those f the previous technology, we can safely say that certain powerful groups are more interested in promoting hype than in weighing the possible effects of the Information SuperHighway.

The Information SuperHighway will not just be a faster Internet; in fact it is possible that many of the elements that current Internet users consider vital will disappear in the new infrastructure. Though the average consumer will have many more options than they do from their home television today, attempts at mass distribution will likely favor mainstream big-budget programs over those hat are controversial or appeal to a narrower audience.

It is possible that diversity available from all sources will decrease and independent productions will be even further marginalized. And the adoption of an asynchronous architecture (a ten-lane highway coming into the library or home with a tiny path leading back out) would pose a significant barrier to those seeking to be information providers, and would favor a model of relatively passive consumption. And the kind of massification and leveling of culture that will follow is likely to be similar to the effects of broadcast television on culture.

History of Internet

Without a doubt, the Internet is undergoing a major transition as it experiences a tremendous influx of new users. Due to the anarchic, distributed nature of the net, we cannot even begin to enumerate the population of the Internet or its growth. As more of the world’s population moves on-line, new concerns will arise which did not confront the earlier generations. The new culture will demand different resources, services and technology than the old generations expected and used. Already we can witness a clash between the emergent culture and the entrenched culture.

The largest conflicts occurring now are about sharing resources, the mpending commercialization of the net, and the growing problem of computer crime. The Internet was born in the union of government and researchers, and for two decades afterwards remained mostly the realm of those two groups. The net began as ARPANET, the Advanced Research Projects Agency Net, designed to be decentralized to sustain operations through a nuclear attack. This nature persists today in the resilience of the net, both technologically and in its culture.

ARPANET was phased out in 1990 and the net backbone was taken over by NSFNET (National Science Foundation). Since 1969 the ain users of cyberspace have been involved in research or in the university community as computer experts or hackers, exploring the limitations and capabilities of this new technology. These people formed a cohesive community with many of the same goals and ethics. In addition to the homogeneity of the net, the small size contributed to a strong feeling of community.

There has been some conflict between the hackers and the researchers over sharing resources, and philosophies about security and privacy, but on the whole, the two groups have co-existed without major incident. The newest of the members of the so-called old generation are the university users who are not involved in research work on the net. Generally these are the students using the net for email, reading netnews and participating in interactive real-time conversations through talk, telnet or irc. This wave of people integrated smoothly with the community as it existed.

Still sharing the common research and education orientation, the community remained cohesive and the culture did not change much, perhaps it only expanded in the more playful areas. These users did not compete with the esearchers for resources other than computer time, which was rapidly becoming more available throughout the eighties. It is only in the past year or two that we have begun to see the explosion of the new generation on the Internet. Businesses have begun connecting themselves to the net, especially with the prospect of the NSFNET backbone changing hands to permit commercial traffic.

Public access nets run by communities or businesses are springing up in cities all over the world, bringing in users who know little about computers and are more interested in the entertainment and information they can glean from the net. Commercial providers like America Online and Compuserve are beginning to open gateways from their exclusive services to the open Internet, specifically allowing their users to access email, netnews and soon ftp and telnet services. The explosion of BBSs and the shared Fidonet software has brought many users who were previously unable to get an account through a university to the world of email and netnews.

At this point, anyone with a computer and a modem can access these most basic services. Several state s, such as Maryland, have begun efforts to connect all their residents to the net, often through their library system. The city of Cambridge, MA now offers access to the world wide web for short segments of time in its public libraries, and even several progressive coffeehouses in the San Francisco Bay area and soon in the Boston area are offering public net access. In the last 20 years, the net has developed slowly, adapting comfortably as its population grew steadily and shifted the culture to more diverse interests.

But as the net faces a huge increase in its users in a short time, the reaction is bound to be more severe, and debate will center around several key issues that were irrelevant in a small homogeneous community. The establishment of new customs concerning these issues will define the culture of the future Internet. Most resources on the net currently are not designed to handle the amount of usage that will occur within the next six months. Sites which offer access to ftp archives are particularly worried about the massive influx of new users from commercial services opening access soon.

America Online administrators addressed this issue in a recent piece of email to ftp sysadmins where they recognized the perceived problem and stated that they would “request that AOL members limit their FTP traffic to off-peak hours for sites” and “work ith administrators to help manage load problems. ” They offer to set up mirror sites for easier access to these resources. Unfortunately, this may not be adequate — it is certainly agreed by now that Internet users will need more patience in the future when accessing the information they want.

Many net users have been complaining recently about the influx of AOL users onto Usenet. Of course, perceptions of these new posters were not enhanced by a bug that caused their messages t o be reposted eight times. Newsgroups (such as alt. aol-rejects) were created specifically with the intent of nsulting AOL users and resenting their entrance onto Usenet. As the net becomes more crowded, we can expect more animosity and rivalry for “rights” to access resources. As the NSFNET backbone changes hands to allow business traffic, we will see even more of a business presence than that which already exists.

At the present time the ethics of business on the net are very unclear. The perception of commercial use as inappropriate use of the net still exists among many segments of the net community. Incidents such as the mass advertisements from the law team of Canter&Siegel have made many people fearful of the otential of abuse of access in cyberspace. On the other hand, useful services are coming on-line, especially with the advent of fill-out forms on the World Wide Web. With technology advancements like authentication and digital money, commercial activity will become even more widespread.

Computer crime becomes a much more immediate problem as the net’s population expands without control. The old and new generations on the net have different security and privacy needs, and different views of what constitutes a computer crime. Even as this conflict plays out on the net, the print media sensationalize very story of computer break-ins and computer pornography rings. Often crimes that only incidentally involve the net are promoted as being symptomatic of the destructive anarchy that exists on the net.

This attitude towards news about the net will eventually bring with it stricter laws governing cyberspace. Major concerns in net crime now involve break-ins, data theft, privacy violations and harassment. When the net was new, it existed solely for the purpose of cooperation and collaboration between researchers. Thus, resources were shared regularly and uncomplainingly. There were few enough sers that one could take the resources one needed without disturbing other people’s use of the net. Of course, there was not as much available then for which users would compete.

A few years ago, the idea of commercializing the net was a thought anathema to most of the users, but slowly and surely, businesses are establishing themselves on the net and will soon form a large portion of the traffic. The old generation fears the abuse of the anarchy of the net for advertising. Most people oppose intrusive methods of advertising, such as junk-mailing lists and “spamming” Usenet, or posting messages to many newsgroups as Canter&Siegel id. Individual choice in viewing promotional material is important to the older generations because this is not intrusive, and in fact supplies a desirable service.

Word of mouth is an important factor in deciding to view information about a product or a service. On the smaller net of the past, there was less crime, less reason for crime, and less vulnerability to major damage. The net was a homogeneous community, dedicated to collaboration, and the information stored on the net was hardly as sensitive as the information soon to be spreading across the net like credit card umbers, driver’s records, medical histories, proprietary information and sensitive financial information. The action most frowned upon by members of the old generation was misuse of resources.

Most realized that their systems and accounts were not very secure and tolerated some exploration by curious hackers (though not destruction of data). However, the old generation received a rude awakening in November 1988 with the Internet worm. As the worm spread to machines all over the nation, bringing down computer systems by the dozens, the net community began to realize that the ecurity of the net would help them protect their data and their resources. Although the worm was not a malicious invent ion, it was easy to conceive of a recurrence of the worm with destructive attributes.

In the early beginnings, many systems were open to all who wished to come and share data or read documents. Computer experts enjoyed exploring systems and finding entrances just for the knowledge to be gained from these activities. This “breaking in” to systems was not a major concern for users. Over time, though, people began to feel a right for privacy and security of their information and hackers fell into disfavor. Data theft was also not a big concern, as the purpose of the net was to share data, not to restrict information. There was very little personal or private information stored on the net.

The small community only included users with legitimate research concerns at the beginning, and cyberspace was not as anonymous as it is now, so harassment was not a concern. The new generation has heard of the infinite resources of the net and the hundreds of communities established on-line. In the last several years the news media have been trumpeting the magical things that the Internet can do for our society. Tantalized by these eports, thousands of people unaffiliated with research institutions or the government are streaming onto the Internet to access these resources.

This influx is causing a monumental change in the direction and the culture of the Internet. We are seeing the beginning of commercialization of the net. This definitely represents a trend away from the old attitudes, as commercial activity has been frowned upon for years. Now the people of the net demand commercial services, information about products, and companies demand access to consumers. It is unclear to me what the new generation of net users want in the form of dvertising. Within the last year, however, we have seen a frightening example of the potential of abuse of the Internet by advertisers with the law team of Canter and Siegel.

Their message which was posted to almost all newsgroups was considered very invasive and extremely inappropriate, yet the duo states that they considered the advertisement a success, and are willing to repeat it. Is this the kind of advertising the new generations want to see? Do we want our inboxes filled with junk email and our travels on the net interspersed by advertising? Because more of us will be on-line, and more of our commercial and usiness transactions will be taking place on-line in the future, crime will rise in cyberspace, and people will need to be protected.

Currently the net operates mostly in an anarchic state with sysadmins and government officials patrolling the borders. There may, however, be a call for greater security on the net. Because of the existence of much proprietary and personal information on the net in the future, access to sites will be restricted severely, and breaking into systems will become a more serious crime. Many people are willing to let the government install our safeguards, but here has been recent controversy about what kind of access the government should have to our information.

Computer crime has been sensationalized recently in the media, especially crimes linked to sex offenders or pornography distributers. I believe that this kind of reporting is detrimental to the future of the net because it may incite unnaturally stringent lawmaking in cyberspace. As the Internet grows to encompass a larger segment of the world’s population its diversity will increase until it begins to mirror the external world. We are beginning to see breakdown in the previously omogeneous characteristics of economic status and educational background.

In the San Francisco Bay area there are coffeehouses with cheap access to an on-line chat area that even homeless people can afford and indeed, many homeless people have come to find that these chat areas give them a sense of community and “home. ” Local library systems across the nation are providing net access. Maryland’s Sailor project is a good example – they provide gopher access in the libraries and through toll-free dialup, and individual libraries will begin to offer full access with mail, ftp and telnet. With he coming of the National Information Infrastructure, net access may become as common as telephone access.

It will cease to be merely a useful toy and tool for the research community and will be a simple fact of life, a point of access to a wealth of information and a meeting place for dispersed communities. We can easily expect conflict to arise in this nascent world net community simply because of differences in needs and visions for the net. An old attitude that makes it difficult to create harmony between the old generations and the new is the behavior of more experienced users towards `newbies’ on the net. In the past, one could expect other users to be somewhat familiar with computing environment.

People who asked too many `stupid’ questions were ostracized and `flamed. ‘ Now the net must handle a gigantic influx of users with less computer experience, who will ask thousands of questions in their exploration of the obscure operations of the Internet. People come to the net with great expectations of the vast resources available to them, and they do make use of them. Unfortunately, not all sites are able to accommodate the increase in traffic, especially with services like Compuserve and America On-line opening their ates to the Internet.

In a letter to ftp sysadmins, Robert Hirsh of AOL states that AOL will request that its members limit traffic to off-peak hours and that AOL will work with administrators to manage load problems, specifically by providing local mirror sites for AOL users and for Internet users. One Internet user from the University of Massachusetts voiced his fears in a post to the newsgroups alt. aol-sucks: “… careless actions by AOLers could seriously jeopardize access and availability on sites already overloaded and restricted. ‘ and“Those who depend on the Internet or legitimate information retrieval/sharing and communication will find themselves swamped in a sea of curiosity seekers, net. sex geeks, and those who are convince d that `telnet’ is synonymous with `Information Superhighway. ‘ ” The old generation perceives the new generations as overtaxing the resources and resents the burgeoning population. Conflicts are inevitable in the commercialization of the net. Simply, the old common philosophy was opposed to commercial activity on the net because the net existed solely for research purposes.

The new generations see the net as the center for many services and perations, and thus will require heavy commercialization of the net. Commercialization does promise to bring more advancement in technology and more investment in the net. The old generation is being forced to accept commercialization, and there has been little outcry over the appearance of commercial WWW sites. More than anything else, the old generation fears the intrusion of advertising, but this may become commonplace as people join the net through commercial providers and access commercial servers.

Beyond resource management and commercial use, the area of most concern policy-wise and legally is that of computer crime. The older generation were used to an anarchic Internet and some would like to continue this experiment in the spirit of freedom, but new users are demanding protections similar to those we enjoy in the physical world. I believe that the need for security is justified, though, because of the expanding and changing nature of the Internet. In particular, breaking in for exploratory purposes will be frowned upon.

As our cyber-dealings gain importance and we begin to think in terms of our cyber-personae as being extensions of ourselves into the realm of cyberspace, privacy violations, data theft and other rime will become more serious. We will spend more time in cyberspace handling our business correspondence, purchasing products, disseminating information and interacting with other people. Through these activities we will gain identities in cyberspace that will be as important to us as our identities in the physical world.

We will need to have easily available forms of authentication of people’s identities, probably through a digital signature. Will we need to ensure that people only have one identity in cyberspace? This may seem logical at first, just as in the physical world we are only one identity by the government for urposes of the law and finances. However, I believe that imposing too many restraints in cyberspace will fail, because there is a tradition of working around the technical solutions of authority to access greater freedom. Perhaps it will work in the business world, because fair dealings involve authentication of identity.

The net will become increasingly supported by commercial services, and many of the resources we now have free of charge will become commercial because they cannot serve the increasing population without funding. Advertisement will become a commonplace ccurrence on the net, though I hope that by convention it will remain unobtrusive. I fear that as more information about ourselves become available on-line, marketers will not resist the opportunity to use this knowledge to their advantage by targeting us for specific product pitches.

Cyberspace will be policed in the future. I envision an agreement between nations regarding illegal actions occurring in cyberspace on a international scope not unlike the current law of the sea. We will see the most control occurring where people get their access to the net. Walls will go up in cyberspace, information will be hidden nd restrained. We will still have hackers working their art on the net, finding ways around our technological barriers, and they will become more dangerous as we have more sensitive information on the net.

Crime stories on the net will be sensationalized because there will still be fear and misunderstanding of cyberspace, and because of the increasing importance of on-line security. The diversity of the emerging cultures will segment into like-minded communities. Information on the net is oriented towards serving interests and not uniting diverse interests. Thus, I fear that the ivision between the older generations and the new ones will become institutionalized as each culture builds the part of cyberspace in which they wish to exist, and there will be little communication between the parts culturally.

As we progress into the information age, everyone will move into cyberspace, just as most people have adopted telephones and integrated them into their homes and businesses. Thus, the on-line culture will slowly begin to duplicate the physical world in its inequalities and segmentation, its diversity and opportunity. Restrictions will go up and walls will be built in cyberspace. There will be laws and regional police to enforce those laws and monitor security in their regions.

We are undergoing a transition perhaps on the same scale as the transition to literacy several hundred years ago. For many centuries after writing began, this skill was left in the hands of the educated elite – mainly the church servants. When literacy finally came to the majority of the middle class and some of the lower class, the Renaissance began. Similarly, we are witnessing the opening of a new medium of information to the general populace, and we can only guess at the outcome.

Internet Tax Essay

The sign at Wal-Mart says, “99 Cents. ” Although, if you go through the checkout line and greet the cashier with merely a dollar, the cashier would laugh at you. The price isn’t ninety-nine cents, it’s $1. 06. The taxman at our friendly Raymore Wal-Mart claims an additional 7. 45% of one’s hard earned dollars. The additional seven cents doesn’t sound like much, but it is when it’s seven additional cents for each dollar of the thousands of dollars that a person spends at a business like Wal-Mart every year. These seven-cent deposits go to the Missouri and Raymore tax funds.

Taxes that help pay for public education, Medicaid, and other state and local services (Alster). There is only one problem, a new form of commerce is taking business away from normal “brick and mortar” businesses like Wal-Mart. This form of commerce isn’t new, it’s actually a few years old and it’s growing exponentially now. This new standard of commerce is called “electronic commerce” or “e-commerce. ” E-commerce comprised of $4 billion dollars in revenues in 1997 (“Let’s Not Rush”), which is almost negligible to the trillions of dollars done in total revenues in the United States.

Although, it is forecasted to be much more popular in the future. Shopping on the Internet has many benefits over shopping in local stores. E-Commerce, or electronic commerce, is commerce done over the Internet. The Internet is seen as the future for business and information technologies. Why would a person want to spend a few minutes getting dressed to drive fifteen minutes to Wal-Mart to spend an hour or two shopping, and another half an hour, on a good day, checking out and driving back home.

In the middle of all that, don’t forget the hassle of trying to find the right toy, while listening to obnoxious kids and dealing with sub-par customer service. There is another problem, most stores aren’t open 24 hours a day like Wal-Mart and QuickTrip. The biggest problem with regular business isn’t the customer service, or busy lines, it’s the six to eight percent sale price boost that occurs at the end of that busy line.

Until October 21, 2001, by the way of the Internet Tax Freedom act (“Bill”), any shopping done on the Internet with a business not located in your state of residence isn’t required to charge its customer a single cent of sales tax. In other words, a person saves 7. 45% by purchasing their goods from the Internet instead of Wal-Mart. That in itself is enough to bring booming business to the Internet, not to mention that you can buy stuff online at four in the morning with no clothes on and no one else in the world would ever know.

Although a person saves paying sales tax, the customer must pay for the products to be delivered. Depending on how much you spend on the Internet, this fee may be a little more or substantially less in relation to the money saved by not paying sales tax. There is a hidden cost for shipping when a person shops at Wal-Mart also. Don’t forget about the gas you burn driving there and back, the slow wear and tear on your vehicle of choice, and all the time you spend driving around town looking for the best prices.

On the Internet, there are web sites that tell you the best prices for products and where to get them. Buying some items on the Internet is the only way to go. Ever buy a computer for $2,000 and pay an additional $150 in taxes? Not on the Internet. It might cost forty dollars to have it shipped, but that extra $110 would fit nicely back into a wallet. The Internet tax Freedom Act from 1998 guarantees no sales taxes till after October 21, 2001. The act also founded a commission to research and plan a way to tax the Internet. The problem is very complex.

For instance, pretend person A owns a business on the Internet. Person A lives in Missouri but the computer that houses his business is in New York. Person B comes along and buys something from the web site. Person B lives in Kansas. The problem lies in that, Missouri, New York, and Kansas all want to collect that sales tax. A person shouldn’t have to pay tax on something three times though. The debate is over who gets the tax. On the other hand, what if the business is housed on a computer in London, England, then what happens?

Don’t forget that the next customer could be from China as well. New technologies lead to new problems. No matter what happens, if there is a sales tax, it will discriminate against somebody. The only explanation I can see is to not have a sales tax, but that discriminates against normal “brick and mortar” businesses, unless sales taxes are abolished nationally and higher income taxes are paid instead, which may be the only viable option. The Internet is a growing commerce, but it will never totally replace “brick and mortar” businesses.

When its 5:00 p. and there’s nothing in the fridge for dinner, ordering bread and lunchmeat on the Internet isn’t a feasible option, driving to Price Chopper is. Grocery stores probably have the least to worry about. Who would order milk or ice cream over the Internet? Non-consumable goods, on the other hand, have a lot to fear. Clothes, electronics, music, cars, furniture, toys, and many other items are quickly finding themselves on the Internet, all presently tax-free. Many items not included in general store stock are easily obtainable on the Internet.

There are generally more variations and brands to chose from on the Internet, all accessible without driving all over town. The largest fear for most people who are interested in shopping online is Internet security. People worry about their credit card numbers getting stolen, products not getting delivered, and other problems. The easiest answer to that is to use large well-known stores online. Don’t buy something from some guy who claims he has what you need. Seek other peoples’ opinions about the site as well.

E-mail the store and ask about their security issues, and as far as that goes, call them and place the order over the phone if too concerned. New 128-bit digital certificates used for encryption of messages on the Internet protect customers. If a person catches a message between a customer and the store, the person won’t be able to read it because its encrypted and it could take hours or days to crack the code (“Is it safe”). It’s more likely that a cashier at a local store would memorize your credit card numbers and use them later.

As far as security goes, when was the last time a person got robbed in the Internet store parking lot or got car-jacked on the way home? Ever have car problems while out shopping? Not on the Internet. The only thing you have to worry about is accidentally turning off the computer or the Internet service provider crashing. While shopping on the Internet, customers are in the safety and convenience of their own home. The Internet is undeniably the best source of information in the world, and it may very well be the next standard for commerce. No taxes and other benefits have made it possible for e-commerce to grow at an outlandish speed.

Taxing it would slow its growth, although, finding a way to tax it fairly is the hard part. “Brick and mortar” companies are in rich supplies these days. In the future, they will probably dwindle, but they’ll never disappear. Choosing the right product is a lot easier seeing it first hand, and some people enjoy shopping simply for the social activity. Meanwhile, the rest of us can sit around in our pajamas and buy stuff for great prices at two in the morning; paying shipping definitely sucks, but it’s better than paying Uncle Sam, at least for now.

Security on the Internet

How do you secure something that is changing faster than you can fix it? The Internet has had security problems since its earliest days as a pure research project. Today, after several years and orders of magnitude of growth, is still has security problems. It is being used for a purpose for which it was never intended: commerce. It is somewhat ironic that the early Internet was design as a prototype for a high-availability command and control network that could resist outages resulting from enemy actions, yet it cannot resist college undergraduates.

The problem is that the attackers are on, and make up apart of, the network they are attacking. Designing a system that is capable of resisting attack from within, while still growing and evolving at a breakneck pace, is probably impossible. Deep infrastructure changes are needed, and once you have achieved a certain amount of size, the sheer inertia of the installed base may make it impossible to apply fixes. The challenges for the security industry are growing. With the electronic commerce spreading over the Internet, there are issues such as nonrepudiation to be solved.

Financial institutions will have both technical concerns, such as the security of a credit card number or banking information, and legal concerns for holding individuals responsible for their actions such as their purchases or sales over the Internet. Issuance and management of encryption keys for millions of users will pose a new type of challenge. While some technologies have been developed, only an industry-wide effort and cooperation can minimize risks and ensure privacy for users, data confidentiality for the financial institutions, and nonrepudiation for electronic commerce.

With the continuing growth in linking individuals and businesses over the Internet, some social issues are starting to surface. The society may take time in adapting to the new concept of transacting business over the Internet. Consumers may take time to trust the network and accept it as a substitute for transacting business in person. Another class of concerns relates to restricting access over the Internet. Preventing distribution of pornography and other objectionable material over the Internet has already been in the news.

We can expect new social hurdles over time and hope the great benefits of the Internet will continue to override these hurdles through new technologies and legislations. The World Wide Web is the single largest, most ubiquitous source of information in the world, and it sprang up spontaneously. People use interactive Web pages to obtain stock quotes, receive tax information from the Internal Revenue Service, make appointments with a hairdresser, consult a pregnancy planner to determine ovulation dates, conduct election polls, register for a conference, search for old friends, and the list goes on.

It is only natural that the Web’s functionality, popularity, and ubiquity have made it the seemingly ideal platform for conducting electronic commerce. People can now go online to buy CDs, clothing, concert tickets, and stocks. Several companies, such Digicash, Cybercash, and First Virtual, have sprung up to provide mechanisms for conducting business on the Web. The savings in cost and the convenience of shopping via the Web are incalculable. Whereas most successful computer systems result from careful, methodical planning, followed by hard work, the Web took on a life of its own from the very beginning.

The introduction of a common protocol and a friendly graphical user interface was all that was needed to ignite the Internet explosion. The Web’s virtues are extolled without end, but its rapid growth and universal adoption have not been without cost. In particular, security was added as an afterthought. New capabilities were added ad hoc to satisfy the growing demand for features without carefully considering the impact on security. As general-purpose scripts were introduced on both the client and the server sides, the dangers of accidental and malicious abuse grew.

It did not take long for the Web to move from the scientific community to the commercial world. At this point, the security threats became much more serious. The incentive for malicious attackers to exploit vulnerabilities in the underlying technologies is at an all-time high. This is indeed frightening when we consider what attackers of computer systems have accomplished when their only incentive was fun and boosting their egos. When business and profit are at stake, we cannot assume anything less than the most dedicated and resourceful attackers typing their utmost to steal, cheat, and perform malice against users of the Web.

When people use their computers to surf the Web, they have many expectations. They expect to find all sorts of interesting information, they expect to have opportunities to shop and they expect to be bombarded with all sorts of ads. Even people who do not use the Web are in jeopardy of being impersonated on the Web. There are simple and advanced methods for ensuring browser security and protecting user privacy. The more simple techniques are user certification schemes, which rely on digital Ids. Netscape Communicator Navigator and Internet Explorer allow users to obtain and use personal certificates.

Currently, the only company offering such certificates is Verisign, which offers digital Ids that consist of a certificate of a user’s identity, signed by Verisign. There are four classes of digital Ids, each represents a different level of assurance in the identify, and each comes at an increasingly higher cost. The assurance is determined by the effort that goes into identifying the person requesting the certificate. Class 1 Digital IDs, intended for casual Web browsing, provided users with an unambiguous name and e-mail address within Verisign’s domain.

A Class 1 ID provides assurance to the server that the client is using an identity issued by Verisign but little guarantee about the actual person behind the ID. Class 2 Digital IDs require third party confirmation of name, address, and other personal information related to the user, and they are available only to residents of the United States and Canada. The information provided to Verisign is checked against a consumer database maintained by Equifax. To protect against insiders at Verisign issuing bogus digital IDs, a hardware device is used to generate the certificates.

Class 3 Digital IDs are not available. The purpose is to bind an individual to an organization. Thus, a user in possession of such an ID could, theoretically, prove that he or she belongs to the organization that employs him or her. The idea behind Digital IDs is that they are entered into the browser and then are automatically sent when users connect to sites requiring personal certificates. Unfortunately, the only practical effect is to make impersonating users on the network only a little bit more difficult.

Many Web sites require their users to register a name and a password. When users connect to these sites, their browser pops up an authentication window that asks for these two items. Usually, the browser than sends the name and password to the server that can allow retrieval of the remaining pages at the site. The authentication information can be protected from eavesdropping and replay by using the SSL protocol. As the number of sites requiring simple authentication grows, so does the number of passwords that each user must maintain.

In fact, users are often required to have several different passwords for systems in their workplace, for personal accounts, for special accounts relating to payroll and vacation, and so on. It is not uncommon for users to have more than six sites they visit that require passwords. In the early days of networking, firewalls were intended less as security devices than as a means of preventing broken networking software or hardware from crashing wide-area networks. In those days, malformed packets or bogus routes frequently crashed systems and disrupted servers.

Desperate network managers installed screening systems to reduce the damage that could happen if a subnet’s routing tables got confused or if a system’s Ethernet card malfunctioned. When companies began connecting to what is now the Internet, firewalls acted as a means of isolating networks to provide security as well as enforce an administrative boundary. Early hackers were not very sophisticated; neither were early firewalls. Today, firewalls are sold by many vendors and protect tens of thousands of sites.

The products are a far cry from the first-generation firewalls, now including fancy graphical user interfaces, intrusion detection systems, and various forms of tamper-proof software. To operate, a firewall sits between the protected network and all external access points. To work effectively, firewalls have to guard all access points into the network’s perimeter otherwise, an attacker can simply go around the firewall and attack an undefended connection. The simple days of the firewalls ended when the Web exploded.

Suddenly, instead of handling only a few simple services in an “us versus them manner”, firewalls now must be connected with complex data and protocols. Today’s firewall has to handle multimedia traffic level, attached downloadable programs (applets) and a host of other protocols plugged into Web browsers. This development has produced a basis conflict: The firewall is in the way of the things users want to do. A second problem has arisen as many sites want to host Web servers: Does the Web server go inside or outside of the firewall?

Firewalls are both a blessing and a curse. Presumably, they help deflect attacks. They also complicate users’ lives, make Web server administrators’ jobs harder, rob network performance, add an extra point of failure, cost money, and make networks more complex to manage. Firewall technologies, like all other Internet technologies, are rapidly changing. There are two main types of firewalls, plus many variations. The main types of firewalls are proxy and network-layer.

The idea of a proxy firewall is simple: Rather than have users log into a gateway host and then access the Internet from there, give them a set of restricted programs running on the gateway host and let them talk to those programs, which act as proxies on behalf of the user. The user never has a account or login on the firewall itself, and he or she can interact only with a tightly controlled restricted environment created by the firewall’s administrator. This approach greatly enhances the security of the firewall itself because it means that users do not have accounts or shell access to the operating system.

Most UNIX bugs require that the attacker have a login on the system to exploit them. By throwing the users off the firewall, it becomes just a dedicated platform that does nothing except support a small set of proxies-it is no longer a general-purpose computing environment. The proxies, in turn, are carefully designed to be reliable and secure because they are the only real point of the system against which an attack can be launched. Proxy firewalls have evolved to the point where today they support a wide range of services and run on a number of different UNIX and Windows NT platforms.

Many security experts believe that proxy firewall is more secure than other types of firewalls, largely because the first proxy firewalls were able to apply additional control on to the data traversing the proxy. The real reason for proxy firewalls was their ease of implementation, not their security properties. For security, it does not really matter where in the processing of data the security check is made; what’s more important is that it is made at all. Because they do not allow any direct communication between the protected network and outside world, proxy firewall inherently provide network address translation.

Whenever an outside site gets a connection from the firewall’s proxy address, it in turn hides and translates the addresses of system behind the firewall. Prior to the invention of firewalls, routers were often pressed into service to provide security and network isolation. Many sites connecting to the Internet in the early days relied on ordinary routers to filter the types of traffic allowed into or out of the network. Routers operate on each packet as a unique event unrelated to previous packets, filtered on IP source, IP destination, IP port number, and a f few other basic data contained in the packet header.

Filtering, strictly speaking, does not constitute a firewall because it does not have quite enough detailed control over data flow to permit building highly secure connections. The biggest problem with using filtering routers for security is the FTP protocol, which, as part of its specification, makes a callback connection in which the remote system initiates a connection to the client, over which data is transmitted. Cryptography is at the heart of computer and network security. The important cryptographic functions are encryption, decryption, one-way hashing, and digital signatures.

Ciphers are divided into two categories, symmetric and asymmetric, or public-key systems. Symmetric ciphers are functions where the same key is used for encryption and decryption. Public-key systems can be used for encryption, but they are also useful for key agreement and digital signatures. Key-agreement protocols enable two parties to compute a secret key, even in the face of an eavesdropper. Symmetric ciphers are the most efficient way to encrypt data so that its confidentiality and integrity are preserved.

That is, the data remains secret to those who do not posses the secret key, and modifications to the cipher text can be detected during decryption. Two of the most popular symmetric ciphers are the Data Encryption Standard (DES) and the International Data Encryption Algorithm (IDEA). The DES algorithm operates on blocks of 64 bits at a time using a key length of 56 bits. The 64 bits are permuted according to the value of the key, and so encryption with two keys that differently in one bit produces two completely different cipher texts.

The most popular mode of DES is called Cipher Block Chaining (CBC) mode, where output from previous block are mixed with the plaintext of each block. The first block is mixed with the plaintext of each block. The block uses a special value called the Initialization Vector. Despite its size and rapid growth, the Web is still in its infancy. So is the software industry. We are just beginning to learn how to develop secure software, and we are beginning to understand that for our future, if it is to be online, we need to incorporate security into the basic underpinnings of everything we develop.

E-commerce and Security

This project will look at e-Commerce, concentrating on security measures of an online auction site, eBay. Security on the Internet is a concern for any online business in today’s society. We will discuss online services, how businesses on the Internet conduct their transactions and shipping.

Conclusion Body (description of the problem) Technology is moving ahead of a rapid pace and reinventing the way business is done. E-business has the potential to affect every part of the value chain, from inbound logistics and operations through to outbound logistics, marketing and after sales support. Forecasts indicate that in 2001 e-business will top U. S. $ 434 billion and much of that will be in business-to-business transactions. The advantages are clear, e- business can help cut costs, link supply chains more efficiently and service markets more effectively.

Scalability, flexibility, security, performance and back end integration are all vital issues, getting architect right is the key. While online services are growing, security measures are becoming more of a concern. Background to Online Services Offering end-to-end online service: The concept of e-comerce essentially gives the customer indirect access to the company’s mainframe based insurance systems. The customer can get information and quotes online whenever it suits them and they can also buy their insurance coverage online and be instantly insured.

This is an end-to-end fully online operation available 24 hours a day. Integrating the supply chain: Businesses need to take advantage of e-commerce to beat competition. There is an example of a leading organization in the building industry, with over 2000 suppliers and 25 stores serving both trade and retail customers. They wanted to reduce overheads, strengthen their supplier relationships and take advantage of e-commerce. One solution is to automate the purchase order process. Orders are sent directly to a suppliers’ system with a consolidated order sent at the end of the day.

Suppliers complete the process by invoicing our client back electronically- the invoice is automatically generated when the goods are despatched. Orders and invoices can be in any file format required by either the buyer or the supplier. All that’s required is a common network between the two parties, such as the Internet. Also neither party is tied to any particular Internet service or bandwidth capabilties. In fact, the solution even works if one party doesn’t have a computer because both orders and invoices can be electronically arranged over the phone.

Becoming an electronic middleman: A unique electronic infrastructure means a wide range of businesses can transact their e- business via the client’s service, the Internet for example. Flexibility, security and ease of replication are key business requirements. The solution provides diverse means of access, high-grade public key encryption of messages, full transaction logging and audit, and a transaction manager that combines multiple back end applications into a single scaleable view for the customer.

The technical skills have made a new multi-bank bill payment service possible. Customers can pay their bills via the Internet using a secure payment services from around the world. They can view bills before payment, set up automatic payments, and load new payments any time by selecting from lists of banks and billers. From a business perspective, the service allows banks and billers a cost effective electronic bill delivery and payment services without having to develop their own systems.

Any transaction activity done on the Internet, especially deailing with payments must be highly secure. No customer information is held on the web server and the system uses the latest 128 bit encryption technology and customer have passwords and ID. International retailing via the net: There are some examples of international retailing done on the Internet. For example, an internationally known publisher wanted an e-business model to launch an internet-based weekly subscription service.

They required an international electronic subscription mechanism and a secure credit card payment system with payments processed in the customer’s own country. The middleware that allows the company to both capture and manage subscriptions and process payments works with a range of banks around the world and allows dealings in some 120 countries. The solutions also provides a wide range of administrative and reporting functions. The business can extract marketing data and keep a profile of its customers.

They can also check statistics, such as visits to the site, and monitor the performance of their network, web servers and internet links. The web site is based on the architectural principles of scalability, security, platform independence and open Internet standards. This was a leading edge project and the first site outside the U. S. to use open market payment middleware. The solution was developed and implemented within just 4 months, which enabled the client to globally launch their product within a very short timeframe.