By: Steve Outing
Most people who regularly use the World Wide Web will tell you that based on their experience, at certain times of the day it will take longer to pull up a page from a remote server than at other times. A common belief is that the Internet gets overloaded at certain times of the day, and bottlenecks on the ‘Net are slowing things down.
A new study by OnTime Delivery, a page delivery audit service for Web publishers, debunks that myth. OnTime Delivery monitors Web pages for paying clients and reports back on the performance of the client’s server in delivering pages to the desktop. The system periodically accesses clients’ URLs and compiles a report on how long it took to pull down the entire page, then compares the results to that of all other servers it tracks.
What it found from an analysis of its clients’ URLs during a beta test from May to September 1995 was that there was no significant day to day or hour to hour variation in page delivery times. “Our assumption that there are ‘best times’ to cruise the Web didn’t stand up to actual measurement,” the OnTime Delivery report states. “Hourly and daily load statistics are essentially useless.”
Individual Web pages can have wide variations in delivery, according to the report, on a minute to minute basis. It is the local conditions on your Web server and Internet connection that make the difference, not the overall traffic on the Internet backbones. “The important point is, you should focus your efforts on short term (minute to minute) load rather than watching the hourly and daily load,” the report says.
Some other interesting factoids from the OnTime Delivery report:
* When browsing Web pages, a 14.4 modem on average took 40% more time to retrieve a page than a 28.8 modem — rather than 100% more time, as you might expect.
* Most pages take 10 seconds or less to load with the graphics-loading option turned off on your Web browser.
* Most Web authors choose to create pages that load in under 30 seconds.
* The ideal page size, according to OnTime Delivery, is 10K for small pages and a maximum of 70K for large pages.
Beware Alta Vista!
The hottest Web search engine yet appeared on the scene a few weeks ago. Digital’s Alta Vista service does an incredible job of indexing the massive contents of the world’s Web sites, cataloging far more pages than any other search engine out there (16 million pages and 8 billion words). Run a typical search containing 2 or 3 words, and you’re likely to get results back containing thousands of documents matching your request. Fine tune your search and if the information you’re looking for is somewhere on the Web, Alta Vista will probably find it when other search engines fail. (There’s currently no charge for using the search service.)
Alta Vista’s comprehensiveness has brought out a problem that Web publishers need to be aware of: Some Web site operators are in the practice of putting documents in their public directories — particularly test files — that are not meant for public consumption. You might think that no one could view those files, because none of your publicly published documents contain links to them. But guess what? They very well could show up in an Alta Vista search, just as though they are public.
The solution is simple: Don’t put anything confidential on an insecure part of your Web server now that search engines as powerful as Alta Vista exist and their Web robots (sometimes known as spiders) are scouring deep into the content of every publicly accessible Web site indexing pages. A slightly more complex solution is to exclude robots from looking at certain parts of your Web server by creating a file on the server which specifies an access policy for robots. If you are interested in learning how to do this, check out this page on Web robot etiquette by Martijn Koster.
A similar “security trap” that some unwary Web site operators have fallen into is putting information in the “Comments” field of a Web page’s HTML code that they don’t expect outsiders to see. Reading the code behind a Web page is a simple matter of selecting “View source” from within Netscape, for instance. It may seem obvious, but some Web publishers treat Comment codes within Web documents as secure. They’re not.
Best Online Newspaper Services Competition
Please don’t forget to nominate your own company or another for Editor & Publisher/The Kelsey Group’s 1996 Best Online Newspaper Services Competition. The nomination form is on the Web at http://www.mediainfo.com/contest.form.html. Deadline for nominations is January 24, 1996. Winners will be announced at the Interactive Newspapers conference in San Francisco on February 24, 1996.
Steve Previous day’s column | Next day’s column | Archive of columns
Presented 5 days a week by Steve Outing, Planetary News LLC.
Made possible by Editor & Publisher magazine.
Got a tip? Let me know about it
If you have a newsworthy item about the newspaper new media business, please send me a note.
This column is written by Steve Outing and underwritten by Editor & Publisher magazine. Tips, letters and feedback can be sent to Steve at firstname.lastname@example.org