Many years ago when I got my first palmtop, the HP 95LX, I subscribed to The HP Palmtop Paper. The majority of the articles would make reference to the HPHAND forum on CompuServe. I would log onto HPHAND every Saturday and print the forum messages for the past week.
I did not want to be tied to a computer screen and a desk to read the messages, so I printed it on paper. I thought there had to be a better way than wasting the paper and toner, but there were no other options yet.
After a few months I noticed a program called acCIS, which permitted offline reading of the forum messages with the 95LX. It was shareware, so I gave it a try. The only problem was lack of mass storage on my HP 95LX; I only had a small 256K SRAM card, so I used my portable Drive95 battery-operated floppy drive. It was not an elegant solution, so I stayed with paper.
A little while later acCIS became a commercial product with more robustness, and I purchased a 1-meg SRAM memory card. It was perfect timing and I began my offline reading of the CompuServe HPHAND forum. I added the AP Newswire Online and TIME magazine to my list via the scripting available with acCIS. The scripting was cumbersome, as the format on CompuServe kept changing, and I missed the depth that the newspapers provided.
The only thing missing was reading The Wall Street Journal and The New York Times on my palmtop. I felt eliminating the daily pile of newspapers and the newsprint ink from my fingers would make sense, but until recently there were no options.
WWW/LX
With WWW/LX, HV, and GET.COM I was finally able to read The Wall Street Journal on my palmtop. I am actually able to read many different publications, such as Business Week, The New York Times, and others. When I wake up in the morning I have The Wall Street Journal and The New York Times waiting for me. I created an alarm which runs a batch file (NEWS.BAT) before I wake up. I plug my palmtop into the modem and turn on the modem before I go to bed the night before. The batch file, using GET.COM and WWW/LX, goes online and downloads the Web pages to my flash disk. The front pages of each section have the same Web address each day, making this process easy.
The NEWS.BAT file downloads the front pages of the different sections of The Wall Street Journal. Each article is summarized by a few sentences and references the full article as a separate Web page. While I have my morning coffee I use HV to read the downloaded Web pages. By reading or scanning the summaries on each front page I am able to get a summary of the day's news. I select those articles where I want the full story and details. The articles will be downloaded when I have finished the summaries. The New York Times is similar. Some of the sections contain summaries and the reference to the full article, and some sections contain only the headline or title with the reference to the full page.
When I am done reading the front pages I go back online using GETART.BAT to download the new articles (Web pages) I wish to read. I then use HV to read these new articles. By automating the download process with the batch files and GET.COM, you save a tremendous amount of time, as compared to reading the summaries and selecting and reading the articles while you are online.
The Wall Street Journal is a paid subscription Web site. The cost is $49.00 per year. If you already subscribe to the paper version you get a discount on the Web version. The New York Times is free, but requires you to register. Business Week does not require any registration. I have noticed that many Web sites do not require any registration or fee.
The best way to register is to log on with a full-featured desktop Web Browser of your choice, as HV does not handle the registration pages. For The Wall Street Journal, point your browser to www.wsj.com. For The New York Times, point your browser to www.nytimes.com.
How it all works
Maybe I should explain how the different pieces work together. There are three different programs. WWW/LX is a communication platform which dials your Internet provider and provides the communication backbone. It is basically transparent to the user. HV is a program which permits you to view Web Pages online (using WWW/LX) and offline as well. GET.COM is a program which uses WWW/LX to retrieve Web pages you have selected with HV and stores them on your palmtop. You then view those pages offline with HV. GET.COM also does one very important thing. It fixes the Web Links on the page. The Links are relative to the current page, and when you are online it links you appropriately. When the file is saved to the A drive (as when you are reading a downloaded article) the link would reference the A drive and not the actual full Web address. GET.COM fixes the relative Web links to a fixed Web link when it saves the files to the drive during the download process. In order to read a Web page and mark links for further downloading, the page must be retrieved with GET.COM. If you save the page while actually online using F10 to save the page, the links will not be properly updated and the link will point to the A drive, instead of an actual Web address.
The directory structure I use keeps things in order and helps identify where to find different pages on my flash disk. I keep WWW/LX and GET.COM in A:\WWW. I keep HV and its related programs and the GET.DAT file in A:\HV. I keep the Wall Street Journal articles in A:\HV\WSJ. I keep The New York Times articles in the A:\HV\NYT. At the end of the day I delete all the articles I have read, to make room for the next day. I also move the unread articles to another directory, using the date as part of the file name. The unread articles from June 15 would be in A:\HV\WSJ0615. If after a few days I have not read the articles, I delete them. Just like with the print version, you have to admit defeat and toss the old papers for recycling.
The Wall Street Journal and The New York Times both require you to enter a username and a password. GET.COM works with authentication, using a file called GET.AUT to store the required information. I use the following format in the GET.AUT with xxxx as the username and yyyy as the password. You can create this file with Memo.
GET.AUT
HTTP://WWW.NYTIMES.COM
xxxx:yyyy
HTTP://WWW.WSJ.COM
xxxx:yyyy
There is one very important point to make regarding the *.DAT files. Memo inserts spaces when you edit the *.DAT file, due to the long lines without spaces. I set memo to 120 characters per line before loading the *.DAT file. Memo causes a problem because the file names and paths could not have spaces, and memo added them when editing with 64 or 80 column widths, and the line wrapped on the right margin. Why? I don't know. This is very important to remember as GET.COM will hang with the extra space which Memo adds to long lines when it wraps the text to the next line. I use EDIT.COM from my desktop to edit the *.DAT files. I use the EDIT.COM from Win95 because it permits multiple files to be open at the same time, so I can cut and paste among them easily when editing. You can use EDIT.COM from Dos 5 or 6. Memo will work if you set the right margin big enough so that it does not force any line wraps.
HTMages of the different sections and select the articles I would like to read by pointing to the reference and pressing (Fn) (Copy) and giving it a file name. The pages have summaries of the articles. By just reading the front pages of The Wall Street Journal you get a good feel for the news of the day. I set my cache directory to a:\hv\wsj so that path pops up automatically when file name is requested. It is important to make the file name descriptive so that you read the important articles first.
A&E, CNN, and Business Week
You will note that I also have a reference for the A&E schedule for the day, and the computer news Web site. You could also add the CNN (www.cnn.com) news site and others to suit your news preference. Once a week I download the Business Week table of contents from www.businessweek.com/c ontents.htm and select the articles I want to read.
To retrieve the articles I have selected I use the following batch file. It uses the GET.DAT file which was created by the (Fn) (Copy) selection. After I have the articles downloaded I delete the GET.DAT file, since I no longer need it and do not need to download the articles a second time. GETART stands for GET ARTicle.
GETART.BAT
www "!get a:\hv\get.dat"
Be selective when choosing the articles you download, consider how much time you have, and choose accordingly. If you do not get to the article, I suggest you delete it, or scan it briefly if the topic is important for later reading. It's no different than with the print version, when you finally admit you will not get to that stack and you bundle it for recycling.
I average about 200K to 400K of disk space for the articles, etc., for one day. I usually delete the front pages before I retrieve the articles, to save disk space.
The most recent versions of HV and GET.COM are at the D&A home page (www.dasoft.com) with instructions for their setup and use.
I have used this combination to download the concert schedule from the radio station Web site, tax forms and instructions and other business information I need from my palmtop. HV does permit you to view graphics inside the document, but I generally use it with the graphics turned off. The process is much faster without the graphics, but even with graphics turned on it is much faster than logging on with my desktop browser.
I now get to read my paper the first thing in the morning, and I do not have to wait for the mail delivery. Although reading the newspaper on the palmtop was difficult at first, I quickly learned to adapt. I scan the articles when reading the print version, and initially I did not scan the articles at all when reading on the palmtop. I would read every word of each article I downloaded. Now that I learned to do more scanning on the palmtop screen, I get through the news much faster and still read the full text when I need to. If the article is not interesting or informative I no longer feel compelled to read every word. There are also references in The Wall Street Journal pages to other sites, such as governmental reports and their full text, if you need them. I find The Wall Street Journal to be the best site. I probably get most of my news that way. Although the articles come from the print version, the Web site adds many "Web specific" features.