Skip to main content

IBM Systems  >   z Systems  >   z/VM  >  

Description of OBFUSCAT

Download count: 13 this month, 3853 altogether.

Downloads for OBFUSCAT:
VMARC archive: v-8K

The essential files in this package are

  • OBFUSCAT EXEC -- a REXX program for use with VM SFS-based web pages.
  • OBFUSCAT.CMD -- an OS/2 version, included for the same reason that men climb mountains.
  • OBFUSCAT DESCRIPT -- this file: a description, explanation, guide, and all-around useful thing to have.
The author of all the programs and documentation is Tim Greer.

The OBFUSCAT package helps you create pages on the World Wide Web which are publicly accessible but which include portions that are not directly accessible. A visitor to the web site can thus be restricted to viewing the pages in certain orders. In particular, the method eliminates the potential for viewing a subpage in unknown context, e.g. via a link from another web site or via a hotlist entry.

OBFUSCAT EXEC is a REXX program intended for use with a web page residing on VM in multiple Shared File System directories. One directory contains files which are to be publicly available, but to which you desire that surfers have no direct access. Surfers may view pages and images in this directory, but only by following links from pages in another of your directories. Frequently, for example every night, run OBFUSCAT EXEC. It scans through the no-outside-links directory and renames each file to some random set of characters. The program then scans each file in both directories, finding hypertext links and references to the former name of each file, and updating all such links and references to point to the new file name. The random file names are chosen to avoid duplication. The result is that all links from other sites and hotlist entries work if pointing to files in the second directory, but the only working links to files in the first directory will be those links from within the directories updated by the program.

Why would anyone want such a program? You might want to...

  • Keep your entire site transparently public, yet ensure visitors see a warning/disclaimer page.
  • Force surfers to a first page in order to gather statistics.
  • Enable presentation of an unaltered HTML file while ensuring that it is presented within a specific temporal context. (An example would be one religious group presenting a rebuttal of another's web-based propaganda.)
  • Ensure context of pages in a multi-page web document. An example of this need comes from results of key-word searching using web search engines. I have frequently gotten hits on what are obviously interior pages. Often the author has failed to even give me a link back to the beginning! (Of course, this EXEC does not prevent the search engines from pointing to these interior pages. What would happen is that when you tried to go to a page on which you got a hit, the link would not be there. This happens because the search engine builds its tables in advance, and the link name will have changed in the meantime. What we need to fix this is an extra HTML tag to tell search engines not to point hits to *this* page, but rather to page xxx (where presumably xxx would guide us to this page). This would work like the existing tag with which you can tell search engines what text to display in describing your page: <META name="description" content="Pack my box with five dozen liquor jugs.">.
  • Block unauthorized/unexpected links.
    • Interior links to a complex web or a complex subsection.
    • <IMG SRC=...> links to images without the accompanying text.
  • Allow pages with advertising to put the advertising on one page and the content on another and ensure that the advertisements still get seen. (This would nicely open the way to effective advertising even if you have image loading turned off.)
  • On line catalogs may want to force users to go to a "specials" page before accessing the rest of the catalog.


Put your web files into two SFS directories, one for which external links are allowed and one for which external links are undesirable. Then you need to modify two lines in OBFUSCAT EXEC to point to your directories. Here is an excerpt from the program, showing the lines you must change:

   /* Randomize files in this directory */                                      
  TARGETDIR = "SERVK1:TGREER1.WEBTEST.OBFUSCUS"                                 
   /* Update anchors in this directory */                                       
  OTHERDIR  = "SERVK1:TGREER1.WEBTEST"                                          
The variable TARGETDIR holds the name of the directory for which external links to files therein are not desired. That is, TARGETDIR is the no-outside-links directory. OTHERDIR holds the name of the directory where external links are ok.

Of course, you will probably want to modify the program more than that. For example, you may want to scan more than just two directories. You may also want to search for other types of HTML tags. You may also want to work on other platforms, so I include an OS/2 version too, OBFUSCAT.CMD.

Both OBFUSCAT EXEC and OBFUSCAT.CMD have undergone some minimal testing, but only in the author's environment. I encourage you to test carefully in your own environment, even if you do not make any modifications other than those required for TARGETDIR and OTHERDIR. Feel free to e-mail the author with suggestions and bug reports. I'd like to know you are using my program. However, since the program works for me and pretty much does what I need, it's fairly unlikely that I'll be coming out with new, improved versions. Finally, a wee little warning. This program does rename files in the TARGETDIR directory. This makes those files inconvenient to modify (because you can't find the $%&@#+! file!), and makes the HTML links from other files to them unrecognizable. There are ways to handle this, e.g. keep "originals" in other directories, or modify OBFUSCAT EXEC to keep the original names somewhere. But do plan ahead.