|   | Open Letter to Veep Al Gore in re New Computer StaNOTICE: TO ALL CONCERNED Certain text files and messages contained on this site deal with activities and devices which would be in violation of various Federal, State, and local laws if actually carried out or constructed. The webmasters of this site do not advocate the breaking of any law. Our text files and message bases are for informational purposes only. We recommend that you contact your local law enforcement officials before undertaking any project based upon any information obtained from this or any other web site. We do not guarantee that any of the information contained on this system is correct, workable, or factual. We are not responsible for, nor do we assume any liability for, damages resulting from the use of any information on this site.
 From <@vm42.cso.uiuc.edu:[email protected]> Sun Jul  3 21:17:49 1994
 Return-Path: <<@vm42.cso.uiuc.edu:[email protected]>>
 Received: from vm42.cso.uiuc.edu by netcom.com (8.6.8.1/SMI-4.1/Netcom)
 id VAA26612; Sun, 3 Jul 1994 21:17:44 -0700
 Received: from VM42.CSO.UIUC.EDU by vm42.cso.uiuc.edu (IBM VM SMTP V2R2)
 with BSMTP id 0987; Mon, 04 Jul 94 03:58:05 UTC
 Received: from VM42.CSO.UIUC.EDU (NJE origin LISTSERV@UIUCVM42) by VM42.CSO.UIUC.EDU (LMail V1.2a/1.ith BSMTP id 2964; Mon, 4 Jul 1994 03:46:29 +0000
 Received: from NIU.BITNET by VMD.CSO.UIUC.EDU (LISTSERV release 1.8a) with NJE
 id 4686 for [email protected]; Sun, 3 Jul 1994 22:40:35 -0500
 Date: Sun, 03 Jul 94 22:36 CDT
 To: cudigest%[email protected]
 From: TK0JUT2%[email protected]
 Subject: Cu Digest, #6.60
 Message-ID: <CUDIGEST%[email protected]>
 Sender: [email protected]
 Status: O
 
 Computer underground Digest    Sun  June 30, 1994   Volume 6 : Issue 60
 ISSN  1004-042X
 
 Editors: Jim Thomas and Gordon Meyer ([email protected])
 Archivist: Brendan Kehoe
 Retiring Shadow Archivist: Stanton McCandlish
 Shadow-Archivists: Dan Carosone / Paul Southworth
 Ralph Sims / Jyrki Kuoppala
 Ian Dickinson
 Coptic Idolator:       Ephram Shrewdlieu
 
 CONTENTS, #6.60 (Sun, June 30, 1994)
 
 File 1--Open Letter to Veep Al Gore in re New Computer Standard
 File 2--PDC'94 CFP-Artifacts session (revised)
 File 3--ACM Releases Crypto Study
 
 Cu-Digest is a weekly electronic journal/newsletter. Subscriptions are
 available at no cost electronically.
 
 CuD is available as a Usenet newsgroup: comp.society.cu-digest
 
 Or, to subscribe, send a one-line message:  SUB CUDIGEST  your name
 Send it to [email protected] or [email protected]
 The editors may be contacted by voice (815-753-0303), fax (815-753-6302)
 or U.S. mail at:  Jim Thomas, Department of Sociology, NIU, DeKalb, IL
 60115, USA.
 
 Issues of CuD can also be found in the Usenet comp.society.cu-digest
 news group; on CompuServe in DL0 and DL4 of the IBMBBS SIG, DL1 of
 LAWSIG, and DL1 of TELECOM; on GEnie in the PF*NPC RT
 libraries and in the VIRUS/SECURITY library; from America Online in
 the PC Telecom forum under "computing newsletters;"
 On Delphi in the General Discussion database of the Internet SIG;
 on RIPCO BBS (312) 528-5020 (and via Ripco on  internet);
 and on Rune Stone BBS (IIRGWHQ) (203) 832-8441.
 CuD is also available via Fidonet File Request from
 1:11/70; unlisted nodes and points welcome.
 
 EUROPE:   from the ComNet in LUXEMBOURG BBS (++352) 466893;
 In ITALY: Bits against the Empire BBS: +39-461-980493
 
 UNITED STATES:  etext.archive.umich.edu (141.211.164.18)  in /pub/CuD/
 ftp.eff.org (192.88.144.4) in /pub/Publications/CuD
 aql.gatech.edu (128.61.10.53) in /pub/eff/cud/
 world.std.com in /src/wuarchive/doc/EFF/Publications/CuD/
 uceng.uc.edu in /pub/wuarchive/doc/EFF/Publications/CuD/
 wuarchive.wustl.edu in /doc/EFF/Publications/CuD/
 EUROPE:         nic.funet.fi in pub/doc/cud/ (Finland)
 ftp.warwick.ac.uk in pub/cud/ (United Kingdom)
 
 JAPAN:          ftp.glocom.ac.jp /mirror/ftp.eff.org/
 
 COMPUTER UNDERGROUND DIGEST is an open forum dedicated to sharing
 information among computerists and to the presentation and debate of
 diverse views.  CuD material may  be reprinted for non-profit as long
 as the source is cited. Authors hold a presumptive copyright, and
 they should be contacted for reprint permission.  It is assumed that
 non-personal mail to the moderators may be reprinted unless otherwise
 specified.  Readers are encouraged to submit reasoned articles
 relating to computer culture and communication.  Articles are
 preferred to short responses.  Please avoid quoting previous posts
 unless absolutely necessary.
 
 DISCLAIMER: The views represented herein do not necessarily represent
 the views of the moderators. Digest contributors assume all
 responsibility for ensuring that articles submitted do not
 violate copyright protections.
 
 ----------------------------------------------------------------------
 
 Date: Thu, 23 Jun 1994 17:12:16 -0500 (CDT)
 From: Wade Riddick <[email protected]>
 Subject: File 1--Open Letter to Veep Al Gore in re New Computer Standard
 
 An Open Letter To Al Gore,
 Vice President of the United States of America
 
 A New Computer Standard: Fixing the Flats on the Information Highway
 
 The U.S. must manage the early adoption of industrywide
 standards that render emerging technologies compatible with
 each other and speed commercial acceptance.  Such standards
 make it easier for purchasers to experiment with equipment
 embodying new technology and reduce the risk of committing to
 a technology that quickly becomes obsolete . . .
 In the U.S., technological standards are set with little
 regard to such issues. Large companies or government agencies
 set de facto standards...  Unfortunately, none of these
 sources of standards has explicit responsibility for managing
 the standards process to best promote a new technology.
 
 - Robert Reich1
 
 One important roadblock often missed by policymakers as they
 work to lay the foundations of the information super-highway is the
 incompatibility that exists among the operating systems and microchips
 that will form the highway's roadbed.  When the Clinton Administration
 opened the telecommunications industry to competition, its goal was
 not to limit consumer choice, but rather to broaden choice by
 weakening narrow, monopolistic controls over technology and allowing
 small private companies to move technology in many different
 directions.
 None of this will be possible without a common standard to allow
 these diverse innovations to interact.  Just as the national economy
 needs a common currency and a common language in which to conduct
 business, so too does the information superhighway need a standard
 through which its components can interact.  Since the development of
 the U.S. Department of Defense's Advanced Research Projects Agency
 Network (ARPANET) in the 1960s, the federal government has done an
 admirable job establishing network protocols, which are rules needed
 for seamless long-distance data transmission between computers.
 Without such standards, today's international computer network, known
 as the Internet, would not exist.
 The U.S. government, however, has not done a good job of
 standardizing the basic commands needed to operate computers-the
 languages, compilers, operating systems and other instructions
 governing the microprocessor (the central processing unit, or CPU,
 that is a computer's "brain").  These forms of programming
 instructions are the most valuable types of electronic data because
 they tell computers how to handle information.  If an application
 (program) can be transmitted between two different  computers but
 cannot run on both machines-the current norm in the industry-the
 application's value is limited.
 Companies like Apple, IBM, Microsoft, Intel and Novell have
 little incentive to create truly open or common standards for
 operating systems or microchip instructions because each company in
 one way or another competes successfully on the basis of differences
 in its products.  Proprietary standards (where all rights to the
 standard are retained by one firm) are one way these companies can
 protect their research and development (R&D) costs from reengineering
 by competing firms.2
 
 The Problem
 
 Just as the mercantilist nations of the last century forced
 their currency on their colonies and used tariff barriers to
 discourage trade with other powers, computer makers in the twentieth
 century have set standards governing the internal commerce of their
 products to the detriment of the competition.3  In the same way that
 19th-century Britain bucked the mercantilist trend, maintained a free
 trading regime, and lost ground to "freeloading" traders as a result,
 IBM defined an open PC standard and bore the costs of maintaining it
 while clone makers got a free ride.  With no need for heavy R&D
 expenses, these companies could undercut IBM prices by a significant
 margin.
 In the past, proprietary standards have acted as unfair exchange
 standards, making it unnecessarily expensive for consumers to move
 their investments in data-and particularly software-around from one
 platform (operating system) to another.  This deters investment, just
 as the asset-trapping nature of a command economy or non-convertible
 currency was for many years a substantial deterrent to foreign
 investment in Eastern Europe.
 Consumers have started demanding more compatibility between
 systems, but companies have been slow to react.  As _The Economist_
 put it,  "every firm wants a monopoly-and every firm wants to call
 it an open standard."4  Recently,  corporations have begun
 establishing interfirm alliances to allow their systems to support
 "multiple personalities" (multiple operating systems).  Future IBM
 computers will be able to run Mac software, while Apple's new Power PC
 will run Windows and OS/2, thanks to the use of translation and
 emulation software.5
 John Sculley-the ex-CEO of Apple-points out in _Defying Gravity_
 that computer designs can no longer be based just on the engineers'
 experience of using the system.  No one company has the business
 expertise to design an entire system in a world where more diverse
 products have to be brought to market faster than ever.  That speed
 requires higher levels of coordination, cooperation and
 standardization between companies.  The current proliferation of
 cross-licensing agreements falls short of a universal standard.  The
 incentive to sell incompatible platforms is still there; companies
 have just decided to rely on translation software that they make,
 called microkernels, instead of full-blown operating systems for their
 profits.  They have failed to break up the operating system into
 individual components that can be built by different companies
 according to comparative (instead of historical) advantage.
 Someday, as happened with railroads and automobiles, a standard
 for interchangeable software parts will emerge, either through
 government intervention or the natural evolution of a monopoly out of
 the market.6  This monopoly will, however, require government
 regulation at some point to prevent abuse, as was necessary with the
 railroad and telephone empires.
 It is often forgotten why, how, and at what cost the national
 railroads were unified.  According to John Browning, "like
 railroads, new information networks are increasingly being built in
 large, monolithic chunks, starting from the long distance links and
 working down to the local one."7  Long distance links were the last
 part of the national rail system to be built, because it took an
 immense effort to integrate incompatible regional networks-
 particularly in the South where there were only spur lines.8  In fact,
 railroads, highways and even computers9 to a certain extent have been
 built up regionally with government stimulus and later coordinated
 through national structures.  Regional and local monopolies had to be
 granted so that proposed standards would be self-enforcing, since
 where there is incentive to compete, there is incentive to deviate
 from the standard and affect the distribution of market share.
 Railroads were easy to standardize because the tracks were
 originally built with iron rails that wore out quickly.  Tracks had to
 be rebuilt often, so it was not difficult-given adequate financial
 incentive-to rebuild the gauges to a particular width.10  The advent
 of steel, because of its durability, might actually have threatened
 this standardization.  Fortunately, just as steel was replacing iron
 in the 1870s and '80s,  local railroad companies came together in
 regional alliances to standardize gauges and policies for
 transcontinental shipping, ending decades of chaos in the industry.
 These alliances greatly reduced costs to the consumer and spurred
 investment in new railroad technology.
 Some railroad companies concerned with standardization feared
 the emergence of a monopoly and tried to preserve their independence
 by confederating.  They borrowed from the American federalist model of
 government to create their own tripartite government with a
 legislative assembly, executive branch, and judiciary for settling
 disputes.  This structure balanced  competing regional interests
 against one another and produced an efficient, egalitarian, state-of-
 the-art continental transportation system.11  Since the governing
 convention created by these small cartels did not include all rail
 companies, nor address all of the public interest, it collapsed when
 Jay Gould and others began forming large conglomerates.  New,
 antidemocratic giants emerged, which Congress then stepped in to
 regulate.
 Either through market evolution or government intervention, such
 a standardization of CPUs and operating systems is inevitable.
 According to _The Economist_, the computer industry is rapidly
 becoming "a commodity business"12 with all the accompanying industry-
 wide conventions.  This is occurring in an industry producing goods
 with the highest intellectual property content in history (hardly
 characteristic of most commodities).
 It is possible for government to move in now, avoid further
 costs of incompatibility and establish a forward-looking, flexible
 standard that will preclude the development of a monopoly and will
 reshape the way value is created in the software industry.  In the
 process,  the hyper-competitive aspects of the computer industry that
 have served society so well could be preserved.  As  the National
 Performance Review prescribes, government can set clear goals and act
 as a catalyst while allowing private actors to move the ball down the
 field.
 Because of the peculiar nature of information, such a standard
 need not be autocratic, nor would setting one be risky.  The Japanese
 and European efforts to set High-Definition Television (HDTV)
 standards flopped because they locked industry into analog hardware
 before superior digital technology was ready.  Immature technologies
 have never been successfully pushed on society.  The software industry
 has almost the opposite problem-not so much inventing the wheel or
 prematurely setting it in stone as constantly having to reinvent it
 (in order to operate applications under different systems).13
 A computer's instructions are vastly different than the regular
 objects that come to mind when standards are discussed.  The
 instructions CPUs use are virtual; they are not materially dependent
 on any particular piece of hardware.  As symbols, they can always grow
 and be reinterpreted, unlike manufactured products such as metal pipe,
 whose dimensions cannot be changed once cast.  Corporate planners,
 long resistant to the adoption of a standardizing framework, are
 beginning to see the adaptability of computer code as an advantage
 upon which a new standard could be based.  As the senior technical
 editor of *BYTE* put it, "the battle is no longer about whether to
 layer object-oriented services and emulation systems . . . on a small
 kernel . . . nor whether to build an operating system in this style
 but how to do the job right."14  The remaining problem is one of
 coordination between corporations in getting these new systems to work
 together.
 
 The Solution
 
 The essential features of such a system are easily described.
 The system could be called DNA, after its biological counterpart which
 binds all organic matter into the same competitive framework.  While
 object orientation15-the way in which commonly used types of data are
 paired with the instructions needed to manipulate that data-makes data
 transportable and software highly extensible *within* a platform, DNA
 would make that operating system and processor object oriented so that
 both data *and* software would be transportable across platforms.  In
 other words, when a processor receives a standard DNA message telling
 it to do something like add two numbers or draw a line, it will have a
 library available to translate the instruction into the host language
 of that particular processor.
 Under this system, it would be up to the CPU's manufacturer to
 supply the most basic translation libraries, but other firms could
 supply add-ons or extensions for functions too complex for the CPU to
 execute.  This way, market competition could be used to set standards
 for new forms of data, instead of having the government mandate
 standards for immature technologies.  A company marketing a product
 which uses a completely novel form of data-say a device for producing
 certain odors16-would have an opportunity to create its own standard
 for data by marketing a new extension for the DNA system.  A
 competitor might also market a similar plug-in, and both companies
 could compete to gain supporters for their mini-standard.  In the end,
 the best solution would likely win out.  Companies would not have to
 worry about maintaining compatibility with an existing base because no
 previous software could produce odors.
 The uniform interface of DNA would allow individual firms to use
 their expertise to replace inefficient system components easily,
 thereby broadening the market for their products.  If DNA contained a
 standard driver for reading keyboard input, for example, and someone
 wanted to market a new voice recognition device that would be
 compatible with past software, that company could make a substitute
 for the keyboard interface that instead uses the firm's voice
 recognition hardware.  DNA would increase the marketability of the
 voice recognition device, because customers could buy the physical
 device without having to upgrade their entire software library.
 According to *The Economist*, "today all firms need a niche"17
 in the computer market-and universal standards can provide the
 necessary framework.  DNA would not pick winners, but would instead
 make it easier for  winners to emerge.  Systems would be built
 component by component on the basis of efficiency, rather than through
 political or alliance considerations.
 Much DNA code may have to be interpreted on each platform, but
 with a common object code standard each platform would be able to do
 this in the most efficient manner.  If this standard's basic design is
 flawed or technology passes it by (since technology moves faster than
 anyone's capacity to plan ahead), certain instructions could be
 reserved in advance to switch to a completely new, but as yet
 unspecified standard.
 In the past, companies have objected to the slight performance
 degradation caused by interpretation.  The Macintosh has been
 successful precisely because of the huge "toolbox"18 of standard
 commands it makes available to applications.  Because programs "call"
 these functions in the system, instead of in the application itself,
 Apple has managed to reduce program size and smoothly maintain the
 system's evolutionary growth path.
 Apple's new PowerPC is the first example of a "multiple
 personality" PC capable of running under more than one operating
 system.  The PowerPC uses a new platform and microprocessor, the 601.
 To run the old software, which is written for a 68000 microprocessor,
 the PowerPC interprets and translates that code to the 601.
 Reinterpreting the old 68000 instructions slows things down, but by
 rewriting the toolbox to run on the faster new 601, Apple makes up for
 that loss.  Users see no performance degradation with old software and
 see tremendous gains with new software.  Most of Apple's competitors
 are planning similar interpretation schemes for their new systems.
 Since an open standard requires some sort of monopolistic
 power, it is clear that if DNA is implemented, companies will no
 longer profit from the creation of monolithic operating systems.  The
 way value is created in the software and hardware industries would be
 radically altered under DNA, as shown in Figure 1, but who wants to
 make money reinventing the wheel?  Real money is made on the cutting
 edges of technology, and this technological advancement should
 continue to be driven by the free market.
 U.S. policymakers must think seriously now about how to keep
 American industries globally competitive for the next fifty years.  By
 2040, no software power will make money reinventing the wheel.  In a
 world where microprocessor architectures are proliferating instead of
 unifying and where technical progress is speeding up in all areas of
 science, a DNA-type standard is needed, if for no other reason than to
 coordinate the diffusion of technical expertise.  Only by making new
 technology generic, so that a user can plug it in and go, will the
 learning curve needed to use new technologies efficiently be
 conquered.
 Technology transfer needs to become more automatic.  Many
 writers, James Dearing among them, have thought of technology transfer
 as a "difference-reduction"19 problem-one of trying to get users and
 inventors to share the same knowledge about an invention so that the
 person in the field knows how to apply it as well as the inventor.  In
 fact, really useful technology gets put to uses never dreamed of by
 its inventors.  The problem is how to insulate the information needed
 to use new technology from the knowledge of how it works-which
 confuses most consumers.
 The historical trend in U.S. technological development is clear;
 either government or industry will eventually take steps to stop this
 continual rebuilding of operating systems from the ground up.  The
 real issue to be decided in the telecommunications debate is not over
 who owns the virtual asphalt or builds the on-ramps.  The question is
 who will own the resulting computer standard governing the packaging
 of information.  Any firm which wins control will have a power not
 unlike the government's ability to print money: the firm will control
 the currency of day-to-day electronic transactions.  This fact is
 becoming increasingly apparent and important to policymakers.
 According to Admiral Bobby Inman and Daniel Burton, "arcane topics
 like technical standards . . . that once were viewed as the
 responsibility of obscure bureaucrats will increasingly engage public
 officials at the highest levels."20
 There is already a consensus in the industry as to what features
 computers will incorporate in the next decade.  It is also clear that
 some sort of standard for object code will emerge as well.
 Government, though, has several options for the role it can play in
 this process:  (1) the Commerce Department, perhaps with some
 authorizing legislation, could call industry heads together and order
 them to set a common object code standard; (2) Commerce could accept
 bids from various companies and groups for such a standard; or (3)
 finally, the federal government could itself craft a standard with the
 help of qualified but disinterested engineers, and then try to force
 it upon the industry through the use of government procurement rules,
 control over the flow of research and development money or other
 economic levers.  The recent victory of Microsoft in its case against
 Stac Electronics over protecting its operating system indicates that
 some reform of the intellectual property laws may be needed as well.
 Given the acrimony in the current debate over the definition of
 a much-needed encryption (data security) standard, it is difficult to
 identify the most politically feasible path for policymakers to follow
 in developing common object code standards. There is enough of a
 consensus in the industry and among users now to begin the search for
 a solution.  A serious effort should also be made to reach a consensus
 with other industrialized nations, for computers are globally
 interconnected to a degree that no other mass consumer product has
 been.
 Government can prevent a monopoly if it moves now.  The unique
 nature of information technology would allow a common standard to
 develop without locking the industry into risky, immature technologies
 and would accelerate rather than hinder innovation.  According to
 Nicholas Negroponte, director of MIT's Media Lab, "an open systems
 approach is likely to foster the most creative energies for new
 services and be a vehicle for the most rapid change and evolution."21
 Such an approach would simply provide a stable framework within
 which businesses could compete on the basis of their expertise and not
 on their historical advantage.  This is what America's founding
 fathers designed federalism to do from the start: balance competing
 sectoral and regional interests against one another to spur
 competition and development for the benefit of all.
 
 By  Wade Riddick
 
 Author Biography
 
 Wade Riddick is a graduate student and National Science Foundation
 Fellow in the Department of Government at the University of Texas.  He
 received his B.A. in English from Louisiana State University.  He can
 be reached at [email protected].
 
 Figure 1
 
 Traditional
 
 Microsoft Windows ->  Disk / Screen / Memory / Audio / ...   ->  User
 
 IBM OS/2          ->  Disk / Screen / Memory / Audio / ...   ->  User
 
 Apple Macintosh   ->  Disk / Screen / Memory / Audio / ...   ->  User
 
 Currently users have to pick one complete operating system to run.
 
 __________________________________________________________________
 
 New Systems
 
 - Microsoft Windows
 /
 Microsoft Windows NT -> kernel  --    IBM OS/2               - User
 \
 - Apple Macintosh
 
 - Microsoft Windows
 /
 Apple/IBM PowerPC    -> kernel  --    IBM OS/2               -> User
 \
 - Apple Macintosh
 
 In systems being introduced this year, users have to pick one
 company's kernel and then another company's operating system(s).
 
 ___________________________________________________________________
 
 DNA Common Standard
 
 Microsoft       Apple      IBM
 
 (             (         )
 )             )         (
 
 Disk    +    Screen  +  Memory   +  .....         -> User
 
 Under DNA, no one company will make *the* operating system.
 
 ___________________________________________________________________
 
 Notes
 
 1      Robert Reich, "The Quiet Path to Technological Preeminence,"
 *Scientific American*, vol. 261, no. 4, (October, 1989), p. 45.
 2      There are many different ways to accomplish the same task.
 Reengineering allows one firm to copy the functionality of another
 firm's design without exactly copying the design itself and infringing
 on the patent.  If a plumber could not find 1" aluminum pipes at the
 hardware store, but had the proper connectors, he might instead use 2"
 pipes;  this is essentially what computer engineers do.
 Most successful companies do not mind that others clone their
 products, because the technological frontier expands so quickly.  One
 generation of chips may have a heyday of only two years.  After that,
 a better chip appears that can do what the old one does and much more.
 Intel, for example, makes its money on the cutting edge of technology
 by selling new chips like the Pentium (i.e., P5) and does not mind
 that Advanced Micro Devices sells a clone of the older  (P4) chip.
 
 Since it is Intel's chip family, users trust only Intel to release the
 next generation standard.  If AMD tried to release a P6 first, no one
 would buy it because it might not be compatible with the P6 Intel
 releases.
 3      Computer instructions can be thought of as forms of money
 because they control specific system resources.  Just as societies
 accept the convention that a piece of paper with symbols has monetary
 value and can be exchanged for something tangible like a candy bar,
 computer makers decide that certain numbered instructions mean certain
 things and perform certain physical tasks on the computer.  Operating
 systems are like political regimes because they set the rules for
 using resources and determine what types of money are permissible.
 Just as businesses in America will not take British pound notes
 because different symbols are printed on the bill, incompatible
 computers do not recognize each other's  basic commands because
 different numbers code for different instructions-even though all
 computers can perform the same logical tasks.  Unlike nations, though,
 assets cannot be moved across computer families because no convention
 for exchanging currencies exists.
 4      "The Computer Industry: Do It My Way," *The Economist*, vol.
 326, no. 7800, (February 27th, 1993), p. 11.
 5      For a detailed description of this technology, see *BYTE*'s
 January 1994 issue.
 6      The most likely stimulus for a desktop PC standard will come
 from interactive TV manufacturers whose profits are not made selling
 operating systems but rather set-top boxes.
 7      "Get on Track: There Will Be No Info Highway," *Wired
 Magazine*,  vol. 2, no. 2, (February, 1994), p. 65.
 8      *The Economist* compared the development of the information
 superhighway to the "the railway free-for-all of the 19th century."
 See "America's Information Highway," *The Economist*, vol. 329, no.
 7843, (December 25, 1993), p. 35.
 9      If one thinks of the fragmentation as sectoral instead of
 regional (e.g., IBM mainframes in banking, Macintoshes in publishing
 and so on).
 10      Companies used non-standard widths to force customers to use
 their railcars and prevent them from riding through their network
 without paying.  The cost to efficiency was high, because
 transcontinental cargo had to be loaded and unloaded several times.
 11      For an account of this standardization process see Alfred
 Chandler's *The Visible Hand* (Cambridge, Mass: Harvard University
 Press, 1977), esp. pp. 130-142.  Because these small firms had
 monopolies in their local markets, they had an interest in adhering to
 and maintaining rail gauge and coupler standards.  In essence, they
 created one big monopoly, but one whose ownership and profits were
 evenly distributed across the countryside.
 12      "The Computer Industry:  Reboot System and Start Again," *The
 Economist*, vol. 326, no. 7800, (February 27th, 1993), p. 4.
 13      Object-oriented programming seeks to solve part of this
 problem by permitting code reuse on particular platforms, but it has
 no standard and does not address the problem of microprocessor Babel,
 so objects cannot easily work across platforms.
 14      John Udell, "The Great OS Debate," *BYTE*, vol. 19, no. 1,
 (January, 1994), p. 117.
 15      Objects are ways of pairing commonly used types ("classes") of
 data with the instructions needed to manipulate them ("methods").
 Programs then perform their tasks by creating or using existing
 objects and sending "messages" to the objects to tell them what to do.
 For instance, a line object might hold two values and a program could
 send it messages creating a new line, changing its location, or
 deleting it.
 This approach cuts down on redundant code.  The programs that
 draw lines can share the same line object.  Small objects can be
 easily combined into more complex systems.  A square could be a
 combination of four lines.  When a program sends a "create" message to
 the square, the square sends four "create" messages to the line
 object.
 16      Presumably for virtual reality or pharmaceutical research.
 17      "The Computer Industry: Harsh New World," *The Economist*,
 vol. 326, no. 7800, (February 27th, 1993), p. 7.
 18      Toolboxes are large sets of functions provided by the
 operating system to applications.  On the Mac, for instance, the
 toolbox draws windows and plays sounds.  Programmers do not need to
 write their own code to do these things because they are provided by
 the system.  Since all programs use these standard services,
 applications can be written faster and appear the same to users, so
 the learning curve for using Mac programs is much shorter.
 Other companies have adopted this approach and now provide
 extensive services through what they call an API (Application Program
 Interface).
 19      James Dearing, "Rethinking Technology Transfer,"
 *International Journal of Technology Management*, vol. 8, pp. 1-8.
 20      Bobby Inman and Ray Burton, "Technology and Competitiveness,"
 *Scientific American*, vol. 269, no. 1 (January 1991), p. 126.
 21      Nicholas Negroponte, "Set-Top Box As Electronic Toll Booth:
 Why We Need Open-Architecture TV," *Wired*, vol. 1, no. 4  (Sept/Oct,
 1993), p. 120.
 
 1 Robert Reich, The Quiet Path to Technological Preeminence,
 Scientific American, vol. 261, no. 4, (October, 1989), p. 45.
 2 There are many different ways to accomplish the same task.
 Reengineering allows one firm to copy the functionality of another
 firm's design without exactly copying the design itself and infringing
 on the patent.  If a plumber could not find 1" aluminum pipes at the
 hardware store, but had the proper connectors, he might instead use 2"
 pipes;  this is essentially what computer engineers do.
 Most successful companies do not mind that others clone their
 products, because the technological frontier expands so quickly.  One
 generation of chips may have a heyday of only two years.  After that,
 a better chip appears that can do what the old one does and much more.
 Intel, for example, makes its money on the cutting edge of technology
 by selling new chips like the Pentium (i.e., P5) and does not mind
 that Advanced Micro Devices sells a clone of the older  (P4) chip.
 
 Since it is Intel's chip family, users trust only Intel to release the
 next generation standard.  If AMD tried to release a P6 first, no one
 would buy it because it might not be compatible with the P6 Intel
 releases.
 3 Computer instructions can be thought of as forms of money
 because they control specific system resources.  Just as societies
 accept the convention that a piece of paper with symbols has monetary
 value and can be exchanged for something tangible like a candy bar,
 computer makers decide that certain numbered instructions mean certain
 things and perform certain physical tasks on the computer.  Operating
 systems are like political regimes because they set the rules for
 using resources and determine what types of money are permissible.
 Just as businesses in America will not take British pound notes
 because different symbols are printed on the bill, incompatible
 computers do not recognize each other's  basic commands because
 different numbers code for different instructions even though all
 computers can perform the same logical tasks.  Unlike nations, though,
 assets cannot be moved across computer families because no convention
 for exchanging currencies exists.
 4 The Computer Industry: Do It My Way, The Economist, vol. 326,
 no. 7800, (February 27th, 1993), p. 11.
 5 For a detailed description of this technology, see BYTE's
 January 1994 issue.
 6 The most likely stimulus for a desktop PC standard will come
 from interactive TV manufacturers whose profits are not made selling
 operating systems but rather set-top boxes.
 7 Get on Track: There Will Be No Info Highway, Wired,  vol. 2,
 no. 2, (February, 1994), p. 65.
 8 The Economist compared the development of the information
 superhighway to the the railway free-for-all of the 19th century.
 See America's Information Highway,  The Economist, vol. 329, no.
 7843, (December 25, 1993), p. 35.
 9 If one thinks of the fragmentation as sectoral instead of
 regional (e.g., IBM mainframes in banking, Macintoshes in publishing
 and so on).
 10 Companies used non-standard widths to force customers to use
 their railcars and prevent them from riding through their network
 without paying.  The cost to efficiency was high, because
 transcontinental cargo had to be loaded and unloaded several times.
 11 For an account of this standardization process see Alfred
 Chandler's The Visible Hand (Cambridge, Mass: Harvard University
 Press, 1977), esp. pp. 130-142.  Because these small firms had
 monopolies in their local markets, they had an interest in adhering to
 and maintaining rail gauge and coupler standards.  In essence, they
 created one big monopoly, but one whose ownership and profits were
 evenly distributed across the countryside.
 12 The Computer Industry:  Reboot System and Start Again, The
 Economist, vol. 326, no. 7800, (February 27th, 1993), p. 4.
 13 Object-oriented programming seeks to solve part of this problem
 by permitting code reuse on particular platforms, but it has no
 standard and does not address the problem of microprocessor Babel, so
 objects cannot easily work across platforms.
 14 John Udell, The Great OS Debate, BYTE, vol. 19, no. 1,
 (January, 1994), p. 117.
 15 Objects are ways of pairing commonly used types (classes) of
 data with the instructions needed to manipulate them (methods).
 Programs then perform their tasks by creating or using existing
 objects and sending messages to the objects to tell them what to do.
 
 For instance, a line object might hold two values and a program could
 send it messages creating a new line, changing its location, or
 deleting it.
 This approach cuts down on redundant code.  The programs that
 draw lines can share the same line object.  Small objects can be
 easily combined into more complex systems.  A square could be a
 combination of four lines.  When a program sends a create message to
 
 the square, the square sends four create messages to the line
 object.
 16 Presumably for virtual reality or pharmaceutical research.
 17 The Computer Industry: Harsh New World, The Economist,  vol.
 326, no. 7800, (February 27th, 1993), p. 7.
 18 Toolboxes are large sets of functions provided by the operating
 system to applications.  On the Mac, for instance, the toolbox draws
 windows and plays sounds.  Programmers do not need to write their own
 code to do these things because they are provided by the system.
 Since all programs use these standard services, applications can be
 written faster and appear the same to users, so the learning curve for
 using Mac programs is much shorter.
 Other companies have adopted this approach and now provide
 extensive services through what they call an API (Application Program
 Interface).
 19 James Dearing, Rethinking Technology Transfer, International
 Journal of Technology Management, vol. 8, pp. 1-8.
 20 Bobby Inman and Ray Burton, Technology and Competitiveness,
 Scientific American, vol. 269, no. 1 (January 1991), p. 126.
 21 Nicholas Negroponte, Set-Top Box As Electronic Toll Booth: Why
 We Need Open-Architecture TV, Wired, vol. 1, no. 4  (Sept/Oct, 1993),
 p. 120.
 
 
 ------------------------------
 
 Date: Fri, 10 Jun 1994 15:41:54 -0700
 From: email list server <[email protected]>
 Subject: File 2--PDC'94 CFP-Artifacts session (revised)
 
 ==================================================================
 
 CALL FOR PARTICIPATION-Artifacts session
 PDC'94
 Third Biennial Conference on Participatory Design
 Chapel Hill, North Carolina
 October 27-28, 1994
 
 Sponsored by Computer Professionals for Social Responsibility
 ==================================================================
 
 In the last few years, participatory approaches to design have gained
 adherents around the world. Participatory design approaches have at
 their core the involvement of workers in the design and development of
 new technologies and work practices that have the potential of
 improving their work lives. Collaborative design projects combine the
 skills and knowledge of workers who will use or are using the
 technology, with the technological and organizational expertise of
 those involved in its development.
 
 The first Participatory Design conference explored the historical roots
 of this way of working, by bringing European practitioners together
 with American researchers and industry developers. By the second
 conference, PDC'92, participatory approaches to design had taken root
 in the US, not only in research environments, but also at several
 commercial firms. The goal at that time was to take a further step
 towards defining and nurturing participatory design. In PDC `94, we
 would like both to consider our ways of working and to foster a
 substantial dialog among practitioners. The conference is an
 international forum where this emerging community can meet, exchange
 ideas and experiences, and investigate the incorporation of
 participatory design approaches in new areas such as: product
 development, long-term system maintenance and redesign, and settings in
 the developing world.
 
 We encourage the participation of all those interested in learning
 about participatory design and in trying it in their own settings, as
 well as those currently employing participatory approaches to design
 (possibly under other names).
 ==================================================================
 
 Artifacts submissions
 (including posters and demonstrations)
 
 The Artifacts program brings together representations, techniques,
 methodologies and technologies developed for or through participatory
 design. (A representation may take the form of documents and other
 objects that reflect work practices, designs, and associated materials,
 and should include both the artifact itself and how it is used in the
 work situation.)
 
 A contribution to the Artifacts program should be intended to be shown
 or demonstrated informally at a booth. The Artifacts program will take
 place in conjunction with the conference dinner and thus will not
 overlap with the papers/panels/workshops tracks.
 
 Submission Requirements:
 Description and motivation of the artifact and how it is used in
 practice (5 copies, maximum 3 pages). Include non-textual materials
 like photographs, videotapes, sketches, etc., if appropriate (only one
 copy of a videotape is required, and photographs may be provided in
 photocopied form). Be sure to describe any plans to engage conference
 participants directly in using the artifact.
 
 Each accepted artifact will be represented by a one-page, published
 short paper in the PDC'94 Proceedings.  Please contact Michael Muller
 at the addresses given below to obtain a copy of the author's kit or
 consult the format/guidelines available through cpsr.org.  The
 one-page short paper MUST be received in camera-ready format as
 part of the submission, due 15 July 1994.
 
 Brief description of artifact presenter's relevant experience and
 background.
 
 Any special equipment or power requirements.
 
 Submissions and requests for information to: Michael Muller, PDC'94
 Artifacts Co-Chair U S WEST Advanced Technologies
 4001 Discovery Drive / Suite 280
 Boulder CO 80303 USA
 
 tel: +1 303 541 6564
 fax: +1 303 541 6003
 email: [email protected]
 ==================================================================
 
 IMPORTANT DATES (in 1994)
 
 July 15:        Artifacts proposals received
 August 1:       Final versions of papers/panels/workshops received for
 proceedings
 August 15:      Acceptance notifications to artifact presenters
 ==================================================================
 
 Accepted submissions and proposals from all categories will appear in a
 proceedings distributed to conference participants. We look forward to
 seeing you in North Carolina in the Fall of 1994.
 
 Sincerely,
 
 PDC '94 Conference Committee
 
 Bill Anderson                           Conference Chair
 Susan Suchman & David Bellin            Local Co-chairs
 Susan Irwin Anderson & Randall Trigg    Program Co-chairs
 Andrew Clement                          Panels Chair
 Finn Kensing                            Workshops Chair
 Annette Adler & Michael Muller          Artifacts Co-chairs
 Elizabeth Erickson                      Proceedings Chair
 Erran Carmel                            Treasurer
 Barbara Katzenberg & Peter Piela        Publicity Co-chairs
 =================================================================
 
 PDC '94 Program Committee
 
 Annette Adler (Artifacts Co-Chair), Xerox Corporate Architecture
 Susan Irwin Anderson (Program Co-Chair)
 Susanne Bodker, Aarhus University
 Tone Bratteteig, University of Oslo, Norway
 Andrew Clement (Panels Chair), University of Toronto
 Yrjo Engestrom, University of California, San Diego
 Christiane Floyd, University of Hamburg
 Joan Greenbaum, LaGuardia College, City University of New York
 Judith Gregory, University of California, San Diego
 Kaj Gronbaek, Aarhus University, Denmark
 Jonathan Grudin, University of California, Irvine
 Mike Hales, University of Brighton, United Kingdom
 Karen Holtzblatt, InContext Enterprises
 Finn Kensing (Workshops Chair), Roskilde University Center, Denmark
 Sarah Kuhn, University of Massachusetts, Lowell
 Michael Muller (Artifacts Co-Chair), US West Advanced Technologies
 Charley Richardson, University of Massachusetts, Lowell
 Patricia Sachs, NYNEX Science and Technology
 Randall Trigg (Program Co-Chair), Xerox Palo Alto Research Center
 Eline Vedel, The National Bank of Norway
 Ina Wagner, Technical University, Vienna
 Terry Winograd, Stanford University / Interval Research
 ==================================================================
 
 For registration information write c/o Information Foundation, 46
 Oakwood Dr., Chapel Hill, NC, 27514 or send electronic mail to
 [email protected].
 
 For program information write William L. Anderson, Xerox Corp. 817-
 02B, 295 Woodcliff Drive Fairport, NY 14450 USA
 email:[email protected]       tel: (716)-383-7983
 ==================================================================
 
 Conference information is also available via the World Wide Web at
 http://cpsr.org/cpsr/conferences/pdc94 or via anonymous ftp at
 ftp.cpsr.org in the /cpsr/conferences/pdc94 directory.
 
 ------------------------------
 
 Date: Thu, 30 Jun 1994 16:34:47 +0000
 From: "US ACM, DC Office" <[email protected]>
 Subject: File 3--ACM Releases Crypto Study
 
 Association for Computing Machinery
 
 PRESS RELEASE
 __________________________________________________
 
 Thursday, June 30, 1994
 
 Contact:
 
 Joseph DeBlasi, ACM Executive Director (212) 869-7440
 Dr. Stephen Kent, Panel Chair (617) 873-3988
 Dr. Susan Landau, Panel Staff (413) 545-0263
 
 COMPUTING SOCIETY RELEASES REPORT ON ENCRYPTION POLICY
 
 "CLIPPER CHIP" CONTROVERSY EXPLORED BY EXPERT PANEL
 
 WASHINGTON, DC - A panel of experts convened by the nation's
 foremost computing society today released a comprehensive report
 on U.S. cryptography policy.  The report, "Codes, Keys and
 Conflicts: Issues in U.S Crypto Policy," is the culmination of a
 ten-month review conducted by the panel of representatives of the
 computer industry and academia, government officials, and
 attorneys.  The 50-page document explores the complex technical
 and social issues underlying the current debate over the Clipper
 Chip and the export control of information security technology.
 
 "With the development of the information superhighway,
 cryptography has become a hotly debated policy issue," according
 to Joseph DeBlasi, Executive Director of the Association for
 Computing Machinery (ACM), which convened the expert panel.  "The
 ACM believes that this report is a significant contribution to the
 ongoing debate on the Clipper Chip and encryption policy.  It cuts
 through the rhetoric and lays out the facts."
 
 Dr. Stephen Kent, Chief Scientist for Security Technology
 with the firm of Bolt  Beranek and Newman, said that he was
 pleased with the final report.  "It provides a very balanced
 discussion of many of the issues that surround the debate on
 crypto policy, and we hope that it will serve as a foundation for
 further public debate on this topic."
 
 The ACM report addresses the competing interests of the
 various stakeholders  in  the  encryption debate  --  law
 enforcement agencies,  the intelligence community, industry and
 users of communications services.  It reviews the recent history
 of U.S. cryptography policy and identifies key questions that
 policymakers must resolve as they grapple with this controversial
 issue.
 
 The ACM cryptography panel was chaired by Dr. Stephen Kent.
 Dr. Susan Landau, Research Associate Professor in Computer Science
 at the University of Massachusetts, co-ordinated the work of the
 panel and did most of the writing. Other panel members were Dr.
 Clinton Brooks, Advisor to the Director, National Security Agency;
 Scott Charney, Chief of the Computer Crime Unit, Criminal
 Division, U.S. Department of Justice; Dr. Dorothy Denning,
 Computer Science Chair, Georgetown University; Dr. Whitfield
 Diffie, Distinguished Engineer, Sun Microsystems; Dr. Anthony
 Lauck, Corporate Consulting Engineer, Digital Equipment
 Corporation; Douglas Miller, Government Affairs Manager, Software
 Publishers Association; Dr. Peter Neumann, Principal Scientist,
 SRI International; and David Sobel, Legal Counsel, Electronic
 Privacy Information Center.  Funding for the cryptography study
 was provided in part by the National Science Foundation.
 
 The ACM, founded in 1947, is a 85,000 member non-profit
 educational and scientific society dedicated to the development
 and use of information technology, and to addressing the impact of
 that technology on the world's major social challenges.  For
 general information, contact ACM, 1515 Broadway, New York, NY
 10036. (212) 869-7440 (tel), (212) 869-0481 (fax).
 
 Information on accessing the report electronically will be
 posted soon in this newsgroup.
 
 ------------------------------
 
 ------------------------------
 
 End of Computer Underground Digest #6.60
 ************************************
 
 
 
 |   |