Autonomous Intelligent Enterprise
5 min read

What’s Next in the Computer Science Sector?

By Stephen DeAngelis

One of the speakers I was fortunate enough to book for the First Annual Enterra Solutions® Cognitive Computing Summit was C. Gordon Bell, an American electrical engineer and entrepreneur. An early employee of Digital Equipment Corporation (DEC) — from 1960–1966 — Bell designed several of their PDP machines and later became Vice President of Engineering — from 1972-1983 — overseeing the development of the VAX. Bell’s later career includes entrepreneur, investor, founding Assistant Director of National Science Foundation’s Computing and Information Science and Engineering Directorate — from 1986-1987 — and researcher emeritus at Microsoft Research — from 1995–2015. He is also known for developing Bell’s Law of computer classes (discussed below). I owe a debt of gratitude to Nancy Kleinrock from TTI/Vanguard who captured the highlights of Bell’s presentation.

Is the Past a Prologue for the Future?

Bell asserted the past half-century of progress in computers has been highly predictable. He noted that Moore’s Law, developed in 1965, predicted the progress of transistor density. The prediction — transistor density would double every 18 months and see a 100-fold improvement each decade — has proved true for over 40 years. In 1971, Bell extrapolated on Moore’s projection in two directions: First, an established computer would improve at Moore’s Law pace at a constant price; and, second, a new computer class would emerge every decade. This latter corollary has become known as Bell’s Law. Consequences of Bell’s Law include:

  • Each new computer class would cost roughly one-tenth of its predecessor.
  • Each new computer class would achieve an installed base an order-of-magnitude greater than its predecessor.
  • Each new computer class would introduce new technologies, manufacturers, and new uses — that is, new markets.

In the 1960s, the dominant computer class was mainframe computers. In the 1970s, minicomputers were introduced. That class was followed by networked workstations and personal computers in the 1980s. The 1990s saw the rise of browser-web-server structure and palm computing. The new century saw the rise of web services (i.e., cloud computing) and smartphones. Bell had predicted that home and body area networks would form by 2010. As part of his historical tour d’horizon, Bell took Summit participants on stroll through the past in which he played a significant role. When he was working at DEC, IBM dominated the mainframe market; but DEC made a name for itself with its Programmed Data Processor (PDP) line of minicomputers. The PDP emerged not as a direct competitor to the mainframe, but as a less capable, less expensive alternative for computational tasks that required fewer resources. When introduced in 1965, the PDP-8 cost $18K, as compared to an IBM System/360, which cost a couple of million dollars. By 1972, a new PDP-8 could be had for a mere $3K.

DEC’s strategy of introducing a less-capable, but more useful and affordable, product was disruptive. Clayton Christensen describes this phenomenon in his best-selling book entitled The Innovator’s Dilemma. When mainframes were the only computers available, people couldn’t imagine ever needing one. As computer prices fell, that all changed. During the 1980s, the personal computer was to the minicomputer what the minicomputer had been to the mainframe, undercutting it on price and features, but poised to overtake it with the promise of Moore’s Law. Bell observed that minicomputers followed a path that PCs have since: some models fall in price while retaining performance metrics, while others retain price with each generation gaining new capabilities. He also noted, “By the time the minicomputer market had matured, the 100 manufacturers of 1965 had consolidated, moved on, or died out; in 1985, a mere half dozen remained.”

The big question Bell’s presentation raised is: Where does the computer science sector go from here? At the Summit, Bell expressed his belief that every “thing” — whether born digital or as a physical object — will eventually be in cyberspace and possess a digital identity with a state that can be sensed and controlled. He indicated he wasn’t sure when this future reality would emerge or how difficult it would be to develop the necessary technology. He acknowledged that the challenges are not all technical. He recognized essential nontechnical considerations to achieving this end state exist. For example, he asked, “Does society — people or governments or firms — desire it? He even wondered, “Is it (or will it be) legal?” Bell’s point is that the future may not be as predictable as the past.

Supercomputers are Getting More Powerful

Bell observed one thing has remained constant in the ever-changing computer science sector — the continued development of supercomputers, the largest, most capable machines of their era, designed and manufactured to spec for the most computationally intensive problems of the day. Bell noted that Seymour Cray pioneered supercomputers, first at Control Data Corp and then at his self-named firm, at a time when IBM dominated the mainframe market. The Cray-1 (1976) featured 17 mega floating-point operations per second (MFLOPS). Currently, the Chinese Sunway supercomputer (2017) features over 100 PFLOPS (i.e., 100 quadrillion floating point operations per second) and 10.6M cores. The advancement in supercomputer technology has been made possible because of the exponential improvement in transistor density. To illustrate this point, Bell noted that Chris Fenton and Andras Tantos replicated the architecture and performance of the 5.5-ton Cray-1 four decades later using an field-programmable gate array (FPGA) board in a ten-inch high machine that is otherwise a visual replica of the original. Today, supercomputers generally take the form of computing clusters.

The Smartphone and Beyond

Bell told Summit participants that even though PCs remain a mainstay of the business and home markets, the game-changing platform of the 2000s was the smartphone. “It continues to reign today as the personal computer that puts the Internet — and all the connectivity and information resources that implies — literally in the hands of nearly half the world’s population. With the cloud performing the server role and smartphones as clients, this pocketable device has made the always-on, always-aware, always-connected lifestyle the norm.” The Internet of Things (IoT) is going to provide ubiquitous connectivity in the future and will make possible Bell’s vision of every “thing” having a digital identity. Using low-cost technologies, connectable “things” will span industrial process controls, home automation, personal wearables, home-bots (e.g., Amazon Echo), sensor networks for environmental and natural resource and urban monitoring, and robotics and vehicular automation.” He predicted ubiquitous millimeter-scale devices will represent one aspect of the next class of computing.

As things become more connected, Bell predicted we will see more urban and environmental automation — smart cities and smart agriculture will become more commonplace. Bell noted municipalities are gaining visibility into traffic, utilities, air quality, security, and other facets of the urban environment, by analyzing and viewing the instantaneous and longitudinal output of distributed sensors at an arbitrary level of granularity. In the agricultural sector, environmental sensor data, integrated from multiple agencies and jurisdictions, provides a window into impending problems and informs farmers. Bell sums the IoT up as “Things as a Service” (TaaS) and considers it the latest paradigm shift. Bell’s bottom line is that technology is only going to progress and how it is used is only limited by our imaginations.

Share this post