Skip to main content

Computing in Higher Education and at Cornell in the 1980s - Ken King

Information Technology in Higher Education in the 80's

Prior to the early 1980's, a relatively small percentage of faculty members in Universities were involved with computing and those who were tended to be fairly technologically sophisticated. Computing was generally important in a relatively small number of courses. Administrative uses were predominantly in the areas of the financial office and the registrarıs office. Institutions tended to adopt a specific vendors technology (IBM or Digital Equipment most frequently) and the pace of change of hardware and software was dictated by the rate of evolution of the technology provided by that vendor. Computer services staff members were frequently attached to specific users or specific administrative offices and worked within an organizational structure that was fairly flat with a minimum of procedures, rules, standards and management. Within the constraints of the technology and institutional resources, the fundamental goal of computer services staffs was to keep their important users happy and they did this by providing individualized service and by trying hard to give them whatever they wanted. Computer users were willing to trade reliability for advanced function. As long as all the important users were happy, the top-level administration tended to ignore computing aside from worrying about the ever increasing cost. Leadership on campus in computing issues tended to be shared between computer services and a small number of people outside of computer services who depended on it. Institutions of higher education had difficulty in defining the role of information technology in the institution, in adapting to its rapid rate of change, and in funding it in a stable and consistent manner. Directors of Computer Centers generally reported at about the same level as Buildings and Grounds and computer center staff were generally housed in basements or other marginal space.

In the early 80's it was possible to classify roughly an institution's approach to technology into five categories: pioneering, early followers, competitive, conservative and resistive. A small number of institutions had taken an active role in developing information technology. These pioneering institutions tended to be a couple of years ahead of early followers who were early adapters of leading edge technology developed elsewhere. Competitive institutions attempted to keep up with the leaders in their peer group and were usually about 4 years or one computer generation behind the pioneers. The vast majority of institutions were conservative. Their technology level was more than one generation behind that of the leaders. Resistive institutions tended to be two or more generations behind the pace setters. In the early 80ıs pioneers usually spent more than 7% of their total institutional budget on information technology; early followers often spent 5% to 7%; competitive institutions 4% to 5%, conservative institutions 3% to 4% and resistive institutions less than 3%. The US national average was about 3.5% of which about 2% was spent on academic computing and 1.5% was spent on administrative computing.

By the middle 1980's with advances in large-scale integrated circuits and communications and the appearance of personal computers and workstations attached to global networks, the world changed dramatically. Computing became democratized and an essential tool for every knowledge worker. The cost of computing cycles on microcomputers became a small fraction of the cost on mainframes and mini-computers, and network access made global information resources available to small and remote institutions. Suddenly the number of people depending on computing on campus began to grow exponentially with most of the new users technologically unsophisticated. Computing began to permeate every discipline and every administrative office lived or died by its computer connection.. The future careers of students began to depend on their ability to understand the role and application of information technology in their future profession. Instead of depending on one vendorıs hardware and software, there were now many alternatives. Computer hardware and software became a commodity market with the best deal changing by the hour.

In the new environment, all the rules changed for computer services organizations. With small staffs and exponentially growing demand for support it was no longer possible to never say no. It was no longer possible to attach staff members to particular users to provide individualized service. Trading technological sophistication for reliability was no longer a winner because the few people who benefited were greatly outnumbered by the people whose lives were disrupted when the system was down. Rules, standards and procedures were necessary to simplify the environment and provide reliable dependable service. Finding ways to leverage support activities became essential. Managing computer services successfully began to depend more and more on leadership and management skills in addition to technical skills.

As it became apparent that information technology was rapidly changing the nature of work and scholarship in many disciplines and the whole institution was dependent on good computer support, top level college and university administrators could no longer ignore computing. Some institutions sought competitive advantage by declaring themselves to be "computer intensive" and greatly increasing computer support budgets. Computer Center Directors began to be replaced by Chief Information Officers who reported at a level or two higher than computer center directors and whose job it was to provide strong leadership on technological issues and to manage limited information technology resources in an optimal way.

Computing at Cornell in the 1980's.

When I visited Cornell in the summer of 1980, my assessment was that Cornell could be classified as competitive with respect to research support and facilities, conservative with respect to support for infusing computing into the curriculum, and resistive with respect to support for administrative and library systems. On the plus side, computer services had a highly talented staff and the administration in response to criticism from the faculty and some trustees, was eager to improve computing. As a result of a failed and expensive attempt to move medical computing to the campus to save money, the administration had little respect for computer services leadership or competence and computer services morale seemed poor. The staff was in multiple locations and housed in marginal space. The main campus computer was housed off campus at the airport. As a legacy of an earlier report by a faculty committee called the DeBoer report, some staff believed that the administration did not appreciate their contribution to the University.

Prior to accepting a position at Cornell, I spent a week interviewing some of the staff, all of the Deans and the faculty members who were most active in computing. I then developed a plan that would move computing up to the status of early follower over a three-year period. I also insisted that the staff be housed in their own building and the campus mainframe computer be brought back to campus so that the whole staff could be housed together. I made acceptance of this plan and the title of Vice Provost a condition for accepting the job. Provost Keith Kennedy had a discussion with the President and they accepted my plan and its funding consequences. I then accepted the challenge of quickly moving Cornell into a national leadership position in computing. Fortunately the computing talent required to accomplish this was mostly already in place. In addition, key administrative people like Keith Kennedy, Jim Spencer and Jack Ostrom were highly supportive. Cornell had a world class faculty that was able to quickly exploit an improved computing environment. Faculty members who were particularly supportive and influential included Bob Cooke, Geoff Chester, Juris Hartmanis, Tim Teitelbaum, Ken Wilson, Paul Velleman, and Bob McGinnis. My task was to help them argue for resources and to keep out of people's way.

By the time I left Cornell in 1987, Cornell was a National Supercomputer Center, a national leader in developing instructional software, in distributed computing, and in networking. Its main administrative systems were online and supported by a data base management system and the library had an on line catalog system. Cornell was one of a small number of institutions that was excellent in all of these categories.

Some anecdotes from this period

I recall that Bob Cooke met me at the airport when I came up and we were driving over that one lane bridge down near the Plantations, and he said, "The most difficult thing about Cornell is that it is a very, very complicated institution." So, I said, "Well, I just finished running computing for the City of New York for two years and the City University of New York has 19 institutions and so I am used to complexity!" Cornell was not that complicated. The complicated thing about Cornell was the Statutory and Endowed split. The Endowed Colleges were always trying to get more money from the Statutory Colleges and the Statutory Colleges were determined to argue about every shared charge.

Starting at the beginning, my immediate impression was that there was a huge amount of talent on the staff but very little confidence and support from Day Hall. This lack of confidence was the result of a disastrous attempt to move computing from the Cornell Medical Center in NY City to the campus. The absence of micro code support for APL on the IBM 168 brought the campus machine to a standstill when running Medical Center administrative systems that were written in APL. The Medical Center had disposed of its 360 model 50 that supported APL micro code in the move and restoring it was very expensive.

In my interview a major concern was my feelings about the DACS group that Doug Gale had just been hired to lead. Faculty on the interview committee wanted to make sure I wasn't opposed to microcomputers. I had been very busy at City University buying lots of microcomputers for the Colleges as I had decided that the way to do instructional computing was with microcomputers. So, that was not a problem.

There were a couple of big problems while I was at Cornell. The biggest was that there had never been a resolution of the problem of splitting central computing costs between the Endowed and the Statutory Colleges and the Professional Schools. It was always an open sore with a built in feeling by all of these people, particularly the Business School thought we were overpriced and the Ag School always had the general feeling that they were getting screwed. So, every year the budget was put together with a lot of uncertainty as to where the funding was coming from. In the end, the Endowed Colleges would make up the difference. A key breakthrough occurred when the University Controller Jack Ostrom developed an indirect cost pool he called the Glob. The Glob included administrative computing costs and all other computing costs like consulting that couldn't be assigned directly. The Glob was allocated by formula to all of the Colleges. There was constant grumbling about the Glob but Jack had found a creative mechanism to keep computing funded in a stable way.

Developing administrative systems in a University environment is particularly challenging. The administration expects these systems to reduce staffing but this almost never happens. I vividly remember in the development of the student system that it cost a horrendous amount of money just to put grade point average into the system. Every college had different rules on what courses counted and I was told that there was no way to change that - it was a fact of life and not a central administration decision. Another problem was that in specifying a system, particularly, for example, the student system, the Registrar insisted that all the paper look exactly the same as it always looked so there was no saving in any of the student offices, in fact, they added people. This was a problem that plagued many aspects of administrative system development. Dave Koehler described this as creating a platinum-covered cow path. Unlike a business where you design a system to save labor and substitute computers to save manual labor, this was absolutely impossible at Cornell because of the respect for local autonomy in the administrative offices. The systems that we did implement, basically the student system and admissions system, financial aid, and public affairs were never designed to save labor. They were designed to give people quicker access to information without rummaging through file cabinets.

The most interesting event when I was at Cornell occurred when I was attending some IBM symposium in New Haven. There was a snowstorm and all the flights were grounded so instead of flying back that evening I was stuck in New Haven and I went out to dinner with Carl Ledbetter. Over dinner we started talking about the NSF supercomputer competition. Carl was moaning about the fact that he was trying to sell IBM on developing vectors to attach to their machines and without them there was nothing that IBM had that would enable them to compete. We discussed the fact that we had 4 array processor systems from Floating Point Systems (FPS) and were adding cycles this way and maybe if we added enough of them we could tide ourselves over until IBM could provide a vector capability. We did a back of the envelope calculation that showed if we attached something like 16 FPS boxes to an IBM machine we would have a machine that was reasonably respectable. So, we decided that just for the hell of it, Carl would feel out IBM about providing a system that would be based on 16 FPS processors. He would try to sell IBM internally and I would try to sell Cornell.

I came back to Cornell and Ken Wilson and Provost Bob Barker and Joe Ballantyne, then VP for Research, all thought that Cornell getting a supercomputer center with a major contribution from IBM would be wonderful. So, I kept talking to Ledbetter who kept telling me that he was running into lots of problems at IBM because they knew that if they agreed to the interim FPS plan, they were committing themselves to developing Vectors. Finally we'd gotten to the point where we had written an NSF proposal that Donna Bergmark had been hard at work writing with the help of others. We had a proposal ready and no commitment from IBM as the submission deadline approached. I came back from some trip and learned that everyone was waiting for me because we were flying to Yorktown. We had chartered a plane and Barker and Wilson and I were going to Yorktown to meet the IBM Director of Research to determine if this was go or no go. They decided that I ought to make the pitch so I put together some foils on the airplane in my own handwriting sketching out the system. We showed up and there were 12 people there for the meeting with Ledbetter looking absolutely ashen, looking like death warmed over. Things must not have been going well.

After my presentation on the proposed cooperative proposal to the NSF, the IBM Director of Research went around the room and about half the people said it was reasonable, about 3 people said it was too much money, a big mistake and potentially embarrassing. The whole meeting lasted no more than 30-40 minutes After listening to everyone he smiled at us and he said, "So, we'll do it! You'll get a letter by tomorrow!" Barker nearly fell out of his chair. On the way to the airport Barker observed that Ledbetter had started breathing again. Barker also said he wasn't sure when they decided to do it but he was sure it was before we got there. Obviously, the Director of the IBM labs and the Vice President of Research would have cleared it with Akers since it was a $48 Million dollar grant.

Then the second cliffhanger was the NSF site visit associated with the competition. Wilson and I had insisted on a dress rehearsal. All people who were going to testify had to come and give their pitch a couple days before the committee arrived. After all, winning or losing depended on these site visits. Wilson and I and Ravi Sudan asked the questions, pretending we were the site visit committee. My favorite question was - "Would you rather compute on a Cray?" One of the researchers replied, "Of course, I'd rather have a Cray," causing Wilson to roll his eyes heavenward! The people who came were upset because they had to take time out to rehearse. They were professionals and they'd done site visits before so why did they have to come and listen to these questions. The day of the site visit was absolutely serendipitous. Wilson who'd done a somewhat rambling job of his presentation during the dress rehearsal was absolutely brilliant and on target. He made everyone believe that it would be the end of civilization as we know it if Cornell did not develop parallel computing. Everyone who testified was enthusiastic. Wilson had brought some guy from Europe who was an expert on parallel computing, who said this was absolutely the wave of the future, this is the way computing is going and you have to have one site that does this. The difference between the site visit presentations and those at the dress rehearsal was absolutely the difference between gold and lead. I would have to say that that day marked the height of my career at Cornell.

Shortly after Cornell won the competition for a National Supercomputer Center, Cornell and the 4 other winners met with Dennis Jennings from the NSF at NCAR in Colorado to talk about networking access to the centers. We recommended immediately connecting the centers with a network. A major issue was what protocol would we use on this network. Wilson and I strongly supported TCP/IP because it was an open protocol as opposed to the other major contender DECNET which was favored by the Physics community. TCP/IP was the protocol developed for ARPA but the ARPANet had essentially died at this point in time in favor of DECNET, a network with commercial support that ran on Dec Vaxes. TCP/IP was a favorite of the most advanced part of the computer science community that ran Unix. While at City University I had signed the first license with Western Electric, the licensing arm of AT&T, to bring Unix to a University and I had started a small Unix support group in DACS when I came to Cornell under Alison Brown, Ken Wilsonıs wife. Thus we were familiar with the advantages of TCP/IP. As an open system, the University community determined its future and functionality rather than a commercial company. We carried the day largely because of Ken Wilsonıs prestige as a Nobel Prize winner.

Dennis Jennings, who was on loan from the University of Dublin in Ireland to the NSF and who was a major player in the international development of Bitnet, was a risk taker and a strong supporter of the TCP/IP decision. He believed that it was easier to ask for forgiveness than to ask for permission. So he agreed on the spot to fund connecting the national supercomputer centers to a network that we called NSFNet. The following day Cornell ordered the communication lines needed to connect the centers (well in advance of receiving an NSF grant) and the University of Illinois ordered the routers (these routers were called fuzzballs and were developed by a group at the University of Delaware). The existing ARPANet used IMPs developed by BBN that were very expensive. Shortly later, these routers were replaced by commercial routers made by a new startup called Cisco.

A second part of the grand plan was to create regional networks that would connect to this backbone network. I called a meeting of computer center directors from New York State Universities at Cornell and out of that meeting Richard Mandelbaum from the University of Rochester and I agreed to seek funding for NYSERNet (New York State Educational and Research Network), that would connect New York State Universities to the NSF backbone, thus connecting them to the national supercomputer centers. The University of Rochester was a member of the Princeton University supercomputer consortium and thus a necessary partner in this effort. We succeeded in getting funding from the NSF and New York State to create this network.

At the same time Doug VanHouweling, then at Carnegie Mellon having left Cornell, and I started a group within EDUCOM called the NTTF (Networking and Telecommunication Task Force) to argue for increased funding for Universities nationally to connect to NSFNet. We hired Mike Roberts from Stanford to lead this effort. Shortly later, I left Cornell to become the President of EDUCOM and a major piece of our agenda was funding for NSFNet. We worked closely with Senator Al Gore and his staff to develop the required legislation. Al Gore's father had sponsored the National Super Highway act and Al coined the phrase "National Information Super highway". He worked tirelessly to get this legislation passed. The NSFNet was spectacularly successful and in April of 1993 NSFnet became the Internet. As they say "the rest is History"!

Prepared by Ken King and John Rudan    September 2004