A Different Way of Computing
I need a computer. Actually, I need the processing power of hundreds of computing hours. Heretofore, if I was a researcher running data or testing a model, that meant using solely the computing power available on my campus computing system. For a major operation, that might mean waiting behind other faculty and student projects, and then having to run my data over days at a time. Today, that computing power can be had at my fingertips in a matter of minutes – or even seconds.
Likewise, my email, my files, my programs were all formerly on my computer – or on my campus’ mainframe. Today, those operations – and my data – may reside on servers in Washington State – or in Bangalore. And this may not just be for my computing needs. Today, it may be for my entire campus and all of the institution’s students and faculty.
Welcome to the world of cloud computing!
The Cloud Computing Concept
The Economist reminds us that: “Computing has constantly changed shape and location—mainly as a result of new technology, but often also because of shifts in demand.” We have seen revolutionary computing technologies – truly “game changing” concepts – come about roughly once each decade in the “modern era” of computing since around 1945 when computing came to mean computations performed by a machine, not by man. From the mainframe era of the 1960s to the advent of minicomputers in the 1970s, the personal computer in the 1980s, the growth of the Internet and the Web in the 1990s, and the explosion of cell phones and other smart, Web-connected devices in the past 10 years.
Now, many think that cloud computing will be “the next big thing.” Indeed, Gartner believes that in the end, the impact of the cloud model will be “no less influential than e-business.” If industry analysts are correct, we thus stand at an inflection point – a true paradigm change – in the evolution of computing.
The basic idea behind cloud computing is that anything that could be done in computing – whether on an individual PC or in a corporate data center – from storing data to communicating via email to collaborating on documents or crunching numbers on large data sets – can be shifted to the cloud. As can be seen in Table 1, cloud computing encompasses a wide variety of offerings, including: SaaS (Software as a Service), PaaS (Platform as a Service), and IaaS (Infrastructure as a Service).
Variants of Cloud Computing
“Software as a Service”
Companies host applications in the cloud that many users access through Internet connections. The service being sold or offered is a complete end-user application.
“Platform as a Service”
Developers can design, build, and test applications that run on the cloud provider’s infrastructure and then deliver those applications to end-users from the provider’s servers.
IaaS “Infrastructure as a Service”
System administrators obtain general processing, storage, database management and other resources and applications through the network and pay only for gets used.
Cloud computing has now become “shorthand” for the larger trend of computing services delivered over the Internet. From the perspective of the market analyst, IDC, cloud computing represents “an emerging IT development, deployment and delivery model, enabling real-time delivery of products, services and solutions over the Internet.” As one commentator recently characterized it: “Cloud computing — in which vast stores of information and processing resources can be tapped from afar, over the Internet, using a personal computer, cell phone or other device — holds great promise…to cut the costs, complexity and headaches of technology for companies and government agencies.”
Certainly, one of the hallmarks of cloud computing is that it enables users to interact with systems, data, and each other in a manner that minimizes concern about the underlying technology. According to the Cloud Computing Manifesto: “The key characteristics of the cloud are the ability to scale and provision computing power dynamically in a cost efficient way and the ability of the consumer (end user, organization or IT staff) to make the most of that power without having to manage the underlying complexity of the technology.”
The Economist captured the meaning of this trend in stating: “The plethora of devices wirelessly connected to the Internet will speed up a shift that is already under way: from a ‘device-centric’ to an ‘information-centric’ world….(and) as wireless technology gets better and cheaper, more and more different kinds of objects will connect directly to the cloud.” Technology guru Clay Shirky perhaps put it best when he said: “What is driving this shift is a change in perspective from seeing the computer as a box to seeing the computer as a door.” The emerging cloud computing paradigm is thus based on a “user-centric interface” that minimizes user concern over the supporting infrastructure.
How does this new, on-demand, information-centric model of computing fit in the world of higher education – and what does it entail for research, for collaboration and for communication in colleges and universities? This article examines the early evidence from the field and discusses the practical and institutional implications. It concludes with a Cloud Migration Strategy for college and university IT executives to follow as they seek to best integrate cloud computing into their overall IT strategies.
Cloud Computing in Universities Today
For universities, migrating to cloud-based services affords them the ability to provide improved collaboration and research capabilities, while at the same time, providing an opportunity to cut IT costs while providing the same – or better – levels of computing services. Magnified by the need to pare overhead costs at a time when public and private institutions are grappling with significant budget shortfalls, cloud computing allows universities to not just use the resources of commercial cloud providers – many of which are available to them either for free or at reduced costs. With the cloud model, students and faculty can take advantage of the ability to work and communicate from anywhere and on any device using cloud-based applications.
The benefits for higher education center upon the scalability and the economics of cloud computing. These will be discussed in subsequent sections.
Scalability of Resources
One of the most important impacts of cloud computing will be the notion of computing power on-demand. One industry expert described this newfound power in the following manner: “When you radically democratize computing so that anyone has access at any moment to supercomputer-type capacity and all the data storage they need.” This “democratization” of computing processing and storage power could have profound implications in everything from scientific inquiry (by making no problem too big to compute) to new enterprise formation (by drastically reducing the need for upfront investment in IT resources – and the people to support and maintain them) to public agencies (by making IT more affordable and available to governments at all levels and in all locales). Thus, we may be seeing a truly new era, where through democratizing computing technology, this will help to bring “the benefits of high-powered computers and communications to all.”
Cloud computing is a revolutionary concept in IT, due to an unprecedented elasticity of resources made possible by the cloud model. In everyday use, elasticity is commonly thought of not just as the ability of an object to stretch out when needed, but to also contract as necessary (think of a rubber band or a bungee cord). In computing terms, elasticity can be defined as: “The ability of a system to dynamically acquire or release compute resources on-demand.” Under the cloud model, organizations that need more computing power have the ability to “scale-up” resources on-demand, without having to pay a premium for that ability. Say, for instance, that a researcher or a department has large, batch-oriented processing tasks. The individual or group can run the operations far faster than previously and at no additional costs, since using 1000 servers for one hour costs no more than using one server for 1000 hours. This unique attribute of cloud computing is a commonly referred to as “cost associativity,” and it allows for computational needs to be addressed far faster and far cheaper than in the past. In short, cloud computing gives organizations – even individual users – with unprecedented scalability.
Additionally, where in the past only the largest universities have had supercomputing capabilities cloud computing, with number-crunching capabilities available on an on-demand basis, affords researchers anywhere to scale their computing power to match the scale of their research question – bringing supercomputing to the mainstream of research. As Delic and Walker recently characterized it, cloud computing might just “enable new insights into challenging engineering, medical and social problems,” as researchers will now have newfound capabilities “to tackle peta-scale type(s) of problems” and to “carry out mega-scale simulations.” Craig A. Stewart, Associate Dean for Research Technologies at Indiana University, recently remarked that with cloud computing, “You reduce the barrier to use advanced computing facilities.”
We have seen the reduction of barriers already paying dividends in research. At pharmaceutical giant Eli Lilly, researchers needed to queue their projects to run in Lilly’s internal data center. This process to provision enough server capacity for their respective projects often meant a delay of up to two months waiting on their data run. Today however, with cloud computing, research scientists can today provision the necessary processing capacity for their projects in five minutes. This allows researchers at Lilly and other research organizations to crunch data and test theories in ways that may have gone unexplored in the prior era where they would have been dependent solely on in-house computing resources! Similar experiences are being reported at universities, both in the U.S. and abroad. For instance, at the International Institute of Information Technology in Hyderabad, India, Associate Professor Vasudeva Varma reports that the ability to run more data from more experiments more quickly has resulted in more publications for faculty in the Institute’s Search and Information Extraction Lab, which he heads.
Economics of Computing
There is much discussion about the whole concept of “free” pricing for products and services today – and many of the email, storage, hosting, and applications that are at the forefront of cloud computing today are indeed free. The most notable of these are the product offerings of Google (Gmail, Google Apps, Google Docs, and others). Much attention has been devoted to the concept of “freeconomics,” most notably the recent book by Wired magazine editor Chris Anderson entitled, Free: The Future of a Radical Price. Most consumer-level cloud offerings would be labeled a “freemium,” which is a free version that is supported by a paid, premium version. Such freemiums are becoming an emergent business model, as they are particularly popular among online service and software companies. And, when faced with competing against “free” alternatives, older, more established companies have seen users migrate to the gratis alternative. Indeed, some see an entire “Culture of free” emerging, where from music to entertainment to news to software, people are coming to expect that free is the price they should pay.
In the corporate computing market, as software, hardware and processing power, and storage capacity become more and more commoditized, cloud computing becomes a free – or lower cost – alternative to the way things have been done for decades. As Gartner analyst Andrea DiMaio recently remarked: “Why should I bother looking for an email client to replace Outlook and coexist with my newly installed OpenOffice, if I can get email and office suite as a service with somebody like Google at a fraction of the cost and – most importantly – giving up the IT management burden too? Why are we talking about moving servers from Windows to Linux when the real question is why do we need to have our own servers in the first place?”
Already, there have been many campuses that have switched to Google or Microsoft-hosted email. Google and Microsoft host email for over four thousand colleges and universities, not just in the U.S., but in over 80 countries worldwide. In fact, almost half of all campuses are now making use of hosted email services. The switch to hosted services is paying significant dividends for the early adopting institutions. By switching to Gmail, Notre Dame reports that it saved .5 million in storage and other tech costs, while at the same time, finding that their students’ satisfaction with the campus’ email rose by over a third! Likewise, institutions (such as Arizona State and Washington State) are consistently reporting at least six figure annual savings from switching to Google or Microsoft hosted systems. Even more importantly, by switching to hosted email and productivity software, the job and focus of college IT staff can be changed. As Pepperdine University’s CIO Timothy Chester recently observed, his smaller IT staff can now be used more efficiently and be more productive, commenting that: “We want our staff working more with students and faculty and less on the nuts and bolts of delivering technology.”
Certainly, as challenging budgetary times have been forecast to persist across higher education for the next few years – at least, there will likely be even greater pressures on colleges and universities to replace “paid” software and computing resources with “free” or low-cost cloud alternatives. From the cloud provider standpoint, Google has stated that its incentive in providing such free services to universities is to create “relationships for life” with students and faculty.
Many in higher education are coming to believe that concur with the cloud computing will be the model of the future for information technology delivery and utilization in colleges and universities. Across higher education, the cloud computing landscape should be quite active over the next few years, as we will see both coordinated efforts and “rogue” operations that will test how and where cloud computing can be effectively applied. As we have seen, colleges and universities will in many instances lead the way. These entities will continue to do so, based on their need for computing power on demand and for providing the types of ready – and in many cases free – IT resources – to their faculty and students. With pressure to reduce the fixed costs of higher education – and IT being a very rich target – the shift to cloud may be more forced in some cases than may be dictated by the on-the-ground circumstances. Indeed, some of the most exciting uses and best practices for cloud computing could well come from the world of higher education.
We have seen predictions that due to the cost and operational benefits of cloud computing, more and more companies will find themselves outsourcing most – if not all – of their IT to cloud providers, creating what has been termed as “serverless organizations.” Indeed, it has been predicted that organizations of all sizes will find it beneficial to concentrate on and optimize their business processes by outsourcing the IT function. So, why not “serverless universities?” By outsourcing almost all of IT and all data storage/handling – this may be a viable proposition for colleges and universities, particularly as cloud offerings expand and are made more secure and reliable.
As we have seen in this article, there are certainly discussions and embryonic efforts underway – both in the U.S. and abroad – as public and private universities examine how to best made the cloud-concept work for they and their students and faculty. Universities are beginning to work collaboratively in the cloud to pool their IT resources. Already, this has occurred in Virginia and North Carolina. In the Commonwealth, a dozen colleges and universities have come together to form the Virginia Virtual Computing Lab. Such efforts allow institutions to cut their IT costs by reducing their need for software licensing, for upgrade capabilities, and for perhaps maintaining their own data centers, all while improving the IT resources for their faculty and students. Already, by shifting to cloud offerings, North Carolina State University has been able to dramatically lower expenditures on software licenses and simultaneously, reduce the campus’ IT staff from 15 to 3 full-time employees.
Additionally, there have been calls for the federal government to take the lead to create a universal cloud computing environment, to be available for use by all colleges and universities nationwide. In doing so, proponents argue for the economic and educational benefits that such a resource would provide, as it would democratize computing technology and “level the playing field” so all students and faculty could have access to the scale and type of computing power enjoyed only by elite institutions.
A Cloud Migration Strategy for Higher Education
It is important to bear in mind that, as one commentator recently put it, “cloud computing is a tool, not a strategy.” IT leaders in higher education will thus be well-advised to take a programmed, assessment of how cloud computing can fit into their overall IT strategy, in support of the mission and overall strategy of their institution. This should take the form of a 6-step process, which this author has labeled as the Cloud Migration Strategy.
The Cloud Migration Strategy begins with learning about the basics of cloud computing – through attending seminars, networking, talking with vendors, and reading articles such as this one. Given that cloud computing represents a new paradigm in computing technology, it will be important for technology transfer to occur – the “techies” in and outside of the institution will need to go the extra mile to educate and inform the “non-techie” amongst their ranks and constituencies as to the merits and value of cloud computing. It will be especially important to devote sufficient funding for research to establish how cloud computing is working – or not working – in various areas in the university and across institutions, so as to ground policies and develop best practices in regards to the use of cloud computing.
Then, IT executives should conduct an honest assessment of their institution’s present IT needs, structure, and capacity utilization. In a cloud computing environment, where resources can be added – or subtracted – based on needs and demand, it will be critical for IT managers to honestly assess their institution’s IT baseline for faculty, students and operations. In looking at data center utilization, it will be vital to look at what resources are used all the time and are necessary for day-to-day operations to establish a baseline for internally-hosted operations. Only then can one look at whether to continue to host “excess” capacity in the data center or to contract for cloud services as needed to scale-up to meet demands for greater amounts of computing resources.
University IT leaders should then pick one area – even one specific project – to “cloud pilot” and assess their ability to manage and bring such a project to fruition. As with any new technology, we are seeing a great deal of pure experimentation with cloud computing – “science project” like work for the most part up till now. All of us who use the Internet are experimenting with cloud applications in our daily lives – from Twittering to Gmail to using photo-sharing sites. In the same way, we are seeing organizations conducting cloud computing trials – what one writer termed as “science experiments” in the use of the technology. Such efforts that are far away from their core IT operations and many times on (or trying to connect) the periphery of the organization. . Many times – even in the public sector and especially on campuses, these experiments may be “rogue” operations – taken on by individuals and units to test the utility of the technology. These are important efforts, and they should be supported – and reported within and outside the institution – so that others in the IT and the wider community can learn of the successes – and the downsides – of operating in the clouds. Thus, it will be vitally important to share both “best practices” and “lessons learned” in cloud computing. Indeed, many predict that such “science projects” in large and small organizations will drive the eventual acceptance and adoption of cloud computing.
After the internal assessment and external outreach stemming from the pilot effort, they should then conduct an overall IT cloud-readiness assessment to determine if they have data and applications that could readily move to a cloud environment and if a public/private/hybrid cloud would be suitable or useable for these purposes and rank-order potential projects. Finally, it is time to begin a cloud rollout strategy – gaining buy-in from both institutional leadership and IT staffers and communicating with both internal and external stakeholders as to the goals, progress, and costs/benefits of each cloud project. This is where the cloud goes from being a test effort to become more mainstream in the way the university manages its data, its operations and its people. It becomes part of “normal” operations, just as other prior tech innovations (from telephony to fax to the Internet to email and to social media) have become IT tools, used in support of the institution’s IT strategy and more importantly, its overall strategy.
At this point, the process enters the final stage – call it “continuous cloud improvement” – to where the institution continues to move appropriate data and applications to the cloud – and perhaps even back from the cloud to internally-hosted operations, if necessary, based on a thorough and continuous assessment of the appropriate use of cloud technologies for their particular university.
Implications for Higher Education
The shift to more cloud-based applications will indeed bring newfound capabilities to communicate, collaborate and conduct research to university faculty, staff and students. However, it will also necessitate a flurry of policy decisions that will need to be made and operational rules that will need to be implemented. For instance, there will have to be IT policy decisions made as to who can access what files and what type of access they will have (i.e. read-only, editing access). The shift will also necessitate institutions to examine how cloud computing will secure and procure their computing environment.
Indeed, one of the principal concerns about cloud computing whether it is secure and reliable. Unfortunately, worries over cloud reliability and availability – or specifically, the lack thereof when such instances arise – are not just theoretical, as there have been well-publicized outages of many of the most popular public cloud services. And, as one industry analyst astutely pointed-out, when cloud service outages or inaccessibility occur, “most of the risk and blame if something goes wrong will fall directly on the shoulders of IT — and not on the cloud computing service providers.”
Security concerns may indeed impede the shift to cloud-based models. As with prior shifts in information technology with the advent of the Internet and the Web, the introduction of e-mail, and the explosion of social media, their growth and adoption rates have been slowed by initial fears – some justified and some very unjustified – over security concerns and the loss of control over data and operations. Certainly, privacy and security questions will need to be addressed as institutional data and applications move into a cloud environment. Indeed, analogies have been drawn between the advent of cloud computing today with the introduction of wireless technologies a decade ago. Finally, security is undoubtedly a hard metric to quantify. And, all too often, from the perspective of Bernard Golden and other observers, the IT community has a somewhat damaging tendency to treating all risks – whatever the real nature of them – as the very worst case scenario and not judging the true impact – and likelihood – of their occurrence.
Finally, universities’ often outdated and byzantine procurement rules and regulations, some of which may even preclude the use of cloud computing in select instances, will need to be changed to be more cloud-friendly and encourage the savings and efficiencies that can come from this new model of IT. There will also need to be changes made in not just the language, but in the mindset of contracting for computing services. For while IT administrators look at capacity and systems, end users look to performance. As Joab Jackson recently put it, the key metric will now become: “When I sit down at that computer, do I see the functionality I need?”
In time, we may look back on the latter portion of this first decade of the new millennium as a true turning point in the history of computing. The transition however will take years, perhaps even decades, and we’re not close to a day when we will simply have computing easily at our fingertips. However, all signs point to a true, campus-led revolution in computing.
David C. Wyld (firstname.lastname@example.org) is the Robert Maurin Professor of Management at Southeastern Louisiana University in Hammond, Louisiana. He is a management consultant, researcher/writer, and executive educator. His blog, Wyld About Business, can be viewed at http://wyld-business.blogspot.com/. He also serves as the Director of the Reverse Auction Research Center (http://reverseauctionresearch.blogspot.com/), a hub of research and news in the expanding world of competitive bidding. Dr. Wyld also maintains compilations of works he has helped his students to turn into editorially-reviewed publications at the following sites:
Management Concepts (http://toptenmanagement.blogspot.com/)
Book Reviews (http://wyld-about-books.blogspot.com/) and
Travel and International Foods (http://wyld-about-food.blogspot.com/).
Written by David Wyld
Professor of Management, Southeastern Louisiana University