1000/1000
Hot
Most Recent
Consumerization is the reorientation of product and service designs to focus on (and market to) the end user as an individual consumer, in contrast with an earlier era of only organization-oriented offerings (designed solely for business-to-business or business-to-government sales). Technologies whose first commercialization was at the inter-organization level thus have potential for later consumerization. The emergence of the individual consumer as the primary driver of product and service design is most commonly associated with the IT industry, as large business and government organizations dominated the early decades of computer usage and development. Thus the microcomputer revolution, in which electronic computing moved from exclusively enterprise and government use to include personal computing, is a cardinal example of consumerization. But many technology-based products, such as calculators and mobile phones, have also had their origins in business markets, and only over time did they become dominated by high-volume consumer usage, as these products commoditized and prices fell. An example of enterprise software that became consumer software is optical character recognition software, which originated with banks and postal systems (to automate cheque clearing and mail sorting) but eventually became personal productivity software. In a different sense, consumerization of IT is the proliferation of personally owned IT at the workplace (in addition to, or even instead of, company-owned IT), which originates in the consumer market, to be used for professional purposes. This bring your own device trend has significantly changed corporate IT policies, as employees now often use their own laptops, netbooks, tablets, and smartphones on the hardware side, and social media, web conferencing, cloud storage, and software as a service on the software side.
Consumerization has existed for many decades, as, for example, the consumerization of refrigeration occurred in the 1910s through 1950s. The consumerization of IT is believed to have been first regularly called by that term by Douglas Neal and John Taylor of the Leading Edge Forum in 2001; the first known published paper on this topic was published by the LEF in June 2004.[1] The term is now used widely throughout the IT industry, and is the topic of numerous conferences and articles. One of the first articles was special insert in "The Economist" magazine in October 8, 2011.[2] Later, Consumerization of IT has been used ambiguously. In an effort to structure the amorphous nature of the term, researchers suggested to take three distinct perspectives: an individual, organizational and market perspective.[3]
The technology behind the consumerization of computing can be said to have begun with the development of eight-bit, general-purpose microprocessors in the early 1970s and eventually the personal computer in the late 1970s and early 1980s. Thus, the microcomputer revolution, in which electronic computing moved from exclusively enterprise and government use to include personal computing, is the cardinal example of consumerization. However, it is significant that the great success of the IBM PC in the first half of the 1980s was driven primarily by business markets. Business preeminence continued during the late 1980s and early 1990s with the rise of the Microsoft Windows PC platform. Meanwhile, other technology-based products, such as calculators, fax machines, and mobile phones, also had their origins in business markets, and only over time did they become dominated by high-volume consumer usage, as these products commoditized and prices fell.
It was the growth of the World Wide Web in the mid 1990s that began to reverse this pattern. In particular the rise of free, advertising-based services such as email and search from companies such as Hotmail and Yahoo began to establish the idea that consumer IT offerings based on a simple Internet browser were often viable alternatives to traditional business computing approaches. Meanwhile, it is argued that consumerization of IT embodies more than consumer IT diffusion, but a chance for considerable productivity gains. It "reflects how enterprises will be affected by, and can take advantage of, new technologies and models that originate and develop in the consumer space, rather than in the enterprise IT sector".[4]
The primary impact of consumerization is that it is forcing businesses, especially large enterprises, to rethink the way they procure and manage IT equipment and services. Historically, central IT organizations controlled the great majority of IT usage within their firms, choosing or at least approving of the systems and services that employees used. Consumerization enables alternative approaches. Today, employees and departments are becoming increasingly self-sufficient in meeting their IT needs. Products have become easier to use, and cloud-based, software-as-a-service offerings are addressing an ever-widening range of business needs in areas such as video-conferencing, digital imaging, business collaboration, sales force support, systems back-up, and other areas.
Similarly, there is increasing interest in so-called Bring Your Own Device strategies, where individual employees can choose and often own the computers and/or smart phones they use at work. The Apple iPhone and iPad have been particularly important in this regard. Both products were designed for individual consumers, but their appeal in the workplace has been great. They have demonstrated that elements of choice, style and entertainment are now critical computer industry dimensions that businesses cannot ignore.
Equally important, large enterprises have become increasingly dependent upon consumerized services as search, mapping, and social media. The capabilities of firms such as Google, Facebook, and Twitter are now essential components of many firm’s marketing strategies. One of the most important consumerization questions going forward is to what extent such advertising-based services will spread into major corporate applications such as email, Customer Relationship Management (CRM), and Intranets.
One of the more serious negative implications of consumerization is that security controls have been slower to be adopted in the consumer space. As a result, there is an increased risk to the information assets accessed through these less trustworthy consumerized devices. In a recent CSOOnline article by Joan Goodchild she reported a survey that found "when asked what are the greatest barriers to enabling employees to use personal devices at work, 83 percent of IT respondents cited "security concerns"[5] This shortcoming may soon be remedied by the chip manufacturers with technologies such as Intel's "Trusted Execution Technology" [6] and ARM's "Trust Zone" [7]—these technologies being designed to increase the trustworthiness of both enterprise and consumer devices.
In addition to the mass market changes above, consumer markets are now changing large-scale computing as well. The giant data centers that have been and are being built by firms such as Google, Apple, Amazon and others are far larger and generally much more efficient than the data centers used by most large enterprises. For example, Google is said to support over 300 million Gmail accounts, while executing more than 1 billion searches per day.
Supporting these consumer-driven volumes requires new levels of efficiency and scale, and this is transforming many traditional data center approaches and practices. Among the major changes are reliance on low cost, commodity servers, N+1 system redundancy, and largely unmanned data center operations. The associated software innovations are equally important in areas such as algorithms, artificial intelligence, and Big Data. In this sense, consumerization seems likely to transform much of the overall computing stack, from individual devices to many of the most demanding large-scale challenges.