Via online business online marketing online business opportunities Vision 2020: Why The Present Is Not What It Used To Be
Traffic Exchange

Via online business online marketing online business opportunities

The year 2020 has been featured in many predictions and long-term visions in the past, implying not only the terminal point for the forecast or planning period but also a crystal-clear crystal ball. Now that the year 2020 is our present, we can clearly see where these prognostications went wrong and try to understand why they were so cloudy.

via online business online marketing online business opportunities 3 principles of cloudy crystal balls

3 principles of cloudy crystal balls


A case in point is the book Vision 2020 by Stan Davis and Bill Davidson, published in 1991. The authors were very confident in their present and future observations: “Everybody is aware that we are now in what is generally known as the information economy. This current economy will come to an end in the 2020s, about seven decades after it began… Today, information-based enhancements have become the main avenue to revitalize mature businesses and to transform them into new ones… the information economy will go through its entire lifecycle in less time than the industrial economy. It is currently in early middle age … 2010, plus or minus a decade, will mark the three-quarter point for the present economy.”

“Smart toilets” were one of the prime examples for Davis and Davidson of the “informationalizing” of existing products and businesses, the main basis for competition and innovation in the mature phase of the information economy. The term “internet” does not appear in the book’s index, nor in their detailed list of federal government investments in infrastructure.

The internet was launched in 1969. The internet economy, which created and/or bolstered the five most valuable US companies today (with two internet-born Chinese companies not far behind) was launched in 1995. The first internet search engine, Archie, and the first website, were launched in 1990. In 1991, the Gopher protocol, designed for distributing, searching, and retrieving documents over the Internet was released, Tim Berners-Lee published the code for the on the Internet, and the first US-based website went live.

Davis and Davidson had a lot of company in 1991. As a market researcher (and forecaster) for Digital Equipment Corporation at the time, I read all the reports issued by analysts and forecasters and researchers and went to all the right forward-looking conferences. The internet was never mentioned until about 1993 or 1994. Then of course, we had a sudden avalanche of reports and forecasts and analyses (i.e., lots of made-up numbers) about the “New Economy” (later known as the “dot-com bubble”).

Clickbank Marketing Tools

I also didn’t see what was coming. In my DEC reports, I predicted that the 1990s will give rise to “a distributed network of data centers, servers and desktop devices.” This was an accurate prediction, except that the internet was what made this “distributed network” possible, but I never mentioned it. Why pay attention to an obscure network which I used a few times to respond to questions about my reports by some geeks at places with names like “,” when Digital had at the time the largest private network in the world, Easynet, and more than 10,000 communities of VAX Notes (electronic bulletin boards with which DEC employees – and authorized partners and customers and friendly geeks – collaborated and shared information)?

READ  Via online business online marketing online business opportunities Industry Thought Leaders Weigh In With Their Media/Advertising Forecasts for 2020

The future, we all agreed—at DEC, and , and AT&T and all the other large tech companies at the time, and all the analysts and forecasters following them—belonged to “robust,” commercial, closed and proprietary, existing networks.

That’s the first principle of cloudy crystal balls: We extrapolate from the present and, often, predict a desired future (as in extensions of existing business models). 

A great example of this principle is another one of the “future visions” produced by a tech company that were so popular in the late 1980s and early 1990s. It was unique in showing the future from the point of view of a knowledge worker and highlighting knowledge applications. But like the ones from DEC and IBM and AT&T, it promoted the agenda and business focus of the company producing it. This 1987 “concept video” was Apple’s Knowledge Navigator, the brainchild of master marketer John Sculley, its at the time, who used it as motivational and recruiting tool.  

Even with their different emphases, all of these late 1980s and early 1990s predictions basically shared the same view of the future, of “let’s-use-a-heavy-duty-access-device-to-find-or-get-costly-information-from-centralized-databases-running-on-top-of-an-expensive-network.” This vision was thwarted by one man, Tim Berners-Lee, and his 1989 invention, the World Wide Web. Originally developed as three software standards running on top of the internet, it rapidly exploded into a vast global, commercial market, with billions of customers, totally eclipsing the “enterprise IT,” business-to-business market, the only meaningful IT market until the early 2000s and the IT market at the center of previous future visions.

Which bring us to the second principle of cloudy crystal balls: We focus on the end-result, not on how we may get there. Most of these pre-Web predictions got some of the details right, even if their timing was off, but they failed to envision the road that led to many of the new uses and devices they foresaw, the road paved by the Web.

Which meant Amazon, and , and Facebook, and billions of new users, and myriad of new uses for “IT.” It also meant data overload, a term I used in 1991 when I made this prediction: “With the destruction of both human and systems barriers to access, users may find themselves facing an overwhelming amount of data, without any means of sorting it and capturing only what they need at a particular point in time.” I’d like to think that I predicted Google in the next sentence–“It is the means of sorting through the data that carry the potential for true Enterprise Integration in the 1990s”—but the truth is that there were no Web and no consumers in my enterprise-focused predictions.

Speaking of data and the data explosion, that is another dimension of the road not taken, of focusing on the end-result, in many predictions—ignoring an underlying trend, the most significant trend driving all others. As I wrote last year, “the most astute and influential observers of the tech landscape have been counting and reporting on the number of devices, the number of users, the volume of eCommerce, the number of online ads, the number of apps, the number of images and videos, and so on. What’s largely been missing is data on data, on what drives the growth of all of these digital entities and what drives new businesses and business models and innovation and change.” Data has been eating the world and the world has been shifting from analog to digital from just about when Vision 2020 was published but data has been viewed (if viewed at all) as a by-product of other tech developments.

READ  Vive La France! Facebook, Google, Amazon Face $452 Million Tax Bill

On the contrary, data has always been the cause, not the effect. If you ignore it, you ignore important factors such as why people buy or invest in tech, what is the demand for tech tools and applications, and who is producing and consuming data and how data serves as the catalyst for business and market innovations.

Shifting the focus of predictions from tools to data, brings us closer to what matters most in the evolution of technology: People. And if you bring people into your predictions, you must understand the social aspects of technology adoption. This is the third principle of cloudy crystal balls: We ignore humans—their motivations, aspirations, attitudes—and we ignore how society works.

The best example of such failure I’m familiar with is “The ,” a report published in 1976 by the Long-Range Planning Service of the (SRI). The of 1985, the report predicted, will not have a personal secretary. Instead he (decidedly not she) will be assisted, along with other managers, by a centralized pool of assistants (decidedly and exclusively, according to the report, of the female persuasion). He will contact the “administrative support center” whenever he needs to dictate a memo to a “,” find a document (helped by an “information storage/retrieval specialist”), or rely on an “administrative support specialist” to help him make decisions.

Unlike many similar forecasts, this report does consider sociological factors, in addition to organizational, economic, and technological trends. But it could only see what was in the air at the time—the “women’s liberation” movement—and it completely missed its future implications: “Working women are demanding and receiving increased responsibility, fulfillment, and opportunities for advancement. The secretarial position as it exists today is under fire because it usually lacks responsibility and advancement potential… In the automated office of the future, repetitious and dull work is expected to be handled by personnel with minimal education and training. Secretaries will, in effect, become administrative specialists, relieving the manager they support of a considerable volume of work.”

Regardless of the women’s liberation movement of his day, the author could not see beyond the creation of a 2-tier system in which some women would continue to perform dull and unchallenging tasks, while other women would be “liberated” into a fulfilling new job category of “administrative support specialist.” In this 1976 forecast, there are no women managers.

READ  Via online business online marketing online business opportunities Smartphone brands issue trade advisories for pricing parity

But this is not the only sociological factor the report missed. The most interesting sociological revolution of the office in the 1980s – and one missing from most accounts of the PC revolution – is what managers (male and female) did with their new word processing, communicating, calculating machine. They took over some of the “dull” secretarial tasks that no self-respecting manager would deign to perform before the 1980s.

This was the real revolution: The typing of memos (later emails), the filing of documents, the creation of more and more data in digital form. In short, a large part of the management of office information, previously exclusively in the hands of secretaries, became in the 1980s (and progressively more so in the 1990s and beyond) an integral part of managerial work.

It was a question of status. No manager would type before the 1980s because it was perceived as work that was not commensurate with his status. Many managers started to type in the 1980s because now they could do it with a new “cool” tool, the PC, which conferred on them the leading-edge, high-status image of this new technology. What mattered was that you were important enough to have one of these cool things, not that you performed with it tasks that were considered beneath you just a few years before.

Status matters. People matter. What we do with technology—and why and how—is more important than the technology itself. It’s difficult to see that when you focus on technology, on its made-up “laws” (as in Moore’s Law—which he never called a “law”), on some deterministic, must-happen technological trajectory. This widely-shared perspective is driven by “paradigms” and “models,” more often than not driven by a world-view the historical materialism-influenced education many forecasters get (or simply the comforting notion that the world can be predicted because it follows a specific pattern). What tripped Davis and Davidson into believing that the information economy was already at its “mature stage” in 1991 was the concept of the s-curve, which all “economies” (they believed) must follow.

A big part of ignoring how society works is this adherence to the comfortable (and comforting) notion that history unfolds inexorably according to universal laws. That was the reason for the popularity, also in 1991, of the ridiculous notion of “the end of history,” that we have reached the universalization of Western liberal democracy as the final form of human government.

That is also the reason that in 1991, a dozen years after the start of China’s “reform and opening-up,” the re-orientation of more than a billion people toward economic competition, no one predicted that in 2020, China will be the second largest economy in the world.

Read More

Please Login to Comment.