Suddenly it’s OK to talk about the Internet again. Much of the discussion has centered on stock prices: The value of the Philadelphia Internet Index has doubled since it hit a low in October 2002. But a far more interesting topic is: what’s next? The short answer is “a lot.” John R. Patrick, formerly IBM’s chief Internet technology officer, estimates that the Internet revolution is less than 3% complete.
What will distinguish this next generation from the last? Anyone foolish enough to write about the future of something as dynamic and unpredictable as the Internet does well to review past predictions and current trends before embarking in any personal flights of fantasy. Helpfully, Fortune magazine did a special Fall 2000 edition on “The Future of the Internet.” The cover photo featuring Ben Affleck and his Internet venture captured the general mindless excitement of the time. However, there were some cautionary words from Fortune surrounding its theme that “broadband would change everything.” Broadband was labeled “the trillion-dollar bet,” and skeptics were quoted as saying that there were no compelling broadband applications. The article cited a Vertical Systems Group prediction that 15% of U.S. households would have a broadband connection by 2004. The four technologies to shape the next-generation Web were identified as voice browsers, Bluetooth, peer-to-peer networking and XML.
We know that Mr. Affleck didn’t quit his day job, but how have the other predictions turned out? Those making “the big bet” generally lost and broadband hasn’t yet changed the Internet very much. The prediction from Vertical Systems Group was inaccurate as well. It was too conservative! About 15% of U.S. households had a broadband connection at the end of 2002 with the number expected to rise to 19% by the end of 2003.
It might also be argued that correctly identifying two out of four major new technologies (XML and peer-to-peer) isn’t bad, especially if we include WiFi and other forms of wireless Internet with Bluetooth as enabling technologies. So for all the hype, the underlying trends Fortune identified to support the growth of Internet businesses and their second generations weren’t wrong or even very late.
In terms of consumer behavior, however, broadband seems to have changed very little. The most recent UCLA Internet Report, “Surveying the Digital Future,” details Internet usage: The percentage of Americans using the Internet increased to 72.3% in 2001 from 66.9% in 2000, and they spent an average of 9.8 hours on the Net per week in 2001 vs. 9.4 hours in 2000. Most interesting, however, was the fact that there was almost no difference in the activities of Internet users with five or more years of experience vs. those with less than one year of experience. Web browsing and email still dominated for both groups, despite the fact that experienced users were on the Internet almost twice as much as new users (12 hours per week vs. 6 hours). Unfortunately, the study did not break out activities as related to using a modem vs. broadband connection. Assuming that broadband was much more prevalent among experienced users, there was almost no observable impact.
It looks like the Internet has been used to date by consumers primarily as a glorified teletype and search engine, or, some say, TV, since TV viewing goes down as Internet use goes up. Most projected broadband applications seem aimed at bringing one or another variant of pay-TV or video games to the consumer. So, have we already seen the future of the Internet, and is it TV?
In terms of enabling any mass wave of new, entrepreneurial companies to profit from the consumer in the Internet’s next generation, the answer is probably yes. This could be interpreted as bad news since historically the combination of the entertainment business and venture capital has hardly been synonymous with successful outcomes.
Yet, the future is not bleak since, as Lou Gerstner once said, “The Internet is about business.” This is certainly true for venture investing. Like all financial bubbles, the Internet bubble began with a basis in reality. People instinctively understood how the Internet could increase the efficiency of business. The financial payoff from successfully implementing these plans was judged by the market to be huge. These visions were and are attainable, just not as quickly or as easily as the ever-increasing prices of Internet stocks once foretold.
So why is the business landscape littered with B-to-B failures? Many of the B-to-B exchanges failed because they did not understand the complexity of old business practices and the resistance to change from the major industry players. Many of the application service providers (ASPs) simply were resellers of other peoples’ software. What these and numerous other B-to-B failures had in common is that they relied almost exclusively on the Internet as a communications network. We now know that simply providing a connection to the Internet doesn’t add enough value to support a viable business model.
The Internet is a business enabler, not a business raison d’etre. ADP Corporation, a payroll processor and world’s most successful “ASP” even before the Internet, adds value by navigating the arcane cash collection and reporting procedures of various taxing authorities and by implicitly ensuring its customers that they will not run afoul of them. Knovel Corp. (a Milestone portfolio company) provides scientific and engineering information to its customers via the Internet. Its value-add lies in its aggregation of information sources and editorial selection in combination with software tools to insure that its customers can find answers more quickly and use data more effectively. Its value is in enhanced productivity, and only secondarily in providing access to information over the Internet. The first lesson for next-generation Internet investments is that they must add significant value beyond the Internet.
The biggest stumbling block to implementing many of the legitimate Internet B-to-B business plans, as well as intra-corporate efficiency initiatives, was and is the problem of integration. The continuing proliferation of computing/communication devices-whether they are PCs, PDAs or cell phones-and the onset of peer-to-peer and its more sophisticated kin, grid computing, compounds the integration problem. XML and Web services are “solutions” to the integration problem, ergo they gotta be big!
The fate of ASPs ought to instill some caution regarding the opportunities for Web services software components. Yes, Microsoft is pushing .Net, but the suspicion grows that this has more to do with pushing its customers toward a subscription business model than with customer benefits. Yet, it is likely that third-party provision of computer “services” via the Web will be a major growth area. The reason lies less in software applications than in storage or information management.
The seemingly endless development of new computing/communication devices also means that an ever increasing amount of data will have to be collected, sorted and synchronized, properly filed and protected, and then retrieved. Trying to manage data on each device separately, or with the PC as the hub, and then integrating the result will not compute. This is especially true since the data will often have to be shared collaboratively with co-workers and business partners, both within and without the enterprise. This data will need to be collected real time via the Internet in one virtual location, separate from the devices, and processed, at least preliminarily. Establishing and managing geographically large, heterogeneous networks where numerous users interact with the data is not easy or cheap. Third parties will most likely do this except for very large organizations. The integration and management of what is becoming a constant data stream will be a major growth area. Much of these services will be provided as “collaborative platforms” or applications. The success of Salesforce.com, as differentiated from other ASPs, is, attributable to the network and data management problems inherent in dealing with the sales function.
XML will also help with the integration problem; anything that helps data interchange and understanding is good, especially if it is based on an independent standard. But XML itself is not a solution. To quote John Patrick again, “Once information is properly tagged [with XML], you will be able to find what you are looking for with much greater speed and precision.” But who would put XML headers on the phone calls they get? Some software program is going to have to do this. And this is only an instance of the larger problem of properly storing and finding data that will have to be solved. Addressing it successfully is likely to involve natural language understanding, which will be a problem, since there has probably been less progress in this field of computer science than any other over the past 50 years.
Achieving integrated, high performance systems at Internet scale will require security, management and trust certification solutions that do not now exist. These issues are grouped together since it’s clear that they and the systems that will have to deal with them are interconnected. Hackers are bad, but 80% of computer security issues involve employees. Who can get access to what level of information, both within and without the enterprise, and how you make sure that the person you’re dealing with is in fact who you think it is will be persistent questions. Trust continues to be a major problem both for the enterprise and the consumer.
What do all of these problems have in common? Most importantly, solving them will require that the Internet become an intelligent network. This means not limiting its intelligence to routing and reassembling the packets of digital bits sent over the Internet, but actually using the massive computing power inherent in the Internet to begin to process and understand the data being transmitted and take some preliminary actions based on this understanding. It will require going beyond just using the Internet as a giant and improved communications network and toward the vision that Tim Berners-Lee, considered the father of the Web, calls the “Semantic Web.”
The difference between the current generation of the Internet and the next will not be broadband, but rather that it will be rudimentarily intelligent, able to take some action on the data being transmitted. Its “intelligence” will be limited, but sufficient to have systems that begin to realize the potential of the Internet forecast by the bubble. This will be a long, slow process, but one with plenty of opportunity for new and innovative solutions and companies.
Richard J. Dumler is general partner with Milestone Venture Partners, a New York-based venture capital firm focusing on early-stage, enterprise information technology companies.