An umbrella term for the second wave of the World Wide Web, which was coined in a conference on the subject in 2004 by O'Reilly Media and CMP Media (later taking its parent name of United Business Media). Sometimes called the "New Internet" as well as "Internet 2.0," Web 2.0 is not a specific technology; rather, it refers to two major paradigm shifts. The one most often touted is "user-generated content," which relates more to individuals. The second, which is equally significant, but more related to business, is "cloud computing."
#1 - The User Rules!
User-generated content, comprising Wikipedia, Facebook, Twitter and myriad blogs, lets everyone have their say on anything and publish it to the world at large. People can easily create a blog or personal Web page and upload their own opinions, audio and video. Users are augmenting the news by reporting current events sometimes faster and with details often overlooked or ignored by the professional news media.
Although millions of opinions and videos, often very amateurish, only add to our information overload, a significant advantage to user-generated content is that truly talented authors, artists, musicians and moviemakers can gain an audience much more easily than they could in the past. Word-of-mouth via the Internet is worth a fortune in promotion. Web 2.0 is leveling the playing field in all arenas just as the PC leveled the playing field in business. See Mobile 2.0
, hot topics and trends
, social networking site
, user reviews
#2 - Cloud Computing
In cloud computing, data and applications are stored on Web servers, and a user has access from any computer via a Web browser. Cloud computing turns the Web into a gigantic application server that is slowly but surely supplanting locally installed office applications. Many believe this particular aspect of cloud computing is the ultimate manifestation of Web 2.0. Another aspect of cloud computing relates to developers and Web publishers (see cloud computing
Cloud services are having significant impact on the type of personal computers people choose. As more software is executed from scripts embedded in Web pages, the CPU chips and operating systems become less relevant. Web browsers interpret scripts the same regardless of the hardware and software environment they reside in (most of the time, that is). For example, in 2007, Google combined several of its office applications into Standard and Premium Editions, the latter a paid service with tech support. Because of Google's influence, this was a watershed event for cloud computing (see Google Apps
). See ASP
, Web application
, thin client
and Enterprise 2.0
What Caused Web 2.0?
Bandwidth and power. Faster than the very costly T1 lines used in the enterprise, cable, DSL and FiOS hookups have extended high-speed connections to individuals and small businesses. Browsing Web pages full of images and video as well as downloading huge video files have become routine.
In addition, entry-level computers became powerful enough to execute scripts in an HTML page without noticeable delays. Combined with refinements in Web programming (see AJAX
), the Web has become a transparent extension of an individual's PC just as local area networks (LANs) extended the user's computing resources inside the enterprise in the 1980s and 1990s.
In the mid-1990s, the Web began (Web 1.0) as a repository of information and static content. Within a couple years, a huge amount of content was dynamic, returning custom results to users. By the turn of the century, the Web became much more interactive (call it Web 1.5), allowing users to play, stop, rewind and fast forward through audio and video content. With Web 2.0, Web-based apps feel like and run as smoothly as local applications. See Web 3.0