What is a Content Management System?
What's a Content Management System (CMS)?
Simply put, a CMS is a complete computer system that manages information. A CMS can be programmed in any computer language and run on any computer system. It allows data to be input, stored in a database, edited by authorized users, and displayed to the public. A good CMS handles every aspect of formatting, storing, cataloguing, and retrieving data, which means that editors and users don't need any technical expertise and don't need to use a special program. CMSs are used both professionally, by newspapers, libraries, online stores, academic journals, and non-professionally, for personal applications like online photo albums, diaries, and music collecting.
The best way to explain what a CMS does is to summarize the way it's being used to upgrade the Bahá'í Library Online [note: as of August 2003, when the system was under initial construction. -J.W.]. I'll describe the painstaking, slow old way and the fairly error-free, infinitely-expandable, multi-language new way. Note that all the points in the old method start with "I", but the new method is done by "Users" -- turning the Bahá'í Library into a CMS thus weans it off of a single editor and opens it to the world.The old way (a partial summary):
Pros: full control over formatting and cataloguing
Cons: only works well with sites of a few hundred files (the Bahá'í Library now has 18,000 files); site ends up being full of errors, dead links, outdated content, limited crossreferencing, and becomes wholly unmanageable for an individual or even a small team of individuals.
Pros: Anyone can register, upload content, correct errors, and add crossreferences; site is easy to navigate and search; content is always up-to-date; site can and soon will house millions of files; site can handle documents in any language
Cons: Limited flexibility; significant up-front time requirements
How's it work?
Easy to explain. First, understand that there are two computers involved in any internet transaction: the home user's "client" computer, which sits on her desk at home or office, and the website's "server" computer, which can be anywhere in the world (and often is). The client-side computer is called the "front end" and the server-side computer is called the "back end".
In the old way of doing things, a website editor has to run a bunch of programs on her home computer in order to create a webpage. She has to use one program to edit photos, another program to convert documents to html, another program to upload them, yet another program (an internet browser) to view them online, and a handful of other specialized programs for various webmastering tasks. This requires installing a lot of expensive software on one's own computer, learning how to use it, and keeping up with regular upgrades. Other than storing the files, the server computer does almost nothing -- all the work is done on the client-side computer.
With a CMS, the only program a user has to use is the internet browser like Internet Explorer or Netscape, and everything else is done on the server. After preparing the HTML, an editor uses the browser to upload and edit documents and the server software takes care of the rest. In other words, it's almost a complete reversal of the old method! This puts all the burden on the website programmer, and relieves the home user of any technical knowledge.Here's the actual process:
Content Management Systems represent a wholly new paradigm for using computers. Home users have taken two decades to get used to the idea of a file system, where data is stored in discrete documents, each with a name and a location in a directory. This paradigm was intended to be a close facsimile to a real office, with desktops, files, folders, and so forth. In a physics analogy, this is a Newtonian virtual world.
CMSs will lead users to accept a new paradigm, one in which the filesystem has been discarded in favor of a database backend which is interpreted by PHP and the HTML interface is created on-the-fly. In this paradigm data is nebulous and without fixed form. In our physics analogy it is a quantum reality, where content is only given shape in the process of interacting with it. It is thus an extremely flexible virtual world, where structure is created by interpretation and can thus be infinitely expanded, shaped, and delivered.
History of CMSs
CMSs as described above have existed for at least a decade, and were first created as custom-coded internal computer networks for libraries, newspapers, and other information-oriented sites. Now the term is used almost exclusively to refer to web-based systems.
1992-95: In the early days of the web, CMSs were employed only by big-budget websites such as Amazon.com. They were programmed mostly in C++ and were run on commercial databases like Oracle. Set-up costs ran into the millions of dollars.
1995: In 1995 the first free, or "Open-Source", CMS was created for the Portland Pattern Repository. The programmer named his system "Wiki Wiki," Hawaiian slang for "quick and easy". He credited the seed for the system to Apple's Hypercard, which was released in the mid-1980s and has been said to be the first true CMS.
1997: PHP, a dynamic HTML engine first created in 1995, was retooled and was released to instant acclaim for its versatility, stability, and ease-of-use. The acronym "PHP" originally stood for "Personal Home Page" but was rechristened "PHP Hypertext Processor".
1995-1999: Thousands of Perl-based Wiki sites appeared around the internet, each devoted to highly specific areas of interest and each with numerous dedicated contributors. A number of PHP-based CMSs also appeared, such as PhpNuke and PostNuke, and a combination of the two called PhpWiki (which is in use at bahai9.com). However, they were largely of interest to techies only and not common. Most CMSs of this period remained those created by well-funded companies and used proprietary ("expensive"!) software such as Allaire's ColdFusion, MicroSoft's ASP "Active Server Pages," and Oracle databases.
During these years three other Open Source elements matured: the back-end database program MySQL, the server operating system Linux, and the webserver software Apache all became more useful and more stable, and enjoyed widespread implementation by host providers, meaning they became available to home users at low cost.
2000: PHP 4.0 was released, which was a vast improvement on all earlier versions. Now all the pieces were in place to witness the beginnings of an Internet revolution, one which is just now becoming apparent. The combination of a free stable operating system (Linux), a free web server (Apache), a free backend database (MySQL), and a free easy-to-learn HTML generator (PHP) allowed web developers to build and manage complex sites at little cost and without years of technical training. These four tools together are such a complete and well-integrated package that they earned the designation LAMP: Linux Apache MySQL PHP. (Not all open-source CMSs use LAMP -- there are many variants on this combination, e.g. using Unix instead of Linux or coding in Perl instead of PHP -- but the LAMP combination is so common that it serves as a useful shorthand.)
2003: Fast forward three years to an internet landscape very different from the 1990s: anyone can now do for free and on their own something equivalent to the system Amazon.com invested millions of dollars and thousands of people to create just a decade before. In the past three years the number of LAMP sites blossomed from a few dozen thousand to millions. LAMP enthusiasts claim that the advantages over the proprietary systems like Microsoft and Oracle are not only price but also ease of use, stability, security, and large, helpful user communities. Not only do proprietary systems often not seem worth their price, they can be harder to use and less secure as well.
2005 [addendum]: Above I mentioned "an Internet revolution". This is a topic worthy of another presentation, so for now let me offer an observation and a prediction. My observation is that keen observers of contemporary social and political trends are beginning to suspect that the unfolding of the information age has been accompanied by an unprecedented transfer of political power from the American people to the forces of technology and capital. However, the same computer technologies which have allowed capital to control the emergent plutocracy may also provide the seeds of its undoing. My prediction is that by the end of 2004 Americans will be very familiar with the words weblog (or blog) and sites such as MeetUp, IndyMedia, or WikiPedia, and will credit such CMS projects with helping revitalize and eventually restore democracy.