A couple of weeks ago, IDC put out a study http://www.emc.com (paid for by EMC) that estimated the amount of data produced in 2006 and projected the load for 2010.
The figures are astounding.
All of us together, mostly individual users, created 161 exabytes of data in 2006.
That's 161 billion gigabytes.
In 2010, that figure will rise to almost a zettabyte. That's roughly 1,000,000,000,000,000,000,000 bytes.
My eyes go cross just trying to count the number of commas.
Now, there are many scary things about all this data.
For example, if you're looking for a needle in it, how will you find it?
Will today's search techniques be up to the job?
Particularly, if it is stored here, there, and everywhere.
Also, the quality of all this data will be — how to put this politely — uneven.
Back in the 18th century, an educated person may well have been able to read everything that had ever been published because the archive was not all that large.
And to get published, the thing had to pass some minimal standards at a publisher.
Today, any fool can publish and does. (your reading this, aren't you)?
Profusely. So who figures out what the good stuff is?
Where is the trusted arbiter of quality? Brand names will figure prominently here.
If you believe the Wall Street Journal, then you'll buy its version of things, and if you believe Jerry Falwell's version of things, then you'll buy into whatever the National Liberty Journal says.
So the moral here is...check the background and reputation of the sources of your information before you give it weight or creedence...
or the drivel that you are regurgitating as facts may be your own!