Curated by: Luigi Canali De Rossi

Tuesday, June 27, 2006

Independent Publishers Soon To Challenge Mainstream Media With A Little Help From Superchips

Sponsored Links

Sooner than you may think possible, independent publishers of all kinds are going to more deeply challenge the mainstream media establishment, thanks to the immense new computing power that will be available to them, at very affordable costs, in the next few coming years.

Photo credit: Kuazo Kuazo

In this surprising scenario, small independent publishers may become as competitive or more than large media publishing organizations thanks to their ability to focus on very specific topics with a high degree of competence, to their ability to personalize content around tightly framed themes, and to their greater talent as direct marketing communicators.

In this future-looking article, John Blossom reveals what appears to be a
fascinating scenario for small independent publishers, as well as an unsettling one for established media publishers facing competitive factors that they have never accounted for in their present marketing and sales strategies.

"...tomorrow's wired and wireless users will be able to afford technology that can allow them to jump easily into competition with many mainstream publishers on a scale never achieved before.

Their ability to fill the Web with their own content and content of their own selection from other sources will far outstrip even today's rowdy environment of technology-empowered users feeling their oats."

Newsmasters, content curators, news jockeys, may very well be the ones that will challenge, much sooner than you may have thought, the scores of media publications, newspapers and web sites which have been trumpeting for too long a communication approach fit only for an industrial, mass media-driven society. A mass communication approach which is rapidly transforming its target into myriads of niche audiences, actively searching, selecting, aggregating and redistributing information in a million new combinations.

Photo credit: Kiyoshi Takahase Segundo

The 100x Factor: A New Generation of Microprocessors Challenges Content Providers

After chugging along with decades worth of PCs that never seem to get appreciably faster in relation to our content needs, hope is on the way.

IBM's new experimental computer chips promise a 100x improvement in processor performance, with its availability to everyday users likely in years rather than decades.

For those who had hoped for an evolutionary progression of publishing into the electronic realm, forgive me for being the bearer of bad news - the revolution will be at your fingertips even sooner than expected.

When I fired up my first Windows-powered PC in 1992, I waited patiently as it chugged through its startup routine, a process of several minutes once I'd loaded on a good amount of software.

Today, with more than twenty times the processing power of that then -"hot" machine, I power on my Windows PC and... I wait several minutes for all of the software to load.

Technology has advanced in computing, but only as much as it has been necessary to maintain a certain status quo. Thus the changes to the content industry tied to electronic delivery, as radical as they have been over this period of technological innovation, have been throttled to some degree by machines that are tied to software and hardware providers that are trying to give their products a reasonable shelf life.

But this careful balance might be upset if innovations at IBM come to light any time soon. As noted in The New York Times, IBM has come up with a microprocessor that will operate at speeds of 500 gigahertz when chilled to a super-cooled state. At room temperature, it plods along at a mere 350 gigahertz - about 100 times faster than today's typical high-end PC or corporate server.

Bernard Meyerson - IBM

While commercial production based on this new design is a ways off, Bernard Meyerson, vice president and chief technologist in IBM's Systems and Technology Group, noted in the article that technologies in this advanced state typically take about 12 to 24 months to make their way into production.

Assuming that it would take another two or three years for these chips to work their way into the consumer end of the market it will be only about five years before we have personal computing devices and infrastructure that are 100 times more powerful than those we have today.

That would be five times more advance in processing power in a few years than my PCs have experienced over fifteen years.

In a world in which "10x" improvements in productivity are likely to bring huge accolades and profits 100x improvements in processing efficiency in the hands of individuals and institutions equipped with these powerful technologies are likely to have a profound effect on publishing as we know it today.

It means, first and foremost, that tomorrow's wired and wireless users will be able to afford technology that can allow them to jump easily into competition with many mainstream publishers on a scale never achieved before.

Their ability to fill the Web with their own content and content of their own selection from other sources will far outstrip even today's rowdy environment of technology-empowered users feeling their oats.

Here are some key factors to bear in mind as computing power begins to accelerate the growth of user publishing beyond previous bounds:

  • The central database meets its match.

    While the processing power provided by databases in major data centers will always be an important element in publishing, a radical increase in processing power at the user level combined with immense storage capabilities will accelerate the ability of users to find and aggregate more valuable content in personalized contexts closer to its ultimate consumers.

    Instead of having to rely on central services to determine content relevance at a generic level it will become far easier for networks of individuals and institutions to collaborate directly to determine relevance in ways that take their own knowledge bases and publishing into account much more cost-effectively.

    This will not mean the death of central search services such as Google but it will mean that networks of individuals with their own high-power search appliances built into their desktops will be able to become active members of both public and private search and sharing "grids" that can help them solve problems far more efficiently than with central services that must respond to more generic needs.

  • The focus of marketing shifts.

    100x processing power may not spell the end of advertising as we know it, but the intimacy with audiences that marketers crave is likely to take new turns that will accelerate where and how money is spent to persuade today's buyers.

    Already major corporate advertisers such as GM are beginning to recognize that being in on user-initiated online conversations is a crucial key to being able to communicate marketing messages effectively. The facilitation of user-enabled conversations and knowledge forming through increased processing power and immense local storage systems is likely to accelerate the need for marketers to come up with strategies that embed them in those conversations effectively.

    Marketing communications will become much more personalized - and more focused on content of all kinds, with our without origins from copyrighted sources, that can gain context in those settings.

  • Value-add services go real-time.

    Web services that enable content to take on life with advanced functionality are today largely relegated to portal services that pull up information from central databases: they are little-used outside of relatively specialized publishing applications and a handful of "mashups". But an enormous increase in processing power opens up the need for Web services to be deployed closer to users who have their own personal portal environments forming on their desktops and in their portable devices.

    Large-scale destination sites will continue to thrive in this environment but the enormous power of distributed computing will place a far greater emphasis on bringing content-enabled Web services to users' desktops for their personal use. The era of the true "content concierge" that anticipates our interests and serves up valuable content and new services is coming soon.

    The transformations seen through three decades of personal computing technologies are profound, yet we forget that many of these changes have happened at a fairly slow pace - slow enough to fill up these technologies' capabilities fairly comfortably.

    The changes that we can expect from omnipresent supercomputing are going to be multifold and oftentimes unanticipated, but the final sledgehammer against the wall separating traditional media operations from personal and institutional publishers is looming clearly through its power.

    For publishers hoping to be able to evolve a smooth transition to online revenues, beware - the road is going to become far more revolutionary than we can begin to imagine.

    Originally written by John Blossom, President of Shore Communications Inc. on ContentBlogger(TM) with the original title:
    "The 100x Factor: A New Generation of Microprocessors Challenges Content Providers" on June 26th 2006

    Read more about John and the management consulting services of Shore Communications Inc. covering enterprise, media and personal publishing at

    John Blossom -
    Reference: Shore [ Read more ]
Readers' Comments    
blog comments powered by Disqus
posted by Robin Good on Tuesday, June 27 2006, updated on Tuesday, May 5 2015

Search this site for more with 








    Curated by

    New media explorer
    Communication designer


    POP Newsletter

    Robin Good's Newsletter for Professional Online Publishers  



    Real Time Web Analytics