The Digital Information Librarian is one of the emerging professional roles that is best qualified to ride and leverage to its best advantage the revolution about to sweep the information management, search, retrieval and independent news publishing industries.
In particular, the academic background and research experience these people generally hold, makes them highly fitting for the emergent "newsmastering" role and for the score of corollary expressions of this critical new profession.
The newsmaster as I personally envision it, is a human capable of creating, refining and syndicating high quality, content rich and very focussed content channels, generally created by searching, filtering and aggregating selected news sources according to complex and long refined criteria.
After the establishment of the blogosphere as a first layer of scouting, filtering and organizing of all online content, we are now seeing the rapid and unstoppable arrival of information organizing tools of all kinds (delicious, Furl, Onfolio, Content Saver, etc.) and the inevitable birth of specialized professionals who will craft and design the new alternative information sources and news channels of tomorrow.
Among the better qualified professionals in this area is Marcus P. Zillman, who has long devoted his research time to identify intelligent and semi-automatic search tools and intelligent software agents that scour the Internet searching for the specific information you are looking for.
I have had the pleasure to recently discover Marcus excellent research skills and generous publishing workflow online while carrying out some queries on new search tools.
What differentiates my view of the future and his is only the fact that I am already completely sold on RSS and its impact while he is not (yet). After my initial request to him about his views on using RSS, he mentioned that he was taking a wait-and-see approach as RSS was gradually coming around and especially because the blooming of several RSS publishing formats spelled for him indication of difficult and troublesome deployment.
Once I explained to him, that RSS flavours are really a major issue only for developers and companies creating tools that work with RSS, and that most up-to-date and well designed RSS tools can ready any and all of the popular RSS formats, he took good note and said he would then look back at RSS with different eyes.
Nonetheless the amazing amount of deep Web resources, artificial intelligent tools and advanced search technologies Marcus and other research librarians have been using, the reach and flexibility that RSS-based information flows are going to provide is just orders of magnitude greater and more powerful.
As I realized the generational gap separating my view and perceptions from his, I kindly invited Marcus for an open interview, so that other librarians and information researchers could tap into our common desire to organize and publish information in ways that make it extremely easy and effective for the end user to get just-in-time at the specific information she needs.
Robin Good: What do you see as the key driving forces in the tremendous surge of new applications and resources to manage information?
Marcus P. Zillman: We are in the process of a true information revolution and the new web applications as well as ability to identify and harness this information has become the focal point for many many initiatives. Ontologies, AI based reasoning and learning systems, knowledge harvesting, knowledge and data meta descriptions formats, and the constant ebb and flow of even better formats and protocols to digest all this information into manageable bites is happening! We are in truly exciting times for the information explorer!
Robin Good: What are in your opinion the tools that mostly used today to handle the vast information flow and to find relevant content?
Marcus P. Zillman: This would all depend upon the ability for many folks to access the various tools available on the Internet. Search engines such as Google are available to all even though most folks do not understand or attempt to learn about the "advanced” search functions of Google and the related APIs such as are described in the fine book by Tara Calishain titled Google Hacks.
Specialized search engines that penetrate the deep web are also used such as Bright Planet and Profusion.com.
I just finished an in depth feature article/paper on Deep Web Research (http://www.llrx.com/features/deepweb.htm) which highlights many of the specialty search products and database information extractors used to discover new knowledge from as well as many other formats of text data.
Robin Good: Who do you think is best poised to identify and bring about new ways of managing the information flow?
Marcus P. Zillman: The best person to identify and bring about new ways of managing information flows is the Librarian and more specifically the electronic reference librarian. This person has an immediate handle on what is happening with the flow of information and how to properly evaluate, disseminate and categorize all the incoming and outgoing information resources.
Robin Good: What are the new tools and technologies that you have seen been able to facilitate and speed up this process?
Marcus P. Zillman: New tools to aid the information gatherer and explorer are news aggregators, RSS feeds, AI based reasoning systems and knowledgeBases, and niched bots/intelligent agents. With the advent of new mark up languages as well as advanced AI algorithms we will be observing smarter and smarter bots and intelligent agents to discover, evaluate, disseminate and warehouse actively sought after data. IBM's WebFountain is an example of what will be commonplace in the future for information retrieval/discovery.
Robin Good: What is the future of news online?
Marcus P. Zillman: News online will be even faster with RSS feeds to alert us of the key words that we have identified as outliners for information discovery.
eMail alerts will be replaced with smart RSS feeds that will follow us wherever we go and contact us via the appropriate message protocol. AI algorithms will constantly monitor our selected newsfeeds to capture only the in depth points of our current interest and projects. Smart snapshots of information will be analyzed, formulated and then disseminated to our projects or personal information bots.
Robin Good: Which will be the sources of information for those who want or need to be on the cutting edge?
Marcus P. Zillman: "Cutting Edge” …...... I have permanent razor burns from being on the cutting and bleeding edge of information over the years! KnowledgeBase data mining by bots designed to gather and extract information based upon our desire and project needs will be used by the cutting edge folks. Web Mining and Data Mining of text information is hot now and will become even hotter in the next year. Add to that web mining for graphics and related images and a whole host of happenings will be generated to please the enormous appetite of information hungry bots.
Robin Good: Can specialized professionals and small businesses consider to become reliable and valid alternative information sources?
Marcus P. Zillman: The answer to this is definitely yes. Information professionals who understand the process will offer human touch and evaluation to information sources that will aid business to accomplish their information needs at a realistic cost without having to pay for sophisticated computers and expensive algorithm formulae.
They will offer a valid alternative information sources as they will combine the best of both worlds.
High touch of bots and real touch of humans … when combined give the best overall view of the information.
Down the road in many years we might…and the word might is used here…have a bot/computer that can out think and out evaluate humans….not now and not in the foreseeable future.... One of my latest presentations is titled "Searching the Internet Using Brains and Bots” where I describe the combination of harnessing human brain power to the routine search capability of bots …
Robin Good: If you were to describe your ideal technology to find and collect news, what features would it have?
Marcus P. Zillman: The ability to monitor all news feeds and sources on a 24/7 basis and an algorithm to constantly search for keywords based upon active definitions as well as key phrases. This information then would be cross referenced with existing data to create an up to date exacting report and analysis.
Robin Good: If you were to recommend a few products and online services to better manage search and information management tasks, outside of your own, which ones would you recommend?
Marcus P. Zillman: I have learned over the years never to recommend resources other than those that I have complete control over ;-)).
With the rapid change of Internet protocols not to mention web applications and scripts, I would recommend companies that can change quickly and be prepared to change in a moment's notice….. as I have said we will be experiencing a tremendous amount of change in the very near future and we must prepare ourselves to accept that change and if we really want and desire information to go with the flow of the change….. Also proprietary versus open source will be an ongoing and constant challenge with positive points for both.
The availability of information coming direct from the author without peer review or with automated peer review will be knocking at the door of the information professional.
We will see more and more databases that will serve information directly to the information consumer and directly from the author.
Exciting times indeed and much of this is happening now. I read about an article or paper and immediately search for the author and discover their blog/web page and immediately click on publications and presto I have their latest .pdf/.ps/.html article/paper. I do this all the time as more and more authors show where their research and papers are online or host them themselves to be disseminated to the discovering information researcher.
With Peer to Peer in a grid network situation I can envision in the near future that we will have servers programmed to disseminate the papers that we find as well as a search bot that will constantly search the grid and peer to peer networks for the latest paper/article on the keyword/phrase that we have programmed for. I am working on this now to have an Information Computer that will both search for and serve papers to and for the Internet community. This could even be done anonymously using the current FreeNet P2P protocols … truly interesting and exciting times ….
Robin Good: In your opinion, will RSS play a critical role in all of what you have described above?
Marcus P. Zillman: Yes RSS feeds will play an important and critical role.
The question is whether major corporations will accept and justify the RSS feeds to place them in the mainstream of Internet information dissemination or will RSS feeds remain with the blogging/news retrieval niched community.
If the big corporate software community takes RSS on then it will become mainstream almost overnight as it will be part of the browser. If not RSS will continue but used by a selected niche of Internet information explorers and news gathers.
In some ways I wished it could happen both ways as there are distinct advantages in each area! Only the future will tell ……
About Marcus P. Zillman
Marcus P. Zillman, M.S., A.M.H.A. is the Executive Director of the Virtual Private Library, CEO/President of BotTechnology.com, Inc., Chairman/CEO TrademarkBots.com, Inc., Creator/Founder BotSpot.com and Executive Producer of BOT2000 and BOT2001 conferences.
Marcus is a benefactor member of the Internet Society, participant in the IETF Users Services Working Group and was selected to participate in the U.S. Government's Open Meeting Electronic Forum as a non governmental expert on information retrieval and access.
He is the Creator/Founder of BotSpot.com "The Spot for all Bots and Intelligent Agents on the Net" one of the Internet's most awarded sites (over 350 awards) and is considered the definitive resource for Bots, Intelligent Agents and Artificial Intelligence on the Internet.
PC Magazine selected it as one of the Top 100 Best Web Sites on the Internet in 1998 as well as selected by NetGuide as the Top 10 of all Internet sites during all of 1998.
Right now Marcus P. Zillman is Executive Director of the Virtual Private Library creators of the Subject Tracer Information Blogs, CEO/President of BotTechnology.com which consults, creates and develops bots and intelligent agents.
His latest book "Mining the Invisible Web - Leveraging the Vast Resources of Hidden Content on the Internet for Research and Intelligence" is co-authored with Sundar Kadayam, CTO and Co-Founder of Intelliseek, Inc., and creator of the InvisibleWeb.com.
Marcus P.Zillman also published a daily RSS news feed, writes a monthly column on the latest Internet Resources as well as publishes a monthly newsletter titled Awareness Watch.
His current websites include:
Marcus P. Zillman's Blog
Marcus P. Zillman Abbreviated Bio
Internet MiniGuides 2004
Originally written by Robin Good and first published on MasterNewMedia.