The Fourth Paradigm = programs to manage and mine enormous data sets

In my work with scientists in mouse-related and ag-biotech related research, a constant challenge is the management of data and information that is collected at a seemingly exponential rate.  The capacity to create this data (knowledge) far out-paces our ability to develop appropriate programs to manage it.  As a result, we are data/information heavy (a good thing) but with no real capacity to optimize its sharing and use (a bad thing), even within the tighter (presumably more manageable) boundaries of a given project.  


I came across an interesting article in the Harvard Business Review today entitled: “The Big Idea: The Next Scientific Revolution” According to its author, Tony Hey, experts do have a good understanding of data and they have the ability to see the often invisible links “between the columns”; finding non-obvious or latent connections within or between disciplines that can serve as catalysts for new and innovative possibilities.  But we have almost reached a crucial point.  Experts are now DROWNING in data.  Information is streaming in at a dizzying rate making it challenging to organize, analyze and store. The late Jim Gray (American computer scientist and recipient of the Turing Award in 1998) proposed what he called “the fourth paradigm” for scientific exploration.

“[Gray’s] vision of powerful new tools to analyze, visualize, mine, and manipulate scientific data may represent the only systematic hope we have for solving some of our thorniest global challenges” writes Hey. “The fourth paradigm*… involves powerful computers. But instead of developing programs based on known rules, scientists begin with the data. They direct programs to mine enormous databases looking for relationships and correlations, in essence using the programs to discover the rules. We consider big data part of the solution, not the problem. The fourth paradigm isn’t trying to replace scientists or the other three methodologies, but it does require a different set of skills. Without the ability to harness sophisticated computer tools that manipulate data, even the most highly trained expert would never manage to unearth the insights that are now starting to come into focus.”

This plays in nicely with Ostrom’s work on the “commons”.  have a few blog entries on her and the IAD Framework in “Consider Icarus…” (search term = Ostrom)

_ _ _ _ _

*The first two paradigms are experiment and theory, computation/simulation is the third.



Knowledge management practice for public good

…The assumption that knowledge flows linearly through a chain of significantly different types of institutions or social networks poses challenges in managing that knowledge around bottlenecks which may arise from disparities in language, cultures, social capital etc… In this blog entry from the KnowledgeCore’s Blog, the author(s) outline knowledge management for public good which draws on (new to me) theories from Spender, Griffiths and Wiig. According to Wiig below, good social knowledge management practice should draw on or leverages models developed and practiced in the private sector… sounds good to me!

“Wiig progresses to discuss societal knowledge management (SKM) where he posits that ‘effective SKM is required to build, maintain, and make the best use of the country’s broad knowledge assets’ (p. 150). It could be questioned whether this is any different from the fundamental aims of any KM process, especially when compared against the knowledge-base view. This is acknowledged by Wiig where he states that ‘in general, SKM shares the same foundation as the private sector KM. Hence SKM uses approaches developed and perfected in the private sector. Most management, organisational, and operational principles are similar’ (p. 151). In Griffiths /et al./ we put forward the K-Core a new model for KM, encompassing 4 functions and 12 enablers.”

Check out more at:

First Look – Knowledge for the Public Good (Societal Knowledge

The KnowledgeCore’s Blog…