RSS

Tag Archives: data visualization

Magic Quadrant Business Intelligence 2014

Over the last two years we have posted some visualization and interpretation of Gartner’s Magic Quadrant Analysis on BI companies. The previous articles in 2012 and 2013.

A Blog reader contacted me about the 2014 update; he sent me the {x,y} coordinate data for 2014 and so it was relatively straightforward to update the public Tableau workbook for it. Here is the image of all 29 companies with their changes from 2013 to 2014:

Gartner’s Magic Quadrant for Business intelligence, changes from 2013 to 2014

Gartner’s Magic Quadrant for Business intelligence, changes from 2013 to 2014

With the slider controls for Execution and Vision as well as the changes thereof, it is easy to filter the dashboard interactively. For example, there were a dozen companies who improved in their execution score (moving up in the quadrant):

Subset of companies who improved execution over the last year.

Subset of companies who improved execution over the last year.

Most of the companies improving their execution are niche players, with SAP being the only leader improving its execution score.

Most of the leaders improved in their vision score (moving right in the quadrant), including Tableau, QlikTech, Tibco and SAS.

Subset of companies who improved vision over the last year.

Subset of companies who improved vision over the last year.

 

7 companies, most of them leaders, lost ground on both execution and vision (moving to the bottom-left):

Companies who lost ground on both execution and vision in 2014

Companies who lost ground on both execution and vision in 2014

 

Lastly, I have updated the Public Tableau workbook with the Magic Quadrant as originally published in 2012 with the data for 2013 and 2014. (Click here for the interactive drawing.)

Public Tableau workbook with 7 years of BI Magic Quadrant data.

Public Tableau workbook with 7 years of BI Magic Quadrant data.

 
1 Comment

Posted by on September 28, 2014 in Industrial

 

Tags: , , , ,

Visualizing Conversion Rates (Funnels, Bullet Charts)

Most sales processes go through a series of stages, from first contact through successive engagement of the potential client to the close. One can think of these as special cases of attrition-based workflows. These are very typical in online (B2C eCommerce) or tele-marketing (call centers) and companies usually collect a lot of data around the conversion rates at each step. How can one visualize such workflows?

One metaphor for these processes is that of a sales funnel. A lot of leads feed in on one side, and at each stage fewer pass through to the next. It is then straightforward to want to visualize such funnels, such as shown here by Tableau.

Tutorial video explaining how to create a funnel chart using Tableau

Tutorial video explaining how to create a funnel chart using Tableau (Source: Tableau Training Video)

Aside from the somewhat tedious steps involved – custom field calculations, left-right duplication of the chart, etc. – it turns out, however, that funnel charts are not easy to interpret. For example, it is not well suited to answer the following questions:

  • What’s the percentage reduction at each step?
  • Comparing two or more funnels, which has better conversions at each step?
  • What are the absolute numbers in each step?
  • Are the conversion rates above or below targets at each step?

Ash Maurya from Spark59 wrote a very useful article on this topic entitled “Why not the funnel chart?“. In it he looks specifically at comparisons of funnels (current vs. prior time intervals or A|B testing).

Time Series comparison of funnel performance (Source: Ash Maurya's Spark59 Blog)

Time Series comparison of funnel performance (Source: Ash Maurya’s Spark59 Blog)

In a next step he shows that the funnel shape doesn’t add to the readability. Instead simple bar charts can do just as well:

Same information with Bar Charts (Source: Ash Maurya's Spark59 Blog)

Same information with Bar Charts (Source: Ash Maurya’s Spark59 Blog)

For a multi-step funnel, the problem remains that with the first step set to 100%, subsequent steps often have fairly small percentages and thus are hard to read and compare. Suppose you are sending emails to 100,000 users, 30% of which click on a link in the email, of which only 10% (3% of total) proceed to register, of which only 10% (0.3% of total) proceed to subscribe to a service. Bars with 3% or even 0.3% of the original length will be barely visible. One interesting variation is to normalize each step in the funnel such that the new, expected conversion number (or that from the prior period) is scaled back to 100%. In that scenario it is easy to see which steps are performing above or below expectations. (Here big jump in Registrations from Jan to Feb, then small drop in Mar.)

Bar Charts with absolute vs. relative numbers

Bar Charts with absolute vs. relative numbers

Next, Ash Maurya uses the Bullet Chart as introduced by Stephen Few in 2008. The Bullet Chart is a variation of a Bar Chart that uses grey-scale bands to indicate performance bands (such as poor – ok – good) as well as a target to see whether the performance was above or below expectations. The target bar allows to combine two charts into just one, giving a compact representation of the relative performance:

Funnel Chart showing funnel performance (Source: Ash Maurya's Spark59 Blog)

Bullet Chart showing funnel performance (Source: Spark59 Blog)

Various authors have looked at how to create such bullet charts in Excel. For example Peltier Tech has looked at this in this article called “How to make horizontal bullet graphs in Excel“. There is still quite some effort involved in creating such charts, as Excel doesn’t directly support bullet charts. Adding color may make sense, although it quickly leads to overuse of color when used in a dashboard (as Stephen Few points out in his preference for grey scales).

Funnel Graphs in Excel (Source: Peltier's Excel Blog)

Bullet Graphs in Excel (Source: Peltier’s Excel Blog)

Another interesting approach comes from Chandoo with an approximation of a bullet graph in cells (as compared to a separate Excel chart object). In this article “Excel Bullet Graphs” he shows how to use conditional formatting and custom formulae to build bullet graphs in a series of cells which can then be included in a table, one chart in each row of the table.

In-Cell Funnel Graph in Excel (Source: Chandoo's Blog)

In-Cell Bullet Graph in Excel (Source: Chandoo’s Blog)

It is somewhat surprising that modern data visualization tools do not yet widely support bullet charts out of the box.

Measuring how marketing efforts influence conversions can be difficult, especially when your customers interact with multiple marketing channels over time before converting. To that end, Google has introduced multi-channel funnels (MCF) in Google Analytics, as well as released an API to report on MCFs. This enables new sets of graphs, which we may cover in a separate post.

 
4 Comments

Posted by on March 31, 2013 in Industrial

 

Tags: , , ,

Magic Quadrant Business Intelligence 2013

It’s that time of the year again: Gartner has released its report on Business Intelligence and Analytics platforms. One year ago we looked at how the data in the Magic Quadrant – the two-dimensional space of execution vs. vision – can be used to visualize movement over time. In fact, the article Gartner’s Magic Quadrant for Business Intelligence became the most viewed post on this Blog.

I had also uploaded a Tableau visualization to Tableau Public, where everyone can interact with the trajectory visualization and download the workbook and the underlying data to do further analysis. This year I wanted to not only add the 2013 data, but also provide a more powerful way of analyzing the dynamic changes, such as filtering the data. For example, consider the moves from 2012 to 2013 of some 21 vendors:

Gartner's Magic Quadrant for Business intelligence, changes from 2012 to 2013

Gartner’s Magic Quadrant for Business intelligence, changes from 2012 to 2013

It might be helpful to filter the vendors in this diagram, for example to show just niche players, or just those who improved in both vision and execution scores. To that end, I created a simple Tableau dashboard with four filters: A range of values for the scores of both vision and execution scores, as well as a range of values for the changes in both scores. The underlying data is also displayed for reference, which can then be used to sort companies by ordering along those values.

Here is an example of the dashboard set to display the subset of 15 companies who increased either both or at least one of their vision or execution scores without lowering the other one.

Subset of companies who improved vision and/or execution over the last year.

Subset of companies who improved vision and/or execution over the last year.

That’s more than 70% of platforms, with the increase in vision being more pronounced than that of execution. That’s considerably more than in the previous years (2013: 15; 2012: 6; 2011: 6; 2010: 3; 2009: 9) – making this collective move to the top-right perhaps a theme of this year’s report.

Who changed Quadrants? Who moved in which dimension?

Last year Tibco (Spotfire) and Tableau were the only two platforms changing quadrants, then becoming challengers. This year both of them “turned right” in their trajectory and crossed over into the leaders quadrant due to strong increases in their vision capabilities. (QlikTech had been on a similar trajectory, but already crossed into the leader quadrant in 2012. It also strengthened both execution and vision again this year.)

Another new challenger is LogiXML. Thanks to ease of use, enhancements from customer feedback and a focus on the OEM market its ability to execute increased substantially. From the Gartner report summary on LogiXML:

Ease of use goes hand-in-hand with cost as a key strength for LogiXML, which is reflected by its customers rating it above average in the survey. The company includes interfaces for business users and IT developers to create reports and dashboards. However, its IT-oriented, rapid development environment seems to be most compelling for its customers. The environment features extensive prebuilt elements for creating content with minimal coding, while its components and engine are highly embeddable, making LogiXML a strong choice for OEMs.

A few other niche players almost broke into new quadrants, including Alteryx (which had the biggest overall increase and almost broke into the visionary quadrant), as well as Actuate and Panorama Software.

The latter two stayed the same with regards to execution (as did SAP) – while all three of them moved strongly to the right to improve on the vision score (forming the Top 3 of vision improvement).

Information Builders and Oracle stayed where they were, changing neither their execution nor vision scores.

Microsoft and Pentaho stayed about the same on vision, but increased substantially in their execution scores.  This propelled Microsoft to the top of the heap on the execution score, while it moved Pentaho from near the bottom of the heap to at least a more viable niche player position. Microsoft’s integration of BI capabilities in Excel, SQL Server and SharePoint as well as leveraging of cloud services and attractive price points make it a strong contender especially in the SMB space. Improvements of its ubiquitous Excel platform give it a unique position in the BI market. From the Gartner report:

Nowhere will Microsoft’s packaging strategy likely have a greater impact on the BI market than as a result of its recent and planned enhancements to Excel. Finally, with Office 2013, Excel is no longer the former 1997, 64K row-limited, tab-limited spreadsheet. It finally begins to deliver on Microsoft’s long-awaited strategic road map and vision to make Excel not only the most widely deployed BI tool, but also the most functional business-user-oriented BI capability for reporting, dashboards and visual-based data discovery analysis. Over the next year, Microsoft plans to introduce a number of high-value and competitive enhancements to Excel, including geospatial and 3D analysis, and self-service ETL with search across internal and external data sources.

The report then goes on to praise Microsoft for further enhancements (queries across relational and Hadoop data sources) that contribute to its strong product vision score and “positive movement in overall vision position”. This does not seem consistent with the presented Magic Quadrant, where Microsoft only moved to the top (execution), not to the right (vision). Perhaps another reason for Gartner to publish the underlying coordinate data and finally adopt this line of visualization with trajectories.

Deteriorate2013

Dashboard with filters revealing two platforms deteriorating in both vision and execution

Only two vendors saw their scores deteriorate in both dimensions: MicroStrategy gave up some ranks, but remains in the leader quadrant. The report cites a steep sales decline in 3Q12 and the increased importance of predictive and prescriptive analytics in this years evaluation among the reasons:

MicroStrategy has the lowest usage of predictive analytics of all vendors in the Magic Quadrant. A reason for this behavior might be the user interface that is overfocused on report design conventions and lacks proper data mining workbench capabilities, such as analysis flow design, thus failing to appeal to power users. To address this matter, MicroStrategy should deliver a new high-end user interface for advanced users, or consumerize the analytic capabilities for mainstream usage by embedding them in Visual Insight.

The other vendor moving to the bottom-left is arcplan, which is now at the bottom of the heap in the niche players quadrant.

Who moved to the top-left?

With the dashboard at hand, you can also go back and do similar queries not just for the current year 2013, but any of the five previous years. For example, who has moved to the top-left – improved execution at the expense of reduced vision – over the years?

In 2013 those were Targit, Jaspersoft, Board International. All three of them had a sharp drop in Execution in the previous year 2012. A plausible scenario of what happened is that these companies lost their focus on execution, dropped the scores and in an attempt to turn-around focused on executing well with a smaller set of features (hence lower vision).

In 2012 the only vendor to display a move to the top-left was QlikTech. They had some sales issues the prior year as well, although their trajectory in 2011 was only modestly lower in execution, mostly towards higher vision.

In 2011 Actuate and Information Builders moved to the top-left. Both had trajectories to the bottom-left the prior year (2010), with especially Actuate losing a lot of ground. With the Year slider on the top-left of the dashboard one can then play out the trajectory while the company filters remain, thus showing only the filtered subset and their history. Actuate completed a remarkable turn-around since then and is now positioned back roughly where it was back in 2010.

Dashboard with analysis of top-left moving companies.

Dashboard with analysis of top-left moving companies.

 

(Click on the image above or here to go to the interactive Public Tableau website.)

In 2010 there were five vendor moving to the top-left: Oracle, SAS, QlikTech, Tibco (Spotfire) and Panorama Software. Although in that case none of them did show a decrease in execution the previous year. That focus on execution may simply have been the result of the economic downturn in 2009.

Such exploratory analysis is hard to conceive without proper interactive data visualization. Given the focus of all the vendors it covers in this report, it seems somewhat anachronistic that Gartner in its report does not leverage the capabilities of such interactive visualization itself. In the previous post on Global Risks we have seen how much value that can add to such thorough analysis. (Much of this dashboard should be applicable for risk analysis as well, just that the two-dimensional space changes from platform vision vs. execution to risk likelihood vs. impact!) If Gartner does not want to drop on its own execution and vision scores, they better adopt such visualization. It’s time.

 
3 Comments

Posted by on February 12, 2013 in Industrial

 

Tags: , , , ,

Visualizing Global Risks 2013

Visualizing Global Risks 2013

A year ago we looked at Global Trends 2025, a 2008 report by the National Intelligence Commission. The 120 page document made surprisingly little use of data visualization, given the well-funded and otherwise very detailed report.

By contrast, at the recent World Economic Forum 2013 in Davos, the Risk Response Network published the eighth edition of its annual Global Risks 2013 report. Its focus on national resilience fits well into the “Resilient Dynamism” theme of this year’s WEF Davos. Here is a good 2 min synopsis of the Global Risks 2013 report.

We will look at the abundant use of data visualization in this work, which is published in print as an 80-page .pdf file. The report links back to the companion website, which offers lots of additional materials (such as videos) and a much more interactive experience (such as the Data Explorer). The website is a great example of the benefits of modern layout, with annotations, footnotes, references and figures broken out in a second column next to the main text.

RiskCategories

One of the main ways to understand risks is to quantify it in two dimensions, namely its likelihood and its impact, say on a scale from 1 (min) to 5 (max). Each risk can then be visualized by its position in the square spanned by those two dimensions. Often risk mitigation is prioritized by the product of these two factors. In other words, the further right and/or top a risk, the more important it becomes to prepare for or mitigate it.

This work is based on a comprehensive survey of more than 1000 experts worldwide on a range of 50 risks across 5 broad categories. Each of these categories is assigned a color, which is then used consistently throughout the report. Based on the survey results the report uses some basic visualizations, such as a list of the top 5 risks by likelihood and impact, respectively.

Source for all figures: World Economic Forum (except where noted otherwise)

Source for all figures: World Economic Forum (except where noted otherwise)

When comparing the position of a particular risk in the quadrant with the previous year(s), one can highlight the change. This is similar to what we have done with highlighting position changes in Gartner’s Magic Quadrant on Business Intelligence. Applied to this risk quadrant the report includes a picture like this for each of the five risk categories:

EconomicRisksChange

This vector field shows at a glance how many and which risks have grown by how much. The fact that a majority of the 50 risks show sizable moves to the top right is of course a big concern. Note that the graphic does not show the entire square from 1 through 5, just a sub-section, essentially the top-right quadrant.

On a more methodical note, I am not sure whether surveys are a very reliable instrument in identifying the actual risks, probably more the perception of risks. It is quite possible that some unknown risks – such as the unprecedented terrorist attacks in the US on 9/11 – outweigh the ones covered here. That said, the wisdom of crowds tends to be a good instrument at identifying the perception of known risks.

Note the “Severe income disparity” risk near the top-right, related to the phenomenon of economic inequality we have looked at in various posts on this Blog (Inequality and the World Economy or Underestimating Wealth Inequality).

A tabular form of showing the top 5 risks over the last seven consecutive years is given as well: (Click on chart for full-resolution image)

Top5RisksChanges

This format provides a feel for the dominance of risk categories (frequency of colors, such as impact of blue = economic risks) and for year over year changes (little change 2012 to 2013). The 2011 column on likelihood marks a bit of an outlier with four of five risks being green (= environmental) after four years without any green risk in the Top 5. I suspect that this was the result of the broad global media coverage after the April 2011 earthquake off the coast of Japan, with the resulting tsunami inflicting massive damage and loss of lives as well as the Fukushima nuclear reactor catastrophe. Again, this reinforces my belief that we are looking at perception of risk rather than actual risk.

Another aggregate visualization of the risk landscape comes in the form of a matrix of heat-maps indicating the distribution of survey responses.

SurveyResponseDistribution

The darker the color of the tile, the more often that particular likelihood/impact combination was chosen in the survey. There is a clear positive correlation between likelihood and impact as perceived by the majority of the experts in the survey. From the report:

Still it is interesting to observe how for some risks, particularly technological risks such as critical systems failure, the answers are more distributed than for others – chronic fiscal imbalances are a good example. It appears that there is less agreement among experts over the former and stronger consensus over the latter.

The report includes many more variations on this theme, such as scatterplots of risk perception by year, gender, age, region of residence etc. Another line of analysis concerns the center of gravity, i.e. the degree of systemic connectivity between risks within each category, as well as the movement of those centers year over year.

Another set of interesting visualizations comes from the connections between risks. From the report:

Top5Connections

Top10ConnectedRisks

Finally, the survey asked respondents to choose pairs of risks which they think are strongly interconnected. They were asked to pick a minimum of three and maximum of ten such connections.

Putting together all chosen paired connections from all respondents leads to the network diagram presented in Figure 37 – the Risk Interconnection Map. The diagram is constructed so that more connected risks are closer to the centre, while weakly connected risks are further out. The strength of the line depends on how many people had selected that particular combination.

529 different connections were identified by survey respondents out of the theoretical maximum of 1,225 combinations possible. The top selected combinations are shown in Figure 38.

It is also interesting to see which are the most connected risks (see Figure 39) and where the five centres of gravity are located in the network (see Figure 40).

One such center of gravity graph (for geopolitical risks) is shown here:RiskInterconnections

The Risk Interconnection Map puts it all together:

RiskInterconnectionMap

Such fairly complex graphs are more intuitively understood in an interactive format. This is where the online Data Explorer comes in. It is a very powerful instrument to better understand the risk landscape, risk interconnections, risk rankings and national resilience analysis. There are panels to filter, the graphs respond to mouse-overs with more detail and there are ample details to explain the ideas behind the graphs.

DataExplorer

There are many more aspects to this report, including the appendices with survey results, national resilience rankings, three global risk scenarios, five X-factor risks, etc. For our purposes here suffice it to say that the use of advanced data visualizations together with online exploration of the data set is a welcome evolution of such public reports. A decade ago no amount of money could have bought the kind of interactive report and analysis tools which are now available for free. The clarity of the risk landscape picture that’s emerging is exciting, although the landscape itself is rather concerning.

 
1 Comment

Posted by on January 31, 2013 in Industrial, Socioeconomic

 

Tags: , , , , , , ,

Circos Data Visualization How-to Book

Earlier this year we have looked at a powerful data visualization tool called Circos developed by Martin Krzywinski from the British Columbia Genome Science Center. The previous post looked at an example of how this tool can be used to show complex connectivity pathways in the human neocortex, so-called Connectograms.

Circos Book Cover

The Circos tool can be used interactively on the above website. In that mode you upload jobs via tabular data- and configuration-files and have some limited control over the rendering of the resulting charts. For full expressive power and flexibility, Circos can also be downloaded freely and used on your computer for rendering with extensive customization control over the resulting charts.

I have been asked to review a new book titled “Circos Data Visualization How-to“, published by Packt Publishing here. It’s main goal is to guide through the above download + installation process and get you started with Circos charts and their modification. Here is a brief review of this book.

Although originally developed for visualizing genomic data, Circos has been applied to many other complex data visualization projects, incl. social sciences. One such study was done by Tom Schenk, who analyzed the relationships between college majors and the professions those graduates ended up in. It appears as if this work inspired the author to write this book to help others with using Circos.

I downloaded the book in Kindle format and read it on the Mac due to the color graphics and the much larger screen size. It’s well structured and around 70 pages in printed form. The book focuses first on the download and install part, then has a series of examples from first chart to more complex ones using customization such as colors, ribbons, heat maps or dynamic binding.

Flow Chart for creation of Circos charts

Flow Chart for creation of Circos charts

Circos is essentially a set of Perl modules combined with the GD graphics library.

The first part is on Installing Circos, with a chapter each on Windows 7 and on Linux or Mac OS. Working on MAC I went the latter route. I ended up right in the weeds and it took me about 4 hours to get everything installed and working. The description is derived from a Linux install and is generally somewhat terse. It assumes you have all prerequisite tools installed on your Mac or at least that you are savvy enough to figure out what’s missing and where to get it. I had to dust off some of my Unix skills and go hunting for solutions via Google to a list of install problems:

  • directory permissions (I needed to warp the exact instructions with sudo)
  • installing Xcode tools from Apple for my platform (make was not preinstalled)
  • understanding cause of error messages (Google searches, Google group on Circos)
  • locating and installing the GD graphics library (helpful installing-circos-on-os-x tips by Paulo Nuin)
  • version and location issues (many libraries are in ongoing development; some sources have moved)

Others may find this part a lot easier, but I would say there should be an extra chapter for the Mac with tips and explanations to some of these speed bumps. On the plus side, the Google group seems to be very active and I found frequent and recent answers by Circos author Martin Krzywinski.

The next part of the book is easy to understand. One creates a simple hair-to-eye color relationship diagram. Then configuration files are introduced to customize colors and chart appearance. All required data and configuration files are also contained in the companion download from the Packt Publishing book page.

Chart of relationship between hair and eye colors

Chart of relationship between hair and eye colors

The last part of the book goes into more advanced topics such as customizing labels, links and ribbons, formatting links with rules, reducing links through bundling, and adding data tracks as heat maps or histograms. This is the meat for those who intend to use Circos in more advanced ways. I did not spend a lot of time here, but found the examples to be useful.

Contributions by State and Political party during 2012 U.S. Presidential Elections

Contributions by State and Political party during 2012 U.S. Presidential Elections

This section ends abruptly. One gets the feel that there are other subtleties that could be explored and explained. A summary or outlook chapter would have been nice to wrap up the book and give perspective. For example, I would have liked to hear from the author how much time he spent with various features during the college major to professions project.

In summary: This book will get you going with Circos on your own machine. Installing can be a challenge on Mac, depending on how familiar you are with Unix and the open source tool stack. The examples for your first Circos charts are easy to follow and explain data and configuration files. The more advanced features are briefly touched upon, but require more experimentation and time to understand and appreciate.
Circos author Martin Krzywinski writes on his website: “To get your feet wet and hands dirty, download Circos and a read the tutorials, or dive into a full course on Circos.” The How-to book by Tom Schenk helps with this process, but you still need to come prepared. If you are a Unix power user this should feel familiar. If you are a Mac user who rarely ever opens a Terminal then you might be better off just using Circos via the tableviewer web interface.
Lastly, I would recommend buying the electronic version of this book, as you can cut & paste the code, leverage the companion code and documents. A printed version of this book would be of very limited use.

 
1 Comment

Posted by on December 6, 2012 in Education, Scientific

 

Tags: , , ,

Connectograms and Circos Visualization Tool

Connectograms and Circos Visualization Tool

Yesterday (May 16) the Public Library of Science (PLoS) published a fascinating article titled “Mapping Connectivity Damage in the Case of Phineas Gage“. It analyzes the brain damage which the famous trauma victim sustained after an accident drove a steel rod through his skull. Railroad worker Phineas Gage survived the accident and continued to live for another 12 years, albeit with significant behavioral changes and anomalies. Those changes were severe enough for him to have to discontinue his work and also get estranged from his friends who stated he was “no longer Gage”. This has become a much studied case about the impact of brain damage on behavior anomalies. Since the accident happened more than 150 years ago there are no autopsy data or brain scans from Phineas Gage’s brain. So how did the scientists reconstruct the likely damage?

Since a few years there has been interest in the human connectome. Just like the genome is a map of human genes, the connectome is a map of the connectivity in the human brain. The human brain is enormously complex. Most estimates put the number of neurons in the hundreds of billions and the synaptic interconnections in the hundreds of trillions! Using diffusion weighted (DWI) and magnetic resonance imaging (MRI) one can identify detailed neuron connectivity. This is such a challenging endeavor that it drives the development of many new technologies, including the data visualization. The image resolution and post-processing power of modern instruments is now large enough to create detailed connectomes that show major pathways of neuronal fibers within the human brain.

The authors of the Laboratory of Neuro Imaging (LONI) in the Neurology Department at UCLA have studied the connectomes of a population of N=110 healthy young males (similar in age and dexterity to Phineas Gage at the time of his accident). From this they constructed a typical healthy connectome and visualized it as follows:

Circular representation of cortical anatomy of normal males (Source: PLoS ONE)

Details of the graphic are explained in the PLoS article. The outermost ring shows the various brain regions by lobe (fr – frontal, ins – insula etc.). The left (right) half of the connectogram figure represents the left (right) hemisphere of the brain and the brain stem is at the bottom, 6 o’clock position of the graph.

Connectograms are circular representations introduced by LONI researchers in their NeuroImage article “Circular representation of human cortical networks for subject and population-level connectomic visualization“:

This article introduces an innovative framework for the depiction of human connectomics by employing a circular visualization method which is highly suitable to the exploration of central nervous system architecture. This type of representation, which we name a ‘connectogram’, has the capability of classifying neuroconnectivity relationships intuitively and elegantly.

Back to Phineas Gage: His skull has been preserved and is on display at a museum. Through sophisticated spatial and neurobiological reasoning the researchers reconstructed the pathway of the steel rod and thus the damaging effects on white matter structure.

Phineas Gage Skull with reconstructed steel rod pathway and damage (Source: PLoS ONE)

Based upon this geospatial model of the damaged brain overlaid against the typical brain connectogram from the healthy population they created another connectogram indicating the connections between brain regions lost or damaged in the accident.

Mean connectivity affected in Phineas Gage by the accident damage (Source: PLoS ONE)

From the article:

The lines in this connectogram graphic represent the connections between brain regions that were lost or damaged by the passage of the tamping iron. Fiber pathway damage extended beyond the left frontal cortex to regions of the left temporal, partial, and occipital cortices as well as to basal ganglia, brain stem, and cerebellum. Inter-hemispheric connections of the frontal and limbic lobes as well as basal ganglia were also affected. Connections in grayscale indicate those pathways that were completely lost in the presence of the tamping iron, while those in shades of tan indicate those partially severed. Pathway transparency indicates the relative density of the affected pathway. In contrast to the morphometric measurements depicted in Fig. 2, the inner four rings of the connectogram here indicate (from the outside inward) the regional network metrics of betweenness centrality, regional eccentricity, local efficiency, clustering coefficient, and the percent of GM loss, respectively, in the presence of the tamping iron, in each instance averaged over the N = 110 subjects.

The point of the above quote is not to be precise in terms of neuroscience. Experts can interpret these images and advance our understanding of how the brain works – I’m certainly not an expert in this field, not even close. The point is to show how advances in imaging and data visualization technologies enable inter-disciplinary research which just a decade ago would have been impossible to conduct. There is also a somewhat artistic quality to these images, which reinforces the notion of data visualization being both art and science.

The tool used for these visualizations is called Circos. It was originally developed for genome and cancer research by Martin Krzywinski at the Genome Sciences Center in Vancouver, CA. Circos can be used for circular visualizations of any tabular data, and the above connectome visualization is a great application. Martin’s website is very interesting in terms of both visualization tools as well as projects. I have already started using Circos – which is available both for download and in an online tableviewer version – for some visualization experiments which I may blog about in the future.

 
6 Comments

Posted by on May 17, 2012 in Scientific

 

Tags: , , , ,

 
%d bloggers like this: