Using Lumira and domain specific KPIs to understand and pilot research institutions
This blog summarized part of the work that I performed during my internship at SAP Labs France under the supervision of Jean-Christophe Pazzaglia
Visibility of Research is of paramount importance for the long term sustainability of research organizations and universities to establish their brands, enhance the researcher’s experience, improve operational performance, and sustain the innovation that makes your institution unique.
Understanding the data surrounding research activities gives the opportunity to identify your weaknesses and strengths and to have a head start so you can determine the pace yourselves. Adopting a common methodology will ensure performance and collaboration throughout the Higher Education and Research sector to monitor, understand and predict challenges.
The aim of our prototype is to support key stakeholders of your research institution to take decisions based on research specific metrics. Metrics that would embrace the research domain specificities, as academic research is not mainly a profit-driven sector. We also kept in mind the necessity of using tools that would enable future exchanges and benchmarking as the research domain is by nature collaborative and competitive.
Our objectives can be summarized as follow :
- Addressing the specific research requirements by taking into consideration the domain specific KPIs such as the Snowball metrics on top of more traditional budget related KPIs;
- Choosing to work with the standard data model CERIF instead of a homegrown data model in order to enable to reuse and to exchange information between institutions to compute metrics therefore enabling benchmarking;
- Exploring how data available on social network and open resources can enrich the institution self-perception including online reputation which is today one of the concerns of every sector. This was easily made possible thanks to Lumira data source extension capabilities. Our adaptor was based on XSLT to enable data fetch from different websites providing research information in RSS or XHTML format like OPENAire, we also developed a scenario exploiting the Twitter Data Extension.
Following a BDD (Behaviour Driven Development) approach, our tests have been the concretizations and the presentations of the scenarios that we had previously created, and we have seen our dashboards and visualizations getting refined and improved through the various presentations and exchanges.
Throughout the versions created, each one of our visualizations gets more centered on a unique yet complex question: its scope of action gets smaller but the precision and value of its result gets bigger.
Throughout the versions our visualizations become interactive; they are imagined in a manner that exploits at its best the “Drill-Down” function, keeping the filter result as visually pleasant and informatively rich as the source.
Here are some Dashboards we have created as part of our scenario:
Budget Management Dashboard
This dashboard allows us, in a glance, to have an insight of the yearly allocated budget: how much of it has been spent as of today and what is the correspondent budget consumption, these metrics being presented in various points chart. In the Table Diagram, we can see which budget consumption percentages are the most critical.
What it if you wanted to compare your budget consumption percentage as of today to final budget consumption percentages of previous years? Don’t you worry! Our dashboard enables that as well, showing the budget consumptions of the two previous years conjugated with the current situation and the allocated budget expectations for next year in a same Radar Chart.
If you want, also, to know how each fund received by the institute has been spent, you can see clearly its “spending flow” through our Sankey Diagram components disposition.
Thanks to the Input control added, we can filter this view (as well as all the others) by department. In the following example, we filtered the Budget Management Dashboard and kept only what is relevant to the EIT Department.
Portfolio Analysis Dashboard
In the first visualization we calculated approximations of costs, thanks to a heuristic, based on:
• Average Time spent by a researcher to create the publication or the IDF (Invention Disclosure Form) or participating in a service delivery that has a social or economic impact
• Average Annual salary of institute researchers
• Estimated administrative costs of publication
The Portfolio Analysis, represented by the bubble chart, shows the percentage of budget consumption for each project comparing to its percentage of time consumption (calculated using time spent since start date until Today or End date comparing to total duration).
This chart also takes into consideration the initial budget (as bubble radius), as overspending or under spending, are perceived much more critic for the projects that have big initial budgets.
Citation Impact & H-index Dashboard
Citation impact is a metric that calculates the average number of citations collected by a project related publications. We opted, additionally, for another average that is calculated from the projects citation impacts to represent the citation impact of the department.
The h-index is an author-level metric that attempts to measure both the productivity and citation impact of the publications of a scientist or scholar. The definition of the index is that a scholar with an index of h has published h papers each of which has been cited in other papers at least h times.
Since H-index can be influenced by the number of years of experience, we used its yearly variation in order to give a chance for youngest efforts to be perceived.
We also compared, in another bar chart, the number of citations received with the numbers of its co-authors to find out correlations between the two specificities: Does a large number of contributors to a research paper work improve its citation impact?
Patent Story Dashboard
When an IDF is created, it is, first of all, submitted to the Internal Patent Department that evaluates the IDF and chooses or not to submit it to the (Regional/ National…) Patent Office that validates or refuses the patent creation.
In this visualization, we can perceive the conversion of our IDFs into patents that occurred between 2009 and 2015, knowing that an IDF needs in average 4 years to be turned into a patent. We also set a ratio that shows how much an IDF is likely to become an actual patent by Department.
These dashboards have been presented during EDUCAUSE2015, a premier higher education IT event that took place between the 28th and 30th of October in Indianapolis, United States.
The prototype was shown to a selected number of customers to demonstrate how BI tools such as Hana and Lumira easily be used to create specific solutions for the higher education and research sector, that would embrace the domain specificities and address its needs and challenges.
For further details or follow up, you can contact me or Jean-Christophe Pazzaglia