In the education sector, unifying data can produce a range of valuable business assets, but the process has been distinctly difficult to manage. This can relate to a university or larger high school, where faculties are spread widely across a large geographical area, as well technical or community college where there is real diversity of students and topics. Each department or faculty generally has its own budgets, proprietary systems have built up over many years, and each faculty tends to manage its own data according to in-house rules.
This can make the process of digital transformation much harder than it ought to be, with data tied up in siloes, digitally dispersed and notoriously hard to access.
Speaking to Paul Moxon, Senior VP at data virtualisation company Denodo, reveals that there is much to be gained by taking the plunge and getting communal data working across an entire campus. Whereas traditional processes for accessing data require everyone to buy in and deposit the data in a central warehouse, virtualising that data means that it can stay in each individual faculty or campus, but still be accessed at great speed upon demand.
“The average campus will have assets distributed very widely, with data scattered across locations and different markets – even semantics can be different from one faculty to the next, with alternate phrases and references being used for what is essentially the same data,” states Moxon.
Further to this, individual faculties tend to have a sense of ownership about their data, which sets up further challenges.
“In some cases it is almost like a feudal fiefdom. There is an attitude that ‘this is my school, and I own the data.’”
This makes it very difficult to get timely information to executives and decision makers in a university, hampering decision-making processes.
“In one particular school that deploys our virtualised data solution, it used to take about four weeks to collect real data on how many people were in a particular campus – therefore, the data was actually out of date before it was visualised. This is a classic scenario across many schools and universities across the world.”
In another case, Moxon referenced a high school in the US that had deployed a virtualised data solution to keep on top of a student body numbering over 2000, which had brought student records, attendance, assessments and exam results as well as performance analysis all into the one place for fast and easy reporting.
In 2015, Indiana University tasked their IT department with a plan which aimed to improve the timeliness of information, so as to best inform decision makers and move away from ‘best guess’ decision making. This was to be a core part of the institution’s strategic plan for 2020 and beyond, with the aim of empowering decision makers with timely, accurate information.
Looking at the existing collection points for data, and how that was being governed by each faculty, the Indiana IT department decided they would not force teams to give up data, but would instead build a logical architecture – in other words, leave data where it was within student record systems, salesforce, HR systems and so forth, but use data virtualisation as a means of pulling up individual parts of the data as required, in order to visualise it when and where it was needed.
This eliminated the need to go through various compliance aspects and centralise the data. Data virtualisation so the data could be accessed from where it was. Built logical data warehouse.
“Virtualising the data meant that stakeholders – let’s say it was the Dean – can go and look at the platform, say ‘give me data on such-and-such topic’, then Denodo will go back to each individual faculty or admin system, find the data requested, and produce it on say, Tableau or another visualisation platform in pretty pictures for the Dean.”
Regarding security, Moxon tells us that there is a security control built into the Denodo platform, which helps control what data is seen and by whom.
“I may be a professor of computer science for example, or teaching digital media to high school students, so will need to have more access to data than others. It is a decision-support system whereby data is aggregated, so as a teacher or professor you don’t necessarily need to see records for an individual student, but for other people – like maybe a councillor or teaching assistant – they may need to see grades for the student they are tutoring. It is also important to be able to hide information on request, if there is a need to protect an individual’s privacy.”
While schools and universities are finding data virtualisation to be a very efficient way of accessing data without building a data warehouse or central repository, what they are actually empowered to do with that data is less obvious.
“Schools are using data virtualisation to make it quicker to access data, and as such they are finding new ways to take advantage of these increased speeds and access. Over time they will probably start to use it in new ways because of that speed.”
Moxon suggests that this progression may lead to the point where institutions use Artificial Intelligence and predictive learning to flag potential drop-outs and students whose marks are beginning to demonstrate a decline in effort, and therefore increase student satisfaction and retention rates.
What we are seeing now is really just the tip of the iceberg when you look at the potential for data use over next few years.
“If you empower employees and academics with the information they need, they can do wonderful things. The people directly involved know best how they can make their own lives easier, make better decisions for their students. It is about being able to empower them with the info that they need, when and where they need it.”
Latest posts by Education Technology Solutions (see all)
- The Importance of STEM in Education With Kaci Heinz - February 26, 2020
- The Importance of STEM in Education With James Wigfall, Education Manager, Space Centre Houston - February 19, 2020
- Life On Other Planets - February 18, 2020