Data Changemakers 1: James Wilkinson, CEO, Entity Group
The Data Changemakers series is a set of interviews and interactions with people who have spent their careers working in or around data and data management initiatives. They have a vision for the data journey and we want to understand what they have learnt and how that drives what they do today. What are their war stories and what advice can they give others embarking on the journey?
– Please describe a little about your own background and you ended up working with data?
Getting into working with data was a happy accident. I had been working for Entity Group since 1989 when they were a leader in the introduction of e-commerce solutions to the retail industry and in 1994 joined Bachman Information Systems, a company established by Charlie Bachman, the leader in relational database and data modelling theory, to work with Data Modelling and Case Tools. I’ve been working with data related companies ever since.
– Would you say that you are a business person or a technical person or something else?
I am definitely more on the business side now but have a technical heritage as a developer, solutions architect and consultant on large mainframe and unix systems.
– What is your current role and its main responsibilities as they relate to data?
I am the CEO of Entity Group. As Entity is specifically a data-focused organisation you could say that all of my responsibilities relate to data!
– What has been the most challenging data-related project you have worked on and why? What was your role in it and was the project a success and why?
Certainly, one of the most challenging projects I’ve worked on was the Contact Point register of children. This was a national system bringing together data on eleven million children across many government agencies and hundreds of millions of records. It involved multiple sources, records and stakeholders and it was really rewarding because of the social benefits it was designed to deliver.
I ran the consulting business for the primary master data software vendor, and we supplied the software and services to the prime contractor for the software implementation element of the solution. The project was a huge technical success in that was delivered on time and to budget and rewarding in that we felt we were building something that would help vulnerable children. The solution was cancelled after the 2010 General Election for a variety of political reasons but was, nevertheless, a hugely satisfying project to complete because it was so challenging. Interestingly the shared services agenda in local government now is beginning to create similar data sharing initiatives between local government, health, social care and policing.
– What did you learn from the experience?
It very much brought home the importance of politics (small ‘p’!) when dealing with system implementations involving multiple stake-holders and data owners. This is the same whether the project is small or large because getting everyone aligned and communicating and agreeing desired and priority outcomes, then keeping them aligned, is the most difficult piece.
– What do you think are the key trends in data management today and how do you think it will change the way we all do business?
Artificial Intelligence and Big Data require a data environment that will allow them to succeed. “Garbage in, Garbage out”, is still as true as it was when I started in the industry 30 years ago.
We are now seeing the trend towards Information Management platforms. For the last few years, information management has been seen as a combination of data governance, data description, data integration, data security, master data, reference data, data stewardship etc, and enterprises have sought solutions for individual solutions to specific problems. I believe that the industry is now growing up and that there is recognition that good data management is a function of each of these in an integrated manner. Hence the trend toward more comprehensive data management platforms.
Of particular interest to us right now is the revitalisation of the drive for better data governance and data description.
What I mean by this is that 30 years or so ago, we all knew about data dictionaries but no one was really interested because, well because they are pretty boring and almost impossible to maintain. There is however a recognition by senior business leaders that competitive advantage and regulatory compliance can be derived from capitalising on the existing data asset within the organisation through AI and Big Data initiatives. These initiatives, however cannot succeed unless the underlying data is both understood in terms of its provenance and quality and trusted. This, coupled with AI based data discovery techniques, has led to a renaissance in data description solutions.
Data governance is equally as important and knowing how much you can trust information is as important as the information itself. Putting in place processes to measure data quality and to curate the data asset are the only way to ensure the trust in the information that you use.
Cloud is the other area. Off-premise concerns are diminishing but are not gone away entirely. Data management solutions are notoriously difficult to implement and to get ROI from so it is good to be able to offer them in the cloud via subscription.
At Entity Group we are seeing an uplift in these types of solution. People still have concerns about keeping data secure outside of their firewall but very often the ability to run data management as an operational rather than a capital expense outweighs this concern in certain circumstances.
– How are the pressures and proof points different now than they have been?
They are different for sure. People now know that they need to capitalise on their data – they still don’t necessarily know how – but the recognition that data is an asset is definitely there. Putting in business metrics around their data, defining them and tying them to the technology metrics is much more in evidence but there is some way to go yet.
– What do you think are the opportunities and obligations of data management?
The opportunity is significant. Gaining a consistent view of an entity and its relationships, for example a customer, is hard to value precisely but likely to be high for every business. The converse of that is obligation like GDPR is also huge. For example the ability to report everything that is known about them, where it is held and why and what you are doing with it.
– Staffing – with such a wide range of things we need skills in – how do you approach staffing? What needs to be in house? What can be external? What can be consumed aaS? CDOaaS?
Data management has to be owned in-house and it must have senior sponsorship. The execution of it needs to encompass parameters such as a framework, governance and a data strategy. You can get help to build these but internal people need to own them. The delivery against these parameters can either be done internally or by a third party with internal oversight and involvement. Realistically, other than for very large enterprises, it would be too difficult to hire and maintain a team with all of the relevant skills to ensure good data management discipline across an enterprise. For this reason, we believe that a managed service approach offered by a trusted provider such as Entity Group is the best way to go for most enterprises.
– What are we asking people to do – build things, or get value from pre-built things?
The important thing with data management is that the strategy, framework and objectives need to be agreed and they need to be achievable. It is very unlikely that any organisation would have – or want to have – all the necessary skills in house because hopefully they are only doing or revising this once every 5 years or so. Much of the execution can be outsourced to a trusted third party. However, each organisation does need a CDO type role to own and co-ordinate data initiatives. These people can also then research and buy pre-built technology assets that fit the data strategy – not the other way around.
– What advice would you give to someone embarking on a large data-related project today?
Don’t! (Laughs). Consider the complexity of the environment. Where are you today? Where do you want to be? Build a road-map to that destination incrementally. The increments need to be small enough to be achieved but valuable enough to make a difference – to be ‘value points’ as we call them. This is because these projects are 3-5 years overall and in that time people and organisations can change a great deal. For a project not to stall it needs to be delivering value regularly and often.
– Are there any particular skills or qualifications you consider to be vital to your success?
Diplomacy, stubbornness and tenacity. But seriously, successful pan-organisational data initiatives are often the first time that all of the departments of an organisation are forced to work together to a common goal and air some of their dirty linen to each other for the common good. There are all sorts of political, operational and technical difficulties to be overcome in doing this. Hence it is essential to understand how data is used in an organisation, how it can be brought together and what value it can deliver over what time. This needs an impossible mix of mixture of business analysis; information architecture and project management skills.
– What are you best known for or what do you like doing outside of your working life?
I like sailing and I have spent a lot of time singing opera.
– If you were reading this article what question would you want me to ask you?
I’d ask: “What gets you out of bed in the morning?”. The answer would be that most organisations, most of the time, have a huge business asset right at their fingertips that they are failing to maximise the value of. Their data. I love being involved in projects that show them how to build a platform that helps them to do this and gives them competence and confidence in their data.