As a large Dutch pension fund service provider, PGGM had a lot of data flowing into and around their organization, but with no clear strategy (or solution) to manage all that data.
With multiple versions of the truth flowing into the reporting process to create client reports, it was a very sensitive and time consuming task to create all the reports. At the same time, ambiguous data definitions resulted in small inconsistencies in positions and risk and performance reports which the front office relies on for their investment decisions.
Additionally, while permitted timelines for delivery of regulatory reporting are getting shorter, PGGM struggled with their time-to-market of reports, caused in part by an unscalable reporting process that required additional resources in the reporting department with every new client. Finally, PGGM’s reporting process was very labor-intensive and therefore expensive.
In order to advance, PGGM realized their need to have more control over their data.
Five key data challenges that were holding PGGM back:
- Multiple versions of the truth flowing into the reporting process
- Ambiguous data definitions resulting in small inconsistencies
- An overly long time-to-market for regulatory reporting
- Reporting process was not scalable
- A very labor-intensive (and therefore expensive) reporting process
Partnering up with SimCorp
Traditionally, PGGM has been reluctant to be first movers with new technology. In this case however, the company believed in the roadmap put forward by SimCorp and wanted to be a part of it from the beginning. PGGM became a pilot client for SimCorp’s Data Warehouse.
Sander Tegelaar, Manager Investment Management IT and Implementation was a member of the project board and had the executive role in the last phase of the project. As he puts it, “This gave us a seat at the table and the ability to influence many aspects of SimCorp’s Data Warehouse. For example, we pushed SimCorp to include the cost area. While there can be a few stop-and-go moments for any pilot client, we felt that the process went really well and that SimCorp was committed to the project and our input on it.”
While PGGM had tried in the past to build in-house data solutions, they quickly realized that it caused more issues than it solved. It made a lot more sense to go with a vendor using a standard data model, with expertise on how best to structure data.
Reflecting upon the importance of standardization, Sander Tegelaar says that the “implementation effort to populate this standard data model was limited due to the standard interface from our core SimCorp Dimension solution.” Additionally, the technical deliverables where limited to PGGM-specific structures (freecodes) and external data sources. While the learning curve of the data warehouse tooling was quite steep “we didn’t need to bring in additional permanent staffing in order to operate the data warehouse,” explains Sander Tegelaar.
Sander Tegelaar from PGGM Investments
Having the business organization own the data warehouse
PGGM considered several perspectives when looking at ownership of their data warehouse:
- The business users should be the designated owner of all the populated fields in the data warehouse
- The business users needed to make a switch from requesting fixed (Crystal) reports to a situation where they could either create reports and dashboards of their own or request data model extensions
- The IT department takes responsibility for the overall data model to avoid redundancy and to validate the expansion request for the data model
Data governance is key
One of Sander’s key takeaways from the project was that their decision for a strong data governance structure was really important and is vital for any successful data warehouse project. Sander Tegelaar explains that from a technical perspective, the data warehouse was very feasible. His greater challenge was getting business awareness and buy-in on the fact that data is important. This is a common challenge for data projects and shouldn’t be underestimated.
Sander Tegelaar took the battle however, and convinced the entire company of the importance of embedding a structure where the company-wide data definition, controls and ownership was clearly defined. “I’m proud we made that happen, because otherwise, you may have had a really nice box, but filled with bad data, and that doesn’t help anyone,” explains Sander.
PGGM introduced a data governance board, including c-level management. The board looks at data files with a solution analyzed by a preparation team consisting of business representatives from the full investment chain. In a data dictionary file the business owner of each data element is defined. The prep team also defines the source of the data, how is it controlled, how it is updated and defined. Once the data dictionary file has been approved by the Governance Board there is no discussion any more on how teams should interpret that data element. Sander sums up, “It also stops us from having a lot of parallel data elements in the system. So that was a big takeaway.”