ESG Research: Optimising ESG data for real estate: Ensuring quality, auditability, and compliance
ESG Data Integrity
REspond:
In the fifth article of this series, we delve into the critical role of data in ESG solutions for real estate, focusing on how data is managed, the significance of data quality, and the auditability of these systems. Data is the backbone of effective software, underscoring the necessity of accurate and reliable information. The importance of auditability is therefore key in ensuring data quality and producing reliable results.
Data quality
A commonly cited phrase when talking about data and systems is: “Garbage in, garbage out.” It is no secret that the quality of data fed into software is important. Good data can yield strong results, but inaccurate or incomplete data will consistently lead to unreliable outcomes [1]. With this in mind, all ESG solutions for real estate we have spoken to acknowledge understanding the importance of data quality as they detail methods and dimensions their platforms use to maintain it.
In our discussions with the solutions, the emphasis on ensuring data quality was underscored by implementing standards that are enforced when clients input data into the system. Interviewees noted that information is initially collected through various methods, including manual uploads, APIs, portals and in some cases, as part of a service. Standards are then implemented to ensure data quality, such as validation rules that define acceptable ranges and trigger alerts when values fall outside these limits. Another standard mentioned involves flagging readings that are estimated or auto-filled.
Additionally, an emerging trend became clear: solutions are increasingly data-agnostic. Of the twelve interviewed solutions, eleven mentioned being data agnostics. This approach ensures flexibility, allowing solutions to integrate with a wide range of IoT devices and hardware, communicate with various systems, and compile a more comprehensive data set. Additionally, this makes these solutions more competitive, as they can cater to clients who may rely on different systems for their core business operations.
In all instances, maintaining data quality through a complete data set is one of the dimensions of data quality [2]. This means solutions are adapting to ensure that necessary data is collected. Guaranteeing, therefore optimal functioning of their system, as highlighted by one of the interviewees: “But if you have a legacy energy management system which is not smart enough, then we can use our open platform. Quickly build a connection and get the data from that old legacy system that may not be strategic to us but that is important. So, we offer a full bouquet of data collection.”
Interestingly, this data-agnostic feature also extends to being hardware-agnostic, enabling the collection of information from hardware that has been previously installed in a building. This may eliminate the need for high CapEx investment in new hardware when adopting a specific solution. As a result, these solutions become more appealing to real estate stakeholders looking for flexible and cost-effective options.
Data Enrichment
More than having a complete dataset, achieving data quality also means being consistent and accurate [2]. With that in mind, some solutions have created algorithms to maintain a steady income of data, in case data upload is interrupted. One of the solutions features an auto-fill mechanism designed to bridge data gaps in the event of temporary disruptions in data uploads: “A lot of these systems now are over IT networks, and it still happens on site that the IT team might change a firewall rule. So, all of your metering data just doesn't come in for a week, so you might have a gap for a week that you need to fill. In the short term, we can fill that with auto-filled estimated data. So again, in the short term, you can make sure that your visualizations are not interrupted by nasty zeros...” For users, the data fill mechanism ensures smooth system performance in terms of data visualization and analysis, contributing to the overall user-friendliness of the system.
Aside from completing data gaps, 75% of the interviewed solutions have taken it a step further by enriching data. They provide extra information to enhance knowledge for real estate clients. As an interviewee mentioned their system integrated various types of information to generate new insights to users: “We know how many people are on that site, and what the correlation is between the number of people on-site and the actual energy consumption. So, the data is coming from two different systems. The data comes together, and you have an enrichment layer that is only provided because we can drive the data together from multiple systems.” Data enrichment to gain new insights can be achieved by incorporating open data sources into a client's data set, or as previously mentioned, by combining data from different systems to uncover fresh perspectives.
Therefore, real estate clients can benefit from data enrichment by having systems that address data gaps, integrate information from various sources, and incorporate open data to generate new insights. However, the critical question remains: how can real estate professionals ensure that the outcomes derived from this manipulated data are accurate?
Auditability
When it comes to auditability, all systems we have spoken to mention features that support data auditing. One key feature is time-series data, which stores information at specific points in time. Other features can include, having an audit trail of changes, cross-referencing different types of information to make sure data is correct, as well as allowing multiple users to review and verify the data. Additionally, some solutions rely on the certification level of the hardware used to collect the data, further enhancing reliability.
While these features aid in auditability, they are not the only methods for auditing data within a solution. The examples mentioned can be considered the first level of auditability, with features that most, if not all, systems we reviewed possess. However, the solutions that generate reports seem to employ more rigorous data auditing mechanisms. This is where differences between solutions emerge, as the level of auditability can vary significantly from one system to another.
Solutions that report according to a certain standard may have a stricter level of auditability because according to one of the interviewees: “a lot of these companies such as GRESB and companies that are looking to benchmark, or for ISO 50001 you need to prove that your data is auditable, and it's not sort of manipulated.” Therefore, to serve as a tool that enables users to comply with specific standards, data must undergo more rigorous auditing.
This additional audit may mean allowing (external) auditors into a system for checks to be performed or as one interview put it, having a “data frame of information that our clients can easily download from the platform to submit to their auditors. So, it has been audited by, I think all the big four companies. They all use this template, and this format.”
Important to mention, as clearly stated by another interviewee: “I’m always a little bit careful with people that say our software is compliant.” “We are the enabler of compliance. Like ISO 50001 (Standard for Energy Management systems) is a classic example that organizations receive ISO 50001. They have an Energy Management system in place and then customers sometimes ask if our software is ISO 50001 certified. No, software is not ISO compliant, organizations are. But our system has everything they need to file that compliancy that they have the right energy management software.”
In this respect, solutions said to enable compliance and that seem to be more advanced than their peers when it comes to compliance, achieve this by generating detailed data reports and providing the necessary granularity for precise reporting “… all of the source data has this granular availability in terms of analysis. So again, you can go into each individual data set. You can see time stamp by time stamp what the data is at that period. Whether it is an estimated value or just whether it was 90% good reads. We have very quick visualizations about the quality of data. You know whether it is 90% good reads. 95% good reads, but where we have some gaps that have been auto-filled. Again, they are flagged out within the granular data sets that can be presented as part of that audit. “
Importance of auditability
Data quality, data enrichment, and auditability are all important components for effective ESG solutions in real estate. High-quality data is the foundation for accurate analysis and decision-making, while data enrichment offers new insights that can drive better outcomes. Solutions that prioritize these aspects are better positioned to meet the growing demands of real estate stakeholders.
Additionally, auditability plays a role in ensuring that the data driving these insights is reliable and compliant with industry standards. As the industry continues to evolve, the ability of ESG platforms to adapt and maintain robust data audit trails will be key to their success in supporting real estate clients in achieving their sustainability and compliance goals.
Sources
Wessel, R. (2024, January 15). Data quality: a game changer in real estate. GRESB. https://www.gresb.com/nl-en/data-quality-a-game-changer-in-real-estate/
Gartner. (n.d.). Data Quality: Best Practices for Accurate Insights. Retrieved October 21, 2024, from https://www.gartner.com/en/data-analytics/topics/data-quality
Disclaimer: Please be aware that the information in this article is based on verbal communication with suppliers and has not been independently verified.