On-Demand: KPIs & Analytics Driving Real Estate Company Performance
August 06, 2020
EisnerAmper and REdirect Consulting discussed understanding the power of data and exploring critical solutions and processes for real estate companies.
Lisa Knee:Buildings themselves do not lend to dynamic innovation like E-commerce or software businesses. The industry is at a turning point and needs to reinvent itself through transformation, technology, and most important flexibility. All sector of real estate have been disrupted to a large extent and using technology and data and providing insight is going to help making informed decisions. No sector will be extinct, but how they switch their focus while transforming these operations to survive and thrive is how we're going to grow into the future.
Today. We're going to discuss key data and analytics to help make those decisions. Thank you both. Thank you everyone for joining us. First I'd like to have Greg Frisky from EisnerAmper start with introductions. Greg, if you could tell us a little bit about yourself.
Greg Fritsky:Thanks Lisa. Hi everybody. Greg Fritsky here. I'm the director at EisnerAmper focused on digital solutions and process automation. I've been with the firm for over two years and I've come out of a big four auditing, been an accountant, been a consultant, been in the software space for many years. Focused a lot of effort on helping clients automate and gain insights, using data. Spending a lot of time more recently helping people through this whole challenging times of digital resilience and leveraging technology to basically make your lives easier. It's a stressful time and thoughts go out to all of you but I'm glad we will have some time today together. I think you'll find this very insightful and appreciate you joining and I'll turn it over to Josh.
Josh Malinoff:Thank you, Greg and thank you EisnerAmper for the opportunity to partner with three direct consulting on this webinar. My name is Josh Malinoff. I'm a principal at Redirect Consulting. Redirect, just in case you're not familiar, is a real estate technology consulting firm. What we do quite simply is we help real estate companies take advantage of the best technology available to you by helping identify those, evaluating those and for the real estate investment management and property management solutions, we help implement and support those. I have been with the firm for about 14 years and have had the pleasure for the past 25 years of working with real estate technology. I'm excited to speak with you today.
Lisa Knee:Wonderful. Thank you both. We've been speaking to a lot of clients and there's certainly been a lot of challenges out there today. Josh, I'm going to pose the first question to you. What are the overall challenges clients are facing with their data?
Josh Malinoff:Wow. Lisa big question. There's a lot, right? I guess on a positive note, the technology available to real estate companies now to leverage that data, to use that data is better than it's ever been before. It's easily accessible. We're going to talk about some opportunities to implement that. Where companies seem to struggle is getting the underlying information or data needed to populate those systems. Across a number of areas there's challenges.
One of the biggest ones is, as I imagine, many of you can relate to is where your data resides. It's probably, in most cases, not all in one place. It's very common to have disparate systems. Different databases or systems that you may have implemented or built within your company. Even outside of systems, there's quite a bit of data such as in spreadsheets, up on social media sites, for example reputation of your tenants, or your properties or talking about these things.
That data is all over and it needs to be gathered and accessed in a way that can be used in a centralized, normalized, clean format. That's where a lot of people, why the companies struggle. That data not only needs to get gathered, normalized and cleaned up, but that's at a point in time. What's going to end up happening is as soon as you get that done, it needs to be maintained. That's where things like data governance is going to come into play.
A lot of challenges there having these things outside of systems, data outside of systems is probably the easiest one to fix. As many of you know, Excel is certainly a friend of all and has been used for a lot of people and by a lot of people. Unfortunately it's not a data source. It's a very easy way for things to get out of sync. Those are just a few Lisa.
Greg Fritsky:Just wanted to add to that, and I couldn't agree more. One of the top challenges, I think for those in the New York area, if you watch Fox news at 10 every night, they come on and they tell you it's now 10:00 PM. Do you know where your children are?. So I'm going to change that to do you know where your data resides? That's probably one of the top challenges that people are faced with today is that things we took for granted. Just give me the report, give me some information, build this for me. It's just not good enough anymore.
The information is stale, it's old. We need to be embracing change more quickly. We need to be able to factor in things more quick and also look for new insights that differentiate ourselves. Everyone has access to the same information and things have become a lot more aligned together with each organization trying to find ways to leverage technology, to differentiate themselves and find a competitive advantage. But I would say, and I agree with you Josh, there's so many different systems.
I have one particular client that has 87 different interfaces, 87 different interfaces. Can you imagine all the information being poured in, ensuring that that data is accurate and complete. Finding a single source of the truth is very difficult. How many people on this call have had situations where they receive one piece of information and somebody else calls with something differently in another group where they have a different system and it's producing a different result. It's more common than we think, and it's a challenge.
There's a lack of what I would call a common taxonomy, which is naming data. If I call this P&L, what does that mean for each part of the organization? Is it mean the same thing on the investment side as it does on the operational side? How do you read that? Finding a common taxonomy and making sure that you identify data and have a common thread in terms of how to report it. The other thing I would say it challenges is just lack of experience. Let's face it, these new tools that we hear about, we're all being pushed to learn more. It used to be Excel was the tool of choice and still a fantastic tool and we've learned how to use it quite a bit.
What I've seen over the years is how much of that time is really analytics and how much of it is more what I call data transformation, which is, think of all the steps you have to take before you get to a result to analyze. All that process that's required, all that time. Very little time to do the analysis. Then when we're done, what is the data telling me and how do I depict it in a way that's meaningful? That's just gaining experience and learning what the new tools are. I guess the last thing I'll say is that there's sort of this concept that this type of data is the responsibility of IT. It sits in their silo. I go to IT when I need something, I need them to produce the information.
But the reality is that we all have the responsibility for the data, we're all the data stewards, whether that be financial data, operational, marketing, HR. We each have responsibility for our respective areas. That challenge is on all of us. We need to continue to find ways to learn how to govern that and we'll talk a little bit about that in a bit. Those are some of the top challenges, I think. And understand that the tools are at the point now that the businesses can be enabled. These are things that you and I can start to use and leverage on a small scale and then develop those skillsets and get more comfortable with it.
Lisa Knee:Great. Assuming that people have the right systems implemented and in place, people are talking about KPIs, and metrics, and things to monitor. Learning which ones that we should be looking at and what's important to monitor, I think that's probably first and foremost in a lot of people's mind. Josh, I'm going to start with you. Are there certain operational KPIs that people need to be looking at within the real estate sector and making sure that their systems are helping them get that information available?
Josh Malinoff:Yes, absolutely. Before I get into some specific examples, I just want you to keep in mind that this is not a one size fits all concept. Each real estate company is going to need to spend some time thinking about what KPIs will be beneficial to monitor? Which KPIs, if monitored correctly will give you a competitive advantage? That is just, again, a point in time because the number of KPIs and the individual KPIs that you think about monitoring should be constantly evaluated because the economy changes, your tenant changes, your portfolio changes and your competition changes.
So keep that in mind that this is an ongoing concept, but once your data is organized, the beauty of it is these tools allow you to very quickly add new KPIs, change KPIs on the fly once you have that in place. Greg and I are going to take you through some sample KPIs and talk a little bit about how they're relevant in today's environment. I'm going to start with the operational KPIs. If you own, operate or invest in real estate, you typically either have commercial such as office buildings, industrial, retail and so on, or residential multifamily apartments as well as some other asset classes of course, but they tend to fall into those two buckets.
Starting with commercial, one of the key things that everyone has top of mind at the moment is occupancy. That of course seems quite obvious but some of the challenges people have is getting an updated, accurate occupancy. On the commercial side, that's a little less frequent. We'll talk in a moment about residential, how you're going to need to keep an eye on that constantly, daily. Tenant concentration, tenant risk. Very top of mind for many of you right now, right? Depending on your portfolio of tenants that you have across your overall portfolio of real estate, you may be in a very good position right now or you may be experiencing some challenges.
Certain types of tenants, certain types of assets are absolutely performing better at the moment and the better understanding that you have of how your tenants are concentrated across your portfolio, the better you'll be able to make quick decisions and understand your underlying risks associated with your portfolio. Accounts receivable. Again, extremely top of mind at the moment. Being able to on a pretty much a daily basis at this point in time, have a complete and accurate understanding of your collections on rents is absolutely critical.
Valuation on the commercial side is a little bit lagging. We haven't really seen yet the overall impact evaluation on some properties other than acquisitions and dispositions because the evaluations may take place quarterly or in some cases annually. There some things you can do to start monitoring that more proactively and more timely. Then also just mention on the retail side, and I would group this as traditional stores, establishment smalls, and as well as the hospitality restaurant space. There's a lot of interesting changes taking place as we speak around how landlords are restructuring leases with those tenants.
As you know, a lot of them at the moment are unfortunately not able to resume operations or in cases where they are able to resume. They just can't do it at anywhere near the volume and capacity that they were able to prior. What's happening is some landlords are actually restructuring from fixed rents to variable rents on lease agreements midstream, and they are trying to figure out ways if we're now going to get paid based on sales volume as opposed to just a fixed monthly predictable amount, we better have a technology solution to help get that information quickly and timely.
That's another. Please, ask any questions as we're going through this in the Q&A box and we'll do our best to get to them all. On the multifamily residential side, there's a lot more, I would say time sensitivity to this. The asset classes is doing fairly well at the moment depending on the location, depending on the class of asset. But overall there are some changes happening. There are tenants that are actually not turning over as much because they're not necessarily going to go chase the next best thing at the moment in this space.
Taking a closer look at marketing data, things such as traffic advertising and understanding where that's happening and where it's changing. Are tenants shopping, are they not? Are they happy where they are? Are they happy with the amenities? There's some interesting changes happening there where it's actually beneficial to the landlords and that tenants aren't chasing the new amenities as much. The gym at the moment may not be the most important thing to people. Occupancy here is a more daily look. Understanding across the portfolio at any point in time if any movement is happening is critical so you can begin to adjust your marketing so you can begin potentially renegotiate terms proactively with your existing residents.
That leads to turn over. Service requests. There's some interesting things happening. Landlords traditionally don't care too much; I don't want to generalize, but this tends to not get a ton of attention during the good times. If tenants are unhappy with the service level that they're getting from the property, they tend to when things are good not worry quite as much. At the moment though, where there's optics in this could be enough to force turnover which could be detrimental.
Being able to really look into the facility and tenant service requests side of things on a real-time basis is becoming critical. I'll pause there and see if we have any quick questions and then we'll turn it over to Greg to talk about the financial and predictive analytics as well. Lisa, would you mind if I take a quick question now or we would you like to hold this off? We have one on KPI.
Lisa Knee:Yeah, we can certainly take one now. I think one of the questions that came in was what type of statistical analysis do you look at on your day to day operations?
Josh Malinoff:Right. Maybe just a little clarification on statistical tests. When I think of statistical tests, I'm thinking of KPIs and calculations around day to day operations. I wasn't sure if that's exactly what the question relates to, but there are benchmarks, which we haven't yet talked about that go against these KPIs. To your question that may be relevant in that you have a calculation around concentration or occupancy or whatever it may be. I want to now benchmark that as close as possible to my peers in the market. My competition, as well as in some cases national standards.
That kind of place it's testing that would take place. Then I think there was a question on retail regarding re-compliance and yes, that is absolutely a factor. The restructuring of leasing could be a whole separate webinar because there's a lot happening there and we haven't yet fully understood. Perhaps EisnerAmper may comment on this as well. The impact that's going to have to things like retesting and debt covenants, and so on. That is quite a hot topic at the moment.
Lisa Knee:Yeah, I agree. I think we can have an entire webinar just based on the implications of leases and modifications and amendments and what the implications are going to have for both REITs and for non REITs Entities. I agree that could be a subject for a great second webinar. Greg, just to continue on KPI's. We look, and everyone's always looking at what should I look at? What metrics should I look at from an operational standpoint? But I also think, from a financial standpoint, and actually looking forward to the future, what are the key indicators there that people need to be focusing on in order to make those decisions?
Greg Fritsky:Thank you, Lisa. Very good question and Josh, thank you for your insight. Many of those operational KPIs that he has mentioned definitely is going to be part of your data sets and obviously is going to be important to draw that information. But what I want to talk a little bit about is how do you take that and translate that into a predictive model, if you will. One of the questions I get quite a bit is, where's all this going? How do I take information that I know today and how do I predict for the unknown? That was the million dollar question.
I took a look at it more from an investor's mindset. Oftentimes, investors, analysts have great information and data available to them. What do they use to draw the results? They're using historical information and they're using current information. But there's also unknowns. There's algorithms that have to be built. There's models that are used. That's statistical analysis. Taking that to the next level is predictive analytics, which essentially is using techniques like data mining and modeling and now even machine learning.
You hear more and more about AI driving a lot of these decisions, but using a lot of that types of information to make certain assumptions and actually build some sort of future model, if you will. Using that historical data to basically project out long-terms. With regards to data, this is one of the things to think about is there's both the known and the unknown. Oftentimes data is what we call structured and unstructured. There's information like reports and KPIs and many of the variables that you use today. But there are things that maybe you're not using or leveraging.
There's more technologies today, such as if you've ever heard of sensors. More organizations are using them to help them measure different things like usage or how many times is somebody actually accessing a building or how many times is somebody using a machine? These sensors, through the internet, can give a lot of insight and information. Social media is being harvested all the time now. Providing insights into trends and what people are thinking about. A lot of future decisions are based on human behavior and sometimes frankly we just can't forecast what's going to happen.
With COVID more recently, what we have is the human experience of people just aren't going to shop. People aren't going to go into the office. People aren't going to make decisions that they would have made three, six months ago. That's very impactful. We didn't see that coming. However, there are certain things that we could measure, or we could have been measuring throughout time in terms of what if something of this nature could have been part of a model. What if demand shut off for whatever reason? Whatever act of God would come our way.
That's how people derive these formulas. If you look at this graph, basically, these are some of the metrics that even investors will look at in terms of assessing risk and making decisions on whether or not the data has value and to what extent. What I would say is that people are looking to drive more value. They're trying to pull in on structured data and basically come up with the predictive analytics of what the future holds. That's the whole data analytics play. It starts out with the structured and the known and basically extrapolates itself into what can the future hold based on certain models.
The question is, is this something only a financial analyst has those types of tools and capabilities to determine that? No. The reality is those tools actually exist within some of the applications you may use today, like Yardi or some others. They may be in your ERP, or there may be something that you need to build that's unique to your circumstance or your use case. But those I don't know if you're familiar with Power BI or Tableaux or some of these other technologies. There are many new ones as well. Alteryx and some other end user friendly applications that allow you to manage data better and you don't necessarily need to have somebody write some sophisticated data program in order to extract the information that you need. Actually at the firm, we're using some of these tools currently to do some of our own analytics and help us with things like audit and tax planning. I'll pause there.
Lisa Knee:I just want to go back. Josh, I'm not sure if you saw this. Somebody clarified that they wanted to know about tests like chi-square and regression tests such like that. I'm not sure if you want to jump in now and respond to that.
Josh Malinoff:Yeah. So there's certainly tools available to do those calculations. Probably won't have time at the moment to get into all the different ways of doing statistical tests. But one thing we're going to continue to highlight, and Greg really just touched on, which is the first step. This sort of walk before you run, is getting that data in place and having your arms around it and being able to keep it current and updated. Once you do that, you get into these Holy grail type areas like predictive analytics which is amazing because it's looking at the trends and predicting a little bit where things may go based on certain variables.
The tools are so mature at this point and so much easier to adopt and implement that that's no longer the barrier. The barrier is your data strategies. I think we're going to get a little chance later in this session to talk about some things you could do to get around the data which will ultimately lead you to be able to do everything from KPIs, statistical testing, covenant testing, through to predictive analytics.
Lisa Knee:Yeah. I think that's the perfect segue now. To say I understand what the challenges are, how am I going to make these decisions to run my business better? There are things that I should be looking at and monitoring, but now what? Where do I go from here? How do I make the next steps to make sure that my technology and my infrastructure has what I need so that I can be looking at these real time and what is my next step? I mean, I think that's the biggest question clients are asking is, how do I make sure that I have the tools that I need and that I'm using them correctly.
Greg or Josh, I can start with either one of you, is to say, well, where do we go from here and how do we advise and help people to make sure that they have everything in place?
Greg Fritsky:Yeah, I can start. I would say that the beginning is always some level of assessment, right? I started with saying, where does your data reside? If you don't know, having some level of even an audit, a data audit and understanding where all your information assets sit. We'll get into controls in a little bit but one of the things I always like to say is that, we have a lot of controls in place around cash. We always say cash is King, and we're always focused on who has the cash.
Well, think of data as the new currency. You got to get a handle on where it sits. What is the value that I'm also trying to derive from it? What are the outcomes? Part of that assessment is understanding what it is I want to build. What are the outcomes I want to achieve. How much information, where's my information reside and how can I build meaningful data sets? Oftentimes people see that as a very large task but the reality is you can start with on a small scale.
I often say, almost a proof of concept you would with any technology type initiative of defining what it is that you want to achieve. If it's some new model that's going to provide you with certain metrics around the next three months of forecasting your tenancy rates or what your lease options are or what your pricing options are. Start with that and understanding what are the data elements that go into that? Then, obviously working with folks that are of a data mindset to help define that and start to build some sort of outcome. Oftentimes we jump to what we want to see. We want the outcome, but we have to understand what's going into that. Some level of assessment is absolutely necessary and I don't know Josh if you want to add to that?
Josh Malinoff:Yeah, that's absolutely great. I would say it's a onetime process that most of you may need to go through to get your arms around everything and put forth a plan to get to these next steps you see up on the side around implementation and then tools and controls and benchmarking. That assessment, initially tends to be a little bit more time consuming. Understanding where all the data sources are, who owns the data because you don't want to have two owners of the same data with two different spreadsheets that have different numbers on them or different information on them. You've got to get all that stuff, single source. You have to see what you can consolidate and aggregate into maybe a database so that it's normalized and kept clean.
You have to have some data governance around it. Then you got this all in place and you can do all the great things around KPIs, analytics and so on. Then as I mentioned in a little bit of the KPI discussion, it's an evolving concept. This assessment, after you get it all under control becomes a lot easier because now you're just keeping your ears and eyes out there, talking to experts and so on about what's important to keep an eye on and you have all the data already in a place to be able to quickly make changes to your analytics and KPIs. I just want to highlight it's an ongoing thing, but it's a little bit of upfront work. Then from there it becomes a lot less painful to maintain.
Lisa Knee:I want to jump in really quickly and skipping with the assessment topics. Sorry to throw you guys off a little bit. How often should someone make these assessments of their current system? If someone comes in and says, well, I had it done three years ago or five years ago, how current does that need to be this assessment?
Greg Fritsky:Just speaking from experience, I would say, if you can do it quarterly great. More likely, realistically, at least every six months you should be going back evaluating your plans. Things are changing so quickly. Oftentimes I'm going through an evaluation right now for client. We're all going into budget season soon. Certainly it's always a good idea. I think around this time of year to have some sense of strategically what you might want to do next year. Then I would say things change all the time. If there's a major change to your business, which we've all been through in the last three months, probably is good a time as any to evaluate. I would say as needed, but I would say minimal of probably every six months.
Josh Malinoff:I would answer it a little differently, Lisa. I think you might've been asking more about the tools and that part doesn't need to change as frequently. If you go through a process of evaluating and selecting the right tools to be able to report and analyze this information. You can typically hold on to that for several years. The other part of assessment we've been talking about is where the data resides and what data to track and what KPIs to monitor and what KPIs to calculate. That I agree with Greg is something probably quarterly that should be looked at.
Doesn't mean you're going to wholesale change everything, but hey, four or five months ago, we were in a very different place. If we didn't pay attention to this and just kept monitoring the same things in our current environment, that wouldn't necessarily make the most sense and give you the best results. It's environmental changes, it's changes in your investment strategy, changes in your investors. I would say there's a number of triggers out there that would cause you to take immediate action and review and then having just a quarterly or biannually data assessment process would be recommended.
Greg Fritsky:I agree with that.
Lisa Knee: I just want to continue along the path. The next step I guess would be to look at your infrastructure and tools as you were alluding to in there as well Josh, correct?
Josh Malinoff:Absolutely. Here's the big good news. I've mentioned it a couple of times before. This is not the hard part. The whole prop tech, you've probably heard that word now more than you cared to hear, world has exploded. There's more choices of technology solutions that more often can be plugged in very quickly without big implementation. Without having to program anything, or without even necessarily involving IT that are available to you. What we typically recommend is most real estate, whether they're owners, operators, managers, or investment firms, typically adopt a real estate ERP system.
In North America, the big ones are Yardi, MRI real page. We work with all of those. The goodness is most of those, all three of those, for example, have built data analysis, analytic tools, and are starting to incorporate predictive analytics right into the product. The reason that's a good first step is your data is already being posted in one place by those providers. You can immediately get access to it. You don't have to worry about integrating an external system to it or maintaining that integration. That being said, there are some amazing point solutions out there that can interface and can connect to your ERP system. That's, I think of all these items, the easiest to work through because there's quite a bit of great choices for you.
Greg Fritsky:Just to add to that. Josh you're correct. Many of these ERP type packages, beyond Yardi, they're all incorporating them, but it boils down to what is the outcome? What is the information? Is that information useful to your specific use case and understanding what your requirements are? But there are point solutions. There are third party solutions but it boils down to just getting experience with these technologies and getting training and ensuring that somebody on your team is focused on analytics as a discipline. It's something that just can't be overemphasized.
One of the things I get a question all the time is, what's the skill set for my team to be able to handle this and to manage with these tools? The reality is that you don't have to go out and spend a lot of money on a data scientist. You probably have individuals in your organizations today that have some tech savviness, if you will. Folks that are really good with macros and maybe some visual basic or just writing.
Lisa Knee:I want to jump back in there and ask you another quick question. People talk a lot about hidden data and how to uncover it. How do you know if you have this hidden data and what is generally there that can be helpful for people in making their assessments and understanding how to run their businesses better if the data's hidden? Who's going to tell someone and how do we unlock this? We hear this term a lot, but for those of us who just are looking at reporting packages that are given to us, we're hoping that everything in there. If there's something hidden, and I don't want to name any of the software providers because we don't want to have that.
Josh Malinoff:Absolutely. We do a lot of implementations of software for companies. One of the things that happens almost in every case is we bring everybody together from each team within the business. Leasing, finance, construction, investor reporting. We all sit around a table. We put together what's called a core team, a project team. They all work for the same company. We go and we start interviewing the team and there's all these aha moments that person one did not know that person two also was doing a bunch of work on the exact same thing.
Other aha moments around oh wow. We didn't realize you're capturing that. We need that information and thought would be a big deal to get. It had to be right across the head of the floor, back when we used to go into office buildings. There's that. Really just the common thing of getting cross functional teams together, understanding what they're doing on a regular basis, what data are they owning and maintaining is part of it. As much as Excel is a fantastic reporting tool, you can't use it as a database.
It's perfectly okay to take a report out of this system and format it for presentation, fonts and graphics and so on and put together pie charts, but you don't want to change the data. We're finding too many organizations are using it in that fashion and that's the hardest data silos, Lisa, or hidden data. That's it, it's in somebody's Excel spreadsheet, it's in their email box. It's in those types of places. Some of it's behavioral around putting together strategies to retire those cell models and migrate them into systems with underlying databases that have good visibility and standards around keeping that data. Looks like Greg might've returned. Can you hear us okay Greg?
Greg Fritsky:Yeah I'm back. I'm one of those folks that was offline from the tropical storm for several days. It's been sporadic since it came back. My apologies and hopefully nobody else is suffering through the effects, but it just knocked me out for a little bit, but I'm back.
Lisa Knee:I agree. We were just talking about unlocking hidden data. I know that I am definitely a guilty party to some of this hidden data that Josh was referring to in that hosting information in Excel that you don't need to be in and making sure that it stays in the actual software and the other programs so that it's shared data. Just to keep moving along on where do we go from here and how do we help and move our clients along is that what we've been hearing a lot is there's a problem with too many people with turnover and infrastructure, or having somebody have all that institutional.
We always hear that they have all the institutional knowledge in their head, right. That it's understanding who keeps the information and how that information is shared amongst a group. Internal controls is really, especially with today where people are working remotely, they're not able to communicate in today's environment and really making sure that there either isn't too much turnover or not enough turnover and that sort of happy medium where that internal control function still exists and how do we prepare for that and make sure that the systems and processes are protected? Greg, I don't know if you want to start with that.
Greg Fritsky:Sure. It's a topic near and dear to me. Data governance is key. Once we have, again, accurate and clean data, how do we keep it clean and how do we manage it and how do we keep people focused on it? Essentially what I always recommend is that assigning somebody the responsibility of data governance. It could possibly be a consortium of several folks. Folks that represent the business, folks that represent IT obviously. I would say, folks in the risk and controls. Always making sure that good controls are put in place. Again, it gets back to safeguarding data as an asset. The fact that information can sit on your laptop, it can sit on a server, and it could be what they call on premise. It could be in the cloud.
But also think about if you're using service providers. Service providers they might have access to your client's data. They may be doing certain functions for you, but they have access to that information. Then ultimately they may sub source it to someone else. Information just because I have access to it, it could be coming from anywhere. The first thing, most importantly is making sure those information assets are protected and safeguarded. Secondly is having the individual who basically adheres to the policies and procedures of your company and making sure that at the end of the day, that you test, that you validate this information, that you have safeguards, that you have a program in place.
Be aware of hacking. People could actually come in and corrupt your data, having some sort of cyber monitoring protocols. Back to what Josh had mentioned. I do agree, you may do an assessment infrequently, but you should always be monitoring your data assets. That should be an ongoing activity and you can put certain safeguard measures in place. That's one of the first and foremost things to think about. One other thing I'll mention is there's this concept of master data management, which is how do I make sure that what comes in, I mentioned the 87 interfaces, what comes in matches the output?
Did I receive everything? Is it accurate? Are the files safeguarded? Some level of master data management needs to be in place to make sure that that information remains safe. Because only once we know that the information is clean, then we can rely upon it. Then we can do all the wonderful things with the models and the predictive analytics and what have you. But I've seen so many clients that come in wanting the outcome and the analytics, but their challenge is just getting their arms around the data and getting that harmonized and normalized and validated.
That's a lot of the blocking and tackling that has to take place. One final note, I know it's one final note, but I'm going to mention robotics process automation, bots. There's a lot of doing that people are doing to validate this information. A bot can work 24 seven. You can put in a bot to basically do a lot of this nasty data management and manage those data assets and help with some of that. They could be on a continuous basis. Definitely something to consider. It's a great starting point as you go down that data journey.
Josh Malinoff:Yeah, Greg. I'm glad you brought that up. RPA, robotic process automation. It's a newer term as it relates to the real estate industry. A lot of people are still having a little trouble understanding what it means for them. One of the things about RPA and these bots is they're very granular test oriented software robots. When you think about implementing technology, sometimes it's scary because you think of time-consuming disruption and expense. This concept that Greg mentioned about RPA, you can do on a very small scale, very quickly, low cost, high return on investment.
You think about any manual process that is happening in your organization today, such as somebody logging into a website to look for market data on a let's say a residential market, they could tell the robot to do that instead and the robot would actually type it into the database, into the ERP system so that that data is accurate and updated timely. I'll just briefly mention, to piggyback a little on what Greg mentioned about keeping the data clean and audited.
There are tools available to help with that. Auditing automation, where it goes out and looks for things like variances period over period, or outliers around the data and highlights those for you. It brings those exceptions right into you to take action on. That can help tremendously because while people can curate all this very carefully, nothing beats exception report to help you identify these issues.
Greg Fritsky:One other note around RPA and this is a recent example of using it for analytics purposes. I have a retail client that one of the challenges they have is evaluating price changes across their competitors. It's a very long cumbersome process. Takes a lot of time, but it's very much doing extracting data, pulling down data from different sites, putting into Excel, working through the model. Basically using this to come up with a resolution of what do we ultimately want to use this data in terms of our pricing?
It's a different industry, the same type of business problem. We built recently a bot that essentially does the work, 160 hours’ worth of effort. Now basically runs on its own every Monday morning and they have the information. They pick up a report that took them literally a day to build, a day or two to build with the whole team and they have the result on an ongoing basis. That's just the idea of using bots to do some of that heavy lifting so you can get to the results faster. Because without getting to the results and you're stuck in the details and the doing, you can never really use it for what you needed to use it for so definitely something to consider.
Lisa Knee:That would be similar to a real time comp for rent, right? Someone would be able to do that and pull that information real time.
Josh Malinoff:That was one of our first RPA projects. That exact use case you described where they had somebody logging into each of the apartments websites, properties websites, getting leasing and marketing and pricing information and keying it into a system. That got automated with the bot. This topic by the way, is both EisnerAmper and Redirect can assist with. We happen to have a webinar week after next on that exact topic if anyone's interested. I know EisnerAmper has service as well, so definitely encourage you to learn more about that.
Lisa Knee:Certainly an interesting topic, because we all talk about it, it saves time. Then roll into our last topic on this is monitoring and benchmarking. It's certainly a way to help with monitoring and benchmarking and to be able to extract that data so that people can evaluate and do their assessments real time.
Greg Fritsky:Yeah. I'll comment on that real quick. Data stewardship again. This is going to be where finding the individuals who are going to represent different parts of the business, who understand what the data sets are telling them and being able to leverage the tools on an ongoing basis. Again, it's, everybody's responsibility to be monitoring and using these capabilities. In terms of benchmarking, looking at the opportunity to the historical information have you already gathered and where can you potentially leverage it to devise new strategies that are going to differentiate you. All of you have things that are proprietary and ultimately, how can you build a model that will route to your benefit in the long term?
Lisa Knee:Great. I want to make sure that we leave time for some of these questions that we have. I think the first one is, in the real estate world we talk that we have our own language. One of the questions asked, is there a coding language specifically for real estate? I'm not sure if someone, Josh or Greg want to address that. If we have our own coding language.
Josh Malinoff:Right. If you don't mind, Greg, I'll take that one. The good news here again is back in the day, not even five, 10 years ago, you had to build these proprietary systems and do some programming. You might have had to have a pretty robust IT team within your real estate company. The mainstream applications now have these capabilities to do calculations using typically Microsoft SQL server, data analysis tools. Using data warehouse language. There's different technologies around that.
The traditional coding concepts where you have a developer in house, or you outsource that to somebody is typically no longer necessary for real estate KPIs and analytics. The tools that you can buy off the shelf have that flexibility for you to do it yourself.
Lisa Knee:We did talk a little bit about timeliness and some of the questions are really great so thank you guys for typing these in. We're talking about KPIs and monitoring them, but they're often stale. Making decisions on 10 collections that we receive from April when we're in July in today's environment, isn't really helpful. How do we keep that information current especially to make decisions?
Greg Fritsky: I can speak from a benchmarking perspective. What I would say again is getting back to the predictive model. There’s a lot of information on the web. There's a lot of government agencies who track certain metrics. There's information. It's an economics. I'll have to post it later, but there's an economic site that I've used in the past. We've done regression analysis. Especially in the financial services space where you might be modeling something over time.
Regression analysis, using information from a website that you can then pin and align your data to can oftentimes point you directly into an area where you can get more updated information. Oftentimes, I see real estate as there's the micro and the macro. I'll say from the macro perspective, there's obviously a lot of government type forecasts. Things around interest rates, consumer demand. Obviously now, even with around real estate there's certain metrics that can be leveraged.
I would say certainly looking at some of those models and then ultimately seeing how we could potentially build some sort of model to leverage that. Taking the information you have today and then projecting against it, using that benchmark. I've seen a number of companies use that approach to predict the future a bit, given that they might have information that's slightly dated.
Josh Malinoff:Greg, I agree. Going back to Lisa's question around data being stale, very common issue. People work hard to get the data in one place so they could be used for KPIs, analytics and other calculations and reporting. Then it's a huge effort for some organizations to refresh that. This is really the key thing that you need to focus on eliminating. That manual process of getting data aggregated and cleaned so that it can be reported on. We look at that in three different layers, I guess probably the best way to put it.
First and foremost, if there's an opportunity get it right into the source system, into the ERP system. Most ERP systems have a lot of flexibility to extend what data points get captured. That's always the best because it's in one consistent database along with all the other data points.
The second option is to integrate an external data source to that recording tool or to that KPI tool which is usually an underlying database. Most of these modern systems have what are called APIs or a re-stability to electronically send and synchronize information. There's a concept called ETL, which is extract, transform and load because most of the time, the data is not consistent between the two systems. Maybe the account number is different or the property code is different. These tools have the ability to transform the data as it brings it in.
The third option, which is becoming quite popular within the real estate space is what Greg and I were talking about a moment ago, which is just to coin a bot to do something for you. It's a shortcut because if you think about a rent roll, for example, there's a lot of underlying information needed to calculate a rent roll, to display a rent roll for a property, but all you really want is the rent roll information. Rather than trying to integrate all of the underlining units and leases and sweets and so on, why not just go grab that information from a report and get it into a system. That's where the bots RPA really can help you. That's the layers, three in priority order. The common theme there is having a way to have those connected so that they're not sitting in silos or out of sync.
Lisa Knee:Well, I'm looking at the clock here and I see that we're running out of time, but you definitely, both of you have given people, I think a lot of action points to understand and figure out how to make sure that their systems are giving them the right information so that they can go on and grow their businesses and run their businesses efficiently and effectively. I would like to thank you all for participating today.
Greg Fritsky:You're welcome.
Josh Malinoff:Thank you Lisa. Thank you everyone.
Greg Fritsky:Thanks for having us.
Lisa Knee:Wonderful. Again, we wish everyone safety first and foremost and health. I'm going to hand this over to Lexi. Thank you all for joining us today.
Greg Fritsky:Thank you.