top of page
  • Writer's pictureMaribeth Vander Weele

Beyond Subrecipient Monitoring: Using Data Analytics to Protect Grant Funds

When grant expenditures amount to billions of dollars, conducting subrecipient monitoring to detect non-compliance may not be enough. Enter the advanced oversight techniques of data analytics.

Data analytics provides important benefits:

  • It’s efficient. With data analytics, detecting red flags of fraud does not depend on receiving outside tips or the discovery of compromised transactions during sampling by Monitors. Monitoring can identify fraud, but that’s not its key purpose. Among other objectives, monitoring is designed to ensure that a program has internal controls in place and that the grant objectives are being achieved. Data analytics identifies red flags by slicing information in a different way and from a different perspective.

  • It’s risk-based. Data analytics allows the oversight team to focus resources on indicators of fraud, such as payroll checks made out to employees for more than 24 hours per day of work, duplication of benefits (such as when state employees on paid leave receive unemployment payments), payments to “ghost vendors”, disappearing equipment, and more.

  • It identifies trends. Data analytics provides a bird’s eye view of your program, lending insight into systemic problems.

  • It’s multidisciplinary. Creating a robust data analytics program requires more than just the information technology know-how to build algorithms. It also requires investigative skills, which guide the oversight team in designing analytics that ask the right questions. Program experience is another vital—and often overlooked—competency for building strong data analytics, especially during the brainstorming phase. Once the right questions are being asked, compliance officers can review areas of concern.

Where to Begin


At the Vander Weele Group, our data analytics framework is built on these fundamental questions:

  1. What assets are at risk?

  2. What risks threaten those assets?

  3. How are those assets at risk?

  4. What data speaks to the risks?

What assets are at risk?

The classic risk analysis methodology begins with a basic question: What assets are at risk? Said differently, what are we trying to protect?


In any federal or state grant-funded program, taxpayer dollars are a key asset, as are equipment, supplies, information and, of course, people.


What risks threaten those assets?

If taxpayer dollars are a key assets, then theft of these funds is an obvious risk.


Misappropriation can take many forms: embezzlement, payment to ineligible organizations or individuals, payments to phony companies, or overpayments to companies associated with political officials or government employees, to name a few examples.


Related to misappropriation of tax dollars is the theft of equipment and supplies purchased with grant funding. Generators, laptops, or furniture purchased with tax dollars can disappear from warehouses and inventories before they reach the intended user. Theft of information is also an important risk to consider.


Even more critical is the threat of program failure, not only in terms of compliance, but also performance. The Uniform Guidance specifically emphasizes the need for safeguards against the “risk that subaward performance goals are not being achieved” (2 CFR § 200.332(d)).

What does this look like?


• A road funded with grant dollars doesn’t get built.

• A grant-funded computer software application never materializes.

• School-to-work training equipment doesn’t function properly because of substitution of quality materials with inferior ones.

• A small business or daycare center remains closed through the funded period despite pledging to do otherwise.


Categorizing types of risk and rating each based on importance—typically determined by multiplying the projected probability of a specific incident by its level of potential impact—is one way to do this. Impact can be measured in terms of the dollar cost of assets that might be compromised. However, in worst-case scenarios—such as when grant-funded humanitarian aid does not reach its destination, or a program designed to combat child-trafficking fails at its objectives—human lives can be at risk.


How are the assets at risk?

After identifying assets at risk, it’s time to brainstorm how a breach could happen. What could go wrong? During this process, think like a thief. If grant-funded prescription medicines are your target, how would you divert them? If you wanted to benefit Uncle Eddy and receive a kickback for doing so, what scheme would you concoct?

An important part of brainstorming is reviewing what incidents have occurred in the program’s past or in the administration of similar programs. Look up old headlines. Talk to local investigators and program officers. What are they seeing? Are they hearing that grant-funded equipment is being dumped on the black market? Are rumors circling about a worker’s compensation scheme perpetrated by grant-funded employees?


What data speaks to the risks?

After you’ve exercised your creativity coming up with disaster scenarios, it’s time to come back down to earth and look at hard data. Specifically, you’ll need to determine what types of data are available to you and how they can be used to assess potential noncompliance, waste, fraud, or abuse. The answers will be highly dependent on the unique nature of each program you’re overseeing.


Let’s say you’ve heard of a worker’s compensation fraud ring among grant-funded employees. Try to locate data reporting how many people have taken leave under worker’s comp, how that number compares to previous years, what locations have the greatest number of worker’s comp claims, what shifts injuries are occurring on, and whether the recipients of worker’s comp benefits are using the same doctor.


Perhaps your risk assessment determined that product and equipment purchased with grant funds are at risk. You’re concerned about substitution of materials—whole milk for kids being substituted with skim milk, but at whole milk’s higher prices; inferior computers with knock-off parts being delivered instead of the high-end electronics the grant paid for; or vendors consistently delivering low-quality services or products.


In this case, begin by obtaining a complete data set related to product or equipment purchases. This should be as detailed as possible—make sure you have a record of invoice and PO numbers, transaction dates, copies or receipts, names of the people who approved each step of the purchase process, etc.


Create the Analyses


Now comes the fun part: building the analyses.

There are thousands of possible algorithms you can use to analyze your data. Depending on the type of information available to you, here are some indicators you may want to look for:


To identify black market diversion or product substitution:


Products or equipment which are:

  • Most frequently returned (frequent returns may indicate inferior quality) or with an unrealistic lack of claims/returns (which may indicate that return data is being falsified).

  • Most frequently overlooked for required inspections.

  • Most frequently marked with “quantity less than on receipt document”.

  • Most frequently marked with “quantity less than requested”.

  • Missing serial numbers.

  • Vendors or recipients associated with any products that meet the above criteria.

Vendors who:

  • Deliver the greatest number of products with an expired shelf life.

  • Have the greatest number of items marked as “unacceptable substitutes.”

  • Provide the products least likely to undergo mandated inspections.

Recipients (units, divisions, or individuals) who:

  • Have a high number (or unrealistically low number) of returns.

  • Most frequently fail to conduct required inspections of products or equipment.

  • Provide the products least likely to undergo mandated inspections.

To identify potential conflicts of interest:

  • Instances in which a government agency or subrecipient agency employee and a vendor share the same address, cell phone number, home phone number, fax number, email address, or bank account number.

  • Instances in which a reference or an emergency contact for an employee is also a vendor.

  • Instances in which the health or life beneficiary of an employee is also a vendor.

To look for kickbacks and favoritism:

  • Analyze products that have the greatest percentage of increases in unit prices and the greatest increases in volume purchased.

  • Identify duplicates or overbilling.

To test for payroll fraud:

  • Identify employees with significant amounts of missing information in the Human Resources database, e.g. emergency contact numbers, ethics disclosure statements, training records, background checks, employee evaluations, W-9s, health insurance elections, deductions, and so on. These omissions can indicate these are “ghost” employees.

  • Identify paychecks made on off days, weekends, or holidays, if payments on those days is not a usual practice.

  • Compare terminated employees, employees on workers compensation rolls, or employees receiving pension payouts with the payroll list.

  • Obtain lists of employees and overtime pay and investigate the circumstances around those with the highest amounts of overtime pay.

  • Investigate individuals being paid for more than 24 hours a day.

  • Look for duplicate paychecks.

  • Identify the number of checks per employee issued in the same year and look for outliers.

Prepare the Data


Before moving to the next phase—data analysis—you’ll need to complete another critical step: preparing or “normalizing” the data. In layman’s terms, this means ensuring the same item, legal entity, person, or phone number is consistently represented in the same way every time it appears in the data set.


For example, in legacy data systems, a phone number might be represented as 555-555-5555, (555) 555-5555, or 1 (555) 555-5555. In this case, you’ll need to select a single, standard format and convert all the entries accordingly. Other places you’ll routinely need to correct your data are street address fields and business name fields.


Even with sophisticated data analytics software and experienced analysts, normalizing data is a painstaking and time-consuming process. However, a single wrongly spaced, abbreviated, or punctuated entry can throw off your entire analysis, so triple check your data!


Analyze the Data


Now you’re ready to test your theories.


Begin with simple analyses—i.e. sort purchases by dollar amount and identify the products or equipment with the highest-dollar volumes, the greatest percentage increases in prices, the widest range of pricing, and so forth. Move to further “cuts” in the data and more sophisticated analyses. Identify false positives and what causes them. Too many results may mean the algorithm requires “tweaking” to prioritize more significant indicators of fraud.


Identify the Outliers


Once you find anomalies or outliers, isolate and aggregate them. Turn each into a “Data Sheet” or “Tip Sheet” with the date and identifying information. Store the information in individual files, folders, or as attachments in software.


Investigate


Investigate anomalies. In the worst-case scenario—suspected fraud—bring in or coordinate with law enforcement or dedicated investigative units to ensure that evidence is preserved properly and not “tainted” in case of a criminal proceeding. Keep in mind that a first-time interview with an individual suspected of committing fraud may be the only opportunity to obtain a confession, and that experienced, skilled interviewers are in the best position to obtain an admission of wrongdoing based on key investigative techniques.


The Sky’s The Limit


When it comes to large datasets, the possibilities for analysis are almost unlimited. Here are some additional important tips to get you started:

  • Build in plenty of time for the planning phase. Understand that experimentation--trial and error--is a normal part of the process.

  • Because of the complexity of diverse data sets, plan for data cleansing to be time-consuming.

  • Design IT systems with fields that lend themselves to analysis. If you’re in communication with system designers, provide feedback when additional fields are needed. Fields that summarize or aggregate data, yes/no checkboxes, and fields that explain the “why” behind information are particularly valuable. For example, perhaps your organization frequently has returns but does not track why an item was returned. Add to the inspection list a check box of reasons such as: damaged goods, wrong product, quantity less than requested, or other. Without creating too much complexity, the list provides powerful insight into flaws in products, supplies, or equipment and the vendors who provide them.

  • Use open-source research and document reviews to explore anomalies. If one vendor consistently shows up on the problem list, research online to see if other organizations have had problems with that company, looking for information such as if that company repeatedly faces complaints, litigation, poor reviews, or bankruptcies.

 

Overwhelmed? We can help.


If you’d like assistance building a data analytics program, contact the Vander Weele Group at info@vanderweelegroup.com.


For more grants oversight resources, visit our Resource Library.

bottom of page