What is Poor Data Quality Costing You? New BI Tools Offer Proactive Insights
The cost of poor data quality is difficult to track, yet it weighs heavily on the collective mind of an organization. Starting at the top, 84% of CEOs are concerned about the quality of the data they are using to make decisions, according to Forbes magazine.
The impact of poor data quality is felt by employees at all levels. Businesses surveyed by Gartner research believe poor data quality is responsible for an average $15 million per year in losses, and 94% believe the data they hold is inaccurate.
Fortunately, new business intelligence (BI) tools can help transportation and logistics companies quickly identify problem areas and take proactive steps to improve data accuracy, consistency and completeness.
Learning from the Experts
Trimble’s BI group has created dashboards that use Domo as the front-end BI application. The interactive dashboards give users visibility and recommendations for how to proactively solve problem areas with data quality in their systems, such as data that is missing or incomplete.
Wes Briggs, Senior Manager of BI Implementation and Michelle Reed, Senior Business Intelligence Analyst for Trimble Transportation, shared new tools that make it possible to quickly identify and correct data quality issues during a Aug. 16 session, “The Cost of Poor Data Quality,” at Trimble’s 2022 Insight Tech Conference + Expo in Orlando.
Taking Proactive Steps
Human error is a primary cause of poor data quality, like when a technician forgets to pause a repair order when taking a lunch break. Employee turnover also contributes to inconsistencies in the way data is entered and managed, explained Briggs.
Briggs showed a Client Implementation dashboard that identifies employees who are incorrectly entering data into TMT Asset Maintenance or another system during the implementation process. The dashboard also shows where errors are creeping into master data records for employees, customers, parts inventories and warranty records, among other areas.
Briggs showed a new Data Quality dashboard that identifies errors, at the system level, that take place over specified time periods. The dashboard gives recommendations for how to proactively correct problems. Reports can be sent directly to users who are responsible for the quality of data in fleet maintenance, operations, and other areas.
Briggs explained how the Data Quality dashboard identifies human errors by giving examples of how it can be used with the TMT Asset Maintenance system to find employee mistakes when opening and closing repair orders. Users can address such problems with training or by changing operating procedures, he said.
The Data Quality dashboard also pinpoints areas where companies have missing or incomplete data, such as when assets in the TMT system do not have warranty information or have errors in odometer readings.
Looking beyond maintenance, Briggs showed how the Data Quality dashboard can identify data quality issues in transportation management systems (TMS), such as when data is out of sequence during the flow of freight transactions from dispatch to invoicing, accounts receivable, and payment.
Contact us today to learn more about using the latest BI technology to proactively identify and solve issues with data quality.