Why GP, NAV, and AX Manufacturers’ Data Is Not AI-Ready
pjoeckel • April 1, 2024
Why GP, NAV, and AX Manufacturers’ Data Is Not AI-Ready
Going live on a new ERP platform without clean data is like spending a fortune on a new supercar and pouring sugar and sawdust into the gas tank.
It is an educated guess, but I will wager that the typical Microsoft Dynamics Great Plains (GP), NAV, and AX manufacturers' data is not AI or upgrade ready.
Let us look at a recent example of the data issues I have seen in my career helping companies select, implement, and fix ERP software projects.
I was advising a partner why his project could not go live when I found additional data issues.
The first issue was that the client transferred hundreds of thousands of routes to the new system. Simple math told me the required routings were closer to one hundred. It turns out that the production department had created a new, unique routing for every part ever made. So, the identical part produced multiple times was not tracked on the same route but rather on a new and unique route each time.
The second and equally significant data problem was the number of duplicate items I suspected were creating major junk-data problems in the item file.
The client creates many engineer-to-order parts supported by a large engineering staff. When I suggested that we analyze the item file for duplicates before importing them to the new system, the VP of engineering was highly insulted and outraged. He assured me that his item master was pristine with zero duplicates because of the tight controls in his department.
However, I have seen multiple engineering departments where the system used to create a new engineering bill of material made it easier to create a new part on the fly rather than find the part in the existing item file.
Prepared to be proven wrong and pleasantly surprised by this group of engineers' efficiency, I asked for an export of the item file. To get ready, I brushed up on Excel's Boolean logic capabilities to find potential duplicate parts. I could have saved myself time and effort.
When I opened the item master in Excel, a cursory scan of the first page of the items showed SEVEN different item numbers with THE same item description. A simple sort and review of duplicate item numbers showed hundreds of thousands of duplicates. In my experience, those are not unique dirty-data scenarios.
Poor data quality can significantly hinder the effectiveness of AI in forecasting and analysis within manufacturing. Here is a brief list of issues that may arise:
These issues underscore the importance of ensuring high-quality data for AI applications in manufacturing. Simply eliminating duplicates is just the first and simple step of a sophisticated data cleansing project that would enable meaningful and valuable data.
Pro-Tip: During the contract negotiation phase for the implementation of your new ERP platform, the time and money-saving suggestion is made that data can be migrated via Excel templates; expect massive change orders and a reschedule of your go-live date approximately two-thirds through your original budget and project time.
Call us for a meaningful conversation on how to clean and convert data to be AI-ready.
Let us look at a recent example of the data issues I have seen in my career helping companies select, implement, and fix ERP software projects.
I was advising a partner why his project could not go live when I found additional data issues.
The first issue was that the client transferred hundreds of thousands of routes to the new system. Simple math told me the required routings were closer to one hundred. It turns out that the production department had created a new, unique routing for every part ever made. So, the identical part produced multiple times was not tracked on the same route but rather on a new and unique route each time.
The second and equally significant data problem was the number of duplicate items I suspected were creating major junk-data problems in the item file.
The client creates many engineer-to-order parts supported by a large engineering staff. When I suggested that we analyze the item file for duplicates before importing them to the new system, the VP of engineering was highly insulted and outraged. He assured me that his item master was pristine with zero duplicates because of the tight controls in his department.
However, I have seen multiple engineering departments where the system used to create a new engineering bill of material made it easier to create a new part on the fly rather than find the part in the existing item file.
Prepared to be proven wrong and pleasantly surprised by this group of engineers' efficiency, I asked for an export of the item file. To get ready, I brushed up on Excel's Boolean logic capabilities to find potential duplicate parts. I could have saved myself time and effort.
When I opened the item master in Excel, a cursory scan of the first page of the items showed SEVEN different item numbers with THE same item description. A simple sort and review of duplicate item numbers showed hundreds of thousands of duplicates. In my experience, those are not unique dirty-data scenarios.
Poor data quality can significantly hinder the effectiveness of AI in forecasting and analysis within manufacturing. Here is a brief list of issues that may arise:
- Inaccurate Predictions : AI relies on historical data to make forecasts. If the data is flawed, predictions will be unreliable.
- Quality Control Problems : Bad data can lead to incorrect product quality assessments, resulting in undetected defects.
- Maintenance Challenges : Predictive maintenance becomes problematic if the data about machine performance is incorrect or incomplete.
- Operational Inefficiency : AI uses data to optimize processes. Poor data can lead to suboptimal decisions and processes.
- Adaptation Issues : Inability to adapt quickly to market changes due to unreliable data-driven insights.
- Decision-Making Errors : Strategic decisions based on poor data can lead to costly mistakes, especially around product costing and sales analysis.
These issues underscore the importance of ensuring high-quality data for AI applications in manufacturing. Simply eliminating duplicates is just the first and simple step of a sophisticated data cleansing project that would enable meaningful and valuable data.
Pro-Tip: During the contract negotiation phase for the implementation of your new ERP platform, the time and money-saving suggestion is made that data can be migrated via Excel templates; expect massive change orders and a reschedule of your go-live date approximately two-thirds through your original budget and project time.
Call us for a meaningful conversation on how to clean and convert data to be AI-ready.
HandsFree ERP
We are dedicated to ERP project excellence with experienced people, innovative processes, and innovative productivity tools like GYDE365-Discover. Experience - over one hundred years of combined experience selecting and implementing strategic ERP platforms.

Peter Joeckel
With an IE/OR engineering degree and enterprise software implementation experience starting at Price Waterhouse, Peter Joeckel has been in the business application selection, implementation, and challenged project turn-around business for over thirty years. He credits his industrial engineering degree with his search for better processes and tools to implement complex business application platforms.
Most recently, he was the lead HandsFree client advisor in the Circle of ERP Excellence lounge and speaker at the Community Summit North America.
HandsFree ERP is dedicated to supporting clients with their ERP initiatives, enabling companies to seamlessly connect users with their ERP partners. By utilizing skilled professionals, streamlined processes, and cutting-edge tools, HandsFree ERP significantly boosts the success rates of ERP projects.

By Peter Joeckel
•
September 11, 2025
Most organizations think data migration is about moving records from A to B. They're wrong. It's about transforming business information into operational truth. Get it wrong, and you're just digitizing your problems at enterprise scale. If you’re a distributor or manufacturer, your business runs on inventory. Simple as that. Everything else, sales, purchasing, operations, revolves around making sure your inventory data is accurate. And yet, so many companies struggle with messy, outdated, or outright incorrect data, setting themselves up for major headaches when it comes time to implement or upgrade an ERP system. For manufacturers and distributors, inventory is the heart of the business. Everything revolves around managing it effectively. In ERP terms, this involves three core processes: 1. Procure-to-Pay – Bringing inventory in from suppliers. 2. Manufacturing or Handling – Transforming or repackaging inventory. 3. Order-to-Cash – Shipping inventory out to customers. At the heart of the problem are three core data sets: customers, suppliers, and inventory . Clean and accurate data here isn’t optional. It’s essential. Let me paint you a picture of what poor data quality really costs: - Financial processes failing because customer master data is inconsistent - Supply chain grinding to a halt because item masters don't match across systems - Month-end closing taking weeks because nobody trusts the numbers - Compliance risks because audit trails are incomplete or incorrect I've seen implementations declare success after migrating millions of records, only to discover they've built a perfect system running on garbage data. The result? Unreliable reporting, broken processes, and users creating shadow systems to track "real" data. Here's what your implementation partner isn't telling you: Data quality issues compound over time. Every day you operate with poor data, you're creating new problems that will need to be fixed later. It's like trying to build a skyscraper on quicksand - no matter how perfect your architecture, IT IS GOING TO SINK. The hard truth: No amount of system optimization can fix bad data. You're either managing data quality now, or you're managing data problems forever. And in D365 F&O, forever gets expensive very quickly. Bills of Materials: The Science That Trips Everyone Up For manufacturers, one of the biggest trouble spots is the Bill of Materials (BOM) . Think of the BOM as a recipe: it defines exactly how components come together to make a finished product, like a “little red wagon.” Each part must be accounted for, structured correctly, and contain only inventory items. Here’s where things go wrong: Many BOMs have too many levels or include non-inventory items like labor and overhead. Legacy systems often force companies to create Frankenstein part numbers that are confusing and error-prone. Process manufacturers with “recipes” face additional complexity because ingredient quality can fluctuate, affecting output consistency. Moving this messy data into a modern ERP without cleaning it first can turn your new system into a nightmare rather than an improvement. Routing: Where Art Meets Science Beyond the BOM, there’s routing , the step-by-step instructions for manufacturing a product. Routing data is critical for understanding capacity, scheduling, and cost management. Capturing work center setup times, labor, material, and overhead costs is key. Most companies simply don’t have this data organized, which means ERP projects often start off on the wrong foot. Planning Ahead: The Key to ERP Success Waiting until the ERP project is live to clean and organize your data is a recipe for disaster. By then, your best engineers and data experts are fully occupied, leaving little time to fix deep-rooted issues. Forward-thinking manufacturers and distributors start data workshops well before the ERP implementation . These workshops: Identify issues in customer, supplier, and inventory data Clean and structure BOMs and routings properly Establish proper part numbering and chart of accounts setups Doing this ahead of time dramatically increases the chances of a smooth, successful ERP deployment—regardless of which system you choose. Bottom line: messy data doesn’t just slow you down, it can completely derail your ERP implementation. Start early, clean it up, and structure it correctly. Your future self (and your new ERP system) will thank you.