I am interested to see if anyone has ever tried something similar to what I am thinking.
So, I’ve been at my new company for a month now and things are starting to come into focus. We are currently on 8 and plan on going Epicor Cloud. Almost nothing in Epicor is being used as intended. It is pretty bad. There is even an Access DB that is being used by one of the groups to manage their products and quoting.
I am currently thinking that I want to install the new environment, set up the base tables, convert Customers & Suppliers, and that is it. The Parts are that bad that I do not see a viable path to converting them. We would keep the old environment going for all of the existing jobs/sales orders/etc. until they are completed and on go live, we would start transacting in the new environment and have to convert parts/methods as we go.
Has anyone ever done something similar? I know that running parallel in both systems is an approach, but this is not doing the transactions in both systems. This would be “new” transactions in Kinetic and anything already started in Vantage staying in Vantage.
I know this would be a huge pain to try and remember which system to be in (especially for the floor), but everything needs to be changed.
The only thing you could also do is do a cloud conversion pass to uplift the existing vantage DB into newer version so that you can shut off the old equipment, that way you don’t have to keep it running. You don’t have to convert anything and go live with it, but you can at least do a conversion on your existing environment and load that into another slot with read only permission on every user so they can access older stuff without having to keep the old servers running.
Did you have Jobs that were being used as Sales Kits? The part for the job was what was on the sales order and each part in the “kit” was a subassembly on the job for the “parent”? Also, some of the subassemblies were just purchased parts? And some of the operations were just there for salaried employees to book time against? And accounting would add parts to the BOM for freight costs?
That’s all cursed, though it should be doable to strip the bad ops and assemblies out. There’s usually a lot of work baked into part methods/BOM, so I’m loathe to chuck that investment entirely. I’d plan on scrubbing them and pulling them forward, if it all possible.
How bad is this? Access is always a trainwreck, but I’m always curious to see if it’s “one car skipped the tracks” bad or “massive derailment that spilled tons of highly toxic and carcinogenic chemicals right into the drinking water” bad.
Some times I wish we could start over clean, but I can’t see how we could get free of the old data. Customers are constantly referring to old PO’s/Quotes. Customer accounts themselves are a mess. On that note:
Make sure you take a hard look at customers and suppliers. We have all sorts of problems, which mostly all result in duplicates. Company orders from us 4 years ago, changes company names, orders from us again… New customer added. Multiple customer files in Epicor, one for each of the customers locations… Drop ships entered over and over for the same location, but the drop ship is on a different one of our dealers. Point being, now is the time to fix any problems with customers and suppliers as well.
I don’t see the advantage of running two systems. If there are process and data changes needed, the clean cutover point of an upgrade will allow you a good time to change them.
My recommendation would be to change only the structural things you see needing to be fixed and upgrade with minimal changes. You may only have a months worth of parts ready to go but you can work the rest in over the first few months post go live.
Running two systems would require you to do all that same work plus keep track of multiple systems.
I work on implementations and upgrades non stop for the past 10 years, let me know if you want to talk more.
Two fractured sets of financials seems like a bigger nightmare to me. I’m not familiar with the legacy system you mentioned but I am going through an on-prem Infor Visual to Kinetic Cloud migration now and the DMT tool, Data Migration Planning and tribal knowledge between Engineers and Ops are getting me where I need to go to modernize and unravel the data entanglements while having a bit of fun. If you want to get away from the fractured system, go all the way. Set your Data Migration Map and test until you end up with a playlist that works through the sequencing and manipulations until you have that “one true source” of business intel delivery that comes in the cloud ERP. There’ll be no excuse to leverage an Access DB when Kinetic is in the cloud and anyone can report good data when it happened. … maybe I missed the mark but I’ve been in IT Project Managment doing implementations for 26+ years and manufacturing outfits always ask if we can run legacy and modern systems concurrently for a few months. … and that always is worse than taking the time to migrate good data and spend time training in the new system before Go-Live.
Thanks for the responses everyone! I fully understand that best practice is to convert everything on the upgrade, but I am not really seeing it as viable in a reasonable amount of time. The business processes are so fundamentally broken and completely opposite of what should be done, I am not seeing the path forward. Let me try to expand a little bit.
Here is an example of one of the jobs that was shipped this year that should have been a sales kit. Taking people through the mental exercise of trying to clean up the existing seems insurmountable for a company that is coming up on its hundredth anniversary. The below is why I am thinking of just forcing people to start over in a clean system.
Here is another example. The important things to note are Asm 27 which is selected in blue does not have a MOM. What you see was added manually or from an old Job (which more than likely was wrong). Also, Asm 27 is a circular reference to its parent Asm 2. Also, the Op highlighted in yellow is just a bucket to add time for Mechanical Engineers. It cannot be scheduled.
And just for fun, here is one more Job. This job is closed.
I am not trying to be difficult, but I have never seen a system being used so incorrectly. The sheer volume of questions and pushback I am going to receive coupled with the board looking for a quick turn-around is what is making me try to come up with the best way forward.
John, that’s why I’d just upgrade it as is and let them in there. Pay the money for one cloud conversion pass to get it uplifted and have them do quote to cash on that same scenario. And if it works it works. You’re out like a couple thousand bucks and they get to see what going forward with current data looks like.
One thing @bderuvo shared with me, and feel free to correct me @bderuvo , is that we should separate process improvements from the implementation and handle them at a later time… at least that’s how I understood it. He said it much more elegantly.
I know it hurts to see the processes like that, but first hand, what @bderuvo said to me rang true. We would get hung up for 3-4 weeks trying to do a process improvement only to get bogged down doing more change management than implementation/upgrade work. It started to become overwhelming to the point where the majority of the time was trying to make improvements to processes instead of upgrading.
I agree, but to what end. If nobody wants to improve processes and you’re starting over with the hopes that you’ll change processes to support starting over I don’t know how that’s going to go if what we’re being told is nobody wants to change right now:
You have your choice of nightmares. If the time and resources aren’t available to straighten out the data beforehand, but the WILL is there to correct the data as necessary, go that way. It won’t be easy, but at least it will be possible. In my (not-usually-so) humble opinion, bringing in the bad data will eventually cripple you since everyone will try to keep using it the same old way with the same old lack of good process.
Work HARD on developing good processes. If you have the best data you can have based on what’s available, good processes will be your best way of ensuring all the NEW data comes in good and stays good.