Archiving

Live since 2015, Ice.ChgLog 23 GB… 34 775 700 Row.

GULP

1 Like

I guess my big dog is just a Chihuahua.

And don’t forget, just counting rows is WAY faster then actually doing anything with it. In actual SQL, if I run select * on parttran, no filters, no joins, nothing other than just read the rows, I’m at 3+ minutes and at 1.8M records and counting. I stopped that query.

But I run Count(1) on that table. It counts 3.5M records in under a second.

This just highlights the differences in the amount of work needing to be done when you have different things happening in queries. Adding joins and filters, and sorts all makes everything work a lot harder.

So purging some of that old data can make a big difference. Although, if you put and index on the date then used table criteria to ignore anything older that a specific date (basically pretend there was a purge) it would probably be almost a fast as if you really did delete the records.

However, it still ends up being a bloated database.

7 Likes

This brings back nightmares!! I did a purge and summarize a week before we moved from E9 to Kinetic and it deleted our entire TranGLC table. We had to go back to the back up from 2 days ago and redo everything…that was a not fun. I’m not sure I could ever confidently pull the trigger on that again.

TRANGLC Deleted after Purge and Summarize! - Epicor ERP 9 - Epicor User Help Forum

3 Likes

I’ve run the Purge against TEST databases a couple times for clients. It appeared to have worked pretty well. I asked for them to go through the test system and review old data. That never happened, and generally got a “you can purge my data when you pry it out of my cold dead hands” response. So, never actually ran against a PROD DB.

I’d think running against a TEST DB, then running some things like financial statements, WIP reports, etc would give you a pretty good idea whether it would work with you.

Purge and summarize works fine. You have to run it for a year at a time, I’ve never had a single year take more than a few hours (depends on type of data being purged). If you’re brave or have a large window of time maybe a few years at a time but only one transaction type.

Oh I remember this @ralmon !!!

Nervous The Big Bang Theory GIF

1 Like

I recently did a purge & summarise on Change Logs and PartTran data (only in test). It was for a three year date range (go live - 2020) and from memory was circa 20 million rows, (we had some fields logging changes that we shouldn’t have had in that time period)

It took about 18 hours total and the test system was almost unusable for the duration - the Purge and Summarise process seems to use as much server resource as was available, not sure if there are any settings around this but I’d hesitate to use in live because of this.

After it was complete we ran a shrink on the database and reduced it by around 15GB, so was worth it for us, I’ll definitely be using it again for more transactions types in Test and Pilot databases.

We run Database Purge and Summarize annually on our 200gb database on the PartTran, TranGLC, and GLJrnDtl, and keep the last 3 years of data. I purge change logs monthly, and keep the last 2 years of data. Prior to the annual purge, we append our SQL archive database from PROD, so that’s always got full records for data analysis… If you have pushback from folks that don’t want to lose that history, find out when the last time something was needed from more than 3 years ago. That’s old news! We don’t purge Quote, Order, PO, or Labor. Those tables aren’t that big for us.

Upload your environment to the ERP Analyzer (ask your CAM), and you’ll get a report showing Top 25 tables (or you could do it in SQL). That showed us which tables made sense to purge.

You definitely don’t want to have Epicor attempt a year (or ten!) of purge in one shot. I typically do each table in one to three month chunks, and let the process finish. I do mine in the afternoon when database activity is less, and purge a larger chunk on my way out the door (to finish overnight). I’m sure you could do it in PowerShell so it’s a click and forget process.

For our test environments, I’ll purge everything (from the already purged PROD environment) to the 18mo min lookback to get our SysAdmin happy about not running out of space, lol. it’s tedious doing it in 1-3 mo chunks, but not impossible. After the purge and shrink, it got us down from 186 GB to 116GB. Not bad, if you have a few test environments…

Also - If you purge GL, Epicor doesn’t care if it’s an open invoice on 5 year terms. You’ll need a datafix every year to add back the detail records for those invoices to fix the errors. Working As Designed. :man_facepalming:

6 Likes

Number of years ago, Dedicated a programmer to setup Archiving on our SAP R/3 system. SAP had a pretty mature archiving method of moving the data to text files. This allows an auditor to look through all that ancient stuff with Notepad or whatever or string search. The other part we did was generate documents like invoices to pdf files. Your business will dictate what should be generated. I think people are missing this aspect in this discussion. Most times, in my experience, people want that old invoice or purchase order or sales order. The order header and order detail records are great but people seem to want the document that is generated from it.

I know our AR invoice is the most modified report style so an invoice from 2017 will be different from an invoice from 2020. Even from the same records. This is a key difference that some accounting types will be concerned with.

6 Likes