Len,
As a lot of this data is fairly static, the best option is Pre loading
the data
Then process any changes, creating a diff of any changes as required.
The 7 days do not then need to be on the critical path of the conversion
window.
You can report on M2k and Epicor 9 showing any differences and process
the changes.
I would only consider UDT if the data does not fit anywhere else, which
in this case it does.
Purging is an option but not mandatory.
Pre Loading is the best method leaving the actual dynamic items like on
hand inventory and cost adjustments as a cut over process.
For one site we preloaded 400K parts and 600K BOM's then during the week
before Go Live, we created a Diff and processed the changes.
The easiest way of doing this was to maintain the original import files,
then rerun the extracts from the system.
I then wrote a Diff Tool that took a MD5 hash of each row and compared
the two files producing just the changed items, But other methods are
possible, such as querying both systems.
This was done for parts, BOM's, BOO's etc.
Finally before go live we had only 5K records to process which were
actually changed during the week.
Hopefully this will help.
Regards,
Stephen
From: vantage@yahoogroups.com [mailto:vantage@yahoogroups.com] On Behalf
Of Len Hartka
Sent: 03 May 2010 16:55
To: vantage@yahoogroups.com
Subject: [Vantage] Dealing with Big files in Migration
Good Day:
Sun Is on M2K (Informix) but,
Going to: Epicor 9, 9.04.505B ( maybe 9.05), SQL (go live date is
8/1/10 ? )
Working on Migration of data.
My IM (PART) file has 292,877 records, and my BOM (PARTMTL) is
1,065,255.
Using standard techniques, the Migration of data will take 7 to 11
days.
Obviously we cannot be down that long so we need another solution.
Epicor has said this is one of the biggest sets they have had to do.
which brings up the question of why should Sun be so different. Only
thing I can think of is that other companies purge their files. I am
specifically talking about. the PART file and the PARTMTL file.
The reason the PARTMTL (BOM) file is so big is that we have a copy
of the BOM for every machine we ever made, since all are different and
customers buy spare parts regularly so we need to look up the part#.
No, in E9, we will create a JOBMTL record instead of a PARTMTL
record for each machine, but that just means the JOBMTL records will get
just as big.
Question:
1. Do you purge records in IM or BOM, or JobMTL?
2. If purged how do you 'archive' for occasional use.
3. One solution is to use E9's User_Defined_Tables (UDT's)- does anyone
have experience with them - do they work fairly seamlessly?
4. Most likely solution is to move 'static' data a week before go-live,
then Update the data on go-live date- did anyone do that and how did it
go?
5. Were you able to migrate data at a faster rate? In-House?
**************************************************************
Sun Automation Group
Celebrating
25 Years of Service
to the Corrugated Industry
**************************************************************
This e-mail and any attachments may contain proprietary and/or
confidential information. If you are not the intended recipient, please
notify the sender immediately by reply e-mail or at 410-472-2900 and
then delete the message without using, disseminating, or copying this
message or any portion thereof. With e-mail communications you are urged
to protect against viruses.
[Non-text portions of this message have been removed]
[Non-text portions of this message have been removed]
As a lot of this data is fairly static, the best option is Pre loading
the data
Then process any changes, creating a diff of any changes as required.
The 7 days do not then need to be on the critical path of the conversion
window.
You can report on M2k and Epicor 9 showing any differences and process
the changes.
I would only consider UDT if the data does not fit anywhere else, which
in this case it does.
Purging is an option but not mandatory.
Pre Loading is the best method leaving the actual dynamic items like on
hand inventory and cost adjustments as a cut over process.
For one site we preloaded 400K parts and 600K BOM's then during the week
before Go Live, we created a Diff and processed the changes.
The easiest way of doing this was to maintain the original import files,
then rerun the extracts from the system.
I then wrote a Diff Tool that took a MD5 hash of each row and compared
the two files producing just the changed items, But other methods are
possible, such as querying both systems.
This was done for parts, BOM's, BOO's etc.
Finally before go live we had only 5K records to process which were
actually changed during the week.
Hopefully this will help.
Regards,
Stephen
From: vantage@yahoogroups.com [mailto:vantage@yahoogroups.com] On Behalf
Of Len Hartka
Sent: 03 May 2010 16:55
To: vantage@yahoogroups.com
Subject: [Vantage] Dealing with Big files in Migration
Good Day:
Sun Is on M2K (Informix) but,
Going to: Epicor 9, 9.04.505B ( maybe 9.05), SQL (go live date is
8/1/10 ? )
Working on Migration of data.
My IM (PART) file has 292,877 records, and my BOM (PARTMTL) is
1,065,255.
Using standard techniques, the Migration of data will take 7 to 11
days.
Obviously we cannot be down that long so we need another solution.
Epicor has said this is one of the biggest sets they have had to do.
which brings up the question of why should Sun be so different. Only
thing I can think of is that other companies purge their files. I am
specifically talking about. the PART file and the PARTMTL file.
The reason the PARTMTL (BOM) file is so big is that we have a copy
of the BOM for every machine we ever made, since all are different and
customers buy spare parts regularly so we need to look up the part#.
No, in E9, we will create a JOBMTL record instead of a PARTMTL
record for each machine, but that just means the JOBMTL records will get
just as big.
Question:
1. Do you purge records in IM or BOM, or JobMTL?
2. If purged how do you 'archive' for occasional use.
3. One solution is to use E9's User_Defined_Tables (UDT's)- does anyone
have experience with them - do they work fairly seamlessly?
4. Most likely solution is to move 'static' data a week before go-live,
then Update the data on go-live date- did anyone do that and how did it
go?
5. Were you able to migrate data at a faster rate? In-House?
**************************************************************
Sun Automation Group
Celebrating
25 Years of Service
to the Corrugated Industry
**************************************************************
This e-mail and any attachments may contain proprietary and/or
confidential information. If you are not the intended recipient, please
notify the sender immediately by reply e-mail or at 410-472-2900 and
then delete the message without using, disseminating, or copying this
message or any portion thereof. With e-mail communications you are urged
to protect against viruses.
[Non-text portions of this message have been removed]
[Non-text portions of this message have been removed]