Dealing with Big files in Migration

Len,



As a lot of this data is fairly static, the best option is Pre loading
the data

Then process any changes, creating a diff of any changes as required.



The 7 days do not then need to be on the critical path of the conversion
window.

You can report on M2k and Epicor 9 showing any differences and process
the changes.



I would only consider UDT if the data does not fit anywhere else, which
in this case it does.

Purging is an option but not mandatory.



Pre Loading is the best method leaving the actual dynamic items like on
hand inventory and cost adjustments as a cut over process.



For one site we preloaded 400K parts and 600K BOM's then during the week
before Go Live, we created a Diff and processed the changes.

The easiest way of doing this was to maintain the original import files,
then rerun the extracts from the system.

I then wrote a Diff Tool that took a MD5 hash of each row and compared
the two files producing just the changed items, But other methods are
possible, such as querying both systems.



This was done for parts, BOM's, BOO's etc.

Finally before go live we had only 5K records to process which were
actually changed during the week.



Hopefully this will help.



Regards,

Stephen

From: vantage@yahoogroups.com [mailto:vantage@yahoogroups.com] On Behalf
Of Len Hartka
Sent: 03 May 2010 16:55
To: vantage@yahoogroups.com
Subject: [Vantage] Dealing with Big files in Migration





Good Day:

Sun Is on M2K (Informix) but,
Going to: Epicor 9, 9.04.505B ( maybe 9.05), SQL (go live date is
8/1/10 ? )


Working on Migration of data.

My IM (PART) file has 292,877 records, and my BOM (PARTMTL) is
1,065,255.

Using standard techniques, the Migration of data will take 7 to 11
days.
Obviously we cannot be down that long so we need another solution.

Epicor has said this is one of the biggest sets they have had to do.
which brings up the question of why should Sun be so different. Only
thing I can think of is that other companies purge their files. I am
specifically talking about. the PART file and the PARTMTL file.

The reason the PARTMTL (BOM) file is so big is that we have a copy
of the BOM for every machine we ever made, since all are different and
customers buy spare parts regularly so we need to look up the part#.
No, in E9, we will create a JOBMTL record instead of a PARTMTL
record for each machine, but that just means the JOBMTL records will get
just as big.

Question:

1. Do you purge records in IM or BOM, or JobMTL?
2. If purged how do you 'archive' for occasional use.
3. One solution is to use E9's User_Defined_Tables (UDT's)- does anyone
have experience with them - do they work fairly seamlessly?
4. Most likely solution is to move 'static' data a week before go-live,
then Update the data on go-live date- did anyone do that and how did it
go?
5. Were you able to migrate data at a faster rate? In-House?


**************************************************************

Sun Automation Group

Celebrating

25 Years of Service

to the Corrugated Industry

**************************************************************

This e-mail and any attachments may contain proprietary and/or
confidential information. If you are not the intended recipient, please
notify the sender immediately by reply e-mail or at 410-472-2900 and
then delete the message without using, disseminating, or copying this
message or any portion thereof. With e-mail communications you are urged
to protect against viruses.

[Non-text portions of this message have been removed]





[Non-text portions of this message have been removed]
Good Day:

Sun Is on M2K (Informix) but,
Going to: Epicor 9, 9.04.505B ( maybe 9.05), SQL (go live date is
8/1/10 ? )


Working on Migration of data.

My IM (PART) file has 292,877 records, and my BOM (PARTMTL) is
1,065,255.

Using standard techniques, the Migration of data will take 7 to 11
days.
Obviously we cannot be down that long so we need another solution.

Epicor has said this is one of the biggest sets they have had to do.
which brings up the question of why should Sun be so different. Only
thing I can think of is that other companies purge their files. I am
specifically talking about. the PART file and the PARTMTL file.

The reason the PARTMTL (BOM) file is so big is that we have a copy
of the BOM for every machine we ever made, since all are different and
customers buy spare parts regularly so we need to look up the part#.
No, in E9, we will create a JOBMTL record instead of a PARTMTL
record for each machine, but that just means the JOBMTL records will get
just as big.

Question:

1. Do you purge records in IM or BOM, or JobMTL?
2. If purged how do you 'archive' for occasional use.
3. One solution is to use E9's User_Defined_Tables (UDT's)- does anyone
have experience with them - do they work fairly seamlessly?
4. Most likely solution is to move 'static' data a week before go-live,
then Update the data on go-live date- did anyone do that and how did it
go?
5. Were you able to migrate data at a faster rate? In-House?


**************************************************************

Sun Automation Group

Celebrating

25 Years of Service

to the Corrugated Industry

**************************************************************




This e-mail and any attachments may contain proprietary and/or confidential information. If you are not the intended recipient, please notify the sender immediately by reply e-mail or at 410-472-2900 and then delete the message without using, disseminating, or copying this message or any portion thereof. With e-mail communications you are urged to protect against viruses.


[Non-text portions of this message have been removed]
Going back to an earlier post of mine today, when we went live with SQL
Vantage I wrote import scripts for the Part Master and was able to do
1000 records per minute import. Epicor will use "Pervasive" to import,
I believe, and if I recall correctly it utilizes Service Connect.....
which is not built for mass transactional projects. It's slow.



We are an Engineering to Order manufacturer so we, too, had tons of
BOMS. Since we rarely utilize the same BOM twice we opted to not import
in our legacy BOMs. Instead we manually rebuilt a hundred or so ones
that we sometimes re-use. We worked that data input into our conference
room testing and it worked great for training.



When you import Parts you'll hit these tables: Part, PartBin, PartCost,
PartPlant, PartWhse and PlantWhse.



I also flagged all imported parts so I could easily identify these
records (specifically I set a Number UD field = 777) .



Consider doing it in-house. A sample proc for import of Parts would
look like this:



-- Procedure to Import INVENTORY PARTS

--

--

--

-- TIMCO

-- Vic Drecchio

-- Created: 2008-12-04

-- Last Modified: 2009-01-01

--

-- ASSUMPTIONS:

-- SQL Table Name for Input: "INV_IN"

-- Structure:

-- ID

-- ITEM

-- PARTNUM

-- DESC

-- TYPE

-- PRODGROUP

-- PARTCLASS

-- UOM

-- ONHAND

-- UNITCOST

-- WAREHOUSE

-- BIN

-- IMPORT777

--



set ANSI_NULLS ON

set QUOTED_IDENTIFIER ON

go



use mfgsys803;





DECLARE @COMPANY VARCHAR(10),

@PLANT VARCHAR(10),

@1 INT,

@RECFLAG INT,

@DUPFLAG INT,

@PARTCOUNT INT,

@BINCOUNT INT,

@COSTCOUNT INT,

@PLANTCNT INT,

@WHCOUNT INT,

@PWCOUNT INT,

@ROWID INT,

@STOCKFLAG INT;



SELECT @COMPANY = 'Your Company',

@PLANT = 'Your Plant',

@1 = 1,

@RECFLAG = 777, --- USED TO ID IMPORTED RECORDS

@DUPFLAG = 999, --- USED TO ID RECORDS SKIPPED DUE TO
NON-UNIQ PN'S

@PARTCOUNT = 0,

@BINCOUNT = 0,

@COSTCOUNT = 0,

@PLANTCNT = 0,

@WHCOUNT = 0,

@PWCOUNT = 0,

@ROWID = 0,

@STOCKFLAG = 0

;





--============================



-- SET VARIABLES FOR CURSOR FETCH



DECLARE

@ITEM NVARCHAR(50),

@PARTNUM NVARCHAR(50),

@DESC NVARCHAR(50),

@TYPE NVARCHAR(20),

@PRODGROUP NVARCHAR(20),

@PARTCLASS NVARCHAR(20),

@UOM NVARCHAR(10),

@ONHAND FLOAT,

@UNITCOST FLOAT,

@WAREHOUSE NVARCHAR(25),

@BIN NVARCHAR(25) ;



------------------------------------------

-- GET LIST OF RECORDS TO IMPORT

-- >> DUMP INTO CURSOR TO SCAN THRU

------------------------------------------



DECLARE PARTLIST CURSOR FOR

SELECT ID, ITEM, PARTNUM, [DESC], [TYPE], PRODGROUP,

PARTCLASS, UOM, ONHAND, UNITCOST, WAREHOUSE, BIN

FROM INV_IN;



OPEN PARTLIST



-------------------------------------------

-- BEGIN SCAN OF CURSOR



FETCH NEXT FROM PARTLIST

INTO @ROWID,

@ITEM,

@PARTNUM,

@DESC,

@TYPE,

@PRODGROUP,

@PARTCLASS,

@UOM,

@ONHAND,

@UNITCOST,

@WAREHOUSE,

@BIN ;



WHILE (@@FETCH_STATUS <> -1)

BEGIN;

-->>>>

-- SEE IF PART ALREADY EXISTS IN PART MASTER

SELECT @PARTCOUNT = COUNT(*) FROM PART P WHERE P.PARTNUM =
@PARTNUM

--~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~

IF @PARTCOUNT = 0

------- PART ----------------

-----------------------------

-- UNIQUE PART MASTER -------

-----------------------------



BEGIN;



-- PREPARE STOCKFLAG:

-- IF TYPE = PURCHASED, *STOCK*

-- IF TYPE = MANUFACTURED, *NON-STOCK*



IF UPPER(LEFT(@TYPE,1)) = 'M'

SELECT @STOCKFLAG = 1

ELSE

SELECT @STOCKFLAG = 0



-- INSERT INTO PART



INSERT INTO PART

(COMPANY, PARTNUM, SEARCHWORD, PARTDESCRIPTION,
CLASSID, IUM, PUM, TYPECODE, NONSTOCK,

PURCHASINGFACTOR, UNITPRICE, PRICEPERCODE,
INTERNALUNITPRICE, INTERNALPRICEPERCODE,

PRODCODE, COSTMETHOD, SHORTCHAR10, NUMBER20)

VALUES (

@COMPANY,

@PARTNUM,

LEFT(@DESC,8),

@DESC,

@PARTCLASS,

@UOM,

@UOM,

UPPER(LEFT(@TYPE,1)),

@STOCKFLAG, -- NON-STOCK ?

@1, -- PURCHASING FACTOR FLAG

@UNITCOST,

'E',

@UNITCOST,

'E',

@PRODGROUP,

'A',

@ITEM,

@RECFLAG

);





-----------------------

-- UPDATE THE INV_IN TABLE TO SHOW ROW IMPORTED

-----------------------



UPDATE INV_IN SET IMPORT777 = @RECFLAG WHERE ID = @ROWID

-- PUTS A 777 IN THE INPUT TABLE TO SHOW A UNIQUE PART WAS
INSERTED





END;



IF @PARTCOUNT > 0

BEGIN;

-----------------------

-- UPDATE THE INV_IN TABLE TO SHOW ROW IMPORTED

-----------------------



UPDATE INV_IN SET IMPORT777 = @DUPFLAG WHERE ID = @ROWID

-- PUTS A 999 IN THE INPUT TABLE TO SHOW RECORD RECORD WAS
NON-UNIQ

END;



SELECT @PARTCOUNT = 0 --- RESET IT



--~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~



------- PART BIN ------------

-----------------------------

-- ON HAND QTYS ONLY --------

-----------------------------



IF @ONHAND > 0

BEGIN



---- CHECK TO SEE IF THE PART IS ALREADY IN THIS LOCATION



SELECT @BINCOUNT =

COUNT(*) FROM PARTBIN PB

WHERE PB.PARTNUM = @PARTNUM AND

PB.WAREHOUSECODE = @WAREHOUSE AND

PB.BINNUM = @BIN



IF @BINCOUNT > 0 -- PART ALREADY EXISTS IN THIS WH/BIN...
ADD TO IT!

BEGIN

UPDATE PARTBIN

SET ONHANDQTY = (ONHANDQTY + @ONHAND),

NUMBER10 = @RECFLAG

WHERE

PARTNUM =
@PARTNUM AND

WAREHOUSECODE = @WAREHOUSE
AND

BINNUM = @BIN;

END;



IF @BINCOUNT = 0 --- NOT IN THERE YET! INSERT!

BEGIN

INSERT INTO PARTBIN

(COMPANY, PARTNUM, WAREHOUSECODE, BINNUM,
ONHANDQTY, NUMBER10)

VALUES

(@COMPANY, @PARTNUM, @WAREHOUSE, @BIN, @ONHAND,
@RECFLAG);

END;



SELECT @BINCOUNT = 0



END;



----------------------------------------------------------

----------------------------------------------------------



---- PARTCOST ---------



SELECT @COSTCOUNT = COUNT(*) FROM PARTCOST PC

WHERE PC.PARTNUM = @PARTNUM AND COSTID = @1



IF @COSTCOUNT = 0

BEGIN

INSERT INTO PARTCOST

(COMPANY, PARTNUM, COSTID, AVGMATERIALCOST,
NUMBER10)

VALUES

(@COMPANY, @PARTNUM, @1, @UNITCOST, @RECFLAG);



SELECT @COSTCOUNT = 0

END;



---------------------------------------------------------------

---------------------------------------------------------------

---------------------------------------------------------------



----------PARTPLANT--------------------



SELECT @PLANTCNT = COUNT(*) FROM PARTPLANT PP

WHERE PP.PARTNUM = @PARTNUM AND PP.PLANT = @PLANT



IF @PLANTCNT = 0

BEGIN

INSERT INTO PARTPLANT

(COMPANY, PLANT, PARTNUM, PRIMWHSE, MRPRECALCNEEDED,
PROCESSMRP, GENERATESUGG, SOURCETYPE,

COSTMETHOD, KITSHIPCOMPLETE, KITPRICING,
KITPRINTCOMPSPS, KITPRINTCOMPSINV, QTYBEARING,

NUMBER10)

VALUES

(@COMPANY, @PLANT, @PARTNUM, @WAREHOUSE, @1, @1, @1, 'P',

'A', @1, 'P', @1, @1, @1, @RECFLAG);



SELECT @PLANTCNT = 0

END;





-----------------------------------------------------------------

-----------------------------------------------------------------

-----------------------------------------------------------------



------------- PARTWHSE ----------------



SELECT @WHCOUNT = COUNT(*) FROM PARTWHSE PW

WHERE PW.PARTNUM = @PARTNUM AND

PW.WAREHOUSECODE = @WAREHOUSE



IF @WHCOUNT = 0

BEGIN

INSERT INTO PARTWHSE

(COMPANY, PARTNUM, WAREHOUSECODE, ONHANDQTY, NUMBER10)

VALUES

(@COMPANY, @PARTNUM, @WAREHOUSE, @ONHAND, @RECFLAG);

END



IF @WHCOUNT > 0

BEGIN

UPDATE PARTWHSE

SET ONHANDQTY = (ONHANDQTY + @ONHAND),

NUMBER10 = @RECFLAG

WHERE PARTNUM = @PARTNUM AND

WAREHOUSECODE = @WAREHOUSE;

END



SELECT @WHCOUNT = 0



-------------------------------------------------------------------

-------------------------------------------------------------------

-------------------------------------------------------------------

-------------------------------------------------------------------



--------------- PLANTWHSE ---------------



SELECT @PWCOUNT = COUNT(*) FROM PLANTWHSE PL

WHERE

PL.PARTNUM = @PARTNUM AND

PL.WAREHOUSECODE = @WAREHOUSE;



IF @PWCOUNT = 0

BEGIN

INSERT INTO PLANTWHSE

(COMPANY, PLANT, PARTNUM, WAREHOUSECODE, PRIMBIN, NUMBER10)

VALUES

(@COMPANY, @PLANT, @PARTNUM, @WAREHOUSE, @BIN, @RECFLAG)

END



SELECT @PWCOUNT = 0











---------- DONE THIS LOOP ITERATION

---------- GET NEXT RECORD!



FETCH NEXT FROM PARTLIST

INTO @ROWID,

@ITEM,

@PARTNUM,

@DESC,

@TYPE,

@PRODGROUP,

@PARTCLASS,

@UOM,

@ONHAND,

@UNITCOST,

@WAREHOUSE,

@BIN ;







END

-- END OF BIG LOOP











-- REMEMBER TO UPDATE OH QTYS IN PARTWHSE FROM PARTBIN(S)



--



close PARTList

Deallocate PARTList



________________________________

From: vantage@yahoogroups.com [mailto:vantage@yahoogroups.com] On Behalf
Of Len Hartka
Sent: Monday, May 03, 2010 11:55 AM
To: vantage@yahoogroups.com
Subject: [Vantage] Dealing with Big files in Migration





Good Day:

Sun Is on M2K (Informix) but,
Going to: Epicor 9, 9.04.505B ( maybe 9.05), SQL (go live date is
8/1/10 ? )


Working on Migration of data.

My IM (PART) file has 292,877 records, and my BOM (PARTMTL) is
1,065,255.

Using standard techniques, the Migration of data will take 7 to 11
days.
Obviously we cannot be down that long so we need another solution.

Epicor has said this is one of the biggest sets they have had to do.
which brings up the question of why should Sun be so different. Only
thing I can think of is that other companies purge their files. I am
specifically talking about. the PART file and the PARTMTL file.

The reason the PARTMTL (BOM) file is so big is that we have a copy
of the BOM for every machine we ever made, since all are different and
customers buy spare parts regularly so we need to look up the part#.
No, in E9, we will create a JOBMTL record instead of a PARTMTL
record for each machine, but that just means the JOBMTL records will get
just as big.

Question:

1. Do you purge records in IM or BOM, or JobMTL?
2. If purged how do you 'archive' for occasional use.
3. One solution is to use E9's User_Defined_Tables (UDT's)- does anyone
have experience with them - do they work fairly seamlessly?
4. Most likely solution is to move 'static' data a week before go-live,
then Update the data on go-live date- did anyone do that and how did it
go?
5. Were you able to migrate data at a faster rate? In-House?


**************************************************************

Sun Automation Group

Celebrating

25 Years of Service

to the Corrugated Industry

**************************************************************

This e-mail and any attachments may contain proprietary and/or
confidential information. If you are not the intended recipient, please
notify the sender immediately by reply e-mail or at 410-472-2900 and
then delete the message without using, disseminating, or copying this
message or any portion thereof. With e-mail communications you are urged
to protect against viruses.

[Non-text portions of this message have been removed]





[Non-text portions of this message have been removed]
As I understand it, Pervasive and Service Connect are somewhat mutually exclusive routes for importing data as part of a migration/implementation. Pervasive SQL is a third party integration tool, and writes directly to the Progress layer (and perhaps to SQL if that's your environment) and doesn't invoke the Vantage business logic. Presumably Epicor has implemented data integrity checking as part of the Pervasive tool.

It is reportedly pretty fast, and requires you to send Epicor your data, and they send you back a database. Depending on your implementation approach, this may or may not be what you want. If your implementation is highly-iterative with your data (as ours was) the delays imposed by waiting for Epicor to turn around a new database copy weren't what we wanted.

Service Connect is transactional, uses the Web Services-wrapped business objects, and Epicor will sell you workflows that get data into the system with validation, etc. And you can use them after you go live to bring more data in, modify them for your needs, etc. But it is slow given the overhead of the Web Services, and not what you want if you're moving a lot of data into Vantage/Epicor.

Now there's a new Data Migration Tool for Epicor 9 but I don't know much about it, and whether it is based on Pervasive or something new.

-bws

--
Brian W. Spolarich ~ Manager, Information Services ~ Advanced Photonix / Picometrix
    bspolarich@... ~ 734-864-5618 ~ www.advancedphotonix.com


-----Original Message-----
From: vantage@yahoogroups.com [mailto:vantage@yahoogroups.com] On Behalf Of Vic Drecchio
Sent: Monday, May 03, 2010 12:09 PM
To: vantage@yahoogroups.com
Subject: RE: [Vantage] Dealing with Big files in Migration

Going back to an earlier post of mine today, when we went live with SQL
Vantage I wrote import scripts for the Part Master and was able to do
1000 records per minute import. Epicor will use "Pervasive" to import,
I believe, and if I recall correctly it utilizes Service Connect.....
which is not built for mass transactional projects. It's slow.