SQL Question / Direct Import

Jason,



I don't know all the specifics, but I believe some of the later versions of
Epicor have some other option besides Service Connect for EDI. It may be
called EDI-Direct, or some similar type name. That may be an option for you
to discuss with your CAM.



Kevin Simon



From: vantage@yahoogroups.com [mailto:vantage@yahoogroups.com] On Behalf Of
Elizabeth
Sent: Friday, May 25, 2012 1:46 PM
To: vantage@yahoogroups.com
Subject: [Vantage] Re: SQL Question / Direct Import





At a previous employer we did this through Service Connect (a consultant had
set it up). I don't think that would make your time window though... as I
recall we did about 15k lines in about 4 hours.

It all came to us as a flat text file, and the SC workflow did all the
folding, bending, spindling, and mutilating so the next morning it was all
there.

Ernie Lowell
Diba Industries

--- In vantage@yahoogroups.com <mailto:vantage%40yahoogroups.com> , "Jason
Dearth" <jdearth@...> wrote:
>
> We are still not completely satisfied with the length of time that
> Direct Import (EDI) takes to process all of our incoming demand. We
> might consider writing our own solution directly into SQL like we did
> with our previous ERP system. Our goal is to get all of our demand in
> our system (or updated) daily within about a 2 hour window. We currently
> have about 75,000 schedules from our customers. I am currently only able
> to run this on the weekend, and have gotten it down from a 2 day affair
> to about one day, but not going out as far as we really need to. This is
> a huge delta, so I doubt that 700 will get us the rest of the way, but I
> will be testing that over the next few weeks. (I know that daily imports
> seem crazy, but our customers make changes every 4 hours that impact qty
> and dock code.....and we want to use an automated ASN system so we can
> cut our shipping transactions in half.....several hundred daily
> transactions)
>
>
>
> My question is, has anyone written directly to SQL in an Epicor table?
>
> Does this bypass the business objects that Epicor has if we did it from
> within the application?
>
> There appears to be 'checksum' kind of fields in each table.....is that
> true, and what impact would this have to my approach I am considering?
>
>
>
> I can download all the demand from the customers portals through
> processing center and to our servers in a total of about 8
> minutes.....writing individual flat files as required by Direct Import.
> We would like to query the necessary data from Epicor and do the
> validations and matching in memory and then write to the tables. I think
> they are taking the approach of strictly processing each one through
> individually.....which is taking an incredible amount of time.
>
>
>
> I need to get this down to within our 2 hour window ( or something close
> to that), but don't want to do anything stupid and end up with data
> corruption either. Any advice would be greatly appreciated if it saves
> me in exploration time.
>
>
>
> Thanks,
>
> Jason Dearth
>
> Ferco Aerospace Group
>
>
>
> [Non-text portions of this message have been removed]
>





[Non-text portions of this message have been removed]
We are still not completely satisfied with the length of time that
Direct Import (EDI) takes to process all of our incoming demand. We
might consider writing our own solution directly into SQL like we did
with our previous ERP system. Our goal is to get all of our demand in
our system (or updated) daily within about a 2 hour window. We currently
have about 75,000 schedules from our customers. I am currently only able
to run this on the weekend, and have gotten it down from a 2 day affair
to about one day, but not going out as far as we really need to. This is
a huge delta, so I doubt that 700 will get us the rest of the way, but I
will be testing that over the next few weeks. (I know that daily imports
seem crazy, but our customers make changes every 4 hours that impact qty
and dock code.....and we want to use an automated ASN system so we can
cut our shipping transactions in half.....several hundred daily
transactions)



My question is, has anyone written directly to SQL in an Epicor table?

Does this bypass the business objects that Epicor has if we did it from
within the application?

There appears to be 'checksum' kind of fields in each table.....is that
true, and what impact would this have to my approach I am considering?



I can download all the demand from the customers portals through
processing center and to our servers in a total of about 8
minutes.....writing individual flat files as required by Direct Import.
We would like to query the necessary data from Epicor and do the
validations and matching in memory and then write to the tables. I think
they are taking the approach of strictly processing each one through
individually.....which is taking an incredible amount of time.



I need to get this down to within our 2 hour window ( or something close
to that), but don't want to do anything stupid and end up with data
corruption either. Any advice would be greatly appreciated if it saves
me in exploration time.



Thanks,

Jason Dearth

Ferco Aerospace Group



[Non-text portions of this message have been removed]
Firstly, writing to SQL will not inherently corrupt your data. No need to
worry about check-sums or the like.

It does bypass all the business logic, for better or worse. While this
certainly makes loading data dramatically faster, it also makes it much
easier to hose your DB. If you're going this route you need to be certain
than you know every change that gets made when a new schedule is imported.
Run a trace on the process. See what methods get called. Figure out what
each method does.

Generally I would never suggest to bypass the business objects, but if
you've gotta do it, do it right. As long as your SQL script ends up making
the same changes to the DB as you would get by going through the BOs,
you're in good shape.

On Thu, May 24, 2012 at 5:54 PM, Jason Dearth <jdearth@...> wrote:

> **
>
>
> We are still not completely satisfied with the length of time that
> Direct Import (EDI) takes to process all of our incoming demand. We
> might consider writing our own solution directly into SQL like we did
> with our previous ERP system. Our goal is to get all of our demand in
> our system (or updated) daily within about a 2 hour window. We currently
> have about 75,000 schedules from our customers. I am currently only able
> to run this on the weekend, and have gotten it down from a 2 day affair
> to about one day, but not going out as far as we really need to. This is
> a huge delta, so I doubt that 700 will get us the rest of the way, but I
> will be testing that over the next few weeks. (I know that daily imports
> seem crazy, but our customers make changes every 4 hours that impact qty
> and dock code.....and we want to use an automated ASN system so we can
> cut our shipping transactions in half.....several hundred daily
> transactions)
>
> My question is, has anyone written directly to SQL in an Epicor table?
>
> Does this bypass the business objects that Epicor has if we did it from
> within the application?
>
> There appears to be 'checksum' kind of fields in each table.....is that
> true, and what impact would this have to my approach I am considering?
>
> I can download all the demand from the customers portals through
> processing center and to our servers in a total of about 8
> minutes.....writing individual flat files as required by Direct Import.
> We would like to query the necessary data from Epicor and do the
> validations and matching in memory and then write to the tables. I think
> they are taking the approach of strictly processing each one through
> individually.....which is taking an incredible amount of time.
>
> I need to get this down to within our 2 hour window ( or something close
> to that), but don't want to do anything stupid and end up with data
> corruption either. Any advice would be greatly appreciated if it saves
> me in exploration time.
>
> Thanks,
>
> Jason Dearth
>
> Ferco Aerospace Group
>
> [Non-text portions of this message have been removed]
>
>
>



--
*John Driggers*
*High End Dev, System Design, Profit Drinking*
*
**:: 904.404.9233*
*:: waffqle@...*
*:: NO FAXES*

*

*


[Non-text portions of this message have been removed]
If you've got the techncal knowledge in house or the money to hire a
consultant then go with the business objects. I never recommend anyone
write to the DB unless they absolutely have to.
One of the things you'll get into is deadlocks, if you are writing to a
table or record while someone else is using the app.

Just my 2 cn.ts

*Jose C Gomez*
*Software Engineer*
*
*
*checkout my new blog <http://www.usdoingstuff.com> *
*
*T: 904.469.1524 mobile
E: jose@...
http://www.josecgomez.com
<http://www.linkedin.com/in/josecgomez> <http://www.facebook.com/josegomez>
<http://www.google.com/profiles/jose.gomez> <http://www.twitter.com/joc85>
<http://www.josecgomez.com/professional-resume/>
<http://www.josecgomez.com/feed/>
<http://www.usdoingstuff.com>

*Quis custodiet ipsos custodes?*



On Thu, May 24, 2012 at 6:38 PM, John Driggers <waffqle@...> wrote:

> Firstly, writing to SQL will not inherently corrupt your data. No need to
> worry about check-sums or the like.
>
> It does bypass all the business logic, for better or worse. While this
> certainly makes loading data dramatically faster, it also makes it much
> easier to hose your DB. If you're going this route you need to be certain
> than you know every change that gets made when a new schedule is imported.
> Run a trace on the process. See what methods get called. Figure out what
> each method does.
>
> Generally I would never suggest to bypass the business objects, but if
> you've gotta do it, do it right. As long as your SQL script ends up making
> the same changes to the DB as you would get by going through the BOs,
> you're in good shape.
>
> On Thu, May 24, 2012 at 5:54 PM, Jason Dearth <jdearth@...>
> wrote:
>
> > **
> >
> >
> > We are still not completely satisfied with the length of time that
> > Direct Import (EDI) takes to process all of our incoming demand. We
> > might consider writing our own solution directly into SQL like we did
> > with our previous ERP system. Our goal is to get all of our demand in
> > our system (or updated) daily within about a 2 hour window. We currently
> > have about 75,000 schedules from our customers. I am currently only able
> > to run this on the weekend, and have gotten it down from a 2 day affair
> > to about one day, but not going out as far as we really need to. This is
> > a huge delta, so I doubt that 700 will get us the rest of the way, but I
> > will be testing that over the next few weeks. (I know that daily imports
> > seem crazy, but our customers make changes every 4 hours that impact qty
> > and dock code.....and we want to use an automated ASN system so we can
> > cut our shipping transactions in half.....several hundred daily
> > transactions)
> >
> > My question is, has anyone written directly to SQL in an Epicor table?
> >
> > Does this bypass the business objects that Epicor has if we did it from
> > within the application?
> >
> > There appears to be 'checksum' kind of fields in each table.....is that
> > true, and what impact would this have to my approach I am considering?
> >
> > I can download all the demand from the customers portals through
> > processing center and to our servers in a total of about 8
> > minutes.....writing individual flat files as required by Direct Import.
> > We would like to query the necessary data from Epicor and do the
> > validations and matching in memory and then write to the tables. I think
> > they are taking the approach of strictly processing each one through
> > individually.....which is taking an incredible amount of time.
> >
> > I need to get this down to within our 2 hour window ( or something close
> > to that), but don't want to do anything stupid and end up with data
> > corruption either. Any advice would be greatly appreciated if it saves
> > me in exploration time.
> >
> > Thanks,
> >
> > Jason Dearth
> >
> > Ferco Aerospace Group
> >
> > [Non-text portions of this message have been removed]
> >
> >
> >
>
>
>
> --
> *John Driggers*
> *High End Dev, System Design, Profit Drinking*
> *
> **:: 904.404.9233*
> *:: waffqle@...*
> *:: NO FAXES*
>
> *
>
> *
>
>
> [Non-text portions of this message have been removed]
>
>
>
> ------------------------------------
>
> Useful links for the Yahoo!Groups Vantage Board are: ( Note: You must
> have already linked your email address to a yahoo id to enable access. )
> (1) To access the Files Section of our Yahoo!Group for Report Builder and
> Crystal Reports and other 'goodies', please goto:
> http://groups.yahoo.com/group/vantage/files/.
> (2) To search through old msg's goto:
> http://groups.yahoo.com/group/vantage/messages
> (3) To view links to Vendors that provide Vantage services goto:
> http://groups.yahoo.com/group/vantage/linksYahoo! Groups Links
>
>
>
>


[Non-text portions of this message have been removed]
Jason,



I'm insane: I'm a huge fan of circumventing the slow biz logic and pumping
data directly into SQL. It's safe, blazingly fast and completely accurate
when orchestrated *perfectly*. But leave no room for mistakes otherwise
you'll be rolling back to last night's backup.



As John said, run your traces. You have to test in your pilot/test/sandbox
environment 3 dozen times before implementing it live. There are tools out
there to help you analyze before-and-after differences in your db's tables.
Use them. Do your process the safe and condoned way via the UI. Analyze
the before-and-after differences it made. Check table rowcounts. Use SQL
Profiler to watch what occurs. Create your SQL scripts to precisely
simulate those changes. Use WITH(NOLOCK) clauses on your Select queries.



If done right, it's safe. I've Inserted and Updated directly to SQL
simulating various processes and mass record imports more times than Lindsay
Lohan has been in court. That's a lot. ;-)



You can mitigate your risk by leaving yourself breadcrumbs: If you touch a
record, also update an unused UD field. I like setting a NumberXX field to
something like 999 when I touch it directly. That way, during the analysis
and testing process, you can easily identify the records you've hit.
Breadcrumbs are also good when the potential "oops" happens and you need to
surgically undo something. It's always easier to fix something when you
have a flag set: "oops"... Update Whatever Set Whatever where Number01 =
999



It's risky and rogue, but definitely doable. Be careful.



Be careful. :-)







Vic



From: vantage@yahoogroups.com [mailto:vantage@yahoogroups.com] On Behalf Of
Jason Dearth
Sent: Thursday, May 24, 2012 5:54 PM
To: vantage@yahoogroups.com
Subject: [Vantage] SQL Question / Direct Import





We are still not completely satisfied with the length of time that
Direct Import (EDI) takes to process all of our incoming demand. We
might consider writing our own solution directly into SQL like we did
with our previous ERP system. Our goal is to get all of our demand in
our system (or updated) daily within about a 2 hour window. We currently
have about 75,000 schedules from our customers. I am currently only able
to run this on the weekend, and have gotten it down from a 2 day affair
to about one day, but not going out as far as we really need to. This is
a huge delta, so I doubt that 700 will get us the rest of the way, but I
will be testing that over the next few weeks. (I know that daily imports
seem crazy, but our customers make changes every 4 hours that impact qty
and dock code.....and we want to use an automated ASN system so we can
cut our shipping transactions in half.....several hundred daily
transactions)

My question is, has anyone written directly to SQL in an Epicor table?

Does this bypass the business objects that Epicor has if we did it from
within the application?

There appears to be 'checksum' kind of fields in each table.....is that
true, and what impact would this have to my approach I am considering?

I can download all the demand from the customers portals through
processing center and to our servers in a total of about 8
minutes.....writing individual flat files as required by Direct Import.
We would like to query the necessary data from Epicor and do the
validations and matching in memory and then write to the tables. I think
they are taking the approach of strictly processing each one through
individually.....which is taking an incredible amount of time.

I need to get this down to within our 2 hour window ( or something close
to that), but don't want to do anything stupid and end up with data
corruption either. Any advice would be greatly appreciated if it saves
me in exploration time.

Thanks,

Jason Dearth

Ferco Aerospace Group

[Non-text portions of this message have been removed]



No virus found in this message.
Checked by AVG - www.avg.com
Version: 2012.0.2176 / Virus Database: 2425/5020 - Release Date: 05/24/12



[Non-text portions of this message have been removed]
At a previous employer we did this through Service Connect (a consultant had set it up). I don't think that would make your time window though... as I recall we did about 15k lines in about 4 hours.

It all came to us as a flat text file, and the SC workflow did all the folding, bending, spindling, and mutilating so the next morning it was all there.

Ernie Lowell
Diba Industries

--- In vantage@yahoogroups.com, "Jason Dearth" <jdearth@...> wrote:
>
> We are still not completely satisfied with the length of time that
> Direct Import (EDI) takes to process all of our incoming demand. We
> might consider writing our own solution directly into SQL like we did
> with our previous ERP system. Our goal is to get all of our demand in
> our system (or updated) daily within about a 2 hour window. We currently
> have about 75,000 schedules from our customers. I am currently only able
> to run this on the weekend, and have gotten it down from a 2 day affair
> to about one day, but not going out as far as we really need to. This is
> a huge delta, so I doubt that 700 will get us the rest of the way, but I
> will be testing that over the next few weeks. (I know that daily imports
> seem crazy, but our customers make changes every 4 hours that impact qty
> and dock code.....and we want to use an automated ASN system so we can
> cut our shipping transactions in half.....several hundred daily
> transactions)
>
>
>
> My question is, has anyone written directly to SQL in an Epicor table?
>
> Does this bypass the business objects that Epicor has if we did it from
> within the application?
>
> There appears to be 'checksum' kind of fields in each table.....is that
> true, and what impact would this have to my approach I am considering?
>
>
>
> I can download all the demand from the customers portals through
> processing center and to our servers in a total of about 8
> minutes.....writing individual flat files as required by Direct Import.
> We would like to query the necessary data from Epicor and do the
> validations and matching in memory and then write to the tables. I think
> they are taking the approach of strictly processing each one through
> individually.....which is taking an incredible amount of time.
>
>
>
> I need to get this down to within our 2 hour window ( or something close
> to that), but don't want to do anything stupid and end up with data
> corruption either. Any advice would be greatly appreciated if it saves
> me in exploration time.
>
>
>
> Thanks,
>
> Jason Dearth
>
> Ferco Aerospace Group
>
>
>
> [Non-text portions of this message have been removed]
>