Cloud Migration - replacing external data source

In our on-prem environment (10.1.500), we have a report that is built from 2 different external tables, and those tables are filled/updated with an SP and some scheduled tasks. External data sourcing is going away (we are migrating to Gov Cloud), but I need to preserve the report. In a nutshell, it is sales/cost data displayed in terms of daily, MTD, vs. Plan, vs. Last Year, and also shows Backlog. The report then slices and dices by division and then also sections out for focus customers.

I’ve built my 2 tables in UD column maintenance, and now I’ve got to figure out the best way to write to them. The 700 line SP does a bunch of calculations, has a bunch of variables, has a bunch of temp tables. Lots of INSERT and MERGE to build a sort of ‘master’ temp table. Finally writes to one of the external tables. The other external table (sales plan) is updated by a monthly-run SQL task and used in the SP as well.

I could use some advice with strategy. Nathan A. set me on a course to use the UD tables and BPMs, but it still seems like there’s some forks in this road.

I’m a self-taught noob. Please help!

I think the UD Tables and BPM’s are going to be your best bet. You could try mimicking the SP in Updateable BAQ(s) to assemble the data you want to add to tables. Using an UBAQ will allow you to use the Business Activity Query Designer, where you can make subqueries/calculations etc. It also allows you to schedule when the query/updates will run. The UBAQ wouldn’t necessarily update anything itself, but it would allow you to make a BPM on the UBAQs GetList method via Updateable BAQ Directive Maintenance (hopefully you have this module because this strategy would require it). This GetList would fire everytime the UBAQ is run, and from there you can update a UD table via this BPM, with the UBAQ’s data.

Having trouble locating “Updateable BAQ Directive Maintenance”, licensed or not, in EAC.

I’m not sure which license it is (or what version it was introduced), but a menu search should help (if you have the right permissions)

image

This is the path ours is at.

Oh, I see. I started looking here and thought I was maybe getting warm. Thanks a bunch for your input. I’m a little afraid of trying to re-birth this SP, but feeling like I might be on the right track at least.
image

Yeah it’s right there in the menu… hah. Thanks again.

Understandable. You may be able to call the SP in a custom code block, not sure.

Another option is to look at Epicor Functions since you’re on 10.2.500 already. They replace the UBAQ solution we had to use previously. You can then call the update routine as a REST call on a schedule that you choose.

Mark W.

P.S. On UBAQ, make sure your user ID had Advanced BAQ options enabled on the Options tab.

Not in the Cloud. The Stored Procedure won’t be there to call and there’s no access to the server to add it.

Circling back around to this randomly. I’ve studied some other examples, and I think I’d like to keep a UD table filled with 3 calendar years of dates (and a bunch of pre-calculated values: fiscal year, isWorkDay, isHoliday, etc). This table would be re-written any time Erp.ProdCalDay was updated. I think this will have value beyond this one report. I have a SQL statement that works with our on-prem, but need to convert to a BPM - and I’m having trouble getting started. This looks close… Insert UD Table Records from a BPM … but I can’t see how to adapt it to my needs.

Hi Justin,

Instead of doing all the work in a BPM, have you considered using an Epicor Function and then call the Function from the BPM? You would have access to the UD table Business Object to do the data work.

Well, yes… but I’m no more practiced with functions than I am with authoring BPMs. I’ll spend some more time looking at that angle and see if something crystallizes. Thanks!

In either case, you would define the UD table dataset as a variable and then fill it by Query or code. Then you can do an update (remember to clear the previous values and set you rowtype to add) and do an update.

Should work says the guy not doing it…