DMT Template to SQL Call

,

Hey Folks,
Our implementation has me re-creating instances from a bare DB rather than copying an existing DB, because we have been changing many of the early choices in our setup and configuration as we learn more about the system. For instance we were stuck with an incorrect currency code from early on, and even though we could change the way it was displayed, it made more sense to change the actual currency code during a DB refresh before we went live.

Because of this, and because I didn’t want to rely on building BAQs and maintaining them between systems, I endeavored to make a repo of SQL calls, powershell scripts and playlist csv’s to help me, based on some of the work I’ve seen here on Epiusers (specifically this thread Cloning a company with BAQ Export / DMT - #5 by Paul_Millsaps).

I can’t be the only one to struggle through this, so I thought I’d post the SQL call I use to take a template created in DMT and turn it in to an SQL call I can use to create a CSV from. I will eventually post the larger project structure I use to automate the SQL-> CSV process with powershell (and you could take it further to run the DMTs, but I merely save and run them via playlists).

This was definitely a personal choice, I am not recommending everyone do it this way, but with 2024.2 allowing you to create a BAQ from SQL, this may at least save you a few clicks along the way if you are making BAQs for it.

The SQL call is at the bottom of this post, but as an example, if I build a User Codes template from DMT with the fields I want, I would have a CSV that looks like this:

"Company","CodeTypeID","CodeTypeDesc","UDCodes#CodeTypeID","UDCodes#CodeID","UDCodes#IsActive","UDCodes#CodeDesc"

The parent table name I need is UDCodeType (you can find it in the dropdown of the template builder in DMT where it says All).

There are many ways to find the schema name, but if you already have SSMS or Azure Data Studio open, you might as well just check which one it’s under. For UDCodeType it is Ice

Next you need to do is copy and paste the query below, fill in your @tableName, @tableSchema and @selectedColumns with the data we found above, and run it. If we ran it for UserCodes we’d end up with the following result:

Started executing query at Line 1
(7 rows affected)
(1 row affected)

SELECT  
UDCodeType.Company,  
UDCodeType.CodeTypeID,  
UDCodeType.CodeTypeDesc,  
UDCodes.CodeTypeID AS [UDCodes#CodeTypeID],  
UDCodes.CodeID AS [UDCodes#CodeID],  
(CASE WHEN UDCodes.IsActive = 1 THEN 'True' ELSE 'False' END) AS [UDCodes#IsActive],  
UDCodes.CodeDesc AS [UDCodes#CodeDesc]  
FROM Ice.UDCodeType  
LEFT JOIN Ice.UDCodes ON /* Add your join condition here */

Total execution time: 00:00:00.187

In that result you can see the generated query:

SELECT  
UDCodeType.Company,  
UDCodeType.CodeTypeID,  
UDCodeType.CodeTypeDesc,  
UDCodes.CodeTypeID AS [UDCodes#CodeTypeID],  
UDCodes.CodeID AS [UDCodes#CodeID],  
(CASE WHEN UDCodes.IsActive = 1 THEN 'True' ELSE 'False' END) AS [UDCodes#IsActive],  
UDCodes.CodeDesc AS [UDCodes#CodeDesc]  
FROM Ice.UDCodeType  
LEFT JOIN Ice.UDCodes ON /* Add your join condition here */

The formatting is up to you, and so is the Join conditions (for User Codes you’d use LEFT JOIN Ice.UDCodes ON UDCodes.CodeTypeID = UDCodeType.CodeTypeID). You can also see that it formats Boolean fields for true/false and names the returned columns to be compatible with DMT.

If it can’t find a field named the same as the column header in the DMT, it will comment it out in the sql (see below Buyer example)

So what can you do with this, and why is useful? To me I use this to extract, filter, manipulate and clean data from our current test environment to load our next one.

Did someone spell your name wrong when they put you in as a buyer? Did you want to make sure everyone has ConsolidatedPurchasing turned off? You could take the base Buyer query generated by the code and add a case to handle it:

SELECT
    Company,
    CASE
        WHEN BuyerID = 'DNOUNAN' THEN 'DNOONAN'
        ELSE BuyerID
    END AS BuyerID,
    -- PerConID,
    Name,
    POLimit,
    ApprovalPerson,
    --EMailAddress,
    'False' AS ConsolidatedPurchasing
FROM Erp.PurAgent

Did you want to make sure that you don’t end up with duplicate Person/Contact records every time you enter people into the DB? You can join to it based on known data (in this case employee ID’s that I’ve stored in a field our instance doesn’t use):

SELECT
    eb.Company,
    eb.EmpID,
    pc.PerConID AS PerConID,
    eb.FirstName,
    eb.LastName,
    eb.Name,
    eb.SupervisorID,
    eb.Shift,
    (CASE WHEN eb.Payroll = 1 THEN 'True' ELSE 'False' END) AS Payroll,
    (CASE WHEN eb.ShopSupervisor = 1 THEN 'True' ELSE 'False' END) AS ShopSupervisor,
    (CASE WHEN eb.ShipRecv = 1 THEN 'True' ELSE 'False' END) AS ShipRecv,
    (CASE WHEN eb.ProductionWorker = 1 THEN 'True' ELSE 'False' END) AS ProductionWorker,
    (CASE WHEN eb.CanRequestMaterial = 1 THEN 'True' ELSE 'False' END) AS CanRequestMaterial,
    (CASE WHEN eb.CanReportQty = 1 THEN 'True' ELSE 'False' END) AS CanReportQty,
    (CASE WHEN eb.CanReportScrapQty = 1 THEN 'True' ELSE 'False' END) AS CanReportScrapQty,
    (CASE WHEN eb.CanReportNCQty = 1 THEN 'True' ELSE 'False' END) AS CanReportNCQty,
    eb.ExpenseCode,
    eb.JCDept,
    eb.ResourceGrpID,
    eb.ResourceID,
    (CASE WHEN eb.AllowProduction = 1 THEN 'True' ELSE 'False' END) AS AllowProduction,
    eb.LaborRate
FROM ERP.EmpBasic eb
LEFT JOIN KineticUAT.Erp.PerCon pc ON pc.IssuerPrsnIDCode = eb.EmpID -- Take the Employee ID from the old DB, and match it to a PerCon record in the new DB's PerCon table. I use the IssuerPrsnIDCode because we don't use this field and it stays unchanged for us
WHERE eb.EmpStatus = 'A'

In the above example I could also change ResourceGrpID, JCDept, Shift, etc. if I need to make a change that was locked in on the old DB.

I realize this was a bit rambly, I’ll clean it up as time permits this week. If you have questions let me know, or find me at the conference Wednesday or Thursday.

Cheers,
Devon

The Super Useful SQL Call:

-- fill in the three variables below, then run the query through SSMS or Azure Data Studio, or your SQL tool of choice

DECLARE @tableName NVARCHAR(128) = ''; -- Name of the parent table for the DMT i.e. UDCodeType for UserCodes
DECLARE @tableSchema NVARCHAR(128) = 'Erp'; -- Erp, Ice, Or whatever schema your table is in
DECLARE @selectedColumns NVARCHAR(MAX) = ''; -- The list of DMT column names without quotes, including the join columns i.e. for User Codes I would use Company,CodeTypeID,CodeTypeDesc,UDCodes#CodeTypeID,UDCodes#CodeID,UDCodes#IsActive,UDCodes#CodeDesc

-- reserved for the call below, don't touch these
DECLARE @columns NVARCHAR(MAX) = '';
DECLARE @joins NVARCHAR(MAX) = '';
DECLARE @sql NVARCHAR(MAX) = '';

-- Step 1: Filtered list of columns into a temp table
SELECT value AS COLUMN_NAME
INTO #ColumnList
FROM STRING_SPLIT(@selectedColumns, ',');

-- Step 2: Identify columns that are bit (boolean-like), columns from other tables, and other columns
SELECT @columns = STRING_AGG(CAST(
    CASE 
        WHEN CHARINDEX('#', cl.COLUMN_NAME) > 0 THEN
            CASE
                WHEN jc.COLUMN_NAME IS NULL THEN
                    '    -- ' + cl.COLUMN_NAME + ' -- Column not found in schema'
                WHEN jc.DATA_TYPE = 'bit' THEN
                    '    (CASE WHEN ' + jt.TableName + '.' + PARSENAME(REPLACE(cl.COLUMN_NAME, '#', '.'), 1) + ' = 1 THEN ''True'' ELSE ''False'' END) AS [' + cl.COLUMN_NAME + ']'
                ELSE
                    '    ' + jt.TableName + '.' + PARSENAME(REPLACE(cl.COLUMN_NAME, '#', '.'), 1) + ' AS [' + cl.COLUMN_NAME + ']'
            END
        WHEN c.COLUMN_NAME IS NULL THEN
            '    -- ' + cl.COLUMN_NAME + ' -- Column not found in schema'
        WHEN c.DATA_TYPE = 'bit' THEN 
            '    (CASE WHEN ' + @tableName + '.' + c.COLUMN_NAME + ' = 1 THEN ''True'' ELSE ''False'' END) AS ' + c.COLUMN_NAME
        ELSE 
            '    ' + @tableName + '.' + c.COLUMN_NAME
    END AS NVARCHAR(MAX)), ',' + CHAR(10)
),
@joins = STRING_AGG(CAST(
    CASE 
        WHEN CHARINDEX('#', cl.COLUMN_NAME) > 0 THEN
            'LEFT JOIN ' + @tableSchema + '.' + PARSENAME(REPLACE(cl.COLUMN_NAME, '#', '.'), 2) + ' ON /* Add your join condition here */'
        ELSE
            NULL
    END AS NVARCHAR(MAX)), CHAR(10))
FROM #ColumnList cl
LEFT JOIN INFORMATION_SCHEMA.COLUMNS c ON c.COLUMN_NAME = cl.COLUMN_NAME AND c.TABLE_NAME = @tableName AND c.TABLE_SCHEMA = @tableSchema
LEFT JOIN INFORMATION_SCHEMA.COLUMNS jc ON jc.COLUMN_NAME = PARSENAME(REPLACE(cl.COLUMN_NAME, '#', '.'), 1) AND jc.TABLE_NAME = PARSENAME(REPLACE(cl.COLUMN_NAME, '#', '.'), 2) AND jc.TABLE_SCHEMA = @tableSchema
LEFT JOIN (
    SELECT DISTINCT PARSENAME(REPLACE(cl.COLUMN_NAME, '#', '.'), 2) AS TableName
    FROM #ColumnList cl
    WHERE CHARINDEX('#', cl.COLUMN_NAME) > 0
) jt ON jt.TableName = PARSENAME(REPLACE(cl.COLUMN_NAME, '#', '.'), 2);

-- Step 3: Ensure only one join per table
SELECT DISTINCT jt.TableName
INTO #JoinTables
FROM #ColumnList cl
LEFT JOIN (
    SELECT DISTINCT PARSENAME(REPLACE(cl.COLUMN_NAME, '#', '.'), 2) AS TableName
    FROM #ColumnList cl
    WHERE CHARINDEX('#', cl.COLUMN_NAME) > 0
) jt ON jt.TableName = PARSENAME(REPLACE(cl.COLUMN_NAME, '#', '.'), 2)
WHERE CHARINDEX('#', cl.COLUMN_NAME) > 0;

SELECT @joins = STRING_AGG(CAST(
    'LEFT JOIN ' + @tableSchema + '.' + jt.TableName + ' ON /* Add your join condition here */' AS NVARCHAR(MAX)), CHAR(10))
FROM #JoinTables jt;

-- Step 4: Construct the SELECT query
SET @sql = 'SELECT ' + CHAR(10) + @columns + CHAR(10) + 'FROM ' + @tableSchema + '.' + @tableName + CHAR(10) + ISNULL(@joins, '');

-- Step 5: Output the generated SQL query
-- SELECT @sql AS GeneratedSQL;
WHILE LEN(@sql) > 0
BEGIN
    PRINT LEFT(@sql, 4000);  -- PRINT only handles up to 4000 characters at a time
    SET @sql = SUBSTRING(@sql, 4001, LEN(@sql));  -- Continue from where left off
END

-- Cleanup
DROP TABLE #ColumnList;
DROP TABLE #JoinTables;

Hi Devon,
first of all good luck - there is a lot of work ahead of you and quite a few ways to go wrong even with the best intent. Pay attention when importing contacts - if you import one person, delete it (via DMT) and import again, the 2 records will not be 100% identical.
Also as a personal preference, I would build the BAQs needed for export, put them into dashboards for end users to see, to validate and once they say is ok, then export data and import to new environment.
If you put the logic on the SQL query, then the user sees something on the screen (consolidated purchasing), says is ok, you import it without and then it might be wrong. It might work your way if you have 5 - 6 or 20 rules but if you have 300 and users change their mind (or do not communicate with each other), then of course who imported the wrong data and is to blame ?
Devon :slight_smile:

1 Like