All available tables for Kinetic for BAQs

Does anyone know of a way to access a list of all available tables in Kinetic that I can export to Excel have on hand? I’m trying something out, but it would be helpful to have that list to refer back to for my project.

1 Like

Ice.ZDataTable should have everything you need

4 Likes

Awesome. I will check it out.

I guess I’m not familiar enough with Kinetic 2024.1 to know where to go to see the list of tables in Ice.ZDataTable.

Just run a BAQ on that table, and put in the field DBTableName. Add any other fields to get extra data.

2 Likes

You could also just use the Data Dictionary Viewer to view this data without needing to create any BAQs

3 Likes

Sometimes I feel that the more I work in Kinetic, the less I begin to understand it. Could you help me out and tell me where I find that?

2 Likes

Search for “Data Dictionary Viewer” in the application menu. It is under the System Setup → System Maintenance submenu.

Welcome To The Party Pal GIFs - Find & Share on GIPHY

Common feeling amongst us all. You’ve come to the right place to read, learn and commiserate with others in the same boat.

7 Likes

I’m just starting out on learning programming, so I’m probably not the most valuable when stating these things, but when designing this software, did they just tell the programmers what they wanted it to do and let the programmers decide how things worked and flowed rather than sitting down with them step by step and essentially saying “make it more user friendly and simpler”? Just me nitpicking I suppose.

2 Likes

Nitpicking…nah, not really. Common complaint - all the cards/tabs/whatevers don’t always flow as smoothly when compared to how you’d expect to execute…or to your previous ERP platform. Takes time to adapt - either by getting used to their flow, or redefining it to what you need.

We all start somewhere. Deep knowledge of Epicor doesn’t come overnight. We’ve been live 2 years and I’ve barely scratched the surface. We’ve stayed fairly vanilla (with the exception of EDI/bigbox retail requirements/mods)…but there are areas that I’d love to have the time to enhance.

2 Likes

Well, it seems like a typical Monday. Trying to find the Data Dictionary and when running a search I can’t find it and also when trying to follower the directions through erphelp zendesk, I must not have permissions to go through System Setup>System Maintenance>Data Dictionary Viewer. So… when is Kinetic getting an AI that you can just tell it what you want to do and it will build it automatically with minimal tweaking? :upside_down_face:

I agree!! Who knew it would be so difficult to try to run a report or modify a report to print a different label size…It’s 2025…it shouldn’t be this hard!

1 Like

Just wanted to jump back here as it has been a while. Wanted to get some things squared away. So, the reason I was looking for a way to export the tables, indexes, and fields, was that I decided to test out building a specific Agent in ChatGPT geared specifically towards helping to teach building BAQ’s as well as giving step by step instructions on how to do it in Epicor Kinetic. Having these tables, indexes, and fields would help eliminate a lot of the Agent guessing as to what the fields I would need are named when doing joins or other things. Example is ZDataSet will have DataSetId but ZDataTable will not. Small things like this that can be frustrating. So what I have been doing is creating a spreadsheet with the more common Tables and fields that I would use and then uploading the Excel file to the Agent along with the PDFs for specific Kinetic GUI and functions and then tell it what kind of BAQ I want to create. If it tells me to use a table I haven’t used before I update the Excel file and delete the old one that was uploaded and replace it with the updated one. I’m still testing, but it has helped me build a few simpler BAQs that I use with some calculated fields and such. It also explains along the way why these need to be joined this way or that way. As it makes mistakes now, I am learning enough to know where it might have gone wrong and go back and correct it. Just figured I’d share the use case with everyone.

By the way, has anyone ever run into this site while trying to google anything about SQL to BAQ for Kinetic? BAQ Generation) Looks like I stumbled across some sort of testing page for something that Epicor is working on. One of the links on the page is using OpenAI.

Also, if anyone really wants to get a kick out of something, upload any pdf documentation regarding Kinetic or YouTube videos regarding Kinetic that has small tutorials to NotebookLM and once it is done going through everything use the audio studio. It makes the information in the docs and videos into a 2-person podcast type of thing that you can listen to. It’s cool and creepy at the same time.

Hi Shawn, I have a working json table/field/relationship schema that i’m using for the purpose of feeding an LLM for BAQ generation, this combined with a very long set of instructions leads to pretty decent BAQs. This data is current as of 2025.1.8 (I wrote a function to pull this out using REST and remove the fields not relevant to LLM BAQ generation)

You and I are both T-SQL, Epicor, and C# experts. We enjoy communicating in clear, concise language. Assume interrogatory prompts are intended to generate a BAQ/T-SQL query. Use the provided files as a reference when building SQL queries. tableschema.txt describes the Epicor SQL Table Layout, fieldschema.txt describes the Epicor SQL Field Layout, and DatasetRelationships_0.xlsx describes the parent and child fields/tables to use when forming a table relationship. When referencing a calculated field or expression elsewhere in the query (such as in the FROM, WHERE, GROUP BY, or ORDER BY clauses), always use the full expression instead of its column alias. Assume the query processor is not aware of the alias and requires the full expression in other areas. When building a query and attempting to pull columns not present in the current table, reference the Relationship Excel file to determine how to join related tables to access the desired column(s). When you are replying, do not re-phrase the prompt as part of your reply. When creating a BAQ, ensure to always use unique table aliases, even in subqueries and CTEs. if the same table is used in a subquery and the top level query, it must use a different name in each instance. Refer to tables with notation [Schema].[TableName] AS [UniqueAlias] such as [Erp].[Part] AS [Part_1]. After generating a completed SQL statement, check fieldschema.txt to validate all columns referenced are present. If they are not, analyze columns and alter SQL to use only columns present in fieldschema.txt. Ensure that columns are present on the table they are referenced. If needing a column not on a table, use the relationship excel to get it from another table. Always use unique aliases when referencing a column, for example if referencing Part.PartNum and InvcDtl.PartNum, the first column should be referenced as [Part].[PartNum] AS [Part_PartNum], and the second referenced as [InvcDtl].[PartNum] AS [InvcDtl_PartNum] - always use the syntax SELECT [TableName].[FieldName] AS [TableName_FieldName]. When creating a calculated field in a SELECT statement, the alias must always be [Calculated_ExpressionName]. Replace the text "ExpressionName" with the assigned name of the calculated field. The expression must always be wrapped with parenthesis. Always include a WHERE clause stating "Company = @CurrentCompany" when the input prompt does not specify a company name. Use Company as a column in a join anywhere it is present in the provided relationship excel file. When referencing tables and columns within a calculated field, always use the syntax TableName.FieldName vs. [TableName].[FieldName]. Ensure in a calculated field or expression that neither the Table Name nor the Field Name is wrapped in Brackets. When using string_agg, always use the syntax STRING_AGG(CONVERT(nvarchar(max), Column), ',') to avoid possible errors. Default to concatenating string_agg with ', ' unless otherwise specified. Always ensure that expressions/calculated fields are wrapped in parenthesis. Run a triple check on generated SQL that all columns exist on their respective tables according to fieldschema.txt, and re-formulate the query if a column does not exist so that it is using only columns existing in fieldschema.txt. When referencing InvcHead dollar amounts, reference DocInvoiceAmt. When building SQL, do not use [ ] brackets around table or field references within calculated fields or expressions (such as within SUM(), COUNT(), or custom calculations). Instead, use the syntax TableAlias.FieldName inside all calculated expressions or functions. Only use [ ] brackets for column or table aliases outside of calculated expressions, never inside aggregation or calculation syntax. When limiting rows in T-SQL, use SELECT TOP (x) syntax. Run the completed SQL through testing to ensure only columns from fieldschema.txt are used, and correct as applicable.

field-schema.txt (35.7 MB)
table-schema.txt (614.0 KB)
Dataset Relationships_0.xlsx (102.2 KB)

2 Likes

What LLM are you using for this?

Perplexity Pro, uploaded to a “space” - I’ve got a separate space configured for each topic (C# coding in BPM/Function, BAQ, etc.). The more narrow the scope of the instructions + reference data, the “better” the AI is at kicking out code that has less hallucinations/mistakes.

I’m writing some function code/app studio to better learn Epicor’s guts, one of the things on my radar is a “dump your Epicor schema to some json files” button. This will be useful over the static data I’ve assembled and provided here, as it can iterate through all your UD fields/tables and add them to the schema as well, bringing them into the AI model. Any time you add a field you could update your model. There is also some dataset optimization to be done, I have SysRowID and SysRevID included and they are largely irrelevant to the model, so we can save some tokens by stripping this too.

That, combined with a shared “Data Dictionary Overlay” would let us overlay additional metadata on tables/fields. Epicor’s out of box descriptions are not very descriptive so LLM can get stuck translating plain english prompts to SQL.

It also performs better when it can access the output of Epicor’s BAQ Designer “Analyze” - and then again output from when the BAQ runs if there is some warning/error, etc. - so it benefits from API integration where you can set up a feedback loop and have it iterate the prompt a few times. I’m planning to tinker with that when I get some extra $ to buy API tokens :slight_smile:

1 Like

I’ve been looking through the files you uploaded. These are great. I might be able to use these towards the agent I’m working on if you don’t mind me using this?

Yes sir! That’s what it’s for! Would be interested to see what cool things come from it!

Epicor had a demo of an LLM tied into BAQ designer at insights, it had some hiccups but conceptually very cool. Prompt to data right in the LLM without any intermediate SQL being visible to the user. Might be what’s at your labs link above!

1 Like