Filters not working on some BO's when called through API, or inconsistent

Been working on some automation using the kinetic API’s on 2023.2 and have been getting some strange behaviour. Just wonder if anyone has any insight on these or can tell me what doing wrong.

In some instances, I need to use $Select and $filter to restrict the amount of data returned but using the filters results in some odd behaviour, for example.

using Erp.BO.PartSvc/PartUOMs to get a list of all UOM information for a given part, using $filter=PartNum eq ‘partnumber’ works just fine, unless the part number contains a non alphanumeric character like a hyphen or a period. and then the query will return status 200 Ok but has no content or returns everything, like every part number.

Then on other ones like SalesOrder, applying any sort of filtering this way just returns an empty 200 Ok response with no content. Just “value”

I know there are urls preconfigured with parameters but I cannot use them for everything as they do not have the parameters I need to filter on in every case so need to be able to use my own, but the results can be hit or miss.

Just returning everything and then sifting through the response in code is not really an option as due to the size of the returned json it can take a while to get the response and really slow down the application.

Anyone got any advice on this or maybe im just doing something wrong?

The OData methods can be difficult to work with, and are not always reliable.

I assume from what I quoted, you don’t want or can’t use the custom methods because not all will filter what you want.

In that case, I would recommend you make functions that do the filtering for you, and call those instead.


Also please step back and give and give us more context on what you are doing and why. That may help us to give better advice.

3 Likes

So what you are saying, is, I need to stop procrastinating and just get on with learning how to use functions in Kinetic…

Fiiiiiiiine.

To the documentation I go.

3 Likes

You won’t regret it. As a developer and an administrator, it really makes my life easier. And using these with the new kinetic interface makes customizations on forms so much more powerful.

3 Likes

In this case, if you just want to fetch data, simply calling a BAQ might be easier than a function.

You can pass parameters and/or filter to the baq which is nice

4 Likes

Agreed. I have found it so much easier to only ever use two endpoints when calling Kinetic API:

-GET BAQ to retrieve data

-POST Function to create/update/delete (where the Function uses Business Object method calls to do what it needs to do)

4 Likes

@TomAlexander would you mind posting an example of the Get and Post as you are using them, I am not familiar with this approach, thank you

2 Likes

In REST Help page you can browse BAQs and Functions:

Yes that is the feedback I have had from a few people, but here is the issue I have with this.
If I want to develop something purely for internal use, then why would you even bother using the API approach, just do it in functions. However, if I want to use the APIs to create an application that I can then give to someone else I don’t want to have to go, here is the application :smiley:

and the 5 functions and 37 BAQ’s that you need to import to your environment to make it work… :no_mouth:

Kind of defeats the point a little.

Just my 2c

4 Likes

what you thinking @klincecum :grinning:

1 Like

I’m always thinking. It’s a damn curse!

I’ll let you know tomorrow. Right now I’m thinking about food. :cut_of_meat:

3 Likes

Valid point! My perspective is more about individual customizations where I find it easier to do the “heavy lifting” inside Epicor (BAQs/Functions). But if you’re building a product it’s a different perspective.

3 Likes

Setting aside that functions are also APIs…this hits several of the hot architectural topics today: coupling vs. cohesion and monoliths vs. microservices.

In one of @klincecum’s Wiki posts, we talk about coupling and cohesion, so no need to repeat that here.

But monoliths (one big application) and microservices (many services) have been the latest topic in software architectural circles. The latest take is that, like coupling and cohesion, we need balance, and the current suggestion is to build a modular monolith to start with, then evolve the system for particular business capabilities as required. The key here is modularity.

This has the battle between Unix/Linux and Windows, CISC and RISC, and software architecture for years. Do you build small things that do one thing really well and assemble them as needed, or do you build specialty things that are optimized for one task but not easily changed? At least for now, Linux and ARM appear to be winning the day.

Companies like AWS, Google, and Microsoft are API-First organizations. This reduces coupling and allows them to react to changes and build new things quickly. At Microsoft, the API comes first. Next, they build a PowerShell script to interact with the API. Finally, they may add a UI. This provides automation out of the box but also a guided experience. Importantly, each piece is easily testable on its own and not only during end-to-end testing.

In the end, it’s all about balance. Do we need 37 BAQs or three? One Function Library or more? How much duplicate code do we want to manage if it’s in five single purpose function libraries? :person_shrugging: What is easier to manage and evolve?

That depends on the point I imagine. What is the point being defeated here?

3 Likes

I was exaggerating to make a point obviously.

But what I was getting at was more the fact that the Rest Api’s are there but seem to have some issues when it comes to implementation. It’s fantastic that they exist and enable a world of possibilities to build products that can interact directly with the ERP and your data. And there are obviously ones that do exist already like EKW and Sugar CRM etc that use the OData approach.

It was more my frustration that it SHOULD be possible to create an application that can interact with the system entirely through the Rest API’s (and yes you can). This would mean that you can create a turnkey application that would do a thing that Kinetic cannot do itself and it would not require any modification to the clients environment to do it.
Then when I hit a bump in the road the suggestion was to work around it with modifications inside the kinetic application. However, depending on the context of what you are trying to build that might not be an option.

So, my point was, if you cannot build a solution to interact with the ERP entirely using REST without also having to build custom functions or BAQs to work around the fact some endpoints do not function as documented. Then you may as well just build your solution as a collection of functions, BAQ’s, and form customisations instead.

And that’s a shame as the fact that the REST api’s are there is such a cool and powerful tool. And i want to use them to build cool stuff :grin:

Oh, I hear you, and I agree with you for the most part. My caution is if we could control the any application through an API in a Remote Procedure Call way, SHOULD we? :thinking:

Going back to at least the 90s, computer scientists have been studying distributed computing. During that time, they came up with the Fallacies of Distributed Computing, which are:

  • The network is reliable
  • Latency is zero
  • Bandwidth is infinite
  • The network is secure
  • Topology doesn’t change
  • There is one administrator
  • Transport cost is zero
  • The network is homogeneous

So while I like, no, love the idea of REST integration, I need to think about what to do if the network is not available, who’s sniffing the network, how much data am I trying to push over the network, how long am I willing to wait, etc.

How do the various APIs (not just Kinetic) react in these situations? I’d like to build cool stuff too. Can I make it performant, reliable and, secure at the same time? This was the point of my reply above.