Been working on some automation using the kinetic API’s on 2023.2 and have been getting some strange behaviour. Just wonder if anyone has any insight on these or can tell me what doing wrong.
In some instances, I need to use $Select and $filter to restrict the amount of data returned but using the filters results in some odd behaviour, for example.
using Erp.BO.PartSvc/PartUOMs to get a list of all UOM information for a given part, using $filter=PartNum eq ‘partnumber’ works just fine, unless the part number contains a non alphanumeric character like a hyphen or a period. and then the query will return status 200 Ok but has no content or returns everything, like every part number.
Then on other ones like SalesOrder, applying any sort of filtering this way just returns an empty 200 Ok response with no content. Just “value”
I know there are urls preconfigured with parameters but I cannot use them for everything as they do not have the parameters I need to filter on in every case so need to be able to use my own, but the results can be hit or miss.
Just returning everything and then sifting through the response in code is not really an option as due to the size of the returned json it can take a while to get the response and really slow down the application.
Anyone got any advice on this or maybe im just doing something wrong?
You won’t regret it. As a developer and an administrator, it really makes my life easier. And using these with the new kinetic interface makes customizations on forms so much more powerful.
Yes that is the feedback I have had from a few people, but here is the issue I have with this.
If I want to develop something purely for internal use, then why would you even bother using the API approach, just do it in functions. However, if I want to use the APIs to create an application that I can then give to someone else I don’t want to have to go, here is the application …
and the 5 functions and 37 BAQ’s that you need to import to your environment to make it work…
Valid point! My perspective is more about individual customizations where I find it easier to do the “heavy lifting” inside Epicor (BAQs/Functions). But if you’re building a product it’s a different perspective.
Setting aside that functions are also APIs…this hits several of the hot architectural topics today: coupling vs. cohesion and monoliths vs. microservices.
But monoliths (one big application) and microservices (many services) have been the latest topic in software architectural circles. The latest take is that, like coupling and cohesion, we need balance, and the current suggestion is to build a modular monolith to start with, then evolve the system for particular business capabilities as required. The key here is modularity.
This has the battle between Unix/Linux and Windows, CISC and RISC, and software architecture for years. Do you build small things that do one thing really well and assemble them as needed, or do you build specialty things that are optimized for one task but not easily changed? At least for now, Linux and ARM appear to be winning the day.
Companies like AWS, Google, and Microsoft are API-First organizations. This reduces coupling and allows them to react to changes and build new things quickly. At Microsoft, the API comes first. Next, they build a PowerShell script to interact with the API. Finally, they may add a UI. This provides automation out of the box but also a guided experience. Importantly, each piece is easily testable on its own and not only during end-to-end testing.
In the end, it’s all about balance. Do we need 37 BAQs or three? One Function Library or more? How much duplicate code do we want to manage if it’s in five single purpose function libraries? What is easier to manage and evolve?
That depends on the point I imagine. What is the point being defeated here?
But what I was getting at was more the fact that the Rest Api’s are there but seem to have some issues when it comes to implementation. It’s fantastic that they exist and enable a world of possibilities to build products that can interact directly with the ERP and your data. And there are obviously ones that do exist already like EKW and Sugar CRM etc that use the OData approach.
It was more my frustration that it SHOULD be possible to create an application that can interact with the system entirely through the Rest API’s (and yes you can). This would mean that you can create a turnkey application that would do a thing that Kinetic cannot do itself and it would not require any modification to the clients environment to do it.
Then when I hit a bump in the road the suggestion was to work around it with modifications inside the kinetic application. However, depending on the context of what you are trying to build that might not be an option.
So, my point was, if you cannot build a solution to interact with the ERP entirely using REST without also having to build custom functions or BAQs to work around the fact some endpoints do not function as documented. Then you may as well just build your solution as a collection of functions, BAQ’s, and form customisations instead.
And that’s a shame as the fact that the REST api’s are there is such a cool and powerful tool. And i want to use them to build cool stuff
Oh, I hear you, and I agree with you for the most part. My caution is if we could control the any application through an API in a Remote Procedure Call way, SHOULD we?
Going back to at least the 90s, computer scientists have been studying distributed computing. During that time, they came up with the Fallacies of Distributed Computing, which are:
The network is reliable
Latency is zero
Bandwidth is infinite
The network is secure
Topology doesn’t change
There is one administrator
Transport cost is zero
The network is homogeneous
So while I like, no, love the idea of REST integration, I need to think about what to do if the network is not available, who’s sniffing the network, how much data am I trying to push over the network, how long am I willing to wait, etc.
How do the various APIs (not just Kinetic) react in these situations? I’d like to build cool stuff too. Can I make it performant, reliable and, secure at the same time? This was the point of my reply above.