Function Generation from EFx -- Returned JSON Data Size Limit?

Hey folks,

I have an Epicor function that bundles up a PO dataset and converts it to JSON.

It works fine until the PO data gets too big, I think.

I can get it to work until the returned JSON object gets upwards of 47k (guessing a 50k limit), and then I get an error message something like the one below.

Here’s the serialization statement:

outResult = Newtonsoft.Json.JsonConvert.SerializeObject(jsonList, Newtonsoft.Json.Formatting.Indented);

It’s returned to the caller as a system string.

Note the error message references ‘Context.BpmData[0].Character17’. I’m guessing it’s using that data as a workspace. Maybe I’m using it up?

Single POs start hitting errors around five receipt lines (and related records). I’m including receipts and invoices at the line and release level, so there’s a lot of duplication.

I can get four or five POs returned with single lines and receipts before the crash.

I could probably squeeze a bit of room out by not indenting, but don’t think it would get me too far. Data requirements not my own.

Any experience doing something like this? Anyone hit size limits? Hints, remedies, random suggestions?

Thanks,

Joe

Application Error

Exception caught in: Newtonsoft.Json

Error Detail 
============
Message: Unterminated string. Expected delimiter: ". Path 'Context.BpmData[0].Character17', line 1, position 351.
Program: Newtonsoft.Json.dll
Method: ReadStringIntoBuffer

Client Stack Trace 
==================
   at Newtonsoft.Json.JsonTextReader.ReadStringIntoBuffer(Char quote)
   at Newtonsoft.Json.JsonTextReader.ParseProperty()
   at Newtonsoft.Json.JsonTextReader.ParseObject()
   at Newtonsoft.Json.JsonTextReader.Read()
   at Ice.Api.Serialization.JsonReaderExtensions.ReadAndAssert(JsonReader reader)
   at Ice.Api.Serialization.IceTableConverter.CreateRow(JsonReader reader, IIceTable table, Lazy`1 lazyUDColumns, JsonSerializer serializer)
   at Ice.Api.Serialization.IceTableConverter.ReadJson(JsonReader reader, Type objectType, Object existingValue, JsonSerializer serializer)
   at Ice.Api.Serialization.IceTablesetConverter.ReadJson(JsonReader reader, Type objectType, Object existingValue, JsonSerializer serializer)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.DeserializeConvertable(JsonConverter converter, JsonReader reader, Type objectType, Object existingValue)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.SetPropertyValue(JsonProperty property, JsonConverter propertyConverter, JsonContainerContract containerContract, JsonProperty containerProperty, JsonReader reader, Object target)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.PopulateObject(Object newObject, JsonReader reader, JsonObjectContract contract, JsonProperty member, String id)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateObject(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateValueInternal(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)
   at Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader reader, Type objectType)
   at Epicor.ServiceModel.Channels.ImplBase.ProcessJsonReturnHeader[TOut](KeyValuePair`2 messageHeader)
   at Epicor.ServiceModel.Channels.ImplBase.SetResponseHeaders(HttpResponseHeaders httpResponseHeaders)
   at Epicor.ServiceModel.Channels.ImplBase.HandleContractAfterCall(HttpResponseHeaders httpResponseHeaders)
   at Epicor.ServiceModel.Channels.ImplBase.CallWithMultistepBpmHandling(String methodName, ProxyValuesIn valuesIn, ProxyValuesOut valuesOut, Boolean useSparseCopy)
   at Epicor.ServiceModel.Channels.ImplBase.Call(String methodName, ProxyValuesIn valuesIn, ProxyValuesOut valuesOut, Boolean useSparseCopy)
   at Ice.Proxy.BO.DynamicQueryImpl.GetList(DynamicQueryDataSet queryDS, QueryExecutionDataSet executionParams, Int32 pageSize, Int32 absolutePage, Boolean& hasMorePage)
   at Ice.Adapters.DynamicQueryAdapter.<>c__DisplayClass45_0.<GetList>b__0(DataSet datasetToSend)
   at Ice.Adapters.DynamicQueryAdapter.ProcessUbaqMethod(String methodName, DataSet updatedDS, Func`2 methodExecutor, Boolean refreshQueryResultsDataset)
   at Ice.Adapters.DynamicQueryAdapter.GetList(DynamicQueryDataSet queryDS, QueryExecutionDataSet execParams, Int32 pageSize, Int32 absolutePage, Boolean& hasMorePage)
   at Ice.UI.App.BAQDesignerEntry.BAQTransaction.TestCallListBckg()
   at Ice.UI.App.BAQDesignerEntry.BAQTransaction.<>c__DisplayClass226_0.<BeginExecute>b__0()
   at System.Threading.Tasks.Task.InnerInvoke()
   at System.Threading.Tasks.Task.Execute()

Are you, or something you are using, using that field?

The CallContextBpmData is stored in the headers, and there is a size limit.

I have sent and received megabytes of data through rest, I believe you have something else going on.

I’d share the code as well as give a little more context on what’s going on here.

3 Likes

The code never mentions any call context data. It just serializes a JSON string (that part seems to work okay) and then returns the string from the EFx back to the calling BPM.

It looks like Epicor is using the call context area to pass the data and it gets full at about 50k. Or something.

I think we’ll wind up writing the JSON to a UD table file attachment from inside the function to avoid those size issues anyway.

Thanks,

Joe

Like Kevin said, this is only for BPMData. If you pass data in the functions as parameters, the size is far greater. Header data is limited by the webserver used and not Epicor.

Just for clarity, here are some items.

The calling function is in a BPM on an updatable dashboard for testing:

The function signature:

image

The JSON serialization code:

image

If I add ‘outResult = “”;’ after that serialization statement, I don’t get an error, so I don’t think the issue is there. It’s when the string gets passed back to the output variable or from there back into the BPM, I think, is where the issue lies.

We’ll probably just send the JSON output to a file attachment so we don’t have that size limitation.

Thanks,

Joe

Looking at this error, it would appear that the Json you are trying to parse is invalid. You may want to dump the string out ahead of time and put it in a Json checker or VS Code to see the error. I don’t think you’re having a problem with overflow here. You may have a character in your BPMData that needs to be escaped in order to parse it. Newtonsoft has no knowledge of Epicor and would not use BPMData as a work area.

Nah that’s the same error I got when I was testing call context limits.

The Office GIF

One thing that might help is not having the JSON formatted as indented. Json is Json spaces aren’t free.

3 Likes

What does “CleanJsonObject” do?

I thought about the indenting, but when it’s failing at five lines with all the history and overhead, I supposed one with 100 lines might hit the limit anyway. :slight_smile:

I’ve got some industrial strength cleaning going into building strings and at the JSON object, too, because I thought I must have bad data in there.

And I did have a bunch of quote marks for inches, new lines, etc.

Just too dang big to do it that way.

Hey, anyone here have experience writing a JSON file out to a UD table attachment?

Thanks,

Joe

What is your show message doing in the BPM? Is it showing the json result from the function?

If so try removing this or changing it to some simple and not the json field.

The reason being is that the info messages are sent in the call context client (I believe) section of the rest headers. Therefore giving the same issues people have seen when using call context bpm data.

3 Likes

Joe,
I have a PO that has over 4K Releases
image

Over 3MB of Data Source,

Simple Function

image

And it took a few seconds… but it came right up and worked

I think you may have either a weird BPM that is writing to BPMData on PO GetByID? or something like that that is causing issues, or your CleanUp routines aren’t keeping your data in good shape.

Attached is the EFX in case it helps
TEST.efxb (25.6 KB)

1 Like

There are methods for building the Json without using strings and “cleaning” it as you go along.

Newtonsoft:

System.Text.Json:

Food for thought.

1 Like

That’s my guess as well.

I hope it’s not just a red herring and we’re chasing our tails

I’ll check, thanks.

Sometimes it’s the dangdest things.

I disconnected the message and it seemed to work at 633+mb.

Thanks, Ben, et al.

Joe

1 Like

In case anyone comes across this in the future.

Call context limits are reasonably big, but not huge, and infomessages/showmessage are in the call context. Putting too much data through there is a recipe for pain and confusion.

2 Likes

and the limits are not set by Epicor it is a limit in the HTTP Protocol so go yell at W3C

3 Likes