I had some users start getting this error when trying to log in last week. I am seeing some REST API errors in the event viewer on the AppServer, though they all look like Business Logic Errors. This has come up a few times in the past week.
A quick google search yields a vast array of potential solutions. This is a .NET issue and more particularly a computer issue not an Epicor issue.
Check your clock sync, restart your server, check that your .NET is up to date and that windows is up to date.
Are you using a Terminal Server? If so you may have too many clients leaving the connection open. How Check out the google with some suggestions
Including this
Message: Insufficient winsock resources available to complete socket connection initiation.
Inner Exception Message: An operation on a socket could not be performed because the system lacked sufficient buffer space or because a queue was full 192.168.100.83:808
Program: CommonLanguageRuntimeLibrary
Method: HandleReturnMessage
Client Stack Trace
Server stack trace:
at System.ServiceModel.Channels.SocketConnectionInitiator.Connect(Uri uri, TimeSpan timeout)
at System.ServiceModel.Channels.BufferedConnectionInitiator.Connect(Uri uri, TimeSpan timeout)
at System.ServiceModel.Channels.ConnectionPoolHelper.EstablishConnection(TimeSpan timeout)
at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.Channels.SecurityChannelFactory1.ClientSecurityChannel1.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open()
Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at System.ServiceModel.ICommunicationObject.Open()
at Epicor.ServiceModel.Channels.ChannelEntry1.CreateNewChannel() at Epicor.ServiceModel.Channels.ChannelEntry1.CreateChannel()
at Epicor.ServiceModel.Channels.ChannelEntry1.GetContract() at Epicor.ServiceModel.Channels.ImplBase1.GetChannel()
at Epicor.ServiceModel.Channels.ImplBase`1.HandleContractBeforeCall()
at Ice.Proxy.Lib.SessionModImpl.Login()
at Ice.Core.Session…ctor(String userID, String password, String asUrl, LicenseType licenseType, String pathToConfigurationFile, Boolean fwVerCheck, String companyID, String plantID)
at Ice.Core.Session…ctor(String userID, String password, LicenseType licenseType)
at IceShell.Apps.LogonDialog.logOn(String userID, String password, Boolean promptUpdatePassword)
at IceShell.Apps.LogonDialog.DoWorkLogon()
Inner Exception
An operation on a socket could not be performed because the system lacked sufficient buffer space or because a queue was full 192.168.100.83:808
at System.Net.Sockets.Socket.DoConnect(EndPoint endPointSnapshot, SocketAddress socketAddress)
at System.Net.Sockets.Socket.Connect(EndPoint remoteEP)
at System.ServiceModel.Channels.SocketConnectionInitiator.Connect(Uri uri, TimeSpan timeout)
Agree with the source fo the issue being enviropnment. We did find that StackOverFlow post and my infrastructure team has been using that to troubleshoot already, I found a reference to the MaxUserPort setting in one of the Epicor Migration guides referencing the same error.
Appreciate the document and the reply, sorry for the @ so early in the week, I just knew who could point me the right way.
I have seen this before. For us it was the CDC log reader service. That is used by Epicor Social which we are not using so we removed the service and that solved the issue.
With that service running we would get about 12 hours of runtime before getting the errors. Reboot the server and we got another 12 hours…
Edit: I used the TCPView tool to see what was using all the TCP sockets, from there I think i used the Process ID to find that it was the CDC Log Reader service. TCP view can be found here TCPView for Windows - Sysinternals | Microsoft Learn
So what was the solution?
Was it adding those registry keys?
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\Tcpip\Parameters\MaxUserPort = 60000 HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\Tcpip\Parameters\TcpTimedWaitDelay = 30