Docstar (ECM) System Requirements

We have been using Docstar (ECM) without IDC for the past 2 years. We are just now adding IDC and noticed the system requirements say that IDC needs to be its own separate server (with SQL DB) apart from the Docstar server we already have.

With only 2,000 transactions a month, is it really necessary that we have IDC on a separate server?

Just curious what other configurations there are out there. Is it really necessary to have 2 servers dedicated for Docstar?


@jdewitt6029 I don’t know if is needed since we are just getting started with AP Automation but our consultant was advised we also needed a separate IDC server for 2,000 transactions a month so I have ECM and IDC servers in dev and production.

@jdewitt6029 My personal opinion would be no you dont’ need a separate server, but that also has a lot to do with how much work your DS instance will be doing. Do you have a lot of users in DS, a lot of workflows happening at the same time as AP Invocie import? Things like that…

If you’re virtualized then it’s a no brainer - add a second server - but if you’re on physical servers, you could try it once, then move it to another server if the performance is poor. Moving it wouldn’t not be hard.

We were under the impression that IDC needed its own instance of SQL Server, and we were going to run it on the same server as the IDC Application, but now being informed that IDC can reside on the same instance as our Epicor environments DBs, Enterprise Search DB, etc. and that IDC only does only does logging and store system settings. Can anyone confirm if that is correct.

I can confirm that - the IDC database is really only accessed to save the learned data. Otherwise it’s load in on the services for OCR and processing. We have a VM for the IDC services and the database resides on our main ERP SQL server. Same goes for Docstar as it has it’s own VM for services and image storage, but the database coexists on the ERP SQL Server. It really has very little traffic generally speaking.

Fantastic, thanks Mike.

Good to know, we just signed on for it as well and were trying to figure out architecture.

Great minds think alike :wink:

1 Like

Mike, what about the storage on the SQL server, the size and growth of the docstar database, what are your thoughts on that?

92k docs right now, DS database is 3 GB used. IDC database is 325MB used.
We’ve got 40 GB of docs to import later this month, so we’ll see what happens :slight_smile:

Note - we have lots of fields on our content types, and my workflows pull back all that data from ERP so that all documents are equally searchable inside Docstar, so my index data is a LOT more than someone using it to store the docs for (and only really accessible from) ERP.

Right on I wanted to ask you what you planned to do with the 40GB of docs, whether you were going to import them or not.

You use docstar company-wide right? You don’t use other document types within Epicor, yeah?

I find it gets messy if you have people attaching different ways within epicor i.e. file path v.s. docstar.

I wanted to cut over to just using docstar. Keep me posted about size when you import/index the 40K.

so we have a handful of doctypes inside ERP and really force the users to do everything via the ERP UI. This 40GB of stuff is from DocLink which we used prior to Docstar.

Then I’ve got another 20GB of stuff that is currently linked in ERP but using the ‘file store’ storage method, so they are in a big directory of PDFs on a shared drive. Once I can get a query built that will find and rename them based on the XfileRef/XFileAttach tables inside ERP, I can bring them into Docstar using a workflow and reattach them back to ERP and drop the old references.

Ah, I see… I would imagine that someone from docstar has something created that does just that. Cause we are also using the file store storage and I would like to move it all so that we have one central document management software that we can train and adapt to.

Instead of having to learn two and adhere to two.

They have a utility for doing it, but only ‘they’ can use it. I know it works for DocLink (it’s what we are using), but I’m not sure about any other system - and if I recall correctly, it won’t work for the file store document either - that’s why I’m working on a script to pull the data out of ERP and a workflow to consume it all.

1 Like

Surely they have had to do this for one client before who uses file store…

And of course only they can use it :wink: .

I’ll ask when I talk to them next and let everyone know.

1 Like

Right on!

To import into DocStar, you could create an XML file for each document with the meta-data in it and drop the two files (original and XML) in the take-up folder. You should have the meta-data (of some kind) in DocLink, but if not, you could IDC your DocLink PDFs…

No doubt we could build a workflow or metadata to do it. Just wondering if they have already created that for their customers and if they offer it.

I’m suggesting “yes” :wink:

FileStore is a problem though since there is no meta-data. One could go back to the linked document to get it though… :thinking:

The query I’m building is pulling the metdata from the associated transaction for the attachment. The query will be done by ERP Doctype (quote, SO, PO, AP, AR, etc) but will create the XML/CSV file for import. I’ve found a few variations on the theme but I thoguth it might be simpler to use the script to pull the key field data (company-type-transaction#) and rename the file - then using the workflow to do all the work with a metadata datalink back to ERP, then an attach to ERP step. Or maybe even use the File ID# or GUID… I’ve almost got it all where I want it :slight_smile:

I don’t know Powershell well enough, but I think it might work if it could do a SQL query - like call the cmdline SQL utility, then it could simply recursive parse through my shared folder, looking up the files and renaming them and dropping them into the take-up folder. But I’d also want it to call the API to ‘delete attachment by ID#’ as well… that way it would be self-cleaning, one file at a time (in case it crashes).

1 Like