There are so many ways to get information from a 3rd-party system into ServiceNow and back. It can be done via web services, JDBC, email, file transfer, etc. Let’s say a customer wants to send a CSV file to the ServiceNow instance, they have a couple of choices: attach the file manually to a data source or place the file on an FTP(s) server. Oddly, there is no option to simply have the mid server send the file directly to the ServiceNow instance to be processed by an import set. This is a very common complaint/recommendation that I hear from customers.
Placing a file on an FTP server is typically a good idea, especially when that file will be used by other vendors and you want a common location for these CSV files to be retrieved. This is not always an easy thing to do, however, when there is no public-facing FTP(s) server available and you simply need to send a CSV of users, for example, to ServiceNow.
Introducing, the Scheduled File Importer.
The Scheduled File Importer gives the mid server the ability to send a file that exists on the same machine (or on a mapped drive) directly to a ServiceNow import set table that has been created previously in your instance.
When this update set is installed, a new section in the Mid Server Application will appear called File Importer. This section contains three modules, each of which I’ll cover briefly: Scheduled File Import, Scheduled File Transform, and File Import History.
.
.
Prerequisites:
- The Mid Server plugin needs to be installed.
- The CSV file should be added to a data source via an attachment and then imported. This will create an import set table into which you may import your data.
- A transform map should be defined for the import set table in order to map the file being sent from the mid server into a target record.
Scheduled File Import
This module allows an admin to schedule the import of a CSV file from a mid server. As shown below, it only has a few fields to define, each of which are pretty self-explanatory.
The only thing to note from these fields is that the import set table that is specified in this record should already exist in the system.
When a scheduled file import record is created, it will run on a scheduled interval that is defined by the record. When the data is imported, it is added to the import set table in a Pending state (not yet transformed). Because of this, you have the freedom to load many different files into a particular import set table before a transformation occurs.
Scheduled File Transform
This module is merely a script that can be configured to run on a defined interval. It looks for all import set tables that are defined in any scheduled file import record and transforms them all.
A transform map must be defined for the import set table in order to process any of the pending records.
File Import History
Whenever a scheduled file import is run, there will be two corresponding entries in the ecc_queue. The first entry is the outgoing communication to the MID server. In this case, it sends the mid server the command to fetch a file, gives it the file location, and tells it what import set to send the data to. The second entry contains the raw results of the command.
This module filters the ECC queue for entries that pertain to this solution. Â These entries can be quite valuable when debugging efforts are required.
Is it possible to use the MID server to do an import of user records from an LDAP server?
Not directly. If you created a CSV extract of your LDAP info and put it on a MID server then you could import that file using this technique.
This is a great addition and we are trying to use it in our environment. We have followed all of the above steps,but ran to an issue. We have three schedules setup which each has their own import-set table and transform map. When the schedules run, only one of them gets to transform and the other two never get to that stage. We have checked to make sure everything is setup as it should be and can’t find any errors or warning in the log either. Is there anything you can think of that can help us resolving this?!
How large of a file this function can handle?! What is the average load time per minute?! Or in another word, how many records should be loaded in one minute (approximately)?!
AdminPro,
Are you still having problems with all of the transforms running? If so, I’d be happy to look into it. Please send me your instance name and I will take a look at what is going on.
Regarding your second question, I don’t believe that we have run any benchmarks on this. Is the reason you’re asking because you’re seeing performance issues or are you just curious?
Thanks.
We had been using this solution for close to a month now, however, recently the file stopped getting imported up to our server. The File Import History still shows that the data is being pushed up, but it never gets imported into our defined table. Any suggestions on how to troubleshoot this?
I would start by taking a look at the data in your import set table to see if the information is making it that far. If it is, then the scheduled file importer is probably working correctly and you’ll need to take a look at your transform map. If it isn’t, then you’ll want to take a look at the importer config and make sure that the MID server is up and operating correctly.
I’d like to also say that this is a great feature to add to an instance, but I do have some questions. Here’s my understanding of the process:
1). I’ve defined a single file import record, which is pulling a CSV file with ~49,000 records (people).
2). Manually executing the file import runs very fast, usually in just a few minutes.
3). In the import set table, all the records are loaded in a pending state, with no row numbers or import set relationship. This is the piece I’m a little confused with (why isn’t there an import set created at this point?)
4). When I manually execute the scheduled transform script, it appears that an import set is then created, and each record in the import set table gets a row number and import set reference.
At this point, everything does work, but it is horribly slow. The scheduled transform has been running for about 25 minutes now, and has just barely processed 4000 records. What is the expected performance for loading records? This seems rather slow (even for a sandbox instance), and I am seeing repeated warnings in the system log about “Table handling an extremely large result set.”
Any insight would be greatly appreciated.
Thanks.
Mike
Hi,
About transform map on large dataset, it appear that execution time increases exponentially. For a 12000 record transformation, we had about 37h of transformation.
What we have made to avoid this is to disable “Run business rules” in transform maps, now it take only 10 minutes. But, be carefull, we will able to do this just because we had no important business rules associated with our record type “on insert”.
Regards,
Philippe
Hi Mark,
It looks like the table never gets populated after the scheduled file import runs. The messages I get is that it can access the file and it processes the file, but nothing is being inserted in the custom table.
We confirmed the MID server is up. I am able to manually import the file to the same table and that works fine. It just doesn’t work through the scheduled file import. Let me clarify, you can watch and see the schedule file import run, just nothing gets inserted into the table.
Also could you clarify as to what you mean when you say “importer config”. Are you just saying our mid server configuration?
Thanks for your help.
Benjamin,
I have uploaded the latest version of the update set to replace the version that you have. It fixes some performance issues that earlier scripts had. Can I have you try this out in your demo instance and let me know if it resolves your issues?
Thanks!
Is there something I should do to upgrade the version?
I downloaded the copy I did just install it and commit it but data is still not being imported into the temp table for me when I execute the scheduled file import.
We just changed our MID server to run on Linux because the integration with Netcool worked. We tested the file upload and we get the same response. The file never gets imported into the table.
Any suggestions on what I can do to troubleshoot? Here is the information from the File Import History:
SNFilePoster Log:
Initializing SNFilePoster
Running SNFilePoster execute
Arriving to postToImportSet function
Upload URL: ‘https://primedev.service-now.com/nav_to.dosys_import.do?sysparm_import_set_tablename=u_itim_update’
Response status code: 302
Response body:
Looking at your upload URL, it says to go to nav_to.dosys_import.do, which is incorrect. It should probably say
&lsquo ;https://primedev.service-now.com/sys_import.do?sysparm_import_set_tablename=u_itim_update‘
Thanks, Jacob. That is what our problem was. Looks like the mid file import uses the glide.servlet.uri and we had modified it to get it to work with a redirect. That is where the nav_to.do was getting added.
I have just tried this and have a problem being reported in the mid server log. Can you advise what is causing this ?
08/09/11 14:45:27 (241) Probe: JavascriptProbe:SNFilePoster Worker starting: JavascriptProbe
08/09/11 14:45:27 (382) Probe: JavascriptProbe:SNFilePoster Slow compile (141ms) of script: probe:SNFilePoster
08/09/11 14:45:27 (522) Probe: JavascriptProbe:SNFilePoster WARNING *** WARNING *** org.mozilla.javascript.EcmaError: “SNFilePoster” is not defined.
Caused by error in JavaScript probe ‘SNFilePoster’ at line 1
==> 1: var req = new SNFilePoster();req.getLog()
08/09/11 14:45:27 (538) Probe: JavascriptProbe:SNFilePoster Enqueuing: D:\ServiceNow\agent\work\monitors\ECCSender\output\ecc_queue.2aeab3fe24d81000dbfb09307d1280a7.xml
08/09/11 14:45:27 (538) Probe: JavascriptProbe:SNFilePoster Worker completed: JavascriptProbe time: 0:00:00.297
08/09/11 14:45:28 (335) ECCSender.1 Sending ecc_queue.2aeab3fe24d81000dbfb09307d1280a7.xml
Hmm, that does sound strange. Please try downloading the update set again and re-applying. You’re missing something.
Ok, downloaded the update set 2.3 and applied all ok. However this has not changed the problem.
08/10/11 11:51:21 (118) Probe: JavascriptProbe:SNFilePoster Worker starting: JavascriptProbe
08/10/11 11:51:21 (571) Probe: JavascriptProbe:SNFilePoster Slow compile (453ms) of script: probe:SNFilePoster
08/10/11 11:51:21 (884) Probe: JavascriptProbe:SNFilePoster WARNING *** WARNING *** org.mozilla.javascript.EcmaError: “SNFilePoster” is not defined.
Caused by error in JavaScript probe ‘SNFilePoster’ at line 1
==> 1: var req = new SNFilePoster();req.getLog()
08/10/11 11:51:21 (993) Probe: JavascriptProbe:SNFilePoster Enqueuing: D:\ServiceNow\agent\work\monitors\ECCSender\output\ecc_queue.177c9cdf245c1000dbfb09307d128041.xml
08/10/11 11:51:22 (290) Probe: JavascriptProbe:SNFilePoster Worker completed: JavascriptProbe time: 0:00:00.781
08/10/11 11:51:23 (009) ECCSender.1 Sending ecc_queue.177c9cdf245c1000dbfb09307d128041.xml
How does applying the update set to the instance affect the mid server ? Is the SNFilePoster a componenet that needs to be applied to the mid server ?
Hi,
I have uploaded the update set as per document without any error. I used below link to upload the update set:
https://servicenowguru.wpengine.com/service-now-general-knowledge/installing-sncguru-update-set/
My intention is to import the data from one Service-Now server to another Service-Now server or any external application.
As per steps mentioned above :-
“When this update set is installed, a new section in the Mid Server Application will appear called File Importer. This section contains three modules, each of which I’ll cover briefly: Scheduled File Import, Scheduled File Transform, and File Import History.”
I am not getting any new section in the MID server section in SN GUI.
Could you please help me out if I am missing something or need to do any thing else.
Regards,
ND
Send me your instance name via the contact page and I’ll take a look.
Even I am getting the above error.
Evaluation error: Caused by error in JavaScript probe ‘SNFilePoster’ at line 1
==> 1: var req = new SNFilePoster();req.getLog()
Please help
Please update your SNFilePoster mid server script include with the latest code to make sure that we’re debugging a true problem. You can find the latest code here: https://servicenowguru.wpengine.com/scheduled-file-poster-code/
Please update me with your results. Thanks!
Actually, I am seeing the same error in the ecc queue coming back from the mid server. Looking at the mid server agent log, there is an error about “SNFilePoster is not defined.”
I’m working on a june2011-05-25-2011__patch3-11-14-2011 instance, mid server version 11-22-2011_1328. I did download the code from https://servicenowguru.wpengine.com/scheduled-file-poster-code/ just to be sure, but that didn’t help.
Answer for Mike:
There is some change required in the access control written on ecc queue table.
Below are the details:
Add mid_server role to the acl ecc_agent_script_include.* and ecc_agent_script_include for read operation
Go to ‘requires role’ related lists on above ACL’s and assign mid_server role.
This should solve the issue.
Hope this works!!!
Unfortunately, those ACLs are already in place. Any other ideas?
Ack. Someone had reset all the passwords in the user table, including the account the mid server was using. Odd that I was seeing “SNFilePoster not defined” instead of “authentication failed” in the log.
Those ACLs are already there but see if the requires role related list has the mid server role.
Is there an option to delete the source file (csv) after import?
FYI: The process works perfectly!!! Should be part of the Out of Box product…
You can do this by simply running a post script to do the cleanup, if you wish. To do this, check the “Run Post-script” checkbox and a script field will appear. In that script field, you can enter a command that is specific for the machine where the command will run. In other words, if the mid server is running on a Windows machine, you’ll want to use a DOS command to delete the file. Otherwise, you’ll use a BASH command.
Thanks for the feedback!
I tried using the post-script to rename the file, but it doesn’t appear to do anything and there are no error messages in the history. I’ve tried different variations of DOS commands (i.e. with and without the cd\scripts).
cd\scripts
ren filename.csv filename1.csv
Nancy,
If you can give me a test user on your instance, I’d be happy to log in and take a look. You can email that to me at jacob@crossfuze.com.
Thanks,
Jacob.
Thanks Jacob for the information. I am going to give this a try!!
Hi Jacob,
I have installed this update set in my instance.But our ftp servers requires password and username for loging in,So my question is where i will put those credentials without Breaking the code.
Kind Regards,
Ankush
This functionality doesn’t connect to an FTP server. See this line in the post above…
“The Scheduled File Importer gives the mid server the ability to send a file that exists on the same machine (or on a mapped drive).”
You need to install the MID server on the machine that hosts the files (or can access them via a mapped drive). As such, there is no need for any credentials other than what you would use in a regular MID server config.
Has this been tested on Aspen? I’ve been reliably getting an error with an import that previously worked as configured (85% sure, anyway)… File is not a normal file / status code 200.
Thanks!
I’m not aware of anything in Aspen that would break this, though there were some issues with the previous June 2011 release. Please ensure you’re using the latest version of the update set from this site (especially the latest version of the script include).
Yes, I’m using 2.5.
Per your request, I just installed this on Aspen. It looks like it runs just fine for me on the demo instance running Aspen. I’m not sure what the issue you might be seeing. However, I’d double-check the path you have entered to make sure that it points to a valid file.
I have tried the update set, but got some errors in the preview, which indicate that there are 7 missing dictionary entries in the u_scheduled_file_import table. The missing entries are: run_dayofmonth, active, run_start, name, run_period, run_dayofweek and run_type. Also, the “Scheduled File Transform” module was not created on commit.
I also tried to apply it to one of the demo-instances and got the same result. Have you seen this before?
/Tomas
I just tried this on a demo instance and it seems to install the module just fine. Did you make sure to refresh your application list after installing the update set? You’ll need to do that in order to see the new module.
I did notice the preview errors that you mentioned…. my guess is that since my table extends the system scheduled job table, the previewer script does not see the base table’s fields when doing the evaluation. After I commited the udpate set, the list and forms all looked good.
Let me know if you can see the module after a refresh.
Thanks for your quick response! I located the missing module. It was set to inactive. Is it only activated when the import set table has a corresponding transform?
/Tomas
No, I think it should be active on the update set’s commit. I’m not sure why it didn’t come that way for you.
Ok, it works fine now. By the way, is there any way to specify the format of the file (as in a data source). My csv includes international characters and I therefore need to use utf-8.
Thanks,
Tomas
Never mind. I figured out that I could set the format property to utf-8 on the Data source. It works now.
Thanks again for a great customization!
I just installed this update set. When looking at the Import set run it looks like the updates and ignores are correct. however, when I look at the records that should be updated, nothing changed. I then looked at the File Import History and the output processed but the input is in the error state with a message of ‘No sensors defined’.
Thanks,
Rose
I was able to get the updates to work. There was an issue with the CSV file setting a value to true. I had to change the true/false from all CAPS to lower case. I’m still getting the ‘No sensors defined’ error message but everything is updating as planned.
The file import seems to be working fine for me but I also get the “No Sensors Defined” error message in the file import history. Does anyone know if this is something we should ignore or is a valid error occurring? Thanks for responses.
I tried to use this utility for importing user data from | delimited CSV file. The data is not getting mapped to the fields even when there is a transform map present. Is there any way we can specify the delimiters?
I attempted to use this utility in Berlin (CA). Everything appeared to work, expect no actual data was imported. I tried test loading 20 records from the data source and is said “Invalid table alias”. There are no special characters in the import set table name. Any idea what would cause that error?
I’m experiencing the same problem since upgrading to Berlin. What I believe is the issue is that I have some fields at the end of my import file that are sometimes blank. The program that we are using to generate the CSV file simply leaves these fields off the record when they are at the end of the record. I still have my production instance at Aspen and the exact same file works there so I know this issue started with Berlin.
This may not exactly be your issue but maybe its a place to look.
I’ve opened a HI ticket and will post what I find out.
If SNC cannot easily fix then there are a few things to try.
1. See if the program generating the file can output empty fields at the end of a record.
2. Change the order of the fields in the record so that the last field always has a value. I know that blank fields in the middle of a record are processed properly but I think this is because they are formatted with quotes and commas as one would expect.
I’ve confirmed that the issue is with the blank fields not being represented in the csv file. When the transform runs I found an error in the Warning log stating that the import/transform is expecting 30 fields but found 28. Or something to that effect.
When reporting to ServiceNow they confirm that in Berlin this is an issue but in Aspen this was not an issue and the file would process normally. You can manually add a , to represent each blank field but this is not practical for a large file. I also found that opening the file in Excel, entering blanks in the missing fields of the first record and then saving the file in .csv format resolved the issue. Apparently Excel then exports all columns, including blanks. Still only practical for testing purposes and not something you want to do daily.
My only solution is to have the program generating the file fixed to include all the fields, even if they are blank.
Jacob,
One of our clients upgraded to Berlin and I’d been using this particular file import functionality to pick up some user records to update fields not populated by LDAP. Ever since the upgrade the file has failed to import. I’ve checked the MID server logs and the ECC queue, and the probes have been launching. However they’ve been returning “Response status code: 200.” Was this particular integration broken by the Berlin release?
The free version that had been offered previously is no longer supported by Crossfuze. I have heard from some customers that Berlin may break the legacy application.
Our new commercial product, the Crossfuze Data Extract, is quite a bit better than the old free version and will work with legacy, current, and upcoming ServiceNow product releases. Please let me know if you’d like to see a demo of the new version.
Thanks,
Jacob
Hi,
I am loading data in SN using JDBC data source and scheduled import.
While data load, getting error as “Error during data load, skipping application module creation”
Any idea why this error has been occurred?
is there any encryption that i sbeing used for data transfer here?
The data is SSL encrypted during transit from the MID server to the ServiceNow instance.
Since we upgraded to Eureka this function has stopped working.
Service Now Tech have advised the following:
The root reason is MID Server times-out when it does not receive a response from the instance within 5 minutes.
The 5 minutes cannot be updated to a higher value, it is a limit set by the storage device in ServiceNow Data Center.
We tried to reduce the size of our 2.4MB file but that made zero difference the file still would not upload.
I have been manually uploading until we find a fix or a supported method.
Hi Gareth, I would push back with SN support if I were you. They indicate that the issue is being caused by a setting on their data center storage device, but why would an instance upgrade cause that interaction to behave differently? I would assume that the storage device configuration is the same for their Dublin and Eureka instances. Seems like it’s the upgrade that has caused your issue, not the storage device config.
Any update on this from anyone? We have upgraded from Calgary directly to Eureka and our imports are no longer working.