Run batch file .bat or .vbs after Scribe Insight Integration Process runs in Scribe Console

Four years ago, I have written a post to talk about running batch files before or after running a Scribe DTS job. Here is a link to the old post:

http://www.mohamedibrahim.net/blog/2009/08/11/scribe-console-renaming-source-text-files-before-running-a-job-after-processing-and-regularly-changing-source-file-name/

Today, I was trying to do the same thing and I referred to my post and I was faced with a strange problem. I found that the Scribe Console integration process fires the DTS scribe job and then post processing, it runs the batch file. But the process never runs again. No matter what you do this process doesn’t run again until you reset it, change and apply change to it or pause it and then resume it. I found this to be really interesting. I tried few options including making sure Scribe has full access to the location, using UNC path instead of folder path, adding “Exit” to the end of the batch file and a number of other options to ensure the batch is processed and then hands back to the scribe process. All of that didn’t work and the process only ran once and after the batch file has run, it never runs again. If I ran the batch separately it works fine and if I run the process without the batch, all works fine.

After a lot of trials and thinking, and I mean a lot of thinking, I remembered that I had exactly the same issue 4 years ago. I looked through my files and I remembered that it was the exact same problem back then. I have just failed to mention it in my post 4 years ago (link above).

Checking back my files, I found that I used the pre job processing instead of the post processing of the dts package. So if I wanted a scribe dts job to run and then a batch to run to copy the source file to an archive location, rename it, time stamp it and then deletes it from its original location, if that’s what I want, I do it the other way. The solution is that you setup you batch to run before the job. i.e. the batch will run, copy the file to archive folder, rename it, timestamp it, and then the project runs. Once the project run, I used the option to delete the event file after execution which does the deletion for me (my source file is also my event file).

So the solution in short is: Setup your batch file to execute before your scribe console DTS job process runs instead of after it. and it works….

In regards to the different options I have tried to get the batch to work after processing the dts package, I found the following links on Scribe Open mind useful for coming up with ideas on how to get it to work (none of them worked for me unfortunately):

https://openmind.scribesoftware.com/topics/prepost-processing-commands-using-vbscript-vbs

https://openmind.scribesoftware.com/topics/pre-job-batch-script-not-running-in-file-based-int

 

Finally, I know as an MVP, I need to report this back to Scribe and I’ll try to do this. Saying that, I won’t be surprised that I am just doing something wrong in my batch or configuration that is causing the issue but as you can see from the 2 open mind links above, there are a number of people who have exactly the same issue.

 

 

Methods to Bulk Delete Microsoft Dynamics CRM records and Using Scribe Insight to perform a Bulk Delete of all CRM records.

I’m sure many people needed to do a bulk delete operation on Microsoft Dynamics CRM 4.0. You may have uploaded thousands of records from an imported file or migrated them through Scribe or even used a .NET application to mass create records.

Unfortunately, and as far as I can see, there is no straight forward way to do bulk records deletion on Dynamics CRM 4.0 using the out of the box functionality and interface of Dynamics CRM 4.0.

To bulk delete records in Dynamics CRM 4.0, you have the following main options:

  • Get a third party tool or CRM add-on to bulk delete records. This option is a straight forward one but you might have to pay for purchasing or using the tool. It may also have security issues. I would not recommend it to my clients as most probably the tool is created by a small company or an individual which I don’t know. Hence, it will be rather difficult to put this tool on a live Production environment or client server. Let alone adding it to CRM Online or to a CRM hosted solution by a partner.
  • Use CRM SDK to write a .NET application (or a .NET console application) that will run and delete all records for a specified entity or entities. This is a more robust way of doing it, but it may take longer time and is probably not suitable for people who do not come from .NET development background.
  • Use Scribe Insight. This is what this post is about really.. Using Scribe Insight to bulk Delete Dynamics CRM records.

Please Note: This is a work around. It is not supported by Scribe and the advice in this post is provided as is with no warranty. I have tried it and it works perfectly but can not guarantee it will have the same acceptable results in any other environment.

Here is what you need to do:

  1. Create a new Scribe workbench DTS (or Job). Point to your usual source file (even a sample one) and point to CRM: either IFD Forms for hosted CRM or direct connection.
  2. Configure the targe: Create one delete step on the target.
  3. Make sure that the option to “Allow multiple record matches on updates/deletes” is ticked under the All steps tab.
  4. Under Step control tab, leave failure to go to next row but change all the success records (Success (0), Success (1) and Success (>1) ) to End Job. Select success radio button at the bottom and write a message to your log such as: “All records Deleted”.
  5. No Data links are important as you are only deleting.
  6. On the Lookup link, just make the lookup condition impossible. Such as: where Account Name = 123456789 or whatever.
  7. Run the DTS.

The Job will read the first source line. Will then try to find this record at the target (remember it is update/delete). Since we have setup the lookup link to look for something “impossible to find”, the result of the update will be Success (0).

Once this happens, Scribe will go and delete all records for your chosen entity (or CRM table). This will be a complete bulk delete of all CRM records using Scribe.

Remember, it’s a work around… that works.

Scribe: Moving DTS from one location to another and changing source file location in Scribe

A challenging issue with Scribe is how to move a DTS (job) and source files (in case source is a text batch file) from one location to another or from one server to another without loosing mappings, corruptingdata links, lookup criteria, user variables, loosing fields names and basically corrupting the whole DTS.

Again, I have looked online and could barely see any information or help in this regard (probably my mistake as I didn’t search properly!). I found out that the key for moving Scribe project files including source across to a new location or a new server, the key is always the QETXT.ini file. This file is vital for pointing the DTS to which source file it should be looking at. QETXT.ini files have all the field names, so mapping S1 to the name that you have chosen for the first source field. It also has the source file name, the ODBC name and the table name. From there you can do almost everything.

When you move the files across, you will obviously need to re-point to the new source location, but by editing QETXT.ini, you will be able to put back all the source (S1 to field name) mappings, point to the new source file name, ODBC name and everything else.

This has proved very efficient and have worked now with my whole deployment.

One more important piece of advice, always try to get a dedicated folder for every source file. So if you have more than one DTS jobs, make sure that the source for each DTS job is in a separate dedicated folder. This will ensure you have separate QETXT.ini file for each one of them, hence, you can easily update the information inside it. It will still work with one large QETXT.ini file but it’s always better to separate the sources and their associated QETXT.ini files. You can always manually split this file into source specific files and put each source in a separate folder later on (which is what I have done after “inventing” this best practice of having separate source folders!).

Microsoft CRM 4.0 IFD Internet Facing Deployment (IFD) Internal Network address and subnet mask list of IP addresses.

Hi,

I have been working on setting up a test Microsoft dynamics CRM server on a server at home with Internet facing deployment. I wanted to try to see how CRM hosted will work. I have been through few issues that thankfully I managed to resolve them all. I will write another post specifically focusing on the deployment and IFD but for now, I want to focus on a quick issue that I had.

After the Microsoft Dynamics CRM was setup fine, I used the IFD tool (available free on Microsoft downloads website) to allow the Internet facing feature for the CRM server. On the IFD tool there is a list of machines that you can tell the IFD tool about. This is called “IFD Internal Network Address and Subnet Mask”.  The first thing that came to mind on seeing this is that it is for the internal computers that are allowed access to the server. Reading through the IFD documentation, I found out that it is actually the opposite. This list of machines IPs is the list of IPs for computers which are not allowed access to the Dynamics CRM server from the Internet (IFD). i.e. Any computer with an IP in this list, it will only be allowed to connect to the CRM server internally but if you attempt to access CRM Server externally from this machine, it will not even get to the authentication form.

So assume you added to this list in the IFD tool a computer with IP 192.168.1.88.

If you attempt to connect with this machine to the Dynamics CRM server using the internal address for example: http://localhost/CRM/, this will work find.

But if you attempt to connect from this machine to the CRM server using the external address: http://www.yourcrmserverdomainname.com/ then this will NOT work.

I would refer to this list as the internal addresses banned from accessing the dynamics CRM server externally.

I had this issue and I was trying to access the CRM server externally with another machine at home but I wasn’t able to and then I found out that I should actually remove this machine’s IP address from the list in order to be able to access it.

Hope this will ever help someone!

Mohamed Ibrahim

SAP and Microsoft Dynamics CRM Integration using SCRIBE

Few months ago, I was asked to research the possibility of integrating two systems for a client using SCRIBE Software specifically. The two systems are Microsoft CRM and SAP. I have done an intensive research in the subject and I have come up with a 9 pages document detailing the answer.

The quick answer is yes. SCRIBE is a good application that can be used to provide an integration between Microsoft Dynamics CRM and SAP.

If you want the report I created studying the strength of SCRIBE and the possibility of using it to do such integration, please request it via a comment on this page and I will email you the document. I will also email you a technical specification document that SCRIBE has sent me which details such integration. This technical document is not publicly available on their website as far as I know but you can always request it directly from them.

My technical report/document also has some examples of case studies in which Microsoft CRM system has been integrated successfullywith SAP. It also lists some white papers and technical documents/documentation that has covered the subject in general and the specific integration between SAP and CRM using SCRIBE.

The document does not include any reference to the client, the exact project specification or any information that could be confidential. It’s just simple facts and findings on SCRIBE and the possibility of using it for SAP and CRM integration.

*****  Updated 18/02/2010:

The document is now available on Scribe Insight blog as a guest blog post: http://blog.scribesoft.com/2010/02/guest-post-crmsap-integration-using-scribe.html  .. 

I can still send you the document if you want, just request it via a comment below please.

****** Mohamed Ibrahim Mostafa

A list of Important Questions you need to answer before starting any Integration solution or project.

I am currently working on an Integration solution for one of our clients. The solution is a general integration between two systems. The main thing for me was that I wanted to come up with a list of questions that I need an answer for so that I can start planning and designing the integration solution.

I thought about a list of general questions that most (if not all) consultants working on any integration solutions will need to have complete answers for before starting the design phase, let alone the development phase of the project.

In my opinion, the list of questions are as follows (not in real order – just a braindump!):

  • How many environments do you have? Development, Test and Live? (recommended) or is the project is still in development so you can use live environment for development? Where will be the test environment later on?
  • Is this a direct integration or in-direct integration? Is this an instant, event driven integration or a periodic scheduled integration between two systems? Are their queues for data to be migrated?
  • What backup and restore operations can you do? The ability to Backup and restore data is vital.
  • What integration application or tool are you going to use or is available? SCRIBE, SQL Server Integration Services, Web Services (Microsoft .NET Web Services), console applications, plugins? What SDK will you need? CRM SDK and CRM API for example?
  • The Environment structure: How many physical servers? Where are these servers located? Where is the integration tool or application installed?
  • How and When can I get access to the environment? Access to all servers is required including access to all databases and to all applications. For example: Access to Microsoft Dynamics CRM application (via webclient) is essential to confirm that data imported to a CRM has been migrated successfully.
  • What type and format of extracts and data imports? CSV (Comma separated Values), XML, i-Doc, sql flat files, batch files, etc…
  • Where will the extracts be imported? directly using the tool or via an FTP server? Is an SFTP server required?
  • Are their duplicates? If so where, and what classifies a duplicate?
  • Are there data entry standards for each application in the overall integrated system?
  • Are there fields that are required in each system part of this integration?
  • Are there fields that aren’t used?
  • Are there any fields with null values?
  • What relationship does the data has? are there fields which are dependant on others?
  • What are the primary and forign keys of all tables in each system that will be part of the integrated system? Any field that does not allow null, business required (and preferably business recommended) must have a data upon migration (Defauls can be used then).
  • Overall high level mapping between the different systems.
  • What is the value, length, and format of fields/columns in the source system? What is the corresponding value, length and format in the target system?
  • Are there any Pick Lists? A cross reference is required to map source and target values.
  • What Data validation is required and is acceptable by the client and the project stakeholders?
  • Differencing: What are the business rules for differencing? What data does not need to be updated and when? what data is needed to be updated based on the business requirements?
  • using Default values for all required fields and columns in the target system to avoid causing any errors.

This is the list I have thought of so far. I will keep on updating this list as and when I think of something important that needs to be considered.

Let me know if you have any comments or feedback on these questions and tell me whether or not are these questions helpful.

Thanks for reading.