ERP Software Upgrades: What’s in it for Finance?

Software upgrades are typically managed under the supervision of a company’s IT department. However, when you think about upgrading an ERP system, I.e. Microsoft Dynamics GP, Finance is heavily impacted. This begs the question of what would cause a Dynamics GP upgrade to be initiated.

As stated above, Finance needs to be involved as they are going to feel the heaviest impact. There are a number of benefits for Finance during an upgrade. It is important for the CFO, Controller, etc. to stay involved and lead their team in using additional functionality, gaining additional productivity, and streamlining processes based on the additional feature set.

When looking at the “What’s New” documents, there have typically are usually over 100 changes, but most organizations don’t take advantage of them.

With GPUG (GP User’s Group) Summit beginning in Tampa, FL tonight, it is time for sessions and learning more about GP and the roadmap.

Tomorrow, I will be presenting a session “Upgrades: What’s in it for Finance?” at 2 pm. Come to this session and we’ll dig deeper into who initiates upgrade projects, why upgrades are done and how Finance can benefit. I will add the presentation slide deck to the blog following the session.

Microsoft Dynamics GP Native BI

I am currently attending the Microsoft Partner Connections Event in Tampa, FL. Belinda Allen (@BelindaTheGPcsi) presented on the types of tools available within Microsoft Dynamics GP natively. This session got me thinking about how often I have worked with companies that ask for Business Intelligence and what they mean by it. So often, people associate BI with either reports or dashboards. Belinda pointed out, and I agree, that BI is not about the reports or the dashboards but about the data provided in a timely, meaningful way to analyze and direct decisions within the business.

There are so many cases where business owners request Business Intelligence but do not know what information they actually need. Belinda’s presentation was so on target with saying partners need to help customers see what they need and can do. She goes on to present that you need to start by asking what information would make it easier to create or achieve goals. From there, you can look at the problems you have and what information would help in the resolution and / or prevention of that problem happening again. Finally, look at what you would have done differently if you had 1 additional piece of information.

Belinda showed three (3) types of BI:

  • Past Information (Tactical)
  • Present information (Operational)
  • Future information (Strategic)

Management Reporter shows trend reports and analysis of the past which helps to see what was done tactically.

Strategic reports include GP Budgets, Excel Budgeting Wizard and Forecaster. These help to see what you want to accomplish going forward.

Operational BI monitors current processes for managers and front line employees. This can be reported via built in reports and inquiries within Dynamics GP. SmartList and SmartList Designer provide ad-hoc query capabilities. Reminders on the home page are also BI. Business Analyzer also provides graphical reports, which can be shown within GP or on the desktop.

Excel refreshable reports with Power Pivot allows the creation of dashboards within Excel using data connections.

This session was extremely helpful to develop a good approach for introducing BI. Building upon this, I think it opens up companies to see some information and then gradually ask for more. This is a great opportunity to help businesses use data better and further their growth.

I look forward to seeing more over the week on BI and Dynamics GP.

MDA: The Power to Link Data Across Modules

Recently, I was presented with the challenge of producing a customer profitability report out of Microsoft Dynamics GP. My first thought, of course, was the report already exists, why do I need to create one?

In looking further at how customer profitability was being tracked, I found that there was the traditional sales model. The customer purchased product for a price. The product had an associated cost and we could produce a profitability report based on the FIFO layers of inventory.

The next component that entered into the picture though was that there were shipping charges paid to a 3rd party carrier as well as broker fees, marketing rebates, etc. which were being entered directly into accounts payable. Those needed to be added and subtracted from the cost of the item when calculating customer profitability.

By using MDA to require a customer id on the key general ledger accounts, we were able to code accounts payable transactions to customers. Taking it 1 step further, we used a code on the user defined fields in the chart of accounts to categorize accounts as to which column on the report they should be in.

Joining the gl transactions to the mda transactions based on the journal entry number, we could associate a customer with those specific journal entries or payables transactions. By combining that data with the sales data using the originating master id, we were able to capture the sale and the cost in the more traditional distribution sense.

Once that data was gathered, I added it to a stored procedure taking a beginning date and ending date as a parameter so the user could select the period to review.

I then created an SSRS report which put the data in the appropriate column, added totals, percentages and ran the report.

The sql code for the stored procedure is below.

/****** Object:  StoredProcedure [dbo].[usp_CustomerProfitReport] Script Date: 05/05/2013 16:34 ******/ SET ANSI_NULLS ON GO

SET QUOTED_IDENTIFIER ON GO

ALTER PROCEDURE [dbo].[usp_CustomerProfitReport](@StartDate Date, @EndDate Date) AS /*  NAME: usp_CustomerProfitReport  AUTHOR: Howard Swerdloff, Consultant  DATE: May 5, 2013    Execution: usp_CustomerProfitReport ’01-01-2013 00:00:00.000′, ’03-31-2013 00:00:00.000′    CHANGE LOG:    DATE   AUTHOR    DESCRIPTION  =========================================================================================  May 5, 2013  Howard Swerdloff Initial Development   */    — Select MDA Transactions

select  AIndex.ACTNUMST AS ACCOUNT  ,AMast.ACTDESCR AS [DESCRIPTION]  ,AMast.USRDEFS1 AS CATEGORY  ,CTrx.SERIES  ,CTrx.JRNENTRY  ,CTrx.SOURCDOC  ,CTrx.REFRENCE  ,CTrx.DSCRIPTN  ,CTrx.TRXDATE  ,CTrx.TRXSORCE  ,CTrx.ORGNTSRC  ,CTrx.ORCTRNUM  ,CTrx.ORMSTRID  ,CTrx.ORMSTRNM  ,CTrx.ORDOCNUM  ,CTrx.ORTRXSRC  ,CTrx.OrigSeqNum  ,CTrx.SEQNUMBR  ,CTrx.DEBITAMT  ,CTrx.CRDTAMNT  ,MDtl.GROUPID  ,MDtl.CODEID  ,MDtl.CODEAMT  ,Cust.CUSTNAME  ,Cust.CUSTCLAS   from  GL00100 AMast  inner join   GL00105 AIndex   on   AMast.ACTINDX = AIndex.ACTINDX  inner join   GL20000 CTrx   on   Amast.ACTINDX = CTrx.ACTINDX     left outer join   DTA10100 MHdr   on    CTrx.JRNENTRY = MHdr.JRNENTRY    and    CTrx.SEQNUMBR = MHdr.SEQNUMBR  left outer join   DTA10200 MDtl   on    MHdr.DTAREF = MDtl.DTAREF    and    MHdr.SEQNUMBR = MDtl.SEQNUMBR    left outer join   RM00101 Cust   on    MDtl.CODEID = Cust.CUSTNMBR    where  AMast.USRDEFS1 not in (‘N/A’, ”)  and  MHdr.DTAREF is not null  and  CTrx.TRXDATE between @StartDate and @EndDate

UNION ALL

— Select Non MDA Sales Transactions

select  AIndex.ACTNUMST AS ACCOUNT  ,AMast.ACTDESCR AS [DESCRIPTION]  ,AMast.USRDEFS1 AS CATEGORY  ,CTrx.SERIES  ,CTrx.JRNENTRY  ,CTrx.SOURCDOC  ,CTrx.REFRENCE  ,CTrx.DSCRIPTN  ,CTrx.TRXDATE  ,CTrx.TRXSORCE  ,CTrx.ORGNTSRC  ,CTrx.ORCTRNUM  ,CTrx.ORMSTRID  ,CTrx.ORMSTRNM  ,CTrx.ORDOCNUM  ,CTrx.ORTRXSRC  ,CTrx.OrigSeqNum  ,CTrx.SEQNUMBR  ,CTrx.DEBITAMT  ,CTrx.CRDTAMNT  ,MDtl.GROUPID  ,MDtl.CODEID  ,MDtl.CODEAMT  ,Cust.CUSTNAME  ,Cust.CUSTCLAS   from  GL00100 AMast  inner join   GL00105 AIndex   on   AMast.ACTINDX = AIndex.ACTINDX  inner join   GL20000 CTrx   on   Amast.ACTINDX = CTrx.ACTINDX     left outer join   DTA10100 MHdr   on    CTrx.JRNENTRY = MHdr.JRNENTRY    and    CTrx.SEQNUMBR = MHdr.SEQNUMBR  left outer join   DTA10200 MDtl   on    MHdr.DTAREF = MDtl.DTAREF    and    MHdr.SEQNUMBR = MDtl.SEQNUMBR      left outer join   RM00101 Cust   on    CTrx.ORMSTRID = Cust.CUSTNMBR

where  AMast.USRDEFS1 not in (‘N/A’, ”)  and  MHdr.DTAREF is null  and  CTrx.SERIES = 3  and  CTrx.TRXDATE between @StartDate and @EndDate

UNION ALL

— Select Non MDA Non Sales Transactions

select  AIndex.ACTNUMST AS ACCOUNT  ,AMast.ACTDESCR AS [DESCRIPTION]  ,AMast.USRDEFS1 AS CATEGORY  ,CTrx.SERIES  ,CTrx.JRNENTRY  ,CTrx.SOURCDOC  ,CTrx.REFRENCE  ,CTrx.DSCRIPTN  ,CTrx.TRXDATE  ,CTrx.TRXSORCE  ,CTrx.ORGNTSRC  ,CTrx.ORCTRNUM  ,CTrx.ORMSTRID  ,CTrx.ORMSTRNM  ,CTrx.ORDOCNUM  ,CTrx.ORTRXSRC  ,CTrx.OrigSeqNum  ,CTrx.SEQNUMBR  ,CTrx.DEBITAMT  ,CTrx.CRDTAMNT  ,MDtl.GROUPID  ,MDtl.CODEID  ,MDtl.CODEAMT  ,Cust.CUSTNAME  ,Cust.CUSTCLAS   from  GL00100 AMast  inner join   GL00105 AIndex   on   AMast.ACTINDX = AIndex.ACTINDX  inner join   GL20000 CTrx   on   Amast.ACTINDX = CTrx.ACTINDX     left outer join   DTA10100 MHdr   on    CTrx.JRNENTRY = MHdr.JRNENTRY    and    CTrx.SEQNUMBR = MHdr.SEQNUMBR  left outer join   DTA10200 MDtl   on    MHdr.DTAREF = MDtl.DTAREF    and    MHdr.SEQNUMBR = MDtl.SEQNUMBR      left outer join   RM00101 Cust   on    CTrx.ORMSTRID = Cust.CUSTNMBR        where  AMast.USRDEFS1 not in (‘N/A’, ”)  and  MHdr.DTAREF is null  and  CTrx.SERIES != 3  and  CTrx.TRXDATE between @StartDate and @EndDate

Dynamics GP SmartList Builder Security

SmartList is a common ad-hoc reporting tool within Dynamics GP, which is used by many customers. What happens when you try to add that one field that is not in the list of available columns or you want that one piece of data that does not reside within Dynamics GP? This is where SmartList Builder comes in.

SmartList Builder adds the power to create your own SmartLists. Here, we want to address the number one question I get. Can I modify an existing SmartList? The short answer is no, but the longer answer is you can create a new SmartList replicating the existing one, with the changes you want and then through security remove the old. As we go through the security model, you will see how to do that.

There are several areas that need to be addressed when looking at SmartList security.

SmartList1

Click to Enlarge

First, within SmartList Builder itself, there is a secuity menu. You have two choices, data connections and SQL tables. For purposes of this article, we will focus on the SQL table security.

Click to Enlarge

Click to Enlarge

This is where you can grant SmartList Builder access to specific databases and / or tables and views. You will notice that all databases are listed, so you can add security to other database tables and views outside of Dynamics GP. Often the key to creating the SmartList users are asking for is to first create a view that pulls the data together, including any unique calculations, case statements, etc.. If the table relationship does not exist in GP out of the box, it needs to be defined in a SQL view.

Once security is granted to the appropriate objects, SmartList Builder can be used to create the SmartList. Building the SmartList is outside the scope of this article, but will be in a future blog post.

After building a new SmartList, typically only sa and the users assigned to the Power User role will have access to it. The next step is to grant security within the Dynamics GP role based model.

SmartList3

Click to Enlarge

In Security Task Setup, you can add to an existing task or create a new task. The task must be assigned to a role and the role to a user. The idea though is to select SmartList for the product, SmartList Object for the Type and SmartList Objects for the Series. This will give you a list of all SmartLists, including the ones you’ve created with SmartList Builder. Add a checkmark to those new ones you want the users to see. Remember, you can create multiple tasks, assigning some SmartLists to a group of users and other SmartLists to another group.

In the case mentioned earlier where you want to modify an existing SmartList, you can remove security from the existing SmartList and replace it with the new SmartList you’ve created.

GP 2010 Multicurrency Rate Updates

Microsoft Dynamics GP provides the ability to track multiple currencies, rate tables and exchange rates. The challenge has always been in keeping the tables updated and with what frequency. With out of the box functionality, GP requires a user to go in to the rate table maintenance screen and enter the rate table id, the exchange rate, the effective date and the expiration date. (date thru which the rate is valid) This requires looking up the currency Fx rate prior to entering the data in GP, which can become a chore for someone to do at the beginning of each day. (especially if there are multiple currencies and sources) You then need to have a backup for that person in case of vacation, sick time, etc.

There are many independent software vendors (ISVs) that provide automated mechanisms to update the rates. They can be tied to a source of data and set on a schedule for maintaining the rate table. There are 4 items that are critical in selecting how this is to be done:

  1. What are the local reporting requirements? I.e. many countries require you to use rates furnished by their “national bank”.
  2. What sources does the isv connect to? Can they feed from multiple sources if countries are requiring rates from their national bank?
  3. Does the source include all currencies you anticipate using?
  4. Are there subscription fees associated with the rate feed in addition to the price of the ISV product?

I have found that it is a good idea to contact the taxing authorities in each country directly to find out what the requirements are. In one country as an example, I found almost a 5% difference in the rates reported by online currency translation services and the national bank. The taxing authority in that country required all books and tax returns to be kept and reported using the rates of the national bank.

In researching ISV products, I found they had a set list of vendors with whom they had pre-built integrations. In one case, they did not include all the currencies I was using. In another case, they had all the currencies I needed, but the source required me to use them for all international wire transfers and even went as far as saying they would not provide the feed if I did not open an account with their bank.

Also, not many of the ISVs will allow multiple sources of rates, needed in the case where countries require a feed from their national bank.

I found that most of the sources of currency Fx rates provide subscription services, many at no charge. Their technical support staff can give you the code required to call their web services and get the rates required. You then need a developer to write a Windows service to call the provider’s service and update the rate table in GP. This turned out to be a very easy project for a developer I used and it provides the ability to quickly and easily add new currency conversions and sources since the overall model is built.

The key message here is if you do not want to lookup and modify rates manually, understand your requirements, look at the ISVs AND their sources of currency feeds carefully and if it doesn’t look like it will work, do not worry because a developer should be able to easily help. Don’t forget that you can always ask the ISV for a modification as well.

Posting takes a long time in GP 2010

When Dynamics GP posts to general ledger, it utilizes a temporary table called pjournal. The pjournal table resides in each company database in your environment. This is a temporary table and will slow down posting when records remain in it.

Per kb 898982 http://support.microsoft.com/kb/898982 you can delete all records from the pjournal table in each company database. I have found that to significantly increase the speed of posting.

If you find this to be an issue that you are experiencing, you can enable a job under sql agent called “Remove Posted PJournals”. This job will go through all of your company databases and clear the table. In order to enable the job, open SQL Server Management Studio and expand your server instance.

Expand SQL Server Agent and then jobs. Find the job entitled “Remove Posted PJournals”. Right click and choose properties. Mark the “Enabled” check box.

By default, this job runs every 30 minutes. If you want to change the schedule, you can right click on the job, choose properties and go to the schedule page.

New GP 2013 Upgrade Blog Series

Kelly Youells from Microsoft has been busy publishing blogs and information on upgrading GP 2013. She is planning a blog series as below:

March 28 – Upgrade Planning – Let’s Get Started!
April 2 – Known Upgrade Issues
April 4 – Version Checks & Upgrades
April 9 – Database Upgrade – The Details!
April 11 – Modified Dictionary Upgrade
April 16 – Upgrade Troubleshooting
April 18 – Microsoft Dynamics GP is Upgraded – What’s Next?

Watch https://community.dynamics.com/gp/b/dynamicsgp/archive/2013/03/25/microsoft-dynamics-gp-2013-upgrade-blog-series-schedule.aspx for more information. Links will be updated as the blogs are published.

Thanks for all the great information Kelly.

Changing Fixed Asset Depreciation Method Doesn’t Save

When changing the fixed asset depreciation method from straight line to an accelerated method in the asset book screen, clicking save reset the depreciation method back to straight line.

In order to understand the reason, it is important to understand the function of the switch-over method. The switch over method provides a choice of no switch or straight-line. At the beginning of each year, Dynamics GP analyzes the switch-over method against the depreciation method chosen. Depreciation will be calculated based on whichever method provides the most depreciation.

For this reason, straight-line is not a valid switch-over method when choosing an accelerated depreciation method. If changing from straight line to any double declining balance, you need to choose no switch-over in order for the record to save.

Customizing Navigation Lists vs Personalizing

When looking at Navigation Lists in GP, there is often a question of what can be done that is visible to all users vs. what can be done that remains for the user making the change. This gets into Navigation List personalization compared to customization.

The following are the types of things you can do via personalization:

  • Only visible to you
  • Drag and drop columns in another sequence
  • Change the sort
  • Hide or show the information pane
  • Hide or show the filter area
  • Hide or show the action pane

With customization, you can do all of the above plus:

  • You set who can see the changes
  • Set defaults for the list
  • Set what actions are available on the action pane

Dynamics GP 2010 Fixed Asset Projection Button Grayed Out

Recently, while running the fixed asset projection routine in Dynamics GP 2010, the network connection was lost. After resolving the issue and going back into GP, the user found the projection button was grayed out in fixed asset routines. There is a kb article on customer source and many blogs available on this subject and most say to delete the records in the FA41900 table. This actually did not work.

I found a kb article on Microsoft’s public site 856544, which also indicated the FA40203 table needed to be deleted. This resolved the issue. http://support.microsoft.com/kb/856544#appliesto

Maintaining Dynamics GP Integrations in a Multi-user Environment

Microsoft Dynamics GP Integration Manager offers an easy to use interface for importing master or transactional data into Dynamics GP. When using text files and without getting into vb scripting to prompt a user for the name of the file to be integrated, you are required to save the file in the same location with the same name in order for the integration to run without modification. 

In larger organizations, where multiple people run the same integrations or you want to limit what integrations users have access to, this becomes a small challenge. There are a couple of different ways this can be handled. 

Integration Manager has an option where you point to the location of the integration manager database (typically an imd file). Each user can point to their own database, which allows them to save files in their own location with it’s own name. This presents a maintenance challenge as you start getting multiple copies of the same integrations. As 1 changes, you need to update all the similar integrations. 

You can also have everyone point to a shared integration manager database, but have multiple integrations for the same task, still giving users the ability to save their file where they like, but not alleviating the issue of having to maintain multiple integrations for the same task.

My preferred method is to have an integration manager database for each department on the network share. Each integration will run from a local location. It could be on the user’s local c drive or from their user folder on the network. This way there is 1 copy of each integration but multiple people can run the same integration at the same time, since the source files are in different locations. 

Each user can then have a shortcut to the imd file on their desktop or have the integration manager configuration point to the database for their department. 

For testing purposes and integration modifications, I suggest a master integration database where integrations can be imported to or from. This provides a test environment for whomever is writing or modifying the integrations.

Integrating Vendor EFT Information Into Dynamics GP

Recently, I was asked to integrate vendor information into Dynamics GP. At first glance, this is a straight forward integration which is done all the time. As I went over the file, I found one of the requirements was to include EFT information along with the address. These fields are not available using either the Dynamics GP or eConnect destination adapters.

It is possible, of course, to develop an application that calls web services or writes directly to the tables, and while that is a better solution, it would take significantly longer than writing something in Integration Manager.

The first step is to map the vendor information using standard Integration Manager mapping. Following that, you want to put a script in the “Before Document Commit” area of the integration object. This script will run after Integration Manager validates all other data in the record, but before the record is posted to the database.

The first step is to declare the variables you will use to enter data in each field. That is done using the dim command. Included in the variables should be connections for both the database and a recordset so that records can be added to the address master table (sy00600)

Dim objRec
Dim objConn

Set objRec = CreateObject(“ADODB.Recordset”)
Set objConn = CreateObject(“ADODB.Connection”)

The following is a sample of adding variables for the data fields.

dim ADRSCODE
dim EFTBankCode
dim EFTTransitRoutingNumber

You will then assign values to each variable by using the field name followed by an equals (=) sign and the value. In the case of Integration Manager, we will want to read those values from the source file provided. There is a command called sourcefields available for that. For example, assuming we setup a source object called vendors and a field called Address ID, we could set the value of the ADRSCODE by using:

ADRSCODE = sourcefields(“vendors.Address ID”)

I also recommend using the trim command to remove any leading or trailing spaces.

ADRSCODE = trim(sourcefields(“vendors.Address ID”) )

The cmdstring command is used to create a SQL statement that can be executed at the end of the script.

sSQL = “insert into [dbo].[SY06000] … “

Also, very important is to close your connection at the end of the script. If you don’t, each record imported will open a new connection. At some point, you will run out of system resources if there are too many connections open.

The following is an example of a script after all these components are put together. It is only a sample and is provided as is. You will need to modify based on your specific field names and fields you want to update. Be sure to change the database name and password on the connection string as well.

Dim objRec
Dim objConn
Dim cmdString
Dim VendorID
Dim ADRSCODE
Dim EFTBankType
Dim BankName
Dim EFTBankBranch
Dim IntlBankAcctNum
Dim Bnkctrcd
Dim CBankcd
Dim RegCode1
Dim EFTTransitRoutingNo
Dim Curncyid
Dim BankCode
Dim RegCode2
VendorID = trim(SourceFields(“VendorsEFT.Vendor ID”))
ADRSCODE = trim(SourceFields(“VendorsEFT.Address ID”))
EFTBankType = trim(SourceFields(“VendorsEFT.EFTBankType”))
BankName = trim(SourceFields(“VendorsEFT.BankName”))
EFTBankBranch = trim(SourceFields(“VendorsEFT.EFTBankBranch”))
IntlBankAcctNum = trim(SourceFields(“VendorsEFT.IntlBankAcctNum”))
Bnkctrcd = trim(SourceFields(“VendorsEFT.BNKCTRCD”))
CBankcd = trim(SourceFields(“VendorsEFT.CBANKCD”))
EFTBankCheckDigit = trim(SourceFields(“VendorsEFT.EFTBankCheckDigit”))
RegCode1 = trim(SourceFields(“VendorsEFT.RegCode1”))
EFTTransitRoutingNo = trim(SourceFields(“VendorsEFT.EFTTransitRoutingNo”))
Curncyid = trim(SourceFields(“VendorsEFT.Currency ID”))
EFTBankCode = trim(SourceFields(“VendorsEFT.BANK CODE”))
RegCode2 = trim(SourceFields(“VendorsEFT.RegCode2”))
Set objRec = CreateObject(“ADODB.Recordset”)
Set objConn = CreateObject(“ADODB.Connection”)
objConn.ConnectionString = “Provider=MSDASQL;DSN=Dynamics GP;Initial Catalog=TWO;User Id=sa;Password=PASSWORD”
objConn.Open
cmdString = “INSERT INTO [TWO].[dbo].[SY06000]” &_
“([SERIES]” &_
“,[CustomerVendor_ID] “&_
“,[ADRSCODE] “&_
“,[VENDORID] “&_
“,[CUSTNMBR] “&_
“,[EFTUseMasterID] “&_
“,[EFTBankType] “&_
“,[FRGNBANK] “&_
“,[INACTIVE] “&_
“,[BANKNAME]”&_
“,[EFTBankAcct] “&_
“,[EFTBankBranch] “&_
“,[GIROPostType] “&_
“,[EFTBankCode] “&_
“,[EFTBankBranchCode] “&_
“,[EFTBankCheckDigit] “&_
“,[BSROLLNO] “&_
“,[IntlBankAcctNum] “&_
“,[SWIFTADDR] “&_
“,[CustVendCountryCode] “&_
“,[DeliveryCountryCode] “&_
“,[BNKCTRCD] “&_
“,[CBANKCD] “&_
“,[ADDRESS1] “&_
“,[ADDRESS2] “&_
“,[ADDRESS3] “&_
“,[ADDRESS4] “&_
“,[RegCode1] “&_
“,[RegCode2] “&_
“,[BankInfo7] “&_
“,[EFTTransitRoutingNo] “&_
“,[CURNCYID] “&_
“,[EFTTransferMethod] “&_
“,[EFTAccountType] “&_
“,[EFTPrenoteDate] “&_
“,[EFTTerminationDate]) “&_
“VALUES “&_
“(4 “&_
“,rtrim(‘”& VendorID &”‘)”&_
“,rtrim(‘”& ADRSCODE &”‘)”&_
“,rtrim(‘”& VendorID &”‘)”&_
“,” “&_
“,rtrim(‘1’) “&_
“,rtrim(’26’) “&_
“,rtrim(‘1’) “&_
“,rtrim(‘0’) “&_
“,rtrim(‘”&BankName & “‘) “&_
“,” “&_
“,rtrim(‘”&EFTBankBranch &”‘) “&_
“,1” &_
“,rtrim(‘”&EFTBankCode &”‘) “&_
“,” “&_
“,rtrim(‘”& EFTBankCheckDigit &”‘) “&_
“,” “&_
“,rtrim(‘”& IntlBankAcctNum &”‘) “&_
“,” “&_
“,” “&_
“,” “&_
“,rtrim(‘”& Bnkctrcd &”‘) “&_
“,rtrim(‘”& Cbankcd &”‘) “&_
“,” “&_
“,” “&_
“,” “&_
“,” “&_
“,rtrim(‘”& RegCode1 &”‘) “&_
“,rtrim(‘”& RegCode2 &”‘) “&_
“,0″&_
“,rtrim(‘”& EFTTransitRoutingNo &”‘) “&_
“,rtrim(‘”& CURNCYID &”‘) “&_
“,2 “&_
“,1 “&_
“,rtrim(‘1900-01-01 00:00:00.000’) “&_
“,rtrim(‘1900-01-01 00:00:00.000’)) “
Set objRec = objConn.Execute(cmdString)
ObjConn.Close

Installing ISV products to multiple instances of GP 10

I recently needed to install the Spanish version of Dynamics GP on the same Terminal Server as the English version. Since GP 10 allows you to install multiple instances, that was not an issue. Where I ran into problems was with certain isv products that required installs rather than just dropping a cnk file into the GP directory. When trying to install for the 2nd time, I received a choice of uninstalling or repairing the installation.

The workaround I found was to install to the initial instance and then copy that folder to a backup location. I could then uninstall the application and reinstall to the 2nd GP instance. Once that was complete, I copied the backup folder and replaced the GP folder that was there. I now had the ISV product working in both instances.

 

Maintaining Test Environments for Dynamics GP

Often you hear about the importance of building test environments before upgrading. What about maintaining the test environment for reporting, customization, integrating and adding new functionality? It is important to keep a test system up and running replicating your production environment as closely as possible. This allows users to have a place to go when they want to try something and check the results before proceeding in production.

To maintain the system, you can restore production data on a routine basis. Typically, monthly is good unless there is an issue requiring testing. In that case you can restore the most recent backup. Keep in mind that once the databases are restored, you will need to create the logins and reset the sql passwords in order for users to login. This is also a good way to make sure your SQL jobs are actually backing up to something that can be restored.

I also recommend working on reports using local reporting tools. Deploy them to test once you are comfortable that they work. You can then move them to production after the users sign-off that the report is what they want.