Quantcast
Channel: Dynamic Code Blocks - Microsoft Dynamics GP
Viewing all 173 articles
Browse latest View live

Updating Dynamics GP Comment ID in Sales Order Entry

$
0
0

A library of prewritten comments can be defined in Dynamics GP and stored with a name known as the Comment ID. When an sop document is entered, then by selection or typing the comment id, that order will be stamped with the prewritten comment. Think of it like the old rubber stampers for stamping comments onto documents using and ink pad. A COPY of the comment is put against the order in the SOP10106 table under the field CMMTEXT (and the other comment fields too, more on them later).

SOP Entry Comment ID Window showing manual text entry

If the source comment is changed at a later point in time, then any future documents will be stamped with the new texts, but the documents that have already been stamped will not have the comment changed as it is a COPY of the comment as it was when it was attached to the order. See later for ideas on “fixing” this.

It is also possible and normal to go in and edit the comment text, adding or changing the text that was added from the comment id. The comment ID name will remain in this case so that you may still see the comment has been applied at some point, and a pencil edit icon will be added next to the comment ID to show it has been manually updated.

Pencil Edit Icon on Comment ID

If the comment ID name is removed from the field then the whole comment is erased, including any manual edits.

For the developers out there, note that the comment is added to the SOP10106 table immediately on entering the comment ID, before the order is saved. Also note that the (text type) CMMTEXT field is also broken up into chunks of 51 characters over a number of comment fields. This is to support report outputs that cannot handle the massive text type field.

COMMENT_x fields 1-4 char(51) break up CMMTTEXT

I want to update the exiting comments from comment IDs

So something has changed about your standard comment so the comment in the comment ID is updated, that deals with future documents, but what about the exiting sales documents that have been stamped with the previous comment?

There are two obvious solutions:

  • Use a mail merge Dynamics GP macro to, for each order document of concern, update the comments, but opening each order, removing the comment ID and replacing it again, then finally saving the order.
  • Use SQL to update the database directly.

The first method of macro will work if you have no in house SQL skills. Be aware though that any edits to the comments will be also lost, check with the users that they are not in the habit of editing the text that has been stamped into the document.

The second method is ok if you are confident with SQL, or ask your dynamics GP partner to help if you are at all concerned.

UPDATE sc
SET CMMTTEXT=REPLACE(CAST( CMMTTEXT ASVARCHAR(MAX)),'Old Comment Text', 'New comment text')
FROM
SOP10100 so
JOIN
SOP10106 sc
ON so.SOPTYPE=sc.SOPTYPE AND so.SOPNUMBE=sc.SOPNUMBE
WHERE
CHARINDEX(CAST( CMMTTEXT ASVARCHAR(MAX)),'Old Comment Text') >0
AND so.COMMNTID='CommentIDName'

The above SQL will update only for orders stamped with the comment ID in question. It will then replace the old text with the new, leaving any extra text, so it is low risk in terms of losing data. However if someone has edited the text from the template, then this will not work. You may need to be careful if you have comments that span multiple lines, as the line returns will need to be taken care of, you could always source the text from an existing record for ease, using a sub query.
 
DECLARE @oldCommentText VARCHAR(MAX)
SELECT @oldCommentText = CAST( CMMTTEXT ASVARCHAR(MAX)) FROM SOP10106
WHERE SOPNUMBE='{sourcedoc}'and soptype={sourcetype}

UPDATE sc
SET CMMTTEXT=REPLACE(CAST( CMMTTEXT ASVARCHAR(MAX)),@oldCommentText, 'New comment text')
FROM
SOP10100 so
JOIN
SOP10106 sc
ON so.SOPTYPE=sc.SOPTYPE AND so.SOPNUMBE=sc.SOPNUMBE
WHERE
CHARINDEX(CAST( CMMTTEXT ASVARCHAR(MAX)),@oldCommentText) >0
AND so.COMMNTID='CommentIDName'

Replace CommentIDName with the comment id to target. And replace ‘Old comment text’ and ‘New comment text’ as appropriate.  Please check this SQL before trying it and run it on a copy of your data in your test system first. These scripts are for ideas only and not claiming to be a tested solution.

Hopefully this post will help someone, let me know with a comment if it did.


Using SQL UNPIVOT operator to reconcile Dynamics GP inventory items

$
0
0

Working with the Dynamics GP inventory tables you will encounter the field named QTYTYPE a lot. This is usually seen with its partner LOCNCODE.

Inventory in GP can reside in different locations (for example depots New York, London, Sidney). That location then is broken down further into five item types. Item types can be though of condition/state of the item where the item states are an indexed as followed:

1 = On Hand
2 = Returned
3 = In Use
4 = In Service
5 = Damaged

This is a way we can categorise where and what state/status the inventory is in.

The item stock levels for each item is stored in the table IV00102, keyed by ITEMNMBR and LOCNCODE. To avoid creating four times more rows, the table architect decided to pivot the table, giving each quantity type its own field column in the table. The fields names are as follows:

ITEMTYPEField NameDescription
1QTYONHNDOn Hand
2QTYRTRNDReturned
3QTYINUSEIn Use
4QTYINSVCIn Service
5QTYDMGEDDamaged

This presents a problem as the other inventory tables have a QTYTYPE field and have different rows for different quantity types. If we need to join to the IV00102 table this becomes troublesome for us. One solution (I know there are others) is to use the UNPIVOT operator in TSQL to unpivot the IV00102 table, causing the columns to present themselves as extra rows.

SELECT ITEMNMBR
,LOCNCODE
,CASE valuename
WHEN'QTYONHND'
THEN 1
WHEN'QTYRTRND'
THEN 2
WHEN'QTYINUSE'
THEN 3
WHEN'QTYINSVC'
THEN 4
WHEN'QTYDMGED'
THEN 5
ENDAS QTYTYPE
,QtyValue
FROM iv00102
UNPIVOT(QtyValue FOR valuename IN (
QTYONHND
,QTYRTRND
,QTYINUSE
,QTYINSVC
,QTYDMGED
)) UnPiv
WHERE ITEMNMBR = '100XLG'

This gives rise to the following result set.

SQL results

See how each quantity type now has its own row rather  than being named columns? We also used a CASE statement to alias the names of the columns back to index numbers.

Example, reconciling inventory values using SQL

In the blog post by Mahmood M. Alsaadi  Reconciling Quantity on Hand – SQL Script, he shows a script to reconcile inventory, so let us not reinvent the wheel and start with that script. The example provided in his post, at the time of writing did not take into account the quantity types. Unfortunately I ended up debugging the SQL to work this out, only to then return to the original post comments to find someone else had also done the same and pointed this out. However the solution they proposed was to tie the QTYTYPE=1 so that only on hand quantities are reconciled.  I felt I could do better than!

I build on the original script adding in the UNPIVOT introduced above:

SELECT TRX_BALANCE.ITEMNMBR AS ItemNumber
,TRXLOCTN AS Location
,Master_Balance.QTYTYPE AS QTYTYPE
,BALANCE AS TRX_BALNACE
,QtyValue AS Master_Balance
,ATYALLOC AS Master_AllocatedQuantity
,QtyAvailable
,BALANCE - QtyValue AS Variance
FROM (
SELECT ITEMNMBR
,TRXLOCTN
,QTYTYPE
,SUM(QTYRECVD) - SUM(QTYSOLD) AS BALANCE
FROM dbo.IV10200
--WHERE IV10200.ITEMNMBR='40-322'
GROUPBY ITEMNMBR
,TRXLOCTN
,QTYTYPE
) AS TRX_BALANCE
LEFTOUTERJOIN (
SELECT ITEMNMBR
,LOCNCODE
,CASE valuename
WHEN'QTYONHND'
THEN 1
WHEN'QTYRTRND'
THEN 2
WHEN'QTYINUSE'
THEN 3
WHEN'QTYINSVC'
THEN 4
WHEN'QTYDMGED'
THEN 5
ENDAS QTYTYPE
,QtyValue
,CASE valuename
WHEN'QTYONHND'
THEN ATYALLOC
ELSE 0
ENDAS ATYALLOC
,CASE valuename
WHEN'QTYONHND'
THEN QtyValue - ATYALLOC
ELSE 0
ENDAS QtyAvailable
FROM IV00102
UNPIVOT(QtyValue FOR valuename IN (
QTYONHND
,QTYRTRND
,QTYINUSE
,QTYINSVC
,QTYDMGED
)) IV00102Pivot
) AS Master_Balance ON TRX_BALANCE.ITEMNMBR = Master_Balance.ITEMNMBR
AND TRX_BALANCE.TRXLOCTN = Master_Balance.LOCNCODE
AND TRX_BALANCE.QTYTYPE = Master_Balance.QTYTYPE
WHERE BALANCE - QtyValue <> 0

This will output the item, location and quantity type of the items that have an incorrect inventory level in the IV00102 table (this is the table viewed when looking at the Dynamics GP item enquiry form).

This can be a useful  script to set as a scheduled SQL Job to notify your GP admin that inventory needs reconciling, even listing the items that need reconciling. This could also lead to generation of a macro to do the reconcile… one for another day…..

Dynamics GP Item Stock Enquiry Window taking too long to display data

$
0
0

If it takes a long time for the Item Sock Enquiry window to display data after entering an item number, this is a sign of large amounts of data in GP slowing things down.

Stock Inquiry

In my example, entering an item number into the stock enquiry window, it then takes nine minutes to display the data. That is not a typo, I timed it with my iPhone…

stopwatch

Why is this so slow?

The document that helps us is The Balance in Item Stock Inquiry is incorrect in Microsoft Dynamics GP , this document under “more information” explains that IV30300 (Transaction Amounts History detail) is loaded for the item and iterated through  totalling up the values to give the balance. Running the following SQL gives us the items with the most transactions in that table, and the item I’m looking at has 9253 rows.

SELECT ITEMNMBR, COUNT(ITEMNMBR) 
FROM IV30300
GROUPBY ITEMNMBR
ORDERBY 2 DESC

So this is more data that expected. The “problem” is that there is too much data for the way GP works. The solution is to trim the historical transactions. Removing history from the IV module will reduce  the number of records the window has to iterate through and make the performance acceptable.

Inventory Year End, Remove Sold Receipts and Cost Change History

$
0
0

Inventory year end closing window

FIFO has to be maintained somewhere so goods receipts are added to table IV10200, as the inventory is consumed the QTYSOLD field is incremented against the receipt, until the QTYRECVD = QTYSOLD and at this point the field RCPTSOLD is set from 0 to 1, to indicate that the receipt has been totally consumed. The order of receipts is chosen depending on the cost model, FIFO etc.

As can be seen above, as part of the year end closing, the option is provided to remove any consumed receipts before a specified date. If this option is chosen, then GP will look for receipts with RCPTSOLD=1 and DATERECD<{Date Provided}  and remove them from the IV10200 table. It will also remove the records in IV10201 that represent the demand against the receipt. You can see this if you run a SQL trace while it runs.

DELETEFROM IV10201 WHERE ITEMNMBR = 'myitem'AND SRCRCTSEQNM = 1 
 
I was worrying if this would affect the stock levels if I reconciled the inventory after removing the receipts. If you look at my other post Using SQL UNPIVOT operator to reconcile Dynamics GP inventory items then you will note that inventory reconcile calculates the available stock from the IV10200 table using:
SUM(QTYRECVD) - SUM(QTYSOLD)
As we are only removing rows where these are equal (as the whole receipt is sold), then they are rows that were not participating in the stock quantity anyway, hence no effect on stock reconciles.

In the above script SRCRCTSEQNM=1 (Source Receipt Sequence Number) is from the RCTSEQNM (Receipt Sequence Number) of the record removed from IV10200 binding demand and source. – or so it looks at first sight.
 
hey wait…
 
The key for receipts table IV10200 allows the same item to have the same shared RCTSEQNM if the dates received is different. Indeed our data has this a lot of the time, see the table screen shot for an item below, see how the item has multiple RCTSEQNM=1. I don’t think this should happen.
 
rectable
 
I had a look an our test copy of the live company, ran the above year end routine and indeed it removed all records from IV10201 for the sequence number, even though I think they related to records still remaining in IV10200! I’ve seen some posts regarding people finding missing data in their systems but not being able to track down why and what makes it happen, I think perhaps this may be an answer? It may be that I am interpreting what I am seeing wrong too, I need more information on how records relate.
 
Further investigation…
I don’t think I’m wrong, I have the feeling that we have some serious corruption in our table! After running the remove sold receipts then reconciling one of the items affected gives us this result:
 
itemreconcile
 
GP has added a record back into IV10201 to replace the one that was taken by the year end utility, I think this is a good indication that this is not supposed to have happened.
GP creates a new record in IV10201 with ORIGINDOCID field set to “RECON” (reconcile) for the full amount sold on the receipt, as it can’t now find the transaction details, as they were removed by the year end utility.
 
I’m guessing that the RCTSEQNM should auto increment for each receipt and that there should not duplicate RCTSEQNM for any item. The following script checks for instances of this happening in correctly.
 
SELECT
ITEMNMBR,
RCTSEQNM,
COUNT(*) ,MAX(DATERECD)
FROM IV10200
GROUPBY ITEMNMBR, RCTSEQNM
HAVINGCOUNT(RCTSEQNM)>1
ORDERBY 4

(19498 row(s) affected)

Oh crickey! - there are 19498 item sequence number combinations where this has occurred. Looking at the first part of the result set we see this is real bad, some of the sequence numbers have been used 735 times, and that is just at first glance, I would rather not dig deeper. With the dates added, I see that this happened between years 1998 and 2009. I guess whatever the cause was went away after that year. So these days it does work how I would expect and the records are tied between the 200 and 201 tables by the RCTSEQNM (when the data is correct).
 
corruptseq
 
 
corruptseqdate
 
 
Where do I go from here?
So I need to clean up this mess, it looks like it may not be possible to recover from this and preserve data. As it happens I want to trim these receipts back to seven years anyway, so if I do that the problem will solve itself, I think. An item reconcile after the trim will rebuild dummy records for sold quantities if any are missing, so we end up in a better place than I am now.
 
 
It is not a good idea to delete a receipt and associated detail if consumed recently as it makes processing returns more difficult once a receipt has been removed.
 
If you don’t want to run year end for inventory, that updates starting qtys and summaries, you could try writing a script. Due to the above issues I am unable to complete this script right now as it will suffer a similar issue to that of the native functionality until I can find how to relate the tables correctly, but for those interested, here is my work in progress...
 
BEGINTRANSACTION

DECLARE @DateThreshold DATETIME

SELECT @DateThreshold = '20021231'

DECLARE @ToRemove ASTABLE (
ITEMNMBR CHAR(31)
,TRXLOCTN CHAR(11)
,RCTSEQNM INT
,SRCRCTSEQNM INT
);

WITH CTE_AvailableToRemove
AS (
SELECT ITEMNMBR
,SRCRCTSEQNM
FROM IV10201
GROUPBY ITEMNMBR
,SRCRCTSEQNM
HAVINGMAX(DOCDATE) < DATEADD(year, - 1, getdate())
)
INSERT INTO @ToRemove (
ITEMNMBR
,TRXLOCTN
,RCTSEQNM
,SRCRCTSEQNM
)
SELECT rcpt.ITEMNMBR
,rcpt.TRXLOCTN
,rcpt.RCTSEQNM
,rcpt.SRCRCTSEQNM
FROM IV10200 rcpt
JOIN IV10201 ld ON rcpt.RCTSEQNM = ld.SRCRCTSEQNM
JOIN CTE_AvailableToRemove ON rcpt.RCTSEQNM = CTE_AvailableToRemove.SRCRCTSEQNM
AND CTE_AvailableToRemove.ITEMNMBR = rcpt.ITEMNMBR
WHERE rcpt.RCPTSOLD = 1
AND (rcpt.DATERECD <= @DateThreshold)

DELETE ld
OUTPUT DELETED.[ITEMNMBR]
,DELETED.[TRXLOCTN]
,DELETED.[QTYTYPE]
,DELETED.[DOCDATE]
,DELETED.[RCTSEQNM]
,DELETED.[ORIGINDOCTYPE]
,DELETED.[ORIGINDOCID]
,DELETED.[LNSEQNBR]
,DELETED.[CMPNTSEQ]
,DELETED.[QTYSOLD]
,DELETED.[UNITCOST]
,DELETED.[IVIVINDX]
,DELETED.[IVIVOFIX]
,DELETED.[SRCRCTSEQNM]
,DELETED.[TRXREFERENCE]
,DELETED.[PCHSRCTY]
INTO Archive.[dbo].[IV10201]
FROM IV10201 ld
JOIN @ToRemove rm ON ld.ITEMNMBR = rm.ITEMNMBR
AND ld.TRXLOCTN = rm.TRXLOCTN
AND ld.RCTSEQNM = rm.RCTSEQNM

DELETE IV10200
OUTPUT DELETED.[ITEMNMBR]
,DELETED.[TRXLOCTN]
,DELETED.[DATERECD]
,DELETED.[RCTSEQNM]
,DELETED.[RCPTSOLD]
,DELETED.[QTYRECVD]
,DELETED.[QTYSOLD]
,DELETED.[QTYCOMTD]
,DELETED.[QTYRESERVED]
,DELETED.[FLRPLNDT]
,DELETED.[PCHSRCTY]
,DELETED.[RCPTNMBR]
,DELETED.[VENDORID]
,DELETED.[PORDNMBR]
,DELETED.[UNITCOST]
,DELETED.[QTYTYPE]
,DELETED.[Landed_Cost]
,DELETED.[NEGQTYSOPINV]
,DELETED.[VCTNMTHD]
,DELETED.[ADJUNITCOST]
,DELETED.[QTYONHND]
INTO ARCHIVE..IV10200
FROM IV10200
JOIN @ToRemove r ON IV10200.ITEMNMBR = r.ITEMNMBR
AND IV10200.RCTSEQNM = r.SRCRCTSEQNM
AND IV10200.TRXLOCTN = r.TRXLOCTN
WHERE RCPTSOLD = 1
AND (IV10200.DATERECD <= @DateThreshold)

ROLLBACK TRANSACTION



 

Uppercase SQL script using SSMS

$
0
0

I just realised I use this functionality a lot and thought that others might not know it was possible.

Highlight a lower case bit of text in the SQL editor window of SSMS and press CTRL+SHIFT+U and the text will go to upper case (L for lower case).

SQL Upper case

This is handy when someone else has written a SQL script with GP field names in lower case as I’m so used to them in upper, it speeds up my reading to make them upper.

This is just an example. You may also highlight the whole script and make it upper case or individual lines, the above GIF is just showing the principle. In this particular case it would be quicker to have made the whole lot upper case then go back and lower case the table aliases.

Disable Enhanced Intrastat Dynamics GP

$
0
0

I have fallen out with the GP Intrastat module, it has been causing me some problems. I decided it had to go, but I found it was not obvious how to switch it off…

Microsoft Dynamics GP>> Tools>>Setup>> Company>>Company

Select the options button

Dynamics GP Company Options Window

The company setup options window will allow access to the “Enable Intrastat Tracking” option. Try unchecking the checkbox. You will get the following dialog.

Please detach all debtors creditors and sites from all declarants before you unmark Intrastat Tracking

Microsoft Dynamics GP>> Tools>>Setup>> Company>>Enhanced Intrastat>>Setup

Intrastat Setup Window

In this window, select unmark all then ok the window.

Go back to the company options window, you should now be able to deselect the intrastat module.

You must then log out of GP and back in again, this will remove the buttons etc that are related to the Intrastat module.

Please be aware that this will stop reporting of Intrastat figures, make certain that your intrastat returns can be completed in some other way before disabling this module.

Writing off stock from Qty In Service from Dynamics GP using SQL

$
0
0

I have 9,791 items with QTY in service >0 that I want to "adjust" out of stock.

All these items have been financially accounted for, written off over the years, the IV account manually cleared by journal each month. However the items have been left in stock (why we do this is for another day).

My challenge is to remove the items from stock so that I can then delete all the inventory master records for items as they are all historical items that have been discontinued for over 10 years and I'm sick of seeing them in reports and enquiries.

To do this

  • I need to firstly create a stock transfer document for all the items and qtys, including bins and serial numbers
  • I need create a stock adjustment for the same items bin qtys and serial numbers to adjust the items out of stock
  • I need to post a reversing journal to undo the effect on the already adjusted accounts

I really don’t like the idea of trying to write a GP macro to do this, it will take forever to run and is complicated by having to select serial numbers and bin numbers for each item.

I could go for programming solution using eConnect, but looking at it, I don’t think the business rules are that difficult and the problem is quite constrained so just building the transactions in SQL will do me. I feel confident I won’t screw anything up here.

I will show you the scripts I’ve used but be aware that I know this system well, you may have other modules installed that would affect things, or have GP set up in different ways that will cause issues. So TEST! TEST! TEST! first, I can take to responsibility for what happens on your system with this script!

To generate a transfer, create an empty inventory transfer with no lines, take a note of the document number and set it as the variable in the script. Save the GP document and come out of it.

Run the first half of the SQL script to create transfers for ALL the items in inventory that have a type of “In Service”. Obviously this could be narrowed down to your own needs with some more where clauses.

Note that I have made some safe assumptions for the dataset I’m working with, some of the field values are tied down rather than looked up. For example, you would need to check the account indexes for your inventory accounts before running the script, but it provides a starting point.

Once the SQL has generated the transfer, post it. Once posted run the second half of the SQL script, this will use the posted transfer to generate the inventory adjustment.

Now post the inventory adjustment.

Note: When I tried this it took over 19 hours on the test environment, after switching off the Intrastat Module it only took a few mins, see more on this at the bottom of this post. 

Finally in our case we needed some reversing journals to put the accounts back to where they should have been, but that was only as we were not using the system correctly in the first place!

The SQL…



DECLARE @IVDOCNMBR CHAR(17)

-- Create a new Inventory Transfer Document, Save it then enter its number here
-- this will be the document that will be loaded with items
SELECT @IVDOCNMBR = '00017213'

-- Load all items that have an "in service" value to a stock transfer to change to "on hand"
INSERT INTO IV10001
SELECT @IVDOCNMBR
,3
,IV00101.ITEMNMBR
,(
ROW_NUMBER() OVER (
ORDERBY IV00101.ITEMNMBR
) - 1
) * 16384 + 16384
,IV40201.BASEUOFM
,IV40202.QTYBSUOM
,IV00102.QTYINSVC
,CURRCOST
,STNDCOST * IV00102.QTYINSVC
,IV00102.LOCNCODE
,IV00102.LOCNCODE
,4
,1
,2011
,523
,0x00000000
,DECPLCUR
,IV00101.DECPLQTY
,0
,''
FROM IV00101
JOIN IV00102 ON IV00101.ITEMNMBR = IV00102.ITEMNMBR
AND IV00102.RCRDTYPE = 2
JOIN IV40201 ON IV40201.UOMSCHDL = IV00101.UOMSCHDL
JOIN IV40202 ON IV40202.UOMSCHDL = IV40201.UOMSCHDL
AND IV40202.UOFM = IV40201.BASEUOFM
WHERE QTYINSVC > 0

INSERT INTO IV10003
SELECT IVDOCNBR
,IVDOCTYP
,LNSEQNBR
,(
ROW_NUMBER() OVER (
ORDERBY IV10001.ITEMNMBR
) - 1
) * 16384 + 16384
,IV10001.ITEMNMBR
,TRXLOCTN
,IV00112.BIN
,IV00112.BIN
,TRFQTYTY
,IV00112.QUANTITY
,0
FROM IV10001
JOIN IV00112 ON IV10001.ITEMNMBR = IV00112.ITEMNMBR
AND IV10001.TRXLOCTN = IV00112.LOCNCODE
AND IV10001.TRFQTYTY = IV00112.QTYTYPE
WHERE IVDOCNBR = @IVDOCNMBR
AND IV00112.QUANTITY - IV00112.ATYALLOC > 0

INSERT INTO IV10002
SELECT IVDOCNBR
,IVDOCTYP
,IV10001.ITEMNMBR
,IV00200.SERLNMBR
,1
,IV10001.LNSEQNBR
,(
ROW_NUMBER() OVER (
ORDERBY IV10001.ITEMNMBR
) - 1
) * 16384 + 16384
,IV00200.DATERECD
,IV00200.DTSEQNUM
,0
,0
,IV00200.BIN
,IV00200.BIN
,'1900-01-01'
,'1900-01-01'
FROM IV10001
JOIN IV00200 ON IV10001.ITEMNMBR = IV00200.ITEMNMBR
AND IV10001.TRXLOCTN = IV00200.LOCNCODE
AND IV10001.TRFQTYTY = IV00200.QTYTYPE
WHERE IVDOCNBR = @IVDOCNMBR
AND IV00200.SERLNSLD = 0

UPDATE IV00200
SET SERLNSLD = 1
FROM IV10001
JOIN IV10002 ON IV10001.IVDOCNBR = IV10002.IVDOCNBR
AND IV10001.IVDOCTYP = IV10002.IVDOCTYP
AND IV10001.LNSEQNBR = IV10002.LNSEQNBR
JOIN IV00200 ON IV10002.ITEMNMBR = IV00200.ITEMNMBR
AND IV10001.TRXLOCTN = IV00200.LOCNCODE
AND IV10002.DATERECD = IV00200.DATERECD
AND IV10002.DTSEQNUM = IV00200.DTSEQNUM
AND IV10002.SERLTNUM = IV00200.SERLNMBR
AND IV10001.TRFQTYTY = IV00200.QTYTYPE
WHERE IV10002.IVDOCNBR = @IVDOCNMBR

-----------------------------------------------------------------------------------
-- Post the generated document, once sucessfully posted run the next part that follows
---- prepare the inventory adjustment by creating a new inventory adjustment and saving it (empty)
-- enter the previous document number for the stock transfer, and the empty adjustment number - an inventory adjustment
-- will be created from the transfer to adjust all that stock out.
DECLARE @IVDOCNMBR CHAR(17)

SELECT @IVDOCNMBR = '00017213'-- posted (in history) inventory transfer

DECLARE @IVDOCNMBR2 CHAR(17)

SELECT @IVDOCNMBR2 = '00054595'-- empty inventory adjustment

INSERT INTO IV10001
SELECT @IVDOCNMBR2
,1
,IV30300.ITEMNMBR
,(
ROW_NUMBER() OVER (
ORDERBY IV30300.ITEMNMBR
) - 1
) * 16384 + 16384
,IV30300.UOFM
,IV30300.QTYBSUOM
,- 1 * TRXQTY
,UNITCOST
,0
,IV30300.TRNSTLOC
,''
,0
,0
,525
,2008
,0x00000000
,DECPLCUR
,IV30300.DECPLQTY
,0
,''
FROM IV30300
WHERE IV30300.DOCNUMBR = @IVDOCNMBR
AND IV30300.DOCTYPE = 3

INSERT INTO IV10003
SELECT @IVDOCNMBR2
,1
,IV30300.LNSEQNBR
,IV30302.SEQNUMBR
,IV30302.ITEMNMBR
,IV30302.LOCNCODE
,TOBIN
,''
,IV30302.QTYTYPE
,IV30302.QTYSLCTD
,0
FROM IV30302
JOIN IV30300 ON IV30302.DOCNUMBR = IV30300.DOCNUMBR
AND IV30302.DOCTYPE = IV30300.DOCTYPE
AND IV30302.LNSEQNBR = IV30300.LNSEQNBR
WHERE IV30300.DOCNUMBR = @IVDOCNMBR

INSERT INTO IV10002
SELECT @IVDOCNMBR2
,1
,IV30400.ITEMNMBR
,[SERLTNUM]
,1
,LNSEQNBR
,SLTSQNUM
,DATERECD
,DTSEQNUM
,0
,0
,TOBIN
,''
,'1900-01-01'
,'1900-01-01'
FROM [IV30400]
JOIN IV00200 ON IV30400.ITEMNMBR = IV00200.ITEMNMBR
AND IV30400.SERLTNUM = IV00200.SERLNMBR
AND IV00200.SERLNSLD = 0
WHERE DOCNUMBR = @IVDOCNMBR

UPDATE IV00200
SET SERLNSLD = 1
FROM IV10001
JOIN IV10002 ON IV10001.IVDOCNBR = IV10002.IVDOCNBR
AND IV10001.IVDOCTYP = IV10002.IVDOCTYP
AND IV10001.LNSEQNBR = IV10002.LNSEQNBR
JOIN IV00200 ON IV10002.ITEMNMBR = IV00200.ITEMNMBR
AND IV10001.TRXLOCTN = IV00200.LOCNCODE
AND IV10002.DATERECD = IV00200.DATERECD
AND IV10002.DTSEQNUM = IV00200.DTSEQNUM
AND IV10002.SERLTNUM = IV00200.SERLNMBR
AND IV00200.QTYTYPE = 1
WHERE IV10002.IVDOCNBR = @IVDOCNMBR

ROLLBACKTRANSACTION

UPDATE IV00102
SET ATYALLOC = ATYALLOC + ABS(TRXQTY)
--SELECT *
FROM IV10001
JOIN IV00102 ON IV10001.ITEMNMBR = IV00102.ITEMNMBR
AND IV10001.TRXLOCTN = IV00102.LOCNCODE
AND TRFQTYTY = 0
WHERE IV10001.IVDOCNBR = @IVDOCNMBR2
AND IV10001.IVDOCTYP = 1

UPDATE IV00102
SET ATYALLOC = ATYALLOC + ABS(TRXQTY)
--SELECT *
FROM IV10001
JOIN IV00102 ON IV10001.ITEMNMBR = IV00102.ITEMNMBR
AND RCRDTYPE = 1
AND TRFQTYTY = 0
WHERE IV10001.IVDOCNBR = @IVDOCNMBR2
AND IV10001.IVDOCTYP = 1

 

Dynamics GP Intrastat module and slow posting of inventory transfer

 
When posting this large 3,000 line transaction with the Intrastat module switched on, it was still posting 18hrs later, at that point I killed the GP client process.
Looking at the SQL trace for what the GP client was doing before I killed it, it seemed to be that GP was iterating through each line of the transaction looking up something in the Intrastat VAT table but for every other line of the transaction too. You could see the occasional call to IV00101 where by the item number was moving to the next number, but then it would sit for ages calling out to the VAT tables for all the items on the order before moving on again.
 
intrastat iv posting sql trace showing calls to ZDP_VAT stored proc
 
After disabling the Intrastat module, the transaction posted in a few minutes, so there is some interaction with that module causing a serious degradation in performance on our system. In one of my previous posts I mentioned that the Intrastat module also causes the Sales order processing auto suggestion functionality to not work. I am now looking to see if we can report Intrastat without the need for the module so that ultimately I can switch it off for good.
 
 
 
 

OLE Notes Migration

$
0
0

It was time to get our old OLE notes migrated. GP has stopped using OLE notes containers for attachments to notes in the system, mostly due to the need to work with the web browser hosting of GP.  I didn’t want to import the attachments back into the new GP attachments feature, as no one had been screaming for the ones that had vanished after the upgrade (only a few enquiries). However I did want to make them available should someone need them enough (don’t ask me for criteria for the enough!). So the plan is to run the first part of the migration tool only, the extract and not the import. For details of the tool see the post by encore:

The OLE note migration utility in Dynamics GP

The OLE notes directory totalled 46GB with 30k documents, running the migration utility brought the extracted size down to 3GB. Evidence is that the object containers were not efficient!

So the lesson is that you might want to extract your OLE notes even if you don’t intend to use them as it will help your file sizes, in our case the storage is backed by a SAN that will be compressing things anyway, extracting the files makes them useable from the directory.

What you get after extraction is a directory structure that starts with the dynamics GP product ID, and then has a folder for each noteidx under that. The note index is the record id for the note in the notes table. So for every product that uses notes and had attachments there should be a top level folder, then subfolders for the note attachments.

It looks like the same kind of scheme used by the newer GP attachments feature, see the post:

Document attach feature Dynamics GP database and BusObjKey formats

the directory name is the hex version of the note index that the contents of that directory link to.

This is good enough for me to recover any documents requested by looking at the note index and converting it.


Empty Matched To Shipment in Purchasing Invoice Entry of Dynamics GP

$
0
0

About from time to time on Euro transactions we get a yellow triangle and empty “Matched to Shipment” field in the Purchasing Invoice Entry window of GP.

Purchasing Invoice Entry Yellow Triangle

The problem is always the same, the fields

  • CURNCYID
  • CURRNIDX
  • XCHGRATE
  • RATECALC

Are all either default or missing for the item row in the table POP10500.

I am still looking for the break through clue as to what causes this, I’m guessing network outages or other hardware failures interrupting processing somewhere. In the meantime I just fix the issue when it comes up, but having fixed it a couple of times I wrote a script to help.

If this is the problem that you are seeing too then the following script can be used to fix it for a given PO Number.

BEGINTRANSACTION

UPDATE pl
SET CURNCYID = ph.CURNCYID
,CURRNIDX = ph.CURRNIDX
,XCHGRATE = ph.XCHGRATE
,RATECALC = ph.RATECALC
FROM POP10100 ph
JOIN POP10500 pl ON ph.PONUMBER = pl.PONUMBER
WHERE ph.PONUMBER = '{your po number}'
AND pl.CURNCYID = ''

ROLLBACK TRANSACTION

As always, I can’t know your GP environment, so this should be thoroughly tested in a test environment before running the script on production data. I can’t take any responsibility for what might happen on your systems.

Let me know if this helped you in the comments, it motivates me to blog more…

Error: This document contains one or more posting holds, Dynamics GP Edit List

$
0
0

If you are experiencing this on SOP Sales Invoice posting edit lists, then maybe you are using drop ship items. Also perhaps you have the Shipping Notification Tool installed?

Error this document contains one or more posting holds

There is no posting hold as such on the document, the posting hold mechanism is being hijacked by the shipping tool to prevent posting of the document. It is actually because GP does not think the item in question have been shipped yet and so is preventing it from posting. Ensure that the item was marked as shipped using the ship notification window.

The setting in the INI file needs changing if you want to change this behaviour, details of the shipping notification window are in this post as is more information on the INI settings to change if you want to change the behaviour:

 Dynamics GP Drop Shipping Sales invoices before purchase invoice with Shipment Notification Tool (SNT)

SQL to extract contiguous ranges for maintenance tasks in Dynamics GP

$
0
0

From time to time I find some SQL that really makes me smile and sit back and stare at it with awe. Today is one of those days, let me show you…

Many of the maintenance windows in Dynamics GP ask for ranges of numbers to be entered. They consist of a start number and then an end number, start and end points for which you would like to execute a operation over.

It is a common thing in GP is to use a GP macro mail merge to automate the user interface where a repetitive operations are required.  If a macro mail merge is being used for one of these windows, then it requires the start and end numbers over which we would like to perform the operation.

Let move to something more tangible and the real world example I’m working on. However don’t get tied down by my example, there are many other range windows in GP that this principle would work on too.

Remove Sales History Windows Dynamics GP

I want to surgically remove 901,872 historical sales documents from GP, using a macro. I am using a macro so that we are safe, the GP biz logic gets applied by the UI. I could use SQL but always best to try macro first so to benefit from the safety the UI gives us. This could take some time, supplying the document numbers and one by one deleting them.

Master numbers are a number id that ties together a string of sales documents that are related, it is held against all sales documents. For efficiency we notice that we can remove documents by master number, so all related documents, quote->order->invoice->return will be deleted at the same time. This would mean looping through the remove operation of window much less, as one master number removes many documents.

This brings us to 446,741 unique master numbers that need removing from 901,872 sales documents (yes I’ve checked and only three master numbers in my range link to related documents outside my range, so I’ve taken those out of this list). I could macro the window to set the "from" and "to" to be the same master number, thus removing one set of related documents at a time, then moving to the next number and then the next. It would work, but testing showed I’d be here all week doing it. Besides I can see somehow that I can be more clever than that!

GP is more efficient on these types of operation if you can supply ranges of numbers to work on, so really I want to look at the list of 446,741 numbers and find all the contiguous ranges within the sequence. Then identifying the start and end numbers to put in the UI start and end fields. Although it is worth remembering sometimes there will just be on number in a particular range, when there are gaps on both sides of the master number.

Hitting Google I found the following post, where they use recursive CTEs to find the ranges. Restricting my sequence to only 10,000 records, it took about three mins to run. Letting it run on the full sequence, I ran out of patience (and probably server resources) to let it get to end. I did try to tune it with some physical tables indexed to try and improve performance, but ultimately recursive CTE’s are terrible for large data sets like this.

How to find contiguous ranges with SQL

I went looking again and found another solution in this post.

How to find the boundaries of groups of contiguous sequential numbers?

This is genius insight in its script design. I do use ROW_NUMBER() a lot myself for all sorts of weird problem solving, but had never seen it applied like this before.

I adapted the snippet of the SQL to my problem and came up with this:

DECLARE @MNTable TABLE (MSTRNUMB INTPRIMARYKEY (MSTRNUMB));

INSERT INTO @MNTable
SELECTDISTINCT MSTRNUMB AS MSTRNUMB
FROM [dbo].[SOP30200] --WHERE Blah blah...
ORDERBY MSTRNUMB;

WITH CTE
AS (
SELECT ROW_NUMBER() OVER (
ORDERBY MSTRNUMB
) - MSTRNUMB AS Groups
,MSTRNUMB
FROM @MNTable
)
SELECTMIN(MSTRNUMB) AS [From]
,MAX(MSTRNUMB) AS [To]
FROM CTE
GROUPBY Groups
ORDERBYMIN(MSTRNUMB)
 
So from a query that never finished processing we go to this:
 

query times

Really, it is now down to a few seconds, to get all the results!

sqlresults

We have now only 3,760 row ranges, from 446,741 master numbers, so I now only need to get the macro to iterate the remove window 3,760 times, that is a much more acceptable!

So the mail merge macro now takes each row of this result set and uses the from and to values from the query to populate the boxes for the master number and then sets the report options and finally then processes. Job done.

Back to the SQL for a moment…

The SQL is fascinating for this though, looking at the CTE data we see how it works.

sql result cte

The row number increments but the subtracting of the ROW_NUMBER from the master number sequence causes a banding or grouping number to be created. So long as the sequence number is only one greater than the previous number, then the banding number is maintained the same in the group. This is because the relative difference between the ROW_NUMBER and the master number has not changed. If the master number jumps up because of a gap it will have gone up by more than the row number, so the grouping will number will be changed to a new grouping number.

This grouping number is then subsequently used to gather the max and min values for each section. I love the efficiency and simplicity of this solution. It also shows how many ways there can be to solve a problem.

Installing Dynamics GP Intrastat Module

$
0
0

If you are looking for the Intrastat module, it is part of the Main GP install, I think fromGP2013 when more of the product extras were bundled in - but certainly from GP 2015. When running the GP installer the various features that can be installed are offered in the installer, find Enhanced Intrastat, click on the icon to change if it is installed or not by the installer. This can also be performed retrospectively to add the feature to an existing install.

intrastat installer Dynamics GP

You must switch the module on in company options and use the intrastat setup window to get it going. Look for the user guide on line or on the installation media for full details.

What do you call that window GP?

$
0
0

Steve Endow mentioned on twitter the other day about the naming of objects in GP when programming. The inconstancies are astounding, I have to agree, you can spend half an hour trying to determine if the object named something almost like what you want is actually the object you want or not, then two mins to actually write the mod! The fact you end up using a tool to work out what the object name of the window is in the code, well that just tells the story.

api naming dynamics gp steve endow

What Steve might not realise is that it is even worse if you speak English rather than American. See that window he’s looking at in the tweet, look at it on my screen…

puchaseinv

So we have Enquiry rather than Inquiry, no big deal? Well mostly no we learn the translations, but it confusing to the new developers who have to get used to this translations issue. Sometimes the translated terms are not as obvious as this. However even this would be an issue if you were looking at the object explorer ordered alphabetically or doing a search for the object.

There  is fun though with Debtors and Customers. I still don’t really know why we can’t have customers too, for our users it would make more sense than the accounting term debtors!

Steve kindly furnished me with this screen shot of an American GP customer window.

GP2015CustMaint

Compare that to what we see…with a English Debtor Window. These are the same windows, replacing customer everywhere, now imagine searching for this in visual studio, or a field called debtor id…

dmuk

In visual studio when developing addins we have RMCustomerMaintenance, good job I didn’t go searching for debtor when looking for the object, eh?

customerobject

So I conclude it is all fun and games developing visual studio addins for GP!

 

Thanks for Steve inspiring this post with his tweet and helping with the screen shot.

FP: Couldn’t close table! Dynamics GP Error Solution

$
0
0

I talked before in a previous blog post about the “syContentPageXMLCache cannot find table” error. This and its cousin the “FP: Couldn’t close table!” error are caused by something severing long lived SQL connections between the client and server.

FP: Couldn't close table! 

This might be due to networking issues like bad routing, physical faults with NIC cards, faulty Ethernet cables or connectors, servers going to sleep, connectivity problems (WIFI) or many other potential causes.

The problem

I found myself involved in this today. A new employee started, since they have been working for us, every time the Dynamics GP application is closed at the end of the day (or sooner) they get the FP Couldn’t close table error. They have also been getting other SQL related errors.

IT support had already tried:

  • New GP user ID for the user
  • Rebuilding the PC from the standard image
  • Swapping the newer PC hardware the new user got, for the same hardware the rest of the users are using, involving another new image rebuild.
  • Deleting the user’s AD profile and rebuilding it.
  • Swapping the Ethernet cables
  • Swapping to another wall port of a user that is known to work, also on another network switch
  • Checking power saving sleep options on the NIC

None of the above has stopped the issue occurring. It came to a head when a quotation was entered, that ended up pulling the wrong currency for pricing (as I suspect the SQL connection broke under the hood). I was stumped as to what the issue could possibly be bearing in mind what had already been tried.

Resolving the issue

The user was instructed to email me the screenshots of the errors, the second they happened. Not long after, I came back from my lunch to see an email come in. Attached was the stereotypical errors caused by connection loss.

I connected to the event log on the offending machine, looked at recent history and found the problem.

Event viewer shows Kernel Power event shortly before problem

Although the power settings in windows 10 and it looked ok from the top level screen. When drilling into “Change advanced power settings” and checking through all the options, the “Allow hybrid sleep” setting was set to ON. Looking at the other machines in the area, they all were set to OFF.

Advanced power settings

So my current working assumption is that the user had come back from lunch, during which time the machine had snoozed, causing the SQL connection to drop, with the following errors on resuming using GP:

An unknown SQL error occurred.

A SQL network connection error occurred and your connection was cleared.

David Musgrave has in the past posted some more information on these kinds of issues that are worth a read:

TCP Chimney Setting and SQL Server Error: TCP Provider: An existing connection was forcibly closed by the remote host

More on SQL Server Connection issues with Microsoft Dynamics GP

Exporting from SmartList Dynamics GP “returned 29”

$
0
0

Are you experiencing a dialog box with the following text on attempting to export a Smartlist to Excel?

“RUN APPLICATION ERROR”

"C:\Users\user\AppData\Local\Temp\7\somefilename.xlsx returned 29"

This seems to be an error passed back from Excel regarding permissions issues, or lack of communication to excel from GP via DDE (Dynamic Data Exchange).

The option in excel in advanced options (File>>Options>>Advanced)

“Ignore other applications that use Dynamic Data Exchange (DDE)” should be unchecked.

Ignore other applications that use Dynamic Data Exchange (DDE)

There are a couple of threads on this on the GP forums:

SmartList export to Excel Run Application Error "[filepath]" returned 29

Run application error 29 in SmartList


Dynamics GP eConnect and handling exceptions under Web API

$
0
0

Beware when using eConnect with ASP.NET Web API.

slipping 

… as exceptions raised by eConnect will not be of type eConnect.Exception

thus using our normal typical catch block, such as:

catch(eConnectException ex)

will not work, because all eConnectExceptions will actually cause the following exception to be raised/returned:

System.NullReferenceException: Object reference not set to an instance of an object.

  at Microsoft.Dynamics.GP.eConnect.EventLogHelper.AddExceptionHeader(String action, Object[] inputParameters, StringBuilder errorString)

This is because there is a flaw in the eConnect, eConnect.EventLogHelper.CreateEventLogEntry method. It attempts to extract the Thread.CurrentPrincipal.Identity.Name as part of the output, but when eConnect is used in the context of ASP.NET Web API, this is null

As it is null this causes a new exception to be caused, the original exception superseded by the new System.NullReferenceException caused in the exception handler, that masks the original issue.

Explicitly setting the current threads identity name just before calling eConnect solves this problem and so allows the eConnect.Exception to be caught and processed as would normally be expected.

I discovered this whilst diagnosing a problem on the forums for someone, see: Econnect Exception Handling doesn't work in WEB API 2 application

Dynamics GP Stored procedure [smFormatStringsForExecs]

$
0
0

You’ve seen it often enough in GP stored procedures, but what is it doing?

EXEC @iStatus = smFormatStringsForExecs 
@I_vInputString = @I_charEndCustomer,
@O_cOutputString = @cEndCustomer OUTPUT,
@O_iErrorState = @O_iErrorState OUTPUT

The above code snippet is the common code pattern it appears in. I had always assumed, until today that this procedure was cleaning the input into the procedures for anti-SQL injection attack purposes, alas it seems not.

The procedure actually turns the passed string into a quoted string for use as parameters when building up SQL by concatenation within other GP stored procedures.

Example:

IF we pass the string 6252''5 002     (with the trailing spaces) into the procedure, this is what we find:

DECLARE    @return_value int,
@O_cOutputString char(255),
@O_iErrorState int

EXEC @return_value = [dbo].[smFormatStringsForExecs]
@I_vInputString = N'6252''5 002 ',
@O_cOutputString = @O_cOutputString OUTPUT,
@O_iErrorState = @O_iErrorState OUTPUT

SELECT @O_cOutputString as N'@O_cOutputString',
@O_iErrorState as N'@O_iErrorState'

SELECT'Return Value' = @return_value

2017-03-10_09-43-34

So the procedure has taken any trailing space out and wrapped the passed string in quotes and doubled up any quotes that were in the string to delimit them. Thus the output of the procedure can be used in string concatenation to build a SQL script dynamically.

So you don’t want to use the output of this parameter directly or it won’t work:

Example
i.e. don’t do this with the output from the format strings procedure…

DELETEFROM RM00101 WHERE CUSTNMBR=@O_cOutputString

-as it will not match any customer numbers (unless your customer numbers are in quotes!).

Screwed Allocations in Dynamics GP

$
0
0

SOP Entry

This is purely and observational post of a phenomenon we see in GP from time to time.

The telesales team perform data entry in the SOP screens at a lightening pace, tabbing through windows, hammering information into the screens very quickly while the customer verbalises the order to them. Over the years we regularly see inconsistencies between IV and SOP modules. Here is one that I captured that I’ve seen a few times.

The order default site is “1” and the user has no reason to sell from any location than “1”. There is no stock in location “5” nor any reason to look at that location. The order below is curious.

2017-03-29_09-44-36

I would say some how the user has entered the item order quantity, accidentally, into the SITE ID field, every time the I see this the quantity ordered is the same as the accidentally entered site id. Thus my assumption.

That alone is not odd, but look further…

2017-03-29_09-47-08

The stock has fully allocated to the order line.

2017-03-29_09-47-44

However looking above that item has no stock (or ever has had) in location 5. Indeed the location allocation shows zero allocated.

2017-03-29_09-49-47

Now above we look at location “1”, where the stock should have come from, and we see 5 allocated. If we drill into that allocation we find no documents allocated, however switching to location 5 shows our SOP document.

So it seems to me that the line originally was taking from location “1” and allocated, then somehow the user has managed to change the site ID without GP attempting to reallocate the stock. If it did reallocate it would have to back order the items as there is none to allocate in 5.

Then the order line has been saved with 5 as the site ID, leaving corrupt figures between SOP-IV modules.
I can only think there is a way or speed at which the events on the form don’t fire correctly or in the right sequence that allows this to happen.

VST Controls®™ for Dynamics GP Visual Studio Tools development

$
0
0

I’m so glad the day has finally arrived where I can talk about this innovation, I think this is one of the coolest things in .NET development with Dynamics GP for some time. I have been restless  to give it a go since I was first told about it.

What is the fuss about?

Last night VST Controls 1.0 was released. This is a really cool FREE .NET developer control library from Envisage Software (makers of PostMaster) & Precipio Services. When I say control library, it is actually a helper tool (provided as a .dll for GP 2013+) that allows any standard windows visual controls to be layered on a Dynamics GP form, and for the GP addin to communicate to those controls.

.NET development on Dynamics GP is limited to the standard set of dexterity controls that GP provides in modifier. Today users take for granted and expect as standard to be able to use more advanced controls with their applications.
Product image on item maintenance window

Above shows a product photo on the Item Maintenance form of GP. To a Dynamics NAV developer, perhaps excitement from being able to place a picture from the database onto the form seems odd? -trust me this is cool for us GP .NET developers.

VST Controls®™ solves some of the frustration caused by being restricted to the small subset of controls that GP as standard provides to the developer. Now with VST Controls on the scene, a developer’s creativity is released, as any control, including 3rd party controls can be placed on top of a GP form. Not only that, it is done with a few lines of code, so very simple to achieve!

Often I have wanted to be able to add say, a web browser control or picture control, even hyperlinks to GP forms, but have ended up having to “pop” a dialog window (a .NET form) to do so. This in many cases this upsets the flow that the user experiences as they traverse the user interface. By layering the “VST Controls” on top of the GP windows this should solve many of these issues.

Hey- this is not a substitute for the design time experience if the GP modifier designer suddenly supported .NET controls, but is the next best thing. There are also going to be some limitations to what you can do using this technique, already I've noticed issues around when the user resizes the GP form, but I’ll tolerate those limitations for what this opens up to me.

Go try it

So go try it – let your imagination go wild, it is totally$£€Freeno royalties  it is offered as a contribution to our GP community –so well done guys!

Fabrikam Day 12th April 2017

$
0
0

Around the world people are posting photos of local landmarks and a #FabrikamDay banner, just for fun and to celebrate a sense of community around our product, Microsoft Dynamics GP.

Here is my contribution in front of The Angel of the North

FabrikamDay  - North East England

The date is a random date that was in the future, chosen to be used for the test company data in Dynamics GP.

fabrikam date

For more information on this read
Fabrikam Day! post by Amber and the

Post by Njevity To Go
Attention Dynamics GP Customers and Partners: We Need Your Help to celebrate Fabrikam Day!

Viewing all 173 articles
Browse latest View live