Control Addins in AL

Hey everyone,

Quick post today to highlight the updated documentation for building control addins in AL.

The docs have recently been updated to include much more information regarding what properties and methods are available.

Check out the documentation here.

You will also want to check out the style guide, so that you can make sure that your control addin fits the look and feel of the new Business Central theme.

That’s all for now, happy coding!

Advertisements

Converting an ISV solution to an extension: Update 2

I realize it’s been a bit since my last update. I decided to take a week of vacation (no, not brought on by this conversion process!), and then decided to build a separate app to compliment our ISV solution.

Back to the conversion project….I’ve been able to spend a little over a week worth of time since my last update , so I’ve got a few things I want to cover.

Environment

I was asked a little while ago what the environment is that I’m using, so I’ll quickly explain. I’m using a 100% Docker-based system running locally on my Windows 10 Surface Book 2. With the luxury of 16GB RAM available on my machine, I’m able to dedicate 8Gb to the container which allows for a nice smooth experience without having to switch to Windows Server. If you want to get into building your own Docker system, check out Freddy’s post to get started.

As for the image I’m using, I’m using the Business Central insider builds. You may want to choose another image to use, or if you do not have access to the insider builds, refer to this post to determine what’s right for you.

Where I’m at in the Process

At the end of the last post, I had tackled some of the easy things. Now I’ve been able to get through a large chunk of the more tedious things. Remember, I’m not necessarily fixing issues right now, I’m still working my way towards an extension that builds and publishes.

Issue 1 – Report DataItem Naming

One of the issues that came up was with the converted reports. It appears as though that in our original reports, we had named various dataitems with the name ‘Labels‘. Well, guess what is a reserved word in AL? This was an easy fix as it just involved renaming the dataitem. Since then I’ve tried to reproduce the error but can’t so I’m wondering if this has already been fixed in the more recent AL Language extension that I’m now using.

Issue 2 – TextConst Dropped in Conversion

This is a weird one. I’ll mention it here in case anyone else comes across this, but somewhere in the conversion of my CAL objects to AL files, a bunch of TextConst variables were dropped. I don’t know why, and there’s a possibility that it was something I did, but nonetheless, I had a bunch of files I had to go through in order to fix the missing variables.

Issue 3 – Trigger Modifications Unsupported…or are they?

This one was a bit annoying, but I understand why it gets flagged by the Txt2AL conversion tool. If you’re converting an ISV, you’ll be almost certain to run into this.

In our table and page extensions, there a load of places where we had put code in various triggers, such as onValidate, onOpenPage, onAfterGetCurrentRecord, etc. All of these changes were marked as “Unsupported” during the conversion process.

In your extension object, you do get a nice indicator of what the change was. The example below shows that I had made a modification to the onOpenPage trigger and added some code at the end to populate a ‘User ID Filter’. I often found myself going back to C/Side just to confirm the exact modification and what code was around it.

UnsupportedFeatureTrigger

This is a bit misleading though as you can create the necessary trigger in the extension file and then add your existing code to it. Using the example above, my extension file would now look like this:

UnsupportedFeatureTrigger2

The caveat to this, and the reason why the conversion tool flags these changes, is that you have to be conscious of when the triggers are fired and where your original code was within the trigger.

If your original code was at the start of the onOpenPage trigger, your new code that you add to the onOpenPage trigger in the extension object will not fire until AFTER the “original” trigger fires. Given this, while you may be able to recreate the triggers needed and then put your code in them, the code may not actually be firing when you intended.

If your existing trigger code was somewhere in the middle of the old trigger code, you will need to rework your solution to work within the onBefore/onAfter structure.

Some triggers (e.g. onInitPage) are not available in extension objects, so you need to also be aware that. Available triggers also change depending on whether or not you are adding your own field/control to the page, or modifying an existing one.

You can always see what triggers are available to you by pressing Ctrl+Space after the trigger keyword.

triggerLookup

Knowing all of this information, you can see why it doesn’t make sense for the conversion to automatically assume all of your trigger code can simply be moved over ‘as-is’. This one’s going to require some developer know-how!

Alright, I think this is it for now. I have more updates to share, but at the risk of turning this post into a novel, I’ll save those for the next post.

Until next time, happy coding!

Converting an ISV solution to an extension: Update 1

Well, what can I say……….this is going to take a while. 🙂

If you have no idea what I am talking about, start here.

After getting the objects converted to AL (using the standard process documented here), I was immediately greeted in VS Code by just over 8000 errors/warnings. Great start.

Luckily half of those were generated by the code analyzers. If you don’t know what they are, see here. Yes, at this point in the project I am considering that half of my issues are “only non-standard and poorly formatted code”. Gotta try and stay positive!

This brings me to my first “recommendation” in this process:

Turn off ALL the code analyzers.

The code analyzers are enabled by default, but can be turned off in your VS Code settings (see here). Once I get to a point where the “non-syntax” errors have been addressed, I will re-enable the analyzers and then deal with that barrel of fun. I’ve done this before in much smaller apps and very quickly got tired of typing ‘()’ 🙂

After disabling the analyzers my list of errors/warnings was brought “all the way down to” just over 4000. Oh wow I’m halfway done!! Ha ha.

The next thing I decided to tackle was another fairly easy one. Dealing with Product Group Code. If you’re not already aware, the Product Group Code feature was deprecated in favour of an improved parent/child Item Category structure. The Product Group Code fields however, were left in the product (not sure I like that, but we can deal with it) and marked for removal (see ObsoleteState property here). This causes any references to those fields to be flagged as warnings. Your code will still build, but this is a heads up that at some point in the future the fields will (probably!?) be removed. As I’m already head deep in things to fix and rework, I decided I’d rather not wait and I will remove references to those fields now.

For me, removing the references to the Product Group Code was pretty easy. We didn’t have major functionality tied to that field, but other ISV solutions may require more effort to remove those references, or, you may just choose to not deal with that now. Up to you.

That’s my update for now, back to fixing more issues. More updates to come!

Happy coding!

Converting an ISV solution to an extension: The Beginning

I’ve been a bit neglectful of this blog for the past couple of months, but with good reason. I’ve been jumping around multiple projects, and have now landed on a project that will involve taking a roughly 3000 object ISV solution and converting it to an extension.

Easy task? Far from it.
Challenging? Most definitely……but that’s why we do what we do right!?
Will I be able to convert it all to an extension? Not likely. (Yes, extensions still have limitations)

What I want to do is post my journey through this project. It’s not going to be a step by step of what to do as this is the first time I’m doing this. Up to this point, my extension experience was building them from scratch, or taking small pieces of existing features and recreating them as extensions. Nothing on this level yet. In all likelihood, I will make some mistakes, or perhaps do things in a less than ideal way, but that’s what I hope to capture in these posts, so that hopefully others can learn form them……and so I don’t make the same mistakes the next time I might have to do this. 🙂

Now first off, I want to be clear that having a 3000+ object extension is not ideal. It’s not the actual end-goal of this project. The overall solution will eventually be broken down into a set of smaller extensions, but as the current design and code is so tightly integrated, we’re moving it all to AL now and will break it apart later.

My “plan” (I use that word loosely) for this project when I began was to do the following:

  1. Convert all net new ISV objects to AL.
  2. Convert modified base tables and pages to AL (e.g. table/page extension files).
  3. Rework existing C/Side objects to move to an AL-based solution.

The first 2 steps leave me with a large portion of the solution in AL, but also a portion back in C/Side. The objects in C/Side would be the base codeunits, xmlports, reports, etc. that were modified by the ISV solution but can’t be directly converted to AL. That’s where step 3 comes into play. Reworking the objects to use events and/or redesigning the ISV solution is going to be required to clean up those objects. All in all, a lot of work, and I know for a fact that I will come across some design that cannot be replicated in an extension…..so stay tuned for however I’m going to deal with that!

Look for more posts on this as I move through this process. I hope to show both the good and the bad so that people can get a sense of what they’re in for if they come across a similar project.

Happy coding!

Dynamics 365 Business Central in an Azure Container Instance

I recently came across this article, which talks about using the Azure Cloud Shell to create an Azure Container Instance that runs Dynamics 365 Business Central. I was intrigued because Azure Container Instances has just recently been released to the public and I just gotta try the new stuff! Thanks Andrey for that article!

What is an Azure Container Instance you might be asking? If you’ve been keeping up with Dynamics 365 Business Central development, you have been using containers to create your environments. This requires Docker to run either on a Windows 10 or Windows 2016 Server machine that’s either hosted or on-prem. Either way, you’re carrying the overhead of the machine, physical or virtual. With Azure Container Instances, you can create the containers directly in Azure without that machine overhead. This ‘should’ translate to some sort of cost savings, but as my container has only been up for about 2 hours as of the time of this article, I don’t yet know if or how much savings there will be.

In Andrey’s post, he walks you through using the Azure Portal and Azure Cloud Shell to create the container. Being the ‘lazy developer’ that I am though, I prefer to do as little manual work as possible so I thought I’d take a stab at building PowerShell script that I can run locally and potentially automate the entire process. Yup, even opening my browser to paste the code into Azure Cloud Shell is apparently too much work for me. 🙂

Turns, out this is pretty easy to do. Using the Azure Resource Manager PowerShell module, we can easily connect to our Azure account, and create the necessary container pieces.

Here’s how…

Connect-AzureRmAccount
The first thing we need to do is connect to our subscription and tenant. The user will be prompted for credentials when this command is executed. If you don’t know what your subscription and tenant IDs are, you can find instructions here for the subscription ID, and here for the tenant ID.

New-AzureRmResourceGroup
Once we’re connected we need to create the Azure Resource Group that will be used for our container instance.

New-AzureRmContainerGroup
Once the resource group is created now we can create the container. This is where we get to set the parameters for the container. One change I made from Andrey’s initial post is that I assigned the container the DnsNameLabel, which will mean we can use the Fqdn to access the container instead of the IP address. If you’ve used FreddyK‘s NavContainerHelper module, you’ll also notice that the parameters here are similar to some of the ones used by the New-NavContainer commandlet. Hey maybe we can get some new additions to the module for this stuff!

Ok…..here’s the actual code. It’s pretty basic at this point in time. Just getting my feet wet to see how it goes.

Install-Module AzureRM

### SET VARIABLES
$azureSubID = ''
$azureTenantID = ''
$azureResourceGroupName = 'myResourceGroup'
$azureLocation = 'EastUS'
$containerEnvVariables = @{ACCEPT_EULA='Y';USESSL='N'}
$containerImage = 'microsoft/bcsandbox:us'
$containerName = 'myContainer'

### CONNECT TO AZURE ACCOUNT
Connect-AzureRmAccount -Environment AzureCloud -Force -Subscription $azureSubID -TenantId $azureTenantID

### CREATE RESOURCE GROUP
New-AzureRmResourceGroup -Name $azureResourceGroupName -Location $azureLocation

### CREATE CONTAINER
New-AzureRmContainerGroup -Image $containerImage `
 -Name $containerName `
 -ResourceGroupName $azureResourceGroupName `
 -Cpu 2 `
 -EnvironmentVariable $containerEnvVariables `
 -IpAddressType Public `
 -MemoryInGB 4 `
 -OsType Windows `
 -DnsNameLabel $containerName ​
 -Port 80,443,7048,7049,8080 `
 -Location $azureLocation

Once you execute the above script, go grab a coffee. After about 15-20 minutes your container should be up and running. You can check on the state of your container using the following code:

Get-AzureRmContainerGroup -ResourceGroupName $azureResourceGroupName -Name $containerName

When you run the above code you’ll see various properties of the container. What you want to pay attention to are ProvisioningState and State, which will appear as ‘Creating‘ and ‘Pending‘ as shown below.

InkedAzContainerDeploy1_LI

 

Once the container has been created, you should see the following statuses:

InkedAzContainerDeploy2_LI.jpg

 

Take note of the Fqdn property and save the address value. This is the address that you will need to use to connect to your Business Central environment later on.

Once your container has a State of ‘Running‘, you can check the container logs by using the following code:

Get-AzureRmContainerInstanceLog -ContainerGroupName $containerName -ResourceGroupName $azureResourceGroupName

Running the above code will show you the container logs, and again, if you’ve been using the NavContainerHelper, these logs will look very familiar to you:

InkedContainerLogs_LI

 

Remember!!!
When you connect to your container via Visual Studio Code or the Web Client, or to download the VSIX, you need to use the address from the FQDN property of the container instance, and not the address values that you see in the container logs. See some examples below:

Insiders
If you have access to the private insider builds for Business Central, you need to provide credentials in order to access the Docker image registry. You can do that by adding the ‘-RegistryCredential‘ parameter and supplying a PSCredential object to the New-AzureRmContainerGroup command.

Oh, if you’re into this kind of thing, you can check out the Azure Container Instance SLA here. It’s a super fun read! 🙂

Thanks again to Andrey Baludin for his original post on Azure Container Instances!

Happy coding!

AL Extensions: Translate Your Solution With the Multilingual App Toolkit Editor

In this post, I showed you how you can use Microsoft Dynamics Lifecycle Services to translate your AL project into a variety of languages.

As with most things, there are multiple ways to go about doing things. This post will look at the Microsoft Multilingual App Toolkit. This toolkit is integrated into Visual Studio but there is also a standalone version, called the Multilingual App Toolkit Editor.

With this tool you can manually do your translation and/or you can connect it to the Microsoft Translator service via Azure, which is what I will describe in this post.

Here’s how…

Download and install the Multilingual App Toolkit Editor.
https://developer.microsoft.com/en-us/windows/develop/multilingual-app-toolkit

If all you want to do is work offline and manually do your translations, you can stop here. Continue on if you want to connect to the translation service in Azure, but note that you do need an active Azure subscription for this.

Enable the Translator Text API in Azure.
Using the Azure portal, do the following to add the Translator Text API to your Azure subscription:

  1. Choose “Create a Resource“.
  2. Search for “Translator Text API” and select it.
  3. On the Translator Text API blade, press the Create button to begin configuring the subscription.
  4. Fill in the fields accordingly for the API by giving it a name, pricing tier, etc. Note that there is a free tier option that lets you translate up to 2 million characters per month. Additional pricing info can be found here.
  5. Press the Create button to deploy the API to your Azure subscription. This might take a few minutes.

Get your Translator Text API authentication key.
Once the API has been deployed, you need to get your subscription key so that the Multilingual Tool can authenticate and connect to it.

  1. In the Azure portal, select “All Resources” and select the Azure subscription that you deployed the API to.
  2. In the list of resources, click on the Translator Text API service that you deployed.
  3. In the Translator Text API blade, select Keys.
  4. Copy one of the 2 keys that are associated with the service. You will need this key value in the next step.

Add Multilingual App Toolkit Editor credentials.
Now that we have the Translator Text API up and running, and the Multilingual App Toolkit Editor installed, we need to configure the authentication. We do that using the Windows Credential Manager.

  1. On your Windows machine, launch Credential Manager.
  2. Select Windows Credentials.
  3. Select Add a generic credential.
  4. Enter the following values:
    • Internet or network address: Multilingual/MicrosoftTranslator
    • User name: Multilingual App Toolkit
    • Password: <the Translator Text API key that you retrieved earlier>
  5. Click OK.
  6. Close Credential Manager.

Ok, now that we have everything installed, deployed, and configured, we can open up the Multilingual App Toolkit Editor (search Multilingual Editor in your Start menu) and translate the XLF file from our AL project. You can learn about generating this file here.

  1. Copy the auto-generated ‘.g.xlf‘ file to create a new file in the same folder. Rename the file based on the AL standards here.
  2. Edit the new file and update the ‘target-language‘ property to be the language that you are translating the file to (e.g. fr-CA).
  3. Close and save the file.
  4. Using the Multilingual App Toolkit Editor, select Open and select your new file.
  5. From the ribbon, select Translate > Translate All. The toolkit will now use the Translator Text API in Azure to translate the file based on the source and target languages. This might take a few minutes based on the numbers of items that need to be translated in your solution.
  6. Once the translation is done you can manually review and edit any if you wish.
  7. Close and save the file.

Now you have your new translation file. Simply repeat the steps to generate each translation that your AL solution requires!

Submitting to AppSource
If you are submitting your solution to AppSource, even if you do not need multi-language support in your solution, you still must provide (at a minimum) a translation file for the base language (e.g. en-US) in which your solution is published.

Note that the auto-generated ‘.g.xlf’ file is NOT a properly formatted translation file and your solution will not pass validation if you do not create at least the base language file.

In the pic below you have the raw ‘.g.xlf’ file as it gets created during the AL project build process. As you can see, there is only a ‘source‘ property for the message control even though the ‘target-language‘ of the file is set to ‘en-US’:

AutoGenTranslationFileFormat

In a properly formatted translation file, you will have both the ‘source‘ and the ‘target‘ properties:

ProperTranslationFileFormat

In addition to the formatting, you can’t rely on editing the ‘.g.xlf’ file because it gets overwritten each time you build your AL project.

In short, use the ‘.g.xlf’ file ONLY as the source for generating other translation files.

EDIT: I was just informed that fellow MVP Tobias Fenster‘s ALRunner VS Code extension can (among many things) convert the ‘.g.xlf’ translation file into a properly formatted file. Quick and easy! Check it out here!

Happy coding!

AL Extensions: Managing Version Numbering With VSTS Builds

I wanted to do this post to talk a bit more about something that Soren Klemmensen had in Part 2 of the blog series on automated builds with VSTS. We wanted to keep those original posts as simple as possible but there are a few more topics that we would like to expand upon.

In that particular post, he covered the PowerShell code needed to generate the AL extension using the ALC compiler tool, and in there is some interesting code around setting the version number of the extension package that gets created. That’s what I want to talk a bit more about here.

As we know, the version number that is assigned to the AL extension needs to be stored in the app.json that sits in the the AL project, and that value needs to be there before the compile process can be executed, like this:

ALExtensionVersion

Typically, this means that the developer is updating that value as needed before they compile and package the extension.

Wouldn’t it be nice to not have to have the developer do that every build? They’ve already got so much to worry about! 🙂

Well…it just so happens that one of the many great features in VSTS build definitions is the ability to automatically generate build numbers. Using a combination of static values and tokens, you’re able to generate any build number that you would like and in pretty much any format you would like.

You can read up on what tokens are available here, and how you can use them to build whatever format of build number you want.

In VSTS, these build numbers can be used to tag checkins so that you can track which checkin represents a build, but we’re going to leverage this and build ourselves a properly formatted version number that we will inject into the AL extension during the build process. This way, the developer can sit back and relax wait for the VSTS magic to happen!

Here’s how we’re going to do it…

1. Update build definition

We need to update our build definition to create our version number for us. We do that by populating the Build number format box on the Options tab of the build definition. Here’s where you can use the VSTS tokens and/or static text.

For this walkthrough, my version number needs to meet the following criteria:

  1. It will follow the standard #.#.#.# formatting that is required for an AL extension.
  2. It will be based on the date that the build is executed.
  3. I need to ensure I always get a unique version number even if I generate multiple builds in a day.
  4. This is the first version of my extension, so I want to make sure the version number begins with 1.0.

Given all of this criteria, I’ll populate the field like this:

BuildVersionFormat

As you can see, I am just hard-coding the “1.0.” so that every version I get begins the same. I’m using the date token (note the format can be changed if you like) so that the version contains the date on which the build is executed. Last but not least, I’m using the revision token to ensure that every time the build is executed I will get a unique version number. If you’re implementing Continuous Integration (and you really should!) you will certainly at some point have multiple builds done within the same day.

NOTE: The revision token is 1-based, so in my example the first build on a given day will end with .1 and not .0 (not what I would have preferred but what can you do).

2. Update build script

Remember I mentioned Soren’s post? In that post the following code is added to your build script. We only really need to update 1 line in this script, but I do want to talk about a few of the other lines so you know what’s going on here. The specific code we’re interested in is shown in red:

$ExtensionAppJsonFile = “.\app.json”
$ExtensionAppJsonObject = Get-Content -Raw -Path $ExtensionAppJsonFile | ConvertFrom-Json
$Publisher = $ExtensionAppJsonObject.Publisher
$Name = $ExtensionAppJsonObject.Name
$ExtensionAppJsonObject.Version = ‘1.0.’+$env:Build_BuildID + ‘.0’
$ExtensionName = $Publisher + ‘_’ + $Name + ‘_’ + $ExtensionAppJsonObject.Version + ‘.app’
$ExtensionAppJsonObject | ConvertTo-Json | set-content $ExtensionAppJsonFile
$ALProjectFolder = $env:System_DefaultWorkingDirectory
$ALPackageCachePath = ‘C:\ALBuild\Symbols’
Write-Host “Using Symbols Folder: ” $ALPackageCachePath
$ALCompilerPath = ‘C:\ALBuild\bin’
Write-Host “Using Compiler: ” $ALCompilerPath
$AlPackageOutPath = Join-Path -Path $env:Build_StagingDirectory -ChildPath $ExtensionName
Write-Host “Using Output Folder: ” $AlPackageOutPath
Set-Location -Path $ALCompilerPath
.\alc.exe /project:$ALProjectFolder /packagecachepath:$ALPackageCachePath /out:$AlPackageOutPath

 

The first 2 lines of highlighted code read the app.json file from the AL project into a PowerShell object using ConvertFrom-Json. This will allow us to read/write any of the properties within the json file:

$ExtensionAppJsonFile = ".\app.json"
$ExtensionAppJsonObject = Get-Content -Raw -Path $ExtensionAppJsonFile | ConvertFrom-Json

Now what we want to do is update the Version property with the build number that our VSTS build definition generated. We can get that value by using one of the environment variables that VSTS makes available while the build process is running. There are a bunch of environment variables available to us (see here for more info) but the one that we’re interested in is Build.BuildNumber. The value in this variable is the value that VSTS generated based on the Build number format that you populated in your build definition. If you leave that field blank, then this environment variable would be empty.

This is where we need to replace some of the code from Soren’s post. The original code (shown below) sets the version number to Build.BuildID, which is the identifier that VSTS assigns to each build:

$ExtensionAppJsonObject.Version = ‘1.0.’+$env:Build_BuildID + ‘.0’

Since we want to control the version number format using our build definition, this is where we need to make a small change. We’ll replace the above line of code with the following that will update the Version property to Build.BuildNumber:

$ExtensionAppJsonObject.Version = $env:Build_BuildNumber

Notes

  • To access VSTS environment variables in your PowerShell script, begin the variable name with ‘$env:’ and replace ‘.’ with ‘_’.
  • Because this environment variables are only generated during the build process you will not be able to manually run your PowerShell script to test the values of these variables.

Finally, we need to write our changes to the app.json file as so far we’ve only made the changes in the PowerShell object We use ConvertTo-Json for that:

$ExtensionAppJsonObject | ConvertTo-Json | set-content $ExtensionAppJsonFile

That’s It!

After you’ve done the above updates to your build definition and script, when you generate your build either manually or via trigger, your packaged AL extension will have version numbers similar to the following:

1.0.20180407.1   (first build of the day)
1.0.20180407.2   (second build of the day)
1.0.20180407.3   (third build of the day)
1.0.20180407.n   (nth build of the day)

Source code management

One of the obvious questions if you implement your build process like this is that if the version number is only updated at the time the build is done, what happens with source code management? Well to be honest, I do not track the version number of my extension in source code management at all.

Personally, I don’t see the value since when my build runs, it tags the checkin that represents that build, so at any point in time I can go back and find the source code for any build that I’ve done, and if I need to I can rebuild it.

So what do I keep in source code management? I keep all of my extension versions permanently set to ‘0.0.0.0’. This means that no developer ever needs to touch the version property during the development stage. It also helps identify if a database has a released version of my extension installed versus a pre-release development version, which does happen from time to time.

Hopefully you’ve found this useful and you’ve now updated your build definition to provide you the exact version number format that you want.

For a listing of the all the automated build articles, see my post here.

That’s it for now, happy coding!