When a record in Dataverse changes the updated value can represent a state change within a business process that needs to execute further action. Consider the Tasks table. If the Activity Status column value changes to “Completed” then notify the user who submitted the issue. In Power Automate we can support this type of process using the Dataverse When a row is added, modified or deleted trigger and specifically the Modified change type that can target specific table columns and values. In this blog post we will take a closer look at the trigger options and show how you can get started with creating Flows that take action based on a value change within a Dataverse table.
The When a row is added, modified or deleted trigger is available when creating an automated cloud Flow and includes the following options:
Change type choices to scope the Flow execution to a particular action like add, modify, and delete.
Table name choices for all of the tables available to you within your Dataverse environment.
Scope choices to limit the Flow execution based on the records selected ownership.
Business Unit: Flow will only execute if record is owned by the users in the same Business Unit.
Organization: Flow will execute regardless of ownership by any users in the same Organization.
Parent: Child business unit: Flow will only execute if record is owned by the users in the same Business Unit or a Child Business Unit.
User: Flow will only execute if the record is owned by the user
Select columns will limit the Flow execution to the logical column names entered. Multiple columns are allowed and should be separated by a comma.
Filter rows will limit the Flow execution to only those records meeting OData filter query.
Delay until postpones the Flow execution to the time delay specified. Expressions are not supported. Specify a date column from the existing record.
Run as sets the users context for the Flow connections during execution.
Flow owner: User listed as the Flow owner.
Modifying user: User triggering the Flow.
Row owner: User listed as the records owner.
Create Flow using the When a row is added, modified or deleted trigger
Create a new automated cloud flow using the Dataverse When an action is performed trigger.
In our example, we are triggering off of the Tasks table when the Activity Status choice column has been set to “Completed”.
Before configuring the trigger, get the Logical name (A) column name and the desired Choices Value (B) from Dataverse > Tables > Tasks > Columns > Edit column.
Set the When an action is performed trigger as follows:
Change type to “Modified”
Table name to “Tasks”
Scope to “Organization”
Select columns to “statecode”
Filter rows to “statecode eq 1”
With this configuration the Flow will execute if anyone sets the Activity Status column to “Complete”. The rest of my demo build includes steps to lookup record owner and record modifier information.
Add a Dataverse Get a row by ID action and set:
Table name to “Users”
Row ID to Dynamic Value “Owner (Value)” from the trigger
Add a Dataverse Get a row by ID action and set:
Table name to “Users”
Row ID to Dynamic Value “Modified By (Value)” from the trigger
Add an Outlook Send an email (V2) action and set:
To to Dynamic Value “Primary Email” from the “Get a row by ID” owner action
Subject to Dynamic Value “Subject” from the trigger
Body to any static or dynamic message of your choosing.
My completed Flow looks like the image below.
To test your Flow, create a new Task in Dataverse and then update the Activity Status to “Complete”. If all goes well the Flow will execute and you will receive the email notification. This is a simple Flow pattern that is useful for triggering further action as a result of state changes like sending notifications, updating related records and starting sub-processes.
Thanks for reading!
NY
dataverse-take-action-01
nyoung30
Change type choices to scope the Flow execution to a particular action like add, modify, and delete
Table name choices for all of the tables available to you within your Dataverse environment.
Scope choices to limit the Flow execution based on the records selected ownership.
Select columns will limit the Flow execution to the logical column names entered.
Filter rows will limit the Flow execution to only those records meeting OData filter query
Run as sets the users context for the Flow connections during execution
Dataverse When an action is performed trigger
Tasks table when the Activity Status choice column has been set to “Completed”
the Logical name (A) column name and the desired Choices Value (B) from Dataverse
I’ve created enough Power Automate routines to know that they all eventually fail for some reason or other and it’s important to add resiliency to your production Flows where you can. This can include:
Using service-accounts for your connections. This removes the impact of account actions like password changes or user deactivations.
Adding co-owners to the Flow. This increases the options for support and continuity if the primary owner is not available.
Handling errors whenever possible. This allows for known issues with the data, actions and other conditions that are acceptable within the business process that can be predictably dealt with in the Flow.
All of my production Flows include steps that will catch errors and let me know that there was a problem. The out-of-the-box Power Automate failure notification emails lack detail and are not real-time enough for my production routines. There are many ways of handling errors in Power Automate and, in this post, I will share how I do it for my own Flows using Scope actions and Microsoft Teams Adaptive Cards.
Scope: Encapsulate a block of actions and inherit the last terminal status (Succeeded, Failed, Cancelled) of actions inside.
At a high-level our Flow pattern will try to run the primary actions that make up the workflow. If there are errors, catch them and take follow up actions for known issues. If there are still issues it can run a final set of fail-safe actions. This type of exception handling is known as Try-Catch-Finally.
In Power Automate we will use Scope actions as containers for the Try, Catch and Finally blocks of work. In addition, our Flow will capture any debug information using a string variable that will be used for our Adaptive Card notification.
My typical Flow looks like the image below.
Part 1 – Add debug variable.
Add an Initialize variable action and set:
Name to “varDebug”.
Type to “String”.
Value to “List of action names and statuses:
“.
The added line spacing is intentional and makes the debug information more readable within the Adaptive Card.
Part 2 – Add Try block.
Add a Scope action, rename it to “Scope – Try” and place all the primary workflow actions within the Try block. This name is used in later actions, so precision is important.
Part 3 – Add Catch block.
Add another Scope action, rename it to “Scope – Catch” and then:
Click Menu for Scope – Try. Hint: the ellipse button.
Select Configure run after.
Set Scope – Try to “has failed” and then click Done.
This setting ensures that the Catch block only runs if the anything in the Try block has failed. Notice that the arrow between the Try and Catch Scope actions is now red showing that it only runs on error.
Within Scope – Catch, add an Apply to each action and set:
Select an output from previous steps to the following expression: result('Scope_-_Try')
Add an Append to string variable action and set:
Name to “varDebug”.
Value to: “ – Action Name: @{items(‘Apply_to_each_-_varDebug’)?[‘name’]} – Action Status: @{items(‘Apply_to_each_-_varDebug’)?[‘status’]}
“.
Add a Compose action, to the Apply to each loop, rename it to “Compose – varDebug” and set Inputs to “varDebug”. This name is used in later actions, so precision is important.
Add a Post adaptive card in a chat or channel action and set:
Post as to “Flow bot”.
Post in to “Chat with Flow bot”.
Recipient to your Flow owner. In my sample Flow the owner is included in an object variable definition, hence the “variable(…)” reference in the sample image. For my production Flows I store owner info in a table and is queried for at run time.
Adaptive Card to JSON code listed in this text file.
Customize the following lines of code in the Adaptive Card to meet your needs:
C) Flow name “text”: “@{variables(‘varParameters’)[‘varFlowName’]}”
Ensure that Outputs from the Compose – varDebug action display exactly as shown below. This is the captured debug information from an error.
My completed Scope – Catch block looks like the image below.
Any errors in the Try block are captured in the Apply to each loop, formatted into a single string using the Compose action and then sent to the Flow owner using the Adaptive Card. In the sample image below I get a breakdown of the Flow steps, statuses and quickly see where the failure occurred.
For my production Flows that are related to Microsoft Lists or Dataverse data I include an action button that links to the specific item / record on the Adaptive Card.
Part 4 – Add Finally block.
Add another Scope action, rename it to “Scope – Finally” and then:
Click Menu for Scope – Try. Hint: the ellipse button.
Select Configure run after.
Select all options in Scope – Finally and then click Done.
This setting ensures that the Finally block always runs regardless of any failures in the Try and Catch blocks. The Finally block is ideally suited for central logging activities outside of the core business process that the Flow is supporting.
My completed Flow looks like the image below. Making a Flow fail can be tricky, so to test consider updating a code based actions like Parse JSON or Post adaptive card in a chat or channel in Try block with an expected character to force an error.
All things with technology eventually fail. Using a Try-Catch-Finally approach to exception handling in Power Automate will help to add resiliency to your production Flows.
Thanks for reading!
NY
Error-handling-12
nyoung30
My typical Flow
Initialize variable action
Scope - Try action
Configure run after has failed
Notice that the arrow between the Try and Catch Scope actions is now red showing that it only runs on error.
Append to string variable action
Compose action
Post adaptive card in a chat or channel
Adaptive Card sample
Outputs from the Compose - varDebug action
My completed Scope - Catch
In the sample image below I get a breakdown of the Flow steps, statuses and quickly see where the failure occurred.
Scope - Finally
Select all options in t Scope - Finally and then click Done
I recently encountered a scenario where I had to extract multiple lines of text from a system generated email using Power Automate. The key to this Flow pattern is the predictability of the system generated message, meaning I can count on line 1 always being line 1 and line 2 always being line 2 etc. The Flow was difficult to create but simple in its final implementation using Html to text and Compose actions.
In the image below we can see my system generated email. Our Flow will extract lines 1-4.
I am using an Automated cloud flow with the When a new email arrives (V3) trigger. I am using the “Subject Filter” to specify the conditions by which the Flow will execute. In this case any email with a subject of “System email message” will trigger the flow. The context of the trigger is based on the account defined in the connection, i.e. my inbox.
Next, I add an Html to text action and set the content to Body from the When a new email arrives (V3) trigger.
The Html to text action converts the email body (A) into readable and usable content (B).
Using a Compose action with an expression we extract our target text line.
The expression code is listed below and generally reads as:
Yellow: Replace the hidden line break characters with a pipe symbol using decodeUriComponent(‘%0A’)
I’m sure there are more elegant ways of performing this operation, but it suits my needs and the value it adds outweighs any risk of refactoring if the system generated email ever changes. In my production version of this Flow, data is written to Dataverse for review and action as part of a larger business process and has error handling to let me know if things fail.
Thanks for reading!
NY
image-24
nyoung30
System generated email
When a new email arrives
Html to text
Html to text action converts the email body into readable and usable content
One of the great values of using Dataverse Model-driven apps is the ability to use the out-of-the-box forms. These forms supply the user interface to the tables and data within our app and can be customized to meet requirements without having to learn code or incur technical debt. The out-of-the-box forms can be added and customized as part of a solution and if called for new ones created. In this post, we will walk through adding an existing form to a solution; customizing the form and creating a new form.
There are four types of forms in Dataverse:
Main is the default Model-driven app user experience when viewing and editing data. In the sample image below, we see the Main form for the Account table.
Quick Create is a simplified version of the Main form and is used for in-context record creation without having to switch between table forms. In the sample image below, we see the Quick Create form for the Account table supplied in-context while in the Contact table.
Quick View forms are embedded into a Main form and show related read-only information on the record being viewed. In the sample image below, we see the Quick View Account form while in the main Sales table form.
Card forms are light-weight forms that are used in views for mobile experiences. In the sample image below, we see the Account Card in a mobile device experience.
Adding an existing form
Adding an existing form to an existing table is relatively simple. From your Power Apps solution, expand Tables,select your target table and then click Add existing.
Select Forms, select your target form and then click Add.
Our existing form is now added to the table forms within our solution.
Update an existing form
After adding an existing form, you can customize it through the Edit option in the Commands menu.
The form opens and you can add Components (UI elements), Table columns (fields), Form libraries (code components) and Business rules (data driven actions).
Add a column is a simple as clicking the target field and then rearranging it on the form. In the example images below, I am adding the “Account Health” column to the form and then position it as I see fit.
There are quite a few Components worth exploring and adding as needed to your forms. The image below shows the out-of-the-box components. I like to use the 1-column section and 3-column tab to group like content and to split out subject matter on larger forms. For related tables I like to use the Quick view control to display relevant information from the “parent” to the “child” record.
Form libraries and Business rules are too complex of a subject to explore in this blog post but are worth learning more about if you have the need.
Create a new form
Add a new form is a simple as clicking New form, select your form type and then design as desired.
When you create a new Main form, it will use the existing Main form for a starting point. In the sample image below, I have created a new Main form, customized to it only show address information.
You can switch between forms by clicking the drop down menu next to the table name.
As a non-developer I can’t say enough good things about Dataverse Model-driven apps. The platform allows me to focus on understanding business problems and solve for them without having to put the tool in front of the problem. Knowing how to add, customize and create new forms is part of the Model-driven app experience that adds tremendous value with low investment of time and no technical debit.
In an earlier blog post, I wrote about Dataverse environments and how they are the container that stores the data, apps, processes, reports, and other components within a Dataverse solution. A Dataverse solution is the way that we package our application and all its parts into a single file that can be used for distribution or for application lifecycle management (ALM) processes. In this post I will show to how to create a new Dataverse solution. Some of the content in this post comes from an article that I wrote for AvePoint called How to Get Started With Dataverse Solutions. This post is intended to a lighter weight version of the same content.
Click New solution and then click New publisher. A solution publisher shows who developed the app.
Complete the New publisher form by supplying the following information:
Display name: This is the publisher’s name shown in the solution
Name: This is the internal publisher’s name with no spaces or special characters
Prefix: Gets added to all custom solutions items and will help discern delivered Dataverse items from your custom items
Choice value prefix: This is a numeric value that gets added to all choice value internal identifiers. These become important when using Power Automate, but are not surfaced on the application front-end
Contact: This is the default organization information
Click Save to create the new publisher and return to the solution creation screen.
Complete the New solution form by supplying the following information:
Display name: This is the name shown in the solution
Name: This is the internal solution name with no spaces or special characters
Publisher: We created this during our earlier steps.
Click Create to create the new solution.
Our newly created solution is now available in the Solution listing.
Click into your solution and notice the default and new options available for us to add existing objects from the Environment or to create from new.
To get our solution kick-started let’s add some commonly used existing tables – Account and Contacts. Click Add existing and then click Table.Select the Account and Contact and then click Next.
Click Add to include the existing tables into our solution.
To see our tables in action we will create a Model-drive app by clicking New, select App and then click Model-driven app.
Supply a Name and then click Create.
Click Add page to include the Account and Contact tables.
Select Table based view and form and then click Next.
Select Account, Contact and ensure that Show in navigation is selected and then click Add.
Click Play to test your Model-driven app. Click Save and continue if prompted.
If you click the Accounts entry you will see the value that Dataverse and the Common Data Model brings to citizen developers and makers alike with all of the forms, views, tables, columns and so on already created. Value is further realized if data already exists in the shared objects like the Accounts table by maintaining a single source of truth.
I am a big fan of Dataverse and Model-driven apps as they allow me to focus on data models, automations and process flows that meet my user’s needs. Solutions are core to the Dataverse story as a means of packaging solution componentry into a single, manageable and portable container.
In future blog posts we will build out our Issue Tracker solution with customized views, forms and automations. Thanks for reading!
An environmentin Dataverse is the container that stores the data, apps, processes, reports and other components within a Dataverse solution. A common use case for creating added Dataverse environments is to keep “production”, “test”, and “development” activities separated. In this blog post, we will walk through the steps of creating a new Dataverse environment and learn about the environment types and scenarios where you would use them along the way.
There are six environment types available for Dataverse and each is intended for a specific use. For example, the Production type is used for the long-term work of the organization, while the Trial type is used for short-term testing. Knowing the environment’s purpose will guide you in what type you will create. Here are the six environment types and example use cases:
Type
Description
Use case
Production
Intended for long-term storage and work.
A production app running the business of an organization.
Default
Auto-created with the tenant.
A user created Flow that saves email attachments to OneDrive.
Sandbox
Intended for non-production use.
A development area for an app.
Trial
Intended for short-term testing.
Feature testing in Dataverse.
Developer
Intended for single user development.
Learn and lab environment for Dataverse.
Microsoft Dataverse for Teams
Auto-created when Power Apps is added to Microsoft Teams.
A custom app built for Teams.
For a more detailed explanation of the different environment types see the Microsoft Environment Overview page.
In our demo, we will create a new Sandbox environment that can be used for Dataverse development. Start by connecting to the Power Platform admin center at: https://admin.powerplatform.microsoft.com.
Click Environments
Click New
Supply an environment Name, select your target Region, set Create a database for this environment to Yes and then click Next
Review the available options, update as needed and then click Save
The provisioning process can be viewed in the State column
You can access the newly created environment through the Power Platform Admin Center for administrative actions like performing backups, applying updates and granting access.
To start creating and building solutions in your new Dataverse environment go to https://make.powerapps.com/ and then selecting your target environment.
Despite being a relatively simple exercise, creating a new Dataverse environment does require the correct permissions and Microsoft licensing. See the Microsoft Who can create environments article for a detailed breakdown of requirements. If you plan on developing and deploying Dataverse solutions, do so in the right environment and the right environment is rarely the Default. For more information on planning your environments check out this post on the Power Apps blog Establishing an Environment Strategy for Microsoft Power Platform.
Thanks for reading!
NY
Power Platform admin center
nyoung30
Power Platform admin center
New environment button
New environment options
Add database settings
New Dataverse environment viewed through the Power Platform admin center
New Dataverse environment viewed through the Power Apps maker experience
Dataverse is sometimes viewed as a database service, but it’s much more than just tables and views. Also included are apps, forms, processes, workflows, and other built-in and reusable objects that can be used to create and distribute applications.
One such inclusion is Power Apps, and with it the ability to create Canvas and Model-Driven applications. I especially like Model-Driven apps because they allow me to configure a solution based on data model that reflects the business processes that I am trying to support.
Once an application is built, it invariably needs to move between Power Platform environments like “development,” “test,” and “production.” A Dataverse solution is the way that we package our application and all its parts into a single file that can be used for distribution or for application lifecycle management (ALM) processes.
In this post, we will walk through the steps of creating a Dataverse solution that can be moved between environments and explain some of the related concepts along the way. To get started, open Power Apps at https://make.powerapps.com/ and then select your target environment.
Create a new solution
Click Solutions and then click New solution.
Click New publisher. A solution publisher shows who developed the app.
Complete the New publisher form by supplying the following information:
Display name: This is the publisher’s name shown in the solution
Name: This is the internal publisher’s name with no spaces or special characters
Prefix: Gets added to all custom solutions items and will help discern delivered Dataverse items from your custom items
Choice value prefix: This is a numeric value that gets added to all choice value internal identifiers. These become important when using Power Automate, but are not surfaced on the application front-end
Contact: This is the default organizationinformation
Click Save to create the new publisher and return to the solution creation screen.
Complete the New solution form by supplying the following information:
Display name: This is the name shown in the solution
Name: This is the internal solution name with no spaces or special characters
Publisher: We created this during our earlier steps.
Click Create to create the new solution.
Our new solution is now available in our environment.
When we click into the solution, we see the default object types like Apps, Chatbots, Cloud flows, and Tables. Clicking Add existing will allow us to import items that we previously created, or leverage items delivered as part of the Common Data Model (CDM). Note: CDM is a part of Dataverse.
For the purposes of this blog post, we will create an app with existing and custom tables to help track customer issues. Click Add existing and then click Table.
Select the Account and Contact (not shown below) tables and then click Next. These two tables are part of the CDM and have preconfigured columns, metadata, forms, views, and relationships that will help us build our solution faster.
Select Include all components for both tables and then click Add. Components include the columns, metadata, views, forms, keys, business rules, and so on.
Create a new table called Source by clicking New and then clicking Table.
Set the Display name to “Source” and select Enable attachments (including notes and files) and then click Save. Note: Dataverse creates a plural name for the table so try to name tables in singular form. Enabling attachments will allow us to use the Timeline control in our Model-Driven app.
Click the Source table and then click Add column to create the following new columns:
Column name
Column type
Source Name
Text with max length 100
Source Owner
Email with max length 100
Click Save Table to commit your changes.
Click Tables to return to the tables screen.
Create another table called Issue and set the Display name to “Issue” and select Enable attachments (including notes and files) and then click Save.
Note that our publisher prefix is added to the schema name I.e., nys_Issue.
Add the following columns to the Issue table:
Column name
Column type
Folder Location
URL with max length 255
Issue Description
Text with max length 1,000
Priority
Choice with the following choices: Critical, High, Normal, Low
Status
Choice with a new choice having the following options: Blocked, In progress, Completed, Duplicate, By design, Won’t fix, New.
Account
Lookup with related table set to Account
Contact
Lookup with related table set to Contact
Assigned To
Lookup with related table set to Contact
Date Reported
Date Only with default options
Due Date
Date Only with default options
Days Old
Whole Number with default options
Issue Source
Lookup with related table set to Source
Click Save Table to commit your changes.
Now that our data-model is built we can create our model-driven app by clicking Apps, New, App, and then select Model-driven app.
Select the Modern app designer (preview) experience and then click Create.
Enter “Issue Tracker” as the app name and click Create.
We can incorporate our data model into our app by adding Pages. Pages are the views and forms that come as part of Dataverse and require very little customization to deliver an excellent user experience. To get started, click Add page.
Set the page type to Table based view and form and click Next.
Select the Account, Contact, Issue and Source tables and then click Add.
Click Save and then click Publish to finish building the app.
There are many options to further customize the app, table views, and forms. See these articles for more information:
With a small amount of view and form customization, our app looks like these images below.
Export solution
Now that our app is built, let’s export the solution by clicking Export on the Overview section of the solution.
There are two options given to use prior to exporting:
Publish all changes will promote all changes made to our solution items. This is generally a good idea if all of your changes are complete and ready for use.
Check for issues is a system check to see if there is anything that is not ready for use.
Click Publish and wait for all items to be published. Click Check for issues and wait for the issue check to complete. You will receive an email notification of any findings with Check for issues. Click Next.
At this point you have the choice to export as:
Managed: The items within the solution cannot be changed in the destination environment. This is a good for moving validated code to a production environment.
Unmanaged: The items in the solution can be changed as desired. This is good for sharing code with others.
In this example we will export as Managed. Select Managed and click Export.
After the export file is generated, you can download the file. Click Download to save the solution file locally.
If you were to inspect the export file, you would see the XML files that allow the solution to be rebuilt.
XML file snippet shown in the image below.
Import Solution
The process of importing your solution file is straightforward. Select your destination environment in Power Apps, click Solutions,and then click Import.
Click Browse and select your downloaded export file. Click Next.
The Import a solution screen will confirm the details that we specified in the export (name, type, publisher, and version). Click Import to start the process.
The import process can be viewed from the Solutions screen within PowerApps. Notice the grey banner near the top of the screen. It usually takes a few minutes for the import process to complete.
Once complete, the grey banner will turn to green showing a successful import.
Click the Issue Tracker solution and notice that all of our objects have been recreated in our destination environment. Also note that we are not able to edit any of the items directly as they are part of a managed solution.
Play your app to see the Issue Tracker app in action.
Data is not included in the solution export and import process. That’s a blog post for another day.
Conclusion
Dataverse solution files are the cornerstone of Power Platform ALM. Even at a basic level, manually exporting and importing provides a higher degree of maturity than rebuilding applications by hand—or worse, developing in a production environment. Look to Azure DevOps for advanced Power Platform ALM where these mundane tasks are automated.
Have any other topics relating to Dataverse solutions or model-driven apps that you’d like to see covered? Let me know down below!
Thanks for reading!
NY
nyoung30
Click Solutions and then click New solution
A solution publisher shows who developed the app.
Click Save to create the new publisher and return to the solution creation screen.
Click Create to create the new solution.
Our new solution is now available in our environment.
Clicking Add existing will allow us to import items that we previously created, or leverage items delivered as part of the Common Data Model (CDM)
Select the Account and Contact (not shown below) tables and then click Next.
Components include the columns, metadata, views, forms, keys, business rules, and so on.
Create a new table called Source by clicking New and then clicking Table.
Enabling attachments will allow us to use the Timeline control in our Model-Driven app.
Click Save Table to commit your changes
Click Tables to return to the tables screen.
Create another table called Issue and set the Display name to “Issue” and select Enable attachments (including notes and files) and then click Save.
ote that our publisher prefix is added to the schema name I.e., nys_Issue.
Now that our data-model is built we can create our model-driven app by clicking Apps, New, App, and then select Model-driven app.
Select the Modern app designer (preview) experience and then click Create.
Enter “Issue Tracker” as the app name and click Create.
To get started, click Add page.
Set the page type to Table based view and form and click Next.
Click Save and then click Publish to finish building the app.
export the solution by clicking Export on the Overview section of the solution.
lick Publish and wait for all items to be published. Click Check for issues and wait for the issue check to complete. You will receive an email notification of any findings with Check for issues. Click Next.
Select Managed and click Export.
Click Download to save the solution file locally.
XML files that allow the solution to be rebuilt.
XML file snippet shown in the image below.
Select your destination environment in Power Apps, click Solutions, and then click Import.
Click Browse and select your downloaded export file. Click Next.
The Import a solution screen will confirm the details that we specified in the export (name, type, publisher, and version). Click Import to start the process.
The import process can be viewed from the Solutions screen within PowerApps. Notice the grey banner near the top of the screen. It usually takes a few minutes for the import process to complete.
Once complete, the grey banner will turn to green showing a successful import.
Click the Issue Tracker solution and notice that all of our objects have been recreated in our destination environment. Also note that we are not able to edit any of the items directly as they are part of a managed solution
Play your app to see the Issue Tracker app in action.
Data is not included in the solution export and import process
If you want to get started with Dataverse but do not have a license available get a Power AppsDeveloper Plan. The Developer Plan gives you Power Apps, Power Automate and Dataverse for non-production use. Learn more at: https://powerapps.microsoft.com/en-us/developerplan/
You have two options for signing up:
Get started free, use this choice if are you not using Power Apps, Power Automate or Dataverse.
Existing user? Add a dev environment, use this choice if you are already using Power Apps, Power Automate or Dataverse.
The setup experience in both cases looks like the images below. Important: a work or school account is needed.
Click Get started free or Existing user? Add a dev environment.
Enter your work or school email address and click Next.
Select your country or region, enter your phone number and click Get started.
Click Get Started to confirm the confirmation details.
You will be prompted to select you country again and then click Accept.
Your new development environment will open in the Power Apps maker experience. The environment will be your full name in plural “Environment”. For example, “Norm Young’s Environment”. You cannot change this name. Outside of dataflows you get all of the standard and premium features.
Not every organization will have access to Dataverse. Using the Power Apps Developer Plan will give you opportunities to discover the feature set and prove the value to your organization prior to buying more licenses. Sign-up today and get developing!
Thanks for reading.
NY
image-38
nyoung30
Power Apps Developer Plan signup options
Enter your work or school email address and click Next.
Select your country or region, enter your phone number and click Get started.
Click Get Started to confirm the confirmation details.
You will be prompted to select you country again and then click Accept.
Despite this blog post’s catchy title there is no magic upgrade button to move Microsoft Lists to Dataverse for Teams. There may be in the future but until then migration translates to rebuild. A list rebuild to Dataverse will most likely be caused by a design requirement that Lists is not suited to, like setting up complex table relationships or custom security configurations in the data. In this post I am redesigning my list to have a more normalized structure. I am using a customized version of the Microsoft Lists Issue Tracker template in this demo and will split the list into multiple tables. The Issue Tacker list structure lends itself to capturing process transactions. The Issue Source column stores the related apps and services that might have issues that need tracking. Moving Issue Source to a related table allows us to add more metadata about the source like person responsible, standardizes data entry and makes for a better user experience by moving away from manual data entry to a lookup experience. In a relational design view Issue Source will have a 1-to-many relationship to the Issue Tracker table.
To get started with our list migration connect to Microsoft Teams, click More added apps and then select Power Apps.
ClickStart now on the How to create an app for your team screen.
Select your target team and then click Create.
Power Apps will create the Dataverse for Teams database and send you an email when complete. After the provisioning is complete you are prompted to supply an app name in the Power Apps Teams maker experience. Enter an app name and click Save.
We will create our new tables, Source and Issue Tracker , by clicking on With data on the Start this screen form.
Our Source table has a 1-to-many relationship with the Issue Tracker table, this means we should build Source first. Enter a table name and click Create. Notice that Dataverse tables names are usually named in singular form and then given a plural name within the create table experience.
We can add columns to our table by clicking the Add column button. I like to add an “ID” column to all of my tables using the Auto number column type with a string prefix. If the Name value changes in the table, I can supply data continuity through the ID column. To create the column, click Add column and set:
Name to “Source ID”
Type to “Auto number”
Autonumber type to “String prefixed number”
Prefix to “SRC”, this can be any value but should reflect the table name in some form
Minimum number of digits to “4”, this value should reflect the max number of entries you expect in this table
Seed value to “1000”, this can be any value and in is usually influenced by a business requirement
Max length to “100”, this value should reflect the max number of entries you expect in this table
Click Create
The Source table will include a service owner column. Unlike Microsoft Lists there is no Person column, and we will use the Email column instead. Click Close when complete to save the table changes and return back to Power Apps.
Our next step is to create the Issue Tracker table.
Add the following columns shown below:
Column name
Column type
Issue ID
Auto number with: – Autonumber type set to “String prefixed number” – Prefix set to “ISS” – Minimum number of digits to “4” – Seed vale set to 1,000 – Max length to 100
Folder Location
URL with max length 255
Issue Description
Text with max length 1,000
Priority
Choice with the following choices: Critical, High, Normal, Low
Status
Choice with the following choices: Blocked, In progress, Completed, Duplicate, By design, Won’t fix, New
Assigned To
Email with max length 100
Date Reported
Date with default options
Due Date
Date with default options
Days Old
Number with default options
Issue Source
Lookup with related table set to Source
Column creation images are shown below.
Click Close when complete to save the table changes and return back to Power Apps.
Back in Power Apps, click With data in the Start this screen and select “Issues” to build our starter app.
Our starter app looks good but requires a few modifications to make it work for our design.
Click EditForm1and then click Fields.
We will add the Issue Source column to our form by clicking Issue Source and then click Add. Use the Columns and Layout controls to layout the form in your preferred way.
With very little modification our starter app is simple, clean and looking good.
We need to publish the app to Teams for general use. Click Publish to Teams and update the Name, Icon and fill colors as desired.
Our finished app looks like the image below.
Microsoft Lists are great, and I use them every day in my professional life. When requirements exceed what Lists are capable of then migrating to Dataverse for Teams is a smart choice. You don’t have to be a developer to take advantage of what is possible with the out-of-the-box Dataverse for Teams functionality. In future posts we will extend the functionality of our app with Power Automate and Power Apps.
Thanks for reading!
NY
Update: Soon after posting this article, Scott McKenzie let me know that: “Dataflows for Dataverse Teams is now available in preview. If you create a new Dataflow utilizing your existing SharePoint List, you have the ability to create the Entity with matching Fields and transfer the data rather than having to build the Entity and Fields from scratch.” If I were doing straightforward List to Dataverse for Teams migrations I would definitely look into Dataflows.
image-33
nyoung30
Add Power Apps to Microsoft Teams
How to create an app for your team screen
Select your target team and then click Create.
Enter an app name and Save.
Start this screen With Data
Create a table experience
Source ID column definition
Populated Source ID column.
Owner column using the email column type.
Issue ID column using Autonumber`
Folder Location column using URL column type
Issue Description column using the Text column type
Priority column using the Choice column type
Date Reported column using the date column type
Issue Source column using the Lookup related table column type
I’ve previously written about how to import data from Excel into Microsoft Lists. In this post, we will switch things up and export list data into Excel using Power Automate. In our scenario, we will export a subset of the list data into a date-named spreadsheet that is stored in Microsoft Teams daily.
Excel Preparation
Our solution requires an empty Excel file with columns that match our list schema that has been formatted as a table. This file will serve as a template for the export process and will be named Template.xlsx. Take note of the table name within your template file.
Be sure to take note of the SharePoint site URL associated with the Teams site. Hint: From the Files tab, click Open in SharePoint and copy the site URL.
Flow Build
Create a new Flow from our list by clicking Integrate > Power Automate > See your flows.
Click New Flow > Scheduled cloud flow.
Supply a Flow name, i.e. “Export List to Excel”, set the recurrence to “Day” and click Create.
Create an Initialize variable action and set:
Name to “varFileName”
Type to “String”
Value to the following expression: concat('IssueTracker-', utcNow('yyyy-MM-dd'), '.xlsx')
Note: This expression generates the file name based on static text (“IssueTracker-“) and the current date. Change the static text as you see fit.
Add another Initialize variable action and set:
Name to “varTableName”
Type to “String”
Value to the name of the table in the template file, i.e. “Table1”
Create a SharePoint Copy file action and set:
Current Site Address to our site address noted above
File to Copy, navigate to “/Shared Documents/General/Template.xlsx”
Note: “General” is the name of the Teams channel and can be changed to the folder name that corresponds with your desired Teams channel
Destination Site Address to our site address
Destinate Folder to “/Shared Documents/General”
If another file is already there to “Copy with a new name”
Add a SharePoint Send an HTTP request to SharePoint action and set:
Site Address to our site address
Method to “POST”
Uri to “_api/web/lists/GetByTitle(‘Documents’)/items(@{outputs(‘Copy_file_-_Template.xlsx’)?[‘body/ItemId’]})/validateUpdateListItem” where ItemId comes from the Copy file action
Note: The filter query should match your requirements. In my scenario, I am only exporting active issues.
Create a new Excel Add a row into a table and set:
Location to our site address
Document Library to Documents
File to the following expression: concat('/General/', variables('varFileName')) Note: The “General” tab is hardcoded in the expression and can bechange as needed
Table to “varTableName”
Row to:
Pattern: { "Excel column name 1": List column name 1, "Excel column name 2": List column name 2 }
Example: { "Title": @{items('Apply_to_each')?['Title']}, "Issue description": @{items('Apply_to_each')?['Description']}, "Priority": @{items('Apply_to_each')?['Priority/Value']}, "Status": @{items('Apply_to_each')?['Status/Value']}, "Date reported": @{items('Apply_to_each')?['DateReported']}, "Due Date": @{items('Apply_to_each')?['Duedate']}, "Person or group the issue is assigned to": @{items('Apply_to_each')?['Assignedto/DisplayName']}, "Issue source': @{items('Apply_to_each')?['IssueSource']}, "Issue logged by": @{items('Apply_to_each')?['Issueloggedby/DisplayName']} }
Note: Column names should be updated to match your Excel table schema
My completed Flow looks like the image below.
Save and run your Flow. If all goes well, your list rows will be added to a new Excel file with the current date in the filename.
Manually exporting list data is easy enough but automating the process can take a bit more work than expected. This Flow pattern can be extended to send the Excel file to users or for further integration with other apps and services.
Thanks for reading.
NY
image-10
nyoung30
Excel table shown in Microsoft Teams,
Open in SharePoint from the Files tab in Microsoft Teams