I just coded this for my personal use, but thought someone else also might be interested in – https://github.com/ninethsense/WordPronounce
Power Automate: Send daily bugs report in Microsoft Teams from Azure DevOps Boards
You can simply change same to send a mail notification too, instead of Teams.
Sample output on Teams:
Development Steps:
Step 01 – Create query in Azure DevOps board for Bugs
I would start with a Recurrence Trigger (schedule). Check always the Time zone of the trigger will happen on GMT.
Step 01 – Schedule
I would start with a Recurrence Trigger (schedule). Check always the Time zone of the trigger will happen on GMT.
Step 02 – Initialize Azure DevOps URL (optional)
I am using this url for hyper linking in the output, which can be considered optional.
Step 03– Get bug list from DevOps boards
I suggest you to create a ‘query’ first which returns filtered ‘bug’ results. Below is the query have created and saved for this example blog.
Next, get the results from this query:
Step 04 – HTML’ize
I wanted to format the Bugs list content as a table and, MS Teams message supports HTML. So I will create a header and assign it to a new variable.
Step 05– Convert rows to HTML rows
Next step creates each row from DevOps boards query to corresponding <TR>s. You can choose to omit or add any columns as per your requirement.
Step 06 – Close HTML table
Once the rows are added to HTML table, I would close it and add also any footer notes.
Step 07– Post message in Teams
Finally, send the message to Teams. I have used a group chat in my case. Additionally I am using a ‘condition’ component also to check if the list is empty so it can directly ‘terminate’ the flow instead of posting an empty table in Teams.
Flow view:
This is what my simple Power Automate flow looks like:
Power Automate: Send daily status post in Microsoft Teams from a SharePoint List
Assuming you have a “List” in SharePoint and you want to post a summary/or list as-it-is to a chat group in Microsoft Teams. You can simply change same to send a mail notification too, instead of Teams.
Output in Teams:
Development Steps:
Step 01 – Schedule
I would start with a Recurrence Trigger (schedule). Check always the Time zone of the trigger will happen on GMT.
Step 02 – Get list items from SharePoint
Choose the site address and list name
Step 03 – HTML’ize
I wanted to format the SharePoint List content as a table and, MS Teams message supports HTML. So I will create a header and assign it to a new variable.
Step 04 – Convert SharePoint rows to HTML rows
Next step creates each row from SharePoint to corresponding <TR>s. You can choose to omit or add any columns as per your requirement.
Step 05 – Close HTML table
Once the rows are added to HTML table, I would close it and add also any footer notes.
Step 06 – Post message in Teams
Finally, send the message to Teams. I have used a group chat in my case.
Flow view:
This is what my simple Power Automate flow looks like:
Introduction to Azure Monitor – Slides deck
Find the presentation deck I’ve used at Azure Developer Community event.
Recording–Azure Monitor session
Find the recording of my Azure Monitor introduction session which I did for Azure Developer Community event.
Web 3.0 is here…
While reading about Web3, it initially reminded me for no reason the decentralized internet concept discussed in the famous Silicon Valley television series. Well, I thought to write about my learning on the reading of this buzz topic. I started my love for internet with the sound of dial-up internet so I was lucky enough to experience different browsers, sockets, chat apps, HTML, compatibilities and incompatibilities, and various phases of web standards. Initially it was purely technology focused but it has turned to human focused now. Thanks to the evolving Customer Experience (CX) and design thinkers’ priorities and involuntary digital revolution.
Web3 is based on Blockchain
Specifically, the decentralization and, that is the major upgrade we have from Web 2.0. Data volumes, importance and complexities has raised to uncontrollable states now and it has become a need of the hour for the enterprises to keep a log book of what is happening with data. Systems are forced today to check whether you are a human being and not a bot, before allowing you to do any transactions. Captcha, Multi-Factor-Authentication (MFA), Single Sign On, Face Recognition, and what not. Companies are investing hugely to protect their data by securing the systems and we were limited to traditional authentication and role based security based authorizations as enterprise standard. For many years sensitive industries such as Banking sector were reluctant to use Cloud systems because of trust issues and Cloud vendors such as Microsoft Azure and Amazon Web Services had hard time selling their products to enterprises. Consumer sector is still feared about Alexa being listening to their conversations, and industries are concerned about the data leakage of their IoT devices. With Web 3.0, and the proven architecture of crypto designs such as blockchain is going to significantly change the outlook and the on-premise systems will soon completely move to a decentralized mode.
It’s intelligent
Web 3.0 is expected to be intelligent than the previous generations because now we have figured out what internet can do, and what we want. We have advanced (but still immature in many area) much in data science, analytics and predictions so it is time to have the systems we build also have these learnings available ‘by design’.
Is it just a hype?
No, it cannot be. The concepts doesn’t talk anything unrealistic, but it vouch for the need of the hour.
More reading
Visit https://web3.foundation/ to read more. The technology stack page is very interesting.
My session on Azure Monitor
I will be talking about Azure Monitor at Azure Developer Community, Kochi event on Saturday, 18 Dec 2021. Join me if you are interested.
Register: https://www.meetup.com/azure-developer-community-kochi/events/282318713/
Azure Purview–notes
Reading notes
Where exactly in your organization is the data which you are searching for? It is a usual fiasco happends in any big/enterprise to search for information/data when an employee resigns, because ‘catalog’ information resides with certain people in the organization which creates a dependency.
Find my reading notes on Azure Purview:
Azure Purview provides:
Unified data governance service
Manage and govern
– on-premises,
– multi-cloud, and
– SaaS data
Create a
– holistic,
– up-to-date map of your data landscape
with
– automated data discovery,
– sensitive data classification,
– and end-to-end data lineage.
Unified Map
– Automate and Manage metadata from hybrid sources
– Classify data using built-in and custom classifiers
– Label sensitive data
– Integrate all your data systems using Apache Atlas API
Catalogue Insights:
– Asset Insights
– Glossary Insights
– Scan Insights
– Classification Insights
– Sensitive Label Insights
Docs
https://docs.microsoft.com/en-gb/azure/purview/overview
Supported data sources
https://docs.microsoft.com/en-us/azure/purview/purview-connector-overview
Pricing
https://azure.microsoft.com/en-in/pricing/details/azure-purview/
Questions for planning:
Scenarios:
Persona – Who are the users?
Source system – What are the data sources such as Azure Data Lake Storage Gen2 or Azure SQL Database?
Impact Area – What is the category of this scenario?
Detail scenarios – How the users use Purview to solve problems?
Expected outcome – What is the success criteria?
Deployment:
What are the main organization data sources and data systems?
For data sources that are not supported yet by Purview, what are my options?
How many Purview instances do we need?
Who are the users?
Who can scan new data sources?
Who can modify content inside of Purview?
What process can I use to improve the data quality in Purview?
How to bootstrap the platform with existing critical assets, glossary terms, and contacts?
How to integrate with existing systems?
How to gather feedback and build a sustainable process?
Ref: https://docs.microsoft.com/en-gb/azure/purview/deployment-best-practices
TOGAF / Enterprise Architecture Study Group–WhatsApp
The groups is active as of today, and is expected to keep the group for next 6 months. Please excuse me if the joining link is not working.
Is the choice of programming language important for Microservices?
An answer of definite YES or NO is highly irrelevant here, because a Microservice can be written in any programming language which supports some communication end point, such as API.
A Microservice is an application architecture style and the programming language used is just one element of it. Let us revisit Martin Fowler’s definitions from 2014,
…a particular way of designing software applications as suites of independently deployable services
…developing a single application as a suite of small services, each running in its own process
…which may be written in different programming languages and use different data storage technologies.
Not all Microservices need to follow all the characteristics defined by all the experts in the field, but at least we have to make sure it is not violating any characteristics.
There are different deployment options available such as Kubernetes, or Azure Container Service. They use a concept of removing-the-bad, and spin-up-new containers. To support this process, it is important to make sure our microservice applications are independent and stateless to avoid any application crashes or malfunction.
Coming back to our question of programming languages, while we can use any languages or technology stack for writing Microservices, it is always good to keep these points in mind while selection:
- If not necessary, try to avoid languages which requires a virtual machine to run..such as JVM for Java. Some memory resident platforms require some bootup and killing time which can delay the instance creation and destroy. Found Go handles is better.
- Smaller the better, the container size matters. Some languages require a bigger library to be accompanied to have the application work properly, but it makes the container size bigger.
- Application performance – some programming languages are faster for some operations compared to others
Well, it all boils down to the functional and technical requirements, and I do believe that the selection of programming language is not crucial for most cases. Upon Google research, it is also found that people rely/biased on their own skillset to build Microservices, than making the choice-of-language a problem in front of them, which makes this question a low priority concern.
Read about Microservices here – https://martinfowler.com/articles/microservices.html