My revised flow worked with a clickable link to the full article, but the published date was in Zulu/UTC. While that’s l33t as eff, I’m thinking there’s got to be an easy way to convert DateTime values between time zones. Enter the workflow dynamic content and the convertFromUtc function.
The dynamic content tool lets you build an expression with functions that operate a combination of values from the object (e.g., RSS Feed Item) and static values.
convertFromUtc(triggerOutputs()?['body/publishDate'],'Eastern Standard Time','f')
Once I had my function ready, I clicked the blue OK button, and it was inserted into my message body like so.
And I couldn’t leave well enough alone, and replaced the nonsense with actual separate paragraphs. But the workflow took those out and put the line breaks back in. You win some; you lose some.
Here’s what it looks like now:
Room for improvement?
It doesn’t indicate the name of the blog anywhere, and although the blog name is part of the RSS feed, it isn’t a feed item property that I can select. Since I’m configuring this flow on a per feed basis, I could just hardcode the source, but that won’t get updated if the name changes.
Another short post about getting RSS updates into a Microsoft Teams channel. I got a basic Flow working in my previous post. I wanted to make a couple changes to the message format, and I was able to make those changes. In order to make the link to the full article clickable, I had to click the “>” button on the action toolbar to show the HTML code of the message body, which let me add the dynamic content field Primary feed link as the HREF property of the link:
Once I’d added the dynamic content to the HTML, it didn’t seem possible to get back to the GUI editor.
I’m hoping this post appears as intended when the next polling cycle completed.
I’m attempting to use Microsoft Flow (now Power Automate) to update a channel in Microsoft Teams when an RSS feed is updated. Flow has a simple template just for this purpose, Post a message on Teams when a RSS feed is published.
I have used it and pointed it at this blog’s feed. We’ll see if this post appears in the target channel.
I haven’t played with any fancy formatting. We’ll start with the content and see where we end up.
And, it worked! Mostly. It turns out that the RSS connector polling interval is 30 minutes, and it didn’t appear to do an initial poll until 30 minutes after I created the flow. Once I found that bit of info, I waited and sure enough:
This is functional. The feed item link isn’t a link, though, which seems like a pretty helpful change to make. And I’d probably like to see when the item was actually posted. We’ll see if I can make those changes.
Additionally I see that the item was posted by me, rather than by Microsoft Flow or the name of the feed. I don’t know if there are ways to get that to happen. More reading to do.
I have a Windows file server with thousands of shares. Occasionally, create hidden shares for data migration or other administrative tasks. How do you find these shares?
Some websites suggest running Get-WmiObject -Class Win32_Share and piping the output of that to Where-Object to filter. That can work, but it has the computer send you all the share objects. If you want to run this command to get shares from a remote computer, this is highly inefficient.
Instead, we can specify a filter in the initial Get- cmdlet. I’m also going to switch to the Get-CimInstance cmdlet, which is optimized for remote execution.
PS Z:\> Get-CimInstance -ComputerName ServerName -ClassName Win32_Share -Filter 'Type = "0" AND Name LIKE "%$"'
The Filter parameter uses a WQL query to specific that I want regular shares (not administrative shares like C$ or IPC$; see the Win32_Share class doc for details) AND whose names end with a dollar sign. It may not return data much faster, but it sends much less data over the wire, which is important especially for remote scenarios.
This post and this twitter thread describe a mechanism to prevent the latest ransomware cyber attack from running. It involves creating 1 (or 3) files with a specific name(s) and with the Read-only attribute set. Although the instructions on the first post describe copying and renaming notepad.exe, any file, even an empty file, with the correct names and the Read-only attribute will suffice, if I read the twitter thread correctly.
There are numerous ways to accomplish this in a large organization, including an SCCM package that either deploys some files, or that runs a script to create the files. However, I decided to use Group Policy File Preferences to copy a small text file to the three filenames described, including setting the Read-only attribute.
Using Group Policy File Preferences to create the files that will block the Petya (NotPetya) Ransomware.
This should be executed on the affected computers at their next GP refresh, which might be sooner than a reboot for a start-up script.
I struggled with getting a new Server 2016 Remote Desktop Gateway Service running. I followed the official documentation from Microsoft, configuring two servers as a farm, and creating a single CAP and RAP identically on each server. But every time I tried to connect, I received an error message from the client that my account:
I love those error messages that say “Contact your network administrator for assistance.”
I found a corresponding entry in the Microsoft-Windows-TerminalServices-Gateway/Operational log with the following text:
The user “CAMPUS\[username]”, on client computer “132.198.xxx.yyy”, did not meet connection authorization policy requirements and was therefore not authorized to access the RD Gateway server. The authentication method used was: “NTLM” and connection protocol used: “HTTP”. The following error occurred: “23003”.
I double-checked the groups I had added to the CAP and verified the account I was using should be authorized. I even removed everything and inserted “Domain Users”, which still failed.
I found different entries that also corresponded to each failure in the System log from the Network Policy Service (NPS) with Event ID 4402 claiming:
“There is no domain controller available for domain CAMPUS.”
I know the server has a valid connection to a domain controller (it logged me into the admin console). But I double-checked using NLTEST /SC_QUERY:CAMPUS. Yup; all good.
A few more Bingoogle searches and I found a forum post about this NPS failure. The marked solution just points to a description of the Event ID, but one of the comments contains the solution: the Network Policy Service on the gateway systems needs to be registered. This instruction is not part of the official documentation, though upon re-reading that doc, I now see that someone has mentioned this step in the comments.
In this case, registration simply means adding the computer objects to the RAS and IAS Servers AD group (requires Domain Admin privs). Once I made this change, I was able to successfully connect to a server using the new remote desktop gateway service.
Many thanks to TechNet forum user Herman Bonnie for posting the very helpful comment.
You may have noticed that Microsoft OneNote displays a little warning for notebooks stored in your Documents folder.
OneNote notebook warning “may not sync correctly.”
This is because Windows computers that are part of UVM’s Active Directory domain use a feature called Offline Files to make your Documents folder available to you when you’re not on the campus network. (see my Offline Files post for more info.)
The warning shows up because OneNote has its own file sync process, and having another file sync process layer under that can mess up its syncing, theoretically. In my many years of using OneNote, I’ve only seen one (maybe two) situations where this may have created problems. That said, ignoring warnings is generally a bad idea; it makes it easier to miss an issue that really does need attention.
But there is another way: SharePoint. Continue reading →