| The PnP team recently announced they were making changes to how the PnP.PowerShell and M365 CLI authenticate. The short story is that if you want to use the PnP.PowerShell module or the M365 CLI after September 9th, 2024, you’ll need to create your own Application Registration, also known as an App Reg. I know, I know, that sounds confusing and scary. Fortunately there’s nothing to it. I’ve got all the steps for you to follow along. You’ll be ready to go in no time. If you’re reading this, I assume you already have the PnP.PowerShell module installed. Good, because you’ll need that. The steps I’m going to show you will probably work with any version 2.0.0 or later, but I recommend updating to at least 2.9.0, which is the latest version as of this blog post. It’s got some extra sauce in it to make this go more smoothly. After you’ve got the PnP.PowerShell module installed, open up PowerShell. This module has supported using custom app regs for a while, so all of the plumbing is already there. We need to run the Register-PnPAzureADApp cmdlet (also aliases as Register-EntraIDApp, they’re the same thing) to create our own App Reg. Example #1 from the help is what I use. Here’s what it looks like: Register-PnPAzureADApp -ApplicationName PnP.PowerShell -Tenant 1kgvf.onmicrosoft.com -Store CurrentUser –Interactive The name of the App Reg we’re creating is “PnP.PowerShell.” My tenant is 1kgvf, but of course you’ll use your own. It’s going to create a certificate for us (we won’t need it) and store it in the CurrentUser Certificate Store. And finally, since I’m using MFA, like every good user does, I use the –Interactive switch to do browser based authentication. You’ll have to log in as a Global Admin, or a user the Global Admin has given permission to create App Regs to. After I log in Azure gets to work doing what it does. Since I didn’t pass any scopes to Register-PnPAzureADApp it uses its default set. You’ll get prompted to authenticate a second time then asked to consent to them. We’ll talk more about that in a bit. Once it’s done, you’ll get a screen like this: Here is where you’ll get the Client ID (also called AppID and AzureAppID, it’s all the same) you’ll need when you connect. In my example that’s 001ed5a0-be10-4bc3-a40c-a1cad0c987b7. You can also get that ID number by going into the Azure Portal and looking at the Enterprise applications. Find your App Reg and click it. Then you can copy it to your clipboard. Now that you have your ClientID, let’s use that to connect to Microsoft 365. In my case I would use this connect statement: Connect-PnPOnline -Url https://1kgvf.sharepoint.com/ -Interactive -ClientId 001ed5a0-be10-4bc3-a40c-a1cad0c987b7 Of course you’ll use your own tenant name and the ClientID that you created. The one I created isn’t visible to your tenant. You’ll get prompted for a username and password, and hopefully some MFA. Then you’ll be connected to M365. Here’s how it all looks: As you can see, I’m connected and a quick function check looks like everything is working fine. Success!!! Well, sort of… We talked before about scopes, and how the Register-AzureADApp cmdlet used its default scopes since I didn’t specify any. Those scopes covered SharePoint, and Users, but not much else. For instance, if I try to get a list of the Teams in my tenant, I’m met with an authentication screen. After I authenticate (with a Global Admin) I get a page wanting more permission: Now it wants access to Read all groups, and more User permissions. I clicked Consent and then Accept. It returned my Teams to me. This adds some complication. If the user you normally use PnP.PowerShell with is a GA, then you’re golden. Every time you stumble onto something you can’t do, it’ll prompt you and you’ll consent. If you have a separate GA user, or someone else is doing the GA stuff for you, you’ll have to go into the Azure Portal and add the additional Scopes in there. That sounds like something that would fit really well in another blog post. There’s another gotcha. By default, when Register-PnPAzureADApp creates the App Reg, it only gives the user that created it permission to use it. If anyone else needs to use it you’ll need to go into the Azure Portal, open the App Reg, go to the Users and Groups blade, and add the additional users. That won’t give them any additional permissions anywhere in M365, it simply gives them permission to use this App Reg when using PnP.PowerShell. I mentioned at the top that version 2.9.0 had some extra sauce to help with this. It can get a bit tedious adding the –ClientID SomeUglyGUID part every time you connect. This is particularly painful if you’ve got scripts and the like with connect statements that don’t have the -ClientID parameter. In 2.9.0 and later Connect-PnPOnline supports an environment variable ENTRAID_APP_ID. If no –ClientID is specified, and that variable has a value, Connect-PnPOnline will use that. Here’s what it looks like: That variable value will go away when you close PowerShell. If you put the $env:ENTRAID_APP_ID = "001ed5a0-be10-4bc3-a40c-a1cad0c987b7" line in your PowerShell profile it will get populated every time you open PowerShell. If you’re a local admin on your machine you can also set a persistent environment variable in an admin prompt with this line: [System.Environment]::SetEnvironmentVariable('ENTRAID_APP_ID', '001ed5a0-be10-4bc3-a40c-a1cad0c987b7', 'User') Or you can set it in the Control Panel. Again, you’ll want to use your Client ID, not mine. I hope that helps. If you have any questions, leave me a comment. tk ShortURL: https://www.toddklindt.com/PoshMakeAppReg |
| I know Mondays can be rough. Fortunately I have something to help you through this particular Monday. I was recently interviewed on the “Mastering Mondays” podcast from the No More Bad Mondays guys. I had a blast catching up with Dave and Matt. We talk about our favorite tech and I recount an experience in my childhood that nearly scarred me for life. The time flew by. Give the episode a listen on your favorite platform and let me know what you think.
tk
ShortURL: https://www.toddklindt.com/HappyMonday |
| My buddies at Syskit and I were chatting recently and we realized we hadn’t worked together in far, far too long. We decided that needed to be remedied right away! So last week I wrote a blog post for them called Using AI to write PowerShell scripts. It’s even more fun than it sounds, I promise. In it I cover my process for using AI to write even cooler PowerShell scripts than I would be able to on my own. You can even use the free version of ChatGPT to write better PowerShell. Give it a read and let us know what you think. tk Short URL: https://www.toddklindt.com/SyskitAIandPowerShell |
| During the last Ask Sympraxis our friend Kasper Larsen relayed a question he had gotten recently, “Is it possible to run the PnP.PowerShell module if I’m not allowed to install it or PowerShell 7?” The question brought tears of sadness to my eyes. Then, my indomitable spirit kicked in, “We’ll help this person!”, it said. And here we are. The short answer to, “Can you run PnP.PowerShell if you can’t install anything” is a resounding, “Yes! Heck yes you can!” The answer to “how” comes in two parts. The first is to download the PowerShell 7 Zip file and run pwsh.exe out of there without installing it. The second part is to install the PnP.PowerShell module in the CurrentUser scope, so that it doesn’t try to write anything to a protected directory. After that, run PnP.PowerShell cmdlets to your heart’s content. Here’s what it looks like: There’s proof, the PowerShell way, that the user I’m logged in as isn’t an admin. First, I download the PowerShell 7 zip file and extract it to a folder in my Downloads folder. Then I CD to the directory and run pwsh.exe, like this: You can see from the $PSVersionTable that we’re running PowerShell 7. Now I install the PnP.PowerShell module to my user with the line Install-Module pnp.powershell –Scope CurrentUser. After I run the install I use Connect-PnPOnline like I normally would. At the bottom I highlighted where the module is installed, your personal Documents directory. One very important note, is that you (or anyone) won’t be able to connect if the PnP.PowerShell application registration hasn’t been approved in your tenant. This blog post, “How to Register the PnP.PowerShell App Registration if You’re not a Tenant Admin” covers it a bit. That App Registration is necessary in 99% of the use cases. You can connect and do a few SharePoint things without it, but that list is pretty short. I’m not sure how often this will come up, but hopefully this blog post is at least interesting. tk ShortURL: https://www.toddklindt.com/PoshWithoutAdmin |
| If you’ve been around the block with SharePoint or Microsoft 365 administration, you know that handling client credentials can sometimes feel like juggling with fire. When I start a new engagement with a client I generally get credentials to access their tenant. Of course they immediately go into our password management tool. I also do a lot of PowerShell scripting for my clients, so I save them to the Windows Credential store too, so that I can connect with Connect-PnPOnline without having to enter them each time. And while that’s not a lot of work, I thought I could streamline it. That’s why I’ve put together a PowerShell command, Add-ClientCredential, that makes it a little easier. What Does Add-ClientCredential do? In a nutshell, this PowerShell command is designed to streamline the process of adding client credentials in a SharePoint or Microsoft 365 environment. It stores your credential for https://tenant.sharepoint.com and https://tenant-admin.sharepoint.com. If you don’t specify any credentials when you connect with Connect-PnPOnline it will look for them in the Windows Credential Store. If you have one assigned for the root of the tenant, https://tenant.sharepoint.com, it will also use that for other sites in the tenant, like https://tenant.sharepoint.com/sites/ToddisTheBest, if a credential is not saved for that specific site. Here’s a quick example: Add-ClientCredential -TenantName "contoso" -UserName admin@contoso.com I like this method because then my password will never show up in History or a Transcript if one is running. Since we didn’t pass it a password, it will prompt you for one. Then it will create credential entries for https://contoso.sharepoint.com, http://contoso.sharepoint.com, and https://contoso-admin.sharepoint.com To pass it a password, do it like this: Add-ClientCredential -TenantName "contoso" -UserName admin@contoso.com -Password (ConvertTo-SecureString "YourPassword” -AsPlainText -Force) Keep in mind that will show up in plain text in PowerShell’s Get-History, or the Transcript file, if you have that running. I built in some smarts so that if there is already a credential stored for “Contoso” it will let you know and ask you if you want to overwrite it. If you want to get super fancy you can add the –TestCredential switch which will test the credentials you gave it by logging in with them. All of this is available if you run help Add-ClientCredential –Examples. In the background this function uses Add-PnPStoredCredential to store the credentials for you. It adds them for the root of the tenant, and the –admin URL. How to Get Started To get your hands on this little beauty, head over to my GitHub repository. You can download addclientcreds.psm1 itself, or clone the whole repo. Use Import-Module to import it into your PowerShell host and you’re ready to go. Wrapping Up addclientcreds.psm1 is my attempt to put a little more simplicity and sanity into the world of SharePoint and Microsoft 365 administration. I hope you find it as useful as I do. As always, I welcome your feedback and questions. Drop a comment below or shoot me a message on Twitter @ToddKlindt. tk ShortURL: https://www.toddklindt.com/PoshAddClientCreds |
| In my last masterpiece, Tackling SharePoint's 5000 Item Limit with PowerShell and Search, I show how to use PowerShell with Search to return more than 5000 items from a SharePoint list or library, even if SharePoint refuses to return more than 5000 items. Pretty impressive, right? When demonstrating something like that you need a list or library with more than 5000 items. There are are a lot of scripts out there that can create test data, but I needed something specific. So, I opened up a new windows in VS Code and got to coding. The PowerShell function I wrote is Add-AttorneyFiles, which is designed to streamline the creation of attorney files and case folders. Lots and lots of them. This function has a few parameters to tailor the files and folders it creates. It accepts two mandatory parameters, AttorneyCount and CaseCount, specifying the number of attorneys and case folders to create for each attorney. There are also a few switch parameters that allow you to customize the function's behavior further. You can choose to create a file in each case folder using the CreateStaticFile switch, or specify that only closed or client case folders should be created with the OnlyClosedCases and OnlyClientCases switches. You can also specify the name of the static file created using the StaticFileName parameter. If no name is provided, the function will default to creating a file named "readme.txt". The function starts by checking if a connection to a SharePoint site exists. If it does, the function creates attorney files and case folders in a SharePoint directory. The names for these attorney files are generated randomly from a list of common first and last names. This randomness helps to create a more realistic environment for testing. Once the list of attorney names is created, the function will loop through each attorney, creating the appropriate case folders, either client case folders, closed case folders, or both, based on the parameters passed in. The function gives the folders and files it creates a randomly generated last and first name, along with a random case number, for a more realistic setup. If the CreateStaticFile switch is present, a static file will be created in each case folder. The content of these static files is a random selection of words, downloaded from a free online dictionary. This randomization also contributes to a more realistic testing environment. Here's an example of how you might use this function: Add-AttorneyFiles -AttorneyCount 10 -CaseCount 5 –CreateStaticFile
This will create 10 attorney folders, each with 5 case folders. A static file named "readme.txt" will be created in each case folder. For my blog a created a whole lot of autorun.inf files. By automating the setup of this testing environment, I was able to generate a high volume of test data, with a format that accurately represented my client’s data, without exposing any of it. I’ve uploaded the code to GitHub. Check it out. Happy PowerShelling, and as always, feel free to drop any questions or comments below! tk ShortURL: https://www.toddklindt.com/PoshAttorneyFiles |
| In the world of SharePoint, the 5000 item per view limit is a well-known challenge, the stuff of legend. To recap, while a SharePoint list or library can have up to 30 million items in it, SharePoint refuses to show you more than 5000 of them at a time. That’s an API level control to protect the backend, so it won’t do this in a view in a web page, an API call through PowerShell or CLI, nothing. While there are several ways to navigate around this limit, such as using CAML queries, these methods often fall short when dealing with really large datasets. This blog post will explore a unique solution to this problem using PowerShell, specifically focusing on the use of Search to retrieve data.
The Challenge
Recently, I was working with a client, a law firm, who had a whopping 7.8 million items in a SharePoint document library. Of course not a great Information Architecture. We were helping them fix that. Among other horrors, over the years, attorneys had copied the contents of CDs and DVDs to various places in SharePoint, creating a massive and complex data structure. The challenge was to find all these so we could migrate them out or delete them. However, due to the sheer volume of data, there was no way to slice the data using the normal tools to get back fewer than 5000 results. This is where PowerShell swoops in and saves the day.
The PowerShell Solution
The solution came in two parts. We were looking for the DVDs by looking for the autorun.inf file in the root. I discovered I couldn’t using something like Get-PnPListItem to find all of the autorun.inf files because I couldn’t find a way with CAML, or anything else to pare the result set down below 5000. There were just too many files. However, I did discover that I could find them in the Search Center, which gave me the first idea. Get them from Search in PowerShell. I used the Submit-PnPSearchQuery cmdlet to send a search query to SharePoint and it retrieves all the results. It gets them in batches of 500 (the maximum for a single search request), with the option of making multiple requests if necessary to retrieve all results. This worked pretty well, but was tedious because I could only get 500 at a time and there were thousands. I had to modify the search, run it again, and append those results to the results from the previous searches. That was too much work, and led to the second part. I wrote a PowerShell function called Submit-PnPSearchQueryAll. This function uses the Submit-PnPSearchQuery cmdlet to send a search query to SharePoint and retrieves all results, paging through them and running multiple queries as needed. If the -ShowProgress switch is provided, the function will display the total number of results and a progress bar.
Before we look at the function itself, let’s see how it works. My usage looked like this: $AllResults = Submit-PnPSearchQueryAll -query "autorun.inf"
That stored every file named autorun.inf in the variable $AllResults. There are 5046 of them: $AllResults.Count
Since that’s a collection of objects I can treat them like any old object: $AllResults[1000]
and
$AllResults | Where-Object {$_.ParentLink -like "*AttorneyFiles/Johnson, Michael*" }
or
$AllResults | Where-Object {$_.ParentLink -like "*AttorneyFiles/Johnson, Michael*" } | select Path,ParentLink
and
$AllResults | Where-Object {$_.ParentLink -like "*AttorneyFiles/Johnson, Michael*" } | Export-Csv .\mj.csv
See all the fun you can have? Since the object we’re getting back is a PnPResultTable object, it doesn’t have all the same properties as a PnPListItem. When I wrote the function I had to decide which ones I needed. If you use this, you might need something different.
How It Works
The function begins by initializing variables for the starting row and page size, set to 500. It then enters a loop where it performs the search query with Submit-PnPSearchQuery and retrieves the results. If the –ShowProgress switch is provided, it will display the total number of results on the first run and a progress bar for each subsequent run.
For each result, the function outputs a custom object with the desired properties. It then increments the $startRow by the $pageSize and continues the loop while the $startRow is less than the Total Rows.
Before I got this working I tried a couple of other approaches, but this one worked the best.
Grab the PSM1 file with Submit-PnPSearchQueryAll here.
Conclusion
This PowerShell function proved to be an effective solution to the SharePoint 5000 item limit, allowing us to retrieve all items from a massive SharePoint document library. It demonstrates the power and flexibility of PowerShell and SharePoint's Search functionality when dealing with large datasets. Whether you're dealing with millions of items or just want a more efficient way to retrieve data from SharePoint, consider giving this function a try.
tk
ShortURL: https://www.toddklindt.com/Posh5000ItemLimit |
| I know I have a checkered past with the Developer community. Back in the old days they could do some pretty awful things to my beloved on-prem SharePoint servers with their incessant BINing and GACing things. Fortunately for all of us (including those defenseless servers) those days are behind us. So, I just got off this Microsoft 365 & Power Platform Development Community call, and boy, do I have some cool stuff to share with you. I got the chance to show off this neat trick I've been working on - using GitHub Copilot and ChatGPT (or any AI you're into) to give your PowerShell scripting a serious boost. You're probably thinking, "AI and PowerShell, really?" But stick with me here. It's like having a co-pilot for your coding. Like someone smarter than you, looking over your shoulder, and there if you have questions. It's there to help you out, make things smoother, and let's be honest, who doesn't want to feel a bit like Tony Stark talking to JARVIS while coding? During the demo, I took everyone on a little tour of how you can get GitHub and Copilot and ChatGPT into the mix with your PowerShell development routine. It's all about using AI to help with the heavy lifting - coding, debugging, repetitive tasks, and even the dreaded documentation. I'm telling you, this is next-level stuff, and we're just scratching the surface of what AI can do for us. But hey, don't just take my word for it. Give it a whirl! Play around with adding some AI into your PowerShell development and see how it goes. I used ChatGPT and GitHub Copilot for the demo, but you can pick any AI you're comfortable with. That's it from me for now. Keep an eye out for more cool AI posts. Microsoft 365 & Power Platform Development Community call My AI Demo tk ShortURL: https://www.toddklindt.com/PoshwithAI |
| Hello, SharePoint and PowerShell enthusiasts! Todd Klindt here, and I've got something exciting to share with you today. I recently had the opportunity to present a developer-focused demo on the Microsoft 365 & Power Platform Community channel. The topic? Dynamic parameter validation in PowerShell. In this 13-minute demo, I walk you through the process of creating your own PowerShell cmdlet. This cmdlet allows you to pull and tab through information from a site, list, text file, Azure, Graph, etc. within the PowerShell environment. I utilize the power of ValidateSet, ValidateScript, and ArgumentCompleter. To show off, the demo concludes with an interesting twist. I used ChatGPT to write the same code. All the code I used in the demo is available on my GitHub repo. You can find it at PoshPnP and PnP PowerShell. I hope you find this demo useful in your PowerShell journey. Remember, when you're writing PowerShell, try to be the tool maker, not the tool user. Happy coding! Watch the full video here. tk ShortURL: https://www.toddklindt.com/PoshValidation |
| Hi All, Today, I want to share a handy PowerShell function I developed recently to enhance readability of ShareGate log files. If you are using ShareGate for SharePoint migrations or management tasks, you might be familiar with the extensive Excel logs produced by ShareGate. While these logs are rich in details, they can sometimes be too rich in details. They often require some formatting for better readability or to highlight the necessary details. When doing a lot of migrations I found myself doing the same steps over and over again to these logs. Being lazy, I thought, “Someone should automate this!”. I sat down with my buddy PowerShell and this is what we came up with. This function, Format-ShareGateLogFile, tackles this by opening a ShareGate log file in Excel format and applying a few changes. It does this by using Doug Finke’s excellent ImportExcel PowerShell Module. It adds a table to the first worksheet, formats the first column as "Date-Time", calculates the duration of the log file, and formats the duration as "[h]:mm:ss" in the last row. Finally, it saves the changes and closes the Excel file. To use this function, you’ll have to download Logfiles.psm1 from my Github Repo. Then use Import-Module to import it into your PowerShell session. After that’s done you can run Format-ShareGateLogFile. You will need to provide the path to the Excel file to format as an argument to the Path parameter, which is mandatory. For instance, PS C:\> Format-ShareGateLogFile -Path "C:\path\to\ShareGateLogFile.xlsx"
This will format the Excel file located at "C:\path\to\ShareGateLogFile.xlsx" for readability. The function also accepts two optional switches: Open and HideColumns. If you use the Open switch, the function will open the formatted Excel file automatically after it has finished formatting. The HideColumns switch will hide specified columns (E-U, W-AR, AT-BA) in the Excel file. For instance, PS C:\> Format-ShareGateLogFile -Path "C:\path\to\ShareGateLogFile.xlsx" -HideColumns -Open
This will format the Excel file, hide the specified columns, and open the Excel file automatically after it has been formatted. One neat feature of the function is that it can accept pipeline input for the `Path` parameter. This means you can pipe in a series of file paths to the function and it will format each file in turn. For instance, PS C:\> Get-ChildItem -Path "C:\path\to\folder" -Filter "*.xlsx" | Format-ShareGateLogFile -HideColumns -Open
This will get all the Excel (.xlsx) files in the specified folder, and for each one, it will be formatted for readability, with specified columns hidden, and the Excel file opened automatically after it has been formatted. This function has saved me heaps of time while working with ShareGate log files, and I hope it does the same for you. Happy scripting! tk ShortURL: https://www.toddklindt.com/PoshFormatSharegateLogs |
Compliance Details javascript:commonShowModalDialog('{SiteUrl}/_layouts/itemexpiration.aspx?ID={ItemId}&List={ListId}', 'center:1;dialogHeight:500px;dialogWidth:500px;resizable:yes;status:no;location:no;menubar:no;help:no', function GotoPageAfterClose(pageid){if(pageid == 'hold') {STSNavigate(unescape(decodeURI('{SiteUrl}'))+'/_layouts/hold.aspx?ID={ItemId}&List={ListId}'); return false;} if(pageid == 'audit') {STSNavigate(unescape(decodeURI('{SiteUrl}'))+'/_layouts/Reporting.aspx?Category=Auditing&backtype=item&ID={ItemId}&List={ListId}'); return false;} if(pageid == 'config') {STSNavigate(unescape(decodeURI('{SiteUrl}'))+'/_layouts/expirationconfig.aspx?ID={ItemId}&List={ListId}'); return false;}}, null); 0x0 0x1 ContentType 0x01 898 View in Browser /blog/_layouts/xlviewer.aspx?id={ItemUrl}&DefaultItemOpen=1 0x0 0x1 FileType xlsx 255 View in Browser /blog/_layouts/xlviewer.aspx?id={ItemUrl}&DefaultItemOpen=1 0x0 0x1 FileType xlsm 255 View in Browser /blog/_layouts/xlviewer.aspx?id={ItemUrl}&DefaultItemOpen=1 0x0 0x1 FileType xlsb 255 View in Browser /blog/_layouts/xlviewer.aspx?id={ItemUrl}&DefaultItemOpen=1 0x0 0x1 FileType ods 255 |
|
|
|
Compliance Details javascript:commonShowModalDialog('{SiteUrl}/_layouts/itemexpiration.aspx?ID={ItemId}&List={ListId}', 'center:1;dialogHeight:500px;dialogWidth:500px;resizable:yes;status:no;location:no;menubar:no;help:no', function GotoPageAfterClose(pageid){if(pageid == 'hold') {STSNavigate(unescape(decodeURI('{SiteUrl}'))+'/_layouts/hold.aspx?ID={ItemId}&List={ListId}'); return false;} if(pageid == 'audit') {STSNavigate(unescape(decodeURI('{SiteUrl}'))+'/_layouts/Reporting.aspx?Category=Auditing&backtype=item&ID={ItemId}&List={ListId}'); return false;} if(pageid == 'config') {STSNavigate(unescape(decodeURI('{SiteUrl}'))+'/_layouts/expirationconfig.aspx?ID={ItemId}&List={ListId}'); return false;}}, null); 0x0 0x1 ContentType 0x01 898 View in Browser /blog/_layouts/xlviewer.aspx?id={ItemUrl}&DefaultItemOpen=1 0x0 0x1 FileType xlsx 255 View in Browser /blog/_layouts/xlviewer.aspx?id={ItemUrl}&DefaultItemOpen=1 0x0 0x1 FileType xlsm 255 View in Browser /blog/_layouts/xlviewer.aspx?id={ItemUrl}&DefaultItemOpen=1 0x0 0x1 FileType xlsb 255 View in Browser /blog/_layouts/xlviewer.aspx?id={ItemUrl}&DefaultItemOpen=1 0x0 0x1 FileType ods 255 |
|
|
|