Skip to main content

Create easy to use Excel (csv) from a PowerShell export

Problem

I get rusty in between coding projects, and I forget how to do what seems to be basic things like building and exporting a table to CSV for use in Excel or other downstream process/automation/data integration, etc.

PowerShell covered

  • Variables
  • simple process for collecting an array of information 
  • Foreach loop
  • @([pscustomobject]@{})
  • +=
  • Export-CSV

Logical Process

To build a file I can use in Excel, the steps involved are creating an empty variable which I will use as the array to store the table, 

  1. Crate an empty variable to store data for later export to the CSV/Table
  2. Process to retrieve raw data to process into the CSV/Table
    1. for me this is usually Get-ADUser, Get-ADGroupMember, a SQL query, or a request from a SharePoint API for list data
    2. for this demonstration we use Get-Process because everyone uses it for demos 😋
  3. Foreach loop to process the raw data gleaning only the data I need for the table
  4. += appending each looped occurrence into the CSV/Table
  5. @([pscustomobject]@{}) - this is the thing I forget the most. This is how to create a line item in my new table with the header I want and putting the data in the right columns
  6. Export-CSV -noTypeInformation is the final step to move the variable data to a file on the computer to open in Excel

Code

$data = $null
$processes = get-process
foreach ($eachProcess in $processes) {
    $data +=
    @(
        [pscustomobject]@{
            Name=$eachProcess.Name
            CPU=$eachProcess.CPU
            Id=$eachProcess.Id
            ThreadCount=$eachProcess.Threads.Count}
    )
}
$data | Export-Csv C:\users\public\Downloads\TableExample.csv -NoTypeInformation

Explanation

In the code, the initial variable is created/initialized and emptied (in case there was any data already).

Get-Process is loaded into a variable called processes. The $processes variable could be exported right away but the purpose of this code is to evaluate and transform the data to a custom output. Many of the reasons this is often performed are for data integrations, data transfers, and downstream processes which only require just-in-time data for other processes. 

The foreach loop, transforms each line in of the processes variable and collects only four items from each line. Using the $eachProcess variable collects each line and after the closing bracket of the foreach loop, it overwrites the $eachProcess variable with a new instance for the next $processes line. 

Using the '+=' symbols together tells PowerShell to add a line to the $data variable. 

For bullet 5 notice how the column heading precedes the equals signs and the object property of the variable $eachProcess is mapped to the heading. (If you want to know all the object properties of your processes variable you can use the Get-Member cmdlt - $processes | Get-Member. It can be very helpful to figure out what you are looking for.) 

Finally, the $data can be exported to the location you desire. I am using this location as an example. Chances are the data needs to be exported to a server or emailed to a client.

Comments

Popular posts from this blog

How to Create and Use a SharePoint App Registration

SharePoint App Registrations are incredibly useful for unattended tasks to read or update SharePoint sites. It's similar to a user service account but more secure. They cannot be used as a user sign-in. They are used in custom applications or automated scripted processes. They are made even more secure with a certificate, in which case, they may only run from the machine that owns the certificate. SharePoint Things to Do with App Registrations Read data and write the data to another process Write CSV, text, or XML files Useful for data for to import into report software Writing the data to other APIs such as ERPs, Project software, or work authorization systems Changing the data Check the data based on a schedule. Update the data based on constraints Change permissions of items in a SharePoint list once an item is past due Work with partners and vendors to provide just-in-time, specific access to necessary data Simple Process Create the application either in SharePoint or in App Re

The Fix: npm ERR! code SELF_SIGNED_CERT_IN_CHAIN [SOLVED]

npm ERR! code SELF_SIGNED_CERT_IN_CHAIN npm ERR! errno SELF_SIGNED_CERT_IN_CHAIN npm ERR! request to https://registry.npmjs.org/gulp-cli failed, reason: self signed certificate in certificate chain What does it even mean? How does one fix this? There are so much information, it's hard to parse what I'm supposed to do. The Solution: Disable the Company VPN This is not ideal and you should work in cooperation with your IT security operations to let them know what you are doing and why you need it. It was reading this article:  https://stackoverflow.com/questions/54611707/request-to-https-registry-npmjs-org-co-failed  and the comments that finally clued me into our VPN causing the conflict. I previously did not have any issues but our company hired a new CISO who has been very active in hardening our corporate environment. What didn't work: gulp untrust-dev-cert Removing package-lock.json file Reinstall Node.js What I wasn't going to try (and you shouldn't either): npm

PowerApps Productivity Timer

I like to track activities . I have several workout apps and a Fitbit . Measuring things allows us to monitor our performance and journal our success. But not all activities have ready-made, easy to use apps - a problem measuring things that don’t fit into popular categories. Early in my IT career I was an IT procurement specialist for a large enterprise. Procurement specialists process 20-60 requisitions a day. It’s a lot of paperwork (even with a digital process.) I oversaw making it more digital and less paper based. I succeeded in reducing a typical requisition from 8 minutes to 2 minutes. The process allowed procurement to reduce person hours by almost a whole position. I’ve always said that computers allow us to do more and there was always more to do. Being able to measure that efficiency improvement is the impetus for having an easy-to-use timer. I have learned many times over the years that productivity is lost or gained in the moments between activities. I waste a lot