Remote Desktop Session Host Servers getting hung regularly?

In this post, we will discuss on an issue that may be faced by quite many folks who have a remote desktop farm in their environment. The RDSH (remote desktop session host) servers might be getting hung quite often, especially during peak business hours. Out of all the servers in the farm, this situation could happen with any of the server, with no particular order.

Althought there could be many reasons behind it, but in some cases, it could become more problematic because event viewer might not be revealing any definitve clue. There could be few logs which we can associate with the server hung state, but none of them would be pointing to the right direction, like below.

1> The winlogon notification subscriber is taking long time to handle the notification event.

2> A timeout (120000 milliseconds) was reached while waiting for a transaction response from the SessionEnv service.

3> The server did not register with DCOM within the required timeout.

Remote Desktop Session Host server hangs/locks up (2008 R2 in vSphere 4.1) Here is a blog post bearing similarity to this case, where this issue is discussed.

Diagnosing the problem

To find the right direction to search for the solution, we can run a DCS (Data Collector Set) in Performance monitor for CPU utilization and Memory consumption, in any or all of the RDSH servers. Thec onfiguration steps are explained in this link Create a Data Collector Set to Monitor Performance Counters. This way we can know the picture of resource utilization, just moments before the server dives into hung state, the next time.

If the RDSH servers are VMware based, we can analyze the VM level logsand look for following.

1> GuestMsg: Too many channels opened

2> GuestRpc: Channel 6, unable to send the reset rpc.

Also the ESXI level logs related to performance could tell about the resource utilization during the time of VM’s hung state.

Another blog post of vmware community might shed some light into the topic VM of Esxi 6 crash after too many TSE connexion 

Here we must discuss, the TCP Chimney Offload, Receive Side Scaling (RSS), and Network Direct Memory Access (NetDMA) features that are available for the TCP/IP protocol in Windows Server 2008 onwards.

Disabling these three features, in many cases resovles the issue Information about the TCP Chimney Offload, Receive Side Scaling, and Network Direct Memory Access features in Windows Server 2008

1> Receive Side Scaling (RSS).

RSS enables network adapters to distribute the kernel-mode network processing load across multiple processor cores in multi-core computers.

2> Network Direct Memory Access (NetDMA)

NetDMA provides operating system support for direct memory access (DMA) offload. TCP/IP uses NetDMA to relieve the CPU from copying received data into application buffers, reducing CPU load.

3> TCP Chimney Offload

TCP Chimney Offload is a networking technology that helps transfer the workload from the CPU to a network adapter during network data transfer.

If this also has not resolved the issue, we can proceed further with another solution.

Windows System Resource Manager (WSRM)

Windows System Resource Manager (WSRM) can be used to allocate processor and memory resources to applications, users, Remote Desktop Services sessions, and Internet Information Services (IIS) application pools.

With Windows System Resource Manager for the Windows Server® 2012 operating system, you can manage server processor and memory usage with standard or custom resource policies.

Equal per session

Out of all poicies, this is the one which might solve our purpose. When the Equal_Per_Session resource allocation policy is managing the system, resources are allocated on an equal basis for each session connected to the system. This policy is for use with RD Session Host servers.

This used to work nicely prior to Windows Server 2012 R2, but for some really disappointing reason, Microsoft has removed this feature, beginning with Windows Server 2012 R2 and left us with no alternatives.

There seems to be a workaround for this too. WINDOWS SYSTEM RESOURCE MANAGER AND WINDOWS SERVER 2012 R2 In this blog, author has described a way to still be able to use WSRM in Windows Server 2012 R2 and later editions. But this requires a presence of Windows Server 2012, which is not possible many times.

There is also a third party tool Process Lasso Server Edition, which claims to perform this task, but haven’t tested it.

So now we head to the final section, which is more of a preventive step.

Microsoft Remote Desktop Services 2012 Management Pack for System Center 2012

The Remote Desktop Services Management Pack helps you manage your computers that are running Remote Desktop Services on Windows Server 2008 R2 by monitoring the health of the following Remote Desktop Services role services.

When there is problem with the availability or performance of one of these components, Microsoft System Center Operations Manager 2007 uses the Remote Desktop Services Management Pack to detect the issue and alert you so that you can diagnose the problem and fix it.

To set this up, this link can be really helpful Monitoring RDS 2012 with System Center Operations Manager 2012 (Part 1)

I will post more here, if I find any more definitive solution to this issue.

Advertisements

Exporting Network Printers’ information using Powershell

The printers in a windows domain can be managed by Print Server role of Windows Server. The network printers can also be published in Active Directory from Print Management console. Therefore we can extract information on printers that are published in AD and the ones which are present in Print Servers. It could be possible that not all of the printers in Print Servers are published. Hence a need arises to have a report on all the printers for proper classification and management.

Exporting Information from Active Directory

Using Powershell command Get-ADObject

From a powershell console, we can execute below command to list all the printers published in the domain. The scope of the search can be limited using -searchscope switch.

get-adobject -filter {ObjectClass -eq "printQueue"} -SearchScope subtree -Properties location | select Name,Location >Printers.csv

Using CSV Directory Exchange utility

Also we can make use of CSVDE to export the list of all printers published in AD to a CSV file. This command can be executed both from CMD and PowerShell prompt and search scope can be limited using -p switch.

csvde -r "(objectClass=printQueue)" -p subtree -l "name,location" -f "C:\printers.csv"

Exporting information from remote Print Servers

Now we will try to export the printers from print servers and compare the same to the report we exported earier using CSVDE/Get-ADObject. Following script is very effective in this regard which I have borrowed from the link below and modified a small bit. The usage and description is included in the parenthesis below.

<#
.SYNOPSIS
 Script to create a Excel spreadsheet with detailed information about
 the printers installed on the server
.DESCRIPTION 
 Script was designed to give you a good description of how your print
 server(s) are installed and configured.

 * Requires Microsoft Excel be installed on the workstation you are running
 the script from.
.PARAMETER PrintServers
 Name of the server you wish to run the script again. Can also be an
 array of servers.
.OUTPUTS
 Excel spreadsheet
.EXAMPLE
 .\Export-PrinterInfo.ps1 -PrintServers "MyPrintServer"
.EXAMPLE
 .\Export-PrinterInfo.ps1 -PrintServers (Get-Content c:\scripts\myprintserverlist.txt)
.NOTES
 Author: Martin Pugh
 Twitter: @thesurlyadm1n
 Spiceworks: Martin9700
 Blog: www.thesurlyadmin.com

 Changelog:
 1.0 Initial Release
.LINK
 http://community.spiceworks.com/scripts/show/2186-export-printer-information-to-spreadsheet
#>
[CmdletBinding()]
Param (
 [Parameter(Mandatory=$true)]
 [string[]]$PrintServers = "yourPrintServer"
)

# Create new Excel workbook
Write-Verbose "$(Get-Date): Script begins!"
Write-Verbose "$(Get-Date): Opening Excel..."
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$Excel = New-Object -ComObject Excel.Application
$Excel.Visible = $True
$WorkBook = $Excel.Workbooks.Add()
$WorkBook.worksheets.add()
$Sheet = $WorkBook.Worksheets.Item(1)
$Sheet.Name = "Printer Inventory"
#======================================================
$Sheet.Cells.Item(1,1) = "Print Server"
$Sheet.Cells.Item(1,2) = "Printer Name"
$Sheet.Cells.Item(1,3) = "Location"
$Sheet.Cells.Item(1,4) = "Comment"
$Sheet.Cells.Item(1,5) = "IP Address"
$Sheet.Cells.Item(1,6) = "Driver Name"
$Sheet.Cells.Item(1,7) = "Driver Version"
$Sheet.Cells.Item(1,8) = "Driver"
$Sheet.Cells.Item(1,9) = "Shared"
$Sheet.Cells.Item(1,10) = "Share Name"
$Sheet.Cells.Item(1,11) = "AD - Published"
#=======================================================
$intRow = 2
$range = $Sheet.UsedRange
$range.Interior.ColorIndex = 40
$range.Font.ColorIndex = 11
$range.Font.Bold = $True
#=======================================================

# Get printer information
ForEach ($PrintServer in $PrintServers)
{ Write-Verbose "$(Get-Date): Working on $PrintServer..."
 $Printers = Get-WmiObject Win32_Printer -ComputerName $PrintServer
 ForEach ($Printer in $Printers)
 {
 If ($Printer.Name -notlike "Microsoft XPS*")
 { $Sheet.Cells.Item($intRow, 1) = $PrintServer
 $Sheet.Cells.Item($intRow, 2) = $Printer.Name
 $Sheet.Cells.Item($intRow, 3) = $Printer.Location
 $Sheet.Cells.Item($intRow, 4) = $Printer.Comment

 If ($Printer.PortName -notlike "*\*")
 { $Ports = Get-WmiObject Win32_TcpIpPrinterPort -Filter "name = '$($Printer.Portname)'" -ComputerName $Printserver
 ForEach ($Port in $Ports)
 {
 $Sheet.Cells.Item($intRow, 5) = $Port.HostAddress
 }
 }

 #################### 
 $Drivers = Get-WmiObject Win32_PrinterDriver -Filter "__path like '%$($Printer.DriverName)%'" -ComputerName $Printserver
 ForEach ($Driver in $Drivers)
 { $Drive = $Driver.DriverPath.Substring(0,1)
 $Sheet.Cells.Item($intRow, 7) = (Get-ItemProperty ($Driver.DriverPath.Replace("$Drive`:","\\$PrintServer\$Drive`$"))).VersionInfo.ProductVersion
 $Sheet.Cells.Item($intRow,8) = Split-Path $Driver.DriverPath -Leaf
 }
 #################### 
 $Sheet.Cells.Item($intRow, 6) = $Printer.DriverName
 $Sheet.Cells.Item($intRow, 9) = $Printer.Shared
 $Sheet.Cells.Item($intRow, 10) = $Printer.ShareName
 $Sheet.Cells.Item($intRow, 11) = $Printer.Published
 $intRow ++
 }
 }
}

$intRow ++ 
$Sheet.Cells.Item($intRow,1) = "Printer inventory completed"
$Sheet.Cells.Item($intRow,1).Font.Bold = $True
$Sheet.Cells.Item($intRow,1).Interior.ColorIndex = 40
$Sheet.Cells.Item($intRow,2).Interior.ColorIndex = 40
Write-Verbose "$(Get-Date): Completed!"

Therefore with the reports extracted by means which are discussed in this article, we can have a detailed report at our hand, regarding the complete printers information in our domain.

Processing windows event viewer logs with Powershell

While working with Windows event Viewer, its better to make use of the command

Get-Winevent

This is more powerful than the “Get-eventlog” command which has a limited scope.

Two major points of differences (courtesy: Managing event logs in PowerShell

  • Get-WinEvent gives you much wider and deeper reach into the event logs. It can access log providers directly as well as tap into Windows event tracing logs. That said, it’s easier to delve into the content of classic event log entries with Get-EventLog.
  • For remoting, Get-WinEvent uses the built-in Windows event log remoting technology instead of PowerShell remoting. Thus, you’ll find that remote log queries run faster with Get-WinEvent than with Get-EventLog.

A comparison between the two can also be found in Processing Event Logs in PowerShell

$starttime = (Get-Date).AddDays(-27)
$endtime = (Get-Date).AddDays(-1)
Get-WinEvent –FilterHashtable @{logname=’security’;StartTime=$starttime;endtime=$endtime} -MaxEvents 10|select *

Piping the get-winevent to select *, will reveal all the search paramters that we can make use of. Also we can limit the search by including a switch -maxevents, which will show the latest N number of the events. The concept of hashtable with reference to get-winevt is explained well in Advanced Event Log Filtering Using PowerShell

Message : The computer attempted to validate the credentials for an account.

Authentication Package: MICROSOFT_AUTHENTICATION_PACKAGE_V1_0
Logon Account: usr
Source Workstation: SERVER01
Error Code: 0xC0000064
Id : 4776
Version : 0
Qualifiers :
Level : 0
Task : 14336
Opcode : 0
Keywords : -9218868437227405312345452
RecordId : 27939335804
ProviderName : Microsoft-Windows-Security-Auditing
ProviderId : 548XX25-5X78-4XX4-X5ba-3e3b0XXd
LogName : Security
ProcessId : 612
ThreadId : 156552
MachineName : SERVER01.DOMAIN.CORP
UserId :
TimeCreated : 10.09.2017 14:09:45
ActivityId :
RelatedActivityId :
ContainerLog : security
MatchedQueryIds : {}
Bookmark : System.Diagnostics.Eventing.Reader.EventBookmark
LevelDisplayName : Information
OpcodeDisplayName : Info
TaskDisplayName : Credential Validation
KeywordsDisplayNames : {Audit Failure}
Properties : {System.Diagnostics.Eventing.Reader.EventProperty, System.Diagnostics.Eventing.Reader.EventProperty,
System.Diagnostics.Eventing.Reader.EventProperty, System.Diagnostics.Eventing.Reader.EventProperty}

We should note here that while searching with get-winevent command, we must mandatorily provide either of below three search parameters along with any combination of the rest of the parameters.

LogName  -> This can be seen from “General” Tab of Preview Pane of the Windows event viewer console.

ProviderName -> This can be seen from “Details” Tab of Preview Pane of the Windows event viewer console.

Path

Searching through old archived evtx files

Making use of Path parameter

If we have archived the old event viewer log files on a network path or local hard drive, which are of the format .evtx, then in order to scan through the log files for a particular event ID, we wil find ourselves then in a difficult situation with very less tools at our disposal. In this situation the third parameter “Path” will help us.

$starttime = get-date (Get-Date).AddDays(-27)
$endtime = get-date (Get-Date).AddDays(-22)
$ls=gci "C:\LOGS\*"| 
Where{$_.LastWriteTime -gt $starttime -and $_.LastWriteTime -lt $endtime}

foreach ($l in $ls){
if ($p.name -notmatch "Temp"){
$a=Get-WinEvent -FilterHashtable @{Path="$p";id="4738"} -ErrorAction SilentlyContinue | 
select message, id, level ,providername, logname, processid, threadid, machinename, userid, timecreated, containerlog|
where {$_.message -like '*user1*'}
$a
$a|out-file "C:\log.txt" -Append
}}

Parsing through a string in message field of an event log

In above example we have parsed throught the test in Message field of the event log corresponding to event ID “4738” for an account “user1” that was changed. We couldn’t have done this parsing from the hash table, hence we resorted to use of where.

We will wrap the discussion here 🙂

Migrating from Windows Server 2012 R2 to Windows Server 2016

As a first step to migrate an old windows server to a new Windows Server 2016, it is always better to chalk out a strategy or a check list. This will help ensure no hiccups come in between the process.

In this direction, the first step can be listing out the roles and features installed on the server. At this stage we would also identify the things that will not automatically come to the new server once it is brought into the domain and promoted.

In my example I am considering a server “server1” installed with Windows Server 2012 R2. It is an additional domain controller and also has DNS and DHCP roles installed in it. “server1” is in a DHCP failover relation (Hot Standby) with another DC “server2”.

Following are the things that we need to manually configure in the new server and is painstaking.

1> During roles and features installation stage, we need to manually install the same.

2> Conditional Forwarders (if any) configured at server level.

3> DHCP server level configurations.

4> IP configuration which in this case will be same as the old one.

Now we will one by one try to automate these steps. There sure are other ways to do this like command line approach and GUI, but will restrict ourselves to powershell here.

1> We can perform below steps to automate the installation of roles and features, which can take time to select all roles and features and can lead to errors also.

First, using PowerShell, we need to create an answer file from already existing server “server1”, containing list of all roles and features installed.

Get-WindowsFeature |
? { $_.Installed -AND $_.SubFeatures.Count -eq 0 } |
Export-Clixml \\server3\c$\ServerFeatures.xml

Second, from elevated PowerShell, we will import the same answer file in the new server “server3” and it will automatically install all the roles and features in that file.

$ServerFeatures = Import-Clixml c:\ServerFeatures.xml
foreach ($feature in $ServerFeatures) {
Install-WindowsFeature -Name $feature.name
}

2> We can make use of a powershell module DnsShell to export DNS conditional forwarders from “server1” to “server3”. First we should download the module which is compatible with powershell version 4.0 also and then importing it as follows.

import-module "C:\Program Files (x86)\WindowsPowerShell\Modules\DnsShell"

For making the module available to all the users logging into the system, the module should be placed in “C:\Program Files (x86)\WindowsPowerShell\Modules\”

Get-DnsZone -ZoneType Forwarder -Server server1 | New-DnsZone -Server server3

In this way we are piping the DNS conditional forwarders settings to the new server in one command.

3> Now we will look into transfering the server level configurations that we might have on the olde server “server1”. We will first execute below command in powershell to create a backup of DHCP leases.

export-dhcpserver -computername server1 -file "c:\dhcpexport.xml" -leases

Then we can import this exported file in new server either by specifying UNC path of file or by first manually copying the file in new location. Kindly note that we need to specify here the DHCP backup directory which can be seen in properties of dhcp server in DHCP console.

Import-DhcpServer -ComputerName server2 -File C:\dhcpexport.xml -BackupPath C:\dhcpbackup\ -ServerConfigOnly

We need not export other configurations as the old server “server1” is in a DHCP failover relation with “server2”. Hence when we reconfigure the failover between “server2” and “server3”, complete configuration will get replicated to the new server “server3” except the server level configuration. We can refer to this link for setting up a DHCP failover.

4> Before the old server “server1” has been demoted, unjoined from domain and shutdown permanently, we should export the IP configuration of the server and store in a migration store folder.

Export-SmigServerSetting -IPConfig -Path "c:\ipexport\" -Verbose

Also it would be better if we export the IP configuration to a text file from command line.

IPConfig -all >  c:\ipconfig.txt

Now we can import the configuration in the new server “server3” by first copying both the “ipconfig.txt” and migration store folder to the new server “server3” and then executing below command in powershell. We need to perform this step only when the old server “server1” is completely removed. Also we must know the MAC addresses of the source and destination NIC cards.

Import-SmigServerSetting -IPConfig All -SourcePhysicalAddress "XX-XX-XX-XX-XX-XX" -TargetPhysicalAddress "YY-YY-YY-YY-YY-YY" -Path "c:\ipexport\" -Verbose

For detailed ip configuration migration steps, we can refer to this technedt link IP Configuration: Migrating IP Configuration Data

This concludes our discussion here.

Working with csv and xlsx files using powershell Part-2

Continuing from where we left in part-1, we’ll dive further in this topic.

Copy a range of cells from one excel sheet and paste in another

[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$xl1 = New-Object -ComObject Excel.Application
$xl1.visible = $true
$wb1 = $xl1.workbooks.open($xlsx3)
$xl1.Worksheets.item(1).name = "Worksheet1"
$ws1 = $wb1.worksheets.Item(1)

$ws1.Activate()

$lr1 = $ws1.UsedRange.rows.count
$r1 = $ws1.Range("A1:F$lr1")
$r1.copy()

[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$xl2 = New-Object -ComObject Excel.Application
$xl2.visible = $true
$wb2 = $xl2.workbooks.open($xlsx4)
$ws2 = $wb2.worksheets.Item(1)
$ws2.activate()

$r2 = $ws2.Range("A1:F$lr1")
$ws2.Paste($r2)

$wb1.save()

$wb2.save()
$xl1.Workbooks.Close()

$xl2.Workbooks.Close()
$xl1.Quit()

$xl2.Quit()

We can change the cell selection as per our need, for e.g.

$lr1 = ($ws1.UsedRange.rows.count)-1

This will count all the rows ‘r’ and subtract the last row. As a result the variable ‘$lr1’ will store the numeric value ‘r-1’. After this we can just mention the selction range as

$r1 = $ws1.Range("A1:F$lr1")

This will select all the cells containing data starting from ‘A1’ cell upto the second last row. Instead of two steps, it can be combined into one as,

$r1 = $ws1.Range("A1:F$($ws1.UsedRange.rows.count-1)")

We can paste the selection to any place in the excel by altering the range as below.

$lr2 = $r1.Rows.Count

This will store the count of rows contained in the copy selection that we have made and store it in variable ‘$lr2’.

$r2 = $ws2.Range("A2:F$lr2")
$ws2.Paste($r2)

Note that the source and destination cell range should be exactly same.

Now that we are discussing about excel com object and powershell, so Powershell Excel Cookbook Ver 2 can’t go without mention. It has almost everything one can get across while working to automate excel with PowerShell.

Inserting values in specific cells

$ws1.cells.item(1,2) = "Hello"

This will store the string ‘Hello’ in cell ‘B1’. The range (1,2) correspond to cell at the intersection of first row and second column.

Sorting a column

$rsort1=$ws1.usedrange
$rsort2 = $ws1.Range("A2")
[void]$rsort1.sort($rsort2, 1)

First we will select all the cells where data is present and store this range in variable ‘rsort1’. After this we will pick the cell from the respective column from which onwards, we want the sorting to be done. So in this case, cells in ‘A’ column starting from ‘2’ row will be sorted in ascending manner due to parameter ‘1’.

Summing values in each column and displaying the sum at the bottom of respective column

$r1 = $ws1.usedRange
$row1 = $r1.rows.count
$Sumrow1 = $row1 + 1
$r2 = $ws1.Range("B2:B$row1")
$r3 = $ws1.Range("C2:C$row1")
$r4 = $ws1.Range("D2:D$row1")
$r5 = $ws1.Range("E2:E$row1")
$r6 = $ws1.Range("F2:F$row1")
$functions = $xl1.WorkSheetfunction
$ws1.cells.item($Sumrow1,1) = "Total"
$ws1.cells.item($Sumrow1,2) = $functions.sum($r2)
$ws1.cells.item($Sumrow1,3) = $functions.sum($r3)
$ws1.cells.item($Sumrow1,4) = $functions.sum($r4)
$ws1.cells.item($Sumrow1,5) = $functions.sum($r5)
$ws1.cells.item($Sumrow1,6) = $functions.sum($r6)
[void]$r1.entireColumn.Autofit()

First we will store the count of all the rows conatining data and store in variable $row1. Assuming that the first row contains the headers, we select from 2nd row to the end. After this we make use of ‘worksheetfunction’ of excel ‘xl1’ object that we have created to call ‘sum’ function. After this we will sum the values in columns ‘A’ to ‘F’ and store at their bottom. After the operation, we can ‘autofit’ the entire data range. There is another way of adjusting the column width as below, by specifying the column width.

$ws1.Columns.Item('a').columnWidth = 20

Sometimes if the datatype of values is set to text/general, the summing operation might fail. So we have to set the datatype prior to this, as follows.

$ws1.Columns.Item('a').Numberformat="0"
$ws1.Columns.Item('b').Numberformat="0"
$ws1.Columns.Item('c').Numberformat="0"
$ws1.Columns.Item('d').Numberformat="0"
$ws1.Columns.Item('e').Numberformat="0"
$ws1.Columns.Item('f').Numberformat="0"

Adding chart to worksheet

Lets select chart and chart type first that we will put in the worksheet.

$ch = $ws1.shapes.addChart().chart
$ch.chartType = 58

Then we will set the data range that we want to chart. We can also give a suitable title to the chart.

$ch.setSourceData($range1)
$ch.HasTitle = $true
$ch.ChartTitle.Text = "My chart"

Now we will select the cell range where we want to copy the chart. If we want to adjust the chart size, it can be done by alterig the range as below.

$RngToCover = $ws1.Range("A$($range2.Rows.Count + 5):F$($range.Rows.Count + 23)")
$ChtOb = $ch.Parent
$ChtOb.Top = $RngToCover.Top
$ChtOb.Left = $RngToCover.Left
$ChtOb.Height = $RngToCover.Height
$ChtOb.Width = $RngToCover.Width

Cell formatting

Below are some basic formatting operations that can be done on a cell range.

$range.select()
$range.HorizontalAlignment = -4108
$range.VerticalAlignment = -4107
$range.BorderAround(1,4,1)
$range.WrapText = $false
$range.Orientation = 0
$range.AddIndent = $false
$range.IndentLevel = 0
$range.ShrinkToFit = $false
$range.ReadingOrder = -5002
$range.MergeCells = $false
$range.Borders.Item(12).Weight = 2
$range.Borders.Item(12).LineStyle = 1
$range.Borders.Item(12).ColorIndex = -4105

Working with csv and xlsx files using powershell Part-1

This is a part of series of posts to discuss various operations that can be done on .csv and .xlsx files using powershell.

Changing delimiter of csv file:

Import-Csv -Delimiter "," -Path $p|Export-Csv -Delimiter ";" -Path $p1 -NoTypeInformation

Replacing a character from a column in csv file:

Import-Csv -Delimiter ";" -Path $p1 -Header "1","2","3"|
Select-Object *| ForEach-Object {
"{0};{1};{2}" -f $_.1,($_.2).replace("Mr.",""),$_.3>> $p2
}

Removing header from a csv file:

Import-Csv -Delimiter ";" -Path $p2 -Header "1","2","3"|
Select-Object -Skip 1 >> $p3
}

Inserting headers in a headerless csv file:

import-csv -Delimiter ";" -Path $p4 -Header "Team","Player name","Status"|
Export-Csv -Delimiter ";" -Path $p5 -NoTypeInformation

Converting csv into xlsx file:

# Create a new Excel workbook with one empty sheet
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$excel = New-Object -ComObject excel.application
$excel.DisplayAlerts=$false
$workbook = $excel.Workbooks.Add(1)
$worksheet = $workbook.worksheets.Item(1)

# Build the QueryTables.Add command and reformat the data
$TxtConnector = ("TEXT;" + $p5)
$Connector = $worksheet.QueryTables.add($TxtConnector,$worksheet.Range("A1"))
$query = $worksheet.QueryTables.item($Connector.name)
$query.TextFileOtherDelimiter = $delimiter
$query.TextFileParseType = 1
$query.TextFileColumnDataTypes = ,1 * $worksheet.Cells.Columns.Count
$query.AdjustColumnWidth = 1

# Execute & delete the import query
$query.Refresh()
$query.Delete()

# Save & close the Workbook as XLSX.
$workbook.SaveAs($xlsx)
$excel.Quit()

Filter columns of an excel file:

Now we will make use of a module PSExcel which makes data manipulation very easy.

import-module "C:\Program Files (x86)\WindowsPowerShell\Modules\PSExcel-master\PSExcel"

[threading.thread]::CurrentThread.CurrentCulture = 'en-US'

Import-XLSX $xlsx -sheet Sheet1 |
Where-Object {($_.'Team' -in @("KKR","KVIP","CSK"))`
-and ($_.'Status' -notin @("bench","not selected"))}|
Export-XLSX $xlsx1

Create Pivot table from a worksheet of an excel file:

Import-XLSX $xlsx1 -sheet Sheet1 |
Export-XLSX $xlsx2 -PivotRows "Team" -PivotColumns "Status" -PivotValues "Player name" -ChartType pie

Merging 2 excel worksheets over a common column

$lt = Import-XLSX $xlsx1 -Sheet Sheet2 -header "Name","Employee Number","Department"

$rt = Import-XLSX $xlsx2 -Sheet Worksheet2 -header "Name","Salary"

Join-Object -Left $lt -Right $rt -LeftJoinProperty "Name" -RightJoinProperty "Name" -type AllInBoth -RightProperties "Salary"|
Export-XLSX $xlsx3 -WorksheetName Mergedoutput

The other options of joining are:

1> -AllInLeft

2> -AllInRight

3> -AllInBoth

4> -OnlyIfInBoth

I’ll try to explore more operations that can be done specifically with xlsx files in part-2 of this series.

Collect disk space details from remote computers

In this post we will discuss some PowerShell scripts to get the details like, used space, free space, total size, etc. of all the drives on remote computers. The list of remote computers could be fed as a text file to the script. Lets discuss each scripts one by one.

Using Win32_LogicalDisk

$CSVFile = "C:\Disk space report.csv"
"Computer Name;Volume Name;Drive Name;Size(GB);Free(GB);Used(GB);Used%(GB)" | out-file -FilePath $CSVFile -append

#domain admin's credentials
$cre = get-credential
$machines = get-content "C:\serverset.txt"
foreach ($machine in $machines) 
{
$disk=Get-WmiObject Win32_LogicalDisk -filter "DriveType=3" -computer $machine -Credential $cre| Select SystemName,DeviceID,VolumeName,Size,freespace
foreach ($d in $disk)
{
#to set language pack for current thread as en-US
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$DeviceID=($d.DeviceID).Replace(":", "")

$VolumeName=$d.VolumeName
#Round(($d.size/1gb), 2)) will round the value to two decimal places
$size=('{0:N2}' -f [Math]::Round(($d.size/1gb), 2))

$freespace=('{0:N2}' -f [Math]::Round(($d.freespace/1gb), 2))
#we have to define the values stored in $d.size and $d.freespace as integer
$usespace=('{0:N2}' -f ([Math]::Round($d.size/1GB,2 -as [int])) - ([Math]::Round($d.freespace/1GB,2 -as [int])))

$usespacepc=('{0:N2}' -f [Math]::Round((($usespace/$size)*100), 2))

$exporttocsv =$machine+";"+$DeviceID+";"+$d.VolumeName+";"+$size+";"+$freespace+";"+$usespace+";"+$usespacepc
$exporttocsv | Out-File -Append $CSVFile
  }
}

Now where on one hand we can get free space quite easily, we need to perform calculations to get used space.

Using Get-PSDrive

Now we will try to do the same thing, but this time we won’t do any calculation. As a result this can lead to higher speed and accuracy. We’ll try to make use of get-psdrive command to get used space and free space directly. However we still need to use get-wmiobject command to get total size of drive and volumename.

$CSVFile = "C:\Disk space report.csv"
"Computer Name;Drive Name;Volume Name;Size(GB);Free(GB);Used(GB);Used%(GB)" | out-file -FilePath $CSVFile -append
WriteHost"ComputerName;VolumeName;DriveName;Size(GB);Free(GB);Used(GB);Used(GB)"

$cre = get-credential
$machines = get-content "C:\serverset.txt"
foreach ($machine in $machines) {
#=================================================== 
#---To get Volume names and Total size, we will-----
#---use Win32_LogicalDisk, and to get Used Space----
#---and free space, we will use Get-PSDrive---------
#===================================================
$disk=Get-WmiObject Win32_LogicalDisk -filter "DriveType=3" -computer $machine -Credential $cre | Select Size,DeviceID,VolumeName

$disk1=Invoke-Command -ComputerName $machine {Get-PSDrive} -Credential $cre | Select-Object PSComputerName,Name,Used,Free,Provider|
where-object Provider -match "FileSystem"

foreach ($d in $disk)
{
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$deviceid=($d.DeviceID).Replace(":", "")

foreach ($d1 in $disk1)
{

if ($d1.Name -match $deviceid){
$VolumeName = $d.VolumeName
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$size=('{0:N2}' -f [Math]::Round(($d.size/1gb), 2))

$freespace=('{0:N2}' -f [Math]::Round(($d1.free/1gb), 2))

$usespace=('{0:N2}' -f [Math]::Round(($d1.used/1gb), 2))

$usespacepc=('{0:N2}' -f [Math]::Round((($usespace/$size)*100), 2))

$exporttocsv =$d1.PSComputerName+";"+$deviceid+";"+$VolumeName+";"+$size+";"+$freespace+";"+$usespace+";"+$usespacepc
$exporttocsv
$exporttocsv| Out-File -Append $CSVFile
Break }
    }
  }
}

Using Win32_Volume

Now we will see how we can make use of Win32_Volume to get the desired output.

$CSVFile = "C:\Disk space report.csv"
"Computer name,Drive name,Size(GB),Free(GB),Used(GB),% Used" | out-file -FilePath $CSVFile -append
$machines = get-content "C:\Users\umurarka\Desktop\servers.txt"
$cre = Get-Credential
foreach ($machine in $machines) 

{
$disk = Get-WmiObject - -ComputerName $machine -Class Win32_Volume -ErrorAction SilentlyContinue
foreach ($d in $disk)
{

$size=('{0:N2}' -f [Math]::Round(($d.Capacity/1gb), 2))

$freespace=('{0:N2}' -f [Math]::Round(($d.freespace/1gb), 2))

$usespace=('{0:N2}' -f ([Math]::Round($d.Capacity/1GB,2 )) - ([Math]::Round($d.freespace/1GB,2)))

$usespacepc=('{0:N2}' -f [Math]::Round((($usespace/$size)*100), 2))

$exporttocsv ="acrdefrps01"+" "+$d.DriveLetter+" "+$size+" "+$freespace+" "+$usespace+" "+$usespacepc
$exporttocsv| Out-File -Append $CSVFile
  }
}

A Fast Way of getting disk space report through UNC paths

Using above methods sometimes come with problems like access denied. To overcome these challenges, we can use UNC paths to solve our problem. Also it has many times higher speed than any of the above method.

#===================================================
#---function for geeting disk space from UNC--------
#===================================================
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
function getDiskSpaceInfoUNC($p_UNCpath, $p_unit = 1GB, $p_format = '{0:N1}')
{
    # unit, one of --> 1kb, 1mb, 1gb, 1tb, 1pb
    $l_typeDefinition = @' 
        [DllImport("kernel32.dll", CharSet = CharSet.Auto, SetLastError = true)] 
        [return: MarshalAs(UnmanagedType.Bool)] 
        public static extern bool GetDiskFreeSpaceEx(string lpDirectoryName, 
            out ulong lpFreeBytesAvailable, 
            out ulong lpTotalNumberOfBytes, 
            out ulong lpTotalNumberOfFreeBytes); 
'@
    $l_type = Add-Type -MemberDefinition $l_typeDefinition -Name Win32Utils -Namespace GetDiskFreeSpaceEx -PassThru

    $freeBytesAvailable     = New-Object System.UInt64 # differs from totalNumberOfFreeBytes when per-user disk quotas are in place
    $totalNumberOfBytes     = New-Object System.UInt64
    $totalNumberOfFreeBytes = New-Object System.UInt64

    $l_result = $l_type::GetDiskFreeSpaceEx($p_UNCpath,([ref]$freeBytesAvailable),([ref]$totalNumberOfBytes),([ref]$totalNumberOfFreeBytes)) 

    $totalBytes     = if($l_result) { $totalNumberOfBytes    /$p_unit } else { '' }
    $totalFreeBytes = if($l_result) { $totalNumberOfFreeBytes/$p_unit } else { '' }

    New-Object PSObject -Property @{
        Path      = $p_UNCpath
        Total     =  [math]::round($totalBytes , 2 , 1)
        Used      =  [math]::round($totalBytes -  $totalFreeBytes , 2 , 1)
        Used_percent      = [math]::round((($totalBytes -  $totalFreeBytes)/$totalBytes)*100 , 2 , 1)
        Free      = [math]::round($totalFreeBytes , 2 , 1)
    } 
}

#===================================================
$CSVFile = "C:\Disk space report.csv"
$machines = get-content "C:\servers.txt"
foreach ($machine in $machines) 
 
{
#===================================================
#---With the help of Win32_Share, we will filter----
#---and refine our search results to include only---
#---the drive letters, e.g. C$----------------------
#===================================================
$ds = get-WmiObject -class Win32_Share -computer $machine -ErrorAction SilentlyContinue|
? {($_.description -NotLike "*printer*") -and ($_.name -notlike "SYSVOL") -and ($_.name -like "?$")} |
 select Name

 foreach ($d in $ds) {
 $p = join-path \\$machine\ $d.name

[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
getDiskSpaceInfoUNC $p |ft -AutoSize
getDiskSpaceInfoUNC $p |ft -AutoSize -HideTableHeaders| Out-File -Append $CSVFile

  }
}

Booting from ISO file on remote computer via ILO console on HP Proliant servers

Accessing an ISO file for new OS installation a new HP Proliant server can be tricky if we don’t have manual access to the server or if we intend to do this remotely.

In page number: 237 of the ILO 4 user guide available at HPE iLO 4 User Guide, I found below paragraph, which can be an option to access the ISO image from new physical server.

===============================================================

Using an image file through a URL (IIS/Apache) You can connect scripted media by using the .NET IRC or Java IRC.

Scripted media supports only 1.44 MB floppy disk images (.img) and CD/DVD-ROM images (.iso).

The image must be on a web server on the same network as iLO.

  1. Start the .NET IRC or Java IRC.
  2. Depending on the image type you will use, select Virtual Drives→URL Removable Media (.img) or Virtual Drives→URL CD-ROM/DVD (.iso). The Image file at URL dialog box opens.
  3. Enter the URL for the image file that you want to mount as a virtual drive, and then click Connect. The virtual drive activity LED does not show drive activity for URL-mounted virtual media.

================================================================

Following is the process flow which has been mentioned in the HP blog ILO 4 Scripted Media URL

  1. Use an existing web server, or install a new web server for the purpose of delivering the ISO files

2. Create a folder to hold the ISO images

3. Add ISO file(s)

4. Enable directory browsing in Web Services. You can do this with the IIS manager if          its a Windows web server. If you created a custom folder for the files, enable        directory browsing on that folder.

5. You must add a MIME type for the ISO extension. In Server 2008 IIS, you can do this          from the HTTP Headers selection in IIS Manager.

a> .ISO application/octet-stream

b> .IMG application/octet-stream

6. Login to the ILO target server, and open the remote console

7. At the top of the window, click on Virtual Drives, and then select URL DVD ROM

8. Input the HTTP path to the image file, including the file name. Click connect and it              will mount the drive. Path will resemble “http://hostname or IP/folder/filename.ISO

Now the problem with this approach is network latency, especially when the infrastructure is globally spread.

So what can be solution here. Virtual folders?

lets read below two paragraphs taken from the user guide.

==============================================

Operating system considerations: Virtual Folder

  • Windows—A Virtual Folder appears automatically after Windows recognizes the mounting of the virtual USB device. You can use the folder the same way that you use a locally attached device. Virtual Folders are nonbootable. Attempting to boot from the Virtual Folder might prevent the server from starting.

===============================================

Virtual folders enable you to access, browse to, and transfer files from a client to a managed server. You can mount and dismount a local or networked directory that is accessible through the client. After you create a virtual image of a folder or directory, the server connects to the image as a USB storage device, enabling you to browse to the server and transfer the files from the virtual image to the server.

Using iLO Virtual Media The Virtual Folder is nonbootable and read-only; the mounted folder is static. Changes to the client folder are not replicated in the mounted folder.

=================================================

Apart from this virtual folders also have a size limit of 2 GB.

So we come down to our next option -> Image File CD/DVD-ROM

For this option we first need to copy the ISO file to a temporary server in the network where we are installing the new physical server. From this temporary server, we should access the ILO of new server. And select VIrtual Drives Option as Image File CD/DVD-ROM. We then have to browse to ISO file stored locally on temporary server and mount it. As this time the ISO file is in the same network as the new server, the network latency would be eliminated and access speed would be greatly enhanced.

Getting folder owners’ details from remote computers

In this short article, we would try to get information about owners of folders on remote computers and the permissions that they have on them.

The below script would travel one level down of C: drive and then recursively gather information about all files and subfolders within.

$path = gci “C:\*” -Recurse
GET-ACL “$path”| select path, Owner -expand access| 
select @{n=”Path”;e={$_.Path.replace(“Microsoft.PowerShell.Core\FileSystem::”,””)}}, Owner, IdentityReference, FileSystemRights, AccessControlType, IsInherited|
Export-Csv "C:\folder_owner_output.csv" -NoTypeInformation

This post has been majorly inspired by this blog post Find file / folder owner information using PowerShell or command prompt 

I recently go hold of another script that can be really useful if we are just interested in getting folder names and their owners. Also we can get owner details of folders at a particular level in a folder hierarchy. In my case I want folder owner details at second level down the root folder only using UNC path. Including “$_.PSIsContainer -eq $True” restricts the searchbase to folders only and excludes the files.

$arr = @()
gci "\\server01\*\*" | ? {$_.PSIsContainer -eq $True} | % {
$obj = New-Object PSObject
$obj | Add-Member NoteProperty Name $_.Name
$obj | Add-Member NoteProperty Owner ((Get-ACL $_.FullName).Owner)
$arr += $obj
}
$arr
$arr | Export-CSV -notypeinformation "C:\folder_owner_output.csv"

Outlook Automation

Although Microsoft does not recommend unattended automation of its office products, we can leverage powershell at automating some common outlook related operations.

Searching for a particular subject line and then downloading its attachments

$date = get-date -format d

$filepath = "C:\attachments\"
$attr = $date+"_report.csv"
$path = Join-Path $filepath $attr

$olFolderInbox = 6
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$outlook = new-object -com outlook.application;
$ns = $outlook.GetNameSpace("MAPI");
$inbox = $ns.GetDefaultFolder($olFolderInbox)
$messages = $inbox.items

foreach($message in $messages){
$msubject = $message.subject
write-host "Messages Subject: $msubject"

if ($msubject.contains("Daily Report")){
$message.attachments|foreach {
Write-Host $_.filename
Write-Host "Attachment is saved at $path"
$_.saveasfile(($path))}
$TargetFolder = $inbox.Folders.Item('Processed')
$message.Move($TargetFolder)
break
}else{continue}
}

Prior to running above script we must create a folder “Processed” under “Inbox” of Outlook. Also we may have a hard time scheduling this on task scheduler as the account from which the script runs, should be logged in.

Sending Mail

$From = "abc@company.com"
$to = "xyz@company.com"
$Cc = "pqr@company.com"

$Subject = "Daily Report"
$Body = "PFA The Excel Report containing details."
$SMTPServer = "server01.company.corp"

Send-MailMessage -From $From -to $To -Cc $Cc -Subject $Subject `
-Body $Body -SmtpServer $SMTPServer -Attachments $path

write-host "Mail sent with report attached"

In above script we have not made use of COM object. As a result we can use this as unattended script scheduled on task scheduler. Also the execution is fast.

Now the same task can also be done via COM object as can be seen below.

$Username = "MyUserName";
$Password= "MyPassword";

function Send-ToEmail([string]$email, [string]$attachmentpath){

    $message = new-object Net.Mail.MailMessage;
    $message.From = "YourName@gmail.com";
    $message.To.Add($email);
    $message.Subject = "subject text here...";
    $message.Body = "body text here...";
    $attachment = New-Object Net.Mail.Attachment($attachmentpath);
    $message.Attachments.Add($attachment);

    $smtp = new-object Net.Mail.SmtpClient("smtp.gmail.com", "587");
    $smtp.EnableSSL = $true;
    $smtp.Credentials = New-Object System.Net.NetworkCredential($Username, $Password);
    $smtp.send($message);
    write-host "Mail Sent" ; 
    $attachment.Dispose();
 }
Send-ToEmail  -email "reciever@gmail.com" -attachmentpath $path;

The above script has been taken from a solution posted in how to send email with powershell