Getting Folder Details from Remote Computers using PowerShell

This is post all about getting details about shared folders from remote computers using few techniques. We can get various information like last modified time, ACLs, etc. But it can get time consuming in a large production environment at the point when we have to calculate folder size.

Method 1: Measure-object

This can work for a small environment, but can take unreasonable long time on large work environment.

$CSVFile = "C:\logs.csv"
$machines = get-content "C:\servers.txt"
"Computer name,Foldername,Modified time,User/Group,Permissions" | 
out-file -FilePath $CSVFile -append
foreach ($machine in $machines) 
{
 $items = Get-ChildItem \\$machine\h$\USER , \\$machine\h$\USERS
 foreach ($item in $items)
  {
<#Check only folders and filter out files#>
   if ($item.Attributes -eq "Directory")            
    {         
<#-----Get folder name--------------------#>
     $FoldName = $item.Name          
     Write-Host $FoldName
<#------Get last modified timestamp-------#>
     $ModTime = $item.lastwritetime          
     Write-Host $ModTime
<#----------Get folder size---------------#>
     $size = ($item | measure Length -sum).sum/1mb
<# We will collect information about ACL where Folder name 
then matches the name of user
Also we can employ any other condition for our purpose here #>
     $ACLs = get-acl $item.fullname | ForEach-Object { $_.Access  }
     Foreach ($ACL in $ACLs)
      {
       if ($ACL.IdentityReference -eq “Company\" + $FoldName)
        {
         $OutInfo = $machine+","+$FoldName+","+$ModTime+","+$ACL.IdentityReference+","+$ACL.FileSystemRights+","+$size
         Write-Host $OutInfo
         Add-Content -Value $OutInfo -Path $CSVFile
        }
      }
    }
  } 
}

Method 2: File system object

It is observed that calculating folder size by using file system object deliver higher speed and more accurate folder sizes.

$fso = New-Object -comobject Scripting.FileSystemObject
$folder = $fso.GetFolder($path)
$size = [math]::round(($folder.size)/1mb , 2 , 1 )

Method 3: PSExec

There are certain things we need to ensure before proceeding with this method. Firstly, PSExec service must not be in disabled or marked for deletion state in target remote server. Secondly, the account launching the script should have appropriate administrative privileges. Thirdly, execution of scripts must not be disabled in remote computers via group policy.

Step 1> Proceeding with this method, first create a runme file with name, say runme.ps1 as follows:

$CSVFile = "C:\shared_folder\logs.csv"
"File Server,P Folder Name,P Folder Size,Last Modified time,`
User Name,NTFS Access" | out-file -FilePath $CSVFile -append
$machines = get-content "C:\shared_folder\servers.txt"
ForEach ($machine In $machines) 
{
 if (Test-Connection -ComputerName $machine -Count 2 -ErrorAction SilentlyContinue)   
  {
   Write-Host "$machine is up" -ForegroundColor Green
<#-----& is an alternative of start-process-------
accepteula switch will bypass dialog of accepting eula agreement #>     
& C:\Windows\System32\psexec.exe \\$machine -u Company\admin -p ### -h -accepteula1 cmd.exe /c "\\server1\shared_folder\Script.bat"  
  }
 else    
  {
   Write-Host "$machine is down" -ForegroundColor Red
   $info = $machine + "," + "is down"
   Add-content -value $info -Path $CSVFile
  }
}

Step 2> Above script will call out to following batch file “Script.bat” over network.

@Echo Off
Net Use T: /delete /yes
Net Use T: \\server1\shared_folder
CMD /C powershell.exe -ExecutionPolicy Bypass -Command T:\Get_ACL.ps1
Net Use T: /delete /yes
Exit

Step 3> Above script will map the folder with script as a network drive on remote computer and then call out to following Get_ACL.ps1 file which is actually responsible for gathering shared folder information. Note that now we won’t use network path of folders, instead we will write complete local path. As the script runs locally on the remote computer, this script will offset the delay caused due to network latency.

$CSVFile = "T:\logs.csv"
$machine = hostname
$Folders = Get-ChildItem h:\USER , h:\USERS


After this we have our usual script where we can employ either method 1 or method 2.

As a concluding remark, this method has highest speed among others mentioned here on this post and should ease off burden of lots of windows administrators.

Powershell script for obtaining DHCP scope usage statistics for all the DHCP servers in the domain

I am presenting in this post, a solution to a common task of a Windows admin, where one has to pull report of DHCP scope usage statistics and the following script will do just the job at hand.

Algorithm:

1> Get the list of all DHCP servers in the domain from Get-DhcpServerInDC command.

2> Store the DHCP servers in a variable $machines.

3> Get the DHCP scopes recursively for each DHCP server stored in $machines, from Get-DhcpServerv4Scope command.

4> Store the DHCP scopes in a variable $AllScopes.

5> Get usage statistics for each DHCP scope recursively, from Get-DhcpServerv4ScopeStatistics command.

6> Store the statistics for a particular scope in a variable $ScopeStats.

7> Append the data collected to a csv file

Script:

$CurrentDate = Get-Date -format dddd
$CSVFile = "C:\Dhcp_$CurrentDate.csv"
$machines = Get-DhcpServerInDC

"DHCPServer,ScopeName,ScopeID,FreeAddresses,AddressesInUse, `
PercentInUse" |out-file -FilePath $CSVFile -append

foreach ($machine in $machines)
{
 $DHCPName = $machine.DnsName
 $AllScopes = Get-DhcpServerv4Scope -ComputerName $DHCPName
 foreach ($scope in $AllScopes) 
    {
     $ScopeName = $scope.Name
     $ScopeId = $scope.ScopeId
     $ScopeStats = Get-DhcpServerv4ScopeStatistics`
     -ScopeId $ScopeId -ComputerName $DHCPName
     $ScopePercentInUse = $ScopeStats.PercentageInUse
     $Addfree = $ScopeStats.AddressesFree
     $AddUse = $ScopeStats.AddressesInUse
     $OutInfo = $DHCPName + "," + $ScopeName + "," + $ScopeId`
     + "," + $Addfree + "," + $AddUse + "," + $ScopePercentInUse
     Write-Host $OutInfo
     Add-Content -Value $OutInfo -Path $CSVFile
    }
}
We can extend this solution to a scenario where we have to obtain average DHCP scope usage statistics for a whole week. I agree that this solution is not the most elegant way to approach the requirement, but we can have the job completed with this script.

Algorithm:

1> Import CSVs containing per day DHCP scope data in array format. 2> Combine the arrays produced, to form a single array of data and output as CSV file. 3> Remove Duplicate columns from the CSV produces in previous step, by assigning headers to each column and selecting the desired columns and output as CSV file. 4> Calculate average of each scope’s statistics from all seven week days and add this data as columns in CSV generated in previous step and output as final CSV file.

Script:

<#---Creating arrays of csv files for each weekday's 
and storing in seperate variables--#>

$csv1 = @(Get-Content "C:\Dhcp_Monday.csv")
$csv2 = @(Get-Content "C:\Dhcp_Tuesday.csv")
$csv3 = @(Get-Content "C:\Dhcp_Wednesday.csv")
$csv4 = @(Get-Content "C:\Dhcp_Thursday.csv")
$csv5 = @(Get-Content "C:\Dhcp_Friday.csv")
$csv6 = @(Get-Content "C:\Dhcp_Saturday.csv")
$csv7 = @(Get-Content "C:\Dhcp_Sunday.csv")

<#------Creating an array with combined csvs 
and giving output in csv-------------#>

$csv = @()
for ($i=0; $i -lt $csv1.Count; $i++) 
{
 $csv += $csv1[$i] + ', ' + $csv2[$i] + ', ' + $csv3[$i]+ ', ' + `
 $csv4[$i] + ', ' + $csv5[$i]+ ', ' + $csv6[$i] + ', ' + $csv7[$i]
}
$csv | Out-File "C:\consolidated.csv" -encoding default

<#---Removing duplicate or unwanted columns 
from combined csv from previous step----#>

$source = "C:\consolidated.csv"
$destination = "C:\consolidated2.csv"
$finaldestination = "C:\consolidatedfinal.csv"

(Import-CSV $source -Header 1,2,3,4,5,6,7,8,9,10,11,12,
13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,`
31,32,33,34,35,36,37,38,39,40,41,42 |
Select "1","2","3","4","5","6","10","11","12","16","17",`
"18","22","23","24","28","29","30","34","35","36","40",`
"41","42" |
 ConvertTo-Csv -NoTypeInformation |
 Select-Object -Skip 2) -replace '"' | Set-Content $destination

<#-----Performing averaging operation on 
selected columns and removing unwanted columns 
and giving final output in csv------#>

(Import-csv $destination -header "DHCP Server","Scope Name",`
"Scope ID",4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,`
21,22,23,24 | select "DHCP Server","Scope Name","Scope ID",`
@{Name="(Average of week) No. of IP Addresses free";`
Expression={[math]::Round(([Decimal]$_.4 + [Decimal]$_.7` 
+ [Decimal]$_.10 + [Decimal]$_.13 + [Decimal]$_.16 +`
[Decimal]$_.19 + [Decimal]$_.22)/7,2)}},`
@{Name="(Average of week) No. of IP Addresses in Use";`
Expression={[math]::Round(([Decimal]$_.5 + [Decimal]$_.8`
+ [Decimal]$_.11 + [Decimal]$_.14 + [Decimal]$_.17 +`
[Decimal]$_.20 + [Decimal]$_.23)/7,2)}},`
@{Name="(Average of week) Percent IP Addresses in Use";`
Expression={[math]::Round(([Decimal]$_.6 + [Decimal]$_.9`
+ [Decimal]$_.12 + [Decimal]$_.15 + [Decimal]$_.18 +`
[Decimal]$_.21 + [Decimal]$_.24)/7,2)}}|
ConvertTo-Csv -NoTypeInformation) -replace '"'|
Set-Content $finaldestination

<#-------Removing unwanted files---------#>

Remove-Item "C:\consolidated.csv"
Remove-Item "C:\consolidated2.csv"

#-------Creating new folder with 
current month's and week's name----------#>

$week = (Get-WmiObject Win32_LocalTime).weekinmonth
$month = get-date -Format MMMM

New-Item -ItemType directory -Path "C:\$month Week$week"

<#-----Moving the dhcp reports corresponding 
to each weekday's to new folder created----#>     

Move-item "C:\Dhcp_Monday.csv" "C:\$month Week$week"
Move-item "C:\Dhcp_Tuesday.csv" "C:\$month Week$week"
Move-item "C:\Dhcp_Wednesday.csv" "C:\$month Week$week"
Move-item "C:\Dhcp_Thursday.csv" "C:$month Week$week"
Move-item "C:\Dhcp_Friday.csv" "C:$month Week$week"
Move-item "C:\Dhcp_Saturday.csv" "C:$month Week$week"
Move-item "C:\Dhcp_Sunday.csv" "C:$month Week$week"
Move-item "C:\consolidatedfinal.csv" "C:$month Week$week"

All thanks to Adam Dimech’s Coding Blog https://code.adonline.id.au/csv-to-xlsx-powershell/,  we can have the final csv report converted to a more productive xlsx file, as follows.

<#----Assign the desired path for 
storing final xlsx file to a variable----#>     

$week = (Get-WmiObject Win32_LocalTime).weekinmonth
$month = get-date -Format MMMM
$csv = "C:\consolidatedfinal.csv" #Location of the source file
$xlsx = "C:\$month Week$week\consolidated.xlsx"  
$delimiter = "," 

<#------Create a new Excel workbook 
with one empty sheet-------------#>

[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$excel = New-Object -ComObject excel.application 
$excel.DisplayAlerts=$false
$workbook = $excel.Workbooks.Add(1)
$worksheet = $workbook.worksheets.Item(1)

<#-----Build the QueryTables.Add command 
and reformat the data--------------#>

$TxtConnector = ("TEXT;" + $csv)
$Connector = $worksheet.QueryTables.`
add($TxtConnector,$worksheet.Range("A1"))
$query = $worksheet.QueryTables.item($Connector.name)
$query.TextFileOtherDelimiter = $delimiter
$query.TextFileParseType  = 1
$query.TextFileColumnDataTypes = ,1 * $worksheet.Cells.Columns.Count
$query.AdjustColumnWidth = 1

<#------Execute & delete the import query------#>

$query.Refresh()
$query.Delete()

<#------Save & close the Workbook as XLSX-------#>

$workbook.SaveAs($xlsx)
$excel.Quit() 
Thanks to a brilliant module for PowerShell by Warren F http://ramblingcookiemonster.github.io/PSExcel-Intro/ , we can remove inactive dhcp servers’ data from the final xlsx generated. as follows.
Import-XLSX "C:\$month Week$week\consolidated.xlsx"|

Where-Object {($_.'DHCP Server' -notin @("server1.abc.corp",`
"server2.abc.corp"))}|Export-XLSX -Path`
"C:\$month Week$week\$month Week$week dhcp scope report.xlsx" 

<#---Remove unwanted consolidated csv file---#>

Remove-Item "C:\$month Week$week\consolidated.xlsx"